Intensive blood glucose control and vascular outcomes in patients with type 2 diabetes.
Patel, Anushka; MacMahon, Stephen; Chalmers, John; Neal, Bruce; Billot, Laurent; Woodward, Mark; Marre, Michel; Cooper, Mark; Glasziou, Paul; Grobbee, Diederick; Hamet, Pavel; Harrap, Stephen; Heller, Simon; Liu, Lisheng; Mancia, Giuseppe; Mogensen, Carl Erik; Pan, Changyu; Poulter, Neil; Rodgers, Anthony; Williams, Bryan; Bompoint, Severine; de Galan, Bastiaan E; Joshi, Rohina; Travert, Florence
2008-06-12
In patients with type 2 diabetes, the effects of intensive glucose control on vascular outcomes remain uncertain. We randomly assigned 11,140 patients with type 2 diabetes to undergo either standard glucose control or intensive glucose control, defined as the use of gliclazide (modified release) plus other drugs as required to achieve a glycated hemoglobin value of 6.5% or less. Primary end points were composites of major macrovascular events (death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke) and major microvascular events (new or worsening nephropathy or retinopathy), assessed both jointly and separately. After a median of 5 years of follow-up, the mean glycated hemoglobin level was lower in the intensive-control group (6.5%) than in the standard-control group (7.3%). Intensive control reduced the incidence of combined major macrovascular and microvascular events (18.1%, vs. 20.0% with standard control; hazard ratio, 0.90; 95% confidence interval [CI], 0.82 to 0.98; P=0.01), as well as that of major microvascular events (9.4% vs. 10.9%; hazard ratio, 0.86; 95% CI, 0.77 to 0.97; P=0.01), primarily because of a reduction in the incidence of nephropathy (4.1% vs. 5.2%; hazard ratio, 0.79; 95% CI, 0.66 to 0.93; P=0.006), with no significant effect on retinopathy (P=0.50). There were no significant effects of the type of glucose control on major macrovascular events (hazard ratio with intensive control, 0.94; 95% CI, 0.84 to 1.06; P=0.32), death from cardiovascular causes (hazard ratio with intensive control, 0.88; 95% CI, 0.74 to 1.04; P=0.12), or death from any cause (hazard ratio with intensive control, 0.93; 95% CI, 0.83 to 1.06; P=0.28). Severe hypoglycemia, although uncommon, was more common in the intensive-control group (2.7%, vs. 1.5% in the standard-control group; hazard ratio, 1.86; 95% CI, 1.42 to 2.40; P<0.001). A strategy of intensive glucose control, involving gliclazide (modified release) and other drugs as required, that lowered the glycated hemoglobin value to 6.5% yielded a 10% relative reduction in the combined outcome of major macrovascular and microvascular events, primarily as a consequence of a 21% relative reduction in nephropathy. (ClinicalTrials.gov number, NCT00145925.) 2008 Massachusetts Medical Society
Prentice, Ross L.; Chlebowski, Rowan T.; Stefanick, Marcia L.; Manson, JoAnn E.; Langer, Robert D.; Pettinger, Mary; Hendrix, Susan L.; Hubbell, F. Allan; Kooperberg, Charles; Kuller, Lewis H.; Lane, Dorothy S.; McTiernan, Anne; O’Sullivan, Mary Jo; Rossouw, Jacques E.; Anderson, Garnet L.
2009-01-01
The Women’s Health Initiative randomized controlled trial found a trend (p = 0.09) toward a lower breast cancer risk among women assigned to daily 0.625-mg conjugated equine estrogens (CEEs) compared with placebo, in contrast to an observational literature that mostly reports a moderate increase in risk with estrogenalone preparations. In 1993–2004 at 40 US clinical centers, breast cancer hazard ratio estimates for this CEE regimen were compared between the Women’s Health Initiative clinical trial and observational study toward understanding this apparent discrepancy and refining hazard ratio estimates. After control for prior use of postmenopausal hormone therapy and for confounding factors, CEE hazard ratio estimates were higher from the observational study compared with the clinical trial by 43% (p = 0.12). However, after additional control for time from menopause to first use of postmenopausal hormone therapy, the hazard ratios agreed closely between the two cohorts (p = 0.82). For women who begin use soon after menopause, combined analyses of clinical trial and observational study data do not provide clear evidence of either an overall reduction or an increase in breast cancer risk with CEEs, although hazard ratios appeared to be relatively higher among women having certain breast cancer risk factors or a low body mass index. PMID:18448442
Olsen, Morten; Hjortdal, Vibeke E; Mortensen, Laust H; Christensen, Thomas D; Sørensen, Henrik T; Pedersen, Lars
2011-04-01
Congenital heart defect patients may experience neurodevelopmental impairment. We investigated their educational attainments from basic schooling to higher education. Using administrative databases, we identified all Danish patients with a cardiac defect diagnosis born from 1 January, 1977 to 1 January, 1991 and alive at age 13 years. As a comparison cohort, we randomly sampled 10 persons per patient. We obtained information on educational attainment from Denmark's Database for Labour Market Research. The study population was followed until achievement of educational levels, death, emigration, or 1 January, 2006. We estimated the hazard ratio of attaining given educational levels, conditional on completing preceding levels, using discrete-time Cox regression and adjusting for socio-economic factors. Analyses were repeated for a sub-cohort of patients and controls born at term and without extracardiac defects or chromosomal anomalies. We identified 2986 patients. Their probability of completing compulsory basic schooling was approximately 10% lower than that of control individuals (adjusted hazard ratio = 0.79, ranged from 0.75 to 0.82 0.79; 95% confidence interval: 0.75-0.82). Their subsequent probability of completing secondary school was lower than that of the controls, both for all patients (adjusted hazard ratio = 0.74; 95% confidence interval: 0.69-0.80) and for the sub-cohort (adjusted hazard ratio = 0.80; 95% confidence interval: 0.73-0.86). The probability of attaining a higher degree, conditional on completion of youth education, was affected both for all patients (adjusted hazard ratio = 0.88; 95% confidence interval: 0.76-1.01) and for the sub-cohort (adjusted hazard ratio = 0.92; 95% confidence interval: 0.79-1.07). The probability of educational attainment was reduced among long-term congenital heart defect survivors.
Sun, Ming-Hui; Liao, Yaping Joyce; Lin, Che-Chen; Chiang, Rayleigh Ping-Ying; Wei, James Cheng-Chung
2018-04-26
Obstructive sleep apnea (OSA) is associated with many systemic diseases including diabetes, hypertension, stroke, and cardiovascular disease. The aim of our study was to investigate the association between OSA and optic neuropathy (ON), and to evaluate the efficacy of treatment for OSA on the risk of ON. We used the data from the Longitudinal Health Insurance Database, which involved one million insurants from Taiwan National Health Insurance program (Taiwan NHI). OSA patients had a 1.95-fold higher risk of ON compared with non-OSA patients in all age group. The risk was significantly higher (adjusted hazard ratio: 4.21) in the group aged <45 years and male individuals (adjusted hazard ratio: 1.93). Meanwhile, sleep apnea was associated with ON regardless of the existence of comorbidity or not. OSA patients treated with continuous positive airway pressure (CPAP) had an adjusted 2.31-fold higher hazard of developing ON compared to controls, and those without any treatment had an adjusted 1.82-fold higher hazard of developing ON compared to controls. Moreover, ON patients had a 1.45-fold higher risk of OSA, and those aged between 45 and 64 years (hazard ratio: 1.76) and male individuals (hazard ratio: 1.55) had highest risk. Our study showed that OSA increased the risk of developing ON after controlling the comorbidities; however, treatment with CPAP did not reduce the risk of ON. Further large population study accessing to medical records about the severity of OSA and treatment for OSA is needed to clarify the efficacy of treatment for OSA in reducing the risk of ON.
A balanced hazard ratio for risk group evaluation from survival data.
Branders, Samuel; Dupont, Pierre
2015-07-30
Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.
Song, Yun; Xu, Benjamin; Xu, Richard; Tung, Renee; Frank, Eric; Tromble, Wayne; Fu, Tong; Zhang, Weiyi; Yu, Tao; Zhang, Chunyan; Fan, Fangfang; Zhang, Yan; Li, Jianping; Bao, Huihui; Cheng, Xiaoshu; Qin, Xianhui; Tang, Genfu; Chen, Yundai; Yang, Tianlun; Sun, Ningling; Li, Xiaoying; Zhao, Lianyou; Hou, Fan Fan; Ge, Junbo; Dong, Qiang; Wang, Binyan; Xu, Xiping; Huo, Yong
2016-07-01
Pulse wave velocity (PWV) has been shown to influence the effects of antihypertensive drugs in the prevention of cardiovascular diseases. Data are limited on whether PWV is an independent predictor of stroke above and beyond hypertension control. This longitudinal analysis examined the independent and joint effect of brachial-ankle PWV (baPWV) with hypertension control on the risk of first stroke. This report included 3310 hypertensive adults, a subset of the China Stroke Primary Prevention Trial (CSPPT) with baseline measurements for baPWV. During a median follow-up of 4.5 years, 111 participants developed first stroke. The risk of stroke was higher among participants with baPWV in the highest quartile than among those in the lower quartiles (6.3% versus 2.4%; hazard ratio, 1.66; 95% confidence interval, 1.06-2.60). Similarly, the participants with inadequate hypertension control had a higher risk of stroke than those with adequate control (5.1% versus 1.8%; hazard ratio, 2.32; 95% confidence interval, 1.49-3.61). When baPWV and hypertension control were examined jointly, participants in the highest baPWV quartile and with inadequate hypertension control had the highest risk of stroke compared with their counterparts (7.5% versus 1.3%; hazard ratio, 3.57; 95% confidence interval, 1.88-6.77). There was a significant and independent effect of high baPWV on stroke as shown among participants with adequate hypertension control (4.2% versus 1.3%; hazard ratio, 2.29, 95% confidence interval, 1.09-4.81). In summary, among hypertensive patients, baPWV and hypertension control were found to independently and jointly affect the risk of first stroke. Participants with high baPWV and inadequate hypertension control had the highest risk of stroke compared with other groups. © 2016 American Heart Association, Inc.
Adverse health outcomes in women exposed in utero to diethylstilbestrol.
Hoover, Robert N; Hyer, Marianne; Pfeiffer, Ruth M; Adam, Ervin; Bond, Brian; Cheville, Andrea L; Colton, Theodore; Hartge, Patricia; Hatch, Elizabeth E; Herbst, Arthur L; Karlan, Beth Y; Kaufman, Raymond; Noller, Kenneth L; Palmer, Julie R; Robboy, Stanley J; Saal, Robert C; Strohsnitter, William; Titus-Ernstoff, Linda; Troisi, Rebecca
2011-10-06
Before 1971, several million women were exposed in utero to diethylstilbestrol (DES) given to their mothers to prevent pregnancy complications. Several adverse outcomes have been linked to such exposure, but their cumulative effects are not well understood. We combined data from three studies initiated in the 1970s with continued long-term follow-up of 4653 women exposed in utero to DES and 1927 unexposed controls. We assessed the risks of 12 adverse outcomes linked to DES exposure, including cumulative risks to 45 years of age for reproductive outcomes and to 55 years of age for other outcomes, and their relationships to the baseline presence or absence of vaginal epithelial changes, which are correlated with a higher dose of, and earlier exposure to, DES in utero. Cumulative risks in women exposed to DES, as compared with those not exposed, were as follows: for infertility, 33.3% vs. 15.5% (hazard ratio, 2.37; 95% confidence interval [CI], 2.05 to 2.75); spontaneous abortion, 50.3% vs. 38.6% (hazard ratio, 1.64; 95% CI, 1.42 to 1.88); preterm delivery, 53.3% vs. 17.8% (hazard ratio, 4.68; 95% CI, 3.74 to 5.86); loss of second-trimester pregnancy, 16.4% vs. 1.7% (hazard ratio, 3.77; 95% CI, 2.56 to 5.54); ectopic pregnancy, 14.6% vs. 2.9% (hazard ratio, 3.72; 95% CI, 2.58 to 5.38); preeclampsia, 26.4% vs. 13.7% (hazard ratio 1.42; 95% CI, 1.07 to 1.89); stillbirth, 8.9% vs. 2.6% (hazard ratio, 2.45; 95% CI, 1.33 to 4.54); early menopause, 5.1% vs. 1.7% (hazard ratio, 2.35; 95% CI, 1.67 to 3.31); grade 2 or higher cervical intraepithelial neoplasia, 6.9% vs. 3.4% (hazard ratio, 2.28; 95% CI, 1.59 to 3.27); and breast cancer at 40 years of age or older, 3.9% vs. 2.2% (hazard ratio, 1.82; 95% CI, 1.04 to 3.18). For most outcomes, the risks among exposed women were higher for those with vaginal epithelial changes than for those without such changes. In utero exposure of women to DES is associated with a high lifetime risk of a broad spectrum of adverse health outcomes. (Funded by the National Cancer Institute.).
Khan, Muhammad; Lin, Jie; Liao, Guixiang; Li, Rong; Wang, Baiyao; Xie, Guozhu; Zheng, Jieling; Yuan, Yawei
2017-07-01
Whole brain radiotherapy has been a standard treatment of brain metastases. Stereotactic radiosurgery provides more focal and aggressive radiation and normal tissue sparing but worse local and distant control. This meta-analysis was performed to assess and compare the effectiveness of whole brain radiotherapy alone, stereotactic radiosurgery alone, and their combination in the treatment of brain metastases based on randomized controlled trial studies. Electronic databases (PubMed, MEDLINE, Embase, and Cochrane Library) were searched to identify randomized controlled trial studies that compared treatment outcome of whole brain radiotherapy and stereotactic radiosurgery. This meta-analysis was performed using the Review Manager (RevMan) software (version 5.2) that is provided by the Cochrane Collaboration. The data used were hazard ratios with 95% confidence intervals calculated for time-to-event data extracted from survival curves and local tumor control rate curves. Odds ratio with 95% confidence intervals were calculated for dichotomous data, while mean differences with 95% confidence intervals were calculated for continuous data. Fixed-effects or random-effects models were adopted according to heterogeneity. Five studies (n = 763) were included in this meta-analysis meeting the inclusion criteria. All the included studies were randomized controlled trials. The sample size ranged from 27 to 331. In total 202 (26%) patients with whole brain radiotherapy alone, 196 (26%) patients receiving stereotactic radiosurgery alone, and 365 (48%) patients were in whole brain radiotherapy plus stereotactic radiosurgery group. No significant survival benefit was observed for any treatment approach; hazard ratio was 1.19 (95% confidence interval: 0.96-1.43, p = 0.12) based on three randomized controlled trials for whole brain radiotherapy only compared to whole brain radiotherapy plus stereotactic radiosurgery and hazard ratio was 1.03 (95% confidence interval: 0.82-1.29, p = 0.81) for stereotactic radiosurgery only compared to combined approach. Local control was best achieved when whole brain radiotherapy was combined with stereotactic radiosurgery. Hazard ratio 2.05 (95% confidence interval: 1.36-3.09, p = 0.0006) and hazard ratio 1.84 (95% confidence interval: 1.26-2.70, p = 0.002) were obtained from comparing whole brain radiotherapy only and stereotactic radiosurgery only to whole brain radiotherapy + stereotactic radiosurgery, respectively. No difference in adverse events for treatment difference; odds ratio 1.16 (95% confidence interval: 0.77-1.76, p = 0.48) and odds ratio 0.92 (95% confidence interval: 0.59-1.42, p = 71) for whole brain radiotherapy + stereotactic radiosurgery versus whole brain radiotherapy only and whole brain radiotherapy + stereotactic radiosurgery versus stereotactic radiosurgery only, respectively. Adding stereotactic radiosurgery to whole brain radiotherapy provides better local control as compared to whole brain radiotherapy only and stereotactic radiosurgery only with no difference in radiation related toxicities.
2013-01-01
Background Designs and analyses of clinical trials with a time-to-event outcome almost invariably rely on the hazard ratio to estimate the treatment effect and implicitly, therefore, on the proportional hazards assumption. However, the results of some recent trials indicate that there is no guarantee that the assumption will hold. Here, we describe the use of the restricted mean survival time as a possible alternative tool in the design and analysis of these trials. Methods The restricted mean is a measure of average survival from time 0 to a specified time point, and may be estimated as the area under the survival curve up to that point. We consider the design of such trials according to a wide range of possible survival distributions in the control and research arm(s). The distributions are conveniently defined as piecewise exponential distributions and can be specified through piecewise constant hazards and time-fixed or time-dependent hazard ratios. Such designs can embody proportional or non-proportional hazards of the treatment effect. Results We demonstrate the use of restricted mean survival time and a test of the difference in restricted means as an alternative measure of treatment effect. We support the approach through the results of simulation studies and in real examples from several cancer trials. We illustrate the required sample size under proportional and non-proportional hazards, also the significance level and power of the proposed test. Values are compared with those from the standard approach which utilizes the logrank test. Conclusions We conclude that the hazard ratio cannot be recommended as a general measure of the treatment effect in a randomized controlled trial, nor is it always appropriate when designing a trial. Restricted mean survival time may provide a practical way forward and deserves greater attention. PMID:24314264
Bunch, T Jared; May, Heidi T; Bair, Tami L; Anderson, Jeffrey L; Crandall, Brian G; Cutler, Michael J; Jacobs, Victoria; Mallender, Charles; Muhlestein, Joseph B; Osborn, Jeffrey S; Weiss, J Peter; Day, John D
2015-12-01
There are a paucity of data about the long-term natural history of adult Wolff-Parkinson-White syndrome (WPW) patients in regard to risk of mortality and atrial fibrillation. We sought to describe the long-term outcomes of WPW patients and ascertain the impact of ablation on the natural history. Three groups of patients were studied: 2 WPW populations (ablation: 872, no ablation: 1461) and a 1:5 control population (n=11 175). Long-term mortality and atrial fibrillation rates were determined. The average follow-up for the WPW group was 7.9±5.9 (median: 6.9) years and was similar between the ablation and nonablation groups. Death rates were similar between the WPW group versus the control group (hazard ratio, 0.96; 95% confidence interval, 0.83-1.11; P=0.56). Nonablated WPW patients had a higher long-term death risk compared with ablated WPW patients (hazard ratio, 2.10; 95% confidence interval: 1.50-20.93; P<0.0001). Incident atrial fibrillation risk was higher in the WPW group compared with the control population (hazard ratio, 1.55; 95% confidence interval, 1.29-1.87; P<0.0001). Nonablated WPW patients had lower risk than ablated patients (hazard ratio, 0.39; 95% confidence interval, 0.28-0.53; P<0.0001). Long-term mortality rates in WPW patients are low and similar to an age-matched and gender-matched control population. WPW patients that underwent the multifactorial process of ablation had a lower mortality compared to nonablated WPW patients. Atrial fibrillation rates are high long-term, and ablation does not reduce this risk. © 2015 American Heart Association, Inc.
Royston, Patrick; Parmar, Mahesh K B
2014-08-07
Most randomized controlled trials with a time-to-event outcome are designed and analysed under the proportional hazards assumption, with a target hazard ratio for the treatment effect in mind. However, the hazards may be non-proportional. We address how to design a trial under such conditions, and how to analyse the results. We propose to extend the usual approach, a logrank test, to also include the Grambsch-Therneau test of proportional hazards. We test the resulting composite null hypothesis using a joint test for the hazard ratio and for time-dependent behaviour of the hazard ratio. We compute the power and sample size for the logrank test under proportional hazards, and from that we compute the power of the joint test. For the estimation of relevant quantities from the trial data, various models could be used; we advocate adopting a pre-specified flexible parametric survival model that supports time-dependent behaviour of the hazard ratio. We present the mathematics for calculating the power and sample size for the joint test. We illustrate the methodology in real data from two randomized trials, one in ovarian cancer and the other in treating cellulitis. We show selected estimates and their uncertainty derived from the advocated flexible parametric model. We demonstrate in a small simulation study that when a treatment effect either increases or decreases over time, the joint test can outperform the logrank test in the presence of both patterns of non-proportional hazards. Those designing and analysing trials in the era of non-proportional hazards need to acknowledge that a more complex type of treatment effect is becoming more common. Our method for the design of the trial retains the tools familiar in the standard methodology based on the logrank test, and extends it to incorporate a joint test of the null hypothesis with power against non-proportional hazards. For the analysis of trial data, we propose the use of a pre-specified flexible parametric model that can represent a time-dependent hazard ratio if one is present.
Chen, I-Chun; Lee, Ming-Huei; Lin, Hsuan-Hung; Wu, Shang-Liang; Chang, Kun-Min; Lin, Hsiu-Ying
2017-05-01
Interstitial cystitis/bladder pain syndrome (IC/BPS) has several well-known comorbid psychiatric manifestations, including insomnia, anxiety, and depression. We hypothesized that somatoform disorder, which is a psychosomatic disease, can be used as a sensitive psychiatric phenotype of IC/BPS. We investigated whether somatoform disorder increases the risk of IC/BPS.A nested case-control study and a retrospective cohort study were followed up over a 12-year period (2002-2013) in the Taiwan Health Insurance Reimbursement Database. In the nested case-control study, 1612 patients with IC/BPS were matched in a 1:2 ratio to 3224 controls based on propensity scores. The odds ratio for somatoform disorder was calculated using conditional logistic regression analysis. In the retrospective cohort study, 1436 patients with somatoform disorder were matched in a 1:2 ratio to 2872 patients with nonsomatoform disorder based on propensity scores. Cox regression analysis was used to estimate the hazard ratio associated with the development of IC/BPS in patients with somatoform disorder, and the cumulative survival probability was tested using the Kaplan-Meier analysis.We found that the odds ratio for somatoform disorder was 2.46 (95% confidence interval [CI], 1.05-5.76). Although the average time until IC/BPS development in the control subjects was 11.5 ± 1.3 years, this interval was shorter in patients with somatoform disorder (6.3 ± 3.6 years). The hazard ratio for developing IC/BPS was 2.50 (95% CI 1.23-5.58); the adjusted hazard ratio was 2.26 (95% CI 1.002-5.007). The patients and controls also differed significantly in their cumulative survival probability for IC/BPS (log rank P < .05).Evidence from the nested case-control study and retrospective cohort study consistently indicated that somatoform disorder increases the risk for IC/BPS. Our study suggests that somatoform disorder can be used as a sensitive psychiatric phenotype to predict IC/BPS. Any past history of somatoform disorder should be documented while examining patients with IC/BPS.
Page, W F; Brass, L M
2001-09-01
For the first 30 years after repatriation, former American prisoners of war (POWs) of World War II and the Korean Conflict had lower death rates for heart disease and stroke than non-POW veteran controls and the U.S. population, but subsequent morbidity data suggested that this survival advantage may have disappeared. We used U.S. federal records to obtain death data through 1996 and used proportional hazards analysis to compare the mortality experience of POWs and controls. POWs aged 75 years and older showed a significantly higher risk of heart disease deaths than controls (hazard ratio = 1.25; 95% confidence interval, 1.01-1.56), and their stroke mortality was also increased, although not significantly (hazard ratio = 1.13; 95% confidence interval, 0.66-1.91). These results suggest that circulatory disease sequelae of serious, acute malnutrition and the stresses associated with imprisonment may not appear until after many decades.
Rössler, Roland; Junge, Astrid; Bizzini, Mario; Verhagen, Evert; Chomiak, Jiri; Aus der Fünten, Karen; Meyer, Tim; Dvorak, Jiri; Lichtenstein, Eric; Beaudouin, Florian; Faude, Oliver
2018-06-01
The objective of this study was to assess the efficacy of a newly developed warm-up programme ('11+ Kids') regarding its potential to reduce injuries in children's football. Children's football teams (under 9 years, under 11 years, and under 13 years age groups) from Switzerland, Germany, the Czech Republic and the Netherlands were invited. Clubs were randomised to an intervention group and a control group, and followed for one season. The intervention group replaced their usual warm-up by '11+ Kids', while the control group warmed up as usual. The primary outcome was the overall risk of football-related injuries. Secondary outcomes were the risks of severe and lower extremity injuries. We calculated hazard ratios using extended Cox models, and performed a compliance analysis. In total, 292,749 h of football exposure of 3895 players were recorded. The mean age of players was 10.8 (standard deviation 1.4) years. During the study period, 374 (intervention group = 139; control group = 235) injuries occurred. The overall injury rate in the intervention group was reduced by 48% compared with the control group (hazard ratio 0.52; 95% confidence interval 0.32-0.86). Severe (74% reduction, hazard ratio 0.26; 95% confidence interval 0.10-0.64) and lower extremity injuries (55% reduction, hazard ratio 0.45; 95% confidence interval 0.24-0.84) were also reduced. Injury incidence decreased with increasing compliance. '11+ Kids' is efficacious in reducing injuries in children's football. We observed considerable effects for overall, severe and lower extremity injuries. The programme should be performed at least once per week to profit from an injury preventive effect. However, two sessions per week can be recommended to further increase the protective benefit. ClinicalTrials.gov identifier: NCT02222025.
Risks of developing psychiatric disorders in pediatric patients with psoriasis.
Kimball, Alexa B; Wu, Eric Q; Guérin, Annie; Yu, Andrew P; Tsaneva, Magda; Gupta, Shiraz R; Bao, Yanjun; Mulani, Parvez M
2012-10-01
Symptoms of psoriasis can be embarrassing and distressing, and may increase risk of developing psychiatric disorders in young people. We sought to compare incidences of psychiatric disorders between pediatric patients with psoriasis and psoriasis-free control subjects. Patients (<18 years) with continuous health plan enrollment 6 months before and after first psoriasis diagnosis (index date) were selected (Thomson Reuters MarketScan database, 2000-2006 [Thomson Reuters, New York, NY]). Patients with psoriasis (N = 7404) were matched 1:5 on age and sex to psoriasis-free control subjects (N = 37,020). Patients were followed from index date to first diagnosis of a psychiatric disorder (ie, alcohol/drug abuse, depression, anxiety disorder, bipolar disorder, suicidal ideation, eating disorder), end of data availability, or disenrollment. Patients with psychiatric diagnoses or psychotropic medication use before the index date were excluded. Cox proportional hazard models controlling for age, sex, and comorbidities were used to estimate the effect of psoriasis on risks of developing psychiatric disorders. Patients with psoriasis were significantly more at risk of developing psychiatric disorders versus control subjects (5.13% vs 4.07%; P = .0001; hazard ratio = 1.25; P = .0001), especially depression (3.01% vs 2.42%; P = .0036; hazard ratio = 1.25; P = .0053) and anxiety (1.81% vs 1.35%; P = .0048; hazard ratio = 1.32; P = .0045). Retrospective, observational studies of medical claims data are typically limited by overall quality and completeness of data and accuracy of coding for diagnoses and procedures. Pediatric patients with psoriasis had an increased risk of developing psychiatric disorders, including depression and anxiety, compared with psoriasis-free control subjects. Copyright © 2011 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
Schwizer, Werner; Menne, Dieter; Schütze, Kurt; Vieth, Michael; Goergens, Reiner; Malfertheiner, Peter; Leodolter, Andreas; Fried, Michael; Fox, Mark R
2013-08-01
This study aimed to resolve controversy regarding the effects of Helicobacter pylori eradication therapy and H. pylori infection in gastro-oesophageal reflux disease. A randomized, double-blind, multicentre trial was performed in patients presenting with reflux symptoms. H. pylori-positive patients were randomized to receive either antibiotics or placebo for 7 days. H. pylori-negative patient controls received placebo. All received esomeprazole 20 mg b.d. for 7 days, followed by 40 mg o.d. to complete an 8-week course, and were followed up for 32 weeks by telephone. In this study, 198/589 (34%) patients were H. pylori-positive and 113 H. pylori-negative patients served as controls. Baseline endoscopy revealed 63% Los Angeles grade 0A and 37% Los Angeles grade BCD oesophagitis with no difference between patient groups. Symptom improvement on esomeprazole was seen in 89%. H. pylori eradication was successful in 82%. H. pylori eradication had no effect on symptomatic relapse (hazard ratio 1.15, 95% CI 0.74-1.8; p = 0.5). Overall, H. pylori-positive patients had a lower probability of relapse compared to H. pylori-negative controls (hazard ratio 0.6, 95% CI 0.43-0.85; p = 0.004). Relapse hazard was modulated also by oesophagitis grade (BCD vs. 0A, hazard ratio 2.1, 95% CI 1.5-3.0). Relapse of gastro-oesophageal reflux disease symptoms after a course of high dose acid suppression took longer for H. pylori-positive patients than H. pylori-negative controls; however eradication therapy had no effect on the risk of relapse; ClincialTrials.gov number, NCT00574925.
Menne, Dieter; Schütze, Kurt; Vieth, Michael; Goergens, Reiner; Malfertheiner, Peter; Leodolter, Andreas; Fried, Michael; Fox, Mark R
2013-01-01
Objectives This study aimed to resolve controversy regarding the effects of Helicobacter pylori eradication therapy and H. pylori infection in gastro-oesophageal reflux disease. Design A randomized, double-blind, multicentre trial was performed in patients presenting with reflux symptoms. H. pylori-positive patients were randomized to receive either antibiotics or placebo for 7 days. H. pylori-negative patient controls received placebo. All received esomeprazole 20 mg b.d. for 7 days, followed by 40 mg o.d. to complete an 8-week course, and were followed up for 32 weeks by telephone. Results In this study, 198/589 (34%) patients were H. pylori-positive and 113 H. pylori-negative patients served as controls. Baseline endoscopy revealed 63% Los Angeles grade 0A and 37% Los Angeles grade BCD oesophagitis with no difference between patient groups. Symptom improvement on esomeprazole was seen in 89%. H. pylori eradication was successful in 82%. H. pylori eradication had no effect on symptomatic relapse (hazard ratio 1.15, 95% CI 0.74–1.8; p = 0.5). Overall, H. pylori-positive patients had a lower probability of relapse compared to H. pylori-negative controls (hazard ratio 0.6, 95% CI 0.43–0.85; p = 0.004). Relapse hazard was modulated also by oesophagitis grade (BCD vs. 0A, hazard ratio 2.1, 95% CI 1.5–3.0). Conclusion Relapse of gastro-oesophageal reflux disease symptoms after a course of high dose acid suppression took longer for H. pylori-positive patients than H. pylori-negative controls; however eradication therapy had no effect on the risk of relapse; ClincialTrials.gov number, NCT00574925. PMID:24917966
Malhotra, Konark; Katsanos, Aristeidis H; Bilal, Mohammad; Ishfaq, Muhammad Fawad; Goyal, Nitin; Tsivgoulis, Georgios
2018-02-01
Pharmacokinetic and prior studies on thienopyridine and proton pump inhibitors (PPI) coadministration provide conflicting data for cardiovascular outcomes, whereas there is no established evidence on the association of concomitant use of PPI and thienopyridines with adverse cerebrovascular outcomes. We conducted a systematic review and meta-analysis of randomized controlled trials and cohort studies from inception to July 2017, reporting following outcomes among patients treated with thienopyridine and PPI versus thienopyridine alone (1) ischemic stroke, (2) combined ischemic or hemorrhagic stroke, (3) composite outcome of stroke, myocardial infarction (MI), and cardiovascular death, (4) MI, (5) all-cause mortality, and (6) major or minor bleeding events. After the unadjusted analyses of risk ratios, we performed additional analyses of studies reporting hazard ratios adjusted for potential confounders. We identified 22 studies (12 randomized controlled trials and 10 cohort studies) comprising 131 714 patients. Concomitant use of PPI with thienopyridines was associated with increased risk of ischemic stroke (risk ratio, 1.74; 95% confidence interval [CI], 1.41-2.16; P <0.001), composite stroke/MI/cardiovascular death (risk ratio, 1.14; 95% CI, 1.01-1.29; P =0.04), and MI (risk ratio, 1.19; 95% CI, 1.00-1.40; P =0.05). Likewise, in adjusted analyses concomitant use of PPI with thienopyridines was again associated with increased risk of stroke (hazard ratios adjusted, 1.30; 95% CI, 1.04-1.61; P =0.02), composite stroke/MI/cardiovascular death (hazard ratios adjusted, 1.23; 95% CI, 1.03-1.47; P =0.02), but not with MI (hazard ratios adjusted, 1.19; 95% CI, 0.93-1.52; P =0.16). Co-prescription of PPI and thienopyridines increases the risk of incident ischemic strokes and composite stroke/MI/cardiovascular death. Our findings corroborate the current guidelines for PPI deprescription and pharmacovigilance, especially in patients treated with thienopyridines. © 2018 American Heart Association, Inc.
Huang, Wen-Kuan; Lin, Yung-Chang; Chiou, Meng-Jiun; Yang, Tsai-Sheng; Chang, John Wen-Cheng; Yu, Kuang-Hui; Kuo, Chang-Fu; See, Lai-Chu
2013-01-01
There have been no large-scale population-based studies to estimate the subsequent risk of primary liver cancer (PLC) among patients with pyogenic liver abscess (PLA). This study aimed to provide relevant data. The Taiwan Longitudinal Health Insurance Database for the years 2000 and 2005 was used. The PLA group were adult inpatients who were newly diagnosed with PLA from 2000 to 2008. The control group was randomly selected and matched with the PLA group in terms of age, sex, and date in which medical treatment was sought other than for PLA. There were 1,987 patients each in the PLA and control groups. In total, 56 had PLC, 48 (2.4%, 601.5 per 100,000 person-years) from the PLA group, and 8 from the control group. After adjusting for potential covariates, the hazard ratio of PLC for the PLA group was 3.4 times that of the control group (95% confidence interval = 1.6-7.3, p <0.001). The PLC risk for the PLA group was significantly higher within the first year after PLA diagnosis (hazard ratio: 35.4) as compared with the control group and became insignificant (hazard ratio: 2.0, 95% confidence interval = 0.8-4.9) more than one year after PLA diagnosis. Patients with PLA have a higher rate of PLC than matched controls, especially within the first year after the diagnosis of PLA, suggesting PLA is a warning sign for PLC.
Allopurinol and Cardiovascular Outcomes in Adults With Hypertension.
MacIsaac, Rachael L; Salatzki, Janek; Higgins, Peter; Walters, Matthew R; Padmanabhan, Sandosh; Dominiczak, Anna F; Touyz, Rhian M; Dawson, Jesse
2016-03-01
Allopurinol lowers blood pressure in adolescents and has other vasoprotective effects. Whether similar benefits occur in older individuals remains unclear. We hypothesized that allopurinol is associated with improved cardiovascular outcomes in older adults with hypertension. Data from the United Kingdom Clinical Research Practice Datalink were used. Multivariate Cox-proportional hazard models were applied to estimate hazard ratios for stroke and cardiac events (defined as myocardial infarction or acute coronary syndrome) associated with allopurinol use over a 10-year period in adults aged >65 years with hypertension. A propensity-matched design was used to reduce potential for confounding. Allopurinol exposure was a time-dependent variable and was defined as any exposure and then as high (≥300 mg daily) or low-dose exposure. A total of 2032 allopurinol-exposed patients and 2032 matched nonexposed patients were studied. Allopurinol use was associated with a significantly lower risk of both stroke (hazard ratio, 0.50; 95% confidence interval, 0.32-0.80) and cardiac events (hazard ratio, 0.61; 95% confidence interval, 0.43-0.87) than nonexposed control patients. In exposed patients, high-dose treatment with allopurinol (n=1052) was associated with a significantly lower risk of both stroke (hazard ratio, 0.58; 95% confidence interval, 0.36-0.94) and cardiac events (hazard ratio, 0.65; 95% confidence interval, 0.46-0.93) than low-dose treatment (n=980). Allopurinol use is associated with lower rates of stroke and cardiac events in older adults with hypertension, particularly at higher doses. Prospective clinical trials are needed to evaluate whether allopurinol improves cardiovascular outcomes in adults with hypertension. © 2016 American Heart Association, Inc.
Liu, Jui-Ming; Hsu, Ren-Jun; Chang, Fung-Wei; Chiu, Feng-Hsiang; Yeh, Chia-Lun; Huang, Chun-Fa; Chang, Shu-Ting; Lee, Hung-Chang; Chi, Hsin; Lin, Chien-Yu
2017-01-01
Scabies is a common and annoying disorder. Pernicious anemia (PA) is a serious disease which, when untreated, leads to death. Mounting evidence suggests that immune-mediated inflammatory processes play a role in the pathophysiology of both diseases. The relationship between these two diseases has not been investigated. We conducted this study to explore the potential relationship between scabies and PA. This nationwide, population-based study was conducted using the National Health Insurance Research Database of Taiwan. In total, 5,407 patients with scabies were identified as a study group and 20,089 matched patients were randomly selected as a control group. We tracked patients in both groups for a 7-year period to identify the incidence of PA. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazards regression was used to calculate the hazard ratios for PA. Of the 25,496 patients in this study, 183 (0.7%) patients with newly diagnosed PA were identified during the 7-year follow-up period; 71 of 5,407 (1.3%) from the scabies group and 112 of 20,089 (0.6%) from the control group. Patients with scabies had a higher risk of subsequent PA, with a crude hazard ratio of 2.368. After adjusting for covariates, the adjusted hazard ratio was 1.51 (95% confidence interval: 1.09-2.08). This study demonstrated an increased risk of PA (adjusted hazard ratio 1.51) among patients with scabies. Immune-mediated inflammatory processes may contribute to this association. Further studies are warranted to investigate the entire pathological mechanisms between these two diseases. Physicians should pay attention to patients with history of scabies presented with anemia. Further confirmative tests of PA may contribute to correct diagnosis and initiation of vitamin B12 supplement.
Hamer, Mark; Batty, G David; Stamatakis, Emmanuel; Kivimaki, Mika
2010-12-01
Common mental disorders, such as anxiety and depression, are risk factors for mortality among cardiac patients, although this topic has gained little attention in individuals with hypertension. We examined the combined effects of hypertension and common mental disorder on mortality in participants with both treated and untreated hypertension. In a representative, prospective study of 31 495 adults (aged 52.5 ± 12.5 years, 45.7% men) we measured baseline levels of common mental disorder using the 12-item General Health Questionnaire (GHQ-12) and collected data on blood pressure, history of hypertension diagnosis, and medication use. High blood pressure (systolic/diastolic >140/90 mmHg) in study members with an existing diagnosis of hypertension indicated uncontrolled hypertension and, in undiagnosed individuals, untreated hypertension. There were 3200 deaths from all causes [943 cardiovascular disease (CVD)] over 8.4 years follow-up. As expected, the risk of CVD was elevated in participants with controlled [multivariate hazard ratio = 1.63, 95% confidence interval (CI) 1.26-2.12] and uncontrolled (multivariate hazard ratio = 1.57, 95% CI 1.08-2.27) hypertension compared with normotensive participants. Common mental disorder (GHQ-12 score of ≥4) was also associated with CVD death (multivariate hazard ratio = 1.60, 95% CI 1.35-1.90). The risk of CVD death was highest in participants with both diagnosed hypertension and common mental disorder, especially in study members with controlled (multivariate hazard ratio = 2.32, 95% CI 1.70-3.17) hypertension but also in uncontrolled hypertension (multivariate hazard ratio = 1.90, 95% CI 1.18-3.05). The combined effect of common mental disorder was also apparent in participants with undiagnosed (untreated) hypertension, especially for all-cause mortality. These findings suggest that the association of hypertension with total and CVD mortality is stronger when combined with common mental disorder.
Brown, Allen W; Leibson, Cynthia L; Mandrekar, Jay; Ransom, Jeanine E; Malec, James F
2014-01-01
To examine the contribution of co-occurring nonhead injuries to hazard of death after traumatic brain injury (TBI). A random sample of Olmsted County, Minnesota, residents with confirmed TBI from 1987 through 1999 was identified. Each case was assigned an age- and sex-matched, non-TBI "regular control" from the population. For "special cases" with accompanying nonhead injuries, 2 matched "special controls" with nonhead injuries of similar severity were assigned. Vital status was followed from baseline (ie, injury date for cases, comparable dates for controls) through 2008. Cases were compared first with regular controls and second with regular or special controls, depending on case type. In total, 1257 cases were identified (including 221 special cases). For both cases versus regular controls and cases versus regular or special controls, the hazard ratio was increased from baseline to 6 months (10.82 [2.86-40.89] and 7.13 [3.10-16.39], respectively) and from baseline through study end (2.92 [1.74-4.91] and 1.48 [1.09-2.02], respectively). Among 6-month survivors, the hazard ratio was increased for cases versus regular controls (1.43 [1.06-2.15]) but not for cases versus regular or special controls (1.05 [0.80-1.38]). Among 6-month survivors, accounting for nonhead injuries resulted in a nonsignificant effect of TBI on long-term mortality.
Stereotactic Radiosurgery for the Treatment of Primary and Metastatic Spinal Sarcomas
Balagamwala, Ehsan H.; Angelov, Lilyana; Suh, John H.; Djemil, Toufik; Magnelli, Anthony; Qi, Peng; Zhuang, Tingliang; Godley, Andrew
2016-01-01
Purpose: Despite advancements in local and systemic therapy, metastasis remains common in the natural history of sarcomas. Unfortunately, such metastases are the most significant source of morbidity and mortality in this heterogeneous disease. As a classically radioresistant histology, stereotactic radiosurgery has emerged to control spinal sarcomas and provide palliation. However, there is a lack of data regarding pain relief and relapse following stereotactic radiosurgery. Methods: We queried a retrospective institutional database of patients who underwent spine stereotactic radiosurgery for primary and metastatic sarcomas. The primary outcome was pain relief following stereotactic radiosurgery. Secondary outcomes included progression of pain, radiographic failure, and development of toxicities following treatment. Results: Forty treatment sites were eligible for inclusion; the median prescription dose was 16 Gy in a single fraction. Median time to radiographic failure was 14 months. At 6 and 12 months, radiographic control was 63% and 51%, respectively. Among patients presenting with pain, median time to pain relief was 1 month. Actuarial pain relief at 6 months was 82%. Median time to pain progression was 10 months; at 12 months, actuarial pain progression was 51%. Following multivariate analysis, presence of neurologic deficit at consult (hazard ratio: 2.48, P < .01) and presence of extraspinal bone metastases (hazard ratio: 2.83, P < .01) were associated with pain relief. Greater pain at consult (hazard ratio: 1.92, P < .01), prior radiotherapy (hazard ratio: 4.65, P = .02), and greater number of irradiated vertebral levels were associated with pain progression. Conclusions: Local treatment of spinal sarcomas has remained a challenge for decades, with poor rates of local control and limited pain relief following conventional radiotherapy. In this series, pain relief was achieved in 82% of treatments at 6 months, with half of patients experiencing pain progression by 12 months. Given minimal toxicity and suboptimal pain control at 12 months, dose escalation beyond 16 Gy is warranted. PMID:27074915
Lin, Chien-Yu; Chang, Fung-Wei; Yang, Jing-Jung; Chang, Chun-Hung; Yeh, Chia-Lun; Lei, Wei-Te; Huang, Chun-Fa; Liu, Jui-Ming; Hsu, Ren-Jun
2017-11-01
Both scabies and bipolar disorder (BD) are common and troublesome disorders. There are several similarities in both diseases: pruritus, a higher prevalence in crowded environments, and cytokine-mediated inflammatory processes in the pathophysiology. We conducted this nationwide population-based study to investigate the possible relationship between scabies and BD. Based on the National Health Insurance Research Database (NHIRD) of Taiwan, a total of 7096 patients with scabies were identified as a study group and 28,375 matched patients as a control. We tracked the patients in both groups for a 7-year period to identify those newly diagnosed with BD. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazard regressions were performed to calculate the hazard ratio (HR) of BD. Of the 35,471 patients in this study, 183 (0.5%) patients with newly diagnosed BD were identified, with 58 (0.8%) from the scabies group and 125 (0.4%) from the control group. The patients with scabies had a higher risk of subsequent BD, with a crude hazard ratio of 1.86 and an adjusted hazard ratio of 1.55 (95% confidence interval: 1.12-2.09, P < 0.05). This study shows there is an increased risk for BD among patients with scabies. Immunopathology may contribute to this association. Copyright © 2017 Elsevier B.V. All rights reserved.
Fealy, Nigel; Aitken, Leanne; du Toit, Eugene; Lo, Serigne; Baldwin, Ian
2017-10-01
To determine whether blood flow rate influences circuit life in continuous renal replacement therapy. Prospective randomized controlled trial. Single center tertiary level ICU. Critically ill adults requiring continuous renal replacement therapy. Patients were randomized to receive one of two blood flow rates: 150 or 250 mL/min. The primary outcome was circuit life measured in hours. Circuit and patient data were collected until each circuit clotted or was ceased electively for nonclotting reasons. Data for clotted circuits are presented as median (interquartile range) and compared using the Mann-Whitney U test. Survival probability for clotted circuits was compared using log-rank test. Circuit clotting data were analyzed for repeated events using hazards ratio. One hundred patients were randomized with 96 completing the study (150 mL/min, n = 49; 250 mL/min, n = 47) using 462 circuits (245 run at 150 mL/min and 217 run at 250 mL/min). Median circuit life for first circuit (clotted) was similar for both groups (150 mL/min: 9.1 hr [5.5-26 hr] vs 10 hr [4.2-17 hr]; p = 0.37). Continuous renal replacement therapy using blood flow rate set at 250 mL/min was not more likely to cause clotting compared with 150 mL/min (hazards ratio, 1.00 [0.60-1.69]; p = 0.68). Gender, body mass index, weight, vascular access type, length, site, and mode of continuous renal replacement therapy or international normalized ratio had no effect on clotting risk. Continuous renal replacement therapy without anticoagulation was more likely to cause clotting compared with use of heparin strategies (hazards ratio, 1.62; p = 0.003). Longer activated partial thromboplastin time (hazards ratio, 0.98; p = 0.002) and decreased platelet count (hazards ratio, 1.19; p = 0.03) were associated with a reduced likelihood of circuit clotting. There was no difference in circuit life whether using blood flow rates of 250 or 150 mL/min during continuous renal replacement therapy.
Mueller, C.S.
2010-01-01
I analyze the sensitivity of seismic-hazard estimates in the central and eastern United States (CEUS) to maximum magnitude (mmax) by exercising the U.S. Geological Survey (USGS) probabilistic hazard model with several mmax alternatives. Seismicity-based sources control the hazard in most of the CEUS, but data seldom provide an objective basis for estimating mmax. The USGS uses preferred mmax values of moment magnitude 7.0 and 7.5 for the CEUS craton and extended margin, respectively, derived from data in stable continental regions worldwide. Other approaches, for example analysis of local seismicity or judgment about a source's seismogenic potential, often lead to much smaller mmax. Alternative models span the mmax ranges from the 1980s Electric Power Research Institute/Seismicity Owners Group (EPRI/SOG) analysis. Results are presented as haz-ard ratios relative to the USGS national seismic hazard maps. One alternative model specifies mmax equal to moment magnitude 5.0 and 5.5 for the craton and margin, respectively, similar to EPRI/SOG for some sources. For 2% probability of exceedance in 50 years (about 0.0004 annual probability), the strong mmax truncation produces hazard ratios equal to 0.35-0.60 for 0.2-sec spectral acceleration, and 0.15-0.35 for 1.0-sec spectral acceleration. Hazard-controlling earthquakes interact with mmax in complex ways. There is a relatively weak dependence on probability level: hazardratios increase 0-15% for 0.002 annual exceedance probability and decrease 5-25% for 0.00001 annual exceedance probability. Although differences at some sites are tempered when faults are added, mmax clearly accounts for some of the discrepancies that are seen in comparisons between USGS-based and EPRI/SOG-based hazard results.
Predicting risk of cancer during HIV infection: the role of inflammatory and coagulation biomarkers.
Borges, Álvaro H; Silverberg, Michael J; Wentworth, Deborah; Grulich, Andrew E; Fätkenheuer, Gerd; Mitsuyasu, Ronald; Tambussi, Giuseppe; Sabin, Caroline A; Neaton, James D; Lundgren, Jens D
2013-06-01
To investigate the relationship between inflammatory [interleukin-6 (IL-6) and C-reactive protein (CRP)] and coagulation (D-dimer) biomarkers and cancer risk during HIV infection. A prospective cohort. HIV-infected patients on continuous antiretroviral therapy (ART) in the control arms of three randomized trials (N=5023) were included in an analysis of predictors of cancer (any type, infection-related or infection-unrelated). Hazard ratios for IL-6, CRP and D-dimer levels (log2-transformed) were calculated using Cox models stratified by trial and adjusted for demographics and CD4+ cell counts and adjusted also for all biomarkers simultaneously. To assess the possibility that biomarker levels were elevated at entry due to undiagnosed cancer, analyses were repeated excluding early cancer events (i.e. diagnosed during first 2 years of follow-up). During approximately 24,000 person-years of follow-up (PYFU), 172 patients developed cancer (70 infection-related; 102 infection-unrelated). The risk of developing cancer was associated with higher levels (per doubling) of IL-6 (hazard ratio 1.38, P<0.001), CRP (hazard ratio 1.16, P=0.001) and D-dimer (hazard ratio 1.17, P=0.03). However, only IL-6 (hazard ratio 1.29, P=0.003) remained associated with cancer risk when all biomarkers were considered simultaneously. Results for infection-related and infection-unrelated cancers were similar to results for any cancer. Hazard ratios excluding 69 early cancer events were 1.31 (P=0.007), 1.14 (P=0.02) and 1.07 (P=0.49) for IL-6, CRP and D-dimer, respectively. Activated inflammation and coagulation pathways are associated with increased cancer risk during HIV infection. This association was stronger for IL-6 and persisted after excluding early cancer. Trials of interventions may be warranted to assess whether cancer risk can be reduced by lowering IL-6 levels in HIV-positive individuals.
Salim, Agus; Tai, E Shyong; Tan, Vincent Y; Welsh, Alan H; Liew, Reginald; Naidoo, Nasheen; Wu, Yi; Yuan, Jian-Min; Koh, Woon P; van Dam, Rob M
2016-08-01
In western populations, high-sensitivity C-reactive protein (hsCRP), and to a lesser degree serum creatinine and haemoglobin A1c, predict risk of coronary heart disease (CHD). However, data on Asian populations that are increasingly affected by CHD are sparse and it is not clear whether these biomarkers can be used to improve CHD risk classification. We conducted a nested case-control study within the Singapore Chinese Health Study cohort, with incident 'hard' CHD (myocardial infarction or CHD death) as an outcome. We used data from 965 men (298 cases, 667 controls) and 528 women (143 cases, 385 controls) to examine the utility of hsCRP, serum creatinine and haemoglobin A1c in improving the prediction of CHD risk over and above traditional risk factors for CHD included in the ATP III model. For each sex, the performance of models with only traditional risk factors used in the ATP III model was compared with models with the biomarkers added using weighted Cox proportional hazards analysis. The impact of adding these biomarkers was assessed using the net reclassification improvement index. For men, loge hsCRP (hazard ratio 1.25, 95% confidence interval: 1.05; 1.49) and loge serum creatinine (hazard ratio 4.82, 95% confidence interval: 2.10; 11.04) showed statistically significantly associations with CHD risk when added to the ATP III model. We did not observe a significant association between loge haemoglobin A1c and CHD risk (hazard ratio 1.83, 95% confidence interval: 0.21; 16.06). Adding hsCRP and serum creatinine to the ATP III model improved risk classification in men with a net gain of 6.3% of cases (p-value = 0.001) being reclassified to a higher risk category, while it did not significantly reduce the accuracy of classification for non-cases. For women, squared hsCRP was borderline significantly (hazard ratio 1.01, 95% confidence interval: 1.00; 1.03) and squared serum creatinine was significantly (hazard ratio 1.81, 95% confidence interval: 1.49; 2.21) associated with CHD risk. However, the association between squared haemoglobin A1c and CHD risk was not significant (hazard ratio 1.05, 95% confidence interval: 0.99; 1.12). The addition of hsCRP and serum creatinine to the ATP III model resulted in 3.7% of future cases being reclassified to a higher risk category (p-value = 0.025), while it did not significantly reduce the accuracy of classification for non-cases. Adding hsCRP and serum creatinine, but not haemoglobin A1c, to traditional risk factors improved CHD risk prediction among non-diabetic Singaporean Chinese. The improved risk estimates will allow better identification of individuals at high risk of CHD than existing risk calculators such as the ATP III model. © The European Society of Cardiology 2016.
Stratifying the risks of oral anticoagulation in patients with liver disease.
Efird, Lydia M; Mishkin, Daniel S; Berlowitz, Dan R; Ash, Arlene S; Hylek, Elaine M; Ozonoff, Al; Reisman, Joel I; Zhao, Shibei; Jasuja, Guneet K; Rose, Adam J
2014-05-01
Chronic liver disease presents a relative contraindication to warfarin therapy, but some patients with liver disease nevertheless require long-term anticoagulation. The goal is to identify which patients with liver disease might safely receive warfarin. Among 102 134 patients who received warfarin from the Veterans Affairs from 2007 to 2008, International Classification of Diseases-Ninth Revision codes identified 1763 patients with chronic liver disease. Specific diagnoses and laboratory values (albumin, aspartate aminotransferase, alanine aminotransferase, creatinine, and cholesterol) were examined to identify risk of adverse outcomes, while controlling for available bleeding risk factors. Outcomes included percent time in therapeutic range, a measure of anticoagulation control, and major hemorrhagic events, by International Classification of Diseases-Ninth Revision codes. Patients with liver disease had lower mean time in therapeutic range (53.5%) when compared with patients without (61.7%; P<0.001) and more hemorrhages (hazard ratio, 2.02; P<0.001). Among patients with liver disease, serum albumin and creatinine levels were the strongest predictors of both outcomes. We created a 4-point score system: patients received 1 point each for albumin (2.5-3.49 g/dL) or creatinine (1.01-1.99 mg/dL), and 2 points each for albumin (<2.5 g/dL) or creatinine (≥2 mg/dL). This score predicted both anticoagulation control and hemorrhage. When compared with patients without liver disease, those with a score of zero had modestly lower time in therapeutic range (56.7%) and no increase in hemorrhages (hazard ratio, 1.16; P=0.59), whereas those with the worst score (4) had poor control (29.4%) and high hazard of hemorrhage (hazard ratio, 8.53; P<0.001). Patients with liver disease receiving warfarin have poorer anticoagulation control and more hemorrhages. A simple 4-point scoring system using albumin and creatinine identifies those at risk for poor outcomes. © 2014 American Heart Association, Inc.
Rauch, Geraldine; Brannath, Werner; Brückner, Matthias; Kieser, Meinhard
2018-05-01
In many clinical trial applications, the endpoint of interest corresponds to a time-to-event endpoint. In this case, group differences are usually expressed by the hazard ratio. Group differences are commonly assessed by the logrank test, which is optimal under the proportional hazard assumption. However, there are many situations in which this assumption is violated. Especially in applications were a full population and several subgroups or a composite time-to-first-event endpoint and several components are considered, the proportional hazard assumption usually does not simultaneously hold true for all test problems under investigation. As an alternative effect measure, Kalbfleisch and Prentice proposed the so-called 'average hazard ratio'. The average hazard ratio is based on a flexible weighting function to modify the influence of time and has a meaningful interpretation even in the case of non-proportional hazards. Despite this favorable property, it is hardly ever used in practice, whereas the standard hazard ratio is commonly reported in clinical trials regardless of whether the proportional hazard assumption holds true or not. There exist two main approaches to construct corresponding estimators and tests for the average hazard ratio where the first relies on weighted Cox regression and the second on a simple plug-in estimator. The aim of this work is to give a systematic comparison of these two approaches and the standard logrank test for different time-toevent settings with proportional and nonproportional hazards and to illustrate the pros and cons in application. We conduct a systematic comparative study based on Monte-Carlo simulations and by a real clinical trial example. Our results suggest that the properties of the average hazard ratio depend on the underlying weighting function. The two approaches to construct estimators and related tests show very similar performance for adequately chosen weights. In general, the average hazard ratio defines a more valid effect measure than the standard hazard ratio under non-proportional hazards and the corresponding tests provide a power advantage over the common logrank test. As non-proportional hazards are often met in clinical practice and the average hazard ratio tests often outperform the common logrank test, this approach should be used more routinely in applications. Schattauer GmbH.
How much does a reminder letter increase cervical screening among under-screened women in NSW?
Morrell, Stephen; Taylor, Richard; Zeckendorf, Sue; Niciak, Amanda; Wain, Gerard; Ross, Jayne
2005-02-01
To evaluate a direct mail-out campaign to increase Pap screening rates in women who have not had a test in 48 months. Ninety thousand under-screened women were randomised to be mailed a 48-month reminder letter to have a Pap test (n=60,000), or not to be mailed a letter (n=30,000). Differences in Pap test rates were assessed by Kaplan-Meier survival analysis, by chi2 tests of significance between Pap test rates in letter versus no-letter groups, and by proportional hazards regression modelling of predictors of a Pap test with letter versus no-letter as the main study variable. T-tests were conducted on mean time to Pap test to assess whether time to Pap test was significantly different between the intervention and control groups. After 90 days following each mail-out, Pap test rates in the letter group were significantly higher than in the non-letter group, by approximately two percentage points. After controlling for potential confounders, the hazard ratio of a Pap test within 90 days of a mail-out in the letter group was 1.5 compared with 1.0 in the no-letter group. Hazard ratios of having a Pap test within 90 days decreased significantly with time since last Pap test (p<0.0001); were significantly higher than 1.0 for most non-metropolitan areas of NSW compared with metropolitan areas; and increased significantly with age (p<0.0001). Pap test hazard ratios were not associated with socio-economic status of area of residence, but the hazard ratio was significantly higher than 1.0 if the reminder letter was sent after the Christmas/New Year break. No significant differences in mean time to Pap test were found between the letter and no-letter groups. Being sent a reminder letter is associated with higher Pap testing rates in under-screened women.
Chang, Fung-Wei; Chiu, Feng-Hsiang; Yeh, Chia-Lun; Huang, Chun-Fa; Chang, Shu-Ting; Lee, Hung-Chang; Chi, Hsin; Lin, Chien-Yu
2017-01-01
Objectives Scabies is a common and annoying disorder. Pernicious anemia (PA) is a serious disease which, when untreated, leads to death. Mounting evidence suggests that immune-mediated inflammatory processes play a role in the pathophysiology of both diseases. The relationship between these two diseases has not been investigated. We conducted this study to explore the potential relationship between scabies and PA. Materials and methods This nationwide, population-based study was conducted using the National Health Insurance Research Database of Taiwan. In total, 5,407 patients with scabies were identified as a study group and 20,089 matched patients were randomly selected as a control group. We tracked patients in both groups for a 7-year period to identify the incidence of PA. The demographic characteristics and comorbidities of the patients were analyzed, and Cox proportional hazards regression was used to calculate the hazard ratios for PA. Results Of the 25,496 patients in this study, 183 (0.7%) patients with newly diagnosed PA were identified during the 7-year follow-up period; 71 of 5,407 (1.3%) from the scabies group and 112 of 20,089 (0.6%) from the control group. Patients with scabies had a higher risk of subsequent PA, with a crude hazard ratio of 2.368. After adjusting for covariates, the adjusted hazard ratio was 1.51 (95% confidence interval: 1.09–2.08). Conclusion This study demonstrated an increased risk of PA (adjusted hazard ratio 1.51) among patients with scabies. Immune-mediated inflammatory processes may contribute to this association. Further studies are warranted to investigate the entire pathological mechanisms between these two diseases. Physicians should pay attention to patients with history of scabies presented with anemia. Further confirmative tests of PA may contribute to correct diagnosis and initiation of vitamin B12 supplement. PMID:29066901
Clark, Daniel O; Gao, Sujuan; Lane, Kathleen A; Callahan, Christopher M; Baiyewu, Olusegun; Ogunniyi, Adesola; Hendrie, Hugh C
2014-09-01
To compare the effect of obesity and related risk factors on 10-year mortality in two cohorts of older adults of African descent; one from the United States and one from Nigeria. Study participants were community residents aged 70 or older of African descent living in Indianapolis, Indiana (N = 1,269) or Ibadan, Nigeria (1,197). We compared survival curves between the two cohorts by obesity class and estimated the effect of obesity class on mortality in Cox proportional hazards models controlling for age, gender, alcohol use, and smoking history, and the cardiometabolic biomarkers blood pressure, triglycerides, high-density lipoprotein, low-density lipoprotein, and C-reactive protein. We found that underweight was associated with an increased risk of death in both the Yoruba (hazards ratio = 1.35, 95% confidence interval: 1.12-1.63) and African American samples (hazards ratio = 2.49, 95% confidence interval: 1.40-4.43) compared with those with normal weight. The overweight and obese participants in both cohorts experienced survival similar to the normal weight participants. Controlling for cardiometabolic biomarkers had little effect on the obesity-specific hazard ratios in either cohort. Despite significant differences across these two cohorts in terms of obesity and biomarker levels, overall 10-year survival and obesity class-specific survival were remarkably similar. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Carbon Structure Hazard Control
NASA Technical Reports Server (NTRS)
Yoder, Tommy; Greene, Ben; Porter, Alan
2015-01-01
Carbon composite structures are widely used in virtually all advanced technology industries for a multitude of applications. The high strength-to-weight ratio and resistance to aggressive service environments make them highly desirable. Automotive, aerospace, and petroleum industries extensively use, and will continue to use, this enabling technology. As a result of this broad range of use, field and test personnel are increasingly exposed to hazards associated with these structures. No single published document exists to address the hazards and make recommendations for the hazard controls required for the different exposure possibilities from damaged structures including airborne fibers, fly, and dust. The potential for personnel exposure varies depending on the application or manipulation of the structure. The effect of exposure to carbon hazards is not limited to personnel, protection of electronics and mechanical equipment must be considered as well. The various exposure opportunities defined in this document include pre-manufacturing fly and dust, the cured structure, manufacturing/machining, post-event cleanup, and post-event test and/or evaluation. Hazard control is defined as it is applicable or applied for the specific exposure opportunity. The carbon exposure hazard includes fly, dust, fiber (cured/uncured), and matrix vapor/thermal decomposition products. By using the recommendations in this document, a high level of confidence can be assured for the protection of personnel and equipment.
Thomas, Laine; Svetkey, Laura; Brancati, Frederick L.; Califf, Robert M.; Edelman, David
2013-01-01
BACKGROUND Low and low-normal serum potassium is associated with an increased risk of diabetes. We hypothesized that the protective effect of valsartan on diabetes risk could be mediated by its effect of raising serum potassium. METHODS We analyzed data from the Nateglinide and Valsartan in Impaired Glucose Tolerance Outcomes Research (NAVIGATOR) trial, which randomized participants at risk for diabetes to either valsartan (up to 160mg daily) or no valsartan. Using Cox models, we evaluated the effect of valsartan on diabetes risk over a median of 4 years of follow-up and calculated the mediation effect of serum potassium as the difference in treatment hazard ratios from models excluding and including 1-year change in serum potassium. The 95% confidence interval (CI) for the difference in log hazard ratios was computed by bootstrapping. RESULTS The hazard ratio for developing diabetes among those on valsartan vs. no valsartan was 0.866 (95% CI = 0.795–0.943) vs. 0.868 (95% CI = 0.797–0.945), after controlling for 1-year change in potassium. The bootstrap 95% CI for a difference in these log hazard ratios was not statistically significant (−0.003 to 0.009). CONCLUSIONS Serum potassium does not appear to significantly mediate the protective effect of valsartan on diabetes risk. PMID:23417031
LeBlanc, John C.; Pless, I. Barry; King, W. James; Bawden, Harry; Bernard-Bonnin, Anne-Claude; Klassen, Terry; Tenenbein, Milton
2006-01-01
Background Young children may sustain injuries when exposed to certain hazards in the home. To better understand the relation between several childproofing strategies and the risk of injuries to children in the home, we undertook a multicentre case–control study in which we compared hazards in the homes of children with and without injuries. Methods We conducted this case-control study using records from 5 pediatric hospital emergency departments for the 2-year period 1995–1996. The 351 case subjects were children aged 7 years and less who presented with injuries from falls, burns or scalds, ingestions or choking. The matched control subjects were children who presented during the same period with acute non-injury-related conditions. A home visitor, blinded to case-control status, assessed 19 injury hazards at the children's homes. Results Hazards found in the homes included baby walkers (21% of homes with infants), no functioning smoke alarm (17% of homes) and no fire extinguisher (51% of homes). Cases did not differ from controls in the mean proportion of home hazards. After controlling for siblings, maternal education and employment, we found that cases differed from controls for 5 hazards: the presence of a baby walker (odds ratio [OR] 9.0, 95% confidence interval [CI] 1.1–71.0), the presence of choking hazards within a child's reach (OR 2.0, 95% CI 1.0–3.7), no child-resistant lids in bathroom (OR 1.6, 95% CI 1.0–2.5), no smoke alarm (OR 3.2, 95% CI 1.4–7.7) and no functioning smoke alarm (OR 1.7, 95% CI 1.0–2.8). Interpretation Homes of children with injuries differed from those of children without injuries in the proportions of specific hazards for falls, choking, poisoning and burns, with a striking difference noted for the presence of a baby walker. In addition to counselling parents about specific hazards, clinicians should consider that the presence of some hazards may indicate an increased risk for home injuries beyond those directly related to the hazard found. Families with any home hazard may be candidates for interventions to childproof against other types of home hazards. PMID:16998079
Noise Pollution--What can be Done?
ERIC Educational Resources Information Center
Shaw, Edgar A. G.
1975-01-01
Discusses the ratio of energy dissipated as sound to the mechanical output of devices. Considers noise levels, ranges vs. peaks, noise indexes, and health hazards. Indicates some problems vs. solutions in the technology of noise control. (GH)
Patient Portal Use and Blood Pressure Control in Newly Diagnosed Hypertension.
Manard, William; Scherrer, Jeffrey F; Salas, Joanne; Schneider, F David
2016-01-01
Current evidence that patient portal use improves disease management is inconclusive. Randomized controlled trials have found no benefit of Web-based patient-provider communication for blood pressure (BP) control, but patients from these studies were not selected for uncontrolled hypertension, nor did measures of portal use occur in a real-world setting, as captured in the electronic medical record. This study determined whether patient portal use by patients with treated, incident hypertension was associated with achieving BP control. Between 2008 to 2010, 1571 patients with an incident hypertension diagnosis, ages 21 to >89 years, were identified from an academic medical center primary care patient data registry. Cox proportional hazard models were computed to estimate the association between portal use and incident BP control during follow-up (2011-2015), before and after adjusting for covariates. Covariates included sociodemographics, smoking, obesity and other physical and mental health comorbidities, and volume of health care utilization. After adjusting for age, portal users were more likely than nonusers to achieve BP control (hazard ratio, 1.24; 95% confidence interval, 1.06-1.45). After adjustment for sociodemographics, portal use was no longer associated with BP control (hazard ratio, 0.98; 95% confidence interval, 0.83-1.16). Patient sociodemographic factors, including race, sex, and socioeconomic status, account for the observation that portal use leads to BP control among persons with newly diagnosed hypertension. Further research is warranted to determine whether there are benefits of portal use for other chronic conditions. © Copyright 2016 by the American Board of Family Medicine.
Sickle Cell Trait, Rhabdomyolysis, and Mortality among U.S. Army Soldiers
Nelson, D. Alan; Deuster, Patricia A.; Carter, Robert; Hill, Owen T.; Wolcott, Vickee L.; Kurina, Lianne M.
2016-01-01
Background Studies have suggested that sickle cell trait elevates the risks of exertional rhabdomyolysis and death. We conducted a study of sickle cell trait in relation to these outcomes, controlling for known risk factors for exertional rhabdomyolysis, in a large population of active persons who had undergone laboratory tests for hemoglobin AS (HbAS) and who were subject to exertional-injury precautions. Methods We used Cox proportional-hazards models to test whether the risks of exertional rhabdomyolysis and death varied according to sickle cell trait status among 47,944 black soldiers who had undergone testing for HbAS and who were on active duty in the U.S. Army between January 2011 and December 2014. We used the Stanford Military Data Repository, which contains comprehensive medical and administrative data on all active-duty soldiers. Results There was no significant difference in the risk of death among soldiers with sickle cell trait, as compared with those without the trait (hazard ratio, 0.99; 95% confidence interval [CI], 0.46 to 2.13; P = 0.97), but the trait was associated with a significantly higher adjusted risk of exertional rhabdomyolysis (hazard ratio, 1.54; 95% CI, 1.12 to 2.12; P = 0.008). This effect was similar in magnitude to that associated with tobacco use, as compared with no use (hazard ratio, 1.54; 95% CI, 1.23 to 1.94; P<0.001), and to that associated with having a body-mass index (BMI; the weight in kilograms divided by the square of the height in meters) of 30.0 or more, as compared with a BMI of less than 25.0 (hazard ratio, 1.39; 95% CI, 1.04 to 1.86; P = 0.03). The effect was less than that associated with recent use of a statin, as compared with no use (hazard ratio, 2.89; 95% CI, 1.51 to 5.55; P = 0.001), or an antipsychotic agent (hazard ratio, 3.02; 95% CI, 1.34 to 6.82; P = 0.008). Conclusions Sickle cell trait was not associated with a higher risk of death than absence of the trait, but it was associated with a significantly higher risk of exertional rhabdomyolysis. (Funded by the National Heart, Lung, and Blood Institute and the Uniformed Services University of the Health Sciences.) PMID:27518662
Kendler, Kenneth S.; Lönn, Sara Larsson; Sundquist, Jan; Sundquist, Kristina
2015-01-01
Objective The purpose of this study was to clarify the causes of the smoking-schizophrenia association. Method Using Cox proportional hazard and co-relative control models, the authors predicted future risk for a diagnosis of schizophrenia or nonaffective psychosis from the smoking status of 1,413,849 women and 233,879 men from, respectively, the Swedish birth and conscript registries. Results Smoking was assessed in women at a mean age of 27 and in men at a mean age of 18. The mean age at end of follow-up was 46 for women and 26 for men. Hazard ratios for first-onset schizophrenia were elevated both for light smoking (2.21 [95% CI=1.90–2.56] for women and 2.15 [95% CI=1.25–3.44] for men) and heavy smoking (3.45 [95% CI=2.95–4.03] for women and 3.80 [95% CI=1.19–6.60] for men). These associations did not decline when schizophrenia onsets 3–5 years after smoking assessment were censored. When age, socioeconomic status, and drug abuse were controlled for, hazard ratios declined only modestly in both samples. Women who smoked into late pregnancy had a much higher risk for schizophrenia than those who quit early. Hazard ratios predicting nonaffective psychosis in the general population, in cousins, in half siblings, and in full siblings discordant for heavy smoking were, respectively, 2.67, 2.71, 2.54, and 2.18. A model utilizing all relative pairs predicted a hazard ratio of 1.69 (95% CI=1.17–2.44) for nonaffective psychosis in the heavy-smoking member of discordant monozygotic twin pairs. Conclusions Smoking prospectively predicts risk for schizophrenia. This association does not arise from smoking onset during a schizophrenic prodrome and demonstrates a clear dose-response relationship. While little of this association is explained by epidemiological confounders, a portion arises from common familial/genetic risk factors. However, in full siblings and especially monozygotic twins discordant for smoking, risk for nonaffective psychosis is appreciably higher in the smoking member. These results can help in evaluating the plausibility of various etiological hypotheses for the smoking-schizophrenia association. PMID:26046339
Risk of Hypertension among First-Time Symptomatic Kidney Stone Formers
Kittanamongkolchai, Wonngarm; Mara, Kristin C.; Mehta, Ramila A.; Vaughan, Lisa E.; Denic, Aleksandar; Knoedler, John J.; Enders, Felicity T.; Lieske, John C.
2017-01-01
Background and objectives Prior work has suggested a higher risk of hypertension in kidney stone formers but lacked disease validation and adjustment for potential confounders. Certain types of stone formers may also be at higher risk of hypertension. Design, setting, participants, & measurements In our study, incident symptomatic stone formers in Olmsted County from 2000 to 2011 were manually validated by chart review and age and sex matched to Olmsted County controls. We followed up patients through November 20, 2015. Hypertension was also validated by manual chart review, and the risk of hypertension in stone formers compared with controls was assessed both univariately and after adjusting for comorbidities. The risk of hypertension among different subtypes of stone formers was also evaluated. Results Among 3023 coded stone formers from 2000 to 2011, a total of 1515 were validated and matched to 1515 controls (mean age was 45 years old, and 56% were men). After excluding those with baseline hypertension (20% of stone formers and 18% of controls), 154 stone formers and 110 controls developed hypertension. Median follow-up time was 7.8 years in stone formers and 9.6 years in controls. Stone formers were found to have a higher risk of hypertension compared with controls (hazard ratio, 1.50; 95% confidence interval, 1.18 to 1.92), even after adjusting for age, sex, body mass index, serum creatinine, CKD, diabetes, gout, coronary artery disease, dyslipidemia, tobacco use, and alcohol abuse (hazard ratio, 1.58; 95% confidence interval, 1.12 to 2.21). Results were similar after excluding patients who were ever on a thiazide diuretic (hazard ratio, 1.65; 95% confidence interval, 1.16 to 2.38). Stone composition, radiographic stone burden, number of subsequent stone events, and stone removal surgeries were not associated with hypertension (P>0.05 for all). Conclusions The risk of hypertension was higher after the first symptomatic kidney stone event. However, kidney stone severity, type, and treatment did not associate with hypertension. PMID:28148559
Pole, Jason D.; Mustard, Cameron A.; To, Teresa; Beyene, Joseph; Allen, Alexander C.
2010-01-01
This study was designed to test the hypothesis that fetal exposure to corticosteroids in the antenatal period is an independent risk factor for the development of asthma in early childhood with little or no effect in later childhood. A population-based cohort study of all pregnant women who resided in Nova Scotia, Canada, and gave birth to a singleton fetus between 1989 and 1998 was undertaken. After a priori specified exclusions, 80,448 infants were available for analysis. Using linked health care utilization records, incident asthma cases developed after 36 months of age were identified. Extended Cox proportional hazards models were used to estimate hazard ratios while controlling for confounders. Exposure to corticosteroids during pregnancy was associated with a risk of asthma in childhood between 3–5 years of age: adjusted hazard ratio of 1.19 (95% confidence interval: 1.03, 1.39), with no association noted after 5 years of age: adjusted hazard ratio for 5–7 years was 1.06 (95% confidence interval: 0.86, 1.30) and for 8 or greater years was 0.74 (95% confidence interval: 0.54, 1.03). Antenatal steroid therapy appears to be an independent risk factor for the development of asthma between 3 and 5 years of age. PMID:21490744
Guertler, Diana; Vandelanotte, Corneel; Kirwan, Morwenna; Duncan, Mitch J
2015-07-15
Data from controlled trials indicate that Web-based interventions generally suffer from low engagement and high attrition. This is important because the level of exposure to intervention content is linked to intervention effectiveness. However, data from real-life Web-based behavior change interventions are scarce, especially when looking at physical activity promotion. The aims of this study were to (1) examine the engagement with the freely available physical activity promotion program 10,000 Steps, (2) examine how the use of a smartphone app may be helpful in increasing engagement with the intervention and in decreasing nonusage attrition, and (3) identify sociodemographic- and engagement-related determinants of nonusage attrition. Users (N=16,948) were grouped based on which platform (website, app) they logged their physical activity: Web only, app only, or Web and app. Groups were compared on sociodemographics and engagement parameters (duration of usage, number of individual and workplace challenges started, and number of physical activity log days) using ANOVA and chi-square tests. For a subsample of users that had been members for at least 3 months (n=11,651), Kaplan-Meier survival curves were estimated to plot attrition over the first 3 months after registration. A Cox regression model was used to determine predictors of nonusage attrition. In the overall sample, user groups differed significantly in all sociodemographics and engagement parameters. Engagement with the program was highest for Web-and-app users. In the subsample, 50.00% (5826/11,651) of users stopped logging physical activity through the program after 30 days. Cox regression showed that user group predicted nonusage attrition: Web-and-app users (hazard ratio=0.86, 95% CI 0.81-0.93, P<.001) and app-only users (hazard ratio=0.63, 95% CI 0.58-0.68, P<.001) showed a reduced attrition risk compared to Web-only users. Further, having a higher number of individual challenges (hazard ratio=0.62, 95% CI 0.59-0.66, P<.001), workplace challenges (hazard ratio=0.94, 95% CI 0.90-0.97, P<.001), physical activity logging days (hazard ratio=0.921, 95% CI 0.919-0.922, P<.001), and steps logged per day (hazard ratio=0.99999, 95% CI 0.99998-0.99999, P<.001) were associated with reduced nonusage attrition risk as well as older age (hazard ratio=0.992, 95% CI 0.991-0.994, P<.001), being male (hazard ratio=0.85, 95% CI 0.82-0.89, P<.001), and being non-Australian (hazard ratio=0.87, 95% CI 0.82-0.91, P<.001). Compared to other freely accessible Web-based health behavior interventions, the 10,000 Steps program showed high engagement. The use of an app alone or in addition to the website can enhance program engagement and reduce risk of attrition. Better understanding of participant reasons for reducing engagement can assist in clarifying how to best address this issue to maximize behavior change.
Oh, Jee-Young; Allison, Matthew A; Barrett-Connor, Elizabeth
2017-01-01
Although the prevalence rates of hypertension (HTN) and diabetes mellitus are slowing in some high-income countries, HTN and diabetes mellitus remain as the two major risk factors for atherosclerotic cardiovascular disease (CVD), the leading cause of death in the United States and worldwide. We aimed to observe the association of HTN and diabetes mellitus with all-cause and CVD mortality in older white adults. All community-dwelling Rancho Bernardo Study participants who were at least 55 years old and had carefully measured blood pressure and plasma glucose from 75-g oral glucose tolerance test at the baseline visit (1984-1987, n = 2186) were followed up until death or the last clinic visit in 2013 (median 14.3 years, interquartile range 8.4-21.3). In unadjusted analyses, diabetes mellitus was associated with all-cause mortality [hazard ratio 1.40, 95% confidence interval (CI) 1.23-1.60] and CVD mortality (hazard ratio 1.67, 95% CI 1.39-2.00); HTN with all-cause mortality [hazard ratio 1.93 (1.73-2.15)] and CVD mortality [hazard ratio 2.45 (2.10-2.93)]. After adjustment for cardiovascular risk factors, including age, BMI, triglycerides, HDL-cholesterol, smoking, exercise, and alcohol consumption, diabetes mellitus was associated with CVD mortality only (hazard ratio 1.25, P = 0.0213). Conversely, HTN was associated with both all-cause (hazard ratio 1.34, P < 0.0001) and CVD mortality (hazard ratio 1.40, P = 0.0003). Having both diabetes mellitus and HTN was associated with all-cause (hazard ratio 1.38, P = 0.0002) and CVD mortality (hazard ratio 1.70, P < 0.0001). We report the novel finding that HTN is more strongly associated with all-cause and CVD mortality than diabetes mellitus. Having both confers a modest increase in the hazards for these types of mortality.
Chang, Y S; Chang, C C; Chen, Y H; Chen, W S; Chen, J H
2017-10-01
Objectives Patients with systemic lupus erythematosus are considered vulnerable to infective endocarditis and prophylactic antibiotics are recommended before an invasive dental procedure. However, the evidence is insufficient. This nationwide population-based study evaluated the risk and related factors of infective endocarditis in systemic lupus erythematosus. Methods We identified 12,102 systemic lupus erythematosus patients from the National Health Insurance research-oriented database, and compared the incidence rate of infective endocarditis with that among 48,408 non-systemic lupus erythematosus controls. A Cox multivariable proportional hazards model was employed to evaluate the risk of infective endocarditis in the systemic lupus erythematosus cohort. Results After a mean follow-up of more than six years, the systemic lupus erythematosus cohort had a significantly higher incidence rate of infective endocarditis (42.58 vs 4.32 per 100,000 person-years, incidence rate ratio = 9.86, p < 0.001) than that of the control cohort. By contrast, the older systemic lupus erythematosus cohort had lower risk (adjusted hazard ratio 11.64) than that of the younger-than-60-years systemic lupus erythematosus cohort (adjusted hazard ratio 15.82). Cox multivariate proportional hazards analysis revealed heart disease (hazard ratio = 5.71, p < 0.001), chronic kidney disease (hazard ratio = 2.98, p = 0.034), receiving a dental procedure within 30 days (hazard ratio = 36.80, p < 0.001), and intravenous steroid therapy within 30 days (hazard ratio = 39.59, p < 0.001) were independent risk factors for infective endocarditis in systemic lupus erythematosus patients. Conclusions A higher risk of infective endocarditis was observed in systemic lupus erythematosus patients. Risk factors for infective endocarditis in the systemic lupus erythematosus cohort included heart disease, chronic kidney disease, steroid pulse therapy within 30 days, and a recent invasive dental procedure within 30 days.
Muruet, Walter; Rudd, Anthony; Wolfe, Charles D A; Douiri, Abdel
2018-03-01
Intravenous thrombolysis with alteplase is one of the few approved treatments for acute ischemic stroke; nevertheless, little is known about its long-term effects on survival and recovery because clinical trials follow-up times are limited. Patients registered between January 2005 and December 2015, to the population-based South London Stroke Register of first-ever strokes. Propensity score was used to match thrombolyzed and control cases to a 1:2 ratio by demographical and clinical covariates. The primary outcome was survival up to 10 years using Kaplan-Meier estimates, Cox proportional hazards, and restricted mean survival time. Secondary outcomes included stroke recurrence and functional status (Barthel Index and Frenchay Activities Index scores) at 5 years. From 2052 ischemic strokes, 246 treated patients were matched to 492 controls. Median follow-up time 5.45 years (interquartile range, 4.56). Survival was higher in the treatment group (median, 5.72 years) compared with control group (4.98 years, stratified log-rank test <0.001). The number needed to treat to prevent 1 death at 5 years was 12 and 20 at 10 years. After Cox regression analysis, thrombolysis reduced risk of mortality by 37% (hazard ratio, 0.63; 95% confidence interval [CI], 0.48-0.82) at 10 years; however, after introducing a multiplicative interaction term into the model, mortality risk reduction was 42% (hazard ratio, 0.58; 95% CI, 0.40-0.82) at 10 years for those arriving within 3 hours to the hospital. On average, in a 10-year period, treated patients lived 1 year longer than controls. At 5 years, thrombolysis was associated with independence (Barthel Index≥90; odds ratio, 3.76; 95% CI, 1.22-13.34) and increased odds of a higher Frenchay Activities Index (proportional odds ratio, 2.37; 95% CI, 1.16-4.91). There was no difference in stroke recurrence. Thrombolysis with intravenous alteplase is associated with improved long-term survival and functional status after ischemic stroke. © 2018 The Authors.
NASA Technical Reports Server (NTRS)
Kumar, K. V.; Calkins, Dick S.; Waligora, James M.; Gilbert, John H., III; Powell, Michael R.
1992-01-01
This study investigated the association between time at onset of circulating microbubbles (CMB) and symptoms of altitude decompression sickness (DCS), using Cox proportional hazard regression models. The study population consisted of 125 individuals who participated in direct ascent, simulated extravehicular activities profiles. Using individual CMB status as a time-dependent variable, we found that the hazard for symptoms increased significantly (at the end of 180 min at altitude) in the presence of CMB (Hazard Ratio = 29.59; 95 percent confidence interval (95 percent CI) = 7.66-114.27), compared to no CMB. Further examination was conducted on the subgroup of individuals who developed microbubbles during the test (n = 49), by using Cox regression. Individuals with late onset of CMB (greater than 60 min at altitude) showed a significantly reduced risk of symptoms (hazard ratio = 0.92; 95 percent CI = 0.89-0.95), compared to those with early onset (equal to or less than 60 min), while controlling for other risk factors. We conclude that time to detection of circulating microbubbles is an independent determinant of symptoms of DCS.
Opioid Analgesics and Adverse Outcomes among Hemodialysis Patients.
Ishida, Julie H; McCulloch, Charles E; Steinman, Michael A; Grimes, Barbara A; Johansen, Kirsten L
2018-05-07
Patients on hemodialysis frequently experience pain and may be particularly vulnerable to opioid-related complications. However, data evaluating the risks of opioid use in patients on hemodialysis are limited. Using the US Renal Data System, we conducted a cohort study evaluating the association between opioid use (modeled as a time-varying exposure and expressed in standardized oral morphine equivalents) and time to first emergency room visit or hospitalization for altered mental status, fall, and fracture among 140,899 Medicare-covered adults receiving hemodialysis in 2011. We evaluated risk according to average daily total opioid dose (>60 mg, ≤60 mg, and per 60-mg dose increment) and specific agents (per 60-mg dose increment). The median age was 61 years old, 52% were men, and 50% were white. Sixty-four percent received opioids, and 17% had an episode of altered mental status (15,658 events), fall (7646 events), or fracture (4151 events) in 2011. Opioid use was associated with risk for all outcomes in a dose-dependent manner: altered mental status (lower dose: hazard ratio, 1.28; 95% confidence interval, 1.23 to 1.34; higher dose: hazard ratio, 1.67; 95% confidence interval, 1.56 to 1.78; hazard ratio, 1.29 per 60 mg; 95% confidence interval, 1.26 to 1.33), fall (lower dose: hazard ratio, 1.28; 95% confidence interval, 1.21 to 1.36; higher dose: hazard ratio, 1.45; 95% confidence interval, 1.31 to 1.61; hazard ratio, 1.04 per 60 mg; 95% confidence interval, 1.03 to 1.05), and fracture (lower dose: hazard ratio, 1.44; 95% confidence interval, 1.33 to 1.56; higher dose: hazard ratio, 1.65; 95% confidence interval, 1.44 to 1.89; hazard ratio, 1.04 per 60 mg; 95% confidence interval, 1.04 to 1.05). All agents were associated with a significantly higher hazard of altered mental status, and several agents were associated with a significantly higher hazard of fall and fracture. Opioids were associated with adverse outcomes in patients on hemodialysis, and this risk was present even at lower dosing and for agents that guidelines have recommended for use. Copyright © 2018 by the American Society of Nephrology.
Coping strategies and self-esteem in the high-risk offspring of bipolar parents.
Goodday, Sarah M; Bentall, Richard; Jones, Steven; Weir, Arielle; Duffy, Anne
2018-03-01
This study investigated whether there were differences in coping strategies and self-esteem between offspring of parents with bipolar disorder (high-risk) and offspring of unaffected parents (control), and whether these psychological factors predicted the onset and recurrence of mood episodes. High-risk and control offspring were followed longitudinally as part of the Flourish Canadian high-risk bipolar offspring cohort study. Offspring were clinically assessed annually by a psychiatrist using semi-structured interviews and completed a measure of coping strategies and self-esteem. In high-risk offspring, avoidant coping strategies significantly increased the hazard of a new onset Diagnostic and Statistical Manual of Mental Disorders, 4th Edition twice revised mood episode or recurrence (hazard ratio: 1.89, p = 0.04), while higher self-esteem significantly decreased this hazard (hazard ratio: 2.50, p < 0.01). Self-esteem and avoidant coping significantly interacted with one another ( p < 0.05), where the risk of a Diagnostic and Statistical Manual of Mental Disorders, 4th Edition twice revised new onset mood episode or recurrence was only significantly increased among high-risk offspring with both high avoidant coping and low self-esteem. A reduction of avoidant coping strategies in response to stress and improvement of self-esteem may be useful intervention targets for preventing the new onset or recurrence of a clinically significant mood disorder among individuals at high familial risk.
Herpes zoster as a risk factor for stroke and TIA: a retrospective cohort study in the UK.
Breuer, Judith; Pacou, Maud; Gautier, Aline; Brown, Martin M
2014-07-08
Stroke and TIA are recognized complications of acute herpes zoster (HZ). Herein, we evaluate HZ as a risk factor for cerebrovascular disease (stroke and TIA) and myocardial infarction (MI) in a UK population cohort. A retrospective cohort of 106,601 HZ cases and 213,202 controls, matched for age, sex, and general practice, was identified from the THIN (The Health Improvement Network) general practice database. Cox proportional hazard models were used to examine the risks of stroke, TIA, and MI in cases and controls, adjusted for vascular risk factors, including body mass index >30 kg/m(2), smoking, cholesterol >6.2 mmol/L, hypertension, diabetes, ischemic heart disease, atrial fibrillation, intermittent arterial claudication, carotid stenosis, and valvular heart disease, over 24 (median 6.3) years after HZ infection. Risk factors for vascular disease were significantly increased in cases of HZ compared with controls. Adjusted hazard ratios for TIA and MI but not stroke were increased in all patients with HZ (adjusted hazard ratios [95% confidence intervals]: 1.15 [1.09-1.21] and 1.10 [1.05-1.16], respectively). However, stroke, TIA, and MI were increased in cases whose HZ occurred when they were younger than 40 years (adjusted hazard ratios [95% confidence intervals]: 1.74 [1.13-2.66], 2.42 [1.34-4.36], and 1.49 [1.04-2.15], respectively). Subjects younger than 40 years were significantly less likely to be asked about vascular risk factors compared with older patients (p < 0.001). HZ is an independent risk factor for vascular disease in the UK population, particularly for stroke, TIA, and MI in subjects affected before the age of 40 years. In older subjects, better ascertainment of vascular risk factors and earlier intervention may explain the reduction in risk of stroke after HZ infection. © 2014 American Academy of Neurology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Shih-Neng; Department of Biomedical Imaging and Radiological Science, China Medical University, Taichung, Taiwan; Liao, Chih-Ying
2011-03-15
Purpose: To investigate the prognostic value of the volume reduction rate (VRR) in patients with head-and-neck cancer treated with intensity-modulated radiotherapy (IMRT). Methods and Materials: Seventy-six patients with oropharyngeal cancer (OPC) and another 76 with hypopharyngeal cancer (HPC) were enrolled in volumetric analysis. All patients received allocated radiotherapy courses. Adaptive computed tomography was done 4 to 5 weeks after the start of IMRT. Primary tumor volume measurement was derived using separate images for the pretreatment gross tumor volume (pGTV) and the interval gross tumor volume. Results: In the OPC group, the pGTV ranged from 6.6 to 242.6 mL (mean, 49.9more » mL), whereas the value of the VRR ranged from 0.014 to 0.74 (mean, 0.43). In HPC patients, the pGTV ranged from 4.1 to 152.4 mL (mean, 35.6 mL), whereas the VRR ranged from -1.15 to 0.79 (mean, 0.33). Multivariate analysis of the primary tumor relapse-free survival for OPC revealed three prognostic factors: T4 tumor (p = 0.0001, hazard ratio 7.38), pGTV {>=}20 mL (p = 0.01, hazard ratio 10.61), and VRR <0.5 (p = 0.001, hazard ratio 6.49). Multivariate analysis of the primary tumor relapse-free survival for HPC showed two prognostic factors: pGTV {>=}30 mL (p = 0.001, hazard ratio 2.87) and VRR <0.5 (p = 0.03, hazard ratio 2.25). Conclusion: The VRR is an outcome predictor for local control in OPC and HPC patients treated with IMRT. Those with large tumor volumes or a VRR <0.5 should be considered for a salvage operation or a dose-escalation scheme.« less
Lantz, Paula M.; Golberstein, Ezra; House, James S.; Morenoff, Jeffrey D.
2012-01-01
Many demographic, socioeconomic, and behavioral risk factors predict mortality in the United States. However, very few population-based longitudinal studies are able to investigate simultaneously the impact of a variety of social factors on mortality. We investigated the degree to which demographic characteristics, socioeconomic variables and major health risk factors were associated with mortality in a nationally-representative sample of 3,617 U.S. adults from 1986-2005, using data from the 4 waves of the Americans’ Changing Lives study. Cox proportional hazard models with time-varying covariates were employed to predict all-cause mortality verified through the National Death Index and death certificate review. The results revealed that low educational attainment was not associated with mortality when income and health risk behaviors were included in the model. The association of low-income with mortality remained after controlling for major behavioral risks. Compared to those in the “normal” weight category, neither overweight nor obesity was significantly associated with the risk of mortality. Among adults age 55 and older at baseline, the risk of mortality was actually reduced for those were overweight (hazard rate ratio=0.83, 95% C.I. = 0.71 – 0.98) and those who were obese (hazard rate ratio=0.68, 95% C.I. = 0.55 – 0.84), controlling for other health risk behaviors and health status. Having a low level of physical activity was a significant risk factor for mortality (hazard rate ratio=1.58, 95% C.I. = 1.20 – 2.07). The results from this national longitudinal study underscore the need for health policies and clinical interventions focusing on the social and behavioral determinants of health, with a particular focus on income security, smoking prevention/cessation, and physical activity. PMID:20226579
Circulating ferritin concentrations and risk of type 2 diabetes in Japanese individuals.
Akter, Shamima; Nanri, Akiko; Kuwahara, Keisuke; Matsushita, Yumi; Nakagawa, Tohru; Konishi, Maki; Honda, Toru; Yamamoto, Shuichiro; Hayashi, Takeshi; Noda, Mitsuhiko; Mizoue, Tetsuya
2017-07-01
Higher iron storage has been linked to an increased risk of type 2 diabetes, but little is known about the mediator of this association. Here, we prospectively investigated the association between circulating ferritin, a marker of iron storage, and the incidence of type 2 diabetes among Japanese individuals. The participants were 4,754 employees who attended a comprehensive health check-up in 2008-2009 and donated blood for the study. During 5 years of follow up, diabetes was identified based on plasma glucose, glycated hemoglobin and self-report. Two controls matched to each case on sex, age and date of check-up were randomly chosen using density sampling, giving 327 cases and 641 controls with ferritin measurement. Cox proportional hazards regression was used to estimate the hazard ratio while adjusting for a series of potential confounders or mediators. Elevated serum ferritin levels were associated with a significantly increased risk of type 2 diabetes, with the hazard ratio adjusted for known risk factors in the highest vs lowest quartile of 1.42 (95% confidence interval: 1.03-1.96). This association was unchanged after adjustment for C-reactive protein and adiponectin, but attenuated after adjustment for liver enzyme and insulin resistance (hazard ratio 1.04). The ferritin-diabetes association was confined to non-obese participants. These results suggest that elevated iron storage is associated with increased risk of type 2 diabetes in normal weight individuals, and that this association is partly mediated through liver dysfunction and resulting insulin resistance. © 2017 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.
Sim, John J.; Bhandari, Simran K.; Shi, Jiaxiao; Reynolds, Kristi; Calhoun, David A.; Kalantar-Zadeh, Kamyar; Jacobsen, Steven J.
2015-01-01
We sought to compare the risk of end stage renal disease (ESRD), ischemic heart event (IHE), congestive heart failure (CHF), cerebrovascular accident (CVA), and all-cause mortality among 470,386 individuals with resistant and nonresistant hypertension (non-RH). Resistant hypertension (60,327 individuals) was sub-categorized into 2 groups; 23,104 patients with cRH (controlled on 4 or more medicines) and 37,223 patients with uRH (uncontrolled on 3 or more medicines) in a 5 year retrospective cohort study. Cox proportional hazard modeling was used to estimate hazard ratios adjusting for age, gender, race, body mass index, chronic kidney disease (CKD), and co-morbidities. Resistant hypertension (cRH and uRH) compared to non-RH, had multivariable adjusted hazard ratios (95% confidence intervals) of 1.32 (1.27–1.37), 1.24 (1.20–1.28), 1.46 (1.40–1.52), 1.14 (1.10–1.19), and 1.06 (1.03–1.08) for ESRD, IHE, CHF, CVA, and mortality, respectively. Comparison of uRH to cRH had hazard ratios of 1.25 (1.18–1.33), 1.04 (0.99–1.10), 0.94 (0.89–1.01), 1.23 (1.14–1.31), and 1.01 (0.97–1.05) for ESRD, IHE, CHF, CVA, and mortality, respectively. Males and Hispanics had greater risk for ESRD within all 3 cohorts. Resistant hypertension had greater risk for ESRD, IHE, CHF, CVA, and mortality. The risk of ESRD and CVA and were 25% and 23% greater, respectively, in uRH compared to cRH supporting the linkage between blood pressure and both outcomes. PMID:25945406
Risk of Cardiovascular Events in Patients With Diabetes Mellitus on β-Blockers.
Tsujimoto, Tetsuro; Sugiyama, Takehiro; Shapiro, Martin F; Noda, Mitsuhiko; Kajio, Hiroshi
2017-07-01
Although the use of β-blockers may help in achieving maximum effects of intensive glycemic control because of a decrease in the adverse effects after severe hypoglycemia, they pose a potential risk for the occurrence of severe hypoglycemia. This study aimed to evaluate whether the use of β-blockers is effective in patients with diabetes mellitus and whether its use is associated with the occurrence of severe hypoglycemia. Using the ACCORD trial (Action to Control Cardiovascular Risk in Diabetes) data, we performed Cox proportional hazards analyses with a propensity score adjustment. The primary outcome was the first occurrence of a cardiovascular event during the study period, which included nonfatal myocardial infarction, unstable angina, nonfatal stroke, and cardiovascular death. The mean follow-up periods (±SD) were 4.6±1.6 years in patients on β-blockers (n=2527) and 4.7±1.6 years in those not on β-blockers (n=2527). The cardiovascular event rate was significantly higher in patients on β-blockers than in those not on β-blockers (hazard ratio, 1.46; 95% confidence interval, 1.24-1.72; P <0.001). In patients with coronary heart disease or heart failure, the cumulative event rate for cardiovascular events was also significantly higher in those on β-blockers than in those not on β-blockers (hazard ratio, 1.27; 95% confidence interval, 1.02-1.60; P =0.03). The incidence of severe hypoglycemia was significantly higher in patients on β-blockers than in those not on β-blockers (hazard ratio, 1.30; 95% confidence interval, 1.03-1.64; P =0.02). In conclusion, the use of β-blockers in patients with diabetes mellitus was associated with an increased risk for cardiovascular events. © 2017 The Authors.
Haynes, Erin; Lanphear, Bruce P; Tohn, Ellen; Farr, Nick; Rhoads, George G
2002-01-01
Dust control is often recommended to prevent children's exposure to residential lead hazards, but the effect of these controls on children's blood lead concentrations is uncertain. We conducted a systematic review of randomized, controlled trials of low-cost, lead hazard control interventions to determine the effect of lead hazard control on children's blood lead concentration. Four trials met the inclusion criteria. We examined mean blood lead concentration and elevated blood lead concentrations (> or = 10 microg/dL, > or = 15 microg/dL, and > or = 20 microg/dL) and found no significant differences in mean change in blood lead concentration for children by random group assignment (children assigned to the intervention group compared with those assigned to the control group). We found no significant difference between the intervention and control groups in the percentage of children with blood lead > or = 10 microg/dL, 29% versus 32% [odds ratio (OR), 0.85; 95% confidence interval (CI), 0.56-1.3], but there was a significant difference in the percentage of children with blood lead > or = 15 microg/dL between the intervention and control groups, 6% versus 14% (OR, 0.40; 95% CI, 0.21-0.80) and in the percentage of children with blood lead > or = 20 microg/dL between the intervention and control groups, 2% versus 6% (OR, 0.29; 95% CI, 0.10-0.85). We conclude that although low-cost, interior lead hazard control was associated with 50% or greater reduction in the proportion of children who had blood lead concentrations exceeding 15 microg/dL and > or = 20 microg/dL, there was no substantial effect on mean blood lead concentration. PMID:11781171
Haynes, Erin; Lanphear, Bruce P; Tohn, Ellen; Farr, Nick; Rhoads, George G
2002-01-01
Dust control is often recommended to prevent children's exposure to residential lead hazards, but the effect of these controls on children's blood lead concentrations is uncertain. We conducted a systematic review of randomized, controlled trials of low-cost, lead hazard control interventions to determine the effect of lead hazard control on children's blood lead concentration. Four trials met the inclusion criteria. We examined mean blood lead concentration and elevated blood lead concentrations (> or = 10 microg/dL, > or = 15 microg/dL, and > or = 20 microg/dL) and found no significant differences in mean change in blood lead concentration for children by random group assignment (children assigned to the intervention group compared with those assigned to the control group). We found no significant difference between the intervention and control groups in the percentage of children with blood lead > or = 10 microg/dL, 29% versus 32% [odds ratio (OR), 0.85; 95% confidence interval (CI), 0.56-1.3], but there was a significant difference in the percentage of children with blood lead > or = 15 microg/dL between the intervention and control groups, 6% versus 14% (OR, 0.40; 95% CI, 0.21-0.80) and in the percentage of children with blood lead > or = 20 microg/dL between the intervention and control groups, 2% versus 6% (OR, 0.29; 95% CI, 0.10-0.85). We conclude that although low-cost, interior lead hazard control was associated with 50% or greater reduction in the proportion of children who had blood lead concentrations exceeding 15 microg/dL and > or = 20 microg/dL, there was no substantial effect on mean blood lead concentration.
Marrie, Thomas J; Minhas-Sandhu, Jasjeet K; Majumdar, Sumit R
2017-01-01
Abstract Objective To determine the attributable risk of community acquired pneumonia on incidence of heart failure throughout the age range of affected patients and severity of the infection. Design Cohort study. Setting Six hospitals and seven emergency departments in Edmonton, Alberta, Canada, 2000-02. Participants 4988 adults with community acquired pneumonia and no history of heart failure were prospectively recruited and matched on age, sex, and setting of treatment (inpatient or outpatient) with up to five adults without pneumonia (controls) or prevalent heart failure (n=23 060). Main outcome measures Risk of hospital admission for incident heart failure or a combined endpoint of heart failure or death up to 2012, evaluated using multivariable Cox proportional hazards analyses. Results The average age of participants was 55 years, 2649 (53.1%) were men, and 63.4% were managed as outpatients. Over a median of 9.9 years (interquartile range 5.9-10.6), 11.9% (n=592) of patients with pneumonia had incident heart failure compared with 7.4% (n=1712) of controls (adjusted hazard ratio 1.61, 95% confidence interval 1.44 to 1.81). Patients with pneumonia aged 65 or less had the lowest absolute increase (but greatest relative risk) of heart failure compared with controls (4.8% v 2.2%; adjusted hazard ratio 1.98, 95% confidence interval 1.5 to 2.53), whereas patients with pneumonia aged more than 65 years had the highest absolute increase (but lowest relative risk) of heart failure (24.8% v 18.9%; adjusted hazard ratio 1.55, 1.36 to 1.77). Results were consistent in the short term (90 days) and intermediate term (one year) and whether patients were treated in hospital or as outpatients. Conclusion Our results show that community acquired pneumonia substantially increases the risk of heart failure across the age and severity range of cases. This should be considered when formulating post-discharge care plans and preventive strategies, and assessing downstream episodes of dyspnoea. PMID:28193610
2009-01-01
Background During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia. PMID:19331670
Chronic Use of Theophylline and Mortality in Chronic Obstructive Pulmonary Disease: A Meta-analysis.
Horita, Nobuyuki; Miyazawa, Naoki; Kojima, Ryota; Inoue, Miyo; Ishigatsubo, Yoshiaki; Kaneko, Takeshi
2016-05-01
Theophylline has been shown to improve respiratory function and oxygenation in patients with chronic obstruction pulmonary disease (COPD). However, the impact of theophylline on mortality in COPD patients has not been not sufficiently evaluated. Two investigators independently searched for eligible articles in 4 databases. The eligibility criterion for this meta-analysis was an original research article that provided a hazard ratio for theophylline for all-cause mortality of COPD patients. Both randomized controlled trials and observational studies were accepted. After we confirmed no substantial heterogeneity (I(2)<50%), the fixed-model method with generic inverse variance was used for meta-analysis to estimate the pooled hazard ratio. We screened 364 potentially eligible articles. Of the 364 articles, 259 were excluded on the basis of title and abstract, and 99 were excluded after examination of the full text. Our final analysis included 6 observational studies and no randomized controlled trials. One study reported 2 cohorts. The number of patients in each cohort ranged from 47 to 46,403. Heterogeneity (I(2)=42%, P=.11) and publication bias (Begg's test r=0.21, P=.662) were not substantial. Fixed-model meta-analysis yielded a pooled hazard ratio for theophylline for all-cause death of 1.07 (95% confidence interval: 1.02-1.13, P=.003). This meta-analysis of 7 observational cohorts suggests that theophylline slightly increases all-cause death in COPD patients. Copyright © 2014 SEPAR. Published by Elsevier Espana. All rights reserved.
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
Mocroft, Amanda; Sterne, Jonathan A C; Egger, Matthias; May, Margaret; Grabar, Sophie; Furrer, Hansjakob; Sabin, Caroline; Fatkenheuer, Gerd; Justice, Amy; Reiss, Peter; d'Arminio Monforte, Antonella; Gill, John; Hogg, Robert; Bonnet, Fabrice; Kitahata, Mari; Staszewski, Schlomo; Casabona, Jordi; Harris, Ross; Saag, Michael
2009-04-15
The extent to which mortality differs following individual acquired immunodeficiency syndrome (AIDS)-defining events (ADEs) has not been assessed among patients initiating combination antiretroviral therapy. We analyzed data from 31,620 patients with no prior ADEs who started combination antiretroviral therapy. Cox proportional hazards models were used to estimate mortality hazard ratios for each ADE that occurred in >50 patients, after stratification by cohort and adjustment for sex, HIV transmission group, number of antiretroviral drugs initiated, regimen, age, date of starting combination antiretroviral therapy, and CD4+ cell count and HIV RNA load at initiation of combination antiretroviral therapy. ADEs that occurred in <50 patients were grouped together to form a "rare ADEs" category. During a median follow-up period of 43 months (interquartile range, 19-70 months), 2880 ADEs were diagnosed in 2262 patients; 1146 patients died. The most common ADEs were esophageal candidiasis (in 360 patients), Pneumocystis jiroveci pneumonia (320 patients), and Kaposi sarcoma (308 patients). The greatest mortality hazard ratio was associated with non-Hodgkin's lymphoma (hazard ratio, 17.59; 95% confidence interval, 13.84-22.35) and progressive multifocal leukoencephalopathy (hazard ratio, 10.0; 95% confidence interval, 6.70-14.92). Three groups of ADEs were identified on the basis of the ranked hazard ratios with bootstrapped confidence intervals: severe (non-Hodgkin's lymphoma and progressive multifocal leukoencephalopathy [hazard ratio, 7.26; 95% confidence interval, 5.55-9.48]), moderate (cryptococcosis, cerebral toxoplasmosis, AIDS dementia complex, disseminated Mycobacterium avium complex, and rare ADEs [hazard ratio, 2.35; 95% confidence interval, 1.76-3.13]), and mild (all other ADEs [hazard ratio, 1.47; 95% confidence interval, 1.08-2.00]). In the combination antiretroviral therapy era, mortality rates subsequent to an ADE depend on the specific diagnosis. The proposed classification of ADEs may be useful in clinical end point trials, prognostic studies, and patient management.
Intensive Versus Standard Blood Pressure Control in SPRINT-Eligible Participants of ACCORD-BP.
Buckley, Leo F; Dixon, Dave L; Wohlford, George F; Wijesinghe, Dayanjan S; Baker, William L; Van Tassell, Benjamin W
2017-12-01
We sought to determine the effect of intensive blood pressure (BP) control on cardiovascular outcomes in participants with type 2 diabetes mellitus (T2DM) and additional risk factors for cardiovascular disease (CVD). This study was a post hoc, multivariate, subgroup analysis of ACCORD-BP (Action to Control Cardiovascular Risk in Diabetes Blood Pressure) participants. Participants were eligible for the analysis if they were in the standard glucose control arm of ACCORD-BP and also had the additional CVD risk factors required for SPRINT (Systolic Blood Pressure Intervention Trial) eligibility. We used a Cox proportional hazards regression model to compare the effect of intensive versus standard BP control on CVD outcomes. The "SPRINT-eligible" ACCORD-BP participants were pooled with SPRINT participants to determine whether the effects of intensive BP control interacted with T2DM. The mean baseline Framingham 10-year CVD risk scores were 14.5% and 14.8%, respectively, in the intensive and standard BP control groups. The mean achieved systolic BP values were 120 and 134 mmHg in the intensive and standard BP control groups ( P < 0.001). Intensive BP control reduced the composite of CVD death, nonfatal myocardial infarction (MI), nonfatal stroke, any revascularization, and heart failure (hazard ratio 0.79; 95% CI 0.65-0.96; P = 0.02). Intensive BP control also reduced CVD death, nonfatal MI, and nonfatal stroke (hazard ratio 0.69; 95% CI 0.51-0.93; P = 0.01). Treatment-related adverse events occurred more frequently in participants receiving intensive BP control (4.1% vs. 2.1%; P = 0.003). The effect of intensive BP control on CVD outcomes did not differ between patients with and without T2DM ( P > 0.62). Intensive BP control reduced CVD outcomes in a cohort of participants with T2DM and additional CVD risk factors. © 2017 by the American Diabetes Association.
Comín-Colet, Josep; Enjuanes, Cristina; Verdú-Rotellar, José M; Linas, Anna; Ruiz-Rodriguez, Pilar; González-Robledo, Gina; Farré, Núria; Moliner-Borja, Pedro; Ruiz-Bustillo, Sonia; Bruguera, Jordi
2016-07-01
The role of telemedicine in the management of patients with chronic heart failure (HF) has not been fully elucidated. We hypothesized that multidisciplinary comprehensive HF care could achieve better results when it is delivered using telemedicine. In this study, 178 eligible patients with HF were randomized to either structured follow-up on the basis of face-to-face encounters (control group, 97 patients) or delivering health care using telemedicine (81 patients). Telemedicine included daily signs and symptoms based on telemonitoring and structured follow-up by means of video or audio-conference. The primary end-point was non-fatal HF events after six months of follow-up. The median age of the patients was 77 years, 41% were female, and 25% were frail patients. The hazard ratio for the primary end-point was 0.35 (95% confidence interval (CI), 0.20-0.59; p-value < 0.001) in favour of telemedicine. HF readmission (hazard ratio 0.39 (0.19-0.77); p-value=0.007) and cardiovascular readmission (hazard ratio 0.43 (0.23-0.80); p-value=0.008) were also reduced in the telemedicine group. Mortality was similar in both groups (telemedicine: 6.2% vs control: 12.4%, p-value > 0.05). The telemedicine group experienced a significant mean net reduction in direct hospital costs of €3546 per patient per six months of follow-up. Among patients managed in the setting of a comprehensive HF programme, the addition of telemedicine may result in better outcomes and reduction of costs. © The Author(s) 2015.
Assessing the Impact of Analytical Error on Perceived Disease Severity.
Kroll, Martin H; Garber, Carl C; Bi, Caixia; Suffin, Stephen C
2015-10-01
The perception of the severity of disease from laboratory results assumes that the results are free of analytical error; however, analytical error creates a spread of results into a band and thus a range of perceived disease severity. To assess the impact of analytical errors by calculating the change in perceived disease severity, represented by the hazard ratio, using non-high-density lipoprotein (nonHDL) cholesterol as an example. We transformed nonHDL values into ranges using the assumed total allowable errors for total cholesterol (9%) and high-density lipoprotein cholesterol (13%). Using a previously determined relationship between the hazard ratio and nonHDL, we calculated a range of hazard ratios for specified nonHDL concentrations affected by analytical error. Analytical error, within allowable limits, created a band of values of nonHDL, with a width spanning 30 to 70 mg/dL (0.78-1.81 mmol/L), depending on the cholesterol and high-density lipoprotein cholesterol concentrations. Hazard ratios ranged from 1.0 to 2.9, a 16% to 50% error. Increased bias widens this range and decreased bias narrows it. Error-transformed results produce a spread of values that straddle the various cutoffs for nonHDL. The range of the hazard ratio obscures the meaning of results, because the spread of ratios at different cutoffs overlap. The magnitude of the perceived hazard ratio error exceeds that for the allowable analytical error, and significantly impacts the perceived cardiovascular disease risk. Evaluating the error in the perceived severity (eg, hazard ratio) provides a new way to assess the impact of analytical error.
de León, A Cabrera; Coello, S Domínguez; González, D Almeida; Díaz, B Brito; Rodríguez, J C del Castillo; Hernández, A González; Aguirre-Jaime, A; Pérez, M del Cristo Rodríguez
2012-03-01
To estimate the incidence rate and risk factors for diabetes in the Canary Islands. A total of 5521 adults without diabetes were followed for a median of 3.5 years. Incident cases of diabetes were self-declared and validated in medical records. The following factors were assessed by Cox regression to estimate the hazard ratios for diabetes: impaired fasting glucose (5.6 mmol/l ≤ fasting glucose ≤ 6.9 mmol/l), BMI, waist-to-height ratio (≥ 0.55), insulin resistance (defined as triglycerides/HDL cholesterol ≥ 3), familial antecedents of diabetes, Canarian ancestry, smoking, alcohol intake, sedentary lifestyle, Mediterranean diet, social class and the metabolic syndrome. The incidence rate was 7.5/10(3) person-years (95% CI 6.4-8.8). The greatest risks were obtained for impaired fasting glucose (hazard ratio 2.6; 95% CI 1.8-3.8), Canarian ancestry (hazard ratio 1.9; 95% CI 1.0-3.4), waist-to-height ratio (hazard ratio 1.7; 95% CI 1.1-2.5), insulin resistance (hazard ratio 1.5; 95% CI 1.0-2.2) and paternal history of diabetes (hazard ratio 1.5; 95% CI 1.0-2.3). The metabolic syndrome (hazard ratio 1.9; 95% CI 1.3-2.8) and BMI ≥ 30 kg/m(2) (hazard ratio 1.7; 95% CI 1.0-2.7) were significant only when their effects were not adjusted for impaired fasting glucose and waist-to-height ratio, respectively. The incidence of diabetes in the Canary Islands is 1.5-fold higher than that in continental Spain and 1.7-fold higher than in the UK. The main predictors of diabetes were impaired fasting glucose, Canarian ancestry, waist-to-height ratio and insulin resistance. The metabolic syndrome predicted diabetes only when its effect was not adjusted for impaired fasting glucose. In individuals with Canarian ancestry, genetic susceptibility studies may be advisable. In order to propose preventive strategies, impaired fasting glucose, waist-to-height ratio and triglyceride/HDL cholesterol should be used to identify subjects with an increased risk of developing diabetes. © 2011 The Authors. Diabetic Medicine © 2011 Diabetes UK.
Correction of Selection Bias in Survey Data: Is the Statistical Cure Worse Than the Bias?
Hanley, James A
2017-04-01
In previous articles in the American Journal of Epidemiology (Am J Epidemiol. 2013;177(5):431-442) and American Journal of Public Health (Am J Public Health. 2013;103(10):1895-1901), Masters et al. reported age-specific hazard ratios for the contrasts in mortality rates between obesity categories. They corrected the observed hazard ratios for selection bias caused by what they postulated was the nonrepresentativeness of the participants in the National Health Interview Study that increased with age, obesity, and ill health. However, it is possible that their regression approach to remove the alleged bias has not produced, and in general cannot produce, sensible hazard ratio estimates. First, we must consider how many nonparticipants there might have been in each category of obesity and of age at entry and how much higher the mortality rates would have to be in nonparticipants than in participants in these same categories. What plausible set of numerical values would convert the ("biased") decreasing-with-age hazard ratios seen in the data into the ("unbiased") increasing-with-age ratios that they computed? Can these values be encapsulated in (and can sensible values be recovered from) one additional internal variable in a regression model? Second, one must examine the age pattern of the hazard ratios that have been adjusted for selection. Without the correction, the hazard ratios are attenuated with increasing age. With it, the hazard ratios at older ages are considerably higher, but those at younger ages are well below one. Third, one must test whether the regression approach suggested by Masters et al. would correct the nonrepresentativeness that increased with age and ill health that I introduced into real and hypothetical data sets. I found that the approach did not recover the hazard ratio patterns present in the unselected data sets: the corrections overshot the target at older ages and undershot it at lower ages.
Migraine and risk of stroke: a national population-based twin study.
Lantz, Maria; Sieurin, Johanna; Sjölander, Arvid; Waldenlind, Elisabet; Sjöstrand, Christina; Wirdefeldt, Karin
2017-10-01
Numerous studies have indicated an increased risk for stroke in patients with migraine, especially migraine with aura; however, many studies used self-reported migraine and only a few controlled for familial factors. We aimed to investigate migraine as a risk factor for stroke in a Swedish population-based twin cohort, and whether familial factors contribute to an increased risk. The study population included twins without prior cerebrovascular disease who answered a headache questionnaire during 1998 and 2002 for twins born 1935-58 and during 2005-06 for twins born between 1959 and 1985. Migraine with and without aura and probable migraine was defined by an algorithm mapping on to clinical diagnostic criteria according to the International Classification of Headache Disorders. Stroke diagnoses were obtained from the national patient and cause of death registers. Twins were followed longitudinally, by linkage of national registers, from date of interview until date of first stroke, death, or end of study on 31 Dec 2014. In total, 8635 twins had any migraineous headache, whereof 3553 had migraine with aura and 5082 had non-aura migraineous headache (including migraine without aura and probable migraine), and 44 769 twins had no migraine. During a mean follow-up time of 11.9 years we observed 1297 incident cases of stroke. The Cox proportional hazards model with attained age as underlying time scale was used to estimate hazard ratios with 95% confidence intervals for stroke including ischaemic and haemorrhagic subtypes related to migraine with aura, non-aura migraineous headache, and any migraineous headache. Analyses were adjusted for gender and cardiovascular risk factors. Where appropriate; within-pair analyses were performed to control for confounding by familial factors. The age- and gender-adjusted hazard ratio for stroke related to migraine with aura was 1.27 (95% confidence interval 1.00-1.62), P = 0.05, and 1.07 (95% confidence interval 0.91-1.26), P = 0.39 related to any migraineous headache. Multivariable adjusted analyses showed similar results. When stratified by gender and attained age of ≤50 or >50 years, the estimated hazard ratio for stroke was higher in twins younger than 50 years and in females; however, non-significant. In the within-pair analysis, the hazard ratio for stroke related to migraine with aura was attenuated [hazard ratio 1.09 (95% confidence interval 0.81-1.46), P = 0.59]. In conclusion, we observed no increased stroke risk related to migraine overall but there was a modestly increased risk for stroke related to migraine with aura, and within-pair analyses suggested that familial factors might contribute to this association. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Bhattacherjee, Ashis; Kunar, Bijay Mihir; Baumann, Michele; Chau, Nearkasen
2013-12-01
The role of occupational hazards in occupational injury may be mediated by individual factors across various age groups. This study assessed the role of occupational hazards as well as contribution of individual factors to injuries among Indian and French coalminers. We conducted a case-control study on 245 injured workers and on 330 controls without any injuries from Indian coal mines using face-to-face interviews, and a retrospective study on 516 French coalminers using a self-administered questionnaire including potential occupational and personal factors. Data were analyzed using logistic models. The annual rate of injuries was 5.5% for Indian coalminers and 14.9% for the French ones. Logistic model including all occupational factors showed that major injury causes were: hand-tools, material handling, machines, and environment/work-geological/strata conditions among Indian miners (adjusted odds-ratios 2.01 to 3.30) and biomechanical exposure score among French miners (adjusted odds-ratio 3.01 for score the 1-4, 3.47 for the score 5-7, and 7.26 for score ≥ 8, vs. score 0). Personal factors among Indian and French coalminers reduced/exacerbated the roles of various occupational hazards to a different extent depending on workers' age. We conclude that injury roles of occupational hazards were reduced or exacerbated by personal factors depending on workers' age in both populations. This knowledge is useful when designing prevention which should definitely consider workers' age.
Gaw, Sally; Brooks, Bryan W
2016-04-01
Pharmaceuticals are ubiquitous contaminants in aquatic ecosystems. Adaptive monitoring, assessment, and management programs will be required to reduce the environmental hazards of pharmaceuticals of concern. Potentially underappreciated factors that drive the environmental dose of pharmaceuticals include regulatory approvals, marketing campaigns, pharmaceutical subsidies and reimbursement schemes, and societal acceptance. Sales data for 5 common antidepressants (duloxetine [Cymbalta], escitalopram [Lexapro], venlafaxine [Effexor], bupropion [Wellbutrin], and sertraline [Zoloft]) in the United States from 2004 to 2008 were modeled to explore how environmental hazards in aquatic ecosystems changed after patents were obtained or expired. Therapeutic hazard ratios for Effexor and Lexapro did not exceed 1; however, the therapeutic hazard ratio for Zoloft declined whereas the therapeutic hazard ratio for Cymbalta increased as a function of patent protection and sale patterns. These changes in therapeutic hazard ratios highlight the importance of considering current and future drivers of pharmaceutical use when prioritizing pharmaceuticals for water quality monitoring programs. When urban systems receiving discharges of environmental contaminants are examined, water quality efforts should identify, prioritize, and select target analytes presently in commerce for effluent monitoring and surveillance. © 2015 SETAC.
Jarvis, J; Seed, M; Elton, R; Sawyer, L; Agius, R
2005-01-01
Aims: To investigate quantitatively, relationships between chemical structure and reported occupational asthma hazard for low molecular weight (LMW) organic compounds; to develop and validate a model linking asthma hazard with chemical substructure; and to generate mechanistic hypotheses that might explain the relationships. Methods: A learning dataset used 78 LMW chemical asthmagens reported in the literature before 1995, and 301 control compounds with recognised occupational exposures and hazards other than respiratory sensitisation. The chemical structures of the asthmagens and control compounds were characterised by the presence of chemical substructure fragments. Odds ratios were calculated for these fragments to determine which were associated with a likelihood of being reported as an occupational asthmagen. Logistic regression modelling was used to identify the independent contribution of these substructures. A post-1995 set of 21 asthmagens and 77 controls were selected to externally validate the model. Results: Nitrogen or oxygen containing functional groups such as isocyanate, amine, acid anhydride, and carbonyl were associated with an occupational asthma hazard, particularly when the functional group was present twice or more in the same molecule. A logistic regression model using only statistically significant independent variables for occupational asthma hazard correctly assigned 90% of the model development set. The external validation showed a sensitivity of 86% and specificity of 99%. Conclusions: Although a wide variety of chemical structures are associated with occupational asthma, bifunctional reactivity is strongly associated with occupational asthma hazard across a range of chemical substructures. This suggests that chemical cross-linking is an important molecular mechanism leading to the development of occupational asthma. The logistic regression model is freely available on the internet and may offer a useful but inexpensive adjunct to the prediction of occupational asthma hazard. PMID:15778257
Clemens, Michael S; Stewart, Ian J; Sosnov, Jonathan A; Howard, Jeffrey T; Belenkiy, Slava M; Sine, Christy R; Henderson, Jonathan L; Buel, Allison R; Batchinsky, Andriy I; Cancio, Leopoldo C; Chung, Kevin K
2016-10-01
To evaluate the association between acute respiratory distress syndrome and acute kidney injury with respect to their contributions to mortality in critically ill patients. Retrospective analysis of consecutive adult burn patients requiring mechanical ventilation. A 16-bed burn ICU at tertiary military teaching hospital. Adult patients more than 18 years old requiring mechanical ventilation during their initial admission to our burn ICU from January 1, 2003, to December 31, 2011. None. A total 830 patients were included, of whom 48.2% had acute kidney injury (n = 400). These patients had a 73% increased risk of developing acute respiratory distress syndrome after controlling for age, gender, total body surface area burned, and inhalation injury (hazard ratio, 1.73; 95% CI, 1.18-2.54; p = 0.005). In a reciprocal multivariate analysis, acute respiratory distress syndrome (n = 299; 36%) demonstrated a strong trend toward developing acute kidney injury (hazard ratio, 1.39; 95% CI, 0.99-1.95; p = 0.05). There was a 24% overall in-hospital mortality (n = 198). After adjusting for the aforementioned confounders, both acute kidney injury (hazard ratio, 3.73; 95% CI, 2.39-5.82; p < 0.001) and acute respiratory distress syndrome (hazard ratio, 2.16; 95% CI, 1.58-2.94; p < 0.001) significantly contributed to mortality. Age, total body surface area burned, and inhalation injury were also significantly associated with increased mortality. Acute kidney injury increases the risk of acute respiratory distress syndrome in mechanically ventilated burn patients, whereas acute respiratory distress syndrome similarly demonstrates a strong trend toward the development of acute kidney injury. Acute kidney injury and acute respiratory distress syndrome are both independent risks for subsequent death. Future research should look at this interplay for possible early interventions.
Mortality after a diagnosis of dementia in a population aged 75 and over in Spain.
Llinàs-Regla, Jordi; López-Pousa, Secundino; Vilalta-Franch, Joan; Garre-Olmo, Josep; Román, Gustavo C
2008-01-01
To examine the impact of incident dementia on the risk of death, taking into account other chronic illnesses potentially related to death. Six-year, prospective, two-phase, observational cohort study. 8 municipalities from a rural area in Girona (Spain). A representative community-based cohort of 1,153 adults aged over 70 living at home at study enrolment. Surviving participants underwent detailed clinical evaluation and were assessed by means of the Cambridge Examination for Mental Disorders of the Elderly. Relatives of deceased participants were interviewed using the Retrospective Collateral Dementia Interview. Mortality rates and relative risk of death for subjects with a diagnosis of dementia were calculated. The Cox proportional hazards regression model was used to assess the relationship between mortality and the diagnosis of dementia. In this cohort, 40.0% (n = 49) of the subjects with a diagnosis of dementia died. The mortality rate specific to dementia was 1.0 per 100 person-years. Mortality risk ratios for dementia were 1.79 in men [95% confidence interval (CI) = 1.06-3.02], and 3.14 in women (95% CI = 2.04-4.85). The population death risk attributable to the diagnosis of dementia in our cohort was 11.8%. The most important mortality risks were severe dementia (hazard ratio = 5.7, 95% CI = 3.7-8.6), cancer (hazard ratio = 3.2, 95% CI = 2.2-4.5), heart disease, and an age over 85 (hazard ratio = 1.4, 95% CI = 1.1-1.9). Dementia is a major risk factor for death in advanced age, with the highest mortality rates in women. Moderate and severe dementia was associated with an increased mortality risk even after appropriate control of comorbid conditions. Copyright 2008 S. Karger AG, Basel.
Batty, G David; Deary, Ian J; Zaninotto, Paola
2016-02-01
We examined the little-tested associations between general cognitive function in middle and older age and later risk of death from chronic diseases. In the English Longitudinal Study of Ageing (2002-2012), 11,391 study participants who were 50-100 years of age at study induction underwent a battery of cognitive tests and provided a range of collateral data. In an analytical sample of 9,204 people (4,982 women), there were 1,488 deaths during follow-up (mean duration, 9.0 years). When we combined scores from 4 cognition tests that represented 3 acknowledged key domains of cognitive functioning (memory, executive function, and processing speed), cognition was inversely associated with deaths from cancer (per each 1-standard-deviation decrease in general cognitive function score, hazard ratio = 1.21, 95% CI: 1.10, 1.33), cardiovascular disease (hazard ratio = 1.71, 95% CI: 1.55, 1.89), other causes (hazard ratio = 2.07, 95% CI: 1.79, 2.40), and respiratory illness (hazard ratio = 2.48, 95% CI: 2.12, 2.90). Controlling for a range of covariates, such as health behaviors and socioeconomic status, and left-censoring to explore reverse causality had very little impact on the strength of these relationships. These findings indicate that cognitive test scores can provide relatively simple indicators of the risk of death from an array of chronic diseases and that these associations appear to be independent of other commonly assessed risk factors. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Hsiao, Yi-Han; Chen, Yung-Tai; Tseng, Ching-Ming; Wu, Li-An; Perng, Diahn-Warng; Chen, Yuh-Min; Chen, Tzeng-Ji; Chang, Shi-Chuan; Chou, Kun-Ta
2017-10-01
Sleep disorders are common non-motor symptoms in patients with Parkinson's disease. Our study aims to explore the relationship between non-apnea sleep disorders and future Parkinson's disease. This is a cohort study using a nationwide database. The participants were recruited from the Taiwan National Health Insurance Research Database between 2000 and 2003. A total of 91 273 adult patients who had non-apnea sleep disorders without pre-existing Parkinson's disease were enrolled. An age-, gender-, income-, urbanization- and Charlson comorbidity index score-matched control cohort consisting of 91 273 participants was selected for comparison. The two cohorts were followed for the occurrence of Parkinson's disease, death or until the end of 2010, whichever came first. The Kaplan-Meier analyses revealed patients with non-apnea sleep disorders tended to develop Parkinson's disease (log-rank test, P < 0.001). After a multivariate adjustment in a Cox regression model, non-apnea sleep disorders was an independent risk factor for the development of Parkinson's disease [crude hazard ratio: 1.63, 95% confidence interval (CI): 1.54-1.73, P < 0.001; adjusted hazard ratio: 1.18, 95% CI: 1.11-1.26, P < 0.001]. In the subgroup analysis, patients with chronic insomnia (lasting more than 3 months) had the greatest risk (crude hazard ratio: 2.91, 95% CI: 2.59-3.26, P < 0.001; adjusted hazard ratio: 1.37, 95% CI: 1.21-1.55, P < 0.001). In conclusion, this study revealed that non-apnea sleep disorders, especially chronic insomnia, are associated with a higher risk for future Parkinson's disease. © 2017 European Sleep Research Society.
Diagnosis and mortality in 47,XYY persons: a registry study
2010-01-01
Background Sex chromosomal abnormalities are relatively common, yet many aspects of these syndromes remain unexplored. For instance epidemiological data in 47,XYY persons are still limited. Methods Using a national Danish registry, we identified 208 persons with 47,XYY or a compatible karyotype, whereof 36 were deceased; all were diagnosed from 1968 to 2008. For further analyses, we identified age matched controls from the male background population (n = 20,078) in Statistics Denmark. We report nationwide prevalence data, data regarding age at diagnosis, as well as total and cause specific mortality data in these persons. Results The average prevalence was 14.2 47,XYY persons per 100,000, which is reduced compared to the expected 98 per 100,000. Their median age at diagnosis was 17.1 years. We found a significantly decreased lifespan from 77.9 years (controls) to 67.5 years (47,XYY persons). Total mortality was significantly increased compared to controls, with a hazard ratio of 3.6 (2.6-5.1). Dividing the causes of deaths according to the International Classification of Diseases, we identified an increased hazard ratio in all informative chapters, with a significantly increased ratio in cancer, pulmonary, neurological and unspecified diseases, and trauma. Conclusion We here present national epidemiological data regarding 47,XYY syndrome, including prevalence and mortality data, showing a significantly delay to diagnosis, reduced life expectancy and an increased total and cause specific mortality. PMID:20509956
Post-Inhaled Corticosteroid Pulmonary Tuberculosis Increases Lung Cancer in Patients with Asthma.
Jian, Zhi-Hong; Huang, Jing-Yang; Lin, Frank Cheau-Feng; Nfor, Oswald Ndi; Jhang, Kai-Ming; Ku, Wen-Yuan; Ho, Chien-Chang; Lung, Chia-Chi; Pan, Hui-Hsien; Wu, Min-Chen; Wu, Ming-Fang; Liaw, Yung-Po
2016-01-01
To evaluate the association between post-inhaled corticosteroid (ICS) pulmonary tuberculosis (TB), pneumonia and lung cancer in patients with asthma. The study samples were collected from the National Health Insurance Database. Asthmatic patients who were first-time users of ICS between 2003 and 2005 were identified as cases. For each case, 4 control individuals were randomly matched for sex, age and date of ICS use. Cases and matched controls were followed up until the end of 2010. Cox proportional hazard regression was used to determine the hazard ratio for pulmonary infections and lung cancer risk in the ICS users and non-users. A total of 10,904 first-time users of ICS were matched with 43,616 controls. The hazard ratios for lung cancer were: 2.52 (95% confidence interval [CI], 1.22-5.22; p = 0.012) for individuals with post-ICS TB, 1.28 (95%CI, 0.73-2.26; p = 0.389) for post-ICS pneumonia, 2.31(95%CI, 0.84-6.38; p = 0.105) for post-ICS pneumonia+TB, 1.08 (95%CI, 0.57-2.03; p = 0.815) for TB, 0.99 (95%CI, 0.63-1.55; p = 0.970) for pneumonia, and 0.32 (95%CI, 0.05-2.32; p = 0.261) for pneumonia+ TB, respectively. Post-ICS TB increased lung cancer risk in patients with asthma. Because of the high mortality associated with lung cancer, screening tests are recommended for patients with post-ICS TB.
Correction of Selection Bias in Survey Data: Is the Statistical Cure Worse Than the Bias?
Hanley, James A
2017-03-15
In previous articles in the American Journal of Epidemiology (Am J Epidemiol. 2013;177(5):431-442) and American Journal of Public Health (Am J Public Health. 2013;103(10):1895-1901), Masters et al. reported age-specific hazard ratios for the contrasts in mortality rates between obesity categories. They corrected the observed hazard ratios for selection bias caused by what they postulated was the nonrepresentativeness of the participants in the National Health Interview Study that increased with age, obesity, and ill health. However, it is possible that their regression approach to remove the alleged bias has not produced, and in general cannot produce, sensible hazard ratio estimates. First, one must consider how many nonparticipants there might have been in each category of obesity and of age at entry and how much higher the mortality rates would have to be in nonparticipants than in participants in these same categories. What plausible set of numerical values would convert the ("biased") decreasing-with-age hazard ratios seen in the data into the ("unbiased") increasing-with-age ratios that they computed? Can these values be encapsulated in (and can sensible values be recovered from) 1 additional internal variable in a regression model? Second, one must examine the age pattern of the hazard ratios that have been adjusted for selection. Without the correction, the hazard ratios are attenuated with increasing age. With it, the hazard ratios at older ages are considerably higher, but those at younger ages are well below 1. Third, one must test whether the regression approach suggested by Masters et al. would correct the nonrepresentativeness that increased with age and ill health that I introduced into real and hypothetical data sets. I found that the approach did not recover the hazard ratio patterns present in the unselected data sets: The corrections overshot the target at older ages and undershot it at lower ages. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.
A prospective cohort study of soy product intake and stomach cancer death
Nagata, C; Takatsuka, N; Kawakami, N; Shimizu, H
2002-01-01
The relationship between intake of soy products and death from stomach cancer was examined in a community-based prospective study of Japanese men and women in Takayama, Japan. Over 7 years of follow-up, 121 deaths from stomach cancer (81 men and 40 women) occurred among 30 304 (13 880 men and 16 424 women) participants who were at least 35 years of age. Diet including the intake of soy products and isoflavones was assessed by a validated semiquantitative food–frequency questionnaire at the beginning of the study. In men, the highest compared to the lowest tertile of total soy product intake was significantly inversely associated with death from stomach cancer after controlling for covariates (hazard ratios=0.50; 95% confidence intervals (CIs) 0.26-0.93, P for trend=0.03). Decreased hazard ratios for the highest compared to the lowest tertiles of total soy product intake (hazard ratios=0.49; 95% CI 0.22–1.13) was observed in women, although this association was of marginal significance. These data suggest that soy intake may reduce the risk of death from stomach cancer. British Journal of Cancer (2002) 87, 31–36. doi:10.1038/sj.bjc.6600349 www.bjcancer.com © 2002 Cancer Research UK PMID:12085252
Lee, Sang-Uk; Oh, In-Hwan; Jeon, Hong Jin; Roh, Sungwon
2017-06-01
The relation of income and socioeconomic status with suicide rates remains unclear. Most previous studies have focused on the relationship between suicide rates and macroeconomic factors (e.g., economic growth rate). Therefore, we aimed to identify the relationship between individuals' socioeconomic position and suicide risk. We analyzed suicide mortality rates across socioeconomic positions to identify potential trends using observational data on suicide mortality collected between January 2003 and December 2013 from 1,025,340 national health insurance enrollees. We followed the subjects for 123.5 months on average. Socioeconomic position was estimated using insurance premium levels. To examine the hazard ratios of suicide mortality in various socioeconomic positions, we used Cox proportional hazard models. We found that the hazard ratios of suicide showed an increasing trend as socioeconomic position decreased. After adjusting for gender, age, geographic location, and disability level, Medicaid recipients had the highest suicide hazard ratio (2.28; 95% CI, 1.87-2.77). Among the Medicaid recipients, men had higher hazard ratios than women (2.79; 95% CI, 2.17-3.59 vs. 1.71; 95% CI, 1.25-2.34). Hazard ratios also varied across age groups. The highest hazard ratio was found in the 40-59-year-old group (3.19; 95% CI, 2.31-4.43), whereas the lowest ratio was found in those 60 years and older (1.44; 95% CI, 1.09-1.87). Our results illuminate the relationship between socioeconomic position and suicide rates and can be used to design and implement future policies on suicide prevention. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.
Flegal, Katherine M; Kit, Brian K; Graubard, Barry I
2018-01-01
Misclassification of body mass index (BMI) categories arising from self-reported weight and height can bias hazard ratios in studies of BMI and mortality. We examined the effects on hazard ratios of such misclassification using national US survey data for 1976 through 2010 that had both measured and self-reported weight and height along with mortality follow-up for 48,763 adults and a subset of 17,405 healthy never-smokers. BMI was categorized as <22.5 (low), 22.5-24.9 (referent), 25.0-29.9 (overweight), 30.0-34.9 (class I obesity), and ≥35.0 (class II-III obesity). Misreporting at higher BMI categories tended to bias hazard ratios upwards for those categories, but that effect was augmented, counterbalanced, or even reversed by misreporting in other BMI categories, in particular those that affected the reference category. For example, among healthy male never-smokers, misclassifications affecting the overweight and the reference categories changed the hazard ratio for overweight from 0.85 with measured data to 1.24 with self-reported data. Both the magnitude and direction of bias varied according to the underlying hazard ratios in measured data, showing that findings on bias from one study should not be extrapolated to a study with different underlying hazard ratios. Because of misclassification effects, self-reported weight and height cannot reliably indicate the lowest-risk BMI category. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health 2017. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Brunstein, Claudio; Zhang, Mei-Jie; Barker, Juliet; St. Martin, Andrew; Bashey, Asad; de Lima, Marcos; Dehn, Jason; Hematti, Peiman; Perales, Miguel-Angel; Rocha, Vanderson; Territo, Mary; Weisdorf, Daniel; Eapen, Mary
2017-01-01
The effects of inter-unit HLA-match on early outcomes with regards to double cord blood transplantation have not been established. Therefore, we studied the effect of inter-unit HLA-mismatching on the outcomes of 449 patients with acute leukemia after double cord blood transplantation. Patients were divided into two groups: one group that included transplantations with inter-unit mismatch at 2 or less HLA-loci (n=381) and the other group with inter-unit mismatch at 3 or 4 HLA-loci (n=68). HLA-match considered low resolution matching at HLA-A and -B loci and allele-level at HLA-DRB1, the accepted standard for selecting units for double cord blood transplants. Patients’, disease, and transplant characteristics were similar in the two groups. We observed no effect of the degree of inter-unit HLA-mismatch on neutrophil (Hazard Ratio 1.27, P=0.11) or platelet (Hazard Ratio 0.1.13, P=0.42) recovery, acute graft-versus-host disease (Hazard Ratio 1.17, P=0.36), treatment-related mortality (Hazard Ratio 0.92, P=0.75), relapse (Hazard Ratio 1.18, P=0.49), treatment failure (Hazard Ratio 0.99, P=0.98), or overall survival (Hazard Ratio 0.98, P=0.91). There were no differences in the proportion of transplants with engraftment of both units by three months (5% after transplantation of units with inter-unit mismatch at ≤2 HLA-loci and 4% after transplantation of units with inter-unit mismatch at 3 or 4 HLA-loci). Our observations support the elimination of inter-unit HLA-mismatch criterion when selecting cord blood units in favor of optimizing selection based on individual unit characteristics. PMID:28126967
Preston, Ioana R.; Roberts, Kari E.; Miller, Dave P.; Sen, Ginny P.; Selej, Mona; Benton, Wade W.; Hill, Nicholas S.
2015-01-01
Background— Long-term anticoagulation is recommended in idiopathic pulmonary arterial hypertension (IPAH). In contrast, limited data support anticoagulation in pulmonary arterial hypertension (PAH) associated with systemic sclerosis (SSc-PAH). We assessed the effect of warfarin anticoagulation on survival in IPAH and SSc-PAH patients enrolled in Registry to Evaluate Early and Long-term PAH Disease Management (REVEAL), a longitudinal registry of group I PAH. Methods and Results— Patients who initiated warfarin on study (n=187) were matched 1:1 with patients never on warfarin, by enrollment site, etiology, and diagnosis status. Descriptive analyses were conducted to compare warfarin users and nonusers by etiology. Survival analyses with and without risk adjustment were performed from the time of warfarin initiation or a corresponding quarterly update in matched pairs to avoid immortal time bias. Time-varying covariate models were used as sensitivity analyses. Mean warfarin treatment was 1 year; mean international normalized ratios were 1.9 (IPAH) and 2.0 (SSc-PAH). Two-thirds of patients initiating warfarin discontinued treatment before the last study assessment. There was no survival difference with warfarin in IPAH patients (adjusted hazard ratio, 1.37; P=0.21) or in SSc-PAH patients (adjusted hazard ratio, 1.60; P=0.15) in comparison with matched controls. However, SSc-PAH patients receiving warfarin within the previous year (hazard ratio, 1.57; P=0.031) or any time postbaseline (hazard ratio, 1.49; P=0.046) had increased mortality in comparison with warfarin-naïve patients. Conclusions— No significant survival advantage was observed in IPAH patients who started warfarin. In SSc-PAH patients, long-term warfarin was associated with poorer survival than in patients not receiving warfarin, even after adjusting for confounders. Clinical Trial Registration— URL: http://www.clinicaltrials.gov. Unique identifier: NCT00370214. PMID:26510696
Preston, Ioana R; Roberts, Kari E; Miller, Dave P; Sen, Ginny P; Selej, Mona; Benton, Wade W; Hill, Nicholas S; Farber, Harrison W
2015-12-22
Long-term anticoagulation is recommended in idiopathic pulmonary arterial hypertension (IPAH). In contrast, limited data support anticoagulation in pulmonary arterial hypertension (PAH) associated with systemic sclerosis (SSc-PAH). We assessed the effect of warfarin anticoagulation on survival in IPAH and SSc-PAH patients enrolled in Registry to Evaluate Early and Long-term PAH Disease Management (REVEAL), a longitudinal registry of group I PAH. Patients who initiated warfarin on study (n=187) were matched 1:1 with patients never on warfarin, by enrollment site, etiology, and diagnosis status. Descriptive analyses were conducted to compare warfarin users and nonusers by etiology. Survival analyses with and without risk adjustment were performed from the time of warfarin initiation or a corresponding quarterly update in matched pairs to avoid immortal time bias. Time-varying covariate models were used as sensitivity analyses. Mean warfarin treatment was 1 year; mean international normalized ratios were 1.9 (IPAH) and 2.0 (SSc-PAH). Two-thirds of patients initiating warfarin discontinued treatment before the last study assessment. There was no survival difference with warfarin in IPAH patients (adjusted hazard ratio, 1.37; P=0.21) or in SSc-PAH patients (adjusted hazard ratio, 1.60; P=0.15) in comparison with matched controls. However, SSc-PAH patients receiving warfarin within the previous year (hazard ratio, 1.57; P=0.031) or any time postbaseline (hazard ratio, 1.49; P=0.046) had increased mortality in comparison with warfarin-naïve patients. No significant survival advantage was observed in IPAH patients who started warfarin. In SSc-PAH patients, long-term warfarin was associated with poorer survival than in patients not receiving warfarin, even after adjusting for confounders. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00370214. © 2015 The Authors.
Pfleger, C C H; Flachs, E M; Koch-Henriksen, Nils
2010-07-01
There is a need for follow-up studies of the familial situation of multiple sclerosis (MS) patients. To evaluate the probability of MS patients to remain in marriage or relationship with the same partner after onset of MS in comparison with the population. All 2538 Danes with onset of MS 1980-1989, retrieved from the Danish MS-Registry, and 50,760 matched and randomly drawn control persons were included. Information on family status was retrieved from Statistics Denmark. Cox analyses were used with onset as starting point. Five years after onset, the cumulative probability of remaining in the same relationship was 86% in patients vs. 89% in controls. The probabilities continued to deviate, and at 24 years, the probability was 33% in patients vs. 53% in the control persons (p < 0.001). Among patients with young onset (< 36 years of age), those with no children had a higher risk of divorce than those having children less than 7 years (Hazard Ratio 1.51; p < 0.0001), and men had a higher risk of divorce than women (Hazard Ratio 1.33; p < 0.01). MS significantly affects the probability of remaining in the same relationship compared with the background population.
On the Interpretation of the Hazard Ratio and Communication of Survival Benefit.
Sashegyi, Andreas; Ferry, David
2017-04-01
This brief communication will clarify the difference between a relative hazard and a relative risk. We highlight the importance of this difference, and demonstrate in practical terms that 1 minus the hazard ratio should not be interpreted as a risk reduction in the commonly understood sense of the term. This article aims to provide a better understanding of the type of risk reduction that a hazard ratio implies, thereby clarifying the intent in the communication among practitioners and researchers and establishing an accurate and realistic foundation for communicating with patients. The Oncologist 2017;22:484-486. © AlphaMed Press 2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fortin, Andre; Wang Changshu; Vigneault, Eric
2008-09-01
Purpose: To investigate the effect of anemia on outcome of treatment with radiochemotherapy in patients with head-and-neck cancer. Methods and Materials: The data of 196 patients with Stage II-IV head-and-neck cancer treated with concomitant cisplatin-based radiochemotherapy were retrospectively reviewed. Anemia was defined according to World Health Organization criteria as hemoglobin <130 g/L in men and <120 g/L in women. Results: Fifty-three patients were classified as anemic, 143 as nonanemic. The 3-year local control rate of anemic and nonanemic patients was 72% and 85%, respectively (p = 0.01). The 3-year overall survival rate of anemic and nonanemic patients was 52% andmore » 77%, respectively (p = 0.004). In multivariate analysis, anemia was the most significant predictor of local control (hazard ratio, 0.37, p = 0.009) and survival (hazard ratio, 0.47, p = 0.007). A dose-effect relationship was also found for local control (p = .04) and survival (0.04) when grouping by hemoglobin concentration: <120, 120-140, and >140 g/L. Conclusions: Anemia was strongly associated with local control and survival in this cohort of patients with head-and-neck cancer receiving radiochemotherapy.« less
Edoxaban versus warfarin in patients with atrial fibrillation.
Giugliano, Robert P; Ruff, Christian T; Braunwald, Eugene; Murphy, Sabina A; Wiviott, Stephen D; Halperin, Jonathan L; Waldo, Albert L; Ezekowitz, Michael D; Weitz, Jeffrey I; Špinar, Jindřich; Ruzyllo, Witold; Ruda, Mikhail; Koretsune, Yukihiro; Betcher, Joshua; Shi, Minggao; Grip, Laura T; Patel, Shirali P; Patel, Indravadan; Hanyok, James J; Mercuri, Michele; Antman, Elliott M
2013-11-28
Edoxaban is a direct oral factor Xa inhibitor with proven antithrombotic effects. The long-term efficacy and safety of edoxaban as compared with warfarin in patients with atrial fibrillation is not known. We conducted a randomized, double-blind, double-dummy trial comparing two once-daily regimens of edoxaban with warfarin in 21,105 patients with moderate-to-high-risk atrial fibrillation (median follow-up, 2.8 years). The primary efficacy end point was stroke or systemic embolism. Each edoxaban regimen was tested for noninferiority to warfarin during the treatment period. The principal safety end point was major bleeding. The annualized rate of the primary end point during treatment was 1.50% with warfarin (median time in the therapeutic range, 68.4%), as compared with 1.18% with high-dose edoxaban (hazard ratio, 0.79; 97.5% confidence interval [CI], 0.63 to 0.99; P<0.001 for noninferiority) and 1.61% with low-dose edoxaban (hazard ratio, 1.07; 97.5% CI, 0.87 to 1.31; P=0.005 for noninferiority). In the intention-to-treat analysis, there was a trend favoring high-dose edoxaban versus warfarin (hazard ratio, 0.87; 97.5% CI, 0.73 to 1.04; P=0.08) and an unfavorable trend with low-dose edoxaban versus warfarin (hazard ratio, 1.13; 97.5% CI, 0.96 to 1.34; P=0.10). The annualized rate of major bleeding was 3.43% with warfarin versus 2.75% with high-dose edoxaban (hazard ratio, 0.80; 95% CI, 0.71 to 0.91; P<0.001) and 1.61% with low-dose edoxaban (hazard ratio, 0.47; 95% CI, 0.41 to 0.55; P<0.001). The corresponding annualized rates of death from cardiovascular causes were 3.17% versus 2.74% (hazard ratio, 0.86; 95% CI, 0.77 to 0.97; P=0.01), and 2.71% (hazard ratio, 0.85; 95% CI, 0.76 to 0.96; P=0.008), and the corresponding rates of the key secondary end point (a composite of stroke, systemic embolism, or death from cardiovascular causes) were 4.43% versus 3.85% (hazard ratio, 0.87; 95% CI, 0.78 to 0.96; P=0.005), and 4.23% (hazard ratio, 0.95; 95% CI, 0.86 to 1.05; P=0.32). Both once-daily regimens of edoxaban were noninferior to warfarin with respect to the prevention of stroke or systemic embolism and were associated with significantly lower rates of bleeding and death from cardiovascular causes. (Funded by Daiichi Sankyo Pharma Development; ENGAGE AF-TIMI 48 ClinicalTrials.gov number, NCT00781391.).
2013-06-01
rising IL-6 levels portended worse overall survival (hazard ratio = 1.525, P = 0.02). The following is a synopsis of year-2, followed by a summary...6 with patient outcome. Specifically, our data indicated that rising IL-6 levels portended worse overall survival (hazard ratio = 1.525, P = 0.02...portended worse overall survival (hazard ratio = 1.525, P = 0.02). 3. Key Research Accomplishments: Altogether, we identified… • A significant
Semi-active control of monopile offshore wind turbines under multi-hazards
NASA Astrophysics Data System (ADS)
Sun, C.
2018-01-01
The present paper studies the control of monopile offshore wind turbines subjected to multi-hazards consisting of wind, wave and earthquake. A Semi-active tuned mass damper (STMD) with tunable natural frequency and damping ratio is introduced to control the dynamic response. A new fully coupled analytical model of the monopile offshore wind turbine with an STMD is established. The aerodynamic, hydrodynamic and seismic loading models are derived. Soil effects and damage are considered. The National Renewable Energy Lab monopile 5 MW baseline wind turbine model is employed to examine the performance of the STMD. A passive tuned mass damper (TMD) is utilized for comparison. Through numerical simulation, it is found that before damage occurs, the wind and wave induced response is more dominant than the earthquake induced response. With damage presence in the tower and the foundation, the nacelle and the tower response is increased dramatically and the natural frequency is decreased considerably. As a result, the passive TMD with fixed parameters becomes off-tuned and loses its effectiveness. In comparison, the STMD retuned in real-time demonstrates consistent effectiveness in controlling the dynamic response of the monopile offshore wind turbines under multi-hazards and damage with a smaller stroke.
Risk of Suicide Attempt among Adolescents with Conduct Disorder: A Longitudinal Follow-up Study.
Wei, Han-Ting; Lan, Wen-Hsuan; Hsu, Ju-Wei; Bai, Ya-Mei; Huang, Kai-Lin; Su, Tung-Ping; Li, Cheng-Ta; Lin, Wei-Chen; Chen, Tzeng-Ji; Chen, Mu-Hong
2016-10-01
To assess the independent or comorbid effect of conduct and mood disorders on the risk of suicide. The Taiwan National Health Insurance Research Database was used to derive data for 3711 adolescents aged 12-17 years with conduct disorder and 14 844 age- and sex-matched controls between 2001 and 2009. The participants were followed up to the end of 2011, and those who attempted suicide during the follow-up period were identified. Adolescents with conduct disorder had a higher incidence of suicide (0.9% vs 0.1%; P <.001) and attempted suicide at a younger age (17.38 ± 2.04 vs 20.52 ± 1.70 years of age) than did the controls. The Cox proportional hazards regression model, after adjustment for demographic data and psychiatric comorbidities, determined that conduct disorder was an independent risk factor for subsequent suicide attempts (hazard ratio, 5.17; 95% CI, 2.29-11.70). The sensitivity after those with other psychiatric comorbidities were excluded revealed a consistent finding (hazard ratio, 10.32; 95% CI, 3.71-28.71). Adolescents with conduct disorder had an increased risk of suicide attempts over the next decade. Future studies are required to clarify the underlying pathophysiology and elucidate whether prompt intervention for conduct disorder could reduce this risk. Copyright © 2016 Elsevier Inc. All rights reserved.
Kimura, Go; Ueda, Takeshi
2017-03-01
A post hoc analysis of interim results from PREVAIL, a Phase III, double-blind, placebo-controlled trial of men with metastatic castration-resistant prostate cancer, demonstrated that the treatment effects, safety and pharmacokinetics of enzalutamide in Japanese patients were generally consistent with those of the overall population. A recent longer term analysis of PREVAIL demonstrated continued benefit of enzalutamide treatment over placebo. Here, we report results from a post hoc analysis of Japanese patients enrolled in PREVAIL at the prespecified number of deaths for the final analysis. In Japanese patients, enzalutamide reduced the risk of death by 35% (hazard ratio, 0.65; 95% confidence interval, 0.28-1.51) and the risk of investigator-assessed radiographic progression or death by 60% (hazard ratio, 0.40; 95% confidence interval, 0.18-0.90). These results show that treatment effects and safety in Japanese patients in the final analysis of PREVAIL continued to be generally consistent with those of the overall population. © The Author 2016. Published by Oxford University Press.
Ueda, Takeshi
2017-01-01
Abstract A post hoc analysis of interim results from PREVAIL, a Phase III, double-blind, placebo-controlled trial of men with metastatic castration-resistant prostate cancer, demonstrated that the treatment effects, safety and pharmacokinetics of enzalutamide in Japanese patients were generally consistent with those of the overall population. A recent longer term analysis of PREVAIL demonstrated continued benefit of enzalutamide treatment over placebo. Here, we report results from a post hoc analysis of Japanese patients enrolled in PREVAIL at the prespecified number of deaths for the final analysis. In Japanese patients, enzalutamide reduced the risk of death by 35% (hazard ratio, 0.65; 95% confidence interval, 0.28–1.51) and the risk of investigator-assessed radiographic progression or death by 60% (hazard ratio, 0.40; 95% confidence interval, 0.18–0.90). These results show that treatment effects and safety in Japanese patients in the final analysis of PREVAIL continued to be generally consistent with those of the overall population. PMID:28003320
Bonilla-Palomas, J L; Gámez-López, A L; Castillo-Domínguez, J C; Moreno-Conde, M; López-Ibáñez, M C; Anguita-Sánchez, M
2018-03-01
To assess the long-term effect of nutritional intervention on malnourished, hospitalised patients with heart failure (HF). A total of 120 malnourished patients hospitalized for HF were randomised to undergo (or not) an individual nutritional intervention for 6 months. The primary event was the combination of all-cause death and readmission for HF. We performed an intent-to-treat analysis and assessed the effect of the intervention at 24 months. The combined event occurred in 47.5% of the intervention group and in 73.8% of the control group (hazard ratio: 0.45; 95% confidence interval: 0.28-0.72; P=.001). Thirty-nine percent of the intervention group and 59% of the control group died (hazard ratio: 0.53; 95% confidence interval: 0.31-0.89; P=.017). A nutritional intervention for malnourished patients hospitalised for HF maintains its prognostic benefit in the long-term follow-up. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
Ko, Naomi Y; Snyder, Frederick R; Raich, Peter C; Paskett, Electra D; Dudley, Donald J; Lee, Ji-Hyun; Levine, Paul H; Freund, Karen M
2016-09-01
Patient navigation was developed to address barriers to timely care and reduce cancer disparities. The current study explored navigation and racial and ethnic differences in time to the diagnostic resolution of a cancer screening abnormality. The authors conducted an analysis of the multisite Patient Navigation Research Program. Participants with an abnormal cancer screening test were allocated to either navigation or control. The unadjusted median time to resolution was calculated for each racial and ethnic group by navigation and control. Multivariable Cox proportional hazards models were fit, adjusting for sex, age, cancer abnormality type, and health insurance and stratifying by center of care. Among a sample of 7514 participants, 29% were non-Hispanic white, 43% were Hispanic, and 28% were black. In the control group, black individuals were found to have a longer median time to diagnostic resolution (108 days) compared with non-Hispanic white individuals (65 days) or Hispanic individuals (68 days) (P<.0001). In the navigated groups, black individuals had a reduction in the median time to diagnostic resolution (97 days) (P<.0001). In the multivariable models, among controls, black race was found to be associated with an increased delay to diagnostic resolution (hazard ratio, 0.77; 95% confidence interval, 0.69-0.84) compared with non-Hispanic white individuals, which was reduced in the navigated arm (hazard ratio, 0.85; 95% confidence interval, 0.77-0.94). Patient navigation appears to have the greatest impact among black patients, who had the greatest delays in care. Cancer 2016. © 2016 American Cancer Society. Cancer 2016;122:2715-2722. © 2016 American Cancer Society. © 2016 American Cancer Society.
Migraine and risk of narcolepsy in children: A nationwide longitudinal study.
Yang, Chun-Pai; Hsieh, Meng-Lun; Chiang, Jen-Huai; Chang, Hsing-Yi; Hsieh, Vivian Chia-Rong
2017-01-01
The association between migraine and narcolepsy remains controversial. We aim to investigate whether migraine is associated with an increased risk of developing narcolepsy in children. In this longitudinal study, nationwide medical-claims data of pediatric patients (0-17y) with migraine are identified using the National Health Insurance Research Database (NHIRD) between 1997 and 2010 in Taiwan. Two cohorts are selected: migraine cases (n = 8,923) and propensity score-matched non-migraine controls (n = 35,692). Children with previous history of narcolepsy or headache before the index date are excluded. Cohorts are followed until the end of 2012, their withdrawal from the NHI program, or incidence of narcolepsy (ICD-9-CM: 347). Cox proportional hazards regression models are used to estimate hazard ratios (HRs) and 95% confidence intervals of developing narcolepsy in children with migraine compared to their non-migraine controls. A total of 13 incident cases with narcolepsy are observed during follow-up, with incidence rates of 0.1915 and 0.0278 per 1,000 person-years in migraine and non-migraine children, respectively. After a mean follow-up period of 4.68 and 5.04 years in the case and control cohort, respectively, the former exhibited a greater risk of developing narcolepsy compared to the latter (adjusted hazard ratio (aHR) = 5.30, 95% confidence interval (CI): 1.61, 17.4; p = 0.006). This finding persisted after controlling for potential confounders like baseline comorbidities and concurrent medication uptake, and in our analyses with migraine subtypes. Migraine is an independent risk factor for narcolepsy development in children. Further studies are needed to validate our findings and to explore the exact pathophysiological mechanisms linking migraine and narcolepsy.
Effect of Donor and Recipient Factors on Corneal Graft Rejection
Stulting, R. Doyle; Sugar, Alan; Beck, Roy; Belin, Michael; Dontchev, Mariya; Feder, Robert S.; Gal, Robin L.; Holland, Edward J.; Kollman, Craig; Mannis, Mark J.; Price, Francis; Stark, Walter; Verdier, David D.
2014-01-01
Purpose To assess the relationship between donor and recipient factors and corneal allograft rejection in eyes that underwent penetrating keratoplasty (PK) in the Cornea Donor Study. Methods 1090 subjects undergoing corneal transplantation for a moderate risk condition (principally Fuchs’ dystrophy or pseudophakic corneal edema) were followed for up to 5 years. Associations of baseline recipient and donor factors with the occurrence of a probable or definite rejection event were assessed in univariate and multivariate proportional hazards models. Results Eyes with pseudophakic or aphakic corneal edema (N=369) were more likely to experience a rejection event than eyes with Fuchs’ dystrophy (N=676) (34% ± 6% versus 22% ± 4%; hazard ratio = 1.56; 95% confidence interval 1.21 to 2.03). Among eyes with Fuchs’dystrophy, a higher probability of a rejection event was observed in phakic post-transplant eyes compared with eyes that underwent cataract extraction with or without intraocular lens implantation during PK (29% vs. 19%; hazard ratio = 0.54; 95% confidence interval 0.36 to 0.82). Female recipients had a higher probability of a rejection event than males (29% vs. 21%; hazard ratio=1.42; 95% confidence interval 1.08 to 1.87), after controlling for the effect of preoperative diagnosis and lens status. Donor age and donor recipient ABO compatibility were not associated with rejection. Conclusions There was a substantially higher graft rejection rate in eyes with pseudophakic or aphakic corneal edema compared with eyes with Fuchs’ dystrophy. Female recipients were more likely to have a rejection event than males. Graft rejection was not associated with donor age. PMID:22488114
Troy, Jesse D.; Hartge, Patricia; Weissfeld, Joel L.; Oken, Martin M.; Colditz, Graham A.; Mechanic, Leah E.; Morton, Lindsay M.
2010-01-01
Prospective studies of lifestyle and non-Hodgkin lymphoma (NHL) are conflicting, and some are inconsistent with case-control studies. The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial was used to evaluate risk of NHL and its subtypes in association with anthropometric factors, smoking, and alcohol consumption in a prospective cohort study. Lifestyle was assessed via questionnaire among 142,982 male and female participants aged 55–74 years enrolled in the PLCO Trial during 1993–2001. Hazard ratios and 95% confidence intervals were calculated using Cox proportional hazards regression. During 1,201,074 person-years of follow-up through 2006, 1,264 histologically confirmed NHL cases were identified. Higher body mass index (BMI; weight (kg)/height (m)2) at ages 20 and 50 years and at baseline was associated with increased NHL risk (Ptrend < 0.01 for all; e.g., for baseline BMI ≥30 vs. 18.5–24.9, hazard ratio = 1.32, 95% confidence interval: 1.13, 1.54). Smoking was not associated with NHL overall but was inversely associated with follicular lymphoma (ever smoking vs. never: hazard ratio = 0.62, 95% confidence interval: 0.45, 0.85). Alcohol consumption was unrelated to NHL (drinks/week: Ptrend = 0.187). These data support previous studies suggesting that BMI is positively associated with NHL, show an inverse association between smoking and follicular lymphoma (perhaps due to residual confounding), and do not support a causal association between alcohol and NHL. PMID:20494998
Memory Binding Test Predicts Incident Amnestic Mild Cognitive Impairment.
Mowrey, Wenzhu B; Lipton, Richard B; Katz, Mindy J; Ramratan, Wendy S; Loewenstein, David A; Zimmerman, Molly E; Buschke, Herman
2016-07-14
The Memory Binding Test (MBT), previously known as Memory Capacity Test, has demonstrated discriminative validity for distinguishing persons with amnestic mild cognitive impairment (aMCI) and dementia from cognitively normal elderly. We aimed to assess the predictive validity of the MBT for incident aMCI. In a longitudinal, community-based study of adults aged 70+, we administered the MBT to 246 cognitively normal elderly adults at baseline and followed them annually. Based on previous work, a subtle reduction in memory binding at baseline was defined by a Total Items in the Paired (TIP) condition score of ≤22 on the MBT. Cox proportional hazards models were used to assess the predictive validity of the MBT for incident aMCI accounting for the effects of covariates. The hazard ratio of incident aMCI was also assessed for different prediction time windows ranging from 4 to 7 years of follow-up, separately. Among 246 controls who were cognitively normal at baseline, 48 developed incident aMCI during follow-up. A baseline MBT reduction was associated with an increased risk for developing incident aMCI (hazard ratio (HR) = 2.44, 95% confidence interval: 1.30-4.56, p = 0.005). When varying the prediction window from 4-7 years, the MBT reduction remained significant for predicting incident aMCI (HR range: 2.33-3.12, p: 0.0007-0.04). Persons with poor performance on the MBT are at significantly greater risk for developing incident aMCI. High hazard ratios up to seven years of follow-up suggest that the MBT is sensitive to early disease.
Yoon, Jin-Ha; Roh, Jaehoon; Kim, Chi-Nyon; Won, Jong-Uk
2016-01-01
The aim of this study was to examine the relationship between noise exposure and risk of occupational injury. Korean National Health and Nutrition Examination Survey was used for the current study. Self-report questionnaires were used to investigate occupational injury and exposure to noise, chemicals, and machines and equipments. In separate analyses for occupation and occupational hazard, the proportion of occupational injuries increased according to severity of noise exposure (all P < 0.05). Compared to the non-exposure group, the respective odds ratio (95% confidence intervals) for occupational injury was 1.39 (1.07-1.80) and 1.67 (1.13-2.46) in the mild and severe noise exposure groups, after controlling for age, gender, sleep hours, work schedule (shift work), and exposure status to hazardous chemicals and hazardous machines and equipments. The current study highlights the association between noise exposure and risk of occupational injury. Furthermore, risk of occupational injury increased according to severity of noise exposure.
Arsenault, Benoit J; Boekholdt, S Matthijs; Dubé, Marie-Pierre; Rhéaume, Eric; Wareham, Nicholas J; Khaw, Kay-Tee; Sandhu, Manjinder S; Tardif, Jean-Claude
2014-06-01
Although a previous study has suggested that a genetic variant in the LPA region was associated with the presence of aortic valve stenosis (AVS), no prospective study has suggested a role for lipoprotein(a) levels in the pathophysiology of AVS. Our objective was to determine whether lipoprotein(a) levels and a common genetic variant that is strongly associated with lipoprotein(a) levels are associated with an increased risk of developing AVS. Serum lipoprotein(a) levels were measured in 17 553 participants of the European Prospective Investigation into Cancer (EPIC)-Norfolk study. Among these study participants, 118 developed AVS during a mean follow-up of 11.7 years. The rs10455872 genetic variant in LPA was genotyped in 14 735 study participants, who simultaneously had lipoprotein(a) level measurements, and in a replication study of 379 patients with echocardiography-confirmed AVS and 404 controls. In EPIC-Norfolk, compared with participants in the bottom lipoprotein(a) tertile, those in the top lipoprotein(a) tertile had a higher risk of AVS (hazard ratio, 1.57; 95% confidence interval, 1.02-2.42) after adjusting for age, sex, and smoking. Compared with rs10455872 AA homozygotes, carriers of 1 or 2 G alleles were at increased risk of AVS (hazard ratio, 1.78; 95% confidence interval, 1.11-2.87, versus hazard ratio, 4.83; 95% confidence interval, 1.77-13.20, respectively). In the replication study, the genetic variant rs10455872 also showed a positive association with AVS (odds ratio, 1.57; 95% confidence interval, 1.10-2.26). Patients with high lipoprotein(a) levels are at increased risk for AVS. The rs10455872 variant, which is associated with higher lipoprotein(a) levels, is also associated with increased risk of AVS, suggesting that this association may be causal. © 2014 American Heart Association, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Terence, E-mail: trdtwk@nccs.com.sg; Lim, Wan-Teck; Fong, Kam-Weng
Purpose: To compare survival, tumor control, toxicities, and quality of life of patients with locally advanced nasopharyngeal carcinoma (NPC) treated with induction chemotherapy and concurrent chemo-radiation (CCRT), against CCRT alone. Patients and Methods: Patients were stratified by N stage and randomized to induction GCP (3 cycles of gemcitabine 1000 mg/m{sup 2}, carboplatin area under the concentration-time-curve 2.5, and paclitaxel 70 mg/m{sup 2} given days 1 and 8 every 21 days) followed by CCRT (radiation therapy 69.96 Gy with weekly cisplatin 40 mg/m{sup 2}), or CCRT alone. The accrual of 172 was planned to detect a 15% difference in 5-year overall survival (OS) with a 5%more » significance level and 80% power. Results: Between September 2004 and August 2012, 180 patients were accrued, and 172 (GCP 86, control 86) were analyzed by intention to treat. There was no significant difference in OS (3-year OS 94.3% [GCP] vs 92.3% [control]; hazard ratio 1.05; 1-sided P=.494]), disease-free survival (hazard ratio 0.77, 95% confidence interval 0.44-1.35, P=.362), and distant metastases–free survival (hazard ratio 0.80, 95% confidence interval 0.38-1.67, P=.547) between the 2 arms. Treatment compliance in the induction phase was good, but the relative dose intensity for concurrent cisplatin was significantly lower in the GCP arm. Overall, the GCP arm had higher rates of grades 3 and 4 leukopenia (52% vs 37%) and neutropenia (24% vs 12%), but grade 3 and 4 acute radiation toxicities were not statistically different between the 2 arms. The global quality of life scores were comparable in both arms. Conclusion: Induction chemotherapy with GCP before concurrent chemo-irradiation did not improve survival in locally advanced NPC.« less
Bajwa, Ednan K; Yu, Chu-Ling; Gong, Michelle N; Thompson, B Taylor; Christiani, David C
2007-05-01
Pre-B-cell colony-enhancing factor (PBEF) levels are elevated in bronchoalveolar lavage fluid and serum of patients with acute lung injury. There are several suspected functional polymorphisms of the corresponding PBEF gene. We hypothesized that variations in PBEF gene polymorphisms alter the risk of developing acute respiratory distress syndrome (ARDS). Nested case-control study. Tertiary academic medical center. We studied 375 patients with ARDS and 787 at-risk controls genotyped for the PBEF T-1001G and C-1543T polymorphisms. None. Patients with the -1001G (variant) allele had significantly greater odds of developing ARDS than wild-type homozygotes (odds ratio, 1.35; 95% confidence interval, 1.02-1.78). Patients with the -1543T (variant) allele did not have significantly different odds of developing ARDS than wild-type homozygotes (odds ratio, 0.86; 95% confidence interval, 0.65-1.13). When analysis was stratified by ARDS risk factor, -1543T was associated with decreased odds of developing ARDS in septic shock patients (odds ratio, 0.66; 95% confidence interval, 0.45-0.97). Also, -1001G was associated with increased hazard of intensive care unit mortality, whereas -1543T was associated with decreased hazard of 28-day and 60-day ARDS mortality, as well as shorter duration of mechanical ventilation. Similar results were found in analyses of the related GC (-1001G:-1543C) and TT (-1001T:-1543T) haplotypes. The PBEFT-1001G variant allele and related haplotype are associated with increased odds of developing ARDS and increased hazard of intensive care unit mortality among at-risk patients, whereas the C-1543T variant allele and related haplotype are associated with decreased odds of ARDS among patients with septic shock and better outcomes among patients with ARDS.
Matching on the Disease Risk Score in Comparative Effectiveness Research of New Treatments
Wyss, Richard; Ellis, Alan R.; Brookhart, M. Alan; Funk, Michele Jonsson; Girman, Cynthia J.; Simpson, Ross J.; Stürmer, Til
2016-01-01
Purpose We use simulations and an empirical example to evaluate the performance of disease risk score (DRS) matching compared with propensity score (PS) matching when controlling large numbers of covariates in settings involving newly introduced treatments. Methods We simulated a dichotomous treatment, a dichotomous outcome, and 100 baseline covariates that included both continuous and dichotomous random variables. For the empirical example, we evaluated the comparative effectiveness of dabigatran versus warfarin in preventing combined ischemic stroke and all-cause mortality. We matched treatment groups on a historically estimated DRS and again on the PS. We controlled for a high-dimensional set of covariates using 20% and 1% samples of Medicare claims data from October 2010 through December 2012. Results In simulations, matching on the DRS versus the PS generally yielded matches for more treated individuals and improved precision of the effect estimate. For the empirical example, PS and DRS matching in the 20% sample resulted in similar hazard ratios (0.88 and 0.87) and standard errors (0.04 for both methods). In the 1% sample, PS matching resulted in matches for only 92.0% of the treated population and a hazard ratio and standard error of 0.89 and 0.19, respectively, while DRS matching resulted in matches for 98.5% and a hazard ratio and standard error of 0.85 and 0.16, respectively. Conclusions When PS distributions are separated, DRS matching can improve the precision of effect estimates and allow researchers to evaluate the treatment effect in a larger proportion of the treated population. However, accurately modeling the DRS can be challenging compared with the PS. PMID:26112690
Matching on the disease risk score in comparative effectiveness research of new treatments.
Wyss, Richard; Ellis, Alan R; Brookhart, M Alan; Jonsson Funk, Michele; Girman, Cynthia J; Simpson, Ross J; Stürmer, Til
2015-09-01
We use simulations and an empirical example to evaluate the performance of disease risk score (DRS) matching compared with propensity score (PS) matching when controlling large numbers of covariates in settings involving newly introduced treatments. We simulated a dichotomous treatment, a dichotomous outcome, and 100 baseline covariates that included both continuous and dichotomous random variables. For the empirical example, we evaluated the comparative effectiveness of dabigatran versus warfarin in preventing combined ischemic stroke and all-cause mortality. We matched treatment groups on a historically estimated DRS and again on the PS. We controlled for a high-dimensional set of covariates using 20% and 1% samples of Medicare claims data from October 2010 through December 2012. In simulations, matching on the DRS versus the PS generally yielded matches for more treated individuals and improved precision of the effect estimate. For the empirical example, PS and DRS matching in the 20% sample resulted in similar hazard ratios (0.88 and 0.87) and standard errors (0.04 for both methods). In the 1% sample, PS matching resulted in matches for only 92.0% of the treated population and a hazard ratio and standard error of 0.89 and 0.19, respectively, while DRS matching resulted in matches for 98.5% and a hazard ratio and standard error of 0.85 and 0.16, respectively. When PS distributions are separated, DRS matching can improve the precision of effect estimates and allow researchers to evaluate the treatment effect in a larger proportion of the treated population. However, accurately modeling the DRS can be challenging compared with the PS. Copyright © 2015 John Wiley & Sons, Ltd.
Katz, Patricia P.; Yelin, Edward H.; Iribarren, Carlos; Knight, Sara J.; Blanc, Paul D.; Eisner, Mark D.
2010-01-01
Background: Psychologic factors affect how patients with COPD respond to attempts to improve their self-management skills. Learned helplessness may be one such factor, but there is no validated measure of helplessness in COPD. Methods: We administered a new COPD Helplessness Index (CHI) to 1,202 patients with COPD. Concurrent validity was assessed through association of the CHI with established psychosocial measures and COPD severity. The association of helplessness with incident COPD exacerbations was then examined by following subjects over a median 2.1 years, defining COPD exacerbations as COPD-related hospitalizations or ED visits. Results: The CHI demonstrated internal consistency (Cronbach α = 0.75); factor analysis was consistent with the CHI representing a single construct. Greater CHI-measured helplessness correlated with greater COPD severity assessed by the BODE (Body-mass, Obstruction, Dyspnea, Exercise) Index (r = 0.34; P < .001). Higher CHI scores were associated with worse generic (Short Form-12, Physical Component Summary Score) and respiratory-specific (Airways Questionnaire 20) health-related quality of life, greater depressive symptoms, and higher anxiety (all P < .001). Controlling for sociodemographics and smoking status, helplessness was prospectively associated with incident COPD exacerbations (hazard ratio = 1.31; P < .001). After also controlling for the BODE Index, helplessness remained predictive of COPD exacerbations among subjects with BODE Index ≤ median (hazard ratio = 1.35; P = .01), but not among subjects with higher BODE Index values (hazard ratio = 0.93; P = .34). Conclusions: The CHI is an internally consistent and valid measure, concurrently associated with health status and predictively associated with COPD exacerbations. The CHI may prove a useful tool in analyzing differential clinical responses mediated by patient-centered attributes. PMID:19837823
Anveden, Åsa; Taube, Magdalena; Peltonen, Markku; Jacobson, Peter; Andersson-Assarsson, Johanna C.; Sjöholm, Kajsa; Svensson, Per-Arne; Carlsson, Lena M.S.
2017-01-01
Summary Objective To examine the long-term effects of bariatric surgery on female-specific cancer in women with obesity. Methods The prospective, matched Swedish Obese Subjects (SOS) study was designed to examine outcomes after bariatric surgery. This study includes 1420 women from the SOS cohort that underwent bariatric surgery and 1447 contemporaneously matched controls who received conventional obesity treatment. Age was 37–60 years and BMI was ≥38 kg/m2. Information on cancer events was obtained from the Swedish National Cancer Registry. Median follow-up time was 18.1 years (interquartile range 14.8–20.9 years, maximum 26 years). This study is registered with ClinicalTrials.gov, NCT01479452. Results Bariatric surgery was associated with reduced risk of overall cancer (hazard ratio=0.71; 95% CI 0.59–0.85; p<0.001). About half of the observed cancers were female-specific, and the incidence of these were lower in the surgery group compared with the control group (hazard ratio=0.68; 95% CI 0.52–0·88; p=0.004). The surgical treatment benefit with respect to female-specific cancer was significantly associated with baseline serum insulin (interaction p value=0.022), with greater relative treatment benefit in patients with medium or high insulin levels. Separate analyses of different types of female-specific cancers showed that bariatric surgery was associated with reduced risk of endometrial cancer (hazard ratio=0.56: 95% CI 0.35–0.89; p=0.014). Conclusions In this long-term study, bariatric surgery was associated with reduced risk of female-specific cancer, especially in women with hyperinsulinemia at baseline. PMID:28259424
Post-Inhaled Corticosteroid Pulmonary Tuberculosis Increases Lung Cancer in Patients with Asthma
Lin, Frank Cheau-Feng; Nfor, Oswald Ndi; Jhang, Kai-Ming; Ku, Wen-Yuan; Ho, Chien-Chang; Lung, Chia-Chi; Pan, Hui-Hsien; Wu, Min-Chen; Wu, Ming-Fang; Liaw, Yung-Po
2016-01-01
Purpose To evaluate the association between post-inhaled corticosteroid (ICS) pulmonary tuberculosis (TB), pneumonia and lung cancer in patients with asthma. Methods The study samples were collected from the National Health Insurance Database. Asthmatic patients who were first-time users of ICS between 2003 and 2005 were identified as cases. For each case, 4 control individuals were randomly matched for sex, age and date of ICS use. Cases and matched controls were followed up until the end of 2010. Cox proportional hazard regression was used to determine the hazard ratio for pulmonary infections and lung cancer risk in the ICS users and non-users. Results A total of 10,904 first-time users of ICS were matched with 43,616 controls. The hazard ratios for lung cancer were: 2.52 (95% confidence interval [CI], 1.22–5.22; p = 0.012) for individuals with post-ICS TB, 1.28 (95%CI, 0.73–2.26; p = 0.389) for post-ICS pneumonia, 2.31(95%CI, 0.84–6.38; p = 0.105) for post-ICS pneumonia+TB, 1.08 (95%CI, 0.57–2.03; p = 0.815) for TB, 0.99 (95%CI, 0.63–1.55; p = 0.970) for pneumonia, and 0.32 (95%CI, 0.05–2.32; p = 0.261) for pneumonia+ TB, respectively. Conclusions Post-ICS TB increased lung cancer risk in patients with asthma. Because of the high mortality associated with lung cancer, screening tests are recommended for patients with post-ICS TB. PMID:27448321
Vu, Thuy C.; Nutt, John G.; Holford, Nicholas H. G.
2012-01-01
AIM To describe the time to clinical events (death, disability, cognitive impairment and depression) in Parkinson's disease using the time course of disease status and treatment as explanatory variables. METHODS Disease status based on the Unified Parkinson's Disease Rating Scale (UPDRS) and the time to clinical outcome events were obtained from 800 patients who initially had early Parkinson's disease. Parametric hazard models were used to describe the time to the events of interest. RESULTS Time course of disease status (severity) was an important predictor of clinical outcome events. There was an increased hazard ratio for death 1.4 (95% CI 1.31, 149), disability 2.75 (95% CI 2.30, 3.28), cognitive impairment 4.35 (95% CI 1.94, 9.74), and depressive state 1.43 (95% CI 1.26, 1.63) with each 10 unit increase of UPDRS. Age at study entry increased the hazard with hazard ratios of 49.1 (95% CI 8.7, 278) for death, 4.76 (95% CI 1.10, 20.6) for disability and 90.0 (95% CI 63.3–128) for cognitive impairment at age 60 years. Selegiline treatment had independent effects as a predictor of death at 8 year follow-up with a hazard ratio of 2.54 (95% CI 1.51, 4.25) but had beneficial effects on disability with a hazard ratio of 0.363 (95% CI 0.132, 0.533) and depression with a hazard ratio of 0.372 (95% CI 0.12, 0.552). CONCLUSIONS Our findings show that the time course of disease status based on UPDRS is a much better predictor of future clinical events than any baseline disease characteristic. Continued selegiline treatment appears to increase the hazard of death. PMID:22300470
Kendler, Kenneth S; Lönn, Sara Larsson; Salvatore, Jessica; Sundquist, Jan; Sundquist, Kristina
2017-05-01
The purpose of this study was to clarify the magnitude and nature of the relationship between divorce and risk for alcohol use disorder (AUD). In a population-based Swedish sample of married individuals (N=942,366), the authors examined the association between divorce or widowhood and risk for first registration for AUD. AUD was assessed using medical, criminal, and pharmacy registries. Divorce was strongly associated with risk for first AUD onset in both men (hazard ratio=5.98, 95% CI=5.65-6.33) and women (hazard ratio=7.29, 95% CI=6.72-7.91). The hazard ratio was estimated for AUD onset given divorce among discordant monozygotic twins to equal 3.45 and 3.62 in men and women, respectively. Divorce was also associated with an AUD recurrence in those with AUD registrations before marriage. Furthermore, widowhood increased risk for AUD in men (hazard ratio=3.85, 95% CI=2.81-5.28) and women (hazard ratio=4.10, 95% CI=2.98-5.64). Among divorced individuals, remarriage was associated with a large decline in AUD in both sexes (men: hazard ratio=0.56, 95% CI=0.52-0.64; women: hazard ratio=0.61, 95% CI=0.55-0.69). Divorce produced a greater increase in first AUD onset in those with a family history of AUD or with prior externalizing behaviors. Spousal loss through divorce or bereavement is associated with a large enduring increased AUD risk. This association likely reflects both causal and noncausal processes. That the AUD status of the spouse alters this association highlights the importance of spouse characteristics for the behavioral health consequences of spousal loss. The pronounced elevation in AUD risk following divorce or widowhood, and the protective effect of remarriage against subsequent AUD, speaks to the profound impact of marriage on problematic alcohol use.
Lim, Wendy; Meade, Maureen; Lauzier, Francois; Zarychanski, Ryan; Mehta, Sangeeta; Lamontagne, Francois; Dodek, Peter; McIntyre, Lauralyn; Hall, Richard; Heels-Ansdell, Diane; Fowler, Robert; Pai, Menaka; Guyatt, Gordon; Crowther, Mark A; Warkentin, Theodore E; Devereaux, P J; Walter, Stephen D; Muscedere, John; Herridge, Margaret; Turgeon, Alexis F; Geerts, William; Finfer, Simon; Jacka, Michael; Berwanger, Otavio; Ostermann, Marlies; Qushmaq, Ismael; Friedrich, Jan O; Cook, Deborah J
2015-02-01
To identify risk factors for failure of anticoagulant thromboprophylaxis in critically ill patients in the ICU. Multivariable regression analysis of thrombosis predictors from a randomized thromboprophylaxis trial. Sixty-seven medical-surgical ICUs in six countries. Three thousand seven hundred forty-six medical-surgical critically ill patients. All patients received anticoagulant thromboprophylaxis with low-molecular-weight heparin or unfractionated heparin at standard doses. Independent predictors for venous thromboembolism, proximal leg deep vein thrombosis, and pulmonary embolism developing during critical illness were assessed. A total of 289 patients (7.7%) developed venous thromboembolism. Predictors of thromboprophylaxis failure as measured by development of venous thromboembolism included a personal or family history of venous thromboembolism (hazard ratio, 1.64; 95% CI, 1.03-2.59; p = 0.04) and body mass index (hazard ratio, 1.18 per 10-point increase; 95% CI, 1.04-1.35; p = 0.01). Increasing body mass index was also a predictor for developing proximal leg deep vein thrombosis (hazard ratio, 1.25; 95% CI, 1.06-1.46; p = 0.007), which occurred in 182 patients (4.9%). Pulmonary embolism occurred in 47 patients (1.3%) and was associated with body mass index (hazard ratio, 1.37; 95% CI, 1.02-1.83; p = 0.035) and vasopressor use (hazard ratio, 1.84; 95% CI, 1.01-3.35; p = 0.046). Low-molecular-weight heparin (in comparison to unfractionated heparin) thromboprophylaxis lowered pulmonary embolism risk (hazard ratio, 0.51; 95% CI, 0.27-0.95; p = 0.034) while statin use in the preceding week lowered the risk of proximal leg deep vein thrombosis (hazard ratio, 0.46; 95% CI, 0.27-0.77; p = 0.004). Failure of standard thromboprophylaxis using low-molecular-weight heparin or unfractionated heparin is more likely in ICU patients with elevated body mass index, those with a personal or family history of venous thromboembolism, and those receiving vasopressors. Alternate management or incremental risk reduction strategies may be needed in such patients.
Celiac Disease and Anorexia Nervosa: A Nationwide Study.
Mårild, Karl; Størdal, Ketil; Bulik, Cynthia M; Rewers, Marian; Ekbom, Anders; Liu, Edwin; Ludvigsson, Jonas F
2017-05-01
Previous research suggests an association of celiac disease (CD) with anorexia nervosa (AN), but data are mostly limited to case reports. We aimed to determine whether CD is associated with the diagnosis of AN. Register-based cohort and case-control study including women with CD ( n = 17 959) and sex- and age-matched population-based controls ( n = 89 379). CD (villous atrophy) was identified through the histopathology records of Sweden's 28 pathology departments. Inpatient and hospital-based outpatient records were used to identify AN. Hazard ratios for incident AN diagnosis were estimated by using stratified Cox regression with CD diagnosis as a time-dependent exposure variable. In the secondary analyses, we used conditional logistic regression to estimate odds ratios for being diagnosed with AN before CD. Median age of CD diagnosis was 28 years. During 1 174 401 person-years of follow-up, 54 patients with CD were diagnosed with AN (27/100 000 person-years) compared with 180 matched controls (18/100 000 person-years). The hazard ratio for later AN was 1.46 (95% confidence interval [CI], 1.08-1.98) and 1.31 beyond the first year after CD diagnosis (95% CI, 0.95-1.81). A previous AN diagnosis was also associated with CD (odds ratio, 2.18; 95% CI, 1.45-3.29). Estimates remained largely unchanged when adjusted for socioeconomic characteristics and type 1 diabetes. The bidirectional association between AN diagnosis and CD warrants attention in the initial assessment and follow-up of these conditions because underdiagnosis and misdiagnosis of these disorders likely cause protracted and unnecessary morbidity. Copyright © 2017 by the American Academy of Pediatrics.
Yang, Peng; Ma, Junhong; Yang, Xin; Li, Wei
2017-01-01
Background To investigate the clinical significance of naïve T cells, memory T cells, CD45RA+CD45RO+ T cells, and naïve/memory ratio in non-small cell lung cancer (NSCLC) patients. Methods Pretreatment peripheral blood samples from 76 NSCLC patients and 28 age- and sex-matched healthy volunteers were collected and tested for immune cells by flow cytometry. We compared the expression of these immune cells between patients and healthy controls and evaluated their predictive roles for survival in NSCLC by cox proportional hazards model. Results Decreased naïve CD4+ T cells, naïve CD8+ T cells, CD4+ naïve/memory ratios and CD4+CD45RA+CD45RO+ T cells, and increased memory CD4+ T cells, were observed in 76 NSCLC patients compared to healthy volunteers. Univariate analysis revealed that elevated CD4+ naïve/memory ratio correlated with prolonged progression-free survival (P=0.013). Multivariate analysis confirmed its predictive role with a hazard ratio of 0.35 (95% confidence interval, 0.19-0.75, P=0.012). Conclusions Peripheral CD4+ naïve/memory ratio can be used as a predictive biomarker in NSCLC patients and used to optimize personalized treatment strategies. PMID:29137371
Albuminuria and Rapid Loss of GFR and Risk of New Hip and Pelvic Fractures
Gao, Peggy; Clase, Catherine M.; Mente, Andrew; Mann, Johannes F.E.; Sleight, Peter; Yusuf, Salim; Teo, Koon K.
2013-01-01
Summary Background and objectives The microvascular circulation plays an important role in bone health. This study examines whether albuminuria, a marker of renal microvascular disease, is associated with incident hip and pelvic fractures. Design, setting, participants, & measurements This study reanalyzed data from the Ongoing Telmisartan Alone and in combination with Ramipril Global End Point Trial/Telmisartan Randomized Assessment Study in Angiotensin-Converting Enzyme Intolerant Subjects with Cardiovascular Disease trials, which examined the impact of renin angiotensin system blockade on cardiovascular outcomes (n=28,601). Albuminuria was defined as an albumin-to-creatinine ratio≥30 mg/g (n=4597). Cox proportional hazards models were used to determine the association of albuminuria with fracture risk adjusted for known risk factors for fractures, estimated GFR, and rapid decline in estimated GFR (≥5%/yr). Results There were 276 hip and pelvic fractures during a mean of 4.6 years of follow-up. Participants with baseline albuminuria had a significantly increased risk of fracture compared with participants without albuminuria (unadjusted hazard ratio=1.62 [1.22, 2.15], P<0.001; adjusted hazard ratio=1.36 [1.01, 1.84], P=0.05). A dose-dependent relationship was observed, with macroalbuminuria having a large fracture risk (unadjusted hazard ratio=2.01 [1.21, 3.35], P=0.007; adjusted hazard ratio=1.71 [1.007, 2.91], P=0.05) and microalbuminuria associating with borderline or no statistical significance (unadjusted hazard ratio=1.52 [1.10, 2.09], P=0.01; adjusted hazard ratio=1.28 [0.92, 1.78], P=0.15). Estimated GFR was not a predictor of fracture in any model, but rapid loss of estimated GFR over the first 2 years of follow-up predicted subsequent fracture (adjusted hazard ratio=1.47 [1.05, 2.04], P=0.02). Conclusions Albuminuria, especially macroalbuminuria, and rapid decline of estimated GFR predict hip and pelvic fractures. These findings support a theoretical model of a relationship between underlying causes of microalbuminuria and bone disease. PMID:23184565
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, M.L.; Carroll, H.A.
1986-07-01
The handbook describes basic types of foams that may be used to control vapor hazards from spilled volatile chemicals. It provides a table to be used by spill-response personnel to choose an appropriate foam based on the type of chemical spill. Six general types of foams, surfactant (syndet) foams, aqueous film forming foams (AFFF), alcohol type or polar solvent type foams (ATF), and special foams such as Hazmat NF no. 1 which was developed especially for alkaline spills. The handbook provides the basis for spill responders to evaluate and select a foam for vapor control by using the test methodsmore » presented or by considering manufacturers specifications for foam-expansion ratios and quarter drainage times. The responder is encouraged to maximize the effectiveness of a foam by trying different nozzles, distances of applications, and thicknesses of the foam layers.« less
Scherrer, B; Andrieu, S; Ousset, P J; Berrut, G; Dartigues, J F; Dubois, B; Pasquier, F; Piette, F; Robert, P; Touchon, J; Garnier, P; Mathiex-Fortunet, H; Vellas, B
2015-12-01
Time-to-event analysis is frequently used in medical research to investigate potential disease-modifying treatments in neurodegenerative diseases. Potential treatment effects are generally evaluated using the logrank test, which has optimal power and sensitivity when the treatment effect (hazard ratio) is constant over time. However, there is generally no prior information as to how the hazard ratio for the event of interest actually evolves. In these cases, the logrank test is not necessarily the most appropriate to use. When the hazard ratio is expected to decrease or increase over time, alternative statistical tests such as the Fleming-Harrington test, provide a better sensitivity. An example of this comes from a large, five-year randomised, placebo-controlled prevention trial (GuidAge) in 2854 community-based subjects making spontaneous memory complaints to their family physicians, which evaluated whether treatment with EGb761 can modify the risk of developing AD. The primary outcome measure was the time to conversion from memory complaint to Alzheimer's type dementia. Although there was no significant difference in the hazard function of conversion between the two treatment groups according to the preplanned logrank test, a significant treatment-by-time interaction for the incidence of AD was observed in a protocol-specified subgroup analysis, suggesting that the hazard ratio is not constant over time. For this reason, additional post hoc analyses were performed using the Fleming-Harrington test to evaluate whether there was a signal of a late effect of EGb761. Applying the Fleming-Harrington test, the hazard function for conversion to dementia in the placebo group was significantly different from that in the EGb761 treatment group (p = 0.0054), suggesting a late effect of EGb761. Since this was a post hoc analysis, no definitive conclusions can be drawn as to the effectiveness of the treatment. This post hoc analysis illustrates the interest of performing another randomised clinical trial of EGb761 explicitly testing the hypothesis of a late treatment effect, as well as of using of better adapted statistical approaches for long term preventive trials when it is expected that prevention cannot have an immediate effect but rather a delayed effect that increases over time.
Sommers, Marilyn S.; Lyons, Michael S.; Fargo, Jamison D.; Sommers, Benjamin D.; McDonald, Catherine C.; Shope, Jean T.; Fleming, Michael F.
2014-01-01
Background Risky driving and hazardous drinking are associated with significant human and economic costs. Brief interventions for more than one risky behavior have the potential to reduce health-compromising behaviors in populations with multiple risk-taking behaviors such as young adults. Emergency department (ED) visits provide a window of opportunity for interventions meant to reduce both risky driving and hazardous drinking. Methods We determined the efficacy of a Screening, Brief Intervention, and Referral to Treatment (SBIRT) protocol addressing risky driving and hazardous drinking. We used a randomized controlled trial design with follow-ups through 12 months. ED patients aged 18 to 44 who screened positive for both behaviors (n = 476) were randomized to brief intervention (BIG), contact control (CCG), or no-contact control (NCG) groups. The BIG (n = 150) received a 20-minute assessment and two 20-minute interventions. The CCG (n = 162) received a 20-minute assessment at baseline and no intervention. The NCG (n = 164) were asked for contact information at baseline and had no assessment or intervention. Outcomes at 3, 6, 9, and 12 months were self-reported driving behaviors and alcohol consumption. Results Outcomes were significantly lower in BIG compared with CCG through 6 or 9 months, but not at 12 months: Safety belt use at 3 months (adjusted odds ratio [AOR], 0.22; 95% confidence interval [CI], 0.08 to 0.65); 6 months (AOR, 0.13; 95% CI, 0.04 to 0.42); and 9 months (AOR, 0.18; 95% CI, 0.06 to 0.56); binge drinking at 3 months (adjusted rate ratio [ARR] 0.84; 95% CI, 0.74 to 0.97) and 6 months (ARR, 0.81; 95% CI, 0.67 to 0.97); and ≥ 5 standard drinks/d at 3 months (AOR, 0.43; 95% CI, 0.20 to 0.91) and 6 months (AOR, 0.41; 95% CI, 0.17 to 0.98). No substantial differences were observed between BIG and NCG at 12 months. Conclusions Our findings indicate that SBIRT reduced risky driving and hazardous drinking in young adults, but its effects did not persist after 9 months. Future research should explore methods for extending the intervention effect. PMID:23802878
Sommers, Marilyn S; Lyons, Michael S; Fargo, Jamison D; Sommers, Benjamin D; McDonald, Catherine C; Shope, Jean T; Fleming, Michael F
2013-10-01
Risky driving and hazardous drinking are associated with significant human and economic costs. Brief interventions for more than one risky behavior have the potential to reduce health-compromising behaviors in populations with multiple risk-taking behaviors such as young adults. Emergency department (ED) visits provide a window of opportunity for interventions meant to reduce both risky driving and hazardous drinking. We determined the efficacy of a Screening, Brief Intervention, and Referral to Treatment (SBIRT) protocol addressing risky driving and hazardous drinking. We used a randomized controlled trial design with follow-ups through 12 months. ED patients aged 18 to 44 who screened positive for both behaviors (n = 476) were randomized to brief intervention (BIG), contact control (CCG), or no-contact control (NCG) groups. The BIG (n = 150) received a 20-minute assessment and two 20-minute interventions. The CCG (n = 162) received a 20-minute assessment at baseline and no intervention. The NCG (n = 164) were asked for contact information at baseline and had no assessment or intervention. Outcomes at 3, 6, 9, and 12 months were self-reported driving behaviors and alcohol consumption. Outcomes were significantly lower in BIG compared with CCG through 6 or 9 months, but not at 12 months: Safety belt use at 3 months (adjusted odds ratio [AOR], 0.22; 95% confidence interval [CI], 0.08 to 0.65); 6 months (AOR, 0.13; 95% CI, 0.04 to 0.42); and 9 months (AOR, 0.18; 95% CI, 0.06 to 0.56); binge drinking at 3 months (adjusted rate ratio [ARR] 0.84; 95% CI, 0.74 to 0.97) and 6 months (ARR, 0.81; 95% CI, 0.67 to 0.97); and ≥5 standard drinks/d at 3 months (AOR, 0.43; 95% CI, 0.20 to 0.91) and 6 months (AOR, 0.41; 95% CI, 0.17 to 0.98). No substantial differences were observed between BIG and NCG at 12 months. Our findings indicate that SBIRT reduced risky driving and hazardous drinking in young adults, but its effects did not persist after 9 months. Future research should explore methods for extending the intervention effect. Copyright © 2013 by the Research Society on Alcoholism.
Intensive Blood-Pressure Control in Hypertensive Chronic Kidney Disease
Appel, Lawrence J.; Wright, Jackson T.; Greene, Tom; Agodoa, Lawrence Y.; Astor, Brad C.; Bakris, George L.; Cleveland, William H.; Charleston, Jeanne; Contreras, Gabriel; Faulkner, Marquetta L.; Gabbai, Francis B.; Gassman, Jennifer J.; Hebert, Lee A.; Jamerson, Kenneth A.; Kopple, Joel D.; Kusek, John W.; Lash, James P.; Lea, Janice P.; Lewis, Julia B.; Lipkowitz, Michael S.; Massry, Shaul G.; Miller, Edgar R.; Norris, Keith; Phillips, Robert A.; Pogue, Velvie A.; Randall, Otelio S.; Rostand, Stephen G.; Smogorzewski, Miroslaw J.; Toto, Robert D.; Wang, Xuelei
2013-01-01
BACKGROUND In observational studies, the relationship between blood pressure and end-stage renal disease (ESRD) is direct and progressive. The burden of hypertension-related chronic kidney disease and ESRD is especially high among black patients. Yet few trials have tested whether intensive blood-pressure control retards the progression of chronic kidney disease among black patients. METHODS We randomly assigned 1094 black patients with hypertensive chronic kidney disease to receive either intensive or standard blood-pressure control. After completing the trial phase, patients were invited to enroll in a cohort phase in which the blood-pressure target was less than 130/80 mm Hg. The primary clinical outcome in the cohort phase was the progression of chronic kidney disease, which was defined as a doubling of the serum creatinine level, a diagnosis of ESRD, or death. Follow-up ranged from 8.8 to 12.2 years. RESULTS During the trial phase, the mean blood pressure was 130/78 mm Hg in the intensive-control group and 141/86 mm Hg in the standard-control group. During the cohort phase, corresponding mean blood pressures were 131/78 mm Hg and 134/78 mm Hg. In both phases, there was no significant between-group difference in the risk of the primary outcome (hazard ratio in the intensive-control group, 0.91; P = 0.27). However, the effects differed according to the baseline level of proteinuria (P = 0.02 for interaction), with a potential benefit in patients with a protein-to-creatinine ratio of more than 0.22 (hazard ratio, 0.73; P = 0.01). CONCLUSIONS In overall analyses, intensive blood-pressure control had no effect on kidney disease progression. However, there may be differential effects of intensive blood-pressure control in patients with and those without baseline proteinuria. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases, the National Center on Minority Health and Health Disparities, and others.) PMID:20818902
Statistical evaluation of the Local Lymph Node Assay.
Hothorn, Ludwig A; Vohr, Hans-Werner
2010-04-01
In the Local Lymph Node Assay measured endpoints for each animal, such as cell proliferation, cell counts and/or lymph node weight should be evaluated separately. The primary criterion for a positive response is when the estimated stimulation index is larger than a specified relative threshold that is endpoint- and strain-specific. When the lower confidence limit for ratio-to-control comparisons is larger than a relevance threshold, a biologically relevant increase can be concluded according to the proof of hazard. Alternatively, when the upper confidence limit for ratio-to-control comparisons is smaller than a tolerable margin, harmlessness can be concluded according to a proof of safety. Copyright 2009 Elsevier Inc. All rights reserved.
Tominaga, K; Andow, J; Koyama, Y; Numao, S; Kurokawa, E; Ojima, M; Nagai, M
1998-01-01
Many psychosocial factors have been reported to influence the duration of survival of breast cancer patients. We have studied how family members, hobbies and habits of the patients may alter their psychosocial status. Female patients with surgically treated breast cancer diagnosed between 1986 and 1995 at the Tochigi Cancer Center Hospital, who provided information on the above-mentioned factors, were used. Their subsequent physical status was followed up in the outpatients clinic. The Cox regression model was used to evaluate the relationship between the results of the factors examined and the duration of the patients' survival, adjusting for the patients' age, stage of disease at diagnosis and curability, as judged by the physician in charge after the treatment. The following factors were revealed to be significant with regard to the survival of surgically treated breast cancer patients: being a widow (hazard ratio 3.29; 95% confidence interval 1.32-8.20), having a hobby (hazard ratio 0.43; 95% confidence interval 0.23-0.82), number of hobbies (hazard ratio 0.64; 95% confidence interval 0.41-1.00), number of female children (hazard ratio 0.64; 95% confidence interval 0.42-0.98), smoker (hazard ratio 2.08; 95% confidence interval 1.02-4.26) and alcohol consumption (hazard ratio 0.10; 95% confidence interval 0.01-0.72). These results suggest that psychosocial factors, including the family environment, where patients receive emotional support from their spouse and children, hobbies and the patients' habits, may influence the duration of survival in surgically treated breast cancer patients.
Adaptive Servo-Ventilation for Central Sleep Apnea in Systolic Heart Failure.
Cowie, Martin R; Woehrle, Holger; Wegscheider, Karl; Angermann, Christiane; d'Ortho, Marie-Pia; Erdmann, Erland; Levy, Patrick; Simonds, Anita K; Somers, Virend K; Zannad, Faiez; Teschler, Helmut
2015-09-17
Central sleep apnea is associated with poor prognosis and death in patients with heart failure. Adaptive servo-ventilation is a therapy that uses a noninvasive ventilator to treat central sleep apnea by delivering servo-controlled inspiratory pressure support on top of expiratory positive airway pressure. We investigated the effects of adaptive servo-ventilation in patients who had heart failure with reduced ejection fraction and predominantly central sleep apnea. We randomly assigned 1325 patients with a left ventricular ejection fraction of 45% or less, an apnea-hypopnea index (AHI) of 15 or more events (occurrences of apnea or hypopnea) per hour, and a predominance of central events to receive guideline-based medical treatment with adaptive servo-ventilation or guideline-based medical treatment alone (control). The primary end point in the time-to-event analysis was the first event of death from any cause, lifesaving cardiovascular intervention (cardiac transplantation, implantation of a ventricular assist device, resuscitation after sudden cardiac arrest, or appropriate lifesaving shock), or unplanned hospitalization for worsening heart failure. In the adaptive servo-ventilation group, the mean AHI at 12 months was 6.6 events per hour. The incidence of the primary end point did not differ significantly between the adaptive servo-ventilation group and the control group (54.1% and 50.8%, respectively; hazard ratio, 1.13; 95% confidence interval [CI], 0.97 to 1.31; P=0.10). All-cause mortality and cardiovascular mortality were significantly higher in the adaptive servo-ventilation group than in the control group (hazard ratio for death from any cause, 1.28; 95% CI, 1.06 to 1.55; P=0.01; and hazard ratio for cardiovascular death, 1.34; 95% CI, 1.09 to 1.65; P=0.006). Adaptive servo-ventilation had no significant effect on the primary end point in patients who had heart failure with reduced ejection fraction and predominantly central sleep apnea, but all-cause and cardiovascular mortality were both increased with this therapy. (Funded by ResMed and others; SERVE-HF ClinicalTrials.gov number, NCT00733343.).
Palatini, Paolo; Bratti, Paolo; Palomba, Daniela; Saladini, Francesca; Zanatta, Nello; Maraglino, Giuseppe
2010-06-01
The objective of this study was to investigate the effect of regular physical activity on the haemodynamic response to public speaking and to evaluate the long-term effect of exercise on development of hypertension. We assessed 75 sedentary and 44 active participants screened for stage 1 hypertension with consistent activity habits and 63 normotensive individuals as control. The blood pressure (BP) response to public speaking was assessed with beat-to-beat noninvasive recording. Definition of incident hypertension was based either on clinic or 24-h BP measurement. The BP response to public speaking was greater in the hypertensive than the normotensive participants (P=0.018/0.009). Among the former, sedentary participants showed increased BP reactivity to the speech test (45.2+/-22.6/22.2+/-11.5mmHg, P<0.01/<0.001 versus controls), whereas physically active participants had a response similar to that of controls (35.4+/-18.5/18.5+/-11.5mmHg, P=not significant). During a median follow-up of 71 months, ambulatory BP did not virtually change in the active participants (-0.9+/-7.8/-0.0+/-4.7mmHg) and increased in their sedentary peers (2.8+/-9.8/3.2+/-7.4mmHg, P=0.08/0.003 versus active). Active participants were less likely to develop incident hypertension than sedentary ones. After controlling for several confounders including baseline heart rate, the hazard ratio was 0.53 [95% confidence interval (CI) 0.31-0.94] for clinic hypertension and 0.60 (95% CI 0.37-0.99) for ambulatory hypertension. Inclusion of BP response to public speaking into the Cox model influenced the strength of the association only marginally [hazard ratio=0.55 (95% CI 0.30-0.97) and hazard ratio=0.59 (95% CI 0.36-0.99), respectively]. Regular physical activity attenuates the BP reaction to psychosocial stressors. However, this mechanism seems to be only partially responsible for the long-term effect of exercise on BP.
Chen-Hussey, Vanessa; Carneiro, Ilona; Keomanila, Hongkham; Gray, Rob; Bannavong, Sihamano; Phanalasy, Saysana; Lindsay, Steven W
2013-01-01
Mosquito vectors of malaria in Southeast Asia readily feed outdoors making malaria control through indoor insecticides such as long-lasting insecticidal nets (LLINs) and indoor residual spraying more difficult. Topical insect repellents may be able to protect users from outdoor biting, thereby providing additional protection above the current best practice of LLINs. A double blind, household randomised, placebo-controlled trial of insect repellent to reduce malaria was carried out in southern Lao PDR to determine whether the use of repellent and long-lasting insecticidal nets (LLINs) could reduce malaria more than LLINs alone. A total of 1,597 households, including 7,979 participants, were recruited in June 2009 and April 2010. Equal group allocation, stratified by village, was used to randomise 795 households to a 15% DEET lotion and the remainder were given a placebo lotion. Participants, field staff and data analysts were blinded to the group assignment until data analysis had been completed. All households received new LLINs. Participants were asked to apply their lotion to exposed skin every evening and sleep under the LLINs each night. Plasmodium falciparum and P. vivax cases were actively identified by monthly rapid diagnostic tests. Intention to treat analysis found no effect from the use of repellent on malaria incidence (hazard ratio: 1.00, 95% CI: 0.99-1.01, p = 0.868). A higher socio-economic score was found to significantly decrease malaria risk (hazard ratio: 0.72, 95% CI: 0.58-0.90, p = 0.004). Women were also found to have a reduced risk of infection (hazard ratio: 0.59, 95% CI: 0.37-0.92, p = 0.020). According to protocol analysis which excluded participants using the lotions less than 90% of the time found similar results with no effect from the use of repellent. This randomised controlled trial suggests that topical repellents are not a suitable intervention in addition to LLINs against malaria amongst agricultural populations in southern Lao PDR. These results are also likely to be applicable to much of the Greater Mekong Sub-region. This trial is registered with number NCT00938379.
Letang, Emilio; Lewis, James J; Bower, Mark; Mosam, Anisa; Borok, Margareth; Campbell, Thomas B; Naniche, Denise; Newsom-Davis, Tom; Shaik, Fahmida; Fiorillo, Suzanne; Miro, Jose M; Schellenberg, David; Easterbrook, Philippa J
2013-06-19
To assess the incidence, predictors, and outcomes of Kaposi sarcoma-associated paradoxical immune reconstitution inflammatory syndrome (KS-IRIS) in antiretroviral therapy (ART)-naive HIV-infected patients with Kaposi sarcoma initiating ART in both well resourced and limited-resourced settings. Pooled analysis of three prospective cohorts of ART-naive HIV-infected patients with Kaposi sarcoma from sub-Saharan Africa (SSA) and one from the UK. KS-IRIS case definition was standardized across sites. Cox regression and Kaplan-Meier survival analysis were used to identify the incidence and predictors of KS-IRIS and Kaposi sarcoma-associated mortality. Fifty-eight of 417 (13.9%) eligible individuals experienced KS-IRIS with an incidence 2.5 times higher in the African vs. European cohorts (P=0.001). ART alone as initial Kaposi sarcoma treatment (hazard ratio 2.97, 95% confidence interval (CI) 1.02-8.69); T1 Kaposi sarcoma stage (hazard ratio 2.96, 95% CI 1.26-6.94); and plasma HIV-1 RNA more than 5 log₁₀ copies/ml (hazard ratio 2.14, 95% CI 1.25-3.67) independently predicted KS-IRIS at baseline. Detectable plasma Kaposi sarcoma-associated herpes virus (KSHV) DNA additionally predicted KS-IRIS among the 259 patients with KSHV DNA assessed (hazard ratio 2.98, 95% CI 1.23-7.19). Nineteen KS-IRIS patients died, all in SSA. Kaposi sarcoma mortality was 3.3-fold higher in Africa, and was predicted by KS-IRIS (hazard ratio 19.24, CI 7.62-48.58), lack of chemotherapy (hazard ratio 2.35, 95% CI 1.09-5.05), pre-ART CD4 cell count less than 200 cells/μl (hazard ratio 2.04, 95% CI 0.99-4.2), and detectable baseline KSHV DNA (hazard ratio 2.12, 95% CI 0.94-4.77). KS-IRIS incidence and mortality are higher in SSA than in the UK. This is largely explained by the more advanced Kaposi sarcoma disease and lower chemotherapy availability. KS-IRIS is a major contributor to Kaposi sarcoma-associated mortality in Africa. Our results support the need to increase awareness on KS-IRIS, encourage earlier presentation, referral and diagnosis of Kaposi sarcoma, and advocate on access to systemic chemotherapy in Africa. © 2013 Wolters Kluwer Health | Lippincott Williams & Wilkins
Flint, Alexander C; Conell, Carol; Ren, Xiushui; Kamel, Hooman; Chan, Sheila L; Rao, Vivek A; Johnston, S Claiborne
2017-07-01
Outpatient statin use reduces the risk of recurrent ischemic stroke among patients with stroke of atherothrombotic cause. It is not known whether statins have similar effects in ischemic stroke caused by atrial fibrillation (AFib). We studied outpatient statin adherence, measured by percentage of days covered, and the risk of recurrent ischemic stroke in patients with or without AFib in a 21-hospital integrated healthcare delivery system. Among 6116 patients with ischemic stroke discharged on a statin over a 5-year period, 1446 (23.6%) had a diagnosis of AFib at discharge. The mean statin adherence rate (percentage of days covered) was 85, and higher levels of percentage of days covered correlated with greater degrees of low-density lipoprotein suppression. In multivariable survival models of recurrent ischemic stroke over 3 years, after controlling for age, sex, race/ethnicity, medical comorbidities, and hospital center, higher statin adherence predicted reduced stroke risk both in patients without AFib (hazard ratio, 0.78; 95% confidence interval, 0.63-0.97) and in patients with AFib (hazard ratio, 0.59; 95% confidence interval, 0.43-0.81). This association was robust to adjustment for the time in the therapeutic range for international normalized ratio among AFib subjects taking warfarin (hazard ratio, 0.61; 95% confidence interval, 0.41-0.89). The relationship between statin adherence and reduced recurrent stroke risk is as strong among patients with AFib as it is among patients without AFib, suggesting that AFib status should not be a reason to exclude patients from secondary stroke prevention with a statin. © 2017 American Heart Association, Inc.
Effect of motivational interviewing on rates of early childhood caries: a randomized trial.
Harrison, Rosamund; Benton, Tonya; Everson-Stewart, Siobhan; Weinstein, Phil
2007-01-01
The purposes of this randomized controlled trial were to: (1) test motivational interviewing (MI) to prevent early childhood caries; and (2) use Poisson regression for data analysis. A total of 240 South Asian children 6 to 18 months old were enrolled and randomly assigned to either the MI or control condition. Children had a dental exam, and their mothers completed pretested instruments at baseline and 1 and 2 years postintervention. Other covariates that might explain outcomes over and above treatment differences were modeled using Poisson regression. Hazard ratios were produced. Analyses included all participants whenever possible. Poisson regression supported a protective effect of MI (hazard ratio [HR]=0.54 (95%CI=035-0.84)-that is, the M/ group had about a 46% lower rate of dmfs at 2 years than did control children. Similar treatment effect estimates were obtained from models that included, as alternative outcomes, ds, dms, and dmfs, including "white spot lesions." Exploratory analyses revealed that rates of dmfs were higher in children whose mothers had: (1) prechewed their food; (2) been raised in a rural environment; and (3) a higher family income (P<.05). A motivational interviewing-style intervention shows promise to promote preventive behaviors in mothers of young children at high risk for caries.
1997-09-18
scrubbers , detectable dioxin/furans may occur, since dioxin/furans are much more soluble in organics than in water. Carbon adsorption is frequently...air pollution control device is required. Acid gases may be controlled by using a wet or dry scrubber or by using a coated baghouse. Operating...unit: 1. exit treated waste temperature; 2. baghouse pressure drop, venturi pressure drop, or drop in liquid/gas ratio; 3. waste feed rate; 4
Felker, G Michael; Fiuzat, Mona; Thompson, Vivian; Shaw, Linda K; Neely, Megan L; Adams, Kirkwood F; Whellan, David J; Donahue, Mark P; Ahmad, Tariq; Kitzman, Dalane W; Piña, Ileana L; Zannad, Faiez; Kraus, William E; O'Connor, Christopher M
2013-11-01
ST2 is involved in cardioprotective signaling in the myocardium and has been identified as a potentially promising biomarker in heart failure (HF). We evaluated ST2 levels and their association with functional capacity and long-term clinical outcomes in a cohort of ambulatory patients with HF enrolled in the Heart Failure: A Controlled Trial Investigating Outcomes of Exercise Training (HF-ACTION) study-a multicenter, randomized study of exercise training in HF. HF-ACTION randomized 2331 patients with left ventricular ejection fraction <0.35 and New York Heart Association class II to IV HF to either exercise training or usual care. ST2 was analyzed in a subset of 910 patients with evaluable plasma samples. Correlations and Cox models were used to assess the relationship among ST2, functional capacity, and long-term outcomes. The median baseline ST2 level was 23.7 ng/mL (interquartile range, 18.6-31.8). ST2 was modestly associated with measures of functional capacity. In univariable analysis, ST2 was significantly associated with death or hospitalization (hazard ratio, 1.48; P<0.0001), cardiovascular death or HF hospitalization (hazard ratio, 2.14; P<0.0001), and all-cause mortality (hazard ratio, 2.33; P<0.0001; all hazard ratios for log2 ng/mL). In multivariable models, ST2 remained independently associated with outcomes after adjustment for clinical variables and amino-terminal pro-B-type natriuretic peptide. However, ST2 did not add significantly to reclassification of risk as assessed by changes in the C statistic, net reclassification improvement, and integrated discrimination improvement. ST2 was modestly associated with functional capacity and was significantly associated with outcomes in a well-treated cohort of ambulatory patients with HF although it did not significantly affect reclassification of risk. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00047437.
Cochrane, Shannon K; Chen, Shyh-Huei; Fitzgerald, Jodi D; Dodson, John A; Fielding, Roger A; King, Abby C; McDermott, Mary M; Manini, Todd M; Marsh, Anthony P; Newman, Anne B; Pahor, Marco; Tudor-Locke, Catrine; Ambrosius, Walter T; Buford, Thomas W
2017-12-02
Data are sparse regarding the value of physical activity (PA) surveillance among older adults-particularly among those with mobility limitations. The objective of this study was to examine longitudinal associations between objectively measured daily PA and the incidence of cardiovascular events among older adults in the LIFE (Lifestyle Interventions and Independence for Elders) study. Cardiovascular events were adjudicated based on medical records review, and cardiovascular risk factors were controlled for in the analysis. Home-based activity data were collected by hip-worn accelerometers at baseline and at 6, 12, and 24 months postrandomization to either a physical activity or health education intervention. LIFE study participants (n=1590; age 78.9±5.2 [SD] years; 67.2% women) at baseline had an 11% lower incidence of experiencing a subsequent cardiovascular event per 500 steps taken per day based on activity data (hazard ratio, 0.89; 95% confidence interval, 0.84-0.96; P =0.001). At baseline, every 30 minutes spent performing activities ≥500 counts per minute (hazard ratio, 0.75; confidence interval, 0.65-0.89 [ P =0.001]) were also associated with a lower incidence of cardiovascular events. Throughout follow-up (6, 12, and 24 months), both the number of steps per day (per 500 steps; hazard ratio, 0.90, confidence interval, 0.85-0.96 [ P =0.001]) and duration of activity ≥500 counts per minute (per 30 minutes; hazard ratio, 0.76; confidence interval, 0.63-0.90 [ P =0.002]) were significantly associated with lower cardiovascular event rates. Objective measurements of physical activity via accelerometry were associated with cardiovascular events among older adults with limited mobility (summary score >10 on the Short Physical Performance Battery) both using baseline and longitudinal data. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01072500. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Woodward, M; Zhang, X; Barzi, F; Pan, W; Ueshima, H; Rodgers, A; MacMahon, S
2003-02-01
To provide reliable age- and region-specific estimates of the associations between diabetes and major cardiovascular diseases and death in populations from the Asia-Pacific region. Twenty-four cohort studies from Asia, Australia, and New Zealand (median follow-up, 5.4 years) provided individual participant data from 161,214 people (58% from Asia) of whom 4,873 had a history of diabetes at baseline. The associations of diabetes with the risks of coronary heart disease, stroke, and cause-specific mortality during follow-up were estimated using time-dependent Cox models, stratified by study cohort and sex and adjusted for age at risk. In all, 9,277 deaths occurred (3,635 from cardiovascular disease). The hazard ratio (95% CI) associated with diabetes was 1.97 (1.72-2.25) for fatal cardiovascular disease; there were similar hazard ratios for fatal coronary heart disease, fatal stroke, and composites of fatal and nonfatal outcomes. For all cardiovascular outcomes, hazard ratios were similar in Asian and non-Asian populations and in men and women, but were greater in younger than older individuals. For noncardiovascular death, the hazard ratio was 1.56 (1.38-1.77), with separately significant increases in the risks of death from renal disease, cancer, respiratory infections, and other infective causes. The hazard ratio for all-causes mortality was 1.68 (1.55-1.84), with similar ratios in Asian and non-Asian populations, but with significantly higher ratios in younger than older individuals. The relative effect of diabetes on the risks of cardiovascular disease and death in Asian populations is much the same as that in the largely Caucasian populations of Australia and New Zealand. Hazard ratios were severalfold greater in younger people than older people. The rapidly growing prevalence of diabetes in Asia heralds a large increase in the incidence of diabetes-related death in the coming decades.
Age Variation in the Association Between Obesity and Mortality in Adults.
Wang, Zhiqiang; Peng, Yang; Liu, Meina
2017-12-01
The aim of this study was to evaluate the previously reported finding that the association between obesity and mortality strengthens with increasing age. The data were derived from the National Health Interview Survey. Age-specific hazard ratios of mortality for grade 2/3 obesity (BMI ≥ 35 kg/m 2 ), relative to a BMI of 18.5 kg/m 2 to < 25 kg/m 2 , were calculated by using a flexible parametric survival model (240,184 white men) and Cox proportional hazard models (51,697 matched pairs). When the model included interaction terms between obesity and age at the survey, hazard ratios appeared to increase with age if those interaction terms were ignored by fixing age at the survey as a single value. However, when recalculated for adults with various ages at the survey, according to model specifications, hazard ratios were higher for younger adults than for older adults with the same follow-up duration. Based on matched data, hazard ratios were also higher for younger adults (2.14 [95% CI: 1.90-2.40] for those 40-49 years of age) than for older adults (1.22 [95%: 0.91-1.63] for those 90+ years of age). For any given follow-up duration, the association between obesity and mortality weakens with age. The previously reported strengthening of the obesity-mortality association with increasing age was caused by the failure to take all the model specifications into consideration when calculating adjusted hazard ratios. © 2017 The Obesity Society.
Magnus, Per; Bakke, Eirin; Hoff, Dominic A; Høiseth, Gudrun; Graff-Iversen, Sidsel; Knudsen, Gun Peggy; Myhre, Ronny; Normann, Per Trygve; Næss, Øyvind; Tambs, Kristian; Thelle, Dag S; Mørland, Jørg
2011-11-22
This study tested the hypothesis that moderate alcohol intake exerts its cardioprotective effect mainly through an increase in the serum level of high-density lipoprotein cholesterol. In the Cohort of Norway (CONOR) study, 149 729 adult participants, recruited from 1994 to 2003, were followed by linkage to the Cause of Death Registry until 2006. At recruitment, questionnaire data on alcohol intake were collected, and the concentration of high-density lipoprotein cholesterol in serum was measured. Using Cox regression, we found that the adjusted hazard ratio for men for dying from coronary heart disease was 0.52 (95% confidence interval, 0.39-0.69) when consuming alcohol more than once a week compared with never or rarely. The ratio changed only slightly, to 0.55 (0.41-0.73), after the regression model included the serum level of high-density cholesterol. For women, the corresponding hazard ratios were 0.62 (0.32-1.23) and 0.68 (0.34-1.34), respectively. Alcohol intake is related to a reduced risk of death from coronary heart disease in the follow-up of a large, population-based Norwegian cohort study with extensive control for confounding factors. Our findings suggest that the serum level of high-density cholesterol is not an important intermediate variable in the possible causal pathway between moderate alcohol intake and coronary heart disease.
Survival from skin cancer and its associated factors in Kurdistan province of Iran.
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
We explored survival of skin cancer and its determinants in Kurdistan province of Iran. In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar's office and information on type, stage and anatomic locations were extracted from patients' hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients.
Survival from skin cancer and its associated factors in Kurdistan province of Iran
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
Background: We explored survival of skin cancer and its determinants in Kurdistan province of Iran. Methods: In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar’s office and information on type, stage and anatomic locations were extracted from patients’ hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Results: Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Conclusion: Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients. PMID:26793668
Ekamper, Peter; van Poppel, Frans; Stein, Aryeh D; Bijwaard, Govert E; Lumey, L H
2015-02-15
Nutritional conditions in early life may affect adult health, but prior studies of mortality have been limited to small samples. We evaluated the relationship between pre-/perinatal famine exposure during the Dutch Hunger Winter of 1944-1945 and mortality through age 63 years among 41,096 men born in 1944-1947 and examined at age 18 years for universal military service in the Netherlands. Of these men, 22,952 had been born around the time of the Dutch famine in 6 affected cities; the remainder served as unexposed controls. Cox proportional hazards models were used to estimate hazard ratios for death from cancer, heart disease, other natural causes, and external causes. After 1,853,023 person-years of follow-up, we recorded 1,938 deaths from cancer, 1,040 from heart disease, 1,418 from other natural causes, and 523 from external causes. We found no increase in mortality from cancer or cardiovascular disease after prenatal famine exposure. However, there were increases in mortality from other natural causes (hazard ratio = 1.24, 95% confidence interval: 1.03, 1.49) and external causes (hazard ratio = 1.46, 95% confidence interval: 1.09, 1.97) after famine exposure in the first trimester of gestation. Further follow-up of the cohort is needed to provide more accurate risk estimates of mortality from specific causes of death after nutritional disturbances during gestation and very early life. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Suicide in Tourette's and Chronic Tic Disorders.
Fernández de la Cruz, Lorena; Rydell, Mina; Runeson, Bo; Brander, Gustaf; Rück, Christian; D'Onofrio, Brian M; Larsson, Henrik; Lichtenstein, Paul; Mataix-Cols, David
2017-07-15
Persons with neuropsychiatric disorders are at increased risk of suicide, but there is little data concerning Tourette's and chronic tic disorders (TD/CTD). We aimed to quantify the risk of suicidal behavior in a large nationwide cohort of patients with TD/CTD, establish the contribution of psychiatric comorbidity to this risk, and identify predictors of suicide. Using a validated algorithm, we identified 7736 TD/CTD cases in the Swedish National Patient Register during a 44-year period (1969-2013). Using a matched case-cohort design, patients were compared with general population control subjects (1:10 ratio). Risk of suicidal behavior was estimated using conditional logistic regressions. Predictors of suicidal behavior in the TD/CTD cohort were studied using Cox regression models. In unadjusted models, TD/CTD patients, compared with control subjects, had an increased risk of both dying by suicide (odds ratio: 4.39; 95% confidence interval [CI]: 2.89-6.67) and attempting suicide (odds ratio: 3.86; 95% CI: 3.50-4.26). After adjusting for psychiatric comorbidities, the risk was reduced but remained substantial. Persistence of tics beyond young adulthood and a previous suicide attempt were the strongest predictors of death by suicide in TD/CTD patients (hazard ratio: 11.39; 95% CI: 3.71-35.02, and hazard ratio: 5.65; 95% CI: 2.21-14.42, respectively). TD/CTD are associated with substantial risk of suicide. Suicidal behavior should be monitored in these patients, particularly in those with persistent tics, history of suicide attempts, and psychiatric comorbidities. Preventive and intervention strategies aimed to reduce the suicidal risk in this group are warranted. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Increased risk of Bell palsy in patients with migraine: a nationwide cohort study.
Peng, Kuan-Po; Chen, Yung-Tai; Fuh, Jong-Ling; Tang, Chao-Hsiun; Wang, Shuu-Jiun
2015-01-13
To evaluate the association between migraine and Bell palsy and to examine the effects of age, sex, migraine subtype, and comorbid risk factors for Bell palsy. This nationwide cohort study was conducted using data from the Taiwan National Health Insurance Research Database. Subjects aged 18 years or older with neurologist-diagnosed migraine from 2005 to 2009 were included. A nonheadache age- and propensity score-matched control cohort was selected for comparison. All subjects were followed until the end of 2010, death, or the occurrence of a Bell palsy event. Cox proportional hazards regression was used to calculate the adjusted hazard ratios and 95% confidence intervals to compare the risk of Bell palsy between groups. Both cohorts (n = 136,704 each) were followed for a mean of 3.2 years. During the follow-up period, 671 patients (424,372 person-years) in the migraine cohort and 365 matched control subjects (438,677 person-years) were newly diagnosed with Bell palsy (incidence rates, 158.1 and 83.2/100,000 person-years, respectively). The adjusted hazard ratio for Bell palsy was 1.91 (95% confidence interval, 1.68-2.17; p < 0.001). The association between migraine and Bell palsy remained significant in sensitivity analyses, and tests of interaction failed to reach significance in all subgroup analyses. Migraine is a previously unidentified risk factor for Bell palsy. The association between these 2 conditions suggests a linked disease mechanism, which is worthy of further exploration. © 2014 American Academy of Neurology.
Understanding the Risk Factors of Trauma Center Closures
Shen, Yu-Chu; Hsia, Renee Y.; Kuzma, Kristen
2011-01-01
Objectives We analyze whether hazard rates of shutting down trauma centers are higher due to financial pressures or in areas with vulnerable populations (such as minorities or the poor). Materials and Methods This is a retrospective study of all hospitals with trauma center services in urban areas in the continental US between 1990 and 2005, identified from the American Hospital Association Annual Surveys. These data were linked with Medicare cost reports, and supplemented with other sources, including the Area Resource File. We analyze the hazard rates of trauma center closures among several dimensions of risk factors using discrete-time proportional hazard models. Results The number of trauma center closures increased from 1990 to 2005, with a total of 339 during this period. The hazard rate of closing trauma centers in hospitals with a negative profit margin is 1.38 times higher than those hospitals without the negative profit margin (P < 0.01). Hospitals receiving more generous Medicare reimbursements face a lower hazard of shutting down trauma centers (ratio: 0.58, P < 0.01) than those receiving below average reimbursement. Hospitals in areas with higher health maintenance organizations penetration face a higher hazard of trauma center closure (ratio: 2.06, P < 0.01). Finally, hospitals in areas with higher shares of minorities face a higher risk of trauma center closure (ratio: 1.69, P < 0.01). Medicaid load and uninsured populations, however, are not risk factors for higher rates of closure after we control for other financial and community characteristics. Conclusions Our findings give an indication on how the current proposals to cut public spending could exacerbate the trauma closure particularly among areas with high shares of minorities. In addition, given the negative effect of health maintenance organizations on trauma center survival, the growth of Medicaid managed care population should be monitored. Finally, high shares of Medicaid or uninsurance by themselves are not independent risk factors for higher closure as long as financial pressures are mitigated. Targeted policy interventions and further research on the causes, are needed to address these systems-level disparities. PMID:19704354
Shen, Yu-Chu; Hsia, Renee Y; Kuzma, Kristen
2009-09-01
We analyze whether hazard rates of shutting down trauma centers are higher due to financial pressures or in areas with vulnerable populations (such as minorities or the poor). This is a retrospective study of all hospitals with trauma center services in urban areas in the continental US between 1990 and 2005, identified from the American Hospital Association Annual Surveys. These data were linked with Medicare cost reports, and supplemented with other sources, including the Area Resource File. We analyze the hazard rates of trauma center closures among several dimensions of risk factors using discrete-time proportional hazard models. The number of trauma center closures increased from 1990 to 2005, with a total of 339 during this period. The hazard rate of closing trauma centers in hospitals with a negative profit margin is 1.38 times higher than those hospitals without the negative profit margin (P < 0.01). Hospitals receiving more generous Medicare reimbursements face a lower hazard of shutting down trauma centers (ratio: 0.58, P < 0.01) than those receiving below average reimbursement. Hospitals in areas with higher health maintenance organizations penetration face a higher hazard of trauma center closure (ratio: 2.06, P < 0.01). Finally, hospitals in areas with higher shares of minorities face a higher risk of trauma center closure (ratio: 1.69, P < 0.01). Medicaid load and uninsured populations, however, are not risk factors for higher rates of closure after we control for other financial and community characteristics. Our findings give an indication on how the current proposals to cut public spending could exacerbate the trauma closure particularly among areas with high shares of minorities. In addition, given the negative effect of health maintenance organizations on trauma center survival, the growth of Medicaid managed care population should be monitored. Finally, high shares of Medicaid or uninsurance by themselves are not independent risk factors for higher closure as long as financial pressures are mitigated. Targeted policy interventions and further research on the causes, are needed to address these systems-level disparities.
Morita, Kazunori; Saruwatari, Junji; Tanaka, Takahiro; Oniki, Kentaro; Kajiwara, Ayami; Miyazaki, Hiroko; Yoshida, Akira; Jinnouchi, Hideaki; Nakagawa, Kazuko
2017-02-01
This study investigated the associations between the common hepatocyte nuclear factor-1A (HNF1A) variants and the risk of diabetic retinopathy (DR) in relation to the glycemic control and weight status. A retrospective longitudinal analysis was conducted among 354 Japanese patients with type 2 diabetes mellitus (T2DM) (mean follow-up duration: 5.8±2.5 years). The multivariable-adjusted hazard ratio (HR) for the cumulative incidence of DR was calculated using a Cox proportional hazard model. During the observation period, the longitudinal associations of the HNF1A diplotypes with the risk of DR and the clinical parameters were also analyzed using the generalized estimating equations approach. The combination of risk variants, i.e., rs1169288-C, rs1183910-A and rs2464196-A, was defined as the H1 haplotype. The incidence of DR was higher in the H1/H1 diplotype cases than in the others (HR 2.75 vs. non-H1/non-H1; p=0.02). Only in normal-weight subjects, the risks of DR and poor glycemic control were higher in the H1/H1 diplotype cases than in the others [odds ratio 4.08 vs. non-H1/non-H1, p=0.02; odds ratio 3.03, p=0.01; respectively]. This study demonstrated that the common HNF1A diplotype of three risk variants may be an independent risk factor for the development of DR resulting from poor glycemic control in normal-weight patients with T2DM. These results need to be replicated in larger and more varied study populations. Copyright © 2016 Elsevier Inc. All rights reserved.
Effect on injuries of assigning shoes based on foot shape in air force basic training.
Knapik, Joseph J; Brosch, Lorie C; Venuto, Margaret; Swedler, David I; Bullock, Steven H; Gaines, Lorraine S; Murphy, Ryan J; Tchandja, Juste; Jones, Bruce H
2010-01-01
This study examined whether assigning running shoes based on the shape of the bottom of the foot (plantar surface) influenced injury risk in Air Force Basic Military Training (BMT) and examined risk factors for injury in BMT. Data were collected from BMT recruits during 2007; analysis took place during 2008. After foot examinations, recruits were randomly consigned to either an experimental group (E, n=1042 men, 375 women) or a control group (C, n=913 men, 346 women). Experimental group recruits were assigned motion control, stability, or cushioned shoes for plantar shapes indicative of low, medium, or high arches, respectively. Control group recruits received a stability shoe regardless of plantar shape. Injuries during BMT were determined from outpatient visits provided from the Defense Medical Surveillance System. Other injury risk factors (fitness, smoking, physical activity, prior injury, menstrual history, and demographics) were obtained from a questionnaire, existing databases, or BMT units. Multivariate Cox regression controlling for other risk factors showed little difference in injury risk between the groups among men (hazard ratio [E/C]=1.11, 95% CI=0.89-1.38) or women (hazard ratio [E/C]=1.20, 95% CI= 0.90-1.60). Independent injury risk factors among both men and women included low aerobic fitness and cigarette smoking. This prospective study demonstrated that assigning running shoes based on the shape of the plantar surface had little influence on injury risk in BMT even after controlling for other injury risk factors. Published by Elsevier Inc.
Health Literacy and Access to Kidney Transplantation
Grubbs, Vanessa; Gregorich, Steven E.; Perez-Stable, Eliseo J.; Hsu, Chi-yuan
2009-01-01
Background and objectives: Few studies have examined health literacy in patients with end stage kidney disease. We hypothesized that inadequate health literacy in a hemodialysis population is common and is associated with poorer access to kidney transplant wait-lists. Design, setting, participants, & measurements: We enrolled 62 Black and White maintenance hemodialysis patients aged 18 to 75. We measured health literacy using the short form Test of Functional Health Literacy in Adults. Our primary outcomes were (1) time from dialysis start date to referral date for kidney transplant evaluation and (2) time from referral date to date placed on kidney transplant wait-list. We used Cox proportional hazard models to examine the association between health literacy (adequate versus inadequate) and our outcomes after controlling for demographics and co-morbid conditions. Results: Roughly one third (32.3%) of participants had inadequate health literacy. Forty-seven (75.8%) of participants were referred for transplant evaluation. Among those referred, 40 (85.1%) were wait-listed. Participants with inadequate health literacy had 78% lower hazard of referral for transplant evaluation than those with adequate health literacy (adjusted hazard ratio [AHR] 0.22; 95% confidence interval 0.08, 0.60; P = 0.003). The hazard ratio of being wait-listed by health literacy was not statistically different (AHR 0.80, 95% CI, 0.39, 1.61), P = 0.5). Conclusions: Inadequate health literacy is common in our hemodialysis patient population and is associated with a lower hazard of referral for transplant evaluation. Strategies to reduce the impact of health literacy on the kidney transplant process should be explored. PMID:19056617
Prognostic impact of metastatic pattern in stage IV breast cancer at initial diagnosis.
Leone, Bernardo Amadeo; Vallejo, Carlos Teodoro; Romero, Alberto Omar; Machiavelli, Mario Raúl; Pérez, Juan Eduardo; Leone, Julieta; Leone, José Pablo
2017-02-01
To analyze the prognostic influence of metastatic pattern (MP) compared with other biologic and clinical factors in stage IV breast cancer at initial diagnosis (BCID) and evaluate factors associated with specific sites of metastases (SSM). We evaluated women with stage IV BCID with known metastatic sites, reported to the Surveillance, Epidemiology and End Results program from 2010 to 2013. MP was categorized as bone-only, visceral, bone and visceral (BV), and other. Univariate and multivariate analyses determined the effects of each variable on overall survival (OS). Logistic regression examined factors associated with SSM. We included 9143 patients. Bone represented 37.5% of patients, visceral 21.9%, BV 28.8%, and other 11.9%. Median OS by MP was as follows: bone 38 months, visceral 21 months, BV 19 months, and other 33 months (P < 0.0001). Univariate analysis showed that higher number of metastatic sites had worse prognosis. In multivariate analysis, older age (hazard ratio 1.9), black race (hazard ratio 1.17), grade 3/4 tumors (hazard ratio 1.6), triple-negative (hazard ratio 2.24), BV MP (hazard ratio 2.07), and unmarried patients (hazard ratio 1.25) had significantly shorter OS. As compared with HR+/HER2- tumors, triple-negative and HR-/HER2+ had higher odds of brain, liver, lung, and other metastases. HR+/HER2+ had higher odds of liver metastases. All three subtypes had lower odds of bone metastases. There were substantial differences in OS according to MP. Tumor subtypes have a clear influence among other factors on SSM. We identified several prognostic factors that could guide therapy selection in treatment naïve patients.
Brasky, Theodore M.; Sponholtz, Todd R.; Palmer, Julie R.; Rosenberg, Lynn; Ruiz-Narváez, Edward A.; Wise, Lauren A.
2016-01-01
Dietary long-chain (LC) ω-3 polyunsaturated fatty acids (PUFAs), which derive primarily from intakes of fatty fish, are thought to inhibit inflammation and de novo estrogen synthesis. This study prospectively examined the associations of dietary LC ω-3 PUFAs and fish with endometrial cancer risk in 47,602 African-American women living in the United States, aged 21–69 years at baseline in 1995, and followed them until 2013 (n = 282 cases). Multivariable-adjusted Cox regression models estimated hazard ratios and 95% confidence intervals for associations of LC ω-3 PUFA (quintiled) and fish (quartiled) intake with endometrial cancer risk, overall and by body mass index (BMI; weight (kg)/height (m)2). The hazard ratio for quintile 5 of total dietary LC ω-3 PUFAs versus quintile 1 was 0.79 (95% confidence interval (CI): 0.51, 1.24); there was no linear trend. Hazard ratios for the association were smaller among normal-weight women (BMI <25: hazard ratio (HR) = 0.53, 95% CI: 0.18, 1.58) than among overweight/obese women (BMI ≥25: HR = 0.88, 95% CI: 0.54, 1.43), but these differences were not statistically significant. Fish intake was also not associated with risk (quartile 4 vs. quartile 1: HR = 0.86, 95% CI: 0.56, 1.31). Again hazard ratios were smaller among normal-weight women (HR = 0.65) than among overweight/obese women (HR = 0.94). While compatible with no association, the hazard ratios observed among leaner African-American women are similar to those from recent prospective studies conducted in predominantly white populations. PMID:26755676
Wicks, Susanne; Hjern, Anders; Dalman, Christina
2010-10-01
Recent studies suggest a role for social factors during childhood in the later development of schizophrenia. Since social conditions in childhood are closely related to parental psychiatric illness, there is a need to disentangle how genes and social environmental factors interact. A total of 13,163 children born in Sweden between 1955 and 1984 and reared in Swedish adoptive families were linked to the National Patient Register until 2006 regarding admissions for non-affective psychoses, including schizophrenia. Hazard ratios for nonaffective psychoses were estimated in relation to three indicators of socioeconomic position in childhood (household data of the rearing family obtained via linkage to the National Censuses of 1960-1985) and in relation to indicator of genetic liability (biological parental inpatient care for psychosis). In addition, the total Swedish-born population was investigated. Increased risks for nonaffective psychosis were found among adoptees (without biological parental history of psychosis) reared in families with disadvantaged socioeconomic position, which consisted of adoptive parental unemployment (hazard ratio=2.0), single-parent household (hazard ratio=1.2), and living in apartments (hazard ratio=1.3). The risk was also increased among persons with genetic liability for psychosis alone (hazard ratio=4.7). Among those exposed to both genetic liability and a disadvantaged socioeconomic situation in childhood, the risk was considerably higher (hazard ratio=15.0, 10.3, and 5.7 for parental unemployment, single-parent household, and apartment living, respectively). Analyses in the larger population supported these results. The results indicate that children reared in families with a disadvantaged socioeconomic position have an increased risk for psychosis. There was also some support for an interaction effect, suggesting that social disadvantage increases this risk more in children with genetic liability for psychosis.
Relationships between exercise, smoking habit and mortality in more than 100,000 adults.
O'Donovan, Gary; Hamer, Mark; Stamatakis, Emmanuel
2017-04-15
Exercise is associated with reduced risks of all-cause, cardiovascular disease (CVD) and cancer mortality; however, the benefits in smokers and ex-smokers are unclear. The aim of this study was to investigate associations between exercise, smoking habit and mortality. Self-reported exercise and smoking, and all-cause, CVD and cancer mortality were assessed in 106,341 adults in the Health Survey for England and the Scottish Health Survey. There were 9149 deaths from all causes, 2839 from CVD and 2634 from cancer during 999,948 person-years of follow-up. Greater amounts of exercise were associated with decreases and greater amounts of smoking were associated with increases in the risks of mortality from all causes, CVD and cancer. There was no statistically significant evidence of biological interaction; rather, the relative risks of all-cause mortality were additive. In the subgroup of 26,768 ex-smokers, the all-cause mortality hazard ratio was 0.70 (95% CI 0.60, 0.80), the CVD mortality hazard ratio was 0.71 (0.55, 092) and the cancer mortality hazard ratio was 0.66 (0.52, 0.84) in those who exercised compared to those who did not. In the subgroup of 28,440 smokers, the all-cause mortality hazard ratio was 0.69 (0.57, 0.83), the CVD mortality hazard ratio was 0.66 (0.45, 0.96) and the cancer mortality hazard ratio was 0.69 (0.51, 0.94) in those who exercised compared to those who did not. Given that an outright ban is unlikely, this study is important because it suggests exercise reduces the risks of all-cause, CVD and cancer mortality by around 30% in smokers and ex-smokers. © 2017 UICC.
Association between GFR Estimated by Multiple Methods at Dialysis Commencement and Patient Survival
Wong, Muh Geot; Pollock, Carol A.; Cooper, Bruce A.; Branley, Pauline; Collins, John F.; Craig, Jonathan C.; Kesselhut, Joan; Luxton, Grant; Pilmore, Andrew; Harris, David C.
2014-01-01
Summary Background and objectives The Initiating Dialysis Early and Late study showed that planned early or late initiation of dialysis, based on the Cockcroft and Gault estimation of GFR, was associated with identical clinical outcomes. This study examined the association of all-cause mortality with estimated GFR at dialysis commencement, which was determined using multiple formulas. Design, setting, participants, & measurements Initiating Dialysis Early and Late trial participants were stratified into tertiles according to the estimated GFR measured by Cockcroft and Gault, Modification of Diet in Renal Disease, or Chronic Kidney Disease-Epidemiology Collaboration formula at dialysis commencement. Patient survival was determined using multivariable Cox proportional hazards model regression. Results Only Initiating Dialysis Early and Late trial participants who commenced on dialysis were included in this study (n=768). A total of 275 patients died during the study. After adjustment for age, sex, racial origin, body mass index, diabetes, and cardiovascular disease, no significant differences in survival were observed between estimated GFR tertiles determined by Cockcroft and Gault (lowest tertile adjusted hazard ratio, 1.11; 95% confidence interval, 0.82 to 1.49; middle tertile hazard ratio, 1.29; 95% confidence interval, 0.96 to 1.74; highest tertile reference), Modification of Diet in Renal Disease (lowest tertile hazard ratio, 0.88; 95% confidence interval, 0.63 to 1.24; middle tertile hazard ratio, 1.20; 95% confidence interval, 0.90 to 1.61; highest tertile reference), and Chronic Kidney Disease-Epidemiology Collaboration equations (lowest tertile hazard ratio, 0.93; 95% confidence interval, 0.67 to 1.27; middle tertile hazard ratio, 1.15; 95% confidence interval, 0.86 to 1.54; highest tertile reference). Conclusion Estimated GFR at dialysis commencement was not significantly associated with patient survival, regardless of the formula used. However, a clinically important association cannot be excluded, because observed confidence intervals were wide. PMID:24178976
Holma, K Mikael; Melartin, Tarja K; Haukka, Jari; Holma, Irina A K; Sokero, T Petteri; Isometsä, Erkki T
2010-07-01
Prospective long-term studies of risk factors for suicide attempts among patients with major depressive disorder have not investigated the course of illness and state at the time of the act. Therefore, the importance of state factors, particularly time spent in risk states, for overall risk remains unknown. In the Vantaa Depression Study, a longitudinal 5-year evaluation of psychiatric patients with major depressive disorder, prospective information on 249 patients (92.6%) was available. Time spent in depressive states and the timing of suicide attempts were investigated with life charts. During the follow-up assessment period, there were 106 suicide attempts per 1,018 patient-years. The incidence rate per 1,000 patient-years during major depressive episodes was 21-fold (N=332 [95% confidence interval [CI]=258.6-419.2]), and it was fourfold during partial remission (N=62 [95% CI=34.6-92.4]) compared with full remission (N=16 [95% CI=11.2-40.2]). In the Cox proportional hazards model, suicide attempts were predicted by the months spent in a major depressive episode (hazard ratio=7.74 [95% CI=3.40-17.6]) or in partial remission (hazard ratio=4.20 [95% CI=1.71-10.3]), history of suicide attempts (hazard ratio=4.39 [95% CI=1.78-10.8]), age (hazard ratio=0.94 [95% CI=0.91-0.98]), lack of a partner (hazard ratio=2.33 [95% CI=0.97-5.56]), and low perceived social support (hazard ratio=3.57 [95% CI=1.09-11.1]). The adjusted population attributable fraction of the time spent depressed for suicide attempts was 78%. Among patients with major depressive disorder, incidence of suicide attempts varies markedly depending on the level of depression, being highest during major depressive episodes. Although previous attempts and poor social support also indicate risk, the time spent depressed is likely the major factor determining overall long-term risk.
Hansen, Richard A.; Khodneva, Yulia; Glasser, Stephen P.; Qian, Jingjing; Redmond, Nicole; Safford, Monika M.
2018-01-01
Background Mixed evidence suggests second-generation antidepressants may increase risk of cardiovascular and cerebrovascular events. Objective Assess whether antidepressant use is associated with acute coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality. Methods Secondary analyses of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) longitudinal cohort study were conducted. Use of selective serotonin reuptake inhibitors, serotonin and norepinephrine reuptake inhibitors, bupropion, nefazodone, and trazodone was measured during the baseline (2003-2007) in-home visit. Outcomes of coronary heart disease, stroke, cardiovascular disease death, and all-cause mortality were assessed every 6 months and adjudicated by medical record review. Cox proportional hazards time-to-event analysis followed patients until their first event on or before December 31, 2011, iteratively adjusting for covariates. Results Among 29,616 participants, 3,458 (11.7%) used an antidepressant of interest. Intermediate models adjusting for everything but physical and mental health found an increased risk of acute coronary heart disease (Hazard Ratio=1.21; 95% CI 1.04-1.41), stroke (Hazard Ratio=1.28; 95% CI 1.02-1.60), cardiovascular disease death (Hazard Ratio =1.29; 95% CI 1.09-1.53), and all-cause mortality (Hazard Ratio=1.27; 95% CI 1.15-1.41) for antidepressant users. Risk estimates trended in this direction for all outcomes in the fully adjusted model, but only remained statistically associated with increased risk of all-cause mortality (Hazard Ratio=1.12; 95% CI 1.01-1.24). This risk was attenuated in sensitivity analyses censoring follow-up time at 2-years (Hazard Ratio=1.37; 95% CI 1.11-1.68). Conclusions In fully adjusted models antidepressant use was associated with a small increase in all-cause mortality. PMID:26783360
Elevated Plasma CXCL12α Is Associated with a Poorer Prognosis in Pulmonary Arterial Hypertension
Li, Lili; O’Connell, Caroline; Codd, Mary; Lawrie, Allan; Morton, Allison; Kiely, David G.; Condliffe, Robin; Elliot, Charles; McLoughlin, Paul; Gaine, Sean
2015-01-01
Rationale Recent work in preclinical models suggests that signalling via the pro-angiogenic and pro-inflammatory cytokine, CXCL12 (SDF-1), plays an important pathogenic role in pulmonary hypertension (PH). The objective of this study was to establish whether circulating concentrations of CXCL12α were elevated in patients with PAH and related to mortality. Methods Plasma samples were collected from patients with idiopathic pulmonary arterial hypertension (IPAH) and PAH associated with connective tissue diseases (CTD-PAH) attending two pulmonary hypertension referral centres (n = 95) and from age and gender matched healthy controls (n = 44). Patients were subsequently monitored throughout a period of five years. Results CXCL12α concentrations were elevated in PAH groups compared to controls (P<0.05) and receiver-operating-characteristic analysis showed that plasma CXCL12α concentrations discriminated patients from healthy controls (AUC 0.80, 95% confidence interval 0.73-0.88). Kaplan Meier analysis indicated that elevated plasma CXCL12α concentration was associated with reduced survival (P<0.01). Multivariate Cox proportional hazards model showed that elevated CXCL12α independently predicted (P<0.05) earlier death in PAH with a hazard ratio (95% confidence interval) of 2.25 (1.01-5.00). In the largest subset by WHO functional class (Class 3, 65% of patients) elevated CXCL12α independently predicted (P<0.05) earlier death, hazard ratio 2.27 (1.05-4.89). Conclusions Our data show that elevated concentrations of circulating CXCL12α in PAH predicted poorer survival. Furthermore, elevated circulating CXCL12α was an independent risk factor for death that could potentially be included in a prognostic model and guide therapy. PMID:25856504
Elevated plasma CXCL12α is associated with a poorer prognosis in pulmonary arterial hypertension.
McCullagh, Brian N; Costello, Christine M; Li, Lili; O'Connell, Caroline; Codd, Mary; Lawrie, Allan; Morton, Allison; Kiely, David G; Condliffe, Robin; Elliot, Charles; McLoughlin, Paul; Gaine, Sean
2015-01-01
Recent work in preclinical models suggests that signalling via the pro-angiogenic and pro-inflammatory cytokine, CXCL12 (SDF-1), plays an important pathogenic role in pulmonary hypertension (PH). The objective of this study was to establish whether circulating concentrations of CXCL12α were elevated in patients with PAH and related to mortality. Plasma samples were collected from patients with idiopathic pulmonary arterial hypertension (IPAH) and PAH associated with connective tissue diseases (CTD-PAH) attending two pulmonary hypertension referral centres (n = 95) and from age and gender matched healthy controls (n = 44). Patients were subsequently monitored throughout a period of five years. CXCL12α concentrations were elevated in PAH groups compared to controls (P<0.05) and receiver-operating-characteristic analysis showed that plasma CXCL12α concentrations discriminated patients from healthy controls (AUC 0.80, 95% confidence interval 0.73-0.88). Kaplan Meier analysis indicated that elevated plasma CXCL12α concentration was associated with reduced survival (P<0.01). Multivariate Cox proportional hazards model showed that elevated CXCL12α independently predicted (P<0.05) earlier death in PAH with a hazard ratio (95% confidence interval) of 2.25 (1.01-5.00). In the largest subset by WHO functional class (Class 3, 65% of patients) elevated CXCL12α independently predicted (P<0.05) earlier death, hazard ratio 2.27 (1.05-4.89). Our data show that elevated concentrations of circulating CXCL12α in PAH predicted poorer survival. Furthermore, elevated circulating CXCL12α was an independent risk factor for death that could potentially be included in a prognostic model and guide therapy.
Rades, Dirk; Sehmisch, Lena; Janssen, Stefan; Schild, Steven E
2016-12-01
Many patients with brain metastases from melanoma receive whole-brain radiotherapy (WBRT). WBRT-regimens must consider the patient's prognosis in order to deliver the best therapy. Seven factors were correlated to intracerebral control and survival after WBRT alone in 92 patients with melanoma: WBRT regimen, age at WBRT, gender, Karnofsky performance score (KPS), number of brain lesions, number of extracranial metastatic sites, and time from melanoma diagnosis to WBRT. On univariate analyses, KPS ≥80 (p=0.075) showed a trend towards improved intracerebral control. Greater WBRT dose (p=0.029), age ≤60 years (p=0.002), KPS ≥80 (p<0.001) and no extracranial site (p=0.008) were positively correlated with survival. On multivariate analyses, KPS (hazard ratio=2.11, 95% confidence interval=1.28-3.47; p=0.003) and number of extracranial metastatic sites (hazard ratio=1.27, 95% confidence interval=1.02-1.56; p=0.030) maintained significance regarding survival. The study identified predictors of survival for patients with melanoma receiving WBRT for brain metastases that can contribute to selection of individualized therapies. Copyright© 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.
Parkinson disease and musculoskeletal pain: an 8-year population-based cohort study.
Lien, Wei-Hung; Lien, Wei-Chih; Kuan, Ta-Shen; Wu, Shang-Te; Chen, Yi-Ting; Chiu, Ching-Ju
2017-07-01
The aim of this study was to evaluate the incidence and clinical features of musculoskeletal pain (MSP) in patients with Parkinson disease (PD) compared with a control group without the disease. The retrospective cohort study used a subset of the Taiwan National Health Insurance Research Database (NHIRD) comprising information on 1 million beneficiaries randomly sampled from the entire population of Taiwan. A total of 490 patients aged 50 and above with newly diagnosed Parkinson disease were identified during a period from 2000 to 2005. Among them, 199 developed MSP after PD. The control group consisted of 1960 participants without PD over the study period randomly selected by matching PD cases according to the date of PD incidence, age, and sex. The study groups were then followed to the end of 2007. Musculoskeletal pain was the end point. The incidence rate ratios of MSP were higher in the PD group than in the control group, representing an adjusted hazard ratio of 1.31 (95% confidence interval 1.09 to 1.58). PD was associated with a significantly elevated risk of MSP in all sex and age stratifications, with the highest hazard ratio noted for middle-aged male patients with PD, followed by older male patients with PD. This study showed that the PD may significantly increase the risk of developing MSP. The risk of developing MSP seems to be greatest for middle-aged male patients with PD. Clinicians should be more alert for MSP in patients with PD, and early intervention should be considered.
Grellety, Emmanuel; Babakazo, Pélagie; Bangana, Amina; Mwamba, Gustave; Lezama, Ines; Zagre, Noël Marie; Ategbo, Eric-Alain
2017-04-26
Cash transfer programs (CTPs) aim to strengthen financial security for vulnerable households. This potentially enables improvements in diet, hygiene, health service access and investment in food production or income generation. The effect of CTPs on the outcome of children already severely malnourished is not well delineated. The objective of this study was to test whether CTPs will improve the outcome of children treated for severe acute malnutrition (SAM) in the Democratic Republic of the Congo over 6 months. We conducted a cluster-randomised controlled trial in children with uncomplicated SAM who received treatment according to the national protocol and counselling with or without a cash supplement of US$40 monthly for 6 months. Analyses were by intention to treat. The hazard ratio of reaching full recovery from SAM was 35% higher in the intervention group than the control group (adjusted hazard ratio, 1.35, 95% confidence interval (CI) = 1.10 to 1.69, P = 0.007). The adjusted hazard ratios in the intervention group for relapse to moderate acute malnutrition (MAM) and SAM were 0.21 (95% CI = 0.11 to 0.41, P = 0.001) and 0.30 (95% CI = 0.16 to 0.58, P = 0.001) respectively. Non-response and defaulting were lower when the households received cash. All the nutritional outcomes in the intervention group were significantly better than those in the control group. After 6 months, 80% of cash-intervened children had re-gained their mid-upper arm circumference measurements and weight-for-height/length Z-scores and showed evidence of catch-up. Less than 40% of the control group had a fully successful outcome, with many deteriorating after discharge. There was a significant increase in diet diversity and food consumption scores for both groups from baseline; the increase was significantly greater in the intervention group than the control group. CTPs can increase recovery from SAM and decrease default, non-response and relapse rates during and following treatment. Household developmental support is critical in food insecure areas to maximise the efficiency of SAM treatment programs. ClinicalTrials.gov, NCT02460848 . Registered on 27 May 2015.
Pérez de Prado, Armando; López-Gómez, Juan M.; Quiroga, Borja; Goicoechea, Marian; García-Prieto, Ana; Torres, Esther; Reque, Javier; Luño, José
2016-01-01
Background and objectives Supraventricular arrhythmias are associated with high morbidity and mortality. Nevertheless, this condition has received little attention in patients on hemodialysis. The objective of this study was to analyze the incidence of intradialysis supraventricular arrhythmia and its long–term prognostic value. Design, setting, participants, & measurements We designed an observational and prospective study in a cohort of patients on hemodialysis with a 10-year follow-up period. All patients were recruited for study participation and were not recruited for clinical indications. The study population comprised 77 patients (42 men and 35 women; mean age =58±15 years old) with sinus rhythm monitored using a Holter electrocardiogram over six consecutive hemodialysis sessions at recruitment. Results Hypertension was present in 68.8% of patients, and diabetes was present in 29.9% of patients. Supraventricular arrhythmias were recorded in 38 patients (49.3%); all of these were short, asymptomatic, and self-limiting. Age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08) and right atrial enlargement (hazard ratio, 4.29; 95% confidence interval, 1.30 to 14.09) were associated with supraventricular arrhythmia in the multivariate analysis. During a median follow-up of 40 months, 57 patients died, and cardiovascular disease was the main cause of death (52.6%). The variables associated with all-cause mortality in the Cox model were age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08), C-reactive protein (hazard ratio, 1.04 per 1 mg/L; 95% confidence interval, 1.00 to 1.08), and supraventricular arrhythmia (hazard ratio, 3.21; 95% confidence interval, 1.29 to 7.96). Patients with supraventricular arrhythmia also had a higher risk of nonfatal cardiovascular events (hazard ratio, 4.32; 95% confidence interval, 2.11 to 8.83) and symptomatic atrial fibrillation during follow-up (hazard ratio, 17.19; 95% confidence interval, 2.03 to 145.15). Conclusions The incidence of intradialysis supraventricular arrhythmia was high in our hemodialysis study population. Supraventricular arrhythmias were short, asymptomatic, and self-limiting, and although silent, these arrhythmias were independently associated with mortality and cardiovascular events. PMID:27697781
Verde, Eduardo; Pérez de Prado, Armando; López-Gómez, Juan M; Quiroga, Borja; Goicoechea, Marian; García-Prieto, Ana; Torres, Esther; Reque, Javier; Luño, José
2016-12-07
Supraventricular arrhythmias are associated with high morbidity and mortality. Nevertheless, this condition has received little attention in patients on hemodialysis. The objective of this study was to analyze the incidence of intradialysis supraventricular arrhythmia and its long-term prognostic value. We designed an observational and prospective study in a cohort of patients on hemodialysis with a 10-year follow-up period. All patients were recruited for study participation and were not recruited for clinical indications. The study population comprised 77 patients (42 men and 35 women; mean age =58±15 years old) with sinus rhythm monitored using a Holter electrocardiogram over six consecutive hemodialysis sessions at recruitment. Hypertension was present in 68.8% of patients, and diabetes was present in 29.9% of patients. Supraventricular arrhythmias were recorded in 38 patients (49.3%); all of these were short, asymptomatic, and self-limiting. Age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08) and right atrial enlargement (hazard ratio, 4.29; 95% confidence interval, 1.30 to 14.09) were associated with supraventricular arrhythmia in the multivariate analysis. During a median follow-up of 40 months, 57 patients died, and cardiovascular disease was the main cause of death (52.6%). The variables associated with all-cause mortality in the Cox model were age (hazard ratio, 1.04 per year; 95% confidence interval, 1.00 to 1.08), C-reactive protein (hazard ratio, 1.04 per 1 mg/L; 95% confidence interval, 1.00 to 1.08), and supraventricular arrhythmia (hazard ratio, 3.21; 95% confidence interval, 1.29 to 7.96). Patients with supraventricular arrhythmia also had a higher risk of nonfatal cardiovascular events (hazard ratio, 4.32; 95% confidence interval, 2.11 to 8.83) and symptomatic atrial fibrillation during follow-up (hazard ratio, 17.19; 95% confidence interval, 2.03 to 145.15). The incidence of intradialysis supraventricular arrhythmia was high in our hemodialysis study population. Supraventricular arrhythmias were short, asymptomatic, and self-limiting, and although silent, these arrhythmias were independently associated with mortality and cardiovascular events. Copyright © 2016 by the American Society of Nephrology.
Herbst, Christine; Rehan, Fareed A.; Brillant, Corinne; Bohlius, Julia; Skoetz, Nicole; Schulz, Holger; Monsef, Ina; Specht, Lena; Engert, Andreas
2010-01-01
Combined modality treatment (CMT) of chemotherapy followed by localized radiotherapy is standard treatment for patients with early stage Hodgkin’s lymphoma. However, the role of radiotherapy has been questioned recently and some clinical study groups advocate chemotherapy only for this indication. We thus performed a systematic review with meta-analysis of randomized controlled trials comparing chemotherapy alone with CMT in patients with early stage Hodgkin’s lymphoma with respect to response rate, tumor control and overall survival (OS). We searched Medline, EMBASE and the Cochrane Library as well as conference proceedings from January 1980 to February 2009 for randomized controlled trials comparing chemotherapy alone versus the same chemotherapy regimen plus radiotherapy. Progression free survival and similar outcomes were analyzed together as tumor control. Effect measures used were hazard ratios for OS and tumor control as well as relative risks for complete response (CR). Meta-analyses were performed using RevMan5. Five randomized controlled trials involving 1,245 patients were included. The hazard ratio (HR) was 0.41 (95% confidence interval (CI) 0.25 to 0.66) for tumor control and 0.40 (95% CI 0.27 to 0.59) for OS for patients receiving CMT compared to chemotherapy alone. CR rates were similar between treatment groups. In sensitivity analyses another 6 trials were included that did not fulfill the inclusion criteria of our protocol but were considered relevant to the topic. These trials underlined the results of the main analysis. In conclusion, adding radiotherapy to chemotherapy improves tumor control and OS in patients with early stage Hodgkin’s lymphoma. PMID:19951972
Yoon, Jin-Ha; Roh, Jaehoon; Kim, Chi-Nyon; Won, Jong-Uk
2016-01-01
Objectives: The aim of this study was to examine the relationship between noise exposure and risk of occupational injury. Materials and Methods: Korean National Health and Nutrition Examination Survey was used for the current study. Self-report questionnaires were used to investigate occupational injury and exposure to noise, chemicals, and machines and equipments. Results: In separate analyses for occupation and occupational hazard, the proportion of occupational injuries increased according to severity of noise exposure (all P < 0.05). Compared to the non-exposure group, the respective odds ratio (95% confidence intervals) for occupational injury was 1.39 (1.07–1.80) and 1.67 (1.13–2.46) in the mild and severe noise exposure groups, after controlling for age, gender, sleep hours, work schedule (shift work), and exposure status to hazardous chemicals and hazardous machines and equipments. Conclusions: The current study highlights the association between noise exposure and risk of occupational injury. Furthermore, risk of occupational injury increased according to severity of noise exposure. PMID:27991467
Hazard ratio estimation and inference in clinical trials with many tied event times.
Mehrotra, Devan V; Zhang, Yiwei
2018-06-13
The medical literature contains numerous examples of randomized clinical trials with time-to-event endpoints in which large numbers of events accrued over relatively short follow-up periods, resulting in many tied event times. A generally common feature across such examples was that the logrank test was used for hypothesis testing and the Cox proportional hazards model was used for hazard ratio estimation. We caution that this common practice is particularly risky in the setting of many tied event times for two reasons. First, the estimator of the hazard ratio can be severely biased if the Breslow tie-handling approximation for the Cox model (the default in SAS and Stata software) is used. Second, the 95% confidence interval for the hazard ratio can include one even when the corresponding logrank test p-value is less than 0.05. To help establish a better practice, with applicability for both superiority and noninferiority trials, we use theory and simulations to contrast Wald and score tests based on well-known tie-handling approximations for the Cox model. Our recommendation is to report the Wald test p-value and corresponding confidence interval based on the Efron approximation. The recommended test is essentially as powerful as the logrank test, the accompanying point and interval estimates of the hazard ratio have excellent statistical properties even in settings with many tied event times, inferential alignment between the p-value and confidence interval is guaranteed, and implementation is straightforward using commonly used software. Copyright © 2018 John Wiley & Sons, Ltd.
Neighborhood disadvantage and ischemic stroke: the Cardiovascular Health Study (CHS).
Brown, Arleen F; Liang, Li-Jung; Vassar, Stefanie D; Stein-Merkin, Sharon; Longstreth, W T; Ovbiagele, Bruce; Yan, Tingjian; Escarce, José J
2011-12-01
Neighborhood characteristics may influence the risk of stroke and contribute to socioeconomic disparities in stroke incidence. The objectives of this study were to examine the relationship between neighborhood socioeconomic status and incident ischemic stroke and examine potential mediators of these associations. We analyzed data from 3834 whites and 785 blacks enrolled in the Cardiovascular Health Study, a multicenter, population-based, longitudinal study of adults ages≥65 years from 4 US counties. The primary outcome was adjudicated incident ischemic stroke. Neighborhood socioeconomic status was measured using a composite of 6 census tract variables. Race-stratified multilevel Cox proportional hazard models were constructed adjusted for sociodemographic, behavioral, and biological risk factors. Among whites, in models adjusted for sociodemographic characteristics, stroke hazard was significantly higher among residents of neighborhoods in the lowest compared with the highest neighborhood socioeconomic status quartile (hazard ratio, 1.32; 95% CI, 1.01-1.72) with greater attenuation of the hazard ratio after adjustment for biological risk factors (hazard ratio, 1.16; 0.88-1.52) than for behavioral risk factors (hazard ratio, 1.30; 0.99-1.70). Among blacks, we found no significant associations between neighborhood socioeconomic status and ischemic stroke. Higher risk of incident ischemic stroke was observed in the most disadvantaged neighborhoods among whites, but not among blacks. The relationship between neighborhood socioeconomic status and stroke among whites appears to be mediated more strongly by biological than behavioral risk factors.
Yang, Ya-Hsu; Teng, Hao-Wei; Lai, Yen-Ting; Li, Szu-Yuan; Lin, Chih-Ching; Yang, Albert C; Chan, Hsiang-Lin; Hsieh, Yi-Hsuan; Lin, Chiao-Fan; Hsu, Fu-Ying; Liu, Chih-Kuang; Liu, Wen-Sheng
2015-01-01
Patients with late-onset depression (LOD) have been reported to run a higher risk of subsequent dementia. The present study was conducted to assess whether statins can reduce the risk of dementia in these patients. We used the data from National Health Insurance of Taiwan during 1996-2009. Standardized Incidence Ratios (SIRs) were calculated for LOD and subsequent dementia. The criteria for LOD diagnoses included age ≥65 years, diagnosis of depression after 65 years of age, at least three service claims, and treatment with antidepressants. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients for validation studies. Kaplan-Meier curve estimate was used to measure the group of patients with dementia living after diagnosis of LOD. Totally 45,973 patients aged ≥65 years were enrolled. The prevalence of LOD was 12.9% (5,952/45,973). Patients with LOD showed to have a higher incidence of subsequent dementia compared with those without LOD (Odds Ratio: 2.785; 95% CI 2.619-2.958). Among patients with LOD, lipid lowering agent (LLA) users (for at least 3 months) had lower incidence of subsequent dementia than non-users (Hazard Ratio = 0.781, 95% CI 0.685-0.891). Nevertheless, only statins users showed to have reduced risk of dementia (Hazard Ratio = 0.674, 95% CI 0.547-0.832) while other LLAs did not, which was further validated by Kaplan-Meier estimates after we used the propensity scores with the one-to-one nearest-neighbor matching model to control the confounding factors. Statins may reduce the risk of subsequent dementia in patients with LOD.
[TECOS: confirmation of the cardiovascular safety of sitaliptin].
Scheen, A J; Paquot, N
2015-10-01
The cardiovascular safety of sitagliptin has been evaluated in TECOS ("Trial Evaluating Cardiovascular Outcomes with Sitagliptin"). TECOS recruited patients with type 2 diabetes and a history of cardiovascular disease who received, as add-on to their usual therapy, either sitagliptin (n = 7.257) or placebo (n = 7.266), with a median follow-up of 3 years. The primary cardiovascular outcome was a composite of cardiovascular death, nonfatal myocardial infarction, nonfatal stroke, or hospitalization for unstable angina. Sitagliptin was noninferior to placebo for the primary composite cardiovascular outcome (hazard ratio, 0.98; 95% confidence interval, 0.88 to 1.09; P<0.001). Rates of hospitalization for heart failure did not differ between the two groups (hazard ratio, 1.00; 95% CI, 0.83 to 1.20; P=0.98). The cardiovascular safety of sitagliptin, which was already shown in meta-analyses of phase II-III randomised controlled trials and in observational cohort studies in real life, is now confirmed in the landmark prospective cardiovascular outcome study TECOS.
Bruckner, Tim A; Yoon, Jangho; Gonzales, Marco
2017-03-01
Increasing attention focuses on cardiovascular disease (CVD) among persons with SMI. We examined, among persons with SMI, whether co-occurring substance use disorder (SUD) elevates the risk of CVD death. We linked 2002-2007 Medicaid claims data on 121,817 persons with SMI to cause and date of death information. We applied a proportional hazards model that controls for co-morbidity at baseline, atypical antipsychotic prescription medications, age, gender and race/ethnicity. Results among persons with co-occurring SUD indicate a 24 % increased risk of CVD death (hazard ratio 1.24; 95 % confidence interval 1.17-1.33). We encourage further coordination of services for this population.
Dalager-Pedersen, Michael; Søgaard, Mette; Schønheyder, Henrik Carl; Nielsen, Henrik; Thomsen, Reimar Wernich
2014-04-01
Infections may trigger acute cardiovascular events, but the risk after community-acquired bacteremia is unknown. We assessed the risk for acute myocardial infarction and ischemic stroke within 1 year of community-acquired bacteremia. This population-based cohort study was conducted in Northern Denmark. We included 4389 hospitalized medical patients with positive blood cultures obtained on the day of admission. Patients hospitalized with bacteremia were matched with up to 10 general population controls and up to 5 acutely admitted nonbacteremic controls, matched on age, sex, and calendar time. All incident events of myocardial infarction and stroke during the following 365 days were ascertained from population-based healthcare databases. Multivariable regression analyses were used to assess relative risks with 95% confidence intervals (CIs) for myocardial infarction and stroke among bacteremia patients and their controls. The risk for myocardial infarction or stroke was greatly increased within 30 days of community-acquired bacteremia: 3.6% versus 0.2% among population controls (adjusted relative risk, 20.86; 95% CI, 15.38-28.29) and 1.7% among hospitalized controls (adjusted relative risk, 2.18; 95% CI, 1.80-2.65). The risks for myocardial infarction or stroke remained modestly increased from 31 to 180 days after bacteremia in comparison with population controls (adjusted hazard ratio, 1.64; 95% CI, 1.18-2.27), but not versus hospitalized controls (adjusted hazard ratio, 0.95; 95% CI, 0.69-1.32). No differences in cardiovascular risk were seen after >6 months. Increased 30-day risks were consistently found for a variety of etiologic agents and infectious foci. Community-acquired bacteremia is associated with increased short-term risk of myocardial infarction and stroke.
Apolipoprotein E and mortality in African-Americans and Yoruba.
Lane, Kathleen A; Gao, Sujuan; Hui, Siu L; Murrell, Jill R; Hall, Kathleen S; Hendrie, Hugh C
2003-10-01
The literature on the association between apolipoprotein E (ApoE) and mortality across ethnic and age groups has been inconsistent. No studies have looked at this association in developing countries. We used data from the Indianapolis-Ibadan Dementia study to examine this association between APOE and mortality in 354 African-Americans from Indianapolis and 968 Yoruba from Ibadan, Nigeria. Participants were followed up to 9.5 years for Indianapolis and 8.7 years for Ibadan. Subjects from both sites were divided into 2 groups based upon age at baseline. A Cox proportional hazards regression model adjusting for age at baseline, education, hypertension, smoking history and gender in addition to time-dependent covariates of cancer, diabetes, heart disease, stroke, and dementia was fit for each cohort and age group. Having ApoE epsilon4 alleles significantly increased mortality risk in Indianapolis subjects under age 75 (hazard ratio: 2.00; 95% CI: 1.19-3.35; p = 0.0089). No association was found in Indianapolis subjects 75 and older (hazard ratio: 0.71; 95% CI: 0.45-1.10; p = 0.1238), Ibadan subjects under 75 (hazard ratio: 1.04; 95% CI: 0.78 to 1.40; p = 0.7782), or Ibadan subjects over 75 (hazard ratio: 1.21; 95% CI: 0.83 to 1.75; p = 0.3274).
Apolipoprotein E and mortality in African-Americans and Yoruba
Lane, Kathleen A.; Gao, Sujuan; Hui, Siu L.; Murrell, Jill R.; Hall, Kathleen S.; Hendrie, Hugh C.
2011-01-01
The literature on the association between apolipoprotein E (ApoE) and mortality across ethnic and age groups has been inconsistent. No studies have looked at this association in developing countries. We used data from the Indianapolis-Ibadan Dementia study to examine this association between APOE and mortality in 354 African-Americans from Indianapolis and 968 Yoruba from Ibadan, Nigeria. Participants were followed up to 9.5 years for Indianapolis and 8.7 years for Ibadan. Subjects from both sites were divided into 2 groups based upon age at baseline. A Cox proportional hazards regression model adjusting for age at baseline, education, hypertension, smoking history and gender in addition to time-dependent covariates of cancer, diabetes, heart disease, stroke, and dementia was fit for each cohort and age group. Having ApoE ε4 alleles significantly increased mortality risk in Indianapolis subjects under age 75 ( hazard ratio: 2.00; 95% CI: 1.19–3.35; p = 0.0089). No association was found in Indianapolis subjects 75 and older (hazard ratio: 0.71; 95% CI: 0.45–1.10; p = 0.1238), Ibadan subjects under 75 (hazard ratio: 1.04; 95% CI: 0.78 to 1.40; p = 0.7782), or Ibadan subjects over 75 (hazard ratio: 1.21; 95% CI: 0.83 to 1.75; p = 0.3274). PMID:14646029
Radiotherapy for Tracheal-Bronchial Cystic Adenoid Carcinomas.
Levy, A; Omeiri, A; Fadel, E; Le Péchoux, C
2018-01-01
Primary tracheal-bronchial adenoid cystic carcinoma (thoracic adenoid cystic carcinoma; TACC) is a rare and aggressive malignant tumour. Radiotherapy results have not been previously individualised in this setting. Records of 31 patients with TACC (74% tracheal and 26% bronchial) who received radiotherapy between February 1984 and September 2014 were retrospectively analysed. Surgical removal of the primary tumour was carried out for most (71%) patients, and 13/22 (59%) had R1 or R2 (1/22) margins. The mean tumour size was 4.1 cm, 10 (32%) had associated lymph node involvement and 13 (41%) had perineural invasion (PNI). Adjuvant and definitive radiotherapy were delivered for 22 (71%) and nine patients, respectively. The mean delivered dose was 62 Gy (40-70 Gy) and eight patients had a radiotherapy boost (mean 19 Gy, range 9-30 Gy, two with endobronchial brachytherapy). At a median follow-up of 5.7 years, the 5 year overall survival and progression-free survival (PFS) rates were 88% and 61%, respectively. There were three local relapses and 10 metastatic relapses (mean delay 3.2 years), resulting in 5 year local and metastatic relapse rates of 10% and 26%, respectively. The prognostic factors in the univariate analysis for both decreased overall survival and PFS were: age ≥50 years (hazard ratio 6.2 and 3.8) and the presence of PNI (hazard ratio 10.3 and 4.1); and for PFS only: a radiotherapy dose ≤ 60 Gy (hazard ratio 3.1). Late toxicities were: tracheotomy due to symptomatic tracheal stenosis (n = 5), G3 dyspnoea (n = 4), hypothyroidism (n = 5) and pericarditis (n = 4). Radiotherapy dose may affect local control and the presence of PNI should be considered as an adverse prognostic factor. TACC irradiation conferred good local control rates, when comparing these results with historical series. Copyright © 2017 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Prognostic Significance of Selected Lifestyle Factors in Urinary Bladder Cancer
Wakai, Kenji; Ohno, Yoshiyuki; Obata, Kohji; Aoki, Kunio
1993-01-01
To examine the prognostic significance of lifestyle factors in urinary bladder cancer, we conducted a follow‐up study of 258 incident bladder cancer patients, who were originally recruited in a case‐control study in metropolitan Nagoya. Information on individual survivals was obtained from the computer data‐file of the tumor registry of the Nagoya Bladder Cancer Research Group. Univariate analyses revealed significant associations of 5‐year survivorship with educational attainment, marital status, drinking habits and consumption of green tea in males, and age at first consultation, histological type and grade of tumor, stage and distant metastasis in both sexes. After adjustment for age, stage, histology (histological type and grade) and distant metastasis by means of a proportional hazards model, drinking of alcoholic beverages was significantly associated with the prognosis of bladder cancer in males. Its adjusted hazard ratio was 0.46 (95% confidence interval: 0.26–0.79), favoring patients who had taken alcoholic beverages. In detailed analysis, ex‐drinkers and all levels of current drinkers demonstrated hazard ratios smaller than unity, although no clear dose‐response relationship was detected. No prognostic significance was found for such lifestyle factors as smoking habit, uses of artificial sweeteners and hairdye, and consumption of coffee, black tea, matcha (powdered green tea) and cola. PMID:8294212
Prognostic significance of selected lifestyle factors in urinary bladder cancer.
Wakai, K; Ohno, Y; Obata, K; Aoki, K
1993-12-01
To examine the prognostic significance of lifestyle factors in urinary bladder cancer, we conducted a follow-up study of 258 incident bladder cancer patients, who were originally recruited in a case-control study in metropolitan Nagoya. Information on individual survivals was obtained from the computer data-file of the tumor registry of the Nagoya Bladder Cancer Research Group. Univariate analyses revealed significant associations of 5-year survivorship with educational attainment, marital status, drinking habits and consumption of green tea in males, and age at first consultation, histological type and grade of tumor, stage and distant metastasis in both sexes. After adjustment for age, stage, histology (histological type and grade) and distant metastasis by means of a proportional hazards model, drinking of alcoholic beverages was significantly associated with the prognosis of bladder cancer in males. Its adjusted hazard ratio was 0.46 (95% confidence interval: 0.26-0.79), favoring patients who had taken alcoholic beverages. In detailed analysis, ex-drinkers and all levels of current drinkers demonstrated hazard ratios smaller than unity, although no clear dose-response relationship was detected. No prognostic significance was found for such lifestyle factors as smoking habit, uses of artificial sweeteners and hairdye, and consumption of coffee, black tea, matcha (powdered green tea) and cola.
Psychosocial factors and shoulder symptom development among workers.
Smith, Caroline K; Silverstein, Barbara A; Fan, Z Joyce; Bao, Stephen; Johnson, Peter W
2009-01-01
Shoulder injuries are a common cause of pain and discomfort. Many work-related factors have been associated with the onset of shoulder symptoms. The psychosocial concepts in the demand-control model have been studied in association with musculoskeletal symptoms but with heterogeneous findings. The purpose of this study was to assess the relationship between the psychosocial concepts of the demand-control model and the incidence of shoulder symptoms in a working population. After following 424 subjects for approximately 1 year, 85 incident cases were identified from self-reported data. Cox proportional hazards modeling was used to assess the associations between shoulder symptoms and demand-control model quadrants. Cases were more likely to be female and report other upper extremity symptoms at baseline (P < 0.05). From the hazard models, being in either a passive or high strain job quadrant was associated with the incidence of shoulder symptoms. Hazard ratios were 2.17, 95% CI 1.02-4.66 and 2.19, 95% CI 1.08-4.42, respectively. Using self-reporting to determine demand-control quadrants was successful in identifying subjects at risk of developing work-related shoulder symptoms. Research is needed to determine if this relationship holds with clinically diagnosed shoulder and other upper extremity musculoskeletal disorders. This may be part of a simple tool for assessing risk of developing these UEMSDs. (c) 2008 Wiley-Liss, Inc.
Coffee and risk of death from hepatocellular carcinoma in a large cohort study in Japan.
Kurozawa, Y; Ogimoto, I; Shibata, A; Nose, T; Yoshimura, T; Suzuki, H; Sakata, R; Fujita, Y; Ichikawa, S; Iwai, N; Tamakoshi, A
2005-09-05
We examined the relation between coffee drinking and hepatocellular carcinoma (HCC) mortality in the Japan Collaborative Cohort Study for Evaluation of Cancer Risk (JACC Study). In total, 110,688 cohort members (46,399 male and 64,289 female subjects) aged 40-79 years were grouped by coffee intake into three categories: one or more cups per day, less than one cup per day and non-coffee drinkers. Cox proportional hazards model by SAS was used to obtain hazard ratio of HCC mortality for each coffee consumption categories. The hazard ratios were adjusted for age, gender, educational status, history of diabetes and liver diseases, smoking habits and alcohol. The hazard ratio of death due to HCC for drinkers of one and more cups of coffee per day, compared with non-coffee drinkers, was 0.50 (95% confidence interval 0.31-0.79), and the ratio for drinkers of less than one cup per day was 0.83 (95% confidence interval 0.54-1.25). Our data confirmed an inverse association between coffee consumption and HCC mortality.
Coffee and risk of death from hepatocellular carcinoma in a large cohort study in Japan
Kurozawa, Y; Ogimoto, I; Shibata, A; Nose, T; Yoshimura, T; Suzuki, H; Sakata, R; Fujita, Y; Ichikawa, S; Iwai, N; Tamakoshi, A
2005-01-01
We examined the relation between coffee drinking and hepatocellular carcinoma (HCC) mortality in the Japan Collaborative Cohort Study for Evaluation of Cancer Risk (JACC Study). In total, 110 688 cohort members (46 399 male and 64 289 female subjects) aged 40–79 years were grouped by coffee intake into three categories: one or more cups per day, less than one cup per day and non-coffee drinkers. Cox proportional hazards model by SAS was used to obtain hazard ratio of HCC mortality for each coffee consumption categories. The hazard ratios were adjusted for age, gender, educational status, history of diabetes and liver diseases, smoking habits and alcohol. The hazard ratio of death due to HCC for drinkers of one and more cups of coffee per day, compared with non-coffee drinkers, was 0.50 (95% confidence interval 0.31–0.79), and the ratio for drinkers of less than one cup per day was 0.83 (95% confidence interval 0.54–1.25). Our data confirmed an inverse association between coffee consumption and HCC mortality. PMID:16091758
Alexander, Matthew D; Hippe, Daniel S; Cooke, Daniel L; Hallam, Danial K; Hetts, Steven W; Kim, Helen; Lawton, Michael T; Sekhar, Laligam N; Kim, Louis J; Ghodke, Basavaraj V
2018-03-01
High-risk components of brain arteriovenous malformations (BAVMs) can be targeted to reduce the risk of lesion rupture. To evaluate targeted embolization of aneurysms against other means of treatment with a case-control analysis; we previously investigated this approach associated with BAVMs. Retrospective analysis of patients with BAVMs was performed, identifying patients treated with intention to occlude only an aneurysm associated with a BAVM. For each targeted aneurysm embolization (TAE) patient identified, 4 control patients were randomly selected, controlling for rupture status, age, and Spetzler-Martin plus Lawton-Young supplemental score. Analysis was performed to compare rates of adverse events (hemorrhage, new seizure, and death) between the 2 groups. Thirty-two patients met inclusion criteria, and 128 control patients were identified, out of 1103 patients treated during the study period. Thirty-four adverse events occurred (15 ruptures, 15 new seizures, and 11 deaths) during the follow-up period (mean 1157 d for the TAE cohort and 1036 d for the non-TAE cohort). Statistically lower associations were noted for the TAE group for any adverse event (hazard ratio 0.28, P = .037) and the composite outcome of hemorrhage or new seizure (hazard ratio 0.20, P = .029). For BAVMs at high risk for surgical resection, TAE can be performed safely and effectively. Patients treated with TAE had better outcomes than matched patients undergoing other combinations of treatment. TAE can be considered for BAVMs with high operative risk prior to radiosurgery or when no other treatment options are available. Copyright © 2017 by the Congress of Neurological Surgeons
Suicide Following Deliberate Self-Harm.
Olfson, Mark; Wall, Melanie; Wang, Shuai; Crystal, Stephen; Gerhard, Tobias; Blanco, Carlos
2017-08-01
The authors sought to identify risk factors for repeat self-harm and completed suicide over the following year among adults with deliberate self-harm. A national cohort of Medicaid-financed adults clinically diagnosed with deliberate self-harm (N=61,297) was followed for up to 1 year. Repeat self-harm per 1,000 person-years and suicide rates per 100,000 person-years (based on cause of death information from the National Death Index) were determined. Hazard ratios of repeat self-harm and suicide were estimated by Cox proportional hazard models. During the 12 months after nonfatal self-harm, the rate of repeat self-harm was 263.2 per 1,000 person-years and the rate of completed suicide was 439.1 per 100,000 person-years, or 37.2 times higher than in a matched general population cohort. The hazard of suicide was higher after initial self-harm events involving violent as compared with nonviolent methods (hazard ratio=7.5, 95% CI=5.5-10.1), especially firearms (hazard ratio=15.86, 95% CI=10.7-23.4; computed with poisoning as reference), and to a lesser extent after events of patients who had recently received outpatient mental health care (hazard ratio=1.6, 95% CI=1.2-2.0). Compared with self-harm patients using nonviolent methods, those who used violent methods were at significantly increased risk of suicide during the first 30 days after the initial event (hazard ratio=17.5, 95% CI=11.2-27.3), but not during the following 335 days. Adults treated for deliberate self-harm frequently repeat self-harm in the following year. Patients who use a violent method for their initial self-harm, especially firearms, have an exceptionally high risk of suicide, particularly right after the initial event, which highlights the importance of careful assessment and close follow-up of this group.
Chuang, Michael L.; Gona, Philimon; Salton, Carol J.; Yeon, Susan B.; Kissinger, Kraig V.; Blease, Susan J.; Levy, Daniel; O'Donnell, Christopher J.; Manning, Warren J.
2013-01-01
We sought to determine whether depressed myocardial contraction fraction (MCF, the ratio of left ventricular (LV) stroke volume to myocardial volume) predicts cardiovascular disease (CVD) events in initially healthy adults. A subset (N=318, 60±9 yrs, 158 men) of the Framingham Heart Study Offspring cohort free of clinical CVD underwent volumetric cardiovascular magnetic resonance (CMR) imaging in 1998–1999. LV ejection fraction (EF), mass and MCF were determined. “Hard” CVD events comprised cardiovascular death, myocardial infarction, stroke or new heart failure. A Cox proportional hazards model adjusting for Framingham Coronary Risk Score (FCRS) was used to estimate hazard ratios for incident hard CVD events for sex-specific quartiles of MCF, LV mass and LVEF. The lowest quartile of LV mass and highest quartiles of MCF and EF served as referent. Kaplan-Meier survival plots and the log rank test were used to compare event-free survival. MCF was greater in women (0.58±0.13) than men (0.52±0.11), p<0.01. Nearly all (99%) participants had EF ≥ 0.55. Over up to 9-year (median 5.2) follow-up, 31 participants (10%) experienced an incident hard CVD event. Lowest-quartile MCF was 7 times more likely to develop hard CVD (hazard ratio 7.11, p=0.010) compared to the lowest quartile, and the elevated hazards persisted even after adjustment for LV mass (hazard ratio=6.09, p=0.020). The highest-quartile LV mass/height2.7 had nearly five-fold risk (hazard ratio 4.68, p=0.016). Event-free survival was shorter in lowest-quartile MCF, p = 0.0006, but not in lowest-quartile LVEF. Conclusion: In a cohort of adults initially without clinical CVD, lowest-quartile MCF conferred an increased hazard for hard CVD events after adjustment for traditional CVD risk factors and LV mass. PMID:22381161
Chuang, Michael L; Gona, Philimon; Salton, Carol J; Yeon, Susan B; Kissinger, Kraig V; Blease, Susan J; Levy, Daniel; O'Donnell, Christopher J; Manning, Warren J
2012-05-15
We sought to determine whether depressed myocardial contraction fraction (MCF; ratio of left ventricular [LV] stroke volume to myocardial volume) predicts cardiovascular disease (CVD) events in initially healthy adults. A subset (n = 318, 60 ± 9 years old, 158 men) of the Framingham Heart Study Offspring cohort free of clinical CVD underwent volumetric cardiovascular magnetic resonance imaging in 1998 through 1999. LV ejection fraction (EF), mass, and MCF were determined. "Hard" CVD events consisted of cardiovascular death, myocardial infarction, stroke, or new heart failure. A Cox proportional hazards model adjusting for Framingham Coronary Risk Score was used to estimate hazard ratios for incident hard CVD events for gender-specific quartiles of MCF, LV mass, and LVEF. The lowest quartile of LV mass and highest quartiles of MCF and EF served as referents. Kaplan-Meier survival plots and log-rank test were used to compare event-free survival. MCF was greater in women (0.58 ± 0.13) than in men (0.52 ± 0.11, p <0.01). Nearly all participants (99%) had EF ≥0.55. During an up to 9-year follow-up (median 5.2), 31 participants (10%) developed an incident hard CVD event. Lowest-quartile MCF was 7 times more likely to develop a hard CVD (hazard ratio 7.11, p = 0.010) compared to the remaining quartiles, and increased hazards persisted even after adjustment for LV mass (hazard ratio 6.09, p = 0.020). The highest-quartile LV mass/height 2.7 had a nearly fivefold risk (hazard ratio 4.68, p = 0.016). Event-free survival was shorter in lowest-quartile MCF (p = 0.0006) but not in lowest-quartile LVEF. In conclusion, in a cohort of adults initially without clinical CVD, lowest-quartile MCF conferred an increased hazard for hard CVD events after adjustment for traditional CVD risk factors and LV mass. Copyright © 2012 Elsevier Inc. All rights reserved.
Socioeconomic disparities in outcomes after acute myocardial infarction.
Bernheim, Susannah M; Spertus, John A; Reid, Kimberly J; Bradley, Elizabeth H; Desai, Rani A; Peterson, Eric D; Rathore, Saif S; Normand, Sharon-Lise T; Jones, Philip G; Rahimi, Ali; Krumholz, Harlan M
2007-02-01
Patients of low socioeconomic status (SES) have higher mortality after acute myocardial infarction (AMI). Little is known about the underlying mechanisms or the relationship between SES and rehospitalization after AMI. We analyzed data from the PREMIER observational study, which included 2142 patients hospitalized with AMI from 18 US hospitals. Socioeconomic status was measured by self-reported household income and education level. Sequential multivariable modeling assessed the relationship of socioeconomic factors with 1-year all-cause mortality and all-cause rehospitalization after adjustment for demographics, clinical factors, and quality-of-care measures. Both household income and education level were associated with higher risk of mortality (hazard ratio 2.80, 95% CI 1.37-5.72, lowest to highest income group) and rehospitalization after AMI (hazard ratio 1.55, 95% CI 1.17-2.05). Patients with low SES had worse clinical status at admission and received poorer quality of care. In multivariable modeling, the relationship between household income and mortality was attenuated by adjustment for demographic and clinical factors (hazard ratio 1.19, 95% CI 0.54-2.62), with a further small decrement in the hazard ratio after adjustment for quality of care. The relationship between income and rehospitalization was only partly attenuated by demographic and clinical factors (hazard ratio 1.38, 95% CI 1.01-1.89) and was not influenced by adjustment for quality of care. Patients' baseline clinical status largely explained the relationship between SES and mortality, but not rehospitalization, among patients with AMI.
Transplantation Outcomes for Children with Hypodiploid Acute Lymphoblastic Leukemia.
Mehta, Parinda A; Zhang, Mei-Jie; Eapen, Mary; He, Wensheng; Seber, Adriana; Gibson, Brenda; Camitta, Bruce M; Kitko, Carrie L; Dvorak, Christopher C; Nemecek, Eneida R; Frangoul, Haydar A; Abdel-Azim, Hisham; Kasow, Kimberly A; Lehmann, Leslie; Gonzalez Vicent, Marta; Diaz Pérez, Miguel A; Ayas, Mouhab; Qayed, Muna; Carpenter, Paul A; Jodele, Sonata; Lund, Troy C; Leung, Wing H; Davies, Stella M
2015-07-01
Children with hypodiploid acute lymphoblastic leukemia (ALL) have inferior outcomes despite intensive risk-adapted chemotherapy regimens. We describe 78 children with hypodiploid ALL who underwent hematopoietic stem cell transplantation between 1990 and 2010. Thirty-nine (50%) patients had ≤ 43 chromosomes, 12 (15%) had 44 chromosomes, and 27 (35%) had 45 chromosomes. Forty-three (55%) patients underwent transplantation in first remission (CR1) and 35 (45%) underwent transplantation in ≥ second remission (CR2). Twenty-nine patients (37%) received a graft from a related donor and 49 (63%) from an unrelated donor. All patients received a myeloablative conditioning regimen. The 5-year probabilities of leukemia-free survival, overall survival, relapse, and treatment-related mortality for the entire cohort were 51%, 56%, 27%, and 22%, respectively. Multivariate analysis confirmed that mortality risks were higher for patients who underwent transplantation in CR2 (hazard ratio, 2.16; P = .05), with number of chromosomes ≤ 43 (hazard ratio, 2.15; P = .05), and for those who underwent transplantation in the first decade of the study period (hazard ratio, 2.60; P = .01). Similarly, treatment failure risks were higher with number of chromosomes ≤ 43 (hazard ratio, 2.28; P = .04) and the earlier transplantation period (hazard ratio, 2.51; P = .01). Although survival is better with advances in donor selection and supportive care, disease-related risk factors significantly influence transplantation outcomes. Copyright © 2015 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Association between chronic azotemic kidney disease and the severity of periodontal disease in dogs.
Glickman, Lawrence T; Glickman, Nita W; Moore, George E; Lund, Elizabeth M; Lantz, Gary C; Pressler, Barrak M
2011-05-01
Naturally occurring periodontal disease affects >75% of dogs and has been associated with cardiac lesions and presumptive endocarditis. However, the relationships between periodontal disease and chronic kidney disease (CKD) in dogs have not been studied. In a retrospective longitudinal study the incidence of azotemic CKD was compared between a cohort of 164,706 dogs with periodontal disease and a cohort of age-matched dogs with no periodontal disease from a national primary care practice. These dogs contributed 415,971 dog-years of follow-up from 2002 to 2008. Hazard ratios and 95% confidence intervals from Cox regression were used to compare the incidence of azotemic CKD in dogs with stage 1, 2, or 3/4 periodontal disease to dogs with no periodontal disease. The hazard ratio for azotemic CKD increased with increasing severity of periodontal disease (stage 1 hazard ratio=1.8, 95% confidence interval: 1.6, 2.1; stage 2 hazard ratio=2.0, 95% confidence interval: 1.7, 2.3; stage 3/4 hazard ratio=2.7, 95% confidence interval: 2.3, 3.0; P(trend)=<0.0001) after adjustment for age, gender, neuter status, breed, body weight, number of hospital visits, and dental procedures. Increasing severity of periodontal disease was also associated with serum creatinine >1.4 mg/dl and blood urea nitrogen >36 mg/dl, independent of a veterinarian's clinical diagnosis of CKD. Copyright © 2011 Elsevier B.V. All rights reserved.
Two models for evaluating landslide hazards
Davis, J.C.; Chung, C.-J.; Ohlmacher, G.C.
2006-01-01
Two alternative procedures for estimating landslide hazards were evaluated using data on topographic digital elevation models (DEMs) and bedrock lithologies in an area adjacent to the Missouri River in Atchison County, Kansas, USA. The two procedures are based on the likelihood ratio model but utilize different assumptions. The empirical likelihood ratio model is based on non-parametric empirical univariate frequency distribution functions under an assumption of conditional independence while the multivariate logistic discriminant model assumes that likelihood ratios can be expressed in terms of logistic functions. The relative hazards of occurrence of landslides were estimated by an empirical likelihood ratio model and by multivariate logistic discriminant analysis. Predictor variables consisted of grids containing topographic elevations, slope angles, and slope aspects calculated from a 30-m DEM. An integer grid of coded bedrock lithologies taken from digitized geologic maps was also used as a predictor variable. Both statistical models yield relative estimates in the form of the proportion of total map area predicted to already contain or to be the site of future landslides. The stabilities of estimates were checked by cross-validation of results from random subsamples, using each of the two procedures. Cell-by-cell comparisons of hazard maps made by the two models show that the two sets of estimates are virtually identical. This suggests that the empirical likelihood ratio and the logistic discriminant analysis models are robust with respect to the conditional independent assumption and the logistic function assumption, respectively, and that either model can be used successfully to evaluate landslide hazards. ?? 2006.
Abortion and mental health: findings from The National Comorbidity Survey-Replication.
Steinberg, Julia R; McCulloch, Charles E; Adler, Nancy E
2014-02-01
To examine whether a first abortion increases risk of mental health disorders compared with a first childbirth with and without considering prepregnancy mental health and adverse exposures, childhood economic status, miscarriage history, age at first abortion or childbirth, and race or ethnicity. A cohort study compared rates of mental disorders (anxiety, mood, impulse-control, substance use, eating disorders, and suicidal ideation) among 259 women postabortion and 677 women postchildbirth aged 18-42 years at the time of interview from The National Comorbidity Survey-Replication. The percentage of women with no, one, two, and three or more mental health disorders before their first abortion was 37.8%, 19.7%, 15.2%, and 27.3% and before their first childbirth was 57.9%, 19.6%, 9.2%, and 13.3%, respectively, indicating that women in the abortion group had more prior mental health disorders than women in the childbirth group (P<.001). Although in unadjusted Cox proportional hazard models, abortion compared with childbirth was associated with statistically significant higher hazards of postpregnancy mental health disorders, associations were reduced and became nonstatistically significant for five disorders after adjusting for the aforementioned factors. Hazard ratios and associated 95% confidence intervals dropped from 1.52 (1.08-2.15) to 1.12 (0.87-1.46) for anxiety disorders; from 1.56 (1.23-1.98) to 1.18 (0.88-1.56) for mood disorders; from 1.62 (1.02-2.57) to 1.10 (0.75-1.62) for impulse-control disorders; from 2.53 (1.09-5.86) to 1.82 (0.63-5.25) for eating disorders; and from 1.62 (1.09-2.40) to 1.25 (0.88-1.78) for suicidal ideation. Only the relationship between abortion and substance use disorders remained statistically significant, although the hazard ratio dropped from 3.05 (1.94-4.79) to 2.30 (1.35-3.92). After accounting for confounding factors, abortion was not a statistically significant predictor of subsequent anxiety, mood, impulse-control, and eating disorders or suicidal ideation. LEVEL OF EVEDIENCE: II.
Säfsten, Eleonor; Forsell, Yvonne; Ramstedt, Mats; Galanti, Maria Rosaria
2017-06-06
Hazardous and harmful consumption of alcohol is a leading cause of preventable disease and premature deaths. Modifying the amount and pattern of risky alcohol consumption conveys substantial benefits to individuals and to society at large. Telephone helplines provide a feasible alternative to face-to-face counselling in order to increase the reach of brief interventions aiming at modifying the hazardous and harmful use of alcohol. However, there is a lack of studies on the implementation and evaluation of population-based telephone services for the prevention and treatment of alcohol misuse. A randomised controlled trial was designed to compare a brief, structured intervention to usual care within the Swedish National Alcohol Helpline (SAH), concerning their effectiveness on decreasing the hazardous use of alcohol. Between May 2015 and December 2017, about 300 callers are to be individually randomised with a 1:1 ratio to a brief, structured intervention (n = 150) or to usual care (n = 150). The brief, structured intervention consists of the delivery of a self-help booklet followed by one proactive call from SAH counsellors to monitor and give feedback about the client's progression. Callers assigned to usual care receive telephone counselling according to existing practice, i.e., motivational interviewing in a tailored and client-driven combination of proactive and reactive calls. The primary outcome is defined as a change from a higher to a lower AUDIT risk-level category between baseline and follow-up. General linear modeling will be used to calculate risk ratios of the outcome events. The primary analysis will follow an intention-to-treat (ITT) approach. The trial is designed to evaluate the effectiveness in decreasing the hazardous and harmful consumption of alcohol of a brief, structured intervention compared to usual care when delivered at the SAH. The results of the study will be used locally to improve the effectiveness of the service provided at the SAH. Additionally, they will expand the evidence base about optimal counselling models in population-based telephone services for alcohol misuse prevention and treatment. ISRCNT.com, ID: ISRCTN13160878 . Retrospectively registered on 18 January 2016.
Combination Antifungal Therapy for Cryptococcal Meningitis
Day, Jeremy N.; Chau, Tran T.H.; Wolbers, Marcel; Mai, Pham P.; Dung, Nguyen T.; Mai, Nguyen H.; Phu, Nguyen H.; Nghia, Ho D.; Phong, Nguyen D.; Thai, Cao Q.; Thai, Le H.; Chuong, Ly V.; Sinh, Dinh X.; Duong, Van A.; Hoang, Thu N.; Diep, Pham T.; Campbell, James I.; Sieu, Tran P.M.; Baker, Stephen G.; Chau, Nguyen V.V.; Hien, Tran T.
2014-01-01
BACKGROUND Combination antifungal therapy (amphotericin B deoxycholate and flucytosine) is the recommended treatment for cryptococcal meningitis but has not been shown to reduce mortality, as compared with amphotericin B alone. We performed a randomized, controlled trial to determine whether combining flucytosine or high-dose fluconazole with high-dose amphotericin B improved survival at 14 and 70 days. METHODS We conducted a randomized, three-group, open-label trial of induction therapy for cryptococcal meningitis in patients with human immunodeficiency virus infection. All patients received amphotericin B at a dose of 1 mg per kilogram of body weight per day; patients in group 1 were treated for 4 weeks, and those in groups 2 and 3 for 2 weeks. Patients in group 2 concurrently received flucytosine at a dose of 100 mg per kilogram per day for 2 weeks, and those in group 3 concurrently received fluconazole at a dose of 400 mg twice daily for 2 weeks. RESULTS A total of 299 patients were enrolled. Fewer deaths occurred by days 14 and 70 among patients receiving amphotericin B and flucytosine than among those receiving amphotericin B alone (15 vs. 25 deaths by day 14; hazard ratio, 0.57; 95% confidence interval [CI], 0.30 to 1.08; unadjusted P = 0.08; and 30 vs. 44 deaths by day 70; hazard ratio, 0.61; 95% CI, 0.39 to 0.97; unadjusted P = 0.04). Combination therapy with fluconazole had no significant effect on survival, as compared with monotherapy (hazard ratio for death by 14 days, 0.78; 95% CI, 0.44 to 1.41; P = 0.42; hazard ratio for death by 70 days, 0.71; 95% CI, 0.45 to 1.11; P = 0.13). amphotericin B plus flucytosine was associated with significantly increased rates of yeast clearance from cerebrospinal fluid (−0.42 log10 colony-forming units [CFU] per milliliter per day vs. −0.31 and −0.32 log10 CFU per milliliter per day in groups 1 and 3, respectively; P<0.001 for both comparisons). Rates of adverse events were similar in all groups, although neutropenia was more frequent in patients receiving a combination therapy. CONCLUSIONS Amphotericin B plus flucytosine, as compared with amphotericin B alone, is associated with improved survival among patients with cryptococcal meningitis. A survival benefit of amphotericin B plus fluconazole was not found. (Funded by the Wellcome Trust and the British Infection Society; Controlled-Trials.com number, ISRCTN95123928.) PMID:23550668
Meredith, Peter A; Lloyd, Suzanne M; Ford, Ian; Elliott, Henry L
2016-01-01
A retrospective further analysis of the ACTION database evaluated the relationship between cardiovascular outcomes and the "quality" of the control of blood pressure (BP). The study population (n = 6287) comprised those patients with four BP measurements during year 1 subdivided according to the proportion of visits in which BP was controlled in relation to two BP targets: < 140/90mmHg and < 130/80 mmHg. Differences between the BP control groups for the major prespecified ACTION outcomes were investigated with Cox proportional hazards models. For all the prespecified cardiovascular endpoints the incidence declined as the proportion of visits with BP control increased. The greatest differences in outcomes between the different BP control groups were observed for the risk of stroke but were still apparent for all the other endpoints. For example, the risks for the primary outcome [hazard ratio (HR) 0.78; 95% confidence interval (CI) 0.67 to 0.90] were significantly less in the group with >_75% of visits with BP control than in the group with < 25% of visits with BP control. There were no significant treatment-related differences. Retrospective analyses are not definitive but these results highlight the importance of the attainment of BP control targets and the consistency of BP control during long-term follow-up.
Feng, Tom; Howard, Lauren E; Vidal, Adriana C; Moreira, Daniel M; Castro-Santamaria, Ramiro; Andriole, Gerald L; Freedland, Stephen J
2017-02-01
To determine if cholesterol is a risk factor for the development of lower urinary tract symptoms in asymptomatic men. A post-hoc analysis of the Reduction by Dutasteride of Prostate Cancer Events (REDUCE) study was carried out in 2323 men with baseline International Prostate Symptom Score <8 and not taking benign prostatic hyperplasia or cholesterol medications. Cox proportion models were used to test the association between cholesterol, high-density lipoprotein, low-density lipoprotein and the cholesterol : high-density lipoprotein ratio with incident lower urinary tract symptoms, defined as first report of medical treatment, surgery or two reports of an International Prostate Symptom Score >14. A total of 253 men (10.9%) developed incident lower urinary tract symptoms. On crude analysis, higher high-density lipoprotein was associated with a decreased lower urinary tract symptoms risk (hazard ratio 0.89, P = 0.024), whereas total cholesterol and low-density lipoprotein showed no association. After multivariable adjustment, the association between high-density lipoprotein and incident lower urinary tract symptoms remained significant (hazard ratio 0.89, P = 0.044), whereas no association was observed for low-density lipoprotein (P = 0.611). There was a trend for higher cholesterol to be linked with higher lower urinary tract symptoms risk, though this was not statistically significant (hazard ratio 1.04, P = 0.054). A higher cholesterol : high-density lipoprotein ratio was associated with increased lower urinary tract symptoms risk on crude (hazard ratio 1.11, P = 0.016) and adjusted models (hazard ratio 1.12, P = 0.012). Among asymptomatic men participating in the REDUCE study, higher cholesterol was associated with increased incident lower urinary tract symptoms risk, though the association was not significant. A higher cholesterol : high-density lipoprotein ratio was associated with increased incident lower urinary tract symptoms, whereas higher high-density lipoprotein was protective. These findings suggest dyslipidemia might play a role in lower urinary tract symptoms progression. © 2016 The Japanese Urological Association.
Mackenzie, P; Pryor, D; Burmeister, E; Foote, M; Panizza, B; Burmeister, B; Porceddu, S
2014-10-01
To determine prognostic factors for locoregional relapse (LRR), distant relapse and all-cause death in a contemporary cohort of locoregionally advanced oropharyngeal squamous cell carcinoma (OSCC) treated with definitive chemoradiotherapy or radiotherapy alone. OSCC patients treated with definitive radiotherapy between 2005 and 2010 were identified from a prospective head and neck database. Patient age, gender, smoking history, human papillomavirus (HPV) status, T- and N-category, lowest involved nodal level and gross tumour volume of the primary (GTV-p) and nodal (GTV-n) disease were analysed in relation to LRR, distant relapse and death by way of univariate and multivariate analysis. In total, 130 patients were identified, 88 HPV positive, with a median follow-up of 42 months. On multivariate analysis HPV status was a significant predictor of LRR (hazard ratio 0.15; 95% confidence interval 0.05-0.51) and death (hazard ratio 0.29; 95% confidence interval 0.14-0.59) but not distant relapse (hazard ratio 0.53, 95% confidence interval 0.22-1.27). Increasing T-category was associated with a higher risk of LRR (hazard ratio 1.80 for T3/4 versus T1/2; 95% confidence interval 1.08-2.99), death (hazard ratio 1.37, 95% confidence interval 1.06-1.77) and distant relapse (hazard ratio 1.35; 95% confidence interval 1.00-1.83). Increasing GTV-p was associated with increased risk of distant relapse and death. N3 disease and low neck nodes were significant for LRR, distant relapse and death on univariate analysis only. Tumour HPV status was the strongest predictor of LRR and death. T-category is more predictive of distant relapse and may provide additional prognostic value for LRR and death when accounting for HPV status. Copyright © 2014 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Giebel, Sebastian; Labopin, Myriam; Socié, Gerard; Beelen, Dietrich; Browne, Paul; Volin, Liisa; Kyrcz-Krzemien, Slawomira; Yakoub-Agha, Ibrahim; Aljurf, Mahmoud; Wu, Depei; Michallet, Mauricette; Arnold, Renate; Mohty, Mohamad; Nagler, Arnon
2017-01-01
Allogeneic hematopoietic cell transplantation is widely used to treat adults with high-risk acute lymphoblastic leukemia. The aim of this study was to analyze whether the results changed over time and to identify prognostic factors. Adult patients treated between 1993 and 2012 with myeloablative allogeneic hematopoietic cell transplantation from HLA matched sibling (n=2681) or unrelated (n=2178) donors in first complete remission were included. For transplantations from sibling donors performed between 2008 and 2012, 2-year probabilities of overall survival were: 76% (18–25 years old), 69% (26–35 and 36–45 years old) and 60% (46–55 years old). Among recipients of transplantations from unrelated donors, the respective survival rates were 66%, 70%, 61%, and 62%. In comparison with the 1993–2007 period, significant improvements were observed for all age groups except for the 26–35-year old patients. In a multivariate model, transplantations performed between 2008 and 2012, when compared to 1993–2007, were associated with significantly reduced risks of non-relapse mortality (Hazard Ratio 0.77, P=0.00006), relapse (Hazard Ratio 0.85, P=0.007), treatment failure (Hazard Ratio 0.81, P<0.00001), and overall mortality (Hazard Ratio 0.79, P<0.00001). In the analysis restricted to transplantations performed between 2008 and 2012, the use of total body irradiation-based conditioning was associated with reduced risk of relapse (Hazard Ratio 0.48, P=0.004) and treatment failure (Hazard Ratio 0.63, P=0.02). We conclude that results of allogeneic hematopoietic cell transplantation for adults with acute lymphoblastic leukemia improved significantly over time. Total body irradiation should be considered as the preferable type of myeloablative conditioning. PMID:27686376
Matarraz, Sergio; Leoz, Pilar; Fernández, Carlos; Colado, Enrique; Chillón, María Carmen; Vidriales, María Belén; González, Marcos; Rivera, Daniel; Osuna, Carlos Salvador; Caballero-Velázquez, Teresa; Van Der Velden, Vincent; Jongen-Lavrencic, Mojca; Gutiérrez, Oliver; Bermejo, Ana Yeguas; Alonso, Luis García; García, Monique Bourgeois; De Ramón Sánchez, Cristina; García-Donas, Gloria; Mateo, Aránzazu García; Recio, Isabel; Sánchez-Real, Javier; Mayado, Andrea; Gutiérrez, María Laura; Bárcena, Paloma; Barrena, Susana; López, Antonio; Van Dongen, Jacques; Orfao, Alberto
2018-03-23
Severe hemorrhagic events occur in a significant fraction of acute promyelocytic leukemia patients, either at presentation and/or early after starting therapy, leading to treatment failure and early deaths. However, identification of independent predictors for high-risk of severe bleeding at diagnosis, remains a challenge. Here, we investigated the immunophenotype of bone marrow leukemic cells from 109 newly diagnosed acute promyelocytic leukemia patients, particularly focusing on the identification of basophil-related features, and their potential association with severe bleeding episodes and patient overall survival.From all phenotypes investigated on leukemic cells, expression of the CD203c and/or CD22 basophil-associated markers showed the strongest association with the occurrence and severity of bleeding (p ≤ 0.007); moreover, aberrant expression of CD7, coexpression of CD34 + /CD7 + and lack of CD71 was also more frequently found among patients with (mild and severe) bleeding at baseline and/or after starting treatment (p ≤ 0.009). Multivariate analysis showed that CD203c expression (hazard ratio: 26.4; p = 0.003) and older age (hazard ratio: 5.4; p = 0.03) were the best independent predictors for cumulative incidence of severe bleeding after starting therapy. In addition, CD203c expression on leukemic cells (hazard ratio: 4.4; p = 0.01), low fibrinogen levels (hazard ratio: 8.8; p = 0.001), older age (hazard ratio: 9.0; p = 0.002), and high leukocyte count (hazard ratio: 5.6; p = 0.02) were the most informative independent predictors for overall survival.In summary, our results show that the presence of basophil-associated phenotypic characteristics on leukemic cells from acute promyelocytic leukemia patients at diagnosis is a powerful independent predictor for severe bleeding and overall survival, which might contribute in the future to (early) risk-adapted therapy decisions.
Rasouli, B; Ahlbom, A; Andersson, T; Grill, V; Midthjell, K; Olsson, L; Carlsson, S
2013-01-01
We investigated the influence of different aspects of alcohol consumption on the risk of Type 2 diabetes and autoimmune diabetes in adults. We used data from the Nord-Trøndelag Health Survey (HUNT) study, in which all adults aged ≥ 20 years from Nord-Trondelag County were invited to participate in three surveys in 1984-1986, 1995-1997 and 2006-2008. Patients with diabetes were identified using self-reports, and participants with onset age ≥ 35 years were classified as having Type 2 diabetes if they were negative for anti-glutamic acid decarboxylase (n = 1841) and as having autoimmune diabetes if they were positive for anti-glutamic acid decarboxylase (n = 140). Hazard ratios of amount and frequency of alcohol use, alcoholic beverage choice, and binge drinking and alcohol use disorders were estimated. Moderate alcohol consumption (adjusted for confounders) was associated with a reduced risk of Type 2 diabetes in men, but not in women (hazard ratio for men 10-15 g/day 0.48, 95% CI 0.28-0.77; hazard ratio for women ≥ 10 g/day 0.81, 95% CI 0.33-1.96). The reduced risk was primarily linked to consumption of wine [hazard ratio 0.93, 95% CI 0.87-0.99 (per g/day)]. No increased risk was seen in participants reporting binge drinking or in problem drinkers. The results were also compatible with a reduced risk of autoimmune diabetes associated with alcohol consumption [hazard ratio 0.70, 95% CI 0.45-1.08 (frequent consumption) and hazard ratio 0.36, 95% CI 0.13-0.97 (2-7 g/day)]. Moderate alcohol consumption associates with reduced risk of both Type 2 diabetes and autoimmune diabetes. A protective effect of alcohol intake may be limited to men. High alcohol consumption does not seem to carry an increased risk of diabetes. © 2012 The Authors. Diabetic Medicine © 2012 Diabetes UK.
Chen-Hussey, Vanessa; Carneiro, Ilona; Keomanila, Hongkham; Gray, Rob; Bannavong, Sihamano; Phanalasy, Saysana; Lindsay, Steven W.
2013-01-01
Background Mosquito vectors of malaria in Southeast Asia readily feed outdoors making malaria control through indoor insecticides such as long-lasting insecticidal nets (LLINs) and indoor residual spraying more difficult. Topical insect repellents may be able to protect users from outdoor biting, thereby providing additional protection above the current best practice of LLINs. Methods and Findings A double blind, household randomised, placebo-controlled trial of insect repellent to reduce malaria was carried out in southern Lao PDR to determine whether the use of repellent and long-lasting insecticidal nets (LLINs) could reduce malaria more than LLINs alone. A total of 1,597 households, including 7,979 participants, were recruited in June 2009 and April 2010. Equal group allocation, stratified by village, was used to randomise 795 households to a 15% DEET lotion and the remainder were given a placebo lotion. Participants, field staff and data analysts were blinded to the group assignment until data analysis had been completed. All households received new LLINs. Participants were asked to apply their lotion to exposed skin every evening and sleep under the LLINs each night. Plasmodium falciparum and P. vivax cases were actively identified by monthly rapid diagnostic tests. Intention to treat analysis found no effect from the use of repellent on malaria incidence (hazard ratio: 1.00, 95% CI: 0.99–1.01, p = 0.868). A higher socio-economic score was found to significantly decrease malaria risk (hazard ratio: 0.72, 95% CI: 0.58–0.90, p = 0.004). Women were also found to have a reduced risk of infection (hazard ratio: 0.59, 95% CI: 0.37–0.92, p = 0.020). According to protocol analysis which excluded participants using the lotions less than 90% of the time found similar results with no effect from the use of repellent. Conclusions This randomised controlled trial suggests that topical repellents are not a suitable intervention in addition to LLINs against malaria amongst agricultural populations in southern Lao PDR. These results are also likely to be applicable to much of the Greater Mekong Sub-region. Trial Registration This trial is registered with number NCT00938379 PMID:23967083
Effective tree hazard control on forested recreation sites...losses and protection costs evaluated
Lee A. Paine
1967-01-01
Effectiveness of hazard control was evaluated by analyzing data on tree failures, accidents, and control costs on California recreation sites. Results indicate that reduction of limb hazard in oaks and bole hazard in conifers is the most effective form of control. Least effective is limb hazard reduction in conifers. After hazard control goals or control budgets have...
Curran, Eileen A; Dalman, Christina; Kearney, Patricia M; Kenny, Louise C; Cryan, John F; Dinan, Timothy G; Khashan, Ali S
2015-09-01
Because the rates of cesarean section (CS) are increasing worldwide, it is becoming increasingly important to understand the long-term effects that mode of delivery may have on child development. To investigate the association between obstetric mode of delivery and autism spectrum disorder (ASD). Perinatal factors and ASD diagnoses based on the International Classification of Diseases, Ninth Revision (ICD-9),and the International Statistical Classification of Diseases, 10th Revision (ICD-10),were identified from the Swedish Medical Birth Register and the Swedish National Patient Register. We conducted stratified Cox proportional hazards regression analysis to examine the effect of mode of delivery on ASD. We then used conditional logistic regression to perform a sibling design study, which consisted of sibling pairs discordant on ASD status. Analyses were adjusted for year of birth (ie, partially adjusted) and then fully adjusted for various perinatal and sociodemographic factors. The population-based cohort study consisted of all singleton live births in Sweden from January 1, 1982, through December 31, 2010. Children were followed up until first diagnosis of ASD, death, migration, or December 31, 2011 (end of study period), whichever came first. The full cohort consisted of 2,697,315 children and 28,290 cases of ASD. Sibling control analysis consisted of 13,411 sibling pairs. Obstetric mode of delivery defined as unassisted vaginal delivery (VD), assisted VD, elective CS, and emergency CS (defined by before or after onset of labor). The ASD status as defined using codes from the ICD-9 (code 299) and ICD-10 (code F84). In adjusted Cox proportional hazards regression analysis, elective CS (hazard ratio, 1.21; 95% CI, 1.15-1.27) and emergency CS (hazard ratio, 1.15; 95% CI, 1.10-1.20) were associated with ASD when compared with unassisted VD. In the sibling control analysis, elective CS was not associated with ASD in partially (odds ratio [OR], 0.97; 95% CI, 0.85-1.11) or fully adjusted (OR, 0.89; 95% CI, 0.76-1.04) models. Emergency CS was significantly associated with ASD in partially adjusted analysis (OR, 1.20; 95% CI, 1.06-1.36), but this effect disappeared in the fully adjusted model (OR, 0.97; 95% CI, 0.85-1.11). This study confirms previous findings that children born by CS are approximately 20% more likely to be diagnosed as having ASD. However, the association did not persist when using sibling controls, implying that this association is due to familial confounding by genetic and/or environmental factors.
Saito, Yoshihiko; Okada, Sadanori; Ogawa, Hisao; Soejima, Hirofumi; Sakuma, Mio; Nakayama, Masafumi; Doi, Naofumi; Jinnouchi, Hideaki; Waki, Masako; Masuda, Izuru; Morimoto, Takeshi
2017-02-14
The long-term efficacy and safety of low-dose aspirin for primary prevention of cardiovascular events in patients with type 2 diabetes mellitus are still inconclusive. The JPAD trial (Japanese Primary Prevention of Atherosclerosis With Aspirin for Diabetes) was a randomized, open-label, standard care-controlled trial examining whether low-dose aspirin affected cardiovascular events in 2539 Japanese patients with type 2 diabetes mellitus and without preexisting cardiovascular disease. Patients were randomly allocated to receive aspirin (81 or 100 mg daily; aspirin group) or no aspirin (no-aspirin group) in the JPAD trial. After that trial ended in 2008, we followed up with the patients until 2015, with no attempt to change the previously assigned therapy. Primary end points were cardiovascular events, including sudden death, fatal or nonfatal coronary artery disease, fatal or nonfatal stroke, and peripheral vascular disease. For the safety analysis, hemorrhagic events, consisting of gastrointestinal bleeding, hemorrhagic stroke, and bleeding from any other sites, were also analyzed. The primary analysis was conducted for cardiovascular events among patients who retained their original allocation (a per-protocol cohort). Analyses on an intention-to-treat cohort were conducted for hemorrhagic events and statistical sensitivity. The median follow-up period was 10.3 years; 1621 patients (64%) were followed up throughout the study; and 2160 patients (85%) retained their original allocation. Low-dose aspirin did not reduce cardiovascular events in the per-protocol cohort (hazard ratio, 1.14; 95% confidence interval, 0.91-1.42). Multivariable Cox proportional hazard model adjusted for age, sex, glycemic control, kidney function, smoking status, hypertension, and dyslipidemia showed similar results (hazard ratio, 1.04; 95% confidence interval, 0.83-1.30), with no heterogeneity of efficacy in subgroup analyses stratified by each of these factors (all interaction P >0.05). Sensitivity analyses on the intention-to-treat cohort yielded consistent results (hazard ratio, 1.01; 95% confidence interval, 0.82-1.25). Gastrointestinal bleeding occurred in 25 patients (2%) in the aspirin group and 12 (0.9%) in the no-aspirin group ( P =0.03), and the incidence of hemorrhagic stroke was not different between groups. Low-dose aspirin did not affect the risk for cardiovascular events but increased risk for gastrointestinal bleeding in patients with type 2 diabetes mellitus in a primary prevention setting. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00110448. © 2016 American Heart Association, Inc.
Germline PARP4 mutations in patients with primary thyroid and breast cancers.
Ikeda, Yuji; Kiyotani, Kazuma; Yew, Poh Yin; Kato, Taigo; Tamura, Kenji; Yap, Kai Lee; Nielsen, Sarah M; Mester, Jessica L; Eng, Charis; Nakamura, Yusuke; Grogan, Raymon H
2016-03-01
Germline mutations in the PTEN gene, which cause Cowden syndrome, are known to be one of the genetic factors for primary thyroid and breast cancers; however, PTEN mutations are found in only a small subset of research participants with non-syndrome breast and thyroid cancers. In this study, we aimed to identify germline variants that may be related to genetic risk of primary thyroid and breast cancers. Genomic DNAs extracted from peripheral blood of 14 PTEN WT female research participants with primary thyroid and breast cancers were analyzed by whole-exome sequencing. Gene-based case-control association analysis using the information of 406 Europeans obtained from the 1000 Genomes Project database identified 34 genes possibly associated with the phenotype with P < 1.0 × 10(-3). Among them, rare variants in the PARP4 gene were detected at significant high frequency (odds ratio = 5.2; P = 1.0 × 10(-5)). The variants, G496V and T1170I, were found in six of the 14 study participants (43%) while their frequencies were only 0.5% in controls. Functional analysis using HCC1143 cell line showed that knockdown of PARP4 with siRNA significantly enhanced the cell proliferation, compared with the cells transfected with siControl (P = 0.02). Kaplan-Meier analysis using Gene Expression Omnibus (GEO), European Genome-phenome Archive (EGA) and The Cancer Genome Atlas (TCGA) datasets showed poor relapse-free survival (P < 0.001, Hazard ratio 1.27) and overall survival (P = 0.006, Hazard ratio 1.41) in a PARP4 low-expression group, suggesting that PARP4 may function as a tumor suppressor. In conclusion, we identified PARP4 as a possible susceptibility gene of primary thyroid and breast cancer. © 2016 Society for Endocrinology.
Germline PARP4 mutations in patients with primary thyroid and breast cancers
Ikeda, Yuji; Kiyotani, Kazuma; Yew, Poh Yin; Kato, Taigo; Tamura, Kenji; Yap, Kai-Lee; Nielsen, Sarah M.; Mester, Jessica L; Eng, Charis; Nakamura, Yusuke; Grogan, Raymon H.
2016-01-01
Germline mutations in the PTEN gene, which cause Cowden syndrome (CS), are known to be one of the genetic factors for primary thyroid and breast cancers, however, PTEN mutations are found in only a small subset of research participants with non-syndrome breast and thyroid cancers. In this study, we aimed to identify germline variants that may be related to genetic risk of primary thyroid and breast cancers. Genomic DNAs extracted from peripheral blood of 14 PTEN-wild-type female research participants with primary thyroid and breast cancers were analyzed by whole-exome sequencing. Gene-based case control association analysis using the information of 406 Europeans obtained from the 1000 Genomes Project database identified 34 genes possibly associated with the phenotype with P<1.0×10−3. Among them, rare variants in the PARP4 gene were detected at significant high frequency (odds ratio = 5.2, P = 1.0×10−5). The variants, G496V and T1170I, were found in 6 of the 14 study participants (43%) while their frequencies were only 0.5% in controls. Functional analysis using HCC1143 cell line showed that knockdown of PARP4 with siRNA significantly enhanced the cell proliferation, compared with the cells transfected with siControl (P = 0.02). Kaplan-Meier analysis using GEO, EGA and TCGA datasets showed poor progression-free survival (P = 0.006, Hazard ratio 0.71) and overall survival (P < 0.0001, Hazard ratio 0.79) in a PARP4 low-expression group, suggesting that PARP4 may function as a tumor suppression. In conclusion, we identified PARP4 as a possible susceptibility gene of primary thyroid and breast cancer. PMID:26699384
Fogelfeld, Leon; Hart, Peter; Miernik, Jadwiga; Ko, Jocelyn; Calvin, Donna; Tahsin, Bettina; Adhami, Anwar; Mehrotra, Rajeev; Fogg, Louis
2017-03-01
To evaluate efficacy of a multifactorial-multidisciplinary approach in delaying CKD 3-4 progression to ESRD. Two-year proof-of-concept stratified randomized control trial conducted in an outpatient clinic of a large public hospital system. This intervention, led by a team of endocrinologists, nephrologists, nurse practitioners, and registered dietitians, integrated intensive diabetes-renal care with behavioral/dietary and pharmacological interventions. 120 low-income adults with T2DM and CKD 3-4 enrolled; 58% male, 55% African American, 23% Hispanic. Primary outcome was progression rate from CKD 3-4 to ESRD. Fewer intervention (13%) than control (28%) developed ESRD, p<0.05. Intervention had greater albumin/creatinine ratio (ACR) decrease (62% vs. 42%, p<0.05) and A1C<7% attainment (50% vs. 30%, p<0.05) and trended toward better lipid/blood pressure control (p=NS). Significant differences between 25 ESRD and 95 ESRD-free patients were baseline eGFR (28 vs. 40ml/min/1.73m 2 ), annual eGFR decline (15 vs. 3ml/min/year), baseline ACR (2362 vs. 1139mg/g), final ACR (2896 vs. 1201mg/g), and final A1C (6.9 vs. 7.8%). In multivariate Cox analysis, receiving the intervention reduced hazard ratio to develop ESRD (0.125, CI 0.029-0.54) as did higher baseline eGFR (0.69, CI 0.59-0.80). Greater annual eGFR decline increased hazard ratio (1.59, CI 1.34-1.87). The intervention delayed ESRD. Improved A1C and ACR plus not-yet-identified variables may have influenced better outcomes. Multifactorial-multidisciplinary care may serve as a CKD 3-4 treatment paradigm. Copyright © 2017 Elsevier Inc. All rights reserved.
Shahan, M R; Seaman, C E; Beck, T W; Colinet, J F; Mischler, S E
2017-09-01
Float coal dust is produced by various mining methods, carried by ventilating air and deposited on the floor, roof and ribs of mine airways. If deposited, float dust is re-entrained during a methane explosion. Without sufficient inert rock dust quantities, this float coal dust can propagate an explosion throughout mining entries. Consequently, controlling float coal dust is of critical interest to mining operations. Rock dusting, which is the adding of inert material to airway surfaces, is the main control technique currently used by the coal mining industry to reduce the float coal dust explosion hazard. To assist the industry in reducing this hazard, the Pittsburgh Mining Research Division of the U.S. National Institute for Occupational Safety and Health initiated a project to investigate methods and technologies to reduce float coal dust in underground coal mines through prevention, capture and suppression prior to deposition. Field characterization studies were performed to determine quantitatively the sources, types and amounts of dust produced during various coal mining processes. The operations chosen for study were a continuous miner section, a longwall section and a coal-handling facility. For each of these operations, the primary dust sources were confirmed to be the continuous mining machine, longwall shearer and conveyor belt transfer points, respectively. Respirable and total airborne float dust samples were collected and analyzed for each operation, and the ratio of total airborne float coal dust to respirable dust was calculated. During the continuous mining process, the ratio of total airborne float coal dust to respirable dust ranged from 10.3 to 13.8. The ratios measured on the longwall face were between 18.5 and 21.5. The total airborne float coal dust to respirable dust ratio observed during belt transport ranged between 7.5 and 21.8.
Prabhu, Anil; Tully, Phillip J; Bennetts, Jayme S; Tuble, Sigrid C; Baker, Robert A
2013-08-01
Though Indigenous Australian peoples reportedly have poorer survival outcome after cardiac surgery, few studies have jointly documented the experience of major morbidity, and considered the influence of patient geographic remoteness. From January 1998 to September 2008, major morbidity events and survival were recorded for 2748 consecutive patients undergoing coronary artery bypass graft surgery. Morbidity and survival analyses adjusted for propensity deciles based on patient ethnicity and age, sex, left ventricular ejection fraction, recent myocardial infarction, tobacco smoking, diabetes, renal disease and history of stroke. Sensitivity analyses controlled for the patient accessibility/remoteness index of Australia (ARIA). The 297 Indigenous Australian patients (10.8% of total) had greater odds for total morbidity (adjusted odds ratio = 1.55; 95% confidence interval [CI] 1.04-2.30) and prolonged ventilation (adjusted odds ratio = 2.08; 95% confidence interval [CI] 1.25-3.44) in analyses adjusted for propensity deciles and geographic remoteness. With a median follow-up of 7.5 years (interquartile range 5.2-10.2), Indigenous Australian patients were found to experience 30% greater mortality risk (unadjusted hazard ratio = 1.30; 95% CI: 1.03-1.64, p = 0.03). The effect size strengthened after adjustment for propensity score (adjusted hazard ratio = 1.49; 95% CI: 1.13-1.96, p = .004). Adjustment for ARIA categorisation strengthened the effect size (adjusted HR = 1.54 (95% CI: 1.11-2.13, p = .009). Indigenous Australian peoples were at greater risk for prolonged ventilation and combined morbidity outcome, and experienced poorer survival in the longer term. Higher mortality risk among Indigenous Australians was evident even after controlling for remoteness and accessibility to services. Crown Copyright © 2013. Published by Elsevier B.V. All rights reserved.
Cancer Survival Estimates Due to Non-Uniform Loss to Follow-Up and Non-Proportional Hazards
K M, Jagathnath Krishna; Mathew, Aleyamma; Sara George, Preethi
2017-06-25
Background: Cancer survival depends on loss to follow-up (LFU) and non-proportional hazards (non-PH). If LFU is high, survival will be over-estimated. If hazard is non-PH, rank tests will provide biased inference and Cox-model will provide biased hazard-ratio. We assessed the bias due to LFU and non-PH factor in cancer survival and provided alternate methods for unbiased inference and hazard-ratio. Materials and Methods: Kaplan-Meier survival were plotted using a realistic breast cancer (BC) data-set, with >40%, 5-year LFU and compared it using another BC data-set with <15%, 5-year LFU to assess the bias in survival due to high LFU. Age at diagnosis of the latter data set was used to illustrate the bias due to a non-PH factor. Log-rank test was employed to assess the bias in p-value and Cox-model was used to assess the bias in hazard-ratio for the non-PH factor. Schoenfeld statistic was used to test the non-PH of age. For the non-PH factor, we employed Renyi statistic for inference and time dependent Cox-model for hazard-ratio. Results: Five-year BC survival was 69% (SE: 1.1%) vs. 90% (SE: 0.7%) for data with low vs. high LFU respectively. Age (<45, 46-54 & >54 years) was a non-PH factor (p-value: 0.036). However, survival by age was significant (log-rank p-value: 0.026), but not significant using Renyi statistic (p=0.067). Hazard ratio (HR) for age using Cox-model was 1.012 (95%CI: 1.004 -1.019) and the same using time-dependent Cox-model was in the other direction (HR: 0.997; 95% CI: 0.997- 0.998). Conclusion: Over-estimated survival was observed for cancer with high LFU. Log-rank statistic and Cox-model provided biased results for non-PH factor. For data with non-PH factors, Renyi statistic and time dependent Cox-model can be used as alternate methods to obtain unbiased inference and estimates. Creative Commons Attribution License
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-27
... Awards for Lead-Based Paint Hazard Control, and Lead Hazard Reduction Demonstration Grant Programs for... (OHHLHC) Lead-Based Paint Hazard Control, and Lead Hazard Reduction Demonstration Grant Program Notices of... Grants.gov on December 3, 2012, and amended on January 18, 2013, for the Lead Based Paint Hazard Control...
Kendzerska, Tetyana; To, Teresa M; Aaron, Shawn D; Lougheed, M Diane; Sadatsafavi, Mohsen; FitzGerald, J Mark; Gershon, Andrea S
2017-03-01
Little is known about the natural history of chronic obstructive pulmonary disease (COPD) that has developed from airway remodeling due to asthma, as compared with other COPD phenotypes. We compared long-term health outcomes of individuals with COPD with and without a history of asthma in a population-based cohort study. All individuals with physician-diagnosed COPD between the ages 40 and 55 years from 2009 and 2011 were identified and followed until March 2013 through provincial health administrative data (Ontario, Canada). The exposure was a history of asthma at least 2 years before the diagnosis of COPD to ensure it preceded COPD. The hazards of COPD-, respiratory-, and cardiovascular (CV)-related hospitalizations and all-cause mortality were compared between groups using a Cox regression model controlling for demographic characteristics, comorbidities, and level of health care. Among 9053 patients with COPD, 2717 (30%) had a history of asthma. Over a median of 2.9 years, 712 (8%) individuals had a first COPD hospitalization, 964 (11%) a first respiratory-related and 342 (4%) a first CV-related hospitalization, and 556 (6%) died. Controlling for confounding, a history of asthma was significantly associated with COPD and respiratory-related hospitalizations (hazard ratio, 1.53 [95% CI, 1.29-1.82] and hazard ratio, 1.63 [95% CI, 1.14-1.88], respectively), but not with CV-related hospitalizations or all-cause mortality. Additional analyses confirmed that these findings were not likely a result of unmeasured confounding or misclassification. Middle-aged individuals with physician-diagnosed COPD and a history of asthma had a higher hazard of hospitalizations due to COPD and other respiratory diseases than did those without. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Health Insurance Trajectories and Long-Term Survival After Heart Transplantation.
Tumin, Dmitry; Foraker, Randi E; Smith, Sakima; Tobias, Joseph D; Hayes, Don
2016-09-01
Health insurance status at heart transplantation influences recipient survival, but implications of change in insurance for long-term outcomes are unclear. Adults aged 18 to 64 receiving first-time orthotopic heart transplants between July 2006 and December 2013 were identified in the United Network for Organ Sharing registry. Patients surviving >1 year were categorized according to trajectory of insurance status (private compared with public) at wait listing, transplantation, and 1-year follow-up. The most common insurance trajectories were continuous private coverage (44%), continuous public coverage (27%), and transition from private to public coverage (11%). Among patients who survived to 1 year (n=9088), continuous public insurance (hazard ratio =1.36; 95% confidence interval 1.19, 1.56; P<0.001) and transition from private to public insurance (hazard ratio =1.25; 95% confidence interval 1.04, 1.50; P=0.017) were associated with increased mortality hazard relative to continuous private insurance. Supplementary analyses of 11 247 patients included all durations of post-transplant survival and examined post-transplant private-to-public and public-to-private transitions as time-varying covariates. In these analyses, transition from private to public insurance was associated with increased mortality hazard (hazard ratio =1.25; 95% confidence interval 1.07, 1.47; P=0.005), whereas transition from public to private insurance was associated with lower mortality hazard (hazard ratio =0.78; 95% confidence interval 0.62, 0.97; P=0.024). Transition from private to public insurance after heart transplantation is associated with worse long-term outcomes, compounding disparities in post-transplant survival attributed to insurance status at transplantation. By contrast, post-transplant gain of private insurance among patients receiving publicly funded heart transplants was associated with improved outcomes. © 2016 American Heart Association, Inc.
Kumar, N B; Cantor, A; Allen, K; Cox, C E
2000-06-15
Although a large body of research exists concerning pathologic prognostic indicators of the rate of incidence and survival from breast carcinoma, to the authors' knowledge very few studies have examined the effects of anthropometric variables such as height, obesity, weight gain in adulthood, timing of weight gain, and body composition to survival, although these variables are related to the incidence rate. The survival status of 166 patients diagnosed with primary breast carcinoma and followed for at least 10 years was obtained from the Cancer Center's registry, and significant anthropometric and other known prognostic indicators regarding survival after diagnosis were determined by Cox proportional hazards analysis. Eighty-three of 166 breast carcinoma patients (50%) with up to 10 years of follow-up died of disease. Android body fat distribution, as indicated by a higher suprailiac:thigh ratio, was a statistically significant (P < 0.0001) prognostic indicator for survival after controlling for stage of disease, with a hazards ratio of 2.6 (95% confidence interval [95% CI], 1.63-4.17). Adult weight gain, as indicated specifically by weight at age 30 years, was a statistically significant (P < 0.05) prognostic indicator for survival with a hazards ratio of 1.15 (95% CI, 1.0-1.28). In addition, the authors observed the Quatelet Index, a negatively significant (P < 0.01) prognostic indicator for survival with a hazards ratio of 0.92 (95% CI, 0.87-0.98). Other markers of general obesity such as weight at diagnosis, percent body fat, and body surface area were not significant markers influencing survival. Similarly, height; triceps, biceps; subscapular, suprailiac, abdominal, and thigh skinfolds; waist and hip circumferences; family history; and reproductive and hormonal variables at the time of diagnosis showed no apparent significant relation to survival. The results of the current study provide some evidence that android body fat distribution at diagnosis and increased weight at age 30 years increases a woman's risk of dying of breast carcinoma. Copyright 2000 American Cancer Society.
Saxagliptin and cardiovascular outcomes in patients with type 2 diabetes mellitus.
Scirica, Benjamin M; Bhatt, Deepak L; Braunwald, Eugene; Steg, P Gabriel; Davidson, Jaime; Hirshberg, Boaz; Ohman, Peter; Frederich, Robert; Wiviott, Stephen D; Hoffman, Elaine B; Cavender, Matthew A; Udell, Jacob A; Desai, Nihar R; Mosenzon, Ofri; McGuire, Darren K; Ray, Kausik K; Leiter, Lawrence A; Raz, Itamar
2013-10-03
The cardiovascular safety and efficacy of many current antihyperglycemic agents, including saxagliptin, a dipeptidyl peptidase 4 (DPP-4) inhibitor, are unclear. We randomly assigned 16,492 patients with type 2 diabetes who had a history of, or were at risk for, cardiovascular events to receive saxagliptin or placebo and followed them for a median of 2.1 years. Physicians were permitted to adjust other medications, including antihyperglycemic agents. The primary end point was a composite of cardiovascular death, myocardial infarction, or ischemic stroke. A primary end-point event occurred in 613 patients in the saxagliptin group and in 609 patients in the placebo group (7.3% and 7.2%, respectively, according to 2-year Kaplan-Meier estimates; hazard ratio with saxagliptin, 1.00; 95% confidence interval [CI], 0.89 to 1.12; P=0.99 for superiority; P<0.001 for noninferiority); the results were similar in the "on-treatment" analysis (hazard ratio, 1.03; 95% CI, 0.91 to 1.17). The major secondary end point of a composite of cardiovascular death, myocardial infarction, stroke, hospitalization for unstable angina, coronary revascularization, or heart failure occurred in 1059 patients in the saxagliptin group and in 1034 patients in the placebo group (12.8% and 12.4%, respectively, according to 2-year Kaplan-Meier estimates; hazard ratio, 1.02; 95% CI, 0.94 to 1.11; P=0.66). More patients in the saxagliptin group than in the placebo group were hospitalized for heart failure (3.5% vs. 2.8%; hazard ratio, 1.27; 95% CI, 1.07 to 1.51; P=0.007). Rates of adjudicated cases of acute and chronic pancreatitis were similar in the two groups (acute pancreatitis, 0.3% in the saxagliptin group and 0.2% in the placebo group; chronic pancreatitis, <0.1% and 0.1% in the two groups, respectively). DPP-4 inhibition with saxagliptin did not increase or decrease the rate of ischemic events, though the rate of hospitalization for heart failure was increased. Although saxagliptin improves glycemic control, other approaches are necessary to reduce cardiovascular risk in patients with diabetes. (Funded by AstraZeneca and Bristol-Myers Squibb; SAVOR-TIMI 53 ClinicalTrials.gov number, NCT01107886.).
May, Heidi T; Nelson, John R; Lirette, Seth T; Kulkarni, Krishnaji R; Anderson, Jeffrey L; Griswold, Michael E; Horne, Benjamin D; Correa, Adolfo; Muhlestein, Joseph B
2016-05-01
Dyslipidemia plays a significant role in the progression of cardiovascular disease. The apolipoprotein (apo) A1 remnant ratio (apo A1/VLDL3-C + IDL-C) has recently been shown to be a strong predictor of death/myocardial infarction risk among women >50 years undergoing angiography. However, whether this ratio is associated with coronary heart disease risk among other populations is unknown. We evaluated the apo A1 remnant ratio and its components for coronary heart disease incidence. Observational. Participants (N = 4722) of the Jackson Heart Study were evaluated. Baseline clinical characteristics and lipoprotein subfractions (Vertical Auto Profile method) were collected. Cox hazard regression analysis, adjusted by standard cardiovascular risk factors, was utilized to determine associations of lipoproteins with coronary heart disease. Those with new-onset coronary heart disease were older, diabetic, smokers, had less education, used more lipid-lowering medication, and had a more atherogenic lipoprotein profile. After adjustment, the apo A1 remnant ratio (hazard ratio = 0.67 per 1-SD, p = 0.002) was strongly associated with coronary heart disease incidence. This association appears to be driven by the IDL-C denominator (hazard ratio = 1.23 per 1-SD, p = 0.007). Remnants (hazard ratio = 1.21 per 1-SD, p = 0.017), but not apo A1 (hazard ratio = 0.85 per 1-SD, p = 0.121) or VLDL3-C (hazard ratio = 1.13 per 1-SD, p = 0.120) were associated with coronary heart disease. Standard lipids were not associated with coronary heart disease incidence. We found the apo A1 remnant ratio to be strongly associated with coronary heart disease. This ratio appears to better stratify risk than standard lipids, apo A1, and remnants among a primary prevention cohort of African Americans. Its utility requires further study as a lipoprotein management target for risk reduction. © The European Society of Cardiology 2015.
Shirani, Afsaneh; Zhao, Yinshan; Karim, Mohammad Ehsanul; Evans, Charity; Kingwell, Elaine; van der Kop, Mia L; Oger, Joel; Gustafson, Paul; Petkau, John; Tremlett, Helen
2012-07-18
Interferon beta is widely prescribed to treat multiple sclerosis (MS); however, its relationship with disability progression has yet to be established. To investigate the association between interferon beta exposure and disability progression in patients with relapsing-remitting MS. Retrospective cohort study based on prospectively collected data (1985-2008) from British Columbia, Canada. Patients with relapsing-remitting MS treated with interferon beta (n = 868) were compared with untreated contemporary (n = 829) and historical (n = 959) cohorts. The main outcome measure was time from interferon beta treatment eligibility (baseline) to a confirmed and sustained score of 6 (requiring a cane to walk 100 m; confirmed at >150 days with no measurable improvement) on the Expanded Disability Status Scale (EDSS) (range, 0-10, with higher scores indicating higher disability). A multivariable Cox regression model with interferon beta treatment included as a time-varying covariate was used to assess the hazard of disease progression associated with interferon beta treatment. Analyses also included propensity score adjustment to address confounding by indication. The median active follow-up times (first to last EDSS measurement) were as follows: for the interferon beta-treated cohort, 5.1 years (interquartile range [IQR], 3.0-7.0 years); for the contemporary control cohort, 4.0 years (IQR, 2.1-6.4 years); and for the historical control cohort, 10.8 years (IQR, 6.3-14.7 years). The observed outcome rates for reaching a sustained EDSS score of 6 were 10.8%, 5.3%, and 23.1% in the 3 cohorts, respectively. After adjustment for potential baseline confounders (sex, age, disease duration, and EDSS score), exposure to interferon beta was not associated with a statistically significant difference in the hazard of reaching an EDSS score of 6 when either the contemporary control cohort (hazard ratio, 1.30; 95% CI, 0.92-1.83; P = .14) or the historical control cohort (hazard ratio, 0.77; 95% CI, 0.58-1.02; P = .07) were considered. Further adjustment for comorbidities and socioeconomic status, where possible, did not change interpretations, and propensity score adjustment did not substantially change the results. Among patients with relapsing-remitting MS, administration of interferon beta was not associated with a reduction in progression of disability.
The Maximum Cumulative Ratio (MCR) quantifies the degree to which a single component of a chemical mixture drives the cumulative risk of a receptor.1 This study used the MCR, the Hazard Index (HI) and Hazard Quotient (HQ) to evaluate co-exposures to six phthalates using biomonito...
14 CFR 417.413 - Hazard areas.
Code of Federal Regulations, 2014 CFR
2014-01-01
... controls during public access. A launch operator must establish procedural controls that prevent hazardous... that system hazard controls are in place that prevent initiation of a hazardous event. Hazard controls... devices or other restraints on system actuation switches or other controls to eliminate the possibility of...
14 CFR 417.413 - Hazard areas.
Code of Federal Regulations, 2010 CFR
2010-01-01
... controls during public access. A launch operator must establish procedural controls that prevent hazardous... that system hazard controls are in place that prevent initiation of a hazardous event. Hazard controls... devices or other restraints on system actuation switches or other controls to eliminate the possibility of...
14 CFR 417.413 - Hazard areas.
Code of Federal Regulations, 2011 CFR
2011-01-01
... controls during public access. A launch operator must establish procedural controls that prevent hazardous... that system hazard controls are in place that prevent initiation of a hazardous event. Hazard controls... devices or other restraints on system actuation switches or other controls to eliminate the possibility of...
Sun, Wenjie; Schooling, C Mary; Chan, Wai Man; Ho, Kin Sang; Lam, Tai Hing
2011-04-01
Increasingly, researchers have begun to explore the association between depression and mortality. The current study examined the association between depressive symptoms and all-cause and cause-specific mortality in Chinese older people. Further to examine whether any associations were similar by sex and health status. We used the Chinese version of the 15-item Geriatric Depression Scale to measure depressive symptoms (Geriatric Depression Scale score ≥ 8) and Cox regression to examine the association with all-cause and cause-specific mortality in a population-based cohort study of all 56,088 enrollees, aged 65 years or older, from July 1998 to December 2000 at all 18 Elderly Health Centers of Department of Health of Hong Kong. The cohort was followed up for mortality till December 31, 2005. Depressive symptoms were associated with all-cause mortality (hazard ratio 1.21, 95% confidence interval: 1.08-1.37) in men only (p value for sex interaction <.05) and with suicide mortality in men (hazard ratio 2.81, 95% confidence interval: 1.13-7.01) and women (hazard ratio 2.40, 95% confidence interval: 1.18-4.82) but not with other major causes of death after adjusting for age, education, monthly expenditure, smoking, alcohol drinking, physical activity, body mass index, health status, and self-rated health. The associations did not vary with health status. Depressive symptoms were associated with all-cause mortality in men and with suicide in both sexes. Randomized controlled trials concerning the effects of treatment of depression on mortality are needed to clarify the causal pathways.
McKee, Richard H; Tibaldi, Rosalie; Adenuga, Moyinoluwa D; Carrillo, Juan-Carlos; Margary, Alison
2018-02-01
The European chemical control regulation (REACH) requires that data on physical/chemical, toxicological and environmental hazards be compiled. Additionally, REACH requires formal assessments to ensure that substances can be safely used for their intended purposes. For health hazard assessments, reference values (Derived No Effect levels, DNELs) are calculated from toxicology data and compared to estimated exposure levels. If the ratio of the predicted exposure level to the DNEL, i.e. the Risk Characterization Ratio (RCR), is less than 1, the risk is considered controlled; otherwise, additional Risk Management Measures (RMM) must be applied. These requirements pose particular challenges for complex substances. Herein, "white spirit", a complex hydrocarbon solvent, is used as an example to illustrate how these procedures were applied. Hydrocarbon solvents were divided into categories of similar substances. Representative substances were identified for DNEL determinations. Adjustment factors were applied to the no effect levels to calculate the DNELs. Exposure assessments utilized a standardized set of generic exposure scenarios (GES) which incorporated exposure predictions for solvent handling activities. Computer-based tools were developed to automate RCR calculations and identify appropriate RMMs, allowing consistent communications to users via safety data sheets. Copyright © 2017 ExxonMobil Biomedical Sciences Inc. Published by Elsevier Inc. All rights reserved.
Ji, W H; Jiang, Y H; Ji, Y L; Li, B; Mao, W M
2016-07-01
The study aimed to evaluate the prognostic significance of prechemotherapy neutrophil to lymphocyte ratio and platelet to lymphocyte ratio, and preoperative neutrophil to lymphocyte ratio and platelet to lymphocyte ratio in locally advanced esophageal squamous cell cancer. We analyzed retrospectively locally advanced esophageal squamous cell cancer patients who had received neoadjuvant chemotherapy before undergoing a radical esophagectomy between 2009 and 2012. Neutrophil to lymphocyte ratio and platelet to lymphocyte ratio before chemotherapy and before the surgery were calculated. Univariate analyses showed that prechemotherapy neutrophil to lymphocyte ratio >5 (P = 0.048, hazard ratio = 2.86; 95% confidence interval: 1.01-8.12) and prechemotherapy platelet to lymphocyte ratio >130 (P = 0.025, hazard ratio = 5.50; 95% confidence interval: 1.23-24.55) were associated significantly with overall survival (OS), and prechemotherapy platelet to lymphocyte ratio >130 (P = 0.026, hazard ratio = 3.18; 95% confidence interval: 1.15-8.85) was associated significantly with progression-free survival. However, only prechemotherapy neutrophil to lymphocyte ratio >5 (P = 0.024, hazard ratio = 3.50; 95% confidence interval: 1.18-10.40) remained significantly associated with OS in multivariate analyses. Neither preoperative neutrophil to lymphocyte ratio nor platelet to lymphocyte ratio was associated with OS or progression-free survival. The prechemotherapy neutrophil to lymphocyte ratio >5 to preoperative neutrophil to lymphocyte ratio ≤5 group showed significantly worse OS than the prechemotherapy neutrophil to lymphocyte ratio ≤5 to preoperative neutrophil to lymphocyte ratio ≤5 group (P = 0.050). The prechemotherapy platelet to lymphocyte ratio >130 to preoperative platelet to lymphocyte ratio ≤130 group (P = 0.016) and platelet to lymphocyte ratio >130 to preoperative platelet to lymphocyte ratio >130 group (P = 0.042) showed significantly worse OS than the prechemotherapy platelet to lymphocyte ratio ≤30 to preoperative platelet to lymphocyte ratio ≤130 group. In conclusions, prechemotherapy neutrophil to lymphocyte ratio is an independent prognostic factor for OS in patients with advanced esophageal squamous cell cancer treated with neoadjuvant chemotherapy, and, as an adverse prognostic predictor, increased prechemotherapy neutrophil to lymphocyte ratio is superior to platelet to lymphocyte ratio. Maintaining a low neutrophil to lymphocyte ratio and platelet to lymphocyte ratio throughout treatment is a predictor of better OS. © 2015 International Society for Diseases of the Esophagus.
Pei, Xueqing; Liu, Yu; Sun, Liwei; Zhang, Jun; Fang, Yuanyuan; Liao, Xin; Liu, Jian; Zhang, Cuntai; Yin, Tiejun
2016-12-01
The aim of this study was to evaluate the efficacy and toxicity of molecular targeted agents plus chemotherapy compared with chemotherapy alone as second-line therapy for patients with metastatic colorectal cancer (mCRC). We identified randomized controlled trials that compared molecular targeted agents plus chemotherapy with chemotherapy alone by searching the PubMed and Embase databases for articles published between January 2000 and September 2015. The outcome measures included progression-free survival, overall survival, objective response rate, and adverse events. Two investigators independently performed the information retrieval, screening, and data extraction. Stata 10.0 software was used to statistically analyze the extracted data. In accordance with our inclusion criteria, 11 trials, with a total of 7440 patients, were included in this meta-analysis through rounds of selection. We divided the biologic agents used into 3 subgroups based on the type of biologic agents-vascular endothelial growth factor (VEGF) inhibitor, epidermal growth factor receptor inhibitor, and other pathway inhibitors. Our results suggested that the regimen of a molecular targeted agent plus chemotherapy had a significant advantage in progression-free survival, overall survival, and objective response rate over chemotherapy alone (hazard ratio, 0.74; 95% confidence interval [CI], 0.70-0.78; hazard ratio, 0.88; 95% CI, 0.83-0.93; risk ratio, 2.24; 95% CI: 1.58-3.17, respectively). However, the rate of grade ≥ 3 adverse events was also higher in the combination therapy arm (risk ratio, 1.25; 95% CI, 1.17-1.33). Subgroup analysis showed that the combination of VEGF inhibitor with chemotherapy had a significant advantage in PFS, OS, and ORR over chemotherapy alone, but there was also a higher risk ratio in adverse events for this combination compared with the control group. In conclusion, a molecular targeted agent, especially VEGF inhibitor, plus chemotherapy is a worthwhile combination for patients with metastatic colorectal cancer as second-line therapy. However, more randomized controlled trials on a larger scale are needed for evaluating the value of epidermal growth factor receptor and other pathway inhibitors. Copyright © 2016 Elsevier Inc. All rights reserved.
Watanabe, Shunsuke; Yoshihisa, Akiomi; Kanno, Yuki; Takiguchi, Mai; Yokokawa, Tetsuro; Sato, Akihiko; Miura, Shunsuke; Shimizu, Takeshi; Abe, Satoshi; Sato, Takamasa; Suzuki, Satoshi; Oikawa, Masayoshi; Sakamoto, Nobuo; Yamaki, Takayoshi; Sugimoto, Koichi; Kunii, Hiroyuki; Nakazato, Kazuhiko; Suzuki, Hitoshi; Saitoh, Shu-Ichi; Takeishi, Yasuchika
2016-12-01
Intake of n-3 polyunsaturated fatty acids (n-3 PUFAs) lowers the risk of atherosclerotic cardiovascular events, particularly ischemic heart disease. In addition, the ratio of eicosapentaenoic acid (EPA; n-3 PUFA) to arachidonic acid (AA; n-6 PUFA) has recently been recognized as a risk marker of cardiovascular disease. In contrast, the prognostic impact of the EPA/AA ratio on patients with heart failure (HF) remains unclear. A total of 577 consecutive patients admitted for HF were divided into 2 groups based on median of the EPA/AA ratio: low EPA/AA (EPA/AA <0.32 mg/dl, n = 291) and high EPA/AA (EPA/AA ≥0.32, n = 286) groups. We compared laboratory data and echocardiographic findings and followed cardiac mortality. Although body mass index, blood pressure, B-type natriuretic peptide, hemoglobin, estimated glomerular filtration rate, total protein, albumin, sodium, C-reactive protein, and left ventricular ejection fraction did not differ between the 2 groups, cardiac mortality was significantly higher in the low EPA/AA group than in the high EPA/AA group (12.7 vs 5.9%, log-rank P = .004). Multivariate Cox proportional hazard analysis revealed that the EPA/AA ratio was an independent predictor of cardiac mortality (hazard ratio 0.677, 95% confidence interval 0.453-0.983, P = .041) in patients with HF. The EPA/AA ratio was an independent predictor of cardiac mortality in patients with HF; therefore, the prognosis of patients with HF may be improved by taking appropriate management to control the EPA/AA balance. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
A more powerful test based on ratio distribution for retention noninferiority hypothesis.
Deng, Ling; Chen, Gang
2013-03-11
Rothmann et al. ( 2003 ) proposed a method for the statistical inference of fraction retention noninferiority (NI) hypothesis. A fraction retention hypothesis is defined as a ratio of the new treatment effect verse the control effect in the context of a time to event endpoint. One of the major concerns using this method in the design of an NI trial is that with a limited sample size, the power of the study is usually very low. This makes an NI trial not applicable particularly when using time to event endpoint. To improve power, Wang et al. ( 2006 ) proposed a ratio test based on asymptotic normality theory. Under a strong assumption (equal variance of the NI test statistic under null and alternative hypotheses), the sample size using Wang's test was much smaller than that using Rothmann's test. However, in practice, the assumption of equal variance is generally questionable for an NI trial design. This assumption is removed in the ratio test proposed in this article, which is derived directly from a Cauchy-like ratio distribution. In addition, using this method, the fundamental assumption used in Rothmann's test, that the observed control effect is always positive, that is, the observed hazard ratio for placebo over the control is greater than 1, is no longer necessary. Without assuming equal variance under null and alternative hypotheses, the sample size required for an NI trial can be significantly reduced if using the proposed ratio test for a fraction retention NI hypothesis.
Gough, Michael S.; Morgan, Mary Anne M.; Mack, Cynthia M.; Darling, Denise C.; Frasier, Lauren M.; Doolin, Kathleen P.; Apostolakos, Michael J.; Stewart, Judith C.; Graves, Brian T.; Arning, Erland; Bottiglieri, Teodoro; Mooney, Robert A.; Frampton, Mark W.; Pietropaoli, Anthony P.
2011-01-01
Objective Arginine deficiency may contribute to microvascular dysfunction, but previous studies suggest that arginine supplementation may be harmful in sepsis. Systemic arginine availability can be estimated by measuring the ratio of arginine to its endogenous inhibitors, asymmetric and symmetric dimethylarginine. We hypothesized that the arginine to dimethylarginine (Arg/DMA) ratio is reduced in patients with severe sepsis and associated with severity of illness and outcomes. Design Case-control and prospective cohort study Setting Medical and surgical intensive care units of an academic medical center Patients and Subjects 109 severe sepsis and 50 control subjects Measurements and Main Results Plasma and urine were obtained in control subjects and within 48 hours of diagnosis in severe sepsis patients. The Arg/DMA ratio was higher in control subjects vs. sepsis patients ((median = 95 [inter-quartile range = 85 – 114]) vs. 34 [24 – 48], p < 0.001), and in hospital survivors vs. non-survivors ((39 [26 – 52]) vs. 27 [19 – 32], p = 0.004). The Arg/DMA ratio was correlated with Acute Physiology and Chronic Health Evaluation II score (Spearman’s correlation coefficient [rho] = − 0.40, p < 0.001) and organ-failure free days (rho = 0.30, p = 0.001). A declining Arg/DMA ratio was independently associated with hospital mortality (odds ratio =1.63 per quartile, 95% confidence interval [CI] = 1.00 – 2.65, p = 0.048) and risk of death over 6 months (hazard ratio = 1.41 per quartile, 95% CI = 1.01 – 1.98, p = 0.043). The Arg/DMA ratio was correlated with the urinary nitrate to creatinine ratio (rho = 0.46, p < 0.001). Conclusions The Arg/DMA ratio is associated with severe sepsis, severity of illness, and clinical outcomes. The Arg/DMA ratio may be a useful biomarker, and interventions designed to augment systemic arginine availability in severe sepsis may still be worthy of investigation. PMID:21378552
Outcomes of anticoagulation therapy in patients with mental health conditions.
Paradise, Helen T; Berlowitz, Dan R; Ozonoff, Al; Miller, Donald R; Hylek, Elaine M; Ash, Arlene S; Jasuja, Guneet K; Zhao, Shibei; Reisman, Joel I; Rose, Adam J
2014-06-01
Patients with mental health conditions (MHCs) experience poor anticoagulation control when using warfarin, but we have limited knowledge of the association between specific mental illness and warfarin treatment outcomes. To examine the relationship between the severity of MHCs and outcomes of anticoagulation therapy. Retrospective cohort analysis. We studied 103,897 patients on warfarin for 6 or more months cared for by the Veterans Health Administration during fiscal years 2007-2008. We identified 28,216 patients with MHCs using ICD-9 codes: anxiety disorders, bipolar disorder, depression, post-traumatic stress disorder, schizophrenia, and other psychotic disorders. Outcomes included anticoagulation control, as measured by percent time in the therapeutic range (TTR), as well as major hemorrhage. Predictors included different categories of MHC, Global Assessment of Functioning (GAF) scores, and psychiatric hospitalizations. Patients with bipolar disorder, depression, and other psychotic disorders experienced TTR decreases of 2.63 %, 2.26 %, and 2.92 %, respectively (p < 0.001), after controlling for covariates. Patients with psychotic disorders other than schizophrenia experienced increased hemorrhage after controlling for covariates [hazard ratio (HR) 1.24, p = 0.03]. Having any MHC was associated with a slightly increased hazard for hemorrhage (HR 1.19, p < 0.001) after controlling for covariates. Patients with specific MHCs (bipolar disorder, depression, and other psychotic disorders) experienced slightly worse anticoagulation control. Patients with any MHC had a slightly increased hazard for major hemorrhage, but the magnitude of this difference is unlikely to be clinically significant. Overall, our results suggest that appropriately selected patients with MHCs can safely receive therapy with warfarin.
Imamura, Fumiaki; Lichtenstein, Alice H; Dallal, Gerard E; Meigs, James B; Jacques, Paul F
2009-07-01
The ability to interpret epidemiologic observations is limited because of potential residual confounding by correlated dietary components. Dietary pattern analyses by factor analysis or partial least squares may overcome the limitation. To examine confounding by dietary pattern as well as standard risk factors and selected nutrients, the authors modeled the longitudinal association between alcohol consumption and 7-year risk of type 2 diabetes mellitus in 2,879 healthy adults enrolled in the Framingham Offspring Study (1991-2001) by Cox proportional hazard models. After adjustment for standard risk factors, consumers of > or =9.0 drinks/week had a significantly lower risk of type 2 diabetes mellitus compared with abstainers (hazard ratio = 0.47, 95% confidence interval (CI): 0.27, 0.81). Adjustment for selected nutrients had little effect on the hazard ratio, whereas adjustment for dietary pattern variables by factor analysis significantly shifted the hazard ratio away from null (hazard ratio = 0.33, 95% CI: 0.17, 0.64) by 40.0% (95% CI: 16.8, 57.0; P = 0.002). Dietary pattern variables by partial least squares showed similar results. Therefore, the observed inverse association, consistent with past studies, was confounded by dietary patterns, and this confounding was not captured by individual nutrient adjustment. The data suggest that alcohol intake, not dietary patterns associated with alcohol intake, is responsible for the observed inverse association with type 2 diabetes mellitus risk.
Potanas, Christopher P; Padgett, Sheldon; Gamblin, Rance M
2015-04-15
Objective-To identify variables associated with prognosis in dogs undergoing surgical excision of anal sac apocrine gland adenocarcinomas (ASACs) with and without adjunctive chemotherapy. Design-Retrospective case series. Animals-42 dogs with ASACs. Procedures-Information on signalment, clinical signs, diagnostic procedures, surgical procedures, adjunctive therapies, survival time, and disease-free interval was obtained from the medical records. Results-Survival time was significantly associated with the presence of sublumbar lymphadenopathy and sublumbar lymph node extirpation, with median survival time significantly shorter for dogs with sublumbar lymphadenopathy (hazard ratio, 2.31) than for those without and for dogs that underwent lymph node extirpation (hazard ratio, 2.31) than for those that did not. Disease-free interval was significantly associated with the presence of sublumbar lymphadenopathy, lymph node extirpation, and administration of platinum-containing chemotherapeutic agents, with median disease-free interval significantly shorter for dogs with sublumbar lymphadenopathy (hazard ratio, 2.47) than for those without, for dogs that underwent lymph node extirpation (hazard ratio, 2.47) than for those that did not, and for dogs that received platinum-containing chemotherapeutic agents (hazard ratio, 2.69) than for those that did not. Survival time and disease-free interval did not differ among groups when dogs were grouped on the basis of histopathologic margins (complete vs marginal vs incomplete excision). Conclusions and Clinical Relevance-Results suggested that in dogs with ASAC undergoing surgical excision, the presence of sublumbar lymphadenopathy and lymph node extirpation were both negative prognostic factors. However, completeness of surgical excision was not associated with survival time or disease-free interval.
Hardy, Dale S; Stallings, Devita T; Garvin, Jane T; Xu, Hongyan; Racette, Susan B
2017-01-01
To determine which anthropometric measures are the strongest discriminators of incident type 2 diabetes (T2DM) among White and Black males and females in a large U.S. cohort. We used Atherosclerosis Risk in Communities study data from 12,121 participants aged 45-64 years without diabetes at baseline who were followed for over 11 years. Anthropometric measures included a body shape index (ABSI), body adiposity index (BAI), body mass index (BMI), waist circumference (WC), waist to hip ratio (WHR), waist to height ratio (WHtR), and waist to hip to height ratio (WHHR). All anthropometric measures were repeated at each visit and converted to Z-scores. Hazard ratios and 95% confidence intervals adjusted for age were calculated using repeated measures Cox proportional hazard regression analysis. Akaike Information Criteria was used to select best-fit models. The magnitude of the hazard ratio effect sizes and the Harrell's C-indexes were used to rank the highest associations and discriminators, respectively. There were 1,359 incident diabetes cases. Higher values of all anthropometric measures increased the risk for development of T2DM (p < 0.0001) except ABSI, which was not significant in White and Black males. Statistically significant hazard ratios ranged from 1.26-1.63 for males and 1.15-1.88 for females. In general, the largest hazard ratios were those that corresponded to the highest Harrell's C-Index and lowest Akaike Information Criteria values. Among White and Black males and females, BMI, WC, WHR, and WHtR were comparable in discriminating cases from non-cases of T2DM. ABSI, BAI, and WHHR were inferior discriminators of incident T2DM across all race-gender groups. BMI, the most commonly used anthropometric measure, and three anthropometric measures that included waist circumference (i.e., WC, WHR, WHtR) were the best anthropometric discriminators of incident T2DM across all race-gender groups in the ARIC cohort.
Increased Rate of Hospitalization for Diabetes and Residential Proximity of Hazardous Waste Sites
Kouznetsova, Maria; Huang, Xiaoyu; Ma, Jing; Lessner, Lawrence; Carpenter, David O.
2007-01-01
Background Epidemiologic studies suggest that there may be an association between environmental exposure to persistent organic pollutants (POPs) and diabetes. Objective The aim of this study was to test the hypothesis that residential proximity to POP-contaminated waste sites result in increased rates of hospitalization for diabetes. Methods We determined the number of hospitalized patients 25–74 years of age diagnosed with diabetes in New York State exclusive of New York City for the years 1993–2000. Descriptive statistics and negative binomial regression were used to compare diabetes hospitalization rates in individuals who resided in ZIP codes containing or abutting hazardous waste sites containing POPs (“POP” sites); ZIP codes containing hazardous waste sites but with wastes other than POPs (“other” sites); and ZIP codes without any identified hazardous waste sites (“clean” sites). Results Compared with the hospitalization rates for diabetes in clean sites, the rate ratios for diabetes discharges for people residing in POP sites and “other” sites, after adjustment for potential confounders were 1.23 [95% confidence interval (CI), 1.15–1.32] and 1.25 (95% CI, 1.16–1.34), respectively. In a subset of POP sites along the Hudson River, where there is higher income, less smoking, better diet, and more exercise, the rate ratio was 1.36 (95% CI, 1.26–1.47) compared to clean sites. Conclusions After controlling for major confounders, we found a statistically significant increase in the rate of hospitalization for diabetes among the population residing in the ZIP codes containing toxic waste sites. PMID:17366823
Herpes zoster correlates with pyogenic liver abscesses in Taiwan.
Mei-Ling, Shen; Kuan-Fu, Liao; Sung-Mao, Tsai; Cheng-Li, Lin Ms; Shih-Wei, Lai
2016-12-01
The purpose of the paper was to explore the relationship between herpes zoster and pyogenic liver abscesses in Taiwan. This was a nationwide cohort study. Using the database of the Taiwan National Health Insurance Program, there were 33049 subjects aged 20-84 years who were newly diagnosed with herpes zoster from 1998 to 2010 that were selected for our study, and they were our herpes zoster group. 131707 randomly selected subjects without herpes zoster were our non-herpes zoster group. Both groups were matched by sex, age, other comorbidities, and the index year of their herpes zoster diagnosis. The incidence of pyogenic liver abscesses at the end of 2011 was then estimated. The multivariable Cox proportional hazard regression model was used to estimate the hazard ratio and 95% confidence interval for pyogenic liver abscesses associated with herpes zoster and other comorbidities. The overall incidence rate was 1.38-fold higher in the herpes zoster group than in the non-herpes zoster group (4.47 vs. 3.25 per 10000 person-years, 95% confidence interval 1.32, 1.44). After controlling for potential confounding factors, the adjusted hazard ratio of pyogenic liver abscesses was 1.34 in the herpes zoster group (95% confidence interval 1.05, 1.72) when compared with the non-herpes zoster group. Sex (in this case male), age, presence of biliary stones, chronic kidney diseases, chronic liver diseases, cancers, and diabetes mellitus were also significantly associated with pyogenic liver abscesses. Patients with herpes zoster are associated with an increased hazard of developing pyogenic liver abscesses.
Maroules, Christopher D; Rosero, Eric; Ayers, Colby; Peshock, Ronald M; Khera, Amit
2013-10-01
To determine the value of two abdominal aortic atherosclerosis measurements at magnetic resonance (MR) imaging for predicting future cardiovascular events. This study was approved by the institutional review board and complied with HIPAA regulations. The study consisted of 2122 participants from the multiethnic, population-based Dallas Heart Study who underwent abdominal aortic MR imaging at 1.5 T. Aortic atherosclerosis was measured by quantifying mean aortic wall thickness (MAWT) and aortic plaque burden. Participants were monitored for cardiovascular death, nonfatal cardiac events, and nonfatal extracardiac vascular events over a mean period of 7.8 years ± 1.5 (standard deviation [SD]). Cox proportional hazards regression was used to assess independent associations of aortic atherosclerosis and cardiovascular events. Increasing MAWT was positively associated with male sex (odds ratio, 3.66; P < .0001), current smoking (odds ratio, 2.53; P < .0001), 10-year increase in age (odds ratio, 2.24; P < .0001), and hypertension (odds ratio, 1.66; P = .0001). A total of 143 participants (6.7%) experienced a cardiovascular event. MAWT conferred an increased risk for composite events (hazard ratio, 1.28 per 1 SD; P = .001). Aortic plaque was not associated with increased risk for composite events. Increasing MAWT and aortic plaque burden both conferred an increased risk for nonfatal extracardiac events (hazard ratio of 1.52 per 1 SD [P < .001] and hazard ratio of 1.46 per 1 SD [P = .03], respectively). MR imaging measures of aortic atherosclerosis are predictive of future adverse cardiovascular events. © RSNA, 2013.
Dhruva, Sanket S; Huang, Chenxi; Spatz, Erica S; Coppi, Andreas C; Warner, Frederick; Li, Shu-Xia; Lin, Haiqun; Xu, Xiao; Furberg, Curt D; Davis, Barry R; Pressel, Sara L; Coifman, Ronald R; Krumholz, Harlan M
2017-07-01
Randomized trials of hypertension have seldom examined heterogeneity in response to treatments over time and the implications for cardiovascular outcomes. Understanding this heterogeneity, however, is a necessary step toward personalizing antihypertensive therapy. We applied trajectory-based modeling to data on 39 763 study participants of the ALLHAT (Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial) to identify distinct patterns of systolic blood pressure (SBP) response to randomized medications during the first 6 months of the trial. Two trajectory patterns were identified: immediate responders (85.5%), on average, had a decreasing SBP, whereas nonimmediate responders (14.5%), on average, had an initially increasing SBP followed by a decrease. Compared with those randomized to chlorthalidone, participants randomized to amlodipine (odds ratio, 1.20; 95% confidence interval [CI], 1.10-1.31), lisinopril (odds ratio, 1.88; 95% CI, 1.73-2.03), and doxazosin (odds ratio, 1.65; 95% CI, 1.52-1.78) had higher adjusted odds ratios associated with being a nonimmediate responder (versus immediate responder). After multivariable adjustment, nonimmediate responders had a higher hazard ratio of stroke (hazard ratio, 1.49; 95% CI, 1.21-1.84), combined cardiovascular disease (hazard ratio, 1.21; 95% CI, 1.11-1.31), and heart failure (hazard ratio, 1.48; 95% CI, 1.24-1.78) during follow-up between 6 months and 2 years. The SBP response trajectories provided superior discrimination for predicting downstream adverse cardiovascular events than classification based on difference in SBP between the first 2 measurements, SBP at 6 months, and average SBP during the first 6 months. Our findings demonstrate heterogeneity in response to antihypertensive therapies and show that chlorthalidone is associated with more favorable initial response than the other medications. © 2017 American Heart Association, Inc.
Dassopoulos, Themistocles; Nguyen, Geoffrey C.; Talor, Monica Vladut; Datta, Lisa Wu; Isaacs, Kim L.; Lewis, James D.; Gold, Michael S.; Valentine, John F.; Smoot, Duane T.; Harris, Mary L.; Oliva-Hemker, Maria; Bayless, Theodore M.; Burek, C. Lynne; Brant, Steven R.
2012-01-01
Background NOD2 mutations and anti-Saccharomyces cerevisiae antibodies (ASCA) are associated with Crohn’s disease (CD), ileal involvement and complicated disease behavior in whites. ASCA and the three common NOD2 mutations have not been assessed in African American (AA) adults with CD. Methods AA patients with CD and controls were recruited by the Mid-Atlantic African American IBD Study (Johns Hopkins Hospital and satellite centers at Howard University, University of Florida, University of North Carolina, University of Pennsylvania, and the Washington Hospital Center, Washington, DC) as part of the NIDDK IBD Genetics Consortium. Genotyping for the three common CD associated NOD2 mutations (Leu1007fsinsC, G908R/2722g>c, and R702W/2104c>t) and ASCA ELISA assays were performed in 183 AA CD patients and 143 controls. Positive ASCA was based on either IgA or IgG above threshold. CD phenotyping was performed using the NIDDK IBD Genetics Consortium guidelines. Logistic regression was used to calculate adjusted odds ratios (OR) for the association between ASCA and disease phenotype. Results ASCA sensitivity and specificity in this AA population were 70.5% and 70.4% respectively. On univariate analysis, ASCA was associated with younger mean age at diagnosis (25.0±11.8 vs. 32.1±14.2 yrs, p<0.001), ileal involvement (73.0% vs. 48.0%, p=0.002), and complicated (stricturing/ penetrating) behavior (60.3% vs. 41.7%, p=0.03). On multivariate analysis, ASCA titer (/25U) was associated with ileal involvement (OR 1.18, 95% CI 1.04-1.34), complicated behavior (OR 1.13, 95% CI 1.01-1.28) and surgery (hazard ratio 1.11, 95% CI 1.02-1.21). Risks for surgery also included smoking (hazard ratio 1.50, 95% CI 1.14-1.99) and CD family history (hazard ratio 2.39, 95% CI 1.11-5.14). NOD2 carriers (all heterozygotes) were more common among CD cases than controls (8.2 vs. 2.1%; OR 4.17, 95% CI: 1.18 - 14.69). NOD2 mutation population attributable risk was 6.2%. Conclusions In comparison to whites, ASCA in AAs has a similar sensitivity but a lower specificity for CD. ASCA is associated with ileal involvement, complicated behavior and surgery in AAs with CD. NOD2 is a risk gene for AA CD, although mutation frequency and population attributable risk are much lower than in whites. PMID:19826411
Carlsson, Lena M S; Sjöholm, Kajsa; Ahlin, Sofie; Jacobson, Peter; Andersson-Assarsson, Johanna C; Karlsson Lindahl, Linda; Maglio, Cristina; Karlsson, Cecilia; Hjorth, Stephan; Taube, Magdalena; Carlsson, Björn; Svensson, Per-Arne; Peltonen, Markku
2018-05-24
Obesity increases risk of falling, but the effect of bariatric surgery on fall-related injuries is unknown. The aim of this study was therefore to study the association between bariatric surgery and long-term incidence of fall-related injuries in the prospective, controlled Swedish Obese Subjects study. At inclusion, body mass index was ≥ 34 kg/m 2 in men and ≥38 kg/m 2 in women. The surgery per-protocol group (n = 2007) underwent gastric bypass (n = 266), banding (n = 376), or vertical banded gastroplasty (n = 1365), and controls (n = 2040) received usual care. At the time of analysis (31 December 2013), median follow-up was 19 years (maximal 26 years). Fall-related injuries requiring hospital treatment were captured using data from the Swedish National Patient Register. During follow-up, there were 617 first-time fall-related injuries in the surgery group and 513 in the control group (adjusted hazard ratio 1.21, 95% CI, 1.07-1.36; P = 0.002). The incidence differed between treatment groups (P < 0.001, log-rank test) and was higher after gastric bypass than after usual care, banding and vertical banded gastroplasty (adjusted hazard ratio 0.50-0.52, P < 0.001 for all three comparisons). In conclusion, gastric bypass surgery was associated with increased risk of serious fall-related injury requiring hospital treatment.
76 FR 45600 - Order of Succession for the Office of Healthy Homes and Lead Hazard Control
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-29
... Office of Healthy Homes and Lead Hazard Control AGENCY: Office of Healthy Homes and Lead Hazard Control... Healthy Homes and Lead Hazard Control for the Department of Housing and Urban Development designates the Order of Succession for the Office of Healthy Homes and Lead Hazard Control. This Order of Succession...
Bowden, Jack; Seaman, Shaun; Huang, Xin; White, Ian R
2016-04-30
In randomised controlled trials of treatments for late-stage cancer, it is common for control arm patients to receive the experimental treatment around the point of disease progression. This treatment switching can dilute the estimated treatment effect on overall survival and impact the assessment of a treatment's benefit on health economic evaluations. The rank-preserving structural failure time model of Robins and Tsiatis (Comm. Stat., 20:2609-2631) offers a potential solution to this problem and is typically implemented using the logrank test. However, in the presence of substantial switching, this test can have low power because the hazard ratio is not constant over time. Schoenfeld (Biometrika, 68:316-319) showed that when the hazard ratio is not constant, weighted versions of the logrank test become optimal. We present a weighted logrank test statistic for the late stage cancer trial context given the treatment switching pattern and working assumptions about the underlying hazard function in the population. Simulations suggest that the weighted approach can lead to large efficiency gains in either an intention-to-treat or a causal rank-preserving structural failure time model analysis compared with the unweighted approach. Furthermore, violation of the working assumptions used in the derivation of the weights only affects the efficiency of the estimates and does not induce bias or inflate the type I error rate. The weighted logrank test statistic should therefore be considered for use as part of a careful secondary, exploratory analysis of trial data affected by substantial treatment switching. ©2015 The Authors. Statistics inMedicine Published by John Wiley & Sons Ltd.
Portman, Michael A.; Slee, April; Olson, Aaron K.; Cohen, Gordon; Karl, Tom; Tong, Elizabeth; Hastings, Laura; Patel, Hitendra; Reinhartz, Olaf; Mott, Antonio R.; Mainwaring, Richard; Linam, Justin; Danzi, Sara
2011-01-01
Background Triiodothyronine levels decrease in infants and children after cardiopulmonary bypass. We tested the primary hypothesis that triiodothyronine (T3) repletion is safe in this population and produces improvements in postoperative clinical outcome. Methods and Results The TRICC study was a prospective, multicenter, double-blind, randomized, placebo-controlled trial in children younger than 2 years old undergoing heart surgery with cardiopulmonary bypass. Enrollment was stratified by surgical diagnosis. Time to extubation (TTE) was the primary outcome. Patients received intravenous T3 as Triostat (n=98) or placebo (n=95), and data were analyzed using Cox proportional hazards. Overall, TTE was similar between groups. There were no differences in adverse event rates, including arrhythmia. Prespecified analyses showed a significant interaction between age and treatment (P=0.0012). For patients younger than 5 months, the hazard ratio (chance of extubation) for Triostat was 1.72. (P=0.0216). Placebo median TTE was 98 hours with 95% confidence interval (CI) of 71 to 142 compared to Triostat TTE at 55 hours with CI of 44 to 92. TTE shortening corresponded to a reduction in inotropic agent use and improvement in cardiac function. For children 5 months of age, or older, Triostat produced a significant delay in median TTE: 16 hours (CI, 7–22) for placebo and 20 hours (CI, 16–45) for Triostat and (hazard ratio, 0.60; P=0.0220). Conclusions T3 supplementation is safe. Analyses using age stratification indicate that T3 supplementation provides clinical advantages in patients younger than 5 months and no benefit for those older than 5 months. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT00027417. PMID:20837917
Lin, Sheng-Chieh; Lin, Hui-Wen
2015-04-01
Childhood asthma and premature birth are both common; however, no studies have reported urbanization association between asthma and prematurity and the duration of prematurity affect asthma development. We use Taiwan Longitudinal Health Insurance Database (LHID) to explore association between asthma and prematurity among children by using a population-based analysis. This is a retrospective cohort study with registration data derived from Taiwan LHID. We evaluated prematurely born infants and children aged <5 years (n = 532) and age-matched control patients (n = 60505) using Cox proportional hazard regression analysis within a hospital cluster model. Of the 61 037 examinees, 14 012 experienced asthma during the 5-year follow-up, including 161 (72.26 per 1000 person-years) infants and children born prematurely and 13 851 (40.27 per 1000 person-years) controls. The hazard ratio for asthma during 5-year follow-up period was 1.95 (95% confidence interval = 1.67-2.28) among children born prematurely. Boys born prematurely aged 0-2 years were associated with higher asthma rates compared with girls in non-premature and premature groups. Living in urban areas, those born prematurely were associated with higher rates of asthma compared with non-prematurity. Those born prematurely lived in northern region had higher asthma hazard ratio than other regions. Our analyses indicated that sex, age, urbanization level, and geographic region are significantly associated with prematurity and asthma. Based on cumulative asthma-free survival curve generated using the Kaplan-Meier method, infants born prematurely should be closely monitored to see if they would develop asthma until the age of 6 years.
Yankey, Barbara A; Rothenberg, Richard; Strasser, Sheryl; Ramsey-White, Kim; Okosun, Ike S
2017-11-01
Background Reports associate marijuana use with cardiovascular emergencies. Studies relating marijuana use to cardiovascular mortality are scarce. Recent advance towards marijuana use legalization emphasizes the importance of understanding relationships between marijuana use and cardiovascular deaths; the primary ranked mortality. Recreational marijuana is primarily smoked; we hypothesize that like cigarette smoking, marijuana use will be associated with increased cardiovascular mortalities. Design The design of this study was based on a mortality follow-up. Method We linked participants aged 20 years and above, who responded to questions on marijuana use during the 2005 US National Health and Nutrition Examination Survey to data from the 2011 public-use linked mortality file of the National Center for Health Statistics, Centers for Disease Control and Prevention. Only participants eligible for mortality follow-up were included. We conducted Cox proportional hazards regression analyses to estimate hazard ratios for hypertension, heart disease, and cerebrovascular mortality due to marijuana use. We controlled for cigarette smoking and other relevant variables. Results Of the 1213 eligible participants 72.5% were presumed to be alive. The total follow-up time was 19,569 person-years. Adjusted hazard ratios for death from hypertension among marijuana users compared to non-marijuana users was 3.42 (95% confidence interval: 1.20-9.79) and for each year of marijuana use was 1.04 (95% confidence interval: 1.00-1.07). Conclusion From our results, marijuana use may increase the risk for hypertension mortality. Increased duration of marijuana use is associated with increased risk of death from hypertension. Recreational marijuana use potentially has cardiovascular adverse effects which needs further investigation.
ERIC Educational Resources Information Center
Bilder, Deborah; Botts, Elizabeth L.; Smith, Ken R.; Pimentel, Richard; Farley, Megan; Viskochil, Joseph; McMahon, William M.; Block, Heidi; Ritvo, Edward; Ritvo, Riva-Ariella; Coon, Hilary
2013-01-01
This study's purpose was to investigate mortality among individuals with autism spectrum disorders (ASD) ascertained during a 1980s statewide autism prevalence study (n = 305) in relation to controls. Twenty-nine of these individuals (9.5 %) died by the time of follow up, representing a hazard rate ratio of 9.9 (95 % CI 5.7-17.2) in relation to…
Carfilzomib, lenalidomide, and dexamethasone for relapsed multiple myeloma.
Stewart, A Keith; Rajkumar, S Vincent; Dimopoulos, Meletios A; Masszi, Tamás; Špička, Ivan; Oriol, Albert; Hájek, Roman; Rosiñol, Laura; Siegel, David S; Mihaylov, Georgi G; Goranova-Marinova, Vesselina; Rajnics, Péter; Suvorov, Aleksandr; Niesvizky, Ruben; Jakubowiak, Andrzej J; San-Miguel, Jesus F; Ludwig, Heinz; Wang, Michael; Maisnar, Vladimír; Minarik, Jiri; Bensinger, William I; Mateos, Maria-Victoria; Ben-Yehuda, Dina; Kukreti, Vishal; Zojwalla, Naseem; Tonda, Margaret E; Yang, Xinqun; Xing, Biao; Moreau, Philippe; Palumbo, Antonio
2015-01-08
Lenalidomide plus dexamethasone is a reference treatment for relapsed multiple myeloma. The combination of the proteasome inhibitor carfilzomib with lenalidomide and dexamethasone has shown efficacy in a phase 1 and 2 study in relapsed multiple myeloma. We randomly assigned 792 patients with relapsed multiple myeloma to carfilzomib with lenalidomide and dexamethasone (carfilzomib group) or lenalidomide and dexamethasone alone (control group). The primary end point was progression-free survival. Progression-free survival was significantly improved with carfilzomib (median, 26.3 months, vs. 17.6 months in the control group; hazard ratio for progression or death, 0.69; 95% confidence interval [CI], 0.57 to 0.83; P=0.0001). The median overall survival was not reached in either group at the interim analysis. The Kaplan-Meier 24-month overall survival rates were 73.3% and 65.0% in the carfilzomib and control groups, respectively (hazard ratio for death, 0.79; 95% CI, 0.63 to 0.99; P=0.04). The rates of overall response (partial response or better) were 87.1% and 66.7% in the carfilzomib and control groups, respectively (P<0.001; 31.8% and 9.3% of patients in the respective groups had a complete response or better; 14.1% and 4.3% had a stringent complete response). Adverse events of grade 3 or higher were reported in 83.7% and 80.7% of patients in the carfilzomib and control groups, respectively; 15.3% and 17.7% of patients discontinued treatment owing to adverse events. Patients in the carfilzomib group reported superior health-related quality of life. In patients with relapsed multiple myeloma, the addition of carfilzomib to lenalidomide and dexamethasone resulted in significantly improved progression-free survival at the interim analysis and had a favorable risk-benefit profile. (Funded by Onyx Pharmaceuticals; ClinicalTrials.gov number, NCT01080391.).
Early Probiotic Supplementation for Eczema and Asthma Prevention: A Randomized Controlled Trial.
Cabana, Michael D; McKean, Michelle; Caughey, Aaron B; Fong, Lawrence; Lynch, Susan; Wong, Angela; Leong, Russell; Boushey, Homer A; Hilton, Joan F
2017-09-01
To determine if probiotic administration during the first 6 months of life decreases childhood asthma and eczema. We conducted a randomized, double-blind controlled trial of Lactobacillus rhamnosus GG (LGG) supplementation on the cumulative incidence of eczema (primary end point) and asthma and rhinitis (secondary end points) in high-risk infants. For the first 6 months of life, intervention infants ( n = 92) received a daily dose of 10 billion colony-forming units of LGG and 225 mg of inulin (Amerifit Brands, Cromwell, CT), and control infants ( n = 92) received 325 mg of inulin alone. We used survival analysis methods to estimate disease incidences in the presence or absence of LGG and to estimate the efficacy of LGG in delaying or preventing these diseases. Infants were accrued over a 6-year period (median follow-up: 4.6 years; 95% retention rate at 2 years). At 2 years of age, the estimated cumulative incidence of eczema was 30.9% (95% confidence interval [CI], 21.4%-40.4%) in the control arm and 28.7% (95% CI, 19.4%-38.0%) in the LGG arm, for a hazard ratio of 0.95 (95% CI, 0.59-1.53) (log-rank P = .83). At 5 years of age, the cumulative incidence of asthma was 17.4% (95% CI, 7.6%-27.1%) in the control arm and 9.7% (95% CI, 2.7%-16.6%) in the LGG arm, for a hazard ratio of 0.88 (95% CI, 0.41-1.87) (log-rank P = .25). For high-risk infants, early LGG supplementation for the first 6 months of life does not appear to prevent the development of eczema or asthma at 2 years of age. Copyright © 2017 by the American Academy of Pediatrics.
Liou, Li-Syue; Chang, Chih-Ya; Chen, Hsuan-Ju; Tseng, Chun-Hung; Chen, Cheng-Yu; Sung, Fung-Chang
2017-01-01
This population-based cohort study investigated the risk of developing peripheral arterial occlusive disease (PAOD) in patients with Bell's palsy. We used longitudinal claims data of health insurance of Taiwan to identify 5,152 patients with Bell's palsy newly diagnosed in 2000-2010 and a control cohort of 20,608 patients without Bell's palsy matched by propensity score. Incidence and hazard ratio (HR) of PAOD were assessed by the end of 2013. The incidence of PAOD was approximately 1.5 times greater in the Bell's palsy group than in the non-Bell's palsy controls (7.75 vs. 4.99 per 1000 person-years). The Cox proportional hazards regression analysis measured adjusted HR was 1.54 (95% confidence interval (CI) = 1.35-1.76) for the Bell's palsy group compared to the non-Bell's palsy group, after adjusting for sex, age, occupation, income and comorbidities. Men were at higher risk of PAOD than women in the Bell's palsy group, but not in the controls. The incidence of PAOD increased with age in both groups, but the Bell's palsy group to control group HR of PAOD decreased as age increased. The systemic steroid treatment reduced 13% of PAOD hazard for Bell's palsy patients, compared to those without the treatment, but not significant. Bell's palsy appears to be associated with an increased risk of developing PAOD. Further pathophysiologic, histopathology and immunologic research is required to explore the underlying biologic mechanism.
Risk of Suicide Attempt in Poststroke Patients: A Population-Based Cohort Study.
Harnod, Tomor; Lin, Cheng-Li; Kao, Chia-Hung
2018-01-10
This nationwide population-based cohort study evaluated the risk of and risk factors for suicide attempt in poststroke patients in Taiwan. The poststroke and nonstroke cohorts consisted of 713 690 patients and 1 426 009 controls, respectively. Adults (aged >18 years) who received new stroke diagnoses according to the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM ; codes 430-438) between 2000 and 2011 were included in the poststroke cohort. We calculated the adjusted hazard ratio for suicide attempt ( ICD-9-CM codes E950-E959) after adjustment for age, sex, monthly income, urbanization level, occupation category, and various comorbidities. Kaplan-Meier analysis was used to measure the cumulative incidence of suicide attempt, and the Fine and Gray method was used as a competing event when estimating death subhazard ratios and 95% confidence intervals between groups. The cumulative incidence of suicide attempt was higher in the poststroke cohort, and the adjusted hazard ratio of suicide attempt was 2.20 (95% confidence interval, 2.04-2.37) compared with that of the controls. The leading risk factors for poststroke suicide attempt were earning low monthly income (<660 US dollars), living in less urbanized regions, doing manual labor, and having a stroke before age 50 years. The attempted suicide risk did not differ significantly between male and female patients in this study. These results convey crucial information to clinicians and governments for preventing suicide attempt in poststroke patients in Taiwan and other Asian countries. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Matsuo, Koji; Machida, Hiroko; Horowitz, Max P; Shahzad, Mian M K; Guntupalli, Saketh R; Roman, Lynda D; Wright, Jason D
2017-11-01
While there is an increasing trend of ovarian conservation at the time of surgical treatment for young women with stage I cervical cancer, the risk for subsequent ovarian cancer after ovarian conservation has not been well studied. We sought to examine the incidence of and risk factors for metachronous ovarian cancer among young women with stage I cervical cancer who had ovarian conservation at the time of hysterectomy. The Surveillance, Epidemiology, and End Results Program was used to identify women aged <50 years who underwent hysterectomy with ovarian conservation for stage I cervical cancer from 1983 through 2013 (n = 4365). Time-dependent analysis was performed for ovarian cancer risk after cervical cancer diagnosis. Mean age at cervical cancer diagnosis was 37 years, and the majority of patients had stage IA disease (68.2%) and squamous histology (72.9%). Median follow-up time was 10.8 years, and there were 13 women who developed metachronous ovarian cancer. The 10- and 20-year cumulative incidences of metachronous ovarian cancer were 0.2% (95% confidence interval, 0.1-0.4) and 0.5% (95% confidence interval, 0.2-0.8), respectively. Mean age at the time of diagnosis of metachronous ovarian cancer was 47.5 years, and stage III-IV disease was seen in 55.6%. Age (≥45 vs <45 years, hazard ratio, 4.22; 95% confidence interval, 1.16-15.4; P = .018), ethnicity (non-white vs white, hazard ratio, 4.29; 95% confidence interval, 1.31-14.0; P = .009), cervical cancer histology (adenocarcinoma or adenosquamous vs squamous, hazard ratio, 3.50; 95% confidence interval, 1.17-10.5; P = .028), and adjuvant radiotherapy use (yes vs no, hazard ratio, 3.69; 95% confidence interval, 1.01-13.4; P = .034) were significantly associated with metachronous ovarian cancer risk. The presence of multiple risk factors was associated with a significantly increased risk of metachronous ovarian cancer compared to the no risk factor group: 1 risk factor (hazard ratio range, 2.96-8.43), 2 risk factors (hazard ratio range, 16.6-31.0), and 3-4 risk factors (hazard ratio range, 62.3-109), respectively. Metachronous ovarian cancer risk after ovarian conservation for women with stage I cervical cancer is <1%. Older age, non-white ethnicity, adenocarcinoma or adenosquamous histology, and adjuvant radiotherapy may be associated with an increased metachronous ovarian cancer risk. Copyright © 2017 Elsevier Inc. All rights reserved.
1993-04-01
34 in the remainder of this "• IPS. Ensure that system safety, Section refer to the DoD format paragraph health hazards, and environmental for the...hazardous materials is controlled in the manner which protects human health and the environment at the least cost. Hazardous Material Control and Management...of hazardous materials is controlled in a manner which protects human health and the environment at the least cost. Hazardous Material Control and
Larsson, Susanna C; Håkanson, Niclas; Permert, Johan; Wolk, Alicja
2006-06-01
High meat consumption has been associated with increased risk of pancreatic cancer in several, although not all, case-control studies. However, prospective data on this relationship are sparse, and the results have been inconsistent. We prospectively evaluated meat, fish, poultry, and egg consumption in relation to pancreatic cancer incidence in a population-based cohort of 61,433 Swedish women. Diet was assessed with a food-frequency questionnaire at baseline (1987-1990) and again in 1997. Pancreatic cancers were ascertained through linkage to the Swedish Cancer Register. Cox proportional hazards models were used to estimate multivariate hazard ratios with 95% confidence intervals (CI). During the 941,218 person-years of follow-up, from 1987 through 2004, 172 incident cases of pancreatic cancer were diagnosed. Long-term red meat consumption (using data from both dietary questionnaires) was positively associated with risk of pancreatic cancer (p-trend = 0.01), whereas long-term poultry consumption was inversely (p-trend = 0.04) associated with risk. The multivariate hazard ratios for the highest versus the lowest category of consumption were 1.73 (95% CI = 0.99-2.98) for red meat and 0.44 (95% CI = 0.20-0.97) for poultry. There were no significant associations with processed meat, fish or egg consumption. Findings from this prospective study suggest that substituting poultry for red meat might reduce the risk of pancreatic cancer.
Doherty, Sarah M; Jackman, Louise M; Kirwan, John F; Dunne, Deirdre; O'Connor, Kieran G; Rouse, John M
2016-12-01
The incidence of melanoma is rising worldwide. Current Irish guidelines from the National Cancer Control Programme state suspicious pigmented lesions should not be removed in primary care. There are conflicting guidelines and research advising who should remove possible melanomas. To determine whether initial diagnostic excision biopsy of cutaneous malignant melanoma in primary versus secondary care leads to poorer survival. Analysis of data comprising 7116 cases of cutaneous malignant melanoma from the National Cancer Registry Ireland between January 2002 and December 2011. Single predictor variables were examined by the chi-square or Mann-Whitney U test. The effects of single predictor variables on survival were examined by Cox proportionate hazards modelling and a multivariate Cox model of survival based on excision in a non-hospital setting versus hospital setting was derived with adjusted and unadjusted hazard ratios. Over a 10-year period 8.5% of melanomas in Ireland were removed in a non-hospital setting. When comparing melanoma death between the hospital and non-hospital groups, the adjusted hazard ratio was 1.56 (95%CI: 1.08-2.26); (P = .02), indicating a non-inferior outcome for the melanoma cases initially treated in the non-hospital group, after adjustment for significant covariates. This study suggests that initial excision biopsy carried out in general practice does not lead to a poorer outcome. [Box: see text].
Grossman, Douglas; Farnham, James M; Hyngstrom, John; Klapperich, Marki E; Secrest, Aaron M; Empey, Sarah; Bowen, Glen M; Wada, David; Andtbacka, Robert H I; Grossmann, Kenneth; Bowles, Tawnya L; Cannon-Albright, Lisa A
2018-03-01
Survival data are mixed comparing patients with multiple primary melanomas (MPM) to those with single primary melanomas (SPM). We compared MPM versus SPM patient survival using a matching method that avoids potential biases associated with other analytic approaches. Records of 14,138 individuals obtained from the Surveillance, Epidemiology, and End Results registry of all melanomas diagnosed or treated in Utah between 1973 and 2011 were reviewed. A single matched control patient was selected randomly from the SPM cohort for each MPM patient, with the restriction that they survived at least as long as the interval between the first and second diagnoses for the matched MPM patient. Survival curves (n = 887 for both MPM and SPM groups) without covariates showed a significant survival disadvantage for MPM patients (chi-squared 39.29, P < .001). However, a multivariate Cox proportional hazards model showed no significant survival difference (hazard ratio 1.07, P = .55). Restricting the multivariate analysis to invasive melanomas also showed no significant survival difference (hazard ratio 0.99, P = .96). Breslow depth, ulceration status, and specific cause of death were not available for all patients. Patients with MPM had similar survival times as patients with SPM. Copyright © 2018 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
The leaching kinetics of cadmium from hazardous Cu-Cd zinc plant residues.
Li, Meng; Zheng, Shili; Liu, Biao; Du, Hao; Dreisinger, David Bruce; Tafaghodi, Leili; Zhang, Yi
2017-07-01
A large amount of Cu-Cd zinc plant residues (CZPR) are produced from the hydrometallurgical zinc plant operations. Since these residues contain substantial amount of heavy metals including Cd, Zn and Cu, therefore, they are considered as hazardous wastes. In order to realize decontamination treatment and efficient extraction of the valuable metals from the CZPR, a comprehensive recovery process using sulfuric acid as the leaching reagent and air as the oxidizing reagent has been proposed. The effect of temperature, sulfuric acid concentration, particle size, solid/liquid ratio and stirring speed on the cadmium extraction efficiency was investigated. The leaching kinetics of cadmium was also studied. It was concluded that the cadmium leaching process was controlled by the solid film diffusion process. Moreover, the order of the reaction rate constant versus H 2 SO 4 concentration, particle size, solid/liquid ratio and stirring speed was calculated. The XRD and SEM-EDS analysis results showed that the main phases of the secondary sulfuric acid leaching residues were lead sulfate and calcium sulfate. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nochioka, Kotaro; Biering-Sørensen, Tor; Hansen, Kim Wadt; Sørensen, Rikke; Pedersen, Sune; Jørgensen, Peter Godsk; Iversen, Allan; Shimokawa, Hiroaki; Jeger, Raban; Kaiser, Christoph; Pfisterer, Matthias; Galatius, Søren
2017-12-01
Rheumatologic disorders are characterised by inflammation and an increased risk of coronary artery disease (CAD). However, the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing percutaneous coronary intervention (PCI) is unknown. Thus, we aimed to examine the association between rheumatologic disorders and long-term prognosis in CAD patients undergoing PCI. A post-hoc analysis was performed in 4605 patients (age: 63.3 ± 11.0 years; male: 76.6%) with ST-segment elevation myocardial infarction (STEMI; n = 1396), non-STEMI ( n = 1541), and stable CAD ( n = 1668) from the all-comer stent trials, the BAsel Stent Kosten-Effektivitäts Trial-PROspective Validation Examination (BASKET-PROVE) I and II trials. We evaluated the association between rheumatologic disorders and 2-year major adverse cardiac events (MACEs; cardiac death, nonfatal myocardial infarction (MI), and target vessel revascularisation (TVR)) by Cox regression analysis. Patients with rheumatologic disorders ( n = 197) were older, more often female, had a higher prevalence of renal disease, multi-vessel coronary disease, and bifurcation lesions, and had longer total stent lengths. During the 2-year follow-up, the MACE rate was 8.6% in the total cohort. After adjustment for potential confounders, rheumatologic disorders were associated with MACEs in the total cohort (adjusted hazard ratio: 1.55; 95% confidence interval (CI): 1.04-2.31) driven by the STEMI subgroup (adjusted hazard ratio: 2.38; 95% CI: 1.26-4.51). In all patients, rheumatologic disorders were associated with all-cause death (adjusted hazard ratio: 2.05; 95% CI: 1.14-3.70), cardiac death (adjusted hazard ratio: 2.63; 95% CI: 1.27-5.43), and non-fatal MI (adjusted hazard ratio: 2.64; 95% CI: 1.36-5.13), but not with TVR (adjusted hazard ratio: 0.81; 95% CI: 0.41-1.58). The presence of rheumatologic disorders appears to be independently associated with worse outcome in CAD patients undergoing PCI. This calls for further studies and focus on this high-risk group of patients following PCI.
Prednisolone and Mycobacterium indicus pranii in Tuberculous Pericarditis
Mayosi, Bongani M; Ntsekhe, Mpiko; Bosch, Jackie; Pandie, Shaheen; Jung, Hyejung; Gumedze, Freedom; Pogue, Janice; Thabane, Lehana; Smieja, Marek; Francis, Veronica; Joldersma, Laura; Thomas, Kandithalal M.; Thomas, Baby; Awotedu, Abolade A.; Magula, Nombulelo P.; Naidoo, Datshana P.; Damasceno, Albertino; Banda, Alfred Chitsa; Brown, Basil; Manga, Pravin; Kirenga, Bruce; Mondo, Charles; Mntla, Phindile; Tsitsi, Jacob M.; Peters, Ferande; Essop, Mohammed R.; Russell, James B.W.; Hakim, James; Matenga, Jonathan; Barasa, Ayub F.; Sani, Mahmoud U.; Olunuga, Taiwo; Ogah, Okechukwu; Ansa, Victor; Aje, Akinyemi; Danbauchi, Solomon; Ojji, Dike; Yusuf, Salim
2016-01-01
BACKGROUND Tuberculous pericarditis is associated with high morbidity and mortality even if antituberculosis therapy is administered. We evaluated the effects of adjunctive glucocorticoid therapy and Mycobacterium indicus pranii immunotherapy in patients with tuberculous pericarditis. METHODS Using a 2-by-2 factorial design, we randomly assigned 1400 adults with definite or probable tuberculous pericarditis to either prednisolone or placebo for 6 weeks and to either M. indicus pranii or placebo, administered in five injections over the course of 3 months. Two thirds of the participants had concomitant human immunodeficiency virus (HIV) infection. The primary efficacy outcome was a composite of death, cardiac tamponade requiring pericardiocentesis, or constrictive pericarditis. RESULTS There was no significant difference in the primary outcome between patients who received prednisolone and those who received placebo (23.8% and 24.5%, respectively; hazard ratio, 0.95; 95% confidence interval [CI], 0.77 to 1.18; P = 0.66) or between those who received M. indicus pranii immunotherapy and those who received placebo (25.0% and 24.3%, respectively; hazard ratio, 1.03; 95% CI, 0.82 to 1.29; P = 0.81). Prednisolone therapy, as compared with placebo, was associated with significant reductions in the incidence of constrictive pericarditis (4.4% vs. 7.8%; hazard ratio, 0.56; 95% CI, 0.36 to 0.87; P = 0.009) and hospitalization (20.7% vs. 25.2%; hazard ratio, 0.79; 95% CI, 0.63 to 0.99; P = 0.04). Both prednisolone and M. indicus pranii, each as compared with placebo, were associated with a significant increase in the incidence of cancer (1.8% vs. 0.6%; hazard ratio, 3.27; 95% CI, 1.07 to 10.03; P = 0.03, and 1.8% vs. 0.5%; hazard ratio, 3.69; 95% CI, 1.03 to 13.24; P = 0.03, respectively), owing mainly to an increase in HIV-associated cancer. CONCLUSIONS In patients with tuberculous pericarditis, neither prednisolone nor M. indicus pranii had a significant effect on the composite of death, cardiac tamponade requiring pericardiocentesis, or constrictive pericarditis. (Funded by the Canadian Institutes of Health Research and others; IMPI ClinicalTrials.gov number, NCT00810849.) PMID:25178809
Margins of safety provided by COSHH Essentials and the ILO Chemical Control Toolkit.
Jones, Rachael M; Nicas, Mark
2006-03-01
COSHH Essentials, developed by the UK Health and Safety Executive, and the Chemical Control Toolkit (Toolkit) proposed by the International Labor Organization, are 'control banding' approaches to workplace risk management intended for use by proprietors of small and medium-sized businesses. Both systems group chemical substances into hazard bands based on toxicological endpoint and potency. COSSH Essentials uses the European Union's Risk-phrases (R-phrases), whereas the Toolkit uses R-phrases and the Globally Harmonized System (GHS) of Classification and Labeling of Chemicals. Each hazard band is associated with a range of airborne concentrations, termed exposure bands, which are to be attained by the implementation of recommended control technologies. Here we analyze the margin of safety afforded by the systems and, for each hazard band, define the minimal margin as the ratio of the minimum airborne concentration that produced the toxicological endpoint of interest in experimental animals to the maximum concentration in workplace air permitted by the exposure band. We found that the minimal margins were always <100, with some ranging to <1, and inversely related to molecular weight. The Toolkit-GHS system generally produced margins equal to or larger than COSHH Essentials, suggesting that the Toolkit-GHS system is more protective of worker health. Although, these systems predict exposures comparable with current occupational exposure limits, we argue that the minimal margins are better indicators of health protection. Further, given the small margins observed, we feel it is important that revisions of these systems provide the exposure bands to users, so as to permit evaluation of control technology capture efficiency.
The Hierarchy of Control in the Epidemic of Farm Injury.
Dosman, James; Hagel, Louise; King, Nathan; Koehncke, Niels; Kirychuk, Shelley; Trask, Catherine; Neudorf, Joshua; Day, Lesley; Voaklander, Donald; Pickett, William
2015-01-01
The application of the hierarchy of control (HOC) is a well-established approach to hazard reduction in industrial workplaces. However, it has not been generally applied in farm workplaces. The objective was to determine current practices of farmers in the context of a modified HOC, and the effect of these practices on farm injury outcomes. A self-reported mail survey of 1196 Saskatchewan farm operations was conducted in 2013. Selected survey questions were used as proxy measures of the farm owner-operator's practices relevant to each of the six steps of increasing importance in a modified HOC: (1) hazard identification; (2) risk assessment; (3) personal protection; (4) administrative controls; (5) engineering controls; and (6) elimination of the hazard. Analysis used basic descriptive statistics and logistic regression to examine associations of interest. When four of the six HOC steps were adhered to, there was a significant protective effect: odds ratio (OR) = 0.32 (95% confidence interval [CI]: 0.14-0.74) for any injury and OR = 0.27 (95% CI: 0.07-0.99) for serious injury in the overall study population. For farm owner-operators utilizing four of the six steps in the modified HOC, there was a significant protective effect for any injury (OR = 0.30, 95% CI: 0.11-0.83). Although there is a considerable absence of use of elements of the HOC among farm operators, for farmers who adhere to these steps, there is a significant reduction in their risk for injury. Prevention strategies that embrace the practice of these principles may be effective in the control of farm workplace injury.
Bosque-Prous, Marina; Espelt, Albert; Borrell, Carme; Bartroli, Montse; Guitart, Anna M; Villalbí, Joan R; Brugal, M Teresa
2015-08-01
The aim of this study was to estimate the magnitude of gender differences in hazardous drinking among middle-aged people and to analyse whether these differences are associated with contextual factors, such as public policies or socioeconomic factors. Cross-sectional design. The study population included 50- to 64-year-old residents of 16 European countries who participated in the Survey of Health, Ageing and Retirement in Europe project conducted in 2010-12 (n = 26 017). We estimated gender differences in hazardous drinking in each country. To determine whether different social context or women's empowerment variables were associated with gender differences in hazardous drinking, we fitted multilevel Poisson regression models adjusted for various individual and country-level variables, which yielded prevalence ratios and their 95% confidence intervals (95% CI). Prevalence of hazardous drinking was significantly higher in men than women [30.2% (95% CI: 29.1-31.4%) and 18.6% (95% CI: 17.7-19.4%), respectively] in most countries, although the extent of these differences varied between countries. Among individuals aged 50-64 years in Europe, risk of becoming a hazardous drinker was 1.69 times higher (95% CI: 1.45-1.97) in men, after controlling for individual and country-level variables. We also found that lower values of the gender empowerment measure and higher unemployment rates were associated with higher gender differences in hazardous drinking. Countries with the greatest gender differences in hazardous drinking were those with the most restrictions on women's behaviour, and the greatest gender inequalities in daily life. Lower gender differences in hazardous drinking seem to be related to higher consumption among women. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
Bosque-Prous, Marina; Borrell, Carme; Bartroli, Montse; Guitart, Anna M.; Villalbí, Joan R.; Brugal, M. Teresa
2015-01-01
Background: The aim of this study was to estimate the magnitude of gender differences in hazardous drinking among middle-aged people and to analyse whether these differences are associated with contextual factors, such as public policies or socioeconomic factors. Methods: Cross-sectional design. The study population included 50- to 64-year-old residents of 16 European countries who participated in the Survey of Health, Ageing and Retirement in Europe project conducted in 2010–12 (n = 26 017). We estimated gender differences in hazardous drinking in each country. To determine whether different social context or women’s empowerment variables were associated with gender differences in hazardous drinking, we fitted multilevel Poisson regression models adjusted for various individual and country-level variables, which yielded prevalence ratios and their 95% confidence intervals (95% CI). Results: Prevalence of hazardous drinking was significantly higher in men than women [30.2% (95% CI: 29.1–31.4%) and 18.6% (95% CI: 17.7–19.4%), respectively] in most countries, although the extent of these differences varied between countries. Among individuals aged 50–64 years in Europe, risk of becoming a hazardous drinker was 1.69 times higher (95% CI: 1.45–1.97) in men, after controlling for individual and country-level variables. We also found that lower values of the gender empowerment measure and higher unemployment rates were associated with higher gender differences in hazardous drinking. Conclusion: Countries with the greatest gender differences in hazardous drinking were those with the most restrictions on women’s behaviour, and the greatest gender inequalities in daily life. Lower gender differences in hazardous drinking seem to be related to higher consumption among women. PMID:25616593
Lee, Sang-Uk; Roh, Sungwon; Kim, Young-Eun; Park, Jong-Ik; Jeon, Boyoung; Oh, In-Hwan
2017-01-01
The elevated risk of suicide in people with disability has been suggested in the previous studies; however, the majority of study results have been limited to specific disability types, and there is a lack of research comparing the risk of suicide in people with disability in general. To examine the hazard ratio of suicide according to the presence and the types of disability and identify patterns in the results. In this study, we used National Health Insurance Service-National Sample Cohort data on 990,598 people, and performed analysis on the cause of death from 2003 through 2013. A Cox proportional hazard model was used to estimate the hazard ratio of suicide associated with disability and its types. The hazard ratio of suicide among people with disability was 1.9-folds higher compared to people without disability. The risk of suicide among different disability types was higher in mental disorder, renal failure, brain injury and physical disability. The hazard ratio of suicide in people with disability was not varied by income. The time to death by suicide for people with disability from the onset of their disability was 39.8 months on average. Our findings suggest that when the government plans suicide prevention policies, early and additional interventions specific to people with disability are needed. Disability due to mental disorder, renal failure should be given priority. Copyright © 2016 Elsevier Inc. All rights reserved.
Cole, Stephen R.; Hudgens, Michael G.; Tien, Phyllis C.; Anastos, Kathryn; Kingsley, Lawrence; Chmiel, Joan S.; Jacobson, Lisa P.
2012-01-01
To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding. PMID:22302074
Pigmentation Traits, Sun Exposure, and Risk of Incident Vitiligo in Women.
Dunlap, Rachel; Wu, Shaowei; Wilmer, Erin; Cho, Eunyoung; Li, Wen-Qing; Lajevardi, Newsha; Qureshi, Abrar
2017-06-01
Vitiligo is the most common cutaneous depigmentation disorder worldwide, yet little is known about specific risk factors for disease development. Using data from the Nurses' Health Study, a prospective cohort study of 51,337 white women, we examined the associations between (i) pigmentary traits and (ii) reactions to sun exposure and risk of incident vitiligo. Nurses' Health Study participants responded to a question about clinician-diagnosed vitiligo and year of diagnosis (2001 or before, 2002-2005, 2006-2009, 2010-2011, or 2012+). We used Cox proportional hazards regression models to estimate the multivariate-adjusted hazard ratios and 95% confidence intervals of incident vitiligo associated with exposures variables, adjusting for potential confounders. We documented 271 cases of incident vitiligo over 835,594 person-years. Vitiligo risk was higher in women who had at least one mole larger than 3 mm in diameter on their left arms (hazard ratio = 1.37, 95% confidence interval = 1.02-1.83). Additionally, vitiligo risk was higher among women with better tanning ability (hazard ratio = 2.59, 95% confidence interval = 1.21-5.54) and in women who experienced at least one blistering sunburn (hazard ratio = 2.17, 95% confidence interval = 1.15-4.10). In this study, upper extremity moles, a higher ability to achieve a tan, and history of a blistering sunburn were associated with a higher risk of developing vitiligo in a population of white women. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Hlatky, Mark A; Ray, Roberta M; Burwen, Dale R; Margolis, Karen L; Johnson, Karen C; Kucharska-Newton, Anna; Manson, JoAnn E; Robinson, Jennifer G; Safford, Monika M; Allison, Matthew; Assimes, Themistocles L; Bavry, Anthony A; Berger, Jeffrey; Cooper-DeHoff, Rhonda M; Heckbert, Susan R; Li, Wenjun; Liu, Simin; Martin, Lisa W; Perez, Marco V; Tindle, Hilary A; Winkelmayer, Wolfgang C; Stefanick, Marcia L
2015-01-01
Background Data collected as part of routine clinical practice could be used to detect cardiovascular outcomes in pragmatic clinical trials, or in clinical registry studies. The reliability of claims data for documenting outcomes is unknown. Methods and Results We linked records of Women's Health Initiative (WHI) participants aged 65 years and older to Medicare claims data, and compared hospitalizations that had diagnosis codes for acute myocardial infarction (MI) or coronary revascularization with WHI outcomes adjudicated by study physicians. We then compared the hazard ratios for active versus placebo hormone therapy based solely on WHI adjudicated events with corresponding hazard ratios based solely on claims data for the same hormone trial participants. Agreement between WHI adjudicated outcomes and Medicare claims was good for the diagnosis for MI (kappa = 0.71 to 0.74), and excellent for coronary revascularization (kappa=0.88 to 0.91). The hormone:placebo hazard ratio for clinical MI was 1.31 (95% confidence interval (CI) 1.03 to 1.67) based on WHI outcomes, and 1.29 (CI 1.00 to 1.68) based on Medicare data. The hazard ratio for coronary revascularization was 1.09 (CI 0.88 to 1.35) based on WHI outcomes and 1.10 (CI 0.89 to 1.35) based on Medicare data. The differences between hazard ratios derived from WHI and Medicare data were not significant in 1,000 bootstrap replications. Conclusion Medicare claims may provide useful data on coronary heart disease outcomes among patients aged 65 years and older in clinical research studies. Clinical Trials Registration Information www.clinicaltrials.gov, Trial Number NCT00000611 PMID:24399330
Hamilton, S.J.; Buhl, K.J.
1997-01-01
Larval flannelmouth sucker (Catostomus latipinnis) were exposed to arsenate, boron, copper, molybdenum, selenate, selenite, uranium, vanadium, and zinc singly, and to five mixtures of five to nine inorganics. The exposures were conducted in reconstituted water representative of the San Juan River near Shiprock, New Mexico. The mixtures simulated environmental ratios reported for sites along the San Juan River (San Juan River backwater, Fruitland marsh, Hogback East Drain, Mancos River, and McElmo Creek). The rank order of the individual inorganics, from most to least toxic, was: copper > zinc > vanadium > selenite > selenate > arsenate > uranium > boron > molybdenum. All five mixtures exhibited additive toxicity to flannelmouth sucker. In a limited number of tests, 44-day-old and 13-day-old larvae exhibited no difference in sensitivity to three mixtures. Copper was the major toxic component in four mixtures (San Juan backwater, Hogback East Drain, Mancos River, and McElmo Creek), whereas zinc was the major toxic component in the Fruitland marsh mixture, which did not contain copper. The Hogback East Drain was the most toxic mixture tested. Comparison of 96-h LC50values with reported environmental water concentrations from the San Juan River revealed low hazard ratios for arsenic, boron, molybdenum, selenate, selenite, uranium, and vanadium, moderate hazard ratios for zinc and the Fruitland marsh mixture, and high hazard ratios for copper at three sites and four environmental mixtures representing a San Juan backwater, Hogback East Drain, Mancos River, and McElmo Creek. The high hazard ratios suggest that inorganic contaminants could adversely affect larval flannelmouth sucker in the San Juan River at four sites receiving elevated inorganics.
Gratwohl, Alois; Brand, Ronald; McGrath, Eoin; van Biezen, Anja; Sureda, Anna; Ljungman, Per; Baldomero, Helen; Chabannon, Christian; Apperley, Jane
2014-05-01
Competent authorities, healthcare payers and hospitals devote increasing resources to quality management systems but scientific analyses searching for an impact of these systems on clinical outcome remain scarce. Earlier data indicated a stepwise improvement in outcome after allogeneic hematopoietic stem cell transplantation with each phase of the accreditation process for the quality management system "JACIE". We therefore tested the hypothesis that working towards and achieving "JACIE" accreditation would accelerate improvement in outcome over calendar time. Overall mortality of the entire cohort of 107,904 patients who had a transplant (41,623 allogeneic, 39%; 66,281 autologous, 61%) between 1999 and 2006 decreased over the 14-year observation period by a factor of 0.63 per 10 years (hazard ratio: 0.63; 0.58-0.69). Considering "JACIE"-accredited centers as those with programs having achieved accreditation by November 2012, at the latest, this improvement was significantly faster in "JACIE"-accredited centers than in non-accredited centers (approximately 5.3% per year for 49,459 patients versus approximately 3.5% per year for 58,445 patients, respectively; hazard ratio: 0.83; 0.71-0.97). As a result, relapse-free survival (hazard ratio 0.85; 0.75-0.95) and overall survival (hazard ratio 0.86; 0.76-0.98) were significantly higher at 72 months for those patients transplanted in the 162 "JACIE"-accredited centers. No significant effects were observed after autologous transplants (hazard ratio 1.06; 0.99-1.13). Hence, working towards implementation of a quality management system triggers a dynamic process associated with a steeper reduction in mortality over the years and a significantly improved survival after allogeneic stem cell transplantation. Our data support the use of a quality management system for complex medical procedures.
Filippini, Graziella; Falcone, Chiara; Boiardi, Amerigo; Broggi, Giovanni; Bruzzone, Maria G; Caldiroli, Dario; Farina, Rita; Farinotti, Mariangela; Fariselli, Laura; Finocchiaro, Gaetano; Giombini, Sergio; Pollo, Bianca; Savoiardo, Mario; Solero, Carlo L; Valsecchi, Maria G
2008-02-01
Reliable data on large cohorts of patients with glioblastoma are needed because such studies differ importantly from trials that have a strong bias toward the recruitment of younger patients with a higher performance status. We analyzed the outcome of 676 patients with histologically confirmed newly diagnosed glioblastoma who were treated consecutively at a single institution over a 7-year period (1997-2003) with follow-up to April 30, 2006. Survival probabilities were 57% at 1 year, 16% at 2 years, and 7% at 3 years. Progression-free survival was 15% at 1 year. Prolongation of survival was significantly associated with surgery in patients with a good performance status, whatever the patient's age, with an adjusted hazard ratio of 0.55 (p < 0.001) or a 45% relative decrease in the risk of death. Radiotherapy and chemotherapy improved survival, with adjusted hazard ratios of 0.61 (p = 0.001) and 0.89 (p = 0.04), respectively, regardless of age, performance status, or residual tumor volume. Recurrence occurred in 99% of patients throughout the follow-up. Reoperation was performed in one-fourth of these patients but was not effective, whether performed within 9 months (hazard ratio, 0.86; p = 0.256) or after 9 months (hazard ratio, 0.98; p = 0.860) of initial surgery, whereas second-line chemotherapy with procarbazine, lomustine, and vincristine (PCV) or with temozolomide improved survival (hazard ratio, 0.77; p = 0.008). Surgery followed by radiotherapy and chemotherapy should be considered in all patients with glioblastoma, and these treatments should not be withheld because of increasing age alone. The benefit of second surgery at recurrence is uncertain, and new trials are needed to assess its effectiveness. Chemotherapy with PCV or temozolomide seems to be a reasonable option at tumor recurrence.
Filippini, Graziella; Falcone, Chiara; Boiardi, Amerigo; Broggi, Giovanni; Bruzzone, Maria G.; Caldiroli, Dario; Farina, Rita; Farinotti, Mariangela; Fariselli, Laura; Finocchiaro, Gaetano; Giombini, Sergio; Pollo, Bianca; Savoiardo, Mario; Solero, Carlo L.; Valsecchi, Maria G.
2008-01-01
Reliable data on large cohorts of patients with glioblastoma are needed because such studies differ importantly from trials that have a strong bias toward the recruitment of younger patients with a higher performance status. We analyzed the outcome of 676 patients with histologically confirmed newly diagnosed glioblastoma who were treated consecutively at a single institution over a 7-year period (1997 – 2003) with follow-up to April 30, 2006. Survival probabilities were 57% at 1 year, 16% at 2 years, and 7% at 3 years. Progression-free survival was 15% at 1 year. Prolongation of survival was significantly associated with surgery in patients with a good performance status, whatever the patient’s age, with an adjusted hazard ratio of 0.55 (p < 0.001) or a 45% relative decrease in the risk of death. Radiotherapy and chemotherapy improved survival, with adjusted hazard ratios of 0.61 (p = 0.001) and 0.89 (p = 0.04), respectively, regardless of age, performance status, or residual tumor volume. Recurrence occurred in 99% of patients throughout the follow-up. Reoperation was performed in one-fourth of these patients but was not effective, whether performed within 9 months (hazard ratio, 0.86; p = 0.256) or after 9 months (hazard ratio, 0.98; p = 0.860) of initial surgery, whereas second-line chemotherapy with procarbazine, lomustine, and vincristine (PCV) or with temozolomide improved survival (hazard ratio, 0.77; p = 0.008). Surgery followed by radiotherapy and chemotherapy should be considered in all patients with glioblastoma, and these treatments should not be withheld because of increasing age alone. The benefit of second surgery at recurrence is uncertain, and new trials are needed to assess its effectiveness. Chemotherapy with PCV or temozolomide seems to be a reasonable option at tumor recurrence. PMID:17993634
Leone, José Pablo; Leone, Julieta; Zwenger, Ariel Osvaldo; Iturbe, Julián; Leone, Bernardo Amadeo; Vallejo, Carlos Teodoro
2017-03-01
The presence of brain metastases at the time of initial breast cancer diagnosis (BMIBCD) is uncommon. Hence, the prognostic assessment and management of these patients is very challenging. The aim of this study was to analyse the influence of tumour subtype compared with other prognostic factors in the survival of patients with BMIBCD. We evaluated women with BMIBCD, reported to Surveillance, Epidemiology and End Results program from 2010 to 2013. Patients with other primary malignancy were excluded. Univariate and multivariate analyses were performed to determine the effects of each variable on overall survival (OS). We included 740 patients. Median OS for the whole population was 10 months, and 20.7% of patients were alive at 36 months. Tumour subtype distribution was: 46.6% hormone receptor (HR)+/HER2-, 17% HR+/HER2+, 14.1% HR-/HER2+ and 22.3% triple-negative. Univariate analysis showed that the presence of liver metastases, lung metastases and triple-negative patients (median OS 6 months) had worse prognosis. The HR+/HER2+ subtype had the longest OS with a median of 22 months. In multivariate analysis, older age (hazard ratio 1.8), lobular histology (hazard ratio 2.08), triple-negative subtype (hazard ratio 2.25), liver metastases (hazard ratio 1.6) and unmarried patients (hazard ratio 1.39) had significantly shorter OS. Although the prognosis of patients with BMIBCD is generally poor, 20.7% were still alive 3 years after the diagnosis. There were substantial differences in OS according to tumour subtype. In addition to tumour subtype, other independent predictors of OS are age at diagnosis, marital status, histology and liver metastases. Copyright © 2017 Elsevier Ltd. All rights reserved.
Wakai, Kenji; Sugawara, Yumi; Tsuji, Ichiro; Tamakoshi, Akiko; Shimazu, Taichi; Matsuo, Keitaro; Nagata, Chisato; Mizoue, Tetsuya; Tanaka, Keitaro; Inoue, Manami; Tsugane, Shoichiro; Sasazuki, Shizuka
2015-08-01
International reviews have concluded that consumption of fruit and vegetables might decrease the risk of lung cancer. However, the relevant epidemiological evidence still remains insufficient in Japan. Therefore, we performed a pooled analysis of data from four population-based cohort studies in Japan with >200 000 participants and >1700 lung cancer cases. We computed study-specific hazard ratios by quintiles of vegetable and fruit consumption as assessed by food frequency questionnaires. Summary hazard ratios were estimated by pooling the study-specific hazard ratios with a fixed-effect model. In men, we found inverse associations between fruit consumption and the age-adjusted and area-adjusted risk of mortality or incidence of lung cancer. However, the associations were largely attenuated after adjustment for smoking and energy intake. The significant decrease in risk among men remained only for a moderate level of fruit consumption; the lowest summary hazard ratios were found in the third quintile of intake (mortality: 0.71, 95% confidence interval 0.60-0.84; incidence: 0.83, 95% confidence interval 0.70-0.98). This decrease in risk was mainly detected in ever smokers. Conversely, vegetable intake was positively correlated with the risk of incidence of lung cancer after adjustment for smoking and energy intake in men (trend P, 0.024); the summary hazard ratio for the highest quintile was 1.26 (95% confidence interval 1.05-1.50). However, a similar association was not detected for mortality from lung cancer. In conclusion, a moderate level of fruit consumption is associated with a decreased risk of lung cancer in men among the Japanese population. © 2015 The Authors. Cancer Science published by Wiley Publishing Asia Pty Ltd on behalf of Japanese Cancer Association.
Abdolvahabi, Alireza; Shi, Yunhua; Rasouli, Sanaz; Croom, Corbin M; Aliyan, Amir; Martí, Angel A; Shaw, Bryan F
2017-06-21
Over 150 mutations in SOD1 (superoxide dismutase-1) cause amyotrophic lateral sclerosis (ALS), presumably by accelerating SOD1 amyloidogenesis. Like many nucleation processes, SOD1 fibrillization is stochastic (in vitro), which inhibits the determination of aggregation rates (and obscures whether rates correlate with patient phenotypes). Here, we diverged from classical chemical kinetics and used Kaplan-Meier estimators to quantify the probability of apo-SOD1 fibrillization (in vitro) from ∼10 3 replicate amyloid assays of wild-type (WT) SOD1 and nine ALS variants. The probability of apo-SOD1 fibrillization (expressed as a Hazard ratio) is increased by certain ALS-linked SOD1 mutations but is decreased or remains unchanged by other mutations. Despite this diversity, Hazard ratios of fibrillization correlated linearly with (and for three mutants, approximately equaled) Hazard ratios of patient survival (R 2 = 0.67; Pearson's r = 0.82). No correlation exists between Hazard ratios of fibrillization and age of initial onset of ALS (R 2 = 0.09). Thus, Hazard ratios of fibrillization might explain rates of disease progression but not onset. Classical kinetic metrics of fibrillization, i.e., mean lag time and propagation rate, did not correlate as strongly with phenotype (and ALS mutations did not uniformly accelerate mean rate of nucleation or propagation). A strong correlation was found, however, between mean ThT fluorescence at lag time and patient survival (R 2 = 0.93); oligomers of SOD1 with weaker fluorescence correlated with shorter survival. This study suggests that SOD1 mutations trigger ALS by altering a property of SOD1 or its oligomers other than the intrinsic rate of amyloid nucleation (e.g., oligomer stability; rates of intercellular propagation; affinity for membrane surfaces; and maturation rate).
Devillier, Raynier; Dalle, Jean-Hugues; Kulasekararaj, Austin; D'aveni, Maud; Clément, Laurence; Chybicka, Alicja; Vigouroux, Stéphane; Chevallier, Patrice; Koh, Mickey; Bertrand, Yves; Michallet, Mauricette; Zecca, Marco; Yakoub-Agha, Ibrahim; Cahn, Jean-Yves; Ljungman, Per; Bernard, Marc; Loiseau, Pascale; Dubois, Valérie; Maury, Sébastien; Socié, Gérard; Dufour, Carlo; Peffault de Latour, Regis
2016-07-01
Unrelated allogeneic transplantation for severe aplastic anemia is a treatment option after immunosuppressive treatment failure in the absence of a matched sibling donor. Age, delay between disease diagnosis and transplantation, and HLA matching are the key factors in transplantation decisions, but their combined impact on patient outcomes remains unclear. Using the French Society of Bone Marrow Transplantation and Cell Therapies registry, we analyzed all consecutive patients (n=139) who underwent a first allogeneic transplantation for idiopathic severe aplastic anemia from an unrelated donor between 2000 and 2012. In an adjusted multivariate model, age over 30 years (Hazard Ratio=2.39; P=0.011), time from diagnosis to transplantation over 12 months (Hazard Ratio=2.18; P=0.027) and the use of a 9/10 mismatched unrelated donor (Hazard Ratio=2.14; P=0.036) were independent risk factors that significantly worsened overall survival. Accordingly, we built a predictive score using these three parameters, considering patients at low (zero or one risk factors, n=94) or high (two or three risk factors, n=45) risk. High-risk patients had significantly shorter survival (Hazard Ratio=3.04; P<0.001). The score was then confirmed on an independent cohort from the European Group for Blood and Marrow Transplantation database of 296 patients, with shorter survival in patients with at least 2 risk factors (Hazard Ratio=2.13; P=0.005) In conclusion, a simple score using age, transplantation timing and HLA matching would appear useful to help physicians in the daily care of patients with severe aplastic anemia. Copyright© Ferrata Storti Foundation.
Wakai, Kenji; Sugawara, Yumi; Tsuji, Ichiro; Tamakoshi, Akiko; Shimazu, Taichi; Matsuo, Keitaro; Nagata, Chisato; Mizoue, Tetsuya; Tanaka, Keitaro; Inoue, Manami; Tsugane, Shoichiro; Sasazuki, Shizuka
2015-01-01
International reviews have concluded that consumption of fruit and vegetables might decrease the risk of lung cancer. However, the relevant epidemiological evidence still remains insufficient in Japan. Therefore, we performed a pooled analysis of data from four population-based cohort studies in Japan with >200 000 participants and >1700 lung cancer cases. We computed study-specific hazard ratios by quintiles of vegetable and fruit consumption as assessed by food frequency questionnaires. Summary hazard ratios were estimated by pooling the study-specific hazard ratios with a fixed-effect model. In men, we found inverse associations between fruit consumption and the age-adjusted and area-adjusted risk of mortality or incidence of lung cancer. However, the associations were largely attenuated after adjustment for smoking and energy intake. The significant decrease in risk among men remained only for a moderate level of fruit consumption; the lowest summary hazard ratios were found in the third quintile of intake (mortality: 0.71, 95% confidence interval 0.60–0.84; incidence: 0.83, 95% confidence interval 0.70–0.98). This decrease in risk was mainly detected in ever smokers. Conversely, vegetable intake was positively correlated with the risk of incidence of lung cancer after adjustment for smoking and energy intake in men (trend P, 0.024); the summary hazard ratio for the highest quintile was 1.26 (95% confidence interval 1.05–1.50). However, a similar association was not detected for mortality from lung cancer. In conclusion, a moderate level of fruit consumption is associated with a decreased risk of lung cancer in men among the Japanese population. PMID:26033436
Lim, Angelina; Stewart, Kay; Abramson, Michael J; Walker, Susan P; George, Johnson
2012-12-19
Uncontrolled asthma during pregnancy is associated with the maternal hazards of disease exacerbation, and perinatal hazards including intrauterine growth restriction and preterm birth. Interventions directed at achieving better asthma control during pregnancy should be considered a high priority in order to optimise both maternal and perinatal outcomes. Poor compliance with prescribed asthma medications during pregnancy and suboptimal prescribing patterns to pregnant women have both been shown to be contributing factors that jeopardise asthma control. The aim is to design and evaluate an intervention involving multidisciplinary care for women experiencing asthma in pregnancy. A pilot single-blinded parallel-group randomized controlled trial testing a Multidisciplinary Approach to Management of Maternal Asthma (MAMMA©) which involves education and regular monitoring. Pregnant women with asthma will be recruited from antenatal clinics in Victoria, Australia. Recruited participants, stratified by disease severity, will be allocated to the intervention or the usual care group in a 1:1 ratio. Both groups will be followed prospectively throughout pregnancy and outcomes will be compared between groups at three and six months after recruitment to evaluate the effectiveness of this intervention. Outcome measures include Asthma Control Questionnaire (ACQ) scores, oral corticosteroid use, asthma exacerbations and asthma related hospital admissions, and days off work, preventer to reliever ratio, along with pregnancy and neonatal adverse events at delivery. The use of FEV(1)/FEV(6) will be also investigated during this trial as a marker for asthma control. If successful, this model of care could be widely implemented in clinical practice and justify more funding for support services and resources for these women. This intervention will also promote awareness of the risks of poorly controlled asthma and the need for a collaborative, multidisciplinary approach to asthma management during pregnancy. This is also the first study to investigate the use of FEV1/FEV6 as a marker for asthma control during pregnancy. Australian New Zealand Clinical Trials Registry (ACTRN12612000681853).
Ferreira, Mariana P; Coghill, Anna E; Chaves, Claudia B; Bergmann, Anke; Thuler, Luiz C; Soares, Esmeralda A; Pfeiffer, Ruth M; Engels, Eric A; Soares, Marcelo A
2017-02-20
We assessed mortality, treatment response, and relapse among HIV-infected and HIV-uninfected women with cervical cancer in Rio de Janeiro, Brazil. Cohort study of 87 HIV-infected and 336 HIV-uninfected women with cervical cancer. Patients at the Brazilian National Institute of Cancer (2001-2013) were matched on age, calendar year of diagnosis, clinical stage, and tumor histology. Staging and treatment with surgery, radiotherapy, and/or chemotherapy followed international guidelines. We used a Markov model to assess responses to initial therapy, and Cox models for mortality and relapse after complete response (CR). Among 234 deaths, most were from cancer (82% in HIV-infected vs. 93% in HIV-uninfected women); only 9% of HIV-infected women died from AIDS. HIV was not associated with mortality during initial follow-up but was associated more than 1-2 years after diagnosis [overall mortality: stage-adjusted hazard ratio 2.02, 95% confidence interval (CI) 1.27-3.22; cancer-specific mortality: 4.35, 1.86-10.2]. Among 222 patients treated with radiotherapy, HIV-infected had similar response rates to initial cancer therapy as HIV-uninfected women (hazard ratio 0.98, 95% CI 0.58-1.66). However, among women who were treated and had a CR, HIV was associated with elevated risk of subsequent relapse (hazard ratio 3.60, 95% CI 1.86-6.98, adjusted for clinical stage). Among women with cervical cancer, HIV infection was not associated with initial treatment response or early mortality, but relapse after attaining a CR and late mortality were increased in those with HIV. These results point to a role for an intact immune system in control of residual tumor burden among treated cervical cancer patients.
Jackson, Richard; Psarelli, Eftychia-Eirini; Berhane, Sarah; Khan, Harun; Johnson, Philip
2017-02-20
Purpose Following the Sorafenib Hepatocellular Carcinoma Assessment Randomized Protocol (SHARP) trial, sorafenib has become the standard of care for patients with advanced unresectable hepatocellular carcinoma, but the relation between survival advantage and disease etiology remains unclear. To address this, we undertook an individual patient data meta-analysis of three large prospective randomized trials in which sorafenib was the control arm. Methods Of a total of 3,256 patients, 1,643 (50%) who received sorafenib were available. The primary end point was overall survival (OS). A Bayesian hierarchical approach for individual patient data meta-analyses was applied using a piecewise exponential model. Results are presented in terms of hazard ratios comparing sorafenib with alternative therapies according to hepatitis C virus (HCV) or hepatitis B virus (HBV) status. Results Hazard ratios show improved OS for sorafenib in patients who are both HBV negative and HCV positive (log [hazard ratio], -0.27; 95% CI, -0.46 to -0.06). Median unadjusted survival is 12.6 (11.15 to 13.8) months for sorafenib and 10.2 (8.88 to 12.2) months for "other" treatments in this subgroup. There was no evidence of improvement in OS for any other patient subgroups defined by HBV and HCV. Results were consistent across all trials with heterogeneity assessed using Cochran's Q statistic. Conclusion There is consistent evidence that the effect of sorafenib on OS is dependent on patients' hepatitis status. There is an improved OS for patients negative for HBV and positive for HCV when treated with sorafenib. There was no evidence of any improvement in OS attributable to sorafenib for patients positive for HBV and negative for HCV.
Borgi, Lea; Muraki, Isao; Satija, Ambika; Willett, Walter C.; Rimm, Eric B.; Forman, John P.
2017-01-01
Increased fruit and vegetable intake lowers blood pressure in short-term interventional studies. However, data on the association of long-term intake of fruits and vegetables with hypertension risk are scarce. We prospectively examined the independent association of whole fruit (excluding juices) and vegetable intake, as well as the change in consumption of whole fruits and vegetables, with incident hypertension in three large longitudinal cohort studies: Nurses’ Health Study (n=62,175), Nurses’ Health Study II (n=88,475), and Health Professionals Follow-up Study (n =36,803). We calculated hazard ratios and 95% confidence intervals for fruit and vegetable consumption while controlling for hypertension risk factors. Compared with participants whose consumption was ≤4servings/week, the pooled hazard ratios among those whose intake was ≥4servings/day were 0.92(0.87–0.97) for total whole fruit intake and 0.95(0.86–1.04) for total vegetable intake. Similarly, compared with participants who did not increase their fruit or vegetable consumption, the pooled hazard ratios for those whose intake increased by ≥7servings/week were 0.94(0.90–0.97) for total whole fruit intake and 0.98(0.94–1.01) for total vegetable. Analyses of individual fruits and vegetables yielded different results. Consumption levels of ≥4servings/per week (as opposed to <1serving/month) of broccoli, carrots, tofu or soybeans, raisins and apples was associated with lower hypertension risk. In conclusion, our results suggest that greater long-term intake and increased consumption of whole fruits may reduce the risk of developing hypertension. PMID:26644239
2013-01-01
Background Occupations and psychosocial working conditions have rarely been investigated as predictors of disability pension in population-based samples. This study investigated how occupational groups and psychosocial working conditions are associated with future disability pension due to musculoskeletal diagnoses, accounting for familial factors in the associations. Methods A sample of 24 543 same-sex Swedish twin individuals was followed from 1993 to 2008 using nationwide registries. Baseline data on occupations were categorized into eight sector-defined occupational groups. These were further used to reflect psychosocial working conditions by applying the job strain scores of a Job Exposure Matrix. Cox proportional hazard ratios (HR) were estimated. Results During the 12-year (average) follow-up, 7% of the sample was granted disability pension due to musculoskeletal diagnoses. Workers in health care and social work; agriculture, forestry and fishing; transportation; production and mining; and the service and military work sectors were two to three times more likely to receive a disability pension than those in the administration and management sector. Each single unit decrease in job demands and each single unit increase in job control and social support significantly predicted disability pension. Individuals with high work strain or an active job had a lower hazard ratio of disability pension, whereas a passive job predicted a significantly higher hazard ratio. Accounting for familial confounding did not alter these results. Conclusion Occupational groups and psychosocial working conditions seem to be independent of familial confounding, and hence represent risk factors for disability pension due to musculoskeletal diagnoses. This means that preventive measures in these sector-defined occupational groups and specific psychosocial working conditions might prevent disability pension due to musculoskeletal diagnoses. PMID:24040914
Frankland, Andrew; Roberts, Gloria; Holmes-Preston, Ellen; Perich, Tania; Levy, Florence; Lenroot, Rhoshel; Hadzi-Pavlovic, Dusan; Breakspear, Michael; Mitchell, Philip B
2017-11-07
Identifying clinical features that predict conversion to bipolar disorder (BD) in those at high familial risk (HR) would assist in identifying a more focused population for early intervention. In total 287 participants aged 12-30 (163 HR with a first-degree relative with BD and 124 controls (CONs)) were followed annually for a median of 5 years. We used the baseline presence of DSM-IV depressive, anxiety, behavioural and substance use disorders, as well as a constellation of specific depressive symptoms (as identified by the Probabilistic Approach to Bipolar Depression) to predict the subsequent development of hypo/manic episodes. At baseline, HR participants were significantly more likely to report ⩾4 Probabilistic features (40.4%) when depressed than CONs (6.7%; p < .05). Nineteen HR subjects later developed either threshold (n = 8; 4.9%) or subthreshold (n = 11; 6.7%) hypo/mania. The presence of ⩾4 Probabilistic features was associated with a seven-fold increase in the risk of 'conversion' to threshold BD (hazard ratio = 6.9, p < .05) above and beyond the fourteen-fold increase in risk related to major depressive episodes (MDEs) per se (hazard ratio = 13.9, p < .05). Individual depressive features predicting conversion were psychomotor retardation and ⩾5 MDEs. Behavioural disorders only predicted conversion to subthreshold BD (hazard ratio = 5.23, p < .01), while anxiety and substance disorders did not predict either threshold or subthreshold hypo/mania. This study suggests that specific depressive characteristics substantially increase the risk of young people at familial risk of BD going on to develop future hypo/manic episodes and may identify a more targeted HR population for the development of early intervention programs.
Stoner, Marie C D; Pettifor, Audrey; Edwards, Jessie K; Aiello, Allison E; Halpern, Carolyn T; Julien, Aimée; Selin, Amanda; Twine, Rhian; Hughes, James P; Wang, Jing; Agyei, Yaw; Gomez-Olive, F Xavier; Wagner, Ryan G; MacPhail, Catherine; Kahn, Kathleen
2017-09-24
To estimate the association between school attendance, school dropout, and risk of incident HIV and herpes simplex virus type 2 (HSV-2) infection among young women. We used longitudinal data from a randomized controlled trial in rural Mpumalanga province, South Africa, to assess the association between school days attended, school dropout, and incident HIV and HSV-2 in young women aged 13-23 years. We examined inverse probability of exposure weighted survival curves and used them to calculate 1.5, 2.5, and 3.5-year risk differences and risk ratios for the effect of school attendance on incident HIV and HSV-2. A marginal structural Cox model was used to estimate hazard ratios for the effect of school attendance and school dropout on incident infection. Risk of infection increased over time as young women aged, and was higher in young women with low school attendance (<80% school days) compared with high (≥80% school days). Young women with low attendance were more likely to acquire HIV [hazard ratio (HR): 2.97; 95% confidence interval (CI): 1.62, 5.45] and HSV-2 (HR: 2.47; 95% CI: 1.46, 4.17) over the follow-up period than young women with high attendance. Similarly, young women who dropped out of school had a higher weighted hazard of both HIV (HR 3.25 95% CI: 1.67, 6.32) and HSV-2 (HR 2.70; 95% CI 1.59, 4.59). Young women who attend more school days and stay in school have a lower risk of incident HIV and HSV-2 infection. Interventions to increase frequency of school attendance and prevent dropout should be promoted to reduce risk of infection.
Fives, Cassie; Nae, Andreea; Roche, Phoebe; O'Leary, Gerard; Fitzgerald, Brendan; Feeley, Linda; Sheahan, Patrick
2017-04-01
Previous studies have reported variable results for the impact of bone invasion on survival in oral cancer. It is unclear whether bone invasion in small (≤4 cm) squamous cell carcinomas (SCC) of the oral cavity is an independent adverse prognosticator. Our objective was to investigate impact on survival of bone invasion in SCC of floor of mouth (FOM), lower alveolus (LA), and retromolar trigone (RMT) ≤4 cm in size. Retrospective study of 96 patients with SCC of the FOM, LA, and RMT undergoing primary surgical treatment. Original pathology reports and slides were reviewed by three pathologists. Level of bone invasion was categorized as cortical or medullary. Main outcome measures were local control (LC) and overall survival (OS). Bone invasion was present in 31 cases (32%). On review of pathology slides, all cases of bone invasion demonstrated medullary involvement. Median follow-up was 36 months for all patients, and 53 months for patients not dying from cancer. Among tumors ≤4 cm, bone invasion was associated with significantly worse LC (P =.04) and OS (P =.0005). Medullary invasion (hazard ratio: 2.2, 95% confidence interval: 1.1-4.4, P =.03), postoperative radiotherapy (hazard ratio: 0.3, 95% confidence interval: 0.1-0.5, P <.001), and positive pathologic nodal status (hazard ratio: 4.1, 95% confidence interval: 1.9-8.6, P <.001) were independent predictors of worse OS among the entire cohort. Mandibular medullary bone invasion is a poor prognosticator in oral cancers, irrespective of small size of primary tumor. Such cases should be considered for postoperative radiotherapy. 4. Laryngoscope, 127:849-854, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Liang, Jackson J; Elafros, Melissa A; Muser, Daniele; Pathak, Rajeev K; Santangeli, Pasquale; Zado, Erica S; Frankel, David S; Supple, Gregory E; Schaller, Robert D; Deo, Rajat; Garcia, Fermin C; Lin, David; Hutchinson, Mathew D; Riley, Michael P; Callans, David J; Marchlinski, Francis E; Dixit, Sanjay
2016-11-01
Transformation from persistent to paroxysmal atrial fibrillation (AF) after ablation suggests modification of the underlying substrate. We examined the nature of initial arrhythmia recurrence in patients with nonparoxysmal AF undergoing antral pulmonary vein isolation and nonpulmonary vein trigger ablation and correlated recurrence type with long-term ablation efficacy after the last procedure. Three hundred and seventeen consecutive patients with persistent (n=200) and long-standing persistent (n=117) AF undergoing first ablation were included. AF recurrence was defined as early (≤6 weeks) or late (>6 weeks after ablation) and paroxysmal (either spontaneous conversion or treated with cardioversion ≤7 days) or persistent (lasting >7 days). During median follow-up of 29.8 (interquartile range: 14.8-49.9) months, 221 patients had ≥1 recurrence. Initial recurrence was paroxysmal in 169 patients (76%) and persistent in 52 patients (24%). Patients experiencing paroxysmal (versus persistent) initial recurrence were more likely to achieve long-term freedom off antiarrhythmic drugs (hazard ratio, 2.2; 95% confidence interval, 1.5-3.2; P<0.0001), freedom on/off antiarrhythmic drugs (hazard ratio, 2.5; 95% confidence interval, 1.6-3.8; P<0.0001), and arrhythmia control (hazard ratio, 5.2; 95% confidence interval, 2.9-9.2; P<0.0001) after last ablation. In patients with persistent and long-standing persistent AF, limited ablation targeting pulmonary veins and documented nonpulmonary vein triggers improves the maintenance of sinus rhythm and reverses disease progression. Transformation to paroxysmal AF after initial ablation may be a step toward long-term freedom from recurrent arrhythmia. © 2016 American Heart Association, Inc.
Aspergillus sensitization or carriage in cystic fibrosis patients.
Fillaux, Judith; Brémont, François; Murris, Marlène; Cassaing, Sophie; Tétu, Laurent; Segonds, Christine; Pipy, Bernard; Magnaval, Jean-François
2014-07-01
Aspergillus fumigatus (Af) sensitization and persistent carriage are deleterious to lung function, but no consensus has been reached defining these medical entities. This work aimed to identify possible predictive factors for patients who become sensitized to Af, compared with a control group of non-sensitized Af carriers. Between 1995 and 2007, 117 pediatric patients were evaluated. Demographic data, CFTR gene mutations, body mass index and FEV1 were recorded. The presence of Af in sputum, the levels of Af-precipitin, total IgE (t-IgE) and specific IgE to Af (Af-IgE) were determined. Patients were divided into 2 groups: (1) "sensitization": level of Af-IgE > 0.35 IU/mL with t-IgE level < 500 IU/mL and (2) "persistent or transient carriage": Af-IgE level ≤ 0.35 IU/mL with either an Af transient or persistent positive culture. A survival analysis was performed with the appearance of Af-IgE in serum as an outcome variable. Severe mutation (hazard ratio = 3.2), FEV1 baseline over 70% of theoretical value (hazard ratio = 4.9), absence of Pa colonization, catalase activity and previous azithromycin administration (hazard ratio = 9.8, 4.1 and 1.9, respectively) were predictive factors for sensitization. We propose a timeline of the biological events and a tree diagram for risk calculation. Two profiles of cystic fibrosis patients can be envisaged: (1) patients with nonsevere mutation but low FEV1 baselines are becoming colonized with Af or (2) patients with high FEV1 baselines who present with severe mutation are more susceptible to the Af sensitization and then to the presentation of an allergic bronchopulmonary aspergillosis event.
Borgi, Lea; Muraki, Isao; Satija, Ambika; Willett, Walter C; Rimm, Eric B; Forman, John P
2016-02-01
Increased fruit and vegetable intake lowers blood pressure in short-term interventional studies. However, data on the association of long-term intake of fruits and vegetables with hypertension risk are scarce. We prospectively examined the independent association of whole fruit (excluding juices) and vegetable intake, as well as the change in consumption of whole fruits and vegetables, with incident hypertension in 3 large longitudinal cohort studies: Nurses' Health Study (n=62 175), Nurses' Health Study II (n=88 475), and Health Professionals Follow-up Study (n=36 803). We calculated hazard ratios and 95% confidence intervals for fruit and vegetable consumption while controlling for hypertension risk factors. Compared with participants whose consumption was ≤4 servings/week, the pooled hazard ratios among those whose intake was ≥4 servings/day were 0.92(0.87-0.97) for total whole fruit intake and 0.95(0.86-1.04) for total vegetable intake. Similarly, compared with participants who did not increase their fruit or vegetable consumption, the pooled hazard ratios for those whose intake increased by ≥7 servings/week were 0.94(0.90-0.97) for total whole fruit intake and 0.98(0.94-1.01) for total vegetable. Analyses of individual fruits and vegetables yielded different results. Consumption levels of ≥4 servings/week (as opposed to <1 serving/month) of broccoli, carrots, tofu or soybeans, raisins, and apples was associated with lower hypertension risk. In conclusion, our results suggest that greater long-term intake and increased consumption of whole fruits may reduce the risk of developing hypertension. © 2015 American Heart Association, Inc.
Abraham, Alison G; Betoko, Aisha; Fadrowski, Jeffrey J; Pierce, Christopher; Furth, Susan L; Warady, Bradley A; Muñoz, Alvaro
2017-04-01
Clinical care decisions to treat chronic kidney disease (CKD) in a growing child must often be made without the benefit of evidence from clinical trials. We used observational data from the Chronic Kidney Disease in Children cohort to estimate the effectiveness of renin-angiotensin II-aldosterone system blockade (RAAS) to delay renal replacement therapy (RRT) in children with CKD. A total of 851 participants (median age: 11 years, median glomerular filtration rate [GFR]: 52 ml/min/1.73 m 2 , median urine protein to creatinine ratio: 0.35 mg/mg) were included. RAAS use was reported at annual study visits. Both Cox proportional hazards models with time-varying RAAS exposure and Cox marginal structural models (MSM) were used to evaluate the effect of RAAS use on time to RRT. Analyses were adjusted or weighted to control for age, male sex, glomerular diagnosis, GFR, nephrotic range proteinuria, anemia, elevated blood pressure, acidosis, elevated phosphate and elevated potassium. There were 217 RRT events over a 4.1-year median follow-up. At baseline, 472 children (55 %) were prevalent RAAS users, who were more likely to be older, have a glomerular etiology, have higher urine protein, be anemic, have elevated serum phosphate and potassium, take more medications, but less likely to have elevated blood pressure, compared with non-users. RAAS use was found to reduce the risk of RRT by 21 % (hazard ratio: 0.79) to 37 % (hazard ratio: 0.63) from standard regression adjustment and MSM models, respectively. These results support inferences from adult studies of a substantial benefit of RAAS use in pediatric CKD patients.
Ropponen, Annina; Samuelsson, Åsa; Alexanderson, Kristina; Svedberg, Pia
2013-09-16
Occupations and psychosocial working conditions have rarely been investigated as predictors of disability pension in population-based samples. This study investigated how occupational groups and psychosocial working conditions are associated with future disability pension due to musculoskeletal diagnoses, accounting for familial factors in the associations. A sample of 24,543 same-sex Swedish twin individuals was followed from 1993 to 2008 using nationwide registries. Baseline data on occupations were categorized into eight sector-defined occupational groups. These were further used to reflect psychosocial working conditions by applying the job strain scores of a Job Exposure Matrix. Cox proportional hazard ratios (HR) were estimated. During the 12-year (average) follow-up, 7% of the sample was granted disability pension due to musculoskeletal diagnoses. Workers in health care and social work; agriculture, forestry and fishing; transportation; production and mining; and the service and military work sectors were two to three times more likely to receive a disability pension than those in the administration and management sector. Each single unit decrease in job demands and each single unit increase in job control and social support significantly predicted disability pension. Individuals with high work strain or an active job had a lower hazard ratio of disability pension, whereas a passive job predicted a significantly higher hazard ratio. Accounting for familial confounding did not alter these results. Occupational groups and psychosocial working conditions seem to be independent of familial confounding, and hence represent risk factors for disability pension due to musculoskeletal diagnoses. This means that preventive measures in these sector-defined occupational groups and specific psychosocial working conditions might prevent disability pension due to musculoskeletal diagnoses.
Patent foramen ovale and the risk of ischemic stroke in a multiethnic population.
Di Tullio, Marco R; Sacco, Ralph L; Sciacca, Robert R; Jin, Zhezhen; Homma, Shunichi
2007-02-20
We sought to assess the risk of ischemic stroke from a patent foramen ovale (PFO) in the multiethnic prospective cohort of northern Manhattan. Patent foramen ovale has been associated with increased risk of ischemic stroke, mainly in case-control studies. The actual PFO-related stroke risk in the general population is unclear. The presence of PFO was assessed at baseline by using transthoracic 2-dimensional echocardiography with contrast injection in 1,100 stroke-free subjects older than 39 years of age (mean age 68.7 +/- 10.0 years) from the Northern Manhattan Study (NOMAS). The presence of atrial septal aneurysm (ASA) also was recorded. Subjects were followed annually for outcomes. We assessed PFO/ASA-related stroke risk after adjusting for established stroke risk factors. We detected PFO in 164 subjects (14.9%); ASA was present in 27 subjects (2.5%) and associated with PFO in 19 subjects. During a mean follow-up of 79.7 +/- 28.0 months, an ischemic stroke occurred in 68 subjects (6.2%). After adjustment for demographics and risk factors, PFO was not found to be significantly associated with stroke (hazard ratio 1.64, 95% confidence interval [CI] 0.87 to 3.09). The same trend was observed in all age, gender, and race-ethnic subgroups. The coexistence of PFO and ASA did not increase the stroke risk (adjusted hazard ratio 1.25, 95% CI 0.17 to 9.24). Isolated ASA was associated with elevated stroke incidence (2 of 8, or 25%; adjusted hazard ratio 3.66, 95% CI 0.88 to 15.30). Patent foramen ovale, alone or together with ASA, was not associated with an increased stroke risk in this multiethnic cohort. The independent role of ASA needs further assessment in appositely designed and powered studies.
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
... identified food safety hazards, including as appropriate: (i) Critical control points designed to control... control points designed to control food safety hazards introduced outside the processing plant environment... Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF...
Tanimura, Kazuya; Sato, Susumu; Fuseya, Yoshinori; Hasegawa, Koichi; Uemasu, Kiyoshi; Sato, Atsuyasu; Oguma, Tsuyoshi; Hirai, Toyohiro; Mishima, Michiaki; Muro, Shigeo
2016-03-01
Loss of skeletal muscle mass and physical inactivity are important manifestations of chronic obstructive pulmonary disease (COPD), and both are closely related to poor prognoses in patients with COPD. Antigravity muscles are involved in maintaining normal posture and are prone to atrophy with inactivity. The erector spinae muscles (ESM) are one of the antigravity muscle groups, and they can be assessed by chest computed tomography (CT). We hypothesized that the cross-sectional area of ESM (ESMCSA) visualized on chest CT images may serve as a predictor of mortality in patients with COPD. This study was part of the prospective observational study undertaken at Kyoto University Hospital. ESMCSA was measured on a single-slice axial CT image at the level of the 12th thoracic vertebra in patients with COPD. The cross-sectional area of the pectoralis muscles (PMCSA) was also measured. We evaluated the relationship between ESMCSA and clinical parameters, including mortality, in patients with COPD. Age- and height-matched smoking control subjects were also evaluated. In total, 130 male patients and 20 smoking control males were enrolled in this study. ESMCSA was significantly lower in patients with COPD than in the smoking control subjects and was significantly correlated with disease severity. There was a significant but only moderate correlation between ESMCSA and PMCSA. ESMCSA was significantly correlated with previously reported prognostic factors, such as body mass index, dyspnea (modified Medical Research Council dyspnea scale score), FEV1 percent predicted value, inspiratory capacity to total lung capacity ratio, and emphysema severity (percentage of the lung field occupied by low attenuation area). Compared with PMCSA, ESMCSA was more strongly associated with mortality in patients with COPD. Stepwise multivariate Cox proportional hazards analysis revealed that, among these known prognostic factors, ESMCSA was the strongest risk factor for mortality (hazard ratio, 0.85; 95% confidence interval, 0.79-0.92; P < 0.001) and mMRC dyspnea scale score was an additional factor (hazard ratio, 2.35; 95% confidence interval, 1.51-3.65; P < 0.001). ESMCSA assessed by chest CT may be a valuable clinical parameter, as ESACSA correlates significantly with physiological parameters, symptoms, and disease prognosis.
Mason, Malcolm D.; Clarke, Noel W.; James, Nicholas D.; Dearnaley, David P.; Spears, Melissa R.; Ritchie, Alastair W.S.; Attard, Gerhardt; Cross, William; Jones, Rob J.; Parker, Christopher C.; Russell, J. Martin; Thalmann, George N.; Schiavone, Francesca; Cassoly, Estelle; Matheson, David; Millman, Robin; Rentsch, Cyrill A.; Barber, Jim; Gilson, Clare; Ibrahim, Azman; Logue, John; Lydon, Anna; Nikapota, Ashok D.; O’Sullivan, Joe M.; Porfiri, Emilio; Protheroe, Andrew; Srihari, Narayanan Nair; Tsang, David; Wagstaff, John; Wallace, Jan; Walmsley, Catherine; Parmar, Mahesh K.B.; Sydes, Matthew R.
2017-01-01
Purpose Systemic Therapy for Advanced or Metastatic Prostate Cancer: Evaluation of Drug Efficacy is a randomized controlled trial using a multiarm, multistage, platform design. It recruits men with high-risk, locally advanced or metastatic prostate cancer who were initiating long-term hormone therapy. We report survival data for two celecoxib (Cel)-containing comparisons, which stopped accrual early at interim analysis on the basis of failure-free survival. Patients and Methods Standard of care (SOC) was hormone therapy continuously (metastatic) or for ≥ 2 years (nonmetastatic); prostate (± pelvic node) radiotherapy was encouraged for men without metastases. Cel 400 mg was administered twice a day for 1 year. Zoledronic acid (ZA) 4 mg was administered for six 3-weekly cycles, then 4-weekly for 2 years. Stratified random assignment allocated patients 2:1:1 to SOC (control), SOC + Cel, or SOC + ZA + Cel. The primary outcome measure was all-cause mortality. Results were analyzed with Cox proportional hazards and flexible parametric models adjusted for stratification factors. Results A total of 1,245 men were randomly assigned (Oct 2005 to April 2011). Groups were balanced: median age, 65 years; 61% metastatic, 14% N+/X M0, 25% N0M0; 94% newly diagnosed; median prostate-specific antigen, 66 ng/mL. Median follow-up was 69 months. Grade 3 to 5 adverse events were seen in 36% SOC-only, 33% SOC + Cel, and 32% SOC + ZA + Cel patients. There were 303 control arm deaths (83% prostate cancer), and median survival was 66 months. Compared with SOC, the adjusted hazard ratio was 0.98 (95% CI, 0.80 to 1.20; P = .847; median survival, 70 months) for SOC + Cel and 0.86 (95% CI, 0.70 to 1.05; P =.130; median survival, 76 months) for SOC + ZA + Cel. Preplanned subgroup analyses in men with metastatic disease showed a hazard ratio of 0.78 (95% CI, 0.62 to 0.98; P = .033) for SOC + ZA + Cel. Conclusion These data show no overall evidence of improved survival with Cel. Preplanned subgroup analyses provide hypotheses for future studies. PMID:28300506
Mason, Malcolm D; Clarke, Noel W; James, Nicholas D; Dearnaley, David P; Spears, Melissa R; Ritchie, Alastair W S; Attard, Gerhardt; Cross, William; Jones, Rob J; Parker, Christopher C; Russell, J Martin; Thalmann, George N; Schiavone, Francesca; Cassoly, Estelle; Matheson, David; Millman, Robin; Rentsch, Cyrill A; Barber, Jim; Gilson, Clare; Ibrahim, Azman; Logue, John; Lydon, Anna; Nikapota, Ashok D; O'Sullivan, Joe M; Porfiri, Emilio; Protheroe, Andrew; Srihari, Narayanan Nair; Tsang, David; Wagstaff, John; Wallace, Jan; Walmsley, Catherine; Parmar, Mahesh K B; Sydes, Matthew R
2017-05-10
Purpose Systemic Therapy for Advanced or Metastatic Prostate Cancer: Evaluation of Drug Efficacy is a randomized controlled trial using a multiarm, multistage, platform design. It recruits men with high-risk, locally advanced or metastatic prostate cancer who were initiating long-term hormone therapy. We report survival data for two celecoxib (Cel)-containing comparisons, which stopped accrual early at interim analysis on the basis of failure-free survival. Patients and Methods Standard of care (SOC) was hormone therapy continuously (metastatic) or for ≥ 2 years (nonmetastatic); prostate (± pelvic node) radiotherapy was encouraged for men without metastases. Cel 400 mg was administered twice a day for 1 year. Zoledronic acid (ZA) 4 mg was administered for six 3-weekly cycles, then 4-weekly for 2 years. Stratified random assignment allocated patients 2:1:1 to SOC (control), SOC + Cel, or SOC + ZA + Cel. The primary outcome measure was all-cause mortality. Results were analyzed with Cox proportional hazards and flexible parametric models adjusted for stratification factors. Results A total of 1,245 men were randomly assigned (Oct 2005 to April 2011). Groups were balanced: median age, 65 years; 61% metastatic, 14% N+/X M0, 25% N0M0; 94% newly diagnosed; median prostate-specific antigen, 66 ng/mL. Median follow-up was 69 months. Grade 3 to 5 adverse events were seen in 36% SOC-only, 33% SOC + Cel, and 32% SOC + ZA + Cel patients. There were 303 control arm deaths (83% prostate cancer), and median survival was 66 months. Compared with SOC, the adjusted hazard ratio was 0.98 (95% CI, 0.80 to 1.20; P = .847; median survival, 70 months) for SOC + Cel and 0.86 (95% CI, 0.70 to 1.05; P =.130; median survival, 76 months) for SOC + ZA + Cel. Preplanned subgroup analyses in men with metastatic disease showed a hazard ratio of 0.78 (95% CI, 0.62 to 0.98; P = .033) for SOC + ZA + Cel. Conclusion These data show no overall evidence of improved survival with Cel. Preplanned subgroup analyses provide hypotheses for future studies.
Willey, Joshua; Gardener, Hannah; Cespedes, Sandino; Cheung, Ying K; Sacco, Ralph L; Elkind, Mitchell S V
2017-11-01
There is growing evidence that increased dietary sodium (Na) intake increases the risk of vascular diseases, including stroke, at least in part via an increase in blood pressure. Higher dietary potassium (K), seen with increased intake of fruits and vegetables, is associated with lower blood pressure. The goal of this study was to determine the association of a dietary Na:K with risk of stroke in a multiethnic urban population. Stroke-free participants from the Northern Manhattan Study, a population-based cohort study of stroke incidence, were followed-up for incident stroke. Baseline food frequency questionnaires were analyzed for Na and K intake. We estimated the hazard ratios and 95% confidence intervals for the association of Na:K with incident total stroke using multivariable Cox proportional hazards models. Among 2570 participants with dietary data (mean age, 69±10 years; 64% women; 21% white; 55% Hispanic; 24% black), the mean Na:K ratio was 1.22±0.43. Over a mean follow-up of 12 years, there were 274 strokes. In adjusted models, a higher Na:K ratio was associated with increased risk for stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1) and specifically ischemic stroke (hazard ratio, 1.6; 95% confidence interval, 1.2-2.1). Na:K intake is an independent predictor of stroke risk. Further studies are required to understand the joint effect of Na and K intake on risk of cardiovascular disease. © 2017 American Heart Association, Inc.
Candida transmission and sexual behaviors as risks for a repeat episode of Candida vulvovaginitis.
Reed, Barbara D; Zazove, Philip; Pierson, Carl L; Gorenflo, Daniel W; Horrocks, Julie
2003-12-01
To assess associations between female and male factors and the risk of recurring Candida vulvovaginitis. A prospective cohort study of 148 women with Candida vulvovaginitis and 78 of their male sexual partners was conducted at two primary care practices in the Ann Arbor, Michigan, area. Thirty-three of 148 women developed at least one further episode of Candida albicans vulvovaginitis within 1 year of follow-up. Cultures of Candida species from various sites of the woman (tongue, feces, vulva, and vagina) and from her partner (tongue, feces, urine, and semen) did not predict recurrences. Female factors associated with recurrence included recent masturbating with saliva (hazard ratio 2.66 [95% CI 1.17-6.06]) or cunnilingus (hazard ratio 2.94 [95% CI 1.12-7.68]) and ingestion of two or more servings of bread per day (p = 0.05). Male factors associated with recurrences in the woman included history of the male masturbating with saliva in the previous month (hazard ratio 3.68 [95% CI 1.24-10.87]) and lower age at first intercourse (hazard ratio 0.83 [95% CI 0.71-0.96]). Sexual behaviors, rather than the presence of Candida species at various body locations of the male partner, are associated with recurrences of C. albicans vulvovaginitis.
Covariate Measurement Error Correction Methods in Mediation Analysis with Failure Time Data
Zhao, Shanshan
2014-01-01
Summary Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This paper focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error and error associated with temporal variation. The underlying model with the ‘true’ mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling design. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. PMID:25139469
Covariate measurement error correction methods in mediation analysis with failure time data.
Zhao, Shanshan; Prentice, Ross L
2014-12-01
Mediation analysis is important for understanding the mechanisms whereby one variable causes changes in another. Measurement error could obscure the ability of the potential mediator to explain such changes. This article focuses on developing correction methods for measurement error in the mediator with failure time outcomes. We consider a broad definition of measurement error, including technical error, and error associated with temporal variation. The underlying model with the "true" mediator is assumed to be of the Cox proportional hazards model form. The induced hazard ratio for the observed mediator no longer has a simple form independent of the baseline hazard function, due to the conditioning event. We propose a mean-variance regression calibration approach and a follow-up time regression calibration approach, to approximate the partial likelihood for the induced hazard function. Both methods demonstrate value in assessing mediation effects in simulation studies. These methods are generalized to multiple biomarkers and to both case-cohort and nested case-control sampling designs. We apply these correction methods to the Women's Health Initiative hormone therapy trials to understand the mediation effect of several serum sex hormone measures on the relationship between postmenopausal hormone therapy and breast cancer risk. © 2014, The International Biometric Society.
Quantifying the relative risk of sex offenders: risk ratios for static-99R.
Hanson, R Karl; Babchishin, Kelly M; Helmus, Leslie; Thornton, David
2013-10-01
Given the widespread use of empirical actuarial risk tools in corrections and forensic mental health, it is important that evaluators and decision makers understand how scores relate to recidivism risk. In the current study, we found strong evidence for a relative risk interpretation of Static-99R scores using 8 samples from Canada, United Kingdom, and Western Europe (N = 4,037 sex offenders). Each increase in Static-99R score was associated with a stable and consistent increase in relative risk (as measured by an odds ratio or hazard ratio of approximately 1.4). Hazard ratios from Cox regression were used to calculate risk ratios that can be reported for Static-99R. We recommend that evaluators consider risk ratios as a useful, nonarbitrary metric for quantifying and communicating risk information. To avoid misinterpretation, however, risk ratios should be presented with recidivism base rates.
Survival of cancer patients treated with mistletoe extract (Iscador): a systematic literature review
2009-01-01
Background In Europe, extracts from Viscum album (VA-E), the European white-berry mistletoe, are widely used to treat patients with cancer. Methods We searched several databases such as Cochrane, EMBASE, NCCAM, NLM, DIMDI, CAMbase, and Medline. Inclusion criteria were controlled clinical studies on parameters associated with survival in cancer patients treated with Iscador. Outcome data were extracted as they were given in the publication, and expressed as hazard ratios (HR), their logarithm, and the respective standard errors using standard formulas. Results We found 49 publications on the clinical effects of Iscador usage on survival of cancer patients which met our criteria. Among them, 41 studies and strata provided enough data to extract hazard ratios (HR) and their standard errors (Iscador versus no extra treatment). The majority of studies reported positive effects in favour of the Iscador application. Heterogeneity of study results was moderate (I2 = 38.3%, p < 0.0001). The funnel plots were considerably skewed, indicating a publication bias, a notion which is corroborated by statistical means (AC = -1.3, CI: -1.9 to -0.6, p <= 0.0001). A random effect meta-analysis estimated the overall hazard ratio at HR = 0.59 (CI: 0.53 to 0.66, p < 0.0001). Randomized studies showed less effects than non-randomized studies (ratio of HRs: 1.24, CI: 0.79 to 1.92, p = 0.35), and matched-pair studies gave significantly better results than others (ratio of HRs: 0.33; CI: 0.17 to 0.65, p = 0.0012). Conclusions Pooled analysis of clinical studies suggests that adjuvant treatment of cancer patients with the mistletoe extract Iscador is associated with a better survival. Despite obvious limitations, and strong hints for a publication bias which limits the evidence found in this meta-analysis, one can not ignore the fact that studies with positive effects of VA-E on survival of cancer patients are accumulating. Future studies evaluating the effects of Iscador should focus on a transparent design and description of endpoints in order to provide greater insight into a treatment often being depreciated as ineffective, but highly valued by cancer patients. PMID:20021637
Birth Weight, Physical Morbidity, and Mortality: A Population-based Sibling-Comparison Study
Class, Quetzal A.; Rickert, Martin E.; Lichtenstein, Paul; D'Onofrio, Brian M.
2014-01-01
Associations between low birth weight (≤2,500 g) and increased risk of mortality and morbidity provided the foundation for the “developmental origins of health and disease” hypothesis. Previous between-family studies could not control for unmeasured confounders. Therefore, we compared differentially exposed siblings to estimate the extent to which the associations were due to uncontrolled factors. Our population cohort included 3,291,773 persons born in Sweden from 1973 to 2008. Analyses controlled for gestational age, among other covariates, and considered birth weight as both an ordinal and a continuous variable. Outcomes included mortality after 1 year, cardiac-related death, hypertension, ischemic heart disease, pulmonary circulation problems, stroke, and type 2 diabetes mellitus. We fitted fixed-effects models to compare siblings and conducted sensitivity analyses to test alternative explanations. Across the population, the lower the birth weight, the greater the risk of mortality (e.g., cardiac-related death (low birth weight hazard ratio = 2.69, 95% confidence interval: 2.05, 3.53)) and morbidity (e.g., type 2 diabetes mellitus (low birth weight hazard ratio = 1.79, 95% confidence interval: 1.50, 2.14)) outcomes in comparison with normal birth weight. All associations were independent of shared familial confounders and measured covariates. Results emphasize the importance of birth weight as a risk factor for subsequent mortality and morbidity. PMID:24355331
Engels, E A; Chen, J; Viscidi, R P; Shah, K V; Daniel, R W; Chatterjee, N; Klebanoff, M A
2004-08-15
Before 1963, poliovirus vaccine produced in the United States was contaminated with simian virus 40 (SV40), which causes cancer in animals. To examine whether early-life SV40 infection can cause human cancer, the authors studied 54,796 children enrolled in the US-based Collaborative Perinatal Project (CPP) in 1959-1966, 52 of whom developed cancer by their eighth birthday. Those children whose mothers had received pre-1963 poliovirus vaccine during pregnancy (22.5% of the children) had an increased incidence of neural tumors (hazard ratio = 2.6, 95% confidence interval: 1.0, 6.7; 18 cases) and hematologic malignancies (hazard ratio = 2.8, 95% confidence interval: 1.2, 6.4; 22 cases). For 50 CPP children with cancer and 200 CPP control children, the authors tested paired maternal serum samples from pregnancy for SV40 antibodies using a virus-like particle enzyme immunoassay and a plaque neutralization assay. Overall, mothers exhibited infrequent, low-level SV40 antibody reactivity, and only six case mothers seroconverted by either assay. Using the two SV40 assays, maternal SV40 seroconversion during pregnancy was not consistently related to children's case/control status or mothers' receipt of pre-1963 vaccine. The authors conclude that an increased cancer risk in CPP children whose mothers received pre-1963 poliovirus vaccine was unlikely to have been due to SV40 infection transmitted from mothers to their children.
Anema, Johannes R; Steenstra, Ivan A; Bongers, Paulien M; de Vet, Henrica C W; Knol, Dirk L; Loisel, Patrick; van Mechelen, Willem
2007-02-01
Population-based randomized controlled trial. To assess the effectiveness of workplace intervention and graded activity, separately and combined, for multidisciplinary rehabilitation of low back pain (LBP). Effective components for multidisciplinary rehabilitation of LBP are not yet established. Participants sick-listed 2 to 6 weeks due to nonspecific LBP were randomized to workplace intervention (n = 96) or usual care (n = 100). Workplace intervention consisted of workplace assessment, work modifications, and case management involving all stakeholders. Participants still sick-listed at 8 weeks were randomized for graded activity (n = 55) or usual care (n = 57). Graded activity comprised biweekly 1-hour exercise sessions based on operant-conditioning principles. Outcomes were lasting return to work, pain intensity and functional status, assessed at baseline, and at 12, 26, and 52 weeks after the start of sick leave. Time until return to work for workers with workplace intervention was 77 versus 104 days (median) for workers without this intervention (P = 0.02). Workplace intervention was effective on return to work (hazard ratio = 1.7; 95% CI, 1.2-2.3; P = 0.002). Graded activity had a negative effect on return to work (hazard ratio = 0.4; 95% CI, 0.3-0.6; P < 0.001) and functional status. Combined intervention had no effect. Workplace intervention is advised for multidisciplinary rehabilitation of subacute LBP. Graded activity or combined intervention is not advised.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Chang H.; Bonomi, Marcelo; Cesaretti, Jamie
2011-11-01
Purpose: To evaluate whether complex radiotherapy (RT) planning was associated with improved outcomes in a cohort of elderly patients with unresected Stage I-II non-small-cell lung cancer (NSCLC). Methods and Materials: Using the Surveillance, Epidemiology, and End Results registry linked to Medicare claims, we identified 1998 patients aged >65 years with histologically confirmed, unresected stage I-II NSCLC. Patients were classified into an intermediate or complex RT planning group using Medicare physician codes. To address potential selection bias, we used propensity score modeling. Survival of patients who received intermediate and complex simulation was compared using Cox regression models adjusting for propensity scoresmore » and in a stratified and matched analysis according to propensity scores. Results: Overall, 25% of patients received complex RT planning. Complex RT planning was associated with better overall (hazard ratio 0.84; 95% confidence interval, 0.75-0.95) and lung cancer-specific (hazard ratio 0.81; 95% confidence interval, 0.71-0.93) survival after controlling for propensity scores. Similarly, stratified and matched analyses showed better overall and lung cancer-specific survival of patients treated with complex RT planning. Conclusions: The use of complex RT planning is associated with improved survival among elderly patients with unresected Stage I-II NSCLC. These findings should be validated in prospective randomized controlled trials.« less
Does private religious activity prolong survival? A six-year follow-up study of 3,851 older adults.
Helm, H M; Hays, J C; Flint, E P; Koenig, H G; Blazer, D G
2000-07-01
Previous studies have linked higher religious attendance and longer survival. In this study, we examine the relationship between survival and private religious activity. A probability sample of elderly community-dwelling adults in North Carolina was assembled in 1986 and followed for 6 years. Level of participation in private religious activities such as prayer, meditation, or Bible study was assessed by self-report at baseline, along with a wide variety of sociodemographic and health variables. The main outcome was time (days) to death or censoring. During a median 6.3-year follow-up period, 1,137 subjects (29.5%) died. Those reporting rarely to never participating in private religious activity had an increased relative hazard of dying over more frequent participants, but this hazard did not remain significant for the sample as a whole after adjustment for demographic and health variables. When the sample was divided into activity of daily living (ADL) impaired and unimpaired, the effect did not remain significant for the ADL impaired group after controlling for demographic variables (hazard ratio [RH] 1.11, 95% confidence interval [CI] 0.91-1.35). However, the increased hazard remained significant for the ADL unimpaired group even after controlling for demographic and health variables (RH 1.63, 95% CI 1.20-2.21), and this effect persisted despite controlling for numerous explanatory variables including health practices, social support, and other religious practices (RH 1.47, 95% CI 1.07-2.03). Older adults who participate in private religious activity before the onset of ADL impairment appear to have a survival advantage over those who do not.
Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)
Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...
Ferguson, Kelly K; Meeker, John D; McElrath, Thomas F; Mukherjee, Bhramar; Cantonwine, David E
2017-05-01
Preeclampsia is a prevalent and enigmatic disease, in part characterized by poor remodeling of the spiral arteries. However, preeclampsia does not always clinically present when remodeling has failed to occur. Hypotheses surrounding the "second hit" that is necessary for the clinical presentation of the disease focus on maternal inflammation and oxidative stress. Yet, the studies to date that have investigated these factors have used cross-sectional study designs or small study populations. In the present study, we sought to explore longitudinal trajectories, beginning early in gestation, of a panel of inflammation and oxidative stress markers in women who went on to have preeclamptic or normotensive pregnancies. We examined 441 subjects from the ongoing LIFECODES prospective birth cohort, which included 50 mothers who experienced preeclampsia and 391 mothers with normotensive pregnancies. Participants provided urine and plasma samples at 4 time points during gestation (median, 10, 18, 26, and 35 weeks) that were analyzed for a panel of oxidative stress and inflammation markers. Oxidative stress biomarkers included 8-isoprostane and 8-hydroxydeoxyguanosine. Inflammation biomarkers included C-reactive protein, the cytokines interleukin-1β, -6, and -10, and tumor necrosis factor-α. We created Cox proportional hazard models to calculate hazard ratios based on time of preeclampsia diagnosis in association with biomarker concentrations at each of the 4 study visits. In adjusted models, hazard ratios of preeclampsia were significantly (P<.01) elevated in association with all inflammation biomarkers that were measured at visit 2 (median, 18 weeks; hazard ratios, 1.31-1.83, in association with an interquartile range increase in biomarker). Hazard ratios at this time point were the most elevated for C-reactive protein, for interleukin-1β, -6, and -10, and for the oxidative stress biomarker 8-isoprostane (hazard ratio, 1.68; 95% confidence interval, 1.14-2.48) compared to other time points. Hazard ratios for tumor necrosis factor-α were consistently elevated at all 4 of the study visits (hazard ratios, 1.49-1.63; P<.01). In sensitivity analyses, we observed that these associations were attenuated within groups typically at higher risk of experiencing preeclampsia, which include African American mothers, mothers with higher body mass index at the beginning of gestation, and pregnancies that ended preterm. This study provides the most robust data to date on repeated measures of inflammation and oxidative stress in preeclamptic compared with normotensive pregnancies. Within these groups, inflammation and oxidative stress biomarkers show different patterns across gestation, beginning as early as 10 weeks. The start of the second trimester appears to be a particularly important time point for the measurement of these biomarkers. Although biomarkers alone do not appear to be useful in the prediction of preeclampsia, these data are useful in understanding the maternal inflammatory profile in pregnancy before the development of the disease and may be used to further develop an understanding of potentially preventative measures. Published by Elsevier Inc.
Baqui, Abdullah H.; Arifeen, Shams E.; Williams, Emma K.; Ahmed, Saifuddin; Mannan, Ishtiaq; Rahman, Syed M.; Begum, Nazma; Seraji, Habibur R.; Winch, Peter J.; Santosham, Mathuram; Black, Robert E.; Darmstadt, Gary L.
2010-01-01
Background Infections account for about half of neonatal deaths in low-resource settings. Limited evidence supports home-based treatment of newborn infections by community health workers (CHW). Methods In one study arm of a cluster randomized controlled trial, CHWs assessed neonates at home using a 20-sign clinical algorithm and classified sick neonates as having very severe disease or possible very severe disease. Over a two-year period, 10 585 live births were recorded in the study area. CHWs assessed 8474 (80%) of the neonates within the first week of life and referred neonates with signs of severe disease. If referral failed but parents consented to home treatment, CHWs treated neonates with very severe disease or possible very severe disease with multiple signs, using injectable antibiotics. Results For very severe disease, referral compliance was 34% (162/478 cases), and home treatment acceptance was 43% (204/478 cases). The case fatality rate was 4.4% (9/204) for CHW treatment, 14.2% (23/162) for treatment by qualified medical providers, and 28.5% (32/112) for those who received no treatment or who were treated by other unqualified providers. After controlling for differences in background characteristics and illness signs among treatment groups, newborns treated by CHWs had a hazard ratio of 0.22 (95% confidence interval 0.07–0.71) for death during the neonatal period and those treated by qualified providers had a hazard ratio of 0.61 (95% confidence interval of 0.37–0.99), compared with newborns who received no treatment or were treated by untrained providers. Significantly increased hazards ratios of death were observed for neonates with convulsions (HR 6.54; 95% CI 3.98–10.76), chest in-drawing (HR 2.38, 95% CI 1.29–4.39), temperature < 35.3°C (HR 3.47, 95% CI 1.30–9.24), unconsciousness (HR 7.92, 95% CI 3.13–20.04). Conclusions Home treatment of very severe disease in neonates by CHWs was effective and acceptable in a low-resource setting in Bangladesh. PMID:19289979
Aggarwal, Rohit; McBurney, Christine; Schneider, Frank; Yousem, Samuel A; Gibson, Kevin F; Lindell, Kathleen; Fuhrman, Carl R; Oddis, Chester V
2017-03-01
To compare the survival outcomes between myositis-associated usual interstitial pneumonia (MA-UIP) and idiopathic pulmonary fibrosis (IPF-UIP). Adult MA-UIP and IPF-UIP patients were identified using CTD and IPF registries. The MA-UIP cohort included myositis or anti-synthetase syndrome patients with interstitial lung disease while manifesting UIP on high-resolution CT chest and/or a lung biopsy revealing UIP histology. IPF subjects met American Thoracic Society criteria and similarly had UIP histopathology. Kaplan-Meier survival curves compared cumulative and pulmonary event-free survival (event = transplant or death) between (i) all MA-UIP and IPF-UIP subjects, (ii) MA-UIP with biopsy proven UIP (n = 25) vs IPF-UIP subjects matched for age, gender and baseline forced vital capacity (±10%). Cox proportional hazards ratios compared the survival controlling for co-variates. Eighty-one IPF-UIP and 43 MA-UIP subjects were identified. The median cumulative and event-free survival time in IPF vs MA-UIP was 5.25/1.8 years vs 16.2/10.8 years, respectively. Cumulative and event-free survival was significantly worse in IPF-UIP vs MA-UIP [hazards ratio of IPF-UIP was 2.9 (95% CI: 1.5, 5.6) and 5.0 (95% CI: 2.8, 8.7) (P < 0.001), respectively]. IPF-UIP event-free survival (but not cumulative) remained significantly worse than MA-UIP with a hazards ratio of 6.4 (95% CI: 3.0, 13.8) after controlling for age at interstitial lung disease diagnosis, gender, ethnicity and baseline forced vital capacity%. Respiratory failure was the most common cause of death in both groups. A sub-analysis of 25 biopsy-proven MA-UIP subjects showed similar results. MA-UIP patients demonstrated a significant survival advantage over a matched IPF cohort, suggesting that despite similar histological and radiographic findings at presentation, the prognosis of MA-UIP is superior to that of IPF-UIP. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Lee, Ai-Lin; Chen, Bor-Chyuan; Mou, Chih-Hsin; Sun, Mao-Feng; Yen, Hung-Rong
2016-01-01
Abstract With an increasing use of traditional Chinese medicine (TCM) in type 2 diabetes mellitus (T2DM), evidence of long-term benefit with adjunctive TCM treatment is limited. This study investigated whether the concurrent TCM treatment reduces the risk of vascular complications in T2DM patients by using a large population from National Health Insurance Research Database (NHIRD). We identified 33,457 adult patients with newly diagnosed T2DM using anti-diabetic agents from a random sample of one million beneficiaries in the NHIRD between January 1, 2000 and December 31, 2011. We recruited 1049 TCM users (received TCM over 30 days with a diagnosis of T2DM) and randomly selected 4092 controls as the non-TCM cohort at a ratio of 1:4 frequency-matched by age, sex, hypertension, hyperlipidemia, and index year. We investigated the prescription pattern of TCM and conducted a Cox proportional hazards regression to calculate the hazard ratios (HRs) of stroke, chronic kidney diseases (CKD), and diabetic foot between the 2 cohorts. In the TCM cohort, the prescription pattern of TCM was different between insulin and noninsulin patients. The most common herbs were Dan-Shen (Radix Salviae Miltiorrhizae) in noninsulin group and Da-Huang (Radix et Rhizoma Rhei) in insulin group. The most common formulae were Liu-Wei-Di-Huang-Wan in noninsulin group and Yu-Quan-Wan in insulin group. Although no significant reduction in the hazard ratio of CKD and diabetic foot, the incidence rate of stroke was 7.19 per 1000 person-years in the TCM cohort and 10.66 per 1000 person-years in the control cohort, respectively. After adjustment of age, sex, hypertension, hyperlipidemia, and antidiabetes agent use (including sulfonylureas, α-glucosidase, metformin, meglitinide, thiazolidinediones, and insulin), TCM cohorts were found to have a 33% decreased risk of stroke (95% CI = 0.46–0.97; P < 0.05). This population-based retrospective study showed that the complementary TCM therapy might associate with the decreased risk of stroke in T2DM, suggesting TCM as an adjunctive therapy for T2DM to prevent subsequent stroke. PMID:26817897
Brown, Jeremiah R; Pearlman, Daniel M; Marshall, Emily J; Alam, Shama S; MacKenzie, Todd A; Recio-Mayoral, Alejandro; Gomes, Vitor O; Kim, Bokyung; Jensen, Lisette O; Mueller, Christian; Maioli, Mauro; Solomon, Richard J
2016-11-15
We sought to examine the relation between sodium bicarbonate prophylaxis for contrast-associated nephropathy (CAN) and mortality. We conducted an individual patient data meta-analysis from multiple randomized controlled trials. We obtained individual patient data sets for 7 of 10 eligible trials (2,292 of 2,764 participants). For the remaining 3 trials, time-to-event data were imputed based on follow-up periods described in their original reports. We included all trials that compared periprocedural intravenous sodium bicarbonate to periprocedural intravenous sodium chloride in patients undergoing coronary angiography or other intra-arterial interventions. Included trials were determined by consensus according to predefined eligibility criteria. The primary outcome was all-cause mortality hazard, defined as time from randomization to death. In 10 trials with a total of 2,764 participants, sodium bicarbonate was associated with lower mortality hazard than sodium chloride at 1 year (hazard ratio 0.61, 95% confidence interval [CI] 0.41 to 0.89, p = 0.011). Although periprocedural sodium bicarbonate was associated with a reduction in the incidence of CAN (relative risk 0.75, 95% CI 0.62 to 0.91, p = 0.003), there exists a statistically significant interaction between the effect on mortality and the occurrence of CAN (hazard ratio 5.65, 95% CI 3.58 to 8.92, p <0.001) for up to 1-year mortality. Periprocedural intravenous sodium bicarbonate seems to be associated with a reduction in long-term mortality in patients undergoing coronary angiography or other intra-arterial interventions. Copyright © 2016 Elsevier Inc. All rights reserved.
van Hazel, Guy A; Heinemann, Volker; Sharma, Navesh K; Findlay, Michael P N; Ricke, Jens; Peeters, Marc; Perez, David; Robinson, Bridget A; Strickland, Andrew H; Ferguson, Tom; Rodríguez, Javier; Kröning, Hendrik; Wolf, Ido; Ganju, Vinod; Walpole, Euan; Boucher, Eveline; Tichler, Thomas; Shacham-Shmueli, Einat; Powell, Alex; Eliadis, Paul; Isaacs, Richard; Price, David; Moeslein, Fred; Taieb, Julien; Bower, Geoff; Gebski, Val; Van Buskirk, Mark; Cade, David N; Thurston, Kenneth; Gibbs, Peter
2016-05-20
SIRFLOX was a randomized, multicenter trial designed to assess the efficacy and safety of adding selective internal radiation therapy (SIRT) using yttrium-90 resin microspheres to standard fluorouracil, leucovorin, and oxaliplatin (FOLFOX)-based chemotherapy in patients with previously untreated metastatic colorectal cancer. Chemotherapy-naïve patients with liver metastases plus or minus limited extrahepatic metastases were randomly assigned to receive either modified FOLFOX (mFOLFOX6; control) or mFOLFOX6 plus SIRT (SIRT) plus or minus bevacizumab. The primary end point was progression-free survival (PFS) at any site as assessed by independent centralized radiology review blinded to study arm. Between October 2006 and April 2013, 530 patients were randomly assigned to treatment (control, 263; SIRT, 267). Median PFS at any site was 10.2 v 10.7 months in control versus SIRT (hazard ratio, 0.93; 95% CI, 0.77 to 1.12; P = .43). Median PFS in the liver by competing risk analysis was 12.6 v 20.5 months in control versus SIRT (hazard ratio, 0.69; 95% CI, 0.55 to 0.90; P = .002). Objective response rates (ORRs) at any site were similar (68.1% v 76.4% in control v SIRT; P = .113). ORR in the liver was improved with the addition of SIRT (68.8% v 78.7% in control v SIRT; P = .042). Grade ≥ 3 adverse events, including recognized SIRT-related effects, were reported in 73.4% and 85.4% of patients in control versus SIRT. The addition of SIRT to FOLFOX-based first-line chemotherapy in patients with liver-dominant or liver-only metastatic colorectal cancer did not improve PFS at any site but significantly delayed disease progression in the liver. The safety profile was as expected and was consistent with previous studies. © 2016 by American Society of Clinical Oncology.
Stallings, Devita T.; Garvin, Jane T.; Xu, Hongyan; Racette, Susan B.
2017-01-01
Objective To determine which anthropometric measures are the strongest discriminators of incident type 2 diabetes (T2DM) among White and Black males and females in a large U.S. cohort. Methods We used Atherosclerosis Risk in Communities study data from 12,121 participants aged 45–64 years without diabetes at baseline who were followed for over 11 years. Anthropometric measures included a body shape index (ABSI), body adiposity index (BAI), body mass index (BMI), waist circumference (WC), waist to hip ratio (WHR), waist to height ratio (WHtR), and waist to hip to height ratio (WHHR). All anthropometric measures were repeated at each visit and converted to Z-scores. Hazard ratios and 95% confidence intervals adjusted for age were calculated using repeated measures Cox proportional hazard regression analysis. Akaike Information Criteria was used to select best-fit models. The magnitude of the hazard ratio effect sizes and the Harrell’s C-indexes were used to rank the highest associations and discriminators, respectively. Results There were 1,359 incident diabetes cases. Higher values of all anthropometric measures increased the risk for development of T2DM (p < 0.0001) except ABSI, which was not significant in White and Black males. Statistically significant hazard ratios ranged from 1.26–1.63 for males and 1.15–1.88 for females. In general, the largest hazard ratios were those that corresponded to the highest Harrell’s C-Index and lowest Akaike Information Criteria values. Among White and Black males and females, BMI, WC, WHR, and WHtR were comparable in discriminating cases from non-cases of T2DM. ABSI, BAI, and WHHR were inferior discriminators of incident T2DM across all race-gender groups. Conclusions BMI, the most commonly used anthropometric measure, and three anthropometric measures that included waist circumference (i.e., WC, WHR, WHtR) were the best anthropometric discriminators of incident T2DM across all race-gender groups in the ARIC cohort. PMID:28141847
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... Federal Register notice published on May 18, 2011 for Lead Based Paint Hazard Control and Lead Hazard... Based Paint Hazard Control Grant Program A total of $43,206,000 was awarded to 22 grantees for the Lead Based Paint Hazard Control Grant Program and an additional $1,999,971 was awarded to 20 out of the 29...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... Federal Register notice published on September 9, 2010 for Lead Based Paint Hazard Control and Lead Hazard... follows: 1. Lead Based Paint Hazard Control Grant Program A total of $69,700,000 was awarded to 29 grantees for the Lead Based Paint Hazard Control Grant Program and an additional $2,388,637 was awarded to...
Sala-Vila, Aleix; Guasch-Ferré, Marta; Hu, Frank B; Sánchez-Tainta, Ana; Bulló, Mònica; Serra-Mir, Mercè; López-Sabater, Carmen; Sorlí, Jose V; Arós, Fernando; Fiol, Miquel; Muñoz, Miguel A; Serra-Majem, Luis; Martínez, J Alfredo; Corella, Dolores; Fitó, Montserrat; Salas-Salvadó, Jordi; Martínez-González, Miguel A; Estruch, Ramón; Ros, Emilio; B
2016-01-26
Epidemiological evidence suggests a cardioprotective role of α-linolenic acid (ALA), a plant-derived ω-3 fatty acid. It is unclear whether ALA is beneficial in a background of high marine ω-3 fatty acids (long-chain n-3 polyunsaturated fatty acids) intake. In persons at high cardiovascular risk from Spain, a country in which fish consumption is customarily high, we investigated whether meeting the International Society for the Study of Fatty Acids and Lipids recommendation for dietary ALA (0.7% of total energy) at baseline was related to all-cause and cardiovascular disease mortality. We also examined the effect of meeting the society's recommendation for long-chain n-3 polyunsaturated fatty acids (≥500 mg/day). We longitudinally evaluated 7202 participants in the PREvención con DIeta MEDiterránea (PREDIMED) trial. Multivariable-adjusted Cox regression models were fitted to estimate hazard ratios. ALA intake correlated to walnut consumption (r=0.94). During a 5.9-y follow-up, 431 deaths occurred (104 cardiovascular disease, 55 coronary heart disease, 32 sudden cardiac death, 25 stroke). The hazard ratios for meeting ALA recommendation (n=1615, 22.4%) were 0.72 (95% CI 0.56-0.92) for all-cause mortality and 0.95 (95% CI 0.58-1.57) for fatal cardiovascular disease. The hazard ratios for meeting the recommendation for long-chain n-3 polyunsaturated fatty acids (n=5452, 75.7%) were 0.84 (95% CI 0.67-1.05) for all-cause mortality, 0.61 (95% CI 0.39-0.96) for fatal cardiovascular disease, 0.54 (95% CI 0.29-0.99) for fatal coronary heart disease, and 0.49 (95% CI 0.22-1.01) for sudden cardiac death. The highest reduction in all-cause mortality occurred in participants meeting both recommendations (hazard ratio 0.63 [95% CI 0.45-0.87]). In participants without prior cardiovascular disease and high fish consumption, dietary ALA, supplied mainly by walnuts and olive oil, relates inversely to all-cause mortality, whereas protection from cardiac mortality is limited to fish-derived long-chain n-3 polyunsaturated fatty acids. URL: http://www.Controlled-trials.com/. Unique identifier: ISRCTN35739639. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Long-term mortality risk and life expectancy following recurrent hypertensive disease of pregnancy.
Theilen, Lauren H; Meeks, Huong; Fraser, Alison; Esplin, M Sean; Smith, Ken R; Varner, Michael W
2018-04-07
Women with a history of hypertensive disease of pregnancy have increased risks for early mortality from multiple causes. The effect of recurrent hypertensive disease of pregnancy on mortality risk and life expectancy is unknown. We sought to determine whether recurrent hypertensive disease of pregnancy is associated with increased mortality risks. In this retrospective cohort study, we used birth certificate data to determine the number of pregnancies affected by hypertensive disease of pregnancy for each woman delivering in Utah from 1939 through 2012. We assigned women to 1 of 3 groups based on number of affected pregnancies: 0, 1, or ≥2. Exposed women had ≥1 affected singleton pregnancy and lived in Utah for ≥1 year postpartum. Exposed women were matched 1:2 to unexposed women by age, year of childbirth, and parity. Underlying cause of death was determined from death certificates. Mortality risks by underlying cause of death were compared between exposed and unexposed women as a function of number of affected pregnancies. Cox regressions controlled for infant sex, gestational age, parental education, ethnicity, and marital status. We identified 57,384 women with ≥1 affected pregnancy (49,598 women with 1 affected pregnancy and 7786 women with ≥2 affected pregnancies). These women were matched to 114,768 unexposed women. As of 2016, 11,894 women were deceased: 4722 (8.2%) exposed and 7172 (6.3%) unexposed. Women with ≥2 affected pregnancies had increased mortality from all causes (adjusted hazard ratio, 2.04; 95% confidence interval, 1.76-2.36), diabetes (adjusted hazard ratio, 4.33; 95% confidence interval, 2.21-8.47), ischemic heart disease (adjusted hazard ratio, 3.30; 95% confidence interval, 2.02-5.40), and stroke (adjusted hazard ratio, 5.10; 95% confidence interval, 2.62-9.92). For women whose index pregnancy delivered from 1939 through 1959 (n = 10,488), those with ≥2 affected pregnancies had shorter additional life expectancies than mothers who had only 1 or 0 hypertensive pregnancies (48.92 vs 51.91 vs 55.48 years, respectively). Hypertensive diseases of pregnancy are associated with excess risks for early all-cause mortality and some cause-specific mortality, and these risks increase further with recurrent disease. Copyright © 2018 Elsevier Inc. All rights reserved.
Bohula, Erin A; Aylward, Philip E; Bonaca, Marc P; Corbalan, Ramon L; Kiss, Robert G; Murphy, Sabina A; Scirica, Benjamin M; White, Harvey; Braunwald, Eugene; Morrow, David A
2015-11-17
Vorapaxar antagonizes protease-activated receptor 1, the primary receptor for thrombin on human platelets, and reduces recurrent thrombotic events in stable patients with a previous myocardial infarction (MI). We wished to determine whether the efficacy and safety of antiplatelet therapy with vorapaxar was modified by concurrent thienopyridine use. The Thrombin Receptor Antagonist in Secondary Prevention of Atherothrombotic Ischemic Events-Thrombolysis in Myocardial Infarction 50 (TRA 2°P-TIMI 50) was a randomized, double-blind, placebo-controlled trial of vorapaxar in 26,449 patients with previous atherothrombosis. This prespecified analysis included 16,897 patients who qualified with a MI in the preceding 2 weeks to 12 months and was restricted to patients without a history of stroke or transient ischemic attack given its contraindication in that population. Randomization was stratified on the basis of planned thienopyridine use. Thienopyridine was planned at randomization in 12,410 (73%). Vorapaxar significantly reduced the composite of cardiovascular death, MI, and stroke in comparison with placebo regardless of planned thienopyridine therapy (planned thienopyridine, hazard ratio, 0.80, 0.70-0.91, P<0.001; no planned thienopyridine, hazard ratio, 0.75; 0.60-0.94, P=0.011; P-interaction=0.67). Findings were similar when patients were stratified by actual thienopyridine use at baseline (P-interaction=0.82) and through 18 months (P-interaction=0.44). Global Use of Strategies to Open Occluded Coronary Arteries (GUSTO) moderate or severe bleeding risk was increased with vorapaxar and was not significantly altered by planned thienopyridine (planned, hazard ratio, 1.50; 1.18-1.89, P<0.001; no planned, hazard ratio, 1.90, 1.17-3.07, P=0.009; P-interaction=0.37) or actual thienopyridine use (P-interaction=0.24). Vorapaxar reduced cardiovascular death, MI, or stroke in stable patients with a history of previous MI, whether treated concomitantly with a thienopyridine or not. The relative risk of moderate or severe bleeding was similarly increased irrespective of thienopyridine use. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00526474. © 2015 American Heart Association, Inc.
A 3-Month Jump-Landing Training Program: A Feasibility Study Using the RE-AIM Framework
Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain
2013-01-01
Context: Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. Objective: To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Design: Randomized controlled trial. Setting: National and regional basketball teams. Patients or Other Participants: Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. Intervention(s): All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. Main Outcome Measure(s): The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. Results: The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Conclusions: Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries. PMID:23675788
Martínez-González, Miguel Á; Toledo, Estefanía; Arós, Fernando; Fiol, Miquel; Corella, Dolores; Salas-Salvadó, Jordi; Ros, Emilio; Covas, Maria I; Fernández-Crehuet, Joaquín; Lapetra, José; Muñoz, Miguel A; Fitó, Monserrat; Serra-Majem, Luis; Pintó, Xavier; Lamuela-Raventós, Rosa M; Sorlí, Jose V; Babio, Nancy; Buil-Cosiales, Pilar; Ruiz-Gutierrez, Valentina; Estruch, Ramón; Alonso, Alvaro
2014-07-01
The PREDIMED (Prevención con Dieta Mediterránea) randomized primary prevention trial showed that a Mediterranean diet enriched with either extravirgin olive oil or mixed nuts reduces the incidence of stroke, myocardial infarction, and cardiovascular mortality. We assessed the effect of these diets on the incidence of atrial fibrillation in the PREDIMED trial. Participants were randomly assigned to 1 of 3 diets: Mediterranean diet supplemented with extravirgin olive oil, Mediterranean diet supplemented with mixed nuts, or advice to follow a low-fat diet (control group). Incident atrial fibrillation was adjudicated during follow-up by an events committee blinded to dietary group allocation. Among 6705 participants without prevalent atrial fibrillation at randomization, we observed 72 new cases of atrial fibrillation in the Mediterranean diet with extravirgin olive oil group, 82 in the Mediterranean diet with mixed nuts group, and 92 in the control group after median follow-up of 4.7 years. The Mediterranean diet with extravirgin olive oil significantly reduced the risk of atrial fibrillation (hazard ratio, 0.62; 95% confidence interval, 0.45-0.85 compared with the control group). No effect was found for the Mediterranean diet with nuts (hazard ratio, 0.89; 95% confidence interval, 0.65-1.20). In the absence of proven interventions for the primary prevention of atrial fibrillation, this post hoc analysis of the PREDIMED trial suggests that extravirgin olive oil in the context of a Mediterranean dietary pattern may reduce the risk of atrial fibrillation. http://www.controlled-trials.com. Unique identifier: ISRCTN35739639. © 2014 American Heart Association, Inc.
Westerdahl, Christina; Zöller, Bengt; Arslan, Eren; Erdine, Serap; Nilsson, Peter M
2014-12-01
Screening of hypertension has been advocated for early detection and treatment. Severe hypertension (grade 3 hypertension) is a strong predictor for cardiovascular disease. This study aimed to evaluate not only the risk factors for developing severe hypertension, but also the prospective morbidity and mortality risk associated with severe hypertension in a population-based screening and intervention programme. In all, 18,200 individuals from a population-based cohort underwent a baseline examination in 1972-1992 and were re-examined in 2002-2006 in Malmö, Sweden. In total, 300 (1.6%) patients with severe hypertension were identified at re-examination, and predictive risk factors from baseline were calculated. Total and cause-specific morbidity and mortality were followed in national registers in all severe hypertension patients, as well as in age and sex-matched normotensive controls. Cox analyses for hazard ratios were used. Men developing severe hypertension differed from matched controls in baseline variables associated with the metabolic syndrome, as well as paternal history of hypertension (P < 0.001). Women with later severe hypertension were characterized by elevated BMI and a positive maternal history for hypertension at baseline. The risk of mortality, coronary events, stroke and diabetes during follow-up was higher among severe hypertension patients compared to controls. For coronary events, the risk remained elevated adjusted for other risk factors [hazard ratio 2.31, 95% confidence interval (CI) 1.22-4.40, P = 0.011]. Family history and variables associated with metabolic syndrome are predictors for severe hypertension after a long-term follow-up. Severe hypertension is associated with increased mortality, cardiovascular morbidity and incident diabetes in spite of treatment. This calls for improved risk factor control in patients with severe hypertension.
Lee, Wang-Tso; Huang, Hui-Ling; Wong, Lee Chin; Weng, Wen-Chin; Vasylenko, Tamara; Jong, Yuh-Jyh; Lin, Wei-Sheng; Ho, Shinn-Ying
2017-03-01
Tourette syndrome (TS) is associated with a variety of neuropsychiatric comorbidities. However, the relationship between TS and sleep disorders in children is less investigated. This nationwide population-based case-control study aimed to determine the correlation of TS and sleep disorders in children. Patients aged less than 18 years with newly diagnosed TS from 2001 to 2007 were collected (n = 1124) using data from Taiwan's National Health Insurance Research Database and were compared with a comparison cohort (n = 3372). The adjusted hazard ratio (aHR) for developing sleep disorders was calculated by multivariate Cox proportional hazards model. TS was more prevalent in boys, with a male to female ratio of 3.16:1. TS group also had significantly higher urbanization level of residence than controls (p < .001). The overall incidence rate of sleep disorders was 7.24‰ in children with TS, compared to 3.53‰ in controls. The TS group was associated with a significantly higher rate of sleep disorders, with a crude HR of 2.05 (95% confidence inerval [CI] = 1.43-2.95, p < .001). Among the comorbidities of TS, anxiety disorder was associated with the highest risk for sleep disorders (crude HR = 3.26, 95% CI = 1.52-7.00, p < .001). The aHR for TS cohort to develop sleep disorders was 1.72 (95% CI = 1.16-2.53, p = .007). The increased risk of sleep disorders in children with TS cannot be fully attributed to its comorbidities, and TS is an independent risk factor for sleep disorders in children. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.
A 3-month jump-landing training program: a feasibility study using the RE-AIM framework.
Aerts, Inne; Cumps, Elke; Verhagen, Evert; Mathieu, Niels; Van Schuerbeeck, Sander; Meeusen, Romain
2013-01-01
Evaluating the translatability and feasibility of an intervention program has become as important as determining the effectiveness of the intervention. To evaluate the applicability of a 3-month jump-landing training program in basketball players, using the RE-AIM (reach, effectiveness, adoption, implementation, and maintenance) framework. Randomized controlled trial. National and regional basketball teams. Twenty-four teams of the second highest national division and regional basketball divisions in Flanders, Belgium, were randomly assigned (1:1) to a control group and intervention group. A total of 243 athletes (control group = 129, intervention group = 114), ages 15 to 41 years, volunteered. All exercises in the intervention program followed a progressive development, emphasizing lower extremity alignment during jump-landing activities. The results of the process evaluation of the intervention program were based on the 5 dimensions of the RE-AIM framework. The injury incidence density, hazard ratios, and 95% confidence intervals were determined. The participation rate of the total sample was 100% (reach). The hazard ratio was different between the intervention group and the control group (0.40 [95% confidence interval = 0.16, 0.99]; effectiveness). Of the 12 teams in the intervention group, 8 teams (66.7%) agreed to participate in the study (adoption). Eight of the participating coaches (66.7%) felt positively about the intervention program and stated that they had implemented the training sessions of the program as intended (implementation). All coaches except 1 (87.5%) intended to continue the intervention program the next season (maintenance). Compliance of the coaches in this coach-supervised jump-landing training program was high. In addition, the program was effective in preventing lower extremity injuries.
Janghorbani, Mohsen; Amini, Masoud
2016-09-01
In this study, we evaluate the association between triglyceride to high-density lipoprotein cholesterol (TG/HDL) ratio and total cholesterol (TC) to HDL (TC/HDL) ratio and the risks of type 2 diabetes (T2D) in an Iranian high-risk population. We analysed 7-year follow-up data (n = 1771) in non-diabetic first-degree relatives of consecutive patients with T2D 30-70 years old. The primary outcome was the diagnosis of T2D based on repeated oral glucose tolerance tests. We used Cox proportional hazard models to estimate hazard ratio for incident T2D across tertiles of TG/HDL and TC/HDL ratios and plotted a receiver operating characteristic (ROC) curve to assess discrimination. The highest tertile of TG/HDL and TC/HDL ratios compared with the lowest tertile was not associated with T2D in age- and gender-adjusted models (HR 0.99, 95% CI: 0.88, 1.11 for TG/HDL ratio and 1.10, 95% CI: 0.97, 1.23 for TC/HDL ratio). Further adjustment for waist circumference or body mass index, fasting plasma glucose, and low-density lipoprotein cholesterol did not appreciably alter the hazard ratio compared with the age- and gender-adjusted model. The area under the ROC curve for TG/HDL ratio was 57.7% (95% CI: 54.0, 61.5) and for TC/HDL ratio was 55.1% (95% CI: 51.2, 59.0). TG/HDL and TC/HDL ratios were not robust predictors of T2D in high-risk individuals in Iran. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Gao, Jian; Zhang, Jie; Li, Hong; Li, Lei; Xu, Linghong; Zhang, Yujie; Wang, Zhanshan; Wang, Xuezhong; Zhang, Weiqi; Chen, Yizhen; Cheng, Xi; Zhang, Hao; Peng, Liang; Chai, Fahe; Wei, Yongjie
2018-07-01
Volatile organic compounds (VOCs) can react with atmospheric radicals while being transported after being emitted, resulting in substantial losses. Using only observed VOC mixing ratios to assess VOC pollution, is therefore problematic. The observed mixing ratios and initial mixing ratios taking chemical loss into consideration were performed using data for 90 VOCs in the atmosphere in a typical urban area in Beijing in winter 2013 to gain a more accurate view of VOC pollution. The VOC sources, ambient VOC mixing ratios and compositions, variability and influencing factors, contributions to near-ground-ozone and health risks posed were assessed. Source apportionment should be conducted using initial mixing ratios, but health risks should be assessed using observed mixing ratios. The daytime daily mean initial mixing ratio (72.62ppbv) was 7.72ppbv higher than the daytime daily mean observed mixing ratio (64.90ppbv). Alkenes contributed >70% of the consumed VOCs. The nighttime daily mean observed mixing ratio was 71.66ppbv, 6.76ppbv higher than the daytime mixing ratio. The observed mixing ratio for 66 VOCs was 40.31% higher in Beijing than New York. The OFPs of Ini-D (266.54ppbv) was underestimated 23.41% compared to the OFP of Obs-D (204.14ppbv), improving emission control of ethylene and propene would be an effective way of controlling O 3 . Health risk assessments performed for 28 hazardous VOCs show that benzene, chloroform, 1,2-dichloroethane, and acetaldehyde pose carcinogenic risk and acrolein poses non-carcinogenic risks. Source apportionment results indicated that vehicle exhausts, solvent usage and industrial processes were the main VOC source during the study. Copyright © 2018. Published by Elsevier B.V.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Preconception B-vitamin and homocysteine status, conception, and early pregnancy loss.
Ronnenberg, Alayne G; Venners, Scott A; Xu, Xiping; Chen, Changzhong; Wang, Lihua; Guang, Wenwei; Huang, Aiqun; Wang, Xiaobin
2007-08-01
Maternal vitamin status contributes to clinical spontaneous abortion, but the role of B-vitamin and homocysteine status in subclinical early pregnancy loss is unknown. Three-hundred sixty-four textile workers from Anqing, China, who conceived at least once during prospective observation (1996-1998), provided daily urine specimens for up to 1 year, and urinary human chorionic gonadotropin was assayed to detect conception and early pregnancy loss. Homocysteine, folate, and vitamins B6 and B12 were measured in preconception plasma. Relative to women in the lowest quartile of vitamin B6, those in the third and fourth quartiles had higher adjusted proportional hazard ratios of conception (hazard ratio (HR)=2.2, 95% confidence interval (CI): 1.3, 3.4; HR=1.6, 95% CI: 1.1, 2.3, respectively), and the adjusted odds ratio for early pregnancy loss in conceptive cycles was lower in the fourth quartile (odds ratio=0.5, 95% CI: 0.3, 1.0). Women with sufficient vitamin B6 had a higher adjusted hazard ratio of conception (HR=1.4, 95% CI: 1.1, 1.9) and a lower adjusted odds ratio of early pregnancy loss in conceptive cycles (odds ratio=0.7, 95% CI: 0.4, 1.1) than did women with vitamin B6 deficiency. Poor vitamin B6 status appears to decrease the probability of conception and to contribute to the risk of early pregnancy loss in this population.
Beenstock, Jane; Adams, Jean; White, Martin
2011-08-01
Heavy alcohol consumption is associated with significant morbidity and mortality. Levels of alcohol consumption among students and young people are particularly high. Time perspective describes the varying value individuals place on outcomes in the present and future. In general, it has been found that individuals prefer to receive a gain today rather than in the future. There is evidence that time perspective is associated with addictive health behaviours, including alcoholism and cigarette smoking, but less evidence of its association with non-addictive, but hazardous, levels of alcohol consumption. The objective was to determine if there is an association between time perspective and hazardous alcohol consumption. A cross-sectional survey using a self-completion questionnaire was administered to willing undergraduate students attending a convenience sample of lectures in two university faculties. Hazardous alcohol consumption was defined as a score of ≥8 on the Alcohol Use Disorders Identification Test (AUDIT) and time perspective was measured using the Consideration of Future Consequences Scale (CFCS). Participants were 322 undergraduate university students in two faculties at a university in Northern England, UK. Hazardous alcohol consumption was reported by 264 (82%) respondents. After controlling for potential confounding by socio-demographic variables, greater consideration of future consequences was associated with lower odds of reporting hazardous drinking [odds ratio = 0.28; 95% confidence interval 0.15-0.54]. Interventions aimed at increasing future orientated time perspective may be effective in decreasing hazardous alcohol consumption in students.
Ma, Yunsheng; Hébert, James R.; Balasubramanian, Raji; Wedick, Nicole M.; Howard, Barbara V.; Rosal, Milagros C.; Liu, Simin; Bird, Chloe E.; Olendzki, Barbara C.; Ockene, Judith K.; Wactawski-Wende, Jean; Phillips, Lawrence S.; LaMonte, Michael J.; Schneider, Kristin L.; Garcia, Lorena; Ockene, Ira S.; Merriam, Philip A.; Sepavich, Deidre M.; Mackey, Rachel H.; Johnson, Karen C.; Manson, JoAnn E.
2013-01-01
Using data from the Women's Health Initiative (1993–2009; n = 158,833 participants, of whom 84.1% were white, 9.2% were black, 4.1% were Hispanic, and 2.6% were Asian), we compared all-cause, cardiovascular, and cancer mortality rates in white, black, Hispanic, and Asian postmenopausal women with and without diabetes. Cox proportional hazard models were used for the comparison from which hazard ratios and 95% confidence intervals were computed. Within each racial/ethnic subgroup, women with diabetes had an approximately 2–3 times higher risk of all-cause, cardiovascular, and cancer mortality than did those without diabetes. However, the hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups. Population attributable risk percentages (PARPs) take into account both the prevalence of diabetes and hazard ratios. For all-cause mortality, whites had the lowest PARP (11.1, 95% confidence interval (CI): 10.1, 12.1), followed by Asians (12.9, 95% CI: 4.7, 20.9), blacks (19.4, 95% CI: 15.0, 23.7), and Hispanics (23.2, 95% CI: 14.8, 31.2). To our knowledge, the present study is the first to show that hazard ratios for mortality outcomes were not significantly different between racial/ethnic subgroups when stratified by diabetes status. Because of the “amplifying” effect of diabetes prevalence, efforts to reduce racial/ethnic disparities in the rate of death from diabetes should focus on prevention of diabetes. PMID:24045960
Ishikawa, Joji; Ishikawa, Shizukiyo; Kario, Kazuomi
2015-03-01
We attempted to evaluate whether subjects who exhibit prolonged corrected QT (QTc) interval (≥440 ms in men and ≥460 ms in women) on ECG, with and without ECG-diagnosed left ventricular hypertrophy (ECG-LVH; Cornell product, ≥244 mV×ms), are at increased risk of stroke. Among the 10 643 subjects, there were a total of 375 stroke events during the follow-up period (128.7±28.1 months; 114 142 person-years). The subjects with prolonged QTc interval (hazard ratio, 2.13; 95% confidence interval, 1.22-3.73) had an increased risk of stroke even after adjustment for ECG-LVH (hazard ratio, 1.71; 95% confidence interval, 1.22-2.40). When we stratified the subjects into those with neither a prolonged QTc interval nor ECG-LVH, those with a prolonged QTc interval but without ECG-LVH, and those with ECG-LVH, multivariate-adjusted Cox proportional hazards analysis demonstrated that the subjects with prolonged QTc intervals but not ECG-LVH (1.2% of all subjects; incidence, 10.7%; hazard ratio, 2.70, 95% confidence interval, 1.48-4.94) and those with ECG-LVH (incidence, 7.9%; hazard ratio, 1.83; 95% confidence interval, 1.31-2.57) had an increased risk of stroke events, compared with those with neither a prolonged QTc interval nor ECG-LVH. In conclusion, prolonged QTc interval was associated with stroke risk even among patients without ECG-LVH in the general population. © 2014 American Heart Association, Inc.
Liou, Li-Syue; Chang, Chih-Ya; Chen, Hsuan-Ju; Tseng, Chun-Hung; Chen, Cheng-Yu
2017-01-01
Objective This population-based cohort study investigated the risk of developing peripheral arterial occlusive disease (PAOD) in patients with Bell’s palsy. Methods We used longitudinal claims data of health insurance of Taiwan to identify 5,152 patients with Bell’s palsy newly diagnosed in 2000–2010 and a control cohort of 20,608 patients without Bell’s palsy matched by propensity score. Incidence and hazard ratio (HR) of PAOD were assessed by the end of 2013. Results The incidence of PAOD was approximately 1.5 times greater in the Bell’s palsy group than in the non-Bell’s palsy controls (7.75 vs. 4.99 per 1000 person-years). The Cox proportional hazards regression analysis measured adjusted HR was 1.54 (95% confidence interval (CI) = 1.35–1.76) for the Bell’s palsy group compared to the non-Bell’s palsy group, after adjusting for sex, age, occupation, income and comorbidities. Men were at higher risk of PAOD than women in the Bell’s palsy group, but not in the controls. The incidence of PAOD increased with age in both groups, but the Bell’s palsy group to control group HR of PAOD decreased as age increased. The systemic steroid treatment reduced 13% of PAOD hazard for Bell’s palsy patients, compared to those without the treatment, but not significant. Conclusions Bell’s palsy appears to be associated with an increased risk of developing PAOD. Further pathophysiologic, histopathology and immunologic research is required to explore the underlying biologic mechanism. PMID:29216223
Chen, Mu-Hong; Wei, Han-Ting; Su, Tung-Ping; Li, Cheng-Ta; Lin, Wei-Chen; Chang, Wen-Han; Chen, Tzeng-Ji; Bai, Ya-Mei
2014-05-01
Herpes zoster results from reactivation of the endogenous varicella zoster virus infection. Previous studies have shown that herpes zoster and postherpetic neuralgia were associated with anxiety, depression, and insomnia. However, no prospective study has investigated the association between herpes zoster and the development of depressive disorder. Subjects were identified through the Taiwan National Health Insurance Research Database. Patients 18 years or older with a diagnosis of herpes zoster and without a psychiatric history were enrolled in 2000 and compared with age-/sex-matched controls (1:4). These participants were followed up to the end of 2010 for new-onset depressive disorder. A total of 1888 patients with herpes zoster were identified and compared with 7552 age-/sex-matched controls in 2000. Those with herpes zoster had a higher incidence of developing major depression (2.2% versus 1.4%, p = .018) and any depressive disorder (4.3% versus 3.2%, p = .020) than did the control group. The follow-up showed that herpes zoster was an independent risk factor for major depression (hazard ratio = 1.49, 95% confidence interval = 1.04-2.13) and any depressive disorder (hazard ratio = 1.32, 95% confidence interval = 1.03-1.70), after adjusting demographic data and comorbid medical diseases. This is the first study to investigate the temporal association between herpes zoster and depressive disorder. Further studies would be required to clarify the underlying pathophysiology about this association and whether proper treatment of herpes zoster could decrease the long-term risk of depressive disorder.
Dudeja, Divya; Bartarya, Sukesh Kumar; Biyani, A K
2011-10-01
The present study discusses ion sources and assesses the chemical quality of groundwater of Doon Valley in Outer Himalayan region for drinking and irrigational purposes. Valley is almost filled with Doon gravels that are main aquifers supplying water to its habitants. Recharged only by meteoric water, groundwater quality in these aquifers is controlled essentially by chemical processes occurring between water and lithology and locally altered by human activities. Seventy-six water samples were collected from dug wells, hand pumps and tube wells and were analysed for their major ion concentrations. The pH is varying from 5.6 to 7.4 and electrical conductivity from 71 to 951 μmho/cm. Groundwater of Doon valley is dominated by bicarbonate contributing 83% in anionic abundance while calcium and magnesium dominate in cationic concentrations with 88%. The seasonal and spatial variation in ionic concentration, in general, is related to discharge and lithology. The high ratio of (Ca + Mg)/(Na + K), i.e. 10, low ratio of (Na + K)/TZ+, i.e.0.2 and also the presence of carbonate lithology in the northern part of valley, is indicative of carbonate dissolution as the main controlling solute acquisition process in the valley. The low abundance of silica content and high HCO₃/H₄SiO₄ ratio also supports carbonate dissolution and less significant role of silicate weathering as the major source for dissolved ions in Doon Valley. The analytical results computed for various indices show that water is of fairly good quality, although, hard but have moderate dissolved solid content. It is free from sodium hazard lying in C₁-S₁ and C₂-S₁ class of USSL diagram and in general suitable for drinking and irrigation except few locations having slightly high salinity hazard.
De Stavola, Bianca L; Pizzi, Costanza; Clemens, Felicity; Evans, Sally Ann; Evans, Anthony D; dos Santos Silva, Isabel
2012-04-01
Flight crew are exposed to several potential occupational hazards. This study compares mortality rates in UK flight crew to those in air traffic control officers (ATCOs) and the general population. A total of 19,489 flight crew and ATCOs were identified from the UK Civil Aviation Authority medical records and followed to the end of 2006. Consented access to medical records and questionnaire data provided information on demographic, behavioral, clinical, and occupational variables. Standardized mortality ratios (SMR) were estimated for these two occupational groups using the UK general population. Adjusted mortality hazard ratios (HR) for flight crew versus ATCOs were estimated via Cox regression models. A total of 577 deaths occurred during follow-up. Relative to the general population, both flight crew (SMR 0.32; 95% CI 0.30, 0.35) and ATCOs (0.39; 0.32, 0.47) had lower all-cause mortality, mainly due to marked reductions in mortality from neoplasms and cardiovascular diseases, although flight crew had higher mortality from aircraft accidents (SMR 42.8; 27.9, 65.6). There were no differences in all-cause mortality (HR 0.99; 95% CI 0.79, 1.25), or in mortality from any major cause, between the two occupational groups after adjustment for health-related variables, again except for those from aircraft accidents. The latter ratios, however, declined with increasing number of hours. The low all-cause mortality observed in both occupational groups relative to the general population is consistent with a strong "healthy worker effect" and their low prevalence of smoking and other risk factors. Mortality among flight crew did not appear to be influenced by occupational exposures, except for a rise in mortality from aircraft accidents.
Shahan, M.R.; Seaman, C.E.; Beck, T.W.; Colinet, J.F.; Mischler, S.E.
2017-01-01
Float coal dust is produced by various mining methods, carried by ventilating air and deposited on the floor, roof and ribs of mine airways. If deposited, float dust is re-entrained during a methane explosion. Without sufficient inert rock dust quantities, this float coal dust can propagate an explosion throughout mining entries. Consequently, controlling float coal dust is of critical interest to mining operations. Rock dusting, which is the adding of inert material to airway surfaces, is the main control technique currently used by the coal mining industry to reduce the float coal dust explosion hazard. To assist the industry in reducing this hazard, the Pittsburgh Mining Research Division of the U.S. National Institute for Occupational Safety and Health initiated a project to investigate methods and technologies to reduce float coal dust in underground coal mines through prevention, capture and suppression prior to deposition. Field characterization studies were performed to determine quantitatively the sources, types and amounts of dust produced during various coal mining processes. The operations chosen for study were a continuous miner section, a longwall section and a coal-handling facility. For each of these operations, the primary dust sources were confirmed to be the continuous mining machine, longwall shearer and conveyor belt transfer points, respectively. Respirable and total airborne float dust samples were collected and analyzed for each operation, and the ratio of total airborne float coal dust to respirable dust was calculated. During the continuous mining process, the ratio of total airborne float coal dust to respirable dust ranged from 10.3 to 13.8. The ratios measured on the longwall face were between 18.5 and 21.5. The total airborne float coal dust to respirable dust ratio observed during belt transport ranged between 7.5 and 21.8. PMID:28936001
Ethnic Differences in Incidence and Outcomes of Childhood Nephrotic Syndrome.
Banh, Tonny H M; Hussain-Shamsy, Neesha; Patel, Viral; Vasilevska-Ristovska, Jovanka; Borges, Karlota; Sibbald, Cathryn; Lipszyc, Deborah; Brooke, Josefina; Geary, Denis; Langlois, Valerie; Reddon, Michele; Pearl, Rachel; Levin, Leo; Piekut, Monica; Licht, Christoph P B; Radhakrishnan, Seetha; Aitken-Menezes, Kimberly; Harvey, Elizabeth; Hebert, Diane; Piscione, Tino D; Parekh, Rulan S
2016-10-07
Ethnic differences in outcomes among children with nephrotic syndrome are unknown. We conducted a longitudinal study at a single regional pediatric center comparing ethnic differences in incidence from 2001 to 2011 census data and longitudinal outcomes, including relapse rates, time to first relapse, frequently relapsing disease, and use of cyclophosphamide. Among 711 children, 24% were European, 33% were South Asian, 10% were East/Southeast Asian, and 33% were of other origins. Over 10 years, the overall incidence increased from 1.99/100,000 to 4.71/100,000 among children ages 1-18 years old. In 2011, South Asians had a higher incidence rate ratio of 6.61 (95% confidence interval, 3.16 to 15.1) compared with Europeans. East/Southeast Asians had a similar incidence rate ratio (0.76; 95% confidence interval, 0.13 to 2.94) to Europeans. We determined outcomes in 455 children from the three largest ethnic groups with steroid-sensitive disease over a median of 4 years. South Asian and East/Southeast Asian children had significantly lower odds of frequently relapsing disease at 12 months (South Asian: adjusted odds ratio; 0.55; 95% confidence interval, 0.39 to 0.77; East/Southeast Asian: adjusted odds ratio; 0.42; 95% confidence interval, 0.34 to 0.51), fewer subsequent relapses (South Asian: adjusted odds ratio; 0.64; 95% confidence interval, 0.50 to 0.81; East/Southeast Asian: adjusted odds ratio; 0.47; 95% confidence interval, 0.24 to 0.91), lower risk of a first relapse (South Asian: adjusted hazard ratio, 0.74; 95% confidence interval, 0.67 to 0.83; East/Southeast Asian: adjusted hazard ratio, 0.65; 95% CI, 0.63 to 0.68), and lower use of cyclophosphamide (South Asian: adjusted hazard ratio, 0.82; 95% confidence interval, 0.53 to 1.28; East/Southeast Asian: adjusted hazard ratio, 0.54; 95% confidence interval, 0.41 to 0.71) compared with European children. Despite the higher incidence among South Asians, South and East/Southeast Asian children have significantly less complicated clinical outcomes compared with Europeans. Copyright © 2016 by the American Society of Nephrology.
Demetri, George D; von Mehren, Margaret; Jones, Robin L; Hensley, Martee L; Schuetze, Scott M; Staddon, Arthur; Milhem, Mohammed; Elias, Anthony; Ganjoo, Kristen; Tawbi, Hussein; Van Tine, Brian A; Spira, Alexander; Dean, Andrew; Khokhar, Nushmia Z; Park, Youn Choi; Knoblauch, Roland E; Parekh, Trilok V; Maki, Robert G; Patel, Shreyaskumar R
2016-03-10
This multicenter study, to our knowledge, is the first phase III trial to compare trabectedin versus dacarbazine in patients with advanced liposarcoma or leiomyosarcoma after prior therapy with an anthracycline and at least one additional systemic regimen. Patients were randomly assigned in a 2:1 ratio to receive trabectedin or dacarbazine intravenously every 3 weeks. The primary end point was overall survival (OS), secondary end points were disease control-progression-free survival (PFS), time to progression, objective response rate, and duration of response-as well as safety and patient-reported symptom scoring. A total of 518 patients were enrolled and randomly assigned to either trabectedin (n = 345) or dacarbazine (n = 173). In the final analysis of PFS, trabectedin administration resulted in a 45% reduction in the risk of disease progression or death compared with dacarbazine (median PFS for trabectedin v dacarbazine, 4.2 v 1.5 months; hazard ratio, 0.55; P < .001); benefits were observed across all preplanned subgroup analyses. The interim analysis of OS (64% censored) demonstrated a 13% reduction in risk of death in the trabectedin arm compared with dacarbazine (median OS for trabectedin v dacarbazine, 12.4 v 12.9 months; hazard ratio, 0.87; P = .37). The safety profiles were consistent with the well-characterized toxicities of both agents, and the most common grade 3 to 4 adverse effects were myelosuppression and transient elevation of transaminases in the trabectedin arm. Trabectedin demonstrates superior disease control versus conventional dacarbazine in patients who have advanced liposarcoma and leiomyosarcoma after they experience failure of prior chemotherapy. Because disease control in advanced sarcomas is a clinically relevant end point, this study supports the activity of trabectedin for patients with these malignancies. © 2015 by American Society of Clinical Oncology.
Ryödi, Essi; Metso, Saara; Jaatinen, Pia; Huhtala, Heini; Saaristo, Rauni; Välimäki, Matti; Auvinen, Anssi
2015-10-01
Some previous studies have suggested increased cancer risk in hyperthyroid patients treated with radioactive iodine (RAI). It is unclear whether the excess cancer risk is attributable to hyperthyroidism, its treatment, or the shared risk factors of the two diseases. The objective was to assess cancer morbidity and mortality in hyperthyroid patients treated with either RAI or surgery. We identified 4334 patients treated surgically for hyperthyroidism in Finland during 1986-2007 from the Hospital Discharge Registry and 1814 patients treated with RAI for hyperthyroidism at Tampere University Hospital. For each patient, three age- and gender-matched controls were chosen. Information on cancer diagnoses was obtained from the Cancer Registry. The follow-up began 3 months after the treatment and ended at cancer diagnosis, death, emigration, or the common closing date (December 31, 2009). The overall cancer incidence was not increased among the hyperthyroid patients compared to their controls (rate ratio [RR], 1.05; 95% confidence interval [CI], 0.96-1.15). However, the risk of cancers of the respiratory tract (RR, 1.46; 95% CI, 1.05-2.02) and the stomach (RR, 1.64; 95% CI, 1.01-2.68) was increased among the patients. The overall cancer mortality did not differ between the patients and the controls (RR, 1.08; 95% CI, 0.94-1.25). The type of treatment did not affect the overall risk of cancer (hazard ratio for RAI vs thyroidectomy, 1.03; 95% CI, 0.86-1.23) or cancer mortality (hazard ratio, 1.04; 95% CI, 0.91-1.21). In this cohort of Finnish patients with hyperthyroidism treated with thyroidectomy or RAI, the overall risk of cancer was not increased, although an increased risk of gastric and respiratory tract cancers was seen in hyperthyroid patients. Based on this large-scale, long-term follow-up study, the increased cancer risk in hyperthyroid patients is attributable to hyperthyroidism and shared risk factors, not the treatment modality.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-29
... Information Collection: Healthy Homes and Lead Hazard Control Programs Data Collection--Progress Reporting AGENCY: Office of Healthy Homes and Lead Hazard Control, HUD. ACTION: Notice. SUMMARY: The revised... of Healthy Homes and Lead Hazard Control, Department of Housing and Urban Development, 451 7th Street...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-20
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Hazard Analysis and Risk- Based Preventive Controls for Human Food'' and its information collection... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food.'' IV. How To...
Maternity leave in the ninth month of pregnancy and birth outcomes among working women.
Guendelman, Sylvia; Pearl, Michelle; Graham, Steve; Hubbard, Alan; Hosang, Nap; Kharrazi, Martin
2009-01-01
The health effects of antenatal maternity leave have been scarcely evaluated. In California, women are eligible for paid benefits up to 4 weeks before delivery. We explored whether leave at > or =36 weeks gestation increases gestation and birthweight, and reduces primary cesarean deliveries among full-time working women. Drawing from a 2002--2003 nested case-control study of preterm birth and low birthweight among working women in Southern California, we compared a cohort of women who took leave (n = 62) or worked until delivery (n = 385). Models weighted for probability of sampling were used to calculate hazards ratios for gestational age, odds ratios (OR) for primary cesarean delivery, and multilinear regression coefficients for birthweight. Leave-takers were similar to non-leave-takers on demographic and health characteristics, except that more clerical workers took leave (p = .02). Compared with non-leave-takers, leave-takers had almost 4 times lower odds of cesarean delivery after adjusting for covariates (OR, 0.27; 95% confidence interval [CI], 0.08-0.94). Overall, there were no marked differences in length of gestation or mean birthweight. However, in a subgroup of women whose efforts outstripped their occupational rewards, gestation was prolonged (hazard ratio for delivery each day between 36 and 41 weeks, 0.56; 95% CI, 0.34-0.93). Maternity leave in late pregnancy shows promise for reducing cesarean deliveries and prolonging gestation in occupationally strained women.
Preclinical Alzheimer disease and risk of falls.
Stark, Susan L; Roe, Catherine M; Grant, Elizabeth A; Hollingsworth, Holly; Benzinger, Tammie L; Fagan, Anne M; Buckles, Virginia D; Morris, John C
2013-07-30
We determined the rate of falls among cognitively normal, community-dwelling older adults, some of whom had presumptive preclinical Alzheimer disease (AD) as detected by in vivo imaging of fibrillar amyloid plaques using Pittsburgh compound B (PiB) and PET and/or by assays of CSF to identify Aβ₄₂, tau, and phosphorylated tau. We conducted a 12-month prospective cohort study to examine the cumulative incidence of falls. Participants were evaluated clinically and underwent PiB PET imaging and lumbar puncture. Falls were reported monthly using an individualized calendar journal returned by mail. A Cox proportional hazards model was used to test whether time to first fall was associated with each biomarker and the ratio of CSF tau/Aβ₄₂ and CSF phosphorylated tau/Aβ₄₂, after adjustment for common fall risk factors. The sample (n = 125) was predominately female (62.4%) and white (96%) with a mean age of 74.4 years. When controlled for ability to perform activities of daily living, higher levels of PiB retention (hazard ratio = 2.95 [95% confidence interval 1.01-6.45], p = 0.05) and of CSF biomarker ratios (p < 0.001) were associated with a faster time to first fall. Presumptive preclinical AD is a risk factor for falls in older adults. This study suggests that subtle noncognitive changes that predispose older adults to falls are associated with AD and may precede detectable cognitive changes.
A Bayesian Hybrid Adaptive Randomisation Design for Clinical Trials with Survival Outcomes.
Moatti, M; Chevret, S; Zohar, S; Rosenberger, W F
2016-01-01
Response-adaptive randomisation designs have been proposed to improve the efficiency of phase III randomised clinical trials and improve the outcomes of the clinical trial population. In the setting of failure time outcomes, Zhang and Rosenberger (2007) developed a response-adaptive randomisation approach that targets an optimal allocation, based on a fixed sample size. The aim of this research is to propose a response-adaptive randomisation procedure for survival trials with an interim monitoring plan, based on the following optimal criterion: for fixed variance of the estimated log hazard ratio, what allocation minimizes the expected hazard of failure? We demonstrate the utility of the design by redesigning a clinical trial on multiple myeloma. To handle continuous monitoring of data, we propose a Bayesian response-adaptive randomisation procedure, where the log hazard ratio is the effect measure of interest. Combining the prior with the normal likelihood, the mean posterior estimate of the log hazard ratio allows derivation of the optimal target allocation. We perform a simulation study to assess and compare the performance of this proposed Bayesian hybrid adaptive design to those of fixed, sequential or adaptive - either frequentist or fully Bayesian - designs. Non informative normal priors of the log hazard ratio were used, as well as mixture of enthusiastic and skeptical priors. Stopping rules based on the posterior distribution of the log hazard ratio were computed. The method is then illustrated by redesigning a phase III randomised clinical trial of chemotherapy in patients with multiple myeloma, with mixture of normal priors elicited from experts. As expected, there was a reduction in the proportion of observed deaths in the adaptive vs. non-adaptive designs; this reduction was maximized using a Bayes mixture prior, with no clear-cut improvement by using a fully Bayesian procedure. The use of stopping rules allows a slight decrease in the observed proportion of deaths under the alternate hypothesis compared with the adaptive designs with no stopping rules. Such Bayesian hybrid adaptive survival trials may be promising alternatives to traditional designs, reducing the duration of survival trials, as well as optimizing the ethical concerns for patients enrolled in the trial.
Effect of Long Working Hours on Self-reported Hypertension among Middle-aged and Older Wage Workers
2014-01-01
Objectives Many studies have reported an association between overwork and hypertension. However, research on the health effects of long working hours has yielded inconclusive results. The objective of this study was to identify an association between overtime work and hypertension in wage workers 45 years and over of age using prospective data. Methods Wage workers in Korea aged 45 years and over were selected for inclusion in this study from among 10,254 subjects from the Korean Longitudinal Study of Ageing. Workers with baseline hypertension and those with other major diseases were excluded. In the end, a total of 1,079 subjects were included. A Cox proportional hazards model was used to calculate hazard ratios and adjust for baseline characteristics such as sex, age, education, income, occupation, form of employment, body mass index, alcohol habit, smoking habit, regular exercise, and number of working days per week. Additional models were used to calculate hazard ratios after gender stratification. Results Among the 1,079 subjects, 85 workers were diagnosed with hypertension during 3974.2 person-months. The average number of working hours per week for all subjects was 47.68. The proportion of overtime workers was 61.0% (cutoff, 40 h per week). Compared with those working 40 h and less per week, the hazard ratio of subjects in the final model, which adjusted for all selected variables, working 41-50 h per week was 2.20 (95% confidence interval [CI], 1.19–4.06), that of subjects working 51-60 h per week was 2.40 (95% CI, 1.07–5.39), and that of subjects working 61 h and over per week was 2.87 (95% CI, 1.33–6.20). In gender stratification models, the hazard ratio of the females tended to be higher than that of the males. Conclusion As the number of working hours per week increased, the hazard ratio for diagnosis of hypertension significantly increased. This result suggests a positive association between overtime work and the risk of hypertension. PMID:25852938
Effect of Long Working Hours on Self-reported Hypertension among Middle-aged and Older Wage Workers.
Yoo, Dong Hyun; Kang, Mo-Yeol; Paek, Domyung; Min, Bokki; Cho, Sung-Il
2014-01-01
Many studies have reported an association between overwork and hypertension. However, research on the health effects of long working hours has yielded inconclusive results. The objective of this study was to identify an association between overtime work and hypertension in wage workers 45 years and over of age using prospective data. Wage workers in Korea aged 45 years and over were selected for inclusion in this study from among 10,254 subjects from the Korean Longitudinal Study of Ageing. Workers with baseline hypertension and those with other major diseases were excluded. In the end, a total of 1,079 subjects were included. A Cox proportional hazards model was used to calculate hazard ratios and adjust for baseline characteristics such as sex, age, education, income, occupation, form of employment, body mass index, alcohol habit, smoking habit, regular exercise, and number of working days per week. Additional models were used to calculate hazard ratios after gender stratification. Among the 1,079 subjects, 85 workers were diagnosed with hypertension during 3974.2 person-months. The average number of working hours per week for all subjects was 47.68. The proportion of overtime workers was 61.0% (cutoff, 40 h per week). Compared with those working 40 h and less per week, the hazard ratio of subjects in the final model, which adjusted for all selected variables, working 41-50 h per week was 2.20 (95% confidence interval [CI], 1.19-4.06), that of subjects working 51-60 h per week was 2.40 (95% CI, 1.07-5.39), and that of subjects working 61 h and over per week was 2.87 (95% CI, 1.33-6.20). In gender stratification models, the hazard ratio of the females tended to be higher than that of the males. As the number of working hours per week increased, the hazard ratio for diagnosis of hypertension significantly increased. This result suggests a positive association between overtime work and the risk of hypertension.
Lai, Shih-Wei; Lin, Cheng-Li; Liao, Kuan-Fu
2017-08-01
Very little is known about the association between glaucoma and Parkinson's disease in the elderly. The objective of this study was to determine whether glaucoma is associated with Parkinson's disease in older people in Taiwan. A retrospective cohort study was conducted to analyze the Taiwan National Health Insurance Program database from 2000 to 2010. We included 4330 subjects aged 65 years or older with newly diagnosed glaucoma as the glaucoma group, and 17,000 randomly selected subjects without a glaucoma diagnosis as the non-glaucoma group. Both groups were matched for sex, age, other comorbidities, and index year of glaucoma diagnosis. The incidence of Parkinson's disease at the end of 2011 was measured. A multivariable Cox proportional hazard regression model was used to measure the hazard ratio and 95% confidence intervals for Parkinson's disease associated with glaucoma. The overall incidence of Parkinson's disease was 1.28-fold higher in the glaucoma group than that in the non-glaucoma group (7.73 vs. 6.02 per 1000 person-years; 95% confidence interval 1.18, 1.40). After controlling for potential confounding factors, the adjusted hazard ratio of Parkinson's disease was 1.23 for the glaucoma group (95% confidence interval 1.05, 1.46), compared with the non-glaucoma group. Older people with glaucoma correlate with a small but statistically significant increase in the risk for Parkinson's disease. Whether glaucoma may be a non-motor feature of Parkinson's disease in older people requires further research to confirm.
Zhang, Zhongheng; Xu, Xiao; Ni, Hongying; Deng, Hongsheng
2014-02-01
Urine output (UO) is routinely measured in the intensive care unit (ICU) but its prognostic value remains debated. The study aimed to investigate the association between day 1 UO and hospital mortality. Clinical data were abstracted from the Multiparameter Intelligent Monitoring in Intensive Care II (version 2.6) database. UO was recorded for the first 24 h after ICU entry, and was classified into three categories: UO >0.5, 0.3-0.5 and ≤0.3 ml/kg per hour. The primary endpoint was the hospital mortality. Four models were built to adjust for the hazards ratio of mortality. A total of 21,207 unselected ICU patients including 2,401 non-survivors and 18,806 survivors were included (mortality rate 11.3 %). Mortality rate increased progressively across UO categories: >0.5 (7.67 %), 0.3-0.5 (11.27 %) and ≤0.3 ml/kg/h (18.29 %), and this relationship remained statistically significant after rigorous control of confounding factors with the Cox proportional hazards regression model. With UO >0.5 as the referent group, the hazards ratios for UO 0.3-0.5 and UO ≤0.3 were 1.41 (95 % CI 1.29-1.54) and 1.52 (95 % CI 1.38-1.67), respectively. UO obtained on ICU entry is an independent predictor of mortality irrespective of diuretic use. It would be interesting to examine whether strategies to increase UO would improve clinical outcome.
Maggio, Marcello; Corsonello, Andrea; Ceda, Gian Paolo; Cattabiani, Chiara; Lauretani, Fulvio; Buttò, Valeria; Ferrucci, Luigi; Bandinelli, Stefania; Abbatecola, Angela Marie; Spazzafumo, Liana; Lattanzio, Fabrizia
2013-04-08
The use of proton pump inhibitors (PPIs) has rapidly increased during the past several years. However, concern remains about risks associated with their long-term use in older populations. To investigate the relationship between the use of PPIs and the risk of death or the combined end point of death or rehospitalization in older patients discharged from acute care hospitals. We investigated the relationship between PPI use and study outcomes using time-dependent Cox proportional hazards regression in patients 65 years or older discharged from acute care medical wards from April 1 to June 30, 2007. Eleven acute care medical wards. Four hundred ninety-one patients (mean [SD] age, 80.0 [5.9] years). Mortality and the combined end point of death or rehospitalization. RESULTS The use of PPIs was independently associated with mortality (hazard ratio, 1.51 [95% CI, 1.03-2.77]) but not with the combined end point (1.49 [0.98-2.17]). An increased risk of mortality was observed among patients exposed to high-dose PPIs vs none (hazard ratio, 2.59 [95% CI, 1.22-7.16]). In older patients discharged from acute care hospitals, the use of high-dose PPIs is associated with increased 1-year mortality. Randomized controlled studies including older frail patients are needed. In the meantime, physicians need to use caution and balance benefits and harms in long-term prescription of high-dose PPIs.
Powe, Desmond G; Voss, Melanie J; Zänker, Kurt S; Habashy, Hany O; Green, Andrew R; Ellis, Ian O; Entschladen, Frank
2010-11-01
Laboratory models show that the beta-blocker, propranolol, can inhibit norepinephrine-induced breast cancer cell migration. We hypothesised that breast cancer patients receiving beta-blockers for hypertension would show reduced metastasis and improved clinical outcome. Three patient subgroups were identified from the medical records of 466 consecutive female patients (median age 57, range 28-71) with operable breast cancer and follow-up (>10 years). Two subgroups comprised 43 and 49 hypertensive patients treated with beta-blockers or other antihypertensives respectively, prior to cancer diagnosis. 374 patients formed a non-hypertensive control group. Metastasis development, disease free interval, tumour recurrence and hazards risk were statistically compared between groups. Kaplan-Meier plots were used to model survival and DM. Beta-blocker treated patients showed a significant reduction in metastasis development (p=0.026), tumour recurrence (p=0.001), and longer disease free interval (p=0.01). In addition, there was a 57% reduced risk of metastasis (Hazards ratio=0.430; 95% CI=0.200-0.926, p=0.031), and a 71% reduction in breast cancer mortality after 10 years (Hazards ratio=0.291; 95% CI=0.119-0.715, p=0.007). This proof-of-principle study showed beta-blocker therapy significantly reduces distant metastases, cancer recurrence, and cancer-specific mortality in breast cancer patients suggesting a novel role for beta-blocker therapy. A larger epidemiological study leading to randomised clinical trials is needed for breast and other cancer types including colon, prostate and ovary.
Powe, Desmond G.; Voss, Melanie J.; Zänker, Kurt S.; Habashy, Hany O.; Green, Andrew R.; Ellis, Ian O.; Entschladen, Frank
2010-01-01
Laboratory models show that the beta-blocker, propranolol, can inhibit norepinephrine-induced breast cancer cell migration. We hypothesised that breast cancer patients receiving beta-blockers for hypertension would show reduced metastasis and improved clinical outcome. Three patient subgroups were identified from the medical records of 466 consecutive female patients (median age 57, range 28-71) with operable breast cancer and follow-up (>10 years). Two subgroups comprised 43 and 49 hypertensive patients treated with beta-blockers or other antihypertensives respectively, prior to cancer diagnosis. 374 patients formed a non-hypertensive control group. Metastasis development, disease free interval, tumour recurrence and hazards risk were statistically compared between groups. Kaplan-Meier plots were used to model survival and DM. Beta-blocker treated patients showed a significant reduction in metastasis development (p=0.026), tumour recurrence (p=0.001), and longer disease free interval (p=0.01). In addition, there was a 57% reduced risk of metastasis (Hazards ratio=0.430; 95% CI=0.200-0.926, p=0.031), and a 71% reduction in breast cancer mortality after 10 years (Hazards ratio=0.291; 95% CI=0.119-0.715, p=0.007). This proof-of-principle study showed beta-blocker therapy significantly reduces distant metastases, cancer recurrence, and cancer-specific mortality in breast cancer patients suggesting a novel role for beta-blocker therapy. A larger epidemiological study leading to randomised clinical trials is needed for breast and other cancer types including colon, prostate and ovary. PMID:21317458
Mortality of rheumatoid arthritis in Japan: a longitudinal cohort study.
Hakoda, M; Oiwa, H; Kasagi, F; Masunari, N; Yamada, M; Suzuki, G; Fujiwara, S
2005-10-01
To determine the mortality risk of Japanese patients with rheumatoid arthritis, taking into account lifestyle and physical factors, including comorbidity. 91 individuals with rheumatoid arthritis were identified during screening a cohort of 16 119 Japanese atomic bomb survivors in the period 1958 to 1966. These individuals and the remainder of the cohort were followed for mortality until 1999. Mortality risk of the rheumatoid patients was estimated by the Cox proportional hazards model. In addition to age and sex, lifestyle and physical factors such as smoking status, alcohol consumption, blood pressure, and comorbidity were included as adjustment factors for the analysis of total mortality and for analysis of mortality from each cause of death. 83 of the rheumatoid patients (91.2%) and 8527 of the non-rheumatoid controls (52.9%) died during mean follow up periods of 17.8 and 28.0 years, respectively. The age and sex adjusted hazard ratio for mortality in the rheumatoid patients was 1.60 (95% confidence interval, 1.29 to 1.99), p < 0.001. Multiple adjustments, including for lifestyle and physical factors, resulted in a similar mortality hazard ratio of 1.57 (1.25 to 1.94), p < 0.001. Although mortality risk tended to be higher in male than in female rheumatoid patients, the difference was not significant. Pneumonia, tuberculosis, and liver disease were significantly increased as causes of death in rheumatoid patients. Rheumatoid arthritis is an independent risk factor for mortality. Infectious events are associated with increased mortality in rheumatoid arthritis.
Kwan, Sharon W; Harris, William P; Gold, Laura S; Hebert, Paul L
2018-06-01
The purpose of this study was to compare the clinical effectiveness of embolization with that of sorafenib in the management of hepatocellular carcinoma as practiced in real-world settings. This population-based observational study was conducted with the Surveillance, Epidemiology, and End Results-Medicare linked database. Patients 65 years old and older with a diagnosis of primary liver cancer between 2007 and 2011 who underwent embolization or sorafenib treatment were identified. Patients were excluded if they had insufficient claims records, a diagnosis of intrahepatic cholangiocarcinoma, or other primary cancer or had undergone liver transplant or combination therapy. The primary outcome of interest was overall survival. Inverse probability of treatment weighting models were used to control for selection bias. The inclusion and exclusion criteria were met by 1017 patients. Models showed good balance between treatment groups. Compared with those who underwent embolization, patients treated with sorafenib had significantly higher hazard of earlier death from time of treatment (hazard ratio, 1.87; 95% CI, 1.46-2.37; p < 0.0001) and from time of cancer diagnosis (hazard ratio, 1.87; 95% CI, 1.46-2.39; p < 0.0001). The survival advantage after embolization was seen in both intermediate- and advanced-stage disease. This comparative effectiveness study of Medicare patients with hepatocellular carcinoma showed significantly longer overall survival after treatment with embolization than with sorafenib. Because these findings conflict with expert opinion-based guidelines for treatment of advanced-stage disease, prospective randomized comparative trials in this subpopulation would be justified.
Outcomes of Kidney Transplantation in HIV-Infected Recipients
Stock, Peter G.; Barin, Burc; Murphy, Barbara; Hanto, Douglas; Diego, Jorge M.; Light, Jimmy; Davis, Charles; Blumberg, Emily; Simon, David; Subramanian, Aruna; Millis, J. Michael; Lyon, G. Marshall; Brayman, Kenneth; Slakey, Doug; Shapiro, Ron; Melancon, Joseph; Jacobson, Jeffrey M.; Stosor, Valentina; Olson, Jean L.; Stablein, Donald M.; Roland, Michelle E.
2010-01-01
BACKGROUND The outcomes of kidney transplantation and immunosuppression in people infected with human immunodeficiency virus (HIV) are incompletely understood. METHODS We undertook a prospective, nonrandomized trial of kidney transplantation in HIV-infected candidates who had CD4+ T-cell counts of at least 200 per cubic millimeter and undetectable plasma HIV type 1 (HIV-1) RNA levels while being treated with a stable antiretroviral regimen. Post-transplantation management was provided in accordance with study protocols that defined prophylaxis against opportunistic infection, indications for biopsy, and acceptable approaches to immunosuppression, management of rejection, and antiretroviral therapy. RESULTS Between November 2003 and June 2009, a total of 150 patients underwent kidney transplantation; survivors were followed for a median period of 1.7 years. Patient survival rates (±SD) at 1 year and 3 years were 94.6±2.0% and 88.2±3.8%, respectively, and the corresponding mean graft-survival rates were 90.4% and 73.7%. In general, these rates fall somewhere between those reported in the national database for older kidney-transplant recipients (≥65 years) and those reported for all kidney-transplant recipients. A multivariate proportional-hazards analysis showed that the risk of graft loss was increased among patients treated for rejection (hazard ratio, 2.8; 95% confidence interval [CI], 1.2 to 6.6; P = 0.02) and those receiving antithymocyte globulin induction therapy (hazard ratio, 2.5; 95% CI, 1.1 to 5.6; P = 0.03); living-donor transplants were protective (hazard ratio, 0.2; 95% CI, 0.04 to 0.8; P = 0.02). A higher-than-expected rejection rate was observed, with 1-year and 3-year estimates of 31% (95% CI, 24 to 40) and 41% (95% CI, 32 to 52), respectively. HIV infection remained well controlled, with stable CD4+ T-cell counts and few HIV-associated complications. CONCLUSIONS In this cohort of carefully selected HIV-infected patients, both patient- and graft-survival rates were high at 1 and 3 years, with no increases in complications associated with HIV infection. The unexpectedly high rejection rates are of serious concern and indicate the need for better immunotherapy. PMID:21083386
Suicide After Deliberate Self-Harm in Adolescents and Young Adults.
Olfson, Mark; Wall, Melanie; Wang, Shuai; Crystal, Stephen; Bridge, Jeffrey A; Liu, Shang-Min; Blanco, Carlos
2018-04-01
Among adolescents and young adults with nonfatal self-harm, our objective is to identify risk factors for repeated nonfatal self-harm and suicide death over the following year. A national cohort of patients in the Medicaid program, aged 12 to 24 years ( n = 32 395), was followed for up to 1 year after self-harm. Cause of death information was obtained from the National Death Index. Repeat self-harm per 1000 person-years and suicide deaths per 100 000 person-years were determined. Hazard ratios (HRs) of repeat self-harm and suicide were estimated by Cox proportional hazard models. Suicide standardized mortality rate ratios were derived by comparison with demographically matched general population controls. The 12-month suicide standardized mortality rate ratio after self-harm was significantly higher for adolescents (46.0, 95% confidence interval [CI]: 29.9-67.9) than young adults (19.2, 95% CI: 12.7-28.0). Hazards of suicide after self-harm were significantly higher for American Indians and Alaskan natives than non-Hispanic white patients (HR: 4.69, 95% CI: 2.41-9.13) and for self-harm patients who initially used violent methods (HR: 18.04, 95% CI: 9.92-32.80), especially firearms (HR: 35.73, 95% CI: 15.42-82.79), compared with nonviolent self-harm methods (1.00, reference). The hazards of repeat self-harm were higher for female subjects than male subjects (HR: 1.25, 95% CI: 1.18-1.33); patients with personality disorders (HR: 1.55, 95% CI: 1.42-1.69); and patients whose initial self-harm was treated in an inpatient setting (HR: 1.65, 95% CI: 1.49-1.83) compared with an emergency department (HR: 0.62, 95% CI: 0.55-0.69) or outpatient (1.00, reference) setting. After nonfatal self-harm, adolescents and young adults were at markedly elevated risk of suicide. Among these high-risk patients, those who used violent self-harm methods, particularly firearms, were at especially high risk underscoring the importance of follow-up care to help ensure their safety. Copyright © 2018 by the American Academy of Pediatrics.
van der Velde, A Rogier; Gullestad, Lars; Ueland, Thor; Aukrust, Pål; Guo, Yu; Adourian, Aram; Muntendam, Pieter; van Veldhuisen, Dirk J; de Boer, Rudolf A
2013-03-01
In several cross-sectional analyses, circulating baseline levels of galectin-3, a protein involved in myocardial fibrosis and remodeling, have been associated with increased risk for morbidity and mortality in patients with heart failure (HF). The importance and clinical use of repeated measurements of galectin-3 have not yet been reported. Plasma galectin-3 was measured at baseline and at 3 months in patients enrolled in the Controlled Rosuvastatin Multinational Trial in Heart Failure (CORONA) trial (n=1329), and at baseline and at 6 months in patients enrolled in the Coordinating Study Evaluating Outcomes of Advising and Counseling Failure (COACH) trial (n=324). Patient results were analyzed by categorical and percentage changes in galectin-3 level. A threshold value of 17.8 ng/mL or 15% change from baseline was used to categorize patients. Increasing galectin-3 levels over time, from a low to high galectin-3 category, were associated with significantly more HF hospitalization and mortality compared with stable or decreasing galectin-3 levels (hazard ratio in CORONA, 1.60; 95% confidence interval, 1.13-2.25; P=0.007; hazard ratio in COACH, 2.38; 95% confidence interval, 1.02-5.55; P=0.046). In addition, patients whose galectin-3 increased by >15% between measurements had a 50% higher relative hazard of adverse event than those whose galectin-3 stayed within ±15% of the baseline value, independent of age, sex, diabetes mellitus, left ventricular ejection fraction, renal function, medication (β-blocker, angiotensin converting enzyme inhibitor, and angiotensin receptor blocker), and N-terminal probrain natriuretic peptide (hazard ratio in CORONA, 1.50; 95% confidence interval, 1.17-1.92; P=0.001). The impact of changing galectin-3 levels on other secondary end points was comparable. In 2 large cohorts of patients with chronic and acute decompensated HF, repeated measurements of galectin-3 level provided important and significant prognostic value in identifying patients with HF at elevated risk for subsequent HF morbidity and mortality.
Markers of nutritional status and mortality in older adults: The role of anemia and hypoalbuminemia.
Corona, Ligiana Pires; de Oliveira Duarte, Yeda Aparecida; Lebrão, Maria Lúcia
2018-01-01
The aim of the present study was to analyze the impact of anemia and hypoalbuminemia on mortality in a 5-year period. This was longitudinal population-based observational survey part of the Saúde, Bem-Estar e Envelhecimento study (Health, Well-being and Aging), carried out with 1256 older adults from the third wave of the cohort, followed for 5 years, when they were contacted for the fourth wave, in Sao Paulo, Brazil. Anemia was defined when hemoglobin was <12 g/dL for women and <13 g/dL for men, and hypoalbuminemia when serum albumin was <3.5 g/dL. Survival functions were estimated according to nutritional status in four groups: (i) without nutritional alteration; (ii) anemia only; (iii) hypoalbuminemia only; and (iv) anemia and hypoalbuminemia. Hazard ratios were calculated, following the Cox proportional hazards model, controlling for baseline covariates. All analyses considered sample weights, and were carried out using the Stata 12. After the 5-year period, 12.3% of the participants died, and 8.2% were lost to follow up. Those who died had lower hemoglobin and albumin concentrations (13.4 g/dL and 3.7 g/dL) compared with survivors (14.3d/dL and 3.9 g/dL; P < 0.001). The crude death rate was 27.6/1000 person-years for participants in group i, 124.3 in group ii, 116.0 in group iii and 222.8 in group iv (P < 0.001). In the final Cox models, group 2 and 3 had a similar effect (hazard ratio 2.23, P = 0.020; 2.53, P = 0.005; respectively) and group 4 had a higher risk (hazard ratio 3.36; P = 0.004). Anemia and hypoalbuminemia are important markers for death in older adults, and have an additive effect on mortality. Because they are common and cost-effective biomarkers, their use should be encouraged in geriatric evaluation for all health professionals and in population settings, such as primary care. Geriatr Gerontol Int 2018; 18: 177-182. © 2017 Japan Geriatrics Society.
Induced abortion and breast cancer among parous women: a Danish cohort study.
Braüner, Christina Marie; Overvad, Kim; Tjønneland, Anne; Attermann, Jørn
2013-06-01
We investigated whether induced abortion is associated with breast cancer when lifestyle confounders, including smoking and alcohol consumption, are adjusted for. Design. Prospective cohort study. Danish women from the Diet, Cancer and Health study. A total of 25,576 women. We obtained exposure data from baseline questionnaires filled in by the women between 1993 and 1997. Information on breast cancer and emigration was retrieved from Danish national registries. The study power was approximately 85% when applying a minimum detection hazard ratio of 1.2. Long-term effects of induced abortion on the risk of breast cancer among women above 50 years of age. During a follow up of approximately 12 years, 1215 women were diagnosed with breast cancer. When comparing parous women who had an abortion with parous women who never had an abortion, there was no association between breast cancer risk and induced abortion (ever vs. never), with a hazard ratio 0.95 (95% confidence interval 0.83-1.09), regardless of whether the abortion occurred before the first birth (hazard ratio 0.86; 95% confidence interval 0.65-1.14), or after the first birth (hazard ratio 0.97; 95% confidence interval 0.84-1.13). Our study did not show evidence of an association between induced abortion and breast cancer risk. © 2013 The Authors Acta Obstetricia et Gynecologica Scandinavica © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.
Gender related Long-term Differences after Open Infrainguinal Surgery for Critical Limb Ischemia.
Lejay, A; Schaeffer, M; Georg, Y; Lucereau, B; Roussin, M; Girsowicz, E; Delay, C; Schwein, A; Thaveau, F; Geny, B; Chakfe, N
2015-10-01
The role of gender on long-term infrainguinal open surgery outcomes still remains uncertain in critical limb ischemia patients. The aim of this study is to evaluate the gender-specific differences in patient characteristics and long-term clinical outcomes in terms of survival, primary patency and limb salvage among patients undergoing infrainguinal open surgery for CLI. All consecutive patients undergoing infrainguinal open surgery for critical limb ischemia between 2003 and 2012 were included. Survival, limb salvage and primary patency rates were assessed. Independent outcome determinants were identified by the Cox proportional hazard ratio using age and gender as adjustment factors. 584 patients (269 women and 315 men, mean age 76 and 71 years respectively) underwent 658 infrainguinal open surgery (313 in women and 345 in men). Survival rate at 6 years was lower among women compared to men with 53.5% vs 70.9% (p < 0.001). The same applied to primary patency (35.9% vs 52.4%, p < 0.001) and limb salvage (54.3% vs 81.1%, p < 0.001) at 6 years. Female-gender was an independent factor predicting death (hazard ratio 1.50), thrombosis (hazard ratio 2.37) and limb loss (hazard ratio 7.05) in age and gender-adjusted analysis. Gender-related disparity in critical limb ischemia open surgical revascularization outcomes still remains. Copyright © 2015 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.
Jafri, Nazia F; Newitt, David C; Kornak, John; Esserman, Laura J; Joe, Bonnie N; Hylton, Nola M
2014-08-01
To evaluate optimal contrast kinetics thresholds for measuring functional tumor volume (FTV) by breast magnetic resonance imaging (MRI) for assessment of recurrence-free survival (RFS). In this Institutional Review Board (IRB)-approved retrospective study of 64 patients (ages 29-72, median age of 48.6) undergoing neoadjuvant chemotherapy (NACT) for breast cancer, all patients underwent pre-MRI1 and postchemotherapy MRI4 of the breast. Tumor was defined as voxels meeting thresholds for early percent enhancement (PEthresh) and early-to-late signal enhancement ratio (SERthresh); and FTV (PEthresh, SERthresh) by summing all voxels meeting threshold criteria and minimum connectivity requirements. Ranges of PEthresh from 50% to 220% and SERthresh from 0.0 to 2.0 were evaluated. A Cox proportional hazard model determined associations between change in FTV over treatment and RFS at different PE and SER thresholds. The plot of hazard ratios for change in FTV from MRI1 to MRI4 showed a broad peak with the maximum hazard ratio and highest significance occurring at PE threshold of 70% and SER threshold of 1.0 (hazard ratio = 8.71, 95% confidence interval 2.86-25.5, P < 0.00015), indicating optimal model fit. Enhancement thresholds affect the ability of MRI tumor volume to predict RFS. The value is robust over a wide range of thresholds, supporting the use of FTV as a biomarker. © 2013 Wiley Periodicals, Inc.
Heavy Metal Risk Management: Case Analysis
Kim, Ji Ae; Lee, Seung Ha; Choi, Seung Hyun; Jung, Ki Kyung; Park, Mi Sun; Jeong, Ji Yoon; Hwang, Myung Sil; Yoon, Hae Jung; Choi, Dal Woong
2012-01-01
To prepare measures for practical policy utilization and the control of heavy metals, hazard control related institutions by country, present states of control by country, and present states of control by heavy metals were examined. Hazard control cases by heavy metals in various countries were compared and analyzed. In certain countries (e.g., the U.S., the U.K., and Japan), hazardous substances found in foods (e.g., arsenic, lead, cadmium, and mercury) are controlled. In addition, the Joint FAO/WHO Expert Committee on Food Additives (JECFA) recommends calculating the provisional tolerable weekly intake (PTWI) of individual heavy metals instead of the acceptable daily intake (ADI) to compare their pollution levels considering their toxicity accumulated in the human body. In Korea, exposure assessments have been conducted, and in other countries, hazardous substances are controlled by various governing bodies. As such, in Korea and other countries, diverse food heavy metal monitoring and human body exposure assessments are conducted, and reducing measures are prepared accordingly. To reduce the danger of hazardous substances, many countries provide leaflets and guidelines, develop hazardous heavy metal intake recommendations, and take necessary actions. Hazard control case analyses can assist in securing consumer safety by establishing systematic and reliable hazard control methods. PMID:24278603
Federal Register 2010, 2011, 2012, 2013, 2014
2013-04-26
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' that appeared in... Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day...
Aga, Cathrine; Kartus, Jüri-Tomas; Lind, Martin; Lygre, Stein Håkon Låstad; Granan, Lars-Petter; Engebretsen, Lars
2017-10-01
Double-bundle anterior cruciate ligament (ACL) reconstruction has demonstrated improved biomechanical properties and moderately better objective outcomes compared with single-bundle reconstructions. This could make an impact on the rerupture rate and reduce the risk of revisions in patients undergoing double-bundle ACL reconstruction compared with patients reconstructed with a traditional single-bundle technique. The National Knee Ligament Registers in Scandinavia provide information that can be used to evaluate the revision outcome after ACL reconstructions. The purposes of the study were (1) to compare the risk of revision between double-bundle and single-bundle reconstructions, reconstructed with autologous hamstring tendon grafts; (2) to compare the risk of revision between double-bundle hamstring tendon and single-bundle bone-patellar tendon-bone autografts; and (3) to compare the hazard ratios for the same two research questions after Cox regression analysis was performed. Data collection of primary ACL reconstructions from the National Knee Ligament Registers in Denmark, Norway, and Sweden from July 1, 2005, to December 31, 2014, was retrospectively analyzed. A total of 60,775 patients were included in the study; 994 patients were reconstructed with double-bundle hamstring tendon grafts, 51,991 with single-bundle hamstring tendon grafts, and 7790 with single-bundle bone-patellar tendon-bone grafts. The double-bundle ACL-reconstructed patients were compared with the two other groups. The risk of revision for each research question was detected by the risk ratio, hazard ratio, and the corresponding 95% confidence intervals. Kaplan-Meier analysis was used to estimate survival at 1, 2, and 5 years for the three different groups. Furthermore, a Cox proportional hazard regression model was applied and the hazard ratios were adjusted for country, age, sex, meniscal or chondral injury, and utilized fixation devices on the femoral and tibial sides. There were no differences in the crude risk of revision between the patients undergoing the double-bundle technique and the two other groups. A total of 3.7% patients were revised in the double-bundle group (37 of 994 patients) versus 3.8% in the single-bundle hamstring tendon group (1952 of 51,991; risk ratio, 1.01; 95% confidence interval (CI), 0.73-1.39; p = 0.96), and 2.8% of the patients were revised in the bone-patellar tendon-bone group (219 of the 7790 bone-patellar tendon-bone patients; risk ratio, 0.76; 95% CI, 0.54-1.06; p = 0.11). Cox regression analysis with adjustment for country, age, sex, menisci or cartilage injury, and utilized fixation device on the femoral and tibial sides, did not reveal any further difference in the risk of revision between the single-bundle hamstring tendon and double-bundle hamstring tendon groups (hazard ratio, 1.18; 95% CI, 0.85-1.62; p = 0.33), but the adjusted hazard ratio showed a lower risk of revision in the single-bundle bone-patellar tendon-bone group compared with the double-bundle group (hazard ratio, 0.62; 95% CI, 0.43-0.90; p = 0.01). Comparisons of the graft revision rates reported separately for each country revealed that double-bundle hamstring tendon reconstructions in Sweden had a lower hazard ratio compared with the single-bundle hamstring tendon reconstructions (hazard ratio, 1.00 versus 1.89; 95% CI, 1.09-3.29; p = 0.02). Survival at 5 years after index surgery was 96.0% for the double-bundle group, 95.4% for the single-bundle hamstring tendon group, and 97.0% for the single-bundle bone-patellar tendon-bone group. Based on the data from all three national registers, the risk of revision was not influenced by the reconstruction technique in terms of using single- or double-bundle hamstring tendons, although national differences in survival existed. Using bone-patellar tendon-bone grafts lowered the risk of revision compared with double-bundle hamstring tendon grafts. These findings should be considered when deciding what reconstruction technique to use in ACL-deficient knees. Future studies identifying the reasons for graft rerupture in single- and double-bundle reconstructions would be of interest to understand the findings of the present study. Level III, therapeutic study.
Espelt, Albert; Marí-Dell'Olmo, Marc; Penelo, Eva; Bosque-Prous, Marina
2016-06-14
To examine the differences between Prevalence Ratio (PR) and Odds Ratio (OR) in a cross-sectional study and to provide tools to calculate PR using two statistical packages widely used in substance use research (STATA and R). We used cross-sectional data from 41,263 participants of 16 European countries participating in the Survey on Health, Ageing and Retirement in Europe (SHARE). The dependent variable, hazardous drinking, was calculated using the Alcohol Use Disorders Identification Test - Consumption (AUDIT-C). The main independent variable was gender. Other variables used were: age, educational level and country of residence. PR of hazardous drinking in men with relation to women was estimated using Mantel-Haenszel method, log-binomial regression models and poisson regression models with robust variance. These estimations were compared to the OR calculated using logistic regression models. Prevalence of hazardous drinkers varied among countries. Generally, men have higher prevalence of hazardous drinking than women [PR=1.43 (1.38-1.47)]. Estimated PR was identical independently of the method and the statistical package used. However, OR overestimated PR, depending on the prevalence of hazardous drinking in the country. In cross-sectional studies, where comparisons between countries with differences in the prevalence of the disease or condition are made, it is advisable to use PR instead of OR.
Pinkston, Christina M; Baumgartner, Richard N; Connor, Avonne E; Boone, Stephanie D; Baumgartner, Kathy B
2015-12-01
We investigated the association of physical activity with survival for 601 Hispanic women and 682 non-Hispanic white women who participated in the population-based breast cancer case-control New Mexico Women's Health Study. We identified 240 deaths among cases diagnosed with a first primary invasive breast cancer between 1992 and 1994, and 88 deaths among controls. Follow-up extended through 2012 for cases and 2008 for controls. Multivariable hazard ratios (HRs) and 95% confidence intervals (CIs) were estimated using Cox proportional hazards regression. Higher levels of total physical activity were inversely associated with all-cause mortality among Hispanic cases (Quartile (Q)4: HR = 0.55, 95% CI 0.31-0.99). A non-significant trend was observed for recreational activity in Hispanic cases also (Q4: HR = 0.50, 95% CI 0.23-1.09, p for trend = 0.08). No significant associations were noted for non-Hispanic white cases or for controls. The results suggest that increasing physical activity may be protective against mortality in Hispanic women with breast cancer, despite reporting lower levels of recreational activity than non-Hispanic white women or Hispanic controls. Public health programs in Hispanic communities should promote physical activity in women as a means of decreasing breast cancer risk and improving survival.
Kucharska-Newton, Anna M.; Rosamond, Wayne D.; Schroeder, Jane C.; McNeill, Ann Marie; Coresh, Josef; Folsom, Aaron R.
2008-01-01
Summary This study examined prospectively the association of baseline plasma HDL-cholesterol levels with incidence of lung cancer in 14, 547 members of the Atherosclerosis Risk in Communities (ARIC) cohort. There were 259 cases of incident lung cancer identified during follow-up from 1987 through 2000. Results of this study indicated a relatively weak inverse association of HDL-cholesterol with lung cancer that was dependent on smoking status. The hazard ratio of lung cancer incidence in relation to low HDL-cholesterol, adjusted for race, gender, exercise, alcohol consumption, body mass index, triglycerides, age, and cigarette pack-years of smoking, was 1.45 (95% confidence interval 1.10, 1.92). This association was observed among former smokers (hazard ratio: 1.77, 95% confidence interval 1.05, 2.97), but not current smokers. The number of cases among never smokers in this study was too small (n=13) for meaningful interpretation of effect estimates. Excluding cases occurring within five years of baseline did not appreciably change the point estimates, suggesting lack of reverse causality. The modest association of low plasma HDL-cholesterol with greater incident lung cancer observed in this study is in agreement with existing case-control studies. PMID:18342390
Ren, Yong; Duan, Chongyang; Chen, Shangwu; Xu, Anlong
2016-01-01
Acute promyelocytic leukemia (APL) is a curable subtype of acute myeloid leukemia. The optimum regimen for newly diagnosed APL remains inconclusive. In this Bayesian network meta-analysis, we compared the effectiveness of five regimens-arsenic trioxide (ATO) + all-trans retinoic acid (ATRA), realgar-indigo naturalis formula (RIF) which contains arsenic tetrasulfide + ATRA, ATRA + anthracycline-based chemotherapy (CT), ATO alone and ATRA alone, based on fourteen randomized controlled trials (RCTs), which included 1407 newly diagnosed APL patients. According to the results, the ranking efficacy of the treatment, including early death and complete remission in the induction stage, was the following: 1. ATO/RIF + ATRA; 2. ATRA + CT; 3. ATO, and 4. ATRA. For long-term benefit, ATO/RIF + ATRA significantly improved overall survival (OS) (hazard ratio = 0.35, 95%CI 0.15–0.82, p = 0.02) and event-free survival (EFS) (hazard ratio = 0.32, 95%CI 0.16–0.61, p = 0.001) over ATRA + CT regimen for the low-to-intermediate-risk patients. Thus, ATO + ATRA and RIF + ATRA might be considered the optimum treatments for the newly diagnosed APL and should be recommended as the standard care for frontline therapy. PMID:27322078
Wu, Fenfang; Wu, Di; Ren, Yong; Duan, Chongyang; Chen, Shangwu; Xu, Anlong
2016-07-26
Acute promyelocytic leukemia (APL) is a curable subtype of acute myeloid leukemia. The optimum regimen for newly diagnosed APL remains inconclusive. In this Bayesian network meta-analysis, we compared the effectiveness of five regimens-arsenic trioxide (ATO) + all-trans retinoic acid (ATRA), realgar-indigo naturalis formula (RIF) which contains arsenic tetrasulfide + ATRA, ATRA + anthracycline-based chemotherapy (CT), ATO alone and ATRA alone, based on fourteen randomized controlled trials (RCTs), which included 1407 newly diagnosed APL patients. According to the results, the ranking efficacy of the treatment, including early death and complete remission in the induction stage, was the following: 1. ATO/RIF + ATRA; 2. ATRA + CT; 3. ATO, and 4. ATRA. For long-term benefit, ATO/RIF + ATRA significantly improved overall survival (OS) (hazard ratio = 0.35, 95%CI 0.15-0.82, p = 0.02) and event-free survival (EFS) (hazard ratio = 0.32, 95%CI 0.16-0.61, p = 0.001) over ATRA + CT regimen for the low-to-intermediate-risk patients. Thus, ATO + ATRA and RIF + ATRA might be considered the optimum treatments for the newly diagnosed APL and should be recommended as the standard care for frontline therapy.
Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine
2015-01-01
Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the initial modality. Conclusions Home hemodialysis was associated with superior patient and technique survival compared with peritoneal dialysis. PMID:26068181
Yoon, Chang-Yun; Noh, Juhwan; Jhee, Jong Hyun; Chang, Tae Ik; Kang, Ea Wha; Kee, Youn Kyung; Kim, Hyoungnae; Park, Seohyun; Yun, Hae-Ryong; Jung, Su-Young; Oh, Hyung Jung; Park, Jung Tak; Han, Seung Hyeok; Kang, Shin-Wook; Kim, Changsoo; Yoo, Tae-Hyun
2017-09-01
The aim of this study is to elucidate the effects of warfarin use in patients with atrial fibrillation undergoing dialysis using a population-based Korean registry. Data were extracted from the Health Insurance Review and Assessment Service, which is a nationwide, mandatory social insurance database of all Korean citizens enrolled in the National Health Information Service between 2009 and 2013. Thromboembolic and hemorrhagic outcomes were analyzed according to warfarin use. Overall and propensity score-matched cohorts were analyzed by Cox proportional hazards models. Among 9974 hemodialysis patients with atrial fibrillation, the mean age was 66.6±12.2 years, 5806 (58.2%) were men, and 2921 (29.3%) used warfarin. After propensity score matching to adjust for all described baseline differences, 5548 subjects remained, and differences in baseline variables were distributed equally between warfarin users and nonusers. During a mean follow-up duration of 15.9±11.1 months, ischemic and hemorrhagic stroke occurred in 678 (6.8%) and 227 (2.3%) patients, respectively. In a multiple Cox model, warfarin use was significantly associated with an increased risk of hemorrhagic stroke (hazard ratio, 1.44; 95% confidence interval, 1.09-1.91; P =0.010) in the overall cohort. Furthermore, a significant relationship between warfarin use and hemorrhagic stroke was found in propensity-matched subjects (hazard ratio, 1.56; 95% confidence interval, 1.10-2.22; P =0.013). However, the ratios for ischemic stroke were not significantly different in either the propensity-matched (hazard ratio, 0.95; 95% confidence interval, 0.78-1.15; P =0.569) or overall cohort (hazard ratio, 1.06; 95% confidence interval, 0.90-1.26; P =0.470). Our findings suggest that warfarin should be used carefully in hemodialysis patients, given the higher risk of hemorrhagic events and the lack of ability to prevent thromboembolic complications. © 2017 American Heart Association, Inc.
Patel, Siddharth; Kwak, Lucia; Agarwal, Sunil K; Tereshchenko, Larisa G; Coresh, Josef; Soliman, Elsayed Z; Matsushita, Kunihiro
2017-11-03
A few studies have recently reported clockwise and counterclockwise rotations of QRS transition zone as predictors of mortality. However, their prospective correlates and associations with individual cardiovascular disease (CVD) outcomes are yet to be investigated. Among 13 567 ARIC (Atherosclerosis Risk in Communities) study participants aged 45 to 64 years, we studied key correlates of changes in the status of clockwise and counterclockwise rotation over time as well as the association of rotation status with incidence of coronary heart disease (2408 events), heart failure (2196 events), stroke (991 events), composite CVD (4124 events), 898 CVD deaths, and 3469 non-CVD deaths over 23 years of follow-up. At baseline, counterclockwise rotation was most prevalent (52.9%), followed by no (40.5%) and clockwise (6.6%) rotation. Of patients with no rotation, 57.9% experienced counterclockwise or clockwise rotation during follow-up, with diabetes mellitus and black race significantly predicting clockwise and counterclockwise conversion, respectively. Clockwise rotation was significantly associated with higher risk of heart failure (hazard ratio, 1.20; 95% confidence interval [CI], 1.02-1.41) and non-CVD death (hazard ratio, 1.28; 95% CI, 1.12-1.46) after adjusting for potential confounders including other ECG parameters. On the contrary, counterclockwise rotation was significantly related to lower risk of composite CVD (hazard ratio, 0.93; 95% CI, 0.87-0.99]), CVD mortality (hazard ratio, 0.76; 95% CI, 0.65-0.88), and non-CVD deaths (hazard ratio, 0.92; 95% CI, 0.85-0.99 [borderline significance with heart failure]). Counterclockwise rotation, the most prevalent QRS transition zone pattern, demonstrated the lowest risk of CVD and mortality, whereas clockwise rotation was associated with the highest risk of heart failure and non-CVD mortality. These results have implications on how to interpret QRS transition zone rotation when ECG was recorded. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Aspirin and extended-release dipyridamole versus clopidogrel for recurrent stroke.
Sacco, Ralph L; Diener, Hans-Christoph; Yusuf, Salim; Cotton, Daniel; Ounpuu, Stephanie; Lawton, William A; Palesch, Yuko; Martin, Reneé H; Albers, Gregory W; Bath, Philip; Bornstein, Natan; Chan, Bernard P L; Chen, Sien-Tsong; Cunha, Luis; Dahlöf, Björn; De Keyser, Jacques; Donnan, Geoffrey A; Estol, Conrado; Gorelick, Philip; Gu, Vivian; Hermansson, Karin; Hilbrich, Lutz; Kaste, Markku; Lu, Chuanzhen; Machnig, Thomas; Pais, Prem; Roberts, Robin; Skvortsova, Veronika; Teal, Philip; Toni, Danilo; Vandermaelen, Cam; Voigt, Thor; Weber, Michael; Yoon, Byung-Woo
2008-09-18
Recurrent stroke is a frequent, disabling event after ischemic stroke. This study compared the efficacy and safety of two antiplatelet regimens--aspirin plus extended-release dipyridamole (ASA-ERDP) versus clopidogrel. In this double-blind, 2-by-2 factorial trial, we randomly assigned patients to receive 25 mg of aspirin plus 200 mg of extended-release dipyridamole twice daily or to receive 75 mg of clopidogrel daily. The primary outcome was first recurrence of stroke. The secondary outcome was a composite of stroke, myocardial infarction, or death from vascular causes. Sequential statistical testing of noninferiority (margin of 1.075), followed by superiority testing, was planned. A total of 20,332 patients were followed for a mean of 2.5 years. Recurrent stroke occurred in 916 patients (9.0%) receiving ASA-ERDP and in 898 patients (8.8%) receiving clopidogrel (hazard ratio, 1.01; 95% confidence interval [CI], 0.92 to 1.11). The secondary outcome occurred in 1333 patients (13.1%) in each group (hazard ratio for ASA-ERDP, 0.99; 95% CI, 0.92 to 1.07). There were more major hemorrhagic events among ASA-ERDP recipients (419 [4.1%]) than among clopidogrel recipients (365 [3.6%]) (hazard ratio, 1.15; 95% CI, 1.00 to 1.32), including intracranial hemorrhage (hazard ratio, 1.42; 95% CI, 1.11 to 1.83). The net risk of recurrent stroke or major hemorrhagic event was similar in the two groups (1194 ASA-ERDP recipients [11.7%], vs. 1156 clopidogrel recipients [11.4%]; hazard ratio, 1.03; 95% CI, 0.95 to 1.11). The trial did not meet the predefined criteria for noninferiority but showed similar rates of recurrent stroke with ASA-ERDP and with clopidogrel. There is no evidence that either of the two treatments was superior to the other in the prevention of recurrent stroke. (ClinicalTrials.gov number, NCT00153062.) 2008 Massachusetts Medical Society
Low-Risk Lifestyle, Coronary Calcium, Cardiovascular Events, and Mortality: Results From MESA
Ahmed, Haitham M.; Blaha, Michael J.; Nasir, Khurram; Jones, Steven R.; Rivera, Juan J.; Agatston, Arthur; Blankstein, Ron; Wong, Nathan D.; Lakoski, Susan; Budoff, Matthew J.; Burke, Gregory L.; Sibley, Christopher T.; Ouyang, Pamela; Blumenthal, Roger S.
2013-01-01
Unhealthy lifestyle habits are a major contributor to coronary artery disease. The purpose of the present study was to investigate the associations of smoking, weight maintenance, physical activity, and diet with coronary calcium, cardiovascular events, and mortality. US participants who were 44–84 years of age (n = 6,229) were followed in the Multi-Ethnic Study of Atherosclerosis from 2000 to 2010. A lifestyle score ranging from 0 to 4 was created using diet, exercise, body mass index, and smoking status. Coronary calcium was measured at baseline and a mean of 3.1 (standard deviation, 1.3) years later to assess calcium progression. Participants who experienced coronary events or died were followed for a median of 7.6 (standard deviation, 1.5) years. Participants with lifestyle scores of 1, 2, 3, and 4 were found to have mean adjusted annual calcium progressions that were 3.5 (95% confidence interval (CI): 0.0, 7.0), 4.2 (95% CI: 0.6, 7.9), 6.8 (95% CI: 2.0, 11.5), and 11.1 (95% CI: 2.2, 20.1) points per year slower, respectively, relative to the reference group (P = 0.003). Unadjusted hazard ratios for death by lifestyle score were as follows: for a score of 1, the hazard ratio was 0.79 (95% CI: 0.61, 1.03); for a score of 2, the hazard ratio was 0.61 (95% CI: 0.46, 0.81); for a score of 3, the hazard ratio was 0.49 (95% CI: 0.32, 0.75); and for a score of 4, the hazard ratio was 0.19 (95% CI: 0.05, 0.75) (P < 0.001 by log-rank test). In conclusion, a combination of regular exercise, healthy diet, smoking avoidance, and weight maintenance was associated with lower coronary calcium incidence, slower calcium progression, and lower all-cause mortality over 7.6 years. PMID:23733562
Gratwohl, Alois; Brand, Ronald; McGrath, Eoin; van Biezen, Anja; Sureda, Anna; Ljungman, Per; Baldomero, Helen; Chabannon, Christian; Apperley, Jane
2014-01-01
Competent authorities, healthcare payers and hospitals devote increasing resources to quality management systems but scientific analyses searching for an impact of these systems on clinical outcome remain scarce. Earlier data indicated a stepwise improvement in outcome after allogeneic hematopoietic stem cell transplantation with each phase of the accreditation process for the quality management system “JACIE”. We therefore tested the hypothesis that working towards and achieving “JACIE” accreditation would accelerate improvement in outcome over calendar time. Overall mortality of the entire cohort of 107,904 patients who had a transplant (41,623 allogeneic, 39%; 66,281 autologous, 61%) between 1999 and 2006 decreased over the 14-year observation period by a factor of 0.63 per 10 years (hazard ratio: 0.63; 0.58–0.69). Considering “JACIE“-accredited centers as those with programs having achieved accreditation by November 2012, at the latest, this improvement was significantly faster in “JACIE”-accredited centers than in non-accredited centers (approximately 5.3% per year for 49,459 patients versus approximately 3.5% per year for 58,445 patients, respectively; hazard ratio: 0.83; 0.71–0.97). As a result, relapse-free survival (hazard ratio 0.85; 0.75–0.95) and overall survival (hazard ratio 0.86; 0.76–0.98) were significantly higher at 72 months for those patients transplanted in the 162 “JACIE“-accredited centers. No significant effects were observed after autologous transplants (hazard ratio 1.06; 0.99–1.13). Hence, working towards implementation of a quality management system triggers a dynamic process associated with a steeper reduction in mortality over the years and a significantly improved survival after allogeneic stem cell transplantation. Our data support the use of a quality management system for complex medical procedures. PMID:24488562
Dronedarone in high-risk permanent atrial fibrillation.
Connolly, Stuart J; Camm, A John; Halperin, Jonathan L; Joyner, Campbell; Alings, Marco; Amerena, John; Atar, Dan; Avezum, Álvaro; Blomström, Per; Borggrefe, Martin; Budaj, Andrzej; Chen, Shih-Ann; Ching, Chi Keong; Commerford, Patrick; Dans, Antonio; Davy, Jean-Marc; Delacrétaz, Etienne; Di Pasquale, Giuseppe; Diaz, Rafael; Dorian, Paul; Flaker, Greg; Golitsyn, Sergey; Gonzalez-Hermosillo, Antonio; Granger, Christopher B; Heidbüchel, Hein; Kautzner, Josef; Kim, June Soo; Lanas, Fernando; Lewis, Basil S; Merino, Jose L; Morillo, Carlos; Murin, Jan; Narasimhan, Calambur; Paolasso, Ernesto; Parkhomenko, Alexander; Peters, Nicholas S; Sim, Kui-Hian; Stiles, Martin K; Tanomsup, Supachai; Toivonen, Lauri; Tomcsányi, János; Torp-Pedersen, Christian; Tse, Hung-Fat; Vardas, Panos; Vinereanu, Dragos; Xavier, Denis; Zhu, Jun; Zhu, Jun-Ren; Baret-Cormel, Lydie; Weinling, Estelle; Staiger, Christoph; Yusuf, Salim; Chrolavicius, Susan; Afzal, Rizwan; Hohnloser, Stefan H
2011-12-15
Dronedarone restores sinus rhythm and reduces hospitalization or death in intermittent atrial fibrillation. It also lowers heart rate and blood pressure and has antiadrenergic and potential ventricular antiarrhythmic effects. We hypothesized that dronedarone would reduce major vascular events in high-risk permanent atrial fibrillation. We assigned patients who were at least 65 years of age with at least a 6-month history of permanent atrial fibrillation and risk factors for major vascular events to receive dronedarone or placebo. The first coprimary outcome was stroke, myocardial infarction, systemic embolism, or death from cardiovascular causes. The second coprimary outcome was unplanned hospitalization for a cardiovascular cause or death. After the enrollment of 3236 patients, the study was stopped for safety reasons. The first coprimary outcome occurred in 43 patients receiving dronedarone and 19 receiving placebo (hazard ratio, 2.29; 95% confidence interval [CI], 1.34 to 3.94; P=0.002). There were 21 deaths from cardiovascular causes in the dronedarone group and 10 in the placebo group (hazard ratio, 2.11; 95% CI, 1.00 to 4.49; P=0.046), including death from arrhythmia in 13 patients and 4 patients, respectively (hazard ratio, 3.26; 95% CI, 1.06 to 10.00; P=0.03). Stroke occurred in 23 patients in the dronedarone group and 10 in the placebo group (hazard ratio, 2.32; 95% CI, 1.11 to 4.88; P=0.02). Hospitalization for heart failure occurred in 43 patients in the dronedarone group and 24 in the placebo group (hazard ratio, 1.81; 95% CI, 1.10 to 2.99; P=0.02). Dronedarone increased rates of heart failure, stroke, and death from cardiovascular causes in patients with permanent atrial fibrillation who were at risk for major vascular events. Our data show that this drug should not be used in such patients. (Funded by Sanofi-Aventis; PALLAS ClinicalTrials.gov number, NCT01151137.).
Bruun, C; Guassora, A D; Nielsen, A B S; Siersma, V; Holstein, P E; de Fine Olivarius, N
2014-11-01
To investigate the predictive value of both patients' motivation and effort in their management of Type 2 diabetes and their life circumstances for the development of foot ulcers and amputations. This study was based on the Diabetes Care in General Practice study and Danish population and health registers. The associations between patient motivation, effort and life circumstances and foot ulcer prevalence 6 years after diabetes diagnosis and the incidence of amputation in the following 13 years were analysed using odds ratios from logistic regression and hazard ratios from Cox regression models, respectively. Foot ulcer prevalence 6 years after diabetes diagnosis was 2.93% (95% CI 1.86-4.00) among 956 patients. General practitioners' indication of 'poor' vs 'very good' patient motivation for diabetes management was associated with higher foot ulcer prevalence (odds ratio 6.11, 95% CI 1.22-30.61). The same trend was seen for 'poor' vs 'good' influence of the patient's own effort in diabetes treatment (odds ratio 7.06, 95% CI 2.65-18.84). Of 1058 patients examined at 6-year follow-up, 45 experienced amputation during the following 13 years. 'Poor' vs 'good' influence of the patients' own effort was associated with amputation (hazard ratio 7.12, 95% CI 3.40-14.92). When general practitioners assessed the influence of patients' life circumstances as 'poor' vs 'good', the amputation incidence increased (hazard ratio 2.97, 95% CI 1.22-7.24). 'Poor' vs 'very good' patient motivation was also associated with a higher amputation incidence (hazard ratio 7.57, 95% CI 2.43-23.57), although not in fully adjusted models. General practitioners' existing knowledge of patients' life circumstances, motivation and effort in diabetes management should be included in treatment strategies to prevent foot complications. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.
Ma, Huanhuan; Li, Zifu; Yin, Fubin; Kao, William; Yin, Yi; Bai, Xiaofeng
2014-01-01
Steel-mill waste rolling oil (SmWRO) is considered as hazardous substance with high treatment and disposal fees. Anaerobic process could not only transform the hazardous substance into activated sludge, but also generate valuable biogas. This study aimed at studying the biochemical methane potential of SmWRO under inoculum to substrate VS ratios (ISRs) of 0.25, 0.5, 1, 1.5, 2 and 3 using septic tank sludge as inoculum in mesophilic and thermophilic conditions, with blank tests for control. Specific biogas yield (mL/g VS(added)), net biogas yield (mL/g VS(removed)) and VS removal were analyzed. The ANOVA results indicated great influence of ISR and temperature on studied parameters. ISR of 1.5 at 55°C and ISR of 1.5 and 2 at 35°C were suggested with the highest specific biogas yield (262-265 and 303mL/g VS(added)). Kinetic analysis showed that Gompertz model fit the experimental data best with the least RMSE and largest R(2). Copyright © 2013 Elsevier Ltd. All rights reserved.
Comorbidity of phobic disorders with alcoholism in a Canadian community sample.
Sareen, J; Chartier, M; Kjernisted, K D; Stein, M B
2001-10-01
To examine the relation between phobic disorders and alcoholism in a Canadian community sample. Data came from the Mental Health Supplement of the Ontario Health Survey. The University of Michigan revision of the Composite International Diagnostic Interview (UM-CIDI) was used to diagnose DSM-III-R psychiatric disorders in 8116 Canadian respondents between ages 15 and 64 years. Since the cross-system agreement (ICD-10 and DSM-III-R or DSM-IV) on the diagnosis of alcohol abuse is much lower than that for alcohol dependence, we also examined a WHO category, "hazardous alcohol use." Logistic regression controlling for age and sex was used to determine odds ratios (ORs) for phobic disorders and alcohol-use diagnoses. Individuals with lifetime alcohol abuse or dependence had two- to threefold increased odds of having a phobic disorder. Simple phobia and social phobia with multiple fears were significantly associated (ORs 1.5 to 2) with hazardous alcohol use (which had a prevalence of approximately 10%). Given the early onset of most phobic disorders, the findings suggest that these are a risk factor for hazardous patterns of alcohol use.
Nam, Jin Young; Choi, Young; Kim, Juyeong; Cho, Kyoung Hee; Park, Eun-Cheol
2017-08-15
The relationships between breastfeeding discontinuation and cesarean section delivery, and the occurrence of postpartum depression (PPD) remain unclear. Therefore, we aimed to investigate the association of breastfeeding discontinuation and cesarean section delivery with PPD during the first 6 months after delivery. Data were extracted from the Korean National Health Insurance Service-National Sample Cohort for 81,447 women who delivered during 2004-2013. PPD status was determined using the diagnosis code at outpatient or inpatient visit during the 6-month postpartum period. Breastfeeding discontinuation and cesarean section delivery were identified from prescription of lactation suppression drugs and diagnosis, respectively. Cox proportional hazards models were used to calculate adjusted hazard ratios. Of the 81,447 women, 666 (0.82%) had PPD. PPD risk was higher in women who discontinued breastfeeding than in those who continued breastfeeding (hazard ratio=3.23, P<0.0001), in women with cesarean section delivery than in those with vaginal delivery (hazard ratio=1.26, P=0.0040), and in women with cesarean section delivery who discontinued breastfeeding than in those with vaginal delivery who continued breastfeeding (hazard ratio=4.92, P<0.0001). Study limitations include low PPD incidence; use of indirect indicators for PPD, breastfeeding discontinuation, and working status, which could introduce selection bias and errors due to miscoding; and potential lack of adjustment for important confounders. Breastfeeding discontinuation and cesarean section delivery were associated with PPD during the 6-month postpartum period. Our results support the implementation of breastfeeding promoting policies, and PPD screening and treatment programs during the early postpartum period. Copyright © 2017 Elsevier B.V. All rights reserved.
Association of Modality with Mortality among Canadian Aboriginals
Hemmelgarn, Brenda; Rigatto, Claudio; Komenda, Paul; Yeates, Karen; Promislow, Steven; Mojica, Julie; Tangri, Navdeep
2012-01-01
Summary Background and objectives Previous studies have shown that Aboriginals and Caucasians experience similar outcome on dialysis in Canada. Using the Canadian Organ Replacement Registry, this study examined whether dialysis modality (peritoneal or hemodialysis) impacted mortality in Aboriginal patients. Design, setting, participants, & measurements This study identified 31,576 adult patients (hemodialysis: Aboriginal=1839, Caucasian=21,430; peritoneal dialysis: Aboriginal=554, Caucasian=6769) who initiated dialysis between January of 2000 and December of 2009. Aboriginal status was identified by self-report. Dialysis modality was determined 90 days after dialysis initiation. Multivariate Cox proportional hazards and competing risk models were constructed to determine the association between race and mortality by dialysis modality. Results During the study period, 939 (51.1%) Aboriginals and 12,798 (53.3%) Caucasians initiating hemodialysis died, whereas 166 (30.0%) and 2037 (30.1%), respectively, initiating peritoneal dialysis died. Compared with Caucasians, Aboriginals on hemodialysis had a comparable risk of mortality (adjusted hazards ratio=1.04, 95% confidence interval=0.96–1.11, P=0.37). However, on peritoneal dialysis, Aboriginals experienced a higher risk of mortality (adjusted hazards ratio=1.36, 95% confidence interval=1.13–1.62, P=0.001) and technique failure (adjusted hazards ratio=1.29, 95% confidence interval=1.03–1.60, P=0.03) than Caucasians. The risk of technique failure varied by patient age, with younger Aboriginals (<50 years old) more likely to develop technique failure than Caucasians (adjusted hazards ratio=1.76, 95% confidence interval=1.23–2.52, P=0.002). Conclusions Aboriginals on peritoneal dialysis experience higher mortality and technique failure relative to Caucasians. Reasons for this race disparity in peritoneal dialysis outcomes are unclear. PMID:22997343
Olsen, Anne-Marie Schjerning; Fosbøl, Emil L; Lindhardsen, Jesper; Folke, Fredrik; Charlot, Mette; Selmer, Christian; Bjerring Olesen, Jonas; Lamberts, Morten; Ruwald, Martin H; Køber, Lars; Hansen, Peter R; Torp-Pedersen, Christian; Gislason, Gunnar H
2012-10-16
The cardiovascular risk after the first myocardial infarction (MI) declines rapidly during the first year. We analyzed whether the cardiovascular risk associated with using nonsteroidal anti-inflammatory drugs (NSAIDs) was associated with the time elapsed following first-time MI. We identified patients aged 30 years or older admitted with first-time MI in 1997 to 2009 and subsequent NSAID use by individual-level linkage of nationwide registries of hospitalization and drug dispensing from pharmacies in Denmark. We calculated the incidence rates of death and a composite end point of coronary death or nonfatal recurrent MIs associated with NSAID use in 1-year time intervals up to 5 years after inclusion and analyzed risk by using multivariable adjusted time-dependent Cox proportional hazards models. Of the 99 187 patients included, 43 608 (44%) were prescribed NSAIDs after the index MI. There were 36 747 deaths and 28 693 coronary deaths or nonfatal recurrent MIs during the 5 years of follow-up. Relative to noncurrent treatment with NSAIDs, the use of any NSAID in the years following MI was persistently associated with an increased risk of death (hazard ratio 1.59 [95% confidence interval, 1.49-1.69]) after 1 year and hazard ratio 1.63 [95% confidence interval, 1.52-1.74] after 5 years) and coronary death or nonfatal recurrent MI (hazard ratio, 1.30 [95% confidence interval,l 1.22-1.39] and hazard ratio, 1.41 [95% confidence interval, 1.28-1.55]). The use of NSAIDs is associated with persistently increased coronary risk regardless of time elapsed after first-time MI. We advise long-term caution in the use of NSAIDs for patients after MI.
Grønhøj, C; Jensen, D; Dehlendorff, C; Nørregaard, C; Andersen, E; Specht, L; Charabi, B; von Buchwald, C
2018-06-01
The distinct difference in disease phenotype of human papillomavirus-positive (HPV+) and -negative (HPV-) oropharyngeal squamous cell cancer (OPSCC) patients might also be apparent when assessing the effect of time to treatment initiation (TTI). We assessed the overall survival and progression-free survival (PFS) effect from increasing TTI for HPV+ and HPV- OPSCC patients. We examined patients who received curative-intended therapy for OPSCC in eastern Denmark between 2000 and 2014. TTI was the number of days from diagnosis to the initiation of curative treatment. Overall survival and PFS were measured from the start of treatment and estimated with the Kaplan-Meier estimator. Hazard ratios and 95% confidence intervals were estimated with Cox proportional hazard regression. At a median follow-up of 3.6 years (interquartile range 1.86-6.07 years), 1177 patients were included (59% HPV+). In the adjusted analysis for the HPV+ and HPV- patient population, TTI influenced overall survival and PFS, most evident in the HPV- group, where TTI >60 days statistically significantly influenced overall survival but not PFS (overall survival: hazard ratio 1.60; 95% confidence interval 1.04-2.45; PFS: hazard ratio 1.46; 95% confidence interval 0.96-2.22). For patients with a TTI >60 days in the HPV+ group, TTI affected overall survival and PFS similarly, with slightly lower hazard ratio estimates of 1.44 (95% confidence interval 0.83-2.51) and 1.15 (95% confidence interval 0.70-1.88), respectively. For patients treated for a HPV+ or HPV- OPSCC, TTI affects outcome, with the strongest effect for overall survival among HPV- patients. Reducing TTI is an important tool to improve the prognosis. Copyright © 2018. Published by Elsevier Ltd.
Müller, G; Wellmann, J; Hartwig, S; Greiser, K H; Moebus, S; Jöckel, K-H; Schipf, S; Völzke, H; Maier, W; Meisinger, C; Tamayo, T; Rathmann, W; Berger, K
2015-08-01
To analyse the association of neighbourhood unemployment with incident self-reported physician-diagnosed Type 2 diabetes in a population aged 45-74 years from five German regions. Study participants were linked via their addresses at baseline to particular neighbourhoods. Individual-level data from five population-based studies were pooled and combined with contextual data on neighbourhood unemployment. Type 2 diabetes was assessed according to a self-reported physician diagnosis of diabetes. We estimated proportional hazard models (Weibull distribution) in order to obtain hazard ratios and 95% CIs of Type 2 diabetes mellitus, taking into account interval-censoring and clustering. We included 7250 participants residing in 228 inner city neighbourhoods in five German regions in our analysis. The incidence rate was 12.6 per 1000 person-years (95% CI 11.4-13.8). The risk of Type 2 diabetes mellitus was higher in men [hazard ratio 1.79 (95% CI 1.47-2.18)] than in women and higher in people with a low education level [hazard ratio 1.55 (95% CI 1.18-2.02)] than in those with a high education level. Independently of individual-level characteristics, we found a higher risk of Type 2 diabetes mellitus in neighbourhoods with high levels of unemployment [quintile 5; hazard ratio 1.72 (95% CI 1.23-2.42)] than in neighbourhoods with low unemployment (quintile 1). Low education level and high neighbourhood unemployment were independently associated with an elevated risk of Type 2 diabetes mellitus. Studies examining the impact of the residential environment on Type 2 diabetes mellitus will provide knowledge that is essential for the identification of high-risk populations. © 2014 The Authors. Diabetic Medicine © 2014 Diabetes UK.
Shaikh, Amir Y; Wang, Na; Yin, Xiaoyan; Larson, Martin G; Vasan, Ramachandran S; Hamburg, Naomi M; Magnani, Jared W; Ellinor, Patrick T; Lubitz, Steven A; Mitchell, Gary F; Benjamin, Emelia J; McManus, David D
2016-09-01
The relations of measures of arterial stiffness, pulsatile hemodynamic load, and endothelial dysfunction to atrial fibrillation (AF) remain poorly understood. To better understand the pathophysiology of AF, we examined associations between noninvasive measures of vascular function and new-onset AF. The study sample included participants aged ≥45 years from the Framingham Heart Study offspring and third-generation cohorts. Using Cox proportional hazards regression models, we examined relations between incident AF and tonometry measures of arterial stiffness (carotid-femoral pulse wave velocity), wave reflection (augmentation index), pressure pulsatility (central pulse pressure), endothelial function (flow-mediated dilation), resting brachial arterial diameter, and hyperemic flow. AF developed in 407/5797 participants in the tonometry sample and 270/3921 participants in the endothelial function sample during follow-up (median 7.1 years, maximum 10 years). Higher augmentation index (hazard ratio, 1.16; 95% confidence interval, 1.02-1.32; P=0.02), baseline brachial artery diameter (hazard ratio, 1.20; 95% confidence interval, 1.01-1.43; P=0.04), and lower flow-mediated dilation (hazard ratio, 0.79; 95% confidence interval, 0.63-0.99; P=0.04) were associated with increased risk of incident AF. Central pulse pressure, when adjusted for age, sex, and hypertension (hazard ratio, 1.14; 95% confidence interval, 1.02-1.28; P=0.02) was associated with incident AF. Higher pulsatile load assessed by central pulse pressure and greater apparent wave reflection measured by augmentation index were associated with increased risk of incident AF. Vascular endothelial dysfunction may precede development of AF. These measures may be additional risk factors or markers of subclinical cardiovascular disease associated with increased risk of incident AF. © 2016 American Heart Association, Inc.
Farré, Núria; Aranyó, Júlia; Enjuanes, Cristina; Verdú-Rotellar, José María; Ruiz, Sonia; Gonzalez-Robledo, Gina; Meroño, Oona; de Ramon, Marta; Moliner, Pedro; Bruguera, Jordi; Comin-Colet, Josep
2015-02-15
Obese patients with chronic Heart Failure (HF) have better outcome than their lean counterparts, although little is known about the pathophysiology of this obesity paradox. Our aim was to evaluate the hypothesis that patients with chronic HF and obesity (defined as body mass index (BMI)≥30kg/m(2)), may have an attenuated neurohormonal activation in comparison with non-obese patients. The present study is the post-hoc analysis of a cohort of 742 chronic HF patients from a single-center study evaluating sympathetic activation by measuring baseline levels of norepinephrine (NE). Obesity was present in 33% of patients. Higher BMI and obesity were significantly associated with lower NE levels in multivariable linear regression models adjusted for covariates (p<0.001). Addition to NE in multivariate Cox proportional hazard models attenuated the prognostic impact of BMI in terms of outcomes. Finally, when we explored the prognosis impact of raised NE levels (>70th percentile) carrying out a separate analysis in obese and non-obese patients we found that in both groups NE remained a significant independent predictor of poorer outcomes, despite the lower NE levels in patients with chronic HF and obesity: all-cause mortality hazard ratio=2.37 (95% confidence interval, 1.14-4.94) and hazard ratio=1.59 (95% confidence interval, 1.05-2.4) in obese and non-obese respectively; and cardiovascular mortality hazard ratio=3.08 (95% confidence interval, 1.05-9.01) in obese patients and hazard ratio=2.08 (95% confidence interval, 1.42-3.05) in non-obese patients. Patients with chronic HF and obesity have significantly lower sympathetic activation. This finding may partially explain the obesity paradox described in chronic HF patients. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Baeten, Jared M; Donnell, Deborah; Kapiga, Saidi H; Ronald, Allan; John-Stewart, Grace; Inambao, Mubiana; Manongi, Rachel; Vwalika, Bellington; Celum, Connie
2010-03-13
Male circumcision reduces female-to-male HIV-1 transmission risk by approximately 60%. Data assessing the effect of circumcision on male-to-female HIV-1 transmission are conflicting, with one observational study among HIV-1-serodiscordant couples showing reduced transmission but a randomized trial suggesting no short-term benefit of circumcision. Data collected as part of a prospective study among African HIV-1-serodiscordant couples were analyzed for the relationship between circumcision status of HIV-1-seropositive men and risk of HIV-1 acquisition among their female partners. Circumcision status was determined by physical examination. Cox proportional hazards analysis was used. A total of 1096 HIV-1-serodiscordant couples in which the male partner was HIV-1-infected were followed for a median of 18 months; 374 (34%) male partners were circumcised. Sixty-four female partners seroconverted to HIV-1 (incidence 3.8 per 100 person-years). Circumcision of the male partner was associated with a nonstatistically significant approximately 40% lower risk of HIV-1 acquisition by the female partner (hazard ratio 0.62, 95% confidence interval 0.35-1.10, P = 0.10). The magnitude of this effect was similar when restricted to the subset of HIV-1 transmission events confirmed by viral sequencing to have occurred within the partnership (n = 50, hazard ratio 0.57, P = 0.11), after adjustment for male partner plasma HIV-1 concentrations (hazard ratio 0.60, P = 0.13), and when excluding follow-up time for male partners who initiated antiretroviral therapy (hazard ratio 0.53, P = 0.07). Among HIV-1-serodiscordant couples in which the HIV-1-seropositive partner was male, we observed no increased risk and potentially decreased risk from circumcision on male-to-female transmission of HIV-1.
Satoh, Michihiro; Ohkubo, Takayoshi; Asayama, Kei; Murakami, Yoshitaka; Sakurai, Masaru; Nakagawa, Hideaki; Iso, Hiroyasu; Okayama, Akira; Miura, Katsuyuki; Imai, Yutaka; Ueshima, Hirotsugu; Okamura, Tomonori
2015-03-01
No large-scale, longitudinal studies have examined the combined effects of blood pressure (BP) and total cholesterol levels on long-term risks for subtypes of cardiovascular death in an Asian population. To investigate these relationships, a meta-analysis of individual participant data, which included 73 916 Japanese subjects (age, 57.7 years; men, 41.1%) from 11 cohorts, was conducted. During a mean follow-up of 15.0 years, deaths from coronary heart disease, ischemic stroke, and intraparenchymal hemorrhage occurred in 770, 724, and 345 cases, respectively. Cohort-stratified Cox proportional hazard models were used. After stratifying the participants by 4 systolic BP ×4 total cholesterol categories, the group with systolic BP ≥160 mm Hg with total cholesterol ≥5.7 mmol/L had the greatest risk for coronary heart disease death (adjusted hazard ratio, 4.39; P<0.0001 versus group with systolic BP <120 mm Hg and total cholesterol <4.7 mmol/L). The adjusted hazard ratios of systolic BP (per 20 mm Hg) increased with increases in total cholesterol categories (hazard ratio, 1.52; P<0.0001 in group with total cholesterol ≥5.7 mmol/L). Similarly, the adjusted hazard ratios of total cholesterol increased with increases in systolic BP categories (P for interaction ≤0.04). Systolic BP was positively associated with ischemic stroke and intraparenchymal hemorrhage death, and total cholesterol was inversely associated with intraparenchymal hemorrhage, but no significant interactions between BP and total cholesterol were observed for stroke. High BP and high total cholesterol can synergistically increase the risk for coronary heart disease death but not for stroke in the Asian population. © 2015 American Heart Association, Inc.
Association between divorce and risks for acute myocardial infarction.
Dupre, Matthew E; George, Linda K; Liu, Guangya; Peterson, Eric D
2015-05-01
Divorce is a major life stressor that can have economic, emotional, and physical health consequences. However, the cumulative association between divorce and risks for acute myocardial infarction (AMI) is unknown. This study investigated the association between lifetime exposure to divorce and the incidence of AMI in US adults. We used nationally representative data from a prospective cohort of ever-married adults aged 45 to 80 years (n=15,827) who were followed biennially from 1992 to 2010. Approximately 14% of men and 19% of women were divorced at baseline and more than one third of the cohort had ≥1 divorce in their lifetime. In 200,524 person-years of follow-up, 8% (n=1211) of the cohort had an AMI and age-specific rates of AMI were consistently higher in those who were divorced compared with those who were continuously married (P<0.05). Results from competing-risk hazard models showed that AMI risks were significantly higher in women who had 1 divorce (hazard ratio, 1.24; 95% confidence interval, 1.01-1.55), ≥2 divorces (hazard ratio, 1.77; 95% confidence interval, 1.30-2.41), and among the remarried (hazard ratio, 1.35; 95% confidence interval, 1.07-1.70) compared with continuously married women after adjusting for multiple risk factors. Multivariable-adjusted risks were elevated only in men with a history of ≥2 divorces (hazard ratio, 1.30; 95% confidence interval, 1.02-1.66) compared with continuously married men. Men who remarried had no significant risk for AMI. Interaction terms for sex were not statistically significant. Divorce is a significant risk factor for AMI. The risks associated with multiple divorces are especially high in women and are not reduced with remarriage. © 2015 American Heart Association, Inc.
A nested case-control study of fatal work related injuries among Brazilian steel workers.
Barreto, S M; Swerdlow, A J; Smith, P G; Higgins, C D
1997-01-01
OBJECTIVES: To estimate the relative risk of death from work related injury in a steelworks, associated with exposure to various occupational hazards, sociodemographic factors, and medical history. MATERIAL AND METHODS: The study was a nested case-control design. It was based on a cohort of men employed in the steel plant of USIMINAS, Brazil between January 1977 and August 1990, who were followed up to November 1992. The cases were defined as all workers in the cohort who died from injury in the study period and whose death had been notified to the Brazilian Ministry of Labour as being related to work. Four controls per case, matched to cases on year of birth, were randomly selected from among workers employed in the plant at the time of death of the matching case. Data on potential risk factors for occupational injury were extracted from company records; for the controls these data were abstracted for the period preceding the death of the matching case. RESULTS: There were 37 deaths related to work injuries during the study period. Four surviving workers were selected as controls for each case, but for eight the personnel records were incomplete, leaving 140 controls in all. Significantly increased risk of fatal injury related to work was associated with exposure to noise, heat, dust and fumes, gases and vapours, rotating shift work, being a manual worker, and working in the steel mill, coke ovens, blast furnaces, and energy and water supply areas. Risk of fatal injury related to work increased with intensity of exposure to noise (P (trend) = 0.004) and heat (P < 0.001), and increased greatly with a hazard score that combined information on noise, heat, dust, and gas exposure (P < 0.001). Number of years of schooling (P = 0.03) and salary level (P = 0.03) were both negatively associated with risk. In a multivariate analysis including all these significant factors, only hazard score and area of work remained associated with death from injury related to work. The highest risks were for men exposed to all four environmental hazards (odds ratio (OR) 19.4; 95% confidence interval (95% CI) 1.1 to 352.1) and those working in the energy supply area (OR 18.0; 1.6 to 198.1). CONCLUSIONS: The study identified parts of the steelworks and types of hazard associated with greatly increased risk of fatal accident. Research and measures to prevent accidents need to concentrate on these areas and the people working in them. The use of a hazard score was successful in identifying high risk, and similar scoring might prove useful in other industrial situations. PMID:9326164
Larsen, Malene Rode; Aust, Birgit; Høgelund, Jan
2017-04-18
The aim of this study was to investigate whether a multidimensional public-private partnership intervention, focussing on improving the quality and efficiency of sickness benefit case management, reduced the sickness benefit duration and the duration until self-support. We used a difference-in-difference (DID) design with six intervention municipalities and 12 matched control municipalities in Denmark. The study sample comprised 282,103 sickness benefit spells exceeding four weeks. The intervention group with 110,291 spells received the intervention, and the control group with 171,812 spells received ordinary sickness benefit case management. Using register data, we fitted Cox proportional hazard ratio models, estimating hazard ratios (HR) and confidence intervals (CI). We found no joint effect of the intervention on the sickness benefit duration (HR 1.02, CI 0.97-1.07) or the duration until self-support (HR 0.99, CI 0.96-1.02). The effect varied among the six municipalities, with sickness benefit HRs ranging from 0.96 (CI 0.93-1.00) to 1.13 (CI 1.08-1.18) and self-support HRs ranging from 0.91 (CI 0.82-1.00) to 1.11 (CI 1.06-1.17). Compared to receiving ordinary sickness benefit management the intervention had on average no effect on the sickness benefit duration or duration until self-support. However, the effect varied considerably among the six municipalities possibly due to differences in the implementation or the complexity of the intervention.
Robots, systems, and methods for hazard evaluation and visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nielsen, Curtis W.; Bruemmer, David J.; Walton, Miles C.
A robot includes a hazard sensor, a locomotor, and a system controller. The robot senses a hazard intensity at a location of the robot, moves to a new location in response to the hazard intensity, and autonomously repeats the sensing and moving to determine multiple hazard levels at multiple locations. The robot may also include a communicator to communicate the multiple hazard levels to a remote controller. The remote controller includes a communicator for sending user commands to the robot and receiving the hazard levels from the robot. A graphical user interface displays an environment map of the environment proximatemore » the robot and a scale for indicating a hazard intensity. A hazard indicator corresponds to a robot position in the environment map and graphically indicates the hazard intensity at the robot position relative to the scale.« less
Kang, Minyong; Yu, Jiwoong; Sung, Hyun Hwan; Jeon, Hwang Gyun; Jeong, Byong Chang; Park, Se Hoon; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han Yong; Seo, Seong Il
2018-05-13
To examine the prognostic role of the pretreatment aspartate transaminase/alanine transaminase or De Ritis ratio in patients with metastatic renal cell carcinoma receiving first-line systemic tyrosine kinase inhibitor therapy. We retrospectively searched the medical records of 579 patients with metastatic renal cell carcinoma who visited Samsung Medical Center, Seoul, Korea, from January 2001 through August 2016. After excluding 210 patients, we analyzed 360 patients who received first-line tyrosine kinase inhibitor therapy. Cancer-specific survival and overall survival were defined as the primary and secondary end-points, respectively. A multivariate Cox proportional hazards regression model was used to identify independent prognosticators of survival outcomes. The overall population was divided into two groups according to the pretreatment De Ritis ratio as an optimal cut-off value of 1.2, which was determined by a time-dependent receiver operating characteristic curve analysis. Patients with a higher pretreatment De Ritis ratio (≥1.2) had worse cancer-specific survival and overall survival outcomes, compared with those with a lower De Ritis ratio (<1.2). Notably, a higher De Ritis ratio (≥1.2) was found to be an independent predictor of both cancer-specific survival (hazard ratio 1.61, 95% confidence interval 1.13-2.30) and overall survival outcomes (hazard ratio 1.69, 95% confidence interval 1.19-2.39), along with male sex, multiple metastasis (≥2), non-clear cell histology, advanced pT stage (≥3), previous metastasectomy and the Memorial Sloan Kettering Cancer Center risk classification. Our findings show that the pretreatment De Ritis ratio can provide valuable information about the survival outcomes of metastatic renal cell carcinoma patients receiving first-line tyrosine kinase inhibitor therapy. © 2018 The Japanese Urological Association.
Farr, Amanda M; Sheehan, John J; Davis, Brian M; Smith, David M
2016-01-01
Adherence and persistence to antidiabetes medications are important to control blood glucose levels among individuals with type 2 diabetes mellitus (T2D). The objective of this study was to compare adherence and persistence over a 12-month period between patients initiating saxagliptin and patients initiating linagliptin, two dipeptidyl peptidase-4 inhibitors. This retrospective cohort study was conducted in MarketScan(®) Commercial and Medicare Supplemental claims databases. Patients with T2D initiating saxagliptin or linagliptin between January 1, 2009, and June 30, 2013, were selected. Patients were required to be at least 18 years old and have 12 months of continuous enrollment prior to and following initiation. Adherence and persistence to initiated medication were measured over the 12 months after initiation using outpatient pharmacy claims. Patients were considered adherent if the proportion of days covered was ≥0.80. Patients were considered nonpersistent (or to have discontinued) if there was a gap of >60 days without initiated medication on hand. Multivariable logistic regression and multivariable Cox proportional hazard models were fit to compare adherence and persistence, respectively, between the two cohorts. There were 21,599 saxagliptin initiators (mean age 55 years; 53% male) and 5,786 linagliptin initiators (mean age 57 years; 54% male) included in the study sample. Over the 12-month follow-up, 46% of saxagliptin initiators and 42% of linagliptin initiators were considered adherent and 47% of saxagliptin initiators and 51% of linagliptin initiators discontinued their initiated medication. After controlling for patient characteristics, saxagliptin initiation was associated with significantly greater odds of being adherent (adjusted odds ratio =1.212, 95% CI 1.140-1.289) and significantly lower hazards of discontinuation (adjusted hazard ratio =0.887, 95% CI 0.850-0.926) compared with linagliptin initiation. Compared with patients with T2D who initiated linagliptin, patients with T2D who initiated saxagliptin had significantly better adherence and persistence.
Asleh, Rabea; Briasoulis, Alexandros; Schettle, Sarah D; Tchantchaleishvili, Vakhtang; Pereira, Naveen L; Edwards, Brooks S; Clavell, Alfredo L; Maltais, Simon; Joyce, David L; Joyce, Lyle D; Daly, Richard C; Kushwaha, Sudhir S; Stulak, John M
2017-11-01
Diabetes mellitus (DM) is a risk factor for morbidity and mortality in patients with heart failure. The effect of DM on post-left ventricular assist device (LVAD) implantation outcomes is unclear. This study sought to investigate whether patients with DM had worse outcomes than patients without DM after LVAD implantation and whether LVAD support resulted in a better control of DM. We retrospectively reviewed 341 consecutive adults who underwent implantation of LVAD from 2007 to 2016. Patient characteristics and adverse events were studied and compared between patients with and without DM. One hundred thirty-one patients (38%) had DM. Compared with patients without DM, those with DM had higher rates of ischemic cardiomyopathy, LVAD implantation as destination therapy, and increased baseline body mass index. In a proportional hazards (Cox) model with adjustment for relevant covariates and median follow-up of 16.1 months, DM was associated with increased risk of all-cause mortality (hazard ratio, 1.73; 95% confidence interval: 1.18-2.53; P =0.005) and increased risk of nonfatal LVAD-related complications, including a composite of stroke, pump thrombosis, and device infection (hazard ratio, 2.1; 95% confidence interval: 1.35-3.18; P =0.001). Preoperative hemoglobin A1c was not significantly associated with mortality or adverse events among patients with DM. LVAD implantation resulted in a remarkable decrease in hemoglobin A1c levels (7.4±1.9 pre-LVAD versus 6.0±1.5 and 6.3±1.4 after 3 and 12 months post-LVAD, respectively; P <0.0001) and a significant reduction in requirements of DM medications. DM is associated with increased rates of all-cause mortality and major adverse events despite favorable glycemic control after LVAD implantation. © 2017 American Heart Association, Inc.
Toyomaki, Haruya; Sekiguchi, Satoshi; Sasaki, Yosuke; Sueyoshi, Masuo; Makita, Kohei
2018-02-01
The objective of this study was to investigate factors that caused rapid spread during the early phase of the porcine epidemic diarrhea (PED) epidemic in Japan in 2013 and 2014. Anonymized datasets from all pig farms were provided by Kagoshima (709 farms) and Miyazaki Prefectures (506 farms). Semi-parametric survival analysis was conducted using the first 180 days from the first case on December 3, 2013 in Kagoshima Prefecture. To compare the hazard between different farm management types, univariable survival analysis was conducted. As farm sizes varied among different farm types, bivariable survival analysis was conducted for farm size categories and farm density per km 2 for each management type. A case-control study using a postal questionnaire survey was conducted in September 2014, and risk factor analysis was performed using generalized linear models with binomial errors. The hazard was significantly higher in farrow-to-finish farms than fattening farms [hazard ratio (HR) = 1.6, p < 0.01], but was not significantly different between reproduction and fattening farms (HR = 1.3, p = 0.16). In separate bivariable survival analyses for each farm type, large- and middle-scale farms had higher hazard than small-scale farms in fattening (HR = 5.8 and 2.6, respectively, both p < 0.01) and reproduction farms (HR = 4.0 and 3.6, respectively, both p < 0.01). In farrow-to-finish farms, large-scale farms had higher hazard than small-scale farms (HR = 2.8, p < 0.01), and higher farm density per km 2 was also a risk factor (HR = 7.6, p < 0.01). In the case-control study, questionnaires were returned from 78 PED virus-infected and 91 non-infected farms. The overall response rate was 34%. Risk factors of the final model were occurrence of porcine reproductive and respiratory syndrome in the past 5 years [odds ratio (OR) = 1.97, 95% confidence interval (CI): 0.97-4.00, p = 0.054], use of a common compost station (OR = 2.51, 95%CI: 1.08-5.83, p = 0.03), and use of a pig excrement disposal service (OR = 2.64, 95%CI: 1.05-6.63, p = 0.04). High hazard in farrow-to-finish farms suggested transmission from slaughterhouses to susceptible suckling piglets. Hazard associated with large-scale farms and high density might be due to frequent vehicle entrance and transmission by roads. Improvement of farm hygiene management and avoidance of risky practices associated with contact with pig excrement were keys in preventing invasion of PED virus to a farm. Copyright © 2017 Elsevier B.V. All rights reserved.
Ibrahim, Fowzia; Lorente-Cánovas, Beatriz; Doré, Caroline J; Bosworth, Ailsa; Ma, Margaret H; Galloway, James B; Cope, Andrew P; Pande, Ira; Walker, David; Scott, David L
2017-11-01
RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0-6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6-12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0-6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6-12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states. EudraCT, https://www.clinicaltrialsregister.eu, 2010-020738-24; ISRCTN registry, https://www.isrctn.com, 28955701. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Rheumatology.
Maio, Michele; Mackiewicz, Andrzej; Testori, Alessandro; Trefzer, Uwe; Ferraresi, Virginia; Jassem, Jacek; Garbe, Claus; Lesimple, Thierry; Guillot, Bernard; Gascon, Pere; Gilde, Katalin; Camerini, Roberto; Cognetti, Francesco
2010-04-01
Thymosin alpha 1 (Talpha1) is an immunomodulatory polypeptide that enhances effector T-cell responses. In this large randomized study, we evaluated the efficacy and safety of combining Talpha1 with dacarbazine (DTIC) and interferon alfa (IFN-alpha) in patients with metastatic melanoma. Four hundred eighty-eight patients were randomly assigned to five treatment groups: DTIC+IFN-alpha+Talpha1 (1.6 mg); DTIC+IFN-alpha+Talpha1 (3.2 mg); DTIC+IFN-alpha+Talpha1 (6.4 mg); DTIC+Talpha1 (3.2 mg); DTIC+IFN-alpha (control group). The primary end point was best overall response at study end (12 months). Secondary end points included duration of response, overall survival (OS), and progression-free survival (PFS). Patients were observed for up to 24 months. Ten and 12 tumor responses were observed in the DTIC+IFN-alpha+Talpha1 (3.2 mg) and DTIC+Talpha1 (3.2 mg) groups, respectively, versus four in the control group, which was sufficient to reject the null hypothesis that P(0) < or = .05 (expected response rate of standard therapy) in these two arms. Duration of response ranged from 1.9 to 23.2 months in patients given Talpha1 and from 4.4 to 8.4 months in the control group. Median OS was 9.4 months in patients given Talpha1 versus 6.6 months in the control group (hazard ratio = 0.80; 9% CI, 0.63 to 1.02; P = .08). An increase in PFS was observed in patients given Talpha1 versus the control group (hazard ratio = 0.80; 95% CI, 0.63 to 1.01; P = .06). Addition of Talpha1 to DTIC and IFN-alpha did not lead to any additional toxicity. These results suggest Talpha1 has activity in patients with metastatic melanoma and provide rationale for further clinical evaluation of this agent.
Lorente-Cánovas, Beatriz; Doré, Caroline J; Bosworth, Ailsa; Ma, Margaret H; Galloway, James B; Cope, Andrew P; Pande, Ira; Walker, David; Scott, David L
2017-01-01
Abstract Objectives RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. Methods We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0–6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6–12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. Results Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0–6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6–12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). Conclusion Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states. Trail registration EudraCT, https://www.clinicaltrialsregister.eu, 2010-020738-24; ISRCTN registry, https://www.isrctn.com, 28955701 PMID:28968858
Cole, Stephen R.; Jacobson, Lisa P.; Tien, Phyllis C.; Kingsley, Lawrence; Chmiel, Joan S.; Anastos, Kathryn
2010-01-01
To estimate the net effect of imperfectly measured highly active antiretroviral therapy on incident acquired immunodeficiency syndrome or death, the authors combined inverse probability-of-treatment-and-censoring weighted estimation of a marginal structural Cox model with regression-calibration methods. Between 1995 and 2007, 950 human immunodeficiency virus–positive men and women were followed in 2 US cohort studies. During 4,054 person-years, 374 initiated highly active antiretroviral therapy, 211 developed acquired immunodeficiency syndrome or died, and 173 dropped out. Accounting for measured confounders and determinants of dropout, the weighted hazard ratio for acquired immunodeficiency syndrome or death comparing use of highly active antiretroviral therapy in the prior 2 years with no therapy was 0.36 (95% confidence limits: 0.21, 0.61). This association was relatively constant over follow-up (P = 0.19) and stronger than crude or adjusted hazard ratios of 0.75 and 0.95, respectively. Accounting for measurement error in reported exposure using external validation data on 331 men and women provided a hazard ratio of 0.17, with bias shifted from the hazard ratio to the estimate of precision as seen by the 2.5-fold wider confidence limits (95% confidence limits: 0.06, 0.43). Marginal structural measurement-error models can simultaneously account for 3 major sources of bias in epidemiologic research: validated exposure measurement error, measured selection bias, and measured time-fixed and time-varying confounding. PMID:19934191
Todd, Jonathan V.; Cole, Stephen R.; Pence, Brian W.; Lesko, Catherine R.; Bacchetti, Peter; Cohen, Mardge H.; Feaster, Daniel J.; Gange, Stephen; Griswold, Michael E.; Mack, Wendy; Rubtsova, Anna; Wang, Cuiwei; Weedon, Jeremy; Anastos, Kathryn; Adimora, Adaora A.
2017-01-01
Abstract Depression affects up to 30% of human immunodeficiency virus (HIV)-infected individuals. We estimated joint effects of antiretroviral therapy (ART) initiation and depressive symptoms on time to death using a joint marginal structural model and data from a cohort of HIV-infected women from the Women's Interagency HIV Study (conducted in the United States) from 1998–2011. Among 848 women contributing 6,721 years of follow-up, 194 participants died during follow-up, resulting in a crude mortality rate of 2.9 per 100 women-years. Cumulative mortality curves indicated greatest mortality for women who reported depressive symptoms and had not initiated ART. The hazard ratio for depressive symptoms was 3.38 (95% confidence interval (CI): 2.15, 5.33) and for ART was 0.47 (95% CI: 0.31, 0.70). Using a reference category of women without depressive symptoms who had initiated ART, the hazard ratio for women with depressive symptoms who had initiated ART was 3.60 (95% CI: 2.02, 6.43). For women without depressive symptoms who had not started ART, the hazard ratio was 2.36 (95% CI: 1.16, 4.81). Among women reporting depressive symptoms who had not started ART, the hazard ratio was 7.47 (95% CI: 3.91, 14.3). We found a protective effect of ART initiation on mortality, as well as a harmful effect of depressive symptoms, in a cohort of HIV-infected women. PMID:28430844
Kim, Sung-Yong; Le Rademacher, Jennifer; Antin, Joseph H; Anderlini, Paolo; Ayas, Mouhab; Battiwalla, Minoo; Carreras, Jeanette; Kurtzberg, Joanne; Nakamura, Ryotaro; Eapen, Mary; Deeg, H Joachim
2014-12-01
A proportion of patients with aplastic anemia who are treated with immunosuppressive therapy develop clonal hematologic disorders, including post-aplastic anemia myelodysplastic syndrome. Many will proceed to allogeneic hematopoietic stem cell transplantation. We identified 123 patients with post-aplastic anemia myelodysplastic syndrome who from 1991 through 2011 underwent allogeneic hematopoietic stem cell transplantation, and in a matched-pair analysis compared outcome to that in 393 patients with de novo myelodysplastic syndrome. There was no difference in overall survival. There were no significant differences with regard to 5-year probabilities of relapse, non-relapse mortality, relapse-free survival and overall survival; these were 14%, 40%, 46% and 49% for post-aplastic anemia myelodysplastic syndrome, and 20%, 33%, 47% and 49% for de novo myelodysplastic syndrome, respectively. In multivariate analysis, relapse (hazard ratio 0.71; P=0.18), non-relapse mortality (hazard ratio 1.28; P=0.18), relapse-free survival (hazard ratio 0.97; P=0.80) and overall survival (hazard ratio 1.02; P=0.88) of post-aplastic anemia myelodysplastic syndrome were similar to those of patients with de novo myelodysplastic syndrome. Cytogenetic risk was independently associated with overall survival in both groups. Thus, transplant success in patients with post-aplastic anemia myelodysplastic syndrome was similar to that in patients with de novo myelodysplastic syndrome, and cytogenetics was the only significant prognostic factor for post-aplastic anemia myelodysplastic syndrome patients. Copyright© Ferrata Storti Foundation.
Ren, J S; Freedman, N D; Kamangar, F; Dawsey, S M; Hollenbeck, A R; Schatzkin, A; Abnet, C C
2010-07-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric non-cardia cancer, 123 Oesophageal Squamous Cell Carcinoma (ESCC) and 305 Oesophageal Adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% confidence intervals (95% CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of > or =1 cup/day was 0.37 (95% CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95% CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last 3 years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95% CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. Published by Elsevier Ltd.
Ren, JS; Freedman, ND; Kamangar, F; Dawsey, SM; Hollenbeck, AR; Schatzkin, A; Abnet, CC
2010-01-01
The authors investigated the relationship between hot tea, iced tea, coffee and carbonated soft drinks consumption and upper gastrointestinal tract cancers risk in the NIH-AARP Study. During 2,584,953 person-years of follow-up on 481,563 subjects, 392 oral cavity, 178 pharynx, 307 larynx, 231 gastric cardia, 224 gastric noncardia cancer, 123 esophageal squamous cell carcinoma (ESCC) and 305 esophageal adenocarcinoma (EADC) cases were accrued. Hazard ratios (HRs) and 95% Confidence Intervals (95%CIs) were calculated by multivariate-adjusted Cox regression. Compared to non-drinking, the hazard ratio for hot tea intake of ≥1 cup/day was 0.37 (95%CI: 0.20, 0.70) for pharyngeal cancer. The authors also observed a significant association between coffee drinking and risk of gastric cardia cancer (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 1.57 (95%CI: 1.03, 2.39)), and an inverse association between coffee drinking and EADC for the cases occurring in the last three years of follow-up (compared to <1 cup/day, the hazard ratio for drinking >3 cups/day was 0.54 (95%CI: 0.31, 0.92)), but no association in earlier follow-up. In summary, hot tea intake was inversely associated with pharyngeal cancer, and coffee was directly associated with gastric cardia cancer, but was inversely associated with EADC during some follow-up periods. PMID:20395127
Anthropometry and the Risk of Lung Cancer in EPIC.
Dewi, Nikmah Utami; Boshuizen, Hendriek C; Johansson, Mattias; Vineis, Paolo; Kampman, Ellen; Steffen, Annika; Tjønneland, Anne; Halkjær, Jytte; Overvad, Kim; Severi, Gianluca; Fagherazzi, Guy; Boutron-Ruault, Marie-Christine; Kaaks, Rudolf; Li, Kuanrong; Boeing, Heiner; Trichopoulou, Antonia; Bamia, Christina; Klinaki, Eleni; Tumino, Rosario; Palli, Domenico; Mattiello, Amalia; Tagliabue, Giovanna; Peeters, Petra H; Vermeulen, Roel; Weiderpass, Elisabete; Torhild Gram, Inger; Huerta, José María; Agudo, Antonio; Sánchez, María-José; Ardanaz, Eva; Dorronsoro, Miren; Quirós, José Ramón; Sonestedt, Emily; Johansson, Mikael; Grankvist, Kjell; Key, Tim; Khaw, Kay-Tee; Wareham, Nick; Cross, Amanda J; Norat, Teresa; Riboli, Elio; Fanidi, Anouar; Muller, David; Bueno-de-Mesquita, H Bas
2016-07-15
The associations of body mass index (BMI) and other anthropometric measurements with lung cancer were examined in 348,108 participants in the European Investigation Into Cancer and Nutrition (EPIC) between 1992 and 2010. The study population included 2,400 case patients with incident lung cancer, and the average length of follow-up was 11 years. Hazard ratios were calculated using Cox proportional hazard models in which we modeled smoking variables with cubic splines. Overall, there was a significant inverse association between BMI (weight (kg)/height (m)(2)) and the risk of lung cancer after adjustment for smoking and other confounders (for BMI of 30.0-34.9 versus 18.5-25.0, hazard ratio = 0.72, 95% confidence interval: 0.62, 0.84). The strength of the association declined with increasing follow-up time. Conversely, after adjustment for BMI, waist circumference and waist-to-height ratio were significantly positively associated with lung cancer risk (for the highest category of waist circumference vs. the lowest, hazard ratio = 1.25, 95% confidence interval: 1.05, 1.50). Given the decline of the inverse association between BMI and lung cancer over time, the association is likely at least partly due to weight loss resulting from preclinical lung cancer that was present at baseline. Residual confounding by smoking could also have influenced our findings. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Agüero, Fernando; Forner, Alejandro; Manzardo, Christian; Valdivieso, Andres; Blanes, Marino; Barcena, Rafael; Rafecas, Antoni; Castells, Lluis; Abradelo, Manuel; Torre-Cisneros, Julian; Gonzalez-Dieguez, Luisa; Salcedo, Magdalena; Serrano, Trinidad; Jimenez-Perez, Miguel; Herrero, Jose Ignacio; Gastaca, Mikel; Aguilera, Victoria; Fabregat, Juan; Del Campo, Santos; Bilbao, Itxarone; Romero, Carlos Jimenez; Moreno, Asuncion; Rimola, Antoni; Miro, Jose M
2016-02-01
The impact of human immunodeficiency virus (HIV) infection on patients undergoing liver transplantation (LT) for hepatocellular carcinoma (HCC) is uncertain. This study aimed to assess the outcome of a prospective Spanish nationwide cohort of HIV-infected patients undergoing LT for HCC (2002-2014). These patients were matched (age, gender, year of LT, center, and hepatitis C virus (HCV) or hepatitis B virus infection) with non-HIV-infected controls (1:3 ratio). Patients with incidental HCC were excluded. Seventy-four HIV-infected patients and 222 non-HIV-infected patients were included. All patients had cirrhosis, mostly due to HCV infection (92%). HIV-infected patients were younger (47 versus 51 years) and had undetectable HCV RNA at LT (19% versus 9%) more frequently than non-HIV-infected patients. No significant differences were detected between HIV-infected and non-HIV-infected recipients in the radiological characteristics of HCC at enlisting or in the histopathological findings for HCC in the explanted liver. Survival at 1, 3, and 5 years for HIV-infected versus non-HIV-infected patients was 88% versus 90%, 78% versus 78%, and 67% versus 73% (P = 0.779), respectively. HCV infection (hazard ratio = 7.90, 95% confidence interval 1.07-56.82) and maximum nodule diameter >3 cm in the explanted liver (hazard ratio = 1.72, 95% confidence interval 1.02-2.89) were independently associated with mortality in the whole series. HCC recurred in 12 HIV-infected patients (16%) and 32 non-HIV-infected patients (14%), with a probability of 4% versus 5% at 1 year, 18% versus 12% at 3 years, and 20% versus 19% at 5 years (P = 0.904). Microscopic vascular invasion (hazard ratio = 3.40, 95% confidence interval 1.34-8.64) was the only factor independently associated with HCC recurrence. HIV infection had no impact on recurrence of HCC or survival after LT. Our results support the indication of LT in HIV-infected patients with HCC. © 2015 by the American Association for the Study of Liver Diseases.
Failing, Jarrett J; Yan, Yiyi; Porrata, Luis F; Markovic, Svetomir N
2017-12-01
The peripheral blood lymphocyte-to-monocyte ratio (LMR) has been associated with prognosis in many malignancies including metastatic melanoma. However, it has not been studied in patients treated with immune checkpoint inhibitors. In this study, we analyzed the baseline LMR with progression-free survival (PFS) and overall survival (OS) in metastatic melanoma patients treated with pembrolizumab. A total of 133 patients with metastatic melanoma treated with pembrolizumab were included in this retrospective study. LMR was calculated from pretherapy peripheral blood counts and the optimal cutoff value was determined by a receiver operator characteristic curve. PFS and OS were evaluated using the Kaplan-Meier method and multivariate Cox proportional hazard modeling. Patients with an LMR of at least 1.7 showed improved PFS (hazard ratio=0.55; 95% confidence interval: 0.34-0.92; P=0.024) and OS (hazard ratio=0.29; 95% confidence interval: 0.15-0.59; P=0.0007). The baseline LMR is associated with PFS and OS in metastatic melanoma patients treated with pembrolizumab, and could represent a convenient and cost-effective prognostic biomarker. Validation of these findings in an independent cohort is needed.
Does specific psychopathology predict development of psychosis in ultra high-risk (UHR) patients?
Thompson, Andrew; Nelson, Barnaby; Bruxner, Annie; O'Connor, Karen; Mossaheb, Nilufar; Simmons, Magenta B; Yung, Alison
2013-04-01
Studies have attempted to identify additional risk factors within the group identified as 'ultra high risk' (UHR) for developing psychotic disorders in order to characterise those at highest risk. However, these studies have often neglected clinical symptom types as additional risk factors. We aimed to investigate the relationship between baseline clinical psychotic or psychotic-like symptoms and the subsequent transition to a psychotic disorder in a UHR sample. A retrospective 'case-control' methodology was used. We identified all individuals from a UHR clinic who had subsequently developed a psychotic disorder (cases) and compared these to a random sample of individuals from the clinic who did not become psychotic within the sampling time frame (controls). The sample consisted of 120 patients (60 cases, 60 controls). An audit tool was used to identify clinical symptoms reported at entry to the clinic (baseline) using the clinical file. Diagnosis at transition was assessed using the Operational Criteria for Psychotic Illness (OPCRIT) computer program. The relationship between transition to a psychotic disorder and baseline symptoms was explored using survival analysis. Presence of thought disorder, any delusions and elevated mood significantly predicted transition to a psychotic disorder. When other symptoms were adjusted for, only the presence of elevated mood significantly predicted subsequent transition (hazard ratio 2.69, p = 0.002). Thought disorder was a predictor of transition to a schizophrenia-like psychotic disorder (hazard ratio 3.69, p = 0.008). Few individual clinical symptoms appear to be predictive of transition to a psychotic disorder in the UHR group. Clinicians should be cautious about the use of clinical profile alone in such individuals when determining who is at highest risk.
Ma, Xiao; Yang, Yang; Li, Hong-Lan; Zheng, Wei; Gao, Jing; Zhang, Wei; Yang, Gong; Shu, Xiao-Ou; Xiang, Yong-Bing
2017-03-01
Dietary factors have been hypothesized to affect the risk of liver cancer via various mechanisms, but the influence has been not well studied and the evidence is conflicting. We investigated associations of dietary trace element intake, assessed through a validated food frequency questionnaire, with risk of liver cancer in two prospective cohort studies of 132,765 women (1997-2013) and men (2002-2013) in Shanghai, China. The associations were first evaluated in cohort studies and further assessed in a case-control study nested within these cohorts adjusting for hepatitis B virus infection. For cohort analyses, Cox proportional hazard models were used to estimate hazard ratios and 95% confidence intervals. For nested case-control analyses, conditional logistic regression was used to calculate odds ratios and 95% confidence intervals. After a median follow-up time of 15.2 years for the Shanghai Women's Health Study and 9.3 years for the Shanghai Men's Health Study, 192 women and 344 men developed liver cancer. Dietary intake of manganese was inversely associated with liver cancer risk (highest vs. lowest quintile, HR = 0.51, 95% CI: 0.35-0.73; p trend = 0.001). Further adjustment for hepatitis B virus infection in the nested case-control study yielded a similar result (highest vs. lowest quintile, OR = 0.38, 95% CI: 0.21-0.69; p trend < 0.001). No significant association was found between dietary intake of selenium, iron, zinc, copper and liver cancer risk. The results suggest that higher intake of manganese may be associated with a lower risk of liver cancer in China. © 2016 UICC.
Benefits of the Mediterranean Diet: Insights From the PREDIMED Study.
Martínez-González, Miguel A; Salas-Salvadó, Jordi; Estruch, Ramón; Corella, Dolores; Fitó, Montse; Ros, Emilio
2015-01-01
The PREDIMED (PREvención con DIeta MEDiterránea) multicenter, randomized, primary prevention trial assessed the long-term effects of the Mediterranean diet (MeDiet) on clinical events of cardiovascular disease (CVD). We randomized 7447 men and women at high CVD risk into three diets: MeDiet supplemented with extra-virgin olive oil (EVOO), MeDiet supplemented with nuts, and control diet (advice on a low-fat diet). No energy restriction and no special intervention on physical activity were applied. We observed 288 CVD events (a composite of myocardial infarction, stroke or CVD death) during a median time of 4.8years; hazard ratios were 0.70 (95% CI, 0.53-0.91) for the MeDiet+EVOO and 0.70 (CI, 0.53-0.94) for the MeDiet+nuts compared to the control group. Respective hazard ratios for incident diabetes (273 cases) among 3541 non-diabetic participants were 0.60 (0.43-0.85) and 0.82 (0.61-1.10) for MeDiet+EVOO and MeDiet+nuts, respectively versus control. Significant improvements in classical and emerging CVD risk factors also supported a favorable effect of both MeDiets on blood pressure, insulin sensitivity, lipid profiles, lipoprotein particles, inflammation, oxidative stress, and carotid atherosclerosis. In nutrigenomic studies beneficial effects of the intervention with MedDiets showed interactions with several genetic variants (TCF7L2, APOA2, MLXIPL, LPL, FTO, M4CR, COX-2, GCKR and SERPINE1) with respect to intermediate and final phenotypes. Thus, the PREDIMED trial provided strong evidence that a vegetable-based MeDiet rich in unsaturated fat and polyphenols can be a sustainable and ideal model for CVD prevention. Copyright © 2015 Elsevier Inc. All rights reserved.
Fibromyalgia and Risk of Dementia-A Nationwide, Population-Based, Cohort Study.
Tzeng, Nian-Sheng; Chung, Chi-Hsiang; Liu, Feng-Cheng; Chiu, Yu-Hsiang; Chang, Hsin-An; Yeh, Chin-Bin; Huang, San-Yuan; Lu, Ru-Band; Yeh, Hui-Wen; Kao, Yu-Chen; Chiang, Wei-Shan; Tsao, Chang-Hui; Wu, Yung-Fu; Chou, Yu-Ching; Lin, Fu-Huang; Chien, Wu-Chien
2018-02-01
Fibromyalgia is a syndrome of chronic pain and other symptoms and is associated with patient discomfort and other diseases. This nationwide matched-cohort population-based study aimed to investigate the association between fibromyalgia and the risk of developing dementia, and to clarify the association between fibromyalgia and dementia. A total of 41,612 patients of age ≥50 years with newly diagnosed fibromyalgia between January 1, and December 31, 2000 were selected from the National Health Insurance Research Database of Taiwan, along with 124,836 controls matched for sex and age. After adjusting for any confounding factors, Fine and Gray competing risk analysis was used to compare the risk of developing dementia during the 10 years of follow-up. Of the study subjects, 1,704 from 41,612 fibromyalgia patients (21.23 per 1,000 person-years) developed dementia when compared to 4,419 from 124,836 controls (18.94 per 1,000 person-years). Fine and Gray competing risk analysis revealed that the study subjects were more likely to develop dementia (hazard ratio: 2.29, 95% CI: 2.16-2.42; P < 0.001). After adjusting for sex, age, monthly income, urbanization level, geographic region of residence and comorbidities the hazard ratio was 2.77 (95% CI: 2.61-2.95, P < 0.001). Fibromyalgia was associated with increased risk of all types of dementia in this study. The study subjects with fibromyalgia had a 2.77-fold risk of dementia in comparison to the control group. Therefore, further studies are needed to elucidate the underlying mechanisms of the association between fibromyalgia and the risk of dementia. Copyright © 2018 Southern Society for Clinical Investigation. Published by Elsevier Inc. All rights reserved.
Mental illness and intensification of diabetes medications: an observational cohort study.
Frayne, Susan M; Holmes, Tyson H; Berg, Eric; Goldstein, Mary K; Berlowitz, Dan R; Miller, Donald R; Pogach, Leonard M; Laungani, Kaajal J; Lee, Tina T; Moos, Rudolf
2014-10-22
Mental health condition (MHC) comorbidity is associated with lower intensity care in multiple clinical scenarios. However, little is known about the effect of MHC upon clinicians' decisions about intensifying antiglycemic medications in diabetic patients with poor glycemic control. We examined whether delay in intensification of antiglycemic medications in response to an elevated Hemoglobin A1c (HbA1c) value is longer for patients with MHC than for those without MHC, and whether any such effect varies by specific MHC type. In this observational study of diabetic Veterans Health Administration (VA) patients on oral antiglycemics with poor glycemic control (HbA1c ≥8) (N =52,526) identified from national VA databases, we applied Cox regression analysis to examine time to intensification of antiglycemics after an elevated HbA1c value in 2003-2004, by MHC status. Those with MHC were no less likely to receive intensification: adjusted Hazard Ratio [95% CI] 0.99 [0.96-1.03], 1.13 [1.04-1.23], and 1.12 [1.07-1.18] at 0-14, 15-30 and 31-180 days, respectively. However, patients with substance use disorders were less likely than those without substance use disorders to receive intensification in the first two weeks following a high HbA1c, adjusted Hazard Ratio 0.89 [0.81-0.97], controlling for sex, age, medical comorbidity, other specific MHCs, and index HbA1c value. For most MHCs, diabetic patients with MHC in the VA health care system do not appear to receive less aggressive antiglycemic management. However, the subgroup with substance use disorders does appear to have excess likelihood of non-intensification; interventions targeting this high risk subgroup merit attention.
Thvilum, Marianne; Brandt, Frans; Brix, Thomas Heiberg; Hegedüs, Laszlo
2014-09-01
Hypothyroidism is associated with an increased somatic and psychiatric disease burden. Whether there are any socioeconomic consequences of hypothyroidism, such as early retirement or loss of income, remains unclarified. Our aim was to examine, compared with a matched control group, the risk of receiving disability pension (before the age of 60) and the effect on labor market income in patients diagnosed with hypothyroidism. This was an observational register-based cohort study. By record linkage between different Danish health registers, 1745 hypothyroid singletons diagnosed before the age of 60 were each matched with 4 non-hypothyroid controls and followed for a mean of 5 (range 1-31) years. Additionally, we included 277 same-sex twin pairs discordant for hypothyroidism. The risk of disability pension was evaluated by the Cox regression analysis. Changes in labor market income progression over 5 years were evaluated using a difference in difference model. With a hazard ratio of 2.24 (95% confidence interval = 1.73-2.89), individuals diagnosed with hypothyroidism had a significantly increased risk of disability pension. This remained significant when adjusting for educational level and comorbidity (hazard ratio = 1.89; 95% confidence interval = 1.42-2.51). In an analysis of labor market income, 2 years before compared with 2 years after the diagnosis of hypothyroidism, the hypothyroid individuals had on average a €1605 poorer increase than their euthyroid controls (P < .001). Essentially similar results were found in the twin population. A diagnosis of hypothyroidism before the age of 60 is associated with loss of labor market income and an 89% increased risk of receiving a disability pension.
Risk Factors and Outcomes in Transfusion-associated Circulatory Overload
Murphy, Edward L.; Kwaan, Nicholas; Looney, Mark R.; Gajic, Ognjen; Hubmayr, Rolf D.; Gropper, Michael A.; Koenigsberg, Monique; Wilson, Greg; Matthay, Michael; Bacchetti, Peter; Toy, Pearl
2013-01-01
BACKGROUND Transfusion-associated circulatory overload is characterized by new respiratory distress and hydrostatic pulmonary edema within 6 hours after blood transfusion, but its risk factors and outcomes are poorly characterized. METHODS Using a case control design, we enrolled 83 patients with severe transfusion-associated circulatory overload identified by active surveillance for hypoxemia and 163 transfused controls at the University of California, San Francisco (UCSF) and Mayo Clinic (Rochester, Minn) hospitals. Odds ratios (OR) and 95% confidence intervals (CI) were calculated using multivariable logistic regression, and survival and length of stay were analyzed using proportional hazard models. RESULTS Transfusion-associated circulatory overload was associated with chronic renal failure (OR 27.0; 95% CI, 5.2–143), a past history of heart failure (OR 6.6; 95% CI, 2.1–21), hemorrhagic shock (OR 113; 95% CI, 14.1–903), number of blood products transfused (OR 1.11 per unit; 95% CI, 1.01–1.22), and fluid balance per hour (OR 9.4 per liter; 95% CI, 3.1–28). Patients with transfusion-associated circulatory overload had significantly increased in-hospital mortality (hazard ratio 3.20; 95% CI, 1.23–8.10) after controlling for Acute Physiology and Chronic Health Evaluation-II (APACHE-II) score, and longer hospital and intensive care unit lengths of stay. CONCLUSIONS The risk of transfusion-associated circulatory overload increases with the number of blood products administered and a positive fluid balance, and in patients with pre-existing heart failure and chronic renal failure. These data, if replicated, could be used to construct predictive algorithms for transfusion-associated circulatory overload, and subsequent modifications of transfusion practice might prevent morbidity and mortality associated with this complication. PMID:23357450
76 FR 45592 - Delegation of Authority for the Office of Healthy Homes and Lead Hazard Control
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-29
... the Office of Healthy Homes and Lead Hazard Control AGENCY: Office of the Secretary, HUD. ACTION... of 1992, the Office of Healthy Homes and Lead Hazard Control (OHHLHC) is authorized to develop, demonstrate, and promote measures to correct lead-based paint-related health and safety hazards in the home...
Markus, Hugh S; King, Alice; Shipley, Martin; Topakian, Raffi; Cullinane, Marisa; Reihill, Sheila; Bornstein, Natan M; Schaafsma, Arjen
2010-01-01
Summary Background Whether surgery is beneficial for patients with asymptomatic carotid stenosis is controversial. Better methods of identifying patients who are likely to develop stroke would improve the risk–benefit ratio for carotid endarterectomy. We aimed to investigate whether detection of asymptomatic embolic signals by use of transcranial doppler (TCD) could predict stroke risk in patients with asymptomatic carotid stenosis. Methods The Asymptomatic Carotid Emboli Study (ACES) was a prospective observational study in patients with asymptomatic carotid stenosis of at least 70% from 26 centres worldwide. To detect the presence of embolic signals, patients had two 1 h TCD recordings from the ipsilateral middle cerebral artery at baseline and one 1 h recording at 6, 12, and 18 months. Patients were followed up for 2 years. The primary endpoint was ipsilateral stroke and transient ischaemic attack. All recordings were analysed centrally by investigators masked to patient identity. Findings 482 patients were recruited, of whom 467 had evaluable recordings. Embolic signals were present in 77 of 467 patients at baseline. The hazard ratio for the risk of ipsilateral stroke and transient ischaemic attack from baseline to 2 years in patients with embolic signals compared with those without was 2·54 (95% CI 1·20–5·36; p=0·015). For ipsilateral stroke alone, the hazard ratio was 5·57 (1·61–19·32; p=0·007). The absolute annual risk of ipsilateral stroke or transient ischaemic attack between baseline and 2 years was 7·13% in patients with embolic signals and 3·04% in those without, and for ipsilateral stroke was 3·62% in patients with embolic signals and 0·70% in those without. The hazard ratio for the risk of ipsilateral stroke and transient ischaemic attack for patients who had embolic signals on the recording preceding the next 6-month follow-up compared with those who did not was 2·63 (95% CI 1·01–6·88; p=0·049), and for ipsilateral stroke alone the hazard ratio was 6·37 (1·59–25·57; p=0·009). Controlling for antiplatelet therapy, degree of stenosis, and other risk factors did not alter the results. Interpretation Detection of asymptomatic embolisation on TCD can be used to identify patients with asymptomatic carotid stenosis who are at a higher risk of stroke and transient ischaemic attack, and also those with a low absolute stroke risk. Assessment of the presence of embolic signals on TCD might be useful in the selection of patients with asymptomatic carotid stenosis who are likely to benefit from endarterectomy. Funding British Heart Foundation. PMID:20554250
Roldan-Valadez, Ernesto; Rios, Camilo; Motola-Kuba, Daniel; Matus-Santos, Juan; Villa, Antonio R; Moreno-Jimenez, Sergio
2016-11-01
A long-lasting concern has prevailed for the identification of predictive biomarkers for high-grade gliomas (HGGs) using MRI. However, a consensus of which imaging parameters assemble a significant survival model is still missing in the literature; we investigated the significant positive or negative contribution of several MR biomarkers in this tumour prognosis. A retrospective cohort of supratentorial HGGs [11 glioblastoma multiforme (GBM) and 17 anaplastic astrocytomas] included 28 patients (9 females and 19 males, respectively, with a mean age of 50.4 years, standard deviation: 16.28 years; range: 13-85 years). Oedema and viable tumour measurements were acquired using regions of interest in T 1 weighted, T 2 weighted, fluid-attenuated inversion recovery, apparent diffusion coefficient (ADC) and MR spectroscopy (MRS). We calculated Kaplan-Meier curves and obtained Cox's proportional hazards. During the follow-up period (3-98 months), 17 deaths were recorded. The median survival time was 1.73 years (range, 0.287-8.947 years). Only 3 out of 20 covariates (choline-to-N-acetyl aspartate and lipids-lactate-to-creatine ratios and age) showed significance in explaining the variability in the survival hazards model; score test: χ 2 (3) = 9.098, p = 0.028. MRS metabolites overcome volumetric parameters of peritumoral oedema and viable tumour, as well as tumour region ADC measurements. Specific MRS ratios (Cho/Naa, L-L/Cr) might be considered in a regular follow-up for these tumours. Advances in knowledge: Cho/Naa ratio is the strongest survival predictor with a log-hazard function of 2.672 in GBM. Low levels of lipids-lactate/Cr ratio represent up to a 41.6% reduction in the risk of death in GBM.
Palatini, Paolo; Reboldi, Gianpaolo; Beilin, Lawrence J; Eguchi, Kazuo; Imai, Yutaka; Kario, Kazuomi; Ohkubo, Takayoshi; Pierdomenico, Sante D; Saladini, Francesca; Schwartz, Joseph E; Wing, Lindon; Verdecchia, Paolo
2013-09-30
Data from prospective cohort studies regarding the association between ambulatory heart rate (HR) and cardiovascular events (CVE) are conflicting. To investigate whether ambulatory HR predicts CVE in hypertension, we performed 24-hour ambulatory blood pressure and HR monitoring in 7600 hypertensive patients aged 52 ± 16 years from Italy, U.S.A., Japan, and Australia, included in the 'ABP-International' registry. All were untreated at baseline examination. Standardized hazard ratios for ambulatory HRs were computed, stratifying for cohort, and adjusting for age, gender, blood pressure, smoking, diabetes, serum total cholesterol and serum creatinine. During a median follow-up of 5.0 years there were 639 fatal and nonfatal CVE. In a multivariable Cox model, night-time HR predicted fatal combined with nonfatal CVE more closely than 24h HR (p=0.007 and =0.03, respectively). Daytime HR and the night:day HR ratio were not associated with CVE (p=0.07 and =0.18, respectively). The hazard ratio of the fatal combined with nonfatal CVE for a 10-beats/min increment of the night-time HR was 1.13 (95% CI, 1.04-1.22). This relationship remained significant when subjects taking beta-blockers during the follow-up (hazard ratio, 1.15; 95% CI, 1.05-1.25) or subjects who had an event within 5 years after enrollment (hazard ratio, 1.23; 95% CI, 1.05-1.45) were excluded from analysis. At variance with previous data obtained from general populations, ambulatory HR added to the risk stratification for fatal combined with nonfatal CVE in the hypertensive patients from the ABP-International study. Night-time HR was a better predictor of CVE than daytime HR. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Marui, Akira; Kimura, Takeshi; Nishiwaki, Noboru; Mitsudo, Kazuaki; Komiya, Tatsuhiko; Hanyu, Michiya; Shiomi, Hiroki; Tanaka, Shiro; Sakata, Ryuzo
2014-10-01
Coronary heart disease is a major risk factor for left ventricular (LV) systolic dysfunction. However, limited data are available regarding long-term benefits of percutaneous coronary intervention (PCI) in the era of drug-eluting stent or coronary artery bypass grafting (CABG) in patients with LV systolic dysfunction with severe coronary artery disease. We identified 3,584 patients with 3-vessel and/or left main disease of 15,939 patients undergoing first myocardial revascularization enrolled in the CREDO-Kyoto PCI/CABG Registry Cohort-2. Of them, 2,676 patients had preserved LV systolic function, defined as an LV ejection fraction (LVEF) of >50% and 908 had impaired LV systolic function (LVEF≤50%). In patients with preserved LV function, 5-year outcomes were not different between PCI and CABG regarding propensity score-adjusted risk of all-cause and cardiac deaths. In contrast, in patients with impaired LV systolic function, the risks of all-cause and cardiac deaths after PCI were significantly greater than those after CABG (hazard ratio 1.49, 95% confidence interval 1.04 to 2.14, p=0.03 and hazard ratio 2.39, 95% confidence interval 1.43 to 3.98, p<0.01). In both patients with moderate (35%
Rosuvastatin to prevent vascular events in men and women with elevated C-reactive protein.
Ridker, Paul M; Danielson, Eleanor; Fonseca, Francisco A H; Genest, Jacques; Gotto, Antonio M; Kastelein, John J P; Koenig, Wolfgang; Libby, Peter; Lorenzatti, Alberto J; MacFadyen, Jean G; Nordestgaard, Børge G; Shepherd, James; Willerson, James T; Glynn, Robert J
2008-11-20
Increased levels of the inflammatory biomarker high-sensitivity C-reactive protein predict cardiovascular events. Since statins lower levels of high-sensitivity C-reactive protein as well as cholesterol, we hypothesized that people with elevated high-sensitivity C-reactive protein levels but without hyperlipidemia might benefit from statin treatment. We randomly assigned 17,802 apparently healthy men and women with low-density lipoprotein (LDL) cholesterol levels of less than 130 mg per deciliter (3.4 mmol per liter) and high-sensitivity C-reactive protein levels of 2.0 mg per liter or higher to rosuvastatin, 20 mg daily, or placebo and followed them for the occurrence of the combined primary end point of myocardial infarction, stroke, arterial revascularization, hospitalization for unstable angina, or death from cardiovascular causes. The trial was stopped after a median follow-up of 1.9 years (maximum, 5.0). Rosuvastatin reduced LDL cholesterol levels by 50% and high-sensitivity C-reactive protein levels by 37%. The rates of the primary end point were 0.77 and 1.36 per 100 person-years of follow-up in the rosuvastatin and placebo groups, respectively (hazard ratio for rosuvastatin, 0.56; 95% confidence interval [CI], 0.46 to 0.69; P<0.00001), with corresponding rates of 0.17 and 0.37 for myocardial infarction (hazard ratio, 0.46; 95% CI, 0.30 to 0.70; P=0.0002), 0.18 and 0.34 for stroke (hazard ratio, 0.52; 95% CI, 0.34 to 0.79; P=0.002), 0.41 and 0.77 for revascularization or unstable angina (hazard ratio, 0.53; 95% CI, 0.40 to 0.70; P<0.00001), 0.45 and 0.85 for the combined end point of myocardial infarction, stroke, or death from cardiovascular causes (hazard ratio, 0.53; 95% CI, 0.40 to 0.69; P<0.00001), and 1.00 and 1.25 for death from any cause (hazard ratio, 0.80; 95% CI, 0.67 to 0.97; P=0.02). Consistent effects were observed in all subgroups evaluated. The rosuvastatin group did not have a significant increase in myopathy or cancer but did have a higher incidence of physician-reported diabetes. In this trial of apparently healthy persons without hyperlipidemia but with elevated high-sensitivity C-reactive protein levels, rosuvastatin significantly reduced the incidence of major cardiovascular events. (ClinicalTrials.gov number, NCT00239681.) 2008 Massachusetts Medical Society
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-13
... and Hazardous Substances Pollution Contingency Plan National Priorities List AGENCY: Environmental... protection, Air pollution control, Chemicals, Hazardous Waste, Hazardous substances, Intergovernmental relations, Penalties, Reporting and recordkeeping requirements, Superfund, Water pollution control, Water...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-18
... protection, Air pollution control, Chemicals, Hazardous Waste, Hazardous substances, Intergovernmental relations, Penalties, Reporting and recordkeeping requirements, Superfund, Water pollution control, Water... and Hazardous Substances Pollution Contingency Plan; National Priorities List AGENCY: Environmental...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2013 CFR
2013-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2014 CFR
2014-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Hazard Analysis and Critical Control Point (HACCP...
21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2011 CFR
2011-04-01
...: (i) Critical control points designed to control food hazards that are reasonably likely to occur and could be introduced inside the processing plant environment; and (ii) Critical control points designed... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Hazard Analysis and Critical Control Point (HACCP...
Racial differences in colorectal cancer mortality. The importance of stage and socioeconomic status.
Marcella, S; Miller, J E
2001-04-01
This investigation studies racial and socioeconomic differences in mortality from colorectal cancer, and how they vary by stage and age at diagnosis. Cox proportional hazards models were used to estimate the hazard ratio of dying from colorectal cancer, controlling for tumor characteristics and sociodemographic factors. Black adults had a greater risk of death from colorectal cancer, especially in early stages. The gender gap in mortality is wider among blacks than whites. Differences in tumor characteristics and socioeconomic factors each accounted for approximately one third of the excess risk of death among blacks. Effects of socioeconomic factors and race varied significantly by age. Higher stage-specific mortality rates and more advanced stage at diagnosis both contribute to the higher case-fatality rates from colorectal cancer among black adults, only some of which is due to socioeconomic differences. Socioeconomic and racial factors have their most significant effects in different age groups.
24 CFR 35.1330 - Interim controls.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Interim controls. 35.1330 Section... Lead-Paint Hazard Evaluation and Hazard Reduction Activities § 35.1330 Interim controls. Interim controls of lead-based paint hazards identified in a risk assessment shall be conducted in accordance with...
Agalsidase-beta therapy for advanced Fabry disease: a randomized trial.
Banikazemi, Maryam; Bultas, Jan; Waldek, Stephen; Wilcox, William R; Whitley, Chester B; McDonald, Marie; Finkel, Richard; Packman, Seymour; Bichet, Daniel G; Warnock, David G; Desnick, Robert J
2007-01-16
Fabry disease (alpha-galactosidase A deficiency) is a rare, X-linked lysosomal storage disorder that can cause early death from renal, cardiac, and cerebrovascular involvement. To see whether agalsidase beta delays the onset of a composite clinical outcome of renal, cardiovascular, and cerebrovascular events and death in patients with advanced Fabry disease. Randomized (2:1 treatment-to-placebo randomization), double-blind, placebo-controlled trial. 41 referral centers in 9 countries. 82 adults with mild to moderate kidney disease; 74 of whom were protocol-adherent. Intravenous infusion of agalsidase beta (1 mg per kg of body weight) or placebo every 2 weeks for up to 35 months (median, 18.5 months). The primary end point was the time to first clinical event (renal, cardiac, or cerebrovascular event or death). Six patients withdrew before reaching an end point: 3 to receive commercial therapy and 3 due to positive or inconclusive serum IgE or skin test results. Three patients assigned to agalsidase beta elected to transition to open-label treatment before reaching an end point. Thirteen (42%) of the 31 patients in the placebo group and 14 (27%) of the 51 patients in the agalsidase-beta group experienced clinical events. Primary intention-to-treat analysis that adjusted for an imbalance in baseline proteinuria showed that, compared with placebo, agalsidase beta delayed the time to first clinical event (hazard ratio, 0.47 [95% CI, 0.21 to 1.03]; P = 0.06). Secondary analyses of protocol-adherent patients showed similar results (hazard ratio, 0.39 [CI, 0.16 to 0.93]; P = 0.034). Ancillary subgroup analyses found larger treatment effects in patients with baseline estimated glomerular filtration rates greater than 55 mL/min per 1.73 m2 (hazard ratio, 0.19 [CI, 0.05 to 0.82]; P = 0.025) compared with 55 mL/min per 1.73 m2 or less (hazard ratio, 0.85 [CI, 0.32 to 2.3]; P = 0.75) (formal test for interaction, P = 0.09). Most treatment-related adverse events were mild or moderate infusion-associated reactions, reported by 55% of patients in the agalsidase-beta group and 23% of patients in the placebo group. The study sample was small. Only one third of the patients experienced clinical events, and some patients withdrew before experiencing any event. Agalsidase-beta therapy slowed progression to the composite clinical outcome of renal, cardiac, and cerebrovascular complications and death compared with placebo in patients with advanced Fabry disease. Therapeutic intervention before irreversible organ damage may provide greater clinical benefit.
Li, Wenjin; Ray, Roberta M.; Thomas, David B.; Yost, Michael; Davis, Scott; Breslow, Norman; Gao, Dao Li; Fitzgibbons, E. Dawn; Camp, Janice E.; Wong, Eva; Wernli, Karen J.; Checkoway, Harvey
2013-01-01
Exposure to magnetic fields (MFs) is hypothesized to increase the risk of breast cancer by reducing production of melatonin by the pineal gland. A nested case-cohort study was conducted to investigate the association between occupational exposure to MFs and the risk of breast cancer within a cohort of 267,400 female textile workers in Shanghai, China. The study included 1,687 incident breast cancer cases diagnosed from 1989 to 2000 and 4,702 noncases selected from the cohort. Subjects’ complete work histories were linked to a job–exposure matrix developed specifically for the present study to estimate cumulative MF exposure. Hazard ratios and 95% confidence intervals were calculated using Cox proportional hazards modeling that was adapted for the case-cohort design. Hazard ratios were estimated in relation to cumulative exposure during a woman's entire working years. No association was observed between cumulative exposure to MFs and overall risk of breast cancer. The hazard ratio for the highest compared with the lowest quartile of cumulative exposure was 1.03 (95% confidence interval: 0.87, 1.21). Similar null findings were observed when exposures were lagged and stratified by age at breast cancer diagnosis. The findings do not support the hypothesis that MF exposure increases the risk of breast cancer. PMID:24043439
Optimal Scaling of Aftershock Zones using Ground Motion Forecasts
NASA Astrophysics Data System (ADS)
Wilson, John Max; Yoder, Mark R.; Rundle, John B.
2018-02-01
The spatial distribution of aftershocks following major earthquakes has received significant attention due to the shaking hazard these events pose for structures and populations in the affected region. Forecasting the spatial distribution of aftershock events is an important part of the estimation of future seismic hazard. A simple spatial shape for the zone of activity has often been assumed in the form of an ellipse having semimajor axis to semiminor axis ratio of 2.0. However, since an important application of these calculations is the estimation of ground shaking hazard, an effective criterion for forecasting future aftershock impacts is to use ground motion prediction equations (GMPEs) in addition to the more usual approach of using epicentral or hypocentral locations. Based on these ideas, we present an aftershock model that uses self-similarity and scaling relations to constrain parameters as an option for such hazard assessment. We fit the spatial aspect ratio to previous earthquake sequences in the studied regions, and demonstrate the effect of the fitting on the likelihood of post-disaster ground motion forecasts for eighteen recent large earthquakes. We find that the forecasts in most geographic regions studied benefit from this optimization technique, while some are better suited to the use of the a priori aspect ratio.
Pandis, Nikolaos; Polychronopoulou, Argy; Eliades, Theodore
2009-08-01
The purpose of this study was to investigate the efficiency of copper-nickel-titanium (CuNiTi) vs nickel-titanium (NiTi) archwires in resolving crowding of the anterior mandibular dentition. Sixty patients were included in this single-center, single-operator, double-blind randomized trial. All patients were bonded with the In Ovation-R self-ligating bracket (GAC, Central Islip, NY) with a 0.022-in slot, and the amount of crowding of the mandibular anterior dentition was assessed by using the irregularity index. The patients were randomly allocated into 2 groups of 30 patients, each receiving a 0.016-in CuNiTi 35 degrees C (Ormco, Glendora, Calif) or a 0.016-in NiTi (ModernArch, Wyomissing, Pa) wire. The type of wire selected for each patient was not disclosed to the provider or the patient. The date that each patient received a wire was recorded, and all patients were followed monthly for a maximum of 6 months. Demographic and clinical characteristics between the 2 wire groups were compared with the t test or the chi-square test and the Fisher exact test. Time to resolve crowding was explored with statistical methods for survival analysis, and alignment rate ratios for wire type and crowding level were calculated with Cox proportional hazards multivariate modeling. The type of wire (CuNiTi vs NiTi) had no significant effect on crowding alleviation (129.4 vs 121.4 days; hazard ratio, 1.3; P >0.05). Severe crowding (>5 on the irregularity index) showed a significantly higher probability of crowding alleviation duration relative to dental arches with a score of <5 (138.5 vs 113.1 days; hazard ratio, 2.2; P=0.02). The difference of the loading pattern of wires in laboratory and clinical conditions might effectively eliminate the laboratory-derived advantage of CuNiTi wires.
Okazaki, Satoshi; Schirripa, Marta; Loupakis, Fotios; Cao, Shu; Zhang, Wu; Yang, Dongyun; Ning, Yan; Berger, Martin D; Miyamoto, Yuji; Suenaga, Mitsukuni; Iqubal, Syma; Barzi, Afsaneh; Cremolini, Chiara; Falcone, Alfredo; Battaglin, Francesca; Salvatore, Lisa; Borelli, Beatrice; Helentjaris, Timothy G; Lenz, Heinz-Josef
2017-11-15
The hypermethylated in cancer 1/sirtuin 1 (HIC1/SIRT1) axis plays an important role in regulating the nucleotide excision repair pathway, which is the main oxaliplatin-induced damage-repair system. On the basis of prior evidence that the variable number of tandem repeat (VNTR) sequence located near the promoter lesion of HIC1 is associated with HIC1 gene expression, the authors tested the hypothesis that this VNTR is associated with clinical outcome in patients with metastatic colorectal cancer who receive oxaliplatin-based chemotherapy. Four independent cohorts were tested. Patients who received oxaliplatin-based chemotherapy served as the training cohort (n = 218), and those who received treatment without oxaliplatin served as the control cohort (n = 215). Two cohorts of patients who received oxaliplatin-based chemotherapy were used for validation studies (n = 176 and n = 73). The VNTR sequence near HIC1 was analyzed by polymerase chain reaction analysis and gel electrophoresis and was tested for associations with the response rate, progression-free survival, and overall survival. In the training cohort, patients who harbored at least 5 tandem repeats (TRs) in both alleles had a significantly shorter PFS compared with those who had fewer than 4 TRs in at least 1 allele (9.5 vs 11.6 months; hazard ratio, 1.93; P = .012), and these findings remained statistically significant after multivariate analysis (hazard ratio, 2.00; 95% confidence interval, 1.13-3.54; P = .018). This preliminary association was confirmed in the validation cohort, and patients who had at least 5 TRs in both alleles had a worse PFS compared with the other cohort (7.9 vs 9.8 months; hazard ratio, 1.85; P = .044). The current findings suggest that the VNTR sequence near HIC1 could be a predictive marker for oxaliplatin-based chemotherapy in patients with metastatic colorectal cancer. Cancer 2017;123:4506-14. © 2017 American Cancer Society. © 2017 American Cancer Society.
VanderWeele, Tyler J; Yu, Jeffrey; Cozier, Yvette C; Wise, Lauren; Argentieri, M Austin; Rosenberg, Lynn; Palmer, Julie R; Shields, Alexandra E
2017-04-01
Previous longitudinal studies have consistently shown an association between attendance at religious services and lower all-cause mortality, but the literature on associations between other measures of religion and spirituality (R/S) and mortality is limited. We followed 36,613 respondents from the Black Women's Health Study from 2005 through December 31, 2013 to assess the associations between R/S and incident all-cause mortality using proportional hazards models. After control for numerous demographic and health covariates, together with other R/S variables, attending religious services several times per week was associated with a substantially lower mortality rate ratio (mortality rate ratio = 0.64, 95% confidence interval: 0.51, 0.80) relative to never attending services. Engaging in prayer several times per day was not associated with mortality after control for demographic and health covariates, but the association trended towards a higher mortality rate ratio when control was made for other R/S variables (for >2 times/day vs. weekly or less, mortality rate ratio = 1.28, 95% confidence interval: 0.99, 1.67; P-trend < 0.01). Religious coping and self-identification as a very religious/spiritual person were associated with lower mortality when adjustment was made only for age, but the association was attenuated when control was made for demographic and health covariates and was almost entirely eliminated when control was made for other R/S variables. The results indicate that service attendance was the strongest R/S predictor of mortality in this cohort. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Inoue, Manami; Iso, Hiroyasu; Yamamoto, Seiichiro; Kurahashi, Norie; Iwasaki, Motoki; Sasazuki, Shizuka; Tsugane, Shoichiro
2008-07-01
The impact of daily total physical activity level on premature deaths has not been fully clarified in non-Western, relatively lean populations. We prospectively examined the association between daily total physical activity level (METs/day) and subsequent risk of all-cause mortality and mortalities from cancer, heart disease, and cerebrovascular disease. A total of 83,034 general Japanese citizens ages 45-74 years who responded to the questionnaire in 1995-1999 were followed for any cause of death through December 2005. Mutlivariate-adjusted hazard ratios were calculated with a Cox proportional hazards model controlling for potential confounding factors. During follow-up, a total of 4564 deaths were recorded. Compared with subjects in the lowest quartile, increased daily total physical activity was associated with a significantly decreased risk of all-cause mortality in both sexes (hazard ratios for the second, third, and highest quartiles were: men, 0.79, 0.82, 0.73 and women, 0.75, 0.64, 0.61, respectively). The decreased risk was observed regardless of age, frequency of leisure-time sports or physical exercise, or obesity status, albeit with a degree of risk attenuation among those with a high body mass index. A significantly decreased risk was similarly observed for death from cancer and heart disease in both sexes, and from cerebrovascular disease in women. Greater daily total physical activity level, either from occupation, daily life, or leisure time, may be of benefit in preventing premature death.
Statins Are Associated With Reduced Mortality in Multiple Myeloma
Keller, Jesse; Gage, Brian F.; Luo, Suhong; Wang, Tzu-Fei; Moskowitz, Gerald; Gumbel, Jason; Blue, Brandon; O’Brian, Katiuscia; Carson, Kenneth R.
2016-01-01
Purpose The 3-hydroxy-3-methylglutaryl-coenzyme A reductase inhibitors (statins) have activity in one of the pathways influenced by nitrogen-containing bisphosphonates, which are associated with improved survival in multiple myeloma (MM). To understand the benefit of statins in MM, we evaluated the association between statin use and mortality in a large cohort of patients with MM. Patients and Methods From the Veterans Administration Central Cancer Registry, we identified patients diagnosed with MM between 1999 and 2013. We defined statin use as the presence of any prescription for a statin within 3 months before or any time after MM diagnosis. Cox proportional hazards regression assessed the association of statin use with mortality, while controlling for known MM prognostic factors. Results We identified a cohort of 4,957 patients, of whom 2,294 received statin therapy. Statin use was associated with a 21% decrease in all-cause mortality (adjusted hazard ratio, 0.79; 95% CI, 0.73 to 0.86; P < .001) as well as a 24% decrease in MM-specific mortality (adjusted hazard ratio, 0.76; 95% CI, 0.67 to 0.86; P < .001). This association remained significant across all sensitivity analyses. In addition to reductions in mortality, statin use was associated with a 31% decreased risk of developing a skeletal-related event. Conclusion In this cohort study of US veterans with MM, statin therapy was associated with a reduced risk of both all-cause and MM-specific mortality. Our findings suggest a potential role for statin therapy in patients with MM. The putative benefit of statin therapy in MM should be corroborated in prospective studies. PMID:27646948
Association of Unfinished Root Canal Treatments with the Risk of Pneumonia Hospitalization.
Lin, Po-Yen; Chiang, Yu-Chih; Chou, Yu-Ju; Chang, Hong-Ji; Chi, Lin-Yang
2017-01-01
The objective of root canal treatments (RCTs) is to control pulpal diseases and salvage infected teeth by eradicating microorganisms within the root canal system. However, an unfinished RCT can leave a space for bacterial accumulation, which can leak into the oral cavity and then aspirate into the lower respiratory tract and the lungs, causing infection. This study investigated the association of unfinished RCTs with the possible risk of pneumonia hospitalization using a nationwide population-based database. After a matching process, we recruited 116,490 subjects who received an initiated RCT and had no history of pneumonia before 2005 and observed until the end of 2011. An unfinished RCT was operationally defined as an endodontic session that was started on a tooth but had no subsequent completion records. Cox proportional hazards models and subgroup analyses were used to estimate the association of unfinished RCTs on the risk of pneumonia hospitalization. In total, 1285 subjects were hospitalized for pneumonia during 2005 to 2011 with an overall pneumonia hospitalization incidence rate of 0.22% per person year. After adjusting for confounding factors, the adjusted pneumonia hospitalization hazard ratio for subjects who had unfinished RCTs was 1.40 (95% confidence interval, 1.24-1.59) compared with subjects without unfinished RCTs (P < .0001). For middle-aged patients, the hazard ratio was 1.81 (95% confidence interval, 1.45-2.24). Patients with unfinished RCTs had a higher risk of pneumonia hospitalization. Thus, dentists are advised to complete endodontic treatments once started. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Tsinovoi, Cari L; Xun, Pengcheng; McClure, Leslie A; Carioni, Vivian M O; Brockman, John D; Cai, Jianwen; Guallar, Eliseo; Cushman, Mary; Unverzagt, Frederick W; Howard, Virginia J; He, Ka
2018-01-01
The purpose of this case-cohort study was to examine urinary arsenic levels in relation to incident ischemic stroke in the United States. We performed a case-cohort study nested within the REGARDS (REasons for Geographic and Racial Differences in Stroke) cohort. A subcohort (n=2486) of controls was randomly sampled within region-race-sex strata while all incident ischemic stroke cases from the full REGARDS cohort (n=671) were included. Baseline urinary arsenic was measured by inductively coupled plasma-mass spectrometry. Arsenic species, including urinary inorganic arsenic and its metabolites monomethylarsonic acid and dimethylarsinic acid, were measured in a random subset (n=199). Weighted Cox's proportional hazards models were used to calculate hazard ratios and 95% confidence intervals of ischemic stroke by arsenic and its species. The average follow-up was 6.7 years. Although incident ischemic stroke showed no association with total arsenic or total inorganic arsenic, for each unit higher level of urinary monomethylarsonic acid on a log-scale, after adjustment for potential confounders, ischemic stroke risk increased ≈2-fold (hazard ratio=1.98; 95% confidence interval: 1.12-3.50). Effect modification by age, race, sex, or geographic region was not evident. A metabolite of arsenic was positively associated with incident ischemic stroke in this case-cohort study of the US general population, a low-to-moderate exposure area. Overall, these findings suggest a potential role for arsenic methylation in the pathogenesis of stroke, having important implications for future cerebrovascular research. © 2017 American Heart Association, Inc.
Heikkilä, Katriina; Madsen, Ida E H; Nyberg, Solja T; Fransson, Eleonor I; Ahola, Kirsi; Alfredsson, Lars; Bjorner, Jakob B; Borritz, Marianne; Burr, Hermann; Dragano, Nico; Ferrie, Jane E; Knutsson, Anders; Koskenvuo, Markku; Koskinen, Aki; Nielsen, Martin L; Nordin, Maria; Pejtersen, Jan H; Pentti, Jaana; Rugulies, Reiner; Oksanen, Tuula; Shipley, Martin J; Suominen, Sakari B; Theorell, Töres; Väänänen, Ari; Vahtera, Jussi; Virtanen, Marianna; Westerlund, Hugo; Westerholm, Peter J M; Batty, G David; Singh-Manoux, Archana; Kivimäki, Mika
2014-01-01
Many clinicians, patients and patient advocacy groups believe stress to have a causal role in inflammatory bowel diseases, such as Crohn's disease and ulcerative colitis. However, this is not corroborated by clear epidemiological research evidence. We investigated the association between work-related stress and incident Crohn's disease and ulcerative colitis using individual-level data from 95,000 European adults. We conducted individual-participant data meta-analyses in a set of pooled data from 11 prospective European studies. All studies are a part of the IPD-Work Consortium. Work-related psychosocial stress was operationalised as job strain (a combination of high demands and low control at work) and was self-reported at baseline. Crohn's disease and ulcerative colitis were ascertained from national hospitalisation and drug reimbursement registers. The associations between job strain and inflammatory bowel disease outcomes were modelled using Cox proportional hazards regression. The study-specific results were combined in random effects meta-analyses. Of the 95,379 participants who were free of inflammatory bowel disease at baseline, 111 men and women developed Crohn's disease and 414 developed ulcerative colitis during follow-up. Job strain at baseline was not associated with incident Crohn's disease (multivariable-adjusted random effects hazard ratio: 0.83, 95% confidence interval: 0.48, 1.43) or ulcerative colitis (hazard ratio: 1.06, 95% CI: 0.76, 1.48). There was negligible heterogeneity among the study-specific associations. Our findings suggest that job strain, an indicator of work-related stress, is not a major risk factor for Crohn's disease or ulcerative colitis.
Early Posttherapy Hospitalizations Among Survivors of Childhood Leukemia and Lymphoma.
Smitherman, Andrew B; Wilkins, Tania M; Blatt, Julie; Dusetzina, Stacie B
2016-08-01
Long-term survivors of childhood cancers are at increased risk for hospitalization. To test the hypothesis that many treatment-related morbidities are identifiable in the early posttherapy period, we determined the rates and causes for hospitalization among survivors of leukemia and lymphoma during the first 3 years posttherapy. Using a health plan claims database, we identified patients aged 0 to 21 years old treated for leukemia or lymphoma from 2000 to 2010. Survivors were matched 10:1 with similar children without a history of cancer. Hospitalization rates over 3 years were compared using Cox proportional hazards regression and risks of cause-specific hospitalization were compared using log-binomial models. Nineteen percent of childhood leukemia and lymphoma survivors were hospitalized in the first 3 years off therapy. Leukemia survivors (N=529) experienced over 6 times (hazard ratio=6.3; 95% confidence interval [CI], 4.9-8.0) and lymphoma survivors (N=454) over 3 times the hospitalization rate of controls (hazard ratio=3.2; 95% CI, 2.5-4.2). Compared with children without a cancer history, survivors were at increased risk for hospitalization due to infectious causes (leukemia: relative risk [RR], 60.0; 95% CI, 23.4-154.0; lymphoma: RR, 10.0; 95% CI, 4.4-22.9). In addition, lymphoma survivors were at increased risk for cardiovascular-related (RR, 15.0; 95% CI, 5.4-42.0) and pulmonary-related (RR, 8.1; 95% CI, 3.9-16.8) hospitalizations. These findings highlight the morbidity experienced by survivors and suggest that treatment-associated complications may be emerging soon after therapy completion.
Wassertheil-Smoller, Sylvia; McGinn, Aileen P.; Martin, Lisa; Rodriguez, Beatriz L.; Stefanick, Marcia L.; Perez, Marco
2017-01-01
Abstract Atrial fibrillation (AF) is a common arrhythmia that poses a significant risk of stroke. Cross-sectional and case-control studies have shown evidence of associations between AF and breast or colorectal cancer, but there have been no longitudinal studies in which this has been assessed. We prospectively examined a cohort of 93,676 postmenopausal women enrolled in the Women's Health Initiative from 1994 to 1998 to determine whether there are relationships between baseline AF and the development of invasive breast or colorectal cancer. The prevalence of self-reported physician diagnosis of AF at baseline was 5.1%. Over approximately 15 years of follow-up, the incidence of invasive breast cancer was 5.7%, and the incidence of colorectal cancer was 1.6%. Adjusted hazard ratios and 95% confidence intervals were obtained using Cox proportional hazards models. We found no significant association between AF and incident colorectal cancer, but we did see a 19% excess risk of invasive breast cancer among those with AF (adjusted hazard ratio (HR) = 1.19, 95% confidence interval (CI): 1.03, 1.38). Additional adjustment for baseline use of cardiac glycosides attenuated the association between AF and invasive breast cancer (HR = 1.01, 95% CI: 0.85, 1.20). Cardiac glycoside use was strongly associated with incident invasive breast cancer (HR = 1.68, 95% CI: 1.33, 2.12) independent of AF and other confounders. Mechanisms of the associations among breast cancer, AF, and cardiac glycosides need further investigation. PMID:28174828
Singh, Jiwan; Lee, Byeong-Kyu
2015-04-01
Automobile shredder residue (ASR) is considered as hazardous waste in Japan and European countries due to presence of heavy metals. This study was carried on the extraction characteristics of heavy metals (Mn, Fe, Ni, and Cr) from automobile shredder residue (ASR). The effects of pH, temperature, particle size, and liquid/solid ratio (L/S) on the extraction of heavy metals were investigated. The recovery rate of Mn, Fe, Ni, and Cr increased with increasing extraction temperature and L/S ratio. The lowest pH 2, the highest L/S ratio, and the smallest particle size showed the highest recovery of heavy metals from ASR. The highest recovery rates were in the following order: Mn > Ni > Cr > Fe. Reduction of mobility factor for the heavy metals was observed in all the size fractions after the recovery. The results of the kinetic analysis for various experimental conditions supported that the reaction rate of the recovery process followed a second order reaction model (R(2) ⩾ 0.95). The high availability of water-soluble fractions of Mn, Fe, Ni, and Cr from the low grade ASR could be potential hazards to the environment. Bioavailability and toxicity risk of heavy metals reduced significantly with pH 2 of distilled water. However, water is a cost-effective extracting agent for the recovery of heavy metals and it could be useful for reducing the toxicity of ASR. Copyright © 2015 Elsevier Ltd. All rights reserved.
Preclinical Alzheimer disease and risk of falls
Roe, Catherine M.; Grant, Elizabeth A.; Hollingsworth, Holly; Benzinger, Tammie L.; Fagan, Anne M.; Buckles, Virginia D.; Morris, John C.
2013-01-01
Objective: We determined the rate of falls among cognitively normal, community-dwelling older adults, some of whom had presumptive preclinical Alzheimer disease (AD) as detected by in vivo imaging of fibrillar amyloid plaques using Pittsburgh compound B (PiB) and PET and/or by assays of CSF to identify Aβ42, tau, and phosphorylated tau. Methods: We conducted a 12-month prospective cohort study to examine the cumulative incidence of falls. Participants were evaluated clinically and underwent PiB PET imaging and lumbar puncture. Falls were reported monthly using an individualized calendar journal returned by mail. A Cox proportional hazards model was used to test whether time to first fall was associated with each biomarker and the ratio of CSF tau/Aβ42 and CSF phosphorylated tau/Aβ42, after adjustment for common fall risk factors. Results: The sample (n = 125) was predominately female (62.4%) and white (96%) with a mean age of 74.4 years. When controlled for ability to perform activities of daily living, higher levels of PiB retention (hazard ratio = 2.95 [95% confidence interval 1.01–6.45], p = 0.05) and of CSF biomarker ratios (p < 0.001) were associated with a faster time to first fall. Conclusions: Presumptive preclinical AD is a risk factor for falls in older adults. This study suggests that subtle noncognitive changes that predispose older adults to falls are associated with AD and may precede detectable cognitive changes. PMID:23803314
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-13
... and Hazardous Substances Pollution Contingency Plan National Priorities List AGENCY: Environmental... pollution control, Chemicals, Hazardous Waste, Hazardous substances, Intergovernmental relations, Penalties, Reporting and recordkeeping requirements, Superfund, Water pollution control, Water supply. Authority: 33 U...
Handbook on Bird Management and Control.
1980-03-01
strike hazard ...................... 70 FIGURE 10. Sample application of sharp projections ........ 80 FIGURE 11. Plans for a low profile pigeon trap...particularly with Domestic Pigeons , Starlings, and House Sparrows, can reduce most of the hazards mentioned in this chapter. 4.2. HEALTH HAZARDS 4.2.1...damage, pest bird control, hazardous bird control, bird biology t behavior, altering the concept, altering the situation, exclusion, repulsion, removal
Rates of Atrial Fibrillation in Black Versus White Patients With Pacemakers.
Kamel, Hooman; Kleindorfer, Dawn O; Bhave, Prashant D; Cushman, Mary; Levitan, Emily B; Howard, George; Soliman, Elsayed Z
2016-02-12
Black US residents experience higher rates of ischemic stroke than white residents but have lower rates of clinically apparent atrial fibrillation (AF), a strong risk factor for stroke. It is unclear whether black persons truly have less AF or simply more undiagnosed AF. We obtained administrative claims data from state health agencies regarding all emergency department visits and hospitalizations in California, Florida, and New York. We identified a cohort of patients with pacemakers, the regular interrogation of which reduces the likelihood of undiagnosed AF. We compared rates of documented AF or atrial flutter at follow-up visits using Kaplan-Meier survival statistics and Cox proportional hazards models adjusted for demographic characteristics and vascular risk factors. We identified 10 393 black and 91 380 white patients without documented AF or atrial flutter before or at the index visit for pacemaker implantation. During 3.7 (±1.8) years of follow-up, black patients had a significantly lower rate of AF (21.4%; 95% CI 19.8-23.2) than white patients (25.5%; 95% CI 24.9-26.0). After adjustment for demographic characteristics and comorbidities, black patients had a lower hazard of AF (hazard ratio 0.91; 95% CI 0.86-0.96), a higher hazard of atrial flutter (hazard ratio 1.29; 95% CI 1.11-1.49), and a lower hazard of the composite of AF or atrial flutter (hazard ratio 0.94; 95% CI 0.88-99). In a population-based sample of patients with pacemakers, black patients had a lower rate of AF compared with white patients. These findings indicate that the persistent racial disparities in rates of ischemic stroke are likely to be related to factors other than undiagnosed AF. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-25
... establishment's process control plans, that is, its Hazard Analysis and Critical Control Point plans. DATES... control plans, i.e., its Hazard Analysis and Critical Control Point (HACCP) plans; and (3) make the recall... systematic prevention of biological, chemical, and physical hazards. HACCP plans are establishment-developed...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-24
... Availability for HUD's Fiscal Year (FY) 2010 Lead-Based Paint Hazard Control Grant Program and Lead Hazard Reduction Demonstration Grant Program; Technical Correction AGENCY: Office of Healthy Homes and Lead Hazard...://www.Grants.gov its Notice of Funding Availability (NOFA) for HUD's FY2010 Lead-Based Paint Hazard...
Simpson, Colin R; Steiner, Markus Fc; Cezard, Genevieve; Bansal, Narinder; Fischbacher, Colin; Douglas, Anne; Bhopal, Raj; Sheikh, Aziz
2015-10-01
There is evidence of substantial ethnic variations in asthma morbidity and the risk of hospitalisation, but the picture in relation to lower respiratory tract infections is unclear. We carried out an observational study to identify ethnic group differences for lower respiratory tract infections. A retrospective, cohort study. Scotland. 4.65 million people on whom information was available from the 2001 census, followed from May 2001 to April 2010. Hospitalisations and deaths (any time following first hospitalisation) from lower respiratory tract infections, adjusted risk ratios and hazard ratios by ethnicity and sex were calculated. We multiplied ratios and confidence intervals by 100, so the reference Scottish White population's risk ratio and hazard ratio was 100. Among men, adjusted risk ratios for lower respiratory tract infection hospitalisation were lower in Other White British (80, 95% confidence interval 73-86) and Chinese (69, 95% confidence interval 56-84) populations and higher in Pakistani groups (152, 95% confidence interval 136-169). In women, results were mostly similar to those in men (e.g. Chinese 68, 95% confidence interval 56-82), although higher adjusted risk ratios were found among women of the Other South Asians group (145, 95% confidence interval 120-175). Survival (adjusted hazard ratio) following lower respiratory tract infection for Pakistani men (54, 95% confidence interval 39-74) and women (31, 95% confidence interval 18-53) was better than the reference population. Substantial differences in the rates of lower respiratory tract infections amongst different ethnic groups in Scotland were found. Pakistani men and women had particularly high rates of lower respiratory tract infection hospitalisation. The reasons behind the high rates of lower respiratory tract infection in the Pakistani community are now required. © The Royal Society of Medicine.
Lee, Ai-Lin; Chen, Bor-Chyuan; Mou, Chih-Hsin; Sun, Mao-Feng; Yen, Hung-Rong
2016-01-01
With an increasing use of traditional Chinese medicine (TCM) in type 2 diabetes mellitus (T2DM), evidence of long-term benefit with adjunctive TCM treatment is limited. This study investigated whether the concurrent TCM treatment reduces the risk of vascular complications in T2DM patients by using a large population from National Health Insurance Research Database (NHIRD).We identified 33,457 adult patients with newly diagnosed T2DM using anti-diabetic agents from a random sample of one million beneficiaries in the NHIRD between January 1, 2000 and December 31, 2011. We recruited 1049 TCM users (received TCM over 30 days with a diagnosis of T2DM) and randomly selected 4092 controls as the non-TCM cohort at a ratio of 1:4 frequency-matched by age, sex, hypertension, hyperlipidemia, and index year. We investigated the prescription pattern of TCM and conducted a Cox proportional hazards regression to calculate the hazard ratios (HRs) of stroke, chronic kidney diseases (CKD), and diabetic foot between the 2 cohorts.In the TCM cohort, the prescription pattern of TCM was different between insulin and noninsulin patients. The most common herbs were Dan-Shen (Radix Salviae Miltiorrhizae) in noninsulin group and Da-Huang (Radix et Rhizoma Rhei) in insulin group. The most common formulae were Liu-Wei-Di-Huang-Wan in noninsulin group and Yu-Quan-Wan in insulin group. Although no significant reduction in the hazard ratio of CKD and diabetic foot, the incidence rate of stroke was 7.19 per 1000 person-years in the TCM cohort and 10.66 per 1000 person-years in the control cohort, respectively. After adjustment of age, sex, hypertension, hyperlipidemia, and antidiabetes agent use (including sulfonylureas, α-glucosidase, metformin, meglitinide, thiazolidinediones, and insulin), TCM cohorts were found to have a 33% decreased risk of stroke (95% CI = 0.46-0.97; P < 0.05).This population-based retrospective study showed that the complementary TCM therapy might associate with the decreased risk of stroke in T2DM, suggesting TCM as an adjunctive therapy for T2DM to prevent subsequent stroke.
Follow-up of glycemic control and cardiovascular outcomes in type 2 diabetes.
Hayward, Rodney A; Reaven, Peter D; Wiitala, Wyndy L; Bahn, Gideon D; Reda, Domenic J; Ge, Ling; McCarren, Madeline; Duckworth, William C; Emanuele, Nicholas V
2015-06-04
The Veterans Affairs Diabetes Trial previously showed that intensive glucose lowering, as compared with standard therapy, did not significantly reduce the rate of major cardiovascular events among 1791 military veterans (median follow-up, 5.6 years). We report the extended follow-up of the study participants. After the conclusion of the clinical trial, we followed participants, using central databases to identify procedures, hospitalizations, and deaths (complete cohort, with follow-up data for 92.4% of participants). Most participants agreed to additional data collection by means of annual surveys and periodic chart reviews (survey cohort, with 77.7% follow-up). The primary outcome was the time to the first major cardiovascular event (heart attack, stroke, new or worsening congestive heart failure, amputation for ischemic gangrene, or cardiovascular-related death). Secondary outcomes were cardiovascular mortality and all-cause mortality. The difference in glycated hemoglobin levels between the intensive-therapy group and the standard-therapy group averaged 1.5 percentage points during the trial (median level, 6.9% vs. 8.4%) and declined to 0.2 to 0.3 percentage points by 3 years after the trial ended. Over a median follow-up of 9.8 years, the intensive-therapy group had a significantly lower risk of the primary outcome than did the standard-therapy group (hazard ratio, 0.83; 95% confidence interval [CI], 0.70 to 0.99; P=0.04), with an absolute reduction in risk of 8.6 major cardiovascular events per 1000 person-years, but did not have reduced cardiovascular mortality (hazard ratio, 0.88; 95% CI, 0.64 to 1.20; P=0.42). No reduction in total mortality was evident (hazard ratio in the intensive-therapy group, 1.05; 95% CI, 0.89 to 1.25; P=0.54; median follow-up, 11.8 years). After nearly 10 years of follow-up, patients with type 2 diabetes who had been randomly assigned to intensive glucose control for 5.6 years had 8.6 fewer major cardiovascular events per 1000 person-years than those assigned to standard therapy, but no improvement was seen in the rate of overall survival. (Funded by the VA Cooperative Studies Program and others; VADT ClinicalTrials.gov number, NCT00032487.).
Kong, Melissa H; Shaw, Linda K; O'Connor, Christopher; Califf, Robert M; Blazing, Michael A; Al-Khatib, Sana M
2010-07-01
Although no clinical trial data exist on the optimal management of atrial fibrillation (AF) in patients with diastolic heart failure, it has been hypothesized that rhythm-control is more advantageous than rate-control due to the dependence of these patients' left ventricular filling on atrial contraction. We aimed to determine whether patients with AF and heart failure with preserved ejection fraction (EF) survive longer with rhythm versus rate-control strategy. The Duke Cardiovascular Disease Database was queried to identify patients with EF > 50%, heart failure symptoms and AF between January 1,1995 and June 30, 2005. We compared baseline characteristics and survival of patients managed with rate- versus rhythm-control strategies. Using a 60-day landmark view, Kaplan-Meier curves were generated and results were adjusted for baseline differences using Cox proportional hazards modeling. Three hundred eighty-two patients met the inclusion criteria (285 treated with rate-control and 97 treated with rhythm-control). The 1-, 3-, and 5-year survival rates were 93.2%, 69.3%, and 56.8%, respectively in rate-controlled patients and 94.8%, 78.0%, and 59.9%, respectively in rhythm-controlled patients (P > 0.10). After adjustments for baseline differences, no significant difference in mortality was detected (hazard ratio for rhythm-control vs rate-control = 0.696, 95% CI 0.453-1.07, P = 0.098). Based on our observational data, rhythm-control seems to offer no survival advantage over rate-control in patients with heart failure and preserved EF. Randomized clinical trials are needed to verify these findings and examine the effect of each strategy on stroke risk, heart failure decompensation, and quality of life.
Arnone, Mario; Koppisch, Dorothea; Smola, Thomas; Gabriel, Stefan; Verbist, Koen; Visser, Remco
2015-10-01
Many control banding tools use hazard banding in risk assessments for the occupational handling of hazardous substances. The outcome of these assessments can be combined with advice for the required risk management measures (RMMs). The Globally Harmonised System of Classification and Labelling of Chemicals (GHS) has resulted in a change in the hazard communication elements, i.e. Hazard (H) statements instead of Risk-phrases. Hazard banding schemes that depend on the old form of safety information have to be adapted to the new rules. The purpose of this publication is to outline the rationales for the assignment of hazard bands to H statements under the GHS. Based on this, this publication proposes a hazard banding scheme that uses the information from the safety data sheets as the basis for assignment. The assignment of hazard bands tiered according to the severity of the underlying hazards supports the important principle of substitution. Additionally, the set of assignment rules permits an exposure-route-specific assignment of hazard bands, which is necessary for the proposed route-specific RMMs. Ideally, all control banding tools should apply the same assignment rules. This GHS-compliant hazard banding scheme can hopefully help to establish a unified hazard banding strategy in the various control banding tools. Copyright © 2015 Elsevier Inc. All rights reserved.
Ropkins, K; Beck, A J
2002-08-01
Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.
Medication persistence and the use of generic and brand-name blood pressure-lowering agents.
Corrao, Giovanni; Soranna, Davide; La Vecchia, Carlo; Catapano, Alberico; Agabiti-Rosei, Enrico; Gensini, Gianfranco; Merlino, Luca; Mancia, Giuseppe
2014-05-01
Because of their lower cost, healthcare systems recommend physicians to prefer generic products, rather than brand-name medicaments. There is then considerable interest and debate concerning safety and effectiveness of generic products. Few studies have compared patients treated with brand-name and generic drugs for adherence to treatment, with somewhat inconsistent results. The primary objective of this study was to compare the risk of discontinuing antihypertensive drug therapy in patients treated with generic or brand-name agents. The 101,618 beneficiaries of the Healthcare system of Lombardy, Italy, aged 18 years or older who were newly treated on monotherapy with antihypertensive generic or brand-name drugs during 2008, were followed until the earliest date among those of the occurrence of treatment discontinuation to whatever antihypertensive drug therapy (outcome), or censoring (death, emigration, 12 months after treatment initiation). Hazard ratios of discontinuation associated with starting on generic or brand-name products (intention-to-treat analysis), and incidence rate ratio of discontinuation during periods on generic and brand-name products (as-treated analysis) were respectively estimated from a cohort and self-controlled case series analyses. Patients who started on generics did not experience a different risk of discontinuation compared with those starting on brand-name agents (hazard ratio: 1.00; 95% confidence interval 0.98-1.02). Discontinuation did not occur with different rates during periods covered by generics or brand-name agents (incidence rate ratio: 1.01; 95% confidence interval 0.96-1.11) within the same individuals. A number of sensitivity and subgroup analyses confirmed the robustness of these findings. Generic products are not responsible for the high rate of discontinuation from antihypertensive drug therapy. Assuming therapeutic equivalence, clinical implication is of prescribing generic drug therapies.
Aksamit, Timothy; De Soyza, Anthony; Bandel, Tiemo-Joerg; Criollo, Margarita; Elborn, J Stuart; Operschall, Elisabeth; Polverino, Eva; Roth, Katrin; Winthrop, Kevin L; Wilson, Robert
2018-01-01
We evaluated the efficacy and safety of ciprofloxacin dry powder for inhalation (DPI) in patients with non-cystic fibrosis bronchiectasis, two or more exacerbations in the previous year and predefined sputum bacteria.Patients were randomised 2:1 to twice-daily ciprofloxacin DPI 32.5 mg or placebo in 14- or 28-day on/off treatment cycles for 48 weeks. Primary end-points were time to first exacerbation and frequency of exacerbations. Enrolling countries and α level split (0.049 and 0.001 for 14- and 28-day cycles, respectively) differed from RESPIRE 1.Patients were randomised to ciprofloxacin DPI (14 days on/off (n=176) or 28 days on/off (n=171)) or placebo (14 days on/off (n=88) or 28 days on/off (n=86)). The exacerbation rate was low across treatment arms (mean±sd 0.6±0.9). Active treatment showed trends to prolonged time to first exacerbation (ciprofloxacin DPI 14 days on/off: hazard ratio 0.87, 95.1% CI 0.62-1.21; p=0.3965; ciprofloxacin DPI 28 days on/off: hazard ratio 0.71, 99.9% CI 0.39-1.27; p=0.0511) and reduced frequency of exacerbations (ciprofloxacin DPI 14 days on/off: incidence rate ratio 0.83, 95.1% CI 0.59-1.17; p=0.2862; ciprofloxacin DPI 28 days on/off: incidence rate ratio 0.55, 99.9% CI 0.30-1.02; p=0.0014), although neither achieved statistical significance. Ciprofloxacin DPI was well tolerated.Trends towards clinical benefit were seen with ciprofloxacin DPI, but primary end-points were not met. Copyright ©ERS 2018.
Tao, Meng-Hua; Dai, Qi; Chen, Shande; Freudenheim, Jo L; Rohan, Thomas; Wakelee, Heather; Datta, Mridul; Wactawski-Wende, Jean
2017-08-01
Magnesium and calcium are antagonistic in many physiologic processes. However, few studies have investigated the associations of supplemental calcium with lung cancer risk taking this antagonism into account. We evaluated the effect of calcium and vitamin D supplementation on lung cancer incidence and explored whether the ratio of baseline calcium to magnesium (Ca:Mg) intake modifies the association in the Women's Health Initiative (WHI) calcium plus vitamin D supplementation (CaD) trial. The intervention phase of the WHI CaD was a double-blinded, randomized, placebo-controlled trial in 36,382 postmenopausal women aged 50-79 years, recruited at 40U.S. centers. Post-intervention follow-up continued among 29,862 (86%) of the surviving participants. Risk of lung cancer in association with CaD supplementation was evaluated using proportional hazard regression models. After 11 years' cumulative follow-up, there were 207 lung cancers (incidence 0.11% per year) in the supplement arm and 241 (0.12%) in the placebo arm (hazard ratio (HR) for the intervention, 0.91; 95% confidence interval (CI), 0.71-1.17). Subgroup analyses suggested that the HR for lung cancer varied by baseline Ca:Mg intake ratio among women who were current smokers at enrollment (p=0.04 for interaction). Over the entire follow-up period, calcium and vitamin D supplementation did not reduce lung cancer incidence among postmenopausal women. In exploratory analyses, an interaction was found for the baseline Ca:Mg intake ratio on lung cancer among current smokers at the trial entry. This findings need to be further studied for the role of calcium with magnesium in lung carcinogenesis in current smokers. Copyright © 2017 Elsevier B.V. All rights reserved.
Tada, Toshifumi; Kumada, Takashi; Toyoda, Hidenori; Kiriyama, Seiki; Tanikawa, Makoto; Hisanaga, Yasuhiro; Kanamori, Akira; Kitabatake, Shusuke; Yama, Tsuyoki
2015-09-01
It has been reported that the branched-chain amino acid (BCAA) to tyrosine ratio (BTR) is a useful indicator of liver function and BCAA therapy is associated with a decreased incidence of hepatocellular carcinoma (HCC). However, there has not been sufficient research on the relationship between BTR and the effects of BCAA therapy after initial treatment of HCC. We investigated the impact of BTR and BCAA therapy on survival in patients with HCC. A total of 315 patients with HCC who were treated (n = 66) or not treated (n = 249) with BCAA were enrolled; of these, 66 were selected from each group using propensity score matching. Survival from liver-related mortality was analyzed. In patients who did not receive BCAA therapy (n = 249), multivariate analysis for factors associated with survival indicated that low BTR (≤ 4.4) was independently associated with poor prognosis in patients with HCC (hazard ratio, 1.880; 95% confidence interval, 1.125-3.143; P = 0.016). In addition, among patients selected by propensity score matching (n = 132), multivariate analysis indicated that BCAA therapy was independently associated with good prognosis in patients with HCC (hazard ratio, 0.524; 95% confidence interval, 0.282-0.973; P = 0.041). BTR was not significantly associated with survival. Intervention involving BCAA therapy improved survival in patients with HCC versus untreated controls, regardless of BTR. In addition, low BTR was associated with poor prognosis in patients who did not receive BCAA therapy. © 2015 Journal of Gastroenterology and Hepatology Foundation and Wiley Publishing Asia Pty Ltd.
Ito, Yuri; Ioka, Akiko; Tsukuma, Hideaki; Ajiki, Wakiko; Sugimoto, Tomoyuki; Rachet, Bernard; Coleman, Michel P
2009-07-01
We used new methods to examine differences in population-based cancer survival between six prefectures in Japan, after adjustment for age and stage at diagnosis. We applied regression models for relative survival to data from population-based cancer registries covering each prefecture for patients diagnosed with stomach, lung, or breast cancer during 1993-1996. Funnel plots were used to display the excess hazard ratio (EHR) for each prefecture, defined as the excess hazard of death from each cancer within 5 years of diagnosis relative to the mean excess hazard (in excess of national background mortality by age and sex) in all six prefectures combined. The contribution of age and stage to the EHR in each prefecture was assessed from differences in deviance-based R(2) between the various models. No significant differences were seen between prefectures in 5-year survival from breast cancer. For cancers of the stomach and lung, EHR in Osaka prefecture were above the upper 95% control limits. For stomach cancer, the age- and stage-adjusted EHR in Osaka were 1.29 for men and 1.43 for women, compared with Fukui and Yamagata. Differences in the stage at diagnosis of stomach cancer appeared to explain most of this excess hazard (61.3% for men, 56.8% for women), whereas differences in age at diagnosis explained very little (0.8%, 1.3%). This approach offers the potential to quantify the impact of differences in stage at diagnosis on time trends and regional differences in cancer survival. It underlines the utility of population-based cancer registries for improving cancer control.
Navar, Ann Marie; Gallup, Dianne S; Lokhnygina, Yuliya; Green, Jennifer B; McGuire, Darren K; Armstrong, Paul W; Buse, John B; Engel, Samuel S; Lachin, John M; Standl, Eberhard; Van de Werf, Frans; Holman, Rury R; Peterson, Eric D
2017-11-01
Systolic blood pressure (SBP) treatment targets for adults with diabetes mellitus remain unclear. SBP levels among 12 275 adults with diabetes mellitus, prior cardiovascular disease, and treated hypertension were evaluated in the TECOS (Trial Evaluating Cardiovascular Outcomes With Sitagliptin) randomized trial of sitagliptin versus placebo. The association between baseline SBP and recurrent cardiovascular disease was evaluated using multivariable Cox proportional hazards modeling with restricted cubic splines, adjusting for clinical characteristics. Kaplan-Meier curves by baseline SBP were created to assess time to cardiovascular disease and 2 potential hypotension-related adverse events: worsening kidney function and fractures. The association between time-updated SBP and outcomes was examined using multivariable Cox proportional hazards models. Overall, 42.2% of adults with diabetes mellitus, cardiovascular disease, and hypertension had an SBP ≥140 mm Hg. The association between SBP and cardiovascular disease risk was U shaped, with a nadir ≈130 mm Hg. When the analysis was restricted to those with baseline SBP of 110 to 150 mm Hg, the adjusted association between SBP and cardiovascular disease risk was flat (hazard ratio per 10-mm Hg increase, 0.96; 95% confidence interval, 0.91-1.02). There was no association between SBP and risk of fracture. Above 150 mm Hg, higher SBP was associated with increasing risk of worsening kidney function (hazard ratio per 10-mm Hg increase, 1.10; 95% confidence interval, 1.02-1.18). Many patients with diabetes mellitus have uncontrolled hypertension. The U-shaped association between SBP and cardiovascular disease events was largely driven by those with very high or low SBP, with no difference in cardiovascular disease risk between 110 and 150 mm Hg. Lower SBP was not associated with higher risks of fractures or worsening kidney function. © 2017 American Heart Association, Inc.
Deja, Marek A; Kargul, Tomasz; Domaradzki, Wojciech; Stącel, Tomasz; Mazur, Witold; Wojakowski, Wojciech; Gocoł, Radosław; Gaszewska-Żurek, Ewa; Żurek, Paweł; Pytel, Agata; Woś, Stanisław
2012-07-01
This trial was undertaken to determine the safety and efficacy of preoperative aspirin administration in a contemporary cardiac surgical practice setting. This randomized, double-blind, parallel-group, single-center trial involved patients with stable coronary artery disease who were assigned to receive either 300 mg of aspirin or placebo the night before coronary bypass surgery. Using a random digit table, patients were allocated to receive the tablet from 1 of the 40 coded bottles containing either aspirin or placebo. Patients, surgeons, anesthetists, and investigators were all masked to treatment allocation. The primary safety end points were as follows: more than 750 mL of bleeding during the first postoperative 12 hours and more than 1000 mL of total discharge from the chest drains. The secondary efficacy end point was a composite of cardiovascular death, myocardial infarction, or repeat revascularization. A total of 390 patients were allocated to aspirin (387 analyzed) and 399 to placebo (396 analyzed). The follow-up median was 53 months. Fifty-four placebo recipients and 86 aspirin recipients bled more than 750 mL in the first 12 hours (odds ratio [OR], 1.81; 95% confidence interval [CI], 1.25-2.63), while total chest drain discharge was above 1000 mL in 96 placebo and 131 aspirin recipients (OR, 1.60; 95% CI, 1.17-2.18). Preoperative aspirin decreased the long-term hazard of nonfatal coronary event (infarction or repeat revascularization)-hazard ratio (HR), 0.58 (95% CI, 0.33-0.99)--and tended to decrease the hazard of a major cardiac event (cardiovascular death, infarction, or repeat revascularization--HR, 0.65 [95% CI, 0.41-1.03]). Performing coronary grafts on aspirin is associated with increased postoperative bleeding but may decrease the long-term hazard of coronary events. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. 123.6 Section 123.6 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF... processor shall have and implement a written HACCP plan whenever a hazard analysis reveals one or more food...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-19
... Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls for Human Food; Extension of... Analysis and Risk-Based Preventive Controls for Human Food.'' FOR FURTHER INFORMATION CONTACT: Domini Bean... Practice and Hazard Analysis and Risk-Based Preventive Controls for Human Food'' with a 120-day comment...
Enzalutamide in metastatic prostate cancer before chemotherapy.
Beer, Tomasz M; Armstrong, Andrew J; Rathkopf, Dana E; Loriot, Yohann; Sternberg, Cora N; Higano, Celestia S; Iversen, Peter; Bhattacharya, Suman; Carles, Joan; Chowdhury, Simon; Davis, Ian D; de Bono, Johann S; Evans, Christopher P; Fizazi, Karim; Joshua, Anthony M; Kim, Choung-Soo; Kimura, Go; Mainwaring, Paul; Mansbach, Harry; Miller, Kurt; Noonberg, Sarah B; Perabo, Frank; Phung, De; Saad, Fred; Scher, Howard I; Taplin, Mary-Ellen; Venner, Peter M; Tombal, Bertrand
2014-07-31
Enzalutamide is an oral androgen-receptor inhibitor that prolongs survival in men with metastatic castration-resistant prostate cancer in whom the disease has progressed after chemotherapy. New treatment options are needed for patients with metastatic prostate cancer who have not received chemotherapy, in whom the disease has progressed despite androgen-deprivation therapy. In this double-blind, phase 3 study, we randomly assigned 1717 patients to receive either enzalutamide (at a dose of 160 mg) or placebo once daily. The coprimary end points were radiographic progression-free survival and overall survival. The study was stopped after a planned interim analysis, conducted when 540 deaths had been reported, showed a benefit of the active treatment. The rate of radiographic progression-free survival at 12 months was 65% among patients treated with enzalutamide, as compared with 14% among patients receiving placebo (81% risk reduction; hazard ratio in the enzalutamide group, 0.19; 95% confidence interval [CI], 0.15 to 0.23; P<0.001). A total of 626 patients (72%) in the enzalutamide group, as compared with 532 patients (63%) in the placebo group, were alive at the data-cutoff date (29% reduction in the risk of death; hazard ratio, 0.71; 95% CI, 0.60 to 0.84; P<0.001). The benefit of enzalutamide was shown with respect to all secondary end points, including the time until the initiation of cytotoxic chemotherapy (hazard ratio, 0.35), the time until the first skeletal-related event (hazard ratio, 0.72), a complete or partial soft-tissue response (59% vs. 5%), the time until prostate-specific antigen (PSA) progression (hazard ratio, 0.17), and a rate of decline of at least 50% in PSA (78% vs. 3%) (P<0.001 for all comparisons). Fatigue and hypertension were the most common clinically relevant adverse events associated with enzalutamide treatment. Enzalutamide significantly decreased the risk of radiographic progression and death and delayed the initiation of chemotherapy in men with metastatic prostate cancer. (Funded by Medivation and Astellas Pharma; PREVAIL ClinicalTrials.gov number, NCT01212991.).
Peritoneal dialysis in rural Australia.
Gray, Nicholas A; Grace, Blair S; McDonald, Stephen P
2013-12-20
Australians living in rural areas have lower incidence rates of renal replacement therapy and poorer dialysis survival compared with urban dwellers. This study compares peritoneal dialysis (PD) patient characteristics and outcomes in rural and urban Australia. Non-indigenous Australian adults who commenced chronic dialysis between 1 January 2000 and 31 December 2010 according to the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) were investigated. Each patient's residence was classified according to the Australian Bureau of Statistics remote area index as major city (MC), inner regional (IR), outer regional (OR), or remote/very remote (REM). A total of 7657 patients underwent PD treatment during the study period. Patient distribution was 69.0% MC, 19.6% IR, 9.5% OR, and 1.8% REM. PD uptake increased with increasing remoteness. Compared with MC, sub-hazard ratios [95% confidence intervals] for commencing PD were 1.70 [1.61-1.79] IR, 2.01 [1.87-2.16] OR, and 2.60 [2.21-3.06] REM. During the first 6 months of PD, technique failure was less likely outside MC (sub-hazard ratio 0.47 [95% CI: 0.35-0.62], P < 0.001), but no difference was seen after 6 months (sub-hazard ratio 1.05 [95% CI: 0.84-1.32], P = 0.6). Technique failure due to technical (sub-hazard ratio 0.57 [95% CI: 0.38-0.84], P = 0.005) and non-medical causes (sub-hazard ratio 0.52 [95% CI: 0.31-0.87], P = 0.01) was less likely outside MC. Time to first peritonitis episode was not associated with remoteness (P = 0.8). Patient survival while on PD or within 90 days of stopping PD did not differ by region (P = 0.2). PD uptake increases with increasing remoteness. In rural areas, PD technique failure is less likely during the first 6 months and time to first peritonitis is comparable to urban areas. Mortality while on PD does not differ by region. PD is therefore a good dialysis modality choice for rural patients in Australia.
Fitzhugh, Courtney D; Hsieh, Matthew M; Allen, Darlene; Coles, Wynona A; Seamon, Cassie; Ring, Michael; Zhao, Xiongce; Minniti, Caterina P; Rodgers, Griffin P; Schechter, Alan N; Tisdale, John F; Taylor, James G
2015-01-01
Adults with sickle cell anemia (HbSS) are inconsistently treated with hydroxyurea. We retrospectively evaluated the effects of elevating fetal hemoglobin with hydroxyurea on organ damage and survival in patients enrolled in our screening study between 2001 and 2010. An electronic medical record facilitated development of a database for comparison of study parameters based on hydroxyurea exposure and dose. This study is registered with ClinicalTrials.gov, number NCT00011648. Three hundred eighty-three adults with homozygous sickle cell disease were analyzed with 59 deaths during study follow-up. Cox regression analysis revealed deceased subjects had more hepatic dysfunction (elevated alkaline phosphatase, Hazard Ratio = 1.005, 95% CI 1.003-1.006, p<0.0.0001), kidney dysfunction (elevated creatinine, Hazard Ratio = 1.13, 95% CI 1.00-1.27, p = 0.043), and cardiopulmonary dysfunction (elevated tricuspid jet velocity on echocardiogram, Hazard Ratio = 2.22, 1.23-4.02, p = 0.0082). Sixty-six percent of subjects were treated with hydroxyurea, although only 66% of those received a dose within the recommended therapeutic range. Hydroxyurea use was associated with improved survival (Hazard Ratio = 0.58, 95% CI 0.34-0.97, p = 0.040). This effect was most pronounced in those taking the recommended dose of 15-35 mg/kg/day (Hazard Ratio 0.36, 95% CI 0.17-0.73, p = 0.0050). Hydroxyurea use was not associated with changes in organ function over time. Further, subjects with higher fetal hemoglobin responses to hydroxyurea were more likely to survive (p = 0.0004). While alkaline phosphatase was lowest in patients with the best fetal hemoglobin response (95.4 versus 123.6, p = 0.0065 and 96.1 versus 113.6U/L, p = 0.041 at first and last visits, respectively), other markers of organ damage were not consistently improved over time in patients with the highest fetal hemoglobin levels. Our data suggest that adults should be treated with the maximum tolerated hydroxyurea dose, ideally before organ damage occurs. Prospective studies are indicated to validate these findings.
Fitzhugh, Courtney D.; Hsieh, Matthew M.; Allen, Darlene; Coles, Wynona A.; Seamon, Cassie; Ring, Michael; Zhao, Xiongce; Minniti, Caterina P.; Rodgers, Griffin P.; Schechter, Alan N.; Tisdale, John F.; Taylor, James G.
2015-01-01
Background Adults with sickle cell anemia (HbSS) are inconsistently treated with hydroxyurea. Objectives We retrospectively evaluated the effects of elevating fetal hemoglobin with hydroxyurea on organ damage and survival in patients enrolled in our screening study between 2001 and 2010. Methods An electronic medical record facilitated development of a database for comparison of study parameters based on hydroxyurea exposure and dose. This study is registered with ClinicalTrials.gov, number NCT00011648. Results Three hundred eighty-three adults with homozygous sickle cell disease were analyzed with 59 deaths during study follow-up. Cox regression analysis revealed deceased subjects had more hepatic dysfunction (elevated alkaline phosphatase, Hazard Ratio = 1.005, 95% CI 1.003–1.006, p<0.0.0001), kidney dysfunction (elevated creatinine, Hazard Ratio = 1.13, 95% CI 1.00–1.27, p = 0.043), and cardiopulmonary dysfunction (elevated tricuspid jet velocity on echocardiogram, Hazard Ratio = 2.22, 1.23–4.02, p = 0.0082). Sixty-six percent of subjects were treated with hydroxyurea, although only 66% of those received a dose within the recommended therapeutic range. Hydroxyurea use was associated with improved survival (Hazard Ratio = 0.58, 95% CI 0.34–0.97, p = 0.040). This effect was most pronounced in those taking the recommended dose of 15–35 mg/kg/day (Hazard Ratio 0.36, 95% CI 0.17–0.73, p = 0.0050). Hydroxyurea use was not associated with changes in organ function over time. Further, subjects with higher fetal hemoglobin responses to hydroxyurea were more likely to survive (p = 0.0004). While alkaline phosphatase was lowest in patients with the best fetal hemoglobin response (95.4 versus 123.6, p = 0.0065 and 96.1 versus 113.6U/L, p = 0.041 at first and last visits, respectively), other markers of organ damage were not consistently improved over time in patients with the highest fetal hemoglobin levels. Conclusions Our data suggest that adults should be treated with the maximum tolerated hydroxyurea dose, ideally before organ damage occurs. Prospective studies are indicated to validate these findings. PMID:26576059
Peritoneal dialysis in rural Australia
2013-01-01
Background Australians living in rural areas have lower incidence rates of renal replacement therapy and poorer dialysis survival compared with urban dwellers. This study compares peritoneal dialysis (PD) patient characteristics and outcomes in rural and urban Australia. Methods Non-indigenous Australian adults who commenced chronic dialysis between 1 January 2000 and 31 December 2010 according to the Australia and New Zealand Dialysis and Transplant Registry (ANZDATA) were investigated. Each patient’s residence was classified according to the Australian Bureau of Statistics remote area index as major city (MC), inner regional (IR), outer regional (OR), or remote/very remote (REM). Results A total of 7657 patients underwent PD treatment during the study period. Patient distribution was 69.0% MC, 19.6% IR, 9.5% OR, and 1.8% REM. PD uptake increased with increasing remoteness. Compared with MC, sub-hazard ratios [95% confidence intervals] for commencing PD were 1.70 [1.61-1.79] IR, 2.01 [1.87-2.16] OR, and 2.60 [2.21-3.06] REM. During the first 6 months of PD, technique failure was less likely outside MC (sub-hazard ratio 0.47 [95% CI: 0.35-0.62], P < 0.001), but no difference was seen after 6 months (sub-hazard ratio 1.05 [95% CI: 0.84-1.32], P = 0.6). Technique failure due to technical (sub-hazard ratio 0.57 [95% CI: 0.38-0.84], P = 0.005) and non-medical causes (sub-hazard ratio 0.52 [95% CI: 0.31-0.87], P = 0.01) was less likely outside MC. Time to first peritonitis episode was not associated with remoteness (P = 0.8). Patient survival while on PD or within 90 days of stopping PD did not differ by region (P = 0.2). Conclusions PD uptake increases with increasing remoteness. In rural areas, PD technique failure is less likely during the first 6 months and time to first peritonitis is comparable to urban areas. Mortality while on PD does not differ by region. PD is therefore a good dialysis modality choice for rural patients in Australia. PMID:24359341
Effects of clopidogrel added to aspirin in patients with recent lacunar stroke.
Benavente, Oscar R; Hart, Robert G; McClure, Leslie A; Szychowski, Jeffrey M; Coffey, Christopher S; Pearce, Lesly A
2012-08-30
Lacunar infarcts are a frequent type of stroke caused mainly by cerebral small-vessel disease. The effectiveness of antiplatelet therapy for secondary prevention has not been defined. We conducted a double-blind, multicenter trial involving 3020 patients with recent symptomatic lacunar infarcts identified by magnetic resonance imaging. Patients were randomly assigned to receive 75 mg of clopidogrel or placebo daily; patients in both groups received 325 mg of aspirin daily. The primary outcome was any recurrent stroke, including ischemic stroke and intracranial hemorrhage. The participants had a mean age of 63 years, and 63% were men. After a mean follow-up of 3.4 years, the risk of recurrent stroke was not significantly reduced with aspirin and clopidogrel (dual antiplatelet therapy) (125 strokes; rate, 2.5% per year) as compared with aspirin alone (138 strokes, 2.7% per year) (hazard ratio, 0.92; 95% confidence interval [CI], 0.72 to 1.16), nor was the risk of recurrent ischemic stroke (hazard ratio, 0.82; 95% CI, 0.63 to 1.09) or disabling or fatal stroke (hazard ratio, 1.06; 95% CI, 0.69 to 1.64). The risk of major hemorrhage was almost doubled with dual antiplatelet therapy (105 hemorrhages, 2.1% per year) as compared with aspirin alone (56, 1.1% per year) (hazard ratio, 1.97; 95% CI, 1.41 to 2.71; P<0.001). Among classifiable recurrent ischemic strokes, 71% (133 of 187) were lacunar strokes. All-cause mortality was increased among patients assigned to receive dual antiplatelet therapy (77 deaths in the group receiving aspirin alone vs. 113 in the group receiving dual antiplatelet therapy) (hazard ratio, 1.52; 95% CI, 1.14 to 2.04; P=0.004); this difference was not accounted for by fatal hemorrhages (9 in the group receiving dual antiplatelet therapy vs. 4 in the group receiving aspirin alone). Among patients with recent lacunar strokes, the addition of clopidogrel to aspirin did not significantly reduce the risk of recurrent stroke and did significantly increase the risk of bleeding and death. (Funded by the National Institute of Neurological Disorders and Stroke and others; SPS3 ClinicalTrials.gov number, NCT00059306.).
Apixaban versus warfarin in patients with atrial fibrillation.
Granger, Christopher B; Alexander, John H; McMurray, John J V; Lopes, Renato D; Hylek, Elaine M; Hanna, Michael; Al-Khalidi, Hussein R; Ansell, Jack; Atar, Dan; Avezum, Alvaro; Bahit, M Cecilia; Diaz, Rafael; Easton, J Donald; Ezekowitz, Justin A; Flaker, Greg; Garcia, David; Geraldes, Margarida; Gersh, Bernard J; Golitsyn, Sergey; Goto, Shinya; Hermosillo, Antonio G; Hohnloser, Stefan H; Horowitz, John; Mohan, Puneet; Jansky, Petr; Lewis, Basil S; Lopez-Sendon, Jose Luis; Pais, Prem; Parkhomenko, Alexander; Verheugt, Freek W A; Zhu, Jun; Wallentin, Lars
2011-09-15
Vitamin K antagonists are highly effective in preventing stroke in patients with atrial fibrillation but have several limitations. Apixaban is a novel oral direct factor Xa inhibitor that has been shown to reduce the risk of stroke in a similar population in comparison with aspirin. In this randomized, double-blind trial, we compared apixaban (at a dose of 5 mg twice daily) with warfarin (target international normalized ratio, 2.0 to 3.0) in 18,201 patients with atrial fibrillation and at least one additional risk factor for stroke. The primary outcome was ischemic or hemorrhagic stroke or systemic embolism. The trial was designed to test for noninferiority, with key secondary objectives of testing for superiority with respect to the primary outcome and to the rates of major bleeding and death from any cause. The median duration of follow-up was 1.8 years. The rate of the primary outcome was 1.27% per year in the apixaban group, as compared with 1.60% per year in the warfarin group (hazard ratio with apixaban, 0.79; 95% confidence interval [CI], 0.66 to 0.95; P<0.001 for noninferiority; P=0.01 for superiority). The rate of major bleeding was 2.13% per year in the apixaban group, as compared with 3.09% per year in the warfarin group (hazard ratio, 0.69; 95% CI, 0.60 to 0.80; P<0.001), and the rates of death from any cause were 3.52% and 3.94%, respectively (hazard ratio, 0.89; 95% CI, 0.80 to 0.99; P=0.047). The rate of hemorrhagic stroke was 0.24% per year in the apixaban group, as compared with 0.47% per year in the warfarin group (hazard ratio, 0.51; 95% CI, 0.35 to 0.75; P<0.001), and the rate of ischemic or uncertain type of stroke was 0.97% per year in the apixaban group and 1.05% per year in the warfarin group (hazard ratio, 0.92; 95% CI, 0.74 to 1.13; P=0.42). In patients with atrial fibrillation, apixaban was superior to warfarin in preventing stroke or systemic embolism, caused less bleeding, and resulted in lower mortality. (Funded by Bristol-Myers Squibb and Pfizer; ARISTOTLE ClinicalTrials.gov number, NCT00412984.).
Confidence intervals for the first crossing point of two hazard functions.
Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng
2009-12-01
The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.
NASA Technical Reports Server (NTRS)
2012-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to protect our personnel from injury and our equipment from damage. The purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Z1 Suit Port Test in Chamber B located in building 32, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments, and activities while interfacing with facility test systems, equipment, and hardware. The goal of this hazard analysis is to identify all hazards that have the potential to harm personnel and/or damage facility equipment, flight hardware, property, or harm the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, JSC Safety and Health Handbook.
2017-11-14
Objective To assess the three year clinical outcomes and cost effectiveness of a strategy of endovascular repair (if aortic morphology is suitable, open repair if not) versus open repair for patients with suspected ruptured abdominal aortic aneurysm. Design Randomised controlled trial. Setting 30 vascular centres (29 in UK, one in Canada), 2009-16. Participants 613 eligible patients (480 men) with a clinical diagnosis of ruptured aneurysm, of whom 502 underwent emergency repair for rupture. Interventions 316 patients were randomised to an endovascular strategy (275 with confirmed rupture) and 297 to open repair (261 with confirmed rupture). Main outcome measures Mortality, with reinterventions after aneurysm repair, quality of life, and hospital costs to three years as secondary measures. Results The maximum follow-up for mortality was 7.1 years, with two patients in each group lost to follow-up by three years. After similar mortality by 90 days, in the mid-term (three months to three years) there were fewer deaths in the endovascular than the open repair group (hazard ratio 0.57, 95% confidence interval 0.36 to 0.90), leading to lower mortality at three years (48% v 56%), but by seven years mortality was about 60% in each group (hazard ratio 0.92, 0.75 to 1.13). Results for the 502 patients with repaired ruptures were more pronounced: three year mortality was lower in the endovascular strategy group (42% v 54%; odds ratio 0.62, 0.43 to 0.88), but after seven years there was no clear difference between the groups (hazard ratio 0.86, 0.68 to 1.08). Reintervention rates up to three years were not significantly different between the randomised groups (hazard ratio 1.02, 0.79 to 1.32); the initial rapid rate of reinterventions was followed by a much slower mid-term reintervention rate in both groups. The early higher average quality of life in the endovascular strategy versus open repair group, coupled with the lower mortality at three years, led to a gain in average quality adjusted life years (QALYs) at three years of 0.17 (95% confidence interval 0.00 to 0.33). The endovascular strategy group spent fewer days in hospital and had lower average costs of -£2605 (95% confidence interval -£5966 to £702) (about €2813; $3439). The probability that the endovascular strategy is cost effective was >90% at all levels of willingness to pay for a QALY gain. Conclusions At three years, compared with open repair, an endovascular strategy for suspected ruptured abdominal aortic aneurysm was associated with a survival advantage, a gain in QALYs, similar levels of reintervention, and reduced costs, and this strategy was cost effective. These findings support the increasing use of an endovascular strategy, with wider availability of emergency endovascular repair. Trial registration Current Controlled Trials ISRCTN48334791; ClinicalTrials NCT00746122. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Phenotype at diagnosis predicts recurrence rates in Crohn's disease.
Wolters, F L; Russel, M G; Sijbrandij, J; Ambergen, T; Odes, S; Riis, L; Langholz, E; Politi, P; Qasim, A; Koutroubakis, I; Tsianos, E; Vermeire, S; Freitas, J; van Zeijl, G; Hoie, O; Bernklev, T; Beltrami, M; Rodriguez, D; Stockbrügger, R W; Moum, B
2006-08-01
In Crohn's disease (CD), studies associating phenotype at diagnosis and subsequent disease activity are important for patient counselling and health care planning. To calculate disease recurrence rates and to correlate these with phenotypic traits at diagnosis. A prospectively assembled uniformly diagnosed European population based inception cohort of CD patients was classified according to the Vienna classification for disease phenotype at diagnosis. Surgical and non-surgical recurrence rates throughout a 10 year follow up period were calculated. Multivariate analysis was performed to classify risk factors present at diagnosis for recurrent disease. A total of 358 were classified for phenotype at diagnosis, of whom 262 (73.2%) had a first recurrence and 113 patients (31.6%) a first surgical recurrence during the first 10 years after diagnosis. Patients with upper gastrointestinal disease at diagnosis had an excess risk of recurrence (hazard ratio 1.54 (95% confidence interval (CI) 1.13-2.10)) whereas age >/=40 years at diagnosis was protective (hazard ratio 0.82 (95% CI 0.70-0.97)). Colonic disease was a protective characteristic for resective surgery (hazard ratio 0.38 (95% CI 0.21-0.69)). More frequent resective surgical recurrences were reported from Copenhagen (hazard ratio 3.23 (95% CI 1.32-7.89)). A mild course of disease in terms of disease recurrence was observed in this European cohort. Phenotype at diagnosis had predictive value for disease recurrence with upper gastrointestinal disease being the most important positive predictor. A phenotypic North-South gradient in CD may be present, illustrated by higher surgery risks in some of the Northern European centres.
Hansen, Morten; Nyby, Sebastian; Eifer Møller, Jacob; Videbæk, Lars; Kassem, Moustapha; Barington, Torben; Thayssen, Per; Diederichsen, Axel Cosmus Pyndt
2014-01-01
Seven years ago, the DanCell study was carried out to test the hypothesis of improvement in left ventricular ejection fraction (LVEF) following repeated intracoronary injections of autologous bone marrow-derived stem cells (BMSCs) in patients suffering from chronic ischemic heart failure. In this post hoc analysis, the long-term effect of therapy is assessed. 32 patients [mean age 61 (SD ± 9), 81% males] with systolic dysfunction (LVEF 33 ± 9%) received two repeated intracoronary infusions (4 months apart) of autologous BMSCs (1,533 ± 765 × 10(6) BMSCs including 23 ± 11 × 10(6) CD34(+) cells and 14 ± 7 × 10(6) CD133(+) cells). Patients were followed for 7 years and deaths were recorded. During follow-up, 10 patients died (31%). In univariate regression analysis, the total number of BMSCs, CD34(+) cell count and CD133(+) cell count did not significantly correlate with survival (hazard ratio: 0.999, 95% CI: 0.998-1.000, p = 0.24; hazard ratio: 0.94, 95% CI: 0.88-1.01, p = 0.10, and hazard ratio: 0.96, 95% CI: 0.87-1.07, p = 0.47, respectively). After adjustment for baseline variables in multivariate regression analysis, the CD34(+) cell count was significantly associated with survival (hazard ratio: 0.90, 95% CI: 0.82-1.00, p = 0.04). Intracoronary injections of a high number of CD34(+) cells may have a beneficial effect on chronic ischemic heart failure in terms of long-term survival.
Cardiac rehabilitation attendance and outcomes in coronary artery disease patients.
Martin, Billie-Jean; Hauer, Trina; Arena, Ross; Austford, Leslie D; Galbraith, P Diane; Lewin, Adriane M; Knudtson, Merril L; Ghali, William A; Stone, James A; Aggarwal, Sandeep G
2012-08-07
Cardiac rehabilitation (CR) is an efficacious yet underused treatment for patients with coronary artery disease. The objective of this study was to determine the association between CR completion and mortality and resource use. We conducted a prospective cohort study of 5886 subjects (20.8% female; mean age, 60.6 years) who had undergone angiography and were referred for CR in Calgary, AB, Canada, between 1996 and 2009. Outcomes of interest included freedom from emergency room visits, hospitalization, and survival in CR completers versus noncompleters, adjusted for clinical covariates, treatment strategy, and coronary anatomy. Hazard ratios for events for CR completers versus noncompleters were also constructed. A propensity model was used to match completers to noncompleters on baseline characteristics, and each outcome was compared between propensity-matched groups. Of the subjects referred for CR, 2900 (49.3%) completed the program, and an additional 554 subjects started but did not complete CR. CR completion was associated with a lower risk of death, with an adjusted hazard ratio of 0.59 (95% confidence interval, 0.49-0.70). CR completion was also associated with a decreased risk of all-cause hospitalization (adjusted hazard ratio, 0.77; 95% confidence interval, 0.71-0.84) and cardiac hospitalization (adjusted hazard ratio, 0.68; 95% confidence interval, 0.55-0.83) but not with emergency room visits. Propensity-matched analysis demonstrated a persistent association between CR completion and reduced mortality. Among those coronary artery disease patients referred, CR completion is associated with improved survival and decreased hospitalization. There is a need to explore reasons for nonattendance and to test interventions to improve attendance after referral.
Leigh, Lucy; Hudson, Irene L; Byles, Julie E
2015-12-01
The aim of this study is to identify patterns of sleep difficulty in older women, to investigate whether sleep difficulty is an indicator for poorer survival, and to determine whether sleep difficulty modifies the association between disease and death. Data were from the Australian Longitudinal Study on Women's Health, a 15-year longitudinal cohort study, with 10 721 women aged 70-75 years at baseline. Repeated-measures latent class analysis identified four classes of persistent sleep difficulty: troubled sleepers (N = 2429, 22.7%); early wakers (N = 3083, 28.8%); trouble falling asleep (N = 1767, 16.5%); and untroubled sleepers (N = 3442, 32.1%). Sleep difficulty was an indicator for mortality. Compared with untroubled sleepers, hazard ratios and 95% confidence intervals for troubled sleepers, early wakers, and troubled falling asleep were 1.12 (1.03, 1.23), 0.81 (0.75, 0.91) and 0.89 (0.79, 1.00), respectively. Sleep difficulty may modify the prognosis of women with chronic diseases. Hazard ratios (and 95% confidence intervals) for having three or more diseases (compared with 0 diseases) were enhanced for untroubled sleepers, early wakers and trouble falling asleep [hazard ratio = 1.86 (1.55, 2.22), 1.91 (1.56, 2.35) and 1.98 (1.47, 2.66), respectively], and reduced for troubled sleepers [hazard ratio = 1.57 (1.24, 1.98)]. Sleep difficulty in older women is more complex than the presence or absence of sleep difficulty, and should be considered when assessing the risk of death associated with disease. © 2015 European Sleep Research Society.
Hazard Analysis for Building 34 Vacuum Glove Box Assembly
NASA Technical Reports Server (NTRS)
Meginnis, Ian
2014-01-01
One of the characteristics of an effective safety program is the recognition and control of hazards before mishaps or failures occur. Conducting potentially hazardous tests necessitates a thorough hazard analysis in order to prevent injury to personnel, and to prevent damage to facilities and equipment. The primary purpose of this hazard analysis is to define and address the potential hazards and controls associated with the Building 34 Vacuum Glove Box Assembly, and to provide the applicable team of personnel with the documented results. It is imperative that each member of the team be familiar with the hazards and controls associated with his/her particular tasks, assignments and activities while interfacing with facility test systems, equipment and hardware. In fulfillment of the stated purposes, the goal of this hazard analysis is to identify all hazards that have the potential to harm personnel, damage the facility or its test systems or equipment, test articles, Government or personal property, or the environment. This analysis may also assess the significance and risk, when applicable, of lost test objectives when substantial monetary value is involved. The hazards, causes, controls, verifications, and risk assessment codes have been documented on the hazard analysis work sheets in Appendix A of this document. The preparation and development of this report is in accordance with JPR 1700.1, "JSC Safety and Health Handbook" and JSC 17773 Rev D "Instructions for Preparation of Hazard Analysis for JSC Ground Operations".
Attrition in Psychotherapy: A Survival Analysis
ERIC Educational Resources Information Center
Roseborough, David John; McLeod, Jeffrey T.; Wright, Florence I.
2016-01-01
Purpose: Attrition is a common problem in psychotherapy and can be defined as clients ending treatment before achieving an optimal response. Method: This longitudinal, archival study utilized data for 3,728 clients, using the Outcome Questionnaire 45.2. A Cox regression proportional hazards (hazard ratios) model was used in order to better…
Population-based studies of antithyroid drugs and sudden cardiac death
van Noord, Charlotte; Sturkenboom, Miriam C J M; Straus, Sabine M J M; Hofman, Albert; Witteman, Jacqueline C M; Stricker, Bruno H Ch
2009-01-01
AIM Thyroid free T4 is associated with QTc-interval prolongation, which is a risk factor for sudden cardiac death (SCD). Hyperthyroidism has been associated with SCD in case reports, but there are no population-based studies confirming this. The aim was to investigate whether use of antithyroid drugs (as a direct cause or as an indicator of poorly controlled hyperthyroidism) is associated with an increased risk of SCD. METHODS We studied the occurrence of SCD in a two-step procedure in two different Dutch populations. First, the prospective population-based Rotterdam Study including 7898 participants (≥55 years old). Second, we used the Integrated Primary Care Information (IPCI) database, which is a longitudinal general practice research database to see whether we could replicate results from the first study. Drug use at the index date was assessed with prescription information from automated pharmacies (Rotterdam Study) or drug prescriptions from general practices (IPCI). We used a Cox proportional hazards model in a cohort analysis, adjusted for age, gender and use of QTc prolonging drugs (Rotterdam Study) and conditional logistic regression analysis in a case–control analysis, matched for age, gender, practice and calendar time and adjusted for arrhythmia and cerebrovascular ischaemia (IPCI). RESULTS In the Rotterdam Study, 375 participants developed SCD during follow-up. Current use of antithyroid drugs was associated with SCD [adjusted hazard ratio 3.9; 95% confidence interval (CI) 1.7, 8.7]. IPCI included 1424 cases with SCD and 14 443 controls. Also in IPCI, current use of antithyroid drugs was associated with SCD (adjusted odds ratio 2.9; 95% CI 1.1, 7.4). CONCLUSIONS Use of antithyroid drugs was associated with a threefold increased risk of SCD. Although this might be directly caused by antithyroid drug use, it might be more readily explained by underlying poorly controlled hyperthyroidism, since treated patients who developed SCD still had low thyroid-stimulating hormone levels shortly before death. PMID:19740403
Regional Nodal Irradiation in Early-Stage Breast Cancer.
Whelan, Timothy J; Olivotto, Ivo A; Parulekar, Wendy R; Ackerman, Ida; Chua, Boon H; Nabid, Abdenour; Vallis, Katherine A; White, Julia R; Rousseau, Pierre; Fortin, Andre; Pierce, Lori J; Manchul, Lee; Chafe, Susan; Nolan, Maureen C; Craighead, Peter; Bowen, Julie; McCready, David R; Pritchard, Kathleen I; Gelmon, Karen; Murray, Yvonne; Chapman, Judy-Anne W; Chen, Bingshu E; Levine, Mark N
2015-07-23
Most women with breast cancer who undergo breast-conserving surgery receive whole-breast irradiation. We examined whether the addition of regional nodal irradiation to whole-breast irradiation improved outcomes. We randomly assigned women with node-positive or high-risk node-negative breast cancer who were treated with breast-conserving surgery and adjuvant systemic therapy to undergo either whole-breast irradiation plus regional nodal irradiation (including internal mammary, supraclavicular, and axillary lymph nodes) (nodal-irradiation group) or whole-breast irradiation alone (control group). The primary outcome was overall survival. Secondary outcomes were disease-free survival, isolated locoregional disease-free survival, and distant disease-free survival. Between March 2000 and February 2007, a total of 1832 women were assigned to the nodal-irradiation group or the control group (916 women in each group). The median follow-up was 9.5 years. At the 10-year follow-up, there was no significant between-group difference in survival, with a rate of 82.8% in the nodal-irradiation group and 81.8% in the control group (hazard ratio, 0.91; 95% confidence interval [CI], 0.72 to 1.13; P=0.38). The rates of disease-free survival were 82.0% in the nodal-irradiation group and 77.0% in the control group (hazard ratio, 0.76; 95% CI, 0.61 to 0.94; P=0.01). Patients in the nodal-irradiation group had higher rates of grade 2 or greater acute pneumonitis (1.2% vs. 0.2%, P=0.01) and lymphedema (8.4% vs. 4.5%, P=0.001). Among women with node-positive or high-risk node-negative breast cancer, the addition of regional nodal irradiation to whole-breast irradiation did not improve overall survival but reduced the rate of breast-cancer recurrence. (Funded by the Canadian Cancer Society Research Institute and others; MA.20 ClinicalTrials.gov number, NCT00005957.).
Mediterranean Diet and Cardiovascular Health: Teachings of the PREDIMED Study123
Ros, Emilio; Martínez-González, Miguel A.; Estruch, Ramon; Salas-Salvadó, Jordi; Fitó, Montserrat; Martínez, José A.; Corella, Dolores
2014-01-01
The PREDIMED (Prevención con Dieta Mediterránea) study was designed to assess the long-term effects of the Mediterranean diet (MeDiet) without any energy restriction on incident cardiovascular disease (CVD) as a multicenter, randomized, primary prevention trial in individuals at high risk. Participants were randomly assigned to 3 diet groups: 1) MeDiet supplemented with extra-virgin olive oil (EVOO); 2) MeDiet supplemented with nuts; and 3) control diet (advice on a low-fat diet). After 4.8 y, 288 major CVD events occurred in 7447 participants; crude hazard ratios were 0.70 (95% CI: 0.53, 0.91) for the MeDiet + EVOO and 0.70 (95% CI: 0.53, 0.94) for the MeDiet + nuts compared with the control group. Respective hazard ratios for incident diabetes (273 cases) among 3541 participants without diabetes were 0.60 (95% CI: 0.43, 0.85) and 0.82 (95% CI: 0.61, 1.10) compared with the control group. After 1-y follow-up, participants in the MeDiet + nuts group showed a significant 13.7% reduction in prevalence of metabolic syndrome compared with reductions of 6.7% and 2.0% in the MeDiet + EVOO and control groups, respectively. Analyses of intermediate markers of cardiovascular risk demonstrated beneficial effects of the MeDiets on blood pressure, lipid profiles, lipoprotein particles, inflammation, oxidative stress, and carotid atherosclerosis, as well as on the expression of proatherogenic genes involved in vascular events and thrombosis. Nutritional genomics studies demonstrated interactions between a MeDiet and cyclooxygenase-2 (COX-2), interleukin-6 (IL-6), apolipoprotein A2 (APOA2), cholesteryl ester transfer protein plasma (CETP), and transcription factor 7-like 2 (TCF7L2) gene polymorphisms. The PREDIMED study results demonstrate that a high-unsaturated fat and antioxidant-rich dietary pattern such as the MeDiet is a useful tool in the prevention of CVD. PMID:24829485
Mediterranean diet and cardiovascular health: Teachings of the PREDIMED study.
Ros, Emilio; Martínez-González, Miguel A; Estruch, Ramon; Salas-Salvadó, Jordi; Fitó, Montserrat; Martínez, José A; Corella, Dolores
2014-05-01
The PREDIMED (Prevención con Dieta Mediterránea) study was designed to assess the long-term effects of the Mediterranean diet (MeDiet) without any energy restriction on incident cardiovascular disease (CVD) as a multicenter, randomized, primary prevention trial in individuals at high risk. Participants were randomly assigned to 3 diet groups: 1) MeDiet supplemented with extra-virgin olive oil (EVOO); 2) MeDiet supplemented with nuts; and 3) control diet (advice on a low-fat diet). After 4.8 y, 288 major CVD events occurred in 7447 participants; crude hazard ratios were 0.70 (95% CI: 0.53, 0.91) for the MeDiet + EVOO and 0.70 (95% CI: 0.53, 0.94) for the MeDiet + nuts compared with the control group. Respective hazard ratios for incident diabetes (273 cases) among 3541 participants without diabetes were 0.60 (95% CI: 0.43, 0.85) and 0.82 (95% CI: 0.61, 1.10) compared with the control group. After 1-y follow-up, participants in the MeDiet + nuts group showed a significant 13.7% reduction in prevalence of metabolic syndrome compared with reductions of 6.7% and 2.0% in the MeDiet + EVOO and control groups, respectively. Analyses of intermediate markers of cardiovascular risk demonstrated beneficial effects of the MeDiets on blood pressure, lipid profiles, lipoprotein particles, inflammation, oxidative stress, and carotid atherosclerosis, as well as on the expression of proatherogenic genes involved in vascular events and thrombosis. Nutritional genomics studies demonstrated interactions between a MeDiet and cyclooxygenase-2 (COX-2), interleukin-6 (IL-6), apolipoprotein A2 (APOA2), cholesteryl ester transfer protein plasma (CETP), and transcription factor 7-like 2 (TCF7L2) gene polymorphisms. The PREDIMED study results demonstrate that a high-unsaturated fat and antioxidant-rich dietary pattern such as the MeDiet is a useful tool in the prevention of CVD. © 2014 American Society for Nutrition.
Gotland, N; Uhre, M L; Mejer, N; Skov, R; Petersen, A; Larsen, A R; Benfield, T
2016-10-01
Data describing long-term mortality in patients with Staphylococcus aureus bacteremia (SAB) is scarce. This study investigated risk factors, causes of death and temporal trends in long-term mortality associated with SAB. Nationwide population-based matched cohort study. Mortality rates and ratios for 25,855 cases and 258,547 controls were analyzed by Poisson regression. Hazard ratio of death was computed by Cox proportional hazards regression analysis. The majority of deaths occurred within the first year of SAB (44.6%) and a further 15% occurred within the following 2-5 years. The mortality rate was 14-fold higher in the first year after SAB and 4.5-fold higher overall for cases compared to controls. Increasing age, comorbidity and hospital contact within 90 days of SAB was associated with an increased risk of death. The overall relative risk of death decreased gradually by 38% from 1992-1995 to 2012-2014. Compared to controls, SAB patients were more likely to die from congenital malformation, musculoskeletal/skin disease, digestive system disease, genitourinary disease, infectious disease, endocrine disease, injury and cancer and less likely to die from respiratory disease, nervous system disease, unknown causes, psychiatric disorders, cardiovascular disease and senility. Over time, rates of death decreased or were stable for all disease categories except for musculoskeletal and skin disease where a trend towards an increase was seen. Long-term mortality after SAB was high but decreased over time. SAB cases were more likely to die of eight specific causes of death and less likely to die of five other causes of death compared to controls. Causes of death decreased for most disease categories. Risk factors associated with long-term mortality were similar to those found for short-term mortality. To improve long-term survival after SAB, patients should be screened for comorbidity associated with SAB. Copyright © 2016 The British Infection Association. Published by Elsevier Ltd. All rights reserved.
Vannucchi, Alessandro M; Kantarjian, Hagop M; Kiladjian, Jean-Jacques; Gotlib, Jason; Cervantes, Francisco; Mesa, Ruben A; Sarlis, Nicholas J; Peng, Wei; Sandor, Victor; Gopalakrishna, Prashanth; Hmissi, Abdel; Stalbovskaya, Viktoriya; Gupta, Vikas; Harrison, Claire; Verstovsek, Srdan
2015-09-01
Ruxolitinib, a potent Janus kinase 1/2 inhibitor, resulted in rapid and durable improvements in splenomegaly and disease-related symptoms in the 2 phase III COMFORT studies. In addition, ruxolitinib was associated with prolonged survival compared with placebo (COMFORT-I) and best available therapy (COMFORT-II). We present a pooled analysis of overall survival in the COMFORT studies using an intent-to-treat analysis and an analysis correcting for crossover in the control arms. Overall, 301 patients received ruxolitinib (COMFORT-I, n=155; COMFORT-II, n=146) and 227 patients received placebo (n=154) or best available therapy (n=73). After a median three years of follow up, intent-to-treat analysis showed that patients who received ruxolitinib had prolonged survival compared with patients who received placebo or best available therapy [hazard ratio=0.65; 95% confidence interval (95%CI): 0.46-0.90; P=0.01]; the crossover-corrected hazard ratio was 0.29 (95%CI: 0.13-0.63). Both patients with intermediate-2- or high-risk disease showed prolonged survival, and patients with high-risk disease in the ruxolitinib group had survival similar to that of patients with intermediate-2-risk disease in the control group. The Kaplan-Meier estimate of overall survival at week 144 was 78% in the ruxolitinib arm, 61% in the intent-to-treat control arm, and 31% in the crossover-adjusted control arm. While larger spleen size at baseline was prognostic for shortened survival, reductions in spleen size with ruxolitinib treatment correlated with longer survival. These findings are consistent with previous reports and support that ruxolitinib offers a survival benefit for patients with myelofibrosis compared with conventional therapies. (clinicaltrials.gov identifiers: COMFORT-I, NCT00952289; COMFORT-II, NCT00934544). Copyright© Ferrata Storti Foundation.
Wieder, Robert; Shafiq, Basit; Adam, Nabil
2016-01-01
BACKGROUND: African American race negatively impacts survival from localized breast cancer but co-variable factors confound the impact. METHODS: Data sets were analyzed from the Surveillance, Epidemiology and End Results (SEER) directories from 1973 to 2011 consisting of patients with designated diagnosis of breast adenocarcinoma, race as White or Caucasian, Black or African American, Asian, American Indian or Alaskan Native, Native Hawaiian or Pacific Islander, age, stage I, II or III, grade 1, 2 or 3, estrogen receptor or progesterone receptor positive or negative, marital status as single, married, separated, divorced or widowed and laterality as right or left. The Cox Proportional Hazards Regression model was used to determine hazard ratios for survival. Chi square test was applied to determine the interdependence of variables found significant in the multivariable Cox Proportional Hazards Regression analysis. Cells with stratified data of patients with identical characteristics except African American or Caucasian race were compared. RESULTS: Age, stage, grade, ER and PR status and marital status significantly co-varied with race and with each other. Stratifications by single co-variables demonstrated worse hazard ratios for survival for African Americans. Stratification by three and four co-variables demonstrated worse hazard ratios for survival for African Americans in most subgroupings with sufficient numbers of values. Differences in some subgroupings containing poor prognostic co-variables did not reach significance, suggesting that race effects may be partly overcome by additional poor prognostic indicators. CONCLUSIONS: African American race is a poor prognostic indicator for survival from breast cancer independent of 6 associated co-variables with prognostic significance. PMID:27698895
Intravascular ultrasound-guided unprotected left main coronary artery stenting in the elderly.
Tan, Qiang; Wang, Qingsheng; Liu, Dongtian; Zhang, Shuangyue; Zhang, Yang; Li, Yang
2015-05-01
To investigate whether intravascular ultrasound (IVUS) guided percutaneous coronary intervention (PCI) could improve clinical outcomes compared with angiography-guided PCI in the treatment of unprotected left main coronary artery stenosis (ULMCA) in the elderly. This controlled study was carried out between October 2009 and September 2012, in Qinhuangdao First Hospital, Hebei Province, China. One hundred and twenty-three consecutive patients with ULMCA, aged 70 or older, were randomized to an IVUS-guided group and a control group. The occurrence of major adverse cardiac events (MACE): death, non-fatal myocardial infarction, or target lesion revascularizations) were recorded after 2 years of follow-up. The IVUS-guided group had a lower rate of 2-year MACE than the control group (13.1% versus 29.3%, p=0.031). The incidence of target lesion revascularization was lower in the IVUS-guided group than in the control group (9.1% versus 24%, p=0.045). However, there were no differences in death and myocardial infarction in the 2 groups. On Cox proportional hazard analysis, distal lesion was the independent predictor of MACE (hazard ratio [HR]: 1.99, confidence interval [CI]: 1.129-2.367; p=0.043); IVUS guidance was independent factor of survival free of MACE (HR: 0.414, CI: 0.129-0.867; p=0.033). The use of IVUS could reduce MACE in elderly patients undergoing ULMCA intervention.
Sharma, R; Kraemer, DF; Torrazza, RM; Mai, V; Neu, J; Shuster, JJ; Hudak, ML
2015-01-01
OBJECTIVE Recent reports have posited a temporal association between blood transfusion with packed red blood cells (BT) and necrotizing enterocolitis (NEC). We evaluated the relationship between BT and NEC among infants at three hospitals who were consented at birth into a prospective observational study of NEC. STUDY DESIGN We used a case–control design to match each case of NEC in our study population of infants born at <33 weeks postmenstrual age (PMA) to one control infant using hospital of birth, PMA, birth weight and date of birth. RESULT The number of transfusions per infant did not differ between 42 NEC cases and their controls (4.0 ± 4.6 vs 5.4 ± 4.1, mean ± s.d., P = 0.063). A matched-pair analysis did not identify an association of transfusion with NEC in either the 48-h or 7-day time periods before the onset of NEC. Stratifying on matched-sets, the Cox proportional hazard model did not identify any difference in the total number of BTs between the two groups (hazard ratio 0.78, 95% confidence interval 0.57 to 1.07, P = 0.11). CONCLUSION In contrast to previous studies, our case–control study did not identify a significant temporal association between BT and NEC. Additional large prospective randomized studies are needed to clarify the relationship between BT and NEC. PMID:25144159
Lee, Vivian W Y; Schwander, Bjoern; Lee, Victor H F
2014-06-01
To compare the effectiveness and cost-effectiveness of erlotinib versus gefitinib as first-line treatment of epidermal growth factor receptor-activating mutation-positive non-small-cell lung cancer patients. DESIGN. Indirect treatment comparison and a cost-effectiveness assessment. Hong Kong. Those having epidermal growth factor receptor-activating mutation-positive non-small-cell lung cancer. Erlotinib versus gefitinib use was compared on the basis of four relevant Asian phase-III randomised controlled trials: one for erlotinib (OPTIMAL) and three for gefitinib (IPASS; NEJGSG; WJTOG). The cost-effectiveness assessment model simulates the transition between the health states: progression-free survival, progression, and death over a lifetime horizon. The World Health Organization criterion (incremental cost-effectiveness ratio <3 times of gross domestic product/capita:
Ganz, Peter; Amarenco, Pierre; Goldstein, Larry B; Sillesen, Henrik; Bao, Weihang; Preston, Gregory M; Welch, K Michael A
2017-12-01
Established risk factors do not fully identify patients at risk for recurrent stroke. The SPARCL trial (Stroke Prevention by Aggressive Reduction in Cholesterol Levels) evaluated the effect of atorvastatin on stroke risk in patients with a recent stroke or transient ischemic attack and no known coronary heart disease. This analysis explored the relationships between 13 plasma biomarkers assessed at trial enrollment and the occurrence of outcome strokes. We conducted a case-cohort study of 2176 participants; 562 had outcome strokes and 1614 were selected randomly from those without outcome strokes. Time to stroke was evaluated by Cox proportional hazards models. There was no association between time to stroke and lipoprotein-associated phospholipase A 2 , monocyte chemoattractant protein-1, resistin, matrix metalloproteinase-9, N-terminal fragment of pro-B-type natriuretic peptide, soluble vascular cell adhesion molecule-1, soluble intercellular adhesion molecule-1, or soluble CD40 ligand. In adjusted analyses, osteopontin (hazard ratio per SD change, 1.362; P <0.0001), neopterin (hazard ratio, 1.137; P =0.0107), myeloperoxidase (hazard ratio, 1.177; P =0.0022), and adiponectin (hazard ratio, 1.207; P =0.0013) were independently associated with outcome strokes. After adjustment for the Stroke Prognostic Instrument-II and treatment, osteopontin, neopterin, and myeloperoxidase remained independently associated with outcome strokes. The addition of these 3 biomarkers to Stroke Prognostic Instrument-II increased the area under the receiver operating characteristic curve by 0.023 ( P =0.015) and yielded a continuous net reclassification improvement (29.1%; P <0.0001) and an integrated discrimination improvement (42.3%; P <0.0001). Osteopontin, neopterin, and myeloperoxidase were independently associated with the risk of recurrent stroke and improved risk classification when added to a clinical risk algorithm. URL: http://www.clinicaltrials.gov. Unique Identifier: NCT00147602. © 2017 American Heart Association, Inc.
Meguid, Robert A.; Hooker, Craig M.; Harris, James; Xu, Li; Westra, William H.; Sherwood, J. Timothy; Sussman, Marc; Cattaneo, Stephen M.; Shin, James; Cox, Solange; Christensen, Joani; Prints, Yelena; Yuan, Nance; Zhang, Jennifer; Yang, Stephen C.
2010-01-01
Background: Survival outcomes of never smokers with non-small cell lung cancer (NSCLC) who undergo surgery are poorly characterized. This investigation compared surgical outcomes of never and current smokers with NSCLC. Methods: This investigation was a single-institution retrospective study of never and current smokers with NSCLC from 1975 to 2004. From an analytic cohort of 4,546 patients with NSCLC, we identified 724 never smokers and 3,822 current smokers. Overall, 1,142 patients underwent surgery with curative intent. For survival analysis by smoking status, hazard ratios (HRs) were estimated using Cox proportional hazard modeling and then further adjusted by other covariates. Results: Never smokers were significantly more likely than current smokers to be women (P < .01), older (P < .01), and to have adenocarcinoma (P < .01) and bronchioloalveolar carcinoma (P < .01). No statistically significant differences existed in stage distribution at presentation for the analytic cohort (P = .35) or for the subgroup undergoing surgery (P = .24). The strongest risk factors of mortality among patients with NSCLC who underwent surgery were advanced stage (adjusted hazard ratio, 3.43; 95% CI, 2.32-5.07; P < .01) and elevated American Society of Anesthesiologists classification (adjusted hazard ratio, 2.18; 95% CI, 1.40-3.40; P < .01). The minor trend toward an elevated risk of death on univariate analysis for current vs never smokers in the surgically treated group (hazard ratio, 1.20; 95% CI, 0.98-1.46; P = .07) was completely eliminated when the model was adjusted for covariates (P = .97). Conclusions: Our findings suggest that smoking status at time of lung cancer diagnosis has little impact on the long-term survival of patients with NSCLC, especially after curative surgery. Despite different etiologies between lung cancer in never and current smokers the prognosis is equally dismal. PMID:20507946
Whitson, Bryan A; Groth, Shawn S; Andrade, Rafael S; Mitiek, Mohi O; Maddaus, Michael A; D'Cunha, Jonathan
2012-03-01
We used a population-based data set to assess the association between the extent of pulmonary resection for bronchoalveolar carcinoma and survival. The reports thus far have been limited to small, institutional series. Using the Surveillance, Epidemiology, and End Results database (1988-2007), we identified patients with bronchoalveolar carcinoma who had undergone wedge resection, segmentectomy, or lobectomy. The bronchoalveolar carcinoma histologic findings were mucinous, nonmucinous, mixed, not otherwise specified, and alveolar carcinoma. To adjust for potential confounders, we used a Cox proportional hazards regression model. A total of 6810 patients met the inclusion criteria. Compared with the sublobar resections (wedge resections and segmentectomies), lobectomy conferred superior 5-year overall (59.5% vs 43.9%) and cancer-specific (67.1% vs 53.1%) survival (P < .0001). After adjusting for potential confounding patient and tumor characteristics, we found that patients who underwent an anatomic resection had significantly better overall (segmentectomy: hazard ratio, 0.59; 95% confidence interval, 0.43-0.81; lobectomy: hazard ratio, 0.50; 95% confidence interval, 0.44-0.57) and cancer-specific (segmentectomy: hazard ratio, 0.51; 95% confidence interval, 0.34-0.75; lobectomy: hazard ratio, 0.46; 95% confidence interval, 0.40-0.53) survival compared with patients who underwent wedge resection. Additionally, gender, race, tumor size, and degree of tumor de-differentiation were negative prognostic factors. Our results were unchanged when we limited our analysis to early-stage disease. Using a population-based data set, we found that anatomic resections for bronchoalveolar carcinoma conferred superior overall and cancer-specific survival rates compared with wedge resection. Bronchoalveolar carcinoma's propensity for intraparenchymal spread might be the underlying biologic basis of our observation of improved survival after anatomic resection. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Budhathoki, Sanjeev; Hidaka, Akihisa; Sawada, Norie; Tanaka-Mizuno, Sachiko; Kuchiba, Aya; Charvat, Hadrien; Goto, Atsushi; Kojima, Satoshi; Sudo, Natsuki; Shimazu, Taichi; Sasazuki, Shizuka; Inoue, Manami; Tsugane, Shoichiro; Iwasaki, Motoki
2018-01-01
Abstract Objective To evaluate the association between pre-diagnostic circulating vitamin D concentration and the subsequent risk of overall and site specific cancer in a large cohort study. Design Nested case-cohort study within the Japan Public Health Center-based Prospective Study cohort. Setting Nine public health centre areas across Japan. Participants 3301 incident cases of cancer and 4044 randomly selected subcohort participants. Exposure Plasma concentration of 25-hydroxyvitamin D measured by enzyme immunoassay. Participants were divided into quarters based on the sex and season specific distribution of 25-hydroxyvitamin D among subcohorts. Weighted Cox proportional hazard models were used to calculate the multivariable adjusted hazard ratios for overall and site specific cancer across categories of 25-hydroxyvitamin D concentration, with the lowest quarter as the reference. Main outcome measure Incidence of overall or site specific cancer. Results Plasma 25-hydroxyvitamin D concentration was inversely associated with the risk of total cancer, with multivariable adjusted hazard ratios for the second to fourth quarters compared with the lowest quarter of 0.81 (95% confidence interval 0.70 to 0.94), 0.75 (0.65 to 0.87), and 0.78 (0.67 to 0.91), respectively (P for trend=0.001). Among the findings for cancers at specific sites, an inverse association was found for liver cancer, with corresponding hazard ratios of 0.70 (0.44 to 1.13), 0.65 (0.40 to 1.06), and 0.45 (0.26 to 0.79) (P for trend=0.006). A sensitivity analysis showed that alternately removing cases of cancer at one specific site from total cancer cases did not substantially change the overall hazard ratios. Conclusions In this large prospective study, higher vitamin D concentration was associated with lower risk of total cancer. These findings support the hypothesis that vitamin D has protective effects against cancers at many sites. PMID:29514781
Sáez, María E; González-Pérez, Antonio; Johansson, Saga; Himmelmann, Anders; García Rodríguez, Luis A
2016-07-01
In secondary cardiovascular prevention, discontinuation of acetylsalicylic acid (ASA) is associated with an increased risk of cardiovascular events. This study assessed the impact of ASA reinitiation on the risk of myocardial infarction and coronary heart disease death. Patients prescribed ASA for secondary cardiovascular prevention and who had had a period of ASA discontinuation of ≥90 days in 2000-2007 were identified from The Health Improvement Network (N = 10,453). Incidence of myocardial infarction/coronary heart disease death was calculated. Survival analyses using adjusted Cox proportional hazard models were performed to calculate hazard ratios and 95% confidence intervals for the risk of myocardial infarction/coronary heart disease death associated with ASA use patterns after the initial period of discontinuation. Individuals who were prescribed ASA during follow-up were considered reinitiators. The incidence of myocardial infarction/coronary heart disease death was 8.90 cases per 1000 person-years. Risk of myocardial infarction/coronary heart disease death was similar for current ASA users, who had been continuously exposed since reinitiation, and patients who had not reinitiated ASA (hazard ratio 1.27, 95% confidence interval 0.93-1.73). Among reinitiators, an additional period of ASA discontinuation was associated with increased risk of myocardial infarction/coronary heart disease death compared with no reinitiation (current users: hazard ratio 1.46, 95% confidence interval 1.13-1.90; noncurrent users: hazard ratio 1.70, 95% confidence interval 1.31-2.21). ASA reinitiation was not associated with a decreased risk of myocardial infarction/coronary heart disease death. This may be explained by confounding by indication/comorbidity, whereby higher-risk patients are more likely to reinitiate therapy. An additional period of ASA discontinuation among reinitiators was associated with an increased risk of myocardial infarction/coronary heart disease death. © The European Society of Cardiology 2015.
History of Childhood Kidney Disease and Risk of Adult End-Stage Renal Disease.
Calderon-Margalit, Ronit; Golan, Eliezer; Twig, Gilad; Leiba, Adi; Tzur, Dorit; Afek, Arnon; Skorecki, Karl; Vivante, Asaf
2018-02-01
The long-term risk associated with childhood kidney disease that had not progressed to chronic kidney disease in childhood is unclear. We aimed to estimate the risk of future end-stage renal disease (ESRD) among adolescents who had normal renal function and a history of childhood kidney disease. We conducted a nationwide, population-based, historical cohort study of 1,521,501 Israeli adolescents who were examined before compulsory military service in 1967 through 1997; data were linked to the Israeli ESRD registry. Kidney diseases in childhood included congenital anomalies of the kidney and urinary tract, pyelonephritis, and glomerular disease; all participants included in the primary analysis had normal renal function and no hypertension in adolescence. Cox proportional-hazards models were used to estimate the hazard ratio for ESRD associated with a history of childhood kidney disease. During 30 years of follow-up, ESRD developed in 2490 persons. A history of any childhood kidney disease was associated with a hazard ratio for ESRD of 4.19 (95% confidence interval [CI], 3.52 to 4.99). The associations between each diagnosis of kidney disease in childhood (congenital anomalies of the kidney and urinary tract, pyelonephritis, and glomerular disease) and the risk of ESRD in adulthood were similar in magnitude (multivariable-adjusted hazard ratios of 5.19 [95% CI, 3.41 to 7.90], 4.03 [95% CI, 3.16 to 5.14], and 3.85 [95% CI, 2.77 to 5.36], respectively). A history of kidney disease in childhood was associated with younger age at the onset of ESRD (hazard ratio for ESRD among adults <40 years of age, 10.40 [95% CI, 7.96 to 13.59]). A history of clinically evident kidney disease in childhood, even if renal function was apparently normal in adolescence, was associated with a significantly increased risk of ESRD, which suggests that kidney injury or structural abnormality in childhood has long-term consequences.
Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I
2015-01-01
Background/Objectives: The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the ‘real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. Subjects/Methods: A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Results: Patients prescribed sibutramine (N=23 927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77 047; hazard ratio 1.69, 95% confidence interval 1.12–2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21–8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92–2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34–1.32). Conclusions: Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease. PMID:25971925
Yen, Amy Ming-Fang; Boucher, Barbara J; Chiu, Sherry Yueh-Hsia; Fann, Jean Ching-Yuan; Chen, Sam Li-Sheng; Huang, Kuo-Chin; Chen, Hsiu-Hsi
2016-08-02
Transgenerational effects of paternal Areca catechu nut chewing on offspring metabolic syndrome (MetS) risk in humans, on obesity and diabetes mellitus experimentally, and of paternal smoking on offspring obesity, are reported, likely attributable to genetic and epigenetic effects previously reported in betel-associated disease. We aimed to determine the effects of paternal smoking, and betel chewing, on the risks of early MetS in human offspring. The 13 179 parent-child trios identified from 238 364 Taiwanese aged ≥20 years screened at 2 community-based integrated screening sessions were tested for the effects of paternal smoking, areca nut chewing, and their duration prefatherhood on age of detecting offspring MetS at screen by using a Cox proportional hazards regression model. Offspring MetS risks increased with prefatherhood paternal areca nutusage (adjusted hazard ratio, 1.77; 95% confidence interval [CI], 1.23-2.53) versus nonchewing fathers (adjusted hazard ratio, 3.28; 95% CI, 1.67-6.43) with >10 years paternal betel chewing, 1.62 (95% CI, 0.88-2.96) for 5 to 9 years, and 1.42 (95% CI, 0.80-2.54) for <5 years betel usage prefatherhood (Ptrend=0.0002), with increased risk (adjusted hazard ratio, 1.95; 95% CI, 1.26-3.04) for paternal areca nut usage from 20 to 29 years of age, versus from >30 years of age (adjusted hazard ratio,1.61; 95% CI, 0.22-11.69). MetS offspring risk for paternal smoking increased dosewise (Ptrend<0.0001) with earlier age of onset (Ptrend=0.0009), independently. Longer duration of paternal betel quid chewing and smoking, prefatherhood, independently predicted early occurrence of incident MetS in offspring, corroborating previously reported transgenerational effects of these habits, and supporting the need for habit-cessation program provision. © 2016 American Heart Association, Inc.
Damman, Peter; Wallentin, Lars; Fox, Keith A A; Windhausen, Fons; Hirsch, Alexander; Clayton, Tim; Pocock, Stuart J; Lagerqvist, Bo; Tijssen, Jan G P; de Winter, Robbert J
2012-01-31
The present study was designed to investigate the long-term prognostic impact of procedure-related and spontaneous myocardial infarction (MI) on cardiovascular mortality in patients with non-ST-elevation acute coronary syndrome. Five-year follow-up after procedure-related or spontaneous MI was investigated in the individual patient pooled data set of the FRISC-II (Fast Revascularization During Instability in Coronary Artery Disease), ICTUS (Invasive Versus Conservative Treatment in Unstable Coronary Syndromes), and RITA-3 (Randomized Intervention Trial of Unstable Angina 3) non-ST-elevation acute coronary syndrome trials. The principal outcome was cardiovascular death up to 5 years of follow-up. Cumulative event rates were estimated by the Kaplan-Meier method; hazard ratios were calculated with time-dependent Cox proportional hazards models. Adjustments were made for the variables associated with long-term outcomes. Among the 5467 patients, 212 experienced a procedure-related MI within 6 months after enrollment. A spontaneous MI occurred in 236 patients within 6 months. The cumulative cardiovascular death rate was 5.2% in patients who had a procedure-related MI, comparable to that for patients without a procedure-related MI (hazard ratio 0.66; 95% confidence interval, 0.36-1.20, P=0.17). In patients who had a spontaneous MI within 6 months, the cumulative cardiovascular death rate was 22.2%, higher than for patients without a spontaneous MI (hazard ratio 4.52; 95% confidence interval, 3.37-6.06, P<0.001). These hazard ratios did not change materially after risk adjustments. Five-year follow-up of patients with non-ST-elevation acute coronary syndrome from the 3 trials showed no association between a procedure-related MI and long-term cardiovascular mortality. In contrast, there was a substantial increase in long-term mortality after a spontaneous MI.
Buys, Roselien; Coeckelberghs, Ellen; Cornelissen, Véronique A; Goetschalckx, Kaatje; Vanhees, Luc
2016-09-01
Peak oxygen uptake is an independent predictor of mortality in patients with coronary artery disease (CAD). However, patients with CAD are not always capable of reaching peak effort, and therefore submaximal gas exchange variables such as the oxygen uptake efficiency slope (OUES) have been introduced. Baseline exercise capacity as expressed by OUES provides prognostic information and this parameter responds to training. Therefore, we aimed to assess the prognostic value of post-training OUES in patients with CAD. We included 960 patients with CAD (age 60.6 ± 9.5 years; 853 males) who completed a cardiac rehabilitation program between 2000 and 2011. The OUES was calculated before and after cardiac rehabilitation and information on mortality was obtained. The relationships of post-training OUES with all-cause and cardiovascular (CV) mortality was assessed by Cox proportional hazards regression analyses. Receiver operator characteristic curve analysis was performed in order to obtain the optimal cut-off value. During 7.37 ± 3.20 years of follow-up (range: 0.45-13.75 years), 108 patients died, among whom 47 died due to CV reasons. The post-training OUES was related to all-cause (hazard ratio: 0.50, p < 0.001) and CV (hazard ratio: 0.40, p < 0.001) mortality. When significant covariates, including baseline OUES, were entered into the Cox regression analysis, post-training OUES remained related to all-cause and CV mortality (hazard ratio: 0.40, p < 0.01 and 0.26, p < 0.01, respectively). In addition, the change in OUES due to exercise training was positively related to mortality (hazard ratio: 0.49, p < 0.01). Post-training OUES has stronger prognostic value compared to baseline OUES. The lack of improvement in exercise capacity expressed by OUES after an exercise training program relates to a worse prognosis and can help distinguish patients with favorable and unfavorable prognoses. © The European Society of Cardiology 2016.
Hayes, J F; Bhaskaran, K; Batterham, R; Smeeth, L; Douglas, I
2015-09-01
The marketing authorization for the weight loss drug sibutramine was suspended in 2010 following a major trial that showed increased rates of non-fatal myocardial infarction and cerebrovascular events in patients with pre-existing cardiovascular disease. In routine clinical practice, sibutramine was already contraindicated in patients with cardiovascular disease and so the relevance of these influential clinical trial findings to the 'real World' population of patients receiving or eligible for the drug is questionable. We assessed rates of myocardial infarction and cerebrovascular events in a cohort of patients prescribed sibutramine or orlistat in the United Kingdom. A cohort of patients prescribed weight loss medication was identified within the Clinical Practice Research Datalink. Rates of myocardial infarction or cerebrovascular event, and all-cause mortality were compared between patients prescribed sibutramine and similar patients prescribed orlistat, using both a multivariable Cox proportional hazard model, and propensity score-adjusted model. Possible effect modification by pre-existing cardiovascular disease and cardiovascular risk factors was assessed. Patients prescribed sibutramine (N=23,927) appeared to have an elevated rate of myocardial infarction or cerebrovascular events compared with those taking orlistat (N=77,047; hazard ratio 1.69, 95% confidence interval 1.12-2.56). However, subgroup analysis showed the elevated rate was larger in those with pre-existing cardiovascular disease (hazard ratio 4.37, 95% confidence interval 2.21-8.64), compared with those with no cardiovascular disease (hazard ratio 1.52, 95% confidence interval 0.92-2.48, P-interaction=0.0076). All-cause mortality was not increased in those prescribed sibutramine (hazard ratio 0.67, 95% confidence interval 0.34-1.32). Sibutramine was associated with increased rates of acute cardiovascular events in people with pre-existing cardiovascular disease, but there was a low absolute risk in those without. Sibutramine's marketing authorization may have, therefore, been inappropriately withdrawn for people without cardiovascular disease.
Jonkman, Nini H; Westland, Heleen; Groenwold, Rolf H H; Ågren, Susanna; Atienza, Felipe; Blue, Lynda; Bruggink-André de la Porte, Pieta W F; DeWalt, Darren A; Hebert, Paul L; Heisler, Michele; Jaarsma, Tiny; Kempen, Gertrudis I J M; Leventhal, Marcia E; Lok, Dirk J A; Mårtensson, Jan; Muñiz, Javier; Otsu, Haruka; Peters-Klimm, Frank; Rich, Michael W; Riegel, Barbara; Strömberg, Anna; Tsuyuki, Ross T; van Veldhuisen, Dirk J; Trappenburg, Jaap C A; Schuurmans, Marieke J; Hoes, Arno W
2016-03-22
Self-management interventions are widely implemented in the care for patients with heart failure (HF). However, trials show inconsistent results, and whether specific patient groups respond differently is unknown. This individual patient data meta-analysis assessed the effectiveness of self-management interventions in patients with HF and whether subgroups of patients respond differently. A systematic literature search identified randomized trials of self-management interventions. Data from 20 studies, representing 5624 patients, were included and analyzed with the use of mixed-effects models and Cox proportional-hazard models, including interaction terms. Self-management interventions reduced the risk of time to the combined end point of HF-related hospitalization or all-cause death (hazard ratio, 0.80; 95% confidence interval [CI], 0.71-0.89), time to HF-related hospitalization (hazard ratio, 0.80; 95% CI, 0.69-0.92), and improved 12-month HF-related quality of life (standardized mean difference, 0.15; 95% CI, 0.00-0.30). Subgroup analysis revealed a protective effect of self-management on the number of HF-related hospital days in patients <65 years of age (mean, 0.70 versus 5.35 days; interaction P=0.03). Patients without depression did not show an effect of self-management on survival (hazard ratio for all-cause mortality, 0.86; 95% CI, 0.69-1.06), whereas in patients with moderate/severe depression, self-management reduced survival (hazard ratio, 1.39; 95% CI, 1.06-1.83, interaction P=0.01). This study shows that self-management interventions had a beneficial effect on time to HF-related hospitalization or all-cause death and HF-related hospitalization alone and elicited a small increase in HF-related quality of life. The findings do not endorse limiting self-management interventions to subgroups of patients with HF, but increased mortality in depressed patients warrants caution in applying self-management strategies in these patients. © 2016 American Heart Association, Inc.
Law, Sabrina P; Oron, Assaf P; Kemna, Mariska S; Albers, Erin L; McMullan, D Michael; Chen, Jonathan M; Law, Yuk M
2018-05-01
Ventricular assist devices have gained popularity in the management of refractory heart failure in children listed for heart transplantation. Our primary aim was to compare the composite endpoint of all-cause pretransplant mortality and loss of transplant eligibility in children who were treated with a ventricular assist device versus a medically managed cohort. This was a retrospective cohort analysis. Data were obtained from the Scientific Registry of Transplant Recipients. The at-risk population (n = 1,380) was less than 18 years old, either on a ventricular assist device (605 cases) or an equivalent-severity, intensively medically treated group (referred to as MED, 775 cases). None. The impact of ventricular assist devices was estimated via Cox proportional hazards regression (hazard ratio), dichotomizing 1-year outcomes to "poor" (22%: 193 deaths, 114 too sick) versus all others (940 successful transplants, 41 too healthy, 90 censored), while adjusting for conventional risk factors. Among children 0-12 months old, ventricular assist device was associated with a higher risk of poor outcomes (hazard ratio, 2.1; 95% CI, 1.5-3.0; p < 0.001). By contrast, ventricular assist device was associated with improved outcomes for ages 12-18 (hazard ratio, 0.3; 95% CI, 0.1-0.7; p = 0.003). For candidates 1-5 and 6-11 years old, there were no differences in outcomes between the ventricular assist device and MED groups (hazard ratio, 0.8 and 1.0, p = 0.43 and 0.9). The interaction between ventricular assist devices and age group was strongly significant (p < 0.001). This is a comparative study of ventricular assist devices versus medical therapy in children. Age is a significant modulator of waitlist outcomes for children with end-stage heart failure supported by ventricular assist device, with the impact of ventricular assist devices being more beneficial in adolescents.
Fredman, Lisa; Lyons, Jennifer G; Cauley, Jane A; Hochberg, Marc; Applebaum, Katie M
2015-09-01
Previous studies have shown inconsistent associations between caregiving and mortality. This may be due to analyzing caregiver status at baseline only, and that better health is probably related to taking on caregiving responsibilities and continuing in that role. The latter is termed The Healthy Caregiver Hypothesis, similar to the Healthy Worker Effect in occupational epidemiology. We applied common approaches from occupational epidemiology to evaluate the association between caregiving and mortality, including treating caregiving as time-varying and lagging exposure up to 5 years. Caregiving status among 1,068 women (baseline mean age = 81.0 years; 35% caregivers) participating in the Caregiver-Study of Osteoporotic Fractures study was assessed at five interviews conducted between 1999 and 2009. Mortality was determined through January 2012. Cox proportional hazards models were used to estimate adjusted hazard ratios and 95% confidence intervals adjusted for sociodemographics, perceived stress, and functional limitations. A total of 483 participants died during follow-up (38.8% and 48.7% of baseline caregivers and noncaregivers, respectively). Using baseline caregiving status, the association with mortality was 0.77, 0.62-0.95. Models of time-varying caregiving status showed a more pronounced reduction in mortality in current caregivers (hazard ratios = 0.54, 0.38-0.75), which diminished with longer lag periods (3-year lag hazard ratio = 0.68, 0.52-0.88, 5-year lag hazard ratios = 0.76, 0.60-0.95). Overall, caregivers had lower mortality rates than noncaregivers in all analyses. These associations were sensitive to the lagged period, indicating that the timing of leaving caregiving does influence this relationship and should be considered in future investigations. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Muramatsu, Takashi; Matsushita, Kunihiro; Yamashita, Kentaro; Kondo, Takahisa; Maeda, Kengo; Shintani, Satoshi; Ichimiya, Satoshi; Ohno, Miyoshi; Sone, Takahito; Ikeda, Nobuo; Watarai, Masato; Murohara, Toyoaki
2012-03-01
It has not been fully examined whether angiotensin II receptor blocker is superior to calcium channel blocker to reduce cardiovascular events in hypertensive patients with glucose intolerance. A prospective, open-labeled, randomized, controlled trial was conducted for Japanese hypertensive patients with type 2 diabetes mellitus or impaired glucose tolerance. A total of 1150 patients (women: 34%; mean age: 63 years; diabetes mellitus: 82%) were randomly assigned to receive either valsartan- or amlodipine-based antihypertensive treatment. Primary outcome was a composite of acute myocardial infarction, stroke, coronary revascularization, admission attributed to heart failure, or sudden cardiac death. Blood pressure was 145/82 and 144/81 mm Hg, and glycosylated hemoglobin was 7.0% and 6.9% at baseline in the valsartan group and the amlodipine group, respectively. Both of them were equally controlled between the 2 groups during the study. The median follow-up period was 3.2 years, and primary outcome had occurred in 54 patients in the valsartan group and 56 in the amlodipine group (hazard ratio: 0.97 [95% CI: 0.66-1.40]; P=0.85). Patients in the valsartan group had a significantly lower incidence of heart failure than in the amlodipine group (hazard ratio: 0.20 [95% CI: 0.06-0.69]; P=0.01). Other components and all-cause mortality were not significantly different between the 2 groups. Composite cardiovascular outcomes were comparable between the valsartan- and amlodipine-based treatments in Japanese hypertensive patients with glucose intolerance. Admission because of heart failure was significantly less in the valsartan group.
Maternal or paternal suicide and offspring's psychiatric and suicide-attempt hospitalization risk.
Kuramoto, S Janet; Stuart, Elizabeth A; Runeson, Bo; Lichtenstein, Paul; Långström, Niklas; Wilcox, Holly C
2010-11-01
We examined whether the risk for psychiatric morbidity requiring inpatient care was higher for offspring who experienced parental suicide, compared with offspring of fatal accident decedents, and whether the association varied according to the deceased parent's gender. Children and adolescents (0-17 years of age) who experienced maternal (N = 5600) or paternal (N = 17,847) suicide in 1973-2003 in Sweden were identified by using national, longitudinal, population-based registries. Cox regression modeling was used to compare psychiatric hospitalization risks among offspring of suicide decedents and propensity score-matched offspring of accident decedents. Offspring of maternal suicide decedents had increased risk of suicide-attempt hospitalization, after controlling for psychiatric hospitalization for decedents and surviving parents, compared with offspring of maternal accidental decedents. Offspring of paternal suicide decedents had similar risk of suicide-attempt hospitalization, compared with offspring of accident decedents, but had increased risk of hospitalization attributable to depressive and anxiety disorders. The magnitude of risks for offspring suicide-attempt hospitalization was greater for those who experienced maternal versus paternal suicide, compared with their respective control offspring (interaction P = .05; offspring of maternal decedents, adjusted hazard ratio: 1.80 [95% confidence interval: 1.19-2.74]; offspring of paternal decedents, adjusted hazard ratio: 1.14 [95% confidence interval: 0.96-1.35]). Maternal suicide is associated with increased risk of suicide-attempt hospitalization for offspring, beyond the risk associated with maternal accidental death. However, paternal suicide is not associated with suicide-attempt hospitalization. Future studies should examine factors that might differ between offspring who experience maternal versus paternal suicide, including genetic or early environmental determinants.
Burgner, David P; Cooper, Matthew N; Moore, Hannah C; Stanley, Fiona J; Thompson, Peter L; de Klerk, Nicholas H; Carter, Kim W
2015-01-01
Pathogen-specific and overall infection burden may contribute to atherosclerosis and cardiovascular disease (CVD), but the effect of infection severity and timing is unknown. We investigated whether childhood infection-related hospitalisation (IRH, a marker of severity) was associated with subsequent adult CVD hospitalisation. Using longitudinal population-based statutorily-collected administrative health data from Western Australia (1970-2009), we identified adults hospitalised with CVD (ischaemic heart disease, ischaemic stroke, and peripheral vascular disease) and matched them (10:1) to population controls. We used Cox regression to assess relationships between number and type of childhood IRH and adulthood CVD hospitalisation, adjusting for sex, age, Indigenous status, socioeconomic status, and birth weight. 631 subjects with CVD-related hospitalisation in adulthood (≥ 18 years) were matched with 6310 controls. One or more childhood (< 18 years) IRH was predictive of adult CVD-related hospitalisation (adjusted hazard ratio, 1.3; 95% CI 1.1-1.6; P < 0.001). The association showed a dose-response; ≥ 3 childhood IRH was associated with a 2.2 times increased risk of CVD-related hospitalisation in adulthood (adjusted hazard ratio, 2.2; 95% CI 1.7-2.9; P < 0.001). The association was observed across all clinical diagnostic groups of infection (upper respiratory tract infection, lower respiratory tract infection, infectious gastroenteritis, urinary tract infection, skin and soft tissue infection, and other viral infection), and individually with CVD diagnostic categories (ischaemic heart disease, ischaemic stroke and peripheral vascular disease). Severe childhood infection is associated with CVD hospitalisations in adulthood in a dose-dependent manner, independent of population-level risk factors.
Delay discounting rates: a strong prognostic indicator of smoking relapse.
Sheffer, Christine E; Christensen, Darren R; Landes, Reid; Carter, Larry P; Jackson, Lisa; Bickel, Warren K
2014-11-01
Recent evidence suggests that several dimensions of impulsivity and locus of control are likely to be significant prognostic indicators of relapse. One-hundred and thirty-one treatment seeking smokers were enrolled in six weeks of multi-component cognitive-behavioral therapy with eight weeks of nicotine replacement therapy. Cox proportional hazard regressions were used to model days to relapse with each of the following: delay discounting of $100, delay discounting of $1000, six subscales of the Barratt Impulsiveness Scale (BIS), Rotter's Locus of Control (RLOC), Fagerstrom's Test for Nicotine Dependence (FTND), and the Perceived Stress Scale (PSS). Hazard ratios for a one standard deviation increase were estimated with 95% confidence intervals for each explanatory variable. Likelihood ratios were used to examine the level of association with days to relapse for different combinations of the explanatory variables while accounting for nicotine dependence and stress level. These analyses found that the $100 delay discounting rate had the strongest association with days to relapse. Further, when discounting rates were combined with the FTND and PSS, the associations remained significant. When the other measures were combined with the FTND and PSS, their associations with relapse non-significant. These findings indicate that delay discounting is independently associated with relapse and adds to what is already accounted for by nicotine dependence and stress level. They also signify that delay discounting is a productive new target for enhancing treatment for tobacco dependence. Consequently, adding an intervention designed to decrease discounting rates to a comprehensive treatment for tobacco dependence has the potential to decrease relapse rates. Copyright © 2014 Elsevier Ltd. All rights reserved.
Bergamin, Fabricio S; Almeida, Juliano P; Landoni, Giovanni; Galas, Filomena R B G; Fukushima, Julia T; Fominskiy, Evgeny; Park, Clarice H L; Osawa, Eduardo A; Diz, Maria P E; Oliveira, Gisele Q; Franco, Rafael A; Nakamura, Rosana E; Almeida, Elisangela M; Abdala, Edson; Freire, Maristela P; Filho, Roberto K; Auler, Jose Otavio C; Hajjar, Ludhmila A
2017-05-01
To assess whether a restrictive strategy of RBC transfusion reduces 28-day mortality when compared with a liberal strategy in cancer patients with septic shock. Single center, randomized, double-blind controlled trial. Teaching hospital. Adult cancer patients with septic shock in the first 6 hours of ICU admission. Patients were randomized to the liberal (hemoglobin threshold, < 9 g/dL) or to the restrictive strategy (hemoglobin threshold, < 7 g/dL) of RBC transfusion during ICU stay. Patients were randomized to the liberal (n = 149) or to the restrictive transfusion strategy (n = 151) group. Patients in the liberal group received more RBC units than patients in the restrictive group (1 [0-3] vs 0 [0-2] unit; p < 0.001). At 28 days after randomization, mortality rate in the liberal group (primary endpoint of the study) was 45% (67 patients) versus 56% (84 patients) in the restrictive group (hazard ratio, 0.74; 95% CI, 0.53-1.04; p = 0.08) with no differences in ICU and hospital length of stay. At 90 days after randomization, mortality rate in the liberal group was lower (59% vs 70%) than in the restrictive group (hazard ratio, 0.72; 95% CI, 0.53-0.97; p = 0.03). We observed a survival trend favoring a liberal transfusion strategy in patients with septic shock when compared with the restrictive strategy. These results went in the opposite direction of the a priori hypothesis and of other trials in the field and need to be confirmed.
Impact of Insecticide-Treated Net Ownership on All-Cause Child Mortality in Malawi, 2006-2010.
Florey, Lia S; Bennett, Adam; Hershey, Christine L; Bhattarai, Achuyt; Nielsen, Carrie F; Ali, Doreen; Luhanga, Misheck; Taylor, Cameron; Eisele, Thomas P; Yé, Yazoume
2017-09-01
Insecticide-treated nets (ITNs) have been shown to be highly effective at reducing malaria morbidity and mortality in children. However, there are limited studies that assess the association between increasing ITN coverage and child mortality over time, at the national level, and under programmatic conditions. Two analytic approaches were used to examine this association: a retrospective cohort analysis of individual children and a district-level ecologic analysis. To evaluate the association between household ITN ownership and all-cause child mortality (ACCM) at the individual level, data from the 2010 Demographic and Health Survey (DHS) were modeled in a Cox proportional hazards framework while controlling for numerous environmental, household, and individual confounders through the use of exact matching. To evaluate population-level association between ITN ownership and ACCM between 2006 and 2010, program ITN distribution data and mortality data from the 2006 Multiple Indicator Cluster Survey and the 2010 DHS were aggregated at the district level and modeled using negative binomial regression. In the Cox model controlling for household, child and maternal health factors, children between 1 and 59 months in households owning an ITN had significantly lower mortality compared with those without an ITN (hazard ratio = 0.75, 95% confidence interval [CI] = 0.62-90). In the district-level model, higher ITN ownership was significantly associated with lower ACCM (incidence rate ratio = 0.77; 95% CI = 0.60-0.98). These findings suggest that increasing ITN ownership may have contributed to the decline in ACCM during 2006-2010 in Malawi and represent a novel use of district-level data from nationally representative surveys.
Robson, Val; Dodd, Susanna; Thomas, Stephen
2009-03-01
This paper is a report of a study to compare a medical grade honey with conventional treatments on the healing rates of wounds healing by secondary intention. There is an increasing body of evidence to support the use of honey to treat wounds, but there is a lack of robust randomized trials on which clinicians can base their clinical judgement. A sample of 105 patients were involved in a single centre, open-label randomized controlled trial in which patients received either a conventional wound dressing or honey. Data were collected between September 2004 and May 2007. The median time to healing in the honey group was 100 days compared with 140 days in the control group. The healing rate at 12 weeks was equal to 46.2% in the honey group compared with 34.0% in the conventional group, and the difference in the healing rates (95% confidence interval, CI) at 12 weeks between the two groups was 12.2% (-13.6%, 37.9%). The unadjusted hazard ratio (95% CI) from a Cox regression was equal to 1.30 (0.77, 2.19), P = 0.321. When the treatment effect was adjusted for confounding factors (sex, wound type, age and wound area at start of treatment), the hazard ratio increased to 1.51 but was again not statistically significant. Wound area at start of treatment and sex are both highly statistically significant predictors of time to healing. These results support the proposition that there are clinical benefits from using honey in wound care, but further research is needed.