Palkovits, Stefan; Seidel, Gerald; Pertl, Laura; Malle, Eva M; Hausberger, Silke; Makk, Johanna; Singer, Christoph; Osterholt, Julia; Herzog, Sereina A; Haas, Anton; Weger, Martin
2017-12-01
To evaluate the effect of intravitreal bevacizumab on the macular choroidal volume and the subfoveal choroidal thickness in treatment naïve eyes with exudative age-related macular degeneration. The macular choroidal volume and the subfoveal choroidal thickness were measured using enhanced depth imaging optical coherence tomography. After a screening examination, each patient received 3 monthly intravitreal injections of 1.25 mg bevacizumab. One month after the third injection was a final assessment. Forty-seven patients with a mean age of 80 ± 6.4 years were included. The macular choroidal volume decreased significantly from median 4.1 mm (interquartile range 3.4-5.9) to median 3.9 mm (interquartile range 3.1-5.6) between the baseline and final examination (difference -0.46 mm, 95% confidence interval: -0.57 to 0.35, P < 0.001). Similarly, subfoveal choroidal thickness had decreased from 157.0 μm (interquartile range 116.0-244.5) at baseline to 139.0 μm (interquartile range 102.5-212.0) at the final examination (P < 0.001). Both parameters macular choroidal volume at baseline and subfoveal choroidal thickness at baseline were not associated with the response to treatment. The macular choroidal volume and the subfoveal choroidal thickness decreased significantly after 3 monthly bevacizumab injections for exudative age-related macular degeneration.
Goker, Berna; Block, Joel A
2006-01-01
The risk of developing bilateral disease progressing to total hip arthroplasty (THA) among patients who undergo unilateral THA for non-traumatic avascular necrosis (AVN) remains poorly understood. An analysis of the time-course to contralateral THA, as well as the effects of underlying AVN risk factors, is presented. Forty-seven consecutive patients who underwent THA for AVN were evaluated. Peri-operative and annual post-operative antero-posterior pelvis radiographs were examined for evidence of contralateral involvement. Patient age, weight, height, underlying AVN risk factor(s), date of onset of contralateral hip pain if occurred, and date of contralateral THA if performed, were recorded. Bone scan, computerized tomography and magnetic resonance imaging data were utilized when available. Twenty-one patients (46.6%) underwent contralateral THA for AVN within a median of 9 months after the initial THA (range 0-93, interquartile range 28.5 months). The median follow-up for patients without contralateral THA was 75 months (range 3-109, interquartile range 69 months). Thirty-four patients had radiographic findings of contralateral AVN at study entry; 25 were symptomatic bilaterally at entry and 7 developed contralateral symptoms within a mean time of 12 months (median 10 months, interquartile range 12 months). None of the 13 patients who were free of radiographic evidence of contralateral AVN at study entry developed evidence of AVN during the follow-up. AVN associated with glucocorticoid use was more likely to manifest as bilateral disease than either idiopathic AVN or ethanol-associated AVN (P=0.02 and P=0.03 respectively). Radiographically-evident AVN in the contralateral hip at THA is unlikely to remain asymptomatic for a prolonged period of time. Conversely, asymptomatic contralateral hips without radiographic evidence of AVN are unlikely to develop clinically significant AVN.
Nakanishi, Taizo; Goto, Tadahiro; Kobuchi, Taketsune; Kimura, Tetsuya; Hayashi, Hiroyuki; Tokuda, Yasuharu
2017-12-22
To compare bystander cardiopulmonary resuscitation skills retention between conventional learning and flipped learning for first-year medical students. A post-test only control group design. A total of 108 participants were randomly assigned to either the conventional learning or flipped learning. The primary outcome measures of time to the first chest compression and the number of total chest compressions during a 2-minute test period 6 month after the training were assessed with the Mann-Whitney U test. Fifty participants (92.6%) in the conventional learning group and 45 participants (83.3%) in the flipped learning group completed the study. There were no statistically significant differences 6 months after the training in the time to the first chest compression of 33.0 seconds (interquartile range, 24.0-42.0) for the conventional learning group and 31.0 seconds (interquartile range, 25.0-41.0) for the flipped learning group (U=1171.0, p=0.73) or in the number of total chest compressions of 101.5 (interquartile range, 90.8-124.0) for the conventional learning group and 104.0 (interquartile range, 91.0-121.0) for the flipped learning group (U=1083.0, p=0.75). The 95% confidence interval of the difference between means of the number of total chest compressions 6 months after the training did not exceed a clinically important difference defined a priori. There were no significant differences between the conventional learning group and the flipped learning group in our main outcomes. Flipped learning might be comparable to conventional learning, and seems a promising approach which requires fewer resources and enables student-centered learning without compromising the acquisition of CPR skills.
Mattioli, Sandro; Ruffato, Alberto; Lugaresi, Marialuisa; Pilotti, Vladimiro; Aramini, Beatrice; D'Ovidio, Frank
2010-11-01
Quality of outcome of the Heller-Dor operation is sometimes different between studies, likely because of technical reasons. We analyze the details of myotomy and fundoplication in relation to the results achieved over a 30-year single center's experience. From 1979-2008, a long esophagogastric myotomy and a partial anterior fundoplication to protect the surface of the myotomy was routinely performed with intraoperative manometry in 202 patients (97 men; median age, 55.5 years; interquartile range, 43.7-71 years) through a laparotomy and in 60 patients (24 men; median age, 46 years; interquartile range, 36.2-63 years) through a laparoscopy. The follow-up consisted of periodical interview, endoscopy, and barium swallow, and a semiquantitative scale was used to grade results. Mortality was 1 of 202 in the laparotomy group and 0 of 60 in the laparoscopy group. Median follow-up was 96 months (interquartile range, 48-190.5 months) in the laparotomy group and 48 months (interquartile range, 27-69.5 months) in the laparoscopy group. At intraoperative manometry, complete abolition of the high-pressure zone was obtained in 100%. The Dor-related high-pressure zone length and mean pressure were 4.5 ± 0.4 cm and 13.3 ± 2.2 mm Hg in the laparotomy group and 4.5 ± 0.5 cm and 13.2 ± 2.2 mm Hg in the laparoscopy group (P = .75). In the laparotomy group poor results (19/201 [9.5%]) were secondary to esophagitis in 15 (7.5%) of 201 patients (in 2 patients after 184 and 252 months, respectively) and to recurrent dysphagia in 4 (2%) of 201 patients, all with end-stage sigmoid achalasia. In the laparoscopy group 2 (3.3%) of 60 had esophagitis. A long esophagogastric myotomy protected by means of Dor fundoplication cures or substantially reduces dysphagia in the great majority of patients affected by esophageal achalasia and effectively controls postoperative esophagitis. Intraoperative manometry is likely the key factor for achieving the reported results. Copyright © 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Smerud, Hilde Kloster; Bárány, Peter; Lindström, Karin; Fernström, Anders; Sandell, Anna; Påhlsson, Peter; Fellström, Bengt
2011-10-01
Systemic corticosteroid treatment has been shown to exert some protection against renal deterioration in IgA nephropathy (IgAN) but is not commonly recommended for long-term use due to the well-known systemic side effects. In this study, we investigated the efficacy and safety of a new enteric formulation of the locally acting glucocorticoid budesonide (Nefecon®), designed to release the active compound in the ileocecal region. The primary objective was to evaluate the efficacy of targeted release budesonide on albuminuria. Budesonide 8 mg/day was given to 16 patients with IgAN for 6 months, followed by a 3-month follow-up period. The efficacy was measured as change in 24-h urine albumin excretion, serum creatinine and estimated glomerular filtration rate (eGFR). The median relative reduction in urinary albumin excretion was 23% during the treatment period (interquartile range: -0.36 to -0.04, P = 0.04) with pretreatment values ranging from 0.3 to 6 g/24 h (median: 1.5 g/24 h). The median reduction in urine albumin peaked at 40% (interquartile range: -0.58 to -0.15) 2 months after treatment discontinuation. Serum creatinine was reduced by 6% (interquartile range: -0.12 to -0.02; P = 0.003), and eGFR [Modification of Diet in Renal Disease (MDRD)] increased ∼8% (interquartile range: 0.02-0.16, P = 0.003) during treatment. No major corticosteroid-related side effects were observed. In the present pilot study, enteric budesonide targeted to the ileocecal region had a significant effect on urine albumin excretion, accompanied by a minor reduction of serum creatinine and a modest increase of eGFR calculated by the MDRD equation, while eGFR calculated from Cockcroft-Gault equation and cystatin C was not changed. Enteric budesonide may represent a new treatment of IgAN warranting further investigation.
St Louis, James D; Jodhka, Upinder; Jacobs, Jeffrey P; He, Xia; Hill, Kevin D; Pasquali, Sara K; Jacobs, Marshall L
2014-12-01
Contemporary outcomes data for complete atrioventricular septal defect (CAVSD) repair are limited. We sought to describe early outcomes of CAVSD repair across a large multicenter cohort, and explore potential associations with patient characteristics, including age, weight, and genetic syndromes. Patients in the Society of Thoracic Surgeons Congenital Heart Surgery Database having repair of CAVSD (2008-2011) were included. Preoperative, operative, and outcomes data were described. Univariate associations between patient factors and outcomes were described. Of 2399 patients (101 centers), 78.4% had Down syndrome. Median age at surgery was 4.6 months (interquartile range, 3.5-6.1 months), with 11.8% (n = 284) aged ≤ 2.5 months. Median weight at surgery was 5.0 kg (interquartile range, 4.3-5.8 kg) with 6.3% (n = 151) < 3.5 kg. Pulmonary artery band removal at CAVSD repair was performed in 122 patients (4.6%). Major complications occurred in 9.8%, including permanent pacemaker implantation in 2.7%. Median postoperative length of stay (PLOS) was 8 days (interquartile range, 5-14 days). Overall hospital mortality was 3.0%. Weight < 3.5 kg and age ≤ 2.5 months were associated with higher mortality, longer PLOS, and increased frequency of major complications. Patients with Down syndrome had lower rates of mortality and morbidities than other patients; PLOS was similar. In a contemporary multicenter cohort, most patients with CAVSD have repair early in the first year of life. Prior pulmonary artery band is rare. Hospital mortality is generally low, although patients at extremes of low weight and younger age have worse outcomes. Mortality and major complication rates are lower in patients with Down syndrome. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Larochelle, Marc R; Bernson, Dana; Land, Thomas; Stopka, Thomas J; Wang, Na; Xuan, Ziming; Bagley, Sarah M; Liebschutz, Jane M; Walley, Alexander Y
2018-06-19
Opioid overdose survivors have an increased risk for death. Whether use of medications for opioid use disorder (MOUD) after overdose is associated with mortality is not known. To identify MOUD use after opioid overdose and its association with all-cause and opioid-related mortality. Retrospective cohort study. 7 individually linked data sets from Massachusetts government agencies. 17 568 Massachusetts adults without cancer who survived an opioid overdose between 2012 and 2014. Three types of MOUD were examined: methadone maintenance treatment (MMT), buprenorphine, and naltrexone. Exposure to MOUD was identified at monthly intervals, and persons were considered exposed through the month after last receipt. A multivariable Cox proportional hazards model was used to examine MOUD as a monthly time-varying exposure variable to predict time to all-cause and opioid-related mortality. In the 12 months after a nonfatal overdose, 2040 persons (11%) enrolled in MMT for a median of 5 months (interquartile range, 2 to 9 months), 3022 persons (17%) received buprenorphine for a median of 4 months (interquartile range, 2 to 8 months), and 1099 persons (6%) received naltrexone for a median of 1 month (interquartile range, 1 to 2 months). Among the entire cohort, all-cause mortality was 4.7 deaths (95% CI, 4.4 to 5.0 deaths) per 100 person-years and opioid-related mortality was 2.1 deaths (CI, 1.9 to 2.4 deaths) per 100 person-years. Compared with no MOUD, MMT was associated with decreased all-cause mortality (adjusted hazard ratio [AHR], 0.47 [CI, 0.32 to 0.71]) and opioid-related mortality (AHR, 0.41 [CI, 0.24 to 0.70]). Buprenorphine was associated with decreased all-cause mortality (AHR, 0.63 [CI, 0.46 to 0.87]) and opioid-related mortality (AHR, 0.62 [CI, 0.41 to 0.92]). No associations between naltrexone and all-cause mortality (AHR, 1.44 [CI, 0.84 to 2.46]) or opioid-related mortality (AHR, 1.42 [CI, 0.73 to 2.79]) were identified. Few events among naltrexone recipients preclude confident conclusions. A minority of opioid overdose survivors received MOUD. Buprenorphine and MMT were associated with reduced all-cause and opioid-related mortality. National Center for Advancing Translational Sciences of the National Institutes of Health.
Poola, Ashwini Suresh; Rentea, Rebecca M; Weaver, Katrina L; St Peter, Shawn David
2017-05-01
While there is literature on techniques for pectus bar removal, there are limited reports on post-operative management. This can include obtaining a postoperative chest radiograph (CXR) despite the minimal risk of associated intra-thoracic complications. This is a review of our experience with bar removal and lack of routine post-operative CXR. A single institution retrospective chart review was performed from 2000 to 2015. Patients who underwent a pectus bar removal procedure were included. We assessed operative timing of bar placement and removal, procedure length, intra-operative and post-operative complications and post-operative CXR findings, specifically the rate of pneumothoraces. 450 patients were identified in this study. Median duration of bar placement prior to removal was 35 months (interquartile range 30 and 36 months). Sixtey-four patients obtained a post-operative CXR. Of these, only one (58%) film revealed a pneumothorax; this was not drained. A CXR was not obtained in 386 (86%) patients with no immediate or delayed complications from this practice. Median follow-up time for all patients was 11 months (interquartile range 7.5-17 months). The risk for a clinically relevant pneumothorax is minimal following bar removal. This suggests that not obtaining routine imaging following bar removal may be a safe practice.
Development of a PICU in Nepal: the experience of the first year.
Basnet, Sangita; Shrestha, Shrijana; Ghimire, Amrit; Timila, Dipsal; Gurung, Jeena; Karki, Utkarsha; Adhikari, Neelam; Andoh, Jennifer; Koirala, Janak
2014-09-01
Analysis of hospitalization data can help elucidate the pattern of morbidity and mortality in any given area. Little data exist on critically ill children admitted to hospitals in the resource-limited nation of Nepal. We sought to characterize the profile, management, and mortality of children admitted to one PICU. Retrospective analysis. A newly established PICU in Nepal. All patients between the ages of 0 to 16 years admitted to the PICU from July 2009 to July 2010. None. In 12 months, 126 children were admitted to the PICU including 43% female patients. Sixty-three percent were under 5 years. Twenty-nine percent came from tertiary care hospitals and 38% from rural areas outside Kathmandu. Only 18% were transported by ambulance. Median distance travelled to be admitted was 30 km (interquartile range, 10-193). Highest number of admissions were in spring (40%) followed by summer (25%). Almost half were admitted for shock (45%), particularly septic shock (30%). The second commonest reason for admission was neurologic etiologies (15%). Neonatal admissions were also significant (19%). Mortality was 26% and was significantly associated with septic shock (p < 0.01), mechanical ventilation (p < 0.01), and multiple organ dysfunction (< 0.05). Almost one third of patients required mechanical ventilation; median duration was 4 days (interquartile range, 2-8). Mean length of stay in the hospital was 6.2 days (± 5.3) and median 4 (interquartile range, 2.5-9.0). Median Pediatric Risk of Mortality II score for nonsurvivors was 12 (interquartile range, 7-21), and median Pediatric Index of Mortality II for nonsurvivors was 10 (interquartile range, 3-32). Within a short time of opening, the PICU has been seeing significant numbers of critically ill children. Despite adverse conditions and limited resources, survival of 75% is similar to many units in developing nations. Sepsis was the most common reason for PICU admission and mortality.
Avasare, Rupali S; Canetta, Pietro A; Bomback, Andrew S; Marasa, Maddalena; Caliskan, Yasar; Ozluk, Yasemin; Li, Yifu; Gharavi, Ali G; Appel, Gerald B
2018-03-07
C3 glomerulopathy is a form of complement-mediated GN. Immunosuppressive therapy may be beneficial in the treatment of C3 glomerulopathy. Mycophenolate mofetil is an attractive treatment option given its role in the treatment of other complement-mediated diseases and the results of the Spanish Group for the Study of Glomerular Diseases C3 Study. Here, we study the outcomes of patients with C3 glomerulopathy treated with steroids and mycophenolate mofetil. We conducted a retrospective chart review of patients in the C3 glomerulopathy registry at Columbia University and identified patients treated with mycophenolate mofetil for at least 3 months and follow-up for at least 1 year. We studied clinical, histologic, and genetic data for the whole group and compared data for those who achieved complete or partial remission (responders) with those who did not achieve remission (nonresponders). We compared remission with mycophenolate mofetil with remission with other immunosuppressive regimens. We identified 30 patients who met inclusion criteria. Median age was 25 years old (interquartile range, 18-36), median creatinine was 1.07 mg/dl (interquartile range, 0.79-1.69), and median proteinuria was 3200 mg/g creatinine (interquartile range, 1720-6759). The median follow-up time was 32 months (interquartile range, 21-68). Twenty (67%) patients were classified as responders. There were no significant differences in baseline characteristics between responders and nonresponders, although initial proteinuria was lower (median 2468 mg/g creatinine) in responders compared with nonresponders (median 5000 mg/g creatinine) and soluble membrane attack complex levels were higher in responders compared with nonresponders. For those tapered off mycophenolate mofetil, relapse rate was 50%. Genome-wide analysis on complement genes was done, and in 12 patients, we found 18 variants predicted to be damaging. None of these variants were previously reported to be pathogenic. Mycophenolate mofetil with steroids outperformed other immunosuppressive regimens. Among patients who tolerated mycophenolate mofetil, combination therapy with steroids induced remission in 67% of this cohort. Heavier proteinuria at the start of therapy and lower soluble membrane attack complex levels were associated with treatment resistance. Copyright © 2018 by the American Society of Nephrology.
Borghol-Kassar, R; Menezo-Rozalén, J L; Harto-Castaño, M A; Desco-Esteban, M C
2015-03-01
The aim of this article is to study the effect of unilateral congenital cataract surgery on ocular growth and corneal flattening. This is a cross-sectional study of 59 patients operated on due to a unilateral congenital cataract. The median age of the patients at the time of diagnosis was 17 months (interquartile range, 5-39 months). The median age at cataract the time of surgery was 28 months (interquartile range, 8-52 months), and the mean follow-up between cataract surgery and assessments was 149.7±69.9 months (range, 30-319 months). Axial length and corneal curvature were measured in both operated and non-operated eyes, comparing the results between them. There were no statistically significant differences for axial length growth or corneal flattening between operated and non-operated eyes: axial length (P=.327, Student t test) and corneal curvature (P=.078, Student t test). A sub-analysis was performed using the visual acuity and the age of the patient at the time of surgery. The only statistically significant data (P=.007, Student t test) was a lower axial length in operated eyes compared to non-operated eyes, in the non-deep-amblyopia group. No significant axial length growth modifications were observed between operated and non-operated eyes. Only the non-deep-amblyopia group presented with a lower axial length in the operated eyes compared to non-operated eyes. No significant differences in corneal flattening were found between groups after unilateral congenital cataract surgery. Copyright © 2011 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.
Aortoiliac morphologic correlations in aneurysms undergoing endovascular repair.
Ouriel, Kenneth; Tanquilut, Eugene; Greenberg, Roy K; Walker, Esteban
2003-08-01
The feasibility of endovascular aneurysm repair depends on morphologic characteristics of the aortoiliac segment. Knowledge of such characteristics is relevant to safe deployment of a particular device in a single patient and to development of new devices for use in patients with a broader spectrum of anatomic variations. We evaluated findings on computed tomography scans for 277 patients being considered for endovascular aneurysm repair. Aortic neck length and angulation estimates were generated with three-dimensional trigonometry. Specific centerline points were recorded, corresponding to the aorta at the celiac axis, lowest renal artery, cranial aspect of the aneurysm sac, aortic terminus, right hypogastric artery origin, and left hypogastric origin. Aortic neck thrombus and calcium content were recorded, and neck conicity was calculated in degrees. Statistical analysis was performed with the Spearman rank correlation. Data are expressed as median and interquartile range. Median diameter of the aneurysms was 52 mm (interquartile range, 48-59 mm) in minor axis and 56 mm (interquartile range, 51-64 mm) in major axis, and median length was 88 mm (interquartile range, 74-103 mm). Median proximal aortic neck diameter was 26 mm (interquartile range, 22-29 mm), and median neck length was 30 mm (interquartile range, 18-45 mm). The common iliac arteries were similar in diameter (right artery, 16 mm [interquartile range, 13-20 mm]; left artery, 15 mm [interquartile range, 11-18 mm]) and length (right, 59 mm [interquartile range, 50-69 mm]; left, 60 mm [interquartile range, 49-70 mm]). Median angulation of the infrarenal aortic neck was 40 degrees (interquartile range, 29-51 degrees), and median angulation of the suprarenal segment was 45 degrees (interquartile range, 36-57 degrees). By gender, sac diameter, proximal neck diameter, and iliac artery diameter were significantly larger in men. Significant linear associations were identified between sac diameter and sac length, neck angulation, and iliac artery diameter. As the length of the aneurysm sac increased the proximal aortic neck length decreased. Conversely, as the sac length decreased sac eccentricity increased. Mural thrombus content within the neck increased with increasing neck diameter. There is considerable variability in aortoiliac morphologic parameters. Significant associations were found between various morphologic variables, links that are presumably related to a shared pathogenesis for aberration in aortoiliac diameter, length, and angulation. Ultimately this information can be used to develop new endovascular devices with broader applicability and improved long-term results.
Gamma-knife radiosurgery in acromegaly: a 4-year follow-up study.
Attanasio, Roberto; Epaminonda, Paolo; Motti, Enrico; Giugni, Enrico; Ventrella, Laura; Cozzi, Renato; Farabola, Mario; Loli, Paola; Beck-Peccoz, Paolo; Arosio, Maura
2003-07-01
Stereotactic radiosurgery by gamma-knife (GK) is an attractive therapeutic option after failure of microsurgical removal in patients with pituitary adenoma. In these tumors or remnants of them, it aims to obtain the arrest of cell proliferation and hormone hypersecretion using a single precise high dose of ionizing radiation, sparing surrounding structures. The long-term efficacy and toxicity of GK in acromegaly are only partially known. Thirty acromegalic patients (14 women and 16 men) entered a prospective study of GK treatment. Most were surgical failures, whereas in 3 GK was the primary treatment. Imaging of the adenoma and target coordinates identification were obtained by high resolution magnetic resonance imaging. All patients were treated with multiple isocenters (mean, 8; range, 3-11). The 50% isodose was used in 27 patients (90%). The mean margin dose was 20 Gy (range, 15-35), and the dose to the visual pathways was always less than 8 Gy. After a median follow-up of 46 months (range, 9-96), IGF-I fell from 805 micro g/liter (median; interquartile range, 640-994) to 460 micro g/liter (interquartile range, 217-654; P = 0.0002), and normal age-matched IGF-I levels were reached in 7 patients (23%). Mean GH levels decreased from 10 micro g/liter (interquartile range, 6.4-15) to 2.9 micro g/liter (interquartile range, 2-5.3; P < 0.0001), reaching levels below 2.5 micro g/liter in 11 (37%). The rate of persistently pathological hormonal levels was still 70% at 5 yr by Kaplan-Meier analysis. The median volume was 1.43 ml (range, 0.20-3.7). Tumor shrinkage (at least 25% of basal volume) occurred after 24 months (range, 12-36) in 11 of 19 patients (58% of assessable patients). The rate of shrinkage was 79% at 4 yr. In no case was further growth observed. Only 1 patient complained of side-effects (severe headache and nausea immediately after the procedure, with full recovery in a few days with steroid therapy). Anterior pituitary failures were observed in 2 patients, who already had partial hypopituitarism, after 2 and 6 yr, respectively. No patient developed visual deficits. GK is a valid adjunctive tool in the management of acromegaly that controls GH/IGF-I hypersecretion and tumor growth, with shrinkage of adenoma and no recurrence of the disease in the considered observation period and with low acute and chronic toxicity.
Kort, Daniel H; Lobo, Roger A
2014-11-01
To determine the effect of cinnamon on menstrual cyclicity and metabolic dysfunction in women with polycystic ovary syndrome (PCOS). In a prospective, placebo controlled, double-blinded randomized trial, 45 women with PCOS were randomized (1:1) to receive cinnamon supplements (1.5 g/d) or placebo for 6 months. Menstrual cyclicity (average cycles/month) during the 6 months study period was compared between the 2 groups using the Mann-Whitney U test. Changes in menstrual cyclicity and insulin resistance between baseline and the 6 month study period were compared between the 2 groups using Wilcoxon signed rank tests. The 45 women were randomized, 26 women completed 3 months of the study, and 17 women completed the entire 6 months of the study. During the 6 month intervention, menstrual cycles were more frequent in patients taking cinnamon compared with patients taking placebo (median, 0.75; interquartile range, 0.5-0.83 vs median, 0.25; interquartile range, 0-0.54; P = .0085; Mann Whitney U). In patients taking cinnamon, menstrual cyclicity improved from baseline (+ 0.23 cycles/month 95% confidence interval, 0.099-0.36), yet did not improve for women taking placebo. (P = .0076, Wilcoxon signed rank). Samples (n = 5) of serum from the luteal phase in different patients within the cinnamon group were thawed and ovulatory progesterone levels (>3 ng/mL) confirmed. Luteal phase progesterone levels (>3 ng/mL, n = 5) confirmed ovulatory menses. Measures of insulin resistance or serum androgen levels did not change for either group. These preliminary data suggest that cinnamon supplementation improves menstrual cyclicity and may be an effective treatment option for some women with PCOS. Copyright © 2014 Elsevier Inc. All rights reserved.
A strategy for optimizing staffing to improve the timeliness of inpatient phlebotomy collections.
Morrison, Aileen P; Tanasijevic, Milenko J; Torrence-Hill, Joi N; Goonan, Ellen M; Gustafson, Michael L; Melanson, Stacy E F
2011-12-01
The timely availability of inpatient test results is a key to physician satisfaction with the clinical laboratory, and in an institution with a phlebotomy service may depend on the timeliness of blood collections. In response to safety reports filed for delayed phlebotomy collections, we applied Lean principles to the inpatient phlebotomy service at our institution. Our goal was to improve service without using additional resources by optimizing our staffing model. To evaluate the effect of a new phlebotomy staffing model on the timeliness of inpatient phlebotomy collections. We compared the median time of morning blood collections and average number of safety reports filed for delayed phlebotomy collections during a 6-month preimplementation period and 5-month postimplementation period. The median time of morning collections was 17 minutes earlier after implementation (7:42 am preimplementation; interquartile range, 6:27-8:48 am; versus 7:25 am postimplementation; interquartile range, 6:20-8:26 am). The frequency of safety reports filed for delayed collections decreased 80% from 10.6 per 30 days to 2.2 per 30 days. Reallocating staff to match the pattern of demand for phlebotomy collections throughout the day represents a strategy for improving the performance of an inpatient phlebotomy service.
Use of Intranasal Dexmedetomidine as a Solo Sedative for MRI of Infants.
Olgun, Gokhan; Ali, Mir Hyder
2018-01-23
Dexmedetomidine, a selective α-2 receptor agonist, can be delivered via the intranasal (IN) route and be used for procedural sedation. The drug's favorable hemodynamic profile and relative ease of application make it a promising agent for sedation during radiologic procedures, although there are few studies on its efficacy for MRI studies. A retrospective chart review was performed between June 2014 and December 2016. Outpatients between 1 and 12 months of age who received 4 μg/kg of IN dexmedetomidine for MRI were included in the analysis. Our aim with this study was to determine the rate of successful completion of the sedation procedure without the need for a rescue drug (other than repeat IN dexmedetomidine). A total of 52 subjects were included in our study. Median (interquartile range) patient age was 7 (5-8) months. Median (interquartile range) procedure length was 40 (35-50) minutes. Overall success rate (including first dose and any rescue dose IN) of dexmedetomidine was 96.2%. None of the patients had significant adverse effects related to dexmedetomidine. IN dexmedetomidine is an effective solo sedative agent for MRI in infants. Copyright © 2018 by the American Academy of Pediatrics.
Tang, Tien; Abbott, Sally; le Roux, Carel W; Wilson, Violet; Singhal, Rishi; Bellary, Srikanth; Tahrani, Abd A
2018-03-01
We examined the relationship between weight changes after preoperative glucagon-like peptide-1 receptor agonist (GLP-1RA) treatment and weight changes from the start of medical weight management (MWM) until 12 months after bariatric surgery in patients with type 2 diabetes in a retrospective cohort study. A total of 45 patients (64.4% women, median [interquartile range] age 49 [45-60] years) were included. The median (interquartile range) weight loss from start of MWM until 12 months post-surgery was 17.9% (13.0%-29.3%). GLP-1RA treatment during MWM resulted in 5.0% (1.9%-7.7%) weight loss. Weight loss during GLP-1RA treatment predicted weight loss from the start of MWM until 12 months post-surgery, but not postoperative weight loss after adjustment. The proportion of weight loss from start of MWM to 12 months post-surgery attributed to GLP-1RA treatment was negatively associated with that attributed to surgery, after adjustment. In conclusion, weight change after GLP-1RA treatment predicted the weight loss achieved by a combination of MWM and bariatric surgery, but not weight loss induced by surgery only. Failure to lose weight after GLP-1RA treatment should not be considered a barrier to undergoing bariatric surgery. © 2017 John Wiley & Sons Ltd.
Early neurodevelopmental outcomes of infants with intestinal failure.
So, Stephanie; Patterson, Catherine; Gold, Anna; Rogers, Alaine; Kosar, Christina; de Silva, Nicole; Burghardt, Karolina Maria; Avitzur, Yaron; Wales, Paul W
2016-10-01
The survival rate of infants and children with intestinal failure is increasing, necessitating a greater focus on their developmental trajectory. To evaluate neurodevelopmental outcomes in children with intestinal failure at 0-15months corrected age. Analysis of clinical, demographic and developmental assessment results of 33 children followed in an intestinal rehabilitation program between 2011 and 2014. Outcome measures included: Prechtl's Assessment of General Movements, Movement Assessment of Infants, Alberta Infant Motor Scale and Mullen Scales of Early Learning. Clinical factors were correlated with poorer developmental outcomes at 12-15months corrected age. Thirty-three infants (17 males), median gestational age 34weeks (interquartile range 29.5-36.0) with birth weight 1.98kg (interquartile range 1.17-2.50). Twenty-nine (88%) infants had abnormal General Movements. More than half had suspect or abnormal scores on the Alberta Infant Motor Scale and medium to high-risk scores for future neuromotor delay on the Movement Assessment of Infants. Delays were seen across all Mullen subscales, most notably in gross motor skills. Factors significantly associated with poorer outcomes at 12-15months included: prematurity, low birth weight, central nervous system co-morbidity, longer neonatal intensive care admission, necrotizing enterocolitis diagnosis, number of operations and conjugated hyperbilirubinemia. Multiple risk factors contribute to early developmental delay in children with intestinal failure, highlighting the importance of close developmental follow-up. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Rituximab for Severe Membranous Nephropathy: A 6-Month Trial with Extended Follow-Up
Debiec, Hanna; Plaisier, Emmanuelle; Cachanado, Marine; Rousseau, Alexandra; Wakselman, Laura; Michel, Pierre-Antoine; Mihout, Fabrice; Dussol, Bertrand; Matignon, Marie; Mousson, Christiane; Simon, Tabassome
2017-01-01
Randomized trials of rituximab in primary membranous nephropathy (PMN) have not been conducted. We undertook a multicenter, randomized, controlled trial at 31 French hospitals (NCT01508468). Patients with biopsy-proven PMN and nephrotic syndrome after 6 months of nonimmunosuppressive antiproteinuric treatment (NIAT) were randomly assigned to 6-month therapy with NIAT and 375 mg/m2 intravenous rituximab on days 1 and 8 (n=37) or NIAT alone (n=38). Median times to last follow-up were 17.0 (interquartile range, 12.5–24.0) months and 17.0 (interquartile range, 13.0–23.0) months in NIAT-rituximab and NIAT groups, respectively. Primary outcome was a combined end point of complete or partial remission of proteinuria at 6 months. At month 6, 13 (35.1%; 95% confidence interval [95% CI], 19.7 to 50.5) patients in the NIAT-rituximab group and eight (21.1%; 95% CI, 8.1 to 34.0) patients in the NIAT group achieved remission (P=0.21). Rates of antiphospholipase A2 receptor antibody (anti–PLA2R-Ab) depletion in NIAT-rituximab and NIAT groups were 14 of 25 (56%) and one of 23 (4.3%) patients at month 3 (P<0.001) and 13 of 26 (50%) and three of 25 (12%) patients at month 6 (P=0.004), respectively. Eight serious adverse events occurred in each group. During the observational phase, remission rates before change of assigned treatment were 24 of 37 (64.9%) and 13 of 38 (34.2%) patients in NIAT-rituximab and NIAT groups, respectively (P<0.01). Positive effect of rituximab on proteinuria remission occurred after 6 months. These data suggest that PLA2R-Ab levels are early markers of rituximab effect and that addition of rituximab to NIAT does not affect safety. PMID:27352623
Comparison of postural ergonomics between laparoscopic and robotic sacrocolpopexy: a pilot study.
Tarr, Megan E; Brancato, Sam J; Cunkelman, Jacqueline A; Polcari, Anthony; Nutter, Benjamin; Kenton, Kimberly
2015-02-01
To compare resident, fellow, and attending urologic and gynecologic surgeons' musculoskeletal and mental strain during laparoscopic and robotic sacrocolpopexy. Prospective cohort study (Canadian Task Force classification II-2). Academic medical center. Patients who underwent robotic or laparoscopic sacrocolpopexy from October 2009 to January 2011. The Body Part Discomfort (BPD) survey was completed before cases, and the National Aeronautics and Space Administration Task Load Index and BPD survey were completed after cases. Higher scores on BPD and the National Aeronautics and Space Administration Task Load Index indicate greater musculoskeletal discomfort and mental strain. BPD scores were averaged over the following body regions: head/neck, back, hand/wrist, arms, and knees/ankles/feet. Changes in body region-specific discomfort scores were the primary outcomes. Multivariable analysis was performed using mixed-effects linear regression with surgeon as a random effect. Sixteen surgeons participated (53% fellows, 34% residents, and 13% attendings). Thirty-three robotic and 53 laparoscopic cases were analyzed, with a median surgical time of 231 minutes (interquartile range, 204-293 minutes) versus 227 minutes (interquartile range, 203-272 minutes; p = .31), a median estimated blood loss of 100 mL (interquartile range, 50-175 mL) versus 150 mL (interquartile range, 50-200 mL; p = .22), and a mean patient body mass index of 27 ± 4 versus 26 ± 4 kg/m(2) (p = .26), respectively. Robotic surgeries were associated with lower neck/shoulder (-0.19 [interquartile range, -0.32 to -0.01], T = -2.49) and back discomfort scores (-0.35 [interquartile range, -0.58 to 0], T = -2.38) than laparoscopic surgeries. Knee/ankle/foot and arm discomfort increased with case length (0.18 [interquartile range, 0.02-0.3], T = 2.81) and (0.07 [interquartile range, 0.01-0.14], p = .03), respectively. Surgeons performing minimally invasive sacrocolpopexy experienced less neck, shoulder, and back discomfort when surgery was performed robotically. Copyright © 2015 AAGL. Published by Elsevier Inc. All rights reserved.
Breast-feeding Duration: Early Weaning-Do We Sufficiently Consider the Risk Factors?
Karall, Daniela; Ndayisaba, Jean-Pierre; Heichlinger, Angelika; Kiechl-Kohlendorfer, Ursula; Stojakovic, Sarah; Leitner, Hermann; Scholl-Bürgi, Sabine
2015-11-01
Breast-feeding is the recommended form of nutrition for the first 6 months. This target is unmet, however, in most industrialized regions. We evaluated aspects of breast-feeding in a cohort of mother-baby dyads. Breast-feeding practices in 555 mother-baby dyads were prospectively studied for 24 months (personal interview at birth and 7 structured telephone interviews). Of the babies, 71.3% were fully breast-fed on discharge from maternity hospitals and 11.9% were partially breast-feed. Median breast-feeding duration was 6.93 (interquartile range 2.57-11.00) months; for full (exclusive) breast-feeding 5.62 (interquartile range 3.12-7.77) months; 61.7% received supplemental feedings during the first days of life. Breast-feeding duration in babies receiving supplemental feedings was significantly shorter (median 5.06 months versus 8.21 months, P < 0.001). At 6 months, 9.4% of the mothers were exclusively and 39.5% partially breast-feeding. Risk factors for early weaning were early supplemental feedings (odds ratio [OR] 2.87, 95% CI 1.65-4.98), perceived milk insufficiency (OR 7.35, 95% CI 3.59-15.07), low breast-feeding self-efficacy (a mother's self-confidence in her ability to adequately feed her baby) (OR 3.42, 95% CI 1.48-7.94), lower maternal age (OR 3.89, 95% CI 1.45-10.46), and lower education level of the mother (OR 7.30, 95% CI 2.93-18.20). The recommended full breast-feeding duration of the first 6 months of life was not reached. Sociodemographic variables and factors directly related to breast-feeding practices play an important role on breast-feeding duration/weaning in our region. Understanding risk factors will provide insights to give better support to mothers and prevent short- and long-term morbidity following early weaning.
Ong'uti, Sharon K; Ortega, Gezzer; Onwugbufor, Michael T; Ivey, Gabriel D; Fullum, Terrence M; Tran, Daniel D
2013-01-01
Despite the effectiveness of Roux-en-Y gastric bypass (RYGB) in promoting excess weight loss, 40% of the patients regain weight. Endoscopic gastric plication (EGP) using the StomaphyX device can serve as a less-invasive procedure for promoting the loss of regained weight. Our objective was to evaluate the effectiveness of the StomaphyX device in sustaining ongoing weight loss in patients who have regained weight after RYGB at the Division of Minimally Invasive and Bariatric Surgery, Howard University Hospital. We performed a retrospective chart review of patients undergoing EGP using the StomaphyX device from April 2008 to May 2010. The patient demographics and clinical information were assessed. Effective weight loss and the proportion of weight lost after EGP relative to the weight regained after achieving the lowest weight following RYGB was calculated. A total of 27 patients underwent EGP using the StomaphyX device; of these, most were women (n = 25, 93%) and black (n = 14, 52%), followed by white (n = 11, 42%), and Hispanic (n = 1, 4%). The median interval between RYGB and EGP was 6 years, with an interquartile range of 5-8 years. After the EGP procedure, the median effective weight loss was 37% (interquartile range 24-61%). Of the 27 patients, 18 had ≥6 months of follow-up after EGP. Eleven patients had achieved their lowest weight at 1-3 months, 7 at 6 months, and 3 at 12 months. Of the 18 patients, 13 (72%) experienced an increase in weight after achieving their lowest weight after EGP. The use of the StomaphyX device achieved the maximum effective weight loss during the 1-6-month period after EGP. Copyright © 2013 American Society for Metabolic and Bariatric Surgery. Published by Elsevier Inc. All rights reserved.
Chen, J; Li, Y; Wang, Z; McCulloch, P; Hu, L; Chen, W; Liu, G; Li, J; Lang, J
2018-02-01
To evaluate the clinical outcomes of high-intensity focused ultrasound (HIFU) and surgery in treating uterine fibroids, and prepare for a definitive randomised trial. Prospective multicentre patient choice cohort study (IDEAL Exploratory study) of HIFU, myomectomy or hysterectomy for treating symptomatic uterine fibroids. 20 Chinese hospitals. 2411 Chinese women with symptomatic fibroids. Prospective non-randomised cohort study with learning curve analysis (IDEAL Stage 2b Prospective Exploration Study). Complications, hospital stay, return to normal activities, and quality of life (measured with UFS-Qol and SF-36 at baseline, 6 and 12 months), and need for further treatment. Quality-of-life outcomes were adjusted using regression modelling. HIFU treatment quality was evaluated using LC-CUSUM to identify operator learning curves. A health economic analysis of costs was performed. 1353 women received HIFU, 472 hysterectomy and 586 myomectomy. HIFU patients were significantly younger (P < 0.001), slimmer (P < 0.001), better educated (P < 0.001), and wealthier (P = 0.002) than surgery patients. Both UFS and QoL improved more rapidly after HIFU than after surgery (P = 0.002 and P = 0.001, respectively at 6 months), but absolute differences were small. Major adverse events occurred in 3 (0.2%) of HIFU and in 133 (12.6%) of surgical cases (P < 0.001). Median time for hospital stay was 4 days (interquartile range, 0-5 days), 10 days (interquartile range, 8-12.5 days) and 8 days (interquartile range, 7-10 days). HIFU caused substantially less morbidity than surgery, with similar longer-term QoL. Despite group baseline differences and lack of blinding, these findings support the need for a randomised controlled trial (RCT) of HIFU treatment for fibroids. The IDEAL Exploratory design facilitated RCT protocol development. HIFU had much better short-term outcomes than surgery for fibroids in 2411-patient Chinese IDEAL format study. © 2017 Royal College of Obstetricians and Gynaecologists.
Cyclodiode photocoagulation for refractory glaucoma after penetrating keratoplasty.
Shah, P; Lee, G A; Kirwan, J K; Bunce, C; Bloom, P A; Ficker, L A; Khaw, P T
2001-11-01
This study analyzes the results of intraocular pressure (IOP) reduction by contact diode cycloablation (cyclodiode) in cases of refractory glaucoma after penetrating keratoplasty. Retrospective noncomparative, interventional case series. Twenty-eight eyes in 28 patients attending the Moorfields Eye Hospital. Cyclodiode (40 applications x 1.5 W x 1.5 seconds over 270-300 degrees ) was used to control the IOP in refractory glaucoma after penetrating keratoplasty. Postoperative IOP, graft status, visual acuity, and number of antiglaucoma medications were recorded after cyclodiode treatment. Cyclodiode resulted in a reduction of IOP from a median of 33 mmHg (interquartile range [28, 40.5]) to a median of 15 mmHg (interquartile range [12, 20.5]). Most patients had a significant lowering in IOP with a median reduction of 16 mmHg (interquartile range [12, 25]; P < 0.0001). IOPs of 6 to 21 mmHg were achieved in 22 patients (79%). Sixteen patients (57%) required more than one treatment with cyclodiode to control the IOP, with three patients (11%) requiring three treatments and two patients (7%) requiring four treatments. Visual acuity improved (> two Snellen lines of acuity) in three patients (11%) and remained the same (+/- one Snellen line) in 17 patients (61%). The mean number of antiglaucoma medications before cycloablation was 2.6 and was 1.8 after treatment (P < 0.001). Of the 19 patients (68%) with originally clear grafts, three grafts (16%) developed opacification. One patient (4%), with a history of nanophthalmos and recurrent uveal effusion, had delayed hypotony (IOP < 6 mmHg) occurring 46 months after the diode treatment. All patients had at least 6 months follow-up. These patients have often undergone multiple previous complicated ocular interventions and are often not suitable for filtration surgery. Reduction of IOP with maintenance of visual acuity and a good safety profile was achieved in most patients in this study but may require multiple treatments. We propose cyclodiode as an effective treatment for many patients in the management of refractory glaucoma after penetrating keratoplasty.
Durga, Padmaja; Raavula, Parvathi; Gurajala, Indira; Gunnam, Poojita; Veerabathula, Prardhana; Reddy, Mukund; Upputuri, Omkar; Ramachandran, Gopinath
2015-09-01
To assess the effect of tranexamic acid on the quality of the surgical field. Prospective, randomized, double-blind study. Institutional, tertiary referral hospital. American Society of Anesthesiologists physical status class I patients, aged 8 to 60 months with Group II or III (Balakrishnan's classification) clefts scheduled for cleft palate repair. Children were randomized into two groups. The control group received saline, and the tranexamic acid group received tranexamic acid 10 mg/kg as a bolus, 15 minutes before incision. Grade of surgical field on a 10-point scale, surgeon satisfaction, and primary hemorrhage. Significant improvements were noted in surgeon satisfaction and median grade of assessment of the surgical field (4 [interquartile range, 4 to 6] in the control group vs. 3 [interquartile range, 2 to 4] in the test group; P = .003) in the tranexamic acid group compared to the control group. Preincision administration of 10 mg/kg of tranexamic acid significantly improved the surgical field during cleft palate repair.
The impact of a dedicated research education month for anesthesiology residents.
Freundlich, Robert E; Newman, Jessica W; Tremper, Kevin K; Mhyre, Jill M; Kheterpal, Sachin; Sanford, Theodore J; Tait, Alan R
2015-01-01
An educational intervention was implemented at the University of Michigan starting in 2008, in which anesthesiology interns complete a dedicated month-long didactic rotation in evidence-based medicine (EBM) and research methodology. We sought to assess its utility. Scores on a validated EBM test before and after the rotation were compared and assessed for significance of improvement. A survey was also given to gauge satisfaction with the quality of the rotation and self-reported improvement in understanding of EBM topics. Fourteen consecutive interns completed the research rotation during the study period. One hundred percent completed both the pre- and postrotation test. The mean pretest score was 7.78 ± 2.46 (median = 7.5, 0-15 scale, and interquartile range 7.0-10.0) and the mean posttest score was 10.00 ± 2.35 (median = 9.5, interquartile range 8.0-12.3), which represented a statistically significant increase (P = 0.011, Wilcoxon signed-rank test). All fourteen of the residents "agreed" or "strongly agreed" that they would recommend the course to future interns and that the course increased their ability to critically review the literature. Our findings demonstrate that this can be an effective means of improving understanding of EBM topics and anesthesiology research.
Hemoglobin Levels Across the Pediatric Critical Care Spectrum: A Point Prevalence Study.
Hassan, Nabil E; Reischman, Diann E; Fitzgerald, Robert K; Faustino, Edward Vincent S
2018-05-01
To determine the prevailing hemoglobin levels in PICU patients, and any potential correlates. Post hoc analysis of prospective multicenter observational data. Fifty-nine PICUs in seven countries. PICU patients on four specific days in 2012. None. Patients' hemoglobin and other clinical and institutional data. Two thousand three hundred eighty-nine patients with median age of 1.9 years (interquartile range, 0.3-9.8 yr), weight 11.5 kg (interquartile range, 5.4-29.6 kg), and preceding PICU stay of 4.0 days (interquartile range, 1.0-13.0 d). Their median hemoglobin was 11.0 g/dL (interquartile range, 9.6-12.5 g/dL). The prevalence of transfusion in the 24 hours preceding data collection was 14.2%. Neonates had the highest hemoglobin at 13.1 g/dL (interquartile range, 11.2-15.0 g/dL) compared with other age groups (p < 0.001). The percentage of 31.3 of the patients had hemoglobin of greater than or equal to 12 g/dL, and 1.1% had hemoglobin of less than 7 g/dL. Blacks had lower median hemoglobin (10.5; interquartile range, 9.3-12.1 g/dL) compared with whites (median, 11.1; interquartile range, 9.0-12.6; p < 0.001). Patients in Spain and Portugal had the highest median hemoglobin (11.4; interquartile range, 10.0-12.6) compared with other regions outside of the United States (p < 0.001), and the highest proportion (31.3%) of transfused patients compared with all regions (p < 0.001). Patients in cardiac PICUs had higher median hemoglobin than those in mixed PICUs or noncardiac PICUs (12.3, 11.0, and 10.6 g/dL, respectively; p < 0.001). Cyanotic heart disease patients had the highest median hemoglobin (12.6 g/dL; interquartile range, 11.1-14.5). Multivariable regression analysis within diagnosis groups revealed that hemoglobin levels were significantly associated with the geographic location and history of complex cardiac disease in most of the models. In children with cancer, none of the variables tested correlated with patients' hemoglobin levels. Patients' hemoglobin levels correlated with demographics like age, race, geographic location, and cardiac disease, but none found in cancer patients. Future investigations should account for the effects of these variables.
Bland, Ruth M; Little, Kirsty E; Coovadia, Hoosen M; Coutsoudis, Anna; Rollins, Nigel C; Newell, Marie-Louise
2008-04-23
We report on a nonrandomized intervention cohort study to increase exclusive breast-feeding rates for 6 months after delivery in HIV-positive and HIV-negative women in KwaZulu-Natal, South Africa. Lay counselors visited women to support exclusive breast-feeding: four times antenatally, four times in the first 2 weeks postpartum and then fortnightly to 6 months. Daily feeding practices were collected at weekly intervals by separate field workers. Cumulative exclusive breast-feeding rates from birth were assessed by Kaplan-Meier analysis and association with maternal and infant variables was quantified in a Cox regression analysis. One thousand, two hundred and nineteen infants of HIV-negative and 1217 infants of HIV-positive women were followed postnatally. Median duration of exclusive breast-feeding was 177 (R = 1-180; interquartile range: 150-180) and 175 days (R = 1-180; interquartile range: 137-180) in HIV-negative and HIV-positive women, respectively. Using 24-h recall, exclusive breast-feeding rates at 3 and 5 months were 83.1 and 76.5%, respectively, in HIV-negative women and 72.5 and 66.7%, respectively, in HIV-positive women. Using the most stringent cumulative data, 45% of HIV-negative and 40% of HIV-positive women adhered to exclusive breast-feeding for 6 months. Counseling visits were strongly associated with adherence to cumulative exclusive breast-feeding at 4 months, those who had received the scheduled number of visits were more than twice as likely to still be exclusively breast-feeding than those who had not (HIV-negative women: adjusted odds ratio: 2.07, 95% confidence interval: 1.56-2.74, P < 0.0001; HIV-positive women: adjusted odds ratio: 2.86, 95% CI 2.13-3.83, P < 0.0001). It is feasible to promote and sustain exclusive breast-feeding for 6 months in both HIV-positive and HIV-negative women, with home support from well trained lay counselors.
Adherence to Highly Active Antiretroviral Treatment in HIV-Infected Rwandan Women
Musiime, Stephenson; Muhairwe, Fred; Rutagengwa, Alfred; Mutimura, Eugene; Anastos, Kathryn; Hoover, Donald R.; Qiuhu, Shi; Munyazesa, Elizaphane; Emile, Ivan; Uwineza, Annette; Cowan, Ethan
2011-01-01
Background Scale-up of highly active antiretroviral treatment therapy (HAART) programs in Rwanda has been highly successful but data on adherence is limited. We examined HAART adherence in a large cohort of HIV+ Rwandan women. Methods The Rwanda Women's Interassociation Study Assessment (RWISA) was a prospective cohort study that assessed effectiveness and toxicity of ART. We analyzed patient data 12±3 months after HAART initiation to determine adherence rates in HIV+ women who had initiated HAART. Results Of the 710 HIV+ women at baseline, 490 (87.2%) initiated HAART. Of these, 6 (1.2%) died within 12 months, 15 others (3.0%) discontinued the study and 80 others (19.0%) remained in RWISA but did not have a post-HAART initiation visit that fell within the 12±3 month time points leaving 389 subjects for analysis. Of these 389, 15 women stopped their medications without being advised to do so by their doctors. Of the remaining 374 persons who reported current HAART use 354 completed the adherence assessment. All women, 354/354, reported 100% adherence to HAART at the post-HAART visit. The high self-reported level of adherence is supported by changes in laboratory measures that are influenced by HAART. The median (interquartile range) CD4 cell count measured within 6 months prior to HAART initiation was 185 (128, 253) compared to 264 (182, 380) cells/mm3 at the post-HAART visit. Similarly, the median (interquartile range) MCV within 6 months prior to HAART initiation was 88 (83, 93) fL compared to 104 (98, 110) fL at the 12±3 month visit. Conclusion Self-reported adherence to antiretroviral treatment 12±3 months after initiating therapy was 100% in this cohort of HIV-infected Rwandan women. Future studies should explore country-specific factors that may be contributing to high levels of adherence to HAART in this population. PMID:22114706
Tobin, W Oliver; Guo, Yong; Krecke, Karl N; Parisi, Joseph E; Lucchinetti, Claudia F; Pittock, Sean J; Mandrekar, Jay; Dubey, Divyanshu; Debruyne, Jan; Keegan, B Mark
2017-09-01
Chronic lymphocytic inflammation with pontine perivascular enhancement responsive to steroids (CLIPPERS) is a central nervous system inflammatory syndrome predominantly affecting the brainstem, cerebellum, and spinal cord. Following its initial description, the salient features of CLIPPERS have been confirmed and expanded upon, but the lack of formalized diagnostic criteria has led to reports of patients with dissimilar features purported to have CLIPPERS. We evaluated clinical, radiological and pathological features of patients referred for suspected CLIPPERS and propose diagnostic criteria to discriminate CLIPPERS from non-CLIPPERS aetiologies. Thirty-five patients were evaluated for suspected CLIPPERS. Clinical and neuroimaging data were reviewed by three neurologists to confirm CLIPPERS by consensus agreement. Neuroimaging and neuropathology were reviewed by experienced neuroradiologists and neuropathologists, respectively, both of whom were blinded to the clinical data. CLIPPERS was diagnosed in 23 patients (18 male and five female) and 12 patients had a non-CLIPPERS diagnosis. CLIPPERS patients' median age of onset was 58 years (interquartile range, 24-72) and were followed a median of 44 months (interquartile range 38-63). Non-CLIPPERS patients' median age of onset was 52 years (interquartile range, 39-59) and were followed a median of 27 months (interquartile range, 14-47). Clinical symptoms of gait ataxia, diplopia, cognitive impairment, and facial paraesthesia did not discriminate CLIPPERS from non-CLIPPERS. Marked clinical and radiological corticosteroid responsiveness was observed in CLIPPERS (23/23), and clinical worsening occurred in all 12 CLIPPERS cases when corticosteroids were discontinued. Corticosteroid responsiveness was common but not universal in non-CLIPPERS [clinical improvement (8/12); radiological improvement (2/12); clinical worsening on discontinuation (3/8)]. CLIPPERS patients had brainstem predominant perivascular gadolinium enhancing lesions on magnetic resonance imaging that were discriminated from non-CLIPPERS by: homogenous gadolinium enhancing nodules <3 mm in diameter without ring-enhancement or mass effect, and homogenous T2 signal abnormality not significantly exceeding the T1 enhancement. Brain neuropathology on 14 CLIPPERS cases demonstrated marked CD3-positive T-lymphocyte, mild B-lymphocyte and moderate macrophage infiltrates, with perivascular predominance as well as diffuse parenchymal infiltration (14/14), present in meninges, white and grey matter, associated with variable tissue destruction, astrogliosis and secondary myelin loss. Clinical, radiological and pathological feature define CLIPPERS syndrome and are differentiated from non-CLIPPERS aetiologies by neuroradiological and neuropathological findings. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The operative outcomes of displaced medial-end clavicle fractures.
Sidhu, Verinder S; Hermans, Deborah; Duckworth, David G
2015-11-01
Nonoperative treatment of displaced medial clavicle fractures often leads to poor functional outcomes and painful nonunions. This study investigates the functional outcomes of patients undergoing operative fixation of these fractures. We investigated 27 patients undergoing operative fixation of a medial clavicle fracture; 24 had an acute, displaced fracture and 3 had fixation for nonunions. Preoperative radiographs or computed tomography scans were obtained, and data collected included age, sex, mechanism of injury, and fixation method. Follow-up included physical examination and radiographs for assessment of union; Disabilities of the Arm, Shoulder, and Hand scores at 12 months; and the recording of complications. The median age was 37 years (interquartile range, 17-47 years). There were 26 male patients and one female patient included, with 7 physeal injuries and 20 adult injuries. The most common mechanism of fracture was vehicular accident (n = 15). Three patients had operations for nonunions and 2 for a periprosthetic fracture medial to an existing plate. The fracture was fixed with plate and screws in 19 cases and with transosseous sutures in 8 cases. The median Disabilities of the Arm, Shoulder, and Hand score at 12 months was 0.4 (interquartile range, 0-5.0), with a union rate of 100% at 12 months. All patients had full shoulder range of motion at final follow-up and were able to return to preinjury occupational activities. There were no significant complications. Operative fixation of displaced medial clavicle fractures results in anatomic reconstruction and excellent functional outcomes, even in the setting of fixation performed for symptomatic nonunion. Early intervention can minimize the risk of painful nonunion. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.
Prenatal Exposure to Butylbenzyl Phthalate and Early Eczema in an Urban Cohort
Whyatt, Robin M.; Perzanowski, Matthew S.; Calafat, Antonia M.; Perera, Frederica P.; Goldstein, Inge F.; Chen, Qixuan; Rundle, Andrew G.; Miller, Rachel L.
2012-01-01
Background: Recent cross-sectional studies suggest a link between butylbenzyl phthalate (BBzP) in house dust and childhood eczema. Objectives: We aimed to evaluate whether concentrations of monobenzyl phthalate (MBzP), the main BBzP metabolite in urine, during pregnancy are associated prospectively with eczema in young children, and whether this association varies by the child’s sensitization to indoor allergens or serological evidence of any allergies. Methods: MBzP was measured in spot urine samples during the third trimester of pregnancy from 407 African-American and Dominican women residing in New York City in 1999–2006. Repeated questionnaires asked mothers whether their doctor ever said their child had eczema. Child blood samples at 24, 36, and 60 months of age were analyzed for total, anti-cockroach, dust mite, and mouse IgE. Relative risks (RR) were estimated with multivariable modified Poisson regression. Analyses included a multinomial logistic regression model for early- and late-onset eczema versus no eczema through 60 months of age. Results: MBzP was detected in > 99% of samples (geometric mean = 13.6; interquartile range: 5.7–31.1 ng/mL). By 24 months, 30% of children developed eczema, with the proportion higher among African Americans (48%) than among Dominicans (21%) (p < 0.001). An interquartile range increase in log MBzP concentration was associated positively with early-onset eczema (RR = 1.52 for eczema by 24 months; 95% confidence interval: 1.21, 1.91, p = 0.0003, n = 113 reporting eczema/376 total sample), adjusting for urine specific gravity, sex, and race/ethnicity. MBzP was not associated with allergic sensitization, nor did seroatopy modify consistently the MBzP and eczema association. Conclusions: Prenatal exposure to BBzP may influence the risk of developing eczema in early childhood. PMID:22732598
Hotu, Cheri; Rémond, Marc; Maguire, Graeme; Ekinci, Elif; Cohen, Neale
2018-06-04
To determine the impact of an integrated diabetes service involving specialist outreach and primary health care teams on risk factors for micro- and macrovascular diabetes complications in three remote Indigenous Australian communities over a 12-month period. Quantitative, retrospective evaluation. Primary health care clinics in remote Indigenous communities in Australia. One-hundred-and-twenty-four adults (including 123 Indigenous Australians; 76.6% female) with diabetes living in remote communities. Glycosylated haemoglobin, lipid profile, estimated glomerular filtration rate, urinary albumin : creatinine ratio and blood pressure. Diabetes prevalence in the three communities was high, at 32.8%. A total of 124 patients reviewed by the outreach service had a median consultation rate of 1.0 by an endocrinologist and 0.9 by a diabetes nurse educator over the 12-month period. Diabetes care plans were made in collaboration with local primary health care services, which also provided patients with diabetes care between outreach team visits. A significant reduction was seen in median (interquartile range) glycosylated haemoglobin from baseline to 12 months. Median (interquartile range) total cholesterol was also reduced. The number of patients prescribed glucagon-like peptide-1 analogues and dipeptidyl peptidase-4 inhibitors increased over the 12 months and an increase in the number of patients prescribed insulin trended towards statistical significance. A collaborative health care approach to deliver diabetes care to remote Indigenous Australian communities was associated with an improvement in glycosylated haemoglobin and total cholesterol, both important risk factors, respectively, for micro- and macrovascular diabetes complications. © 2018 National Rural Health Alliance Ltd.
Buonerba, Carlo; Sonpavde, Guru; Vitrone, Francesca; Bosso, Davide; Puglia, Livio; Izzo, Michela; Iaccarino, Simona; Scafuri, Luca; Muratore, Margherita; Foschini, Francesca; Mucci, Brigitta; Tortora, Vincenzo; Pagliuca, Martina; Ribera, Dario; Riccio, Vittorio; Morra, Rocco; Mosca, Mirta; Cesarano, Nicola; Di Costanzo, Ileana; De Placido, Sabino; Di Lorenzo, Giuseppe
2017-01-01
Background: Cabazitaxel is a second-generation taxane that is approved for use with concomitant low dose daily prednisone in metastatic castration resistant prostate cancer (mCRPC) after docetaxel failure. Since the role of daily corticosteroids in improving cabazitaxel efficacy or ameliorating its safety profile has not been adequately investigated so far, we compared outcomes of patients receiving cabazitaxel with or without daily corticosteroids in a retrospective single-Institution cohort of mCRPC patients. Patients and methods: Medical records of deceased patients with documented mCRPC treated with cabazitaxel following prior docetaxel between January, 2011 and January, 2017 were reviewed at the single participating center. Patients who were receiving daily doses of systemic corticosteroids other than low dose daily prednisone or prednisolone (<= 10 mg a day) were excluded. The primary end point of this analysis was overall survival (OS). Secondary end-points were exposure to cabazitaxel as well as incidence of grade 3-4 adverse events. Univariable and multivariable Cox proportional hazards regression was used to evaluate prednisone use and other variables as potentially prognostic for overall survival. Results: Overall, among 91 patients, 57 patients received cabazitaxel concurrently with low dose prednisone and 34 patients did not receive concurrent prednisone. The median overall survival of the population was 9.8 months (interquartile range, 9 to 14). Patients receiving prednisone had an overall survival of 9 months (interquartile range, 8 to 12) vs.14 months (interquartile range, 9.4 to 16.7) for patients not treated with prednisone. Approximately 45% of patients had a >30% PSA decline at 12 weeks. Prednisone use was not significantly prognostic for overall survival or PSA decline ≥30% rates on regression analyses. Importantly, a >30% PSA decline at 12, but not at 3, 6, 9 weeks, was prognostic for improved survival at multivariate analysis Conclusions: The data presented here support the hypothesis that omitting daily corticosteroids in cabazitaxel-treated patients has no negative impact on either survival or safety profile. In the large prospective trial CABACARE, cabazitaxel-treated patients will be randomized to receive or not receive daily prednisone. The CABACARE (EudraCT n. 2016-003646-81) study is currently ongoing at University Federico II of Naples and at other multiple participating centers in Italy.
Buonerba, Carlo; Sonpavde, Guru; Vitrone, Francesca; Bosso, Davide; Puglia, Livio; Izzo, Michela; Iaccarino, Simona; Scafuri, Luca; Muratore, Margherita; Foschini, Francesca; Mucci, Brigitta; Tortora, Vincenzo; Pagliuca, Martina; Ribera, Dario; Riccio, Vittorio; Morra, Rocco; Mosca, Mirta; Cesarano, Nicola; Di Costanzo, Ileana; De Placido, Sabino; Di Lorenzo, Giuseppe
2017-01-01
Background: Cabazitaxel is a second-generation taxane that is approved for use with concomitant low dose daily prednisone in metastatic castration resistant prostate cancer (mCRPC) after docetaxel failure. Since the role of daily corticosteroids in improving cabazitaxel efficacy or ameliorating its safety profile has not been adequately investigated so far, we compared outcomes of patients receiving cabazitaxel with or without daily corticosteroids in a retrospective single-Institution cohort of mCRPC patients. Patients and methods: Medical records of deceased patients with documented mCRPC treated with cabazitaxel following prior docetaxel between January, 2011 and January, 2017 were reviewed at the single participating center. Patients who were receiving daily doses of systemic corticosteroids other than low dose daily prednisone or prednisolone (<= 10 mg a day) were excluded. The primary end point of this analysis was overall survival (OS). Secondary end-points were exposure to cabazitaxel as well as incidence of grade 3-4 adverse events. Univariable and multivariable Cox proportional hazards regression was used to evaluate prednisone use and other variables as potentially prognostic for overall survival. Results: Overall, among 91 patients, 57 patients received cabazitaxel concurrently with low dose prednisone and 34 patients did not receive concurrent prednisone. The median overall survival of the population was 9.8 months (interquartile range, 9 to 14). Patients receiving prednisone had an overall survival of 9 months (interquartile range, 8 to 12) vs.14 months (interquartile range, 9.4 to 16.7) for patients not treated with prednisone. Approximately 45% of patients had a >30% PSA decline at 12 weeks. Prednisone use was not significantly prognostic for overall survival or PSA decline ≥30% rates on regression analyses. Importantly, a >30% PSA decline at 12, but not at 3, 6, 9 weeks, was prognostic for improved survival at multivariate analysis Conclusions: The data presented here support the hypothesis that omitting daily corticosteroids in cabazitaxel-treated patients has no negative impact on either survival or safety profile. In the large prospective trial CABACARE, cabazitaxel-treated patients will be randomized to receive or not receive daily prednisone. The CABACARE (EudraCT n. 2016-003646-81) study is currently ongoing at University Federico II of Naples and at other multiple participating centers in Italy. PMID:28928853
Timing of infant feeding in relation to childhood asthma and allergic diseases.
Nwaru, Bright I; Takkinen, Hanna-Mari; Niemelä, Onni; Kaila, Minna; Erkkola, Maijaliisa; Ahonen, Suvi; Haapala, Anna-Maija; Kenward, Michael G; Pekkanen, Juha; Lahesmaa, Riitta; Kere, Juha; Simell, Olli; Veijola, Riitta; Ilonen, Jorma; Hyöty, Heikki; Knip, Mikael; Virtanen, Suvi M
2013-01-01
Emerging evidence questions current recommendations on the timing of infant feeding for the prevention of childhood allergies. The evidence for asthma is inconclusive. We sought to investigate the associations between the duration of breast-feeding and timing of introduction of complementary foods and the development of asthma and allergies by the age of 5 years. Data were analyzed for 3781 consecutively born children. The dietary exposures were categorized into thirds and analyzed as time-dependent variables. Asthma, allergic rhinitis, and atopic eczema end points were assessed by using the International Study of Asthma and Allergies in Childhood questionnaire, whereas IgE antibodies were analyzed from serum samples at the age of 5 years. Cox proportional hazard and logistic regressions were used for the analyses. The median duration of exclusive and total breast-feeding was 1.4 months (interquartile range, 0.2-3.5 months) and 7.0 months (interquartile range, 4.0-11.0 months), respectively. Total breast-feeding of 9.5 months or less was associated with an increased risk of nonatopic asthma. Introduction of wheat, rye, oats, or barley at 5 to 5.5 months was inversely associated with asthma and allergic rhinitis, whereas introduction of other cereals at less than 4.5 months increased the risk of atopic eczema. Introduction of egg at 11 months or less was inversely associated with asthma, allergic rhinitis, and atopic sensitization, whereas introduction of fish at 9 months or less was inversely associated with allergic rhinitis and atopic sensitization. Early introduction of wheat, rye, oats, and barley cereals; fish; and egg (respective to the timing of introduction of each food) seems to decrease the risk of asthma, allergic rhinitis, and atopic sensitization in childhood. Longer duration of total breast-feeding, rather than its exclusivity, was protective against the development of nonatopic but not atopic asthma, suggesting a potential differing effect of breast-feeding on different asthma phenotypes. Copyright © 2012 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.
Crisp, Tom; Khan, Faisal; Padhiar, Nat; Morrissey, Dylan; King, John; Jalan, Rosy; Maffulli, Nicola; Frcr, Otto Chan
2008-01-01
To evaluate a novel conservative management modality for patellar tendinopathy. We recruited nine patients with patellar tendinopathy who had failed conservative management and showed evidence of neovascularisation on power Doppler scanning. A high volume ultrasound guided injection at the interface between the patellar tendon and Hoffa's body. The injection contained 10 ml 0.5% Bupivacaine, 25 mg Hydrocortisone, and between 12 and 40 ml normosaline. 100 mm visual analogue scales (VAS) for pain and for function, and Victorian Institute of Sport Assessment - Patellar tendon (VISA-P) questionnaires at an average of 9 months from the injection. All but one patient (whose pain was unchanged) improved (p = 0.028). The mean improvement in function 2 weeks after injection was 58 mm on VAS (interquartile range 27 - 88, p = 0.018). The mean improvement in pain 2 weeks after injection was 56 mm on a VAS scale (interquartile range 32 - 80, p = 0.018). At a mean follow up of 9 months, an improvement of 22 points from a baseline score of 46 on the VISA-P questionnaire (100 being normal) was established. High volume injections to mechanically disrupt the neovascularisation in patellar tendinopathy are helpful in the management of this condition. Controlled trials would be warranted to investigate in a more conclusive fashion this management modality.
Percutaneous drainage without sclerotherapy for benign ovarian cysts.
Zerem, Enver; Imamović, Goran; Omerović, Safet
2009-07-01
To evaluate percutaneous short-term catheter drainage in the management of benign ovarian cysts in patients at increased surgical risk. Thirty-eight patients with simple ovarian cysts were treated with drainage of fluid content by catheters until output stopped. All patients were poor candidates for surgery. All procedures were performed under ultrasonographic (US) control and local anesthesia. Cytologic examination was performed in all cases. The patients were followed up monthly with color Doppler US for 12 months. Outcome measure was the recurrence of a cyst. During the 12-month follow-up period, 10 of 38 cysts recurred. Seven of the 10 cysts required further intervention, and three were followed up without intervention. Four of the seven patients who required further intervention underwent repeat transabdominal aspiration and three declined repeat aspiration and subsequently underwent surgery. After repeated aspirations, two of four cysts disappeared, one necessitated follow-up only, and one necessitated surgical intervention. Cyst volume (P = .009) and diameter (P = .001) were significantly larger in the cysts that recurred. No evidence of malignancy was reported in the cytologic examination in any patient. No patients developed malignancy during follow-up. No major complications were observed. The hospital stay was 1 day for all patients. The median duration of drainage in the groups with resolved and recurrent cysts was 1 day (interquartile range, 1-1) and 2 days (interquartile range, 1-3), respectively (P = .04). In patients considered poor candidates for open surgery or laparoscopy, percutaneous treatment of ovarian cysts with short-term catheter drainage without sclerotherapy appears to be a safe and effective alternative, with low recurrence rates.
Enzinger, Andrea C.; Zhang, Baohui; Schrag, Deborah; Prigerson, Holly G.
2015-01-01
Purpose To determine how prognostic conversations influence perceptions of life expectancy (LE), distress, and the patient-physician relationship among patients with advanced cancer. Patients and Methods This was a multicenter observational study of 590 patients with metastatic solid malignancies with progressive disease after ≥ one line of palliative chemotherapy, undergoing follow-up to death. At baseline, patients were asked whether their oncologist had disclosed an estimate of prognosis. Patients also estimated their own LE and completed assessments of the patient-physician relationship, distress, advance directives, and end-of-life care preferences. Results Among this cohort of 590 patients with advanced cancer (median survival, 5.4 months), 71% wanted to be told their LE, but only 17.6% recalled a prognostic disclosure by their physician. Among the 299 (51%) of 590 patients willing to estimate their LE, those who recalled prognostic disclosure offered more realistic estimates as compared with patients who did not (median, 12 months; interquartile range, 6 to 36 months v 48 months; interquartile range, 12 to 180 months; P < .001), and their estimates were less likely to differ from their actual survival by > 2 (30.2% v 49.2%; odds ratio [OR], 0.45; 95% CI, 0.14 to 0.82) or 5 years (9.5% v 35.5%; OR, 0.19; 95% CI, 0.08 to 0.47). In adjusted analyses, recall of prognostic disclosure was associated with a 17.2-month decrease (95% CI, 6.2 to 28.2 months) in patients' LE self-estimates. Longer LE self-estimates were associated with lower likelihood of do-not-resuscitate order (adjusted OR, 0.439; 95% CI, 0.296 to 0.630 per 12-month increase in estimate) and preference for life-prolonging over comfort-oriented care (adjusted OR, 1.493; 95% CI, 1.091 to 1.939). Prognostic disclosure was not associated with worse patient-physician relationship ratings, sadness, or anxiety in adjusted analyses. Conclusion Prognostic disclosures are associated with more realistic patient expectations of LE, without decrements to their emotional well-being or the patient-physician relationship. PMID:26438121
Enzinger, Andrea C; Zhang, Baohui; Schrag, Deborah; Prigerson, Holly G
2015-11-10
To determine how prognostic conversations influence perceptions of life expectancy (LE), distress, and the patient-physician relationship among patients with advanced cancer. This was a multicenter observational study of 590 patients with metastatic solid malignancies with progressive disease after ≥ one line of palliative chemotherapy, undergoing follow-up to death. At baseline, patients were asked whether their oncologist had disclosed an estimate of prognosis. Patients also estimated their own LE and completed assessments of the patient-physician relationship, distress, advance directives, and end-of-life care preferences. Among this cohort of 590 patients with advanced cancer (median survival, 5.4 months), 71% wanted to be told their LE, but only 17.6% recalled a prognostic disclosure by their physician. Among the 299 (51%) of 590 patients willing to estimate their LE, those who recalled prognostic disclosure offered more realistic estimates as compared with patients who did not (median, 12 months; interquartile range, 6 to 36 months v 48 months; interquartile range, 12 to 180 months; P < .001), and their estimates were less likely to differ from their actual survival by > 2 (30.2% v 49.2%; odds ratio [OR], 0.45; 95% CI, 0.14 to 0.82) or 5 years (9.5% v 35.5%; OR, 0.19; 95% CI, 0.08 to 0.47). In adjusted analyses, recall of prognostic disclosure was associated with a 17.2-month decrease (95% CI, 6.2 to 28.2 months) in patients' LE self-estimates. Longer LE self-estimates were associated with lower likelihood of do-not-resuscitate order (adjusted OR, 0.439; 95% CI, 0.296 to 0.630 per 12-month increase in estimate) and preference for life-prolonging over comfort-oriented care (adjusted OR, 1.493; 95% CI, 1.091 to 1.939). Prognostic disclosure was not associated with worse patient-physician relationship ratings, sadness, or anxiety in adjusted analyses. Prognostic disclosures are associated with more realistic patient expectations of LE, without decrements to their emotional well-being or the patient-physician relationship. © 2015 by American Society of Clinical Oncology.
Sharma, Pranav; Ashouri, Kenan; Zargar-Shoshtari, Kamran; Luchey, Adam M; Spiess, Philippe E
2016-03-01
We evaluated sociodemographic and economic differences in overall survival (OS) of patients with penile SCC using the National Cancer Data Base (NCDB). We identified 5,412 patients with a diagnosis of penile squamous cell carcinoma from 1998 to 2011 with clinically nonmetastatic disease and available pathologic tumor and nodal staging. OS was estimated using the Kaplan-Meier method, and differences were determined using the log-rank test. Cox proportional hazard regression was performed to identify independent predictors of OS. Estimated median OS was 91.9 months (interquartile range: 25.8-not reached) at median follow-up of 44.7 months (interquartile range: 17.2-81.0). Survival did not change over the study period (P = 0.28). Black patients presented with a higher stage of disease (pT3/T4: 16.6 vs. 13.2%, P = 0.027) and had worse median OS (68.6 vs. 93.7 months, P<0.01). Patients with private insurance and median income≥$63,000 based on zip code presented with a lower stage of disease (pT3/T4: 11.6 vs. 14.7%, P = 0.002 and 12.0 vs. 14.0%, P = 0.042, respectively) and had better median OS (163.2 vs. 70.8 months, P<0.01 and 105.3 vs. 86.4 months, p = 0.001, respectively). On multivariate analysis, black race (hazard ratio [HR]: 1.39, 95% confidence interval [CI]: 1.21-1.58; P<0.01) was independently associated with worse OS, whereas private insurance (HR = 0.79, 95% CI: 0.63-0.98; P = 0.028) and higher median income≥$63,000 (HR = 0.82; 95% CI: 0.72-0.93; P = 0.001) were independently associated with better OS. Racial and economic differences in the survival of patients with penile cancer exist. An understanding of these differences may help minimize disparities in cancer care. Copyright © 2016 Elsevier Inc. All rights reserved.
Bernard, Stephen A; Nguyen, Vina; Cameron, Peter; Masci, Kevin; Fitzgerald, Mark; Cooper, David J; Walker, Tony; Std, B Paramed; Myles, Paul; Murray, Lynne; David; Taylor; Smith, Karen; Patrick, Ian; Edington, John; Bacon, Andrew; Rosenfeld, Jeffrey V; Judson, Rodney
2010-12-01
To determine whether paramedic rapid sequence intubation in patients with severe traumatic brain injury (TBI) improves neurologic outcomes at 6 months compared with intubation in the hospital. Severe TBI is associated with a high rate of mortality and long-term morbidity. Comatose patients with TBI routinely undergo endo-tracheal intubation to protect the airway, prevent hypoxia, and control ventilation. In many places, paramedics perform intubation prior to hospital arrival. However, it is unknown whether this approach improves outcomes. In a prospective, randomized, controlled trial, we assigned adults with severe TBI in an urban setting to either prehospital rapid sequence intubation by paramedics or transport to a hospital emergency department for intubation by physicians. The primary outcome measure was the median extended Glasgow Outcome Scale (GOSe) score at 6 months. Secondary end-points were favorable versus unfavorable outcome at 6 months, length of intensive care and hospital stay, and survival to hospital discharge. A total of 312 patients with severe TBI were randomly assigned to paramedic rapid sequence intubation or hospital intubation. The success rate for paramedic intubation was 97%. At 6 months, the median GOSe score was 5 (interquartile range, 1-6) in patients intubated by paramedics compared with 3 (interquartile range, 1-6) in the patients intubated at hospital (P = 0.28).The proportion of patients with favorable outcome (GOSe, 5-8) was 80 of 157 patients (51%) in the paramedic intubation group compared with 56 of 142 patients (39%) in the hospital intubation group (risk ratio, 1.28; 95% confidence interval, 1.00-1.64; P = 0.046). There were no differences in intensive care or hospital length of stay, or in survival to hospital discharge. In adults with severe TBI, prehospital rapid sequence intubation by paramedics increases the rate of favorable neurologic outcome at 6 months compared with intubation in the hospital.
Ehrlich, Rita; Pokroy, Russell; Segal, Ori; Goldstein, Michaella; Pollack, Ayala; Hanhart, Joel; Barak, Yoreh; Kehat, Rinat; Shulman, Shiri; Vidne, Orit; Abu Ahmad, Wiessam; Chowers, Itay
2018-06-01
To evaluate the outcome of second-line intravitreal ranibizumab treatment in eyes with diabetic macular edema having persistent edema following initial therapy with intravitreal bevacizumab. Diabetic macular edema treated with ranibizumab following bevacizumab failure in Israel was a retrospective, multi-center study. Consecutive eyes with persistent diabetic macular edema following at least three previous intravitreal bevacizumab injections prior to intravitreal ranibizumab, at least three-monthly intravitreal ranibizumab injections and at least 12 months of follow-up were included. Data collected included demographics, ocular findings, diabetes control, details of intravitreal bevacizumab and ranibizumab injections, and visual and anatomical measurements before and after intravitreal ranibizumab treatment. In total, 202 eyes of 162 patients treated at 11 medical centers across Israel were included. Patients received a mean (±standard deviation) of 8.8 ± 4.9 intravitreal bevacizumab injections prior to the switch to intravitreal ranibizumab. A mean of 7.0 ± 2.7 intravitreal ranibizumab injections were given during the 12 months following the switch to intravitreal ranibizumab. The median central subfield retinal thickness (±interquartile range) by spectral-domain optical coherence tomography decreased from 436 ± 162 µm at baseline to 319 ± 113 µm at month 12 (p < 0.001). Median logMAR visual acuity (±interquartile range) improved from 0.40 ± 0.48 at baseline to 0.38 ± 0.40 at month 12 (p = 0.001). Linear regression suggested that higher number of intravitreal ranibizumab injections and higher pre-switch central subfield retinal thickness were associated with favorable visual outcome. Higher number of intravitreal bevacizumab injections and the presence of intraretinal fluid before the switch lessened the odds of favorable outcome. Switching from bevacizumab to ranibizumab in persistent diabetic macular edema was associated with anatomical improvement in the majority of eyes and ⩾2 lines of vision improvement in 22% of eyes.
Out-of-pocket costs for childhood stroke: the impact of chronic illness on parents' pocketbooks.
Plumb, Patricia; Seiber, Eric; Dowling, Michael M; Lee, JoEllen; Bernard, Timothy J; deVeber, Gabrielle; Ichord, Rebecca N; Bastian, Rachel; Lo, Warren D
2015-01-01
Direct costs for children who had stroke are similar to those for adults. There is no information regarding the out-of-pocket costs families encounter. We described the out-of-pocket costs families encountered in the first year after a child's ischemic stroke. Twenty-two subjects were prospectively recruited at four centers in the United States and Canada in 2008 and 2009 as part of the "Validation of the Pediatric NIH Stroke Scale" study; families' indirect costs were tracked for 1 year. Every 3 months, parents reported hours they did not work, nonreimbursed costs for medical visits or other health care, and mileage. They provided estimates of annual income. We calculated total out-of-pocket costs in US dollars and reported costs as a proportion of annual income. Total median out-of-pocket cost for the year after an ischemic stroke was $4354 (range, $0-$28,666; interquartile range, $1008-$8245). Out-of-pocket costs were greatest in the first 3 months after the incident stroke, with the largest proportion because of lost wages, followed by transportation, and nonreimbursed health care. For the entire year, median costs represented 6.8% (range, 0%-81.9%; interquartile range, 2.7%-17.2%) of annual income. Out-of-pocket expenses are significant after a child's ischemic stroke. The median costs are noteworthy provided that the median American household had cash savings of $3650 at the time of the study. These results with previous reports of direct costs provide a more complete view of the overall costs to families and society. Childhood stroke creates an under-recognized cost to society because of decreased parental productivity. Copyright © 2015 Elsevier Inc. All rights reserved.
Percutaneous Dilational Tracheotomy in Solid-Organ Transplant Recipients.
Ozdemirkan, Aycan; Ersoy, Zeynep; Zeyneloglu, Pinar; Gedik, Ender; Pirat, Arash; Haberal, Mehmet
2015-11-01
Solid-organ transplant recipients may require percutaneous dilational tracheotomy because of prolonged mechanical ventilation or airway issues, but data regarding its safety and effectiveness in solid-organ transplant recipients are scarce. Here, we evaluated the safety, effectiveness, and benefits in terms of lung mechanics, complications, and patient comfort of percutaneous dilational tracheotomy in solid-organ transplant recipients. Medical records from 31 solid-organ transplant recipients (median age of 41.0 years [interquartile range, 18.0-53.0 y]) who underwent percutaneous dilational tracheotomy at our hospital between January 2010 and March 2015 were analyzed, including primary diagnosis, comorbidities, duration of orotracheal intubation and mechanical ventilation, length of intensive care unit and hospital stays, the time interval between transplant to percutaneous dilational tracheotomy, Acute Physiology and Chronic Health Evaluation II score, tracheotomy-related complications, and pulmonary compliance and ratio of partial pressure of arterial oxygen to fraction of inspired oxygen. The median Acute Physiology and Chronic Health Evaluation II score on admission was 24.0 (interquartile range, 18.0-29.0). The median interval from transplant to percutaneous dilational tracheotomy was 105.5 days (interquartile range, 13.0-2165.0 d). The only major complication noted was left-sided pneumothorax in 1 patient. There were no significant differences in ratio of partial pressure of arterial oxygen to fraction of inspired oxygen before and after procedure (170.0 [interquartile range, 102.2-302.0] vs 210.0 [interquartile range, 178.5-345.5]; P = .052). However, pulmonary compliance results preprocedure and postprocedure were significantly different (0.020 L/cm H2O [interquartile range, 0.015-0.030 L/cm H2O] vs 0.030 L/cm H2O [interquartile range, 0.020-0.041 L/cm H2O); P = .001]). Need for sedation significantly decreased after tracheotomy (from 17 patients [54.8%] to 8 patients [25.8%]; P = .004]). Percutaneous dilational tracheotomy with bronchoscopic guidance is an efficacious and safe technique for maintaining airways in solidorgan transplant recipients who require prolonged mechanical ventilation, resulting in possible improvements in ventilatory mechanics and patient comfort.
Shkirkova, Kristina; Akam, Eftitan Y; Huang, Josephine; Sheth, Sunil A; Nour, May; Liang, Conrad W; McManus, Michael; Trinh, Van; Duckwiler, Gary; Tarpley, Jason; Vinuela, Fernando; Saver, Jeffrey L
2017-12-01
Background Rapid dissemination and coordination of clinical and imaging data among multidisciplinary team members are essential for optimal acute stroke care. Aim To characterize the feasibility and utility of the Synapse Emergency Room mobile (Synapse ERm) informatics system. Methods We implemented the Synapse ERm system for integration of clinical data, computerized tomography, magnetic resonance, and catheter angiographic imaging, and real-time stroke team communications, in consecutive acute neurovascular patients at a Comprehensive Stroke Center. Results From May 2014 to October 2014, the Synapse ERm application was used by 33 stroke team members in 84 Code Stroke alerts. Patient age was 69.6 (±17.1), with 41.5% female. Final diagnosis was: ischemic stroke 64.6%, transient ischemic attack 7.3%, intracerebral hemorrhage 6.1%, and cerebrovascular-mimic 22.0%. Each patient Synapse ERm record was viewed by a median of 10 (interquartile range 6-18) times by a median of 3 (interquartile range 2-4) team members. The most used feature was computerized tomography, magnetic resonance, and catheter angiography image display. In-app tweet team, communications were sent by median 1 (interquartile range 0-1, range 0-13) users per case and viewed by median 1 (interquartile range 0-3, range 0-44) team members. Use of the system was associated with rapid treatment times, faster than national guidelines, including median door-to-needle 51.0 min (interquartile range 40.5-69.5) and median door-to-groin 94.5 min (interquartile range 85.5-121.3). In user surveys, the mobile information platform was judged easy to employ in 91% (95% confidence interval 65%-99%) of uses and of added help in stroke management in 50% (95% confidence interval 22%-78%). Conclusion The Synapse ERm mobile platform for stroke team distribution and integration of clinical and imaging data was feasible to implement, showed high ease of use, and moderate perceived added utility in therapeutic management.
Response to Antimalarials in Cutaneous Lupus Erythematosus A Prospective Analysis
Chang, Aileen Y.; Piette, Evan W.; Foering, Kristen P.; Tenhave, Thomas R.; Okawa, Joyce; Werth, Victoria P.
2012-01-01
Objective To demonstrate response to antimalarials in patients with cutaneous lupus erythematosus using activity scores from the Cutaneous Lupus Erythematosus Disease Area and Severity Index, a validated outcome measure. Design Prospective, longitudinal cohort study. Setting University cutaneous autoimmune disease clinic. Participants One hundred twenty-eight patients with cutaneous lupus erythematosus who presented from January 2007-July 2010 and had at least 2 visits with activity scores. Main Outcome Measures Response defined by 4-point or 20% decrease in activity score. Response to initiation determined with score before treatment and first visit at least 2 months after treatment. Response to continuation determined with score at first visit and most recent visit on treatment. Results Of 11 patients initiated on hydroxychloroquine, 55% were responders with a decrease in median (interquartile range) activity score from 8.0 (3.5-13) to 3.0 (1.8-7.3) (p=0.03). Of 15 patients who had failed hydroxychloroquine, 67% were responders to initiation of hydroxychloroquine-quinacrine, with a decrease in median (interquartile range) activity score from 6.0 (4.8-8.3) to 3.0 (0.75-5.0) (p=0.004). Nine out of 21 patients (43%) continued on hydroxychloroquine and 9 out of 21 patients (43%) continued on hydroxychloroquine-quinacrine were responders with a decrease in median (interquartile range) activity score from 6.0 (1.5-9.5) to 1.0 (0-4.5) (p=0.009) and 8.5 (4.25-17.5) to 5.0 (0.5-11.5) (p=0.01), respectively. Conclusion The use of quinacrine with hydroxychloroquine is associated with response in patients who fail hydroxychloroquine monotherapy. Further reduction in disease activity can be associated with continuation of antimalarials. PMID:21768444
Uber, Amy J; Perman, Sarah M; Cocchi, Michael N; Patel, Parth V; Ganley, Sarah E; Portmann, Jocelyn M; Donnino, Michael W; Grossestreuer, Anne V
2018-04-03
Assess if amount of heat generated by postcardiac arrest patients to reach target temperature (Ttarget) during targeted temperature management is associated with outcomes by serving as a proxy for thermoregulatory ability, and whether it modifies the relationship between time to Ttarget and outcomes. Retrospective cohort study. Urban tertiary-care hospital. Successfully resuscitated targeted temperature management-treated adult postarrest patients between 2008 and 2015 with serial temperature data and Ttarget less than or equal to 34°C. None. Time to Ttarget was defined as time from targeted temperature management initiation to first recorded patient temperature less than or equal to 34°C. Patient heat generation ("heat units") was calculated as inverse of average water temperature × hours between initiation and Ttarget × 100. Primary outcome was neurologic status measured by Cerebral Performance Category score; secondary outcome was survival, both at hospital discharge. Univariate analyses were performed using Wilcoxon rank-sum tests; multivariate analyses used logistic regression. Of 203 patients included, those with Cerebral Performance Category score 3-5 generated less heat before reaching Ttarget (median, 8.1 heat units [interquartile range, 3.6-21.6 heat units] vs median, 20.0 heat units [interquartile range, 9.0-33.5 heat units]; p = 0.001) and reached Ttarget quicker (median, 2.3 hr [interquartile range, 1.5-4.0 hr] vs median, 3.6 hr [interquartile range, 2.0-5.0 hr]; p = 0.01) than patients with Cerebral Performance Category score 1-2. Nonsurvivors generated less heat than survivors (median, 8.1 heat units [interquartile range, 3.6-20.8 heat units] vs median, 19.0 heat units [interquartile range, 6.5-33.5 heat units]; p = 0.001) and reached Ttarget quicker (median, 2.2 hr [interquartile range, 1.5-3.8 hr] vs median, 3.6 hr [interquartile range, 2.0-5.0 hr]; p = 0.01). Controlling for average water temperature between initiation and Ttarget, the relationship between outcomes and time to Ttarget was no longer significant. Controlling for location, witnessed arrest, age, initial rhythm, and neuromuscular blockade use, increased heat generation was associated with better neurologic (adjusted odds ratio, 1.01 [95% CI, 1.00-1.03]; p = 0.039) and survival (adjusted odds ratio, 1.01 [95% CI, 1.00-1.03]; p = 0.045) outcomes. Increased heat generation during targeted temperature management initiation is associated with better outcomes at hospital discharge and may affect the relationship between time to Ttarget and outcomes.
PTH(1-34) for the Primary Prevention of Postthyroidectomy Hypocalcemia: The THYPOS Trial.
Palermo, Andrea; Mangiameli, Giuseppe; Tabacco, Gaia; Longo, Filippo; Pedone, Claudio; Briganti, Silvia Irina; Maggi, Daria; Vescini, Fabio; Naciu, Anda; Lauria Pantano, Angelo; Napoli, Nicola; Angeletti, Silvia; Pozzilli, Paolo; Crucitti, Pierfilippo; Manfrini, Silvia
2016-11-01
There are no studies evaluating teriparatide for prevention of post-thyroidectomy hypocalcemia. Our objective was to evaluate whether teriparatide can prevent postsurgical hypocalcemia and shorten the hospitalization in subjects at high risk of hypocalcemia following thyroid surgery. This was a prospective phase II randomized open-label trial. This trial was set on a surgical ward. Twenty-six subjects (six males, 20 females) with intact PTH lower than10 pg/ml 4 hours after thyroidectomy were included. Subjects were randomized (1:1) to receive SC administration of 20 mcg of teriparatide every 12 hours until the discharge (treatment group) or to follow standard clinical care (control group). Adjusted serum calcium, duration of hospitalization, and calcium/calcitriol supplementation were measured. Overall, the incidence of hypocalcemia was 3/13 in treatment group and 11/13 in the control group (P = .006). Treated patients had a lower risk of hypocalcemia than controls (relative risk, 0.26 [95% confidence interval, 0.09-0.723)]). The median duration of hospitalization was 3 days (interquartile range, 1) in control subjects and 2 days (interquartile range, 0) in treated subjects (P = .012). One month after discharge, 10/13 subjects in the treatment group had stopped calcium carbonate supplements, while only 5/13 in the control group had discontinued calcium. The ANOVA for repeated measures showed a significant difference in calcium supplements between groups at 1-month visit (P = .04) as well as a significant difference between discharge and 1-month visit in the treatment group (P for interaction time group = .04) Conclusions: Teriparatide may prevent postsurgical hypocalcemia, shorten the duration of hospitalization, and reduce the need for calcium and vitamin D supplementation after discharge in high risk subjects after thyroid surgery.
Choi, Woong Gil; Kim, Soo Hyun; Yoon, Hyung Seok; Lee, Eun Joo; Kim, Dong Woon
2015-01-01
Although drug-eluting stents (DESs) effectively reduce restenosis following percutaneous coronary intervention (PCI), they also delay re-endothelialization and impair microvascular function, resulting in adverse clinical outcomes. Endothelial progenitor cell (EPC) capturing stents, by providing a functional endothelial layer on the stent, have beneficial effects on microvascular function. However, data on coronary microvascular function in patients with EPC stents versus DESs are lacking. Seventy-four patients who previously underwent PCI were enrolled in this study. Microvascular function was evaluated 6 months after PCI based on the index of microvascular resistance (IMR) and the coronary flow reserve (CFR). IMR was calculated as the ratio of the mean distal coronary pressure at maximal hyperemia to the inverse of the hyperemic mean transit time (hTmn). The CFR was calculated by dividing the hTmn by the baseline mean transit time. Twenty-one patients (age, 67.2 ± 9.6 years; male:female, 15:6) with an EPC stent and 53 patients (age, 61.5 ± 14.7 years; male:female, 40:13) with second-generation DESs were included in the study. There were no significant differences in the baseline clinical and angiographic characteristics of the two groups. Angiography performed 6 months postoperatively did not show significant differences in their CFR values. However, patients with the EPC stent had a significantly lower IMR than patients with second-generation DESs (median, 25.5 [interquartile range, 12.85 to 28.18] vs. 29.0 [interquartile range, 15.42 to 39.23]; p = 0.043). Microvascular dysfunction was significantly improved after 6 months in patients with EPC stents compared to those with DESs. The complete re-endothelialization achieved with the EPC stent may provide clinical benefits over DESs, especially in patients with microvascular dysfunction.
Hepatocellular Carcinoma Surveillance Among Cirrhotic Patients With Commercial Health Insurance.
Goldberg, David S; Valderrama, Adriana; Kamalakar, Rajesh; Sansgiry, Sujit S; Babajanyan, Svetlana; Lewis, James D
2016-03-01
To evaluate hepatocellular carcinoma (HCC) surveillance rates among commercially insured patients, and evaluate factors associated with compliance with surveillance recommendations. Most HCC occurs in patients with cirrhosis. American Association for the Study of Liver Diseases and European Association for the Study of the Liver guidelines each recommend biannual HCC surveillance for cirrhotic patients to diagnose HCC at an early, curable stage. However, compliance with these guidelines in commercially insured patients is unknown. We used the Truven Health Analytics databases from 2006 to 2010, using January 1, 2006 as the anchor date for evaluating outcomes. The primary outcome was continuous surveillance measure, defined as the proportion of time "up-to-date" with surveillance (PTUDS), with the 6-month interval immediately following each ultrasound categorized as "up-to-date." During a median follow-up of 22.9 (interquartile range, 16.3 to 33.9) months among 8916 cirrhotic patients, the mean PTUDS was 0.34 (SD, 0.29), and the median was 0.31 (interquartile range, 0.03 to 0.52). These values increased only modestly with inclusion of serum alpha-fetoprotein testing, contrast-enhanced abdominal computed tomographic scans or magnetic resonance imagings, and/or extension of up-to-date time to 12 months. Being diagnosed by a nongastroenterology provider and increasing age were significantly associated with decreased HCC surveillance (P<0.05), whereas a history of a hepatic decompensation event, presence of any component of the metabolic syndrome, and diagnosis of hepatitis B or hepatitis C were significantly associated with increased surveillance (P<0.05). However, even among patients with the most favorable characteristics, surveillance rates remained low. HCC surveillance rates in commercially insured at-risk patients remain poor despite formalized guidelines, highlighting the need to develop interventions to improve surveillance rates.
Long-term treatment of uterine fibroids with ulipristal acetate ☆.
Donnez, Jacques; Vázquez, Francisco; Tomaszewski, Janusz; Nouri, Kazem; Bouchard, Philippe; Fauser, Bart C J M; Barlow, David H; Palacios, Santiago; Donnez, Olivier; Bestel, Elke; Osterloh, Ian; Loumaye, Ernest
2014-06-01
To investigate the efficacy and safety of ulipristal acetate (UPA) for long-term treatment of symptomatic uterine fibroids. Repeated intermittent open-label UPA courses, each followed by randomized double-blind norethisterone acetate (NETA) or placebo. European clinical gynecology centers. Two hundred and nine women with symptomatic fibroids including heavy menstrual bleeding. Patients received up to four 3-month courses of UPA 10 mg daily, immediately followed by 10-day double-blind treatment with NETA (10 mg daily) or placebo. Amenorrhea, fibroid volume, endometrial histology. After the first UPA course, amenorrhea occurred in 79% of women, with median onset (from treatment start) of 4 days (interquartile range, 2-6 days). Median fibroid volume change was -45% (interquartile range, -66%; -25%). Amenorrhea rates were 89%, 88%, and 90% for the 131, 119, and 107 women who received treatment courses 2, 3, and 4, respectively. Median times to amenorrhea were 2, 3, and 3 days for treatment courses 2, 3, and 4, respectively. Median fibroid volume changes from baseline were -63%, -67%, and -72% after treatment courses 2, 3, and 4, respectively. All endometrial biopsies showed benign histology without hyperplasia; NETA did not affect fibroid volume or endometrial histology. Repeated 3-month UPA courses effectively control bleeding and shrink fibroids in patients with symptomatic fibroids. ClinicalTrials.gov (www.clinicaltrials.gov) registration numbers NCT01156857 (PEARL III) and NCT01252069 (PEARL III extension). Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Early Life Experiences and Telomere Length in Adult Rhesus Monkeys: An Exploratory Study
Schneper, Lisa M.; Brooks-Gunn, Jeanne; Notterman, Daniel A.; Suomi, Stephen J.
2016-01-01
Objective Child rearing environments have been associated with morbidity in adult rhesus monkeys. We examine whether such links are also seen with leukocyte telomere length. Methods To determine telomere length in leukocytes, blood was collected from 11 adult females aged seven to ten years who had been exposed to different rearing environments between birth and seven months. Four had been reared with their mothers in typical social groups comprised of other females, their offspring, and 1–2 adult males. The other seven had been reared in either small groups of peers or in individual cages with extensive peer interaction daily. After seven months, all shared a common environment. Results Telomere lengths were longer for those adults who had been reared with their mothers in social groups (median = 16.0 kb, interquartile range = 16.5–15.4) than for those who were reared without their mothers (median = 14.0 kb, interquartile range = 14.3–12.7; 2.2 kb/telomere difference, p<0.027). Conclusions This observation adds to emerging knowledge about early adverse child rearing conditions and their potential for influencing later morbidity. As newborns were randomly assigned to the mother or other rearing conditions, the findings are not confounded by other conditions that co-occur with adverse child rearing environments in humans (e.g., prenatal stress, nutrition and health as well as postnatal nutrition and negative life experiences over and above rearing conditions). PMID:27763985
Silove, Derrick; Alonso, Jordi; Bromet, Evelyn; Gruber, Mike; Sampson, Nancy; Scott, Kate; Andrade, Laura; Benjet, Corina; de Almeida, Jose Miguel Caldas; De Girolamo, Giovanni; de Jonge, Peter; Demyttenaere, Koen; Fiestas, Fabian; Florescu, Silvia; Gureje, Oye; He, Yanling; Karam, Elie; Lepine, Jean-Pierre; Murphy, Sam; Villa-Posada, Jose; Zarkov, Zahari; Kessler, Ronald C.
2016-01-01
Objective The age-at-onset criterion for separation anxiety disorder was removed in DSM-5, making it timely to examine the epidemiology of separation anxiety disorder as a disorder with onsets spanning the life course, using cross-country data. Method The sample included 38,993 adults in 18 countries in the World Health Organization (WHO) World Mental Health Surveys. The WHO Composite International Diagnostic Interview was used to assess a range of DSM-IV disorders that included an expanded definition of separation anxiety disorder allowing onsets in adulthood. Analyses focused on prevalence, age at onset, comorbidity, predictors of onset and persistence, and separation anxiety-related role impairment. Results Lifetime separation anxiety disorder prevalence averaged 4.8% across countries (interquartile range [25th–75th percentiles]=1.4%–6.4%), with 43.1% of lifetime onsets occurring after age 18. Significant time-lagged associations were found between earlier separation anxiety disorder and subsequent onset of internalizing and externalizing DSM-IV disorders and conversely between these disorders and subsequent onset of separation anxiety disorder. Other consistently significant predictors of lifetime separation anxiety disorder included female gender, retrospectively reported childhood adversities, and lifetime traumatic events. These predictors were largely comparable for separation anxiety disorder onsets in childhood, adolescence, and adulthood and across country income groups. Twelve-month separation anxiety disorder prevalence was considerably lower than lifetime prevalence (1.0% of the total sample; interquartile range=0.2%–1.2%). Severe separation anxiety-related 12-month role impairment was significantly more common in the presence (42.4%) than absence (18.3%) of 12-month comorbidity. Conclusions Separation anxiety disorder is a common and highly comorbid disorder that can have onset across the lifespan. Childhood adversity and lifetime trauma are important antecedents, and adverse effects on role function make it a significant target for treatment. PMID:26046337
Lengline, Etienne; Drenou, Bernard; Peterlin, Pierre; Tournilhac, Olivier; Abraham, Julie; Berceanu, Ana; Dupriez, Brigitte; Guillerm, Gaelle; Raffoux, Emmanuel; de Fontbrune, Flore Sicre; Ades, Lionel; Balsat, Marie; Chaoui, Driss; Coppo, Paul; Corm, Selim; Leblanc, Thierry; Maillard, Natacha; Terriou, Louis; Socié, Gerard; de Latour, Regis Peffault
2018-02-01
Few therapeutic options are available for patients with aplastic anemia who are ineligible for transplantation or refractory to immunosuppressive therapy. Eltrombopag was recently shown to produce trilineage responses in refractory patients. However, the effects of real-life use of this drug remain unknown. This retrospective study (2012-2016) was conducted by the French Reference Center for Aplastic Anemia on patients with relapsed/refractory aplastic anemia, and patients ineligible for antithymocyte globulin or transplantation, who received eltrombopag for at least 2 months. Forty-six patients with aplastic anemia were given eltrombopag without prior antithymocyte globulin treatment (n=11) or after antithymocyte globulin administration (n=35) in a relapsed/refractory setting. Eltrombopag (median daily dose 150 mg) was introduced 17 months (range, 8-50) after the diagnosis of aplastic anemia. At last followup, 49% were still receiving treatment, 9% had stopped due to a robust response, 2% due to toxicity and 40% due to eltrombopag failure. Before eltrombopag treatment, all patients received regular transfusions. The overall rates of red blood cell and platelet transfusion independence were 7%, 33%, 46% and 46% at 1, 3, 6 months and last follow-up. Responses were slower to develop in antithymocyte treatment-naïve patients. In patients achieving transfusion independence, hemoglobin concentration and platelet counts improved by 3 g/dL (interquartile range, 1.4-4.5) and 42×10 9 /L (interquartile range, 11-100), respectively. Response in at least one lineage (according to National Institutes of Health criteria) was observed in 64% of antithymocyte treatment-naïve and 74% of relapsed/refractory patients, while trilineage improvement was observed in 27% and 34%, respectively. We found high rates of hematologic improvement and transfusion independence in refractory aplastic anemia patients but also in patients ineligible for antithymocyte globulin receiving first-line treatment. In conclusion, elderly patients unfit for antithymocyte globulin therapy may benefit from eltrombopag. Copyright© 2018 Ferrata Storti Foundation.
Biswas Roy, Sreeja; Elnahas, Shaimaa; Serrone, Rosemarie; Haworth, Cassandra; Olson, Michael T; Kang, Paul; Smith, Michael A; Bremner, Ross M; Huang, Jasmine L
2018-06-01
Gastroesophageal reflux disease (GERD) is prevalent after lung transplantation. Fundoplication slows lung function decline in patients with GERD, but the optimal timing of fundoplication is unknown. We retrospectively reviewed patients who underwent fundoplication after lung transplantion at our center from April 2007 to July 2014. Patients were divided into 2 groups: early fundoplication (<6 months after lung transplantation) and late fundoplication (≥6 months after lung transplantation). Annual decline in percent predicted forced expiratory volume in 1 second (FEV 1 ) was analyzed. Of the 251 patients who underwent lung transplantation during the study period with available pH data, 86 (34.3%) underwent post-transplantation fundoplication for GERD. Thirty of 86 (34.9%) had early fundoplication and 56 of 86 (65.1%) had late fundoplication. Median time from lung transplantation to fundoplication was 4.6 months (interquartile range, 2.0-5.2) and 13.8 months (interquartile range, 9.0-16.1) for the early and late groups, respectively. The median DeMeester score was comparable between groups. One-, 3-, and 5-year actuarial survival rates in the early group were 90%, 70%, and 70%, respectively; in the late group, these rates were 91%, 66%, and 66% (log rank P = .60). Three- and 5-year percent predicted FEV 1 was lower in the late group by 8.9% (95% confidence interval, -30.2 to 12.38; P = .46) and 40.7% (95% confidence interval, -73.66 to -7.69; P = .019). A linear mixed model showed a 5.7% lower percent predicted FEV 1 over time in the late fundoplication group (P < .001). In this study, patients with early fundoplication had a higher FEV 1 5 years after lung transplantation. Early fundoplication might protect against GERD-induced lung damage in lung transplant recipients with GERD. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Demaerschalk, Bart M; Brown, Robert D; Roubin, Gary S; Howard, Virginia J; Cesko, Eldina; Barrett, Kevin M; Longbottom, Mary E; Voeks, Jenifer H; Chaturvedi, Seemant; Brott, Thomas G; Lal, Brajesh K; Meschia, James F; Howard, George
2017-09-01
Multicenter clinical trials attempt to select sites that can move rapidly to randomization and enroll sufficient numbers of patients. However, there are few assessments of the success of site selection. In the CREST-2 (Carotid Revascularization and Medical Management for Asymptomatic Carotid Stenosis Trials), we assess factors associated with the time between site selection and authorization to randomize, the time between authorization to randomize and the first randomization, and the average number of randomizations per site per month. Potential factors included characteristics of the site, specialty of the principal investigator, and site type. For 147 sites, the median time between site selection to authorization to randomize was 9.9 months (interquartile range, 7.7, 12.4), and factors associated with early site activation were not identified. The median time between authorization to randomize and a randomization was 4.6 months (interquartile range, 2.6, 10.5). Sites with authorization to randomize in only the carotid endarterectomy study were slower to randomize, and other factors examined were not significantly associated with time-to-randomization. The recruitment rate was 0.26 (95% confidence interval, 0.23-0.28) patients per site per month. By univariate analysis, factors associated with faster recruitment were authorization to randomize in both trials, principal investigator specialties of interventional radiology and cardiology, pre-trial reported performance >50 carotid angioplasty and stenting procedures per year, status in the top half of recruitment in the CREST trial, and classification as a private health facility. Participation in StrokeNet was associated with slower recruitment as compared with the non-StrokeNet sites. Overall, selection of sites with high enrollment rates will likely require customization to align the sites selected to the factor under study in the trial. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02089217. © 2017 American Heart Association, Inc.
Effort of breathing in children receiving high-flow nasal cannula.
Rubin, Sarah; Ghuman, Anoopindar; Deakers, Timothy; Khemani, Robinder; Ross, Patrick; Newth, Christopher J
2014-01-01
High-flow humidified nasal cannula is often used to provide noninvasive respiratory support in children. The effect of high-flow humidified nasal cannula on effort of breathing in children has not been objectively studied, and the mechanism by which respiratory support is provided remains unclear. This study uses an objective measure of effort of breathing (Pressure. Rate Product) to evaluate high-flow humidified nasal cannula in critically ill children. Prospective cohort study. Quaternary care free-standing academic children's hospital. ICU patients younger than 18 years receiving high-flow humidified nasal cannula or whom the medical team planned to extubate to high-flow humidified nasal cannula within 72 hours of enrollment. An esophageal pressure monitoring catheter was placed to measure pleural pressures via a Bicore CP-100 pulmonary mechanics monitor. Change in pleural pressure (ΔPes) and respiratory rate were measured on high-flow humidified nasal cannula at 2, 5, and 8 L/min. ΔPes and respiratory rate were multiplied to generate the Pressure.Rate Product, a well-established objective measure of effort of breathing. Baseline Pes, defined as pleural pressure at end exhalation during tidal breathing, reflected the positive pressure generated on each level of respiratory support. Twenty-five patients had measurements on high-flow humidified nasal cannula. Median age was 6.5 months (interquartile range, 1.3-15.5 mo). Median Pressure,Rate Product was lower on high-flow humidified nasal cannula 8 L/min (median, 329 cm H2O·min; interquartile range, 195-402) compared with high-flow humidified nasal cannula 5 L/min (median, 341; interquartile range, 232-475; p = 0.007) or high-flow humidified nasal cannula 2 L/min (median, 421; interquartile range, 233-621; p < 0.0001) and was lower on high-flow humidified nasal cannula 5 L/min compared with high-flow humidified nasal cannula 2 L/min (p = 0.01). Baseline Pes was higher on high-flow humidified nasal cannula 8 L/min than on high-flow humidified nasal cannula 2 L/min (p = 0.03). Increasing flow rates of high-flow humidified nasal cannula decreased effort of breathing in children, with the most significant impact seen from high-flow humidified nasal cannula 2 to 8 L/min. There are likely multiple mechanisms for this clinical effect, including generation of positive pressure and washout of airway dead space.
Intestinal cytokines in children with pervasive developmental disorders.
DeFelice, Magee L; Ruchelli, Eduardo D; Markowitz, Jonathan E; Strogatz, Melissa; Reddy, Krishna P; Kadivar, Khadijeh; Mulberg, Andrew E; Brown, Kurt A
2003-08-01
A relationship between autism and gastrointestinal (GI) immune dysregulation has been postulated based on incidence of GI complaints as well as macroscopically observed lymphonodular hyperplasia and microscopically determined enterocolitis in pediatric patients with autism. To evaluate GI immunity, we quantitatively assessed levels of proinflammatory cytokines, interleukin (IL)-6, IL-8, and IL-1beta, produced by intestinal biopsies of children with pervasive developmental disorders. Fifteen patients, six with pervasive developmental disorders and nine age-matched controls, presenting for diagnostic colonoscopy were enrolled. Endoscopic biopsies were organ cultured, supernatants were harvested, and IL-6, IL-8, and IL-1beta levels were quantified by ELISA. Tissue histology was evaluated by blinded pathologists. Concentrations of IL-6 from intestinal organ culture supernatants of patients with pervasive developmental disorders (median 318.5 pg/ml, interquartile range 282.0-393.0 pg/ml) when compared with controls (median 436.9 pg/ml, interquartile range 312.6-602.5 pg/ml) were not significantly different (p = 0.0987). Concentrations of IL-8 (median 84,000 pg/ml, interquartile range 16,000-143,000 pg/ml) when compared with controls (median 177,000 pg/ml, interquartile range 114,000-244,000 pg/ml) were not significantly different (p = 0.0707). Concentrations of IL-1beta (median 0.0 pg/ml, interquartile range 0.0-94.7 pg/ml) when compared with controls (median 0.0 pg/ml, interquartile range 0.0-60.2 pg/ml) were not significantly different (p = 0.8826). Tissue histology was nonpathological for all patients. We have demonstrated no significant difference in production of IL-6, IL-8, and IL-1beta between patients with pervasive developmental disorders and age-matched controls. In general, intestinal levels of IL-6 and IL-8 were lower in patients with pervasive developmental disorders than in age-matched controls. These data fail to support an association between autism and GI inflammation.
Evaluation of an artificial intelligence program for estimating occupational exposures.
Johnston, Karen L; Phillips, Margaret L; Esmen, Nurtan A; Hall, Thomas A
2005-03-01
Estimation and Assessment of Substance Exposure (EASE) is an artificial intelligence program developed by UK's Health and Safety Executive to assess exposure. EASE computes estimated airborne concentrations based on a substance's vapor pressure and the types of controls in the work area. Though EASE is intended only to make broad predictions of exposure from occupational environments, some occupational hygienists might attempt to use EASE for individual exposure characterizations. This study investigated whether EASE would accurately predict actual sampling results from a chemical manufacturing process. Personal breathing zone time-weighted average (TWA) monitoring data for two volatile organic chemicals--a common solvent (toluene) and a specialty monomer (chloroprene)--present in this manufacturing process were compared to EASE-generated estimates. EASE-estimated concentrations for specific tasks were weighted by task durations reported in the monitoring record to yield TWA estimates from EASE that could be directly compared to the measured TWA data. Two hundred and six chloroprene and toluene full-shift personal samples were selected from eight areas of this manufacturing process. The Spearman correlation between EASE TWA estimates and measured TWA values was 0.55 for chloroprene and 0.44 for toluene, indicating moderate predictive values for both compounds. For toluene, the interquartile range of EASE estimates at least partially overlapped the interquartile range of the measured data distributions in all process areas. The interquartile range of EASE estimates for chloroprene fell above the interquartile range of the measured data distributions in one process area, partially overlapped the third quartile of the measured data in five process areas and fell within the interquartile range in two process areas. EASE is not a substitute for actual exposure monitoring. However, EASE can be used in conditions that cannot otherwise be sampled and in preliminary exposure assessment if it is recognized that the actual interquartile range could be much wider and/or offset by a factor of 10 or more.
Variability in Antibiotic Use Across PICUs.
Brogan, Thomas V; Thurm, Cary; Hersh, Adam L; Gerber, Jeffrey S; Smith, Michael J; Shah, Samir S; Courter, Joshua D; Patel, Sameer J; Parker, Sarah K; Kronman, Matthew P; Lee, Brian R; Newland, Jason G
2018-06-01
To characterize and compare antibiotic prescribing across PICUs to evaluate the degree of variability. Retrospective analysis from 2010 through 2014 of the Pediatric Health Information System. Forty-one freestanding children's hospital. Children aged 30 days to 18 years admitted to a PICU in children's hospitals contributing data to Pediatric Health Information System. To normalize for potential differences in disease severity and case mix across centers, a subanalysis was performed of children admitted with one of the 20 All Patient Refined-Diagnosis Related Groups and the seven All Patient Refined-Diagnosis Related Groups shared by all PICUs with the highest antibiotic use. The study included 3,101,201 hospital discharges from 41 institutions with 386,914 PICU patients. All antibiotic use declined during the study period. The median-adjusted antibiotic use among PICU patients was 1,043 days of therapy/1,000 patient-days (interquartile range, 977-1,147 days of therapy/1,000 patient-days) compared with 893 among non-ICU children (interquartile range, 805-968 days of therapy/1,000 patient-days). For PICU patients, the median adjusted use of broad-spectrum antibiotics was 176 days of therapy/1,000 patient-days (interquartile range, 152-217 days of therapy/1,000 patient-days) and was 302 days of therapy/1,000 patient-days (interquartile range, 220-351 days of therapy/1,000 patient-days) for antimethicillin-resistant Staphylococcus aureus agents, compared with 153 days of therapy/1,000 patient-days (interquartile range, 130-182 days of therapy/1,000 patient-days) and 244 days of therapy/1,000 patient-days (interquartile range, 203-270 days of therapy/1,000 patient-days) for non-ICU children. After adjusting for potential confounders, significant institutional variability existed in antibiotic use in PICU patients, in the 20 All Patient Refined-Diagnosis Related Groups with the highest antibiotic usage and in the seven All Patient Refined-Diagnosis Related Groups shared by all 41 PICUs. The wide variation in antibiotic use observed across children's hospital PICUs suggests inappropriate antibiotic use.
Prevalence of seizures in children infected with human immunodeficiency virus.
Samia, Pauline; Petersen, Reneva; Walker, Kathleen G; Eley, Brian; Wilmshurst, Jo M
2013-03-01
A retrospective study of 354 human immunodeficiency virus (HIV)-infected patients identified a subgroup of 27 children with seizures (7.6%, 95% confidence interval: 5.1%-10.9%). Of the total group, 13% (n = 46) had identifiable neurologic deficits and 30% (n = 107) had developmental delay. Both observations were significantly more frequent in the subgroup of patients with seizures (P < .001). The median age of patients with seizures was 20 months (range, 8-87 months) and the median baseline CD4 percentage was 13.5% (interquartile range, 8%-23%). Seizures were treated with sodium valproate (n = 11), phenobarbital (n = 3), diazepam (n = 2), lamotrigine (n = 1), and carbamazepine (n = 1). Combination therapy was required for 5 children. Suboptimal valproic acid levels were recorded for 3 patients. When resources are available, antiepileptic drug level monitoring is advised for children who require both antiepileptic and antiretroviral medications to facilitate optimal seizure management.
Mehta, Amar J; Kubzansky, Laura D; Coull, Brent A; Kloog, Itai; Koutrakis, Petros; Sparrow, David; Spiro, Avron; Vokonas, Pantel; Schwartz, Joel
2015-01-27
There is mixed evidence suggesting that air pollution may be associated with increased risk of developing psychiatric disorders. We aimed to investigate the association between air pollution and non-specific perceived stress, often a precursor to development of affective psychiatric disorders. This longitudinal analysis consisted of 987 older men participating in at least one visit for the Veterans Administration Normative Aging Study between 1995 and 2007 (n = 2,244 visits). At each visit, participants were administered the 14-item Perceived Stress Scale (PSS), which quantifies stress experienced in the previous week. Scores ranged from 0-56 with higher scores indicating increased stress. Differences in PSS score per interquartile range increase in moving average (1, 2, and 4-weeks) of air pollution exposures were estimated using linear mixed-effects regression after adjustment for age, race, education, physical activity, anti-depressant medication use, seasonality, meteorology, and day of week. We also evaluated effect modification by season (April-September and March-October for warm and cold season, respectively). Fine particles (PM2.5), black carbon (BC), nitrogen dioxide, and particle number counts (PNC) at moving averages of 1, 2, and 4-weeks were associated with higher perceived stress ratings. The strongest associations were observed for PNC; for example, a 15,997 counts/cm(3) interquartile range increase in 1-week average PNC was associated with a 3.2 point (95%CI: 2.1-4.3) increase in PSS score. Season modified the associations for specific pollutants; higher PSS scores in association with PM2.5, BC, and sulfate were observed mainly in colder months. Air pollution was associated with higher levels of perceived stress in this sample of older men, particularly in colder months for specific pollutants.
O'Connor, Jeremy M; Fessele, Kristen L; Steiner, Jean; Seidl-Rathkopf, Kathi; Carson, Kenneth R; Nussbaum, Nathan C; Yin, Emily S; Adelson, Kerin B; Presley, Carolyn J; Chiang, Anne C; Ross, Joseph S; Abernethy, Amy P; Gross, Cary P
2018-05-10
The US Food and Drug Administration (FDA) is increasing its pace of approvals for novel cancer therapeutics, including for immune checkpoint inhibitors of programmed cell death 1 protein (anti-PD-1 agents). However, little is known about how quickly anti-PD-1 agents reach eligible patients in practice or whether such patients differ from those studied in clinical trials that lead to FDA approval (pivotal clinical trials). To assess the speed with which anti-PD-1 agents reached eligible patients in practice and to compare the ages of patients treated in clinical practice with the ages of those treated in pivotal clinical trials. This retrospective cohort study, performed from January 1, 2011, through August 31, 2016, included patients from the Flatiron Health Network who were eligible for anti-PD-1 treatment of selected cancer types, which included melanoma, non-small cell lung cancer (NSCLC), and renal cell carcinoma (RCC). Cumulative proportions of eligible patients receiving anti-PD-1 treatment and their age distributions. The study identified 3089 patients who were eligible for anti-PD-1 treatment (median age, 66 [interquartile range, 56-75] years for patients with melanoma, 66 [interquartile range, 58-72] years for patients with RCC, and 67 [interquartile range, 59-74] years for patients with NSCLC; 1742 male [56.4%] and 1347 [43.6%] female; 2066 [66.9%] white). Of these patients, 2123 (68.7%) received anti-PD-1 treatment, including 439 eligible patients with melanoma (79.1%), 1417 eligible patients with NSCLC (65.6%), and 267 eligible patients with RCC (71.2%). Within 4 months after FDA approval, greater than 60% of eligible patients in each cohort had received anti-PD-1 treatment. Overall, similar proportions of older and younger patients received anti-PD-1 treatment during the first 9 months after FDA approval. However, there were significant differences in age between clinical trial participants and patients receiving anti-PD-1 treatment in clinical practice, with more patients being older than 65 years in clinical practice (range, 327 of 1365 [60.6%] to 46 of 72 [63.9%]) than in pivotal clinical trials (range, 38 of 120 [31.7%] to 223 of 544 [41.0%]; all P < .001). Anti-PD-1 agents rapidly reached patients in clinical practice, and patients treated in clinical practice differed significantly from patients treated in pivotal clinical trials. Future actions are needed to ensure that rapid adoption occurs on the basis of representative trial evidence.
Quantitative electromyography in ambulatory boys with Duchenne muscular dystrophy.
Verma, Sumit; Lin, Jenny; Travers, Curtis; McCracken, Courtney; Shah, Durga
2017-12-01
This study's objective was to evaluate quantitative electromyography (QEMG) using multiple-motor-unit (multi-MUP) analysis in Duchenne muscular dystrophy (DMD). Ambulatory DMD boys, aged 5-15 years, were evaluated with QEMG at 6-month intervals over 14 months. EMG was performed in the right biceps brachii (BB) and tibialis anterior (TA) muscles. Normative QEMG data were obtained from age-matched healthy boys. Wilcoxon signed-rank tests were performed. Eighteen DMD subjects were enrolled, with a median age of 7 (interquartile range 7-10) years. Six-month evaluations were performed on 14 subjects. QEMG showed significantly abnormal mean MUP duration in BB and TA muscles, with no significant change over 6 months. QEMG is a sensitive electrophysiological marker of myopathy in DMD. Preliminary data do not reflect a significant change in MUP parameters over a 6-month interval; long-term follow-up QEMG studies are needed to understand its role as a biomarker for disease progression. Muscle Nerve 56: 1361-1364, 2017. © 2017 Wiley Periodicals, Inc.
Single-Incision Laparoscopic Sterilization of the Cheetah (Acinonyx jubatus).
Hartman, Marthinus J; Monnet, Eric; Kirberger, Robert M; Schmidt-Küntzel, Anne; Schulman, Martin L; Stander, Jana A; Stegmann, George F; Schoeman, Johan P
2015-07-01
To describe laparoscopic ovariectomy and salpingectomy in the cheetah (Acinonyx jubatus) using single-incision laparoscopic surgery (SILS). Prospective cohort. Female cheetahs (Acinonyx jubatus) (n = 21). Cheetahs were randomly divided to receive either ovariectomy (n = 11) or salpingectomy (n = 10). The use and complications of a SILS port was evaluated in all of cheetahs. Surgery duration and insufflation volumes of carbon dioxide (CO2 ) were recorded and compared across procedures. Laparoscopic ovariectomy and salpingectomy were performed without complications using a SILS port. The poorly-developed mesosalpinx and ovarian bursa facilitated access to the uterine tube for salpingectomy in the cheetah. The median surgery duration for ovariectomy was 24 minutes (interquartile range 3) and for salpingectomy was 19.5 minutes (interquartile range 3) (P = .005). The median volume of CO2 used for ovariectomy was 11.25 L (interquartile range 3.08) and for salpingectomy was 4.90 L (interquartile range 2.52), (P = .001) CONCLUSIONS: Laparoscopic ovariectomy and salpingectomy can be performed in the cheetah using SILS without perioperative complications. Salpingectomy is faster than ovariectomy and requires less total CO2 for insufflation. © Copyright 2015 by The American College of Veterinary Surgeons.
NASA Astrophysics Data System (ADS)
Astley, Susan; Connor, Sophie; Lim, Yit; Tate, Catriona; Entwistle, Helen; Morris, Julie; Whiteside, Sigrid; Sergeant, Jamie; Wilson, Mary; Beetles, Ursula; Boggis, Caroline; Gilbert, Fiona
2013-03-01
Digital Breast Tomosynthesis (DBT) provides three-dimensional images of the breast that enable radiologists to discern whether densities are due to overlapping structures or lesions. To aid assessment of the cost-effectiveness of DBT for screening, we have compared the time taken to interpret DBT images and the corresponding two-dimensional Full Field Digital Mammography (FFDM) images. Four Consultant Radiologists experienced in reading FFDM images (4 years 8 months to 8 years) with training in DBT interpretation but more limited experience (137-407 cases in the past 6 months) were timed reading between 24 and 32 two view FFDM and DBT cases. The images were of women recalled from screening for further assessment and women under surveillance because of a family history of breast cancer. FFDM images were read before DBT, according to local practice. The median time for readers to interpret FFDM images was 17.0 seconds, with an interquartile range of 12.3-23.6 seconds. For DBT, the median time was 66.0 seconds, and the interquartile range was 51.1-80.5 seconds. The difference was statistically significant (p<0.001). Reading times were significantly longer in family history clinics (p<0.01). Although it took approximately four times as long to interpret DBT than FFDM images, the cases were more complex than would be expected for routine screening, and with higher mammographic density. The readers were relatively inexperienced in DBT interpretation and may increase their speed over time. The difference in times between clinics may be due to increased throughput at assessment, or decreased density.
Carr, John Alfred; Buterakos, Roxanne; Bowling, William M; Janson, Lisa; Kralovich, Kurt A; Copeland, Craig; Link, Renee; Roiter, Cecilia; Casey, Gregory; Wagner, James W
2011-03-01
There is almost no data describing the long-term functional outcome of patients after penetrating cardiac injury. A retrospective study at a Level I trauma center from 2000 to 2009. Sixty-three patients had penetrating cardiac injuries from 28 stabbings and 35 gunshots. Men comprised 89% (56) of the patients. Overall, there were 21 survivors (33%) and 42 died in the emergency room or perioperative period. The mean age did not significantly differ between survivors (36 years ± 12 years) compared with those who died (30 years ± 11 years; p=0.07). There was an increased chance of survival after being stabbed compared with being shot (17 patients vs. 4 patients; odds ratio=12; p=0.002). Thirteen (62%) had injuries to the right ventricle only. Three patients died during follow-up: one from lung cancer and two other patients died from myocardial infarctions, one 9 years later at the age of 45 years and the other 8 years later at the age of 55 years. The survivors had functional follow-up evaluations from 2 months to 114 months (median, 71; interquartile range, 34-92 months) and echocardiographic follow-up from 2 months to 107 months (median, 64; interquartile range, 31-84 months) after their injuries. Functionally, all patients were in NYHA class 1 status, except one patient in class II who was 54 years old and had a mild exertional limitation. The previously injured area could only be identified by echocardiogram in one patient who had a patch repair of a ventricular septal defect (VSD). The mean ejection fraction improved over time from a mean of 51% ± 8% in the immediate postoperative period to 60% ± 9% after a mean follow-up of 59 months (p=0.01). After surgery, 43% of patients had a mild to moderate pericardial effusion; however, the long-term follow-up studies showed that all these had resolved. Wall motion abnormalities occurred in 33% of patients in the immediate postoperative period and, again, all these resolved during long-term follow-up. Patients who survive penetrating cardiac injuries, without coronary arterial or valvular disruption, have an excellent long-term functional outcome with minimal subsequent cardiac morbidity related to the injury. Full physiologic recovery and normal cardiac function can be expected if the patient survives. Copyright © 2011 by Lippincott Williams & Wilkins
Household Expenditure on Tobacco Consumption in a Poverty-Stricken Rural District in Sri Lanka.
Perera, K Manuja N; Guruge, G N Duminda; Jayawardana, Pushpa L
2017-03-01
Tobacco is a determinant of poverty and a barrier for development. Monaragala, a rural, agricultural district, reports the highest poverty-related indicators in southern Sri Lanka. A cross-sectional study was used to describe the household expenditure on tobacco and its association with food- and education-related expenditures at household level. This study used a 4-stage cluster sampling method to recruit a representative sample of 1160 households. Response rate was 98.6%. Median monthly household income was LKR 20 000 (interquartile range [IQR] = LKR 12 000-30 000). The median monthly expenditure on tobacco was LKR 1000 (IQR = LKR 400-2000) with the highest spending tertile reporting a median of LKR 2700 (IQR = LKR 2000-3600).The proportionate expenditure from the monthly income ranged from 0.0% to 50% with a median of 5.0% (IQR = 2.0-10.0) and a mean of 7.4% (7.6). The poorest reported the highest mean proportionate expenditure (9.8%, SD = 10) from the household income. Household expenditure on tobacco negatively associated with expenditure on education.
Rosa, Regis Goulart; Tonietto, Tulio Frederico; da Silva, Daiana Barbosa; Gutierres, Franciele Aparecida; Ascoli, Aline Maria; Madeira, Laura Cordeiro; Rutzen, William; Falavigna, Maicon; Robinson, Caroline Cabral; Salluh, Jorge Ibrain; Cavalcanti, Alexandre Biasi; Azevedo, Luciano Cesar; Cremonese, Rafael Viegas; Haack, Tarissa Ribeiro; Eugênio, Cláudia Severgnini; Dornelles, Aline; Bessel, Marina; Teles, José Mario Meira; Skrobik, Yoanna; Teixeira, Cassiano
2017-10-01
To evaluate the effect of an extended visitation model compared with a restricted visitation model on the occurrence of delirium among ICU patients. Prospective single-center before and after study. Thirty-one-bed medical-surgical ICU. All patients greater than or equal to 18 years old with expected length of stay greater than or equal to 24 hours consecutively admitted to the ICU from May 2015 to November 2015. Change of visitation policy from a restricted visitation model (4.5 hr/d) to an extended visitation model (12 hr/d). Two hundred eighty-six patients were enrolled (141 restricted visitation model, 145 extended visitation model). The primary outcome was the cumulative incidence of delirium, assessed bid using the confusion assessment method for the ICU. Predefined secondary outcomes included duration of delirium/coma; any ICU-acquired infection; ICU-acquired bloodstream infection, pneumonia, and urinary tract infection; all-cause ICU mortality; and length of ICU stay. The median duration of visits increased from 133 minutes (interquartile range, 97.7-162.0) in restricted visitation model to 245 minutes (interquartile range, 175.0-272.0) in extended visitation model (p < 0.001). Fourteen patients (9.6%) developed delirium in extended visitation model compared with 29 (20.5%) in restricted visitation model (adjusted relative risk, 0.50; 95% CI, 0.26-0.95). In comparison with restricted visitation model patients, extended visitation model patients had shorter length of delirium/coma (1.5 d [interquartile range, 1.0-3.0] vs 3.0 d [interquartile range, 2.5-5.0]; p = 0.03) and ICU stay (3.0 d [interquartile range, 2.0-4.0] vs 4.0 d [interquartile range, 2.0-6.0]; p = 0.04). The rate of ICU-acquired infections and all-cause ICU mortality did not differ significantly between the two study groups. In this medical-surgical ICU, an extended visitation model was associated with reduced occurrence of delirium and shorter length of delirium/coma and ICU stay.
Finn Davis, Katherine; Napolitano, Natalie; Li, Simon; Buffman, Hayley; Rehder, Kyle; Pinto, Matthew; Nett, Sholeen; Jarvis, J Dean; Kamat, Pradip; Sanders, Ronald C; Turner, David A; Sullivan, Janice E; Bysani, Kris; Lee, Anthony; Parker, Margaret; Adu-Darko, Michelle; Giuliano, John; Biagas, Katherine; Nadkarni, Vinay; Nishisaki, Akira
2017-10-01
To describe promoters and barriers to implementation of an airway safety quality improvement bundle from the perspective of interdisciplinary frontline clinicians and ICU quality improvement leaders. Mixed methods. Thirteen PICUs of the National Emergency Airway Registry for Children network. Remote or on-site focus groups with interdisciplinary ICU staff. Two semistructured interviews with ICU quality improvement leaders with quantitative and qualitative data-based feedbacks. Bundle implementation success (compliance) was defined as greater than or equal to 80% use for tracheal intubations for 3 consecutive months. ICUs were classified as early or late adopters. Focus group discussions concentrated on safety concerns and promoters and barriers to bundle implementation. Initial semistructured quality improvement leader interviews assessed implementation tactics and provided recommendations. Follow-up interviews assessed degree of acceptance and changes made after initial interview. Transcripts were thematically analyzed and contrasted by early versus late adopters. Median duration to achieve success was 502 days (interquartile range, 182-781). Five sites were early (median, 153 d; interquartile range, 146-267) and eight sites were late adopters (median, 783 d; interquartile range, 773-845). Focus groups identified common "promoter" themes-interdisciplinary approach, influential champions, and quality improvement bundle customization-and "barrier" themes-time constraints, competing paperwork and quality improvement activities, and poor engagement. Semistructured interviews with quality improvement leaders identified effective and ineffective tactics implemented by early and late adopters. Effective tactics included interdisciplinary quality improvement team involvement (early adopter: 5/5, 100% vs late adopter: 3/8, 38%; p = 0.08); ineffective tactics included physician-only rollouts, lack of interdisciplinary education, lack of data feedback to frontline clinicians, and misconception of bundle as research instead of quality improvement intervention. Implementation of an airway safety quality improvement bundle with high compliance takes a long time across diverse ICUs. Both early and late adopters identified similar promoter and barrier themes. Early adopter sites customized the quality improvement bundle and had an interdisciplinary quality improvement team approach.
Kim, Joon-Tae; Chung, Pil-Wook; Starkman, Sidney; Sanossian, Nerses; Stratton, Samuel J; Eckstein, Marc; Pratt, Frank D; Conwit, Robin; Liebeskind, David S; Sharma, Latisha; Restrepo, Lucas; Tenser, May-Kim; Valdes-Sueiras, Miguel; Gornbein, Jeffrey; Hamilton, Scott; Saver, Jeffrey L
2017-02-01
The Los Angeles Motor Scale (LAMS) is a 3-item, 0- to 10-point motor stroke-deficit scale developed for prehospital use. We assessed the convergent, divergent, and predictive validity of the LAMS when performed by paramedics in the field at multiple sites in a large and diverse geographic region. We analyzed early assessment and outcome data prospectively gathered in the FAST-MAG trial (Field Administration of Stroke Therapy-Magnesium phase 3) among patients with acute cerebrovascular disease (cerebral ischemia and intracranial hemorrhage) within 2 hours of onset, transported by 315 ambulances to 60 receiving hospitals. Among 1632 acute cerebrovascular disease patients (age 70±13 years, male 57.5%), time from onset to prehospital LAMS was median 30 minutes (interquartile range 20-50), onset to early postarrival (EPA) LAMS was 145 minutes (interquartile range 119-180), and onset to EPA National Institutes of Health Stroke Scale was 150 minutes (interquartile range 120-180). Between the prehospital and EPA assessments, LAMS scores were stable in 40.5%, improved in 37.6%, and worsened in 21.9%. In tests of convergent validity, against the EPA National Institutes of Health Stroke Scale, correlations were r=0.49 for the prehospital LAMS and r=0.89 for the EPA LAMS. Prehospital LAMS scores did diverge from the prehospital Glasgow Coma Scale, r=-0.22. Predictive accuracy (adjusted C statistics) for nondisabled 3-month outcome was as follows: prehospital LAMS, 0.76 (95% confidence interval 0.74-0.78); EPA LAMS, 0.85 (95% confidence interval 0.83-0.87); and EPA National Institutes of Health Stroke Scale, 0.87 (95% confidence interval 0.85-0.88). In this multicenter, prospective, prehospital study, the LAMS showed good to excellent convergent, divergent, and predictive validity, further establishing it as a validated instrument to characterize stroke severity in the field. © 2017 American Heart Association, Inc.
Goei, Dustin; van Kuijk, Jan-Peter; Flu, Willem-Jan; Hoeks, Sanne E; Chonchol, Michel; Verhagen, Hence J M; Bax, Jeroen J; Poldermans, Don
2011-02-15
Plasma N-terminal pro-B-type natriuretic peptide (NT-pro-BNP) levels improve preoperative cardiac risk stratification in vascular surgery patients. However, single preoperative measurements of NT-pro-BNP cannot take into account the hemodynamic stress caused by anesthesia and surgery. Therefore, the aim of the present study was to assess the incremental predictive value of changes in NT-pro-BNP during the perioperative period for long-term cardiac mortality. Detailed cardiac histories, rest left ventricular echocardiography, and NT-pro-BNP levels were obtained in 144 patients before vascular surgery and before discharge. The study end point was the occurrence of cardiovascular death during a median follow-up period of 13 months (interquartile range 5 to 20). Preoperatively, the median NT-pro-BNP level in the study population was 314 pg/ml (interquartile range 136 to 1,351), which increased to a median level of 1,505 pg/ml (interquartile range 404 to 6,453) before discharge. During the follow-up period, 29 patients (20%) died, 27 (93%) from cardiovascular causes. The median difference in NT-pro-BNP in the survivors was 665 pg/ml, compared to 5,336 pg/ml in the patients who died (p = 0.01). Multivariate Cox regression analyses, adjusted for cardiac history and cardiovascular risk factors (age, angina pectoris, myocardial infarction, stroke, diabetes mellitus, renal dysfunction, body mass index, type of surgery and the left ventricular ejection fraction), demonstrated that the difference in NT-pro-BNP level between pre- and postoperative measurement was the strongest independent predictor of cardiac outcome (hazard ratio 3.06, 95% confidence interval 1.36 to 6.91). In conclusion, the change in NT-pro-BNP, indicated by repeated measurements before surgery and before discharge is the strongest predictor of cardiac outcomes in patients who undergo vascular surgery. Copyright © 2011 Elsevier Inc. All rights reserved.
Evidence for activation of nuclear factor kappaB in obstructive sleep apnea.
Yamauchi, Motoo; Tamaki, Shinji; Tomoda, Koichi; Yoshikawa, Masanori; Fukuoka, Atsuhiko; Makinodan, Kiyoshi; Koyama, Noriko; Suzuki, Takahiro; Kimura, Hiroshi
2006-12-01
Obstructive sleep apnea (OSA) is a risk factor for atherosclerosis, and atherosclerosis evolves from activation of the inflammatory cascade. We propose that activation of the nuclear factor kappaB (NF-kappaB), a key transcription factor in the inflammatory cascade, occurs in OSA. Nine age-matched, nonsmoking, and non-hypertensive men with OSA symptoms and seven similar healthy subjects were recruited for standard polysomnography followed by the collection of blood samples for monocyte nuclear p65 concentrations (OSA and healthy groups). In the OSA group, p65 and of monocyte production of tumor necrosis factor alpha (TNF-alpha) were measured at the same time and after the next night of continuous positive airway pressure (CPAP). p65 Concentrations in the OSA group were significantly higher than in the control group [median, 0.037 ng/microl (interquartile range, 0.034 to 0.051) vs 0.019 ng/microl (interquartile range, 0.013 to 0.032); p = 0.008], and in the OSA group were significantly correlated with apnea-hypopnea index and time spent below an oxygen saturation of 90% (r = 0.77 and 0.88, respectively) after adjustment for age and BMI. One night of CPAP resulted in a reduction in p65 [to 0.020 ng/mul (interquartile range, 0.010 to 0.036), p = 0.04] and levels of TNF-alpha production in cultured monocytes [16.26 (interquartile range, 7.75 to 24.85) to 7.59 ng/ml (interquartile range, 5.19 to 12.95), p = 0.01]. NF-kappaB activation occurs with sleep-disordered breathing. Such activation of NF-kappaB may contribute to the pathogenesis of atherosclerosis in OSA patients.
Central obesity, leptin and cognitive decline: the Sacramento Area Latino Study on Aging.
Zeki Al Hazzouri, Adina; Haan, Mary N; Whitmer, Rachel A; Yaffe, Kristine; Neuhaus, John
2012-01-01
Central obesity is a risk factor for cognitive decline. Leptin is secreted by adipose tissue and has been associated with better cognitive function. Aging Mexican Americans have higher levels of obesity than non-Hispanic Whites, but no investigations examined the relationship between leptin and cognitive decline among them or the role of central obesity in this association. We analyzed 1,480 dementia-free older Mexican Americans who were followed over 10 years. Cognitive function was assessed every 12-15 months with the Modified Mini Mental State Exam (3MSE) and the Spanish and English Verbal Learning Test (SEVLT). For females with a small waist circumference (≤35 inches), an interquartile range difference in leptin was associated with 35% less 3MSE errors and 22% less decline in the SEVLT score over 10 years. For males with a small waist circumference (≤40 inches), an interquartile range difference in leptin was associated with 44% less 3MSE errors and 30% less decline in the SEVLT score over 10 years. There was no association between leptin and cognitive decline among females or males with a large waist circumference. Leptin interacts with central obesity in shaping cognitive decline. Our findings provide valuable information about the effects of metabolic risk factors on cognitive function. Copyright © 2012 S. Karger AG, Basel.
Stephens, John R; Steiner, Michael J; DeJong, Neal; Rodean, Jonathan; Hall, Matt; Richardson, Troy; Berry, Jay G
2018-01-01
We studied constipation-related health care among children before and after constipation admission. Index admissions for constipation in 2010-2011 were identified in the Truven Marketscan Database, which includes children receiving Medicaid in 10 states. We measured number of and spending for outpatient constipation visits 12 months before and after index hospitalizations. We also measured spending for constipation hospitalizations and rehospitalization rate. There were 780 index constipation admissions. The median number of outpatient constipation visits was 1 (interquartile range [IQR] = 0, 3) in the 12 months before and 2 (IQR [0, 4]) after admission ( P = .001). Median outpatient spending for constipation was $110 (IQR [0, 429]) before and $132 (IQR [0, 431]) after admission ( P = .2). Median spending for index constipation admissions was $5295 (IQR [2756, 8267]); 78 children (10%) were rehospitalized for constipation within 12 months. Constipation-related health care utilization increased after constipation admission. Median spending for one constipation admission was 50 times the median spending for 12 months of outpatient constipation visits.
Wayne, Erik J; Edwards, Matthew S; Stafford, Jeanette M; Hansen, Kimberley J; Corriere, Matthew A
2014-08-01
Renal artery aneurysms (RAAs) are uncommon, and rates of growth and rupture are unknown. Limited evidence therefore exists to guide clinical management of RAAs, particularly small aneurysms that are asymptomatic. To further characterize the natural history of RAAs, we studied anatomic characteristics and changes in diameter during imaging surveillance. Patients evaluated for native RAAs at a single institution during a 5-year period (July 2008 to July 2013) were identified and analyzed retrospectively. Patients with two or more cross-sectional imaging studies (computed tomography or magnetic resonance imaging) more than 1 month apart were included. Demographic and clinical data were collected from medical records, and anatomic data (including aneurysm diameter, calcification, and location) were obtained from electronic images. Changes in RAA diameters over time were evaluated by plots and Wilcoxon signed rank tests. Sixty-eight RAAs in 55 patients were analyzed. Median follow-up was 19.4 months (interquartile range, 11.2-49.0 months). Mean age at presentation was 61.8 ± 9.8 years, and 73% of patients were women. Hypertension was prevalent among 73% of patients. Multiple RAAs were present in 18% of patients, and 24% also had arterial aneurysms of other splanchnic or iliac vessels. The majority of RAAs were calcified and located at the main renal artery bifurcation. Mean initial aneurysm diameter was 16.0 ± 6.4 mm. Median annualized growth rate was 0.06 mm (interquartile range, -0.07 to 0.33 mm; P = .11). No RAA ruptures or acute symptoms occurred during surveillance, and 10.3% of RAAs were repaired electively. Risk of short-term RAA growth or rupture was low. These findings suggest that annual (or less frequent) imaging surveillance is safe in the majority of patients and do not support pre-emptive repair of asymptomatic, small-diameter RAAs. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Blumenthal, Kimberly G; Shenoy, Erica S; Varughese, Christy A; Hurwitz, Shelley; Hooper, David C; Banerji, Aleena
2015-10-01
Self-reported penicillin allergy infrequently reflects an inability to tolerate penicillins. Inpatients reporting penicillin allergy receive alternative antibiotics that might be broader spectrum, more toxic, or less effective. To develop and assess a clinical guideline for the general inpatient provider that directs taking a history and prescribing antibiotics for patients with penicillin or cephalosporin allergy. A guideline was implemented to assist providers with assessing allergy history and prescribing antibiotics for patients with reported penicillin or cephalosporin allergy. The guideline used a standard 2-step graded challenge or test dose. A quasi-experimental study was performed to assess safety, feasibility, and impact on antibiotic use by comparing treatment 21 months before guideline implementation with 12 months after guideline implementation. Significantly more test doses to β-lactam antibiotics were performed monthly after vs before guideline implementation (median 14.5, interquartile range 13-16.25, vs 2, interquartile range 1-3.25, P < .001). Seven adverse drug reactions occurred during guideline-driven test doses, with no significant difference in rate (3.9% vs 6.1%, P = .44) or severity (P > .5) between periods. Guideline-driven test doses decreased alternative antimicrobial therapy after the test dose, including vancomycin (68.3% vs 37.2%, P < .001), aztreonam (11.5% vs 0.5%, P < .001), aminoglycosides (6.0% vs 1.1%, P = .004), and fluoro quinolones (15.3% vs 3.3%, P < .001). The implementation of an inpatient antibiotic prescribing guideline for patients with penicillin or cephalosporin allergy was associated with an almost 7-fold increase in the number of test doses to β-lactams without increased adverse drug reactions. Patients assessed with guideline-driven test doses were observed to have significantly decreased alternative antibiotic exposure. Copyright © 2015 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Bartko, Philipp E; Wiedemann, Dominik; Schrutka, Lore; Binder, Christina; Santos-Gallego, Carlos G; Zuckermann, Andreas; Steinlechner, Barbara; Koinig, Herbert; Heinz, Gottfried; Niessner, Alexander; Zimpfer, Daniel; Laufer, Günther; Lang, Irene M; Distelmaier, Klaus; Goliasch, Georg
2017-07-28
Extracorporeal membrane oxygenation following cardiac surgery safeguards end-organ oxygenation but unfavorably alters cardiac hemodynamics. Along with the detrimental effects of cardiac surgery to the right heart, this might impact outcome, particularly in patients with preexisting right ventricular (RV) dysfunction. We sought to determine the prognostic impact of RV function and to improve established risk-prediction models in this vulnerable patient cohort. Of 240 patients undergoing extracorporeal membrane oxygenation support following cardiac surgery, 111 had echocardiographic examinations at our institution before implantation of extracorporeal membrane oxygenation and were thus included. Median age was 67 years (interquartile range 60-74), and 74 patients were male. During a median follow-up of 27 months (interquartile range 16-63), 75 patients died. Fifty-one patients died within 30 days, 75 during long-term follow-up (median follow-up 27 months, minimum 5 months, maximum 125 months). Metrics of RV function were the strongest predictors of outcome, even stronger than left ventricular function ( P <0.001 for receiver operating characteristics comparisons). Specifically, RV free-wall strain was a powerful predictor univariately and after adjustment for clinical variables, Simplified Acute Physiology Score-3, tricuspid regurgitation, surgery type and duration with adjusted hazard ratios of 0.41 (95%CI 0.24-0.68; P =0.001) for 30-day mortality and 0.48 (95%CI 0.33-0.71; P <0.001) for long-term mortality for a 1-SD (SD=-6%) change in RV free-wall strain. Combined assessment of the additive EuroSCORE and RV free-wall strain improved risk classification by a net reclassification improvement of 57% for 30-day mortality ( P =0.01) and 56% for long-term mortality ( P =0.02) compared with the additive EuroSCORE alone. RV function is strongly linked to mortality, even after adjustment for baseline variables and clinical risk scores. RV performance improves established risk prediction models for short- and long-term mortality. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Magnetic versus manual catheter navigation for ablation of free wall accessory pathways in children.
Kim, Jeffrey J; Macicek, Scott L; Decker, Jamie A; Kertesz, Naomi J; Friedman, Richard A; Cannon, Bryan C
2012-08-01
Transcatheter ablation of accessory pathway (AP)-mediated tachycardia is routinely performed in children. Little data exist regarding the use of magnetic navigation (MN) and its potential benefits for ablation of AP-mediated tachycardia in this population. We performed a retrospective review of prospectively gathered data in children undergoing radiofrequency ablation at our institution since the installation of MN (Stereotaxis Inc, St. Louis, MO) in March 2009. The efficacy and safety between an MN-guided approach and standard manual techniques for mapping and ablation of AP-mediated tachycardia were compared. During the 26-month study period, 145 patients underwent radiofrequency ablation for AP-mediated tachycardia. Seventy-three patients were ablated with MN and 72 with a standard manual approach. There were no significant differences in demographic factors between the 2 groups with a mean cohort age of 13.1±4.0 years. Acute success rates were equivalent with 68 of 73 (93.2%) patients in the MN group being successfully ablated versus 68 of 72 (94.4%) patients in the manual group (P=0.889). During a median follow-up of 21.4 months, there were no recurrences in the MN group and 2 recurrences in the manual group (P=0.388). There were no differences in time to effect, number of lesions delivered, or average ablation power. There was also no difference in total procedure time, but fluoroscopy time was significantly reduced in the MN group at 14.0 (interquartile range, 3.8-23.9) minutes compared with the manual group at 28.1 (interquartile range, 15.3-47.3) minutes (P<0.001). There were no complications in either group. MN is a safe and effective approach to ablate AP-mediated tachycardia in children.
Kluth, Luis A; Ernst, Lukas; Vetterlein, Malte W; Meyer, Christian P; Reiss, C Philip; Fisch, Margit; Rosenbaum, Clemens M
2017-08-01
To determine success rates, predictors of recurrence, and recurrence management of patients treated for short anterior urethral strictures by direct vision internal urethrotomy (DVIU). We identified 128 patients who underwent DVIU of the anterior urethra between December 2009 and March 2016. Follow-up was conducted by telephone interviews. Success rates were assessed by Kaplan-Meier estimators. Predictors of stricture recurrence and different further therapy strategies were identified by uni- and multivariable Cox regression analyses. The mean age was 63.8 years (standard deviation: 16.3) and the overall success rate was 51.6% (N = 66) at a median follow-up of 16 months (interquartile range: 6-43). Median time to stricture recurrence was six months (interquartile range: 2-12). In uni- and multivariable analyses, only repeat DVIU (hazard ratio [HR] = 1.87, 95% confidence interval (CI) = 1.13-3.11, P= .015; and HR=1.78, 95% CI = 1.05-3.03, P = .032, respectively) was a risk factor for recurrence. Of 62 patients with recurrence, 35.5% underwent urethroplasty, 29% underwent further endoscopic treatment, and 33.9% did not undergo further interventional therapy. Age (HR = 1.05, 95% CI = 1.01-1.09, P = .019) and diabetes (HR = 2.90, 95% CI = 1.02-8.26, P = .047) were predictors of no further interventional therapy. DVIU seems justifiable in short urethral strictures as a primary treatment. Prior DVIU was a risk factor for recurrence. In case of recurrence, about one-third of the patients did not undergo any further therapy. Higher age and diabetes predicted the denial of any further treatment. Copyright © 2017 Elsevier Inc. All rights reserved.
Freitas, Amanda Souza; Simoneti, Christian Silva; Ferraz, Erica; Bagatin, Ericson; Brandão, Izaira Tincani; Silva, Celio Lopes; Borges, Marcos Carvalho; Vianna, Elcio Oliveira
2016-05-06
Endotoxin from Gram-negative bacteria are found in different concentrations in dust and on the ground of laboratories dealing with small animals and animal houses. Cross-sectional study performed in workplaces of two universities. Dust samples were collected from laboratories and animal facilities housing rats, mice, guinea pigs, rabbits or hamsters and analyzed by the "Limulus amebocyte lysate" (LAL) method. We also sampled workplaces without animals. The concentrations of endotoxin detected in the workplaces were tested for association with wheezing in the last 12 months, asthma defined by self-reported diagnosis and asthma confirmed by bronchial hyperresponsiveness (BHR) to mannitol. Dust samples were obtained at 145 workplaces, 92 with exposure to animals and 53 with no exposure. Exposed group comprised 412 subjects and non-exposed group comprised 339 subjects. Animal-exposed workplaces had higher concentrations of endotoxin, median of 34.2 endotoxin units (EU) per mg of dust (interquartile range, 12.6-65.4), as compared to the non-exposed group, median of 10.2 EU/mg of dust (interquartile range, 2.6-22.2) (p < 0.001). The high concentration of endotoxin (above whole sample median, 20.4 EU/mg) was associated with increased wheezing prevalence (p < 0.001), i.e., 61 % of workers exposed to high endotoxin concentration reported wheezing in the last 12 months compared to 29 % of workers exposed to low endotoxin concentration. The concentration of endotoxin was not associated with asthma report or with BHR confirmed asthma. Exposure to endotoxin is associated with a higher prevalence of wheezing, but not with asthma as defined by the mannitol bronchial challenge test or by self-reported asthma. Preventive measures are necessary for these workers.
Mechanical Thrombectomy in Perioperative Strokes: A Case-Control Study.
Premat, Kévin; Clovet, Olivier; Frasca Polara, Giulia; Shotar, Eimad; Bartolini, Bruno; Yger, Marion; Di Maria, Federico; Baronnet, Flore; Pistocchi, Silvia; Le Bouc, Raphaël; Pires, Christine; Sourour, Nader; Alamowitch, Sonia; Samson, Yves; Degos, Vincent; Clarençon, Frédéric
2017-11-01
Perioperative strokes (POS) are rare but serious complications for which mechanical thrombectomy could be beneficial. We aimed to compare the technical results and patients outcomes in a population of POS versus non-POS (nPOS) treated by mechanical thrombectomy. From 2010 to 2017, 25 patients with POS (ie, acute ischemic stroke occurring during or within 30 days after a procedure) who underwent mechanical thrombectomy (POS group) were enrolled and paired with 50 consecutive patients with nPOS (control group), based on the occlusion's site, National Institute of Health Stroke Scale, and age. Respectively, mean age was 68.3±16.6 versus 67.2±16.6 years ( P =0.70), and median National Institute of Health Stroke Scale score at admission was 20 (interquartile range, 15-25) versus 19 (interquartile range, 17-25; P =0.79). Good clinical outcome (modified Rankin Scale score of 0-2 at 3 months) was achieved by 33.3% (POS) versus 56.5% (nPOS) of patients ( P =0.055). Successful reperfusion (modified Thrombolysis In Cerebral Infarction score of ≥2b) was obtained in 76% (POS) versus 86% (nPOS) of cases ( P =0.22). Mortality at 3 months was 33.3% in the POS group versus 4.2% (nPOS) ( P =0.002). The rate of major procedural complications was 4% (POS) versus 6% (nPOS); none were lethal. Average time from symptoms' onset to reperfusion was 4.9 hours (±2.0) in POS versus 5.2 hours (±2.6). Successful reperfusion seems accessible in POS within a reasonable amount of time and with a good level of safety. However, favorable outcome was achieved with a lower rate than in nPOS, owing to a higher mortality rate. © 2017 American Heart Association, Inc.
Lin, Yucong; Xu, Xijin; Dai, Yifeng; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-12-15
Data on vaccination effects in children chronically exposed to heavy metals are extremely scarce. This study aims to investigate the immune responsiveness to measles, mumps, and rubella (MMR) vaccination in children from an e-waste recycling area. 378 healthy children from Guiyu (exposed group) and Haojiang (reference group) were surveyed. Blood lead (Pb) levels were measured by graphite furnace atomic absorption. Titers of antibodies against MMR were quantified by ELISA. Blood Pb levels of children from the exposed group were significantly higher than those from the reference group (5.61μg/dL vs. 3.57μg/dL, p<0.001). In contrast, the antibody titers against MMR of the children from the exposed group were significantly lower than those from the reference group. The median titer of the anti-measles antibody of the exposed group was 669.64mIU/mL, with an interquartile range of 372.88-1068.42mIU/mL; this was decreased by nearly 40% compared to that of the reference group (median 1046.79mIU/mL, interquartile range 603.29-1733.10mIU/mL). For antibody titers against mumps, there was an about 45% decrease in the exposed group (median 272.24U/mL, interquartile range 95.19-590.16U/mL), compared to the reference group (median 491.78U/mL, interquartile range 183.38-945.96U/mL). In the case of rubella, the median titer of the antibody was also significantly lower in the exposed group (median 37.08IU/mL, interquartile range 17.67-66.66IU/mL) compared to the reference group (median 66.50IU/mL, interquartile range 25.32-105.59IU/mL); the decrease in this case was nearly 44%. The proportion of children whose antibody titers against MMR were below protective level in the exposed group was higher than it was in the reference group. The present study demonstrates that the immune responsiveness to routine vaccination was suppressed in children chronically exposed to lead. Thus, the vaccination strategies for these children living in an e-waste recycling area should be modified. Copyright © 2016. Published by Elsevier B.V.
Grignard, Lynn; Gonçalves, Bronner P; Early, Angela M; Daniels, Rachel F; Tiono, Alfred B; Guelbéogo, Wamdaogo M; Ouédraogo, Alphonse; van Veen, Elke M; Lanke, Kjerstin; Diarra, Amidou; Nebie, Issa; Sirima, Sodiomon B; Targett, Geoff A; Volkman, Sarah K; Neafsey, Daniel E; Wirth, Dyann F; Bousema, Teun; Drakeley, Chris
2018-05-05
Plasmodium falciparum malaria infections often comprise multiple distinct parasite clones. Few datasets have directly assessed infection complexity in humans and mosquitoes they infect. Examining parasites using molecular tools may provide insights into the selective transmissibility of isolates. Using capillary electrophoresis genotyping and next generation amplicon sequencing, we analysed complexity of parasite infections in human blood and in the midguts of mosquitoes that became infected in membrane feeding experiments using the same blood material in two West African settings. Median numbers of clones in humans and mosquitoes were higher in samples from Burkina Faso (4.5, interquartile range 2-8 for humans; and 2, interquartile range 1-3 for mosquitoes) than in The Gambia (2, interquartile range 1-3 and 1, interquartile range 1-3, for humans and mosquitoes, respectively). Whilst the median number of clones was commonly higher in human blood samples, not all transmitted alleles were detectable in the human peripheral blood. In both study sample sets, additional parasite alleles were identified in mosquitoes compared with the matched human samples (10-88.9% of all clones/feeding assay, n = 73 feeding assays). The results are likely due to preferential amplification of the most abundant clones in peripheral blood but confirm the presence of low density clones that produce transmissible sexual stage parasites. Copyright © 2018. Published by Elsevier Ltd.
Serum Fatty Acid Binding Protein 4 (FABP4) Predicts Pre-eclampsia in Women With Type 1 Diabetes.
Wotherspoon, Amy C; Young, Ian S; McCance, David R; Patterson, Chris C; Maresh, Michael J A; Pearson, Donald W M; Walker, James D; Holmes, Valerie A
2016-10-01
To examine the association between fatty acid binding protein 4 (FABP4) and pre-eclampsia risk in women with type 1 diabetes. Serum FABP4 was measured in 710 women from the Diabetes and Pre-eclampsia Intervention Trial (DAPIT) in early pregnancy and in the second trimester (median 14 and 26 weeks' gestation, respectively). FABP4 was significantly elevated in early pregnancy (geometric mean 15.8 ng/mL [interquartile range 11.6-21.4] vs. 12.7 ng/mL [interquartile range 9.6-17]; P < 0.001) and the second trimester (18.8 ng/mL [interquartile range 13.6-25.8] vs. 14.6 ng/mL [interquartile range 10.8-19.7]; P < 0.001) in women in whom pre-eclampsia later developed. Elevated second-trimester FABP4 level was independently associated with pre-eclampsia (odds ratio 2.87 [95% CI 1.24-6.68], P = 0.03). The addition of FABP4 to established risk factors significantly improved net reclassification improvement at both time points and integrated discrimination improvement in the second trimester. Increased second-trimester FABP4 independently predicted pre-eclampsia and significantly improved reclassification and discrimination. FABP4 shows potential as a novel biomarker for pre-eclampsia prediction in women with type 1 diabetes. © 2016 by the American Diabetes Association.
Tominaga, Ryoji; Sekiguchi, Miho; Yonemoto, Koji; Kakuma, Tatsuyuki; Konno, Shin-Ichi
2018-05-01
The Japanese Orthopaedic Association Back Pain Evaluation Questionnaire (JOABPEQ) was developed in 2007, including the five domains of Pain-related disorder, Lumbar spine dysfunction, Gait disturbance, Social life disturbance, and Psychological disorder. It is used by physicians to evaluate treatment efficacy by comparing scores before and after treatment. However, the JOABPEQ does not allow evaluation of the severity of a patient's condition compared to the general population at a single time point. Given the unavailability of a standard measurement of back pain, we sought to establish reference scores and interquartile ranges using data obtained from a multicenter, cross-sectional survey taken in Japanese primary care settings. The Lumbar Spinal Stenosis Diagnosis Support Tool project was conducted from 2011 to 2012 in 1657 hospitals in Japan to investigate the establishment of reference scores using JOABPEQ. Patients aged ≥ 20 years undergoing medical examinations by either non-orthopaedic primary care physicians or general orthopedists were considered for enrollment. A total of 10,651 consecutive low back pain patients (5331 men, 5320 women, 18 subjects with missing sex data) who had undergone a medical examination were included. Reference scores and interquartile ranges for each of the five domains of the JOABPEQ according to age and sex were recorded. The median score and interquartile range are the same in the domain of Pain-related disorder in all ages and sexes. The reference scores for Gait disturbance, Social life disturbance and Psychological disorder declined with increasing age in both age- and sex-stratified groups, while there was some different trend in Lumbar spine dysfunction between men and women. Reference scores and interquartile ranges for JOABPEQ were generated based on the data from the examination data. These provide a measurement standard to assess patient perceptions of low back pain at any time point during evaluation or therapy. Copyright © 2018 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
Williet, Nicolas; Boschetti, Gilles; Fovet, Marion; Di Bernado, Thomas; Claudez, Pierre; Del Tedesco, Emilie; Jarlot, Camille; Rinaldi, Leslie; Berger, Anne; Phelip, Jean-Marc; Flourie, Bernard; Nancey, Stéphane; Paul, Stéphane; Roblin, Xavier
2017-11-01
We investigated whether serum trough levels of vedolizumab, a humanized monoclonal antibody against integrin α4β7, during the induction phase of treatment can determine whether patients will need additional doses (optimization of therapy) within the first 6 months. We conducted an observational study of 47 consecutive patients with Crohn's disease (CD; n = 31) or ulcerative colitis (UC; n = 16) who had not responded to 2 previous treatment regimens with antagonists of tumor necrosis factor and were starting therapy with vedolizumab at 2 hospitals in France, from June 2014 through April 2016. All patients were given a 300-mg infusion of vedolizumab at the start of the study, Week 2, Week 6, and then every 8 weeks; patients were also given corticosteroids during the first 4-6 weeks. Patients not in remission at Week 6 were given additional doses of vedolizumab at Week 10 and then every 4 weeks (extended therapy or optimization). Remission at Week 6 of treatment was defined as CD activity score below 150 points for patients with CD and a partial Mayo Clinic score of <3 points, without concomitant corticosteroids, for patients with UC. Blood samples were collected each week and serum levels of vedolizumab and antibodies against vedolizumab were measured using an enzyme-linked immunosorbent assay. Median trough levels of vedolizumab and interquartile ranges were compared using the nonparametric Mann-Whitney test. The primary objective was to determine whether trough levels of vedolizumab measured during the first 6 weeks of induction therapy associated with the need for extended treatment within the first 6 months. Based on response to therapy at Week 6, extended treatment was required for 30 of the 47 patients (23 patients with CD and 7 patients with UC). At Week 2, trough levels of vedolizumab for patients selected for extended treatment were 23.0 μg/mL (interquartile range, 14.0-37.0 μg/mL), compared with 42.5 μg/mL in patients who did not receive extended treatment (interquartile range, 33.5-50.7; P = .15). At Week 6, trough levels of vedolizumab <18.5 μg/mL were associated with need for extended therapy (100% positive predictive value, 46.2%; negative predictive value; area under the receiver operating characteristic curve, 0.72) within the first 6 months. Among patients who required extended treatment at Week 10, all of those with trough levels of vedolizumab <19.0 μg/mL at Week 6 had achieved clinical remission 4 weeks later (secondary responders). In a prospective study of patients with CD or UC receiving induction therapy with vedolizumab, low trough levels of vedolizumab at Week 6 (<19.0 μg/mL) are associated with need for additional doses (given at Week 10 and then every 4 weeks). All patients receiving these additional doses achieved a clinical response 4 weeks later. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
Knelson, Lauren P.; Williams, David A.; Gergen, Maria F.; Rutala, William A.; Weber, David J.; Sexton, Daniel J.; Anderson, Deverick J.
2014-01-01
A total of 1,023 environmental surfaces were sampled from 45 rooms with patients infected or colonized with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant enterococci (VRE) before terminal room cleaning. Colonized patients had higher median total target colony-forming units (CFU) of MRSA or VRE than did infected patients (median, 25 CFU [interquartile range, 0–106 CFU] vs 0 CFU [interquartile range, 0–29 CFU]; P = .033). PMID:24915217
Fontana, Marianna; Asaria, Perviz; Moraldo, Michela; Finegold, Judith; Hassanally, Khalil; Manisty, Charlotte H; Francis, Darrel P
2014-06-17
Primary prevention guidelines focus on risk, often assuming negligible aversion to medication, yet most patients discontinue primary prevention statins within 3 years. We quantify real-world distribution of medication disutility and separately calculate the average utilities for a range of risk strata. We randomly sampled 360 members of the general public in London. Medication aversion was quantified as the gain in lifespan required by each individual to offset the inconvenience (disutility) of taking an idealized daily preventative tablet. In parallel, we constructed tables of expected gain in lifespan (utility) from initiating statin therapy for each age group, sex, and cardiovascular risk profile in the population. This allowed comparison of the widths of the distributions of medication disutility and of group-average expectation of longevity gain. Observed medication disutility ranged from 1 day to >10 years of life being required by subjects (median, 6 months; interquartile range, 1-36 months) to make daily preventative therapy worthwhile. Average expected longevity benefit from statins at ages ≥50 years ranges from 3.6 months (low-risk women) to 24.3 months (high-risk men). We can no longer assume that medication disutility is almost zero. Over one-quarter of subjects had disutility exceeding the group-average longevity gain from statins expected even for the highest-risk (ie, highest-gain) group. Future primary prevention studies might explore medication disutility in larger populations. Patients may differ more in disutility than in prospectively definable utility (which provides only group-average estimates). Consultations could be enriched by assessing disutility and exploring its reasons. © 2014 American Heart Association, Inc.
Peng, Song; Hu, Liang; Chen, Wenzhi; Chen, Jinyun; Yang, Caiyong; Wang, Xi; Zhang, Rong; Wang, Zhibiao; Zhang, Lian
2015-04-01
To investigate the value of microbubble contrast-enhanced ultrasound (CEUS) in evaluating the treatment response of uterine fibroids to HIFU ablation. Sixty-eight patients with a solitary uterine fibroid from the First Affiliated Hospital of Chongqing Medical University were included and analyzed. All patients underwent pre- and post-treatment magnetic resonance imaging (MRI) with a standardized protocol, as well as pre-evaluation, intraprocedure, and immediate post-treatment CEUS. CEUS and MRI were compared by different radiologists. In comparison with MRI, CEUS showed that the size of fibroids, volume of fibroids, size of non-perfused regions, non-perfused volume (NPV) or fractional ablation (NPV ratio) was similar to that of MRI. In terms of CEUS examination results, the median volume of fibroids was 75.2 (interquartile range, 34.2-127.3) cm(3), the median non-perfused volume was 54.9 (interquartile range, 28.0-98.1) cm(3), the mean fractional ablation was 83.7±13.6 (range, 30.0-100.0)%. In terms of MRI examination results, the median volume of fibroids was 74.1 (interquartile range, 33.4-116.2) cm(3). On the basis of contrast-enhanced T1-weighted images immediately after HIFU treatment, the median non-perfused volume was 58.5 (interquartile range, 27.7-100.0) cm(3), the average fractional ablation was 84.2±14.2 (range, 40.0-100.0)%. CEUS clearly showed the size of fibroids and the non-perfused areas of the fibroid. Results from CEUS correlated well with results obtained from MRI. Copyright © 2015 Elsevier B.V. All rights reserved.
Cell-Free circulating DNA: a new biomarker for the acute coronary syndrome.
Cui, Ming; Fan, Mengkang; Jing, Rongrong; Wang, Huimin; Qin, Jingfeng; Sheng, Hongzhuan; Wang, Yueguo; Wu, Xinhua; Zhang, Lurong; Zhu, Jianhua; Ju, Shaoqing
2013-01-01
In recent studies, concentrations of cell-free circulating DNA (cf-DNA) have been correlated with clinical characteristics and prognosis in several diseases. The relationship between cf-DNA concentrations and the acute coronary syndrome (ACS) remains unknown. Moreover, no data are available for the detection cf-DNA in ACS by a branched DNA (bDNA)-based Alu assay. The aim of the present study was to investigate cf-DNA concentrations in ACS and their relationship with clinical features. Plasma cf-DNA concentrations of 137 ACS patients at diagnosis, of 60 healthy individuals and of 13 patients with stable angina (SA) were determined using a bDNA-based Alu assay. ACS patients (median 2,285.0, interquartile range 916.4-4,857.3 ng/ml), especially in ST-segment elevation myocardial infarction patients (median 5,745.4, interquartile range 4,013.5-8,643.9 ng/ml), showed a significant increase in plasma cf-DNA concentrations compared with controls (healthy controls: median 118.3, interquartile range 81.1-221.1 ng/ml; SA patients: median 202.3, interquartile range 112.7-256.1 ng/ml) using a bDNA-based Alu assay. Moreover, we found positive correlations between cf-DNA and Gensini scoring and GRACE (Global Registry of Acute Coronary Events) scoring in ACS. cf-DNA may be a valuable marker for diagnosing and predicting the severity of coronary artery lesions and risk stratification in ACS. Copyright © 2013 S. Karger AG, Basel.
Shin, Samuel M; Silverman, Joshua S; Bowden, Greg; Mathieu, David; Yang, Huai-Che; Lee, Cheng-Chia; Tam, Moses; Szelemej, Paul; Kaufmann, Anthony M; Cohen-Inbar, Or; Sheehan, Jason; Niranjan, Ajay; Lunsford, L Dade; Kondziolka, Douglas
2017-01-01
Stereotactic radiosurgery (SRS) can be used as part of multimodality management for patients with primary central nervous system lymphoma (PCNSL). The objective of this study is to evaluate outcomes of SRS for this disease. The International Gamma Knife Research Foundation identified 23 PCNSL patients who underwent SRS for either relapsed (intracerebral in-field or out-of-field tumor recurrences) or refractory disease from 1995-2014. All 23 patients presented with RPA Class I or II PCNSL, and were initially treated with a median of 7 cycles of methotrexate-based chemotherapy regimens (range, 3-26 cycles). Ten received prior whole brain radiation (WBRT) to a median dose of 43 Gy (range, 24-55 Gy). Sixteen presented with relapsed PCNSL, and seven presented with refractory disease. Twenty-three received 26 procedures of SRS. The median tumor volume was 4 cm 3 (range, 0.1-26 cm 3 ), and the median margin dose was 15 Gy (range, 8-20 Gy). Median follow-up from SRS was 11 months (interquartile range, 5.7-33.2 months). Twenty presented with treatment response to twenty-three tumors (12 complete, 11 partial). Fourteen patients relapsed or were refractory to salvage SRS, and local control was 95%, 91%, and 75% at 3, 6, and 12 months post SRS. Intracranial (in-field and out-of-field) and distant (systemic) PFS was 86%, 81%, and 55% at 3, 6, and 12 months post SRS. Toxicity of SRS was low, with one developing an adverse radiation effect requiring no additional intervention. Although methotrexate-based chemotherapy regimens with or without WBRT is the first-line management option for PCNSL, SRS may be used as an alternative option in properly selected patients with smaller relapsed or refractory PCNSL tumors.
Gao, S; Sun, F-K; Fan, Y-C; Shi, C-H; Zhang, Z-H; Wang, L-Y; Wang, K
2015-08-01
Glutathione-S-transferase P1 (GSTP1) methylation has been demonstrated to be associated with oxidative stress induced liver damage in acute-on-chronic hepatitis B liver failure (ACHBLF). To evaluate the methylation level of GSTP1 promoter in acute-on-chronic hepatitis B liver failure and determine its predictive value for prognosis. One hundred and five patients with acute-on-chronic hepatitis B liver failure, 86 with chronic hepatitis B (CHB) and 30 healthy controls (HC) were retrospectively enrolled. GSTP1 methylation level in peripheral mononuclear cells (PBMC) was detected by MethyLight. Clinical and laboratory parameters were obtained. GSTP1 methylation levels were significantly higher in patients with acute-on-chronic hepatitis B liver failure (median 16.84%, interquartile range 1.83-59.05%) than those with CHB (median 1.25%, interquartile range 0.48-2.47%; P < 0.01) and HC (median 0.80%, interquartile range 0.67-1.27%; P < 0.01). In acute-on-chronic hepatitis B liver failure group, nonsurvivors showed significantly higher GSTP1 methylation levels (P < 0.05) than survivors. GSTP1 methylation level was significantly correlated with total bilirubin (r = 0.29, P < 0.01), prothrombin time activity (r = -0.24, P = 0.01) and model for end-stage liver disease (MELD) score (r = 0.26, P = 0.01). When used to predict 1- or 2-month mortality of acute-on-chronic hepatitis B liver failure, GSTP1 methylation showed significantly better predictive value than MELD score [area under the receiver operating characteristic curve (AUC) 0.89 vs. 0.72, P < 0.01; AUC 0.83 vs. 0.70, P < 0.05 respectively]. Meanwhile, patients with GSTP1 methylation levels above the cut-off points showed significantly poorer survival than those below (P < 0.05). Aberrant GSTP1 promoter methylation exists in acute-on-chronic hepatitis B liver failure and shows high predictive value for short-term mortality. It might serve as a potential prognostic marker for acute-on-chronic hepatitis B liver failure. © 2015 John Wiley & Sons Ltd.
Kitai, Yuichiro; Doi, Yohei; Osaki, Keisuke; Sugioka, Sayaka; Koshikawa, Masao; Sugawara, Akira
2015-12-01
Proteinuria is an established risk factor for progression of renal disease, including diabetic nephropathy. The predictive power of proteinuria, especially nephrotic range proteinuria, for progressive renal deterioration has been well demonstrated in diabetic patients with normal to relatively preserved renal function. However, little is known about the relationship between severity of proteinuria and renal outcome in pre-dialysis diabetic patients with severely impaired renal function. 125 incident dialysis patients with type 2 diabetes were identified. This study was aimed at retrospectively evaluating the impact of nephrotic range proteinuria (urinary protein-creatinine ratio above 3.5 g/gCr) on renal function decline during the 3 months just prior to dialysis initiation. In total, 103 patients (82.4 %) had nephrotic range proteinuria. The median rate of decline in estimated glomerular filtration rate (eGFR) in this study population was 0.98 (interquartile range 0.51-1.46) ml/min/1.73 m(2) per month. Compared to patients without nephrotic range proteinuria, patients with nephrotic range proteinuria showed significantly faster renal function decline (0.46 [0.24-1.25] versus 1.07 [0.64-1.54] ml/min/1.73 m(2) per month; p = 0.007). After adjusting for gender, age, systolic blood pressure, serum albumin, calcium-phosphorus product, hemoglobin A1c, and use of an angiotensin-converting enzyme inhibitor or an angiotensin II receptor blocker, patients with nephrotic range proteinuria showed a 3.89-fold (95 % CI 1.08-14.5) increased risk for rapid renal function decline defined as a decline in eGFR ≥0.5 ml/min/1.73 m(2) per month. Nephrotic range proteinuria is the predominant renal risk factor in type 2 diabetic patients with severely impaired renal function receiving pre-dialysis care.
Gupta, Ayush; Kapil, Arti; Kabra, S K; Lodha, Rakesh; Sood, Seema; Dhawan, Benu; Das, Bimal K; Sreenivas, V
2013-12-01
Healthcare associated infections (HAIs) are responsible for morbidity and mortality among immunocompromised and critically ill patients. We undertook this study to estimate the burden of HAIs in the paediatric cancer patients in a tertiary care hospital in north India. This prospective, observational study, based on active surveillance for a period of 11 months was undertaken in a 4-bedded isolated, cubicle for paediatric cancer patients. Patients who stayed in the cubicle for ≥48 h, were followed prospectively for the development of HAIs. Of the 138 patients, 13 developed 14 episodes of HAIs during the study period. Patient-days calculated were 1273 days. Crude infection rate (CIR) and incidence density (ID) of all HAIs were 9.4/100 patients and 11/1000 patient-days, respectively. Of the 14 episodes of HAIs, seven (50%) were of blood stream infections (HA-BSI), five (36%) of pneumonia (HAP) and two (14%) urinary tract infections (HA-UTI). The CIRs of HA-BSI, HAP and HA-UTI were 5.1, 3.6 and 1.4/100 patients, respectively. The corresponding IDs were 5.5, 3.9 and 1.6/1000 patient-days, respectively. Mean length of stay was significantly higher in patients who developed an HAI [13.8 (range 7-30), median (Interquartile range) 12 (11-14)] vs 7.5 days [range 2-28, median (interquartile range) 7 (5-9); P<0.0001]. Also mortality was significantly higher in patients who developed an HAI [23% (3/13) vs 3% (4/125), P<0.05]. The incidence of HAIs in the paediatric cancer patients in the study was 11/1000 patient days, of which HA-BSIs were the commonest. HAIs were associated with an increase in morbidity and mortality amongst this high risk patient population.
Ghosh, Jo Kay C.; Wilhelm, Michelle; Su, Jason; Goldberg, Daniel; Cockburn, Myles; Jerrett, Michael; Ritz, Beate
2012-01-01
Few studies have examined associations of birth outcomes with toxic air pollutants (air toxics) in traffic exhaust. This study included 8,181 term low birth weight (LBW) children and 370,922 term normal-weight children born between January 1, 1995, and December 31, 2006, to women residing within 5 miles (8 km) of an air toxics monitoring station in Los Angeles County, California. Additionally, land-use-based regression (LUR)-modeled estimates of levels of nitric oxide, nitrogen dioxide, and nitrogen oxides were used to assess the influence of small-area variations in traffic pollution. The authors examined associations with term LBW (≥37 weeks’ completed gestation and birth weight <2,500 g) using logistic regression adjusted for maternal age, race/ethnicity, education, parity, infant gestational age, and gestational age squared. Odds of term LBW increased 2%–5% (95% confidence intervals ranged from 1.00 to 1.09) per interquartile-range increase in LUR-modeled estimates and monitoring-based air toxics exposure estimates in the entire pregnancy, the third trimester, and the last month of pregnancy. Models stratified by monitoring station (to investigate air toxics associations based solely on temporal variations) resulted in 2%–5% increased odds per interquartile-range increase in third-trimester benzene, toluene, ethyl benzene, and xylene exposures, with some confidence intervals containing the null value. This analysis highlights the importance of both spatial and temporal contributions to air pollution in epidemiologic birth outcome studies. PMID:22586068
Daniels, Tracey; Goodacre, Lynne; Sutton, Chris; Pollard, Kim; Conway, Steven; Peckham, Daniel
2011-08-01
People with cystic fibrosis have a high treatment burden. While uncertainty remains about individual patient level of adherence to medication, treatment regimens are difficult to tailor, and interventions are difficult to evaluate. Self- and clinician-reported measures are routinely used despite criticism that they overestimate adherence. This study assessed agreement between rates of adherence to prescribed nebulizer treatments when measured by self-report, clinician report, and electronic monitoring suitable for long-term use. Seventy-eight adults with cystic fibrosis were questioned about their adherence to prescribed nebulizer treatments over the previous 3 months. Self-report was compared with clinician report and stored adherence data downloaded from the I-Neb nebulizer system. Adherence measures were expressed as a percentage of the prescribed regimen, bias was estimated by the paired difference in mean (95% CI) patient and clinician reported and actual adherence. Agreement between adherence measures was calculated using intraclass correlation coefficients (95% CI), and disagreements for individuals were displayed using Bland-Altman plots. Patient-identified prescriptions matched the medical record prescription. Median self-reported adherence was 80% (interquartile range, 60%-95%), whereas median adherence measured by nebulizer download was 36% (interquartile range, 5%-84.5%). Nine participants overmedicated and underreported adherence. Median clinician report ranged from 50% to 60%, depending on profession. Extensive discrepancies between self-report and clinician report compared with nebulizer download were identified for individuals. Self- and clinician-reporting of adherence does not provide accurate measurement of adherence when compared with electronic monitoring. Using inaccurate measures has implications for treatment burden, clinician prescribing practices, cost, and accuracy of trial data.
Ghosh, Jo Kay C; Wilhelm, Michelle; Su, Jason; Goldberg, Daniel; Cockburn, Myles; Jerrett, Michael; Ritz, Beate
2012-06-15
Few studies have examined associations of birth outcomes with toxic air pollutants (air toxics) in traffic exhaust. This study included 8,181 term low birth weight (LBW) children and 370,922 term normal-weight children born between January 1, 1995, and December 31, 2006, to women residing within 5 miles (8 km) of an air toxics monitoring station in Los Angeles County, California. Additionally, land-use-based regression (LUR)-modeled estimates of levels of nitric oxide, nitrogen dioxide, and nitrogen oxides were used to assess the influence of small-area variations in traffic pollution. The authors examined associations with term LBW (≥37 weeks' completed gestation and birth weight <2,500 g) using logistic regression adjusted for maternal age, race/ethnicity, education, parity, infant gestational age, and gestational age squared. Odds of term LBW increased 2%-5% (95% confidence intervals ranged from 1.00 to 1.09) per interquartile-range increase in LUR-modeled estimates and monitoring-based air toxics exposure estimates in the entire pregnancy, the third trimester, and the last month of pregnancy. Models stratified by monitoring station (to investigate air toxics associations based solely on temporal variations) resulted in 2%-5% increased odds per interquartile-range increase in third-trimester benzene, toluene, ethyl benzene, and xylene exposures, with some confidence intervals containing the null value. This analysis highlights the importance of both spatial and temporal contributions to air pollution in epidemiologic birth outcome studies.
Bossi, Matteo; Tozzi, Matteo; Franchin, Marco; Ferraro, Stefania; Rivolta, Nicola; Ferrario, Massimo; Guttadauro, Chiara; Castelli, Patrizio; Piffaretti, Gabriele
2017-12-25
Background : This study aimed to present cases with cryopreserved human allografts (CHAs) for vascular reconstruction in both aortic and peripheral infected prosthetic grafts. Materials and Methods : This is a single center, observational descriptive study with retrospective analysis. In all cases, the infected prosthetic graft material was completely removed. At discharge, patients were administered anticoagulants. Follow-up examinations included clinical visits, echo-color-Doppler ultrasounds, or computed tomography angiography within 30 days and at 3, 6, and 12 months after the treatment, and then twice per year. Results : We treated 21 patients (90% men, n=19) with the mean age of 71±12 years and mean interval between the initial operation and replacement with CHA of 30 months [range, 1-216; interquartile range (IQR), 2-36]. In-hospital mortality was 14% (n=3); no CHA-related complication led to death. Limb salvage was 100%. No patient was lost at the median follow-up of 14 months (range, 2-61; IQR, 6-39). No rupture, aneurysmal degeneration, or re-infection occurred. Estimated freedom from CHA-related adverse events (95% confidence interval, 43-63) was 95% at 3 years. Conclusion : In our experience, CHAs are a viable option for prosthetic graft infections and provide satisfactory clinical results and favorable stability because of a very low rate of CHA-related adverse events during follow-up.
Spagnuolo, Vincenzo; Travi, Giovanna; Galli, Laura; Cossarini, Francesca; Guffanti, Monica; Gianotti, Nicola; Salpietro, Stefania; Lazzarin, Adriano; Castagna, Antonella
2013-08-01
The objective of this study was to compare immunologic, virologic, and clinical outcomes between living human immunodeficiency virus (HIV)-infected individuals who had a diagnosis of lymphoma versus outcomes in a control group of cancer-free, HIV-infected patients. In this matched cohort study, patients in the case group were survivors of incident lymphomas that occurred between 1997 and June 2010. Controls were living, cancer-free, HIV-infected patients who were matched to cases at a 4:1 ratio by age, sex, nadir CD4 cell count, and year of HIV diagnosis. The date of lymphoma diagnosis served as the baseline in cases and in the corresponding controls. In total, 62 patients (cases) who had lymphoma (20 with Hodgkin disease [HD] and 42 with non-Hodgkin lymphoma [NHL]) were compared with 211 controls. The overall median follow-up was 4.8 years (interquartile range, 2.0-7.9 years). The CD4 cell count at baseline was 278 cells/mm³ (interquartile range, 122-419 cells/mm³) in cases versus 421 cells/mm³ (interquartile range, 222-574 cells/mm³) in controls (P = .003). At the last available visit, the CD4 cell count was 412 cells/mm³ (range, 269-694 cells/mm³) in cases versus 518 cells/mm³ (interquartile range, 350-661 cells/mm³) in controls (P = .087). The proportion of patients who achieved virologic success increased from 30% at baseline to 74% at the last available visit in cases (P = .008) and from 51% to 81% in controls (P = .0286). Patients with HD reached higher CD4 cell counts at their last visit than patients with NHL (589 cells/mm³ [range, 400-841 cells/mm³] vs 332 cells/mm³ [interquartile range, 220-530 cells/mm³], respectively; P = .003). Virologic success was similar between patients with HD and patients with NHL at the last visit. Forty cases (65%) and 76 controls (36%) experienced at least 1 clinical event after baseline (P < .0001); cases were associated with a shorter time to occurrence of the first clinical event compared with controls (P < .0001). HIV-infected lymphoma survivors experienced more clinical events than controls, especially during the first year of follow-up, but they reached similar long-term immunologic and virologic outcomes. © 2013 American Cancer Society.
Ambient air pollution, lung function, and airway responsiveness in asthmatic children.
Ierodiakonou, Despo; Zanobetti, Antonella; Coull, Brent A; Melly, Steve; Postma, Dirkje S; Boezen, H Marike; Vonk, Judith M; Williams, Paul V; Shapiro, Gail G; McKone, Edward F; Hallstrand, Teal S; Koenig, Jane Q; Schildcrout, Jonathan S; Lumley, Thomas; Fuhlbrigge, Anne N; Koutrakis, Petros; Schwartz, Joel; Weiss, Scott T; Gold, Diane R
2016-02-01
Although ambient air pollution has been linked to reduced lung function in healthy children, longitudinal analyses of pollution effects in asthmatic patients are lacking. We sought to investigate pollution effects in a longitudinal asthma study and effect modification by controller medications. We examined associations of lung function and methacholine responsiveness (PC20) with ozone, carbon monoxide (CO), nitrogen dioxide, and sulfur dioxide concentrations in 1003 asthmatic children participating in a 4-year clinical trial. We further investigated whether budesonide and nedocromil modified pollution effects. Daily pollutant concentrations were linked to ZIP/postal code of residence. Linear mixed models tested associations of within-subject pollutant concentrations with FEV1 and forced vital capacity (FVC) percent predicted, FEV1/FVC ratio, and PC20, adjusting for seasonality and confounders. Same-day and 1-week average CO concentrations were negatively associated with postbronchodilator percent predicted FEV1 (change per interquartile range, -0.33 [95% CI, -0.49 to -0.16] and -0.41 [95% CI, -0.62 to -0.21], respectively) and FVC (-0.19 [95% CI, -0.25 to -0.07] and -0.25 [95% CI, -0.43 to -0.07], respectively). Longer-term 4-month CO averages were negatively associated with prebronchodilator percent predicted FEV1 and FVC (-0.36 [95% CI, -0.62 to -0.10] and -0.21 [95% CI, -0.42 to -0.01], respectively). Four-month averaged CO and ozone concentrations were negatively associated with FEV1/FVC ratio (P < .05). Increased 4-month average nitrogen dioxide concentrations were associated with reduced postbronchodilator FEV1 and FVC percent predicted. Long-term exposures to sulfur dioxide were associated with reduced PC20 (percent change per interquartile range, -6% [95% CI, -11% to -1.5%]). Treatment augmented the negative short-term CO effect on PC20. Air pollution adversely influences lung function and PC20 in asthmatic children. Treatment with controller medications might not protect but rather worsens the effects of CO on PC20. This clinical trial design evaluates modification of pollution effects by treatment without confounding by indication. Copyright © 2015 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Adil, Eelam; Robson, Caroline; Perez-Atayde, Antonio; Heffernan, Colleen; Moritz, Ethan; Goumnerova, Liliana; Rahbar, Reza
2016-09-01
To describe our experience and current management approach for congenital nasal neuroglial heterotopia (NGH) and encephaloceles. Retrospective chart review at a tertiary pediatric hospital from 1970 to 2013. Thirty patients met inclusion criteria: 21 NGH and nine encephaloceles. Data including demographics, pathology, imaging modality, surgical approach, resection extent, outcomes, and complications were analyzed. Fourteen NGH patients (67%) presented with an internal nasal mass and nasal obstruction. Three patients (14%) presented with an external nasal mass and four (19%) had a mixed lesion. Median age at surgery was 0.51 years (interquartile range 1.32 years). Thirteen (62%) had an intranasal endoscopic approach. Median operative time was 1.6 hours (interquartile range 1.2 hours), and there were no major complications. Nine patients with encephalocele were identified: six (67%) presented with transethmoidal encephaloceles, two (22%) presented with nasoethmoidal encephaloceles, and one (11%) presented with a nasofrontal lesion. The median age at surgery was 1.25 years (interquartile range 1.4 years). All patients required a craniotomy for intracranial extension. Median operative time was 5 hours (interquartile range 1.9 hours), and eight patients (88%) had a total resection. Length of stay ranged from 3 to 14 days. Nasal neuroglial heterotopia and encephaloceles are very rare lesions that require multidisciplinary evaluation and management. At our institution, there has been a shift to magnetic resonance imaging alone for the evaluation of NGH to avoid radiation exposure. Endoscopic extracranial resection is feasible for most intranasal and mixed NGH without an increase in operative time, residual disease, or complications. 4. Laryngoscope, 126:2161-2167, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
A multicenter study of plasma use in the United States.
Triulzi, Darrell; Gottschall, Jerome; Murphy, Edward; Wu, Yanyun; Ness, Paul; Kor, Daryl; Roubinian, Nareg; Fleischmann, Debra; Chowdhury, Dhuly; Brambilla, Donald
2015-06-01
Detailed information regarding plasma use in the United States is needed to identify opportunities for practice improvement and design of clinical trials of plasma therapy. Ten US hospitals collected detailed medical information from the electronic health records for 1 year (2010-2011) for all adult patients transfused with plasma. A total of 72,167 units of plasma were transfused in 19,596 doses to 9269 patients. The median dose of plasma was 2 units (interquartile range, 2-4; range 1-72); 15% of doses were 1 unit, and 45% were 2 units. When adjusted by patient body weight (kg), the median dose was 7.3 mL/kg (interquartile range, 5.5-12.0). The median pretransfusion international normalized ratio (INR) was 1.9 (25%-75% interquartile range, 1.6-2.6). A total of 22.5% of plasma transfusions were given to patients with an INR of less than 1.6 and 48.5% for an INR of 2.0 or more. The median posttransfusion INR was 1.6 (interquartile range, 1.4-2.0). Only 42% of plasma transfusions resulted in a posttransfusion INR of less than 1.6. Correction of INR increased as the plasma dose increased from 1 to 4 units (p < 0.001). There was no difference in the INR response to different types of plasma. The most common issue locations were general ward (38%) and intensive care unit (ICU; 42%). This large database describing plasma utilization in the United States provides evidence for both inadequate dosing and unnecessary transfusion. Measures to improve plasma transfusion practice and clinical trials should be directed at patients on medical and surgical wards and in the ICU where plasma is most commonly used. © 2014 AABB.
A multicenter study of plasma use in the United States
Triulzi, Darrell; Gottschall, Jerome; Murphy, Edward; Wu, Yanyun; Ness, Paul; Kor, Daryl; Roubinian, Nareg; Fleischmann, Debra; Chowdhury, Dhuly; Brambilla, Donald
2016-01-01
Background Detailed information regarding plasma use in the United States is needed to identify opportunities for practice improvement and design of clinical trials of plasma therapy. Study Design and Methods Ten US hospitals collected detailed medical information from the electronic health records for 1 year (2010-2011) for all adult patients transfused with plasma. Results A total of 72,167 units of plasma were transfused in 19,596 doses to 9269 patients. The median dose of plasma was 2 units (interquartile range, 2-4; range 1-72); 15% of doses were 1 unit, and 45% were 2 units. When adjusted by patient body weight (kg), the median dose was 7.3 mL/kg (interquartile range, 5.5-12.0). The median pretransfusion international normalized ratio (INR) was 1.9 (25%-75% interquartile range, 1.6-2.6). A total of 22.5% of plasma transfusions were given to patients with an INR of less than 1.6 and 48.5% for an INR of 2.0 or more. The median posttransfusion INR was 1.6 (interquartile range, 1.4-2.0). Only 42% of plasma transfusions resulted in a posttransfusion INR of less than 1.6. Correction of INR increased as the plasma dose increased from 1 to 4 units (p < 0.001). There was no difference in the INR response to different types of plasma. The most common issue locations were general ward (38%) and intensive care unit (ICU; 42%). Conclusion This large database describing plasma utilization in the United States provides evidence for both inadequate dosing and unnecessary transfusion. Measures to improve plasma transfusion practice and clinical trials should be directed at patients on medical and surgical wards and in the ICU where plasma is most commonly used. PMID:25522888
López-Padilla, Daniel; Peghini Gavilanes, Esteban; Revilla Ostolaza, Teresa Yolanda; Trujillo, María Dolores; Martínez Serna, Iván; Arenas Valls, Nuria; Girón Matute, Walther Iván; Larrosa-Barrero, Roberto; Manrique Mutiozabal, Adriana; Pérez Gallán, Marta; Zevallos, Annette; Sayas Catalán, Javier
2016-10-01
To determine the prevalence of arterial stump thrombosis (AST) after pulmonary resection surgery for lung cancer and to describe subsequent radiological follow-up and treatment. Observational, descriptive study of AST detected by computerized tomography angiography (CT) using intravenous contrast. Clinical and radiological variables were compared and a survival analysis using Kaplan-Meier curves was performed after dividing patients into 3 groups: patients with AST, patients with pulmonary embolism (PE), and patients without AST or PE. Nine cases of AST were detected after a total of 473 surgeries (1.9%), 6 of them in right-sided surgeries (67% of AST cases). Median time to detection after surgery was 11.3 months (interquartile range 2.7-42.2 months), and range 67.5 months (1.4-69.0 months). Statistically significant differences were found only in the number of CTs performed in AST patients compared to those without AST or PE, and in tumor recurrence in PE patients compared to the other 2 groups. No differences were found in baseline or oncological characteristics, nor in the survival analysis. In this series, AST prevalence was low and tended to occur in right-sided surgeries. Detection over time was variable, and unrelated to risk factors previous to surgery, histopathology, and tumor stage or recurrence. AST had no impact on patient survival. Copyright © 2016 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.
Utilization and Outcomes of Sentinel Lymph Node Biopsy for Vulvar Cancer.
Cham, Stephanie; Chen, Ling; Burke, William M; Hou, June Y; Tergas, Ana I; Hu, Jim C; Ananth, Cande V; Neugut, Alfred I; Hershman, Dawn L; Wright, Jason D
2016-10-01
To examine the use and predictors of sentinel node biopsy in women with vulvar cancer. The Perspective database, an all-payer database that collects data from more than 500 hospitals, was used to perform a retrospective cohort study of women with vulvar cancer who underwent vulvectomy and lymph node assessment from 2006 to 2015. Multivariable models were used to determine factors associated with sentinel node biopsy. Length of stay and cost were compared between women who underwent sentinel node biopsy and lymphadenectomy. Among 2,273 women, sentinel node biopsy was utilized in 618 (27.2%) and 1,655 (72.8%) underwent inguinofemoral lymphadenectomy. Performance of sentinel node biopsy increased from 17.0% (95% confidence interval [CI] 12.0-22.0%) in 2006 to 39.1% (95% CI 27.1-51.0%) in 2015. In a multivariable model, women treated more recently were more likely to have undergone sentinel node biopsy, whereas women with more comorbidities and those treated at rural hospitals were less likely to have undergone the procedure. The median length of stay was shorter for those undergoing sentinel node biopsy (median 2 days, interquartile range 1-3) compared with women who underwent inguinofemoral lymphadenectomy (median 3 days, interquartile range 2-4). The cost of sentinel node biopsy was $7,599 (interquartile range $5,739-9,922) compared with $8,095 (interquartile range $5,917-11,281) for lymphadenectomy. The use of sentinel node biopsy for vulvar cancer has more than doubled since 2006. Sentinel lymph node biopsy is associated with a shorter hospital stay and decreased cost compared with inguinofemoral lymphadenectomy.
Lima, Fabricio O; Furie, Karen L; Silva, Gisele S; Lev, Michael H; Camargo, Erica C S; Singhal, Aneesh B; Harris, Gordon J; Halpern, Elkan F; Koroshetz, Walter J; Smith, Wade S; Nogueira, Raul G
2014-02-01
Limited data exist regarding the natural history of proximal intracranial arterial occlusions. OBJECTIVE To investigate the outcomes of patients who had an acute ischemic stroke attributed to an anterior circulation proximal intracranial arterial occlusion. A prospective cohort study at 2 university-based hospitals from 2003 to 2005 in which nonenhanced computed tomography scans and computed tomography angiograms were obtained at admission of all adult patients suspected of having an ischemic stroke in the first 24 hours of symptom onset. Anterior circulation proximal intracranial arterial occlusion. Frequency of good outcome (defined as a modified Rankin Scale score of ≤ 2) and mortality at 6 months. A total of 126 patients with a unilateral complete occlusion of the intracranial internal carotid artery (ICA; 26 patients: median National Institutes of Health Stroke Scale [NIHSS] score, 11 [interquartile range, 5-17]), of the M1 segment of the middle cerebral artery (MCA; 52 patients: median NIHSS score, 13 [interquartile range, 6-16]), or of the M2 segment of the MCA (48 patients: median NIHSS score, 7 [interquartile range, 4-15]) were included. Of these 3 groups of patients, 10 (38.5%), 20 (38.5%), and 26 (54.2%) with ICA, MCA-M1, and MCA-M2 occlusions, respectively, achieved a modified Rankin Scale score of 2 or less, and 6 (23.1%), 12 (23.1%), and 10 (20.8%) were dead at 6 months. Worse outcomes were seen in patients with a baseline NIHSS score of 10 or higher, with a modified Rankin Scale score of 2 or less achieved in only 7.1% (1 of 14), 23.5% (8 of 34), and 22.7% (5 of 22) of patients and mortality rates of 35.7% (5 of 14), 32.4% (11 of 34), and 40.9% (9 of 22) among patients with ICA, MCA-M1, and MCA-M2 occlusions, respectively. Age (odds ratio, 0.94 [95% CI, 0.91-0.98]), NIHSS score (odds ratio, 0.73 [95% CI, 0.64-0.83]), and strength of leptomeningeal collaterals (odds ratio, 2.37 [95% CI, 1.08-5.20]) were independently associated with outcome, whereas the level of proximal intracranial arterial occlusion (ICA vs MCA-M1 vs MCA-M2) was not. The natural history of proximal intracranial arterial occlusion is variable, with poor outcomes overall. Stroke severity and collateral flow appear to be more important than the level of proximal intracranial arterial occlusion in determining outcomes. Our results provide useful data for proper patient selection and sample size calculations in the design of new clinical trials aimed at recanalization therapies.
Walsh, Jessica A; McFadden, Molly; Woodcock, Jamie; Clegg, Daniel O; Helliwell, Philip; Dommasch, Erica; Gelfand, Joel M; Krueger, Gerald G; Duffin, Kristina Callis
2013-12-01
The Psoriasis Area and Severity Index (PASI) is considered the gold standard assessment tool for psoriasis severity, but PASI is limited by its complexity and insensitivity in people with mild psoriasis. We sought to evaluate the product of a Physician Global Assessment (PGA) and Body Surface Area (BSA) (PGAxBSA) as an alternative to PASI. Psoriasis severity was evaluated at 6-month intervals in participants of the Utah Psoriasis Initiative registry. Correlation coefficients were used to compare PGAxBSA with PASI and the Simplified PASI (SPASI). Between August 2008 and November 2010, 435 assessments were completed in 226 participants. The median PASI score was 3.2 (interquartile range 1.8-5.4) and the median BSA was 3.0% (interquartile range 1.0%-5.0%). PGAxBSA had higher correlations with PASI than SPASI (0.87 vs 0.76, P < .001). PGAxBSA also had higher correlations with a Global Patient Assessment of psoriasis severity (0.65) than both PASI (0.59, P < .001) and SPASI (0.51, P < .001). The use of PGAxBSA for measuring severe psoriasis and response to therapy is unclear, because most participants had mild to moderate psoriasis and data were not collected at predefined intervals in relation to therapy initiation. Interrater reliability was not assessed. PGAxBSA is a simple and sensitive instrument for measuring psoriasis severity. Copyright © 2013 American Academy of Dermatology, Inc. Published by Mosby, Inc. All rights reserved.
Gray, Clive M; Williamson, Carolyn; Bredell, Helba; Puren, Adrian; Xia, Xiaohua; Filter, Ruben; Zijenah, Lynn; Cao, Huyen; Morris, Lynn; Vardas, Efthyia; Colvin, Mark; Gray, Glenda; McIntyre, James; Musonda, Rosemary; Allen, Susan; Katzenstein, David; Mbizo, Mike; Kumwenda, Newton; Taha, Taha; Karim, Salim Abdool; Flores, Jorge; Sheppard, Haynes W
2005-04-01
Defining viral dynamics in natural infection is prognostic of disease progression and could prove to be important for vaccine trial design as viremia may be a likely secondary end point in phase III HIV efficacy trials. There are limited data available on the early course of plasma viral load in subtype C HIV-1 infection in Africa. Plasma viral load and CD4+ T cell counts were monitored in 51 recently infected subjects for 9 months. Individuals were recruited from four southern African countries: Zambia, Malawi, Zimbabwe, and South Africa and the median estimated time from seroconversion was 8.9 months (interquartile range, 5.7-14 months). All were infected with subtype C HIV-1 and median viral loads, measured using branched DNA, ranged from 3.82-4.02 log10 RNA copies/ml from 2-24 months after seroconversion. Viral loads significantly correlated with CD4+ cell counts (r=-0.5, p<0.0001; range, 376-364 cells/mm3) and mathematical modeling defined a median set point of 4.08 log10 (12 143 RNA copies/ml), which was attained approximately 17 months after seroconversion. Comparative measurements using three different viral load platforms (bDNA, Amplicor, and NucliSens) confirmed that viremia in subtype C HIV-1-infected individuals within the first 2 years of infection did not significantly differ from that found in early subtype B infection. In conclusion, the course of plasma viremia, as described in this study, will allow a useful baseline comparator for understanding disease progression in an African setting and may be useful in the design of HIV-1 vaccine trials in southern Africa.
The New MIRUS System for Short-Term Sedation in Postsurgical ICU Patients.
Romagnoli, Stefano; Chelazzi, Cosimo; Villa, Gianluca; Zagli, Giovanni; Benvenuti, Francesco; Mancinelli, Paola; Arcangeli, Giulio; Dugheri, Stefano; Bonari, Alessandro; Tofani, Lorenzo; Belardinelli, Andrea; De Gaudio, A Raffaele
2017-09-01
To evaluate the feasibility and safety of the MIRUS system (Pall International, Sarl, Fribourg, Switzerland) for sedation with sevoflurane for postsurgical ICU patients and to evaluate atmospheric pollution during sedation. Prospective interventional study. Surgical ICU. February 2016 to December 2016. Postsurgical patients requiring ICU admission, mechanical ventilation, and sedation. Sevoflurane was administered with the MIRUS system targeted to a Richmond Agitation Sedation Scale from -3 to -5 by adaptation of minimum alveolar concentration. Data collected included Richmond Agitation Sedation Scale, minimum alveolar concentration, inspired and expired sevoflurane fraction, wake-up times, duration of sedation, sevoflurane consumption, respiratory and hemodynamic data, Simplified Acute Physiology Score II, Sepsis-related Organ Failure Assessment, and laboratory data and biomarkers of organ injury. Atmospheric pollution was monitored at different sites: before sevoflurane delivery (baseline) and during sedation with the probe 15 cm up to the MIRUS system (S1) and 15 cm from the filter-Reflector group (S2). Sixty-two patients were enrolled in the study. No technical failure occurred. Median Richmond Agitation Sedation Scale was -4.5 (interquartile range, -5 to -3.6) with sevoflurane delivered at a median minimum alveolar concentration of 0.45% (interquartile range, 0.4-0.53) yielding a mean inspiratory and expiratory concentrations of 0.79% (SD, 0.24) and 0.76% (SD, 0.18), respectively. Median awakening time was 4 minutes (2.2-5 min). Median duration of sevoflurane administration was 3.33 hours (2.33-5.75 hr), range 1-19 hours with a mean consumption of 7.89 mL/hr (SD, 2.99). Hemodynamics remained stable over the study period, and no laboratory data indicated liver or kidney injury or dysfunction. Median sevoflurane room air concentration was 0.10 parts per million (interquartile range, 0.07-0.15), 0.17 parts per million (interquartile range, 0.14-0.27), and 0.15 parts per million (interquartile range, 0.07-0.19) at baseline, S1, and S2, respectively. The MIRUS system is a promising and safe alternative for short-term sedation with sevoflurane of ICU patients. Atmospheric pollution is largely below the recommended thresholds (< 5 parts per million). Studies extended to more heterogeneous population of patients undergoing longer duration of sedation are needed to confirm these observations.
Atar, Dan; Petzelbauer, Peter; Schwitter, Jürg; Huber, Kurt; Rensing, Benno; Kasprzak, Jaroslaw D; Butter, Christian; Grip, Lars; Hansen, Peter R; Süselbeck, Tim; Clemmensen, Peter M; Marin-Galiano, Marcos; Geudelin, Bernard; Buser, Peter T
2009-02-24
The purpose of this study was to investigate whether FX06 would limit infarct size when given as an adjunct to percutaneous coronary intervention. FX06, a naturally occurring peptide derived from human fibrin, has been shown to reduce myocardial infarct size in animal models by mitigating reperfusion injury. In all, 234 patients presenting with acute ST-segment elevation myocardial infarction were randomized in 26 centers. FX06 or matching placebo was given as intravenous bolus at reperfusion. Infarct size was assessed 5 days after myocardial infarction by late gadolinium enhanced cardiac magnetic resonance imaging. Secondary outcomes included size of necrotic core zone and microvascular obstruction at 5 days, infarct size at 4 months, left ventricular function, troponin I levels, and safety. There were no baseline differences between groups. On day 5, there was no significant difference in total late gadolinium enhanced zone in the FX06 group compared with placebo (reduction by 21%; p = 0.207). The necrotic core zone, however, was significantly reduced by 58% (median 1.77 g [interquartile range 0.0, 9.09 g] vs. 4.20 g [interquartile range 0.3, 9.93 g]; p < 0.025). There were no significant differences in troponin I levels (at 48 h, -17% in the FX06 group). After 4 months, there were no longer significant differences in scar size. There were numerically fewer serious cardiac events in the FX06-treated group, and no differences in adverse events. In this proof-of-concept trial, FX06 reduced the necrotic core zone as one measure of infarct size on magnetic resonance imaging, while total late enhancement was not significantly different between groups. The drug appears safe and well tolerated. (Efficacy of FX06 in the Prevention of Myocardial Reperfusion Injury [F.I.R.E.]; NCT00326976).
Predictors of inguinal hernia after radical prostatectomy.
Rabbani, Farhang; Yunis, Luis Herran; Touijer, Karim; Brady, Mary S
2011-02-01
To determine the significant independent predictors of inguinal hernia development after radical prostatectomy (RP) so that prophylactic measures can be undertaken in those at increased risk. Although inguinal hernia is a recognized complication after RP, the risk factors have not been well elucidated. From January 1999 to June 2007, 4592 consecutive patients underwent open retropubic RP or laparoscopic RP without previous radiotherapy. The median follow-up was 36.9 months (interquartile range 20.3, 60.6). Comorbidities were recorded, as well as the occurrence of inguinal hernia, wound infection, and bladder neck contracture. Cox proportional hazards analysis was performed for the predictors of inguinal hernia after RP on multivariate analysis. Inguinal hernia developed after RP in 68 men (1.5%) men at a median follow-up of 7.9 months (interquartile range 4.3, 18.1). The laterality was bilateral in 7, right in 27, left in 24, and not documented in 10 patients. The significant independent predictors of inguinal hernia included age (hazard ratio [HR] 1.05, 95% confidence interval [CI] 1.01-1.09, P = .016), body mass index (HR 0.91, 95% CI 0.85-0.98, P = .011), history of inguinal hernia repair (HR 3.9, 95% CI 1.8-8.2, P <.001), and bladder neck contracture (HR 2.8, 95% CI 1.3-5.9, P = .007) but not the RP approach (HR 1.08, 95% CI 0.60-1.96, P = .80 for laparoscopic RP vs retropubic RP). The results of our study have indicated that older patients, thinner patients, those with previous inguinal hernia repair, and those developing bladder neck contracture are at increased risk of developing an inguinal hernia. These factors might identify a subset for whom evaluation for subclinical hernia might allow prophylactic inguinal hernia repair at RP. Copyright © 2011 Elsevier Inc. All rights reserved.
Shepard, Morgan A; Silva, Annelise; Starling, Amaal J; Hoerth, Matthew T; Locke, Dona E C; Ziemba, Kristine; Chong, Catherine D; Schwedt, Todd J
2016-01-01
Clinical observations suggest that psychogenic non-epileptic seizure (PNES) patients often have severe migraine, more severe than epilepsy patients. Investigations into migraine characteristics in patients with PNES are lacking. In this study we tested the hypothesis that, compared to epilepsy patients, PNES patients have more severe migraine, with more frequent and longer duration attacks that cause greater disability. In this observational study, 633 patients with video-EEG proven epilepsy or PNES were identified from the Mayo Clinic Epilepsy Monitoring Unit database. Contacted patients were screened for migraine via a validated questionnaire, and when present, data regarding migraine characteristics were collected. Two-sample t-tests, chi square analyses, and Mann-Whitney U tests were used to compare migraine characteristics in PNES patients to those of epilepsy patients. Data from 43 PNES patients with migraine and 29 epilepsy patients with migraine were available. Compared to epilepsy patients, PNES patients reported having more frequent headaches (mean 15.1 ± 9.8 vs. 8.1 ± 6.6 headache days/month, p<.001), more frequent migraine attacks (mean 6.5 ± 6.3 vs. 3.8. ± 4.1 migraines/month, p=.028), longer duration migraines (mean 39.5 ± 28.3 vs. 27.3 ± 20.1h, p=.035), and more frequently had non-visual migraine auras (78.6% vs. 46.7% of patients with migraine auras, p=.033). Migraine-related disability scores were not different between PNES and epilepsy patients (median 39, interquartile range 89 vs. 25, interquartile range 60.6, p=.15). Compared to epilepsy patients with migraine, PNES patients with migraine report having a more severe form of migraine with more frequent and longer duration attacks that are more commonly associated with non-visual migraine auras. Copyright © 2015 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Good, Elizabeth J.; Ghent, Darren J.; Bulgin, Claire E.; Remedios, John J.
2017-09-01
The relationship between satellite land surface temperature (LST) and ground-based observations of 2 m air temperature (
Deger, Leylâ; Plante, Céline; Jacques, Louis; Goudreau, Sophie; Perron, Stéphane; Hicks, John; Kosatsky, Tom; Smargiassi, Audrey
2012-01-01
BACKGROUND: Little attention has been devoted to the effects on children’s respiratory health of exposure to sulphur dioxide (SO2) in ambient air from local industrial emissions. Most studies on the effects of SO2 have assessed its impact as part of the regional ambient air pollutant mix. OBJECTIVE: To examine the association between exposure to stack emissions of SO2 from petroleum refineries located in Montreal’s (Quebec) east-end industrial complex and the prevalence of active asthma and poor asthma control among children living nearby. METHODS: The present cross-sectional study used data from a respiratory health survey of Montreal children six months to 12 years of age conducted in 2006. Of 7964 eligible households that completed the survey, 842 children between six months and 12 years of age lived in an area impacted by refinery emissions. Ambient SO2 exposure levels were estimated using dispersion modelling. Log-binomial regression models were used to estimate crude and adjusted prevalence ratios (PRs) and 95% CIs for the association between yearly school and residential SO2 exposure estimates and asthma outcomes. Adjustments were made for child’s age, sex, parental history of atopy and tobacco smoke exposure at home. RESULTS: The adjusted PR for the association between active asthma and SO2 levels was 1.14 (95% CI 0.94 to 1.39) per interquartile range increase in modelled annual SO2. The effect on poor asthma control was greater (PR=1.39 per interquartile range increase in modelled SO2 [95% CI 1.00 to 1.94]). CONCLUSIONS: Results of the present study suggest a relationship between exposure to refinery stack emissions of SO2 and the prevalence of active and poor asthma control in children who live and attend school in proximity to refineries. PMID:22536578
Long-term outcomes of sandwich ventral hernia repair paired with hybrid vacuum-assisted closure.
Hicks, Caitlin W; Poruk, Katherine E; Baltodano, Pablo A; Soares, Kevin C; Azoury, Said C; Cooney, Carisa M; Cornell, Peter; Eckhauser, Frederic E
2016-08-01
Sandwich ventral hernia repair (SVHR) may reduce ventral hernia recurrence rates, although with an increased risk of surgical site occurrences (SSOs) and surgical site infections (SSIs). Previously, we found that a modified negative pressure wound therapy (hybrid vacuum-assisted closure [HVAC]) system reduced SSOs and SSIs after ventral hernia repair. We aimed to describe our outcomes after SVHR paired with HVAC closure. We conducted a 4-y retrospective review of all complex SVHRs (biologic mesh underlay and synthetic mesh overlay) with HVAC closure performed at our institution by a single surgeon. All patients had fascial defects that could not be reapproximated primarily using anterior component separation. Descriptive statistics were used to report the incidence of postoperative complications and hernia recurrence. A total of 60 patients (59.3 ± 11.4 y, 58.3% male, 75% American Society of Anesthesiologists class ≥3) with complex ventral hernias being underwent sandwich repair with HVAC closure. Major postoperative morbidity (Dindo-Clavien class ≥3) occurred in 14 (23.3%) patients, but incidence of SSO (n = 13, 21.7%) and SSI (n = 4, 6.7%) was low compared with historical reports. Median follow-up time for all patients was 12 mo (interquartile range 5.8-26.5 mo). Hernia recurrence occurred in eight patients (13.3%) after a median time of 20.6 months (interquartile range 16.4- 25.4 months). Use of a dual layer sandwich repair for complex abdominal wall reconstruction is associated with low rates of hernia recurrence at 1 year postoperatively. The addition of the HVAC closure system may reduce the risk of SSOs and SSIs previously reported with this technique and deserves consideration in future prospective studies assessing optimization of ventral hernia repair approaches. Copyright © 2016 Elsevier Inc. All rights reserved.
Helmy, Samir; Marschalek, Julian; Bader, Yvonne; Koch, Marianne; Schmidt, Alice; Kanzler, Marina; Gyoeri, Georg; Polterauer, Stephan; Reinthaller, Alexander; Grimm, Christoph
2016-06-01
Transplantation results in a 5-time elevated risk for a variety of malignancies (Kaposi sarcoma, skin, liver, lung, gastrointestinal cancer). A patient's risk for malignancies could be of particular interest for the follow-up programs of patients and risk adaption after kidney transplantation. The aim of this study was to identify independent risk factors for de novo malignancies in women after renal transplantation. This is a multicenter transversal study, conducted at the Medical University of Vienna and Hospital Rudolfstiftung, Vienna, Austria. We included female kidney graft recipients who were transplanted between 1980 and 2012 and followed-up at our institutions (N = 280). Clinical data of patients were extracted from hospital charts and electronic patient files. Patients were interviewed using a standardized questionnaire regarding their medical history, history of transplantation, and malignant diseases. Detailed information about present and past immunosuppressive regimens, rejection episodes and therapies, renal graft function, and information about primary disease was obtained. Diagnostic work-up and/or surgical exploration was performed if any presence of malignancy was suspected during routine follow-up. Histological specimens were obtained from all patients. the presence of de novo malignancy after kidney transplantation. Two hundred sixty-two women were included for statistical analysis. Median (interquartile range) follow-up period after transplantation was 101.1 (27.3-190.7) months. Thirty-two patients (12.2%) developed a malignancy: dermatologic malignancies (5.7%), breast cancer (3.4%), cervical cancer (0.8%), lung cancer (0.4%), gastrointestinal malignancies (1.5%), vulvar cancer (0.4%), and unclassified malignancies (1.9%). Median (interquartile range) time to malignancy after transplantation was 185.9 (92.0-257.6) months. Cumulative cancer rates were 4.9% (1 year), 14.4% (3 years), 16.4% (5 years), and 21.8% (10 years). Second transplantations were identified as independent risk factor for development of malignancy after transplantation. Long-term risk of developing a malignancy after kidney transplantation is high, which might justify a follow-up of more than 10 years.
Khdour, Maher R; Hallak, Hussein O; Aldeyab, Mamoon A; Nasif, Mowaffaq A; Khalili, Aliaa M; Dallashi, Ahamad A; Khofash, Mohammad B; Scott, Michael G
2018-04-01
Inappropriate use of antibiotics is one of the most important factors contributing to the emergence of drug resistant pathogens. The purpose of this study was to measure the clinical impact of antimicrobial stewardship programme (ASP) interventions on hospitalized patients at the Intensive care unit at Palestinian Medical Complex. A prospective audit with intervention and feedback by ASP team within 48-72 h of antibiotic administration began in September 2015. Four months of pre-ASP data were compared with 4 months of post-ASP data. Data collected included clinical and demographic data; use of antimicrobials measured by defined daily doses, duration of therapy, length of stay, readmission and all-cause mortality. Overall, 176 interventions were made the ASP team with an average acceptance rate of 78.4%. The most accepted interventions were dose optimization (87.0%) followed by de-escalation based on culture results with an acceptance rate of 84.4%. ASP interventions significantly reduces antimicrobial use by 24.3% (87.3 defined daily doses/100 beds vs. 66.1 defined daily doses/100 beds P < 0.001). The median (interquartile range) of length of stay was significantly reduced post ASP [11 (3-21) vs. 7 (4-19) days; P < 0.01]. Also, the median (interquartile range) of duration of therapy was significantly reduced post-ASP [8 (5-12) days vs. 5 (3-9); P = 0.01]. There was no significant difference in overall 30-day mortality or readmission between the pre-ASP and post-ASP groups (26.9% vs. 23.9%; P = 0.1) and (26.1% vs. 24.6%; P = 0.54) respectively. Our prospective audit and feedback programme was associated with positive impact on antimicrobial use, duration of therapy and length of stay. © 2017 The British Pharmacological Society.
A Trial of Extending Hemodialysis Hours and Quality of Life.
Jardine, Meg J; Zuo, Li; Gray, Nicholas A; de Zoysa, Janak R; Chan, Christopher T; Gallagher, Martin P; Monaghan, Helen; Grieve, Stuart M; Puranik, Rajesh; Lin, Hongli; Eris, Josette M; Zhang, Ling; Xu, Jinsheng; Howard, Kirsten; Lo, Serigne; Cass, Alan; Perkovic, Vlado
2017-06-01
The relationship between increased hemodialysis hours and patient outcomes remains unclear. We randomized (1:1) 200 adult recipients of standard maintenance hemodialysis from in-center and home-based hemodialysis programs to extended weekly (≥24 hours) or standard (target 12-15 hours, maximum 18 hours) hemodialysis hours for 12 months. The primary outcome was change in quality of life from baseline assessed by the EuroQol 5 dimension instrument (3 level) (EQ-5D). Secondary outcomes included medication usage, clinical laboratory values, vascular access events, and change in left ventricular mass index. At 12 months, median weekly hemodialysis hours were 24.0 (interquartile range, 23.6-24.0) and 12.0 (interquartile range, 12.0-16.0) in the extended and standard groups, respectively. Change in EQ-5D score at study end did not differ between groups (mean difference, 0.04 [95% confidence interval, -0.03 to 0.11]; P =0.29). Extended hours were associated with lower phosphate and potassium levels and higher hemoglobin levels. Blood pressure (BP) did not differ between groups at study end. Extended hours were associated with fewer BP-lowering agents and phosphate-binding medications, but were not associated with erythropoietin dosing. In a substudy with 95 patients, we detected no difference between groups in left ventricular mass index (mean difference, -6.0 [95% confidence interval, -14.8 to 2.7] g/m 2 ; P =0.18). Five deaths occurred in the extended group and two in the standard group ( P =0.44); two participants in each group withdrew consent. Similar numbers of patients experienced vascular access events in the two groups. Thus, extending weekly hemodialysis hours did not alter overall EQ-5D quality of life score, but was associated with improvement in some laboratory parameters and reductions in medication burden. (Clinicaltrials.gov identifier: NCT00649298). Copyright © 2017 by the American Society of Nephrology.
Abbott, Marvin M.; Tortorelli, R.L.; Becker, M.F.; Trombley, T.J.
2003-01-01
This report is an overview of water resources in and near the Wichita and Affiliated Tribes treaty lands in western Oklahoma. The tribal treaty lands are about 1,140 square miles and are bordered by the Canadian River on the north, the Washita River on the south, 98? west longitude on the east, and 98? 40' west longitude on the west. Seventy percent of the study area lies within the Washita River drainage basin and 30 percent of the area lies within the Canadian River drainage basin. March through June are months of greatest average streamflow, with 49 to 57 percent of the annual streamflow occurring in these four months. November through February, July, and August have the least average streamflow with only 26 to 36 percent of the annual streamflow occurring in these six months. Two streamflow-gaging stations, Canadian River at Bridgeport and Cobb Creek near Fort Cobb, indicated peak streamflows generally decrease with regulation. Two other streamflow-gaging stations, Washita River at Carnegie and Washita River at Anadarko, indicated a decrease in peak streamflows after regulation at less than the 100-year recurrence and an increase in peak streamflows greater than the 100-year recurrence. Canadian River at Bridgeport and Washita River at Carnegie had estimated annual low flows that generally increased with regulation. Cobb Creek near Fort Cobb had a decrease of estimated annual low flows after regulation. There are greater than 900 ground-water wells in the tribal treaty lands. Eighty percent of the wells are in Caddo County.The major aquifers in the study area are the Rush Springs Aquifer and portions of the Canadian River and Washita River valley alluvial aquifers. The Rush Springs Aquifer is used extensively for irrigation as well as industrial and municipal purposes, especially near population centers.The Canadian River and Washita River valley alluvial aquifers are not used extensively in the study area. Well yields from the Rush Springs Aquifer ranged from 11 to greater than 850 gallons per minute. The Rush Springs Aquifer is recharged by the infiltration of precipitation. The estimated recharge is about 1.80 inches per year evenly distributed over the outcrop of the aquifer in the study area. Principal factors affecting the water quality in the study area include geology, agricultural practices,and oil and gas production. Calcium, magnesium, sulfate, and bicarbonate are the dominant dissolved constituents in water in the study area. Interquartile dissolved-solids concentrations in surface-water samples in the study area generally were greater than interquartile concentrations in ground-water samples. Median dissolved-solids concentrations for ground-water samples from Canadian River, Ionine Creek, Spring Creek,and Washita River Basins, which ranged from 535 to 1,195 milligrams per liter,exceeded the U.S. Environmental Protection Agency Secondary Drinking Water Standard of 500 milligrams per liter. Interquartile sulfate concentrations in surface-water samples in the study area generally were greater than interquartile concentrations in ground-water samples. Median sulfate concentrations from ground-water samples in the Canadian River, IonineCreek,and Spring Creek Basins, which ranged from 385 to 570 milligrams per liter, exceeded the U.S. Environmental Protection Agency Secondary Drinking Water Standard of 250 milligrams per liter. Nitrite plus nitrate as nitrogen concentrations in surface-water samples in the study area generally were less than concentrations in ground-water samples. The median nitrite plus nitrate as nitrogen concentration in ground water was 9.8 milligrams per liter, suggesting almost one-half the ground-water samples exceeded the U.S. Environmental Protection Agency Primary Drinking Water Standard (10 milligrams per liter). An estimated 100 million gallons of water per day were withdrawn from surface and ground water for all uses in
Is Repeat PTA of a Failing Hemodialysis Fistula Durable?
Bountouris, Ioannis; Kristmundsson, Thorarinn; Dias, Nuno; Zdanowski, Zbigniew; Malina, Martin
2014-01-01
Purpose. Our objective was to evaluate the outcome of percutaneous transluminal angioplasty (PTA) and particularly rePTA in a failing arteriovenous fistula (AV-fistula). Are multiple redilations worthwhile? Patients and Methods. All 159 stenoses of AV fistulas that were treated with PTA, with or without stenting, during 2008 and 2009, were included. Occluded fistulas that were dilated after successful thrombolysis were also included. Median age was 68 (interquartile range 61.5-78.5) years and 75% were male. Results. Seventy-nine (50%) of the primary PTAs required no further reintervention. The primary patency was 61% at 6 months and 42% at 12 months. Eighty (50%) of the stenoses needed at least one reintervention. Primary assisted patency (defined as patency after subsequent reinterventions) was 89% at 6 months and 85% at 12 months. The durability of repeated PTAs was similar to the durability of the primary PTA. However, an early primary PTA carried a higher risk for subsequent reinterventions. Successful dialysis was achieved after 98% of treatments. Nine percent of the stenoses eventually required surgical revision and 13% of the fistulas failed permanently. Conclusion. The present study suggests that most failing AV-fistulas can be salvaged endovascularly. Repeated PTA seems similarly durable as the primary PTA.
Kaida, Angela; Matthews, Lynn T; Ashaba, Scholastic; Tsai, Alexander C; Kanters, Steve; Robak, Magdalena; Psaros, Christina; Kabakyenga, Jerome; Boum, Yap; Haberer, Jessica E; Martin, Jeffrey N; Hunt, Peter W; Bangsberg, David R
2014-12-01
Among HIV-infected women, perinatal depression compromises clinical, maternal, and child health outcomes. Antiretroviral therapy (ART) is associated with lower depression symptom severity but the uniformity of effect through pregnancy and postpartum periods is unknown. We analyzed prospective data from 447 HIV-infected women (18-49 years) initiating ART in rural Uganda (2005-2012). Participants completed blood work and comprehensive questionnaires quarterly. Pregnancy status was assessed by self-report. Analysis time periods were defined as currently pregnant, postpartum (0-12 months post-pregnancy outcome), or non-pregnancy-related. Depression symptom severity was measured using a modified Hopkins Symptom Checklist 15, with scores ranging from 1 to 4. Probable depression was defined as >1.75. Linear regression with generalized estimating equations was used to compare mean depression scores over the 3 periods. At enrollment, median age was 32 years (interquartile range: 27-37), median CD4 count was 160 cells per cubic millimeter (interquartile range: 95-245), and mean depression score was 1.75 (s = 0.58) (39% with probable depression). Over 4.1 median years of follow-up, 104 women experienced 151 pregnancies. Mean depression scores did not differ across the time periods (P = 0.75). Multivariable models yielded similar findings. Increasing time on ART, viral suppression, better physical health, and "never married" were independently associated with lower mean depression scores. Findings were consistent when assessing probable depression. Although the lack of association between depression and perinatal periods is reassuring, high depression prevalence at treatment initiation and continued incidence across pregnancy and non-pregnancy-related periods of follow-up highlight the critical need for mental health services for HIV-infected women to optimize both maternal and perinatal health.
Vaswani, Devin; Wallace, Adam N; Eiswirth, Preston S; Madaelil, Thomas P; Chang, Randy O; Tomasian, Anderanik; Jennings, Jack W
2018-03-14
To evaluate the effectiveness of percutaneous image-guided thermal ablation in achieving local tumor control and pain palliation of sarcoma metastases within the musculoskeletal system. Retrospective review of 64 sarcoma metastases within the musculoskeletal system in 26 women and 15 men (total = 41) treated with ablation between December 2011 and August 2016 was performed. Mean age of the cohort was 42.9 years ± 16.0 years. Two subgroups were treated: oligometastatic disease (n = 13) and widely metastatic disease (n = 51). A variety of sarcoma histologies were treated with average tumor volume of 42.5 cm 3 (range 0.1-484.7 cm 3 ). Pain scores were recorded before and 4 weeks after therapy for 59% (38/64) of treated lesions. Follow-up imaging was evaluated for local control and to monitor sites of untreated disease as an internal control. Fifty-eight percent (37/64) were lost to imaging follow-up at varying time points over a year. Complication rate was 5% (3/64; one minor and two major events). One-year local tumor control rates were 70% (19/27) in all patients, 67% (12/18) in the setting of progression of untreated metastases, and 100% (10/10) in the setting of oligometastatic disease. Median pain scores decreased from 8 (interquartile range 5.0-9.0) to 3 (interquartile range 0.1-4.0) 1 month after the procedure (P < 0.001). Image-guided percutaneous ablation is an effective option for local tumor control and pain palliation of metastatic sarcomas within the musculoskeletal system. Treatment in the setting of oligometastatic disease offers potential for remission. Level 4, Retrospective Review.
Comparison of Antivenom Dosing Strategies for Rattlesnake Envenomation.
Spyres, Meghan B; Skolnik, Aaron B; Moore, Elizabeth C; Gerkin, Richard D; Padilla-Jones, Angela; Ruha, Anne-Michelle
2018-06-01
This study compares maintenance with clinical- and laboratory-triggered (as-needed [PRN]) antivenom dosing strategies with regard to patient-centered outcomes after rattlesnake envenomation. This is a retrospective cohort study of adult rattlesnake envenomations treated at a regional toxicology center. Data on demographics, envenomation details, antivenom administration, length of stay, and laboratory and clinical outcomes were compared between the PRN and maintenance groups. Primary outcomes were hospital length of stay and total antivenom used, with a hypothesis of no difference between the two dosing strategies. A single regional toxicology center PATIENTS:: Three-hundred ten adult patients envenomated by rattlesnakes between 2007 and 2014 were included. Patients were excluded if no antivenom was administered or for receiving an antivenom other than Crofab (BTG International, West Conshohocken, PA). This is a retrospective study of rattlesnake envenomations treated with and without maintenance antivenom dosing. One-hundred forty-eight in the maintenance group and 162 in the PRN group were included. There was no difference in demographics or baseline envenomation severity or hemotoxicity (32.7% vs 40.5%; respectively; p = 0.158) between the two groups. Comparing the PRN with the maintenance group, less antivenom was used (8 [interquartile range, 6-12] vs 16 [interquartile range, 12-18] vials, respectively; p < 0.001), and hospital length of stay was shorter (27 hr [interquartile range, 20-44 hr] vs 34 hr [interquartile range, 24-43 hr], respectively; p = 0.014). There were no differences in follow-up outcomes of readmission, retreatment, or bleeding and surgical complications. Hospital length of stay was shorter, and less antivenom was used in patients receiving a PRN antivenom dosing strategy after rattlesnake envenomation.
Soon, Reni; Tschann, Mary; Salcedo, Jennifer; Stevens, Katelyn; Ahn, Hyeong Jun; Kaneshiro, Bliss
2017-08-01
To evaluate the efficacy of a paracervical block to decrease pain during osmotic dilator insertion before second-trimester abortion. In this double-blind, randomized trial, 41 women undergoing Laminaria insertion before a second-trimester abortion received either a paracervical block with 18 mL 1% lidocaine and 2 mL sodium bicarbonate or a sham block. Women were between 14 and 23 6/7 weeks of gestation. The primary outcome was pain immediately after insertion of Laminaria. Women assessed their pain on a 100-mm visual analog scale. Secondary outcomes included assessment of pain at other times during the insertion procedure and overall satisfaction with pain control. To detect a 25-mm difference in pain immediately after Laminaria insertion, at an α of 0.05 and 80% power, we aimed to enroll 20 patients in each arm. From May 2015 to December 2015, 20 women received a paracervical block and 21 received a sham block. Groups were similar in demographics, including parity, history of surgical abortion, and number of Laminaria placed. The paracervical block reduced pain after Laminaria insertion (median scores 13 mm [interquartile range 2-39] compared with 54 mm [interquartile range 27-61], P=.01, 95% CI -47.0 to -4.0). Women who received a paracervical block also reported higher satisfaction with overall pain control throughout the entire Laminaria insertion procedure (median scores 95 mm [interquartile range 78-100] compared with 70 mm [interquartile range 44-90], P=.05, 95% CI 0.0-37.0). Paracervical block is effective at reducing the pain of Laminaria insertion. Additionally, a paracervical block increases overall patient satisfaction with pain control during Laminaria placement. ClinicalTrials.gov, NCT02454296.
Rocha Ferreira, Graziela Santos; de Almeida, Juliano Pinheiro; Landoni, Giovanni; Vincent, Jean Louis; Fominskiy, Evgeny; Gomes Galas, Filomena Regina Barbosa; Gaiotto, Fabio A; Dallan, Luís Oliveira; Franco, Rafael Alves; Lisboa, Luiz Augusto; Palma Dallan, Luis Roberto; Fukushima, Julia Tizue; Rizk, Stephanie Itala; Park, Clarice Lee; Strabelli, Tânia Mara; Gelas Lage, Silvia Helena; Camara, Ligia; Zeferino, Suely; Jardim, Jaquelline; Calvo Arita, Elisandra Cristina Trevisan; Caldas Ribeiro, Juliana; Ayub-Ferreira, Silvia Moreira; Costa Auler, Jose Otavio; Filho, Roberto Kalil; Jatene, Fabio Biscegli; Hajjar, Ludhmila Abrahao
2018-04-30
The aim of this study was to evaluate the efficacy of perioperative intra-aortic balloon pump use in high-risk cardiac surgery patients. A single-center randomized controlled trial and a meta-analysis of randomized controlled trials. Heart Institute of São Paulo University. High-risk patients undergoing elective coronary artery bypass surgery. Patients were randomized to receive preskin incision intra-aortic balloon pump insertion after anesthesia induction versus no intra-aortic balloon pump use. The primary outcome was a composite endpoint of 30-day mortality and major morbidity (cardiogenic shock, stroke, acute renal failure, mediastinitis, prolonged mechanical ventilation, and a need for reoperation). A total of 181 patients (mean [SD] age 65.4 [9.4] yr; 32% female) were randomized. The primary outcome was observed in 43 patients (47.8%) in the intra-aortic balloon pump group and 42 patients (46.2%) in the control group (p = 0.46). The median duration of inotrope use (51 hr [interquartile range, 32-94 hr] vs 39 hr [interquartile range, 25-66 hr]; p = 0.007) and the ICU length of stay (5 d [interquartile range, 3-8 d] vs 4 d [interquartile range, 3-6 d]; p = 0.035) were longer in the intra-aortic balloon pump group than in the control group. A meta-analysis of 11 randomized controlled trials confirmed a lack of survival improvement in high-risk cardiac surgery patients with perioperative intra-aortic balloon pump use. In high-risk patients undergoing cardiac surgery, the perioperative use of an intra-aortic balloon pump did not reduce the occurrence of a composite outcome of 30-day mortality and major complications compared with usual care alone.
THE FUNDUS PHENOTYPE ASSOCIATED WITH THE p.Ala243Val BEST1 MUTATION.
Khan, Kamron N; Islam, Farrah; Moore, Anthony T; Michaelides, Michel
2018-03-01
To describe a highly recognizable and reproducible retinal phenotype associated with a specific BEST1 mutation-p.Ala243Val. Retrospective review of consecutive cases where genetic testing has identified p.Ala243Val BEST1 as the cause of disease. Electronic patient records were used to extract demographic, as well as functional and anatomical data. These data were compared with those observed with the most common BEST1 genotype, p.Arg218Cys. Eight individuals (six families) were identified with the p.Ala243Val BEST1 mutation and seven patients with the pathologic variant p.Arg218Cys. No patients with mutation of codon 243 knowingly had a family history of retinal disease, whereas all patients with the p.Arg218Cys variant did. The maculopathy was bilateral in all cases. The p.Ala243Val mutation was associated with a pattern dystrophy-type appearance, most visible with near-infrared reflectance and fundus autofluorescence imaging. This phenotype was never observed with any other genotype. This mutation was associated with an older median age of symptom onset (median = 42, interquartile range = 22) compared with those harboring the p.Arg218Cys mutation (median = 18, interquartile range = 12; Mann-Whitney U test; P < 0.05). Despite their older age, the final recorded acuity seemed to be better in the p.Ala243Val group (median = 0.55, interquartile range = 0.6475; median = 0.33, interquartile range = 0.358), although this did not reach statistical significance (Mann-Whitney U test; P > 0.05). The mutation p.Ala243Val is associated with highly recognizable and reproducible pattern dystrophy-like phenotype. Patients develop symptoms at a later age and tend to have better preservation of electrooculogram amplitudes.
Syphilis in Drug Users in Low and Middle Income Countries
Coffin, Lara S.; Newberry, Ashley; Hagan, Holly; Cleland, Charles M.; Des Jarlais, Don C.; Perlman, David C.
2009-01-01
Background Genital ulcer disease (GUD), including syphilis, is an important cause of morbidity in low and middle income (LMI) countries and syphilis transmission is associated with HIV transmission. Methods We conducted a literature review to evaluate syphilis infection among drug users in LMI countries for the period 1995–2007. Countries were categorized using the World Bank Atlas method (The World Bank, 2007) according to 2006 gross national income per capita. Results Thirty-two studies were included (N=13,848 subjects), mostly from Southeast Asia with some from Latin America, Eastern Europe, Central and East Asia, North Africa and the Middle East but none from regions such as Sub-Saharan Africa. The median prevalence of overall lifetime syphilis (N=32 studies) was 11.1% (interquartile range: 6.3% to 15.3%) and of HIV (N=31 studies) was 1.1% (interquartile range: 0.22% to 5.50%). There was a modest relation (r=0.27) between HIV and syphilis prevalence. Median syphilis prevalence by gender was 4.0% (interquartile range: 3.4% to 6.6%) among males (N=11 studies) and 19.9% (interquartile range: 11.4% to 36.0%) among females (N=6 studies). There was a strong relation (r= 0.68) between syphilis prevalence and female gender that may be related to female sex work. Conclusion Drug users in LMI countries have a high prevalence of syphilis but data are limited and, in some regions, entirely lacking. Further data are needed, including studies targeting the risks of women. Interventions to promote safer sex, testing, counseling and education, as well as health care worker awareness, should be integrated in harm reduction programs and health care settings to prevent new syphilis infections and reduce HIV transmission among drug users and their partners in LMI countries. PMID:19361976
The Perilous Road from HIV Diagnosis in the Hospital to Viral Suppression in the Outpatient Clinic.
Colasanti, Jonathan; Goswami, Neela D; Khoubian, Jonathan J; Pennisi, Eugene; Root, Christin; Ziemer, Dorothy; Armstrong, Wendy S; Del Rio, Carlos
2016-08-01
The HIV care continuum has received considerable attention in recent years, however, few care continua focus on the population of patients who are diagnosed during an inpatient hospital admission. We aimed to describe the HIV care continuum for patients newly diagnosed during hospitalization through 24-month follow-up. A retrospective chart review of HIV patients diagnosed at Grady Memorial Hospital from 2011 to 2012 was performed and records were matched to Georgia Department of Public Health HIV/AIDS surveillance data. Descriptive statistics and statistical tests of independence were utilized. Ninety-four new diagnoses were confirmed during the 2-year study period. Median age was 43 years (interquartile range [IQR] 30-51), 77% were male, 72% were non-Hispanic Black, 31% were men who have sex with men (MSM), and 77% were uninsured. Median CD4 count at diagnosis was 134 cells/μL (IQR 30-307). Eighty-four percent received their diagnosis before hospital discharge, 68% linked to care by 90 days, 73% were retained for 12 months, 48% were virologically suppressed by 12 months, 58% were retained for 24 continuous months, and 38% achieved continuous viral suppression (VS) during the initial 24 months after diagnosis. Late diagnosis is a persistent problem in hospitalized patients. Despite relative success with linkage to care and 12-month retention in care, a minority of patients maintained retention and VS for 24 continuous months.
Hansen, Lea K; Becher, Naja; Bastholm, Sara; Glavind, Julie; Ramsing, Mette; Kim, Chong J; Romero, Roberto; Jensen, Jørgen S; Uldbjerg, Niels
2014-01-01
To evaluate the microbial load and the inflammatory response in the distal and proximal parts of the cervical mucus plug. Experimental research. Twenty women with a normal, singleton pregnancy. Vaginal swabs and specimens from the distal and proximal parts of the cervical mucus plug. Immunohistochemistry, enzyme-linked immunosorbent assay, quantitative polymerase chain reaction and histology. The total bacterial load (16S rDNA) was significantly lower in the cervical mucus plug compared with the vagina (p = 0.001). Among women harboring Ureaplasma parvum, the median genome equivalents/g were 1574 (interquartile range 2526) in the proximal part, 657 (interquartile range 1620) in the distal part and 60,240 (interquartile range 96,386) in the vagina. Histological examinations and quantitative polymerase chain reaction revealed considerable amounts of lactobacilli and inflammatory cells in both parts of the cervical mucus plug. The matrix metalloproteinase-8 concentration was decreased in the proximal part of the plug compared with the distal part (p = 0.08). The cervical mucus plug inhibits, but does not block, the passage of Ureaplasma parvum during its ascending route from the vagina through the cervical canal. © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.
Adverse Drug Events and Medication Errors in African Hospitals: A Systematic Review.
Mekonnen, Alemayehu B; Alhawassi, Tariq M; McLachlan, Andrew J; Brien, Jo-Anne E
2018-03-01
Medication errors and adverse drug events are universal problems contributing to patient harm but the magnitude of these problems in Africa remains unclear. The objective of this study was to systematically investigate the literature on the extent of medication errors and adverse drug events, and the factors contributing to medication errors in African hospitals. We searched PubMed, MEDLINE, EMBASE, Web of Science and Global Health databases from inception to 31 August, 2017 and hand searched the reference lists of included studies. Original research studies of any design published in English that investigated adverse drug events and/or medication errors in any patient population in the hospital setting in Africa were included. Descriptive statistics including median and interquartile range were presented. Fifty-one studies were included; of these, 33 focused on medication errors, 15 on adverse drug events, and three studies focused on medication errors and adverse drug events. These studies were conducted in nine (of the 54) African countries. In any patient population, the median (interquartile range) percentage of patients reported to have experienced any suspected adverse drug event at hospital admission was 8.4% (4.5-20.1%), while adverse drug events causing admission were reported in 2.8% (0.7-6.4%) of patients but it was reported that a median of 43.5% (20.0-47.0%) of the adverse drug events were deemed preventable. Similarly, the median mortality rate attributed to adverse drug events was reported to be 0.1% (interquartile range 0.0-0.3%). The most commonly reported types of medication errors were prescribing errors, occurring in a median of 57.4% (interquartile range 22.8-72.8%) of all prescriptions and a median of 15.5% (interquartile range 7.5-50.6%) of the prescriptions evaluated had dosing problems. Major contributing factors for medication errors reported in these studies were individual practitioner factors (e.g. fatigue and inadequate knowledge/training) and environmental factors, such as workplace distraction and high workload. Medication errors in the African healthcare setting are relatively common, and the impact of adverse drug events is substantial but many are preventable. This review supports the design and implementation of preventative strategies targeting the most likely contributing factors.
Palliative shunt surgery for patients with leptomeningeal metastasis.
Murakami, Yuta; Ichikawa, Masahiro; Bakhit, Mudathir; Jinguji, Shinya; Sato, Taku; Fujii, Masazumi; Sakuma, Jun; Saito, Kiyoshi
2018-05-01
Leptomeningeal metastasis (LM) is associated with poor prognosis and affects the quality of life (QOL) of end-stage cancer patients. Severe headache associated with hydrocephalus causes reduced QOL. We investigated the clinical value of surgical treatment for hydrocephalus in LM patients. The medical records of 11 consecutive patients who underwent lumboperitoneal shunt (LPS) or ventriculoperitoneal shunt (VPS) at our institution between 2007 and 2016 were investigated. Primary brain tumor patients were excluded. We assessed the neurological status and therapeutic effects at 1 month after the shunt surgery. The patients were three males and eight females with a median age of 58 years (interquartile range [IR] 52-68 years). The median preoperative neutrophil-to-lymphocyte ratio was 6.4 (IR 4.8-9.2). Symptom improvement was observed in nine patients, and severe headache was relieved in seven (88%) out of eight patients. The median Karnofsky performance status scale increased from 40 to 60, and the median overall survival after primary malignancy diagnosis was 27.4 months (IR 19.6-63.1 months). The median survival after the diagnosis of brain parenchymal metastasis, LM, and shunt surgery were 7.2 months (IR 5.1-14.1 months), 3.9 months (IR 3.5-6.3 months), and 3.3 months (IR 2.9-5.7 months), respectively. Shunt surgery for hydrocephalus could offer an effective palliative surgical option for symptom relief especially relief of severe headache, contributing improvement of QOL in LM patients. Copyright © 2018 Elsevier B.V. All rights reserved.
Hutchinson, A; Brand, C; Irving, L; Roberts, C; Thompson, P; Campbell, D
2010-05-01
In 2003, chronic obstructive pulmonary disease (COPD) accounted for 46% of the burden of chronic respiratory disease in the Australian community. In the 65-74-year-old age group, COPD was the sixth leading cause of disability for men and the seventh for women. To measure the influence of disease severity, COPD phenotype and comorbidities on acute health service utilization and direct acute care costs in patients admitted with COPD. Prospective cohort study of 80 patients admitted to the Royal Melbourne Hospital in 2001-2002 for an exacerbation of COPD. Patients were followed for 12 months and data were collected on acute care utilization. Direct hospital costs were derived using Transition II, an activity-based costing system. Individual patient costs were then modelled to ascertain which patient factors influenced total direct hospital costs. Direct costs were calculated for 225 episodes of care, the median cost per admission was AU$3124 (interquartile range $1393 to $5045). The median direct cost of acute care management per patient per year was AU$7273 (interquartile range $3957 to $14 448). In a multivariate analysis using linear regression modelling, factors predictive of higher annual costs were increasing age (P= 0.041), use of domiciliary oxygen (P= 0.008) and the presence of chronic heart failure (P= 0.006). This model has identified a number of patient factors that predict higher acute care costs and awareness of these can be used for service planning to meet the needs of patients admitted with COPD.
Valkenburg, Abraham J; Calvier, Elisa A M; van Dijk, Monique; Krekels, Elke H J; O'Hare, Brendan P; Casey, William F; Mathôt, Ron A A; Knibbe, Catherijne A J; Tibboel, Dick; Breatnach, Cormac V
2016-10-01
To compare the pharmacodynamics and pharmacokinetics of IV morphine after cardiac surgery in two groups of children-those with and without Down syndrome. Prospective, single-center observational trial. PICU in a university-affiliated pediatric teaching hospital. Twenty-one children with Down syndrome and 17 without, 3-36 months old, scheduled for cardiac surgery with cardiopulmonary bypass. A loading dose of morphine (100 μg/kg) was administered after coming off bypass; thereafter, morphine infusion was commenced at 40 μg/kg/hr. During intensive care, nurses regularly assessed pain and discomfort with validated observational instruments (COMFORT-Behavior scale and Numeric Rating Scale-for pain). These scores guided analgesic and sedative treatment. Plasma samples were obtained for pharmacokinetic analysis. Median COMFORT-Behavior and Numeric Rating Scale scores were not statistically significantly different between the two groups. The median morphine infusion rate during the first 24 hours after surgery was 31.3 μg/kg/hr (interquartile range, 23.4-36.4) in the Down syndrome group versus 31.7 μg/kg/hr (interquartile range, 25.1-36.1) in the control group (p = 1.00). Population pharmacokinetic analysis revealed no statistically significant differences in any of the pharmacokinetic variables of morphine between the children with and without Down syndrome. This prospective trial showed that there are no differences in pharmacokinetics or pharmacodynamics between children with and without Down syndrome if pain and distress management is titrated to effect based on outcomes of validated assessment instruments. We have no evidence to adjust morphine dosing after cardiac surgery in children with Down syndrome.
Social media as a tool for antimicrobial stewardship.
Pisano, Jennifer; Pettit, Natasha; Bartlett, Allison; Bhagat, Palak; Han, Zhe; Liao, Chuanhong; Landon, Emily
2016-11-01
To increase the reach of our antimicrobial stewardship program (ASP), social media platforms, Facebook and Twitter, were used to increase internal medicine residents' (IMRs') antibiotic (Abx) knowledge and awareness of ASP resources. Fifty-five of 110 (50%) IMRs consented to participate; 39 (71%) completed both pre- and postintervention surveys and followed our ASP on social media. Along with 20 basic Abx and infectious diseases (IDs) questions, this survey assessed IMR awareness of ASP initiatives, social media usage, and attitudes and beliefs surrounding Abx resistance. Over 6 months, IMRs received posts and Tweets of basic Abx/IDs trivia while promoting use of educational tools and clinical pathways on our ASP Web site. To compare pre- and postsurvey responses, McNemar test or Stuart-Maxwell test was used for categorical variables, and paired t test or Wilcoxon signed-rank test was used for continuous variables, as appropriate. Of the IMRs, 98% and 58% use Facebook and Twitter, respectively. To compare pre- and postintervention, median scores for Abx knowledge increased from 12 (interquartile range, 8-13) to 13 (interquartile range, 11-15; P = .048); IMRs knowing how to access the ASP Web site increased from 70% to 94%. More IMRs indicated that they used the clinical pathways "sometimes, frequently, or always" after the intervention (33% vs 61%, P = .004). Social media is a valuable tool to reinforce ASP initiatives while encouraging the use of ASP resources to promote antimicrobial mindfulness. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Measles in a South African paediatric intensive care unit: again!
Coetzee, Saskia; Morrow, Brenda M; Argent, Andrew C
2014-05-01
The aim of this study is to evaluate the outcomes of children with measles-related disease (MRD) admitted to a paediatric intensive care unit (PICU) and the effect on PICU resources and elective surgery of a recent measles epidemic. This was a retrospective observational study of all patients admitted to the PICU of Red Cross War Memorial Children's Hospital, Cape Town, South Africa, with MRD from January to December 2010. Patient admission characteristics, duration of PICU admission and mortality were recorded. Costs were calculated using bed days utilised and estimated daily PICU admission cost. A total of 1274 children were admitted over the study period, 58 (4.6%) with MRD (median (interquartile range) age 7 (5-9) months). Pneumonia was the most common reason for admission (81%) and the main cause of mortality. Non-MRD mortality was 8.8% compared with MRD mortality of 31% (P < 0.0001). Standardised mortality for non-MRD was 0.7 versus 1.7 in MRD (P = 0.002). HIV comorbidity and being underweight for age were associated with increased mortality. Patients with MRD occupied 379 bed days with a median (interquartile range) duration of stay of 5.5 (3.0-9.0) days at an estimated overall cost of R4,813,300 (approximately $543,900). During the study period, 67 children booked for elective surgery, and 87 other referrals were refused PICU admission. MRD was associated with significant morbidity and mortality, and substantial strain on scarce PICU resources. © 2013 The Authors. Journal of Paediatrics and Child Health © 2013 Paediatrics and Child Health Division (Royal Australasian College of Physicians).
Gammaknife radiosurgery in patients with acromegaly.
Erdur, Fatih M; Kilic, Türker; Peker, Selcuk; Celik, Ozlem; Kadioglu, Pinar
2011-12-01
We aimed to evaluate the efficacy and reliability of gamma-knife radiosurgery (GKR) in 22 patients with acromegaly at the Endocrinology-Metabolism Clinic of Cerrahpasa Medical School. We collected data retrospectively from hospital records on disease activity and other pituitary functions, pituitary MRI and visual fields, before GKR and 6, 12, 24, 36, 48 and 60 months after GKR. The median follow-up duration after GKR was 60 months (interquartile range [IQR]: 24-60 months). The remission rate was 54.5% after the 60 months of follow-up. The median growth hormone (GH) level at 60 months after GKR (0.99 ng/mL [IQR: 0.36-2.2]) was significantly lower than the median GH level before GKR (5.65 ng/mL [IQR: 3.85-7.2] (p=0.002). The median insulin-like growth factor-1 (IGF-1) level 60 months after GKR (221.5 ng/mL [IQR: 149-535]) was significantly lower than the median IGF-1 level before GKR (582.5 ng/mL [IQR: 515-655]) (p=0.008). Tumour growth was well controlled in 20 patients (95.2%). Six patients (28.6%) developed new-onset hypopituitarism. We concluded that GKR is an effective adjuvant treatment to control tumour growth, lower GH and IGF-1 levels, and to increase remission rates in patients with acromegaly who were refractory to surgical and medical treatment. Copyright © 2011 Elsevier Ltd. All rights reserved.
Length of stay for older adults residing in nursing homes at the end of life.
Kelly, Anne; Conell-Price, Jessamyn; Covinsky, Kenneth; Cenzer, Irena Stijacic; Chang, Anna; Boscardin, W John; Smith, Alexander K
2010-09-01
To describe lengths of stay of nursing home decedents. Retrospective cohort study. The Health and Retirement Study (HRS), a nationally representative survey of U.S. adults aged 50 and older. One thousand eight hundred seventeen nursing home residents who died between 1992 and 2006. The primary outcome was length of stay, defined as the number of months between nursing home admission and date of death. Covariates were demographic, social, and clinical factors drawn from the HRS interview conducted closest to the date of nursing home admission. The mean age of decedents was 83.3 ± 9.0; 59.1% were female, and 81.5% were white. Median and mean length of stay before death were 5 months (interquartile range 1-20) and 13.7 ± 18.4 months, respectively. Fifty-three percent died within 6 months of placement. Large differences in median length of stay were observed according to sex (men, 3 months vs women, 8 months) and net worth (highest quartile, 3 months vs lowest quartile, 9 months) (all P <.001). These differences persisted after adjustment for age, sex, marital status, net worth, geographic region, and diagnosed chronic conditions (cancer, hypertension, diabetes mellitus, lung disease, heart disease, and stroke). Nursing home lengths of stay are brief for the majority of decedents. Lengths of stay varied markedly according to factors related to social support. © 2010, Copyright the Authors. Journal compilation © 2010, The American Geriatrics Society.
Dusetzina, Stacie B; Keating, Nancy L
2016-02-01
Orally administered anticancer medications are among the fastest growing components of cancer care. These medications are expensive, and cost-sharing requirements for patients can be a barrier to their use. For Medicare beneficiaries, the Affordable Care Act will close the Part D coverage gap (doughnut hole), which will reduce cost sharing from 100% in 2010 to 25% in 2020 for drug spending above $2,960 until the beneficiary reaches $4,700 in out-of-pocket spending. How much these changes will reduce out-of-pocket costs is unclear. We used the Medicare July 2014 Prescription Drug Plan Formulary, Pharmacy Network, and Pricing Information Files from the Centers for Medicare & Medicaid Services for 1,114 stand-alone and 2,230 Medicare Advantage prescription drug formularies, which represent all formularies in 2014. We identified orally administered anticancer medications and summarized drug costs, cost-sharing designs used by available plans, and the estimated out-of-pocket costs for beneficiaries without low-income subsidies who take a single drug before and after the doughnut hole closes. Little variation existed in formulary design across plans and products. The average price per month for included products was $10,060 (range, $5,123 to $16,093). In 2010, median beneficiary annual out-of-pocket costs for a typical treatment duration ranged from $6,456 (interquartile range, $6,433 to $6,482) for dabrafenib to $12,160 (interquartile range, $12,102 to $12,262) for sunitinib. With the assumption that prices remain stable, after the doughnut hole closes, beneficiaries will spend approximately $2,550 less. Out-of-pocket costs for Medicare beneficiaries taking orally administered anticancer medications are high and will remain so after the doughnut hole closes. Efforts are needed to improve affordability of high-cost cancer drugs for beneficiaries who need them. © 2015 by American Society of Clinical Oncology.
Comparative durability and costs analysis of ventricular shunts.
Agarwal, Nitin; Kashkoush, Ahmed; McDowell, Michael M; Lariviere, William R; Ismail, Naveed; Friedlander, Robert M
2018-05-11
OBJECTIVE Ventricular shunt (VS) durability has been well studied in the pediatric population and in patients with normal pressure hydrocephalus; however, further evaluation in a more heterogeneous adult population is needed. This study aims to evaluate the effect of diagnosis and valve type-fixed versus programmable-on shunt durability and cost for placement of shunts in adult patients. METHODS The authors retrospectively reviewed the medical records of all patients who underwent implantation of a VS for hydrocephalus at their institution over a 3-year period between August 2013 and October 2016 with a minimum postoperative follow-up of 6 months. The primary outcome was shunt revision, which was defined as reoperation for any indication after the initial procedure. Supply costs, shunt durability, and hydrocephalus etiologies were compared between fixed and programmable valves. RESULTS A total of 417 patients underwent shunt placement during the index time frame, consisting of 62 fixed shunts (15%) and 355 programmable shunts (85%). The mean follow-up was 30 ± 12 (SD) months. The shunt revision rate was 22% for programmable pressure valves and 21% for fixed pressure valves (HR 1.1 [95% CI 0.6-1.8]). Shunt complications, such as valve failure, infection, and overdrainage, occurred with similar frequency across valve types. Kaplan-Meier survival curve analysis showed no difference in durability between fixed (mean 39 months) and programmable (mean 40 months) shunts (p = 0.980, log-rank test). The median shunt supply cost per index case and accounting for subsequent revisions was $3438 (interquartile range $2938-$3876) and $1504 (interquartile range $753-$1584) for programmable and fixed shunts, respectively (p < 0.001, Wilcoxon rank-sum test). Of all hydrocephalus etiologies, pseudotumor cerebri (HR 1.9 [95% CI 1.2-3.1]) and previous shunt malfunction (HR 1.8 [95% CI 1.2-2.7]) were found to significantly increase the risk of shunt revision. Within each diagnosis, there were no significant differences in revision rates between shunts with a fixed valve and shunts with a programmable valve. CONCLUSIONS Long-term shunt revision rates are similar for fixed and programmable shunt pressure valves in adult patients. Hydrocephalus etiology may play a significant role in predicting shunt revision, although programmable valves incur higher supply costs regardless of initial diagnosis. Utilization of fixed pressure valves versus programmable pressure valves may reduce supply costs while maintaining similar revision rates. Given the importance of developing cost-effective management protocols, this study highlights the critical need for large-scale prospective observational studies and randomized clinical trials of ventricular shunt valve revisions and additional patient-centered outcomes.
Matsui, Elizabeth C; Perzanowski, Matthew; Peng, Roger D; Wise, Robert A; Balcer-Whaley, Susan; Newman, Michelle; Cunningham, Amparito; Divjan, Adnan; Bollinger, Mary E; Zhai, Shuyan; Chew, Ginger; Miller, Rachel L; Phipatanakul, Wanda
2017-03-14
Professionally delivered integrated pest management (IPM) interventions can reduce home mouse allergen concentrations, but whether they reduce asthma morbidity among mouse-sensitized and exposed children and adolescents is unknown. To determine the effect of an IPM intervention on asthma morbidity among mouse-sensitized and exposed children and adolescents with asthma. Randomized clinical trial conducted in Baltimore, Maryland, and Boston, Massachusetts. Participants were mouse-sensitized and exposed children and adolescents (aged 5-17 years) with asthma randomized to receive professionally delivered IPM plus pest management education or pest management education alone. Enrollment occurred between May 2010 and August 2014; the final follow-up visit occurred on September 25, 2015. Integrated pest management consisted of application of rodenticide, sealing of holes that could serve as entry points for mice, trap placement, targeted cleaning, allergen-proof mattress and pillow encasements, and portable air purifiers. Infestation was assessed every 3 months, and if infestation persisted or recurred, additional treatments were delivered. All participants received pest management education, which consisted of written material and demonstration of the materials needed to set traps and seal holes. The primary outcome was maximal symptom days defined as the highest number of days of symptoms in the previous 2 weeks among 3 types of symptoms (days of slowed activity due to asthma; number of nights of waking with asthma symptoms; and days of coughing, wheezing, or chest tightness) across 6, 9, and 12 months. Of 361 children and adolescents who were randomized (mean [SD] age, 9.8 [3.2] years; 38% female; 181 in IPM plus pest management education group and 180 in pest management education alone group), 334 were included in the primary analysis. For the primary outcome, there was no statistically significant between-group difference for maximal symptom days across 6, 9, and 12 months with a median of 2.0 (interquartile range, 0.7-4.7) maximal symptom days in the IPM plus pest management education group and 2.7 (interquartile range, 1.3-5.0) maximal symptom days in the pest management education alone group (P = .16) and a ratio of symptom frequencies of 0.86 (95% CI, 0.69-1.06). Among mouse-sensitized and exposed children and adolescents with asthma, an intensive year-long integrated pest management intervention plus pest management education vs pest management education alone resulted in no significant difference in maximal symptom days from 6 to 12 months. clinicaltrials.gov Identifier: NCT01251224.
Romesser, Paul B; Pei, Xin; Shi, Weiji; Zhang, Zhigang; Kollmeier, Marisa; McBride, Sean M; Zelefsky, Michael J
2018-01-01
To evaluate the difference in prostate-specific antigen (PSA) recurrence-free, distant metastasis-free, overall, and cancer-specific survival between PSA bounce (PSA-B) and non-bounce patients treated with dose-escalated external beam radiation therapy (DE-EBRT). During 1990-2010, 1898 prostate adenocarcinoma patients were treated with DE-EBRT to ≥75 Gy with ≥5 years follow-up. Patients receiving neoadjuvant/concurrent androgen-deprivation therapy (n=1035) or with fewer than 4 PSA values obtained 6 months or more after post-EBRT completion (n=87) were excluded. The evaluable 776 patients were treated (median, 81.0 Gy). Prostate-specific antigen bounce was defined as a ≥0.2-ng/mL increase above the interval PSA nadir, followed by a decrease to nadir or below. Prostate-specific antigen relapse was defined as post-radiation therapy PSA nadir + 2 ng/mL. Median follow-up was 9.2 years (interquartile range, 6.9-11.3 years). One hundred twenty-three patients (15.9%) experienced PSA-B after DE-EBRT at a median of 24.6 months (interquartile range, 16.1-38.5 months). On multivariate analysis, younger age (P=.001), lower Gleason score (P=.0003), and higher radiation therapy dose (P=.0002) independently predicted PSA-B. Prostate-specific antigen bounce was independently associated with decreased risk for PSA relapse (hazard ratio [HR] 0.53; 95% confidence interval [CI] 0.33-0.85; P=.008), distant metastatic disease (HR 0.34; 95% CI 0.12-0.94; P=.04), and all-cause mortality (HR 0.53; 95% CI 0.29-0.96; P=.04) on multivariate Cox analysis. Because all 50 prostate cancer-specific deaths in patients without PSA-B were in the non-bounce cohort, competing-risks analysis was not applicable. A nonparametric competing-risks test demonstrated that patients with PSA-B had superior cancer-specific survival compared with patients without PSA-B (P=.004). Patients treated with dose-escalated radiation therapy for prostate adenocarcinoma who experience posttreatment PSA-B have improved PSA recurrence-free survival, distant metastasis-free survival, overall survival, and cancer-specific survival outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.
Out-of-pocket costs associated with rotavirus gastroenteritis requiring hospitalization in Malaysia.
Chai, P F; Lee, W S
2009-11-20
From August 2006 to July 2007 a prospective study of out-of-pocket costs incurred by care-givers of children hospitalized for rotavirus gastroenteritis was conducted in a hospital in Malaysia. Data on caretaker out-of-pocket costs were collected from 260 children hospitalized with diarrhoea. A stool sample was collected from 198 of these children of which 46 (23%) were positive for rotavirus by latex agglutination assay. The mean (median; interquartile range) out-of-pocket cost incurred by the care-givers was US$194 (US$169; US$47-738), constituting 26% of average monthly income of the households surveyed. Major components of the cost were hospital expenses (45%) and productivity loss (37%). These findings will allow further assessment of the cost-effectiveness of any future rotavirus immunization program in Malaysia.
The Impact of High-Profile Sexual Abuse Cases in the Media on a Pediatric Emergency Department.
Flannery, Dustin D; Stephens, Clare L; Thompson, Amy D
2016-01-01
High-profile media cases of sexual abuse may encourage disclosures of abuse from victims of unrelated assaults and also influence parental concerns, leading to increased emergency department visits. In the region of the study authors' institution, there are two recent high-profile sexual abuse cases with media coverage: Earl Bradley, a Delaware pediatrician, and Jerry Sandusky, a Pennsylvania college football coach. This is a retrospective cohort study of children evaluated for sexual abuse at a pediatric emergency department. Patients were classified as either presenting during a media period or non-media period. The media periods were one-month periods immediately following breaking news reports, when the cases were highly publicized in the media. The non-media periods were the 12-month periods directly preceding the first reports. The median number of emergency department visits per month during a non-media period was 9 visits (interquartile range 6-10). There were 11 visits in the month following the Sandusky case and 13 visits following the Bradley case. There was no statistical difference in number of emergency department visits for sexual abuse between the periods (p = .09). These finding have implications regarding use of resources in pediatric EDs after high-profile sexual abuse cases.
Is Repeat PTA of a Failing Hemodialysis Fistula Durable?
Zdanowski, Zbigniew
2014-01-01
Purpose. Our objective was to evaluate the outcome of percutaneous transluminal angioplasty (PTA) and particularly rePTA in a failing arteriovenous fistula (AV-fistula). Are multiple redilations worthwhile? Patients and Methods. All 159 stenoses of AV fistulas that were treated with PTA, with or without stenting, during 2008 and 2009, were included. Occluded fistulas that were dilated after successful thrombolysis were also included. Median age was 68 (interquartile range 61.5–78.5) years and 75% were male. Results. Seventy-nine (50%) of the primary PTAs required no further reintervention. The primary patency was 61% at 6 months and 42% at 12 months. Eighty (50%) of the stenoses needed at least one reintervention. Primary assisted patency (defined as patency after subsequent reinterventions) was 89% at 6 months and 85% at 12 months. The durability of repeated PTAs was similar to the durability of the primary PTA. However, an early primary PTA carried a higher risk for subsequent reinterventions. Successful dialysis was achieved after 98% of treatments. Nine percent of the stenoses eventually required surgical revision and 13% of the fistulas failed permanently. Conclusion. The present study suggests that most failing AV-fistulas can be salvaged endovascularly. Repeated PTA seems similarly durable as the primary PTA. PMID:24587906
Long-term cardiac (valvulopathy) safety of cabergoline in prolactinoma
Khare, Shruti; Lila, Anurag R.; Patil, Rishikesh; Phadke, Milind; Kerkar, Prafulla; Bandgar, Tushar; Shah, Nalini S.
2017-01-01
Background: Clinical relevance of association of cabergoline use for hyperprolactinemia and cardiac valvulopathy remains unclear. Objective: The aim of the study was to determine the prevalence of valvular heart abnormalities in patients taking cabergoline for the treatment of prolactinoma and to explore any associations with the cumulative dose of drug used. Design: A cross-sectional echocardiographic study was performed in patients who were receiving cabergoline therapy for prolactinoma. Results: Hundred (61 females, 39 males) prolactinoma cases (81 macroprolactinoma and 19 microprolactinoma) were included in the study. The mean age at presentation was 33.9 ± 9.0 years (range: 16–58 years). The mean duration of treatment was 53.11 ± 43.15 months (range: 12–155 months). The mean cumulative dose was 308.6 ± 290.2 mg (range: 26–1196 mg; interquartile range: 104–416 mg). Mild mitral regurgitation was present in one patient (cumulative cabergoline dose 104 mg). Mild tricuspid regurgitation was present in another two patients (cumulative cabergoline dose 52 mg and 104 mg). Aortic and pulmonary valve functioning was normal in all the cases. There were no cases of significant valvular regurgitation (moderate to severe, Grade 3–4). None of the patients had morphological abnormalities such as thickening, calcification, and restricted mobility of any of the cardiac valves. Conclusion: Cabergoline appears to be safe in patients with prolactinoma up to the cumulative dose of ~300 mg. The screening for valvulopathy should be restricted to those with higher cumulative cabergoline exposure. PMID:28217516
Jang, Sae; Vanderpool, Rebecca R; Avazmohammadi, Reza; Lapshin, Eugene; Bachman, Timothy N; Sacks, Michael; Simon, Marc A
2017-09-12
Right ventricular (RV) diastolic function has been associated with outcomes for patients with pulmonary hypertension; however, the relationship between biomechanics and hemodynamics in the right ventricle has not been studied. Rat models of RV pressure overload were obtained via pulmonary artery banding (PAB; control, n=7; PAB, n=5). At 3 weeks after banding, RV hemodynamics were measured using a conductance catheter. Biaxial mechanical properties of the RV free wall myocardium were obtained to extrapolate longitudinal and circumferential elastic modulus in low and high strain regions (E 1 and E 2 , respectively). Hemodynamic analysis revealed significantly increased end-diastolic elastance (E ed ) in PAB (control: 55.1 mm Hg/mL [interquartile range: 44.7-85.4 mm Hg/mL]; PAB: 146.6 mm Hg/mL [interquartile range: 105.8-155.0 mm Hg/mL]; P =0.010). Longitudinal E 1 was increased in PAB (control: 7.2 kPa [interquartile range: 6.7-18.1 kPa]; PAB: 34.2 kPa [interquartile range: 18.1-44.6 kPa]; P =0.018), whereas there were no significant changes in longitudinal E 2 or circumferential E 1 and E 2 . Last, wall stress was calculated from hemodynamic data by modeling the right ventricle as a sphere: stress=Pressure×radius2×thickness. RV pressure overload in PAB rats resulted in an increase in diastolic myocardial stiffness reflected both hemodynamically, by an increase in E ed , and biomechanically, by an increase in longitudinal E 1 . Modest increases in tissue biomechanical stiffness are associated with large increases in E ed . Hemodynamic measurements of RV diastolic function can be used to predict biomechanical changes in the myocardium. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Krieger, Eric V; Clair, Mathieu; Opotowsky, Alexander R; Landzberg, Michael J; Rhodes, Jonathan; Powell, Andrew J; Colan, Steven D; Valente, Anne Marie
2013-02-01
The role of exercise testing to risk stratify patients with repaired coarctation of the aorta (CoA) is controversial. Concentric left ventricular (LV) hypertrophy, defined as an increase in the LV mass-to-volume ratio (MVR), is associated with a greater incidence of adverse cardiovascular events. The objective of the present study was to determine whether a hypertensive response to exercise (HRE) is associated with increased LVMVR in patients with repaired CoA. Adults with repaired CoA who had a symptom-limited exercise test and cardiac magnetic resonance imaging examination within 2 years were identified. A hypertensive response to exercise was defined as a peak systolic blood pressure >220 mm Hg during a symptom-limited exercise test. The LV mass and volume were measured using cardiac magnetic resonance by an investigator who was unaware of patient status. We included 47 patients (median age 27.3 years, interquartile range 19.8 to 37.3), who had undergone CoA repair at a median age of 4.6 years (interquartile range 0.4 to 15.7). Those with (n = 11) and without (n = 36) HRE did not differ in age, age at repair, body surface area, arm-to-leg systolic blood pressure gradient, gender, or peak oxygen uptake with exercise. Those with a HRE had a greater mean systolic blood pressure at rest (146 ± 18 vs 137 ± 18 mm Hg, p = 0.04) and greater median LVMVR (0.85, interquartile range 0.7 to 1, vs 0.66, interquartile range 0.6 to 0.7; p = 0.04) than those without HRE. Adjusting for systolic blood pressure at rest, age, age at repair, and gender, the relation between HRE and LVMVR remained significant (p = 0.001). In conclusion, HRE was associated with increased LVMVR, even after adjusting for multiple covariates. Copyright © 2013 Elsevier Inc. All rights reserved.
Measured degree of dehydration in children and adolescents with type 1 diabetic ketoacidosis.
Ugale, Judith; Mata, Angela; Meert, Kathleen L; Sarnaik, Ashok P
2012-03-01
Successful management of diabetic ketoacidosis depends on adequate rehydration while avoiding cerebral edema. Our objectives are to 1) measure the degree of dehydration in children with type 1 diabetes mellitus and diabetic ketoacidosis based on change in body weight; and 2) investigate the relationships between measured degree of dehydration and clinically assessed degree of dehydration, severity of diabetic ketoacidosis, and routine serum laboratory values. Prospective observational study. University-affiliated tertiary care children's hospital. Sixty-six patients <18 yrs of age with type 1 diabetic ketoacidosis. Patients were weighed using a portable scale at admission; 8, 16, and 24 hrs; and daily until discharge. Measured degree of dehydration was based on the difference between admission and plateau weights. Clinical degree of dehydration was assessed by physical examination and severity of diabetic ketoacidosis was assessed by blood gas values as defined by international guidelines. Laboratory values obtained on admission included serum glucose, urea nitrogen, sodium, and osmolality. Median measured degree of dehydration was 5.2% (interquartile range, 3.1% to 7.8%). Fourteen (21%) patients were clinically assessed as mild dehydration, 49 (74%) as moderate, and three (5%) as severe. Patients clinically assessed as moderately dehydrated had a greater measured degree of dehydration (5.8%; interquartile range, 3.6% to 9.6%) than those assessed as mildly dehydrated (3.7%; interquartile range, 2.3% to 6.4%) or severely dehydrated (2.5%; interquartile range, 2.3% to 2.6%). Nine (14%) patients were assessed as mild diabetic ketoacidosis, 18 (27%) as moderate, and 39 (59%) as severe. Diabetic ketoacidosis severity groups did not differ in measured degree of dehydration. Variables independently associated with measured degree of dehydration included serum urea nitrogen and sodium concentration on admission. Hydration status in children with diabetic ketoacidosis cannot be accurately assessed by physical examination or blood gas values. Fluid therapy based on maintenance plus 6% deficit replacement is reasonable for most patients.
Mulligan, Angela A; Kuhnle, Gunter G C; Lentjes, Marleen A H; van Scheltinga, Veronica; Powell, Natasha A; McTaggart, Alison; Bhaniani, Amit; Khaw, Kay-Tee
2013-08-01
A diet rich in phyto-oestrogens has been suggested to protect against a variety of common diseases but UK intake data on phyto-oestrogens or their food sources are sparse. The present study estimates the average intakes of isoflavones, lignans, enterolignans and coumestrol from 7 d food diaries and provides data on total isoflavone, lignan and phyto-oestrogen consumption by food group. Development of a food composition database for twelve phyto-oestrogens and analysis of soya food and phyto-oestrogen consumption in a populationbased study. Men and women, aged 40–79 years, from the general population participating in the Norfolk arm of the European Prospective Investigation into Cancer and Nutrition (EPIC-Norfolk) between 1993 and 1997, with nutrient and food data from 7 d food diaries. A subset of 20 437 participants. The median daily phyto-oestrogen intake for all men was 1199 mg (interquartile range 934–1537mg; mean 1504mg, SD 1502mg) and 888mg for all women (interquartile range 710–1135 mg; mean 1205 mg, SD 1701mg). In soya consumers, median daily intakes were higher: 2861 mg in men (interquartile range 1304–7269mg; mean 5051mg, SD 5031mg) and 3142 mg in women (interquartile range 1089–7327mg; mean 5396 mg, SD 6092 mg). In both men and women, bread made the greatest contribution to phyto-oestrogen intake – 40?8% and 35?6%, respectively. In soya consumers, vegetable dishes and soya/goat’s/sheep’s milks were the main contributors – 45?7% and 21?3% in men and 38?4% and 33?7% in women, respectively. The ability to estimate phyto-oestrogen intake in Western populations more accurately will aid investigations into their suggested effects on health.
Chen, Hung-Yuan; Chiang, Chih-Kang; Wang, Hsi-Hao; Hung, Kuan-Yu; Lee, Yue-Joe; Peng, Yu-Sen; Wu, Kwan-Dun; Tsai, Tun-Jun
2008-08-01
Greater than 50% of dialysis patients experience sleep disturbances. Cognitive-behavioral therapy (CBT) is effective for treating chronic insomnia, but its effectiveness has never been reported in peritoneal dialysis (PD) patients and its association with cytokines is unknown. We investigated the effectiveness of CBT in PD patients by assessing changes in sleep quality and inflammatory cytokines. Randomized control study with parallel-group design. 24 PD patients with insomnia in a tertiary medical center without active medical and psychiatric illness were enrolled. The intervention group (N = 13) received CBT from a psychiatrist for 4 weeks and sleep hygiene education, whereas the control group (N = 11) received only sleep hygiene education. Primary outcomes were changes in the Pittsburgh Sleep Quality Index and Fatigue Severity Scale scores, and secondary outcomes were changes in serum interleukin 6 (IL-6), IL-1beta, IL-18, and tumor necrosis factor alpha levels during the 4-week trial. Median percentages of change in global Pittsburgh Sleep Quality Index scores were -14.3 (interquartile range, -35.7 to - 6.3) and -1.7 (interquartile range, -7.6 to 7.8) in the intervention and control groups, respectively (P = 0.3). Median percentages of change in global Fatigue Severity Scale scores were -12.1 (interquartile range, -59.8 to -1.5) and -10.5 (interquartile range, -14.3 to 30.4) in the intervention and control groups, respectively (P = 0.04). Serum IL-1beta level decreased in the intervention group, but increased in the control group (P = 0.04). There were no significant differences in changes in other cytokines. This study had a small number of participants and short observation period, and some participants concurrently used hypnotics. CBT may be effective for improving the quality of sleep and decreasing fatigue and inflammatory cytokine levels. CBT can be an effective nonpharmacological therapy for PD patients with sleep disturbances.
Index to Estimate the Efficiency of an Ophthalmic Practice.
Chen, Andrew; Kim, Eun Ah; Aigner, Dennis J; Afifi, Abdelmonem; Caprioli, Joseph
2015-08-01
A metric of efficiency, a function of the ratio of quality to cost per patient, will allow the health care system to better measure the impact of specific reforms and compare the effectiveness of each. To develop and evaluate an efficiency index that estimates the performance of an ophthalmologist's practice as a function of cost, number of patients receiving care, and quality of care. Retrospective review of 36 ophthalmology subspecialty practices from October 2011 to September 2012 at a university-based eye institute. The efficiency index (E) was defined as a function of adjusted number of patients (N(a)), total practice adjusted costs (C(a)), and a preliminary measure of quality (Q). Constant b limits E between 0 and 1. Constant y modifies the influence of Q on E. Relative value units and geographic cost indices determined by the Centers for Medicare and Medicaid for 2012 were used to calculate adjusted costs. The efficiency index is expressed as the following: E = b(N(a)/C(a))Q(y). Independent, masked auditors reviewed 20 random patient medical records for each practice and filled out 3 questionnaires to obtain a process-based quality measure. The adjusted number of patients, adjusted costs, quality, and efficiency index were calculated for 36 ophthalmology subspecialties. The median adjusted number of patients was 5516 (interquartile range, 3450-11,863), the median adjusted cost was 1.34 (interquartile range, 0.99-1.96), the median quality was 0.89 (interquartile range, 0.79-0.91), and the median value of the efficiency index was 0.26 (interquartile range, 0.08-0.42). The described efficiency index is a metric that provides a broad overview of performance for a variety of ophthalmology specialties as estimated by resources used and a preliminary measure of quality of care provided. The results of the efficiency index could be used in future investigations to determine its sensitivity to detect the impact of interventions on a practice such as training modules or practice restructuring.
Ambient Ozone and Emergency Department Visits for Cellulitis
Szyszkowicz, Mieczysław; Porada, Eugeniusz; Kaplan, Gilaad G.; Rowe, Brian H.
2010-01-01
Objectives were to assess and estimate an association between exposure to ground-level ozone and emergency department (ED) visits for cellulitis. All ED visits for cellulitis in Edmonton, Canada, in the period April 1992–March 2002 (N = 69,547) were examined. Case-crossover design was applied to estimate odds ratio (OR, and 95% confidence interval) per one interquartile range (IQR) increase in ozone concentration (IQR = 14.0 ppb). Delay of ED visit relating to exposure was probed using 0- to 5-day exposure lags. For all patients in the all months (January–December) and lags 0 to 2 days, OR = 1.05 (1.02, 1.07). For male patients during the cold months (October–March): OR = 1.05 (1.02, 1.09) for lags 0 and 2 and OR = 1.06 (1.02, 1.10) for lag 3. For female patients in the warm months (April–September): OR = 1.12 (1.06, 1.18) for lags 1 and 2. Cellulitis developing on uncovered (more exposed) skin was analyzed separately, observed effects being stronger. Cellulitis may be associated with exposure to ambient ground level ozone; the exposure may facilitate cellulitis infection and aggravate acute symptoms. PMID:21139878
Cerezo Espinosa, Cristina; Nieto Caballero, Sergio; Juguera Rodríguez, Laura; Castejón-Mochón, José Francisco; Segura Melgarejo, Francisca; Sánchez Martínez, Carmen María; López López, Carmen Amalia; Pardo Ríos, Manuel
2018-02-01
To compare secondary students' learning of basic life support (BLS) theory and the use of an automatic external defibrillator (AED) through face-to-face classroom instruction versus educational video instruction. A total of 2225 secondary students from 15 schools were randomly assigned to one of the following 5 instructional groups: 1) face-to-face instruction with no audiovisual support, 2) face-to-face instruction with audiovisual support, 3) audiovisual instruction without face-to-face instruction, 4) audiovisual instruction with face-to-face instruction, and 5) a control group that received no instruction. The students took a test of BLS and AED theory before instruction, immediately after instruction, and 2 months later. The median (interquartile range) scores overall were 2.33 (2.17) at baseline, 5.33 (4.66) immediately after instruction (P<.001) and 6.00 (3.33) (P<.001). All groups except the control group improved their scores. Scores immediately after instruction and 2 months later were statistically similar after all types of instruction. No significant differences between face-to-face instruction and audiovisual instruction for learning BLS and AED theory were found in secondary school students either immediately after instruction or 2 months later.
Perlis, Nathan; Lo, Kirk C; Grober, Ethan D; Spencer, Leia; Jarvi, Keith
2013-08-01
To determine the coital frequency among infertile couples and which factors are associated with less frequent coitus. Cross-sectional study. Tertiary-level male infertility clinic. A total of 1,298 infertile men. Administration of computer-based survey, semen analysis, and serum hormone evaluation. Monthly coital frequency. A total of 1,298 patients presented to clinic for infertility consultation and completed the computer-based survey. The median male age was 35 years (interquartile range [IQR] 32-39 years) and the median duration of infertility was 2 years (IQR 1-4 years) before consultation. Median monthly coital frequency was seven (IQR 5-10; range 0-40); 24% of couples were having intercourse ≤ 4 times per month. Overall, 0.6%, 2.7%, 4.8%, 5.8%, and 10.8% of the men reported having intercourse 0, 1, 2, 3, and 4 times per month, respectively. When simultaneously taking into account the influence of age, libido, erectile function, and semen volume on coital frequency, older patients had 1.05 times higher odds (per year of age) of less frequent coitus (odds ratio 1.05, 95% confidence interval 1.03-1.08). In addition, patients with better erectile function had 1.12 times higher odds (per point on Sexual Health Inventory for Men scale) of more frequent coitus (odds ratio 1.12, 95% confidence interval 1.09-1.18). Similar to the general population, most infertile couples report having coitus more than four times per month. Older male age and erectile dysfunction are independent risk factors for less frequent coitus among infertile men, which could have an impact on fertility. Coital frequency should be considered in infertility assessments. Copyright © 2013 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.
Ekenze, S O; Adiri, C O; Igwilo, I O; Onumaegbu, O O
2014-02-01
Virilization of the external genitalia in young girls (VEG) manifests mostly as ambiguity of the genitalia and elicits concerns and uncertainties especially in settings with poor awareness. This study evaluates the profile and challenges of VEG in southeast Nigeria. We analyzed 23 children with VEG managed in 2 referral centers in southeast Nigeria from June 2005 to January 2013. They presented at median age of 13.3 months (interquartile range [IQR] 3 months-3 years). The cases included 3 (13%) of Prader type 1, 6 (26%) of type 2, 11 (48%) of type 3, and 3 (13%) of type 4. Five of the Prader type 3 and all 3 cases of Prader type 4 were reared as male prior to presentation. Following evaluation, all the cases were assigned female gender at a mean age of 2.7 years (range 2 months-10.5 years). Appropriate feminizing genitoplasty was undertaken in all the cases and after a follow-up period of 3 months to 5 years (mean 2 years), 2 patients developed vaginal stenosis, and 3 cases had surgical wound infection. Poor awareness, delayed presentation, inadequate facilities, and lack of trained manpower were the challenges in the management of the cases. VEG in our setting is associated with delayed management. Focused health education and public awareness programs, and improved healthcare funding may improve outcome and minimize the need for gender reassignment. Copyright © 2014 North American Society for Pediatric and Adolescent Gynecology. Published by Elsevier Inc. All rights reserved.
Sámano, Reyna; Martínez-Rojano, Hugo; Godínez Martínez, Estela; Sánchez Jiménez, Bernarda; Villeda Rodríguez, Gilda Paulina; Pérez Zamora, Julieta; Casanueva, Esther
2013-06-01
Exclusive breastfeeding (EBF) in adolescent mothers has been associated with greater postpartum maternal weight loss. To assess the associations between EBF and weight loss in adolescent and adult mothers and between EBF and weight and length gain of their children. A cohort of 68 adolescent mothers (15 to 19 years), 64 adult mothers (20 to 29 years), and their infants were studied. Anthropometric measurements were performed at 15, 90, 180, and 365 days postpartum in the mothers and children. EBF was defined as consumption of human milk without supplementation of any type (water, juice, nonhuman milk, or food) for 4 months. Sixty-five percent of mothers sustained EBF for 4 months. There were no significant differences in the weight or length of the infants of adolescent and adult mothers at 365 days postpartum. Among infants of adult mothers, there was a significant difference between the weight gain of those were exclusively breastfed and those who were not exclusively breastfed (6,498 +/- 1,060 vs 6,096 +/- 1,035 g, p < .050) at 365 days postpartum, according to the parameters for weight gain and length established by the World Health Organization (WHO). Among both adult and adolescent mothers, those who practiced EBF lost more weight than those who did not practice EBF (-2.9 kg, 95% interquartile range, -5.7 to 0.8 kg, vs -1.8 kg 95% interquartile range -2.8 to 2.2 kg; p = .004). Gestational weight gain, duration of EBF, and recovery menstruation explained 21% of the variance (F = 28.184, p = .001) in change in postpartum maternal weight (in kilograms) from 0 to 365 days postpartum in all mothers. Pregestational weight, duration of EBF, and maternal age were factors that explained 14% (F = 22.759, p = .001) of the change in the weight and length of the infants from 0 to 365 days of life. EBF in adolescent and adult mothers influences postpartum weight loss and provides adequate infant growth in accordance with the WHO 2006 standards.
Effects of Prehospital Thrombolysis in Stroke Patients With Prestroke Dependency.
Nolte, Christian H; Ebinger, Martin; Scheitz, Jan F; Kunz, Alexander; Erdur, Hebun; Geisler, Frederik; Braemswig, Tim-Bastian; Rozanski, Michal; Weber, Joachim E; Wendt, Matthias; Zieschang, Katja; Fiebach, Jochen B; Villringer, Kersten; Grittner, Ulrike; Kaczmarek, Sabina; Endres, Matthias; Audebert, Heinrich J
2018-03-01
Data on effects of intravenous thrombolysis on outcome of patients with ischemic stroke who are dependent on assistance in activities of daily living prestroke are scarce. Recent registry based analyses in activities of daily -independent patients suggest that earlier start of intravenous thrombolysis in the prehospital setting leads to better outcomes when compared with the treatment start in hospital. We evaluated whether these observations can be corroborated in patients with prestroke dependency. This observational, retrospective analysis included all patients with acute ischemic stroke depending on assistance before stroke who received intravenous thrombolysis either on the Stroke Emergency Mobile (STEMO) or through conventional in-hospital care (CC) in a tertiary stroke center (Charité, Campus Benjamin Franklin, Berlin) during routine care. Prespecified outcomes were modified Rankin Scale scores of 0 to 3 and survival at 3 months, as well as symptomatic intracranial hemorrhage. Outcomes were adjusted in multivariable logistic regression. Between February 2011 and March 2015, 122 of 427 patients (28%) treated on STEMO and 142 of 505 patients (28%) treated via CC needed assistance before stroke. Median onset-to-treatment times were 97 (interquartile range, 69-159; STEMO) and 135 (interquartile range, 98-184; CC; P <0.001) minutes. After 3 months, modified Rankin Scale scores of 0 to 3 was observed in 48 STEMO patients (39%) versus 35 CC patients (25%; P =0.01) and 86 (70%, STEMO) versus 85 (60%, CC) patients were alive ( P =0.07). After adjustment, STEMO care was favorable with respect to modified Rankin Scale scores of 0 to 3 (odds ratio, 1.99; 95% confidence interval, 1.02-3.87; P =0.042) with a nonsignificant result for survival (odds ratio, 1.73; 95% confidence interval, 0.95-3.16; P =0.07). Symptomatic intracranial hemorrhage occurred in 5 STEMO versus 12 CC patients (4.2% versus 8.5%; P =0.167). The results of this study suggest that earlier, prehospital (as compared with in-hospital) start of intravenous thrombolysis in acute ischemic stroke may translate into better clinical outcome in patients with prestroke dependency. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02358772. © 2018 American Heart Association, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mueller, Sabine, E-mail: muellers@neuropeds.ucsf.edu; Department of Pediatrics, University of California, San Francisco, California; Department of Neurosurgery, University of California, San Francisco, California
Purpose: To assess, in a retrospective cohort study, rates and predictors of first and recurrent stroke in patients treated with cranial irradiation (CRT) and/or cervical irradiation at ≤18 years of age. Methods and Materials: We performed chart abstraction (n=383) and phone interviews (n=104) to measure first and recurrent stroke in 383 patients who received CRT and/or cervical radiation at a single institution between 1980 and 2009. Stroke was defined as a physician diagnosis and symptoms consistent with stroke. Incidence of first stroke was number of first strokes per person-years of observation after radiation. We used survival analysis techniques to determinemore » cumulative incidence of first and recurrent stroke. Results: Among 325 subjects with sufficient follow-up data, we identified 19 first strokes (13 ischemic, 4 hemorrhagic, 2 unknown subtype) occurring at a median age of 24 years (interquartile range 17-33 years) in patients treated with CRT. Imaging was reviewed when available (n=13), and the stroke was confirmed in 12. Overall rate of first stroke was 625 (95% confidence interval [CI] 378-977) per 100,000 person-years. The cumulative incidence of first stroke was 2% (95% CI 0.01%-5.3%) at 5 years and 4% (95% CI 2.0%-8.4%) at 10 years after irradiation. With each 100-cGy increase in the radiation dose, the stroke hazard increased by 5% (hazard ratio 1.05; 95% CI 1.01-1.09; P=.02). We identified 6 recurrent strokes; 5 had available imaging that confirmed the stroke. Median time to recurrence was 15 months (interquartile range 6 months-3.2 years) after first stroke. The cumulative incidence of recurrent stroke was 38% (95% CI 17%-69%) at 5 years and 59% (95% CI 27%-92%) at 10 years after first stroke. Conclusion: Cranial irradiation puts childhood cancer survivors at high risk of both first and recurrent stroke. Stroke prevention strategies for these survivors are needed.« less
Brunori, Giuliano; Viola, Battista F; Parrinello, Giovanni; De Biase, Vincenzo; Como, Giovanna; Franco, Vincenzo; Garibotto, Giacomo; Zubani, Roberto; Cancarini, Giovanni C
2007-05-01
A supplemented very-low-protein diet (sVLPD) seems to be safe when postponing dialysis therapy. Prospective multicenter randomized controlled study designed to assess the noninferiority of diet versus dialysis in 1-year mortality assessed by using intention-to-treat and per-protocol analysis. Italian uremic patients without diabetes older than 70 years with glomerular filtration rate of 5 to 7 mL/min (0.08 to 0.12 mL/s). Randomization to an sVLPD (diet group) or dialysis. The sVLPD is a vegan diet (35 kcal; proteins, 0.3 g/kg body weight daily) supplemented with keto-analogues, amino acids, and vitamins. Patients following an sVLPD started dialysis therapy in the case of malnutrition, intractable fluid overload, hyperkalemia, or appearance of uremic symptoms. Mortality, hospitalization, and metabolic markers. 56 patients were randomly assigned to each group, median follow-up was 26.5 months (interquartile range, 40), and patients in the diet group spent a median of 10.7 months (interquartile range, 11) following an sVLPD. Forty patients in the diet group started dialysis treatment because of either fluid overload or hyperkalemia. There were 31 deaths (55%) in the dialysis group and 28 deaths (50%) in the diet group. One-year observed survival rates at intention to treat were 83.7% (95% confidence interval [CI], 74.5 to 94.0) in the dialysis group versus 87.3% (95% CI, 78.9 to 96.5) in the diet group (log-rank test for noninferiority, P < 0.001; for superiority, P = 0.6): the difference in survival was -3.6% (95% CI, -17 to +10; P = 0.002). The hazard ratio for hospitalization was 1.50 for the dialysis group (95% CI, 1.11 to 2.01; P < 0.01). The unblinded nature of the study, exclusion of patients with diabetes, and incomplete enrollment. An sVLPD was effective and safe when postponing dialysis treatment in elderly patients without diabetes.
Bakker, Jessie P.; O'Keeffe, Karyn M.; Neill, Alister M.; Campbell, Angela J.
2011-01-01
Study Objectives: We aimed to investigate the influence of ethnicity on adherence with continuous positive airway pressure (CPAP) in a sample of New Zealand patients. Design: Observational study over one month. Setting: A university-based sleep laboratory. Patients: 126 consecutively consenting CPAP-naïve patients (19.8% Māori, mean±SD apnea-hypopnea index 57.9 ± 38.9 events/h, CPAP 11.1 ± 3.1 cm H2O). Interventions: Patients underwent a 4-week supervised home trial of CPAP following pressure titration. Measurements and Results: Self-identified ethnicity (Māori/non-Māori), Epworth Sleepiness Scale, Self-Efficacy Measure for Sleep Apnea, Rapid Estimate of Adult Literacy in Medicine, New Zealand Deprivation Index (calculated from residential address), New Zealand Individual Deprivation Index (validated 8-item questionnaire), educational history, income, and employment assessed at baseline were compared to objective CPAP adherence after one month. Māori demonstrated significantly lower usage than non-Māori (median 5.11, interquartile range 2.24 h/night compared with median 5.71, interquartile range 2.61 h/night, P = 0.05). There were no significant relationships between adherence and subjective sleepiness, health literacy, or self-efficacy. In a multivariate logistic regression model incorporating 5 variables (ethnicity, eligibility for government-subsidized healthcare, individual deprivation scores, income, and education), non-completion of tertiary education, and high individual socioeconomic deprivation remained significant independent predictors of average CPAP adherence not reaching ≥ 4 h (odds ratio 0.25, 95% CI 0.08-0.83, P = 0.02; odds ratio 0.10, 95% CI 0.02-0.86, P = 0.04, respectively). The overall model explained approximately 23% of the variance in adherence. Conclusions: The disparity in CPAP adherence demonstrated between Māori and non-Māori can be explained in part by lower education levels and socioeconomic status. Citation: Bakker JP; O'Keeffe KM; Neill AM; Campbell AJ. Ethnic disparities in CPAP adherence in New Zealand: effects of socioeconomic status, health literacy and self-efficacy. SLEEP 2011;34(11):1595-1603. PMID:22043130
Craig, Louise E; Bernhardt, Julie; Langhorne, Peter; Wu, Olivia
2010-11-01
Very early mobilization (VEM) is a distinctive characteristic of care in some stroke units; however, evidence of the effectiveness of this approach is limited. To date, only 2 phase II trials have compared VEM with standard care: A Very Early Rehabilitation Trial (AVERT) in Australia and the recently completed Very Early Rehabilitation or Intensive Telemetry after Stroke trial in the United Kingdom. The Very Early Rehabilitation or Intensive Telemetry after Stroke protocol was designed to complement that of AVERT in a number of key areas. The aim of this analysis was to investigate the impact of VEM on independence by pooling data from these 2 comparable trials. Individual data from the 2 trials were pooled. Overall, patients were between 27 and 97 years old, had first or recurring stroke, and were treated within 36 hours after stroke onset. The primary outcome was independence, defined as modified Rankin scale score of 0 to 2 at 3 months. The secondary outcomes included complications of immobility and activities of daily living. Logistic regression was used to assess the effect of VEM on outcome, adjusting for known confounders including age, baseline stroke severity, and premorbid modified Rankin scale score. Findings-All patients in AVERT and Very Early Rehabilitation or Intensive Telemetry after Stroke were included, resulting in 54 patients in the VEM group and 49 patients in the standard care group. The baseline characteristics of VEM patients were largely comparable with standard care patients. Time to first mobilization from symptom onset was significantly shorter among VEM patients (median, 21 hours; interquartile range, 15.8-27.8 hours) compared with standard care patients (median, 31 hours; interquartile range, 23.0-41.2 hours). VEM patients had significantly greater odds of independence compared with standard care patients (adjusted odds ratio, 3.11; 95% confidence interval, 1.03-9.33). Planned collaborations between stroke researchers to conduct trials with common protocols and outcome measures can help advance rehabilitation science. VEM was associated with improved independence at 3 months compared with standard care. However, both trials are limited by small sample sizes. Larger trials (such as AVERT phase III) are still needed in this field.
Mazanderani, Ahmad Haeri; Moyo, Faith; Kufa, Tendesayi; Sherman, Gayle G
2018-02-01
To describe baseline HIV-1 RNA viral load (VL) trends within South Africa's Early Infant Diagnosis program 2010-2016, with reference to prevention of mother-to-child transmission guidelines. HIV-1 total nucleic acid polymerase chain reaction (TNA PCR) and RNA VL data from 2010 to 2016 were extracted from the South African National Health Laboratory Service's central data repository. Infants with a positive TNA PCR and subsequent baseline RNA VL taken at age <7 months were included. Descriptive statistics were performed for quantified and lower-than-quantification limit (LQL) results per annum and age in months. Trend analyses were performed using log likelihood ratio tests. Multivariable linear regression was used to model the relationship between RNA VL and predictor variables, whereas logistic regression was used to identify predictors associated with LQL RNA VL results. Among 13,606 infants with a positive HIV-1 TNA PCR linked to a baseline RNA VL, median age of first PCR was 57 days and VL was 98 days. Thirteen thousand one hundred ninety-five (97.0%) infants had a quantified VL and 411 (3.0%) had an LQL result. A significant decline in median VL was observed between 2010 and 2016, from 6.3 log10 (interquartile range: 5.6-6.8) to 5.6 log10 (interquartile range: 4.2-6.5) RNA copies per milliliter, after controlling for age (P < 0.001), with younger age associated with lower VL (P < 0.001). The proportion of infants with a baseline VL <4 Log10 RNA copies per milliliter increased from 5.4% to 21.8%. Subsequent to prevention of mother-to-child transmission Option B implementation in 2013, the proportion of infants with an LQL baseline VL increased from 1.5% to 6.1% (P < 0.001). Between 2010 and 2016, a significant decline in baseline viremia within South Africa's Early Infant Diagnosis program was observed, with loss of detectability among some HIV-infected infants.
Selberherr, Andreas; Hörmann, Marcus; Prager, Gerhard; Riss, Philipp; Scheuba, Christian; Niederle, Bruno
2017-03-01
The purpose of this study was to demonstrate the high number of kidney stones in primary hyperparathyroidism (PHPT) and the low number of in fact "asymptomatic" patients. Forty patients with PHPT (28 female, 12 male; median age 58 (range 33-80) years; interquartile range 17 years [51-68]) without known symptoms of kidney stones prospectively underwent multidetector computed tomography (MDCT) and ultrasound (US) examinations of the urinary tract prior to parathyroid surgery. Images were evaluated for the presence and absence of stones, as well as for the number of stones and sizes in the long axis. The MDCT and US examinations were interpreted by two experienced radiologists who were blinded to all clinical and biochemical data. Statistical analysis was performed using the Wilcoxon signed-rank test. US revealed a total of 4 kidney stones in 4 (10 %) of 40 patients (median size 6.5 mm, interquartile range 11.5 mm). MDCT showed a total of 41 stones (median size was 3 mm, interquartile range 2.25 mm) in 15 (38 %) of 40 patients. The number of kidney stones detected with MDCT was significantly higher compared to US (p = 0.00124). MDCT is a highly sensitive method for the detection of "silent" kidney stones in patients with PHPT. By widely applying this method, the number of asymptomatic courses of PHPT may be substantially reduced. MDCT should be used primarily to detect kidney stones in PHPT and to exclude asymptomatic PHPT.
Effect of low cost public health staff training on exclusive breastfeeding.
Agampodi, Suneth Buddhika; Agampodi, Thilini Chanchala
2008-11-01
To assess the effectiveness and feasibility of on the job staff training and supportive supervision to improve six months Exclusive Breastfeeding (EBF). A longitudinal study was conducted in a public health field practice area-Sri Lanka in 2006-2007. Three breastfeeding counseling sessions were conducted for public health midwives. Supportive supervision and on the job training were done by two public health physicians. Pre and post intervention independent cross sectional studies were conducted to assess the effectiveness of the programme. The study sample consisted of mother-infant pairs where infants were aging 6 to 12 months, attending child welfare clinics. Primary outcome measure was the proportion of infants who received EBF up to 6 months. Logistic regressing model was used for analysis of predictors of EBF. Study sample consisted of 336 mother-infant pairs (pre 139, post 197). Proportion of mothers who breastfed their infants exclusively for six months improved from 19% to 70% after the intervention. The median duration of EBF increased from 4 months to 6 months (inter-quartile range 2-6 and 5-6 months respectively). Unconfounded effect of intervention on 6 months EBF in logistic regression model was highly significant (OR=13.67. p<0.001). Intervention significantly reduced the bottle feeding rate (OR=0.212, p<0.001) but not formula feeding (OR=1.146. p=0.642). Of potential predictors assessed. Sinhalese mothers than Muslim mothers (OR=3.37, p<0.001) and employed mothers compared to housewives (OR=4.45. p=0.014) were more likely to breastfeed their infants upto six months. Parity, maternal education and maternal age were not significantly associated with six months EBF. The existing public health infrastructure can be used effectively to improve six months EBF in places where the care is given primarily by public health system.
Fukuoka, Kahoru; Furuichi, Mihoko; Ito, Kenta; Morikawa, Yoshihiko; Watanabe, Ichiro; Shimizu, Naoki; Horikoshi, Yuho
2018-06-13
Catheter-associated urinary tract infections account for 30% of healthcare-associated infections. To date, few studies have addressed pediatric catheter-associated urinary tract infection in PICUs. The aim of our study was to assess the risk of catheter-associated urinary tract infection in relation to the duration of catheterization in the PICU. Retrospective cohort study. PICU at a tertiary children's hospital. Our study was conducted between April 2012 and June 2015 at Tokyo Metropolitan Children's Medical Center in Japan. Children in the PICU with an urethral catheter were included. Catheter-associated urinary tract infection cases were defined according to the National Healthcare Safety Network criteria. The patients' demographic data and isolated organisms were reviewed. Duration of catheterization and the catheter-associated urinary tract infection occurrence rate were analyzed. None. Among 1,890 catheterizations, 23 catheter-associated urinary tract infection cases were identified. The overall occurrence rate was 2.35/1,000 catheter-days. Among the patients with catheter-associated urinary tract infection, 13 were boys. The median age was 11 months (interquartile range, 7-35 mo), and the median duration of catheterization was 7 days (interquartile range, 5-12 d). The isolated bacteria were Escherichia coli (26.5%), Enterococcus faecalis (17.6%), and Klebsiella pneumoniae (11.8%). Two species were isolated in each of 11 cases (47.8%). Each additional day of catheterization increased the risk of catheter-associated urinary tract infection (odds ratio, 1.06; 95% CI, 1.02-1.10, and odds ratio adjusted for contact precaution status and surgical procedures was 1.05; 95% CI, 1.01-1.09). Longer duration of catheterization increased the risk of catheter-associated urinary tract infection by 5% each day at the PICU. Prompt removal of the urethral catheter is strongly recommended whenever feasible.
Human milk IgA concentrations during the first year of lactation
Weaver, L.; Arthur, H.; Bunn, J.; Thomas, J.
1998-01-01
AIMS—To measure the concentrations of total IgA in the milk secreted by both breasts, throughout the first year of lactation, in a cohort of Gambian mothers of infants at high risk of infection. SUBJECTS AND METHODS—Sixty five women and their infants were studied monthly from the 4th to 52nd postpartum week. Samples of milk were obtained from each breast by manual expression immediately before the infant was suckled. Milk intakes were measured by test weighing the infants before and after feeds over 12 hour periods; IgA concentrations were determined by enzyme linked immunosorbent assay. RESULTS—A total of 1590 milk samples was measured. The median (interquartile range) concentration of IgA for all samples was 0.708(0.422-1.105) g/l; that in milk obtained from the left breast was 0.785 (0.458-1.247) g/l, and that in milk obtained from the right breast was 0.645 (0.388-1.011) g/l (p < 0.0001). There was no significant change in milk or IgA intakes with advancing infant age, but there was a close concordance of IgA concentrations between the two breasts, with "tracking" of the output of the left and right breasts. There was a significant (p < 0.01) negative correlation between maternal age and parity, and weight of milk ingested by infants. During the dry season (December to May) the median (interquartile range) IgA concentration was significantly higher at 0.853 (0.571-1.254) g/l than during the rainy season (June to November), when it was 0.518 (0.311-0.909) g/l (p < 0.0001). CONCLUSIONS—Sustained IgA secretion is likely to protect suckling infants from microbial infection. PMID:9613353
Workup for Perinatal Stroke Does Not Predict Recurrence.
Lehman, Laura L; Beaute, Jeanette; Kapur, Kush; Danehy, Amy R; Bernson-Leung, Miya E; Malkin, Hayley; Rivkin, Michael J; Trenor, Cameron C
2017-08-01
Perinatal stroke, including neonatal and presumed perinatal presentation, represents the age in childhood in which stroke occurs most frequently. The roles of thrombophilia, arteriopathy, and cardiac anomalies in perinatal ischemic stroke are currently unclear. We took a uniform approach to perinatal ischemic stroke evaluation to study these risk factors and their association with recurrent stroke. We reviewed records of perinatal stroke patients evaluated from August 2008 to February 2016 at a single referral center. Demographics, echocardiography, arterial imaging, and thrombophilia testing were collected. Statistical analysis was performed using Fisher exact test. Across 215 cases, the median follow-up was 3.17 years (1.49, 6.46). Females comprised 42.8% of cases. Age of presentation was neonatal (110, 51.2%) or presumed perinatal (105, 48.8%). The median age at diagnosis was 2.9 days (interquartile range, 2.0-9.9) for neonatal stroke and 12.9 months (interquartile range, 8.7-32.8) for presumed perinatal stroke. Strokes were classified as arterial (149, 69.3%), venous (60, 27.9%), both (4, 1.9%), or uncertain (2, 0.9%) by consensus imaging review. Of the 215 cases, there were 6 (2.8%) recurrent ischemic cerebrovascular events. Abnormal thrombophilia testing was not associated with recurrent stroke, except for a single patient with combined antithrombin deficiency and protein C deficiency. After excluding venous events, 155 patients were evaluated for arteriopathy and cardioembolic risk factors; neither was associated with recurrent stroke. Positive family history of thrombosis was not predictive of abnormal thrombophilia testing. Thrombophilia, arteriopathy, or cardioembolic risk factors were not predictive of recurrent events after perinatal stroke. Thrombophilia evaluation in perinatal stroke should only rarely be considered. © 2017 American Heart Association, Inc.
Hoenigl, Martin; Chaillon, Antoine; Moore, David J; Morris, Sheldon R; Smith, Davey M; Little, Susan J
2016-04-15
It remains unclear if methamphetamine is merely associated with high-risk behavior or if methamphetamine use causes high-risk behavior. Determining this would require a randomized controlled trial, which is clearly not ethical. A possible surrogate would be to investigate individuals before and after starting the use of methamphetamine. We performed a cohort study to analyze recent self-reported methamphetamine use and sexual risk behavior among 8905 men who have sex with men (MSM) receiving the "Early Test," a community-based HIV screening program in San Diego, CA, between April 2008 and July 2014 (total 17,272 testing encounters). Sexual risk behavior was evaluated using a previously published risk behavior score [San Diego Early Test (SDET) score] that predicts risk of HIV acquisition. Methamphetamine use during the last 12 months (hereafter, recent-meth) was reported by 754/8905 unique MSM (8.5%). SDET scores were significantly higher in the 754 MSM with recent-meth use compared with the 5922 MSM who reported that they have never used methamphetamine (P < 0.001). Eighty-two repeat testers initiated methamphetamine between testing encounter, with significantly higher SDET scores after starting methamphetamine [median 5 (interquartile range, 2-7) at recent-meth versus median 3 (interquartile range, 0-5) at never-meth; P < 0.001, respectively]. Given the ethical impossibility of conducting a randomized controlled trial, the results presented here provide the strongest evidence yet that initiation of methamphetamine use increases sexual risk behavior among HIV-uninfected MSM. Until more effective prevention or treatment interventions are available for methamphetamine users, HIV-uninfected MSM who use methamphetamine may represent ideal candidates for alternative effective prevention interventions (ie, preexposure prophylaxis).
Nardo, Luciano G; Christodoulou, Dimitra; Gould, Della; Roberts, Steve A; Fitzgerald, Cheryl T; Laing, Ian
2007-01-01
The aims of this prospective study were to investigate the relationship between anti-Müllerian hormone (AMH) and antral follicle count (AFC), and to determine whether these markers of ovarian reserve correlate with lifestyle factors, ethnicity, chronological age and reproductive history. Participants were 136 normo-ovulatory women undergoing infertility work-up within 3 months of their first ovarian stimulation cycle for in vitro fertilization. On day 3 of a spontaneous menstrual cycle, a blood sample for measurement of plasma AMH levels was taken and a transvaginal ultrasound scan to determine the AFC (follicles measuring 2-5 mm in diameter) was performed. Information about smoking, body mass index, alcohol consumption, ethnic origin, chronological age, age at menarche, years since menarche and gravidity were recorded using a case report form. The main outcome measures were plasma AMH concentrations and total number of small antral follicles (AFC). Median plasma levels of AMH were 2.0 ng/ml (interquartile range 1.1-3.6) and AFC was 10 (interquartile range 7-15). A positive correlation between AMH and AFC (r = 0.54, p < 0.0001) was found. AMH and AFC correlated negatively with age (r = -0.30, p < 0.001 and r = -0.27, p = 0.001 respectively) and number of years since menarche (r = -0.23, p = 0.007 and r = -0.21, p = 0.015 respectively), but not with any of the other measures. Circulating AMH levels and AFC correlated with each other and declined significantly with age. There were only weak, non-significant, correlations with lifestyle factors and reproductive history. These putative markers could be used individually or together to assess the age-related decline of ovarian function in normo-ovulatory candidates for IVF.
Padayatchi, Nesri; Naidu, Naressa; Yende-Zuma, Nonhlanhla; OʼDonnell, Max Roe; Naidoo, Kogieleum; Augustine, Stanton; Zumla, Alimuddin; Loveday, Marian
2016-09-01
The Xpert MTB/RIF assay has been widely implemented in South Africa for rapid tuberculosis (TB) screening. However, its usefulness in management and improving treatment outcomes in patients with multidrug-resistant TB (MDR-TB) remains undefined. The aim of this study was to evaluate the clinical impact of introduction of the Xpert MTB/RIF assay in patients with MDR-TB. We enrolled 921 patients with MDR-TB, who presented to a specialist drug-resistant TB facility in KwaZulu-Natal, South Africa, pre- and post-rollout and implementation of the Xpert MTB/RIF assay. Clinical, laboratory, chest radiograph, and follow-up data from 108 patients with MDR-TB, post-introduction of the Xpert MTB/RIF assay (Xpert group) in November 2010, were analyzed and compared with data from 813 MDR-TB patients from the pre-MTB/RIF assay period (Conventional group), July 2008-2010. Primary impact measure was "treatment success" (World Health Organization definition) at 24 months. Secondary outcomes were time to treatment initiation and disease morbidity. There were no significant differences in treatment success rates between the pre-Xpert MTB/RIF and post-Xpert MTB/RIF groups (54% versus 56.5%, P = 0.681). Median time to treatment initiation was 20 days (interquartile range, 13-31) in the Xpert group versus 92 days (interquartile range, 69-120) in the Conventional group (P < 0.001). Although use of Xpert MTB/RIF assay significantly reduces the time to initiation of MDR-TB treatment, it had no significant impact on treatment outcomes of patients with MDR-TB. Studies on the impact of the Xpert MTB/RIF assay usage on transmission of MDR-TB are required.
Kapel, Gijsbert F L; Reichlin, Tobias; Wijnmaalen, Adrianus P; Piers, Sebastiaan R D; Holman, Eduard R; Tedrow, Usha B; Schalij, Martin J; Stevenson, William G; Zeppenfeld, Katja
2015-02-01
Ventricular tachycardia (VT) is an important cause of late morbidity and mortality in repaired congenital heart disease. The substrate often includes anatomic isthmuses that can be transected by radiofrequency catheter ablation similar to isthmus block for atrial flutter. This study evaluates the long-term efficacy of isthmus block for treatment of re-entry VT in adults with repaired congenital heart disease. Thirty-four patients (49±13 years; 74% male) with repaired congenital heart disease who underwent radiofrequency catheter ablation of VT in 2 centers were included. Twenty-two (65%) had a preserved left and right ventricular function. Patients were inducible for 1 (interquartile range, 1-2) VT, median cycle length: 295 ms (interquartile range, 242-346). Ablation aimed to transect anatomic isthmuses containing VT re-entry circuit isthmuses. Procedural success was defined as noninducibility of any VT and transection of the anatomic isthmus and was achieved in 25 (74%) patients. During long-term follow-up (46±29 months), all patients with procedural success (18/25 with internal cardiac defibrillators) were free of VT recurrence but 7 of 18 experienced internal cardiac defibrillator-related complications. One patient with procedural success and depressed cardiac function received an internal cardiac defibrillator shock for ventricular fibrillation. None of the 18 patients (12/18 with internal cardiac defibrillators) with complete success and preserved cardiac function experienced any ventricular arrhythmia. In contrast, VT recurred in 4 of 9 patients without procedural success. Four patients died from nonarrhythmic causes. In patients with repaired congenital heart disease with preserved ventricular function and isthmus-dependent re-entry, VT isthmus ablation can be curative. © 2014 American Heart Association, Inc.
Barnett, Adam S; Kim, Sunghee; Fonarow, Gregg C; Thomas, Laine E; Reiffel, James A; Allen, Larry A; Freeman, James V; Naccarelli, Gerald; Mahaffey, Kenneth W; Go, Alan S; Kowey, Peter R; Ansell, Jack E; Gersh, Bernard J; Hylek, Elaine M; Peterson, Eric D; Piccini, Jonathan P
2017-11-01
It is unclear how frequently patients with atrial fibrillation receive guideline-concordant (GC) care and whether guideline concordance is associated with improved outcomes. Using data from ORBIT-AF (Outcomes Registry for Better Informed Treatment of Atrial Fibrillation), we determined how frequently patients received care that was concordant with 11 recommendations from the 2014 American Heart Association/American College of Cardiology/Heart Rhythm Society atrial fibrillation guidelines pertaining to antithrombotic therapy, rate control, and antiarrhythmic medications. We also analyzed the association between GC care and clinical outcomes at both the patient level and center level. A total of 9570 patients were included. The median age was 75 years (interquartile range, 67-82), and the median CHA 2 DS 2 -VASc score was 4 (interquartile range, 3-5). A total of 5977 patients (62.5%) received care that was concordant with all guideline recommendations for which they were eligible. Rates of GC care were higher in patients treated by providers with greater specialization in arrhythmias (60.0%, 62.4%, and 67.0% for primary care physicians, cardiologists, and electrophysiologists, respectively; P <0.001). During a median of 30 months of follow-up, patients treated with GC care had a higher risk of bleeding hospitalization (hazard ratio=1.21; P =0.021) but a similar risk of death, stroke, major bleeding, and all-cause hospitalization. Over a third of patients with atrial fibrillation in this large outpatient registry received care that differed in some respect from guideline recommendations. There was no apparent association between GC care and improved risk-adjusted outcomes. © 2017 American Heart Association, Inc.
Ehrman, Jonathan K; Brawner, Clinton A; Al-Mallah, Mouaz H; Qureshi, Waqas T; Blaha, Michael J; Keteyian, Steven J
2017-10-01
Little is known about the relationship of change in cardiorespiratory fitness and mortality risk in Black patients. This study assessed change in cardiorespiratory fitness and its association with all-cause mortality risk in Black and White patients. This is a retrospective, longitudinal, observational cohort study of 13,345 patients (age = 55 ± 11 years; 39% women; 26% black) who completed 2 exercise tests, at least 12 months apart at Henry Ford Hospital, Detroit, Mich. All-cause mortality was identified through April 2013. Data were analyzed in 2015-2016 using Cox regression to calculate hazard ratios (HR) for risk of mortality associated with change in sex-specific cardiorespiratory fitness. Mean time between the tests was 3.4 years (interquartile range 1.9-5.6 years). During 9.1 years (interquartile range 6.3-11.6 years) of follow-up, there were 1931 (14%) deaths (16.5% black, 13.7% white). For both races, change in fitness from Low to the Intermediate/High category resulted in a significant reduction of death risk (HR 0.65 [95% confidence interval (CI), 0.49-0.87] for Black; HR 0.41 [95% CI, 0.34-0.51] for White). Each 1-metabolic-equivalent-of-task increase was associated with a reduced mortality risk in black (HR 0.84 [95% CI, 0.81-0.89]) and white (HR 0.87 [95% CI, 0.82-0.86]) patients. There was no interaction by race. Among black and white patients, change in cardiorespiratory fitness from Low to Intermediate/High fitness was associated with a 35% and 59% lower risk of all-cause mortality, respectively. Copyright © 2017 Elsevier Inc. All rights reserved.
Negative-Pressure Ventilation in Pediatric Acute Respiratory Failure.
Hassinger, Amanda B; Breuer, Ryan K; Nutty, Kirsten; Ma, Chang-Xing; Al Ibrahim, Omar S
2017-12-01
The objective of this work was to describe the use of negative-pressure ventilation (NPV) in a heterogeneous critically ill, pediatric population. A retrospective chart review was conducted of all patients admitted to a pediatric ICU with acute respiratory failure supported with NPV from January 1, 2012 to May 15, 2015. Two hundred thirty-three subjects at a median age of 15.5 months were supported with NPV for various etiologies, most commonly bronchiolitis (70%). Median (interquartile range) duration of support was 18.7 (8.7-34.3) h. The majority were NPV responders (70%), defined as not needing escalation to any form of positive-pressure ventilation. In non-responders, escalation occurred at a median (interquartile range) of 6.9 (3.3-16.6) h. More NPV non-responders had upper-airway obstruction ( P = .02), and fewer had bronchiolitis ( P = .008) compared with responders. A bedside scoring system developed on these data was 98% specific in predicting NPV failure by 4 h after NPV start (area under the curve 0.759, 95% CI 0.675-0.843, P < .001). Complications from NPV were rare (3%); however, delayed enteral nutrition (33%) and continuous intravenous sedation use (51%) in children while receiving NPV were more frequent. The annual percentage of pediatric ICU admissions requiring intubation declined by 28% in the 3 y after NPV introduction, compared with the 3 y prior. NPV is a noninvasive respiratory support for pediatric acute respiratory failure from all causes with few complications and a 70% response rate. Children receiving NPV often required intravenous sedation for comfort, and one third received delayed enteral nutrition. Those who required escalation from NPV worsened within 6 h; this may be predictable with a bedside scoring system. Copyright © 2017 by Daedalus Enterprises.
Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe
2016-07-01
We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.
Allergy Testing in Children With Low-Risk Penicillin Allergy Symptoms.
Vyles, David; Adams, Juan; Chiu, Asriani; Simpson, Pippa; Nimmer, Mark; Brousseau, David C
2017-08-01
Penicillin allergy is commonly reported in the pediatric emergency department (ED). True penicillin allergy is rare, yet the diagnosis results from the denial of first-line antibiotics. We hypothesize that all children presenting to the pediatric ED with symptoms deemed to be low-risk for immunoglobulin E-mediated hypersensitivity will return negative results for true penicillin allergy. Parents of children aged 4 to 18 years old presenting to the pediatric ED with a history of parent-reported penicillin allergy completed an allergy questionnaire. A prespecified 100 children categorized as low-risk on the basis of reported symptoms completed penicillin allergy testing by using a standard 3-tier testing process. The percent of children with negative allergy testing results was calculated with a 95% confidence interval. Five hundred ninety-seven parents completed the questionnaire describing their child's reported allergy symptoms. Three hundred two (51%) children had low-risk symptoms and were eligible for testing. Of those, 100 children were tested for penicillin allergy. The median (interquartile range) age at testing was 9 years (5-12). The median (interquartile range) age at allergy diagnosis was 1 year (9 months-3 years). Rash (97 [97%]) and itching (63 [63%]) were the most commonly reported allergy symptoms. Overall, 100 children (100%; 95% confidence interval 96.4%-100%) were found to have negative results for penicillin allergy and had their labeled penicillin allergy removed from their medical record. All children categorized as low-risk by our penicillin allergy questionnaire were found to have negative results for true penicillin allergy. The utilization of this questionnaire in the pediatric ED may facilitate increased use of first-line penicillin antibiotics. Copyright © 2017 by the American Academy of Pediatrics.
Hong, Il Ki; Choi, Jong Bae; Lee, Jong Ha
2012-09-01
Paresis of the upper extremity after stroke is not effectively solved by existing therapies. We investigated whether mental imagery training combined with electromyogram-triggered electric stimulation improved motor function of the paretic upper extremity in patients with chronic stroke and induced cortical changes. Fourteen subjects with chronic stroke (≥12 months) were randomly allocated to receive mental imagery training combined with electromyogram-triggered electric stimulation (n=7) or generalized functional electric stimulation (n=7) on the forearm extensor muscles of the paretic extremity in 2 20-minute daily sessions 5 days a week for 4 weeks. The upper extremity component of the Fugl-Meyer Motor Assessment, the Motor Activity Log, the modified Barthel Index, and (18)F-fluorodeoxyglucose brain positron emission tomography were measured before and after the intervention. The group receiving mental imagery training combined with electromyogram-triggered electric stimulation exhibited significant improvements in the upper extremity component of the Fugl-Meyer Motor Assessment after intervention (median, 7; interquartile range, 5-8; P<0.05), but the group receiving functional electric stimulation did not (median, 0; interquartile range, 0-3). Differences in score changes between the 2 groups were significant. The mental imagery training combined with electromyogram-triggered electric stimulation group showed significantly increased metabolism in the contralesional supplementary motor, precentral, and postcentral gyri (P(uncorrected)<0.001) after the intervention, but the functional electric stimulation group showed no significant differences. Mental imagery training combined with electromyogram-triggered electric stimulation improved motor function of the paretic extremity in patients with chronic stroke. The intervention increased metabolism in the contralesional motor-sensory cortex. Clinical Trial Registration- URL: https://e-irb.khmccri.or.kr/eirb/receipt/index.html?code=02&status=5. Unique identifier: KHUHMDIRB 1008-02.
Belle, Loic; Motreff, Pascal; Mangin, Lionel; Rangé, Grégoire; Marcaggi, Xavier; Marie, Antoine; Ferrier, Nadine; Dubreuil, Olivier; Zemour, Gilles; Souteyrand, Géraud; Caussin, Christophe; Amabile, Nicolas; Isaaz, Karl; Dauphin, Raphael; Koning, René; Robin, Christophe; Faurie, Benjamin; Bonello, Laurent; Champin, Stanislas; Delhaye, Cédric; Cuilleret, François; Mewton, Nathan; Genty, Céline; Viallon, Magalie; Bosson, Jean Luc; Croisille, Pierre
2016-03-01
Delayed stent implantation after restoration of normal epicardial flow by a minimalist immediate mechanical intervention aims to decrease the rate of distal embolization and impaired myocardial reperfusion after percutaneous coronary intervention. We sought to confirm whether a delayed stenting (DS) approach (24-48 hours) improves myocardial reperfusion, versus immediate stenting, in patients with acute ST-segment-elevation myocardial infarction undergoing primary percutaneous coronary intervention. In the prospective, randomized, open-label minimalist immediate mechanical intervention (MIMI) trial, patients (n=140) with ST-segment-elevation myocardial infarction ≤12 hours were randomized to immediate stenting (n=73) or DS (n=67) after Thrombolysis In Myocardial Infarction 3 flow restoration by thrombus aspiration. Patients in the DS group underwent a second coronary arteriography for stent implantation a median of 36 hours (interquartile range 29-46) after randomization. The primary end point was microvascular obstruction (% left ventricular mass) on cardiac magnetic resonance imaging performed 5 days (interquartile range 4-6) after the first procedure. There was a nonsignificant trend toward lower microvascular obstruction in the immediate stenting group compared with DS group (1.88% versus 3.96%; P=0.051), which became significant after adjustment for the area at risk (P=0.049). Median infarct weight, left ventricular ejection fraction, and infarct size did not differ between groups. No difference in 6-month outcomes was apparent for the rate of major cardiovascular and cerebral events. The present findings do not support a strategy of DS versus immediate stenting in patients with ST-segment-elevation infarction undergoing primary percutaneous coronary intervention and even suggested a deleterious effect of DS on microvascular obstruction size. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01360242. © 2016 American Heart Association, Inc.
Saifodine, Abuchahama; Gudo, Paula Samo; Sidat, Mohsin; Black, James
2013-06-07
TB control is based on the rapid identification of cases and their effective treatment. However, many studies have shown that there are important delays in diagnosis and treatment of patients with TB. The purpose of this study was to assess the prevalence of and identify risk factors associated with patient delay and health system delay among newly diagnosed patients with pulmonary TB. A cross sectional study was carried out in Beira city, Mozambique between September 2009 and February 2010. Patients in the first month of treatment were consecutively selected to this study if they had a diagnosis of pulmonary TB, had no history of previous TB treatment, and were 18 years or older and provided informed consent. Data was obtained through a questionnaire administered to the patients and from patients' files. Among the 622 patients included in the study the median age was 32 years (interquartile range, 26-40) and 272 (43.7%) were females. The median total delay, patient delay and health system delay was 150 days (interquartile range, 91-240), 61 days (28-113) and 62 days (37-120), respectively. The contribution of patient delay and health system delay to total delay was similar. Farming, visiting first a traditional healer, low TB knowledge and coexistence of a chronic disease were associated with increased patient delay. More than two visits to a health facility, farming and coexistence of a chronic disease were associated with increased health system delay. This study revealed a long total delay with a similar contribution of patient delay and health system delay. To reduce the total delay in this setting we need a combination of interventions to encourage patients to seek appropriate health care earlier and to expedite TB diagnosis within the health care system.
Early Outcomes in Children With Antineutrophil Cytoplasmic Antibody-Associated Vasculitis.
Morishita, Kimberly A; Moorthy, Lakshmi N; Lubieniecka, Joanna M; Twilt, Marinka; Yeung, Rae S M; Toth, Mary B; Shenoi, Susan; Ristic, Goran; Nielsen, Susan M; Luqmani, Raashid A; Li, Suzanne C; Lee, Tzielan; Lawson, Erica F; Kostik, Mikhail M; Klein-Gitelman, Marisa; Huber, Adam M; Hersh, Aimee O; Foell, Dirk; Elder, Melissa E; Eberhard, Barbara A; Dancey, Paul; Charuvanij, Sirirat; Benseler, Susanne M; Cabral, David A
2017-07-01
To characterize the early disease course in childhood-onset antineutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV) and the 12-month outcomes in children with AAV. Eligible subjects were children entered into the Pediatric Vasculitis Initiative study who were diagnosed before their eighteenth birthday as having granulomatosis with polyangiitis (Wegener's), microscopic polyangiitis, eosinophilic granulomatosis with polyangiitis (Churg-Strauss), or ANCA-positive pauci-immune glomerulonephritis. The primary outcome measure was achievement of disease remission (Pediatric Vasculitis Activity Score [PVAS] of 0) at 12 months with a corticosteroid dosage of <0.2 mg/kg/day. Secondary outcome measures included the rates of inactive disease (PVAS of 0, with any corticosteroid dosage) and rates of improvement at postinduction (4-6 months after diagnosis) and at 12 months, presence of damage at 12 months (measured by a modified Pediatric Vasculitis Damage Index [PVDI]; score 0 = no damage, score 1 = one damage item present), and relapse rates at 12 months. In total, 105 children with AAV were included in the study. The median age at diagnosis was 13.8 years (interquartile range 10.9-15.8 years). Among the study cohort, 42% of patients achieved remission at 12 months, 49% had inactive disease at postinduction (4-6 months), and 61% had inactive disease at 12 months. The majority of patients improved, even if they did not achieve inactive disease. An improvement in the PVAS score of at least 50% from time of diagnosis to postinduction was seen in 92% of patients. Minor relapses occurred in 12 (24%) of 51 patients after inactive disease had been achieved postinduction. The median PVDI damage score at 12 months was 1 (range 0-6), and 63% of patients had ≥1 PVDI damage item scored as present at 12 months. This is the largest study to date to assess disease outcomes in pediatric AAV. Although the study showed that a significant proportion of patients did not achieve remission, the majority of patients responded to treatment. Unfortunately, more than one-half of this patient cohort experienced damage to various organ systems early in their disease course. © 2017, American College of Rheumatology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vance, Sean M.; Stenmark, Matthew H.; Blas, Kevin
2012-07-01
Purpose: To investigate the prognostic utility of the percentage of cancer volume (PCV) in needle biopsy specimens for prostate cancer patients treated with dose-escalated external beam radiotherapy. Methods and Materials: The outcomes were analyzed for 599 men treated for localized prostate cancer with external beam radiotherapy to a minimal planning target volume dose of 75 Gy (range, 75-79.2). We assessed the effect of PCV and the pretreatment and treatment-related factors on the freedom from biochemical failure, freedom from metastasis, cause-specific survival, and overall survival. Results: The median number of biopsy cores was 7 (interquartile range, 6-12), median PCV was 10%more » (interquartile range, 2.5-25%), and median follow-up was 62 months. The PCV correlated with the National Comprehensive Cancer Network risk group and individual risk features, including T stage, prostate-specific antigen level, Gleason score, and percentage of positive biopsy cores. On log-rank analysis, the PCV stratified by quartile was prognostic for all endpoints, including overall survival. In addition, the PCV was a stronger prognostic factor than the percentage of positive biopsy cores when the two metrics were analyzed together. On multivariate analysis, the PCV predicted a worse outcome for all endpoints, including freedom from biochemical failure, (hazard ratio, 1.9; p = .0035), freedom from metastasis (hazard ratio, 1.7, p = .09), cause-specific survival (hazard ratio, 3.9, p = .014), and overall survival (hazard ratio, 1.8, p = .02). Conclusions: For patients treated with dose-escalated external beam radiotherapy, the volume of cancer in the biopsy specimen adds prognostic value for clinically relevant endpoints, particularly in intermediate- and high-risk patients. Although the PCV determination is more arduous than the percentage of positive biopsy cores, it provides superior risk stratification.« less
Clinical Performance of a New Bitangential Mini-scleral Lens.
Otten, Henny M; van der Linden, Bart J J J; Visser, Esther-Simone
2018-06-01
New bitangential mini-scleral lens designs provide a highly precise fit, follow the scleral shape, are comfortable to wear, and can correct residual astigmatism. This new scleral lens design complements the arsenal of medical contact lenses available to eye care practitioners. The aim of this study was to evaluate the subjective and objective performance of a new mini-scleral lens design with a bitangential periphery. In this observational study, data were collected for up to 15 months (median, 84 days; interquartile range, 76 days) from the left eyes of 133 patients fitted with this newly designed lens. Data were recorded during regular visits at Visser Contact Lens Practice's scleral lens clinics: diagnosis, clinical indication for scleral lenses, previous contact lens type, subjective performance, horizontal visible iris diameter, corrected distance visual acuity, and scleral lens fitting characteristics. The most common indication was keratoconus (45%), followed by irregular astigmatism (22%), keratoplasty (16.5%), ocular surface disease (13.5%), and other forms of irregular astigmatism (3%). The majority of patients (79%) scored comfort as either a 4 or 5 (out of 5), and 82% wore their lenses 12 hours or longer a day. Most lenses (81%) had a diameter of 16 mm (median, 16 mm; range, 15.5 to 17 mm) and were composed of Boston XO2 (46%), Menicon Z (44%), Boston XO (9%), or Boston Equalens II (1%). The median corrected distance visual acuity was 0.022 logarithm of the minimal angle of resolution (interquartile range, 0.155). The fitting characteristics revealed optimal values for centration and movement in 91% and 83%, respectively. Finally, the median stabilization axis was 50 degrees. New mini-scleral lenses with bitangential peripheral geometry yield satisfactory clinical results and good subjective performance and are therefore an effective option for managing patients who have irregular astigmatism or other corneal pathology.
Pulmonary rehabilitation in lymphangioleiomyomatosis: a controlled clinical trial.
Araujo, Mariana S; Baldi, Bruno G; Freitas, Carolina S G; Albuquerque, André L P; Marques da Silva, Cibele C B; Kairalla, Ronaldo A; Carvalho, Celso R F; Carvalho, Carlos R R
2016-05-01
Lymphangioleiomyomatosis (LAM) is a cystic lung disease frequently associated with reduced exercise capacity. The aim of this study was to assess safety and efficacy of pulmonary rehabilitation in LAM.This controlled clinical trial included 40 patients with LAM and a low physical activity level. The pulmonary rehabilitation programme comprised 24 aerobic and muscle strength training sessions and education. The primary outcome was exercise capacity (endurance time during a constant work rate exercise test). Secondary outcomes included health-related quality of life (St George's Respiratory Questionnaire (SGRQ)), 6-min walking distance (6MWD), dyspnoea, peak oxygen consumption (V'O2 ), daily physical activity (pedometer), symptoms of anxiety and depression, lung function and peripheral muscle strength (one-repetition maximum).The baseline characteristics were well balanced between the groups. The pulmonary rehabilitation group exhibited improvements in the following outcomes versus controls: endurance time (median (interquartile range) 169 (2-303) s versus -33 (-129-39) s; p=0.001), SGRQ (median (interquartile range) -8 (-16-2) versus 2 (-4-5); p=0.002) and 6MWD (median (interquartile range) 59 (13-81) m versus 20 (-12-30) m; p=0.002). Dyspnoea, peak V'O2 , daily physical activity and muscle strength also improved significantly. No serious adverse events were observed.Pulmonary rehabilitation is a safe intervention and improves exercise capacity, dyspnoea, daily physical activity, quality of life and muscle strength in LAM. Copyright ©ERS 2016.
Prehospital Emergency Care in Childhood Arterial Ischemic Stroke.
Stojanovski, Belinda; Monagle, Paul T; Mosley, Ian; Churilov, Leonid; Newall, Fiona; Hocking, Grant; Mackay, Mark T
2017-04-01
Immediately calling an ambulance is the key factor in reducing time to hospital presentation for adult stroke. Little is known about prehospital care in childhood arterial ischemic stroke (AIS). We aimed to determine emergency medical services call-taker and paramedic diagnostic sensitivity and to describe timelines of care in childhood AIS. This is a retrospective study of ambulance-transported children aged <18 years with first radiologically confirmed AIS, from 2008 to 2015. Interhospital transfers of children with preexisting AIS diagnosis were excluded. Twenty-three children were identified; 4 with unavailable ambulance records were excluded. Nineteen children were included in the study. Median age was 8 years (interquartile range, 3-14); median Pediatric National Institutes of Stroke Severity Scale score was 8 (interquartile range, 3-16). Emergency medical services call-taker diagnosis was stroke in 4 children (21%). Priority code 1 (lights and sirens) ambulances were dispatched for 13 children (68%). Paramedic diagnosis was stroke in 5 children (26%), hospital prenotification occurred in 8 children (42%), and 13 children (68%) were transported to primary stroke centers. Median prehospital timelines were onset to emergency medical services contact 13 minutes, call to scene 12 minutes, time at scene 14 minutes, transport time 43 minutes, and total prehospital time 71 minutes (interquartile range, 60-85). Emergency medical services call-taker and paramedic diagnostic sensitivity and prenotification rates are low in childhood AIS. © 2017 American Heart Association, Inc.
Location and size of flux ropes in Titan's ionosphere
NASA Astrophysics Data System (ADS)
Martin, C.; Arridge, C. S.; Badman, S. V.; Dieval, C.
2017-12-01
Cassini magnetometer data was surveyed during Titan flybys to find 73 instances of flux rope signatures. A force free flux rope model was utilised to obtain the radii, maximum magnetic field and flux content of flux ropes that adhere to the force-free assumptions. We find that flux ropes at Titan are similar in size in km and flux content to the giant flux ropes identified at Venus, with a median radii of 280 km and an inter-quartile range of 270 km, a median maximum magnetic field of 8 nT with an inter-quartile range of 7 nT and a median flux content of 76 Wb with a large inter-quartile range of 250 Wb. We additionally investigate the occurrence of flux ropes with respect to the Sun-lit facing hemisphere (zenith angle) and the ram-side of Titan within Saturn's corotating magnetosphere (angle of attack of the incoming plasma flow). We find that flux ropes are more commonly detected in Sun-lit areas of Titan's ionosphere, as well as the ram-side of Titan. We see a statistically-significant absence of flux ropes in all SLT sectors in the night side of Titan and the anti-ram side of Titan. We also comment on the physical mechanisms associated with the production of these flux ropes, with particular attention on the variability of Titan's environment in Saturn's magnetosphere.
Cheng, S; Teuffel, O; Ethier, M C; Diorio, C; Martino, J; Mayo, C; Regier, D; Wing, R; Alibhai, S M H; Sung, L
2011-01-01
Background: To describe (1) anticipated health-related quality of life during different strategies for febrile neutropaenia (FN) management and (2) attributes of those preferring inpatient management. Methods: Respondents were parents of children 0–18 years and children 12–18 years receiving cancer treatment. Anticipated health-related quality of life was elicited for four different FN management strategies: entire inpatient, early discharge, outpatient oral and outpatient intravenous (i.v.) therapy. Tools used to measure health-related quality of life were visual analogue scale (VAS), willingness to pay and time trade off. Results: A total of 155 parents and 43 children participated. For parents, median VAS scores were highest for early discharge (5.9, interquartile range 4.4–7.2) and outpatient i.v. (5.9, interquartile range 4.4–7.3). For children, median scores were highest for early discharge (6.1, interquartile range 4.6–7.2). In contrast, the most commonly preferred strategy for parents and children was inpatient in 55.0% and 37.2%, respectively. Higher current child health-related quality of life was associated with a stronger preference for outpatient management. Conclusion: Early discharge and outpatient i.v. management are associated with higher anticipated health-related quality of life, although the most commonly preferred strategy was inpatient care. This data may help with determining more cost-effective strategies for paediatric FN. PMID:21694729
Cheng, S; Teuffel, O; Ethier, M C; Diorio, C; Martino, J; Mayo, C; Regier, D; Wing, R; Alibhai, S M H; Sung, L
2011-08-23
To describe (1) anticipated health-related quality of life during different strategies for febrile neutropaenia (FN) management and (2) attributes of those preferring inpatient management. Respondents were parents of children 0-18 years and children 12-18 years receiving cancer treatment. Anticipated health-related quality of life was elicited for four different FN management strategies: entire inpatient, early discharge, outpatient oral and outpatient intravenous (i.v.) therapy. Tools used to measure health-related quality of life were visual analogue scale (VAS), willingness to pay and time trade off. A total of 155 parents and 43 children participated. For parents, median VAS scores were highest for early discharge (5.9, interquartile range 4.4-7.2) and outpatient i.v. (5.9, interquartile range 4.4-7.3). For children, median scores were highest for early discharge (6.1, interquartile range 4.6-7.2). In contrast, the most commonly preferred strategy for parents and children was inpatient in 55.0% and 37.2%, respectively. Higher current child health-related quality of life was associated with a stronger preference for outpatient management. Early discharge and outpatient i.v. management are associated with higher anticipated health-related quality of life, although the most commonly preferred strategy was inpatient care. This data may help with determining more cost-effective strategies for paediatric FN.
Armstrong, David S.; Parker, Gene W.; Richards, Todd A.
2003-01-01
Streamflow characteristics and methods for determining streamflow requirements for habitat protection were investigated at 23 active index streamflow-gaging stations in southern New England. Fish communities sampled near index streamflow-gaging stations in Massachusetts have a high percentage of fish that require flowing-water habitats for some or all of their life cycle. The relatively unaltered flow condition at these sites was assumed to be one factor that has contributed to this condition. Monthly flow durations and low flow statistics were determined for the index streamflow-gaging stations for a 25- year period from 1976 to 2000. Annual hydrographs were prepared for each index station from median streamflows at the 50-percent monthly flow duration, normalized by drainage area. A median monthly flow of 1 ft3/s/mi2 was used to split hydrographs into a high-flow period (November–May), and a low-flow period (June–October). The hydrographs were used to classify index stations into groups with similar median monthly flow durations. Index stations were divided into four regional groups, roughly paralleling the coast, to characterize streamflows for November to May; and into two groups, on the basis of base-flow index and percentage of sand and gravel in the contributing area, for June to October. For the June to October period, for index stations with a high base-flow index and contributing areas greater than 20 percent sand and gravel, median streamflows at the 50-percent monthly flow duration, normalized by drainage area, were 0.57, 0.49, and 0.46 ft3/s/mi2 for July, August, and September, respectively. For index stations with a low base-flow index and contributing areas less than 20 percent sand and gravel, median streamflows at the 50-percent monthly flow duration, normalized by drainage area, were 0.34, 0.28, and 0.27 ft3/s/mi2 for July, August, and September, respectively. Streamflow variability between wet and dry years can be characterized by use of the interquartile range of median streamflows at selected monthly flow durations. For example, the median Q50 discharge for August had an interquartile range of 0.30 to 0.87 ft3/s/mi2 for the high-flow group and 0.16 to 0.47 ft3/s/mi2 for the low-flow group. Streamflow requirements for habitat protection were determined for 23 index stations by use of three methods based on hydrologic records, the Range of Variability Approach, the Tennant method, and the New England Aquatic-Base-Flow method. Normalized flow management targets determined by the Range of Variability Approach for July, August, and September ranged between 0.21 and 0.84 ft3/s/mi2 for the low monthly flow duration group, and 0.37 and 1.27 ft3/s/mi2 for the high monthly flow duration group. Median streamflow requirements for habitat protection during summer for the 23 index streamflow-gaging stations determined by the Tennant method, normalized by drainage area, were 0.81, 0.61, and 0.21 ft3/s/mi2 for the Tennant 40-, 30-, and 10-percent of the mean annual flow methods, representing good, fair, and poor stream habitat conditions in summer, according to Tennant. New England Aquatic-Base-Flow streamflow requirements for habitat protection during summer were determined from median of monthly mean flows for August for index streamflow-gaging stations having drainage areas greater than 50 mi2 . For five index streamflow-gaging stations in the low median monthly flow group, the average median monthly mean streamflow for August, normalized by drainage area, was 0.48 ft3/s/mi2. Streamflow requirements for habitat protection were determined for riffle habitats near 10 index stations by use of two methods based on hydraulic ratings, the Wetted-Perimeter and R2Cross methods. Hydraulic parameters required by these methods were simulated by calibrated HEC-RAS models. Wetted-Perimeter streamflow requirements for habitat protection, normalized by drainage area, ranged between 0.13 and 0.58 ft3/s/mi2, and had a median value of 0.37 ft3/s/mi2. Streamflow requirements determined by the R2Cross 3-of-3 criteria method ranged between 0.39 and 2.1 ft3/s/mi2 , and had a median of 0.84 ft3/s/mi2. Streamflow requirements determined by the R2Cross 2-of-3 criteria method, normalized by drainage area, ranged between 0.16 and 0.85 ft3/s/mi2 and had a median of 0.36 ft3/s/mi2 , respectively. Streamflow requirements determined by the different methods were evaluated by comparison to streamflow statistics from the index streamflow-gaging stations.
Tisè, Marco; Mazzarini, Laura; Fabrizzi, Giancarlo; Ferrante, Luigi; Giorgetti, Raffaele; Tagliabracci, Adriano
2011-05-01
The main importance in age estimation lies in the assessment of criminal liability and protection of unaccompanied minor immigrants, when their age is unknown. Under Italian law, persons are not criminally responsible before they reach the age of 14. The age of 18 is important when deciding whether juvenile or adult law must be applied. In the case of unaccompanied minors, it is important to assess age in order to establish special protective measures, and correct age estimation may prevent a person over 18 from benefiting from measures reserved for minors. Since the Greulich and Pyle method is one of the most frequently used in age estimation, the aim of this study was to assess the reproducibility and accuracy of the method on a large Italian sample of teenagers, to ascertain the applicability of the Atlas at the critical age thresholds of 14 and 18 years. This retrospective study examined posteroanterior X-ray projections of hand and wrist from 484 Italian-Caucasian young people (125 females, 359 males) between 11 and 19 years old. All radiographic images were taken from trauma patients hospitalized in the Azienda Ospedaliero Universitaria Ospedali Riuniti of Ancona (Italy) between 2006 and 2007. Two physicians analyzed all radiographic images separately. The blind method was used. In the case of an estimated age of 14 years old, the true age ranged from 12.2 to 15.9 years (median, 14.3 years, interquartile range, 1.0 years) for males, and 12.6 to 15.7 years (median, 14.2 years, interquartile range, 1.7 years) for females. In the case of an estimated age of 18 years, the true age ranged from 15.6 to 19.7 years (median, 17.7 years, interquartile range, 1.4 years) for males, and from 16.2 to 20.0 years (median, 18.7 years, interquartile range, 1.8 years) for females. Our study shows that although the GPM is a reproducible and repeatable method, there is a wide margin of error in the estimation of chronological age, mainly in the critical estimated ages of 14 and 18 years old in both males and females.
Vedel, Anne G; Holmgaard, Frederik; Rasmussen, Lars S; Langkilde, Annika; Paulson, Olaf B; Lange, Theis; Thomsen, Carsten; Olsen, Peter Skov; Ravn, Hanne Berg; Nilsson, Jens C
2018-04-24
Cerebral injury is an important complication after cardiac surgery with the use of cardiopulmonary bypass. The rate of overt stroke after cardiac surgery is 1% to 2%, whereas silent strokes, detected by diffusion-weighted magnetic resonance imaging, are found in up to 50% of patients. It is unclear whether a higher versus a lower blood pressure during cardiopulmonary bypass reduces cerebral infarction in these patients. In a patient- and assessor-blinded randomized trial, we allocated patients to a higher (70-80 mm Hg) or lower (40-50 mm Hg) target for mean arterial pressure by the titration of norepinephrine during cardiopulmonary bypass. Pump flow was fixed at 2.4 L·min -1 ·m -2 . The primary outcome was the total volume of new ischemic cerebral lesions (summed in millimeters cubed), expressed as the difference between diffusion-weighted imaging conducted preoperatively and again postoperatively between days 3 and 6. Secondary outcomes included diffusion-weighted imaging-evaluated total number of new ischemic lesions. Among the 197 enrolled patients, mean (SD) age was 65.0 (10.7) years in the low-target group (n=99) and 69.4 (8.9) years in the high-target group (n=98). Procedural risk scores were comparable between groups. Overall, diffusion-weighted imaging revealed new cerebral lesions in 52.8% of patients in the low-target group versus 55.7% in the high-target group ( P =0.76). The primary outcome of volume of new cerebral lesions was comparable between groups, 25 mm 3 (interquartile range, 0-118 mm 3 ; range, 0-25 261 mm 3 ) in the low-target group versus 29 mm 3 (interquartile range, 0-143 mm 3 ; range, 0-22 116 mm 3 ) in the high-target group (median difference estimate, 0; 95% confidence interval, -25 to 0.028; P =0.99), as was the secondary outcome of number of new lesions (1 [interquartile range, 0-2; range, 0-24] versus 1 [interquartile range, 0-2; range, 0-29] respectively; median difference estimate, 0; 95% confidence interval, 0-0; P =0.71). No significant difference was observed in frequency of severe adverse events. Among patients undergoing on-pump cardiac surgery, targeting a higher versus a lower mean arterial pressure during cardiopulmonary bypass did not seem to affect the volume or number of new cerebral infarcts. URL: https://www.clinicaltrials.gov. Unique identifier: NCT02185885. © 2018 American Heart Association, Inc.
Psychoactive medication use in intermediate-care facility residents.
Beers, M; Avorn, J; Soumerai, S B; Everitt, D E; Sherman, D S; Salem, S
1988-11-25
Despite the large number of elderly patients in nursing homes and the intensity of medication use there, few current data are available on patterns of medication use in this setting. We studied all medication use among 850 residents of 12 representative intermediate-care facilities in Massachusetts. Data on all prescriptions and patterns of actual use were recorded for all patients during one month. On average, residents were prescribed 8.1 medications during the month (interquartile range, 7.4 to 8.8) and actually received 4.7 (range, 4.2 to 5.4) medications during this period. More than half of all residents were receiving a psychoactive medication, with 26% receiving antipsychotic medication. Twenty-eight percent of patients were receiving sedative/hypnotics during the study month, primarily on a scheduled rather than an as-needed basis. Of patients receiving a sedative/hypnotic, 26% (range, 14% to 41%) were taking diphenhydramine hydrochloride, a strongly anticholinergic hypnotic. Of those receiving one of the benzodiazepines, 30% were receiving long-acting drugs, generally not recommended for elderly patients. The typical benzodiazepine dose was equivalent to 7.3 mg per patient per day of diazepam. The most commonly used antidepressant was amitriptyline hydrochloride, the most sedating and anticholinergic antidepressant in common use. These data indicate that despite growing evidence of the risks of psychoactive drug use in elderly patients, the nursing home population studied was exposed to high levels of sedative/hypnotic and antipsychotic drug use. Suboptimal choice of medication within a given class was common, and use of standing vs as-needed orders was often not in keeping with current concepts in geriatric psychopharmacology. Additional research is needed to assess the impact of such drug therapy on cognitive and physical functioning, as well as to determine how best to improve patterns of medication use in this vulnerable population.
Lactation is associated with altered metabolomic signatures in women with gestational diabetes.
Much, Daniela; Beyerlein, Andreas; Kindt, Alida; Krumsiek, Jan; Stückler, Ferdinand; Rossbauer, Michaela; Hofelich, Anna; Wiesenäcker, David; Hivner, Susanne; Herbst, Melanie; Römisch-Margl, Werner; Prehn, Cornelia; Adamski, Jerzy; Kastenmüller, Gabi; Theis, Fabian; Ziegler, Anette-G; Hummel, Sandra
2016-10-01
Lactation for >3 months in women with gestational diabetes is associated with a reduced risk of type 2 diabetes that persists for up to 15 years postpartum. However, the underlying mechanisms are unknown. We examined whether in women with gestational diabetes lactation for >3 months is associated with altered metabolomic signatures postpartum. We enrolled 197 women with gestational diabetes at a median of 3.6 years (interquartile range 0.7-6.5 years) after delivery. Targeted metabolomics profiles (including 156 metabolites) were obtained during a glucose challenge test. Comparisons of metabolite concentrations and ratios between women who lactated for >3 months and women who lactated for ≤3 months or not at all were performed using linear regression with adjustment for age and BMI at the postpartum visit, time since delivery, and maternal education level, and correction for multiple testing. Gaussian graphical modelling was used to generate metabolite networks. Lactation for >3 months was associated with a higher total lysophosphatidylcholine/total phosphatidylcholine ratio; in women with short-term follow-up, it was also associated with lower leucine concentrations and a lower total branched-chain amino acid concentration. Gaussian graphical modelling identified subgroups of closely linked metabolites within phosphatidylcholines and branched-chain amino acids that were affected by lactation for >3 months and have been linked to the pathophysiology of type 2 diabetes in previous studies. Lactation for >3 months in women with gestational diabetes is associated with changes in the metabolomics profile that have been linked to the early pathogenesis of type 2 diabetes.
Comparison of oncology drug approval between Health Canada and the US Food and Drug Administration.
Ezeife, Doreen A; Truong, Tony H; Heng, Daniel Y C; Bourque, Sylvie; Welch, Stephen A; Tang, Patricia A
2015-05-15
The drug approval timeline is a lengthy process that often varies between countries. The objective of this study was to delineate the Canadian drug approval timeline for oncology drugs and to compare the time to drug approval between Health Canada (HC) and the US Food and Drug Administration (FDA). In total, 54 antineoplastic drugs that were approved by the FDA between 1989 and 2012 were reviewed. For each drug, the following milestones were determined: the dates of submission and approval for both the FDA and HC and the dates of availability on provincial drug formularies in Canadian provinces and territories. The time intervals between the aforementioned milestones were calculated. Of 54 FDA-approved drugs, 49 drugs were approved by HC at the time of the current study. The median time from submission to approval was 9 months (interquartile range [IQR], 6-14.5 months) for the FDA and 12 months (IQR, 10-21.1 months) for HC (P < .0006). The time from HC approval to the placement of a drug on a provincial drug formulary was a median of 16.7 months (IQR, 5.9-27.2 months), and there was no interprovincial variability among the 5 Canadian provinces that were analyzed (P = .5). The time from HC submission to HC approval takes 3 months longer than the same time interval for the FDA. To the authors' knowledge, this is the first documentation of the time required to bring an oncology drug from HC submission to placement on a provincial drug formulary. © 2015 American Cancer Society.
Standard duplex criteria overestimate the degree of stenosis after eversion carotid endarterectomy.
Benzing, Travis; Wilhoit, Cameron; Wright, Sharee; McCann, P Aaron; Lessner, Susan; Brothers, Thomas E
2015-06-01
The eversion technique for carotid endarterectomy (eCEA) offers an alternative to longitudinal arteriotomy and patch closure (pCEA) for open carotid revascularization. In some reports, eCEA has been associated with a higher rate of >50% restenosis of the internal carotid when it is defined as peak systolic velocity (PSV) >125 cm/s by duplex imaging. Because the conformation of the carotid bifurcation may differ after eCEA compared with native carotid arteries, it was hypothesized that standard duplex criteria might not accurately reflect the presence of restenosis after eCEA. In a case-control study, the outcomes of all patients undergoing carotid endarterectomy by one surgeon during the last 10 years were analyzed retrospectively, with a primary end point of PSV >125 cm/s. Duplex flow velocities were compared with luminal diameter measurements for any carotid computed tomography arteriography or magnetic resonance angiography study obtained within 2 months of duplex imaging, with the degree of stenosis calculated by the methodology used in the North American Symptomatic Carotid Endarterectomy Trial (NASCET) and the European Carotid Surgery Trial (ECST) as well as cross-sectional area (CSA) reduction. Simulations were generated and analyzed by computational model simulations of the eCEA and pCEA arteries. Eversion and longitudinal arteriotomy with patch techniques were used in 118 and 177 carotid arteries, respectively. Duplex follow-up was available in 90 eCEA arteries at a median of 16 (range, 2-136) months and in 150 pCEA arteries at a median of 41 (range, 3-115) months postoperatively. PSV >125 cm/s was present at some time during follow-up in 31% of eCEA and pCEA carotid arteries, each, and in the most recent duplex examination in 7% after eCEA and 21% after pCEA (P = .003), with no eCEA and two pCEA arteries occluding completely during follow-up (P = .29). In 19 carotid arteries with PSV >125 cm/s after angle correction (median, 160 cm/s; interquartile range, 146-432 cm/s) after eCEA that were subsequently examined by axial imaging, the mean percentage stenosis was 8% ± 11% by NASCET, 11% ± 5% by ECST, and 20% ± 9% by CSA criteria. For eight pCEA arteries with PSV >125 cm/s (median velocity, 148 cm/s; interquartile range, 139-242 cm/s), the corresponding NASCET, ECST, and CSA stenoses were 8% ± 35%, 26% ± 32%, and 25% ± 33%, respectively. NASCET internal carotid diameter reduction of at least 50% was noted by axial imaging after two of the eight pCEAs, and the PSV exceeded 200 cm/s in each case. The presence of hemodynamically significant carotid artery restenosis may be overestimated by standard duplex criteria after eCEA and perhaps after pCEA. Insufficient information currently exists to determine what PSV does correspond to hemodynamically significant restenosis. Published by Elsevier Inc.
Presentation to publication: proportion of abstracts published for ESPR, SPR and IPR.
Shelmerdine, Susan C; Lynch, Jeremy O; Langan, Dean; Arthurs, Owen J
2016-09-01
Advancement of knowledge requires presentation and publication of high-quality scientific research. Studies submitted for presentation undergo initial peer review before acceptance and the rate of subsequent publication may be taken as an indicator of access to publication for pediatric radiology studies. Evaluate the proportion of abstracts also published in journals for pediatric radiology conferences and identify factors associated with publication success. All Medline articles that originated from oral presentations at the European Society for Paediatric Radiology (ESPR), the Society for Pediatric Radiology (SPR) or the International Pediatric Radiology (IPR) conferences between 2010 - 2012 were evaluated. Descriptive statistics to evaluate published and unpublished groups were calculated overall and split by characteristics of the abstracts such as number of authors. Overall number of abstracts published was 300/715 (41.9%), with most articles published in radiology specific journals (181/300; 60.3%), with median impact factor 2.31 (interquartile range [IQR]: 1.65-3.14, range: 0-18.03). Those published after the conference (262/300, 87.6%) had a median time to publication of 18 months and for those published before, the median time was -11 months. Median sample size in published articles was 52 (IQR: 33-105, range: 1-6,351). Of pediatric radiology oral abstracts, 41.9% achieve publication after a period of at least 3 years from presentation. Studies originating from certain countries and on certain subspecialty topics were more likely to get published.
Infant head circumference growth is saltatory and coupled to length growth.
Lampl, Michelle; Johnson, Michael L
2011-05-01
Rapid growth rates of head circumference and body size during infancy have been reported to predict developmental pathologies that emerge during childhood. This study investigated whether growth in head circumference was concordant with growth in body length. Forty infants (16 males) were followed between the ages of 2 days and 21 months for durations ranging from 4 to 21 months (2616 measurements). Longitudinal anthropometric measurements were assessed weekly (n=12), semi-weekly (n=24) and daily (n=4) during home visits. Individual head circumference growth was investigated for the presence of saltatory patterns. Coincident analysis tested the null hypothesis that head growth was randomly coupled to length growth. Head circumference growth during infancy is saltatory (p<0.05), characterized by median increments of 0.20 cm (95% confidence interval, 0.10-0.30 cm) in 24-h, separated by intervals of no growth ranging from 1 to 21 days. Daily assessments identified that head growth saltations were coupled to length growth saltations within a median time frame of 2 days (interquartile 0-4, range 1-8 days). Assessed at semi-weekly and weekly intervals, an average 82% (SD 0.13) of head growth saltations was non-randomly concordant with length growth (p≤0.006). Normal infant head circumference grows by intermittent, episodic saltations that are temporally coupled to growth in total body length by a process of integrated physiology that remains to be described. Copyright © 2011 Elsevier Ltd. All rights reserved.
Kong, Angela; Beresford, Shirley A.A.; Alfano, Catherine M.; Foster-Schubert, Karen E.; Neuhouser, Marian L.; Johnson, Donna B.; Duggan, Catherine; Wang, Ching-Yun; Xiao, Liren; Jeffery, Robert W.; Bain, Carolyn E.; McTiernan, Anne
2012-01-01
Lifestyle-based interventions, which typically promote various behavioral modification strategies, can serve as a setting for evaluating specific behaviors and strategies thought to promote or hinder weight loss. The aim of this study was to test the associations of self-monitoring (self-weighing, food journal completion) and eating-related (dietary intake, diet-related weight-control strategies, and meal patterns) behaviors with weight loss in a sample of postmenopausal overweight-to-obese women enrolled in a 12-month dietary weight loss intervention. Changes in body weight and adoption of self-monitoring and eating-related behaviors were assessed in 123 participants. Generalized linear models tested associations of these behaviors with 12-month weight change after adjusting for potential confounders. Mean percent weight loss was 10.7%. In the final model, completing more food journals was associated with a greater % weight loss (interquartile range, 3.7% greater weight loss; p<0.0001) while skipping meals (4.3% lower weight loss; p<0.05) and eating out for lunch (at least once a week, 2.5% lower weight loss; p<0.01) were associated with a lower amount of weight loss. These findings suggest that a greater focus on dietary self-monitoring, home-prepared meals, and consuming meals at regular intervals may improve 12-month weight loss among postmenopausal women enrolled in a dietary weight loss intervention. PMID:22795495
Frisk, Gabriella; Tinge, Beatrice; Ekberg, Sara; Eloranta, Sandra; Bäcklund, L Magnus; Lidbrink, Elisabet; Smedby, Karin E
2017-12-01
The benefit of whole brain radiotherapy (WBRT) for late stage breast cancer patients with brain metastases has been questioned. In this study we evaluated survival and level of care (hospital or home) following WBRT in a population-based cohort by personal and tumor characteristics. We identified 241 consecutive patients with breast cancer and brain metastases receiving WBRT in Stockholm, Sweden, 1999-2012. Through review of medical records, we collected data on prognostic determinants including level of care before and after WBRT. Survival was estimated using Cox regression, and odds ratios (OR) of not coming home using logistic regression. Median age at WBRT was 58 years (range 30---88 years). Most patients (n = 212, 88%) were treated with 4 Gray × 5. Median survival following WBRT was 2.9 months (interquartile range 1.1-6.6 months), and 57 patients (24%) were never discharged from hospital. Poor performance status and triple-negative tumors were associated with short survival (WHO 3-4 median survival 0.9 months, HR = 5.96 (3.88-9.17) versus WHO 0-1; triple-negative tumors median survival 2.0 months, HR = 1.87 (1.23-2.84) versus Luminal A). Poor performance status and being hospitalized before WBRT were associated with increased ORs of not coming home whereas cohabitation with children at home was protective. Survival was short following WBRT, and one in four breast cancer patients with brain metastases could never be discharged from hospital. When deciding about WBRT, WHO score, level of care before WBRT, and the patient's choice of level of care in the end-of-life period should be considered.
Luke, Jason J; Lemons, Jeffrey M; Karrison, Theodore G; Pitroda, Sean P; Melotek, James M; Zha, Yuanyuan; Al-Hallaq, Hania A; Arina, Ainhoa; Khodarev, Nikolai N; Janisch, Linda; Chang, Paul; Patel, Jyoti D; Fleming, Gini F; Moroney, John; Sharma, Manish R; White, Julia R; Ratain, Mark J; Gajewski, Thomas F; Weichselbaum, Ralph R; Chmura, Steven J
2018-02-13
Purpose Stereotactic body radiotherapy (SBRT) may stimulate innate and adaptive immunity to augment immunotherapy response. Multisite SBRT is an emerging paradigm for treating metastatic disease. Anti-PD-1-treatment outcomes may be improved with lower disease burden. In this context, we conducted a phase I study to evaluate the safety of pembrolizumab with multisite SBRT in patients with metastatic solid tumors. Patients and Methods Patients progressing on standard treatment received SBRT to two to four metastases. Not all metastases were targeted, and metastases > 65 mL were partially irradiated. SBRT dosing varied by site and ranged from 30 to 50 Gy in three to five fractions with predefined dose de-escalation if excess dose-limiting toxicities were observed. Pembrolizumab was initiated within 7 days after completion of SBRT. Pre- and post-SBRT biopsy specimens were analyzed in a subset of patients to quantify interferon-γ-induced gene expression. Results A total of 79 patients were enrolled; three patients did not receive any treatment and three patients only received SBRT. Patients included in the analysis were treated with SBRT and at least one cycle of pembrolizumab. Most (94.5%) of patients received SBRT to two metastases. Median follow-up for toxicity was 5.5 months (interquartile range, 3.3 to 8.1 months). Six patients experienced dose-limiting toxicities with no radiation dose reductions. In the 68 patients with imaging follow-up, the overall objective response rate was 13.2%. Median overall survival was 9.6 months (95% CI, 6.5 months to undetermined) and median progression-free survival was 3.1 months (95% CI, 2.9 to 3.4 months). Expression of interferon-γ-associated genes from post-SBRT tumor biopsy specimens significantly correlated with nonirradiated tumor response. Conclusion Multisite SBRT followed by pembrolizumab was well tolerated with acceptable toxicity. Additional studies exploring the clinical benefit and predictive biomarkers of combined multisite SBRT and PD-1-directed immunotherapy are warranted.
Pasupathy, Sivabaskari; Tavella, Rosanna; Grover, Suchi; Raman, Betty; Procter, Nathan E K; Du, Yang Timothy; Mahadavan, Gnanadevan; Stafford, Irene; Heresztyn, Tamila; Holmes, Andrew; Zeitz, Christopher; Arstall, Margaret; Selvanayagam, Joseph; Horowitz, John D; Beltrame, John F
2017-09-05
Contemporary ST-segment-elevation myocardial infarction management involves primary percutaneous coronary intervention, with ongoing studies focusing on infarct size reduction using ancillary therapies. N-acetylcysteine (NAC) is an antioxidant with reactive oxygen species scavenging properties that also potentiates the effects of nitroglycerin and thus represents a potentially beneficial ancillary therapy in primary percutaneous coronary intervention. The NACIAM trial (N-acetylcysteine in Acute Myocardial Infarction) examined the effects of NAC on infarct size in patients with ST-segment-elevation myocardial infarction undergoing percutaneous coronary intervention. This randomized, double-blind, placebo-controlled, multicenter study evaluated the effects of intravenous high-dose NAC (29 g over 2 days) with background low-dose nitroglycerin (7.2 mg over 2 days) on early cardiac magnetic resonance imaging-assessed infarct size. Secondary end points included cardiac magnetic resonance-determined myocardial salvage and creatine kinase kinetics. Of 112 randomized patients with ST-segment-elevation myocardial infarction, 75 (37 in NAC group, 38 in placebo group) underwent early cardiac magnetic resonance imaging. Median duration of ischemia pretreatment was 2.4 hours. With background nitroglycerin infusion administered to all patients, those randomized to NAC exhibited an absolute 5.5% reduction in cardiac magnetic resonance-assessed infarct size relative to placebo (median, 11.0%; [interquartile range 4.1, 16.3] versus 16.5%; [interquartile range 10.7, 24.2]; P =0.02). Myocardial salvage was approximately doubled in the NAC group (60%; interquartile range, 37-79) compared with placebo (27%; interquartile range, 14-42; P <0.01) and median creatine kinase areas under the curve were 22 000 and 38 000 IU·h in the NAC and placebo groups, respectively ( P =0.08). High-dose intravenous NAC administered with low-dose intravenous nitroglycerin is associated with reduced infarct size in patients with ST-segment-elevation myocardial infarction undergoing percutaneous coronary intervention. A larger study is required to assess the impact of this therapy on clinical cardiac outcomes. Australian New Zealand Clinical Trials Registry. URL: http://www.anzctr.org.au/. Unique identifier: 12610000280000. © 2017 American Heart Association, Inc.
Balaji, Seshadri; Daga, Ankana; Bradley, David J; Etheridge, Susan P; Law, Ian H; Batra, Anjan S; Sanatani, Shubayan; Singh, Anoop K; Gajewski, Kelly K; Tsao, Sabrina; Singh, Harinder R; Tisma-Dupanovic, Svjetlana; Tateno, Shigeru; Takamuro, Motoki; Nakajima, Hiromichi; Roos-Hesselink, Jolien W; Shah, Maully
2014-08-01
The study objective was to determine whether the extracardiac conduit Fontan confers an arrhythmia advantage over the intracardiac lateral tunnel Fontan. This multicenter study of 1271 patients compared bradyarrhythmia (defined as need for pacing) and tachyarrhythmia (defined as needing antiarrhythmic therapy) between 602 patients undergoing the intracardiac Fontan and 669 patients undergoing the extracardiac Fontan. The median age at the time of the Fontan procedure was 2.1 years (interquartile range, 1.6-3.2 years) for the intracardiac group and 3.0 years (interquartile range, 2.4-3.9) for the extracardiac group (P < .0001). The median follow-up was 9.2 years (interquartile range, 5-12.8) for the intracardiac group and 4.7 years (interquartile range, 2.8-7.7) for the extracardiac group (P < .0001). Early postoperative (<30 days) bradyarrhythmia occurred in 24 patients (4%) in the intracardiac group and 73 patients (11%) in the extracardiac group (P < .0001). Early postoperative (<30 days) tachyarrhythmia occurred in 32 patients (5%) in the intracardiac group and 53 patients (8%) in the extracardiac group (P = not significant). Late (>30 days) bradyarrhythmia occurred in 105 patients (18%) in the intracardiac group and 63 patients (9%) in the extracardiac group (P < .0001). Late (>30 days) tachyarrhythmia occurred in 58 patients (10%) in the intracardiac group and 23 patients (3%) in the extracardiac group (P < .0001). By multivariate analysis factoring time since surgery, more patients in the extracardiac group had early bradycardia (odds ratio, 2.9; 95% confidence interval, 1.8-4.6), with no difference in early tachycardia, late bradycardia, or late tachycardia. Overall arrhythmia burden is similar between the 2 groups, but the extracardiac Fontan group had a higher incidence of early bradyarrhythmias. There was no difference in the incidence of late tachyarrhythmias over time between the 2 operations. Therefore, the type of Fontan performed should be based on factors other than an anticipated reduction in arrhythmia burden from the extracardiac conduit. Copyright © 2014 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Acute heart failure: perspectives from a randomized trial and a simultaneous registry.
Ezekowitz, Justin A; Hu, Jia; Delgado, Diego; Hernandez, Adrian F; Kaul, Padma; Leader, Rolland; Proulx, Guy; Virani, Sean; White, Michel; Zieroth, Shelley; O'Connor, Christopher; Westerhout, Cynthia M; Armstrong, Paul W
2012-11-01
Randomized controlled trials (RCT) are limited by their generalizability to the broader nontrial population. To provide a context for Acute Study of Nesiritide in Decompensated Heart Failure (ASCEND-HF) trial, we designed a complementary registry to characterize clinical characteristics, practice patterns, and in-hospital outcomes of acute heart failure patients. Eligible patients for the registry included those with a principal diagnosis of acute heart failure (ICD-9-CM 402 and 428; ICD-10 I50.x, I11.0, I13.0, I13.2) from 8 sites participating in ASCEND-HF (n=697 patients, 2007-2010). Baseline characteristics, treatments, and hospital outcomes from the registy were compared with ASCEND-HF RCT patients from 31 Canadian sites (n=465, 2007-2010). Patients in the registry were older, more likely to be female, and have chronic respiratory disease, less likely to have diabetes mellitus: they had a similar incidence of ischemic HF, atrial fibrillation, and similar B-type natriuretic peptide levels. Registry patients had higher systolic blood pressure (registry: median 132 mm Hg [interquartile range 115-151 mm Hg]; RCT: median 120 mm Hg [interquartile range 110-135 mm Hg]) and ejection fraction (registry: median 40% [interquartile range 27-58%]; RCT: median 29% [interquartile range 20-40 mm Hg]) than RCT patients. Registry patients presented more often via ambulance and had a similar total length of stay as RCT patients. In-hospital mortality was significantly higher in the registry compared with the RCT patients (9.3% versus 1.3%,P<0.001), and this remained after multivariable adjustment (odds ratio 6.6, 95% CI 2.6-16.8, P<0.001). Patients enrolled in a large RCT of acute heart failure differed significantly based on clinical characteristics, treatments, and inpatient outcomes from contemporaneous patients participating in a registry. These results highlight the need for context of RCTs to evaluate generalizability of results and especially the need to improve clinical outcomes in acute heart failure. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00475852.
Nishisaki, Akira; Pines, Jesse M; Lin, Richard; Helfaer, Mark A; Berg, Robert A; Tenhave, Thomas; Nadkarni, Vinay M
2012-07-01
Attending physicians are only required to provide in-hospital coverage during daytime hours in many pediatric intensive care units. An in-hospital 24-hr pediatric intensive care unit attending coverage model has been increasingly popular, but the impact of 24-hr, in-hospital attending coverage on care processes and outcomes has not been reported. We compared processes of care and outcomes before and after the implementation of a 24-hr in-hospital pediatric intensive care unit attending physician model. Retrospective comparison of before and after cohorts. A single large, academic tertiary medical/surgical pediatric intensive care unit. : Pediatric intensive care unit admissions in 2000-2006. Transition to 24-hr from 12-hr in-hospital pediatric critical care attending physician coverage model in January 2004. A total of 18,702 patients were admitted to intensive care unit: 8,520 in 24 hrs; 10,182 in 12 hrs. Duration of mechanical ventilation was lower (median 33 hrs [interquartile range 12-88] vs. 48 hrs [interquartile range 16-133], adjusted reduction of 35% [95% confidence interval 25%-44%], p < .001) and intensive care unit length of stay was shorter (median 2 days [interquartile range 1-4] vs. 2 days [interquartile range 1-5], adjusted p < .001) for 24 hr vs. 12 hr coverage. The reduction in mechanical ventilation hours was similar when noninvasive, mechanical ventilation was included in ventilation hours (median 42 hrs vs. 56 hrs, adjusted reduction in ventilation hours: 33% [95% confidence interval 20-45], p < .001). Intensive care unit mortality was not significantly different (2.2% vs. 2.5%, adjusted p =.23). These associations were consistent across daytime and nighttime admissions, weekend and weekday admissions, and among subgroups with higher Pediatric Risk of Mortality III scores, postsurgical patients, and histories of previous intensive care unit admission. Implementation of 24-hr in-hospital pediatric critical care attending coverage was associated with shorter duration of mechanical ventilation and shorter length of intensive care unit stay. After accounting for potential confounders, this finding was consistent across a broad spectrum of critically ill children.
Vitamin D and parathyroid hormone status in a representative population living in Macau, China.
Ke, L; Mason, R S; Mpofu, E; Dibley, M; Li, Y; Brock, K E
2015-04-01
Associations between documented sun-exposure, exercise patterns and fish and supplement intake and 25-hydroxyvitamin D (25OHD) and parathyroid hormone (PTH) were investigated in a random household survey of Macau residents (aged 18-93). Blood samples (566) taken in summer were analyzed for 25OHD and PTH. In this Chinese population, 55% were deficient (25OHD <50nmol/L: median (interquartile range)=47.7 (24.2) nmol/L). Vitamin D deficiency was greatest in those aged <50 years: median (interquartile range)=43.3 (18.2) nmol/L, females: median (interquartile range)=45.5 (19.4) nmol/L and those with higher educational qualifications: median (interquartile range)=43.1 (18.7) nmol/L. In the total Macau population, statistically significant (p<0.01) modifiable associations with lower 25OHD levels were sunlight exposure (β=0.06), physical activity (PA) (measured as hours(hrs)/day: β=0.08), sitting (measured as hrs/day β=-0.20), intake of fish (β=0.08) and calcium (Ca) supplement intake (β=0.06) [linear regression analysis adjusting for demographic risk factors]. On similar analysis, and after adjustment for 25OHD, the only significant modifiable associations in the total population with PTH were sitting (β=-0.17), Body Mass Index (β=0.07) and Ca supplement intake (β=-0.06). In this Macau population less documented sun exposure, fish and Ca supplement intake and exercise were associated with lower 25OHD levels, especially in the younger population, along with the interesting finding that more sitting was associated with both lower 25OHD and high PTH blood levels. In conclusion, unlike findings from Caucasian populations, younger participants were significantly more vitamin D deficient, in particular highly educated single females. This may indicate the desire of young females to be pale and avoid the sun. There are also big differences in lifestyle between the older generation and the younger, in particular with respect to sun exposure and PA. This article is part of a Special Issue entitled '17th Vitamin D Workshop'. Copyright © 2015 Elsevier Ltd. All rights reserved.
Adolescent knowledge and attitudes related to clinical trials.
Brown, Devin L; Cowdery, Joan E; Jones, Toni Stokes; Langford, Aisha; Gammage, Catherine; Jacobs, Teresa L
2015-06-01
Poor enrollment plagues most clinical trials. Furthermore, despite mandates to improve minority representation in clinical trial participation, little progress has been made. We investigated the knowledge and attitudes of adolescents related to clinical trials and made race/ethnicity comparisons in an attempt to identify a possible educational intervention target. Students aged 13-18 years in southeast Michigan were offered participation through a class at one high school or two academic summer enrichment programs that drew from multiple high schools (73% response). Questionnaires previously validated in adults were administered. Non-Hispanic whites were compared with minorities using Wilcoxon rank-sum tests. Of the 82 respondents, the median age was 16 years (interquartile range: 15-17 years); 22 (28%) were white, 41 (51%) were African American, 11 (14%) were multiracial, 2 (2%) were American Indian or Alaska Native, 1 (1%) was Asian, 3 (4%) were Native Hawaiian or other Pacific Islander, and 2 respondents did not report a race (but did report Hispanic ethnicity). Nine (12%) were Hispanic. Only 27 (33%) had ever heard of a clinical trial. On a scale from 1 (most receptive) to 5 (least receptive) for learning more about a clinical trial for a relevant medical condition, the median score was 2 (interquartile range: 1-3) and for participating in a clinical trial for a relevant medical condition was 2 (interquartile range: 2-3). Overall knowledge was poor, with a median of 46% (interquartile range: 23%-62%) of knowledge answers correct. Knowledge was reduced (p = 0.0006) and attitudes were more negative (p = 0.05) in minorities than non-Hispanic whites, while minorities also endorsed more substantial barriers to trial participation (p = 0.0002). Distrust was similar between minority students and non-Hispanic whites (p = 0.15), and self-efficacy was greater in non-Hispanic whites (p = 0.05). Educational interventions directed toward adolescents that address knowledge, attitudes, and distrust in order to improve clinical trial awareness and receptivity overall are needed and may represent a tool to address disparities in minority enrollment in clinical trials. © The Author(s) 2015.
Cranston, Ross D; Baker, Jonathan R; Siegel, Aaron; Brand, Rhonda M; Janocko, Laura; McGowan, Ian
2018-03-01
Imiquimod can be used to treat internal anal high-grade squamous intraepithelial lesions. In HIV-1-infected individuals there is a theoretical concern for increased HIV replication in anorectal tissue secondary to imiquimod-induced mucosal inflammation. The purpose of this study was to assess local virologic, immunologic, and pathologic effects of imiquimod treatment in HIV-infected individuals. This was a pilot study at a single academic center. The study was conducted at the University of Pittsburgh Anal Dysplasia Clinic. HIV-1-infected individuals with biopsy-confirmed internal anal high-grade squamous intraepithelial lesions were included. Imiquimod cream was prescribed for intra-anal use 3 times per week for 9 weeks. Anal human papillomavirus typing, anal and rectal tissue HIV-1 RNA and DNA quantification, cytokine gene expression, and anal histology were measured. Nine evaluable participants (1 participant was lost to follow-up) were all white men with a median age of 46 years (interquartile range = 12 y) and a median CD4 T-cell count of 480 cells per cubic millimeter (interquartile range = 835). All were taking antiretroviral therapy, and 7 of 9 had HIV-1 RNA <50 copies per milliliter. The median dose of imiquimod used was 27.0 (interquartile range = 3.5), and there was a median of 11 days (interquartile range = 10 d) from last dose to assessment. There was no progression to cancer, no significant change in the number of human papillomavirus types detected, and no significant change in quantifiable cytokines/HIV-1 RNA or DNA levels in anal or rectal tissue. Seven (35%) of 20 high-grade lesions resolved to low-grade squamous intraepithelial lesions. The study was limited by the small number of participants and variable time to final assessment. Intra-anal imiquimod showed no evidence of immune activation or increase in HIV-1 viral replication in anal and rectal tissue and confirmed efficacy for intra-anal high-grade squamous intraepithelial lesion treatment morbidity. See Video Abstract at http://links.lww.com/DCR/A498.
Lorenz, Georg; Steubl, Dominik; Kemmner, Stephan; Pasch, Andreas; Koch-Sembdner, Wilhelm; Pham, Dang; Haller, Bernhard; Bachmann, Quirin; Mayer, Christopher C; Wassertheurer, Siegfried; Angermann, Susanne; Lech, Maciej; Moog, Philipp; Bauer, Axel; Heemann, Uwe; Schmaderer, Christoph
2017-10-17
A novel in-vitro test (T 50 -test) assesses ex-vivo serum calcification propensity which predicts mortality in HD patients. The association of longitudinal changes of T 50 with all-cause and cardiovascular mortality has not been investigated. We assessed T 50 in paired sera collected at baseline and at 24 months in 188 prevalent European HD patients from the ISAR cohort, most of whom were Caucasians. Patients were followed for another 19 [interquartile range: 11-37] months. Serum T 50 exhibited a significant decline between baseline and 24 months (246 ± 64 to 190 ± 68 minutes; p < 0.001). With serum Δ-phosphate showing the strongest independent association with declining T 50 (r = -0.39; p < 0.001) in multivariable linear regression. The rate of decline of T 50 over 24 months was a significant predictor of all-cause (HR = 1.51 per 1SD decline, 95% CI: 1.04 to 2.2; p = 0.03) and cardiovascular mortality (HR = 2.15; 95% CI: 1.15 to 3.97; p = 0.02) in Kaplan Meier and multivariable Cox-regression analysis, while cross-sectional T 50 at inclusion and 24 months were not. Worsening serum calcification propensity was an independent predictor of mortality in this small cohort of prevalent HD patients. Prospective larger scaled studies are needed to assess the value of calcification propensity as a longitudinal parameter for risk stratification and monitoring of therapeutic interventions.
Moyo, Faith; Chasela, Charles; Brennan, Alana T; Ebrahim, Osman; Sanne, Ian M; Long, Lawrence; Evans, Denise
2016-01-01
Despite the widely documented success of antiretroviral therapy (ART), stakeholders continue to face the challenges of poor HIV treatment outcomes. While many studies have investigated patient-level causes of poor treatment outcomes, data on the effect of health systems on ART outcomes are scarce. We compare treatment outcomes among patients receiving HIV care and treatment at a public and private HIV clinic in Johannesburg, South Africa. This was a retrospective cohort analysis of ART naïve adults (≥18.0 years), initiating ART at a public or private clinic in Johannesburg between July 01, 2007 and December 31, 2012. Cox proportional-hazards regression was used to identify baseline predictors of mortality and loss to follow-up (>3 months late for the last scheduled visit). Generalized estimating equations were used to determine predictors of failure to suppress viral load (≥400 copies/mL) while the Wilcoxon rank-sum test was used to compare the median absolute change in CD4 count from baseline to 12 months post-ART initiation. 12,865 patients initiated ART at the public clinic compared to 610 at the private clinic. The patients were similar in terms of sex and age at initiation. Compared to public clinic patients, private clinic patients initiated ART at higher median CD4 counts (159 vs 113 cells/mm(3)) and World Health Organization stage I/II (76.1% vs 58.5%). Adjusted hazard models showed that compared to public clinic patients, private clinic patients were less likely to die (adjusted hazard ratio [aHR] 0.50; 95% confidence interval [CI] 0.35-0.70) but were at increased risk of loss to follow-up (aHR 1.80; 95% CI 1.59-2.03). By 12 months post-ART initiation, private clinic patients were less likely to have a detectable viral load (adjusted relative risk 0.65; 95% CI 0.49-0.88) and recorded higher median CD4 change from baseline (184 cells/mm(3) interquartile range 101-300 vs 158 cells/mm(3) interquartile range 91-244), when compared to public clinic patients. We identified differences in treatment outcomes between the two HIV clinics. Findings suggest that the type of clinic at which ART patients initiate and receive treatment can have an impact on treatment outcomes. Further research is necessary to provide more conclusive results.
Turaga, Kiran K.; Beasley, Georgia M.; Kane, John M.; Delman, Keith A.; Grobmyer, Stephen R.; Gonzalez, Ricardo J.; Letson, G. Douglas; Cheong, David; Tyler, Douglas S.; Zager, Jonathan S.
2015-01-01
Objective To demonstrate the efficacy of isolated limb infusion (ILI) in limb preservation for patients with locally advanced soft-tissue sarcomas and nonmelanoma cutaneous malignant neoplasms. Background Locally advanced nonmelanoma cutaneous and soft-tissue malignant neoplasms, including soft-tissue sarcomas of the extremities, can pose significant treatment challenges. We report our experience, including responses and limb preservation rates, using ILI in cutaneous and soft-tissue malignant neoplasms. Methods We identified 22 patients with cutaneous and soft-tissue malignant neoplasms who underwent 26 ILIs with melphalan and actinomycin from January 1, 2004, through December 31, 2009, from 5 institutions. Outcome measures included limb preservation and in-field response rates. Toxicity was measured using the Wieberdink scale and serum creatinine phosphokinase levels. Results The median age was 70 years (range, 19-92 years), and 12 patients (55%) were women. Fourteen patients (64%) had sarcomas, 7 (32%) had Merkel cell carcinoma, and 1 (5%) had squamous cell carcinoma. The median length of stay was 5.5 days (interquartile range, 4-8 days). Twenty-five of the 26 ILIs (96%) resulted in Wieberdink grade III or less toxicity, and 1 patient (4%) developed grade IV toxicity. The median serum creatinine phosphokinase level was 127 U/L for upper extremity ILIs and 93 U/L for lower extremity ILIs. Nineteen of 22 patients (86%) underwent successful limb preservation. The 3-month in-field response rate was 79% (21% complete and 58% partial), and the median follow-up was 8.6 months (range, 1-63 months). Five patients underwent resection of disease after an ILI, of whom 80% are disease free at a median of 8.6 months. Conclusions Isolated limb infusion provides an attractive alternative therapy for regional disease control and limb preservation in patients with limb-threatening cutaneous and soft-tissue malignant neoplasms. Short-term response rates appear encouraging, yet durability of response is unknown. PMID:21768436
DI Pierro, Giovanni Battista; Grande, Pietro; Mordasini, Livio; Danuser, Hansjörg; Mattei, Agostino
2016-08-01
To analyze safety and efficacy of robot-assisted radical prostatectomy (RARP) in a low-volume centre. From 2008 to 2015, 400 consecutive patients undergoing RARP were prospectively enrolled. Complications were classified according to the Modified Clavien System. Biochemical recurrence (BCR) was defined as two consecutive prostate-specific antigen (PSA) values ≥0.2 ng/ml. Functional outcomess were assessed using validated, self-administered questionnaires. Median patient age was 64.5 years. Mean standard deviation (SD) preoperative PSA level was 11.3 (11.7) ng/ml. Median interquartile range (IQR) follow-up was 36 (12-48) months. Overall complication rate was 27.7% (minor complications rate 16.2%). Overall 1-, 3- and 6-year BCR-free survival rates were 85.7%, 77.5% and 53.9%, respectively; these rates were 94.1%, 86.2% and 70.1% in pT2 diseases. At follow-up, 98.4% of patients were fully continent (median (IQR) time to continence was 2 (1-3) months) and 68.2% were potent (median (IQR) time to potency of 3 (3-4) months). RARP appears to be a valuable option for treating clinically localised prostate cancer also in a low-volume institution. Copyright© 2016 International Institute of Anticancer Research (Dr. John G. Delinassios), All rights reserved.
Gin, Henri; Demeaux, Jean-Louis; Grelaud, Angela; Grolleau, Adeline; Droz-Perroteau, Cécile; Robinson, Philip; Lassalle, Régis; Abouelfath, Abdelilah; Boisseau, Michel; Toussaint, Christian; Moore, Nicholas
2013-01-01
Estimate the effect of lifestyle adjustment activities in patients with metabolic syndrome treated by prescribed balneotherapy. Observational pilot cohort study with 12-month follow-up after multidimensional lifestyle training (physical, dietary, educational) during 3-week standard stay in the spa town of Eugénie-les-Bains. Of 145 eligible patients, 97 were included; 63 were followed and analysable. At inclusion all had ≥3 National cholesterol education program-Adult treatment panel III (NCEP-ATPIII) criteria defining metabolic syndrome, 76.2% were female, mean age was 61.2 years. At the end of follow-up (median:10.4 months, Inter-Quartile Range: [6.7;11.4]), 48 of these 63 patients (76.2%) no longer had metabolic syndrome (95%CI [65.7;86.7]). These 48 patients without metabolic syndrome at the end of follow-up represented 49.5% of the 97 included (95%CI [39.5;59.4]). Future studies of lifestyle interventions taking advantage of the spa environment can be expected to find least one third of patients free of metabolic syndrome at the end of 12-month follow-up in the intervention group. © 2013 Société Française de Pharmacologie et de Thérapeutique.
Seizure Freedom in Children With Pathology-Confirmed Focal Cortical Dysplasia.
Mrelashvili, Anna; Witte, Robert J; Wirrell, Elaine C; Nickels, Katherine C; Wong-Kisiel, Lily C
2015-12-01
We evaluated the temporal course of seizure outcome in children with pathology-confirmed focal cortical dysplasia and explored predictors of sustained seizure freedom. We performed a single-center retrospective study of children ≤ 18 years who underwent resective surgery from January 1, 2000 through December 31, 2012 and had pathology-proven focal cortical dysplasia. Surgical outcome was classified as seizure freedom (Engel class I) or seizure recurrence (Engel classes II-IV). Fisher exact and nonparametric Wilcoxon ranksum tests were used, as appropriate. Survival analysis was based on seizure-free outcome. Patients were censored at the time of seizure recurrence or seizure freedom at last follow-up. Thirty-eight patients were identified (median age at surgery, 6.5 years; median duration of epilepsy, 3.3 years). Median time to last follow-up was 13.5 months (interquartile range, 7-41 months). Twenty patients (53%) were seizure free and 26 patients (68%) attained seizure freedom for a minimum of 3 months. Median time to seizure recurrence was 38 months (95% confidence interval, 6-109 months), and the cumulative seizure-free rate was 60% at 12 months (95% confidence interval, 43%-77%). Clinical features associated with seizure freedom at last follow-up included older age at seizure onset (P = .02), older age at surgery (P = .04), absent to mild intellectual disability before surgery (P = .05), and seizure freedom for a minimum of 3 months (P < .001). Favorable clinical features associated with sustained seizure freedom included older age at seizure onset, older age at surgery, absent or mild intellectual disability at baseline, and seizure freedom for a minimum of 3 months. Copyright © 2015 Elsevier Inc. All rights reserved.
Athletes with inguinal disruption benefit from endoscopic totally extraperitoneal (TEP) repair.
Roos, M M; Bakker, W J; Goedhart, E A; Verleisdonk, E J M M; Clevers, G J; Voorbrood, C E H; Sanders, F B M; Naafs, D B; Burgmans, J P J
2018-06-01
Inguinal disruption, a common condition in athletes, is a diagnostic and therapeutic challenge. The aim of this study was to evaluate the effect of endoscopic totally extraperitoneal (TEP) repair in athletes with inguinal disruption, selected through a multidisciplinary, systematic work-up. An observational, prospective cohort study was conducted in 32 athletes with inguinal disruption. Athletes were assessed by a sports medicine physician, radiologist and hernia surgeon and underwent subsequent endoscopic TEP repair with placement of polypropylene mesh. The primary outcome was pain reduction during exercise on the numeric rating scale (NRS) 3 months postoperatively. Secondary outcomes were sports resumption, physical functioning and long-term pain intensity. Patients were assessed preoperatively, 3 months postoperatively and after a median follow-up of 19 months. Follow-up was completed in 30 patients (94%). The median pain score decreased from 8 [interquartile range (IQR) 7-8] preoperatively to 2 (IQR 0-5) 3 months postoperatively (p < 0.001). At long-term follow-up, the median pain score was 0 (IQR 0-3) (p < 0.001). At 3 months, 60% of patients were able to complete a full training and match. The median intensity of sport was 50% (IQR 20-70) preoperatively, 95% (IQR 70-100) 3 months postoperatively (p < 0.001), and 100% (IQR 90-100) at long-term follow-up (p < 0.001). The median frequency of sport was 4 (IQR 3-5) times per week before development of symptoms and 3 (IQR 3-4) times per week 3 months postoperatively (p = 0.025). Three months postoperatively, improvement was shown on all physical functioning subscales. Athletes with inguinal disruption, selected through a multidisciplinary, systematic work-up, benefit from TEP repair.
Ghatak, Nishantadeb; Trehan, Amita; Bansal, Deepak
2016-01-01
In a low-income country, a child with cancer has severe financial implications for the family. Invariably, patients have to self-finance their therapy. "Out-of-pocket" expenses tend to be high. Also, parents may face loss of job or business resulting in loss of income. Our objective was to assess the financial burden in families with a child with cancer. The cost to a family with a child with acute lymphoblastic leukemia (ALL) during the first month of therapy was analyzed. Fifty families were given a cost diary in which details of expenditure (direct medical costs, living costs, transport cost) and lost income/employment were recorded. The families evaluated came from a distance of 260 ± 218 km from hospital. Most families belonged to upper lower category (62%). The medical expenditure amounted to US dollar (USD) 524 (interquartile range (IQR) 395-777). Nonmedical expenditure was USD 207 (IQR 142-293), the maximum expenditure being on food. The monthly expenses were 7.2 times the monthly per capita income of India which was Indian rupee (INR) 5729 (USD 97) in 2012-2013. Thirty-nine families got financial help (USD 800-3225) from various sources, within 6 months of application. Of the families, 72% families suffered loss of income, 34% fathers lost their jobs. Families spend up to seven times their monthly income over a period of 1 month on an unforeseen illness. Despite financial aid from various sources, nonmedical costs amount to nearly 2.5 times the average per capita income. Universal health insurance is the need of the hour.
Daniels, Benjamin; Kiely, Belinda E; Lord, Sarah J; Houssami, Nehmat; Lu, Christine Y; Ward, Robyn L; Pearson, Sallie-Anne
2018-04-01
Outcomes for patients treated in clinical trials may not reflect the experience in routine clinical care. We aim to describe the real-world treatment patterns and overall survival (OS) for women receiving trastuzumab for metastatic breast cancer (MBC). Retrospective, whole-of-population cohort study using demographic, dispensing, and medical services data for women in the Herceptin Program for HER2+MBC. We estimated time on trastuzumab and OS from first dispensing of trastuzumab for MBC and rates of cardiac monitoring prior to and during treatment. We stratified outcomes by two groups based on year of initiation: 2001-2008 and 2009-2015. We benchmarked outcomes to two key trastuzumab clinical trials: H0648g (median OS 25 months) and CLEOPATRA (control group median OS 41 months). Median age of the 5899 women at first trastuzumab dispensing was 57 years (interquartile range [IQR]: 48-66). Median time on trastuzumab increased from 15 months (7-33) in Group One to 18 months (8-42) in Group Two. Median OS increased from 27 months (12-57) in Group One to 38 months (16-83) in Group Two. Rates of cardiac monitoring increased at baseline (52%-76%), and on-treatment (47%-67%), in Group One and Two, respectively. OS, duration of trastuzumab, and frequency of cardiac monitoring increased over the study period. Outcomes for trastuzumab in this heterogeneous real world population were reassuringly comparable to those from clinical trials, with the median OS > 3 years in Group Two and 25% of patients living 7 years or longer. Copyright © 2017 Elsevier Ltd. All rights reserved.
Normalization methods in time series of platelet function assays
Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham
2016-01-01
Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217
Rositch, Anne F.; Soeters, Heidi M.; Offutt-Powell, Tabatha N.; Wheeler, Bradford S.; Taylor, Sylvia M.; Smith, Jennifer S.
2015-01-01
Objective To systematically review the published literature in order to estimate the incidence and describe the variability of human papillomavirus (HPV) infection in women following treatment for cervical neoplasia. Methods Several scientific literature databases (e.g. PubMed, ISI Web of Science) were searched through January 31, 2012. Eligible articles provided data on (i) baseline HPV infection status within 6 months prior to or at time of treatment (pre-treatment); and (ii) HPV test results for women's first visit after treatment occurring within 36 months (post-treatment). We abstracted and summarized the post-treatment incidence of newly detected HPV genotypes that were not present at pre-treatment, overall and stratified by study and other population characteristics. Results A total of 25 studies were included, reporting post-treatment HPV incidence in nearly 2000 women. Mean patient age ranged from 31 to 43 years (median 36). Most studies used cervical exfoliated cell specimens to test for HPV DNA (n = 20; 80%), using polymerase chain reaction (n = 21; 84%). Cervical neoplasia treatment included loop electrical excision procedure (n = 11; 44%); laser conization (n = 2; 8%); laser ablation, surgical conization, cryotherapy, alpha-interferon (n = 1; 4% each); or multiple treatment regimens (n = 8; 32%). Follow-up times post-treatment ranged from 1.5 to 36 months (median 6). More than half of studies (n = 17; 68%) estimated the incidence of any HPV type following treatment, while 7 (28%) focused specifically on high-risk (HR) HPV. HPV incidence after treatment varied widely, ranging from 0 to 47% (interquartile range: 0%-15%) in up to 3 years of follow-up after treatment. Lower HPV incidence was observed among studies that included relatively younger women, used laser conization, focused on HR-HPV rather than overall HPV infection, and had a lower proportion of recurrent cervical disease. Conclusions These modest summary incidence estimates from the published literature can guide clinicians, epidemiologists and health economists in developing best practices for post-treatment cervical cancer prevention. PMID:24412508
Venigalla, Sriram; Carmona, Ruben; Guttmann, David M; Jain, Varsha; Freedman, Gary M; Clark, Amy S; Shabason, Jacob E
2018-05-24
Although adjuvant endocrine therapy confers a survival benefit among females with hormone receptor (HR)-positive breast cancer, the effectiveness of this treatment among males with HR-positive breast cancer has not been rigorously investigated. To investigate trends, patterns of use, and effectiveness of adjuvant endocrine therapy among men with HR-positive breast cancer. This retrospective cohort study identified patients in the National Cancer Database with breast cancer who had received treatment from 2004 through 2014. Inclusion criteria for the primary study cohort were males at least 18 years old with nonmetastatic HR-positive invasive breast cancer who underwent surgery with or without adjuvant endocrine therapy. A cohort of female patients was also identified using the same inclusion criteria for comparative analyses by sex. Data analysis was conducted from October 1, 2017, to December 15, 2017. Receipt of adjuvant endocrine therapy. Patterns of adjuvant endocrine therapy use were assessed using multivariable logistic regression analyses. Association between adjuvant endocrine therapy use and overall survival was assessed using propensity score-weighted multivariable Cox regression models. The primary study cohort comprised 10 173 men with HR-positive breast cancer (mean [interquartile range] age, 66 [57-75] years). The comparative cohort comprised 961 676 women with HR-positive breast cancer (mean [interquartile range] age, 62 [52-72] years). The median follow-up for the male cohort was 49.6 months (range, 0.1-142.5 months). Men presented more frequently than women with HR-positive disease (94.0% vs 84.3%, P < .001). However, eligible men were less likely than women to receive adjuvant endocrine therapy (67.3% vs 79.0%; OR, 0.61; 95% CI, 0.58-0.63; P < .001). Treatment at academic facilities (odds ratio, 1.13; 95% CI, 1.02-1.25; P = .02) and receipt of adjuvant radiotherapy (odds ratio, 2.83; 95% CI, 2.55-3.15; P < .001) or chemotherapy (odds ratio, 1.20; 95% CI, 1.07-1.34; P < .001) were statistically significantly associated with adjuvant endocrine therapy use in men. A propensity score-weighted analysis indicated that relative to no use, adjuvant endocrine therapy use in men was associated with improved overall survival (hazard ratio, 0.70; 95% CI, 0.63-0.77; P < .001). There is a sex disparate underuse of adjuvant endocrine therapy among men with HR-positive breast cancer despite the use of this treatment being associated with improved overall survival. Further research and interventions may be warranted to bridge gaps in care in this population.
Influence of Antiflatulent Dietary Advice on Intrafraction Motion for Prostate Cancer Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lips, Irene M., E-mail: I.M.Lips@umcutrecht.nl; Kotte, Alexis N.T.J.; Gils, Carla H. van
Purpose: To evaluate the effect of an antiflatulent dietary advice on the intrafraction prostate motion in patients treated with intensity-modulated radiotherapy (IMRT) for prostate cancer. Methods and Materials: Between February 2002 and December 2009, 977 patients received five-beam IMRT for prostate cancer to a dose of 76 Gy in 35 fractions combined with fiducial markers for position verification. In July 2008, the diet, consisting of dietary guidelines to obtain regular bowel movements and to reduce intestinal gas by avoiding certain foods and air swallowing, was introduced to reduce the prostate motion. The intrafraction prostate movement was determined from the portalmore » images of the first segment of all five beams. Clinically relevant intrafraction motion was defined as {>=}50% of the fractions with an intrafraction motion outside a range of 3 mm. Results: A total of 739 patients were treated without the diet and 105 patients were treated with radiotherapy after introduction of the diet. The median and interquartile range of the average intrafraction motion per patient was 2.53 mm (interquartile range, 2.2-3.0) without the diet and 3.00 mm (interquartile range, 2.4-3.5) with the diet (p < .0001). The percentage of patients with clinically relevant intrafraction motion increased statistically significant from 19.1% without diet to 42.9% with a diet (odds ratio, 3.18; 95% confidence interval, 2.07-4.88; p < .0001). Conclusions: The results of the present study suggest that antiflatulent dietary advice for patients undergoing IMRT for prostate cancer does not reduce the intrafraction movement of the prostate. Therefore, antiflatulent dietary advice is not recommended in clinical practice for this purpose.« less
Aggarwal, Neil R; Brower, Roy G; Hager, David N; Thompson, B Taylor; Netzer, Giora; Shanholtz, Carl; Lagakos, Adrian; Checkley, William
2018-04-01
High fractions of inspired oxygen may augment lung damage to exacerbate lung injury in patients with acute respiratory distress syndrome. Participants enrolled in Acute Respiratory Distress Syndrome Network trials had a goal partial pressure of oxygen in arterial blood range of 55-80 mm Hg, yet the effect of oxygen exposure above this arterial oxygen tension range on clinical outcomes is unknown. We sought to determine if oxygen exposure that resulted in a partial pressure of oxygen in arterial blood above goal (> 80 mm Hg) was associated with worse outcomes in patients with acute respiratory distress syndrome. Longitudinal analysis of data collected in these trials. Ten clinical trials conducted at Acute Respiratory Distress Syndrome Network hospitals between 1996 and 2013. Critically ill patients with acute respiratory distress syndrome. None. We defined above goal oxygen exposure as the difference between the fraction of inspired oxygen and 0.5 whenever the fraction of inspired oxygen was above 0.5 and when the partial pressure of oxygen in arterial blood was above 80 mm Hg. We then summed above goal oxygen exposures in the first five days to calculate a cumulative above goal oxygen exposure. We determined the effect of a cumulative 5-day above goal oxygen exposure on mortality prior to discharge home at 90 days. Among 2,994 participants (mean age, 51.3 yr; 54% male) with a study-entry partial pressure of oxygen in arterial blood/fraction of inspired oxygen that met acute respiratory distress syndrome criteria, average cumulative above goal oxygen exposure was 0.24 fraction of inspired oxygen-days (interquartile range, 0-0.38). Participants with above goal oxygen exposure were more likely to die (adjusted interquartile range odds ratio, 1.20; 95% CI, 1.11-1.31) and have lower ventilator-free days (adjusted interquartile range mean difference of -0.83; 95% CI, -1.18 to -0.48) and lower hospital-free days (adjusted interquartile range mean difference of -1.38; 95% CI, -2.09 to -0.68). We observed a dose-response relationship between the cumulative above goal oxygen exposure and worsened clinical outcomes for participants with mild, moderate, or severe acute respiratory distress syndrome, suggesting that the observed relationship is not primarily influenced by severity of illness. Oxygen exposure resulting in arterial oxygen tensions above the protocol goal occurred frequently and was associated with worse clinical outcomes at all levels of acute respiratory distress syndrome severity.
Johnson, L; van Jaarsveld, C H M; Llewellyn, C H; Cole, T J; Wardle, J
2014-01-01
Objective: Infant growth trajectories, in terms of size, tempo and velocity, may programme lifelong obesity risk. Timing of breastfeeding cessation and weaning are both implicated in rapid infant growth; we examined the association of both simultaneously with a range of growth parameters. Design: Longitudinal population-based twin birth cohort. Subjects: The Gemini cohort provided data on 4680 UK infants with a median of 10 (interquartile range=8–15) weight measurements between birth and a median of 6.5 months. Age at breastfeeding cessation and weaning were reported by parents at mean age 8.2 months (s.d.=2.2, range=4–20). Growth trajectories were modelled using SuperImposition by Translation And Rotation (SITAR) to generate three descriptors of individual growth relative to the average trajectory: size (grams), tempo (weeks, indicating the timing of the peak growth rate) and velocity (% difference from average, reflecting mean growth rate). Complex-samples general linear models adjusting for family clustering and confounders examined associations between infant feeding and SITAR parameters. Results: Longer breastfeeding (>4 months vs never) was independently associated with lower growth velocity by 6.8% (s.e.=1.3%) and delayed growth tempo by 1.0 (s.e.=0.2 weeks), but not with smaller size. Later weaning (⩾6 months vs <4 months) was independently associated with lower growth velocity by 4.9% (s.e.=1.1%) and smaller size by 102 g (s.e.=25 g). Conclusions: Infants breastfed for longer grew slower for longer after birth (later peak growth rate) but were no different in size, while infants weaned later grew slower overall and were smaller but the timing of peak growth did not differ. Slower trajectories with a delayed peak in growth may have beneficial implications for programming later obesity risk. Replication in cohorts with longer follow-up, alternative confounding structures or randomised controlled trials are required to confirm the long-term effects and directionality, and to rule out residual confounding. PMID:24722545
Arcinue, Cheryl A.; Ma, Feiyan; Barteselli, Giulio; Sharpsten, Lucie; Gomez, Maria Laura; Freeman, William R.
2014-01-01
Purpose To evaluate 6-month and 1-year outcomes of every 8 weeks (Q8W) aflibercept in patients with resistant neovascular age-related macular degeneration (AMD). Design Retrospective, interventional, consecutive case series. Methods Retrospective review of patients with resistance (multiple recurrences or persistent exudation) to every 4 weeks (Q4W) ranibizumab or bevacizumab that were switched to Q8W aflibercept. Results Sixty-three eyes of 58 patients had a median of 13 (interquartile range (IQR), 7-22) previous anti Vascular Endothelial Growth Factor (anti-VEGF) injections. At 6-months after changing to aflibercept, 60.3% of eyes were completely dry, which was maintained up to one-year. The median maximum retinal thickness improved from 355 microns to 269 microns at 6 months (p<0.0001) and 248 microns at one year (p<0.0001). There was no significant improvement in ETDRS visual acuity at 6 months (p=0.2559) and one-year follow-up (p=0.1081) compared with baseline. The mean difference in ETDRS visual acuity compared to baseline at 6 months was −0.05 logMAR (+2.5 letters) and 0.04 logMAR at 1 year (−2 letters). Conclusion Sixty percent of eyes with resistant AMD while on Q4W ranibizumab or bevacizumab were completely dry after changing to Q8W aflibercept at the 6-month and 1-year follow-ups, but visual acuity did not significantly improve. Only a third of eyes needed to be switched from Q8W to Q4W aflibercept due to persistence of fluid; Q8W dosing of aflibercept without the initial 3 monthly loading doses may be a good alternative in a select group of patients who may have developed ranibizumab or bevacizumab resistance. PMID:25461263
Turner-McGrievy, Gabrielle M; Davidson, Charis R; Wingard, Ellen E; Billings, Deborah L
2014-06-01
The aim of this randomized pilot was to assess the feasibility of a dietary intervention among women with polycystic ovary syndrome (PCOS) comparing a vegan to a low-calorie (low-cal) diet. Overweight (body mass index, 39.9 ± 6.1 kg/m(2)) women with PCOS (n = 18; age, 27.8 ± 4.5 years; 39% black) who were experiencing infertility were recruited to participate in a 6-month randomized weight loss study delivered through nutrition counseling, e-mail, and Facebook. Body weight and dietary intake were assessed at 0, 3, and 6 months. We hypothesized that weight loss would be greater in the vegan group. Attrition was high at 3 (39%) and 6 months (67%). All analyses were conducted as intention-to-treat and presented as median (interquartile range). Vegan participants lost significantly more weight at 3 months (-1.8% [-5.0%, -0.9%] vegan, 0.0 [-1.2%, 0.3%] low-cal; P = .04), but there was no difference between groups at 6 months (P = .39). Use of Facebook groups was significantly related to percent weight loss at 3 (P < .001) and 6 months (P = .05). Vegan participants had a greater decrease in energy (-265 [-439, 0] kcal/d) and fat intake (-7.4% [-9.2%, 0] energy) at 6 months compared with low-cal participants (0 [0, 112] kcal/d, P = .02; 0 [0, 3.0%] energy, P = .02). These preliminary results suggest that engagement with social media and adoption of a vegan diet may be effective for promoting short-term weight loss among women with PCOS; however, a larger trial that addresses potential high attrition rates is needed to confirm these results. Copyright © 2014 Elsevier Inc. All rights reserved.
Cause, timing, and location of death in the Single Ventricle Reconstruction trial.
Ohye, Richard G; Schonbeck, Julie V; Eghtesady, Pirooz; Laussen, Peter C; Pizarro, Christian; Shrader, Peter; Frank, Deborah U; Graham, Eric M; Hill, Kevin D; Jacobs, Jeffrey P; Kanter, Kirk R; Kirsh, Joel A; Lambert, Linda M; Lewis, Alan B; Ravishankar, Chitra; Tweddell, James S; Williams, Ismee A; Pearson, Gail D
2012-10-01
The Single Ventricle Reconstruction trial randomized 555 subjects with a single right ventricle undergoing the Norwood procedure at 15 North American centers to receive either a modified Blalock-Taussig shunt or right ventricle-to-pulmonary artery shunt. Results demonstrated a rate of death or cardiac transplantation by 12 months postrandomization of 36% for the modified Blalock-Taussig shunt and 26% for the right ventricle-to-pulmonary artery shunt, consistent with other publications. Despite this high mortality rate, little is known about the circumstances surrounding these deaths. There were 164 deaths within 12 months postrandomization. A committee adjudicated all deaths for cause and recorded the timing, location, and other factors for each event. The most common cause of death was cardiovascular (42%), followed by unknown cause (24%) and multisystem organ failure (7%). The median age at death for subjects dying during the 12 months was 1.6 months (interquartile range, 0.6 to 3.7 months), with the highest number of deaths occurring during hospitalization related to the Norwood procedure. The most common location of death was at a Single Ventricle Reconstruction trial hospital (74%), followed by home (13%). There were 29 sudden, unexpected deaths (18%), although in retrospect, 12 were preceded by a prodrome. In infants with a single right ventricle undergoing staged repair, the majority of deaths within 12 months of the procedure are due to cardiovascular causes, occur in a hospital, and within the first few months of life. Increased understanding of the circumstances surrounding the deaths of these single ventricle patients may reduce the high mortality rate. Copyright © 2012 The American Association for Thoracic Surgery. All rights reserved.
Mansfield, Robert T; Lin, Kimberly Y; Zaoutis, Theoklis; Mott, Antonio R; Mohamad, Zeinab; Luan, Xianqun; Kaufman, Beth D; Ravishankar, Chitra; Gaynor, J William; Shaddy, Robert E; Rossano, Joseph W
2015-07-01
The use of ventricular assist devices has increased dramatically in adult heart failure patients. However, the overall use, outcome, comorbidities, and resource utilization of ventricular assist devices in pediatric patients have not been well described. We sought to demonstrate that the use of ventricular assist devices in pediatric patients has increased over time and that mortality has decreased. A retrospective study of the Pediatric Health Information System database was performed for patients 20 years old or younger undergoing ventricular assist device placement from 2000 to 2010. None. Four hundred seventy-five pediatric patients were implanted with ventricular assist devices during the study period: 69 in 2000-2003 (era 1), 135 in 2004-2006 (era 2), and 271 in 2007-2010 (era 3). Median age at ventricular assist device implantation was 6.0 years (interquartile range, 0.5-13.8), and the proportion of children who were 1-12 years old increased from 29% in era 1 to 47% in era 3 (p = 0.002). The majority of patients had a diagnosis of cardiomyopathy; this increased from 52% in era 1 to 72% in era 3 (p = 0.003). Comorbidities included arrhythmias (48%), pulmonary hypertension (16%), acute renal failure (34%), cerebrovascular disease (28%), and sepsis/systemic inflammatory response syndrome (34%). Two hundred forty-seven patients (52%) underwent heart transplantation and 327 (69%) survived to hospital discharge. Hospital mortality decreased from 42% in era 1 to 25% in era 3 (p = 0.004). Median hospital length of stay increased (37 d [interquartile range, 12-64 d] in era 1 vs 69 d [interquartile range, 35-130] in era 3; p < 0.001) and median adjusted hospital charges increased ($630,630 [interquartile range, $227,052-$853,318] in era 1 vs $1,577,983 [interquartile range, $874,463-$2,280,435] in era 3; p < 0.001). Factors associated with increased mortality include age less than 1 year (odds ratio, 2.04; 95% CI, 1.01-3.83), acute renal failure (odds ratio, 2.1; 95% CI, 1.26-3.65), cerebrovascular disease (odds ratio, 2.1; 95% CI, 1.25-3.62), and extracorporeal membrane oxygenation (odds ratio, 3.16; 95% CI, 1.79-5.60). Ventricular assist device placement in era 3 (odds ratio, 0.3; 95% CI, 0.15-0.57) and a diagnosis of cardiomyopathy (odds ratio, 0.5; 95% CI, 0.32-0.84), were associated with decreased mortality. Large-volume centers had lower mortality (odds ratio, 0.55; 95% CI, 0.34-0.88), lower use of extracorporeal membrane oxygenation, and higher charges. The use of ventricular assist devices and survival after ventricular assist device placement in pediatric patients have increased over time, with a concomitant increase in resource utilization. Age under 1 year, certain noncardiac morbidities, and the use of extracorporeal membrane oxygenation are associated with worse outcomes. Lower mortality was seen at larger volume ventricular assist device centers.
Nonresponders: prolonged fever among infants with urinary tract infections.
Bachur, R
2000-05-01
The majority of young children with fever and urinary tract infections (UTIs) have evidence of pyelonephritis based on renal scans. Resolution of fever during treatment is 1 clinical marker of adequate treatment. Theoretically, prolonged fever may be a clue to complications, such as urinary obstruction or renal abscess. Describe the pattern of fever in febrile children undergoing treatment of a UTI. Compare the clinical characteristics of those patients with prolonged fever to those who respond faster to therapy. An urban pediatric hospital. Medical record review. All children =2 years old admitted to the pediatric service with a primary discharge diagnosis of pyelonephritis or UTI were reviewed for 65 consecutive months. Patients with previous UTI, known urologic problems, or immunodeficiency were excluded. Only patients with an admitting temperature >/=38 degrees C and those who met standard culture criteria were studied. Temperatures are not recorded hourly on the inpatient unit; therefore, they were assigned to blocks of time. Nonresponders were defined as those above the 90th percentile for the time to defervesce. Nonresponders were then compared with the balance of the study patients, termed responders. Of 288 patients studied, the median age was 5.6 months (interquartile range: 1.3-7.9 months old). Median admission temperature was 39.3 degrees C (interquartile range: 38.5 degrees C-40.1 degrees C). Median time to defervesce ranged in the time block 13 to 16 hours. Sixty-eight percent were afebrile by 24 hours and 89% by 48 hours. Thirty-one patients had fever >48 hours (nonresponders). Nonresponders were older than responders (9.4 vs 4.1 months old) but had similar initial temperatures (39.8 vs 39.2 degrees C), white blood cell counts (18.4 vs 17.1 x 1000/mm(3)), and band counts (1.4 vs 1.2 x 1000/mm(3)). Nonresponders had similar urinalyses with regard to leukocyte esterase positive (23/29 vs 211/246), nitrite-positive (8/28 vs 88/221], and the number of patients with "too numerous to count" white blood cell counts per high power field (12/28 vs 77/220). Nonresponders were as likely as responders to have bacteremia (3/31 vs 21/256), hydronephrosis by renal ultrasound (1/31 vs 12/232), and significant vesicoureteral reflux (more than or equal to grade 3; 5/26 vs 30/219). Eschericia coli was the pathogen in cultures of 28 of 31 (nonresponders) and 225 of 257 (responders) cultures. The number of cultures with >/=100 colony-forming units/mL was similar (25/31 nonresponders vs 206/257 responders). Repeat urine cultures were performed in 93% of patients during the admission; all culture results were negative. No renal abscesses or pyo-hydronephrosis was diagnosed. Eighty-nine percent of young children with febrile UTIs were afebrile within 48 hours of initiating parenteral antibiotics. The patients who took longer than 48 hours to defervesce were clinically similar to those whose fevers responded faster to therapy. If antibiotic sensitivities are known, additional diagnostic studies or prolonged hospitalizations may not be justified solely based on persistent fever beyond 48 hours of therapy.
Clément, Marie Caroline; Mahlaoui, Nizar; Mignot, Cécile; Le Bihan, Christine; Rabetrano, Hasina; Hoang, Ly; Neven, Bénédicte; Moshous, Despina; Cavazzana, Marina; Blanche, Stéphane; Fischer, Alain; Audrain, Marie; Durand-Zaleski, Isabelle
2015-06-01
The inclusion of severe combined immunodeficiency (SCID) in a Europe-wide screening program is currently debated. In making a case for inclusion in the French newborn screening program, we explored the costs incurred and potentially saved by early management of SCID. For test costs, a microcosting study documented the resources used in a laboratory piloting a newborn screening test on Guthrie cards using the T-cell receptor excision circle quantification method. For treatment costs, patients with SCID admitted to the national reference center for primary immunodeficiency in France between 2006 and 2010 were included. Costs of admission were estimated from actual national production costs. We estimated the costs for patients who underwent early versus delayed hematopoietic stem cell transplantation (HSCT; age, ≤3 vs. >3 months, respectively). The unit cost of the test varied between €4.69 and €6.79 for 33,800 samples per year, depending on equipment use and saturation. Of the 30 patients included, 27 underwent HSCT after age 3 months. At 1 year after HSCT, 10 of these had died, and all 3 patients undergoing early transplantation survived. The medical costs for HSCT after 3 months were €195,776 (interquartile range, €165,884-€257,160) versus €86,179 (range, €59,014-€272,577) when performed before 3 months of age. In patients undergoing late transplantation, active infection contributed to high cost and poor outcome. Early detection of SCID could reduce the cost of treatment by €50,000-100,000 per case. Assuming a €5 unit cost per test, the incidence required to break even is 1:20,000; however, if the survival advantage of HSCT before 3 months is confirmed, universal screening is likely to be cost-effective. Copyright © 2015 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liauw, Stanley L., E-mail: sliauw@radonc.uchicago.edu; Pitroda, Sean P.; Eggener, Scott E.
Purpose: To summarize the results of a 4-year period in which endorectal magnetic resonance imaging (MRI) was considered for all men referred for salvage radiation therapy (RT) at a single academic center; to describe the incidence and location of locally recurrent disease in a contemporary cohort of men with biochemical failure after radical prostatectomy (RP), and to identify prognostic variables associated with MRI findings in order to define which patients may have the highest yield of the study. Methods and Materials: Between 2007 and 2011, 88 men without clinically palpable disease underwent eMRI for detectable prostate-specific antigen (PSA) after RP.more » The median interval between RP and eMRI was 32 months (interquartile range, 14-57 months), and the median PSA level was 0.30 ng/mL (interquartile range, 0.19-0.72 ng/mL). Magnetic resonance imaging scans consisting of T2-weighted, diffusion-weighted, and dynamic contrast-enhanced imaging were evaluated for features consistent with local recurrence. The prostate bed was scored from 0-4, whereby 0 was definitely normal, 1 probably normal, 2 indeterminate, 3 probably abnormal, and 4 definitely abnormal. Local recurrence was defined as having a score of 3-4. Results: Local recurrence was identified in 21 men (24%). Abnormalities were best appreciated on T2-weighted axial images (90%) as focal hypointense lesions. Recurrence locations were perianastomotic (67%) or retrovesical (33%). The only risk factor associated with local recurrence was PSA; recurrence was seen in 37% of men with PSA >0.3 ng/mL vs 13% if PSA {<=}0.3 ng/mL (P<.01). The median volume of recurrence was 0.26 cm{sup 3} and was directly associated with PSA (r=0.5, P=.02). The correlation between MRI-based tumor volume and PSA was even stronger in men with positive margins (r=0.8, P<.01). Conclusions: Endorectal MRI can define areas of local recurrence after RP in a minority of men without clinical evidence of disease, with yield related to PSA. Further study is necessary to determine whether eMRI can improve patient selection and success of salvage RT.« less
Hoffmann, Till; Assmann, Alexander; Dierksen, Angelika; Roussel, Elisabeth; Ullrich, Sebastian; Lichtenberg, Artur; Albert, Alexander; Sixt, Stephan
2018-04-18
Although off-label use of recombinant activated factor VII against refractory bleeding is incorporated in current guideline recommendations, safety concerns persist predominantly with respect to thromboembolic complications. We analyzed the safety and efficacy of recombinant activated factor VII at a very low dose in cardiosurgical patients with refractory bleeding. This prospective study includes 1180 cardiosurgical patients at risk of bleeding. Goal-directed substitution was based on real-time laboratory testing and clinical scoring of the bleeding intensity. All patients who fulfilled the criteria for enhanced risk of bleeding (n = 281) were consequently included in the present analysis. Patients in whom refractory bleeding developed despite substitution with specific hemostatic compounds (n = 167) received a single shot of very low-dose recombinant activated factor VII (≤20 μg/kg). Mortality and risk of thromboembolic complications, and freedom from stroke and acute myocardial infarction in particular, were analyzed (vs patients without recombinant activated factor VII) by multivariable logistic and Cox regression analyses, as well as Kaplan-Meier estimates. There was no increase in rates of mortality (30-day mortality 4.2% vs 7.0% with P = .418; follow-up survival 85.6% at 13.0 [interquartile range, 8.4-15.7] months vs 80.7% at 10.2 [interquartile range, 7.2-16.1] months with P = .151), thromboembolic complications (6.6% vs 9.6% with P = .637), renal insufficiency, need for percutaneous coronary intervention, duration of ventilation, duration of hospital stay, or rehospitalization in patients receiving very low-dose recombinant activated factor VII compared with patients not receiving recombinant activated factor VII. Complete hemostasis without any need for further hemostatic treatment was achieved after very low-dose recombinant activated factor VII administration in the majority of patients (up to 88.6% vs 0% with P < .001). The key results were confirmed after adjustment by propensity score-based analyses. When combined with early and specific restoration of hemostatic reserves after cardiac surgery, very low-dose recombinant activated factor VII treatment of refractory bleeding is effective and not associated with any apparent increase in adverse events. Copyright © 2018 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Gener, G; Canoui-Poitrine, F; Revuz, J E; Faye, O; Poli, F; Gabison, G; Pouget, F; Viallette, C; Wolkenstein, P; Bastuji-Garin, S
2009-01-01
Antibiotics are frequently used to treat hidradenitis suppurativa (HS); however, few data on their efficacy are available. To evaluate the efficacy of a combination of systemic clindamycin (300 mg twice daily) and rifampicin (600 mg daily) in the treatment of patients with severe HS. Patients (n = 116) who received this combination were studied retrospectively. The main outcome measure was the severity of the disease, assessed by the Sartorius score, before and after 10 weeks of treatment. The Sartorius score dramatically improved at the end of treatment (median = 29, interquartile range = 14.5, vs. median = 14.5, interquartile range = 11; p < 0.001), as did other parameters of severity as well as the quality of life score. Eight patients (6.9%) stopped the treatment because of side effects. The combination of clindamycin and rifampicin is effective in the treatment of severe HS. Copyright 2009 S. Karger AG, Basel.
Geographic variability of adherence to occupational injury treatment guidelines.
Trujillo, Antonio J; Heins, Sara E; Anderson, Gerard F; Castillo, Renan C
2014-12-01
To determine the geographic variability and relationship between six occupational injury practice guidelines. Guidelines were developed by an expert panel and evaluated using workers' compensation claims data from a large, national insurance company (1999 to 2010). Percentage compliance for each guideline was adjusted for age and sex using linear regression and mapped by hospital referral region. Regions with the lowest compliance were identified, and correlations between guidelines were calculated. Compliance to the unnecessary home care guideline showed the lowest geographic variation (interquartile range: 97.3 to 99.0), and inappropriate shoulder bracing showed the highest variation (interquartile range: 77.7 to 90.8). Correlation between the guidelines was weak and not always positive. Different guidelines showed different degrees of geographic variation. Lack of correlation between guidelines suggests that these indicators were not associated with a single underlying health care quality or patient severity construct.
ERIC Educational Resources Information Center
Turegun, Mikhail
2011-01-01
Traditional curricular materials and pedagogical strategies have not been effective in developing conceptual understanding of statistics topics and statistical reasoning abilities of students. Much of the changes proposed by statistics education research and the reform movement over the past decade have supported efforts to transform teaching…
Kumwenda, Chiza; Hemsworth, Jaimie; Phuka, John; Ashorn, Ulla; Arimond, Mary; Maleta, Kenneth; Prado, Elizabeth L; Haskell, Marjorie J; Dewey, Kathryn G; Ashorn, Per
2018-07-01
World Health Organization recommends exclusive breastfeeding for infants for the first 6 months of life, followed by introduction of nutritious complementary foods alongside breastfeeding. Breast milk remains a significant source of nourishment in the second half of infancy and beyond; however, it is not clear whether more breast milk is always better. The present study was designed to determine the association between amount of breast milk intake at 9-10 months of age and infant growth and development by 12-18 months of age. The study was nested in a randomized controlled trial conducted in Malawi. Regression analysis was used to determine associations between breast milk intake and growth and development. Mean (SD) breast milk intake at 9-10 months of age was 752 (244) g/day. Mean (SD) length-for-age z-score at 12 months and change in length-for-age z-score between 12 and 18 months were -1.69 (1.0) and -0.17 (0.6), respectively. At 18 months, mean (SD) expressive vocabulary score was 32 (24) words and median (interquartile range) skills successfully performed for fine, gross, and overall motor skills were 21 (19-22), 18 (16-19), and 38 (26-40), respectively. Breast milk intake (g/day) was not associated with either growth or development. Proportion of total energy intake from breast milk was negatively associated with fine motor (β = -0.18, p = .015) but not other developmental scores in models adjusted for potential confounders. Among Malawian infants, neither breast milk intake nor percent of total energy intake from breast milk at 9-10 months was positively associated with subsequent growth between 12 and 18 months, or development at 18 months. © 2018 John Wiley & Sons Ltd.
Long-term persistence of oral human papillomavirus type 16: the HPV Infection in Men (HIM) study.
Pierce Campbell, Christine M; Kreimer, Aimée R; Lin, Hui-Yi; Fulp, William; O'Keefe, Michael T; Ingles, Donna J; Abrahamsen, Martha; Villa, Luisa L; Lazcano-Ponce, Eduardo; Giuliano, Anna R
2015-03-01
Persistent infection with oral HPV16 is believed to drive the development of most oropharyngeal cancers. However, patterns of oral HPV16 persistence remain understudied, particularly among HIV-negative individuals. Oral HPV16 persistence was evaluated among 1,626 participants of the HPV Infection in Men (HIM) Study. Twenty-three oral HPV16-positive men who provided an oral gargle sample on ≥2 study visits were included in the analysis. Archived oral samples from all follow-up visits were tested for HPV16 using Linear Array and INNO-LiPA detection methods. Persistence was evaluated using consecutive HPV16-positive visits held approximately 6 months apart and using the Kaplan-Meier method. Oral HPV16-positive men were aged 18 to 64 years [median, 36 years; interquartile range (IQR), 25-42] and were followed for a median of 44.4 months (IQR, 29.9-49.5). Of 13 incident infections, 4 (30.8%) persisted ≥12 months, 1 (10.0%) persisted ≥24 months, and none persisted ≥36 months [median infection duration, 7.3 months; 95% confidence interval (CI), 6.4-NA)]. Of 10 prevalent infections, 9 (90.0%) persisted ≥12 months, 8 (80.0%) persisted ≥24 months, 4 (57.1%) persisted ≥36 months, and 2 (40.0%) persisted ≥48 months (median infection duration, NA). Twelve-month persistence of incident infections increased significantly with age (Ptrend = 0.028). Prevalent oral HPV16 infections in men persisted longer than newly acquired infections, and persistence appeared to increase with age. These findings may explain the high prevalence of oral HPV observed at older ages. Understanding oral HPV16 persistence will aid in the identification of men at high-risk of developing HPV-related oropharyngeal cancer. ©2015 American Association for Cancer Research.
Clinical and Endoscopic Features of Metastatic Tumors in the Stomach
Kim, Ga Hee; Ahn, Ji Yong; Jung, Hwoon-Yong; Park, Young Soo; Kim, Min-Ju; Choi, Kee Don; Lee, Jeong Hoon; Choi, Kwi-Sook; Kim, Do Hoon; Lim, Hyun; Song, Ho June; Lee, Gin Hyug; Kim, Jin-Ho
2015-01-01
Background/Aims Metastasis to the stomach is rare. The aim of this study was to describe and analyze the clinical outcomes of cancers that metastasized to the stomach. Methods We reviewed the clinicopathological aspects of patients with gastric metastases from solid organ tumors. Thirty-seven cases were identified, and we evaluated the histology, initial presentation, imaging findings, lesion locations, treatment courses, and overall patient survival. Results Endoscopic findings indicated that solitary lesions presented more frequently than multiple lesions and submucosal tumor-like tumors were the most common appearance. Malignant melanoma was the tumor that most frequently metastasized to the stomach. Twelve patients received treatments after the diagnosis of gastric metastasis. The median survival period from the diagnosis of gastric metastasis was 3.0 months (interquartile range, 1.0 to 11.0 months). Patients with solitary lesions and patients who received any treatments survived longer after the diagnosis of metastatic cancer than patients with multiple lesions and patients who did not any receive any treatments. Conclusions Proper treatment with careful consideration of the primary tumor characteristics can increase the survival period in patients with tumors that metastasize to the stomach, especially in cases with solitary metastatic lesions in endoscopic findings. PMID:25473071
Brizola, Evelise; Zambrano, Marina Bauer; Pinheiro, Bruna de Souza; Vanz, Ana Paula; Félix, Têmis Maria
2017-01-01
To characterize the fracture pattern and the clinical history at the time of diagnosis of osteogenesis imperfecta. In this retrospective study, all patients with osteogenesis imperfecta, of both genders, aged 0-18 years, who were treated between 2002 and 2014 were included. Medical records were assessed to collect clinical data, including the presence of blue sclerae, dentinogenesis imperfecta, positive familial history of osteogenesis imperfecta, and the site of the fractures. In addition, radiographic findings at the time of the diagnosis were reviewed. Seventy-six patients (42 females) were included in the study. Individuals' age ranged from 0 to 114 months, with a median (interquartile range) age of 38 (6-96) months. Blue sclerae were present in 93.4% of patients, dentinogenesis imperfecta was observed in 27.6% of patients, and wormian bones in 29.4% of them. The number of fractures at diagnosis ranged from 0 to 17, with a median of 3 (2-8) fractures. Forty (57%) patients had fractures of the upper and lower extremities, and 9 patients also had spinal fractures. The diagnosis was performed at birth in 85.7% of patients with type 3, and 39.3% of those with type 4/5 of the disorder. Osteogenesis imperfecta is a genetic disorder with distinctive clinical features such as bone fragility, recurrent fractures, blue sclerae, and dentinogenesis imperfecta. It is important to know how to identify these characteristics in order to facilitate the diagnosis, optimize the treatment, and differentiate osteogenesis imperfecta from other disorders that also can lead to fractures.
Brizola, Evelise; Zambrano, Marina Bauer; Pinheiro, Bruna de Souza; Vanz, Ana Paula; Félix, Têmis Maria
2017-01-01
ABSTRACT Objective: To characterize the fracture pattern and the clinical history at the time of diagnosis of osteogenesis imperfecta. Methods: In this retrospective study, all patients with osteogenesis imperfecta, of both genders, aged 0-18 years, who were treated between 2002 and 2014 were included. Medical records were assessed to collect clinical data, including the presence of blue sclerae, dentinogenesis imperfecta, positive familial history of osteogenesis imperfecta, and the site of the fractures. In addition, radiographic findings at the time of the diagnosis were reviewed. Results: Seventy-six patients (42 females) were included in the study. Individuals’ age ranged from 0 to 114 months, with a median (interquartile range) age of 38 (6-96) months. Blue sclerae were present in 93.4% of patients, dentinogenesis imperfecta was observed in 27.6% of patients, and wormian bones in 29.4% of them. The number of fractures at diagnosis ranged from 0 to 17, with a median of 3 (2-8) fractures. Forty (57%) patients had fractures of the upper and lower extremities, and 9 patients also had spinal fractures. The diagnosis was performed at birth in 85.7% of patients with type 3, and 39.3% of those with type 4/5 of the disorder. Conclusions: Osteogenesis imperfecta is a genetic disorder with distinctive clinical features such as bone fragility, recurrent fractures, blue sclerae, and dentinogenesis imperfecta. It is important to know how to identify these characteristics in order to facilitate the diagnosis, optimize the treatment, and differentiate osteogenesis imperfecta from other disorders that also can lead to fractures. PMID:28977334
Medicare Part D payments for neurologist-prescribed drugs
Burke, James F.; Kerber, Kevin A.; Skolarus, Lesli E.; Callaghan, Brian C.
2016-01-01
Objective: To describe neurologists' Medicare Part D prescribing patterns and the potential effect of generic substitutions and price negotiation, which is currently prohibited. Methods: The 2013 Medicare Part D Prescriber Public Use and Summary files were used. Payments for medications were aggregated by provider and drug (brand or generic). Payment, proportion of generic claims or day's supply, and median payment per monthly supply of medication were calculated by physician specialty and drug. Savings from generic substitution were estimated for brand drugs with a generic available. Medicare prices were compared to drug prices negotiated by the federal government with pharmaceutical manufacturers for the Veterans Administration (VA). Results: Neurologists comprised 13,060 (1.2%) providers with $5.0 billion (4.8%) in total payments, third highest of all specialties, with a median monthly payment of $141 (interquartile range $85–225). Multiple sclerosis drugs had the highest payments ($1.8 billion). Within neurologic disease groups ($3.4 billion in payments), 54.2%–91.8% of monthly supplies were generic, but 11.9%–71.3% of the payment was for generic medications. Generic substitution resulted in a $269 million (6.5%) payment decrease. VA pricing resulted in $1.5 billion (44.5% of $3.4 billion) in savings. Conclusions: High payment per monthly supply of medication underlies the high total neurology drug payments and is driven by multiple sclerosis drugs. Lowering drug expenditures by Medicare should focus on drug prices. PMID:27009256
Kong, Angela; Beresford, Shirley A A; Alfano, Catherine M; Foster-Schubert, Karen E; Neuhouser, Marian L; Johnson, Donna B; Duggan, Catherine; Wang, Ching-Yun; Xiao, Liren; Jeffery, Robert W; Bain, Carolyn E; McTiernan, Anne
2012-09-01
Lifestyle-based interventions, which typically promote various behavior modification strategies, can serve as a setting for evaluating specific behaviors and strategies thought to promote or hinder weight loss. The aim of our study was to test the associations of self-monitoring (ie, self-weighing and food journal completion) and eating-related (ie, dietary intake, diet-related weight-control strategies, and meal patterns) behaviors with weight loss in a sample of postmenopausal overweight-to-obese women enrolled in a 12-month dietary weight loss intervention. Changes in body weight and adoption of self-monitoring and eating-related behaviors were assessed in 123 participants. Generalized linear models tested associations of these behaviors with 12-month weight change after adjusting for potential confounders. Mean percent weight loss was 10.7%. In the final model, completing more food journals was associated with a greater percent weight loss (interquartile range 3.7% greater weight loss; P<0.0001), whereas skipping meals (4.3% lower weight loss; P<0.05) and eating out for lunch (at least once a week, 2.5% lower weight loss; P<0.01) were associated with a lower amount of weight loss. These findings suggest that a greater focus on dietary self-monitoring, home-prepared meals, and consuming meals at regular intervals may improve 12-month weight loss among postmenopausal women enrolled in a dietary weight loss intervention. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Prenatal and early-life polychlorinated biphenyl (PCB) levels and behavior in Inuit preschoolers.
Verner, Marc-André; Plusquellec, Pierrich; Desjardins, Justine Laura; Cartier, Chloé; Haddad, Sami; Ayotte, Pierre; Dewailly, Éric; Muckle, Gina
2015-05-01
Whereas it is well established that prenatal exposure to polychlorinated biphenyls (PCBs) can disrupt children's behavior, early postnatal exposure has received relatively little attention in environmental epidemiology. To evaluate prenatal and postnatal exposures to PCB-153, a proxy of total PCB exposure, and their relation to inattention and activity in 5-year-old Inuits from the Cord Blood Monitoring Program. Prenatal exposure to PCBs was informed by cord plasma PCB-153 levels. We used a validated pharmacokinetic model to estimate monthly infants' levels across the first year of life. Inattention and activity were assessed by coding of video recordings of children undergoing fine motor testing. We used multivariable linear regression to evaluate the association between prenatal and postnatal PCB-153 levels and inattention (n=97) and activity (n=98) at 5years of age. Cord plasma PCB-153 was not associated with inattention and activity. Each interquartile range (IQR) increase in estimated infant PCB-153 levels at 2months was associated with a 1.02% increase in the duration of inattention (95% CI: 0.04, 2.00). Statistical adjustment for the duration of breastfeeding slightly increased regression coefficients for postnatal level estimates, some of which became statistically significant for inattention (months: 2-4) and activity (months: 2-5). Our study adds to the growing evidence of postnatal windows of development during which children are more susceptible to neurotoxicants like PCBs. Copyright © 2015 Elsevier Ltd. All rights reserved.
Keim, Sarah A; Daniels, Julie L; Siega-Riz, Anna Maria; Herring, Amy H; Dole, Nancy; Scheidt, Peter C
2012-10-01
The aim of this study was to examine infant feeding and the long-chain polyunsaturated fatty acid (LCPUFA) concentration of breast milk and formulas in relation to infant development. The prospective Pregnancy, Infection and Nutrition Study (n=358) collected data on breastfeeding, breast milk samples and the formulas fed through 4months post-partum. At 12months of age, infants' development was assessed (Mullen Scales of Early Learning). Linear regression was used to examine development in relation to breastfeeding, breast milk docosahexaenoic acid (DHA) and arachidonic acid (AA) concentration, and DHA and AA concentration from the combination of breast milk and formula. The median breast milk DHA concentration was 0.20% of total fatty acids [interquartile range (IQR)=0.14, 0.34]; median AA concentration was 0.52% (IQR=0.44, 0.63). Upon adjustment for preterm birth, sex, smoking, race and ethnicity and education, breastfeeding exclusivity was unrelated to development. Among infants exclusively breastfed, breast milk LCPUFA concentration was not associated with development (Mullen composite, DHA: adjusted β=-1.3, 95% confidence interval: -10.3, 7.7). Variables combining DHA and AA concentrations from breast milk and formula, weighted by their contribution to diet, were unassociated with development. We found no evidence of enhanced infant development related to the LCPUFA content of breast milk or formula consumed during the first four post-natal months. © 2011 Blackwell Publishing Ltd.
Fassier, Thomas; Darmon, Michel; Laplace, Christian; Chevret, Sylvie; Schlemmer, Benoit; Pochard, Frédéric; Azoulay, Elie
2007-01-01
Providing family members with clear, honest, and timely information is a major task for intensive care unit physicians. Time spent informing families has been associated with effectiveness of information but has not been measured in specifically designed studies. To measure time spent informing families of intensive care unit patients. One-day cross-sectional study in 90 intensive care units in France. Clocked time spent by physicians informing the families of each of 951 patients hospitalized in the intensive care unit during a 24-hr period. Median family information time was 16 (interquartile range, 8-30) mins per patient, with 20% of the time spent explaining the diagnosis, 20% on explaining treatments, and 60% on explaining the prognosis. One third of the time was spent listening to family members. Multivariable analysis identified one factor associated with less information time (room with more than one bed) and seven factors associated with more information time, including five patient-related factors (surgery on the study day, higher Logistic Organ Dysfunction score, coma, mechanical ventilation, and worsening clinical status) and two family-related factors (first contact with family and interview with the spouse). Median information time was 20 (interquartile range, 10-39) mins when three factors were present and 106.5 (interquartile range, 103-110) mins when five were present. This study identifies factors associated with information time provided by critical care physicians to family members of critically ill patients. Whether information time correlates with communication difficulties or communication skills needs to be evaluated. Information time provided by residents and nurses should be studied.
Videogame playing as distraction technique in course of venipuncture.
Minute, M; Badina, L; Cont, G; Montico, M; Ronfani, L; Barbi, E; Ventura, A
2012-01-01
Needle-related procedures (venipuncture, intravenous cannulation) are the most common source of pain and distress for children. Reducing needle related pain and anxiety could be important in order to prevent further distress, especially for children needing multiple hospital admissions. The aim of the present open randomized controlled trial was to investigate the efficacy of adding an active distraction strategy (videogame) to EMLA premedication in needle-related pain in children. One-hundred and nine children (4 -10 years of age) were prospectively recruited to enter in the study. Ninety-seven were randomized in two groups: CC group (conventional care: EMLA only) as control group and AD group (active distraction: EMLA plus videogame) as intervention group. Outcome measures were: self-reported pain by mean of FPS-R scale (main study outcome), observer-reported pain by FLACC scale, number of attempts for successful procedure. In both groups FPS-R median rate was 0 (interquartile range: 0-2), with significant pain (FPS-R > 4) reported by 9% of subjects. FLACC median rate was 1 in both groups (interquartile range 0-3 in CC group; 0-2 in AD group). The percentage of children with major pain (FLACC > 4) was 18% in CC group and 9% in AD group (p = 0.2). The median of necessary attempts to succeed in the procedures was 1 (interquartile range 1-2) in both groups.. Active distraction doesn't improve EMLA analgesia for iv cannulation and venipuncture. Even though, it resulted in an easily applicable strategy appreciated by children. This technique could be usefully investigated in other painful procedures.
Foley, J
2008-03-01
To develop baseline data in relation to paediatric minor oral surgical procedures undertaken with both general anaesthesia and nitrous oxide inhalation sedation within a Hospital Dental Service. Data were collected prospectively over a three-year period from May 2003 to June 2006 for patients attending the Departments of Paediatric Dentistry, Dundee Dental Hospital and Ninewells Hospital, NHS Tayside, Great Britain, for all surgical procedures undertaken with either inhalation sedation or general anaesthetic. Both operator status and the procedure being undertaken were noted. In addition, the operating time was recorded. Data for 166 patients (F: 102; M: 64) with a median age of 12.50 (inter-quartile range 10.00, 14.20) years showed that 195 surgical procedures were undertaken. Of these 160 and 35 were with general anaesthetic and sedation respectively. The surgical removal of impacted, carious and supernumerary unit(s) accounted for 53.8% of all procedures, whilst the exposure of impacted teeth and soft tissue surgery represented 34.9% and 11.3% of procedures respectively. The median surgical time for techniques undertaken with sedation was 30.00 (inter-quartile range 25.00, 43.50) minutes whilst that for general anaesthetic was similar at 30.00 (inter-quartile range 15.25, 40.00) minutes (not statistically significant, (Mann Whitney U, W = 3081.5, P = 0.331). The majority of paediatric minor oral surgical procedures entail surgical exposure or removal of impacted teeth. The median treatment time for most procedures undertaken with either general anaesthetic or nitrous oxide sedation was 30 minutes.
Mehra, Tarun; Koljonen, Virve; Seifert, Burkhardt; Volbracht, Jörk; Giovanoli, Pietro; Plock, Jan; Moos, Rudolf Maria
2015-01-01
Reimbursement systems have difficulties depicting the actual cost of burn treatment, leaving care providers with a significant financial burden. Our aim was to establish a simple and accurate reimbursement model compatible with prospective payment systems. A total of 370 966 electronic medical records of patients discharged in 2012 to 2013 from Swiss university hospitals were reviewed. A total of 828 cases of burns including 109 cases of severe burns were retained. Costs, revenues and earnings for severe and nonsevere burns were analysed and a linear regression model predicting total inpatient treatment costs was established. The median total costs per case for severe burns was tenfold higher than for nonsevere burns (179 949 CHF [167 353 EUR] vs 11 312 CHF [10 520 EUR], interquartile ranges 96 782-328 618 CHF vs 4 874-27 783 CHF, p <0.001). The median of earnings per case for nonsevere burns was 588 CHF (547 EUR) (interquartile range -6 720 - 5 354 CHF) whereas severe burns incurred a large financial loss to care providers, with median earnings of -33 178 CHF (30 856 EUR) (interquartile range -95 533 - 23 662 CHF). Differences were highly significant (p <0.001). Our linear regression model predicting total costs per case with length of stay (LOS) as independent variable had an adjusted R2 of 0.67 (p <0.001 for LOS). Severe burns are systematically underfunded within the Swiss reimbursement system. Flat-rate DRG-based refunds poorly reflect the actual treatment costs. In conclusion, we suggest a reimbursement model based on a per diem rate for treatment of severe burns.
Fecal Markers of Environmental Enteropathy and Subsequent Growth in Bangladeshi Children.
Arndt, Michael B; Richardson, Barbra A; Ahmed, Tahmeed; Mahfuz, Mustafa; Haque, Rashidul; John-Stewart, Grace C; Denno, Donna M; Petri, William A; Kosek, Margaret; Walson, Judd L
2016-09-07
Environmental enteropathy (EE), a subclinical intestinal disorder characterized by mucosal inflammation, reduced barrier integrity, and malabsorption, appears to be associated with increased risk of stunting in children in low- and middle-income countries. Fecal biomarkers indicative of EE (neopterin [NEO], myeloperoxidase [MPO], and alpha-1-antitrypsin [AAT]) have been negatively associated with 6-month linear growth. Associations between fecal markers (NEO, MPO, and AAT) and short-term linear growth were examined in a birth cohort of 246 children in Bangladesh. Marker concentrations were categorized in stool samples based on their distribution (< first quartile, interquartile range, > third quartile), and a 10-point composite EE score was calculated. Piecewise linear mixed-effects models were used to examine the association between markers measured quarterly (in months 3-21, 3-9, and 12-21) and 3-month change in length-for-age z-score (ΔLAZ). Children with high MPO levels at quarterly time points lost significantly more LAZ per 3-month period during the second year of life than those with low MPO (ΔLAZ = -0.100; 95% confidence interval = -0.167 to -0.032). AAT and NEO were not associated with growth; however, composite EE score was negatively associated with subsequent 3-month growth. In this cohort of children from an urban setting in Bangladesh, elevated MPO levels, but not NEO or AAT levels, were associated with decreases in short-term linear growth during the second year of life, supporting previous data suggesting the relevance of MPO as a marker of EE. © The American Society of Tropical Medicine and Hygiene.
Amigo, Hugo; Bustos, Patricia; Muzzo, Santiago; Alarcón, Ana María; Muñoz, Sergio
2010-08-01
Early onset of menarche has been linked to prevalence of obesity; however, this may differ for indigenous females. To analyse the relationship between age of menarche and nutritional status among indigenous and non-indigenous girls. The design of this study was cross-sectional. Date of menarche was determined through interviews, and all responses were confirmed by the girls' mothers. A total of 8504 adolescents were screened for recent menarche. One hundred and thirty-one girls of Mapuche (indigenous) and 143 girls of Chilean-Spanish background were identified and evaluated by anthropometric measurements. Median age of menarche was 150 months, interquartile range (IR) 143-157 in indigenous, and 145.5 months, IR 139-153 in non-indigenous girls (p = 0.04). The indigenous females showed a higher prevalence of overweight (36.4% vs 23.1%), although the frequency of obesity was similar (16.8% vs 16.3%). For indigenous girls, age of menarche was delayed by 2.69 months (confidence intervals (CI) -0.38 to 5.77). It was observed that girls with overweight experienced age of menarche 7.59 months earlier than those with normal weight, CI -10.78 to -4.41. In the analysis of obesity, the effect on age of menarche was similar, with onset 7.53 months earlier than for the normal weight, CI -11.34 to -3.72. Age of menarche is younger than has been previously reported and occurs earlier in girls with overweight and obesity, while being indigenous was not related.
Self-learning basic life support: A randomised controlled trial on learning conditions.
Pedersen, Tina Heidi; Kasper, Nina; Roman, Hari; Egloff, Mike; Marx, David; Abegglen, Sandra; Greif, Robert
2018-05-01
To investigate whether pure self-learning without instructor support, resulted in the same BLS-competencies as facilitator-led learning, when using the same commercially available video BLS teaching kit. First-year medical students were randomised to either BLS self-learning without supervision or facilitator-led BLS-teaching. Both groups used the MiniAnne kit (Laerdal Medical, Stavanger, Norway) in the students' local language. Directly after the teaching and three months later, all participants were tested on their BLS-competencies in a simulated scenario, using the Resusci Anne SkillReporter™ (Laerdal Medical, Stavanger, Norway). The primary outcome was percentage of correct cardiac compressions three months after the teaching. Secondary outcomes were all other BLS parameters recorded by the SkillReporter and parameters from a BLS-competence rating form. 240 students were assessed at baseline and 152 students participated in the 3-month follow-up. For our primary outcome, the percentage of correct compressions, we found a median of 48% (interquartile range (IQR) 10-83) for facilitator-led learning vs. 42% (IQR 14-81) for self-learning (p = 0.770) directly after the teaching. In the 3-month follow-up, the rate of correct compressions dropped to 28% (IQR 6-59) for facilitator-led learning (p = 0.043) and did not change significantly in the self-learning group (47% (IQR 12-78), p = 0.729). Self-learning is not inferior to facilitator-led learning in the short term. Self-learning resulted in a better retention of BLS-skills three months after training compared to facilitator-led training. Copyright © 2018 Elsevier B.V. All rights reserved.
Ji, Ruijun; Du, Wanliang; Shen, Haipeng; Pan, Yuesong; Wang, Penglian; Liu, Gaifen; Wang, Yilong; Li, Hao; Zhao, Xingquan; Wang, Yongjun
2014-11-25
Acute ischemic stroke (AIS) is one of the leading causes of death and adult disability worldwide. In the present study, we aimed to develop a web-based risk model for predicting dynamic functional status at discharge, 3-month, 6-month, and 1-year after acute ischemic stroke (Dynamic Functional Status after Acute Ischemic Stroke, DFS-AIS). The DFS-AIS was developed based on the China National Stroke Registry (CNSR), in which eligible patients were randomly divided into derivation (60%) and validation (40%) cohorts. Good functional outcome was defined as modified Rankin Scale (mRS) score ≤ 2 at discharge, 3-month, 6-month, and 1-year after AIS, respectively. Independent predictors of each outcome measure were obtained using multivariable logistic regression. The area under the receiver operating characteristic curve (AUROC) and plot of observed and predicted risk were used to assess model discrimination and calibration. A total of 12,026 patients were included and the median age was 67 (interquartile range: 57-75). The proportion of patients with good functional outcome at discharge, 3-month, 6-month, and 1-year after AIS was 67.9%, 66.5%, 66.9% and 66.9%, respectively. Age, gender, medical history of diabetes mellitus, stroke or transient ischemic attack, current smoking and atrial fibrillation, pre-stroke dependence, pre-stroke statins using, admission National Institutes of Health Stroke Scale score, admission blood glucose were identified as independent predictors of functional outcome at different time points after AIS. The DFS-AIS was developed from sets of predictors of mRS ≤ 2 at different time points following AIS. The DFS-AIS demonstrated good discrimination in the derivation and validation cohorts (AUROC range: 0.837-0.845). Plots of observed versus predicted likelihood showed excellent calibration in the derivation and validation cohorts (all r = 0.99, P < 0.001). When compared to 8 existing models, the DFS-AIS showed significantly better discrimination for good functional outcome and mortality at discharge, 3-month, 6-month, and 1-year after AIS (all P < 0.0001). The DFS-AIS is a valid risk model to predict functional outcome at discharge, 3-month, 6-month, and 1-year after AIS.
Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data
NASA Astrophysics Data System (ADS)
Reno, B. L.; Brown, M.; Piccoli, P. M.
2007-12-01
Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.
Olivotto, Iacopo; Maron, Barry J; Appelbaum, Evan; Harrigan, Caitlin J; Salton, Carol; Gibson, C Michael; Udelson, James E; O'Donnell, Christopher; Lesser, John R; Manning, Warren J; Maron, Martin S
2010-07-15
In hypertrophic cardiomyopathy (HCM), the clinical significance attributable to the broad range of left ventricular (LV) systolic function, assessed as the ejection fraction (EF), is incompletely resolved. We evaluated the EF using cardiovascular magnetic resonance (CMR) imaging in a large cohort of patients with HCM with respect to the clinical status and evidence of left ventricular remodeling with late gadolinium enhancement (LGE). CMR imaging was performed in 310 consecutive patients, aged 42 +/- 17 years. The EF in patients with HCM was 71 +/- 10% (range 28% to 89%), exceeding that of 606 healthy controls without cardiovascular disease (66 +/- 5%, p <0.001). LGE reflecting LV remodeling showed an independent, inverse relation to the EF (B-0.69, 95% confidence interval -0.86 to -0.52; p <0.001) and was greatest in patients with an EF <50%, in whom it constituted a median value of 29% of the LV volume (interquartile range 16% to 40%). However, the substantial subgroup with low-normal EF values of 50% to 65% (n = 45; 15% of the whole cohort), who were mostly asymptomatic or mildly symptomatic (37 or 82% with New York Heart Association functional class I to II), showed substantial LGE (median 5% of LV volume, interquartile range 2% to 10%). This overlapped with the subgroup with systolic dysfunction and significantly exceeded that of patients with an EF of 66% to 75% and >75% (median 2% of the LV volume, interquartile range 1.5% to 4%; p <0.01). In conclusion, in a large cohort of patients with HCM, a subset of patients with low-normal EF values (50% to 65%) was identified by contrast-enhanced CMR imaging as having substantial degrees of LGE, suggesting a transition phase, potentially heralding advanced LV remodeling and systolic dysfunction, with implications for clinical surveillance and management. Copyright (c) 2010. Published by Elsevier Inc.
Ivers, Louise C; Teng, Jessica E; Jerome, J Gregory; Bonds, Matthew; Freedberg, Kenneth A; Franke, Molly F
2014-04-01
The epidemics of food insecurity, malnutrition, and human immunodeficiency virus (HIV) frequently overlap. HIV treatment programs increasingly provide nutrient-dense ready-to-use supplementary foods (RUSFs) to patients living with HIV and food insecurity, but in the absence of wasting, it is not known if RUSF confers benefit above less costly food commodities. We performed a randomized trial in rural Haiti comparing an RUSF with less costly corn-soy blend plus (CSB+) as a monthly supplement to patients with HIV infection who were on antiretroviral therapy (ART) <24 months prior to study start. We compared 6- and 12-month outcomes by ration type in terms of immunologic response, body mass index (BMI), adherence to ART, general health quality of life, household food insecurity, and household wealth. A cohort of 524 patients with HIV receiving ART was randomized and followed over time. Median CD4 cell count at baseline was 339 cells/µL (interquartile range [IQR], 197-475 cells/µL) for the CSB+ group, and 341 cells/µL (IQR, 213-464/µL) for the RUSF group. Measured outcomes improved from baseline over time, but there were no statistically significant differences in change for BMI, household wealth index, hunger, general health perception score, or adherence to ART by ration type at 6 or 12 months. The RUSF group had higher CD4 count at 12 months, but this was also not statistically significant. In 12 months of follow-up, there was no statistically significant difference in outcomes between those receiving RUSF-based compared with CSB+-based rations in a cohort of HIV-infected adults on ART in rural Haiti.
Ford, Mackenzie A.; Almond, Christopher S.; Gauvreau, Kimberlee; Piercey, Gary; Blume, Elizabeth D.; Smoot, Leslie B.; Fynn-Thompson, Francis; Singh, Tajinder P.
2014-01-01
BACKGROUND Previous studies have found no association between graft ischemic time (IT) and survival in pediatric heart transplant (HTx) recipients. However, previous studies were small or analyzed risk only at the extremes of IT, where observations are few. We sought to determine whether graft IT is independently associated with graft survival in a large cohort of children with no a priori assumptions about where the risk threshold may lie. METHODS All children aged <18 years in the U.S. undergoing primary HTx (1987 to 2008) were included. The primary end point was graft loss (death or retransplant) within 6 months. Multivariate analysis was performed to analyze the association between graft IT and graft loss within 6 months after transplant. A secondary end point of longer-term graft loss was assessed among recipients who survived the first 6 months after transplant. RESULTS Of 4,716 pediatric HTxs performed, the median IT was 3.5 hours (interquartile range, 2.7–4.3 hours). Adjusted analysis showed that children with an IT > 3.5 hours were at increased risk of graft loss within 6 months after transplant (hazard ratio, 1.3; 95% confidence interval, 1.1–1.5; p = 0.002). Among 6-month survivors, IT was not associated with longer-term graft loss. CONCLUSIONS IT beyond 3.5 hours is associated with a 30% increase in risk of graft loss within 6 months in pediatric HT recipients. Although the magnitude of risk associated with IT is small compared with the risk associated with recipient factors, these findings may be important during donor assessment for high-risk transplant candidates. PMID:21676628
Berry, Colin; Zimmerli, Lukas U; Steedman, Tracey; Foster, John E; Dargie, Henry J; Berg, Geoffrey A; Dominiczak, Anna F; Delles, Christian
2008-03-01
Morbidity following CABG (coronary artery bypass grafting) is difficult to predict and leads to increased healthcare costs. We hypothesized that pre-operative CMR (cardiac magnetic resonance) findings would predict resource utilization in elective CABG. Over a 12-month period, patients requiring elective CABG were invited to undergo CMR 1 day prior to CABG. Gadolinium-enhanced CMR was performed using a trueFISP inversion recovery sequence on a 1.5 tesla scanner (Sonata; Siemens). Clinical data were collected prospectively. Admission costs were quantified based on standardized actual cost/day. Admission cost greater than the median was defined as 'increased'. Of 458 elective CABG cases, 45 (10%) underwent pre-operative CMR. Pre-operative characteristics [mean (S.D.) age, 64 (9) years, mortality (1%) and median (interquartile range) admission duration, 7 (6-8) days] were similar in patients who did or did not undergo CMR. In the patients undergoing CMR, eight (18%) and 11 (24%) patients had reduced LV (left ventricular) systolic function by CMR [LVEF (LV ejection fraction) <55%] and echocardiography respectively. LE (late enhancement) with gadolinium was detected in 17 (38%) patients. The average cost/day was $2723. The median (interquartile range) admission cost was $19059 ($10891-157917). CMR LVEF {OR (odds ratio), 0.93 [95% CI (confidence interval), 0.87-0.99]; P=0.03} and SV (stroke volume) index [OR 1.07 (95% CI, 1.00-1.14); P=0.02] predicted increased admission cost. CMR LVEF (P=0.08) and EuroScore tended to predict actual admission cost (P=0.09), but SV by CMR (P=0.16) and LV function by echocardiography (P=0.95) did not. In conclusion, in this exploratory investigation, pre-operative CMR findings predicted admission duration and increased admission cost in elective CABG surgery. The cost-effectiveness of CMR in risk stratification in elective CABG surgery merits prospective assessment.
Carpio, D; Jauregui-Amezaga, A; de Francisco, R; de Castro, L; Barreiro-de Acosta, M; Mendoza, J L; Mañosa, M; Ollero, V; Castro, B; González-Conde, B; Hervías, D; Sierra Ausin, M; Sancho Del Val, L; Botella-Mateu, B; Martínez-Cadilla, J; Calvo, M; Chaparro, M; Ginard, D; Guerra, I; Maroto, N; Calvet, X; Fernández-Salgado, E; Gordillo, J; Rojas Feria, M
2016-10-01
Despite having adopted preventive measures, tuberculosis (TB) may still occur in patients with inflammatory bowel disease (IBD) treated with anti-tumour necrosis factor (anti-TNF). Data on the causes and characteristics of TB cases in this scenario are lacking. Our aim was to describe the characteristics of TB in anti-TNF-treated IBD patients after the publication of the Spanish TB prevention guidelines in IBD patients and to evaluate the safety of restarting anti-TNF after a TB diagnosis. In this multicentre, retrospective, descriptive study, TB cases from Spanish hospitals were collected. Continuous variables were reported as mean and standard deviation or median and interquartile range. Categorical variables were described as absolute and relative frequencies and their confidence intervals when necessary. We collected 50 TB cases in anti-TNF-treated IBD patients, 60% male, median age 37.3 years (interquartile range [IQR] 30.4-47). Median latency between anti-TNF initiation and first TB symptoms was 155.5 days (IQR 88-301); 34% of TB cases were disseminated and 26% extrapulmonary. In 30 patients (60%), TB cases developed despite compliance with recommended preventive measures; *not performing 2-step TST (tuberculin skin test) was the main failure in compliance with recommendations. In 17 patients (34%) anti-TNF was restarted after a median of 13 months (IQR 7.1-17.3) and there were no cases of TB reactivation. Tuberculosis could still occur in anti-TNF-treated IBD patients despite compliance with recommended preventive measures. A significant number of cases developed when these recommendations were not followed. Restarting anti-TNF treatment in these patients seems to be safe. Copyright © 2016 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Weissler-Snir, Adaya; Kornowski, Ran; Sagie, Alexander; Vaknin-Assa, Hana; Perl, Leor; Porter, Avital; Lev, Eli; Assali, Abid
2014-11-15
Little is known regarding gender differences in left ventricular (LV) function after anterior wall ST-segment elevation myocardial infarction (STEMI), despite it being a major determinant of patients' morbidity and mortality. We therefore sought to investigate the impact of gender on LV function after primary percutaneous coronary intervention (PCI) for first anterior wall STEMI. Seven hundred eighty-nine consecutive patients (625 men) with first anterior STEMI were included in the analysis. All patients underwent an echocardiographic study within 48 hours of PCI. Women were older and more likely to have diabetes, hypertension, chronic renal failure, and a higher Killip score. Women had prolonged ischemic time, which was driven by prolonged symptom-to-presentation time (2.75 [interquartile range 1.5 to 4] vs 2 [interquartile range 1 to 3.5] hours, p = 0.005). A higher percentage of women had moderate or worse LV dysfunction (LV ejection fraction <40%; 61.6% vs 48%, p = 0.002). In a univariable analysis female gender was associated with moderate or worse LV function (p = 0.002). However, after accounting for variable baseline risk profiles between the 2 groups using multivariable and propensity score techniques, ischemic time >3.5 hours, leukocytosis, and pre-PCI Thrombolysis In Myocardial Infarction flow grade <2 were independent predictors of moderate or worse LV dysfunction, whereas female gender was not. Data on LV function recovery at 6 months, which were available for 45% of female and male patients with moderate or worse LV dysfunction early after PCI, showed no significant gender related difference in LV function recovery. In conclusion, women undergoing PCI for the first event of anterior STEMI demonstrate worse LV function than that of men, which might be partially attributed to delay in presentation. Hence greater efforts should be devoted to increasing women's awareness of cardiac symptoms during the prehospital course of STEMI. Copyright © 2014 Elsevier Inc. All rights reserved.
Physical activity assessment and counseling in Quebec family medicine groups.
Baillot, Aurélie; Baillargeon, Jean-Patrice; Paré, Alex; Poder, Thomas G; Brown, Christine; Langlois, Marie-France
2018-05-01
To determine how often primary health care providers (PHCPs) in family medicine groups (FMGs) assess physical activity (PA) levels, provide PA counseling (PAC), and refer patients to exercise professionals; to describe patients' PA levels, physical fitness, and satisfaction regarding their PA management in FMGs; to describe available PA materials in FMGs and PHCPs' PAC self-efficacy and PA knowledge; and to identify characteristics of patients and PHCPs that determine the assessment of PA and PAC provided by PHCPs. Cross-sectional study using questionnaires and a medical chart audit. Ten FMGs within the Integrated University Health Network of the Centre hospitalier universitaire de Sherbrooke in Quebec. Forty FPs, 24 nurses, and 439 patients. Assessment of PA level and PAC provided by PHCPs. Overall, 51.9% of the patients had had their PA level assessed during the past 18 months, but only 21.6% received PAC from at least 1 of the PHCPs. Similar percentages were found among the inactive (n = 244) and more active (n = 195) patients. The median PAC self-efficacy score of PHCPs was 70.2% (interquartile range 52.0% to 84.7%) and the median PA knowledge score was 45.8% (interquartile range 41.7% to 54.2%), with no significant differences between nurses and FPs. In multivariate analysis, 34% of the variance in PAC provided was explained by assessment of PA level, overweight or obese status, type 2 diabetes or prediabetes, less FP experience, lower patient annual family income, more nurse encounters, and a higher patient physical component summary of quality of life. The rates of assessment of PA and provision of PAC in Quebec FMGs were low, even though most of the patients were inactive. Initiatives to support PHCPs and more resources to assess PA levels and provide PAC should be implemented. Copyright© the College of Family Physicians of Canada.
Kwon, Richard S; Young, Benjamin E; Marsteller, William F; Lawrence, Christopher; Wu, Bechien U; Lee, Linda S; Mullady, Daniel; Klibansky, David A; Gardner, Timothy B; Simeone, Diane M
2016-09-01
This study aimed to determine if the improved pain response to endoscopic retrograde cholangiopancreatogrphy (ERCP) and pancreatic stent placement (EPS) predicts pain response in patients with chronic pancreatitis after modified lateral pancreaticojejunostomy (LPJ). A multi-institutional, retrospective review of patients who underwent successful EPS before LPJ between 2001 and 2010 was performed. The primary outcome was narcotic independence (NI) within 2 months after ERCP or LPJ. A total of 31 narcotic-dependent patients with chronic pancreatitis underwent successful EPS before LPJ. Ten (32%) achieved post-LPJ NI (median follow-up, 8.5 months; interquartile range [IQR], 2-38 months). Eight (80%) of 10 patients with NI post-ERCP achieved NI post-LPJ. Two (10%) without NI post-ERCP achieved NI post-LPJ. Narcotic independence post-EPS was associated strongly with NI post-LPJ with an odds ratio of 38 (P = 0.0025) and predicted post-LPJ NI with a sensitivity, specificity, positive predictive value, and negative predictive value of 80%, 90.5%, 80%, and 90.5%, respectively. Narcotic independence after EPS is associated with NI after LPJ. Failure to achieve NI post-ERCP predicts failure to achieve NI post-LPJ. These results support the need for larger studies to confirm the predictive value of pancreatic duct stenting for better selection of chronic pancreatitis patients who will benefit from LPJ.
Efficacy of golimumab on recurrent uveitis in HLA-B27-positive ankylosing spondylitis.
Yazgan, Serpil; Celik, Ugur; Işık, Metin; Yeşil, Nesibe Karahan; Baki, Ali Erdem; Şahin, Hatice; Gencer, Ercan; Doğan, İsmail
2017-02-01
To evaluate the efficacy of golimumab on severe and frequent recurrent anterior uveitis in patients with HLA-B27-positive ankylosing spondylitis. In this study, 15 eyes of 12 HLA-B27-positive AS patients with resistant anterior uveitis who received 50 mg of subcutaneous golimumab (Gol) per month due to frequent uveitis recurrences were analyzed retrospectively between May 2013 and October 2015. Assessment criteria were uveitis activity, the number of recurrence of uveitis, visual acuity, systemic corticosteroid, or other drug requirement for maintenance of remission of AU. Twelve patients (15 eyes) with HLA-B27-positive ankylosing spondylitis and anterior uveitis have been treated with golimumab 50 mg/month. Remission of uveitis was observed in 12 eyes out of 15. Malign hypertension developed in one subject after the second dose of golimumab therefore the treatment was stopped and this subject was excluded from the study. Median follow-up time was 11 months (interquartile range: 8-18). No uveitic reaction was seen except in the patient who stopped treatment. No topical or systemic steroid necessity was needed except in two cases with oral 4 mg systemic maintenance. Visual acuity was significantly increased (p = 0.002). Golimumab may be a new and effective choice for maintaining remission and the prevention of recurrences of severe, resistant anterior uveitis in patients with HLA-B27-positive ankylosing spondylitis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pashtan, Itai M.; Recht, Abram; Ancukiewicz, Marek
Purpose: External beam accelerated partial breast irradiation (APBI) is an increasingly popular technique for treatment of patients with early stage breast cancer following breast-conserving surgery. Here we present 5-year results of a prospective trial. Methods and Materials: From October 2003 through November 2005, 98 evaluable patients with stage I breast cancer were enrolled in the first dose step (32 Gy delivered in 8 twice-daily fractions) of a prospective, multi-institutional, dose escalation clinical trial of 3-dimensional conformal external beam APBI (3D-APBI). Median age was 61 years; median tumor size was 0.8 cm; 89% of tumors were estrogen receptor positive; 10% hadmore » a triple-negative phenotype; and 1% had a HER-2-positive subtype. Median follow-up was 71 months (range, 2-88 months; interquartile range, 64-75 months). Results: Five patients developed ipsilateral breast tumor recurrence (IBTR), for a 5-year actuarial IBTR rate of 5% (95% confidence interval [CI], 1%-10%). Three of these cases occurred in patients with triple-negative disease and 2 in non-triple-negative patients, for 5-year actuarial IBTR rates of 33% (95% CI, 0%-57%) and 2% (95% CI, 0%-6%; P<.0001), respectively. On multivariable analysis, triple-negative phenotype was the only predictor of IBTR, with borderline statistical significance after adjusting for tumor grade (P=.0537). Conclusions: Overall outcomes were excellent, particularly for patients with estrogen receptor-positive disease. Patients in this study with triple-negative breast cancer had a significantly higher IBTR rate than patients with other receptor phenotypes when treated with 3D-APBI. Larger, prospective 3D-APBI clinical trials should continue to evaluate the effect of hormone receptor phenotype on IBTR rates.« less
Sudjaritruk, Tavitiya; Maleesatharn, Alan; Prasitsuebsai, Wasana; Fong, Siew Moy; Le, Ngoc Oanh; Le, Thanh Thuy Thi; Lumbiganon, Pagakrong; Kumarasamy, Nagalingeswaran; Kurniati, Nia; Hansudewechakul, Rawiwan; Yusoff, Nik Khairulddin Nik; Razali, Kamarul Azahar Mohd; Kariminia, Azar; Sohn, Annette H.
2013-01-01
Abstract A multicenter, retrospective, observational study was conducted to determine prevalence, characteristics, management, and outcome of pulmonary tuberculosis (PTB) in Asian HIV-infected children in the TREAT Asia Pediatric HIV Observational Database (TApHOD). Data on PTB episodes diagnosed during the period between 12 months before antiretroviral therapy (ART) initiation and December 31, 2009 were extracted. A total of 2678 HIV-infected children were included in TApHOD over a 13-year period; 457 developed PTB, giving a period prevalence of 17.1% (range 5.7–33.0% per country). There were a total of 484 PTB episodes; 27 children had 2 episodes each. There were 21 deaths (4.3%). One third of episodes (n=175/484) occurred after ART initiation at a median of 14.1 months (interquartile range [IQR] 2.5–28.8 months). The median (IQR) CD4+ values were 9.0% (3.0–16.0%) and 183.5 (37.8–525.0) cells/mm3 when PTB was diagnosed. Most episodes (n=424/436, 97.3%) had abnormal radiographic findings compatible with PTB, whereas half (n=267/484, 55.2%) presented with clinical characteristics of PTB. One third of those tested (n=42/122, 34.4%) had bacteriological evidence of PTB. Of the 156 episodes (32.2%) that were accompanied with extrapulmonary TB, pleuritis was the most common manifestation (81.4%). After treatment completion, most episodes (n=396/484, 81.9%) were recorded as having positive outcomes (cured, treatment completed and child well, and improvement). The prevalence of PTB among Asian HIV-infected children in our cohort was high. Children with persistent immunosuppression remain vulnerable to PTB even after ART initiation. PMID:24206012
Tate, David J; Awadie, Halim; Bahin, Farzan F; Desomer, Lobke; Lee, Ralph; Heitman, Steven J; Goodrick, Kathleen; Bourke, Michael J
2018-03-01
BACKGROUND AND STUDY AIMS : Large series suggest endoscopic mucosal resection is safe and effective for the removal of large (≥ 10 mm) sessile serrated polyps (SSPs), but it exposes the patient to the risks of electrocautery, including delayed bleeding. We examined the feasibility and safety of piecemeal cold snare polypectomy (pCSP) for the resection of large SSPs. Sequential large SSPs (10 - 35 mm) without endoscopic evidence of dysplasia referred over 12 months to a tertiary endoscopy center were considered for pCSP. A thin-wire snare was used in all cases. Submucosal injection was not performed. High definition imaging of the defect margin was used to ensure the absence of residual serrated tissue. Adverse events were assessed at 2 weeks and surveillance was planned for between 6 and 12 months. 41 SSPs were completely removed by pCSP in 34 patients. The median SSP size was 15 mm (interquartile range [IQR] 14.5 - 20 mm; range 10 - 35 mm). The median procedure duration was 4.5 minutes (IQR 1.4 - 6.3 minutes). There was no evidence of perforation or significant intraprocedural bleeding. At 2-week follow-up, there were no significant adverse events, including delayed bleeding and post polypectomy syndrome. First follow-up has been undertaken for 15 /41 lesions at a median of 6 months with no evidence of recurrence. There is potential for pCSP to become the standard of care for non-dysplastic large SSPs. This could reduce the burden of removing SSPs on patients and healthcare systems, particularly by avoidance of delayed bleeding. © Georg Thieme Verlag KG Stuttgart · New York.
Pashtan, Itai M; Recht, Abram; Ancukiewicz, Marek; Brachtel, Elena; Abi-Raad, Rita F; D'Alessandro, Helen A; Levy, Antonin; Wo, Jennifer Y; Hirsch, Ariel E; Kachnic, Lisa A; Goldberg, Saveli; Specht, Michelle; Gadd, Michelle; Smith, Barbara L; Powell, Simon N; Taghian, Alphonse G
2012-11-01
External beam accelerated partial breast irradiation (APBI) is an increasingly popular technique for treatment of patients with early stage breast cancer following breast-conserving surgery. Here we present 5-year results of a prospective trial. From October 2003 through November 2005, 98 evaluable patients with stage I breast cancer were enrolled in the first dose step (32 Gy delivered in 8 twice-daily fractions) of a prospective, multi-institutional, dose escalation clinical trial of 3-dimensional conformal external beam APBI (3D-APBI). Median age was 61 years; median tumor size was 0.8 cm; 89% of tumors were estrogen receptor positive; 10% had a triple-negative phenotype; and 1% had a HER-2-positive subtype. Median follow-up was 71 months (range, 2-88 months; interquartile range, 64-75 months). Five patients developed ipsilateral breast tumor recurrence (IBTR), for a 5-year actuarial IBTR rate of 5% (95% confidence interval [CI], 1%-10%). Three of these cases occurred in patients with triple-negative disease and 2 in non-triple-negative patients, for 5-year actuarial IBTR rates of 33% (95% CI, 0%-57%) and 2% (95% CI, 0%-6%; P<.0001), respectively. On multivariable analysis, triple-negative phenotype was the only predictor of IBTR, with borderline statistical significance after adjusting for tumor grade (P=.0537). Overall outcomes were excellent, particularly for patients with estrogen receptor-positive disease. Patients in this study with triple-negative breast cancer had a significantly higher IBTR rate than patients with other receptor phenotypes when treated with 3D-APBI. Larger, prospective 3D-APBI clinical trials should continue to evaluate the effect of hormone receptor phenotype on IBTR rates. Copyright © 2012 Elsevier Inc. All rights reserved.
Complementary Feeding and Diarrhea and Respiratory Infection Among HIV-exposed Tanzanian Infants
Kamenju, P; Liu, E; Hertzmark, E; Spiegelman, D; Kisenge, R.R.; Kupka, R; Aboud, S; Manji, K.P.; Duggan, C; Fawzi, W.W.
2016-01-01
Objective To examine the association between complementary feeding (CF) and risks of diarrhea and acute respiratory infection (ARI) among HIV-exposed infants aged 6–24 months. Design We prospectively employed an Infant and Child Feeding Index (ICFI) to measure CF practices (breastfeeding status, food consistency, dietary diversity, food group frequency and meal frequency). We determined the association of ICFI and each of its components with the risk of diarrhea and ARI. Generalized estimating equations (GEE) were used to estimate the relative risks for morbidity episodes. Setting Dar es Salaam, Tanzania. Subjects 2092 HIV-exposed infants followed from 6 months of age to 24 months of age. Results The ICFI score ranged from 0 to 9; the median score was 6 (Inter-Quartile Range; IQR=4, 7). Low ICFI scores were likely associated with increased risk of dysentery (low vs. high tertile Risk Ratio, RR: 1.40; 95% CI: 0.93, 2.10; P for trend=0.02) and respiratory infection (low vs. high tertile RR: 1.16; 95% CI: 0.96, 1.41; P for trend=0.01). Low dietary diversity scores were likely associated with higher risk of dysentery (low vs. high tertile RR: 1.47; 95% CI: 0.92, 2.35; P for trend=0.03) and respiratory infection (low vs. high tertile RR: 1.41; 95% CI: 1.13, 1.76; P for trend=0.01). Low food consistency scores were associated with higher risk of respiratory infection (RR: 1.77; 95% CI: 1.40, 2.26; P<0.01). Conclusions In this setting, low ICFI, dietary diversity and food consistency scores were likely associated with increased risk of diarrhea and acute respiratory infection among HIV-exposed infants. PMID:27861238
FDG-PET/CT can rule out malignancy in patients with vocal cord palsy.
Thomassen, Anders; Nielsen, Anne Lerberg; Lauridsen, Jeppe Kiilerich; Blomberg, Björn Alexander; Hess, Søren; Petersen, Henrik; Johansen, Allan; Asmussen, Jon Thor; Sørensen, Jesper Roed; Johansen, Jørgen; Godballe, Christian; Høilund-Carlsen, Poul Flemming
2014-01-01
The aim was to investigate the performance of (18)F-fluorodeoxyglucose PET/CT to rule out malignancy in patients with confirmed vocal cord palsy (VCP). Between January 2011 and June 2013, we retrospectively included consecutive patients referred to PET/CT with paresis or paralysis of one or both vocal cords. PET/CT results were compared to clinical workup and histopathology. The study comprised 65 patients (32 females) with a mean age of 66±12 years (range 37-89). Eleven patients (17%) had antecedent cancer. Twenty-seven (42%) were diagnosed with cancer during follow-up. The palsy was right-sided in 24 patients, left-sided in 37, and bilateral in 4. Median follow-up was 7 months (interquartile range 4-11 months). Patients without cancer were followed for at least three months. PET/CT suggested a malignancy in 35 patients (27 true positives, 8 false positives) and showed none in 30 (30 true negatives, 0 false negatives). Thus, the sensitivity, specificity, positive and negative predictive values, and accuracy were (95% confidence intervals in parenthesis): 100% (88%-100%), 79% (64%-89%), 77% (61%-88%), 100% (89%-100%), and 88% (78%-94%), respectively. Sixteen patients had palliative treatment, while 11 were treated with curative intent, emphasising the severity of VCP and the need for a rapid and accurate diagnostic work-up. In this retrospective survey, biopsy proven malignancy (whether newly diagnosed or relapsed) was the cause of VCP in almost half of patients (42%). PET/CT had a high sensitivity (100%) with a relatively high false positive rate, but was excellent in ruling out malignancy (negative predictive value 100%).
Smith, Danielle M; Werb, Dan; Abramovitz, Daniela; Magis-Rodriguez, Carlos; Vera, Alicia; Patterson, Thomas L; Strathdee, Steffanie A
2016-03-01
Until the early 2000s, there was only one needle exchange program (NEP) offered in Mexico. In 2004, the second Mexican NEP opened in Tijuana, but its utilization has not been studied. We studied predictors of initiating NEP during its early expansion in Tijuana, Mexico. From April 2006 to April 2007, people who inject drugs (PWID) residing in Tijuana who had injected within the last month were recruited using respondent-driven sampling. Weighted Poisson regression incorporating generalized estimating equations was used to identify predictors of initiating NEP, while accounting for correlation between recruiter and recruits. NEP uptake increased from 20% at baseline to 59% after 6 months. Among a subsample of PWID not accessing NEP at baseline (n = 480), 83% were male and median age was 37 years (Interquartile Range: 32-43). At baseline, 4.4% were HIV-infected and 5.9% had syphilis titers >1:8. In multivariate models, factors associated with NEP initiation (p < .05) were attending shooting galleries (Adjusted Relative Risk [ARR]: 1.54); arrest for track-marks (ARR: 1.38); having a family member that ever used drugs (ARR: 1.37); and having a larger PWID network (ARR: 1.01 per 10 persons). NEP initiation was inversely associated with obtaining syringes at pharmacies (ARR: .56); earning >2500 pesos/month (ARR: .66); and reporting needle sharing (ARR: .71). Uptake of NEP expansion in Tijuana was vigorous among PWID. We identified a range of factors that influenced the likelihood of NEP initiation, including police interaction. These findings have important implications for the scale-up of NEP in Mexico. © American Academy of Addiction Psychiatry.
Simonis, Fabienne D; de Iudicibus, Gianfranco; Cremer, Olaf L; Ong, David S Y; van der Poll, Tom; Bos, Lieuwe D; Schultz, Marcus J
2018-01-01
Macrolides have been associated with favorable immunological effects in various inflammatory disease states. We investigated the association between macrolide therapy and mortality in patients with the acute respiratory distress syndrome (ARDS). This was an unplanned secondary analysis of patients with ARDS within a large prospective observational study of critically ill patients in the intensive care units (ICUs) of two university-affiliated hospitals in the Netherlands. The exposure of interest was low-dose macrolide use prescribed for another reason than infection; we excluded patients who received high-dose macrolides for an infection. The primary endpoint was 30-day mortality. The association between macrolide therapy and mortality was determined in the whole cohort, as well as in a propensity score matched cohort; the association was compared between pulmonary versus non-pulmonary ARDS, and between two biological phenotypes based on plasma levels of 20 biomarkers. In total, 873 patients with ARDS were analyzed, of whom 158 patients (18%) received macrolide therapy during stay in ICU for a median duration of 3 (interquartile range, 1-4) days. Erythromycin was the most frequent prescribed macrolide (97%). Macrolide therapy was associated with reduced 30-day mortality in the whole cohort [22.8% vs. 31.6%; crude odds ratio (OR), 0.64 (interquartile range, 0.43-0.96), P=0.03]. The association in the propensity score matched cohort remained significant [22.8% vs. 32.9%; OR, 0.62 (interquartile range, 0.39-0.96), P=0.03]. Propensity matched associations with mortality were different in patients with non-pulmonary ARDS vs. pulmonary ARDS and also varied by biological phenotype. These data together show that low-dose macrolide therapy prescribed for another reason than infection is associated with decreased mortality in patients with ARDS.
Scientific Production of Research Fellows at the Zagreb University School of Medicine, Croatia
Polašek, Ozren; Kolčić, Ivana; Buneta, Zoran; Čikeš, Nada; Pećina, Marko
2006-01-01
Aim To evaluate scientific production among research fellows employed at the Zagreb University School of Medicine and identify factors associated with their scientific output. Method We conducted a survey among research fellows and their mentors during June 2005. The main outcome measure was publication success, defined for each fellow as publishing at least 0.5 articles per employment year in journals indexed in the Current Contents bibliographic database. Bivariate methods and binary logistic regression were used in data analysis. Results A total of 117 fellows (response rate 95%) and 83 mentors (100%) were surveyed. The highest scientific production was recorded among research fellows employed in public health departments (median 3.0 articles, interquartile range 4.0), compared with those from pre-clinical (median 0.0, interquartile range 2.0) and clinical departments (median 1.0, interquartile range 2.0) (Kruskal-Wallis, P = 0.003). A total of 36 (29%) research fellows published at least 0.5 articles per employment year and were considered successful. Three variables were associated with fellows’ publication success: mentor’s scientific production (odds ratio [OR], 3.14; 95% confidence interval [CI], 1.31-7.53), positive mentor’s assessment (OR, 3.15; 95% CI, 1.10-9.05), and fellows’ undergraduate publication in journals indexed in the Current Contents bibliographic database (OR, 4.05; 95% CI, 1.07-15.34). Conclusion Undergraduate publication could be used as one of the main criteria in selecting research fellows. One of the crucial factors in a fellow’s scientific production and career advancement is mentor’s input, which is why research fellows would benefit most from working with scientifically productive mentors. PMID:17042070
Prabhu, Malavika; Clapp, Mark A; McQuaid-Hanson, Emily; Ona, Samsiya; OʼDonnell, Taylor; James, Kaitlyn; Bateman, Brian T; Wylie, Blair J; Barth, William H
2018-07-01
To evaluate whether a liposomal bupivacaine incisional block decreases postoperative pain and represents an opioid-minimizing strategy after scheduled cesarean delivery. In a single-blind, randomized controlled trial among opioid-naive women undergoing cesarean delivery, liposomal bupivacaine or placebo was infiltrated into the fascia and skin at the surgical site, before fascial closure. Using an 11-point numeric rating scale, the primary outcome was pain score with movement at 48 hours postoperatively. A sample size of 40 women per group was needed to detect a 1.5-point reduction in pain score in the intervention group. Pain scores and opioid consumption, in oral morphine milligram equivalents, at 48 hours postoperatively were summarized as medians (interquartile range) and compared using the Wilcoxon rank-sum test. Between March and September 2017, 249 women were screened, 103 women enrolled, and 80 women were randomized. One woman in the liposomal bupivacaine group was excluded after randomization as a result of a vertical skin incision, leaving 39 patients in the liposomal bupivacaine group and 40 in the placebo group. Baseline characteristics between groups were similar. The median (interquartile range) pain score with movement at 48 hours postoperatively was 4 (2-5) in the liposomal bupivacaine group and 3.5 (2-5.5) in the placebo group (P=.72). The median (interquartile range) opioid use was 37.5 (7.5-60) morphine milligram equivalents in the liposomal bupivacaine group and 37.5 (15-75) morphine milligram equivalents in the placebo group during the first 48 hours postoperatively (P=.44). Compared with placebo, a liposomal bupivacaine incisional block at the time of cesarean delivery resulted in similar postoperative pain scores in the first 48 hours postoperatively. ClinicalTrials.gov, NCT02959996.
England, Timothy J; Hedstrom, Amanda; O'Sullivan, Saoirse; Donnelly, Richard; Barrett, David A; Sarmad, Sarir; Sprigg, Nikola; Bath, Philip M
2017-05-01
Repeated episodes of limb ischemia and reperfusion (remote ischemic conditioning [RIC]) may improve outcome after acute stroke. We performed a pilot blinded placebo-controlled trial in patients with acute ischemic stroke, randomized 1:1 to receive 4 cycles of RIC within 24 hours of ictus. The primary outcome was tolerability and feasibility. Secondary outcomes included safety, clinical efficacy (day 90), putative biomarkers (pre- and post-intervention, day 4), and exploratory hemodynamic measures. Twenty-six patients (13 RIC and 13 sham) were recruited 15.8 hours (SD 6.2) post-onset, age 76.2 years (SD 10.5), blood pressure 159/83 mm Hg (SD 25/11), and National Institutes of Health Stroke Scale (NIHSS) score 5 (interquartile range, 3.75-9.25). RIC was well tolerated with 49 out of 52 cycles completed in full. Three patients experienced vascular events in the sham group: 2 ischemic strokes and 2 myocardial infarcts versus none in the RIC group ( P =0.076, log-rank test). Compared with sham, there was a significant decrease in day 90 NIHSS score in the RIC group, median NIHSS score 1 (interquartile range, 0.5-5) versus 3 (interquartile range, 2-9.5; P =0.04); RIC augmented plasma HSP27 (heat shock protein 27; P <0.05, repeated 2-way ANOVA) and phosphorylated HSP27 ( P <0.001) but not plasma S100-β, matrix metalloproteinase-9, endocannabinoids, or arterial compliance. RIC after acute stroke is well tolerated and appears safe and feasible. RIC may improve neurological outcome, and protective mechanisms may be mediated through HSP27. A larger trial is warranted. URL: http://www.isrctn.com. Unique identifier: ISRCTN86672015. © 2017 American Heart Association, Inc.
Clinical presentation of patients with Ebola virus disease in Conakry, Guinea.
Bah, Elhadj Ibrahima; Lamah, Marie-Claire; Fletcher, Tom; Jacob, Shevin T; Brett-Major, David M; Sall, Amadou Alpha; Shindo, Nahoko; Fischer, William A; Lamontagne, Francois; Saliou, Sow Mamadou; Bausch, Daniel G; Moumié, Barry; Jagatic, Tim; Sprecher, Armand; Lawler, James V; Mayet, Thierry; Jacquerioz, Frederique A; Méndez Baggi, María F; Vallenas, Constanza; Clement, Christophe; Mardel, Simon; Faye, Ousmane; Faye, Oumar; Soropogui, Baré; Magassouba, Nfaly; Koivogui, Lamine; Pinto, Ruxandra; Fowler, Robert A
2015-01-01
In March 2014, the World Health Organization was notified of an outbreak of Zaire ebolavirus in a remote area of Guinea. The outbreak then spread to the capital, Conakry, and to neighboring countries and has subsequently become the largest epidemic of Ebola virus disease (EVD) to date. From March 25 to April 26, 2014, we performed a study of all patients with laboratory-confirmed EVD in Conakry. Mortality was the primary outcome. Secondary outcomes included patient characteristics, complications, treatments, and comparisons between survivors and nonsurvivors. Of 80 patients who presented with symptoms, 37 had laboratory-confirmed EVD. Among confirmed cases, the median age was 38 years (interquartile range, 28 to 46), 24 patients (65%) were men, and 14 (38%) were health care workers; among the health care workers, nosocomial transmission was implicated in 12 patients (32%). Patients with confirmed EVD presented to the hospital a median of 5 days (interquartile range, 3 to 7) after the onset of symptoms, most commonly with fever (in 84% of the patients; mean temperature, 38.6°C), fatigue (in 65%), diarrhea (in 62%), and tachycardia (mean heart rate, >93 beats per minute). Of these patients, 28 (76%) were treated with intravenous fluids and 37 (100%) with antibiotics. Sixteen patients (43%) died, with a median time from symptom onset to death of 8 days (interquartile range, 7 to 11). Patients who were 40 years of age or older, as compared with those under the age of 40 years, had a relative risk of death of 3.49 (95% confidence interval, 1.42 to 8.59; P=0.007). Patients with EVD presented with evidence of dehydration associated with vomiting and severe diarrhea. Despite attempts at volume repletion, antimicrobial therapy, and limited laboratory services, the rate of death was 43%.
Novel dry cryotherapy system for cooling the equine digit
Stefanovski, Darko; Lenfest, Margret; Chatterjee, Sraboni; Orsini, James
2018-01-01
Objectives Digital cryotherapy is commonly used for laminitis prophylaxis and treatment. Currently validated methods for distal limb cryotherapy involve wet application or compression technology. There is a need for a practical, affordable, dry cryotherapy method that effectively cools the digit. The objective of this study was to evaluate the hoof wall surface temperatures (HWSTs) achieved with a novel dry cryotherapy technology. Design Repeated-measures in vivo experimental study. Setting Experimental intervention at a single site. Participants 6 systemically healthy horses (3 mares, 3 geldings). Interventions Cryotherapy was applied to six horses for eight hours with a commercially available rubber and rubber and welded fabricice boot, which extended proximally to include the foot and pastern. Reusable malleable cold therapy packs were secured against the foot and pastern with the three built-in hook-and-loop fastener panels. Primary and secondary outcome measures HWST and pastern surface temperature of the cryotherapy-treated limb, HWST of the control limb and ambient temperature were recorded every five minutes throughout the study period. Results Results were analysed with mixed-effects multivariable regression analysis. The HWST (median 11.1°C, interquartile range 8.6°C–14.7°C) in the cryotherapy-treated limb was significantly decreased compared with the control limb (median 29.7°C, interquartile range 28.9°C–30.4°C) (P≤0.001). Cryotherapy limb HWST reached a minimum of 6.75°C (median) with an interquartile range of 4.1°C–9.3°C. Minimum HWST was achieved 68 minutes after cryotherapy pack application. Conclusions Dry application of cryotherapy significantly reduced HWST and reached minimums below the therapeutic target of 10°C. This cryotherapy method might offer an effective alternative for digital cooling. PMID:29344364
Ibrahim, Wanis H; Alousi, Faraj H; Al-Khal, Abdulatif; Bener, Abdulbari; AlSalman, Ahmed; Aamer, Aaiza; Khaled, Ahmed; Raza, Tasleem
2016-01-01
To determine the mean and median delays in pulmonary tuberculosis (PTB) diagnosis among adults in one of the world's highest gross domestic product per capita countries and identify patient and health system-related reasons for these delays. This is a cross-sectional, face-to-face, prospective study of 100 subjects with confirmed PTB, conducted at main tuberculosis (TB) admitting facilities in Qatar. The mean and median diagnostic delays were measured. The Chi-square test with two-sided P < 0.05 was considered to determine the association between factors and diagnostic delay. The mean and median total diagnostic delays of PTB were 53 (95% confidence interval [CI] 42.61-63.59) and 30 (interquartile range; Q1-Q3, 15-60) days, respectively. The mean patient factor delay was 45.7 (95% CI 28.1-63.4) days, and the median was 30 (interquartile range; Q1-Q3, 15-60) days. The mean health system factor delay was 46.3 (95% CI 35.46-57.06) days, and the median was 30 (interquartile range; Q1-Q3, 18-60) days. The most common cause of patient factor delay was neglect of TB symptoms by patients (in 39% of cases), and for health-care system factor delay was a failure (mostly at general and private care levels) to suspect PTB by doctors (in 57% of cases). There were no significant associations between the presence of language barrier, patient occupation or nationality, and diagnostic delay. Despite a favorable comparison to other countries, there is a substantial delay in the diagnosis of PTB in Qatar. Relevant actions including health education on TB are required to minimize this delay.
CDH1 gene polymorphisms, plasma CDH1 levels and risk of gastric cancer in a Chinese population.
Zhan, Zhen; Wu, Juan; Zhang, Jun-Feng; Yang, Ya-Ping; Tong, Shujuan; Zhang, Chun-Bing; Li, Jin; Yang, Xue-Wen; Dong, Wei
2012-08-01
The genetic polymorphisms in E-cadherin gene (CDH1) may affect invasive/metastatic development of gastric cancer by altering gene transcriptional activity of epithelial cell. Our study aims to explore the associations among CDH1 gene polymorphisms, and predisposition of gastric cancer. We genotyped four potentially functional polymorphisms (rs13689, rs1801552, rs16260 and rs17690554) of the CDH1 gene in a case-control study of 387 incident gastric cancer cases and 392 healthy controls by polymerase chain reaction-ligation detection reaction methods (PCR-LDR) and measured the plasma CDH1 levels using enzyme immunoassay among the subjects. The median and inter-quartile range were adopted for representing the mean level of non-normally distributed data, and we found the level of plasma CDH1 in gastric cancer patients (median: 171.00 pg/ml; inter-quartile range: 257.10 pg/ml) were significantly higher than that of controls (median: 137.40 pg/ml; inter-quartile range: 83.90 pg/ml, P = 0.003). However, none of the four polymorphisms or their haplotypes achieved significant differences in their distributions between gastric cancer cases and controls, and interestingly, in the subgroup analysis of gastric cancer, we found that CA genotype of rs26160 and CG genotype of rs17690554 were associated with the risk of diffuse gastric cancer, compared with their wild genotypes (OR = 2.98, 95 % CI: 1.60-5.53; OR = 2.10, 95 % CI: 1.14-3.85, respectively, P < 0.05). In conclusion, our results indicated that plasma CDH1 levels may serve as a risk marker against gastric cancer and variant genotypes of rs26160 and rs17690554 may contribute to the etiology of diffuse gastric cancer in this study. Further studies are warranted to verify these findings.
de Iudicibus, Gianfranco; Cremer, Olaf L.; Ong, David S. Y.; van der Poll, Tom; Bos, Lieuwe D.; Schultz, Marcus J.
2018-01-01
Background Macrolides have been associated with favorable immunological effects in various inflammatory disease states. We investigated the association between macrolide therapy and mortality in patients with the acute respiratory distress syndrome (ARDS). Methods This was an unplanned secondary analysis of patients with ARDS within a large prospective observational study of critically ill patients in the intensive care units (ICUs) of two university-affiliated hospitals in the Netherlands. The exposure of interest was low-dose macrolide use prescribed for another reason than infection; we excluded patients who received high-dose macrolides for an infection. The primary endpoint was 30-day mortality. The association between macrolide therapy and mortality was determined in the whole cohort, as well as in a propensity score matched cohort; the association was compared between pulmonary versus non-pulmonary ARDS, and between two biological phenotypes based on plasma levels of 20 biomarkers. Results In total, 873 patients with ARDS were analyzed, of whom 158 patients (18%) received macrolide therapy during stay in ICU for a median duration of 3 (interquartile range, 1–4) days. Erythromycin was the most frequent prescribed macrolide (97%). Macrolide therapy was associated with reduced 30-day mortality in the whole cohort [22.8% vs. 31.6%; crude odds ratio (OR), 0.64 (interquartile range, 0.43–0.96), P=0.03]. The association in the propensity score matched cohort remained significant [22.8% vs. 32.9%; OR, 0.62 (interquartile range, 0.39–0.96), P=0.03]. Propensity matched associations with mortality were different in patients with non-pulmonary ARDS vs. pulmonary ARDS and also varied by biological phenotype. Conclusions These data together show that low-dose macrolide therapy prescribed for another reason than infection is associated with decreased mortality in patients with ARDS. PMID:29430441
Cheuk, Queenie K Y; Lo, T K; Lee, C P; Yeung, Anita P C
2015-06-01
To evaluate the efficacy and safety of double balloon catheter for induction of labour in Chinese women with one previous caesarean section and unfavourable cervix at term. Retrospective cohort study. A regional hospital in Hong Kong. Women with previous caesarean delivery requiring induction of labour at term and with an unfavourable cervix from May 2013 to April 2014. Primary outcome was to assess rate of successful vaginal delivery (spontaneous or instrument-assisted) using double balloon catheter. Secondary outcomes were double balloon catheter induction-to-delivery and removal-to-delivery interval; cervical score improvement; oxytocin augmentation; maternal or fetal complications during cervical ripening, intrapartum and postpartum period; and risk factors associated with unsuccessful induction. All 24 Chinese women tolerated double balloon catheter well. After double balloon catheter expulsion or removal, the cervix successfully ripened in 18 (75%) cases. The improvement in Bishop score 3 (interquartile range, 2-4) was statistically significant (P<0.001). Overall, 18 (75%) cases were delivered vaginally. The median insertion-to-delivery and removal-to-delivery intervals were 19 (interquartile range, 13.4-23.0) hours and 6.9 (interquartile range, 4.1-10.8) hours, respectively. Compared with cases without, the interval to delivery was statistically significantly shorter in those with spontaneous balloon expulsion or spontaneous membrane rupture during ripening (7.8 vs 3.0 hours; P=0.025). There were no major maternal or neonatal complications. The only factor significantly associated with failed vaginal birth after caesarean was previous caesarean section for failure to progress (P<0.001). This is the first study using double balloon catheter for induction of labour in Asian Chinese women with previous caesarean section. Using double balloon catheter, we achieved a vaginal birth after caesarean rate of 75% without major complications.
Farbman, L; Avni, T; Rubinovitch, B; Leibovici, L; Paul, M
2013-12-01
Infections caused by methicillin-resistant Staphylococcus aureus (MRSA) incur significant costs. We aimed to examine the cost and cost-benefit of infection control interventions against MRSA and to examine factors affecting economic estimates. We performed a systematic review of studies assessing infection control interventions aimed at preventing spread of MRSA in hospitals and reporting intervention costs, savings, cost-benefit or cost-effectiveness. We searched PubMed and references of included studies with no language restrictions up to January 2012. We used the Quality of Health Economic Studies tool to assess study quality. We report cost and savings per month in 2011 US$. We calculated the median save/cost ratio and the save-cost difference with interquartile range (IQR) range. We examined the effects of MRSA endemicity, intervention duration and hospital size on results. Thirty-six studies published between 1987 and 2011 fulfilled inclusion criteria. Fifteen of the 18 studies reporting both costs and savings reported a save/cost ratio >1. The median save/cost ratio across all 18 studies was 7.16 (IQR 1.37-16). The median cost across all studies reporting intervention costs (n = 31) was 8648 (IQR 2025-19 170) US$ per month; median savings were 38 751 (IQR 14 206-75 842) US$ per month (23 studies). Higher save/cost ratios were observed in the intermediate to high endemicity setting compared with the low endemicity setting, in hospitals with <500-beds and with interventions of >6 months. Infection control intervention to reduce spread of MRSA in acute-care hospitals showed a favourable cost/benefit ratio. This was true also for high MRSA endemicity settings. Unresolved economic issues include rapid screening using molecular techniques and universal versus targeted screening. © 2013 The Authors Clinical Microbiology and Infection © 2013 European Society of Clinical Microbiology and Infectious Diseases.
Efficacy of Testosterone Suppression with Sustained-Release Triptorelin in Advanced Prostate Cancer.
Breul, Jürgen; Lundström, Eija; Purcea, Daniela; Venetz, Werner P; Cabri, Patrick; Dutailly, Pascale; Goldfischer, Evan R
2017-02-01
Androgen deprivation therapy (ADT) is a mainstay of treatment against advanced prostate cancer (PC). As a treatment goal, suppression of plasma testosterone levels to <50 ng/dl has been established over decades. Evidence is growing though that suppression to even lower levels may add further clinical benefit. Therefore, we undertook a pooled retrospective analysis on the efficacy of 1-, 3-, and 6-month sustained-release (SR) formulations of the gonadotropin-releasing hormone (GnRH) agonist triptorelin to suppress serum testosterone concentrations beyond current standards. Data of 920 male patients with PC enrolled in 9 prospective studies using testosterone serum concentrations as primary endpoint were pooled. Patients aged 42-96 years had to be eligible for ADT and to be either naïve to hormonal treatment or have undergone appropriate washout prior to enrolment. Patients were treated with triptorelin SR formulations for 2-12 months. Primary endpoints of this analysis were serum testosterone concentrations under treatment and success rates overall and per formulation, based on a testosterone target threshold of 20 ng/dl. After 1, 3, 6, 9, and 12 months of treatment, 79%, 92%, 93%, 90%, and 91% of patients reached testosterone levels <20 ng/dl, respectively. For the 1-, 3-, and 6-month formulations success rates ranged from 80-92%, from 83-93%, and from 65-97% with median (interquartile range) serum testosterone values of 2.9 (2.9-6.5), 5.0 (2.9-8.7), and 8.7 (5.8-14.1) ng/dl at study end, respectively. In the large majority of patients, triptorelin SR formulations suppressed serum testosterone concentrations to even <20 ng/dl. Testosterone should be routinely monitored in PC patients on ADT although further studies on the clinical benefit of very low testosterone levels and the target concentrations are still warranted.
Bouadma, Lila; Mourvillier, Bruno; Deiler, Véronique; Le Corre, Bertrand; Lolom, Isabelle; Régnier, Bernard; Wolff, Michel; Lucet, Jean-Christophe
2010-03-01
To determine the effect of a 2-yr multifaceted program aimed at preventing ventilator-acquired pneumonia on compliance with eight targeted preventive measures. Pre- and postintervention observational study. A 20-bed medical intensive care unit in a teaching hospital. A total of 1649 ventilator-days were observed. The program involved all healthcare workers and included a multidisciplinary task force, an educational session, direct observations with performance feedback, technical improvements, and reminders. It focused on eight targeted measures based on well-recognized published guidelines, easily and precisely defined acts, and directly concerned healthcare workers' bedside behavior. Compliance assessment consisted of five 4-wk periods (before the intervention and 1 month, 6 months, 12 months, and 24 months thereafter). Hand-hygiene and glove-and-gown use compliances were initially high (68% and 80%) and remained stable over time. Compliance with all other preventive measures was initially low and increased steadily over time (before 2-yr level, p < .0001): backrest elevation (5% to 58%) and tracheal cuff pressure maintenance (40% to 89%), which improved after simple technical equipment implementation; orogastric tube use (52% to 96%); gastric overdistension avoidance (20% to 68%); good oral hygiene (47% to 90%); and nonessential tracheal suction elimination (41% to 92%). To assess overall performance of the last six preventive measures, using ventilator-days as the unit of analysis, a composite score for preventive measures applied (range, 0-6) was developed. The median (interquartile range) composite scores for the five successive assessments were 2 (1-3), 4 (3-5), 4 (4-5), 5 (4-6), and 5 (4-6) points; they increased significantly over time (p < .0001). Ventilator-acquired pneumonia prevalence rate decreased by 51% after intervention (p < .0001). Our active, long-lasting program for preventing ventilator-acquired pneumonia successfully increased compliance with preventive measures directly dependent on healthcare workers' bedside performance. The multidimensional framework was critical for this marked, progressive, and sustained change.
Riera, M; Aibar, E
2013-05-01
Some studies suggest that open access articles are more often cited than non-open access articles. However, the relationship between open access and citations count in a discipline such as intensive care medicine has not been studied to date. The present article analyzes the effect of open access publishing of scientific articles in intensive care medicine journals in terms of citations count. We evaluated a total of 161 articles (76% being non-open access articles) published in Intensive Care Medicine in the year 2008. Citation data were compared between the two groups up until April 30, 2011. Potentially confounding variables for citation counts were adjusted for in a linear multiple regression model. The median number (interquartile range) of citations of non-open access articles was 8 (4-12) versus 9 (6-18) in the case of open access articles (p=0.084). In the highest citation range (>8), the citation count was 13 (10-16) and 18 (13-21) (p=0.008), respectively. The mean follow-up was 37.5 ± 3 months in both groups. In the 30-35 months after publication, the average number (mean ± standard deviation) of citations per article per month of non-open access articles was 0.28 ± 0.6 versus 0.38 ± 0.7 in the case of open access articles (p=0.043). Independent factors for citation advantage were the Hirsch index of the first signing author (β=0.207; p=0.015) and open access status (β=3.618; p=0.006). Open access publishing and the Hirsch index of the first signing author increase the impact of scientific articles. The open access advantage is greater for the more highly cited articles, and appears in the 30-35 months after publication. Copyright © 2012 Elsevier España, S.L. and SEMICYUC. All rights reserved.
Automated external defibrillators and simulated in-hospital cardiac arrests.
Rossano, Joseph W; Jefferson, Larry S; Smith, E O'Brian; Ward, Mark A; Mott, Antonio R
2009-05-01
To test the hypothesis that pediatric residents would have shorter time to attempted defibrillation using automated external defibrillators (AEDs) compared with manual defibrillators (MDs). A prospective, randomized, controlled trial of AEDs versus MDs was performed. Pediatric residents responded to a simulated in-hospital ventricular fibrillation cardiac arrest and were randomized to using either an AED or MD. The primary end point was time to attempted defibrillation. Sixty residents, 21 (35%) interns, were randomized to 2 groups (AED = 30, MD = 30). Residents randomized to the AED group had a significantly shorter time to attempted defibrillation [median, 60 seconds (interquartile range, 53 to 71 seconds)] compared with those randomized to the MD group [median, 103 seconds (interquartile range, 68 to 288 seconds)] (P < .001). All residents in the AED group attempted defibrillation at <5 minutes compared with 23 (77%) in the MD group (P = .01). AEDs improve the time to attempted defibrillation by pediatric residents in simulated cardiac arrests. Further studies are needed to help determine the role of AEDs in pediatric in-hospital cardiac arrests.
Changing Epidemiology of Human Brucellosis, China, 1955-2014.
Lai, Shengjie; Zhou, Hang; Xiong, Weiyi; Gilbert, Marius; Huang, Zhuojie; Yu, Jianxing; Yin, Wenwu; Wang, Liping; Chen, Qiulan; Li, Yu; Mu, Di; Zeng, Lingjia; Ren, Xiang; Geng, Mengjie; Zhang, Zike; Cui, Buyun; Li, Tiefeng; Wang, Dali; Li, Zhongjie; Wardrop, Nicola A; Tatem, Andrew J; Yu, Hongjie
2017-02-01
Brucellosis, a zoonotic disease, was made statutorily notifiable in China in 1955. We analyzed the incidence and spatial-temporal distribution of human brucellosis during 1955-2014 in China using notifiable surveillance data: aggregated data for 1955-2003 and individual case data for 2004-2014. A total of 513,034 brucellosis cases were recorded, of which 99.3% were reported in northern China during 1955-2014, and 69.1% (258, 462/374, 141) occurred during February-July in 1990-2014. Incidence remained high during 1955-1978 (interquartile range 0.42-1.0 cases/100,000 residents), then decreased dramatically in 1979-1994. However, brucellosis has reemerged since 1995 (interquartile range 0.11-0.23 in 1995-2003 and 1.48-2.89 in 2004-2014); the historical high occurred in 2014, and the affected area expanded from northern pastureland provinces to the adjacent grassland and agricultural areas, then to southern coastal and southwestern areas. Control strategies in China should be adjusted to account for these changes by adopting a One Health approach.
Luttmann-Gibson, Heike; Sarnat, Stefanie Ebelt; Suh, Helen H; Coull, Brent A; Schwartz, Joel; Zanobetti, Antonella; Gold, Diane R
2014-02-01
We examine whether ambient air pollution is associated with oxygen saturation in 32 elderly subjects in Steubenville, Ohio. We used linear mixed models to examine the effects of fine particulate matter less than 2.5 μm (PM(2.5)), sulfate (SO(4)(-2)), elemental carbon, and gases on median oxygen saturation. An interquartile range increase of 13.4 μg/m in PM(2.5) on the previous day was associated with a decrease of -0.18% (95% confidence interval: -0.31 to -0.06) and a 5.1 μg/m(3) interquartile range increase in SO(4)(-2) on the previous day was associated with a decrease of -0.16% (95% confidence interval: -0.27 to -0.04) in oxygen saturation during the initial 5-minute rest period of the protocol. Increased exposure to air pollution, including the nontraffic pollutant SO(4)(-2) from industrial sources, led to changes in oxygen saturation that may reflect particle-induced pulmonary inflammatory or vascular responses.
Cancer patient experience, hospital performance and case mix: evidence from England.
Abel, Gary A; Saunders, Catherine L; Lyratzopoulos, Georgios
2014-01-01
This study aims to explore differences between crude and case mix-adjusted estimates of hospital performance with respect to the experience of cancer patients. This study analyzed the English 2011/2012 Cancer Patient Experience Survey covering all English National Health Service hospitals providing cancer treatment (n = 160). Logistic regression analysis was used to predict hospital performance for each of the 64 evaluative questions, adjusting for age, gender, ethnic group and cancer diagnosis. The degree of reclassification was explored across three categories (bottom 20%, middle 60% and top 20% of hospitals). There was high concordance between crude and adjusted ranks of hospitals (median Kendall's τ = 0.84; interquartile range: 0.82-0.88). Across all questions, a median of 5.0% (eight) of hospitals (interquartile range: 3.8-6.4%; six to ten hospitals) moved out of the extreme performance categories after case mix adjustment. In this context, patient case mix has only a small impact on measured hospital performance for cancer patient experience.
Sawinski, Deirdre; Patel, Nikunjkumar; Appolo, Brenda; Bloom, Roy
2017-05-01
Hepatitis C virus (HCV) infection is prevalent in the renal transplant population but direct acting antiviral agents (DAA) provide an effective cure of HCV infection without risk of allograft rejection. We report our experience treating 43 renal transplant recipients with 4 different DAA regimens. One hundred percent achieved a sustained viral response by 12 weeks after therapy, and DAA regimens were well tolerated. Recipients transplanted with a HCV+ donor responded equally well to DAA therapy those transplanted with a kidney from an HCV- donor, but recipients of HCV+ organs experienced significantly shorter wait times to transplantation, 485 days (interquartile range, 228-783) versus 969 days (interquartile range, 452-2008; P = 0.02). On this basis, we advocate for a strategy of early posttransplant HCV eradication to facilitate use of HCV+ organs whenever possible. Additional studies are needed to identify the optimal DAA regimen for kidney transplant recipients, accounting for efficacy, timing relative to transplant, posttransplant clinical outcomes, and cost.
Stone, M; Collins, A L; Silins, U; Emelko, M B; Zhang, Y S
2014-03-01
There is increasing global concern regarding the impacts of large scale land disturbance by wildfire on a wide range of water and related ecological services. This study explores the impact of the 2003 Lost Creek wildfire in the Crowsnest River basin, Alberta, Canada on regional scale sediment sources using a tracing approach. A composite geochemical fingerprinting procedure was used to apportion the sediment efflux among three key spatial sediment sources: 1) unburned (reference) 2) burned and 3) burned sub-basins that were subsequently salvage logged. Spatial sediment sources were characterized by collecting time-integrated suspended sediment samples using passive devices during the entire ice free periods in 2009 and 2010. The tracing procedure combines the Kruskal-Wallis H-test, principal component analysis and genetic-algorithm driven discriminant function analysis for source discrimination. Source apportionment was based on a numerical mass balance model deployed within a Monte Carlo framework incorporating both local optimization and global (genetic algorithm) optimization. The mean relative frequency-weighted average median inputs from the three spatial source units were estimated to be 17% (inter-quartile uncertainty range 0-32%) from the reference areas, 45% (inter-quartile uncertainty range 25-65%) from the burned areas and 38% (inter-quartile uncertainty range 14-59%) from the burned-salvage logged areas. High sediment inputs from burned and the burned-salvage logged areas, representing spatial source units 2 and 3, reflect the lasting effects of forest canopy and forest floor organic matter disturbance during the 2003 wildfire including increased runoff and sediment availability related to high terrestrial erosion, streamside mass wasting and river bank collapse. The results demonstrate the impact of wildfire and incremental pressures associated with salvage logging on catchment spatial sediment sources in higher elevation Montane regions where forest growth and vegetation recovery are relatively slow. Copyright © 2013 Elsevier B.V. All rights reserved.
Sheahan, Anna; Feinstein, Lydia; Dube, Queen; Edmonds, Andrew; Chirambo, Chawanangwa Mahebere; Smith, Emily; Behets, Frieda; Heyderman, Robert; Van Rie, Annelies
2017-07-01
Based on clinical trial results, the World Health Organization recommends infant HIV testing at age 4-6 weeks and immediate antiretroviral therapy (ART) initiation in all HIV-infected infants. Little is known about the outcomes of HIV-infected infants diagnosed with HIV in the first weeks of life in resource-limited settings. We assessed ART initiation and mortality in the first year of life among infants diagnosed with HIV by 12 weeks of age. Cohort of HIV-infected infants in Kinshasa and Blantyre diagnosed before 12 weeks to estimate 12-month cumulative incidences of ART initiation and mortality, accounting for competing risks. Multivariate models were used to estimate associations between infant characteristics and timing of ART initiation. One hundred and twenty-one infants were diagnosed at a median age of 7 weeks (interquartile range, 6-8). The cumulative incidence of ART initiation was 46% [95% confidence interval (CI), 36%, 55%] at 6 months and 70% (95% CI 60%, 78%) at 12 months. Only age at HIV diagnosis was associated with ART initiation by age 6 months, with a subdistribution hazard ratio of 0.70 (95% CI 0.52, 0.91) for each week increase in age at DNA polymerase chain reaction test. The 12-month cumulative incidence of mortality was 20% (95% CI 13%, 28%). Despite early diagnosis of HIV, ART initiation was slow and mortality remained high, underscoring the complexity in translating clinical trial findings and World Health Organization's guidance into real-life practice. Novel and creative health system interventions will be required to ensure that all HIV-infected infants achieve optimal treatment outcomes under routine care settings.
Ng, Khuen F; Tan, Kah K; Sam, Zhi H; Ting, Grace Ss; Gan, Wan Y
2017-04-01
The aim of this study is to describe epidemiology, clinical features, laboratory data and severity of respiratory syncytial virus (RSV) acute lower respiratory infection (ALRI) in Malaysian children and to determine risk factors associated with prolonged hospital stay, paediatric intensive care unit (PICU) admission and mortality. Retrospective data on demographics, clinical presentation, outcomes and laboratory findings of 450 children admitted into Tuanku Jaafar Hospital in Seremban, Malaysia from 2008 to 2013 with documented diagnosis of RSV ALRI were collected and analysed. Most admissions were children below 2 years old (85.8%; 386/450). Commonest symptoms were fever (84.2%; 379/450), cough (97.8%; 440/450) and rhinorrhea (83.6%; 376/450). The median age among febrile patients (n = 379) was 9.0 months with interquartile range (IQR) of 4.0-19.0 months whereas the median age among those who were apyrexial (n = 71) was 2 months with IQR of 1-6 months (P-value <0.001). 15.3% (69/450) needed intensive care and 1.6% (7/450) died. Young age, history of prematurity, chronic comorbidity and thrombocytosis were significantly associated with prolonged hospital stay, PICU admission and mortality. Infants less than 6 months old with RSV ALRI tend to be afebrile at presentation. Younger age, history of prematurity, chronic comorbidity and thrombocytosis are predictors of severe RSV ALRI among Malaysian children. Case fatality rate for Malaysian children below 5 years of age with RSV ALRI in our centre is higher than what is seen in developed countries, suggesting that there is room for improvement. © 2016 Paediatrics and Child Health Division (The Royal Australasian College of Physicians).
Aquilani, Angela; Pires Marafon, Denise; Marasco, Emiliano; Nicolai, Rebecca; Messia, Virginia; Perfetti, Francesca; Magni-Manzoni, Silvia; De Benedetti, Fabrizio
2018-05-01
To evaluate the rate of flare after etanercept (ETN) withdrawal in patients with juvenile idiopathic arthritis (JIA) who attained clinical remission while taking medication, and to identify predictors of flare. Patients were included with oligo- (oJIA) and rheumatoid factor-negative polyarticular JIA (pJIA) who received a first course of ETN for at least 18 months, maintained clinically inactive disease (CID) for at least 6 months during treatment, and were followed for 12 months after ETN withdrawal. Demographic and clinical features were collected at onset, at baseline (initiation of ETN), and at time of disease flare. After ETN withdrawal, 66 of the 110 patients enrolled (60%) flared with arthritis (of whom 7 flared with concurrent anterior uveitis; none with uveitis alone). The median time to flare was 4.3 months (interquartile range 2.5-6.4) with no evident differences between oJIA and pJIA. The number and type of joints involved at baseline and characteristics of ETN treatment/discontinuation were not associated with flare. Patients who flared were more frequently males (p = 0.034), positive for antinuclear antibody (ANA; p = 0.047), and had higher values of C-reactive protein (CRP; p = 0.012) at baseline. These variables remained significantly associated with flare in a multivariate logistic analysis, a model accounting for only 14% of the variability of the occurrence of the flare. Our results show that a significant proportion of patients with JIA who maintain CID for at least 6 months experience a relapse after ETN withdrawal. Male sex, presence of ANA, and elevated CRP at baseline were associated with higher risk of flare.
Tang, Jennifer Y-M; Wong, Gloria H-Y; Ng, Carmen K-M; Kwok, Dorothy T-S; Lee, Maggie N-Y; Dai, David L-K; Lum, Terry Y-S
2016-03-01
To examine the neuropsychological and clinical profile of help-seekers in an early-detection community dementia program and to explore any relationship between profiles and time to seek help. Cross-sectional. Early-detection community dementia program. Help-seekers (N = 1,005) with subjective cognitive complaints or complaints from an informant. Neurocognitive testing, including the Cantonese Mini-Mental State Examination (MMSE), Clock Drawing Test, Digit Span, and Fuld Object Memory Evaluation and other clinical and functioning assessments, including the Clinical Dementia Rating (CDR), activities of daily living (ADLs), instrumental ADLs (IADLs), and depressive symptoms. Time since the person or an informant reported that they first noticed symptoms. Eighty-six percent of help-seekers had at least very mild dementia (CDR score ≥0.5). Cognitive performance was moderately impaired (mean MMSE score 18.4 ± 6.1). They required some assistance with IADLs, had very mild ADL impairments, and had few depressive symptoms. Median time to seek assessment was 12 months (interquartile range 7-30 months) according to the person or the informant (an adult child in 75% of the sample). Using the median-split method, time to seek assessment was classified as early (0-12 months) and late (>12 months). Worse cognitive and IADL performance but not ADL performance or depressive symptoms were observed in late than in early help-seekers. Longer time intervals between symptom recognition and early assessment showed a trend of further impairments on all measures except ADLs. A time interval of more than 12 months between symptom recognition and early assessment appears to be associated with worse cognitive function upon presentation. © 2016, Copyright the Authors Journal compilation © 2016, The American Geriatrics Society.
Chandy, David D; Kare, Jahnavi; Singh, Shakal N; Agarwal, Anjoo; Das, Vinita; Singh, Urmila; Ramesh, V; Bhatia, Vijayalakshmi
2016-07-01
We assessed the effect of vitamin D supplementation on related biochemistry, infection and dentition of the infant. In a double-blind, placebo-controlled trial conducted in Lucknow, India (latitude 26°N), 230 mother -newborn pairs were randomised to receive, for 9 months, 3000µg/month oral vitamin D3 by the mother (group A) or 10µg/d by the infant (group B) or double placebo (group C). All babies received 15 min of sun exposure (unclothed) during massage. Infants' median 25-hydroxyvitamin D (25(OH)D) was lower in group C (median 45·3; interquartile range (IQR) 22-59·5 nmol/l) than in groups A (median 60·8; IQR 41·3-80·5 nmol/l (P7.5µkat/l) was significantly more frequent in group C babies (16 %) than in group A (4 %) or group B (0 %) babies. The number of days with respiratory or diarrhoeal infection by 9 months of age was higher in group C (median 46·5; IQR 14·8-73·3 d) than in group A (median 18·5; IQR 8·8-31·0 d (P<0·01)) or group B (median 13·0; IQR 7·0-28·5 (P<0·05)). We conclude that monthly maternal or daily infant supplementation with vitamin D along with sun exposure is superior to sun exposure alone in maintaining normal infant 25(OH)D at 3·5 months, and provide protection from elevated alkaline phosphatase and infectious morbidity.
Dengler, Julius; Sturesson, Bengt; Kools, Djaya; Prestamburgo, Domenico; Cher, Daniel; van Eeckhoven, Eddie; Erk, Emanuel; Pflugmacher, Robert; Vajkoczy, Peter
2016-11-01
The first results from the randomized, controlled iFuse Implant System Minimally Invasive Arthrodesis (iMIA) trial showed that minimally invasive surgical management (MISM) of low back pain originating from the sacroiliac joint (SIJ) by placing transarticular triangular titanium implants reduced pain more effectively than conservative management (CM). We now conducted a separate analysis of the iMIA data to assess whether the referred leg pain (RLP) component of SIJ-associated pain may also be affected by MISM or CM. Data from 101 patients, recruited between June 2013 and May 2015 at nine European spine care centers, were included. Forty-nine patients were randomized to CM and 51 patients to MISM. RLP was defined as pain below the gluteal fold and assessed using the visual analogue scale (VAS). Changes in RLP over 6 months were the primary endpoint. The prevalence of clinically significant RLP was 76.2 %. Over 6 months of follow-up, CM produced no significant change in RLP, which was 51.0 VAS points (interquartile range (IQR) 17.0-75.0) at baseline. In contrast, in the MISM cohort, we found a significant decrease in RLP from VAS 58.0 (IQR 24.5-80.0) at baseline to VAS 13.5 (IQR 0.0-39.3) after 6 months (p < 0.01). Improvement of RLP was associated only with the type of treatment (OR 5.04, p < 0.01), but not with patient age, sex, or different patterns of pain referral. Our analysis shows that RLP is a frequent phenomenon in patients with SIJ-associated pain. At 6 months of follow-up, MISM helped relieve RLP more effectively than CM. Clinical Trial Registration-URL: http://www.clinicaltrials.gov . Unique identifier: NCT01741025.
Exercise Improves Cognition in Parkinson’s Disease: the PRET-PD Randomized Clinical Trial
David, Fabian J.; Robichaud, Julie A.; Leurgans, Sue E.; Poon, Cynthia; Kohrt, Wendy M.; Goldman, Jennifer G.; Comella, Cynthia L.; Vaillancourt, David E.; Corcos, Daniel M.
2015-01-01
Background This paper reports on the findings of the effect of two structured exercise interventions on secondary cognitive outcomes which were gathered as part of the Progressive Resistance Exercise Training in Parkinson’s disease randomized controlled trial. Methods This study was a prospective, parallel-group, single-center trial. Fifty-one non-demented patients with mild-to-moderate Parkinson’s disease were randomly assigned either to modified Fitness Counts or to Progressive Resistance Exercise, and were followed for 24 months. Cognitive outcomes were the Digit Span, Stroop, and Brief Test of Attention. Results Eighteen patients in modified Fitness Counts and 20 patients in Progressive Resistance Exercise completed the trial. At 12 and at 24 months no differences between groups were observed. At 12 months, relative to baseline, modified Fitness Counts improved on the Digit Span (estimated change, 0.3; Inter-Quartile Range, 0, 0.7; p=0.04) and Stroop (0.3; 0, 0.6; p=0.04), and Progressive Resistance Exercise improved only on the Digit Span (0.7; 0.3, 1; p<0.01). At 24 months, relative to baseline, modified Fitness Counts improved on the Digit Span (0.7; 0.3, 1.7; p<0.01) and Stroop (0.3; 0.1, 0.5; p=0.03), while Progressive Resistance Exercise improved on the Digit Span (0.5; 0.2, 0.8; p<0.01), Stroop (0.2; −0.1, 0.6; p=0.048), and Brief Test of Attention (0.3; 0, 0.8; p=0.048). No neurologic or cognitive adverse events were seen. Conclusions This study provides Class IV level of evidence that 24 months of Progressive Resistance Exercise or modified Fitness Counts may improve attention and working memory in non-demented patients with mild-to-moderate Parkinson’s disease. PMID:26148003
Guillemi, Silvia A; Ling, Sean H; Dahlby, Julia S; Yip, Benita; Zhang, Wendy; Hull, Mark W; Lima, Viviane Dias; Hogg, Robert S; Werb, Ronald; Montaner, Julio S; Harris, Marianne
2016-01-01
Tenofovir disoproxil fumarate (TDF)-associated renal dysfunction may abate when TDF is replaced with abacavir (ABC). The extent to which the third drug atazanavir contributes to renal dysfunction is unclear. A retrospective analysis was conducted on adults who had plasma viral load (pVL)<200 copies/mL for≥six months while receiving TDF/lamivudine (3TC) - or TDF/emtricitabine (FTC)-based antiretroviral therapy (ART), then switched to ABC/3TC while retaining the third drug in the ART regimen. CD4, pVL, creatinine, estimated glomerular filtration rate (eGFR), serum phosphorus, urine albumin to creatinine ratio and serum lipids were compared between pre-switch baseline and 3, 6 and 12 months after the switch to ABC. A total of 286 patients switched from TDF to ABC between 2004 and 2014: 232 (81%) male, median age 48 years (interquartile range (IQR) 42, 56). The third drug was atazanavir (± ritonavir) in 141 (49%) cases. The pVL was<50 copies/mL in 93 to 96% at all time points. Median serum creatinine was 93 µmol/L (IQR 80-111) at baseline and decreased to 88 µmol/L (IQR 78-98) at 12 months after the switch to ABC. Median eGFR increased from 74 (IQR 60-88) mL/min at baseline to 80 mL/min (IQR 69-89) at 12 months. Results were not significantly different between patients on atazanavir versus those on another third drug. Viral suppression was maintained among patients who switched from TDF/3TC or TDF/FTC to ABC/3TC. Serum creatinine and eGFR improved up to 12 months after switching to ABC/3TC, irrespective of whether or not patients were also receiving atazanavir±ritonavir.
Choi, Chahien; Kim, Woo Young; Lee, Dong Hee; Lee, San Hui
2018-03-01
We aimed to evaluate the impact of topical hemostatic sealants and bipolar coagulation during laparoscopic ovarian endometriotic cyst resection on ovarian reserve by comparing the rates of decrease in anti-Müllerian hormone (AMH). A randomized prospective data collection was made on women aged 19-45 years who planned to have laparoscopic ovarian cystectomy at one of two institutions (n = 80), Kangbuk Samsung Hospital, Seoul, Korea or National Health Insurance Service Ilsan Hospital, Goyang, Korea, from January 2014 to April 2016. Patients were randomly divided into two groups treated with either a topical hemostatic sealant or bipolar coagulation for hemostasis. The hemostatic group was randomized to the FloSeal or TachoSil subgroups. Preoperative and 3-month postoperative AMH levels were checked and the rates of decrease of AMH were compared. All patients enrolled were treated with dienogest (Visanne) for 6-12 months. None were lost to follow-up at postoperative 3 months, but about one-third of the patients had been lost to follow-up by 6-12 months. AMH was significantly decreased in both groups 3 months postoperatively; however, the rate of decrease in the bipolar coagulation group was greater than that in the hemostatic sealant group, 41.9% (interquartile range [IQR], 22.29-65.24) versus 18.1% (IQR, 10.94-29.90), P = 0.007. Between the two hemostatic subgroups, there was no significant difference in AMH decrease rate, 14.95% (IQR, 11.34-21.21) versus 18.1% (IQR 9.76-40.70), P = 0.204. Hemostatic sealants may be an alternative to bipolar coagulation for preservation of ovarian reserve after laparoscopic ovarian cystectomy for endometriosis. © 2017 Japan Society of Obstetrics and Gynecology.
Njuguna, Njambi; Ngure, Kenneth; Mugo, Nelly; Sambu, Carrole; Sianyo, Christopher; Gakuo, Stephen; Irungu, Elizabeth; Baeten, Jared; Heffron, Renee
2016-06-01
More than half of human immunodeficiency virus (HIV)-infected individuals in Kenya are unaware of their status, and young women carry a disproportionate burden of incident HIV infections. We sought to determine the effect of an SMS intervention on uptake of HIV testing among female Kenyan college students. We conducted a quasi-experimental study to increase HIV testing among women 18 to 24 years old. Four midlevel training colleges in Central Kenya were allocated to have their study participants receive either weekly SMS on HIV and reproductive health topics or no SMS. Monthly 9-question SMS surveys were sent to all participants for 6 months to collect data on HIV testing, sexual behavior, and HIV risk perception. We used multivariate Cox proportional hazards regression to detect differences in the time to the first HIV test reported by women during the study period. We enrolled 600 women between September 2013 and March 2014 of whom 300 received weekly SMS and monthly surveys and 300 received only monthly surveys. On average, women were 21 years of age (interquartile range, 20-22), 71.50% had ever had sex and 72.62% had never tested for HIV. A total of 356 women reported testing for HIV within the 6 months of follow-up: 67% from the intervention arm and 51% from the control arm (hazard ratio, 1.57; 95% confidence interval, 1.28-1.92). Use of weekly text messages about HIV prevention and reproductive health significantly increased rates of HIV testing among young Kenyan women and would be feasible to implement widely among school populations.
Telehealth Protocol to Prevent Readmission Among High-Risk Patients With Congestive Heart Failure.
Rosen, Daniel; McCall, Janice D; Primack, Brian A
2017-11-01
Congestive heart failure is the leading cause of hospital readmissions. We aimed to assess adherence to and effectiveness of a telehealth protocol designed to prevent hospital admissions for congestive heart failure. We recruited a random sample of 50 patients with congestive heart failure (mean age 61 years) from a managed care organization. We developed a telehealth platform allowing for daily, real-time reporting of health status and video conferencing. We defined adherence as the percentage of days on which the patient completed the daily check-in protocol. To assess efficacy, we compared admission and readmission rates between the 6-month intervention period and the prior 6 months. Primary outcomes were admissions and readmissions due to congestive heart failure, and secondary outcomes were admissions and readmissions due to any cause. Forty-eight patients (96%) completed the protocol. Approximately half (46%) were at high risk for readmission according to standardized measures. Median 120-day adherence was 96% (interquartile range, 92%-98%), and adherence did not significantly differ across sex, race, age, living situation, depression, cognitive ability, or risk for readmission. Approximately equal proportions of patients were admitted for all causes during the 6-month intervention period versus the comparison period (37% vs 43%; P = .32). Half as many patients were admitted for congestive heart failure during the 6-month intervention period compared with the comparison period (12% vs 25%; P = .11). Adherence to this telehealth protocol was excellent and consistent, even among high-risk patients. Future research should test the protocol using a more rigorous randomized design. Copyright © 2017 Elsevier Inc. All rights reserved.
Sikkens, Edmée C M; Cahen, Djuna L; de Wit, Jill; Looman, Caspar W N; van Eijck, Casper; Bruno, Marco J
2014-01-01
In cancer of the pancreatic head region, exocrine insufficiency is a well-known complication, leading to steatorrhea, weight loss, and malnutrition. Its presence is frequently overlooked, however, because the primary attention is focused on cancer treatment. To date, the risk of developing exocrine insufficiency is unspecified. Therefore, we assessed this function in patients with tumors of the pancreatic head, distal common bile duct, or ampulla of Vater. Between March 2010 and August 2012, we prospectively included patients diagnosed with cancer of the pancreatic head region at our tertiary center. To preclude the effect of a resection, we excluded operated patients. Each month, the exocrine function was determined with a fecal elastase test. Furthermore, endocrine function, steatorrhea-related symptoms, and body weight were evaluated. Patients were followed for 6 months, or until death. Thirty-two patients were included. The tumor was located in the pancreas in 75%, in the bile duct in 16%, and in the ampullary region in 9%, with a median size of 2.5 cm. At diagnosis, the prevalence of exocrine insufficiency was 66%, which increased to 92% after a median follow-up of 2 months (interquartile range, 1 to 4 mo). Most patients with cancer of the pancreatic head region were already exocrine insufficient at diagnosis, and within several months, this function was impaired in almost all cases. Given this high prevalence, physicians should be focused on diagnosing and treating exocrine insufficiency, to optimize the nutritional status and physical condition, especially for those patients undergoing palliative chemotherapy and/or radiotherapy.
Conway, Laurie J.; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L.
2015-01-01
Article-at-a-Glance Background Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care–associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. Methods An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. Results After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Conclusions Feedback via an automated system was associated with improved hand hygiene performance in the short term. PMID:25252389
Publication status of contemporary oncology randomised controlled trials worldwide.
Chen, Yu-Pei; Liu, Xu; Lv, Jia-Wei; Li, Wen-Fei; Zhang, Yuan; Guo, Ying; Lin, Ai-Hua; Sun, Ying; Mao, Yan-Ping; Ma, Jun
2016-10-01
Little is known about the extent of selective publication in contemporary oncology randomised controlled trials (RCTs) worldwide. This study aimed to evaluate the rates of publication and timely publication (within 24 months) for contemporary oncology RCTs from all over the world. We also investigated the trial characteristics associated with publication and timely publication. We identified all phase III oncology RCTs registered on ClinicalTrials.gov with a primary completion date between January 2008 and December 2012. We searched PubMed and EMBASE to identify publications. The final search date was 31 December 2015. Our primary outcome measure was the time to publication from the primary completion date to the date of primary publication in a peer-reviewed journal. We identified 598 completed oncology RCTs; overall, 398 (66.6%) had been published. For published trials, the median time to publication was 25 months (interquartile range, 16-37 months). Only 192 trials (32.1%) were published within 24 months. Timely publication was independently associated with trials completed late in 2012. Trials conducted in Asia and other regions were less likely to have timely publication, but trials conducted in different locations were all equally likely to be published. Industry- and NIH-funded trials were equally likely to be published timely or at any time after trial completion. Among 391 published trials with clear primary outcomes, there was a trend for timely publication of positive trials compared with negative trials. Despite the ethical obligations and societal expectations of disclosing findings promptly, oncology RCTs performed poorly. Copyright © 2016 Elsevier Ltd. All rights reserved.
Girma, Tsinuel; Kæstel, Pernille; Mølgaard, Christian; Michaelsen, Kim F; Hother, Anne-Louise; Friis, Henrik
2013-12-06
Severe acute malnutrition has two main clinical manifestations, i.e., oedematous and non-oedematous. However, factors associated with oedema are not well established. Children 0.5-14 years of age with SAM (MUAC < 11.0 cm or weight-for-height < 70 % of median and/or nutritional oedema) admitted to the nutrition unit were included. Information on infections before and during admission was collected together with anthropometry. Predictors of oedema was analysed separately for younger (< 60 months) and older children (≥ 60 months). 351 children were recruited (median age: 36 months (interquartile range 24 to 60); 43.3% females). Oedema was detected in 61.1%. The prevalence of oedema increased with age, peaked at 37-59 months (75%) and declined thereafter. Infection was more common in the younger group (33% vs. 8.9%, p < 0.001) and in this group children with oedema had less infections (25.2% vs. 45.1%, p = 0.001). In the older group the prevalence of infections was not different between oedematous and non-oedematous children (5.5% v. 14.3%, p = 0.17). In the younger group oedema was less common in children with TB (OR = 0.20, 95% CI: 0.06, 0.70) or diarrhea (OR = 0.40, 95% CI: 0.21, 0.73). The proportion of oedema in SAM peaked at three to five years of age and a considerable proportion was above 5 years. Furthermore, the prevalence of infection seemed to be lower among children with oedema. Further studies are needed to better understand the role of infection-immunity interaction.
Falls and Depression in Men: A Population-Based Study.
Stuart, Amanda L; Pasco, Julie A; Jacka, Felice N; Berk, Michael; Williams, Lana J
2018-01-01
The link between falls and depression has been researched in the elderly; however, little information is available on this association in younger adults, particularly men. This study sought to investigate the link between major depressive disorder (MDD) and falls in a population-based sample of 952 men (24-97 years). MDD was diagnosed utilizing the Structured Clinical Interview for DSM-IV-TR Research Version, Non-Patient edition, and categorized as 12-month/past/never. Body mass index and gait were measured; falls, smoking status, psychotropic medication use, and alcohol intake were self-reported as part of the Geelong Osteoporosis Study 5-year follow-up assessment. Thirty-four (3.6%) men met criteria for 12-month MDD, and 110 (11.6%) for past MDD. Of the 952 men, 175 (18.4%) reported falling at least once during the past 12 months. Fallers were older (66 [interquartile range: 48-79] vs. 59 [45-72] years, p = .001) and more likely to have uneven gait ( n = 16, 10% vs. n = 31, 4%, p = .003) than nonfallers. Participants with 12-month MDD had more than twice the odds of falling (age-adjusted odds ratio: 2.22, 95% confidence interval [1.03, 4.80]). The odds of falling were not associated with past depression ( p = .4). Further adjustments for psychotropic drug use, gait, body mass index, smoking status, blood pressure, and alcohol did not explain these associations. Given the 2.2-fold greater likelihood of falling associated with depression was not explained by age or psychotropic drug use, further research is warranted.
Askim, Torunn; Bernhardt, Julie; Churilov, Leonid; Indredavik, Bent
2016-11-11
The National Institutes of Health Stroke Scale (NIHSS) is the first choice among stroke scales. The Scandinavian Stroke Scale (SSS) is an alternative scale that is easy to apply in the clinic. To compare the ability of the SSS with that of the NIHSS in identifying patients who are dead or dependent at 3-month follow-up. A prospective study including patients with acute stroke. NIHSS and SSS measurements were obtained during index hospital stay. The receiver operating characteristics curve was used to determine the optimal dichotomization of the NIHSS and the SSS by using a modified Rankin Scale (mRS) >2 at 3-month follow-up as the criterion standard. Positive and negative predictive values (PPV and NPV) were calculated. A total of 104 patients (mean age 79 years, 57.7% men) were included. Median (interquartile range (IQR)) NIHSS and SSS score were 6.0 (2.0-11.8) and 43.5 (30.0-51.0), respectively. The areas under the curve were 0.769 and 0.796 for NIHSS and SSS, respectively, χ2 (p = 0.303). The best cut-off point for NIHSS was 6/7 points (PPV = 76.2%, NPV = 69.0%) while for SSS it was 42/43 points (PPV = 71.4%, NPV = 73.2%). The SSS was as good as the NIHSS in identifying patients who had died or were dependent at 3-month follow-up. The measurement properties of the SSS should be investigated further.
Conway, Laurie J; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L
2014-09-01
Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care-associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Feedback via an automated system was associated with improved hand hygiene performance in the short-term.
Brier, Moriah J; Chambless, Dianne; Gross, Robert; Su, H Irene; DeMichele, Angela; Mao, Jun J
2015-09-01
Poor adherence to oral adjuvant hormonal therapy for breast cancer is a common problem, but little is known about the relationship between self-report adherence measures and hormonal suppression. We evaluated the relationship of three self-report measures of medication adherence and oestrogen among patients on aromatase inhibitors (AIs). We recruited 235 women with breast cancer who were prescribed AI therapy. Participants self-reported AI adherence by completing the following: (1) a single item asking whether they took an AI in the last month, (2) a modified Morisky Medication Adherence Scale-8 (MMAS-8) and (3) the Visual Analog Scale (VAS). Serum estrone and estradiol were analysed using organic solvent extraction and Celite column partition chromatography, followed by radioimmunoassay. Ten percent of participants reported they had not taken an AI in the last month and among this group, median estrone (33.2 pg/ml [interquartile range (IQR)=22.3]) and estradiol levels (7.2 pg/mL [IQR=3.3]) were significantly higher than those in participants who reported AI use (median estrone=11.5 pg/mL [IQR=4.9]; median estradiol=3.4 pg/mL [IQR=2.1]; p<0.001). This relationship held when controlling for race and AI drug type. A single-item monthly-recall adherence measure for AIs was associated with oestrogen serum levels. This suggests that patient-reported monthly adherence may be a useful measure to identify early non-adherence behaviour and guide interventions to improve patient adherence to hormonal treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Buch, Maya H; Hensor, Elizabeth M A; Rakieh, Chadi; Freeston, Jane E; Middleton, Edward; Horton, Sarah; Das, Sudipto; Peterfy, Charles; Tan, Ai Lyn; Wakefield, Richard J; Emery, Paul
2017-01-01
No proven treatment exists for ACPA-negative undifferentiated arthritis (UA). The aim of this study was to evaluate whether abatacept is effective in treating poor prognosis, ACPA-negative UA, including its effect on power Doppler on US (PDUS). A proof-of-concept, open-label, prospective study of 20 patients with DMARD-naïve, ACPA-negative UA (⩾2 joint synovitis) and PDUS ⩾ 1 with clinical and 20-joint US (grey scale/PDUS) assessments at baseline, 6, 12, 18 and 24 months. All patients received 12 months of abatacept (monotherapy for minimum first 6 months). The primary end point was a composite of the proportion of patients that at 6 months achieved DAS44 remission, a maximum of one swollen joint for at least 3 consecutive months and no radiographic progression (over 0-12 months). Twenty of the 23 patients screened were enrolled [14 female; mean (sd) age 53.4 (11.2) years, symptom duration 7.5 (0.9) months]. Two (10%) achieved the composite primary end point. A reduction in the mean (sd) DAS44 was observed from a baseline value of 2.66 (0.77) to 2.01 (0.81) at 6 months and to 1.78 (0.95) at 12 months. The DAS44 remission rates were 6/20 (30%; 95% CI: 15, 51%) at 6 months and 8/20 (40%; 95% CI: 22, 62%) at 12 months. A striking decrease in the median (interquartile range; IQR) total PDUS score was noted from 10 (4-23) at baseline to 3 (2-12) and 3 (0-5) at 6 and 12 months, respectively. This report is a first in potentially identifying an effective therapy, abatacept monotherapy, for poor-prognosis, ACPA-negative UA, supported by a clear reduction in PDUS. These data justify evaluation in a controlled study. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Aso, Yoshimasa; Terasawa, Tomoko; Kato, Kanako; Jojima, Teruo; Suzuki, Kunihiro; Iijima, Toshie; Kawagoe, Yoshiaki; Mikami, Shigeru; Kubota, Yoshiro; Inukai, Toshihiko; Kasai, Kikuo
2013-11-01
A soluble form of CD26/dipeptidyl peptidase 4 (sCD26/DPP4) is found in serum and it has DPP4 enzymatic activity. We investigated whether the serum level of sCD26/DPP4 was influenced by the oral glucose tolerance test (OGTT) in healthy subjects. The serum sCD26/DPP4 level increased significantly from 824.5 ng/mL (interquartile range, from 699.0 to 1050 ng/mL) at baseline to a peak of 985.0 ng/mL (interquartile range, from 796.5 to 1215 ng/mL) during the OGTT (P < 0.0001). The peak sCD26/DPP4 level correlated positively with the baseline age and body mass index, and fasting plasma glucose (FPG), homeostasis model assessment of insulin resistance (HOMA-IR), triglycerides (TG), alanine aminotransferase, and γ-glutamyl transpeptidase (GGT) levels whereas it correlated negatively with high-density lipoprotein (HDL) cholesterol and the serum levels of total and high-molecular weight (HMW) adiponectin. Stepwise regression analysis was done with forward selection of variables, including age, FPG, HOMA-IR, TG, HDL cholesterol, uric acid, GGT, C-reactive protein, and HMW adiponectin. In a model that explained 57.5% of the variation of the peak sCD26/DPP4 level, GGT (β = 0.382, P = 0.007) and HOMA-IR (β = 0.307, P = 0.034) were independent determinants of the peak serum level of sCD26/DPP4. Serum HMW adiponectin decreased significantly from 4.43 μg/mL (interquartile range, from 2.80 to 6.65 μg/mL) at baseline to 4.17 μg/mL (interquartile range, from 2.48 to 6.96 μg/mL) 120 minutes after the oral glucose load (P < 0.0001). The baseline serum level of sCD26/DPP4 showed a significant negative correlation with the percent change of HMW adiponectin during the OGTT. In conclusion, the serum level of sCD26/DPP4 increased acutely after an oral glucose load in apparently healthy subjects. The abrupt increase of serum sCD26/DPP4 after a glucose load may be a marker of insulin resistance that could come from liver or muscle. Copyright © 2013 Mosby, Inc. All rights reserved.
Adverse events and dropouts in Alzheimer's disease studies: what can we learn?
Henley, David B; Sundell, Karen L; Sethuraman, Gopalan; Schneider, Lon S
2015-01-01
Interpreting Alzheimer's disease (AD) clinical trial (CT) outcomes is complicated by treatment dropouts and adverse events (AEs). In elderly participants, AE rates, dropouts, and deaths are important considerations as they may undermine the validity of clinical trials. Published discontinuation and safety data are limited. Safety data from 1054 placebo-treated participants in IDENTITY and IDENTITY-2, 76-week, Phase 3 AD studies conducted in 31 countries, were pooled, annualized, and summarized overall, by country and age group. Median age was 74.2 (interquartile range 67.9-79.5) years; 57.4% were female; and median observation time was 63.2 (interquartile range 41.6-77.4) weeks when study drug dosing was halted. Overall annualized rates for discontinuations, discontinuations due to AEs, serious adverse events (SAEs), and deaths were 21.6% (range 19.6%-24.0%), 8.2% (range 8.1%-8.3%), 12.0%, and 1.7%, respectively. AE and discontinuation rates varied by country and age groups. Fall, pneumonia, and atrial fibrillation AEs were more frequent in the oldest age group. These annualized placebo safety data provide insight into the course of enrolled patients with mild-to-moderate AD, and are useful in planning longer term trials and in monitoring safety. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
[Fluoride content of bottled natural mineral waters in Spain and prevention of dental caries].
Maraver, Francisco; Vitoria, Isidro; Almerich-Silla, José Manuel; Armijo, Francisco
2015-01-01
The aim of the study was to determine the concentration of fluoride in natural mineral waters marketed in Spain in order to prevent tooth decay without the risk of causing dental fluorosis Descriptive and cross-sectional study during 2012. Natural mineral waters marketed in Spain. Three bottles with different bottling dates of 109 natural mineral waters (97 Spanish and 12 imported brands). Determination of fluoride by ion chromatography Median fluoride concentrations of the natural mineral waters bottled in Spain was 0.22 (range 0.00-4.16; interquartile range:0.37). Most samples (61 brands, 62%) contained less than 0.3mg/L. There are 19 Spanish brands with more than 0.6 mg/L. The median level in imported brands was 0.35 (range 0.10-1.21; interquartile range: 0.23). Only 28 of the 109 brands examined (25.6%) specified the fluoride content on the label. Good correlation was observed between the concentrations indicated and those determined. Fluoride concentrations in natural mineral waters showed high variation. Given the growing consumption of natural mineral waters in Spain, this type of information is important to make proper use of fluoride in the primary prevention of dental caries. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Papageorgiou, Spyridon N; Antonoglou, Georgios N; Sándor, George K; Eliades, Theodore
2017-01-01
A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date.
Antonoglou, Georgios N.; Sándor, George K.; Eliades, Theodore
2017-01-01
A priori registration of randomized clinical trials is crucial to the transparency and credibility of their findings. Aim of this study was to assess the frequency with which registered and completed randomized trials in orthodontics are published. We searched ClinicalTrials.gov and ISRCTN for registered randomized clinical trials in orthodontics that had been completed up to January 2017 and judged the publication status and date of registered trials using a systematic protocol. Statistical analysis included descriptive statistics, chi-square or Fisher exact tests, and Kaplan-Meier survival estimates. From the 266 orthodontic trials registered up to January 2017, 80 trials had been completed and included in the present study. Among these 80 included trials, the majority (76%) were registered retrospectively, while only 33 (41%) were published at the time. The median time from completion to publication was 20.1 months (interquartile range: 9.1 to 31.6 months), while survival analysis indicated that less than 10% of the trials were published after 5 years from their completion. Finally, 22 (28%) of completed trials remain unpublished even after 5 years from their completion. Publication rates of registered randomized trials in orthodontics remained low, even 5 years after their completion date. PMID:28777820
Clinical spectrum of psychogenic non epileptic seizures in children; an observational study.
Madaan, Priyanka; Gulati, Sheffali; Chakrabarty, Biswaroop; Sapra, Savita; Sagar, Rajesh; Mohammad, Akbar; Pandey, R M; Tripathi, Manjari
2018-07-01
The current study was designed to analyze the clinical spectrum of Psychogenic non-epileptic seizures (PNES) in children. Children aged 6-16years with clinically suspected PNES, confirmed by short-term VEEG (STVEEG{video electroencephalogram}) and induction were classified as per Seneviratne classification. Stressors, associated co morbidities, Verbal IQ (Intelligence Quotient) and behavioral abnormalities were assessed using HTP(House tree person) test, DSM IV (Diagnostic and statistical manual of mental disorders) TR criteria, MISIC (Malin intelligence scale for Indian children) and CBCL (Child behaviour checklist). Eighty children with PNES {45 boys; mean age: 10.5 (±1.6) years} were enrolled. Median delay in diagnosis was 5 months {IQR(interquartile range)- 0.5 to 48 months}) and 45% patients were already on AEDs (antiepileptic drugs). Commonest semiology was dialeptic (42.5%), followed by mixed (28.8%), motor (15%) and nonepileptic aura (13.8%). Family stressors were the commonest followed by school related issues. The most common psychiatric comorbidity was adjustment disorder. Somatic complaints were observed in 50% children. Dialeptic PNES is commonest in children. In resource constrained settings, STVEEG along with induction is a reliable method to diagnose PNES. A comprehensive assessment protocol (including assessment of stressors) is needed for holistic management of pediatric PNES. Copyright © 2018 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.
Martin, K; Gertler, R; Sterner, A; MacGuill, M; Schreiber, C; Hörer, J; Vogt, M; Tassani, P; Wiesner, G
2011-08-01
ε-Aminocaproic acid (EACA) and tranexamic acid (TXA) are used for antifibrinolytic therapy in neonates undergoing cardiac surgery, although data directly comparing their blood-sparing efficacy are not yet available. We compared two consecutive cohorts of neonates for the effect of these two medications on perioperative blood loss and allogeneic transfusions. Data from the EACA group (n = 77) were collected over a 12-month period; data from the tranexamic acid group (n = 28) were collected over a 5-month period. Blood loss, rate of reoperation due to bleeding, and transfusion requirements were measured. There was no significant difference in blood loss at 6 hours (EACA 24 [17-30] mL/kg [median (interquartile range)] vs. TXA 20 [11-34] mL/kg, P = 0.491), at 12 hours (EACA 31 [22-38] mL/kg vs. TXA 27 [19-43] ml/kg, P = 0.496) or at 24 hours postoperatively (EACA 41 [31-47] mL/kg vs. TXA 39 [27-60] mL/kg; P = 0.625) or transfusion of blood products. ε-Aminocaproic acid and tranexamic acid are equally effective with respect to perioperative blood loss and transfusion requirements in newborns undergoing cardiac surgery. © Georg Thieme Verlag KG Stuttgart · New York.
Barton, David J; Kumar, Raj G; McCullough, Emily H; Galang, Gary; Arenth, Patricia M; Berga, Sarah L; Wagner, Amy K
2016-01-01
To (1) examine relationships between persistent hypogonadotropic hypogonadism (PHH) and long-term outcomes after severe traumatic brain injury (TBI); and (2) determine whether subacute testosterone levels can predict PHH. Level 1 trauma center at a university hospital. Consecutive sample of men with severe TBI between 2004 and 2009. Prospective cohort study. Post-TBI blood samples were collected during week 1, every 2 weeks until 26 weeks, and at 52 weeks. Serum hormone levels were measured, and individuals were designated as having PHH if 50% or more of samples met criteria for hypogonadotropic hypogonadism. At 6 and 12 months postinjury, we assessed global outcome, disability, functional cognition, depression, and quality of life. We recruited 78 men; median (interquartile range) age was 28.5 (22-42) years. Thirty-four patients (44%) had PHH during the first year postinjury. Multivariable regression, controlling for age, demonstrated PHH status predicted worse global outcome scores, more disability, and reduced functional cognition at 6 and 12 months post-TBI. Two-step testosterone screening for PHH at 12 to 16 weeks postinjury yielded a sensitivity of 79% and specificity of 100%. PHH status in men predicts poor outcome after severe TBI, and PHH can accurately be predicted at 12 to 16 weeks.
Goswami, Gaurav; Upadhyay, Amit; Gupta, Navratan Kumar; Chaudhry, Rajesh; Chawla, Deepak; Sreenivas, V
2013-07-01
To compare analgesic effect of direct breast feeding, 25% dextrose solution and placebo as we give 1st intramuscular whole cell DPT injection to 6week - 3month old infants. Randomized, placebo controlled trial. Immunization clinic of Department of Pediatrics, LLRM Medical College. Infants coming for their 1st DPT vaccination were randomized in to three groups of 40 each. The primary outcome variable was the duration of cry after vaccination. Secondary outcome variables were Modified Facial Coding Score (MFCS) and latency of onset of cry. 120 babies were equally enrolled in breast feed group, 25% dextrose fed group and distilled water fed group. Median (interquartile range) of duration of cry was significantly lower in breast fed (33.5 (17-54) seconds) and 25% dextrose fed babies (47.5 (31-67.5) seconds) as compared to babies given distilled water (80.5 (33.5-119.5) seconds) (P<0.001). MFCS at 1 min and 3 min was significantly lower in direct breast fed and dextrose fed babies. Direct breastfeeding and 25% dextrose act as analgesic in young infants undergoing DPT vaccination in young infants less than 3 month of age.
The existence of bronchiectasis predicts worse prognosis in patients with COPD
Mao, Bei; Lu, Hai-Wen; Li, Man-Hui; Fan, Li-Chao; Yang, Jia-Wei; Miao, Xia-Yi; Xu, Jin-Fu
2015-01-01
Bronchiectasis is prevalent in patients with COPD. The objective of this study was to assess the clinical characteristics and prognostic value of bronchiectasis in patients with COPD in China. Data from patients diagnosed with COPD at the Shanghai Pulmonary Hospital between January 2009 and December 2013 were retrospectively collected and analyzed. SPSS statistical software was used to analyze the data. Data from 896 patients with COPD were analyzed. Bronchiectasis was present in 311 patients. The isolation of pseudomonas aeruginosa (PA) from sputum was the variable most significantly associated with the presence of bronchiectasis in patients with COPD (hazard ratio (HR), 2.93; 95% confidence interval (CI), 1.35–6.37; P = 0.007). During follow-up (median of 21 months; interquartile range: 10-39 months), there were 75 deaths, of which 39 were in the bronchiectasis group. The presence of bronchiectasis (HR, 1.77; 95% CI, 1.02–3.08; P = 0.043) was associated with an increase in all-cause mortality in patients with COPD. These results suggest that bronchiectasis in patients with COPD was associated with the isolation of PA from the sputum. Bronchiectasis was an independent risk factor for all-cause mortality in patients with COPD. PMID:26077673
Quah, H M; Seow-Choen, F
2004-03-01
This study was designed to compare diathermy excision and diathermy coagulation in the treatment of symptomatic prolapsed piles. Forty-five consecutive patients were randomly assigned to diathermy excision hemorrhoidectomy (Group A, n = 25) and diathermy coagulation (Group B, n = 20) under general anesthesia. The median duration of surgery was ten minutes for both groups. There was no statistical difference in the severity of postoperative pain at rest between the two groups, but Group A patients felt less pain during defecation on the third postoperative day (median, 5 (interquartile range, 3-7) vs. 8 (4-9); P = 0.04) and on the sixth postoperative day (median, 5 (interquartile range, 2-6) vs. 9 (5-10); P = 0.02). There was, however, no statistical difference in postoperative oral analgesics use and patients' satisfaction scores between the two groups. Complication rates were similar except that diathermy coagulation tended to leave some residual skin components of external hemorrhoid especially in very large prolapsed piles. Group A patients resumed work earlier (mean, 12 (range, 4-20) vs. 17 (11-21) days); however, this was not statistically significant ( P = 0.1). Diathermy coagulation of hemorrhoids is a simple technique and may be considered in suitable cases.
McAlister, Finlay A; Grover, Steven; Padwal, Raj S; Youngson, Erik; Fradette, Miriam; Thompson, Ann; Buck, Brian; Dean, Naeem; Tsuyuki, Ross T; Shuaib, Ashfaq; Majumdar, Sumit R
2014-12-01
Survivors of ischemic stroke/transient ischemic attack (TIA) are at high risk for other vascular events. We evaluated the impact of 2 types of case management (hard touch with pharmacist or soft touch with nurse) added to usual care on global vascular risk. This is a prespecified secondary analysis of a 6-month trial conducted in outpatients with recent stroke/TIA who received usual care and were randomized to additional monthly visits with either nurse case managers (who counseled patients, monitored risk factors, and communicated results to primary care physicians) or pharmacist case managers (who were also able to independently prescribe according to treatment algorithms). The Framingham Risk Score [FRS]) and the Cardiovascular Disease Life Expectancy Model (CDLEM) were used to estimate 10-year risk of any vascular event at baseline, 6 months (trial conclusion), and 12 months (6 months after last trial visit). Mean age of the 275 evaluable patients was 67.6 years. Both study arms were well balanced at baseline and exhibited reductions in absolute global vascular risk estimates at 6 months: median 4.8% (Interquartile range (IQR) 0.3%-11.3%) for the pharmacist arm versus 5.1% (IQR 1.9%-12.5%) for the nurse arm on the FRS (P = .44 between arms) and median 10.0% (0.1%-31.6%) versus 12.5% (2.1%-30.5%) on the CDLEM (P = .37). These reductions persisted at 12 months: median 6.4% (1.2%-11.6%) versus 5.5% (2.0%-12.0%) for the FRS (P = .83) and median 8.4% (0.1%-28.3%) versus 13.1% (1.6%-31.6%) on the CDLEM (P = .20). Case management by nonphysician providers is associated with improved global vascular risk in patients with recent stroke/TIA. Reductions achieved during the active phase of the trial persisted after trial conclusion. Copyright © 2014 Mosby, Inc. All rights reserved.
Kelly, Paul J; Lin, Nancy U; Claus, Elizabeth B; Quant, Eudocia C; Weiss, Stephanie E; Alexander, Brian M
2012-04-15
Salvage stereotactic radiosurgery (SRS) is often considered in breast cancer patients previously treated for brain metastases. The goal of this study was to analyze clinical outcomes and prognostic factors for survival in the salvage setting. The authors retrospectively examined 79 consecutive breast cancer patients who received salvage SRS (interval of >3 months after initial therapy), 76 of whom (96%) received prior whole-brain radiation therapy. Overall survival (OS) and central nervous system (CNS) progression-free survival rates were calculated from the date of SRS using the Kaplan-Meier method. Prognostic factors were evaluated using the Cox proportional hazards model. Median age was 50.5 years. Fifty-eight percent of this population was estrogen receptor positive, 62% was HER2 positive, and 10% was triple negative. At the time of SRS, 95% had extracranial metastases, with 81% of extracranial metastases at other visceral sites (lung/pleura/liver). Forty-eight percent had stable extracranial disease. Median interval from initial brain metastases therapy to SRS was 8.4 months. Median CNS progression-free survival after SRS was 5.7 months (interquartile range [IQR], 3.6-11 months), and median OS was 9.8 months (IQR, 3.8-18 months). Eighty-two percent of evaluable patients received further systemic therapy after SRS. HER2 status (adjusted hazard ratio [HR], 2.4; P = .008) and extracranial disease status (adjusted HR, 2.7; P = .004) were significant prognostic factors for survival on multivariate analysis. In patients with good Karnofsky performance status, salvage SRS for breast cancer brain metastases is a reasonable treatment option, given an associated median survival in excess of 9 months. Furthermore, patients with HER2-positive tumors at diagnosis or stable extracranial disease at the time of SRS have an improved clinical course, with median survival of >1 year. Copyright © 2011 American Cancer Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Skyler, E-mail: Skylerjohn3101@gmail.com; Jackson, William; Li, Darren
2013-07-01
Purpose: To investigate the utility of the interval to biochemical failure (IBF) after salvage radiation therapy (SRT) after radical prostatectomy (RP) for prostate cancer as a surrogate endpoint for distant metastasis (DM), prostate cancer-specific mortality (PCSM), and overall mortality (OM). Methods and Materials: A retrospective analysis of 575 patients treated with SRT after RP from a single institution. Of those, 250 patients experienced biochemical failure (BF), with the IBF defined as the time from commencement of SRT to BF. The IBF was evaluated by Kaplan-Meier and Cox proportional hazards models for its association with DM, PCSM, and OM. Results: Themore » median follow-up time was 85 (interquartile range [IQR] 49.8-121.1) months, with a median IBF of 16.8 (IQR, 8.5-37.1) months. With a cutoff time of 18 months, as previously used, 129 (52%) of patients had IBF ≤18 months. There were no differences among any clinical or pathologic features between those with IBF ≤ and those with IBF >18 months. On log–rank analysis, IBF ≤18 months was prognostic for increased DM (P<.0001, HR 4.9, 95% CI 3.2-7.4), PCSM (P<.0001, HR 4.1, 95% CI 2.4-7.1), and OM (P<.0001, HR 2.7, 95% CI 1.7-4.1). Cox proportional hazards models with adjustment for other clinical variables demonstrated that IBF was independently prognostic for DM (P<.001, HR 4.9), PCSM (P<.0001, HR 4.0), and OM (P<.0001, HR 2.7). IBF showed minimal change in performance regardless of androgen deprivation therapy (ADT) use. Conclusion: After SRT, a short IBF can be used for early identification of patients who are most likely to experience progression to DM, PCSM, and OM. IBF ≤18 months may be useful in clinical practice or as an endpoint for clinical trials.« less
Effect of daily remote monitoring on pacemaker longevity: a retrospective analysis.
Ricci, Renato Pietro; Morichelli, Loredana; Quarta, Laura; Porfili, Antonio; Magris, Barbara; Giovene, Lisa; Torcinaro, Sergio; Gargaro, Alessio
2015-02-01
Energy demand of remote monitoring in cardiac implantable electronic devices has never been investigated. Biotronik Home Monitoring (HM) is characterized by daily transmissions that may affect longevity. The aim of the study was to retrospectively compare longevity of a specific dual-chamber pacemaker model in patients with HM on and patients with HM off. Hospital files of 201 patients (mean age 87 ± 10 years, 78 men) who had received a Biotronik Cylos DR-T pacemaker between April 2006 and May 2010 for standard indication were reviewed. In 134 patients (67%), HM was activated. The primary end point was device replacement due to battery depletion. The median follow-up period was 56.4 months (interquartile range 41.8-65.2 months). The estimated device longevity was 71.1 months (95% confidence interval [CI] 69.1-72.3 months) in the HM-on group and 60.4 months (CI 55.9-65.1 months) in the HM-off group (P < .0001). The frequency of inhospital visits with significant device reprogramming was higher in the HM-on group than in the HM-off group (33.3% vs 25.0%, respectively; P = .03). Lower ventricular pulse amplitude (2.3 ± 0.4 V vs 2.7 ± 0.5 V; P < .0001) and pacing percentage (49% ± 38% vs 64% ± 38%; P = .02), both calculated as time-weighted averages, were observed with HM on as compared with HM off. Patient attrition was significantly lower in the HM-on group (9.7%; 95% CI 3.0%-28.7%) than in the HM-off group (45.6%; 95% CI 30.3%-64.3%) (P < .0001). In normal practice, energy demand of HM, if present, was overshadowed by programming optimization likely favored by continuous monitoring. Pacemakers controlled remotely with HM showed an 11-month longer longevity. Patient retention was superior. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Zatzick, Douglas; O'Connor, Stephen S; Russo, Joan; Wang, Jin; Bush, Nigel; Love, Jeff; Peterson, Roselyn; Ingraham, Leah; Darnell, Doyanne; Whiteside, Lauren; Van Eaton, Erik
2015-10-01
Posttraumatic stress disorder (PTSD) and its comorbidities are endemic among injured trauma survivors. Previous collaborative care trials targeting PTSD after injury have been effective, but they have required intensive clinical resources. The present pragmatic clinical trial randomized acutely injured trauma survivors who screened positive on an automated electronic medical record PTSD assessment to collaborative care intervention (n = 60) and usual care control (n = 61) conditions. The stepped measurement-based intervention included care management, psychopharmacology, and psychotherapy elements. Embedded within the intervention were a series of information technology (IT) components. PTSD symptoms were assessed with the PTSD Checklist at baseline prerandomization and again, 1-, 3-, and 6-months postinjury. IT utilization was also assessed. The technology-assisted intervention required a median of 2.25 hours (interquartile range = 1.57 hours) per patient. The intervention was associated with modest symptom reductions, but beyond the margin of statistical significance in the unadjusted model: F(2, 204) = 2.95, p = .055. The covariate adjusted regression was significant: F(2, 204) = 3.06, p = .049. The PTSD intervention effect was greatest at the 3-month (Cohen's effect size d = 0.35, F(1, 204) = 4.11, p = .044) and 6-month (d = 0.38, F(1, 204) = 4.10, p = .044) time points. IT-enhanced collaborative care was associated with modest PTSD symptom reductions and reduced delivery times; the intervention model could potentially facilitate efficient PTSD treatment after injury. Copyright © 2015 Wiley Periodicals, Inc., A Wiley Company.
Rueda-Camino, J A; Bernal-Bello, D; Canora-Lebrato, J; Velázquez-Ríos, L; García de Viedma-García, V; Guerrero-Santillán, M; Duarte-Millán, M A; Cristóbal-Bilbao, R; Zapatero-Gaviria, A
2017-12-01
To assess the effect of high doses of corticosteroids in patients hospitalised for exacerbation of chronic obstructive pulmonary disease (COPD). A prospective cohort study was conducted on patients hospitalized with COPD between January and March 2015, grouped according to the glucocorticoid dosage administered (cutoff, 40mg of prednisone/day). We compared the results of hospital stay, readmission and mortality at 3 months of discharge. We analysed 87 patients. The median daily dose was 60mg of prednisone (interquartile range, 46.67-82.33mg/day), and the administration route was intravenous in 96.6% of the cases. We established a relative risk (RR) for hospital stays longer than 8 days of 1.095 (95% CI 0.597-2.007; P=.765) when steroid dosages greater than 40mg/day were employed. In these patients, the hazard ratio (HR) for readmission in the 3 months after discharge was 0.903 (95% CI 0.392-2.082; P=.811), and the mortality was 1.832 (95% CI 0.229-16.645; P=.568). Neither the RR nor the HR varied in a statistically significant manner after adjusting for confounding factors. A daily dose greater than 40mg of prednisone in patients hospitalised for COPD exacerbation was not associated with a shorter hospital stay or a reduction in readmissions or mortality at 3 months. Copyright © 2017 Elsevier España, S.L.U. and Sociedad Española de Medicina Interna (SEMI). All rights reserved.
[Residual pleural opacity in patients treated for pleural tuberculosis in Yaounde].
Balkissou, A D; Pefura-Yone, E W; Netong Gamgne, M; Endale Mangamba, L-M; Onana Ngono, I; Poka Mayap, V; Evouna Mbarga, A; Assamba Mpom, S A; Kanko, N F; Fodjeu, G; Tagne Kamdem, P E; Fogang, D; Kuaban, C
2016-04-01
The aim of this study was to evaluate the incidence and risk factors of residual pleural opacity (RPO) at the end and after 6 months (M12) of antituberculosis treatment (ATT) in adults with pleural tuberculosis. In this prospective cohort study, all patients admitted for pleural tuberculosis between September 2010 and August 2012 in the pneumology A unit of Yaounde Jamot Hospital were included. Each patient was then followed up for 12 months. RPO was considered significant if it was measured 10mm or more on standard chest X-ray. The logistic regression model was used to investigate the risk factors of significant RPO at the end of antituberculosis treatment. Of the 193 patients included, median (interquartile range) age of 33 (25-42) years, 115 (59.6%) were men. The incidence (95% CI) of significant RPO was 22.0% (14.9-29.1) and 11.0% (4.9-17.1) at the end of ATT and at M12 respectively. In multivariate analysis, the risk factors of the occurrence of a significant RPO at the end of ATT and at M12 were smoking, associated parenchymal lesions, and hypoglycopleuria. Cumulative incidence of RPO ≥ 10 mm was 22% at the end of ATT and 11% after 12 months from the beginning of treatment. Patients with risk factors of RPO ≥ 10 mm should benefit from greater surveillance and appropriate management. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Left ventricular hypertrophy with strain and aortic stenosis.
Shah, Anoop S V; Chin, Calvin W L; Vassiliou, Vassilis; Cowell, S Joanna; Doris, Mhairi; Kwok, T'ng Choong; Semple, Scott; Zamvar, Vipin; White, Audrey C; McKillop, Graham; Boon, Nicholas A; Prasad, Sanjay K; Mills, Nicholas L; Newby, David E; Dweck, Marc R
2014-10-28
ECG left ventricular hypertrophy with strain is associated with an adverse prognosis in aortic stenosis. We investigated the mechanisms and outcomes associated with ECG strain. One hundred and two patients (age, 70 years [range, 63-75 years]; male, 66%; aortic valve area, 0.9 cm(2) [range, 0.7-1.2 cm(2)]) underwent ECG, echocardiography, and cardiovascular magnetic resonance. They made up the mechanism cohort. Myocardial fibrosis was determined with late gadolinium enhancement (replacement fibrosis) and T1 mapping (diffuse fibrosis). The relationship between ECG strain and cardiovascular magnetic resonance was then assessed in an external validation cohort (n=64). The outcome cohort was made up of 140 patients from the Scottish Aortic Stenosis and Lipid Lowering Trial Impact on Regression (SALTIRE) study and was followed up for 10.6 years (1254 patient-years). Compared with those without left ventricular hypertrophy (n=51) and left ventricular hypertrophy without ECG strain (n=30), patients with ECG strain (n=21) had more severe aortic stenosis, increased left ventricular mass index, more myocardial injury (high-sensitivity plasma cardiac troponin I concentration, 4.3 ng/L [interquartile range, 2.5-7.3 ng/L] versus 7.3 ng/L [interquartile range, 3.2-20.8 ng/L] versus 18.6 ng/L [interquartile range, 9.0-45.2 ng/L], respectively; P<0.001) and increased diffuse fibrosis (extracellular volume fraction, 27.4±2.2% versus 27.2±2.9% versus 30.9±1.9%, respectively; P<0.001). All patients with ECG strain had midwall late gadolinium enhancement (positive and negative predictive values of 100% and 86%, respectively). Indeed, late gadolinium enhancement was independently associated with ECG strain (odds ratio, 1.73; 95% confidence interval, 1.08-2.77; P=0.02), a finding confirmed in the validation cohort. In the outcome cohort, ECG strain was an independent predictor of aortic valve replacement or cardiovascular death (hazard ratio, 2.67; 95% confidence interval, 1.35-5.27; P<0.01). ECG strain is a specific marker of midwall myocardial fibrosis and predicts adverse clinical outcomes in aortic stenosis. © 2014 American Heart Association, Inc.
Predictors of Timely Access of Oncology Services and Advanced-Stage Cancer in an HIV-Endemic Setting
Suneja, Gita; Tapela, Neo; Mapes, Abigail; Pusoentsi, Malebogo; Mmalane, Mompati; Hodgeman, Ryan; Boyer, Matthew; Musimar, Zola; Ramogola-Masire, Doreen; Grover, Surbhi; Nsingo-Bvochora, Memory; Kayembe, Mukendi; Efstathiou, Jason; Lockman, Shahin; Dryden-Peterson, Scott
2016-01-01
Background. Three-quarters of cancer deaths occur in resource-limited countries, and delayed presentation contributes to poor outcome. In Botswana, where more than half of cancers arise in HIV-infected individuals, we sought to explore predictors of timely oncology care and evaluate the hypothesis that engagement in longitudinal HIV care improves access. Methods. Consenting patients presenting for oncology care from October 2010 to September 2014 were interviewed and their records were reviewed. Cox and logistic models were used to examine the effect of HIV and other predictors on time to oncology care and presentation with advanced cancer (stage III or IV). Results. Of the 1,146 patients analyzed, 584 (51%) had HIV and 615 (54%) had advanced cancer. The initial clinic visit occurred a mean of 144 days (median 29, interquartile range 0–185) after symptom onset, but subsequent mean time to oncology care was 406 days (median 160, interquartile range 59–653). HIV status was not significantly associated with time to oncology care (adjusted hazard ratio [aHR] 0.91, 95% confidence interval [CI] 0.79–1.06). However, patients who reported using traditional medicine/healers engaged in oncology care significantly faster (aHR 1.23, 95% CI 1.09–1.40) and those with advanced cancer entered care earlier (aHR 1.48, 95% CI 1.30–1.70). Factors significantly associated with advanced cancer included income <$50 per month (adjusted odds ratio [aOR] 1.35, 95% CI 1.05–1.75), male sex (aOR 1.45, 95% CI 1.12–1.87), and pain as the presenting symptom (aOR 1.39, 95% CI 1.03–1.88). Conclusion. Longitudinal HIV care did not reduce the substantial delay to cancer treatment. Research focused on reducing health system delay through coordination and navigation is needed. Implications for Practice: The majority (54%) of patients in this large cohort from Botswana presented with advanced-stage cancer despite universal access to free health care. Median time from first symptom to specialized oncology care was 13 months. For HIV-infected patients (51% of total), regular longitudinal contact with the health system, through quarterly doctor visits for HIV management, was not successful in providing faster linkages into oncology care. However, patients who used traditional medicine/healers engaged in cancer care faster, indicating potential for leveraging traditional healers as partners in early cancer detection. New strategies are urgently needed to facilitate diagnosis and timely treatment of cancer in low- and middle-income countries. PMID:27053501
Brown, Carolyn A; Suneja, Gita; Tapela, Neo; Mapes, Abigail; Pusoentsi, Malebogo; Mmalane, Mompati; Hodgeman, Ryan; Boyer, Matthew; Musimar, Zola; Ramogola-Masire, Doreen; Grover, Surbhi; Nsingo-Bvochora, Memory; Kayembe, Mukendi; Efstathiou, Jason; Lockman, Shahin; Dryden-Peterson, Scott
2016-06-01
Three-quarters of cancer deaths occur in resource-limited countries, and delayed presentation contributes to poor outcome. In Botswana, where more than half of cancers arise in HIV-infected individuals, we sought to explore predictors of timely oncology care and evaluate the hypothesis that engagement in longitudinal HIV care improves access. Consenting patients presenting for oncology care from October 2010 to September 2014 were interviewed and their records were reviewed. Cox and logistic models were used to examine the effect of HIV and other predictors on time to oncology care and presentation with advanced cancer (stage III or IV). Of the 1,146 patients analyzed, 584 (51%) had HIV and 615 (54%) had advanced cancer. The initial clinic visit occurred a mean of 144 days (median 29, interquartile range 0-185) after symptom onset, but subsequent mean time to oncology care was 406 days (median 160, interquartile range 59-653). HIV status was not significantly associated with time to oncology care (adjusted hazard ratio [aHR] 0.91, 95% confidence interval [CI] 0.79-1.06). However, patients who reported using traditional medicine/healers engaged in oncology care significantly faster (aHR 1.23, 95% CI 1.09-1.40) and those with advanced cancer entered care earlier (aHR 1.48, 95% CI 1.30-1.70). Factors significantly associated with advanced cancer included income <$50 per month (adjusted odds ratio [aOR] 1.35, 95% CI 1.05-1.75), male sex (aOR 1.45, 95% CI 1.12-1.87), and pain as the presenting symptom (aOR 1.39, 95% CI 1.03-1.88). Longitudinal HIV care did not reduce the substantial delay to cancer treatment. Research focused on reducing health system delay through coordination and navigation is needed. The majority (54%) of patients in this large cohort from Botswana presented with advanced-stage cancer despite universal access to free health care. Median time from first symptom to specialized oncology care was 13 months. For HIV-infected patients (51% of total), regular longitudinal contact with the health system, through quarterly doctor visits for HIV management, was not successful in providing faster linkages into oncology care. However, patients who used traditional medicine/healers engaged in cancer care faster, indicating potential for leveraging traditional healers as partners in early cancer detection. New strategies are urgently needed to facilitate diagnosis and timely treatment of cancer in low- and middle-income countries. ©AlphaMed Press.
NASA Astrophysics Data System (ADS)
Ghent, D.; Good, E.; Bulgin, C.; Remedios, J. J.
2017-12-01
Surface temperatures (ST) over land have traditionally been measured at weather stations. There are many parts of the globe with very few stations, e.g. across much of Africa, leading to gaps in ST datasets, affecting our understanding of how ST is changing, and the impacts of extreme events. Satellites can provide global ST data but these observations represent how hot the land ST (LST; including the uppermost parts of e.g. trees, buildings) is to touch, whereas stations measure the air temperature just above the surface (T2m). Satellite LST data may only be available in cloud-free conditions and data records are frequently <10-15 years in length. Consequently, satellite LST data have not yet featured widely in climate studies. In this study, the relationship between clear-sky satellite LST and all-sky T2m is characterised in space and time using >17 years of data. The analysis uses a new monthly LST climate data record (CDR) based on the Along-Track Scanning Radiometer (ATSR) series, which has been produced within the European Space Agency GlobTemperature project. The results demonstrate the dependency of the global LST-T2m differences on location, land cover, vegetation and elevation. LSTnight ( 10 pm local solar time) is found to be closely coupled with minimum T2m (Tmin) and the two temperatures generally consistent to within ±5 °C (global median LSTnight- Tmin= 1.8 °C, interquartile range = 3.8 °C). The LSTday ( 10 am local time)-maximum T2m (Tmax) variability is higher because LST is strongly influenced by insolation and surface regime (global median LSTday-Tmax= -0.1 °C, interquartile range = 8.1 °C). Correlations for both temperature pairs are typically >0.9 outside of the tropics. A crucial aspect of this study is a comparison between the monthly global anomaly time series of LST and CRUTEM4 T2m. The time series agree remarkably well, with a correlation of 0.9 and 90% of the CDR anomalies falling within the T2m 95% confidence limits (see figure). This analysis provides independent verification of the 1995-2012 T2m anomaly time series, suggesting that LST can provide a complementary perspective on global ST change. The results presented give justification for increasing use of satellite LST data in climate and weather science, both as an independent variable, and to augment T2m data acquired at weather stations.
2014-01-01
Background Insight in the natural course of care dependency of vulnerable older persons in long-term care facilities (LTCF) is essential to organize and optimize individual tailored care. We examined changes in care dependency in LTCF residents over two 6-month periods, explored the possible predictive factors of change and the effect of care dependency on mortality. Methods A prospective follow-up study in 21 Dutch long-term care facilities. 890 LTCF residents, median age 84 (Interquartile range 79–88) years participated. At baseline, 6 and 12 months, care dependency was assessed by the nursing staff with the Care Dependency Scale (CDS), range 15–75 points. Since the median CDS score differed between men and women (47.5 vs. 43.0, P = 0.013), CDS groups (low, middle and high) were based on gender-specific 33% of CDS scores at baseline and 6 months. Results At baseline, the CDS groups differed in median length of stay on the ward, urine incontinence and dementia (all P < 0.001); participants in the low CDS group stayed longer, had more frequent urine incontinence and more dementia. They had also the highest mortality rate (log rank 32.2; df = 2; P for trend <0.001). Per point lower in CDS score, the mortality risk increased with 2% (95% CI 1%-3%). Adjustment for age, gender, cranberry use, LTCF, length of stay, comorbidity and dementia showed similar results. A one point decrease in CDS score between 0 and 6 months was related to an increased mortality risk of 4% (95% CI 3%-6%). At the 6-month follow-up, 10% improved to a higher CDS group, 65% were in the same, and 25% had deteriorated to a lower CDS group; a similar pattern emerged at 12-month follow-up. Gender, age, urine incontinence, dementia, cancer and baseline care dependency status, predicted an increase in care dependency over time. Conclusion The majority of residents were stable in their care dependency status over two subsequent 6-month periods. Highly care dependent residents showed an increased mortality risk. Awareness of the natural course of care dependency is essential to residents and their formal and informal caregivers when considering therapeutic and end-of-life care options. PMID:24884563
Caljouw, Monique A A; Cools, Herman J M; Gussekloo, Jacobijn
2014-05-22
Insight in the natural course of care dependency of vulnerable older persons in long-term care facilities (LTCF) is essential to organize and optimize individual tailored care. We examined changes in care dependency in LTCF residents over two 6-month periods, explored the possible predictive factors of change and the effect of care dependency on mortality. A prospective follow-up study in 21 Dutch long-term care facilities. 890 LTCF residents, median age 84 (Interquartile range 79-88) years participated. At baseline, 6 and 12 months, care dependency was assessed by the nursing staff with the Care Dependency Scale (CDS), range 15-75 points. Since the median CDS score differed between men and women (47.5 vs. 43.0, P = 0.013), CDS groups (low, middle and high) were based on gender-specific 33% of CDS scores at baseline and 6 months. At baseline, the CDS groups differed in median length of stay on the ward, urine incontinence and dementia (all P < 0.001); participants in the low CDS group stayed longer, had more frequent urine incontinence and more dementia. They had also the highest mortality rate (log rank 32.2; df = 2; P for trend <0.001). Per point lower in CDS score, the mortality risk increased with 2% (95% CI 1%-3%). Adjustment for age, gender, cranberry use, LTCF, length of stay, comorbidity and dementia showed similar results. A one point decrease in CDS score between 0 and 6 months was related to an increased mortality risk of 4% (95% CI 3%-6%).At the 6-month follow-up, 10% improved to a higher CDS group, 65% were in the same, and 25% had deteriorated to a lower CDS group; a similar pattern emerged at 12-month follow-up. Gender, age, urine incontinence, dementia, cancer and baseline care dependency status, predicted an increase in care dependency over time. The majority of residents were stable in their care dependency status over two subsequent 6-month periods. Highly care dependent residents showed an increased mortality risk. Awareness of the natural course of care dependency is essential to residents and their formal and informal caregivers when considering therapeutic and end-of-life care options.
The cross-national epidemiology of DSM-IV intermittent explosive disorder
Scott, K. M.; Lim, C. C. W.; Hwang, I.; Adamowski, T.; Al-Hamzawi, A.; Bromet, E.; Bunting, B.; Ferrand, M. P.; Florescu, S.; Gureje, O.; Hinkov, H.; Hu, C.; Karam, E.; Lee, S.; Posada-Villa, J.; Stein, D.; Tachimori, H.; Viana, M. C.; Xavier, M.; Kessler, R. C.
2016-01-01
Background This is the first cross-national study of intermittent explosive disorder (IED). Method A total of 17 face-to-face cross-sectional household surveys of adults were conducted in 16 countries (n = 88 063) as part of the World Mental Health Surveys initiative. The World Health Organization Composite International Diagnostic Interview (CIDI 3.0) assessed DSM-IV IED, using a conservative definition. Results Lifetime prevalence of IED ranged across countries from 0.1 to 2.7% with a weighted average of 0.8%; 0.4 and 0.3% met criteria for 12-month and 30-day prevalence, respectively. Sociodemographic correlates of lifetime risk of IED were being male, young, unemployed, divorced or separated, and having less education. The median age of onset of IED was 17 years with an interquartile range across countries of 13–23 years. The vast majority (81.7%) of those with lifetime IED met criteria for at least one other lifetime disorder; co-morbidity was highest with alcohol abuse and depression. Of those with 12-month IED, 39% reported severe impairment in at least one domain, most commonly social or relationship functioning. Prior traumatic experiences involving physical (non-combat) or sexual violence were associated with increased risk of IED onset. Conclusions Conservatively defined, IED is a low prevalence disorder but this belies the true societal costs of IED in terms of the effects of explosive anger attacks on families and relationships. IED is more common among males, the young, the socially disadvantaged and among those with prior exposure to violence, especially in childhood. PMID:27572872
Ahmad, Mirza Sultan; Husain Zaidi, Syed Aizaz; Medhat, Naila; Farooq, Hadia; Ahmad, Danial; Nasir, Waqar
2018-01-01
To determine the frequency of underweight and stunting among the children entering first year of school and to assess its associated factors. This descriptive, analytical study was conducted at 5 schools of Rabwah, Pakistan, from August to September 2015, and comprised all students who got admission in the selected schools during the study period. Name, father's name, gender, weight, height, status of height, and weight on Z-score charts, and marks obtained in the test were recorded. SPSS 20 was used for statistical analysis. Of the 478 participants, 212(44.4%) were boys and 266(55.6%) were girls. The overall mean age was 66.6±5.966 months (range: 41-129 months). Overall, 53(11.1%) were underweight, 22(4.6%) were severely underweight, 55(11.5%) had stunting and 12(2.5%) had severe stunting. Median marks (Interquartile Range [IQR]) in admission test for obese, overweight, normal, underweight and severely underweight children were 76.3%(37.2-84.7), 65.9%, 66.7%(56.4-72.3), 64.6%(47-71), and 67%(55.3-78), respectively. Median marks (IQR) in admission test for tall, normal height, stunted and severe stunted children were 24.1%, 67%(57.3-73), 57%(31.1-67.8), and 62.6%(49.7-68.3), respectively. Children with stunting scored significantly fewer marks compared to children of normal height (p<0.05). Stunting and underweight were common problems among children starting school. Stunting was found to be associated with lower marks in admission test.
The incidence of ARDS and associated mortality in severe TBI using the Berlin definition.
Aisiku, Imoigele P; Yamal, Jose-Miguel; Doshi, Pratik; Rubin, Maria Laura; Benoit, Julia S; Hannay, Julia; Tilley, Barbara C; Gopinath, Shankar; Robertson, Claudia S
2016-02-01
The incidence of adult respiratory distress syndrome (ARDS) in severe traumatic brain injury (TBI) is poorly reported. Recently, a new definition for ARDS was proposed, the Berlin definition. The percentage of patients represented by TBI in the Berlin criteria study is limited. This study describes the incidence and associated mortality of ARDS in TBI patients. The study was an analysis of the safety of erythropoietin administration and transfusion threshold on the incidence of ARDS in severe TBI patients. Three reviewers independently assessed all patients enrolled in the study for acute lung injury/ARDS using the Berlin and the American-European Consensus Conference (AECC) definitions. A Cox proportional hazards model was used to assess the relationship between ARDS and mortality and 6-month Glasgow Outcome Scale (GOS) score. Two hundred patients were enrolled in the study. Of the patients, 21% (41 of 200) and 26% (52 of 200) developed ARDS using the AECC and Berlin definitions, respectively, with a median time of 3 days (interquartile range, 3) after injury. ARDS by either definition was associated with increased mortality (p = 0.04) but not with differences in functional outcome as measured by the GOS score at 6 months. Adjusted analysis using the Berlin criteria showed an increased mortality associated with ADS (p = 0.01). Severe TBI is associated with an incidence of ARDS ranging from 20% to 25%. The incidence is comparable between the Berlin and AECC definitions. ARDS is associated with increased mortality in severe TBI patients, but further studies are needed to validate these findings. Epidemiologic study, level II.
Nwaru, B I; Takkinen, H-M; Niemelä, O; Kaila, M; Erkkola, M; Ahonen, S; Tuomi, H; Haapala, A-M; Kenward, M G; Pekkanen, J; Lahesmaa, R; Kere, J; Simell, O; Veijola, R; Ilonen, J; Hyöty, H; Knip, M; Virtanen, S M
2013-04-01
To study the associations between timing and diversity of introduction of complementary foods during infancy and atopic sensitization in 5-year-old children. In the Finnish DIPP (type 1 diabetes prediction and prevention) birth cohort (n = 3781), data on the timing of infant feeding were collected up to the age of 2 years and serum IgE antibodies toward four food and four inhalant allergens measured at the age of 5 years. Logistic regression was used for the analyses. Median duration of exclusive and total breastfeeding was 1.4 (interquartile range: 0.2-3.5) and 7.0 (4.0-11.0) months, respectively. When all the foods were studied together and adjusted for confounders, short duration of breastfeeding decreased the risk of sensitization to birch allergen; introduction of oats <5.1 months and barley <5.5 months decreased the risk of sensitization to wheat and egg allergens, and oats additionally associated with milk, timothy grass, and birch allergens. Introduction of rye <7.0 months decreased the risk of sensitization to birch allergen. Introduction of fish <6 months and egg ≤11 months decreased the risk of sensitization to all the specific allergens studied. The introduction of <3 food items at 3 months was associated with sensitization to wheat, timothy grass, and birch allergens; the introduction of 1-2 food items at 4 months and ≤4 food items at 6 months was associated with all endpoints, but house dust mite. These results were particularly evident among high-risk children when the results were stratified by atopic history, indicating the potential for reverse causality. The introduction of complementary foods was consecutively done, and with respect to the timing of each food, early introduction of complementary foods may protect against atopic sensitization in childhood, particularly among high-risk children. Less food diversity as already at 3 months of age may increase the risk of atopic sensitization. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.
Riancho-Zarrabeitia, Leyre; Calvo-Río, Vanesa; Blanco, Ricardo; Mesquida, Marina; Adan, Alfredo M; Herreras, José M; Aparicio, Ángel; Peiteado-Lopez, Diana; Cordero-Coma, Miguel; García Serrano, José Luis; Ortego-Centeno, Norberto; Maíz, Olga; Blanco, Ana; Sánchez-Bursón, Juan; González-Suárez, Senén; Fonollosa, Alejandro; Santos-Gómez, Montserrat; González-Vela, Carmen; Loricera, Javier; Pina, Trinitario; González-Gay, Miguel A
2015-12-01
To assess anti-TNF-α therapy response in uveitis associated with sarcoidosis refractory to conventional immunosuppressive therapy. Open-label, multicenter, retrospective study on patients with sarcoid uveitis who underwent anti-TNF-α therapy because of inadequate response to conventional therapy including corticosteroids and at least 1 systemic synthetic immunosuppressive drug. The main outcome measurements were degree of anterior and posterior chamber inflammation, visual acuity, macular thickness, and immunosuppression load. A total of 17 patients (8 men; 29 affected eyes; mean ± standard deviation age 38.4 ± 16.8; range: 13-76 years) were studied. The patients had bilateral hilar lymphadenopathy (58.8%), lung parenchyma involvement (47.1%), peripheral lymph nodes (41.2%), and involvement of other organs (52.9%). Angiotensin-converting enzyme was elevated in 58.8%. The most frequent ocular pattern was bilateral chronic relapsing panuveitis. The first biologic agent used was adalimumab in 10 (58.8%) and infliximab in 7 (41.2%) cases. Infliximab 5mg/kg intravenously every 4-8 weeks and adalimumab 40mg subcutaneously every 2 weeks were the most common administration patterns. In most cases anti-TNF-α therapy was given in combination with immunosuppressive drugs. The mean duration of follow-up was 33.9 ± 17.1 months. Significant improvement was observed following anti-TNF-α therapy. Baseline results versus results at 2 years from the onset of biologic therapy were the following: the median of cells in the ocular anterior chamber (interquartile range-IQR) 0.5 (0-2) versus 0 (0-0) (p = 0.003), vitritis 0 (0-1.25) versus 0 (0-0) (p = 0.008), macular thickness (391.1 ± 58.8 versus 247 ± 40.5µm) (p = 0.028), and visual acuity 0.60 ± 0.33 versus 0.74 ± 0.27; p = 0.009. The median daily (interquartile range) dose of prednisone was also reduced from 10 (0-30)mg at the onset of the anti-TNF-α therapy to 0 (0-0)mg at 2 years (p = 0.02). Significant reduction was also achieved in the immunosuppressive load. Anti-TNF-α therapy is effective in sarcoid uveitis patients refractory to conventional immunosuppressive therapy. Infliximab and adalimumab allowed a substantial reduction in prednisone dose despite having failed standard therapy. Copyright © 2015 Elsevier Inc. All rights reserved.
Tóth, Gábor; Sándor, Gábor László; Kleiner, Dénes; Szentmáry, Nóra; Kiss, Huba J; Blázovics, Anna; Nagy, Zoltán Zsolt
2016-11-01
Femtosecond laser is a revolutionary, innovative treatment method used in cataract surgery. To evaluate free radical quantity in the anterior chamber of the eye, during femtosecond laser assisted capsulotomy, in a porcine eye model. Seventy fresh porcine eyes were collected within 2 hours post mortem, were transported at 4 ºC and treated within 7 hours. Thirty-five eyes were used as control and 35 as femtosecond laser assisted capsulotomy group. A simple luminol-dependent chemiluminescence method was used to measure the total scavenger capacity in the aqueous humour, as an indicator of free radical production. The emitted photons were expressed in relative light unit %. The relative light unit % was lower in the control group (median 1%, interquartile range [0.4-3%]) than in the femtosecond laser assisted capsulotomy group (median 4.4%, interquartile range [1.5%-21%]) (p = 0.01). Femtosecond laser assisted capsulotomy decreases the antioxidant defense of the anterior chamber, which refers to a significant free radical production during femtosecond laser assisted capsulotomy. Orv. Hetil., 2016, 157(47), 1880-1883.
Higher sweat chloride levels in patients with asthma: a case-control study.
Awasthi, Shally; Dixit, Pratibha; Maurya, Nutan
2015-02-01
To screen asthmatic patients by sweat chloride test to identify proportion with Cystic Fibrosis (CF); (Sweat chloride level >60 mmol/L). Also, to compare sweat chloride levels between cases of bronchial asthma and age and sex matched healthy children aged 5 mo-15 y. The present case-control study was conducted in a tertiary care hospital in India. Cases of bronchial asthma, diagnosed by GINA guideline 2008, and age matched healthy controls were included. Case to control ratio was 2:1. Sweat Chloride test was done by Pilocarpine Iontophoresis method. From April 2010 through May 2012, 216 asthmatics and 112 controls were recruited. Among asthmatics, there was no case of Cystic Fibrosis. Mean sweat chloride levels in asthmatics was 22.39 ± 8.45 mmol/L (inter-quartile range - 15-28 mmol/L) and in controls 19.55 ± 7.04 mmol/L (inter-quartile range - 15-23.5 mmol/L) (p value = 0.048). No Cystic Fibrosis case was identified among asthmatics. Mean sweat chloride levels were higher in asthmatics as compared to controls.
Changing Epidemiology of Human Brucellosis, China, 1955–2014
Lai, Shengjie; Zhou, Hang; Xiong, Weiyi; Gilbert, Marius; Huang, Zhuojie; Yu, Jianxing; Yin, Wenwu; Wang, Liping; Chen, Qiulan; Li, Yu; Mu, Di; Zeng, Lingjia; Ren, Xiang; Geng, Mengjie; Zhang, Zike; Cui, Buyun; Li, Tiefeng; Wang, Dali; Li, Zhongjie; Wardrop, Nicola A.; Tatem, Andrew J.
2017-01-01
Brucellosis, a zoonotic disease, was made statutorily notifiable in China in 1955. We analyzed the incidence and spatial–temporal distribution of human brucellosis during 1955–2014 in China using notifiable surveillance data: aggregated data for 1955–2003 and individual case data for 2004–2014. A total of 513,034 brucellosis cases were recorded, of which 99.3% were reported in northern China during 1955–2014, and 69.1% (258, 462/374, 141) occurred during February–July in 1990–2014. Incidence remained high during 1955–1978 (interquartile range 0.42–1.0 cases/100,000 residents), then decreased dramatically in 1979–1994. However, brucellosis has reemerged since 1995 (interquartile range 0.11–0.23 in 1995–2003 and 1.48–2.89 in 2004–2014); the historical high occurred in 2014, and the affected area expanded from northern pastureland provinces to the adjacent grassland and agricultural areas, then to southern coastal and southwestern areas. Control strategies in China should be adjusted to account for these changes by adopting a One Health approach. PMID:28098531
Androgenic correlates of genetic variation in the gene encoding 5alpha-reductase type 1.
Ellis, Justine A; Panagiotopoulos, Sianna; Akdeniz, Aysel; Jerums, George; Harrap, Stephen B
2005-01-01
Androgens determine male secondary sexual characteristics and influence a variety of metabolic pathways. Circulating levels of androgens are highly heritable; however, the genes involved are largely unknown. The 5alpha-reductase enzymes types 1 and 2 responsible for converting testosterone to the more potent androgen dihydrotestosterone are encoded by the SRD5A1 and SRD5A2 genes, respectively. We performed indirect genetic association studies of SRD5A1 and SRD5A2 and the dihydrotestosterone/testosterone ratio that reflects the activity of 5alpha-reductase in 57 males with type 2 diabetes. We found evidence of significant association between a single nucleotide polymorphism in SRD5A1 and the dihydrotestosterone/testosterone ratio (median 0.10, interquartile range 0.08 vs. median 0.06, interquartile range 0.04, P = 0.009). The polymorphism was not associated with any diabetic phenotypes. These results suggest that functional genetic variants might exist in or around SRD5A1 that affect the activity of the 5alpha-reductase enzyme type 1 and influence androgen levels.
Göktay, Fatih; Altan, Zeynep Müzeyyen; Talas, Anıl; Akpınar, Esma; Özdemir, Ekin Özge; Aytekin, Sema
2016-01-01
Patient anxiety about nail surgery relates mainly to pain associated with needle puncture, anesthetic flow during the procedure, and postoperative care, as well as possible past traumatic experience. The aims of this study were to compare anxiety levels among patients undergoing nail surgery and skin punch biopsy and to assess the effects of demographic characteristics on anxiety. Forty-eight consecutive patients who were referred to a dermatological surgery unit for nail surgery intervention (group 1) and 50 age- and sex-matched patients referred to the same unit for skin punch biopsy (group 2) were enrolled in the study. Patients' anxiety levels were measured using Spielberger's State-Trait Anxiety Inventory. There was no significant difference in median anxiety level between group 1 (42.00; interquartile range, 6.50) and group 2 (41.00; interquartile range, 8.25) (P = .517). The demographic factors of patient sex, educational status, and prior surgery showed no significant effects on anxiety levels. Nail surgery does not seem to cause significantly greater anxiety than skin punch biopsy. © The Author(s) 2015.
Rathi, Vinay K; Wang, Bo; Ross, Joseph S; Downing, Nicholas S; Kesselheim, Aaron S; Gray, Stacey T
2017-02-01
The US Food and Drug Administration (FDA) approves high-risk medical devices based on premarket pivotal clinical studies demonstrating reasonable assurance of safety and effectiveness and may require postapproval studies (PAS) to further inform benefit-risk assessment. We conducted a cross-sectional analysis using publicly available FDA documents to characterize industry-sponsored pivotal studies and PAS of high-risk devices used in the treatment of otolaryngologic diseases. Between 2000 and 2014, the FDA approved 23 high-risk otolaryngologic devices based on 28 pivotal studies. Median enrollment was 118 patients (interquartile range, 67-181), and median duration of longest primary effectiveness end point follow-up was 26 weeks (interquartile range, 16-96). Fewer than half were randomized (n = 13, 46%), blinded (n = 12, 43%), or controlled (n = 10, 36%). The FDA required 23 PASs for 16 devices (70%): almost two-thirds (n = 15, 65%) monitored long-term performance, and roughly one-third (n = 8, 35%) focused on subgroups. Otolaryngologists should be aware of limitations in the strength of premarket evidence when considering the use of newly approved devices.
Wilkinson, S A; van der Pligt, P; Gibbons, K S; McIntyre, H D
2015-01-01
Failure to return to pregnancy weight by 6 months postpartum is associated with long-term obesity, as well as adverse health outcomes. This research evaluated a postpartum weight management programme for women with a body mass index (BMI) > 25 kg m(-2) that combined behaviour change principles and a low-intensity delivery format with postpartum nutrition information. Women were randomised at 24-28 weeks to control (supported care; SC) or intervention (enhanced care; EC) groups, stratified by BMI cohort. At 36 weeks of gestation, SC women received a 'nutrition for breastfeeding' resource and EC women received a nutrition assessment and goal-setting session about post-natal nutrition, plus a 6-month correspondence intervention requiring return of self-monitoring sheets. Weight change, anthropometry, diet, physical activity, breastfeeding, fasting glucose and insulin measures were assessed at 6 weeks and 6 months postpartum. Seventy-seven percent (40 EC and 41 SC) of the 105 women approached were recruited; 36 EC and 35 SC women received a programme and 66.7% and 48.6% completed the study, respectively. No significant differences were observed between any outcomes. Median [interquartile range (IQR)] weight change was EC: -1.1 (9.5) kg versus SC: -1.1 (7.5) kg (6 weeks to 6 months) and EC: +1.0 (8.7) kg versus SC: +2.3 (9) kg (prepregnancy to 6 months). Intervention women breastfed for half a month longer than control women (180 versus 164 days; P = 0.10). An average of 2.3 out of six activity sheets per participant was returned. Despite low intervention engagement, the high retention rate suggests this remains an area of interest to women. Future strategies must facilitate women's engagement, be individually tailored, and include features that support behaviour change to decrease women's risk of chronic health issues. © 2013 The British Dietetic Association Ltd.
Jones, David G; Haldar, Shouvik K; Donovan, Jacqueline; McDonagh, Theresa A; Sharma, Rakesh; Hussain, Wajid; Markides, Vias; Wong, Tom
2016-09-01
To investigate the effects of catheter ablation and rate control strategies on cardiac and inflammatory biomarkers in patients with heart failure and persistent atrial fibrillation (AF). Patients were recruited from the ARC-HF trial (catheter Ablation vs Rate Control for management of persistent AF in Heart Failure, NCT00878384), which compared ablation with rate control for persistent AF in heart failure. B-type natriuretic peptide (BNP), midregional proatrial natriuretic peptide (MR-proANP), apelin, and interleukin-6 (IL-6) were assayed at baseline, 3 months, 6 months, and 12 months. The primary end point, analyzed per-protocol, was changed from baseline at 12 months. Of 52 recruited patients, 24 ablation and 25 rate control subjects were followed to 12 months. After 1.2 ± 0.5 procedures, sinus rhythm was present in 22 (92%) ablation patients; under rate control, rate criteria were achieved in 23 (96%) of 24 patients remaining in AF. At 12 months, MR-proANP fell significantly in the ablation arm (-106.0 pmol/L, interquartile range [IQR] -228.2 to -60.6) compared with rate control (-28.7 pmol/L, IQR -69 to +9.5, P = 0.028). BNP showed a similar trend toward reduction (P = 0.051), with no significant difference in apelin (P = 0.13) or IL-6 (P = 0.68). Changes in MR-proANP and BNP correlated with peak VO2 and ejection fraction, and MR-proANP additionally with quality-of-life score. Catheter ablation, compared with rate control, in patients with heart failure and persistent AF was associated with significant reduction in MR-proANP, which correlated with physiological and symptomatic improvement. Ablation-based rhythm control may induce beneficial cardiac remodeling, unrelated to changes in inflammatory state. This may have prognostic implications, which require confirmation by event end point studies. © 2016 Wiley Periodicals, Inc.
Rannio, Tuomas; Asikainen, Juha; Kokko, Arto; Hannonen, Pekka; Sokka, Tuulikki
2016-04-01
We analyzed remission rates at 3 and 12 months in patients with rheumatoid arthritis (RA) who were naive for disease-modifying antirheumatic drugs (DMARD) and who were treated in a Finnish rheumatology clinic from 2008 to 2011. We compared remission rates and drug treatments between patients with RA and patients with undifferentiated arthritis (UA). Data from all DMARD-naive RA and UA patients from the healthcare district were collected using software that includes demographic and clinical characteristics, disease activity, medications, and patient-reported outcomes. Our rheumatology clinic applies the treat-to-target principle, electronic monitoring of patients, and multidisciplinary care. Out of 409 patients, 406 had data for classification by the 2010 RA criteria of the American College of Rheumatology/European League Against Rheumatism. A total of 68% were female, and mean age (SD) was 58 (16) years. Respectively, 56%, 60%, and 68% were positive for anticyclic citrullinated peptide antibodies (anti-CCP), rheumatoid factor (RF), and RF/anti-CCP, and 19% had erosive disease. The median (interquartile range) duration of symptoms was 6 (4-12) months. A total of 310 were classified as RA and 96 as UA. The patients with UA were younger, had better functional status and lower disease activity, and were more often seronegative than the patients with RA. The 28-joint Disease Activity Score (3 variables) remission rates of RA and UA patients at 3 months were 67% and 58% (p = 0.13), and at 12 months, 71% and 79%, respectively (p = 0.16). Sustained remission was observed in 57%/56% of RA/UA patients. Patients with RA used more conventional synthetic DMARD combinations than did patients with UA. None used biological DMARD at 3 months, and only 2.7%/1.1% of the patients (RA/UA) used them at 12 months (p = 0.36). Remarkably high remission rates are achievable in real-world DMARD-naive patients with RA or UA.
Apondi, Rose; Bunnell, Rebecca; Ekwaru, John Paul; Moore, David; Bechange, Stevens; Khana, Kenneth; King, Rachel; Campbell, James; Tappero, Jordan; Mermin, Jonathan
2011-06-19
Long-term impact of antiretroviral therapy (ART) on sexual HIV-transmission risk in Africa is unknown. We assessed sexual behavior changes and estimated HIV transmission from HIV-infected adults on ART in Uganda. Between 2003 and 2007, we enrolled and followed ART-naive HIV-infected adults in a home-based AIDS program with annual counseling and testing for cohabitating partners, participant transmission risk-reduction plans, condom distribution and prevention support for cohabitating discordant couples. We assessed participants' HIV plasma viral load and partner-specific sexual behaviors. We defined risky sex as intercourse with inconsistent/no condom use with HIV-negative or unknown serostatus partners in previous 3 months. We compared rates using Poisson regression models, estimated transmission risk using established viral load-specific transmission estimates, and documented sero-conversion rates among HIV-discordant couples. Of 928 participants, 755 (81%) had 36 months data: 94 (10%) died and 79 (9%) missing data. Sexual activity increased from 28% (baseline) to 41% [36 months (P < 0.001)]. Of sexually active participants, 22% reported risky sex at baseline, 8% at 6 months (P < 0.001), and 14% at 36 months (P = 0.018). Median viral load among those reporting risky sex was 122,500 [interquartile range (IQR) 45 100-353 000] copies/ml pre-ART at baseline and undetectable at follow-up. One sero-conversion occurred among 62 cohabitating sero-discordant partners (0.5 sero-conversions/100 person-years). At 36 months, consistent condom use was 74% with discordant partners, 55% with unknown and 46% with concordant partners. Estimated HIV transmission risk reduced 91%, from 47.3 to 4.2/1000 person-years. Despite increased sexual activity among HIV-infected Ugandans over 3 years on ART, risky sex and estimated risk of HIV transmission remained lower than baseline levels. Integrated prevention programs could reduce HIV transmission in Africa.
Mutevedzi, Portia C; Lessells, Richard J; Heller, Tom; Bärnighausen, Till; Cooke, Graham S; Newell, Marie-Louise
2010-01-01
Abstract Objective To describe the scale-up of a decentralized HIV treatment programme delivered through the primary health care system in rural KwaZulu-Natal, South Africa, and to assess trends in baseline characteristics and outcomes in the study population. Methods The programme started delivery of antiretroviral therapy (ART) in October 2004. Information on all patients initiated on ART was captured in the programme database and follow-up status was updated monthly. All adult patients (≥ 16 years) who initiated ART between October 2004 and September 2008 were included and stratified into 6-month groups. Clinical and sociodemographic characteristics were compared between the groups. Retention in care, mortality, loss to follow-up and virological outcomes were assessed at 12 months post-ART initiation. Findings A total of 5719 adults initiated on ART were included (67.9% female). Median baseline CD4+ lymphocyte count was 116 cells/µl (interquartile range, IQR: 53–173). There was an increase in the proportion of women who initiated ART while pregnant but no change in other baseline characteristics over time. Overall retention in care at 12 months was 84.0% (95% confidence interval, CI: 82.6–85.3); 10.9% died (95% CI: 9.8–12.0); 3.7% were lost to follow-up (95% CI: 3.0–4.4). Mortality was highest in the first 3 months after ART initiation: 30.1 deaths per 100 person–years (95% CI: 26.3–34.5). At 12 months 23.0% had a detectable viral load (> 25 copies/ml) (95% CI: 19.5–25.5). Conclusion Outcomes were not affected by rapid expansion of this decentralized HIV treatment programme. The relatively high rates of detectable viral load highlight the need for further efforts to improve the quality of services. PMID:20680124
Rosenblatt, Peter L.; Apostolis, Costas A.; Hacker, Michele R.; DiSciullo, Anthony
2013-01-01
The objective of this retrospective study was to evaluate the feasibility, safety, and efficacy of a new laparoscopic technique for the treatment of uterovaginal prolapse using a transcervical access port to minimize the laparoscopic incision. From February 2008 through August 2010, symptomatic pelvic organ prolapse in 43 patients was evaluated and surgically treated using this novel procedure. Preoperative assessment included pelvic examination, the pelvic organ prolapse quantification scoring system (POP-Q), and complex urodynamic testing with prolapse reduction to evaluate for symptomatic or occult stress urinary incontinence. The surgical procedure consisted of laparoscopic supracervical hysterectomy with transcervical morcellation and laparoscopic sacrocervicopexy with anterior and posterior mesh extension. Concomitant procedures were performed as indicated. All procedures were completed laparoscopically using only 5-mm abdominal port sites, with no intraoperative complications. Patients were followed up postoperatively for pelvic examination and POP-Q at 6 weeks, 6 months, and 12 months. The median (interquartile range) preoperative POP-Q values for point Aa was 0 (−1.0 to 1.0), and for point C was −1.0 (−3.0 to 2.0). Postoperatively, median points Aa and C were significantly improved at 6 weeks, 6 months, and 12 months (all p < .001). One patient was found to have a mesh/suture exposure from the sacrocervicopexy, which was managed conservatively without surgery. We conclude that laparoscopic supracervical hysterectomy with transcervical morcellation and laparoscopic sacrocervicopexy is a safe and feasible surgical approach to treatment of uterovaginal prolapse, with excellent anatomic results at 6 weeks, 6 months, and 12 months. Potential advantages of the procedure include minimizing laparoscopic port site size, decreasing the rate of mesh exposure compared with other published data, and reducing the rate of postoperative cyclic bleeding in premenopausal women by removing the cervical core. Longer follow-up is needed to determine the durability and potential long-term sequelae of the procedure. PMID:23084680
CT Screening for Lung Cancer: Nonsolid Nodules in Baseline and Annual Repeat Rounds.
Yankelevitz, David F; Yip, Rowena; Smith, James P; Liang, Mingzhu; Liu, Ying; Xu, Dong Ming; Salvatore, Mary M; Wolf, Andrea S; Flores, Raja M; Henschke, Claudia I
2015-11-01
To address the frequency of identifying nonsolid nodules, diagnosing lung cancer manifesting as such nodules, and the long-term outcome after treatment in a prospective cohort, the International Early Lung Cancer Action Program. A total of 57,496 participants underwent baseline and subsequent annual repeat computed tomographic (CT) screenings according to an institutional review board, HIPAA-compliant protocol. Informed consent was obtained. The frequency of participants with nonsolid nodules, the course of the nodule at follow-up, and the resulting diagnoses of lung cancer, treatment, and outcome are given separately for baseline and annual repeat rounds of screening. The χ(2) statistic was used to compare percentages. A nonsolid nodule was identified in 2392 (4.2%) of 57,496 baseline screenings, and pathologic pursuit led to the diagnosis of 73 cases of adenocarcinoma. A new nonsolid nodule was identified in 485 (0.7%) of 64,677 annual repeat screenings, and 11 had a diagnosis of stage I adenocarcinoma; none were in nodules 15 mm or larger in diameter. Nonsolid nodules resolved or decreased more frequently in annual repeat than in baseline rounds (322 [66%] of 485 vs 628 [26%] of 2392, P < .0001). Treatment of the cases of lung cancer was with lobectomy in 55, bilobectomy in two, sublobar resection in 26, and radiation therapy in one. Median time to treatment was 19 months (interquartile range [IQR], 6-41 months). A solid component had developed in 22 cases prior to treatment (median transition time from nonsolid to part-solid, 25 months). The lung cancer-survival rate was 100% with median follow-up since diagnosis of 78 months (IQR, 45-122 months). Nonsolid nodules of any size can be safely followed with CT at 12-month intervals to assess transition to part-solid. Surgery was 100% curative in all cases, regardless of the time to treatment. © RSNA, 2015
Gallagher, Harry M; Sarwar, Ghulam; Tse, Tracy; Sladden, Timothy M; Hii, Esmond; Yerkovich, Stephanie T; Hopkins, Peter M; Chambers, Daniel C
2015-11-01
Erratic tacrolimus blood levels are associated with liver and kidney graft failure. We hypothesized that erratic tacrolimus exposure would similarly compromise lung transplant outcomes. This study assessed the effect of tacrolimus mean and standard deviation (SD) levels on the risk of chronic lung allograft dysfunction (CLAD) and death after lung transplantation. We retrospectively reviewed 110 lung transplant recipients who received tacrolimus-based immunosuppression. Cox proportional hazard modeling was used to investigate the effect of tacrolimus mean and SD levels on survival and CLAD. At census, 48 patients (44%) had developed CLAD and 37 (34%) had died. Tacrolimus SD was highest for the first 6 post-transplant months (median, 4.01; interquartile range [IQR], 3.04-4.98 months) before stabilizing at 2.84 μg/liter (IQR, 2.16-4.13 μg/liter) between 6 and 12 months. The SD then remained the same (median, 2.85; IQR, 2.00-3.77 μg/liter) between 12 and 24 months. A high mean tacrolimus level 6 to 12 months post-transplant independently reduced the risk of CLAD (hazard ratio [HR], 0.74; 95% confidence interval [CI], 0.63-0.86; p < 0.001) but not death (HR, 0.96; 95% CI, 0.83-1.12; p = 0.65). In contrast, a high tacrolimus SD between 6 and 12 months independently increased the risk of CLAD (HR, 1.46; 95% CI, 1.23-1.73; p < 0.001) and death (HR, 1.27; 95% CI, 1.08-1.51; p = 0.005). Erratic tacrolimus levels are a risk factor for poor lung transplant outcomes. Identifying and modifying factors that contribute to this variability may significantly improve outcomes. Copyright © 2015 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Torok, Kathryn S.; Arkachaisri, Thaschawee
2013-01-01
Objective To evaluate the effectiveness of a uniform single-center treatment protocol composed of high-dose methotrexate (MTX) and oral corticosteroids in a pediatric localized scleroderma (LS) cohort. Methods Thirty-six patients with LS were recruited. Patients with active disease, defined as erythematous lesions and/or new lesions, or expansion of existing lesions, were started on oral prednisone 2 mg/kg/day (maximum 60 mg/day) and subcutaneous (SC) MTX at 1 mg/kg/week (maximum 25 mg/week). Prednisone was tapered and kept at 0.25 mg/kg/day for 12 months. MTX SC was continued for 24 months, and then switched to oral administration to complete 36 months of therapy. Modified LS Skin Severity Index (mLoSSI) and the physician global assessment of disease activity (PGA-A) were used as outcome measures. Results Twenty-five patients with LS were female with a median age at onset of 7.86 years [interquartile range (IQR) 4.63–11.91]. Median disease duration from onset until start of this treatment regimen was 19.2 months (IQR 8.96–35.35). Median duration of followup was 36.40 months (IQR 29.39–45.36). All patients demonstrated significant improvement in mLoSSI at median 1.77 months (IQR 0.76–2.37, 95% CI 1.54, 2.01). PGA-A followed the same trend. No significant adverse reactions or flares were observed during therapy. Conclusion This single-center LS treatment protocol was effective and well tolerated. Clinical outcome in LS is affected by dose and route of administration of immunosuppressive regimens. Daily tapering dose of corticosteroids and parenteral MTX were effective in controlling LS activity without significant adverse reaction. This regimen should be considered as one of the therapies for LS clinical trials. PMID:22247357
Szabó, Eva; Boehm, Günther; Beermann, Christopher; Weyermann, Maria; Brenner, Hermann; Rothenbacher, Dietrich; Decsi, Tamás
2010-03-01
To compare fatty acid composition of human milk at 2 different stages of lactation and investigate the relation between trans isomeric and long-chain polyunsaturated fatty acids (LCPUFAs) in human milk at the sixth month of lactation. We investigated human milk samples obtained at the sixth week and sixth month of lactation from 462 mothers who participated in a large birth cohort study. Fatty acid composition of human milk lipids was determined by high-resolution capillary gas-liquid chromatography. Fat contents of human milk increased significantly between the sixth week and sixth month of lactation (1.63 [2.06] and 3.19 [3.14], g/100 mL; median [interquartile range], P < 0.001). Percentage contributions to human milk fatty acid composition of nearly all polyunsaturated fatty acids also increased significantly (linoleic acid: 10.09 [4.41] and 11.01 [4.53], arachidonic acid: 0.46 [0.32] and 0.48 [0.23], alpha-linolenic acid: 0.69 [0.42] and 0.75 [0.41], and docosahexaenoic acid: 0.17 [0.23] and 0.23 [0.15], % wt/wt, P < 0.001). Values of the 18-carbon trans octadecenoic acid (C18:1n-7/9t) significantly inversely correlated to linoleic acid (r = -0.24, P < 0.001), alpha-linolenic acid (r = -0.19, P < 0.001), and arachidonic acid (r = -0.43, P < 0.001). In contrast, we found no correlation between the 16-carbon trans hexadecenoic acid (C16:1n-7t) and the same LCPUFAs. Data obtained in the present study indicate increasing fat contents with stable or increasing percentage contribution of LCPUFAs in human milk samples between the sixth week and at the sixth month of lactation, and the availability of 18-carbon trans isomeric fatty acids is inversely associated to the availability of several LCPUFAs in human milk at the sixth month of lactation.
Visser, Marieke M; Heijenbrok-Kal, Majanka H; Van't Spijker, Adriaan; Lannoo, Engelien; Busschbach, Jan J V; Ribbers, Gerard M
2016-01-01
This study investigated whether problem-solving therapy (PST) is an effective group intervention for improving coping strategy and health-related quality of life (HRQoL) in patients with stroke. In this multicenter randomized controlled trial, the intervention group received PST as add-on to standard outpatient rehabilitation, the control group received outpatient rehabilitation only. Measurements were performed at baseline, directly after the intervention, and 6 and 12 months later. Data were analyzed using linear-mixed models. Primary outcomes were task-oriented coping as measured by the Coping Inventory for Stressful Situations and psychosocial HRQoL as measured by the Stroke-Specific Quality of Life Scale. Secondary outcomes were the EuroQol EQ-5D-5L utility score, emotion-oriented and avoidant coping as measured by the Coping Inventory for Stressful Situations, problem-solving skills as measured by the Social Problem Solving Inventory-Revised, and depression as measured by the Center for Epidemiological Studies Depression Scale. Included were 166 patients with stroke, mean age 53.06 years (SD, 10.19), 53% men, median time poststroke 7.29 months (interquartile range, 4.90-10.61 months). Six months post intervention, the PST group showed significant improvement when compared with the control group in task-oriented coping (P=0.008), but not stroke-specific psychosocial HRQoL. Furthermore, avoidant coping (P=0.039) and the utility value for general HRQoL (P=0.034) improved more in the PST group than in the control after 6 months. PST seems to improve task-oriented coping but not disease-specific psychosocial HRQoL after stroke >6-month follow-up. Furthermore, we found indications that PST may improve generic HRQoL recovery and avoidant coping. URL: http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2509. Unique identifier: CNTR2509. © 2015 American Heart Association, Inc.
Predictors of post-partum weight retention in a prospective longitudinal study.
Martin, Julia Elizabeth; Hure, Alexis Jayne; Macdonald-Wicks, Lesley; Smith, Roger; Collins, Clare Elizabeth
2014-10-01
Post-partum weight retention (WR) occurs in 60-80% of women with some retaining ≥10 kg with contributing factors reported as pre-pregnancy body mass index (BMI), gestational weight gain (GWG) and breastfeeding. A longitudinal study of pregnancy, with 12-month post-partum follow-up was conducted to determine factors associated with WR. Pregnant women (n = 152) were recruited from the John Hunter Hospital antenatal clinic in New South Wales, Australia. Pre-pregnancy weight was self-reported; weight was measured four times during pregnancy (for GWG) and in the first 12 months post-partum. Infant feeding data were obtained via questionnaires. Breastfeeding was categorised as exclusive, predominant, complementary or not breastfeeding. Linear mixed models tested the predictors of WR, with and without adjustment for potential confounders. Compared with pre-pregnancy weight, 68% of women retained weight at 12 months, median (interquartile range) [4.5 kg (2.1-8.9)]. After adjustment, GWG was positively associated with WR (P < 0.01), but pre-pregnancy weight did not predict WR. For each additional week of any breastfeeding, 0.04 kg less weight was retained. Compared with women who retained weight, those women who did retain had higher rates of exclusive breastfeeding at three months (P < 0.05), but the number of weeks of exclusive breastfeeding failed to predict WR for all women. WR following childbirth is common and associated with GWG, while the number of weeks of 'any' breastfeeding contributed to post-partum weight loss. Whether these factors are modifiable strategies to optimise the weight status of women at this life stage requires further research. © 2012 John Wiley & Sons Ltd.
Application of the FOUR Score in Intracerebral Hemorrhage Risk Analysis.
Braksick, Sherri A; Hemphill, J Claude; Mandrekar, Jay; Wijdicks, Eelco F M; Fugate, Jennifer E
2018-06-01
The Full Outline of Unresponsiveness (FOUR) Score is a validated scale describing the essentials of a coma examination, including motor response, eye opening and eye movements, brainstem reflexes, and respiratory pattern. We incorporated the FOUR Score into the existing ICH Score and evaluated its accuracy of risk assessment in spontaneous intracerebral hemorrhage (ICH). Consecutive patients admitted to our institution from 2009 to 2012 with spontaneous ICH were reviewed. The ICH Score was calculated using patient age, hemorrhage location, hemorrhage volume, evidence of intraventricular extension, and Glasgow Coma Scale (GCS). The FOUR Score was then incorporated into the ICH Score as a substitute for the GCS (ICH Score FS ). The ability of the 2 scores to predict mortality at 1 month was then compared. In total, 274 patients met the inclusion criteria. The median age was 73 years (interquartile range 60-82) and 138 (50.4%) were male. Overall mortality at 1 month was 28.8% (n = 79). The area under the receiver operating characteristic curve was .91 for the ICH Score and .89 for the ICH Score FS . For ICH Scores of 1, 2, 3, 4, and 5, 1-month mortality was 4.2%, 29.9%, 62.5%, 95.0%, and 100%. In the ICH Score FS model, mortality was 10.7%, 26.5%, 64.5%, 88.9%, and 100% for scores of 1, 2, 3, 4, and 5, respectively. The ICH Score and the ICH Score FS predict 1-month mortality with comparable accuracy. As the FOUR Score provides additional clinical information regarding patient status, it may be a reasonable substitute for the GCS into the ICH Score. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Yarlagadda, Bharath; Turagam, Mohit K; Dar, Tawseef; Jangam, Pragna; Veerapaneni, Vaishnavi; Atkins, Donita; Bommana, Sudharani; Friedman, Paul; Deshmukh, Abhishek J; Doshi, Rahul; Reddy, Vivek Y; Dukkipati, Srinivas R; Natale, Andrea; Lakkireddy, Dhanunjaya
2018-03-01
Atrioventricular node (AVN) ablation and permanent pacing is an established strategy for rate control in the management of symptomatic atrial fibrillation (AF). Leadless pacemakers (LPs) can overcome some of the short-term and long-term limitations of conventional transvenous pacemakers (CTPs). The purpose of this study was to compare the feasibility and safety of LP with those of single-chamber CTP in patients with AF undergoing AVN ablation. We conducted a multicenter observational study of patients undergoing AVN ablation and pacemaker implantation (LP vs single-chamber CTP) between February 2014 and November 2016. The primary efficacy end points were acceptable sensing (R wave ≥5.0 mV) and pacing thresholds (≤2.0 V at 0.4 ms) at follow-up. Safety end points included device-related major and minor (early ≤1 month, late >1 month) adverse events. A total of 127 patients with LP (n = 60) and CTP (n = 67) were studied. The median follow-up was 12 months (interquartile range 12-18 months). Ninety-five percent of the LP group and 97% of the CTP group met the primary efficacy end point at follow-up (57 of 60 vs 65 of 67; P = .66). There was 1 major adverse event (loss of pacing and sensing) in the LP group and 2 (lead dislodgement) in the CTP group (1 of 60 vs 2 of 67; P = 1.00). There were 6 minor adverse events (5 early and 1 late) in the LP group and 3 (early) in the CTP group (6 of 60 vs 3 of 67; P = .30). Our results demonstrate the feasibility and safety of LP compared with CTP in patients undergoing AVN ablation for AF. Copyright © 2018 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Cysouw, Matthijs; Bouman-Wammes, Esther; Hoekstra, Otto; van den Eertwegh, Alfons; Piet, Maartje; van Moorselaar, Jeroen; Boellaard, Ronald; Dahele, Max; Oprea-Lager, Daniela
2018-06-01
To investigate the predictive value of [ 18 F]-fluoromethylcholine positron emission tomography/computed tomography (PET/CT)-derived parameters on progression-free survival (PFS) in oligometastatic prostate cancer patients treated with stereotactic body radiation therapy (SBRT). In [ 18 F]-fluoromethylcholine PET/CT scans of 40 consecutive patients with ≤4 metachronous metastases treated with SBRT we retrospectively measured the number of metastases, standardized uptake values (SUV mean , SUV max , SUV peak ), metabolically active tumor volume (MATV), and total lesion choline uptake. Partial-volume correction was applied using the iterative deconvolution Lucy-Richardson algorithm. Thirty-seven lymph node and 13 bone metastases were treated with SBRT. Thirty-three patients (82.5%) had 1 lesion, 4 (10%) had 2 lesions, and 3 (7.5%) had 3 lesions. After a median follow-up of 32.6 months (interquartile range, 35.5 months), the median PFS was 11.5 months (95% confidence interval 8.4-14.6 months). Having more than a single metastasis was a significant prognostic factor (hazard ratio 2.74; P = .03), and there was a trend in risk of progression for large MATV (hazard ratio 1.86; P = .10). No SUV or total lesion choline uptake was significantly predictive for PFS, regardless of partial-volume correction. All PET semiquantitative parameters were significantly correlated with each other (P ≤ .013). The number of choline-avid metastases was a significant prognostic factor for progression after [ 18 F]-fluormethylcholine PET/CT-guided SBRT for recurrent oligometastatic prostate cancer, and there seemed to be a trend in risk of progression for patients with large MATVs. The lesional level of [ 18 F]-fluoromethylcholine uptake was not prognostic for progression. Copyright © 2018 Elsevier Inc. All rights reserved.
Costello, S P; Ghaly, S; Beswick, L; Pudipeddi, A; Agarwal, A; Sechi, A; O'Connor, S; Connor, S J; Sparrow, M P; Bampton, P; Walsh, A J; Andrews, J M
2015-06-01
The efficacy of infliximab has been demonstrated in patients with both acute severe and moderate-severe ulcerative colitis (UC). However, there is a need for 'real-life data' to ensure that conclusions from trial settings are applicable in usual care. We therefore examined the national experience of anti-tumour necrosis factor-α (TNF-α) therapy in UC. Case notes review of patients with UC who had received compassionate access (CA) anti-TNF-α therapy from prospectively maintained inflammatory bowel disease databases of six Australian adult teaching hospitals. Patients either received drug for acute severe UC (ASUC) failing steroids (n = 29) or for medically refractory UC (MRUC) (n = 35). In ASUC, the treating physicians judged that anti-TNF-α therapy was successful in 20/29 patients (69%); in these cases, anti-TNF-α was able to be discontinued (after 1-3 infusions in 19/20 responders) as clinical remission was achieved. Consistent with this perceived benefit, only 7/29 (24%) subsequently underwent colectomy during a median follow up of 12 months (interquartile range (IQR) 5-16). Eight of the 35 patients with MRUC (23%) required colectomy during a median follow up of 28 months (IQR 11-43). The majority of these patients (20/35 or 57%) had anti-TNF-α therapy for ≥4 months, whereas, 27/29 (93%) of ASUC patients had CA for ≤3 months. These data show an excellent overall benefit for anti-TNF-α therapy in both ASUC and MRUC. In particular, only short-duration anti-TNF-α was required in ASUC. These real-life data thus support the clinical trial data and should lead to broader use of this therapy in UC. © 2015 Royal Australasian College of Physicians.
Colvin, Jeffrey D; Bettenhausen, Jessica L; Anderson-Carpenter, Kaston D; Collie-Akers, Vicki; Plencner, Laura; Krager, Molly; Nelson, Brooke; Donnelly, Sara; Simmons, Julia; Higinio, Valeria; Chung, Paul J
2016-03-01
It is critical that pediatric residents learn to effectively screen families for active and addressable social needs (ie, negative social determinants of health). We sought to determine 1) whether a brief intervention teaching residents about IHELP, a social needs screening tool, could improve resident screening, and 2) how accurately IHELP could detect needs in the inpatient setting. During an 18-month period, interns rotating on 1 of 2 otherwise identical inpatient general pediatrics teams were trained in IHELP. Interns on the other team served as the comparison group. Every admission history and physical examination (H&P) was reviewed for IHELP screening. Social work evaluations were used to establish the sensitivity and specificity of IHELP and document resources provided to families with active needs. During a 21-month postintervention period, every third H&P was reviewed to determine median duration of continued IHELP use. A total of 619 admissions met inclusion criteria. Over 80% of intervention team H&Ps documented use of IHELP. The percentage of social work consults was nearly 3 times greater on the intervention team than on the comparison team (P < .001). Among H&Ps with documented use of IHELP, specificity was 0.96 (95% confidence interval 0.87-0.99) and sensitivity was 0.63 (95% confidence interval 0.50-0.73). Social work provided resources for 78% of positively screened families. The median duration of screening use by residents after the intervention was 8.1 months (interquartile range 1-10 months). A brief intervention increased resident screening and detection of social needs, leading to important referrals to address those needs. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
Jesson, Julie; Coulibaly, Aba; Sylla, Mariam; NʼDiaye, Clémentine; Dicko, Fatoumata; Masson, David; Leroy, Valériane
2017-10-01
We assessed a nutritional support intervention in malnourished HIV-infected children in a HIV-care program of the University Hospital Gabriel Touré, Bamako, Mali. All HIV-infected children younger than 15 years were diagnosed for malnutrition between 07 and 12, 2014. Malnutrition was defined according to the WHO growth standards with Z-scores. Two types were studied: acute malnutrition (AM) and chronic malnutrition (CM). All participants were enrolled in a 6-month prospective interventional cohort, receiving Ready-To-Use Therapeutic Food, according to type of malnutrition. The nutritional intervention was offered until child growth reached -1.5 SD threshold. Six-month probability to catch up growth (>-2 SD) was assessed for AM using Kaplan-Meier curves and Cox model. Among the 348 children screened, 198 (57%) were malnourished of whom 158 (80%) children were included: 97 (61%) for AM (35 with associated CM) and 61 (39%) with CM. Fifty-nine percent were boys, 97% were on antiretroviral therapy, median age was 9.5 years (Interquartile Range: 6.7-12.3). Among children with AM, 74% catch-up their growth at 6-month; probability to catch-up growth was greater for those without associated CM (adjusted Hazard Ratio = 1.97, CI 95%: 1.13 to 3.44). Anemia decreased significantly from 40% to 12% at the end of intervention (P < 0.001). This macronutrient intervention showed 6-month benefits for weight gain and reduced anemia among these children mainly on antiretroviral therapy for years and aged greater than 5 years at inclusion. Associated CM slows down AM recovery and needs longer support. Integration of nutritional screening and care in the pediatric HIV-care package is needed to optimize growth and prevent metabolic disorders.
Faramarzi, Masumeh; Goharfar, Zahra; Pourabbas, Reza; Kashefimehr, Atabak; Shirmohmmadi, Adileh
2015-08-01
The purpose of this study was to compare the microbial and clinical effects of mechanical debridement (MD) alone or in combination with the application of enamel matrix derivative (EMD) and sustained-release micro-spherical minocycline (MSM) for treatment of peri-implant mucosal infl ammation (PIMI). Subjects with at least one implant with PIMI were included and divided into control and two different test groups. In all three groups, MD was performed. In the MSM group, following MD, MSM was placed subgingivally around the implants. In the EMD group, after MD, EMD was placed in the sulcus around the implants. Sampling of peri-implant crevicular fl uid for microbial analysis with real-time polymerase chain reaction and recording of probing depth (PD) and bleeding on probing (BOP) were performed prior to as well as two weeks and three months after treatment. Median values and interquartile range were estimated for each variable during the various assessment intervals of the study. In all groups, at two weeks and three months, the counts of Porphyromonas gingivalis decreased significantly compared to baseline. Levels of P. gingivalis were significantly reduced in MSM (P<0.001) and EMD (P=0.026) groups compared to the control group. Also, clinical parameters improved significantly at two weeks and three months. Reduction of PD was significant in MSM (P<0.001) and EMD (P<0.001) groups. The decrease in BOP in the MSM, EMD, and control groups was 60%, 50%, and 20%, respectively. The use of MSM and EMD can be an adjunctive treatment for management of PIMI and improves clinical parameters and reduces P. gingivalis burden three months after treatment.
Mid-Term Vascular Safety of Renal Denervation Assessed by Follow-up MR Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmid, Axel, E-mail: axel.schmid@uk-erlangen.de; Schmieder, Raphael; Lell, Michael
Background/AimsRenal denervation (RDN) emerged as a treatment option for reducing blood pressure (BP) in patients with treatment-resistant hypertension (TRH). However, concerns have been raised regarding the incidence of late renal artery stenosis or thromboembolism after RDN. The goal of the current study was, therefore, to conduct a prospective clinical trial on the mid-term vascular integrity of the renal arteries and the perfusion of the renal parenchyma assessed by magnetic resonance imaging (MRI) in the follow-up after catheter-based RDN.MethodsIn our single-centre investigator initiated study, 51 patients with true TRH underwent catheter-based RDN using the Symplicity Flex{sup TM} catheter (Medtronic Inc., Palomore » Alto, CA). Follow-up MRI was performed at a median of 11 months (interquartile range 6–18 months) after RDN on a 1.5T MR unit. High-resolution MR angiography (MRA) and MRI results were compared to the baseline digital angiography of renal arteries obtained at time of RDN. In case of uncertainties (N = 2) catheter angiography was repeated.ResultsBoth office and 24-h ambulatory BP were significantly reduced 6 and 12 months after RDN. Renal function remained unchanged 6 and 12 months after RDN. In all patients, MRA excluded new or progression of pre-existing low grade renal artery stenosis as well as focal aneurysms at the sites of radiofrequency ablation. In none of the patients new segmental perfusion deficits in either kidney were detected on MRI.ConclusionsNo vascular or parenchymal complications after radiofrequency-based RDN were detected in 51 patients followed up by MRI.« less
Development of bimanual performance in young children with cerebral palsy.
Klevberg, Gunvor L; Elvrum, Ann-Kristin G; Zucknick, Manuela; Elkjaer, Sonja; Østensjø, Sigrid; Krumlinde-Sundholm, Lena; Kjeken, Ingvild; Jahnsen, Reidun
2018-05-01
To describe the development of bimanual performance among young children with unilateral or bilateral cerebral palsy (CP). A population-based sample of 102 children (53 males, 49 females), median age 28.5 months (interquartile range [IQR] 16mo) at first assessment and 47 months (IQR 18mo) at last assessment, was assessed half-yearly with the Assisting Hand Assessment (AHA) or the Both Hands Assessment (BoHA) for a total of 329 assessments. Developmental limits and rates were estimated by nonlinear mixed-effects models. Developmental trajectories were compared between levels of manual ability (Mini-Manual Ability Classification System [Mini-MACS] and MACS) and AHA or BoHA performance at 18 months of age (AHA-18/BoHA-18) for both CP subgroups, and additionally between children with bilateral CP with symmetric or asymmetric hand use. For both CP subgroups, children classified in Mini-MACS/MACS level I, and those with high AHA-18 or BoHA-18 reached the highest limits of performance. For children with bilateral CP the developmental change was small, and children with symmetric hand use reached the highest limits. Mini-MACS/MACS levels and AHA-18 or BoHA-18 distinguished between various developmental trajectories both for children with unilateral and bilateral CP. Children with bilateral CP changed their performance to a smaller extent than children with unilateral CP. Manual Ability Classification System levels and Assisting Hand Assessment/Both Hands Assessment performance at 18 months are important predictors of hand use development in cerebral palsy (CP). Children with bilateral CP improved less than those with unilateral CP. Children with bilateral CP and symmetric hand use reached higher limits than those with asymmetry. © 2018 Mac Keith Press.
Effectiveness of early intensive therapy on β-cell preservation in type 1 diabetes.
Buckingham, Bruce; Beck, Roy W; Ruedy, Katrina J; Cheng, Peiyao; Kollman, Craig; Weinzimer, Stuart A; DiMeglio, Linda A; Bremer, Andrew A; Slover, Robert; Tamborlane, William V
2013-12-01
To assess effectiveness of inpatient hybrid closed-loop control (HCLC) followed by outpatient sensor-augmented pump (SAP) therapy initiated within 7 days of diagnosis of type 1 diabetes on the preservation of β-cell function at 1 year. Sixty-eight individuals (mean age 13.3 ± 5.7 years; 35% female, 92% Caucasian) were randomized to HCLC followed by SAP therapy (intensive group; N = 48) or to the usual-care group treated with multiple daily injections or insulin pump therapy (N = 20). Primary outcome was C-peptide concentrations during mixed-meal tolerance tests at 12 months. Intensive-group participants initiated HCLC a median of 6 days after diagnosis for a median duration of 71.3 h, during which median participant mean glucose concentration was 140 mg/dL (interquartile range 134-153 mg/dL). During outpatient SAP, continuous glucose monitor (CGM) use decreased over time, and at 12 months, only 33% of intensive participants averaged sensor use ≥6 days/week. In the usual-care group, insulin pump and CGM use were initiated prior to 12 months by 15 and 5 participants, respectively. Mean HbA1c levels were similar in both groups throughout the study. At 12 months, the geometric mean (95% CI) of C-peptide area under the curve was 0.43 (0.34-0.52) pmol/mL in the intensive group and 0.52 (0.32-0.75) pmol/mL in the usual-care group (P = 0.49). Thirty-seven (79%) intensive and 16 (80%) usual-care participants had a peak C-peptide concentration ≥0.2 pmol/mL (P = 0.30). In new-onset type 1 diabetes, HCLC followed by SAP therapy did not provide benefit in preserving β-cell function compared with current standards of care.
Trends in Internet Use Among Men Who Have Sex With Men in the United States.
Paz-Bailey, Gabriela; Hoots, Brooke E; Xia, Mingjing; Finlayson, Teresa; Prejean, Joseph; Purcell, David W
2017-07-01
Internet-based platforms are increasingly prominent interfaces for social and sexual networking among men who have sex with men (MSM). MSM were recruited through venue-based sampling in 2008, 2011, and 2014 in 20 US cities. We examined changes in internet use (IU) to meet men and in meeting the last partner online among MSM from 2008 to 2014 using Poisson regression with generalized estimating equations to calculate adjusted prevalence ratios (APRs). We also examined factors associated with increased frequency of IU using data from 2014. IU was categorized as never, infrequent use (
Hamidi, Oksana; Callstrom, Matthew R; Lee, Robert A; Dean, Diana; Castro, M Regina; Morris, John C; Stan, Marius N
2018-03-21
To assess the effectiveness, tolerability, and complications of radiofrequency ablation (RFA) in patients with benign large thyroid nodules (TNs). This is a retrospective review of 14 patients with predominantly solid TNs treated with RFA at Mayo Clinic in Rochester, Minnesota, from December 1, 2013, through October 30, 2016. All the patients declined surgery or were poor surgical candidates. The TNs were benign on fine-needle aspiration, enlarging or causing compressive symptoms, and 3 cm or larger in largest diameter. We evaluated TN volume, compressive symptoms, cosmetic concerns, and thyroid function. Median TN volume reduction induced by RFA was 44.6% (interquartile range [IQR], 42.1%-59.3%), from 24.2 mL (IQR, 17.7-42.5 mL) to 14.4 mL (IQR, 7.1-19.2 mL) (P<.001). Median follow-up was 8.6 months (IQR, 3.9-13.9 months). Maximum results were achieved by 6 months. Radiofrequency ablation did not affect thyroid function. In 1 patient with subclinical hyperthyroidism due to toxic adenoma, thyroid function normalized 4 months after ablation of the toxic nodule. Compressive symptoms resolved in 8 of 12 patients (67%) and improved in the other 4 (33%). Cosmetic concerns improved in all 8 patients. The procedure had no sustained complications. In this population, RFA of benign large TNs performed similarly to the reports from Europe and Asia. It induces a substantial volume reduction of predominantly solid TNs, improves compressive symptoms and cosmetic concerns, and does not affect normal thyroid function. Radiofrequency ablation has an acceptable safety profile and should be considered as a low-risk alternative to conventional treatment of symptomatic benign TNs. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Okada, Masaya; Imagawa, Jun; Tanaka, Hideo; Nakamae, Hirohisa; Hino, Masayuki; Murai, Kazunori; Ishida, Yoji; Kumagai, Takashi; Sato, Seiichi; Ohashi, Kazuteru; Sakamaki, Hisashi; Wakita, Hisashi; Uoshima, Nobuhiko; Nakagawa, Yasunori; Minami, Yosuke; Ogasawara, Masahiro; Takeoka, Tomoharu; Akasaka, Hiroshi; Utsumi, Takahiko; Uike, Naokuni; Sato, Tsutomu; Ando, Sachiko; Usuki, Kensuke; Mizuta, Syuichi; Hashino, Satoshi; Nomura, Tetsuhiko; Shikami, Masato; Fukutani, Hisashi; Ohe, Yokiko; Kosugi, Hiroshi; Shibayama, Hirohiko; Maeda, Yasuhiro; Fukushima, Toshihiro; Yamazaki, Hirohito; Tsubaki, Kazuo; Kukita, Toshimasa; Adachi, Yoko; Nataduka, Toshiki; Sakoda, Hiroto; Yokoyama, Hisayuki; Okamoto, Takahiro; Shirasugi, Yukari; Onishi, Yasushi; Nohgawa, Masaharu; Yoshihara, Satoshi; Morita, Satoshi; Sakamoto, Junichi; Kimura, Shinya
2018-05-01
We previously reported an interim analysis of the DADI (dasatinib discontinuation) trial. The results showed that 48% of patients with chronic myeloid leukemia in the chronic phase who maintained a deep molecular response (DMR) for ≥ 1 year could discontinue second- or subsequent-line dasatinib treatment safely at a median follow-up of 20 months. However, the results from longer follow-up periods would be much more useful from a clinical perspective. The DADI trial was a prospective, multicenter trial conducted in Japan. After confirming a stable DMR for ≥ 1 year, dasatinib treatment subsequent to imatinib or nilotinib was discontinued. After discontinuation, the loss of DMR (even of 1 point) was defined as stringent molecular relapse, thereby triggering therapy resumption. The predictive factors of treatment-free remission (TFR) were analyzed. The median follow-up period was 44.0 months (interquartile range, 40.5-48.0 months). The estimated overall TFR rate at 36 months was 44.4% (95% confidence interval, 32.0%-56.2%). Only 2 patients developed a molecular relapse after the 1-year cutoff point. The presence of imatinib resistance was a significant risk factor for molecular relapse. Moreover, high natural killer cell and low γδ + T-cell and CD4 + regulatory T-cell (CD25 + CD127 low ) counts before discontinuation correlated significantly with successful therapy discontinuation. These findings suggest that discontinuation of second- or subsequent-line dasatinib after a sustained DMR of ≥ 1 year is feasible, especially for patients with no history of imatinib resistance. In addition, the natural killer cell count was associated with the TFR. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Impact of Obstructive Sleep Apnoea on Heart Failure with Preserved Ejection Fraction.
Arikawa, Takuo; Toyoda, Shigeru; Haruyama, Akiko; Amano, Hirohisa; Inami, Shu; Otani, Naoyuki; Sakuma, Masashi; Taguchi, Isao; Abe, Shichiro; Node, Koichi; Inoue, Teruo
2016-05-01
The impact of obstructive sleep apnoea on heart failure with preserved ejection fraction is unknown. Fifty-eight patients who had heart failure with a left ventricular ejection fraction; ≥50% underwent a sleep study. Brain natriuretic peptide (BNP) levels were determined at enrolment and at one, six, 12 and 36 months after enrolment. Obstructive sleep apnoea was found in 39 patients (67%), and they were all subsequently treated with continuous positive airway pressure. Echocardiography at admission showed that E/E' tended to be higher in the 39 patients with, than in the 19 patients without, obstructive sleep apnoea (15.0±3.6 vs 12.1±1.9, respectively, P=0.05). The median BNP levels at enrolment were similar in patients with and without obstructive sleep apnoea [median (interquartile range): 444 (233-752) vs 316 (218-703) pg/ml]. Although BNP levels decreased over time in both groups, the reduction was less pronounced in patients with obstructive sleep apnoea (P<0.05). Consequently, BNP levels were higher in patients with sleep apnoea at six months, [221 (137-324) vs 76 (38-96) pg/ml, P<0.05], 12 months [123 (98-197) vs 52 (38-76) pg/ml, P<0.05] and 36 months [115 (64-174) vs 56 (25-74) pg/ml, P<0.05]. Obstructive sleep apnoea, even when treated appropriately, may worsen long-term cardiac function and outcomes in patients who have heart failure with preserved ejection fraction. Copyright © 2015 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.
Chipollini, Juan; Abel, E Jason; Peyton, Charles C; Boulware, David C; Karam, Jose A; Margulis, Vitaly; Master, Viraj A; Zargar-Shoshtari, Kamran; Matin, Surena F; Sexton, Wade J; Raman, Jay D; Wood, Christopher G; Spiess, Philippe E
2018-04-01
To determine the therapeutic value of lymph node dissection (LND) during cytoreductive nephrectomy (CN) and assess predictors of cancer-specific survival (CSS) in metastatic renal-cell carcinoma. We identified 293 consecutive patients treated with CN at 4 academic institutions from March 2000 to May 2015. LND was performed in 187 patients (63.8%). CSS was estimated by the Kaplan-Meier method for the entire cohort and for a propensity score-matched cohort. Cox proportional hazards regression was used to evaluate CSS in a multivariate model and in an inverse probability weighting-adjusted model for patients who underwent dissection. Median follow-up was 12.6 months (interquartile range, 4.47, 30.3), and median survival was 15.9 months. Of the 293 patients, 187 (63.8%) underwent LND. One hundred six patients had nodal involvement (pN+) with a median CSS of 11.3 months (95% confidence interval [CI], 6.6, 15.9) versus 24.2 months (95% confidence interval, 14.1, 34.3) for pN- patients (log-rank P = .002). The hazard ratio for LND was 1.325 (95% CI, 1.002, 1.75) for the whole cohort and 1.024 (95% CI, 0.682, 1.537) in the propensity score-matched cohort. Multivariate analysis revealed that number of positive lymph nodes (P < .001) was a significant predictor of worse CSS. For patients with metastatic renal-cell carcinoma undergoing CN with lymphadenectomy, the number of nodes positive was predictive of survival at short-term follow-up. However, nonstandardized lymphadenectomy only provided prognostic information without therapeutic benefit. Prospective studies with standardized templates are required to further ascertain the therapeutic value of LND. Copyright © 2017 Elsevier Inc. All rights reserved.
Neve, Melinda J; Morgan, Philip J; Collins, Clare E
2012-07-01
As further understanding is required of what behavioural factors are associated with long-term weight-loss success, the aim of the present study was to determine the prevalence of successful weight loss 15 months post-enrolment in a commercial web-based weight-loss programme and which behavioural factors were associated with success. An online survey was completed 15 months post-enrolment in a commercial web-based weight-loss programme to assess weight-related behaviours and current weight. Participants were classified as successful if they had lost ≥5 % of their starting weight after 15 months. Commercial users of a web-based weight-loss programme. Participants enrolled in the commercial programme between August 2007 and May 2008. Six hundred and seventy-seven participants completed the survey. The median (interquartile range) weight change was -2·7 (-8·2, 1·6) % of enrolment weight, with 37 % achieving ≥5 % weight loss. Multivariate logistic regression analysis found success was associated with frequency of weight self-monitoring, higher dietary restraint score, lower emotional eating score, not skipping meals, not keeping snack foods in the house and eating takeaway foods less frequently. The findings suggest that individuals trying to achieve or maintain ≥5 % weight loss should be advised to regularly weigh themselves, avoid skipping meals or keeping snack foods in the house, limit the frequency of takeaway food consumption, manage emotional eating and strengthen dietary restraint. Strategies to assist individuals make these changes to behaviour should be incorporated within obesity treatments to improve the likelihood of successful weight loss in the long term.
Spataro, Rossella; Bono, Valeria; Marchese, Santino; La Bella, Vincenzo
2012-12-15
Tracheostomy mechanical ventilation (TMV) is performed in amyotrophic lateral sclerosis (ALS) patients with a respiratory failure or when the non-invasive ventilation (NIV) is no longer effective. We evaluated the clinical characteristics and survival of a cohort of tracheostomized ALS patients, followed in a single ALS Clinical Center. Between 2001 and 2010, 87 out of 279 ALS patients were submitted to TMV. Onset was spinal in 62 and bulbar in 25. After tracheostomy, most patients were followed up through telephone interviews to caregivers. A complete survival analysis could be performed in fifty-two TMV patients. 31.3% ALS patients underwent tracheostomy, with a male prevalence (M/F=1.69) and a median age of 61 years (interquartile range=47-66). After tracheostomy, nearly all patients were under home care. TMV ALS patients were more likely than non-tracheostomized (NT) patients to be implanted with a PEG device, although the bulbar-/spinal-onset ratio did not differ between the two groups. Kaplan-Meyer analysis showed that tracheostomy increases median survival (TMV, 47 months vs NT, 31 months, p=0.008), with the greatest effect in patients younger than 60 at onset (TMV ≤ 60 years, 57.5 months vs NT ≤ 60 years, 38.5 months, p=0.002). TMV is increasingly performed in ALS patients. Nearly all TMV patients live at home and most of them are fed through a PEG device. Survival after tracheostomy is generally increased, with the stronger effect in patients younger than 60. This survival advantage is apparently lost when TMV is performed in patients older than 60. The results of this study might be useful for the decision-making process of patients and their families about this advanced palliative care. Copyright © 2012. Published by Elsevier B.V.
Bouchghoul, Hanane; Hornez, Emmanuel; Duval-Arnould, Xavier; Philippe, Henri-Jean; Nizard, Jacky
2015-07-01
To report the first 6 months of experience of a nongovernmental-organization-managed obstetric care unit in a war refugee camp, with problems encountered and solutions implemented. Prospective observational study of the maternity activity of Gynécologie Sans Frontières (GSF). GSF's maternity unit, in Zaatari camp (Jordan). All pregnant women among Syrian refugees who came to the unit for delivery. The GSF's maternity unit is a light structure built with three tents, permitting low-risk pregnancy care and childbirth. Emergency cesarean deliveries were performed in the Moroccan army field hospital. High-risk pregnancies were transferred to Al Mafraq or Amman Hospital (Jordan) after assessment. Delivery characteristics, indications for referral. From September 2012 to February 2013, 371 women attended the unit and 299 delivered in it. Delivery rates increased from 5/month to 112/month over the period. Mean gestational age at birth was 39(+3) gestational weeks (SD = 1.9). Median birthweight was 3100 g (25-75% interquartile range 2840-3430 g). Spontaneous vaginal deliveries were dominant and the major maternal complication was postpartum hemorrhage (n = 13). Eighty-two women were referred to Al Mafraq or Amman hospitals, mainly for preterm labor (32%) and congenital malformations (11%). We managed one case of stillbirth. Maternal mortality did not occur. Despite the difficulties of war, high-risk pregnant women were properly identified, permitting referrals when required. Cooperation with other nongovernmental organizations, including the United Nations High Commissioner for Refugees, was essential for the management of situations at risk of complications and to contain perinatal and maternal mortality. © 2015 Nordic Federation of Societies of Obstetrics and Gynecology.
Aggressive Regimens for Multidrug-Resistant Tuberculosis Reduce Recurrence
Franke, Molly F.; Appleton, Sasha C.; Mitnick, Carole D.; Furin, Jennifer J.; Bayona, Jaime; Chalco, Katiuska; Shin, Sonya; Murray, Megan; Becerra, Mercedes C.
2013-01-01
Background. Recurrent tuberculosis disease occurs within 2 years in as few as 1% and as many as 29% of individuals successfully treated for multidrug-resistant (MDR) tuberculosis. A better understanding of treatment-related factors associated with an elevated risk of recurrent tuberculosis after cure is urgently needed to optimize MDR tuberculosis therapy. Methods. We conducted a retrospective cohort study among adults successfully treated for MDR tuberculosis in Peru. We used multivariable Cox proportional hazards regression analysis to examine whether receipt of an aggressive MDR tuberculosis regimen for ≥18 months following sputum conversion from positive to negative was associated with a reduced rate of recurrent tuberculosis. Results. Among 402 patients, the median duration of follow-up was 40.5 months (interquartile range, 21.2–53.4). Receipt of an aggressive MDR tuberculosis regimen for ≥18 months following sputum conversion was associated with a lower risk of recurrent tuberculosis (hazard ratio, 0.40 [95% confidence interval, 0.17–0.96]; P = .04). A baseline diagnosis of diabetes mellitus also predicted recurrent tuberculosis (hazard ratio, 10.47 [95% confidence interval, 2.17–50.60]; P = .004). Conclusions. Individuals who received an aggressive MDR tuberculosis regimen for ≥18 months following sputum conversion experienced a lower rate of recurrence after cure. Efforts to ensure that an aggressive regimen is accessible to all patients with MDR tuberculosis, such as minimization of sequential ineffective regimens, expanded drug access, and development of new MDR tuberculosis compounds, are critical to reducing tuberculosis recurrence in this population. Patients with diabetes mellitus should be carefully managed during initial treatment and followed closely for recurrent disease. PMID:23223591
Jama, Zimasa V; Chin, Ashley; Mayosi, Bongani M; Badri, Motasim
2015-01-01
Objectives Little is known about the performance of re-used pacemakers and implantable cardioverter defibrillators (ICDs) in Africa. We sought to compare the risk of infection and the rate of malfunction of re-used pacemakers and ICDs with new devices implanted at Groote Schuur Hospital in Cape Town, South Africa. Methods This was a retrospective case comparison study of the performance of re-used pacemakers and ICDs in comparison with new devices implanted at Groote Schuur Hospital over a 10-year period. The outcomes were incidence of device infection, device malfunction, early battery depletion, and device removal due to infection, malfunction, or early battery depletion. Results Data for 126 devices implanted in 126 patients between 2003 and 2013 were analysed, of which 102 (81%) were pacemakers (51 re-used and 51 new) and 24 (19%) were ICDs (12 re-used and 12 new). There was no device infection, malfunction, early battery depletion or device removal in either the re-used or new pacemaker groups over the median follow up of 15.1 months [interquartile range (IQR), 1.3–36.24 months] for the re-used pacemakers, and 55.8 months (IQR, 20.3–77.8 months) for the new pacemakers. In the ICD group, no device infection occurred over a median follow up of 35.9 months (IQR, 17.0–70.9 months) for the re-used ICDs and 45.7 months (IQR, 37.6–53.7 months) for the new ICDs. One device delivered inappropriate shocks, which resolved without intervention and with no harm to the patient. This re-used ICD subsequently needed generator replacement 14 months later. In both the pacemaker and ICD groups, there were no procedure-non-related infections documented for the respective follow-up periods. Conclusion No significant differences were found in performance between re-used and new pacemakers and ICDs with regard to infection rates, device malfunction, battery life and device removal for complications. Pacemaker and ICD re-use is feasible and safe and is a viable option for patients with bradyarrhythmias and tachyarrthythmias. PMID:26407220
Forns, Joan; Dadvand, Payam; Esnaola, Mikel; Alvarez-Pedrerol, Mar; López-Vicente, Mònica; Garcia-Esteban, Raquel; Cirach, Marta; Basagaña, Xavier; Guxens, Mònica; Sunyer, Jordi
2017-11-01
Recently, we showed that exposure to traffic-related air pollutants (TRAPs) at school was negatively associated with cognitive development, specifically working memory and inattentiveness, in primary schoolchildren during a course of 12 months. The persistence of such associations over longer periods remains as an open question. To study the longitudinal association between TRAPs at school and cognitive development over a period of 3.5 years. Indoor and outdoor levels of TRAPs (elemental carbon (EC), dioxide nitrogen (NO 2 ), particulate matter (PM 2.5 ) from traffic sources and ultrafine particles (UFP)) were measured at 39 schools across Barcelona during 2012/2013. Working memory, as a measure of cognitive development, was evaluated 4 times in 2012/2013 assessment and was re-evaluated one more time in 2015 using computerized n-back test (3-back d' as main outcome). Linear mixed effects models were used to test the association between TRAPs and 3-back d', adding child and school as random effects to account for the multilevel nature of the data, and school air pollutants levels (one at a time) as predictor. We found detrimental associations between all TRAPs and annual change in 3-back d' (working memory) (i.e. slower development of working memory in children attending schools with higher levels of air pollution). The associations (per one interquartile range increase in exposure) were strongest for outdoor NO 2 (Coefficient (Coef) = - 4.22, 95% confidence interval (CI), - 6.22, - 2.22) and indoor UFP (Coef = - 4.12, 95%CI, - 5.68, - 1.83). These reductions were equivalent to - 20% (95%CI, - 30.1, - 10.7) and - 19.9% (95%CI, - 31.5, - 8.4) change in annual working memory development associated with one interquartile range increase in outdoor NO 2 and indoor UFP, respectively. Our findings suggest the persistence of the negative association between TRAPs exposure at school and cognitive trajectory measured by n-back test over a period of 3.5 years. Copyright © 2017 Elsevier Inc. All rights reserved.
Non-invasive imaging of oxygen extraction fraction in adults with sickle cell anaemia.
Jordan, Lori C; Gindville, Melissa C; Scott, Allison O; Juttukonda, Meher R; Strother, Megan K; Kassim, Adetola A; Chen, Sheau-Chiann; Lu, Hanzhang; Pruthi, Sumit; Shyr, Yu; Donahue, Manus J
2016-03-01
Sickle cell anaemia is a monogenetic disorder with a high incidence of stroke. While stroke screening procedures exist for children with sickle cell anaemia, no accepted screening procedures exist for assessing stroke risk in adults. The purpose of this study is to use novel magnetic resonance imaging methods to evaluate physiological relationships between oxygen extraction fraction, cerebral blood flow, and clinical markers of cerebrovascular impairment in adults with sickle cell anaemia. The specific goal is to determine to what extent elevated oxygen extraction fraction may be uniquely present in patients with higher levels of clinical impairment and therefore may represent a candidate biomarker of stroke risk. Neurological evaluation, structural imaging, and the non-invasive T2-relaxation-under-spin-tagging magnetic resonance imaging method were applied in sickle cell anaemia (n = 34) and healthy race-matched control (n = 11) volunteers without sickle cell trait to assess whole-brain oxygen extraction fraction, cerebral blood flow, degree of vasculopathy, severity of anaemia, and presence of prior infarct; findings were interpreted in the context of physiological models. Cerebral blood flow and oxygen extraction fraction were elevated (P < 0.05) in participants with sickle cell anaemia (n = 27) not receiving monthly blood transfusions (interquartile range cerebral blood flow = 46.2-56.8 ml/100 g/min; oxygen extraction fraction = 0.39-0.50) relative to controls (interquartile range cerebral blood flow = 40.8-46.3 ml/100 g/min; oxygen extraction fraction = 0.33-0.38). Oxygen extraction fraction (P < 0.0001) but not cerebral blood flow was increased in participants with higher levels of clinical impairment. These data provide support for T2-relaxation-under-spin-tagging being able to quickly and non-invasively detect elevated oxygen extraction fraction in individuals with sickle cell anaemia with higher levels of clinical impairment. Our results support the premise that magnetic resonance imaging-based assessment of elevated oxygen extraction fraction might be a viable screening tool for evaluating stroke risk in adults with sickle cell anaemia. © The Author (2016). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Kooiman, J; Sijpkens, Y W J; van Buren, M; Groeneveld, J H M; Ramai, S R S; van der Molen, A J; Aarts, N J M; van Rooden, C J; Cannegieter, S C; Putter, H; Rabelink, T J; Huisman, M V
2014-10-01
Hydration to prevent contrast-induced acute kidney injury (CI-AKI) induces a diagnostic delay when performing computed tomography-pulmonary angiography (CTPA) in patients suspected of having acute pulmonary embolism. To analyze whether withholding hydration is non-inferior to sodium bicarbonate hydration before CTPA in patients with chronic kidney disease (CKD). We performed an open-label multicenter randomized trial between 2009 and 2013. One hundred thirty-nine CKD patients were randomized, of whom 138 were included in the intention-to-treat population: 67 were randomized to withholding hydration and 71 were randomized to 1-h 250 mL 1.4% sodium bicarbonate hydration before CTPA. Primary outcome was the increase in serum creatinine 48-96 h after CTPA. Secondary outcomes were the incidence of CI-AKI (creatinine increase > 25%/> 0.5 mg dL(-1) ), recovery of renal function, and the need for dialysis within 2 months after CTPA. Withholding hydration was considered non-inferior if the mean relative creatinine increase was ≤ 15% compared with sodium bicarbonate. Mean relative creatinine increase was -0.14% (interquartile range -15.1% to 12.0%) for withholding hydration and -0.32% (interquartile range -9.7% to 10.1%) for sodium bicarbonate (mean difference 0.19%, 95% confidence interval -5.88% to 6.25%, P-value non-inferiority < 0.001). CI-AKI occurred in 11 patients (8.1%): 6 (9.2%) were randomized to withholding hydration and 5 (7.1%) to sodium bicarbonate (relative risk 1.29, 95% confidence interval 0.41-4.03). Renal function recovered in 80.0% of CI-AKI patients within each group (relative risk 1.00, 95% confidence interval 0.54-1.86). None of the CI-AKI patients developed a need for dialysis. Our results suggest that preventive hydration could be safely withheld in CKD patients undergoing CTPA for suspected acute pulmonary embolism. This will facilitate management of these patients and prevents delay in diagnosis as well as unnecessary start of anticoagulant treatment while receiving volume expansion. © 2014 International Society on Thrombosis and Haemostasis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, J.-C.; Wang, W.-Y.; Liang, W.-M.
Purpose: To evaluate the long-term prognostic impact of plasma Epstein-Barr virus (EBV) DNA concentration measured by real-time quantitative polymerase chain reaction (RTQ-PCR) in nasopharyngeal carcinoma (NPC) patients receiving concurrent chemoradiotherapy (CCRT). Methods and Materials: Epstein-Barr virus DNA was retrospectively measured from stock plasma of 152 biopsy-proven NPC patients with Stage II-IV (M0) disease with a RTQ-PCR using the minor groove binder-probe. All patients received CCRT with a median follow-up of 78 months. We divided patients into three subgroups: (1) low pretreatment EBV DNA (<1,500 copies/mL) and undetectable posttreatment EBV DNA (pre-L/post-U) (2) high pretreatment EBV DNA ({>=}1,500 copies/mL) and undetectablemore » posttreatment EBV DNA (pre-H/post-U), and (3) low or high pretreatment EBV DNA and detectable posttreatment EBV DNA (pre-L or H/post-D) for prognostic analyses. Results: Epstein-Barr virus DNA (median concentration, 573 copies/mL; interquartile range, 197-3,074) was detected in the pretreatment plasma of 94.1% (143/152) of patients. After treatment, plasma EBV DNA decreased or remained 0 for all patients and was detectable in 31 patients (20.4%) with a median concentration 0 copy/mL (interquartile range, 0-0). The 5-year overall survival rates of the pre-L/post-U, pre-H/post-U, and pre-L or H/post-D subgroups were 87.2%, 71.0%, and 38.7%, respectively (p < 0.0001). The relapse-free survival showed similar results with corresponding rates of 85.6%, 75.9%, and 26.9%, respectively (p < 0.0001). Multivariate Cox analysis confirmed the superior effects of plasma EBV DNA compared to other clinical parameters in prognosis prediction. Conclusion: Plasma EBV DNA is the most valuable prognostic factor for NPC. More chemotherapy should be considered for patients with persistently detectable EBV DNA after CCRT.« less
Modifying exposure to smoking depicted in movies: a novel approach to preventing adolescent smoking.
Sargent, James D; Dalton, Madeline A; Heatherton, Todd; Beach, Mike
2003-07-01
Most behavioral approaches to adolescent smoking address the behavior directly. We explore an indirect approach: modifying exposure to portrayals of smoking in movies. To describe adolescents' exposure to smoking in movies and to examine factors that could modify such exposure. Occurrences of smoking were counted in each of 601 popular movies. Four thousand nine hundred ten northern New England junior high school students were asked to report which movies they had seen from a randomly generated subsample of 50 films, and responses were used to estimate exposure to the entire sample. Analysis The outcome variable was exposure to movie smoking, defined as the number of smoking occurrences seen. Risk factors for exposure included access to movies (movie channels, videotape use, and movie theater); parenting (R [restricted]-rated movie restrictions, television restrictions, parenting style); and characteristics of the child (age, sex, school performance, sensation-seeking propensity, rebelliousness, and self-esteem). We used multiple regression to assess the association between risk factors and exposure to movie smoking. Subjects had seen an average of 30% of the movie sample (interquartile range, 20%-44%), from which they were exposed to 1160 (interquartile range, 640-1970) occurrences of smoking. In a multivariate model, exposure to movie smoking increased (all P values <.001) by about 10% for each additional movie channel and for every 2 videos watched per week. Exposure increased by 30% for those going to the movie theater more than once per month compared with those who did not go at all. Parent restriction on viewing R-rated movies resulted in a 50% reduction in exposure to movie smoking. There was no association between parenting style and exposure to movie smoking. Much of the protective effect of parent R-rated movie restriction on adolescent smoking was mediated through lower exposure to movie smoking. Adolescents see thousands of smoking depictions in movies, and this influences their attitudes and behavior. Exposure to movie smoking is reduced when parents limit movie access. Teaching parents to monitor and enforce movie access guidelines could reduce adolescent smoking in an indirect, yet powerful, manner.
Goldberg, Mark S; Burnett, Richard T; Stieb, David M; Brophy, James M; Daskalopoulou, Stella S; Valois, Marie-France; Brook, Jeffrey R
2013-10-01
Persons with underlying health conditions may be at higher risk for the short-term effects of air pollution. We have extended our original mortality time series study in Montreal, Quebec, among persons 65 years of age and older, for an additional 10 years (1990-2003) to assess whether these associations persisted and to investigate new health conditions. We created subgroups of subjects diagnosed with major health conditions one year before death using billing and prescription data from the Quebec Health Insurance Plan. We used parametric log-linear Poisson models within the distributed lag non-linear models framework, that were adjusted for long-term temporal trends and daily maximum temperature, for which we assessed associations with NO2, O3, CO, SO2, and particles with aerodynamic diameters 2.5 μm in diameter or less (PM2.5). We found positive associations between daily non-accidental mortality and all air pollutants but O3 (e.g., for a cumulative effect over a 3-day lag, with a mean percent change (MPC) in daily mortality of 1.90% [95% confidence interval: 0.73, 3.08%] for an increase of the interquartile range (17.56 μg m(-3)) of NO2). Positive associations were found amongst persons having cardiovascular disease (cumulative MPC for an increase equal to the interquartile range of NO2=2.67%), congestive heart failure (MPC=3.46%), atrial fibrillation (MPC=4.21%), diabetes (MPC=3.45%), and diabetes and cardiovascular disease (MPC=3.50%). Associations in the warm season were also found for acute and chronic coronary artery disease, hypertension, and cancer. There was no persuasive evidence to conclude that there were seasonal associations for cerebrovascular disease, acute lower respiratory disease (defined within 2 months of death), airways disease, and diabetes and airways disease. These data indicate that individuals with certain health conditions, especially those with diabetes and cardiovascular disease, hypertension, atrial fibrillation, and cancer, may be susceptible to the short-term effects of air pollution. © 2013 Elsevier B.V. All rights reserved.
Lipoprotein(a) Levels and Recurrent Vascular Events After First Ischemic Stroke.
Lange, Kristin S; Nave, Alexander H; Liman, Thomas G; Grittner, Ulrike; Endres, Matthias; Ebinger, Martin
2017-01-01
The association of elevated lipoprotein(a) (Lp(a)) levels and the incidence of cardiovascular disease, especially coronary heart disease and ischemic stroke, is well established. However, evidence on the association between Lp(a) levels and residual vascular risk in stroke survivors is lacking. We aimed to elucidate the risk for recurrent cardiovascular and cerebrovascular events in the patients with first-ever ischemic stroke with elevated Lp(a). All patients with acute ischemic stroke who participated in the prospective Berlin C&S study (Cream & Sugar) between January 2009 and August 2014 with available 12-month follow-up data and stored blood samples were eligible for inclusion. Lp(a) levels were determined in serum samples using an isoform-insensitive nephelometry assay. We assessed the risk for the composite vascular end point of ischemic stroke, transient ischemic attack, myocardial infarction, nonelective coronary revascularization, and cardiovascular death with elevated Lp(a) defined as >30 mg/dL using Cox regression analyses. Of 465 C&S study participants, 250 patients were included into this substudy with a median National Institutes of Health Stroke Scale score of 2 (1-4). Twenty-six patients (10%) experienced a recurrent vascular event during follow-up. Among patients with normal Lp(a) levels, 11 of 157 subjects (7%) experienced an event at a median time of 161 days (interquartile range, 19-196 days), whereas in patients with elevated Lp(a) levels, 15 of 93 subjects (16%) experienced an event at a median time of 48 days (interquartile range, 9-194 days; P=0.026). The risk for a recurrent event was significantly higher in patients with elevated Lp(a) levels after adjustment for potential confounders (hazard ratio, 2.60; 95% confidence interval, 1.19-5.67; P=0.016). Elevated Lp(a) levels are associated with a higher risk for combined vascular event recurrence in patients with acute, first-ever ischemic stroke. This finding should be validated in larger, multicenter trials. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01378468. © 2016 American Heart Association, Inc.
Teshome, Wondu; Belayneh, Mehretu; Moges, Mathewos; Mekonnen, Emebet; Endrias, Misganu; Ayele, Sinafiksh; Misganaw, Tebeje; Shiferaw, Mekonnen; Tesema, Tigist
2015-01-01
Decentralization and task shifting has significantly improved access to antiretroviral therapy (ART). Many studies conducted to determine the attrition rate in Ethiopia have not compared attrition rates between hospitals and health centers in a relatively recent cohort of patients. This study compared death and loss to follow-up (LTFU) rates among ART patients in hospitals and health centers in south Ethiopia. Data routinely collected from patients aged older than 15 years who started ART between July 2011 and August 2012 in 20 selected health facilities (12 being hospitals) were analyzed. The outcomes of interest were LTFU and death. The data were entered, cleaned, and analyzed using Statistical Package for the Social Sciences version 20.0 and Stata version 12.0. Competing-risk regression models were used. The service years of the facilities were similar (median 8 and 7.5 for hospitals and health centers, respectively). The mean patient age was 33.7±9.6 years. The median baseline CD4 count was 179 (interquartile range 93-263) cells/mm(3). A total of 2,356 person-years of observation were made with a median follow-up duration of 28 (interquartile range 22-31) months; 24.6% were either dead or LTFU, resulting in a retention rate of 75.4%. The death rates were 3.0 and 1.5 and the LTFU rate were 9.0 and 10.9 per 100 person-years of observation in health centers and hospitals, respectively. The competing-risk regression model showed that the gap between testing and initiation of ART, body mass index, World Health Organization clinical stage, isoniazid prophylaxis, age, facility type, and educational status were independently associated with LTFU. Moreover, baseline tuberculous disease, poor functional status, and follow-up at a health center were associated with an elevated probability of death. We observed a higher death rate and a lower LTFU rate in health centers than in hospitals. Most of the associated variables were also previously documented. Higher LTFU was noticed for patients with a smaller gap between testing and initiation of treatment.
Non-invasive imaging of oxygen extraction fraction in adults with sickle cell anaemia
Gindville, Melissa C.; Scott, Allison O.; Juttukonda, Meher R.; Strother, Megan K.; Kassim, Adetola A.; Chen, Sheau-Chiann; Lu, Hanzhang; Pruthi, Sumit; Shyr, Yu; Donahue, Manus J.
2016-01-01
Sickle cell anaemia is a monogenetic disorder with a high incidence of stroke. While stroke screening procedures exist for children with sickle cell anaemia, no accepted screening procedures exist for assessing stroke risk in adults. The purpose of this study is to use novel magnetic resonance imaging methods to evaluate physiological relationships between oxygen extraction fraction, cerebral blood flow, and clinical markers of cerebrovascular impairment in adults with sickle cell anaemia. The specific goal is to determine to what extent elevated oxygen extraction fraction may be uniquely present in patients with higher levels of clinical impairment and therefore may represent a candidate biomarker of stroke risk. Neurological evaluation, structural imaging, and the non-invasive T2-relaxation-under-spin-tagging magnetic resonance imaging method were applied in sickle cell anaemia (n = 34) and healthy race-matched control (n = 11) volunteers without sickle cell trait to assess whole-brain oxygen extraction fraction, cerebral blood flow, degree of vasculopathy, severity of anaemia, and presence of prior infarct; findings were interpreted in the context of physiological models. Cerebral blood flow and oxygen extraction fraction were elevated (P < 0.05) in participants with sickle cell anaemia (n = 27) not receiving monthly blood transfusions (interquartile range cerebral blood flow = 46.2–56.8 ml/100 g/min; oxygen extraction fraction = 0.39–0.50) relative to controls (interquartile range cerebral blood flow = 40.8–46.3 ml/100 g/min; oxygen extraction fraction = 0.33–0.38). Oxygen extraction fraction (P < 0.0001) but not cerebral blood flow was increased in participants with higher levels of clinical impairment. These data provide support for T2-relaxation-under-spin-tagging being able to quickly and non-invasively detect elevated oxygen extraction fraction in individuals with sickle cell anaemia with higher levels of clinical impairment. Our results support the premise that magnetic resonance imaging-based assessment of elevated oxygen extraction fraction might be a viable screening tool for evaluating stroke risk in adults with sickle cell anaemia. PMID:26823369
False-Positive Rate of AKI Using Consensus Creatinine–Based Criteria
Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G.S.; Negoianu, Dan; Testani, Jeffrey M.; Berns, Jeffrey S.; Parikh, Chirag R.
2015-01-01
Background and objectives Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false–positive rates because of inherent laboratory and biologic variabilities of creatinine. Design, setting, participants, & measurements We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false–positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient’s true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Results Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false–positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%–8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false–positive AKI diagnosis rate of 30.5% (interquartile range =30.1%–30.9%) versus 2.0% (interquartile range =1.9%–2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Conclusions Use of small serum creatinine changes to diagnose AKI is limited by high false–positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. PMID:26336912
False-Positive Rate of AKI Using Consensus Creatinine-Based Criteria.
Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G S; Negoianu, Dan; Testani, Jeffrey M; Berns, Jeffrey S; Parikh, Chirag R; Wilson, F Perry
2015-10-07
Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false-positive rates because of inherent laboratory and biologic variabilities of creatinine. We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false-positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient's true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false-positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%-8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false-positive AKI diagnosis rate of 30.5% (interquartile range =30.1%-30.9%) versus 2.0% (interquartile range =1.9%-2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Use of small serum creatinine changes to diagnose AKI is limited by high false-positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. Copyright © 2015 by the American Society of Nephrology.
2011-01-01
Introduction The role of ICU design and particularly single-patient rooms in decreasing bacterial transmission between ICU patients has been debated. A recent change in our ICU allowed further investigation. Methods Pre-move ICU-A and pre-move ICU-B were open-plan units. In March 2007, ICU-A moved to single-patient rooms (post-move ICU-A). ICU-B remained unchanged (post-move ICU-B). The same physicians cover both ICUs. Cultures of specified resistant organisms in surveillance or clinical cultures from consecutive patients staying >48 hours were compared for the different ICUs and periods to assess the effect of ICU design on acquisition of resistant organisms. Results Data were collected for 62, 62, 44 and 39 patients from pre-move ICU-A, post-move ICU-A, pre-move ICU-B and post-move ICU-B, respectively. Fewer post-move ICU-A patients acquired resistant organisms (3/62, 5%) compared with post-move ICU-B patients (7/39, 18%; P = 0.043, P = 0.011 using survival analysis) or pre-move ICU-A patients (14/62, 23%; P = 0.004, P = 0.012 on survival analysis). Only the admission period was significant for acquisition of resistant organisms comparing pre-move ICU-A with post-move ICU-A (hazard ratio = 5.18, 95% confidence interval = 1.03 to 16.06; P = 0.025). More antibiotic-free days were recorded in post-move ICU-A (median = 3, interquartile range = 0 to 5) versus post-move ICU-B (median = 0, interquartile range = 0 to 4; P = 0.070) or pre-move ICU-A (median = 0, interquartile range = 0 to 4; P = 0.017). Adequate hand hygiene was observed on 140/242 (58%) occasions in post-move ICU-A versus 23/66 (35%) occasions in post-move ICU-B (P < 0.001). Conclusions Improved ICU design, and particularly use of single-patient rooms, decreases acquisition of resistant bacteria and antibiotic use. This observation should be considered in future ICU design. PMID:21914222
Closed-loop insulin delivery during pregnancy complicated by type 1 diabetes.
Murphy, Helen R; Elleri, Daniela; Allen, Janet M; Harris, Julie; Simmons, David; Rayman, Gerry; Temple, Rosemary; Dunger, David B; Haidar, Ahmad; Nodale, Marianna; Wilinska, Malgorzata E; Hovorka, Roman
2011-02-01
This study evaluated closed-loop insulin delivery with a model predictive control (MPC) algorithm during early (12-16 weeks) and late gestation (28-32 weeks) in pregnant women with type 1 diabetes. Ten women with type 1 diabetes (age 31 years, diabetes duration 19 years, BMI 24.1 kg/m(2), booking A1C 6.9%) were studied over 24 h during early (14.8 weeks) and late pregnancy (28.0 weeks). A nurse adjusted the basal insulin infusion rate from continuous glucose measurements (CGM), fed into the MPC algorithm every 15 min. Mean glucose and time spent in target (63-140 mg/dL), hyperglycemic (>140 to ≥ 180 mg/dL), and hypoglycemic (<63 to ≤ 50 mg/dL) were calculated using plasma and sensor glucose measurements. Linear mixed-effects models were used to compare glucose control during early and late gestation. During closed-loop insulin delivery, median (interquartile range) plasma glucose levels were 117 (100.8-154.8) mg/dL in early and 126 (109.8-140.4) mg/dL in late gestation (P = 0.72). The overnight mean (interquartile range) plasma glucose time in target was 84% (50-100%) in early and 100% (94-100%) in late pregnancy (P = 0.09). Overnight mean (interquartile range) time spent hyperglycemic (>140 mg/dL) was 7% (0-40%) in early and 0% (0-6%) in late pregnancy (P = 0.25) and hypoglycemic (<63 mg/dL) was 0% (0-3%) and 0% (0-0%), respectively (P = 0.18). Postprandial glucose control, glucose variability, insulin infusion rates, and CGM sensor accuracy were no different in early or late pregnancy. MPC algorithm performance was maintained throughout pregnancy, suggesting that overnight closed-loop insulin delivery could be used safely during pregnancy. More work is needed to achieve optimal postprandial glucose control.
Closed-Loop Insulin Delivery During Pregnancy Complicated by Type 1 Diabetes
Murphy, Helen R.; Elleri, Daniela; Allen, Janet M.; Harris, Julie; Simmons, David; Rayman, Gerry; Temple, Rosemary; Dunger, David B.; Haidar, Ahmad; Nodale, Marianna; Wilinska, Malgorzata E.; Hovorka, Roman
2011-01-01
OBJECTIVE This study evaluated closed-loop insulin delivery with a model predictive control (MPC) algorithm during early (12–16 weeks) and late gestation (28–32 weeks) in pregnant women with type 1 diabetes. RESEARCH DESIGN AND METHODS Ten women with type 1 diabetes (age 31 years, diabetes duration 19 years, BMI 24.1 kg/m2, booking A1C 6.9%) were studied over 24 h during early (14.8 weeks) and late pregnancy (28.0 weeks). A nurse adjusted the basal insulin infusion rate from continuous glucose measurements (CGM), fed into the MPC algorithm every 15 min. Mean glucose and time spent in target (63–140 mg/dL), hyperglycemic (>140 to ≥180 mg/dL), and hypoglycemic (<63 to ≤50 mg/dL) were calculated using plasma and sensor glucose measurements. Linear mixed-effects models were used to compare glucose control during early and late gestation. RESULTS During closed-loop insulin delivery, median (interquartile range) plasma glucose levels were 117 (100.8–154.8) mg/dL in early and 126 (109.8–140.4) mg/dL in late gestation (P = 0.72). The overnight mean (interquartile range) plasma glucose time in target was 84% (50–100%) in early and 100% (94–100%) in late pregnancy (P = 0.09). Overnight mean (interquartile range) time spent hyperglycemic (>140 mg/dL) was 7% (0–40%) in early and 0% (0–6%) in late pregnancy (P = 0.25) and hypoglycemic (<63 mg/dL) was 0% (0–3%) and 0% (0–0%), respectively (P = 0.18). Postprandial glucose control, glucose variability, insulin infusion rates, and CGM sensor accuracy were no different in early or late pregnancy. CONCLUSIONS MPC algorithm performance was maintained throughout pregnancy, suggesting that overnight closed-loop insulin delivery could be used safely during pregnancy. More work is needed to achieve optimal postprandial glucose control. PMID:21216859
Hubert, Gordian J; Meretoja, Atte; Audebert, Heinrich J; Tatlisumak, Turgut; Zeman, Florian; Boy, Sandra; Haberl, Roman L; Kaste, Markku; Müller-Barna, Peter
2016-12-01
Intravenous thrombolysis with tissue-type plasminogen activator (tPA) for acute ischemic stroke is more effective when delivered early. Timely delivery is challenging particularly in rural areas with long distances. We compared delays and treatment rates of a large, decentralized telemedicine-based system and a well-organized, large, centralized single-hospital system. We analyzed the centralized system of the Helsinki University Central Hospital (Helsinki and Province of Uusimaa, Finland, 1.56 million inhabitants, 9096 km 2 ) and the decentralized TeleStroke Unit network in a predominantly rural area (Telemedical Project for Integrative Stroke Care [TEMPiS], South-East Bavaria, Germany, 1.94 million inhabitants, 14 992 km 2 ). All consecutive tPA treatments were prospectively registered. We compared tPA rates per total ischemic stroke admissions in the Helsinki and TEMPiS catchment areas. For delay comparisons, we excluded patients with basilar artery occlusions, in-hospital strokes, and those being treated after 270 minutes. From January 1, 2011, to December 31, 2013, 912 patients received tPA in Helsinki University Central Hospital and 1779 in TEMPiS hospitals. Area-based tPA rates were equal (13.0% of 7017 ischemic strokes in the Helsinki University Central Hospital area versus 13.3% of 14 637 ischemic strokes in the TEMPiS area; P=0.078). Median prehospital delays were longer (88; interquartile range, 60-135 versus 65; 48-101 minutes; P<0.001) but in-hospital delays were shorter (18; interquartile range, 13-30 versus 39; 26-56 minutes; P<0.001) in Helsinki University Central Hospital compared with TEMPiS with no difference in overall delays (117; interquartile range, 81-168 versus 115; 87-155 minutes; P=0.45). A decentralized telestroke thrombolysis service can achieve similar treatment rates and time delays for a rural population as a centralized system can achieve for an urban population. © 2016 American Heart Association, Inc.
Geographic Variance of Cost Associated With Hysterectomy.
Sheyn, David; Mahajan, Sangeeta; Billow, Megan; Fleary, Alexandra; Hayashi, Emi; El-Nashar, Sherif A
2017-05-01
To estimate whether the cost of hysterectomy varies by geographic region. This was a cross-sectional, population-based study using the 2013 Healthcare Cost and Utilization Project National Inpatient Sample of women older than 18 years undergoing inpatient hysterectomy for benign conditions. Hospital charges obtained from the National Inpatient Sample database were converted to actual costs using cost-to-charge ratios provided by the Healthcare Cost and Utilization Project. Multivariate regression was used to assess the effects that demographic factors, concomitant procedures, diagnoses, and geographic region have on hysterectomy cost above the median. Women who underwent hysterectomy for benign conditions were identified (N=38,414). The median cost of hysterectomy was $13,981 (interquartile range $9,075-29,770). The mid-Atlantic region had the lowest median cost of $9,661 (interquartile range $6,243-15,335) and the Pacific region had the highest median cost, $22,534 (interquartile range $15,380-33,797). Compared with the mid-Atlantic region, the Pacific (adjusted odds ratio [OR] 10.43, 95% confidence interval [CI] 9.44-11.45), South Atlantic (adjusted OR 5.39, 95% CI 4.95-5.86), and South Central (adjusted OR 2.40, 95% CI 2.21-2.62) regions were associated with the highest probability of costs above the median. All concomitant procedures were associated with an increased cost with the exception of bilateral salpingectomy (adjusted OR 1.03, 95% CI 0.95-1.12). Compared with vaginal hysterectomy, laparoscopic and robotic modes of hysterectomy were associated with higher probabilities of increased costs (adjusted OR 2.86, 95% CI 2.61-3.15 and adjusted OR 5.66, 95% CI 5.11-6.26, respectively). Abdominal hysterectomy was not associated with a statistically significant increase in cost compared with vaginal hysterectomy (adjusted OR 1.01, 95% CI 0.91-1.09). The cost of hysterectomy varies significantly with geographic region after adjusting for confounders.
A Review of Online Evidence-based Practice Point-of-Care Information Summary Providers
Liberati, Alessandro; Moschetti, Ivan; Tagliabue, Ludovica; Moja, Lorenzo
2010-01-01
Background Busy clinicians need easy access to evidence-based information to inform their clinical practice. Publishers and organizations have designed specific tools to meet doctors’ needs at the point of care. Objective The aim of this study was to describe online point-of-care summaries and evaluate their breadth, content development, and editorial policy against their claims of being “evidence-based.” Methods We searched Medline, Google, librarian association websites, and information conference proceedings from January to December 2008. We included English Web-based point-of-care summaries designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, evidence-based information to clinicians. Two investigators independently extracted data on the general characteristics and content presentation of summaries. We assessed and ranked point-of-care products according to: (1) coverage (volume) of medical conditions, (2) editorial quality, and (3) evidence-based methodology. We explored how these factors were associated. Results We retrieved 30 eligible summaries. Of these products, 18 met our inclusion criteria and were qualitatively described, and 16 provided sufficient data for quantitative evaluation. The median volume of medical conditions covered was 80.6% (interquartile range, 68.9% - 84.2%) and varied for the different products. Similarly, differences emerged for editorial policy (median 8.0, interquartile range 5.8 - 10.3) and evidence-based methodology scores (median 10.0, interquartile range 1.0 - 12.8) on a 15-point scale. None of these dimensions turned out to be significantly associated with the other dimensions (editorial quality and volume, Spearman rank correlation r = -0.001, P = .99; evidence-based methodology and volume, r = -0.19, P = .48; editorial and evidence-based methodology, r = 0.43, P =.09). Conclusions Publishers are moving to develop point-of-care summary products. Some of these have better profiles than others, and there is room for improved reporting of the strengths and weaknesses of these products. PMID:20610379
Gurvitch, R; Wood, D A; Tay, E L; Leipsic, J; Ye, J; Lichtenstein, S V; Thompson, C R; Carere, R G; Wijesinghe, N; Nietlispach, F; Boone, R H; Lauck, S; Cheung, A; Webb, J G
2010-09-28
Although short- and medium-term outcomes after transcatheter aortic valve implantation are encouraging, long-term data on valve function and clinical outcomes are limited. Consecutive high-risk patients who had been declined as surgical candidates because of comorbidities but who underwent successful transcatheter aortic valve implantation with a balloon-expandable valve between January 2005 and December 2006 and survived past 30 days were assessed. Clinical, echocardiographic, and computed tomographic follow-up examinations were performed. Seventy patients who underwent successful procedures and survived longer than 30 days were evaluated at a minimum follow-up of 3 years. At a median follow-up of 3.7 years (interquartile range 3.4 to 4.3 years), survival was 57%. Survival at 1, 2, and 3 years was 81%, 74%, and 61%, respectively. Freedom from reoperation was 98.5% (1 patient with endocarditis). During this early procedural experience, 11 patients died within 30 days, and 8 procedures were unsuccessful. When these patients were included, overall survival was 51%. Transaortic pressure gradients increased from 10.0 mm Hg (interquartile range 8.0 to 12.0 mm Hg) immediately after the procedure to 12.1 mm Hg (interquartile range 8.6 to 16.0 mm Hg) after 3 years (P=0.03). Bioprosthetic valve area decreased from a mean of 1.7±0.4 cm(2) after the procedure to 1.4±0.3 cm(2) after 3 years (P<0.01). Aortic incompetence after implantation was trivial or mild in 84% of cases and remained unchanged or improved over time. There were no cases of structural valvular deterioration, stent fracture, deformation, or valve migration. Transcatheter aortic valve implantation demonstrates good medium- to long-term durability and preserved hemodynamic function, with no evidence of structural failure. The procedure appears to offer an adequate and lasting resolution of aortic stenosis in selected patients.
Roca, Bernardino; Mendoza, María A; Roca, Manuel
2016-10-01
To compare the efficacy of extracorporeal shock wave therapy (ESWT) with botulinum toxin type A (BoNT-A) in the treatment of plantar fasciitis (PF). Open label, prospective, randomized study. A total of 72 patients were included. In all participants the median (and interquartile range) of the visual analog scale (VAS) of pain result, when taking the first steps, was 8 (6-9) points before treatment and 6 (4-8) points after treatment (p < 0.001). In the group of patients that received ESWT, the median (and interquartile range) of improvement in the VAS of pain result, when taking the first steps, was 2 (1-4) points, and in the group of patients that received BoNT-A the same result was 1 (0-2) points (p = 0.009). In the group of patients that received ESWT, the median (and interquartile range) of improvement in the Roles and Maudsley scale of pain result was 1 (0-1) points, and in the group of patients that received BoNT-A the same result was 0 (0-1) points (p = 0.006). In a multivariate analysis use of ESWT and lower weight were associated with improvement of pain with treatment in at least one of the three VAS of pain scales used in the study. ESWT was superior to BoNT-A in the control of pain in patients with PF. Implications for Rehabilitation Plantar fasciitis is characterized by pain at the calcaneal origin of the plantar fascia, exacerbated by weight bearing after prolonged periods of rest. Although studies comparing extracorporeal shock wave therapy or botulinum toxin type A to placebo suggest a superiority of the first one, no reliable data exist about it. Extracorporeal shock wave therapy was superior to botulinum toxin type A in the control of pain in patients with PF.
Antimalarial Activity of KAF156 in Falciparum and Vivax Malaria.
White, Nicholas J; Duong, Tran T; Uthaisin, Chirapong; Nosten, François; Phyo, Aung P; Hanboonkunupakarn, Borimas; Pukrittayakamee, Sasithon; Jittamala, Podjanee; Chuthasmit, Kittiphum; Cheung, Ming S; Feng, Yiyan; Li, Ruobing; Magnusson, Baldur; Sultan, Marc; Wieser, Daniela; Xun, Xiaolei; Zhao, Rong; Diagana, Thierry T; Pertel, Peter; Leong, F Joel
2016-09-22
KAF156 belongs to a new class of antimalarial agents (imidazolopiperazines), with activity against asexual and sexual blood stages and the preerythrocytic liver stages of malarial parasites. We conducted a phase 2, open-label, two-part study at five centers in Thailand and Vietnam to assess the antimalarial efficacy, safety, and pharmacokinetic profile of KAF156 in adults with acute Plasmodium vivax or P. falciparum malaria. Assessment of parasite clearance rates in cohorts of patients with vivax or falciparum malaria who were treated with multiple doses (400 mg once daily for 3 days) was followed by assessment of the cure rate at 28 days in a separate cohort of patients with falciparum malaria who received a single dose (800 mg). Median parasite clearance times were 45 hours (interquartile range, 42 to 48) in 10 patients with falciparum malaria and 24 hours (interquartile range, 20 to 30) in 10 patients with vivax malaria after treatment with the multiple-dose regimen and 49 hours (interquartile range, 42 to 54) in 21 patients with falciparum malaria after treatment with the single dose. Among the 21 patients who received the single dose and were followed for 28 days, 1 had reinfection and 7 had recrudescent infections (cure rate, 67%; 95% credible interval, 46 to 84). The mean (±SD) KAF156 terminal elimination half-life was 44.1±8.9 hours. There were no serious adverse events in this small study. The most common adverse events included sinus bradycardia, thrombocytopenia, hypokalemia, anemia, and hyperbilirubinemia. Vomiting of grade 2 or higher occurred in 2 patients, 1 of whom discontinued treatment because of repeated vomiting after receiving the single 800-mg dose. More adverse events were reported in the single-dose cohort, which had longer follow-up, than in the multiple-dose cohorts. KAF156 showed antimalarial activity without evident safety concerns in a small number of adults with uncomplicated P. vivax or P. falciparum malaria. (Funded by Novartis and others; ClinicalTrials.gov number, NCT01753323 .).
King, D; Hume, P; Gissane, C; Brughelli, M; Clark, T
2016-02-01
Head impacts and resulting head accelerations cause concussive injuries. There is no standard for reporting head impact data in sports to enable comparison between studies. The aim was to outline methods for reporting head impact acceleration data in sport and the effect of the acceleration thresholds on the number of impacts reported. A systematic review of accelerometer systems utilised to report head impact data in sport was conducted. The effect of using different thresholds on a set of impact data from 38 amateur senior rugby players in New Zealand over a competition season was calculated. Of the 52 studies identified, 42% reported impacts using a >10-g threshold, where g is the acceleration of gravity. Studies reported descriptive statistics as mean ± standard deviation, median, 25th to 75th interquartile range, and 95th percentile. Application of the varied impact thresholds to the New Zealand data set resulted in 20,687 impacts of >10 g, 11,459 (45% less) impacts of >15 g, and 4024 (81% less) impacts of >30 g. Linear and angular raw data were most frequently reported. Metrics combining raw data may be more useful; however, validity of the metrics has not been adequately addressed for sport. Differing data collection methods and descriptive statistics for reporting head impacts in sports limit inter-study comparisons. Consensus on data analysis methods for sports impact assessment is needed, including thresholds. Based on the available data, the 10-g threshold is the most commonly reported impact threshold and should be reported as the median with 25th and 75th interquartile ranges as the data are non-normally distributed. Validation studies are required to determine the best threshold and metrics for impact acceleration data collection in sport. Until in-field validation studies are completed, it is recommended that head impact data should be reported as median and interquartile ranges using the 10-g impact threshold.
Shahian, David M; He, Xia; Jacobs, Jeffrey P; Kurlansky, Paul A; Badhwar, Vinay; Cleveland, Joseph C; Fazzalari, Frank L; Filardo, Giovanni; Normand, Sharon-Lise T; Furnary, Anthony P; Magee, Mitchell J; Rankin, J Scott; Welke, Karl F; Han, Jane; O'Brien, Sean M
2015-10-01
Previous composite performance measures of The Society of Thoracic Surgeons (STS) were estimated at the STS participant level, typically a hospital or group practice. The STS Quality Measurement Task Force has now developed a multiprocedural, multidimensional composite measure suitable for estimating the performance of individual surgeons. The development sample from the STS National Database included 621,489 isolated coronary artery bypass grafting procedures, isolated aortic valve replacement, aortic valve replacement plus coronary artery bypass grafting, mitral, or mitral plus coronary artery bypass grafting procedures performed by 2,286 surgeons between July 1, 2011, and June 30, 2014. Each surgeon's composite score combined their aggregate risk-adjusted mortality and major morbidity rates (each weighted inversely by their standard deviations) and reflected the proportion of case types they performed. Model parameters were estimated in a Bayesian framework. Composite star ratings were examined using 90%, 95%, or 98% Bayesian credible intervals. Measure reliability was estimated using various 3-year case thresholds. The final composite measure was defined as 0.81 × (1 minus risk-standardized mortality rate) + 0.19 × (1 minus risk-standardized complication rate). Risk-adjusted mortality (median, 2.3%; interquartile range, 1.7% to 3.0%), morbidity (median, 13.7%; interquartile range, 10.8% to 17.1%), and composite scores (median, 95.4%; interquartile range, 94.4% to 96.3%) varied substantially across surgeons. Using 98% Bayesian credible intervals, there were 207 1-star (lower performance) surgeons (9.1%), 1,701 2-star (as-expected performance) surgeons (74.4%), and 378 3-star (higher performance) surgeons (16.5%). With an eligibility threshold of 100 cases over 3 years, measure reliability was 0.81. The STS has developed a multiprocedural composite measure suitable for evaluating performance at the individual surgeon level. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Weiss, Scott L; Fitzgerald, Julie C; Balamuth, Fran; Alpern, Elizabeth R; Lavelle, Jane; Chilutti, Marianne; Grundmeier, Robert; Nadkarni, Vinay M; Thomas, Neal J
2014-11-01
Delayed antimicrobials are associated with poor outcomes in adult sepsis, but data relating antimicrobial timing to mortality and organ dysfunction in pediatric sepsis are limited. We sought to determine the impact of antimicrobial timing on mortality and organ dysfunction in pediatric patients with severe sepsis or septic shock. Retrospective observational study. PICU at an academic medical center. One hundred thirty patients treated for severe sepsis or septic shock. None. We determined if hourly delays from sepsis recognition to initial and first appropriate antimicrobial administration were associated with PICU mortality (primary outcome); ventilator-free, vasoactive-free, and organ failure-free days; and length of stay. Median time from sepsis recognition to initial antimicrobial administration was 140 minutes (interquartile range, 74-277 min) and to first appropriate antimicrobial was 177 minutes (90-550 min). An escalating risk of mortality was observed with each hour delay from sepsis recognition to antimicrobial administration, although this did not achieve significance until 3 hours. For patients with more than 3-hour delay to initial and first appropriate antimicrobials, the odds ratio for PICU mortality was 3.92 (95% CI, 1.27-12.06) and 3.59 (95% CI, 1.09-11.76), respectively. These associations persisted after adjustment for individual confounders and a propensity score analysis. After controlling for severity of illness, the odds ratio for PICU mortality increased to 4.84 (95% CI, 1.45-16.2) and 4.92 (95% CI, 1.30-18.58) for more than 3-hour delay to initial and first appropriate antimicrobials, respectively. Initial antimicrobial administration more than 3 hours was also associated with fewer organ failure-free days (16 [interquartile range, 1-23] vs 20 [interquartile range, 6-26]; p = 0.04). Delayed antimicrobial therapy was an independent risk factor for mortality and prolonged organ dysfunction in pediatric sepsis.
Delayed Antimicrobial Therapy Increases Mortality and Organ Dysfunction Duration in Pediatric Sepsis
Weiss, Scott L.; Fitzgerald, Julie C.; Balamuth, Fran; Alpern, Elizabeth R.; Lavelle, Jane; Chilutti, Marianne; Grundmeier, Robert; Nadkarni, Vinay M.; Thomas, Neal J.
2014-01-01
Objectives Delayed antimicrobials are associated with poor outcomes in adult sepsis, but data relating antimicrobial timing to mortality and organ dysfunction in pediatric sepsis are limited. We sought to determine the impact of antimicrobial timing on mortality and organ dysfunction in pediatric patients with severe sepsis or septic shock. Design Retrospective observational study. Setting PICU at an academic medical center. Patients One hundred thirty patients treated for severe sepsis or septic shock. Interventions None. Measurements and Main Results We determined if hourly delays from sepsis recognition to initial and first appropriate antimicrobial administration were associated with PICU mortality (primary outcome); ventilator-free, vasoactive-free, and organ failure–free days; and length of stay. Median time from sepsis recognition to initial antimicrobial administration was 140 minutes (interquartile range, 74–277 min) and to first appropriate antimicrobial was 177 minutes (90–550 min). An escalating risk of mortality was observed with each hour delay from sepsis recognition to antimicrobial administration, although this did not achieve significance until 3 hours. For patients with more than 3-hour delay to initial and first appropriate antimicrobials, the odds ratio for PICU mortality was 3.92 (95% CI, 1.27–12.06) and 3.59 (95% CI, 1.09–11.76), respectively. These associations persisted after adjustment for individual confounders and a propensity score analysis. After controlling for severity of illness, the odds ratio for PICU mortality increased to 4.84 (95% CI, 1.45–16.2) and 4.92 (95% CI, 1.30–18.58) for more than 3-hour delay to initial and first appropriate antimicrobials, respectively. Initial antimicrobial administration more than 3 hours was also associated with fewer organ failure–free days (16 [interquartile range, 1–23] vs 20 [interquartile range, 6–26]; p = 0.04). Conclusions Delayed antimicrobial therapy was an independent risk factor for mortality and prolonged organ dysfunction in pediatric sepsis. PMID:25148597
Levin, Phillip D; Golovanevski, Mila; Moses, Allon E; Sprung, Charles L; Benenson, Shmuel
2011-01-01
The role of ICU design and particularly single-patient rooms in decreasing bacterial transmission between ICU patients has been debated. A recent change in our ICU allowed further investigation. Pre-move ICU-A and pre-move ICU-B were open-plan units. In March 2007, ICU-A moved to single-patient rooms (post-move ICU-A). ICU-B remained unchanged (post-move ICU-B). The same physicians cover both ICUs. Cultures of specified resistant organisms in surveillance or clinical cultures from consecutive patients staying >48 hours were compared for the different ICUs and periods to assess the effect of ICU design on acquisition of resistant organisms. Data were collected for 62, 62, 44 and 39 patients from pre-move ICU-A, post-move ICU-A, pre-move ICU-B and post-move ICU-B, respectively. Fewer post-move ICU-A patients acquired resistant organisms (3/62, 5%) compared with post-move ICU-B patients (7/39, 18%; P = 0.043, P = 0.011 using survival analysis) or pre-move ICU-A patients (14/62, 23%; P = 0.004, P = 0.012 on survival analysis). Only the admission period was significant for acquisition of resistant organisms comparing pre-move ICU-A with post-move ICU-A (hazard ratio = 5.18, 95% confidence interval = 1.03 to 16.06; P = 0.025). More antibiotic-free days were recorded in post-move ICU-A (median = 3, interquartile range = 0 to 5) versus post-move ICU-B (median = 0, interquartile range = 0 to 4; P = 0.070) or pre-move ICU-A (median = 0, interquartile range = 0 to 4; P = 0.017). Adequate hand hygiene was observed on 140/242 (58%) occasions in post-move ICU-A versus 23/66 (35%) occasions in post-move ICU-B (P < 0.001). Improved ICU design, and particularly use of single-patient rooms, decreases acquisition of resistant bacteria and antibiotic use. This observation should be considered in future ICU design.
Neurogranin as a Cerebrospinal Fluid Biomarker for Synaptic Loss in Symptomatic Alzheimer Disease
Kester, Maartje I.; Teunissen, Charlotte E.; Crimmins, Daniel L.; Herries, Elizabeth M.; Ladenson, Jack. H.; Scheltens, Philip; van der Flier, Wiesje M.; Morris, John C.; Holtzman, David M.; Fagan, Anne M.
2015-01-01
IMPORTANCE Neurogranin (NGRN) seems to be a promising novel cerebrospinal fluid (CSF) biomarker for synaptic loss; however, clinical, and especially longitudinal, data are sparse. OBJECTIVE To examine the utility of NGRN, with repeated CSF sampling, for diagnosis, prognosis, and monitoring of Alzheimer disease (AD). DESIGN, SETTING, AND PARTICIPANTS Longitudinal study of consecutive patients who underwent 2 lumbar punctures between the beginning of 1995 and the end of 2010 within the memory clinic–based Amsterdam Dementia Cohort. The study included 163 patients: 37 cognitively normal participants (mean [SE] age, 64 [2] years; 38% female; and mean [SE] Mini-Mental State Examination [MMSE] score, 28 [0.3]), 61 patients with mild cognitive impairment (MCI) (mean [SE] age, 68 [1] years; 38% female; and mean [SE] MMSE score, 27 [0.3]), and 65 patients with AD (mean [SE] age, 65 [1] years; 45% female; and mean [SE] MMSE score, 22 [0.7]). The mean (SE) interval between lumbar punctures was 2.0 (0.1) years, and the mean (SE) duration of cognitive follow-up was 3.8 (0.2) years. Measurements of CSF NGRN levels were obtained in January and February 2014. MAIN OUTCOME AND MEASURE Levels of NGRN in CSF samples. RESULTS Baseline CSF levels of NGRN in patients with AD (median level, 2381 pg/mL [interquartile range, 1651-3416 pg/mL]) were higher than in cognitively normal participants (median level, 1712 pg/mL [interquartile range, 1206-2724 pg/mL]) (P = .04). Baseline NGRN levels were highly correlated with total tau and tau phosphorylated at threonine 181 in all patient groups (all P < .001), but not with Aβ42. Baseline CSF levels of NGRN were also higher in patients with MCI who progressed to AD (median level, 2842 pg/mL [interquartile range, 1882-3950 pg/mL]) compared with those with stable MCI (median level, 1752 pg/mL [interquartile range, 1024-2438 pg/mL]) (P = .004), and they were predictive of progression from MCI to AD (hazard ratio, 1.8 [95% CI, 1.1-2.9]; stratified by tertiles). Linear mixed-model analyses demonstrated that within-person levels of NGRN increased over time in cognitively normal participants (mean [SE] level, 90 [45] pg/mL per year; P < .05) but not in patients with MCI or AD. CONCLUSIONS AND RELEVANCE Neurogranin is a promising biomarker for AD because levels were elevated in patients with AD compared with cognitively normal participants and predicted progression from MCI to AD. Within-person levels of NGRN increased in cognitively normal participants but not in patients with later stage MCI or AD, which suggests that NGRN may reflect presymptomatic synaptic dysfunction or loss. PMID:26366630
Microcystic macular oedema in multiple sclerosis is associated with disease severity
Gelfand, Jeffrey M.; Nolan, Rachel; Schwartz, Daniel M.; Graves, Jennifer
2012-01-01
Macular oedema typically results from blood–retinal barrier disruption. It has recently been reported that patients with multiple sclerosis treated with FTY-720 (fingolimod) may exhibit macular oedema. Multiple sclerosis is not otherwise thought to be associated with macular oedema except in the context of comorbid clinical uveitis. Despite a lack of myelin, the retina is a site of inflammation and microglial activation in multiple sclerosis and demonstrates significant neuronal and axonal loss. We unexpectedly observed microcystic macular oedema using spectral domain optical coherence tomography in patients with multiple sclerosis who did not have another reason for macular oedema. We therefore evaluated spectral domain optical coherence tomography images in consecutive patients with multiple sclerosis for microcystic macular oedema and examined correlations between macular oedema and visual and ambulatory disability in a cross-sectional analysis. Participants were excluded if there was a comorbidity that could account for the presence of macular oedema, such as uveitis, diabetes or other retinal disease. A microcystic pattern of macular oedema was observed on optical coherence tomography in 15 of 318 (4.7%) patients with multiple sclerosis. No macular oedema was identified in 52 healthy controls assessed over the same period. The microcystic oedema predominantly involved the inner nuclear layer of the retina and tended to occur in small, discrete patches. Patients with multiple sclerosis with microcystic macular oedema had significantly worse disability [median Expanded Disability Score Scale 4 (interquartile range 3–6)] than patients without macular oedema [median Expanded Disability Score Scale 2 (interquartile range 1.5–3.5)], P = 0.0002. Patients with multiple sclerosis with microcystic macular oedema also had higher Multiple Sclerosis Severity Scores, a measure of disease progression, than those without oedema [median of 6.47 (interquartile range 4.96–7.98) versus 3.65 (interquartile range 1.92–5.87), P = 0.0009]. Microcystic macular oedema occurred more commonly in eyes with prior optic neuritis than eyes without prior optic neuritis (50 versus 27%) and was associated with lower visual acuity (median logMAR acuity of 0.17 versus −0.1) and a thinner retinal nerve fibre layer. The presence of microcystic macular oedema in multiple sclerosis suggests that there may be breakdown of the blood–retinal barrier and tight junction integrity in a part of the nervous system that lacks myelin. Microcystic macular oedema may also contribute to visual dysfunction beyond that explained by nerve fibre layer loss. Microcystic changes need to be assessed, and potentially adjusted for, in clinical trials that evaluate macular volume as a marker of retinal ganglion cell survival. These findings also have implications for clinical monitoring in patients with multiple sclerosis on sphingosine 1-phosphate receptor modulating agents. PMID:22539259
Site Variability in Regulatory Oversight for an International Study of Pediatric Sepsis.
Michelson, Kelly N; Reubenson, Gary; Weiss, Scott L; Fitzgerald, Julie C; Ackerman, Kate K; Christie, LeeAnn; Bush, Jenny L; Nadkarni, Vinay M; Thomas, Neal J; Schreiner, Mark S
2018-04-01
Duplicative institutional review board/research ethics committee review for multicenter studies may impose administrative burdens and inefficiencies affecting study implementation and quality. Understanding variability in site-specific institutional review board/research ethics committee assessment and barriers to using a single review committee (an increasingly proposed solution) can inform a more efficient process. We provide needed data about the regulatory oversight process for the Sepsis PRevalence, OUtcomes, and Therapies multicenter point prevalence study. Survey. Sites invited to participate in Sepsis PRevalence, OUtcomes, and Therapies. Investigators at sites that expressed interest and/or participated in Sepsis PRevalence, OUtcomes, and Therapies. None. Using an electronic survey, we collected data about 1) logistics of protocol submission, 2) institutional review board/research ethics committee requested modifications, and 3) use of a single institutional review board (for U.S. sites). We collected surveys from 104 of 167 sites (62%). Of the 97 sites that submitted the protocol for institutional review board/research ethics committee review, 34% conducted full board review, 54% expedited review, and 4% considered the study exempt. Time to institutional review board/research ethics committee approval required a median of 34 (range 3-186) days, which took longer at sites that required protocol modifications (median [interquartile range] 50 d [35-131 d] vs 32 d [14-54 d)]; p = 0.02). Enrollment was delayed at eight sites due to prolonged (> 50 d) time to approval. Of 49 U.S. sites, 43% considered using a single institutional review board, but only 18% utilized this option. Time to final approval for U.S. sites using the single institutional review board was 62 days (interquartile range, 34-70 d) compared with 34 days (interquartile range, 15-54 d) for nonsingle institutional review board sites (p = 0.16). Variability in regulatory oversight was evident for this minimal-risk observational research study, most notably in the category of type of review conducted. Duplicative review prolonged time to protocol approval at some sites. Use of a single institutional review board for U.S. sites was rare and did not improve efficiency of protocol approval. Suggestions for minimizing these challenges are provided.
Colombi, Davide; Dinkel, Julien; Weinheimer, Oliver; Obermayer, Berenike; Buzan, Teodora; Nabers, Diana; Bauer, Claudia; Oltmanns, Ute; Palmowski, Karin; Herth, Felix; Kauczor, Hans Ulrich; Sverzellati, Nicola
2015-01-01
Objectives To describe changes over time in extent of idiopathic pulmonary fibrosis (IPF) at multidetector computed tomography (MDCT) assessed by semi-quantitative visual scores (VSs) and fully automatic histogram-based quantitative evaluation and to test the relationship between these two methods of quantification. Methods Forty IPF patients (median age: 70 y, interquartile: 62-75 years; M:F, 33:7) that underwent 2 MDCT at different time points with a median interval of 13 months (interquartile: 10-17 months) were retrospectively evaluated. In-house software YACTA quantified automatically lung density histogram (10th-90th percentile in 5th percentile steps). Longitudinal changes in VSs and in the percentiles of attenuation histogram were obtained in 20 untreated patients and 20 patients treated with pirfenidone. Pearson correlation analysis was used to test the relationship between VSs and selected percentiles. Results In follow-up MDCT, visual overall extent of parenchymal abnormalities (OE) increased in median by 5 %/year (interquartile: 0 %/y; +11 %/y). Substantial difference was found between treated and untreated patients in HU changes of the 40th and of the 80th percentiles of density histogram. Correlation analysis between VSs and selected percentiles showed higher correlation between the changes (Δ) in OE and Δ 40th percentile (r=0.69; p<0.001) as compared to Δ 80th percentile (r=0.58; p<0.001); closer correlation was found between Δ ground-glass extent and Δ 40th percentile (r=0.66, p<0.001) as compared to Δ 80th percentile (r=0.47, p=0.002), while the Δ reticulations correlated better with the Δ 80th percentile (r=0.56, p<0.001) in comparison to Δ 40th percentile (r=0.43, p=0.003). Conclusions There is a relevant and fully automatically measurable difference at MDCT in VSs and in histogram analysis at one year follow-up of IPF patients, whether treated or untreated: Δ 40th percentile might reflect the change in overall extent of lung abnormalities, notably of ground-glass pattern; furthermore Δ 80th percentile might reveal the course of reticular opacities. PMID:26110421
Katam, Kishore K; Bhatia, Vijayalakshmi; Dabadghao, Preeti; Bhatia, Eesh
2016-01-01
There is little information regarding costs of managing type 1 diabetes mellitus (T1DM) from low- and middle-income countries. We estimated direct costs of T1DM in patients attending a referral diabetes clinic in a governmentfunded hospital in northern India. We prospectively enrolled 88 consecutive T1DM patients (mean [SD] age 15.3 [8] years) with age at onset <18 years presenting to the endocrine clinic of our institution. Data on direct costs were collected for a 12 months-6 months retrospectively followed by 6 months prospectively. Patients belonged predominantly (77%) to the middle socioeconomic strata (SES); 81% had no access to government subsidy or health insurance. The mean direct cost per patient-year of T1DM was `27 915 (inter-quartile range [IQR] `19 852-32 856), which was 18.6% (7.1%-30.1%) of the total family income. A greater proportion of income was spent by families of lower compared to middle SES (32.6% v. 6.6%, p<0.001). The mean out-of-pocket payment for diabetes care ranged from 2% to 100% (mean 87%) of the total costs. The largest expenditure was on home blood glucose monitoring (40%) and insulin (39.5%). On multivariate analysis, total direct cost was associated with annual family income (β=0.223, p=0.033), frequency of home blood glucose monitoring (β=0.249, p=0.016) and use of analogue insulin (β=0.225, p=0.016). Direct costs of T1DM were high; in proportion to their income the costs were greater in the lower SES. The largest expenditure was on home blood glucose monitoring and insulin. Support for insulin and glucose testing strips for T1DM care is urgently required.
Fernández-Salazar, Luis; Barrio, Jesús; Muñoz, Fernando; Muñoz, Concepción; Pajares, Ramón; Rivero, Montserrat; Prieto, Vanesa; Legido, Jesús; Bouhmidi, Abdel; Herranz, Maite; González-Redondo, Guillermo; Fernández, Nereida; Santos, Fernando; Sánchez-Ocaña, Ramón; Joao, Diana
2015-09-01
Infliximab (IFX) therapy intensification in ulcerative colitis (UC) is more common than established in pivotal studies. To establish the frequency and form of intensification for UC in clinical practice, as well as predictors, and to compare outcomes between intensified and non-intensified treatment. A retrospective study of 10 hospitals and 144 patients with response to infliximab (IFX) induction. Predictive variables for intensification were analyzed using a Cox regression analysis. Outcome, loss of response to IFX, and colectomy were compared between intensified and non-intensified therapy. Follow-up time from induction to data collection: 38 months [interquartile range (IQR), 20-62]. Time on IFX therapy: 24 months (IQR, 10-44). In all, 37% of patients required intensification. Interval was shortened for 36 patients, dose was increased for 7, and 10 subjects received both. Concurrent thiopurine immunosuppressants (IMM) and IFX initiation was an independent predictor of intensification [Hazard ratio, 0.034; p, 0.006; CI, 0.003-0.371]. In patients on intensified therapy IFX discontinuation for loss of response (30.4% vs. 10.2%; p, 0.002), steroid reintroduction (35% vs. 18%; p, 0.018), and colectomy (22% vs. 6.4%; p, 0.011) were more common. Of patients on intensification, 17% returned to receiving 5 mg/kg every 8 weeks. Intensification is common and occasionally reversible. IMM initiation at the time of induction with IFX predictsnon-intensification. Intensification, while effective, is associated with poorer outcome.
Bacterial contamination of fabric and metal-bead identity card lanyards: a cross-sectional study.
Pepper, Thomas; Hicks, Georgina; Glass, Stephen; Philpott-Howard, John
2014-01-01
In healthcare, fabric or metal-bead lanyards are universally used for carrying identity cards. However there is little information on microbial contamination with potential pathogens that may readily re-contaminate disinfected hands. We examined 108 lanyards from hospital staff. Most grew skin flora but 7/108 (6%) had potentially pathogenic bacteria: four grew methicillin-susceptible Staphylococcus aureus, and four grew probable fecal flora: 3 Clostridium perfringens and 1 Clostridium bifermentans (one lanyard grew both S. aureus and C. bifermentans). Unused (control) lanyards had little or no such contamination. The median duration of lanyard wear was 12 months (interquartile range 3-36 months). 17/108 (16%) of the lanyards had reportedly undergone decontamination including wiping with alcohol, chlorhexidine or chlorine dioxide; and washing with soap and water or by washing machine. Metal-bead lanyards had significantly lower median bacterial counts than those from fabric lanyards (1 vs. 4 CFU/cm(2); Mann-Whitney U=300.5; P<0.001). 12/32 (38%) of the metal-bead lanyards grew no bacteria, compared with 2/76 (3%) of fabric lanyards. We recommend that an effective decontamination regimen be instituted by those who use fabric lanyards, or that fabric lanyards be discarded altogether in preference for metal-bead lanyards or clip-on identity cards. Copyright © 2014 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.
Demesh, Daniel; Virbalas, Jordan M; Bent, John P
2015-03-01
Pediatric autoimmune neuropsychiatric disorders associated with streptococcal infections (PANDAS) in children describes neuropsychiatric symptom exacerbations that relate temporally to streptococcal infections. Recent case reports suggest tonsillectomy may effectively reduce these symptoms; however, no consensus treatment guidelines exist. This study examines whether tonsillectomy improves neuropsychiatric symptoms in children with PANDAS who have incomplete response to antibiotic therapy. Ten patients met strict diagnostic criteria for PANDAS. Comparisons were made between parental reports of symptom severity at diagnosis, after antibiotic treatment (in 10 patients), and after tonsillectomy (in 9). From a baseline severity score of 10, antibiotics alone improved symptoms to a median (interquartile range [IQR]) score of 8 (6.5-10.0) (P = .03). Nine children who subsequently underwent tonsillectomy reported symptom improvement in comparison with treatment with antibiotics alone, including those with no response to antibiotics. Symptom severity improved at all periods after tonsillectomy compared with antibiotics alone. The median score [IQR] 3 months postoperatively was 3 (0.0-6.5) (P = .01); 6 months postoperatively, 3 (0.0-5.0) (P = .02); 1 year postoperatively, 3 (0.0-5.0) (P = .02); and 3 years postoperatively, 0.5 (0.0-2.3) (P = .03). Four of the 9 had complete resolution after tonsillectomy. This PANDAS cohort whose neuropsychiatric symptoms did not respond sufficiently to antibiotics may have gained benefit from tonsillectomy.
Progression in children with intestinal failure at a referral hospital in Medellín, Colombia.
Contreras-Ramírez, M M; Giraldo-Villa, A; Henao-Roldan, C; Martínez-Volkmar, M I; Valencia-Quintero, A F; Montoya-Delgado, D C; Ruiz-Navas, P; García-Loboguerrero, F
2016-01-01
Patients with intestinal failure are unable to maintain adequate nutrition and hydration due to a reduction in the functional area of the intestine. Different strategies have the potential to benefit these patients by promoting intestinal autonomy, enhancing quality of life, and increasing survival. To describe the clinical characteristics of children with intestinal failure and disease progression in terms of intestinal autonomy and survival. A retrospective study was conducted, evaluating 33 pediatric patients with intestinal failure that were hospitalized within the time frame of December 2005 and December 2013 at a tertiary care referral center. Patient characteristics were described upon hospital admission, estimating the probability of achieving intestinal autonomy and calculating the survival rate. Patient median age upon hospital admission was 2 months (interquartile range [IQR]: 1-4 months) and 54.5% of the patients were boys. Intestinal autonomy was achieved in 69.7% of the cases with a median time of 148 days (IQR: 63 - 431 days), which decreased to 63 days in patients with a spared ileocecal valve. Survival was 91% during a median follow-up of 281 days (IQR: 161 - 772 days). Medical management of patients with intestinal failure is complex. Nutritional support and continuous monitoring are of the utmost importance and long-term morbidity and mortality depends on the early recognition and management of the associated complications. Copyright © 2016. Published by Masson Doyma México S.A.
Validation of the Leicester Cough Questionnaire in non-cystic fibrosis bronchiectasis.
Murray, M P; Turnbull, K; MacQuarrie, S; Pentland, J L; Hill, A T
2009-07-01
Health-related quality of life is a potentially important marker for evaluating existing and new therapies in bronchiectasis. The Leicester Cough Questionnaire (LCQ) is a symptom specific questionnaire designed to assess the impact of cough severity, a major symptom of bronchiectasis. This study aimed to validate the LCQ in bronchiectasis. The validity, responsiveness and reliability of the LCQ were assessed as follows: ability to discriminate severe and mild disease; change in score following antibiotic treatment for exacerbations; repeatability over a 6-month period in stable disease; and comparison with the St George's Respiratory Questionnaire (SGRQ). In total, 120 patients (51 with severe disease, 29 with moderate disease and 40 with mild disease) completed the LCQ and SGRQ. The area under the receiver-operator curve was good for both severe and mild disease (0.84 and 0.80 respectively, p<0.0001). Following 2 weeks' antibiotic treatment, the median LCQ score (interquartile range) improved from 11.3 (9.3-13.7) to 17.8 (15-18.8) (p<0.0001). The LCQ score was repeatable over 6 months in stable disease (intraclass correlation coefficient of 0.96 (95%CI 0.93-0.97), p<0.0001). Correlation between the LCQ and SGRQ scores was -0.7 in both stable disease and exacerbations (p<0.0001). The LCQ can discriminate disease severity, is responsive to change and is reliable for use in non-cystic fibrosis bronchiectasis.
Turak, Osman; Afşar, Barış; Ozcan, Fırat; Öksüz, Fatih; Mendi, Mehmet Ali; Yayla, Çagrı; Covic, Adrian; Bertelsen, Nathan; Kanbay, Mehmet
2016-08-01
Triglyceride (TG) to high-density lipoprotein cholesterol (HDL-C) ratio (TG/HDL-C) has been suggested as a simple method to identify unfavorable cardiovascular outcomes in the general population. The effect of the TG/HDL-C ratio on essential hypertensive patients is unclear. About 900 consecutive essential hypertensive patients (mean age 52.9±12.6 years, 54.2% male) who visited our outpatient hypertension clinic were analyzed. Participants were divided into quartiles based on baseline TG/HDL-C ratio and medical records were obtained periodically for the occurrence of fatal events and composite major adverse cardiovascular events (MACEs) including transient ischemic attack, stroke, aortic dissection, acute coronary syndrome, and death. Participants were followed for a median of 40 months (interquartile range, 35-44 months). Overall, a higher quartile of TG/HDL-C ratio at baseline was significantly linked with higher incidence of fatal and nonfatal cardiovascular events. Using multivariate Cox regression analysis, plasma TG/HDL-C ratio was independently associated with increased risk of fatal events (hazard ratio [HR], 1.25; 95% confidence interval [CI], 1.13-1.37; P≤.001] and MACEs (HR, 1.13; 95% CI, 1.06-1.21; P≤.001). Increased plasma TG/HDL-C ratio was associated with more fatal events and MACEs in essential hypertensive patients. © 2015 Wiley Periodicals, Inc.
Marcantoni, Lina; Toselli, Tiziano; Urso, Giulia; Pratola, Claudio; Ceconi, Claudio; Bertini, Matteo
2015-11-01
In the last decade, there has been an exponential increase in cardioverter-defibrillator (ICD) implants. Remote monitoring systems, allow daily follow-ups of patients with ICD. To evaluate the impact of remote monitoring on the management of cardiovascular events associated with supraventricular and ventricular arrhythmias during long-term follow-up. A total of 207 patients undergoing ICD implantation/replacement were enrolled: 79 patients received remote monitoring systems and were followed up every 12 months, and 128 patients were followed up conventionally every 6 months. All patients were followed up and monitored for the occurrence of supraventricular and ventricular arrhythmia-related cardiovascular events (ICD shocks and/or hospitalizations). During a median follow-up of 842 days (interquartile range 476-1288 days), 32 (15.5%) patients experienced supraventricular arrhythmia-related events and 51 (24.6%) patients experienced ventricular arrhythmia-related events. Remote monitoring had a significant role in the reduction of supraventricular arrhythmia-related events, but it had no effect on ventricular arrhythmia-related events. In multivariable analysis, remote monitoring remained as an independent protective factor, reducing the risk of supraventricular arrhythmia-related events of 67% [hazard ratio, 0.33; 95% confidence interval (CI), 0.13-0.82; P = 0.017]. Remote monitoring systems improved outcomes in patients with supraventricular arrhythmias by reducing the risk of cardiovascular events, but no benefits were observed in patients with ventricular arrhythmias.
Direct costs of care for hepatocellular carcinoma in patients with hepatitis C cirrhosis.
Tapper, Elliot B; Catana, Andreea M; Sethi, Nidhi; Mansuri, Daniel; Sethi, Saurabh; Vong, Annie; Afdhal, Nezam H
2016-03-15
Hepatitis C virus (HCV) is the commonest cause of hepatocellular carcinoma (HCC) in the United States. The benefits of HCV therapy may be measured in part by the prevention of HCC and other complications of cirrhosis. The true cost of care of the HCV patient with HCC is unknown. One hundred patients were randomly selected from a cohort of all HCC patients with HCV at a US transplant center between 2003 and 2013. Patients were categorized by the primary treatment modality, Barcelona class, and ultimate transplant status. Costs included the unit costs of procedures, imaging, hospitalizations, medications, and all subsequent care of the HCC patient until either death or the end of follow-up. Associations with survival and cost were assessed in multivariate regression models. Overall costs included a median of $176,456 (interquartile range [IQR], $84,489-$292,192) per patient or $6279 (IQR, $4043-$9720) per patient-month of observation. The median costs per patient-month were $7492 (IQR, $5137-$11,057) for transplant patients and $4830 for nontransplant patients. The highest median monthly costs were for transplant patients with Barcelona A4 disease ($11,349) and patients who received chemoembolization whether they underwent transplantation ($10,244) or not ($8853). Transarterial chemoembolization and radiofrequency ablation were independently associated with a 28% increase and a 22% decrease in costs, respectively, with adjustments for the severity of liver disease and Barcelona class. These data represent real-world estimates of the cost of HCC care provided at a transplant center and should inform economic studies of HCV therapy. © 2015 American Cancer Society.
Wang, Hong; Ji, Xiao-Bin; Li, Cheng-Wei; Lu, Hai-Wen; Mao, Bei; Liang, Shuo; Cheng, Ke-Bin; Bai, Jiu-Wu; Martinez-Garcia, Miguel Angel; Xu, Jin-Fu
2018-05-23
Lung damage related to tuberculosis is a major contributor to the etiology of bronchiectasis in China. It is unknown whether bronchiectasis severity score systems are applicable in these cases. To evaluate the clinical characteristics and validation of bronchiectasis severity score systems for post-tuberculosis bronchiectasis. The study enrolled 596 bronchiectasis patients in Shanghai Pulmonary Hospital between January 2011 and December 2012. The data for calculating FACED and bronchiectasis severity index (BSI) scores along with mortality, readmission, and exacerbation outcomes were collected and analyzed within a follow-up period with a median length of 48 months (interquartile range 43-54 months). The study enrolled 101 post-tuberculosis bronchiectasis patients and 495 non-tuberculosis bronchiectasis patients. Compared with non-post-tuberculosis bronchiectasis, post-tuberculosis bronchiectasis patients experienced less bilateral bronchiectasis (P=0.004), a higher frequency of right upper lobe involvement (P<0.001), and showed the cylindrical type more often (P<0.001). Follow-up data indicated that both scoring systems were able to predict 48(43-54) month mortality in post-tuberculosis patients as assessed by the area under the receiver operator characteristic curve (AUC) (FACED AUC=0.81, BSI AUC=0.70), but they did not predict readmission (FACED and BSI=0.56) or exacerbation (FACED and BSI=0.52) well. There are apparent differences on radiologic features between bronchiectasis patients with and without history of pulmonary tuberculosis. Both FACED and BSI can predict mortality in post-tuberculosis bronchiectasis. This article is protected by copyright. All rights reserved. © 2018 John Wiley & Sons Ltd.
Puy, Laurent; De Guio, François; Godin, Ophélia; Duering, Marco; Dichgans, Martin; Chabriat, Hugues; Jouvent, Eric
2017-10-01
Cerebral microbleeds are associated with an increased risk of intracerebral hemorrhage. Recent data suggest that microbleeds may also predict the risk of incident ischemic stroke. However, these results were observed in elderly individuals undertaking various medications and for whom causes of microbleeds and ischemic stroke may differ. We aimed to test the relationship between the presence of microbleeds and incident stroke in CADASIL (Cerebral Autosomal Dominant Arteriopathy With Subcortical Infarcts and Leukoencephalopathy)-a severe monogenic small vessel disease known to be responsible for both highly prevalent microbleeds and a high incidence of ischemic stroke in young patients. We assessed microbleeds on baseline MRI in all 378 patients from the Paris-Munich cohort study. Incident ischemic strokes were recorded during 54 months. Survival analyses were used to test the relationship between microbleeds and incident ischemic stroke. Three hundred sixty-nine patients (mean age, 51.4±11.4 years) were followed-up during a median time of 39 months (interquartile range, 19 months). The risk of incident ischemic stroke was higher in patients with microbleeds than in patients without (35.8% versus 19.6%, hazard ratio, 1.87; 95% confidence interval, 1.16-3.01; P =0.009). These results persisted after adjustment for history of ischemic stroke, age, sex, vascular risk factors, and antiplatelet agents use (hazard ratio, 1.89; 95% confidence interval, 1.10-3.26; P =0.02). The presence of microbleeds is an independent risk marker of incident ischemic stroke in CADASIL, emphasizing the need to carefully interpret MRI data. © 2017 American Heart Association, Inc.
Peyton, P J; Wu, C; Jacobson, T; Hogg, M; Zia, F; Leslie, K
2017-07-01
Chronic postsurgical pain (CPSP) is a common and debilitating complication of major surgery. We undertook a pilot study at three hospitals to assess the feasibility of a proposed large multicentre placebo-controlled randomised trial of intravenous perioperative ketamine to reduce the incidence of CPSP. Ketamine, 0.5 mg/kg pre-incision, 0.25 mg/kg/hour intraoperatively and 0.1 mg/kg/hour for 24 hours, or placebo, was administered to 80 patients, recruited over a 15-month period, undergoing abdominal or thoracic surgery under general anaesthesia. The primary endpoint was CPSP in the area of the surgery reported at six-month telephone follow-up using a structured questionnaire. Fourteen patients (17.5%) reported CPSP (relative risk [95% confidence interval] if received ketamine 1.18 [0.70 to 1.98], P =0.56). Four patients in the treatment group and three in the control group reported ongoing analgesic use to treat CPSP and two patients in each group reported their worst pain in the previous 24 hours at ≥3/10 at six months. There were no significant differences in adverse event rates, quality of recovery scores, or cumulative morphine equivalents consumption in the first 72 hours. Numeric Rating Scale pain scores (median [interquartile range, IQR]) for average pain in the previous 24 hours among those patients reporting CPSP were 17.5 [0 to 40] /100 with no difference between treatment groups. A large (n=4,000 to 5,000) adequately powered multicentre trial is feasible using this population and methodology.
A program to provide antiretroviral therapy to residents of an urban slum in Nairobi, Kenya.
Marston, Barbara J; Macharia, Doris K; Nga'nga, Lucy; Wangai, Mary; Ilako, Festus; Muhenje, Odylia; Kjaer, Mette; Isavwa, Anthony; Kim, Andrea; Chebet, Kenneth; DeCock, Kevin M; Weidle, Paul J
2007-06-01
To evaluate retention in care and response to therapy for patients enrolled in an antiretroviral treatment program in a severely resource-constrained setting. We evaluated patients enrolled between February 26, 2003, and February 28, 2005, in a community clinic in Kibera, an informal settlement, in Nairobi, Kenya. Midlevel providers offered simplified, standardized antiretroviral therapy (ART) regimens and monitored patients clinically and with basic laboratory tests. Clinical, immunologic, and virologic indicators were used to assess response to ART; adherence was determined by 3-day recall. A total of 283 patients (70% women; median baseline CD4 count, 157 cells/ mm(3); viral load, 5.16 log copies/mL) initiated ART and were followed for a median of 7.1 months (n = 2384 patient-months). At 1 year, the median CD4 count change was +124.5 cells/mm(3) (n = 74; interquartile range, 42 to 180), and 71 (74%) of 96 patients had viral load <400 copies/mL. The proportion of patients reporting 100% adherence over the 3 days before monthly clinic visits was 94% to 100%. As of February 28, 2005, a total of 239 patients (84%) remained in care, 22 (8%) were lost to follow-up, 12 (4%) were known to have died, 5 (2%) had stopped ART, 3 (1%) moved from the area, and 2 (< 1% ) transferred care. Response to ART in this slum population was comparable to that seen in industrialized settings. With government commitment, donor support, and community involvement, it is feasible to implement successful ART programs in extremely challenging social and environmental conditions.
Psychoactive substances, alcohol and tobacco consumption in HIV-infected outpatients.
Jacquet, Jean-Marc; Peyriere, Hélène; Makinson, Alain; Peries, Marianne; Nagot, Nicolas; Donnadieu-Rigole, Hélène; Reynes, Jacques
2018-06-01
To assess the alcohol consumption, tobacco addiction and psychoactive substance use (PSU) of people living with HIV (PLHIV). Cross-sectional study in an HIV outpatient unit. Autoquestionnaire systematically proposed to all patients during their usual clinical care visit during a 6-months period, for alcohol (AUDIT test), tobacco (Short Fagerstrom Test) and PSU (ASSIST V3.0 test). Of 1334 distributed questionnaires, 1018 PLHIV responded: 76.8% were men [528 patients were MSM), and the median age was 49 years (interquartile range: 42-46). A prevalence of excessive alcohol drinking was found in 22% [95% confidence interval (CI) 19.5-24.7%] and 44.6% (CI 41.5-47.7%) were current smokers, with high dependence in 29.1% (CI 24.9-33.7%). The prevalence of PSU was 37.8% (CI 34.8-41%) in the past 3 months: cannabis 27.7%, poppers 16.4%, cocaine 8.9%, psychotropic medications 7.1%, gamma-hydroxybutyrate/gamma-butyrolactone (GHB/GBL) 4.7%, stimulants 3.1%, synthetic cathinones 2.7%, hallucinogens 1.5%. In the past 3 months, PSU was more prevalent in MSM than in non-MSM patients (46 versus 30%, P < 0.001). MSM consumed significantly more inhaled solvents (poppers) 31.0 versus 1.1%, GHB/GBL 7.8 versus 0.8%, stimulants 5.0 versus 1.1%, synthetic cathinones 4.9 versus 0.3%, and hallucinogens 2.3 versus 0.5%. Given the high prevalence of PSU and other addictions (alcohol and smoking) among PLHIV, and particularly among MSM, a systematic screening of PSU and other addictions should be part of routine clinical care.
Charlton, K; Nichols, C; Bowden, S; Milosavljevic, M; Lambert, K; Barone, L; Mason, M; Batterham, M
2012-11-01
Older malnourished patients experience increased surgical complications and greater morbidity compared with their well-nourished counterparts. This study aimed to assess whether nutritional status at hospital admission predicted clinical outcomes at 18 months follow-up. A retrospective analysis of N=2076 patient admissions (65+ years) from two subacute hospitals, New South Wales, Australia. Analysis of outcomes at 18 months, according to nutritional status at index admission, was performed in a subsample of n = 476. Nutritional status was determined within 72 h of admission using the Mini Nutritional Assessment (MNA). Outcomes, obtained from electronic patient records, included hospital readmission rate, total Length of Stay (LOS), change in level of care at discharge and mortality. Survival analysis, using a Cox proportional hazards model, included age, sex, Major Disease Classification, mobility and LOS at index admission as covariates. At baseline, 30% of patients were malnourished and 53% were at risk of malnutrition. LOS was higher in malnourished and at risk, compared with well-nourished patients (median (interquartile range): 34 (21, 58); 26 (15, 41); 20 (14, 26) days, respectively; P<0.001). Hazard rate for death in the malnourished group is 3.41 (95% confidence interval: 1.07-10.87; P = 0.038) times the well-nourished group. Discharge to a higher level of residential care was 33.1%, 16.9% and 4.9% for malnourished, at-risk and well-nourished patients, respectively; P ≤ 0.001). Malnutrition in elderly subacute patients predicts adverse clinical outcomes and identifies a need to target this population for nutritional intervention following hospital discharge.
Independence and mobility after infrainguinal lower limb bypass surgery for critical limb ischemia.
Ambler, Graeme K; Dapaah, Andrew; Al Zuhir, Naail; Hayes, Paul D; Gohel, Manjit S; Boyle, Jonathan R; Varty, Kevin; Coughlin, Patrick A
2014-04-01
Critical limb ischemia (CLI) is a common condition associated with high levels of morbidity and mortality. Most work to date has focused on surgeon-oriented outcomes such as patency, but there is increasing interest in patient-oriented outcomes such as mobility and independence. This study was conducted to determine the effect of infrainguinal lower limb bypass surgery (LLBS) on postoperative mobility in a United Kingdom tertiary vascular surgery unit and to investigate causes and consequences of poor postoperative mobility. We collected data on all patients undergoing LLBS for CLI at our institution during a 3-year period and analyzed potential factors that correlated with poor postoperative mobility. During the study period, 93 index LLBS procedures were performed for patients with CLI. Median length of stay was 11 days (interquartile range, 11 days). The 12-month rates of graft patency, major amputation, and mortality were 75%, 9%, and 6%, respectively. Rates of dependence increased fourfold during the first postoperative year, from 5% preoperatively to 21% at 12 months. Predictors of poor postoperative mobility were female sex (P = .04) and poor postoperative mobility (P < .001), initially and at the 12-month follow-up. Patients with poor postoperative mobility had significantly prolonged hospital length of stay (15 vs 8 days; P < .001). Patients undergoing LLBS for CLI suffer significantly impaired postoperative mobility, and this is associated with prolonged hospital stay, irrespective of successful revascularization. Further work is needed to better predict patients who will benefit from revascularization and in whom a nonoperative strategy is optimal. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Marquardt, Pascal; Strub, Jörg Rudolf
2006-04-01
The aim of this prospective clinical study was to evaluate the survival rates of IPS Empress 2 (Ivoclar Vivadent) all-ceramic crowns and fixed partial dentures (FPDs) after an observation period of up to 5 years. Forty-three patients (19 women and 24 men) were included in this study. The patients were treated with a total of 58 adhesive bonded IPS Empress 2 restorations. A total of 27 single crowns were placed on molars and premolars, and 31 three-unit FPDs were placed in the anterior and premolar regions. Clinical follow-up examinations took place at 6, 12, 24, 36, 48, and 60 months after insertion. Statistical analysis of the data was calculated using the Kaplan-Meier method. Results of the 50-month analysis (interquartile range, 33 to 61 months) showed that the survival rate was 100% for crowns and 70% for FPDs. Six failures that occurred exclusively in the three-unit FPDs were observed. Framework fractures were recorded in three FPD units where the connector dimensions did not meet the manufacturer specifications. Only one FPD exhibited an irreparable partial veneer fracture, and 2 FPDs showed evidence of biologic failures. The accuracy of fit and esthetic parameters were clinically satisfactory for crowns and FPDs. The results of this 5-year clinical evaluation suggest that IPS Empress 2 ceramic is an appropriate material for the fabrication of single crowns. Because of the reduced survival rates, strict conditions should be considered before the use of IPS Empress 2 material for the fabrication of three-unit FPDs.
Voltolini, Alessandra; Minotti, Anna; Verde, Alessandro; Cipriani, Manlio; Garascia, Andrea; Turazza, Fabio; Macera, Francesca; Perna, Enrico; Russo, Claudio F; Fumagalli, Emilia; Frigerio, Maria
2016-11-01
Heart disease has an impact on patient's identity and self-perception. Taking into account the wide literature about psychological aspects before and after heart transplant, it clearly emerges that there is a lack of data and results for patients up to implantation of ventricular assist devices (VAD). The aim of the present study was to explore quality of life and factors correlated with psychological adjustment in patients supported with VAD. From February 2013 to August 2014, 18 patients (17 male, mean age 57 years) under clinical evaluation before and after VAD implantation were enrolled. During interviews, patients were assessed with EuroQoL-5D questionnaire to monitor improvement of quality of life before implantation and at 3 and 6 months; critical issues, needs and point of views of patients have been described. A significant improvement in the quality of life score was observed at 3 (score 38 [interquartile range 30-40] vs 75 [60-80], p<0.05) and 6 months (38 [30-40] vs 70 [60-80], p<0.05). Overall, patients' psychological state investigated by the test showed a clear and positive trend. All patients need to empower through complete information about the device, related risks and life expectancy. Interdisciplinary approach improved compliance with therapy. Successful treatment and efficient psychological care are closely related to assessment and continuous clinical support. This approach ensures a better selection of patients and improves their compliance. Further data are needed to support our preliminary observations and to explore long-term quality of life.
Early Australian experience in robotic sleeve gastrectomy: a single site series.
Silverman, Candice D; Ghusn, Michael A
2017-05-01
The use of robotic platforms in bariatric surgery has recently gained relevance. With an increased use of this technology come concerns regarding learning curve effects during the initial implementation phase. The sleeve gastrectomy though may represent an ideal training procedure for introducing the robot into bariatric surgical practice. The present review of the first 10 consecutive robotic sleeve gastrectomy procedures performed in an Australian bariatric programme by a single surgeon describes the evolution of the technique, learning curve and initial patient outcomes. Between 2014 and 2015, robotic sleeve gastrectomies were performed as primary and revisional procedures by a consistent surgeon-assistant team. Technique evolution and theatre set-up were documented. Patient demographics, operative time (robot docking and total operation time), additional operative procedures performed, operative and post-operative complications at 1, 3 and 6 months post-procedure and weight loss achieved at 6 months were retrospectively reviewed from a prospectively maintained database. Ten robotic sleeve gastrectomies were performed without significant operative complications. One patient was treated as an outpatient with oral antibiotics for a superficial wound infection. The median total operative time was 123 min (interquartile range (IQR) 108.8-142.5), with a median incision to docking time of 19 min (IQR 15.0-31.8). Length of stay in hospital was 2-3 days. Median excess weight loss achieved at 6 months was 50% (IQR 33.9-66.5). This study describes a method of safely introducing the da Vinci robot into bariatric surgical practice. © 2016 Royal Australasian College of Surgeons.
Cvijić, Marta; Žižek, David; Antolič, Bor; Zupan, Igor
2017-03-01
Cardiac resynchronization therapy (CRT) induces structural and electrical remodeling (ER) in heart failure (HF) patients. Our aim was to assess time course of ER of native conduction and mechanical remodeling after CRT and impact of CRT-induced ER on clinical outcome. We prospectively included 62 patients (aged 66 ± 10 years). Echocardiographic and ECG parameters were measured at baseline and 1, 3, 6, 9, and 12 months after implantation. Biventricular pacing was temporary inhibited during each follow-up to record intrinsic ECG. ER was defined as a decrease in native pre-implantation QRS duration ≥10 ms. During follow-up HF hospitalizations, cardiovascular death and transplantation (combined end point) were recorded. There were significant changes in intrinsic ECG parameters during follow-up; the narrowing of QRS duration was already observed after 1 month (median 185 ms [interquartile range (IQR) 175-194] vs 180 ms [170-194]; P < .001). Left ventricular (LV) volumes decreased only after 3 months of CRT (median end-systolic volume 167 mL [137-206] vs 140 mL [112-196]; P < .001). Only patients with ER (n = 24) exhibited significant mechanical remodeling and showed superior survival free from the combined end point compared with patients without ER (log-rank P = .028). Electrical remodeling of native conduction precedes detectable left ventricular structural changes after CRT. ER of native conduction is associated with better clinical outcome following CRT. Copyright © 2016 Elsevier Inc. All rights reserved.
On the Use of Rank Tests and Estimates in the Linear Model.
1982-06-01
assumption A5, McKean and Hettmansperger (1976) show that 10 w (W(N-c) - W (c+l))/ (2Z /2) (14) where 2Z is the 1-a interpercentile range of the standard...r(.75n) - r(.25n)) (13) The window width h incorporates a resistant estimate of scale, then interquartile range of the residuals, and a normalizing...alternative estimate of i is available with the additional assumption of symmetry of the error distribution. ASSUMPTION: A5. Suppose the underlying error
O'Bichere, Austin; Green, Colin; Phillips, Robin K S
2004-09-01
Water for colostomy irrigation is largely absorbed by the colon, which may result in less efficient expulsion of stool. This study compared the outcome of colonic cleansing with water and polyethylene glycol solution. In a cross-over study, 41 colostomy irrigators were randomly assigned to water or polyethylene glycol solution irrigation first and then the other regimen, each for one week. Patients recorded fluid inflow time, total washout time, cramps, leakage episodes, number of stoma pouches used, and satisfaction scores (Visual Analog Scale, 1-10: 1 = poor, and 10 = excellent). The median and interquartile range for each variable was calculated, and the two treatments were compared (Wilcoxon's test). Eight patients failed to complete the study. Thirty-three patients (20 females; mean age, 55 (range, 39-73) years) provided 352 irrigation sessions: water (n = 176), and polyethylene glycol solution (n = 176). Irrigation was performed every 24, 48, and 72 hours by 17, 9, and 7 patients respectively, using 500 ml (n = 1), 750 ml (n = 2), 1,000 ml (n = 16), 1,500 ml (n = 11), 2,000 ml (n = 2), and 3,500 ml (n = 1) of fluid. The median and interquartile range for water vs. polyethylene glycol solution were: fluid inflow time (6 (range, 4.4-10.8) vs. 6.3 (range, 4.1-11) minutes; P = 0.48), total washout time (53 (range, 33-69) vs. 38 (range, 28-55) minutes; P = 0.01), leakage episodes (2.3 (range, 1.7-3.8) vs. 0.7 (range, 0.2-1); P < 0.001), satisfaction score (5.8 (range, 4-7.5) vs. 8.8 (range, 8.3-10); P < 0.001), and stoma pouch usage per week (75 (range, 45-80) vs. 43 (range, 0-80); P = 0.008). No difference was demonstrated for frequency of cramps ( P = 0.24). Polyethylene glycol solution performed significantly better than water and may be a superior alternative fluid regimen for colostomy irrigation.
Deak, Dalia; Outterson, Kevin; Powers, John H; Kesselheim, Aaron S
2016-09-06
A weak antibiotic pipeline and the increase in drug-resistant pathogens have led to calls for more new antibiotics. Eight new antibiotics were approved by the U.S. Food and Drug Administration (FDA) between January 2010 and December 2015: ceftaroline, fidaxomicin, bedaquiline, dalbavancin, tedizolid, oritavancin, ceftolozane-tazobactam, and ceftazidime-avibactam. This study evaluates the development course and pivotal trials of these antibiotics for their innovativeness, development process, documented patient outcomes, and cost. Data sources were FDA approval packages and databases (January 2010 to December 2015); the Red Book (Truven Health Analytics); Orange Book: Approved Drug Products with Therapeutic Equivalence Evaluations (FDA); and supplementary information from company filings, press releases, and media reports. Four antibiotics were approved for acute bacterial skin and skin-structure infection. Seven had similar mechanisms of action to those of previously approved drugs. Six were initially developed by small to midsized companies, and 7 are currently marketed by 1 of 3 large companies. The drugs spent a median of 6.2 years in clinical trials (interquartile range [IQR], 5.4 to 8.8 years) and 8 months in FDA review (IQR, 7.5 to 8 months). The median number of patients enrolled in the pivotal trials was 666 (IQR, 553 to 739 patients; full range, 44 to 1005 patients), and median trial duration was 18 months (IQR, 15 to 22 months). Seven drugs were approved on the basis of pivotal trials evaluating noninferiority. One drug demonstrated superiority on an exploratory secondary end point, 2 showed decreased efficacy in patients with renal insufficiency, and 1 showed increased mortality compared with older drugs. Seven of the drugs are substantially more expensive than their trial comparators. Limitations are that future research may show benefit to patients, new drugs from older classes may show superior effectiveness in specific patient populations, and initial U.S. prices for each new antibiotic were obtained from public sources. Recently marketed antibiotics are more expensive but have been approved without evidence of clinical superiority.
Midterm outcomes of the Zenith Renu AAA Ancillary Graft.
Jim, Jeffrey; Rubin, Brian G; Geraghty, Patrick J; Money, Samuel R; Sanchez, Luis A
2011-08-01
The Zenith Renu abdominal aortic aneurysm (AAA) Ancillary Graft (Cook Medical Inc, Bloomington, Ind) provides active proximal fixation for treatment of pre-existing endografts with failed or failing proximal fixation or seal. The purpose of this study was to evaluate the midterm outcomes of treatment with this device. From September 2005 to November 2006, a prospective, nonrandomized, multicenter, postmarket registry was utilized to collect physician experiences from 151 cases (89 converters and 62 main body extensions) at 95 institutions. Preoperative indications and procedural and postimplantation outcomes were collected and analyzed. Technical success and clinical success were determined as defined by the Society of Vascular Surgery reporting standards. Patients were predominantly male (87%) with a mean age of 77 years. The interval between the original endograft implantation to Renu treatment was 43.4 ± 18.7 months. The indications for treatment were endoleak (n = 111), migration (n = 136), or both (n = 94). Technical success was 98.0% with two cases of intraoperative conversion and one case of persistent type IA endoleak. The median follow-up for the cohort was 45.0 months (range, 0-56 months; interquartile range, 25.0 months). Overall, 32 cases had treatment failures that included at least one of the following: death (n = 5), type I/III endoleak (n = 18), graft infection (n = 1), thrombosis (n = 1), aneurysm enlargement >5 mm (n = 9), rupture (n = 4), conversion (n = 9, with 7 after 30 days), and migration (n = 1). Overall, the clinical success for the entire cohort during the follow-up period was 78.8% (119/151). The postmarket registry data confirm that the Zenith Renu AAA Ancillary Graft can be used to treat endovascular repairs that failed due to proximal attachment failures. The salvage treatment with the Renu device had high technical success rate and resulted in clinical success in a majority of patients (78.8%). While failed endovascular repairs can be salvaged, a clinical failure in one of five patients still emphasizes the importance of patient and device selection during initial endovascular aneurysm repair to ensure durable success. Copyright © 2011 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.
Faranoush, M; Abolghasemi, Hassan; Toogeh, Gh; Karimi, M; Eshghi, P; Managhchi, M; Hoorfar, H; Dehdezi, B Keikhaei; Mehrvar, A; Khoeiny, B; Kamyar, K; Heshmat, R; Baghaeipour, M R; Mirbehbahani, N B; Fayazfar, R; Ahmadinejad, M; Naderi, M
2015-11-01
In order to establish the efficacy and biosimilar nature of AryoSeven to NovoSeven in the treatment of congenital factor VII (FVII) deficiency, patients received either agent at 30 μg/kg, intravenously per week for 4 weeks, in a randomized fashion. The primary aim was to compare FVII:coagulation activity (FVII:C), 20 minutes after recombinant activated FVII (rFVIIa) injection, in the 2 groups. A secondary measure was self-reported bleeding. The median interquartile baseline range of the plasma level of activated FVII (FVIIa) activity in the 2 groups was 1.6 (1.1-14.0) IU/dL and 5.0 (1.1-25.5) IU/dL. All patients achieved levels of FVIIa (FVII:C) >30 IU/dL, 20 minutes after the injection of rFVIIa. Bleeding was similar between the 2 groups, with a comparable decrease in severity and frequency compared to the last month prior to treatment. AryoSeven is similar to NovoSeven in increasing postinjection FVIIa activity as well as in clinical safety and efficacy. © The Author(s) 2014.
Limmahakhun, S; Chaiwarith, R; Nuntachit, N; Sirisanthana, T; Supparatpinyo, K
2012-06-01
Thailand has been greatly affected by the tuberculosis (TB) and HIV syndemic. This study aimed to determine treatment outcomes among HIV/TB co-infected patients. A retrospective cohort study was conducted at Chiang Mai University Hospital from 1 January 2000 to 31 December 2009. Of 171 patients, 100 patients were male (58.5%) and the mean age was 36.8 ± 8.0 years. Seventy-two patients (42.1%) had pulmonary tuberculosis. Median CD4+ count before TB treatment was 69 cells/mm(3) (interquartile range [IQR] 33, 151). The overall mortality was 3.5% (6 patients). Immune reconstitution inflammatory syndrome (IRIS) occurred in eight patients (6.0%). Disseminated TB infections increased risk of death (odds ratio [OR] = 2.55, 95% confidence interval [CI] 1.25, 5.18) and IRIS (OR = 9.16, 95% CI 1.67, 50.07). Initiating combination antiretroviral therapy (cART) within two months after TB treatment increased risk of IRIS (OR = 6.57, 95% CI 1.61-26.86) and physicians caring for HIV/TB co-infected patients should be aware of this condition.
Birko, Stanislav; Dove, Edward S; Özdemir, Vural
2015-01-01
The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts' variable degrees of conformity (stubbornness/flexibility) in modifying their opinions.
Birko, Stanislav; Dove, Edward S.; Özdemir, Vural
2015-01-01
The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts’ variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647
Ettinger, Adrienne S; Téllez-Rojo, Martha María; Amarasiriwardena, Chitra; González-Cossío, Teresa; Peterson, Karen E; Aro, Antonio; Hu, Howard; Hernández-Avila, Mauricio
2004-06-01
Despite the many well-recognized benefits of breast-feeding for both mothers and infants, detectable levels of lead in breast milk have been documented in population studies of women with no current environmental or occupational exposures. Mobilization of maternal bone lead stores has been suggested as a potential endogenous source of lead in breast milk. We measured lead in breast milk to quantify the relation between maternal blood and bone lead levels and breast-feeding status (exclusive vs. partial) among 310 lactating women in Mexico City, Mexico, at 1 month postpartum. Umbilical cord and maternal blood samples were collected at delivery. Maternal breast milk, blood, and bone lead levels were obtained at 1 month postpartum. Levels of lead in breast milk ranged from 0.21 to 8.02 microg/L (ppb), with a geometric mean (GM) of 1.1 microg/L; blood lead ranged from 1.8 to 29.9 microg/dL (GM = 8.4 microg/dL); bone lead ranged from < 1 to 67.2 microg/g bone mineral (patella) and from < 1 to 76.6 microg/g bone mineral (tibia) at 1 month postpartum. Breast milk lead was significantly correlated with umbilical cord lead [Spearman correlation coefficient (rS) = 0.36, p < 0.0001] and maternal blood lead (rS= 0.38, p < 0.0001) at delivery and with maternal blood lead (rS = 0.42, p < 0.0001) and patella lead (rS= 0.15, p < 0.01) at 1 month postpartum. Mother's age, years living in Mexico City, and use of lead-glazed ceramics, all predictive of cumulative lead exposure, were not significant predictors of breast milk lead levels. Adjusting for parity, daily dietary calcium intake (milligrams), infant weight change (grams), and breast-feeding status (exclusive or partial lactation), the estimated effect of an interquartile range (IQR) increase in blood lead (5.0 microg/dL) was associated with a 33% increase in breast milk lead [95% confidence interval (CI), 24 to 43%], whereas an IQR increase in patella lead (20 microg/g) was associated with a 14% increase in breast milk lead (95% CI, 5 to 25%). An IQR increase in tibia lead (12.0 microg/g) was associated with a 5% increase in breast milk lead (95% CI, -3% to 14%). Our results indicate that even among a population of women with relatively high lifetime exposure to lead, levels of lead in breast milk are low, influenced both by current lead exposure and by redistribution of bone lead accumulated from past environmental exposures.
Ng, Ken K; Nisbet, Mitzi; Damato, Erika M; Sims, Joanne L
2017-05-01
To describe the clinical spectrum of presumed tuberculous (TB) uveitis in a developed, non-endemic country of high immigrant population. Retrospective review of a consecutive case series. All 39 patients diagnosed with presumed TB uveitis at the tertiary uveitis service in Auckland from 2007 to 2014. Clinical chart review. Patient demographics, risk factors, ophthalmic manifestations, management and outcome. The median age was 37 years (interquartile range [IQR] 31-52) and 56% were female. The majority (97%) were born outside of New Zealand, and 77% had no TB-related history. Radiological abnormalities consistent with TB were evident in seven patients, including three who had culture positive pulmonary disease. Anterior uveitis was diagnosed in ten patients (26%), anterior and intermediate uveitis in eight (21%), posterior uveitis in 13 (33%) and panuveitis in eight (21%). Sixteen (41%) had retinal vasculitis, and five (13%) had multifocal serpiginoid choroiditis. Common complications included cataract (51%), ocular hypertension (36%), broad posterior synechiae (33%) and cystoid macular oedema (28%). Anti-TB treatment was initiated in 30 patients (76%). All but three patients completed the intended course of six to 12 months. Following anti-TB treatment, 67% remained in remission for at least 12 months, and all but two patients successfully stopped systemic steroids. The median initial and final visual acuity was 6/9 (IQR 6/6-6/18) and 6/6 (IQR 6/6-6/9), respectively. Despite a wide range of ocular presentations and complications, our cohort demonstrated good remission rate and visual prognosis following anti-TB treatment in carefully selected patients. © 2016 Royal Australian and New Zealand College of Ophthalmologists.
Nathan, Anna Marie; Loo, Hui Yan; de Bruyne, Jessie Anne; Eg, Kah Peng; Kee, Sze Ying; Thavagnanam, Surendran; Bouniu, Marilyn; Wong, Jiat Earn; Gan, Chin Seng; Lum, Lucy Chai See
2017-04-01
Home ventilation (HV) for children is growing rapidly worldwide. The aim was to describe (1) the sociodemographic characteristics of children on HV and (2) the indications for, means and outcome of initiating HV in children from a developing country. This retrospective study included patients sent home on noninvasive or invasive ventilation, over 13 years, by the pediatric respiratory unit in a single center. Children who declined treatment were excluded. Seventy children were initiated on HV: 85.7% on noninvasive ventilation, 14.3% on invasive ventilation. There was about a threefold increase from 2001-2008 (n = 18) to 2009-2014 (n = 52). Median (range) age of initiating HV was 11 (1-169) months and 73% of children were <2 years old. Common indications for HV were respiratory (57.2%), chest/spine anomalies (11.4%), and neuromuscular (10.0%). Fifty-two percent came off their devices with a median (interquartile range) usage duration of 12 (4.8, 21.6) months. Ten children (14.3%) died with one avoidable death. Children with neuromuscular disease were less likely to come off their ventilator (0.0%) compared to children with respiratory disease (62.1%). Forty-one percent of parents bought their equipment, whereas 58.6% borrowed their equipment from the medical social work department and other sources. HV in a resource-limited country is possible. Children with respiratory disease made up a significant proportion of those requiring HV and were more likely to be weaned off. The mortality rate was low. The social work department played an important role in facilitating early discharge. Pediatr Pulmonol. 2017;52:500-507. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Kornuijt, A; Das, D; Sijbesma, T; van der Weegen, W
2016-05-01
In order to prevent dislocation of the hip after total hip arthroplasty (THA), patients have to adhere to precautions in the early post-operative period. The hypothesis of this study was that a protocol with minimal precautions after primary THA using the posterolateral approach would not increase the short-term (less than three months) risk of dislocation. We prospectively monitored a group of unselected patients undergoing primary THA managed with standard precautions (n = 109, median age 68.9 years; interquartile range (IQR) 61.2 to 77.3) and a group who were managed with fewer precautions (n = 108, median age 67.2 years; IQR 59.8 to 73.2). There were no significant differences between the groups in relation to predisposing risk factors. The diameter of the femoral head ranged from 28 mm to 36 mm; meticulous soft-tissue repair was undertaken in all patients. The medical records were reviewed and all patients were contacted three months post-operatively to confirm whether they had experienced a dislocation. There were no dislocations in the less restricted group and one in the more restricted group (p = 0.32). For experienced surgeons using the posterolateral approach at THA and femoral heads of diameter ≥ 28 mm, it appears safe to manage patients in the immediate post-operative period with minimal precautions to protect against dislocation. Larger studies with adequate statistical power are needed to verify this conclusion. Experienced orthopaedic surgeons using the posterolateral approach for THA should not fear an increased dislocation rate if they manage their patients with a minimal precautions protocol. Cite this article: Bone Joint J 2016;98-B:589-94. ©2016 The British Editorial Society of Bone & Joint Surgery.
Association Between Bottle Size and Formula Intake in 2-Month-Old Infants.
Wood, Charles T; Skinner, Asheley C; Yin, H Shonna; Rothman, Russell L; Sanders, Lee M; Delamater, Alan; Ravanbakht, Sophie N; Perrin, Eliana M
2016-04-01
To determine range of bottle sizes used and examine the relationship between bottle size and total daily consumption of infant formula. Cross-sectional analysis of baseline data collected as part of Greenlight, a cluster randomized trial to prevent childhood obesity at 4 pediatric resident clinics. The Greenlight study included healthy, term infants. For our analysis, parents of exclusively formula-fed infants reported volume per feed, number of feeds per day, and bottle size, which was dichotomized into small (<6 oz) or large (≥6 oz). We identified determinants of bottle size, and then examined relationships between bottle size and volume fed with log-transformed ordinary least squares regression, adjusting for infant age, sex, birth weight, current weight, race/ethnicity, and enrollment in Special Supplemental Nutrition Program for Women, Infants, and Children. Of 865 participants in the Greenlight study, 44% (n = 378; 21.8% white, 40.6% black, 35.3% Hispanic, 2.4% other) of infants were exclusively formula fed at 2 months. Median volume per day was 30 oz (interquartile range 12), and 46.0% of infants were fed with large bottles. Adjusted for covariates, parents using larger bottles reported feeding 4 oz more formula per day (34.2 oz, 95% confidence interval 33.5-34.9 vs 29.7 oz, 95% confidence interval 29.2-30.3, P = .03). Among exclusively formula-fed infants, use of a larger bottle is associated with parental report of more formula intake compared to infants fed with smaller bottles. If infants fed with larger bottles receive more formula, these infants may be overfed and consequently at risk for obesity. Copyright © 2016 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.
McCarthy, Eoghan M; Sutton, Emily; Nesbit, Stephanie; White, James; Parker, Ben; Jayne, David; Griffiths, Bridget; Isenberg, David A; Gordon, Caroline; D'Cruz, David P; Rhodes, Benjamin; Lanyon, Peter; Vital, Edward M; Yee, Chee-Seng; Edwards, Christopher J; Teh, Lee-Suan; Akil, Mohammed; McHugh, Neil J; Zoma, Asad; Bruce, Ian N
2018-01-01
Abstract Objectives To describe the baseline characteristics of SLE patients requiring biologic therapy in the UK and to explore short term efficacy and infection rates associated with rituximab (RTX) use. Methods Patients commencing biologic therapy for refractory SLE and who consented to join BILAG-BR were analysed. Baseline characteristics, disease activity (BILAG 2004/SLEDAI-2K) and rates of infection over follow-up were analysed. Response was defined as loss of all A and B BILAG scores to ⩽ 1 B score with no new A/B scores in other organ systems at 6 months. Results Two hundred and seventy SLE patients commenced biologic therapy from September 2010 to September 2015, most commonly RTX (n = 261). Two hundred and fifty (93%) patients were taking glucocorticoids at baseline at a median [interquartile range (IQR)] oral dose of 10 mg (5–20 mg) daily. Response rates at 6 months were available for 68% of patients. The median (IQR) BILAG score was 15 (10–23) at baseline and 3 (2–12) at 6 months (P < 0.0001). The median (IQR) SLEDAI-2K reduced from 8 (5–12) to 4 (0–7) (P < 0.001). Response was achieved in 49% of patients. There was also a reduction in glucocorticoid use to a median (IQR) dose of 7.5 mg (5–12 mg) at 6 months (P < 0.001). Serious infections occurred in 26 (10%) patients, being more frequent in the first 3 months post-RTX therapy. A higher proportion of early infections were non-respiratory (odds ratio = 1.98, 95% CI: 0.99, 3.9; P = 0.049). Conclusion RTX is safe and is associated with improvement in disease activity in refractory SLE patients with concomitant reductions in glucocorticoid use. Early vigilance for infection post-infusion is important to further improve treatment risks and benefits. PMID:29216396
Associations between Deceased-Donor Urine MCP-1 and Kidney Transplant Outcomes.
Mansour, S G; Puthumana, J; Reese, P P; Hall, I E; Doshi, M D; Weng, F L; Schröppel, B; Thiessen-Philbrook, H; Bimali, M; Parikh, C R
2017-07-01
Existing methods to predict recipient allograft function during deceased-donor kidney procurement are imprecise. Understanding the potential renal reparative role for monocyte chemoattractant protein-1 (MCP-1), a cytokine involved in macrophage recruitment after injury, might help predict allograft outcomes. We conducted a sub-study of the multicenter prospective Deceased Donor Study cohort, which evaluated deceased kidney donors from five organ procurement organizations from May 2010 to December 2013. We measured urine MCP-1 (uMCP-1) concentrations from donor samples collected at nephrectomy to determine associations with donor acute kidney injury (AKI), recipient delayed graft function (DGF), 6-month estimated GFR (eGFR), and graft failure. We also assessed perfusate MCP-1 concentrations from pumped kidneys for associations with DGF and 6-month eGFR. AKI occurred in 111 (9%) donors. Median (interquartile range) uMCP-1 concentration was higher in donors with AKI compared to donors without AKI (1.35 [0.41-3.93] ng/ml vs. 0.32 [0.11-0.80] ng/ml, p<0.001). DGF occurred in 756 (31%) recipients, but uMCP-1 was not independently associated with DGF. Higher donor uMCP-1 concentrations were independently associated with higher 6-month eGFR in those without DGF [0.77 (0.10, 1.45) ml/min/1.73m 2 per doubling of uMCP1]. However, there were no independent associations between uMCP-1 and graft failure over a median follow-up of about 2 years. Lastly, perfusate MCP-1 concentrations significantly increased during pump perfusion but were not associated with DGF or 6-month eGFR. Donor uMCP-1 concentrations were modestly associated with higher recipient 6-month eGFR in those without DGF. However, the results suggest that donor uMCP-1 has minimal clinical utility given no associations with graft failure.
First experience of liver transplantation with type 2 donation after cardiac death in France.
Savier, Eric; Dondero, Federica; Vibert, Eric; Eyraud, Daniel; Brisson, Hélène; Riou, Bruno; Fieux, Fabienne; Naili-Kortaia, Salima; Castaing, Denis; Rouby, Jean-Jacques; Langeron, Olivier; Dokmak, Safi; Hannoun, Laurent; Vaillant, Jean-Christophe
2015-05-01
Organ donation after unexpected cardiac death [type 2 donation after cardiac death (DCD)] is currently authorized in France and has been since 2006. Following the Spanish experience, a national protocol was established to perform liver transplantation (LT) with type 2 DCD donors. After the declaration of death, abdominal normothermic oxygenated recirculation was used to perfuse and oxygenate the abdominal organs until harvesting and cold storage. Such grafts were proposed to consenting patients < 65 years old with liver cancer and without any hepatic insufficiency. Between 2010 and 2013, 13 LTs were performed in 3 French centers. Six patients had a rapid and uneventful postoperative recovery. However, primary nonfunction occurred in 3 patients, with each requiring urgent retransplantation, and 4 early allograft dysfunctions were observed. One patient developed a nonanastomotic biliary stricture after 3 months, whereas 8 patients showed no sign of ischemic cholangiopathy at their 1-year follow-up. In comparison with a control group of patients receiving grafts from brain-dead donors (n = 41), donor age and cold ischemia time were significantly lower in the type 2 DCD group. Time spent on the national organ wait list tended to be shorter in the type 2 DCD group: 7.5 months [interquartile range (IQR), 4.0-11.0 months] versus 12.0 months (IQR, 6.8-16.7 months; P = 0.08. The 1-year patient survival rates were similar (85% in the type 2 DCD group versus 93% in the control group), but the 1-year graft survival rate was significantly lower in the type 2 DCD group (69% versus 93%; P = 0.03). In conclusion, to treat borderline hepatocellular carcinoma, LT with type 2 DCD donors is possible as long as strict donor selection is observed. © 2015 American Association for the Study of Liver Diseases.
Barros, Xoana; Rodríguez, Nestor Y; Fuster, David; Rodas, Lida; Esforzado, Nuria; Mazza, Alberto; Rubello, Domenico; Campos, Francisco; Tapias, Andrés; Torregrosa, José-Vicente
2016-10-01
Vitamin D deficiency is prevalent in kidney transplant recipients (KTR) and recommendations on how to replenish vitamin D deposits are scarce. To evaluate, in KTR, the safety and efficacy of calcifediol in two different vitamin D supplementation regimens, in order to assess the most suitable dose. Prospective observational study with two calcifediol supplementation regimens randomly prescribed by clinicians in liquid form, at 266 mcg doses, monthly or biweekly. We analyzed 168 KTR with a functioning allograft for more than 6 months. Patients receiving other vitamin D forms, calcimimetics or bisphosphonates were excluded. Before calcifediol initiation (pre-treatment levels) and after at least 3 months of treatment (post-treatment levels), we measured serum levels of 25-OH vitamin D (25(OH)D), parathyroid hormone (PTH), alkaline phosphatase (ALP), calcium (sCa), phosphate (sPO4) and creatinine (sCreat). In the monthly group (n = 72), 25(OH)D levels increased from 14 ng/ml [interquartile range, IQR 9-22] at baseline to 31 [20-38] (p = 0.000), PTH decreased from 124 pg/ml [87-172] to 114 [78-163] (p = 0.006), while sCa and sPO4 remained stable. In the biweekly group (n = 96), 25(OH)D increased from 14 ng/ml [9-20] at baseline to 39 [28-52] (p = 0), PTH decreased from 141 pg/ml [95-221] to 112 [90-180] (p = 0.000), sCa remained stable and sPO4 increased from 3.3 ± 0.6 mg/dl to 3.5 ± 0.6 (p = 0.003). Renal function remained stable in both groups. Vitamin D reposition with oral calcifediol, in a biweekly or monthly regimen, is safe and effective in improving 25(OH)D blood levels and in decreasing PTH in kidney transplant recipients.
Vervoort, Ajmw; van der Voet, L F; Hehenkamp, Wjk; Thurkow, A L; van Kesteren, Pjm; Quartero, H; Kuchenbecker, W; Bongers, M; Geomini, P; de Vleeschouwer, Lhm; van Hooff, Mha; van Vliet, H; Veersema, S; Renes, W B; Oude Rengerink, K; Zwolsman, S E; Brölmann, Ham; Mol, Bwj; Huirne, Jaf
2018-02-01
To compare the effectiveness of a hysteroscopic niche resection versus no treatment in women with postmenstrual spotting and a uterine caesarean scar defect. Multicentre randomised controlled trial. Eleven hospitals collaborating in a consortium for women's health research in the Netherlands. Women reporting postmenstrual spotting after a caesarean section who had a niche with a residual myometrium of ≥3 mm, measured during sonohysterography. Women were randomly allocated to hysteroscopic niche resection or expectant management for 6 months. The primary outcome was the number of days of postmenstrual spotting 6 months after randomisation. Secondary outcomes were spotting at the end of menstruation, intermenstrual spotting, dysuria, sonographic niche measurements, surgical parameters, quality of life, women's satisfaction, sexual function, and additional therapy. Outcomes were measured at 3 months and, except for niche measurements, also at 6 months after randomisation. We randomised 52 women to hysteroscopic niche resection and 51 women to expectant management. The median number of days of postmenstrual spotting at baseline was 8 days in both groups. At 6 months after randomisation, the median number of days of postmenstrual spotting was 4 days (interquartile range, IQR 2-7 days) in the intervention group and 7 days (IQR 3-10 days) in the control group (P = 0.04); on a scale of 0-10, discomfort as a result of spotting had a median score of 2 (IQR 0-7) in the intervention group, compared with 7 (IQR 0-8) in the control group (P = 0.02). In women with a niche with a residual myometrium of ≥3 mm, hysteroscopic niche resection reduced postmenstrual spotting and spotting-related discomfort. A hysteroscopic niche resection is an effective treatment to reduce niche-related spotting. © 2017 The Authors. BJOG An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.
Guidet, Bertrand; Leblanc, Guillaume; Simon, Tabassome; Woimant, Maguy; Quenot, Jean-Pierre; Ganansia, Olivier; Maignan, Maxime; Yordanov, Youri; Delerme, Samuel; Doumenc, Benoit; Fartoukh, Muriel; Charestan, Pierre; Trognon, Pauline; Galichon, Bertrand; Javaud, Nicolas; Patzak, Anabela; Garrouste-Orgeas, Maïté; Thomas, Caroline; Azerad, Sylvie; Pateron, Dominique; Boumendil, Ariane
2017-10-17
The high mortality rate in critically ill elderly patients has led to questioning of the beneficial effect of intensive care unit (ICU) admission and to a variable ICU use among this population. To determine whether a recommendation for systematic ICU admission in critically ill elderly patients reduces 6-month mortality compared with usual practice. Multicenter, cluster-randomized clinical trial of 3037 critically ill patients aged 75 years or older, free of cancer, with preserved functional status (Index of Independence in Activities of Daily Living ≥4) and nutritional status (absence of cachexia) who arrived at the emergency department of one of 24 hospitals in France between January 2012 and April 2015 and were followed up until November 2015. Centers were randomly assigned either to use a program to promote systematic ICU admission of patients (n=1519 participants) or to follow standard practice (n=1518 participants). The primary outcome was death at 6 months. Secondary outcomes included ICU admission rate, in-hospital death, functional status, and quality of life (12-Item Short Form Health Survey, ranging from 0 to 100, with higher score representing better self-reported health) at 6 months. One patient withdrew consent, leaving 3036 patients included in the trial (median age, 85 [interquartile range, 81-89] years; 1361 [45%] men). Patients in the systematic strategy group had an increased risk of death at 6 months (45% vs 39%; relative risk [RR], 1.16; 95% CI, 1.07-1.26) despite an increased ICU admission rate (61% vs 34%; RR, 1.80; 95% CI, 1.66-1.95). After adjustments for baseline characteristics, patients in the systematic strategy group were more likely to be admitted to an ICU (RR, 1.68; 95% CI, 1.54-1.82) and had a higher risk of in-hospital death (RR, 1.18; 95% CI, 1.03-1.33) but had no significant increase in risk of death at 6 months (RR, 1.05; 95% CI, 0.96-1.14). Functional status and physical quality of life at 6 months were not significantly different between groups. Among critically ill elderly patients in France, a program to promote systematic ICU admission increased ICU use but did not reduce 6-month mortality. Additional research is needed to understand the decision to admit elderly patients to the ICU. clinicaltrials.gov Identifier: NCT01508819.
Fusobacterial liver abscess: a case report and review of the literature.
Jayasimhan, Dilip; Wu, Linus; Huggan, Paul
2017-06-20
Fusobacteriae are facultative anaerobic gram-negative bacilli which cause a range of invasive infections, amongst which pyogenic liver abscesses are rare. We describe a case of Fusobacterium nucleatum liver abscess and review the relevant literature. A 51-year-old lady presented with a 4-day history of abdominal pain, diarrhoea, fever, rigors, and lethargy. Imaging revealed an abscess which was drained. Cultures of the blood and abscess aspirate grew Fusobacterium nucleatum and Prevotella pleuritidis respectively. She achieved full recovery following treatment. A MEDLINE search was undertaken using free-text and Medical Subject Headings (MeSH), keywords "Fusobacterium" and "Liver abscess". Non-English language reports and cases without confirmed growth of Fusobacterium species were excluded. Additional cases were identified by surveying the references of each report and by using the same keywords in a web-based search. Forty-eight cases were identified, 41 in men. The median age was 42.5, with an interquartile range of 33. F. nucleatum and F. necrophorum were in involved in 22 cases each, and 4 cases were not further speciated. Among cases of F. nucleatum liver abscess, nine were attributed to periodontal disease, four to lower gastrointestinal tract disease, one to Lemierre's Syndrome, and eight were considered cryptogenic. All patients treated made a full recovery. Antimicrobial treatment duration ranged from 2 weeks to 6 months with a median of 6 weeks. Fusobacterium nucleatum is an uncommon cause of liver abscess generally associated with good clinical outcomes with contemporary medical and surgical care.
Caspar, Thibault; Fichot, Marie; Ohana, Mickaël; El Ghannudi, Soraya; Morel, Olivier; Ohlmann, Patrick
2017-08-01
Acute myocarditis (AM) often involves the left ventricular (LV) subepicardium that might be displayed by cardiac magnetic resonance even late after the acute phase. In the absence of global or regional LV dysfunction, conventional transthoracic echocardiography (TTE) does not accurately identify tissue sequelae of AM. We sought to evaluate the diagnostic value of two-dimensional (2D) and three-dimensional (3D) speckle-tracking echocardiography to identify patients with a history of AM with preserved LV ejection fraction (LVEF). Fifty patients (group 1: age, 31.4 ± 10.5 years; 76% males) with a history of cardiac magnetic resonance-confirmed diagnosis of AM (according to the Lake Louise criteria) were retrospectively identified and then (21.7 ± 23.4 months later) evaluated by complete echocardiography including 2D and 3D speckle-tracking analysis, as well as 50 age- and gender-matched healthy controls (group 2: age, 31.2 ± 9.5 years: 76% males). Patients with a history of severe clinical presentation of AM (sudden death, ventricular arrhythmia, heart failure, alteration of LVEF) were excluded. At diagnosis, peak troponin and C-reactive protein were 11.97 (interquartile range, 4.52-25.92) μg/L and 32.3 (interquartile range, 14.85-70.45) mg/L, respectively. Mean delay between acute phase and follow-up study TTE was 21.7 ± 23.4 months. LVEF was not statistically different between groups (62.1% vs 63.5%, P = .099). Two-dimensional global longitudinal strain (GLS) was lower in magnitude in group 1 (-17.8% vs -22.1%, P < .0001) as were 2D layer-specific subepicardial GLS (-15.4% vs -19.7%, P < .0001) and subendocardial GLS (-20.71% vs -25.08%, P < .0001). Three-dimensional global longitudinal, circumferential, area, and radial strains were lower in magnitude in group 1 (-11.80% vs -14.98%, P < .0001; -12.57% vs -15.12%, P < .0001; -22.28% vs -25.87%, P < .0001; 31.47% vs 38.06%, P < .0001, respectively). Receiver operating characteristic curve analysis showed that subepicardial GLS displayed a better diagnostic performance to detect sequelae of AM as compared with GLS (area under the curve = 0.97 vs 0.93, P = .045). In patients with a history of AM, a subtle LV dysfunction can be detected by 2D and 3D speckle-tracking echocardiography, even though LVEF is conserved, adding incremental information over conventional TTE. Copyright © 2017 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Emery, Joanne L; Coleman, Tim; Sutton, Stephen; Cooper, Sue; Leonardi-Bee, Jo; Jones, Matthew; Naughton, Felix
2018-04-19
Smoking in pregnancy is a major public health concern. Pregnant smokers are particularly difficult to reach, with low uptake of support options and few effective interventions. Text message-based self-help is a promising, low-cost intervention for this population, but its real-world uptake is largely unknown. The objective of this study was to explore the uptake and cost-effectiveness of a tailored, theory-guided, text message intervention for pregnant smokers ("MiQuit") when advertised on the internet. Links to a website providing MiQuit initiation information (texting a short code) were advertised on a cost-per-click basis on 2 websites (Google Search and Facebook; £1000 budget each) and free of charge within smoking-in-pregnancy webpages on 2 noncommercial websites (National Childbirth Trust and NHS Choices). Daily budgets were capped to allow the Google and Facebook adverts to run for 1 and 3 months, respectively. We recorded the number of times adverts were shown and clicked on, the number of MiQuit initiations, the characteristics of those initiating MiQuit, and whether support was discontinued prematurely. For the commercial adverts, we calculated the cost per initiation and, using quit rates obtained from an earlier clinical trial, estimated the cost per additional quitter. With equal capped budgets, there were 812 and 1889 advert clicks to the MiQuit website from Google (search-based) and Facebook (banner) adverts, respectively. MiQuit was initiated by 5.2% (42/812) of those clicking via Google (95% CI 3.9%-6.9%) and 2.22% (42/1889) of those clicking via Facebook (95% CI 1.65%-2.99%). Adverts on noncommercial webpages generated 53 clicks over 6 months, with 9 initiations (9/53, 17%; 95% CI 9%-30%). For the commercial websites combined, mean cost per initiation was £24.73; estimated cost per additional quitter, including text delivery costs, was £735.86 (95% CI £227.66-£5223.93). Those initiating MiQuit via Google were typically very early in pregnancy (median gestation 5 weeks, interquartile range 10 weeks); those initiating via Facebook were distributed more evenly across pregnancy (median gestation 16 weeks, interquartile range 14 weeks). Commercial online adverts are a feasible, likely cost-effective method for engaging pregnant smokers in digital cessation support and may generate uptake at a faster rate than noncommercial websites. As a strategy for implementing MiQuit, online advertising has large reach potential and can offer support to a hard-to-reach population of smokers. ©Joanne L Emery, Tim Coleman, Stephen Sutton, Sue Cooper, Jo Leonardi-Bee, Matthew Jones, Felix Naughton. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 19.04.2018.
Coleman, Tim; Sutton, Stephen; Cooper, Sue; Leonardi-Bee, Jo; Jones, Matthew; Naughton, Felix
2018-01-01
Background Smoking in pregnancy is a major public health concern. Pregnant smokers are particularly difficult to reach, with low uptake of support options and few effective interventions. Text message–based self-help is a promising, low-cost intervention for this population, but its real-world uptake is largely unknown. Objective The objective of this study was to explore the uptake and cost-effectiveness of a tailored, theory-guided, text message intervention for pregnant smokers (“MiQuit”) when advertised on the internet. Methods Links to a website providing MiQuit initiation information (texting a short code) were advertised on a cost-per-click basis on 2 websites (Google Search and Facebook; £1000 budget each) and free of charge within smoking-in-pregnancy webpages on 2 noncommercial websites (National Childbirth Trust and NHS Choices). Daily budgets were capped to allow the Google and Facebook adverts to run for 1 and 3 months, respectively. We recorded the number of times adverts were shown and clicked on, the number of MiQuit initiations, the characteristics of those initiating MiQuit, and whether support was discontinued prematurely. For the commercial adverts, we calculated the cost per initiation and, using quit rates obtained from an earlier clinical trial, estimated the cost per additional quitter. Results With equal capped budgets, there were 812 and 1889 advert clicks to the MiQuit website from Google (search-based) and Facebook (banner) adverts, respectively. MiQuit was initiated by 5.2% (42/812) of those clicking via Google (95% CI 3.9%-6.9%) and 2.22% (42/1889) of those clicking via Facebook (95% CI 1.65%-2.99%). Adverts on noncommercial webpages generated 53 clicks over 6 months, with 9 initiations (9/53, 17%; 95% CI 9%-30%). For the commercial websites combined, mean cost per initiation was £24.73; estimated cost per additional quitter, including text delivery costs, was £735.86 (95% CI £227.66-£5223.93). Those initiating MiQuit via Google were typically very early in pregnancy (median gestation 5 weeks, interquartile range 10 weeks); those initiating via Facebook were distributed more evenly across pregnancy (median gestation 16 weeks, interquartile range 14 weeks). Conclusions Commercial online adverts are a feasible, likely cost-effective method for engaging pregnant smokers in digital cessation support and may generate uptake at a faster rate than noncommercial websites. As a strategy for implementing MiQuit, online advertising has large reach potential and can offer support to a hard-to-reach population of smokers. PMID:29674308
Atar, Shaul; Tolstrup, Kirsten; Cercek, Bojan; Siegel, Robert J
2007-07-01
Chlamydia pneumoniae has previously been associated with higher prevalence of valvular and cardiac calcifications. To investigate a possible association of seropositivity for C. pneumoniae and the presence of cardiac calcifications (mitral annular or aortic root calcification, and aortic valve sclerosis). We retrospectively analyzed serological data (immunoglobulin G TWAR antibodies) from the AZACS trial (Azithromycin in Acute Coronary Syndromes), and correlated the serological findings according to titer levels with the presence of cardiac calcifications as detected by transthoracic echocardiography. In 271 patients, age 69 +/- 13 years, who underwent both serological and echocardiographic evaluation, we found no significant association between the "calcification sum score" (on a scale of 0-3) in seropositive compared to seronegative patients (1.56 +/- 1.15 vs.1.35 +/- 1.15, respectively, P = 0.26). The median calcification sum score was 1 (interquartile range 0-3) for the seronegative group, and 2 (interquartile range 0-3) for the seropositive group (P = 0.2757). In addition, we did not find a significant correlation of any of the individual sites of cardiac calcification and C. pneumoniae seropositivity. Our findings suggest that past C. pneumoniae infection may not be associated with the pathogenesis of valvular and cardiac calcifications.
Shaw, Leslee J.; Berman, Daniel S.; Picard, Michael H.; Friedrich, Matthias G.; Kwong, Raymond Y.; Stone, Gregg W.; Senior, Roxy; Min, James K.; Hachamovitch, Rory; Scherrer-Crosbie, Marielle; Mieres, Jennifer H.; Marwick, Thomas H.; Phillips, Lawrence M.; Chaudhry, Farooq A.; Pellikka, Patricia A.; Slomka, Piotr; Arai, Andrew E.; Iskandrian, Ami E.; Bateman, Timothy M.; Heller, Gary V.; Miller, Todd D.; Nagel, Eike; Goyal, Abhinav; Borges-Neto, Salvador; Boden, William E.; Reynolds, Harmony R.; Hochman, Judith S.; Maron, David J.; Douglas, Pamela S.
2014-01-01
The lack of standardized reporting of the magnitude of ischemia on noninvasive imaging contributes to variability in translating the severity of ischemia across stress imaging modalities. We identified the risk of coronary artery disease (CAD) death or myocardial infarction (MI) associated with ≥10% ischemic myocardium on stress nuclear imaging as the risk threshold for stress echocardiography and cardiac magnetic resonance. A narrative review revealed that ≥10% ischemic myocardium on stress nuclear imaging was associated with a median rate of CAD death or MI of 4.9%/year (interquartile range: 3.75% to 5.3%). For stress echocardiography, ≥3 newly dysfunctional segments portend a median rate of CAD death or MI of 4.5%/year (interquartile range: 3.8% to 5.9%). Although imprecisely delineated, moderate-severe ischemia on cardiac magnetic resonance may be indicated by ≥4 of 32 stress perfusion defects or ≥3 dobutamine-induced dysfunctional segments. Risk-based thresholds can define equivalent amounts of ischemia across the stress imaging modalities, which will help to translate a common understanding of patient risk on which to guide subsequent management decisions. PMID:24925328
Issar, Tushar; Arnold, Ria; Kwai, Natalie C G; Pussell, Bruce A; Endre, Zoltan H; Poynten, Ann M; Kiernan, Matthew C; Krishnan, Arun V
2018-05-01
To demonstrate construct validity of the Total Neuropathy Score (TNS) in assessing peripheral neuropathy in subjects with chronic kidney disease (CKD). 113 subjects with CKD and 40 matched controls were assessed for peripheral neuropathy using the TNS. An exploratory factor analysis was conducted and internal consistency of the scale was evaluated using Cronbach's alpha. Construct validity of the TNS was tested by comparing scores between case and control groups. Factor analysis revealed valid item correlations and internal consistency of the TNS was good with a Cronbach's alpha of 0.897. Subjects with CKD scored significantly higher on the TNS (CKD: median, 6, interquartile range, 1-13; controls: median, 0, interquartile range, 0-1; p < 0.001). Subgroup analysis revealed construct validity was maintained for subjects with stages 3-5 CKD with and without diabetes. The TNS is a valid measure of peripheral neuropathy in patients with CKD. The TNS is the first neuropathy scale to be formally validated in patients with CKD. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
The Effect of Manual Restraint on Physiological Parameters in Barred Owls ( Strix varia ).
Doss, Grayson A; Mans, Christoph
2017-03-01
Manual restraint is commonly necessary when working with avian species in medical, laboratory, and field settings. Despite their prevalence, little is known about the stress response in raptorial bird species. To further understand the effect of restraint on the stress response in birds of prey, 12 barred owls ( Strix varia ) were manually restrained for 15 minutes. Physiological parameters (cloacal temperature, respiratory rate, heart rate) were followed over time and recorded at defined points during the restraint period. Heart rate decreased significantly over the restraint period by a mean ± SD of -73 ± 46 beats/min. Respiratory rate also decreased significantly (median: -11 breaths/min, interquartile range: -8 to -18). Cloacal temperature increased significantly over time in manually restrained owls (median: +1.5°C [+2.7°F], interquartile range: 1.3°C-2.1°C [2.3°F-3.8°F]). This study is the first to document stress hyperthermia in an owl species. Similar to another raptorial bird, the red-tailed hawk ( Buteo jamaicensis ), both heart rate and respiratory rate decreased and cloacal temperature increased over time in restrained barred owls. Barred owls appear to cope differently to restraint stress when compared to psittacine species.
Differences in sleep architecture between left and right temporal lobe epilepsy.
Nakamura, Miki; Jin, Kazutaka; Kato, Kazuhiro; Itabashi, Hisashi; Iwasaki, Masaki; Kakisaka, Yosuke; Nakasato, Nobukazu
2017-01-01
To investigate whether seizure lateralization affects sleep macrostructure in patients with left and right temporal lobe epilepsy (TLE), as rapid eye movement (REM) sleep is shorter in patients with right hemispheric cerebral infarction than with left. We retrospectively analyzed data from 16 patients with TLE (6 men and 10 women aged 34.9 ± 11.4 years) who underwent polysomnography as well as long-term video electroencephalography. Ten patients were diagnosed with left TLE and six patients with right TLE. Sleep stages and respiratory events were scored based on the American Academy of Sleep Medicine criteria. Sleep and respiratory parameters were compared between the patient groups. Percentage of REM stage sleep was significantly (p < 0.05) lower in patients with left TLE (median 8.8 %, interquartile range 5.5-13.8 %) than in patients with right TLE (median 17.0 %, interquartile range 14.1-18.3 %). The other parameters showed no significant differences. Shorter REM sleep in patients with left TLE sharply contrasts with the previous report of shorter REM sleep in patients with right cerebral infarction. Laterality of the irritative epileptic focus versus destructive lesion may have different effects on the sleep macrostructures.
van der Woude, Olga C P; Cuper, Natascha J; Getrouw, Chavalleh; Kalkman, Cor J; de Graaff, Jurgen C
2013-06-01
Poor vein visibility can make IV cannulation challenging in children with dark skin color. In the operating room, we studied the effectiveness of a near-infrared vascular imaging device (VascuLuminator) to facilitate IV cannulation in children with dark skin color. In the operating room of a general hospital in Curacao, all consecutive children (0-15 years of age) requiring IV cannulation were included in a pragmatic cluster randomized clinical trial. The VascuLuminator was made available to anesthesiologists at the operating complex in randomized clusters of 1 week. Success at first attempt was 63% (27/43, 95% confidence interval [CI], 47%-77%) in the VascuLuminator group vs 51% (23 of 45 patients, 95% CI, 36%-66%) in the control group (P = 0.27). Median time to successful cannulation was 53 seconds (interquartile range: 34-154) in the VascuLuminator group and 68 seconds (interquartile range: 40-159) in the control group (P = 0.54), and hazard ratio was 1.12 (95% CI, 0.73-1.71). The VascuLuminator has limited value in improving success at first attempt of facilitating IV cannulation in children with dark skin color.
Multidrug-resistant tuberculosis around the world: what progress has been made?
Mirzayev, Fuad; Wares, Fraser; Baena, Inés Garcia; Zignol, Matteo; Linh, Nguyen; Weyer, Karin; Jaramillo, Ernesto; Floyd, Katherine; Raviglione, Mario
2015-01-01
Multidrug-resistant tuberculosis (MDR-TB) (resistance to at least isoniazid and rifampicin) will influence the future of global TB control. 88% of estimated MDR-TB cases occur in middle- or high-income countries, and 60% occur in Brazil, China, India, the Russian Federation and South Africa. The World Health Organization collects country data annually to monitor the response to MDR-TB. Notification, treatment enrolment and outcome data were summarised for 30 countries, accounting for >90% of the estimated MDR-TB cases among notified TB cases worldwide. In 2012, a median of 14% (interquartile range 6–50%) of estimated MDR-TB cases were notified in the 30 countries studied. In 15 of the 30 countries, the number of patients treated for MDR-TB in 2012 (71 681) was >50% higher than in 2011. Median treatment success was 53% (interquartile range 40–70%) in the 25 countries reporting data for 30 021 MDR-TB cases who started treatment in 2010. Although progress has been noted in the expansion of MDR-TB care, urgent efforts are required in order to provide wider access to diagnosis and treatment in most countries with the highest burden of MDR-TB. PMID:25261327
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Daniel M., E-mail: daniel.green@stjude.org; Merchant, Thomas E.; Billups, Catherine A.
2015-09-01
Purpose: The treatment of children with embryonal brain tumors (EBT) includes craniospinal irradiation (CSI). There are limited data regarding the effect of CSI on pulmonary function. Methods: Protocol SJMB03 enrolled patients 3 to 21 years of age with EBT. Pulmonary function tests (PFTs) (forced expiratory volume in 1 second [FEV{sub 1}] and forced vital capacity [FVC] by spirometry, total lung capacity [TLC] by nitrogen washout or plethysmography, and diffusing capacity of the lung for carbon monoxide corrected for hemoglobin [DLCO{sub corr}]) were obtained. Differences between PFTs obtained immediately after the completion of CSI and 24 or 60 months after the completion of treatment (ACT)more » were compared using exact Wilcoxon signed-rank tests and repeated-measures models. Results: Between June 24, 2003, and March 1, 2010, 303 eligible patients (spine dose: ≤2345 cGy, 201; >2345 cGy, 102; proton beam, 20) were enrolled, 260 of whom had at least 1 PFT. The median age at diagnosis was 8.9 years (range, 3.1-20.4 years). The median thoracic spinal radiation dose was 23.4 Gy (interquartile range [IQR], 23.4-36.0 Gy). The median cyclophosphamide dose was 16.0 g/m{sup 2} (IQR, 15.7-16.0 g/m{sup 2}). At 24 and 60 months ACT, DLCO{sub corr} was <75% predicted in 23% (27/118) and 25% (21/84) of patients, FEV{sub 1} was <80% predicted in 20% (34/170) and 29% (32/109) of patients, FVC was <80% predicted in 27% (46/172) and 28% (30/108) of patients, and TLC was <75% predicted in 9% (13/138) and 11% (10/92) of patients. DLCO{sub corr} was significantly decreased 24 months ACT (median difference [MD] in % predicted, 3.00%; P=.028) and 60 months ACT (MD in % predicted, 6.00%; P=.033) compared with the end of radiation therapy. These significant decreases in DLCO{sub corr} were also observed in repeated-measures models (P=.011 and P=.032 at 24 and 60 months ACT, respectively). Conclusions: A significant minority of EBT survivors experience PFT deficits after CSI. Continued monitoring of this cohort is planned.« less
Petranova, T; Boyanov, M; Shinkov, A; Petkova, R; Intorcia, M; Psachoulia, E
2017-12-21
Persistence with osteoporosis therapy is critical for fracture risk reduction. This observational study evaluated medication-taking behaviour of women with postmenopausal osteoporosis receiving denosumab or oral ibandronate in real-world clinical practice in Bulgaria. Compared with ibandronate, densoumab was associated with a lower discontinuation rate and greater increases in bone mineral density. Persistence with osteoporosis therapy is critical for fracture risk reduction and the effectiveness of such treatments may be reduced by low persistence. Alternative therapies such as denosumab may improve persistence. This study aimed to describe medication-taking behaviour in women with osteoporosis, prescribed denosumab or oral ibandronate, in Bulgarian clinical practice. This retrospective, observational, multicentre chart review (with up to 24 months follow-up) enrolled postmenopausal women initiating 6-monthly denosumab injection or monthly oral ibandronate treatment for osteoporosis between 1 October 2011 and 30 September 2012. Overall, 441 women were enrolled (224 had initiated denosumab, 217 had initiated ibandronate). At baseline, more women in the denosumab group than in the ibandronate group had a previous fracture (25.5 vs 17.5%; p = 0.043) and past exposure to osteoporosis therapy (19.6 vs 12.0%; p = 0.028). At 24 months, 4.5% of women receiving denosumab had discontinued therapy compared with 56.2% of women receiving ibandronate. Median time to discontinuation was longer in the denosumab group (729 days; interquartile range (IQR), 728.3-729.0) than in the ibandronate group (367 days; IQR, 354.0-484.8; p < 0.001). At 24 months, there were significantly greater changes in BMD T-scores at the lumbar spine (p < 0.001) and femoral neck (p < 0.001) in patients receiving denosumab than in those receiving ibandronate. At 24 months, persistence with denosumab was 98.7%. This real-world study demonstrates there is a low discontinuation rate and high persistence with denosumab. Denosumab was associated with greater BMD increases than ibandronate, which could reduce fracture risk.
Sato, Noriaki; Origuchi, Hideki; Yamamoto, Umpei; Takanaga, Yasuhiro; Mohri, Masahiro
2012-09-01
Supervised cardiac rehabilitation provided at dedicated centres ameliorates exercise intolerance in patients with chronic heart failure. To correlate the amount of physical activity outside the hospital with improved exercise tolerance in patients with limited access to centre-based programs. Forty patients (median age 69 years) with stable heart failure due to systolic left ventricular dysfunction participated in cardiac rehabilitation once per week for five months. Using a validated single-axial accelerometer, the number of steps and physical activity-related energy expenditures on nonrehabilitation days were determined. Median (interquartile range) peak oxygen consumption was increased from 14.4 mL/kg/min (range 12.9 mL/kg/min to 17.8 mL/kg/min) to 16.4 mL/kg/min (range 13.9 mL/kg/min to 19.1 mL/kg/min); P<0.0001, in association with a decreased slope of the minute ventilation to carbon dioxide production plot (34.2 [range 31.3 to 38.1] versus 32.7 [range 30.3 to 36.5]; P<0.0001). Changes in peak oxygen consumption were correlated with the daily number of steps (P<0.01) and physical activity-related energy expenditures (P<0.05). Furthermore, these changes were significantly correlated with total exercise time per day and time spent for light (≤3 metabolic equivalents) exercise, but not with time spent for moderate/vigorous (>3 metabolic equivalents) exercise. The number of steps and energy expenditures outside the hospital were correlated with improved exercise capacity. An accelerometer may be useful for guiding home-based cardiac rehabilitation.
Riveros-Perez, Efrain; Wood, Cristina
2018-03-01
To assess the management and maternal outcomes of placenta accreta spectrum (PAS) disorders. A retrospective chart review was conducted of patients diagnosed with PAS disorders (placenta creta, increta, or percreta) who were treated at a US tertiary care center between February 1, 2011, and January 31, 2016. Obstetric management, anesthetic management, and maternal outcomes were analyzed. A total of 43 cases were identified; placenta previa was diagnosed among 33 (77%). Median age was 33 years (range 23-42). Median blood loss was 1500 mL (interquartile range 1000-2500); blood loss was greatest among the 10 patients with placenta percreta (3250 mL, interquartile range 2200-6000). Transfusion of blood products was necessary among 14 (33%) patients, with no difference in frequency according to the degree of placental invasion (P=0.107). Surgical complications occurred among 10 (23%) patients. Overall, 30 (70%) patients received combined spinal-epidural plus general anesthesia, 4 (9%) received only general anesthesia, and 9 (21%) underwent surgery with combined spinal-epidural anesthesia. One patient experienced difficult airway and another experienced accidental dural puncture. Placenta previa and accreta coexist in many patients, leading to substantial bleeding related to the degree of myometrial invasion. An interdisciplinary team approach plus the use of combined spinal-epidural anesthesia, transitioning to general anesthesia, were advisable and safe. © 2017 International Federation of Gynecology and Obstetrics.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Wilcoxon-Mann-Whitney Test (a) When n and m are less than 21, use Table 1. In order to find the appropriate... trigger (Step 3). The interquartile range (R) is the difference between the quartiles M-1 and M1; these... baseline observations were obtained, calculate the median (M) of all baseline observations: Instructions...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Wilcoxon-Mann-Whitney Test (a) When n and m are less than 21, use Table 1. In order to find the appropriate... trigger (Step 3). The interquartile range (R) is the difference between the quartiles M-1 and M1; these... baseline observations were obtained, calculate the median (M) of all baseline observations: Instructions...
An Initial Evaluation of the Impact of Pokémon GO on Physical Activity.
Xian, Ying; Xu, Hanzhang; Xu, Haolin; Liang, Li; Hernandez, Adrian F; Wang, Tracy Y; Peterson, Eric D
2017-05-16
Pokémon GO is a location-based augmented reality game. Using GPS and the camera on a smartphone, the game requires players to travel in real world to capture animated creatures, called Pokémon. We examined the impact of Pokémon GO on physical activity (PA). A pre-post observational study of 167 Pokémon GO players who were self-enrolled through recruitment flyers or online social media was performed. Participants were instructed to provide screenshots of their step counts recorded by the iPhone Health app between June 15 and July 31, 2016, which was 3 weeks before and 3 weeks after the Pokémon GO release date. Of 167 participants, the median age was 25 years (interquartile range, 21-29 years). The daily average steps of participants at baseline was 5678 (SD, 2833; median, 5718 [interquartile range, 3675-7279]). After initiation of Pokémon GO, daily activity rose to 7654 steps (SD, 3616; median, 7232 [interquartile range, 5041-9744], pre-post change: 1976; 95% CI, 1494-2458, or a 34.8% relative increase [ P <0.001]). On average, 10 000 "XP" points (a measure of game progression) was associated with 2134 additional steps per day (95% CI, 1673-2595), suggesting a potential dose-response relationship. The number of participants achieving a goal of 10 000+ steps per day increased from 15.3% before to 27.5% after (odds ratio, 2.06; 95% CI, 1.70-2.50). Increased PA was also observed in subgroups, with the largest increases seen in participants who spent more time playing Pokémon GO, those who were overweight/obese, or those with a lower baseline PA level. Pokémon GO participation was associated with a significant increase in PA among young adults. Incorporating PA into gameplay may provide an alternative way to promote PA in persons who are attracted to the game. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02888314. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Patel, Priyesh A; Liang, Li; Khazanie, Prateeti; Hammill, Bradley G; Fonarow, Gregg C; Yancy, Clyde W; Bhatt, Deepak L; Curtis, Lesley H; Hernandez, Adrian F
2016-07-01
Diabetes mellitus, heart failure (HF), and chronic kidney disease are common comorbidities, but overall use and safety of antihyperglycemic medications (AHMs) among patients with these comorbidities are poorly understood. Using Get With the Guidelines-Heart Failure and linked Medicare Part D data, we assessed AHM use within 90 days of hospital discharge among HF patients with diabetes mellitus discharged from Get With the Guidelines-Heart Failure hospitals between January 1, 2006, and October 1, 2011. We further summarized use by renal function and assessed renal contraindicated AHM use for patients with estimated glomerular filtration rate <30 mL/min/1.73m(2). Among 8791 patients meeting inclusion criteria, the median age was 77 (interquartile range 71-83), 62.3% were female, median body mass index was 29.7 (interquartile range 25.5-35.3), median hemoglobin A1c was 6.8 (interquartile range 6.2-7.8), and 34% had ejection fraction <40%. 74.9% of patients filled a prescription for an AHM, with insulin (39.5%), sulfonylureas (32.4%), and metformin (17%) being the most commonly used AHMs. Insulin use was higher and sulfonylurea/metformin use was lower among patients with lower renal function classes. Among 1512 patients with estimated glomerular filtration rate <30 mL/min/1.73m(2), 35.4% filled prescriptions for renal contraindicated AHMs per prescribing information, though there was a trend toward lower renal contraindicated AHM use over time (Cochran-Mantel-Haenszel row-mean score test P=0.048). Although use of other AHMs was low overall, thiazolidinediones were used in 6.6% of HF patients, and dipeptidyl peptidase-4 inhibitors were used in 5.1%, with trends for decreasing thiazolidinedione use and increased dipeptidyl peptidase-4 inhibitor use over time (P<0.001). Treatment of diabetes mellitus in patients with HF and chronic kidney disease is complex, and these patients are commonly treated with renal contraindicated AHMs, including over 6% receiving a thiazolidinedione, despite known concerns regarding HF. More research regarding safety and efficacy of various AHMs among HF patients is needed. © 2016 American Heart Association, Inc.
Poeran, Jashvant; Mazumdar, Madhu; Rasul, Rehana; Meyer, Joanne; Sacks, Henry S; Koll, Brian S; Wallach, Frances R; Moskowitz, Alan; Gelijns, Annetine C
2016-02-01
Antibiotic use, particularly type and duration, is a crucial modifiable risk factor for Clostridium difficile. Cardiac surgery is of particular interest because prophylactic antibiotics are recommended for 48 hours or less (vs ≤24 hours for noncardiac surgery), with increasing vancomycin use. We aimed to study associations between antibiotic prophylaxis (duration/vancomycin use) and C difficile among patients undergoing coronary artery bypass grafting. We extracted data on coronary artery bypass grafting procedures from the national Premier Perspective claims database (2006-2013, n = 154,200, 233 hospitals). Multilevel multivariable logistic regressions measured associations between (1) duration (<2 days, "standard" vs ≥2 days, "extended") and (2) type of antibiotic used ("cephalosporin," "cephalosporin + vancomycin," "vancomycin") and C difficile as outcome. Overall C difficile prevalence was 0.21% (n = 329). Most patients (59.7%) received a cephalosporin only; in 33.1% vancomycin was added, whereas 7.2% received vancomycin only. Extended prophylaxis was used in 20.9%. In adjusted analyses, extended prophylaxis (vs standard) was associated with significantly increased C difficile risk (odds ratio, 1.43; confidence interval, 1.07-1.92), whereas no significant associations existed for vancomycin use as adjuvant or primary prophylactic compared with the use of cephalosporins (odds ratio, 1.21; confidence interval, 0.92-1.60, and odds ratio, 1.39; confidence interval, 0.94-2.05, respectively). Substantial inter-hospital variation exists in the percentage of extended antibiotic prophylaxis (interquartile range, 2.5-35.7), use of adjuvant vancomycin (interquartile range, 4.2-61.1), and vancomycin alone (interquartile range, 2.3-10.4). Although extended use of antibiotic prophylaxis was associated with increased C difficile risk after coronary artery bypass grafting, vancomycin use was not. The observed hospital variation in antibiotic prophylaxis practices suggests great potential for efforts aimed at standardizing practices that subsequently could reduce C difficile risk. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Zifan, Ali; Ledgerwood-Lee, Melissa; Mittal, Ravinder K
2016-12-01
Three-dimensional high-definition anorectal manometry (3D-HDAM) is used to assess anal sphincter function; it determines profiles of regional pressure distribution along the length and circumference of the anal canal. There is no consensus, however, on the best way to analyze data from 3D-HDAM to distinguish healthy individuals from persons with sphincter dysfunction. We developed a computer analysis system to analyze 3D-HDAM data and to aid in the diagnosis and assessment of patients with fecal incontinence (FI). In a prospective study, we performed 3D-HDAM analysis of 24 asymptomatic healthy subjects (control subjects; all women; mean age, 39 ± 10 years) and 24 patients with symptoms of FI (all women; mean age, 58 ± 13 years). Patients completed a standardized questionnaire (FI severity index) to score the severity of FI symptoms. We developed and evaluated a robust prediction model to distinguish patients with FI from control subjects using linear discriminant, quadratic discriminant, and logistic regression analyses. In addition to collecting pressure information from the HDAM data, we assessed regional features based on shape characteristics and the anal sphincter pressure symmetry index. The combination of pressure values, anal sphincter area, and reflective symmetry values was identified in patients with FI versus control subjects with an area under the curve value of 1.0. In logistic regression analyses using different predictors, the model identified patients with FI with an area under the curve value of 0.96 (interquartile range, 0.22). In discriminant analysis, results were classified with a minimum error of 0.02, calculated using 10-fold cross-validation; different combinations of predictors produced median classification errors of 0.16 in linear discriminant analysis (interquartile range, 0.25) and 0.08 in quadratic discriminant analysis (interquartile range, 0.25). We developed and validated a novel prediction model to analyze 3D-HDAM data. This system can accurately distinguish patients with FI from control subjects. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.
Gupta, Mihir; Miller, Christopher J; Baker, Jason V; Lazar, Jason; Bogner, Johannes R; Calmy, Alexandra; Soliman, Elsayed Z; Neaton, James D
2013-03-01
We assessed the relation of inflammatory and coagulation biomarkers with electrocardiographic (ECG) evidence of myocardial ischemia. High-sensitivity C-reactive protein (hsCRP), interleukin-6 (IL-6), and D-dimer levels were measured at study entry for 3,085 human immunodeficiency virus-infected participants (mean age 44 years; 26.4% women; 24.6% black) in the Strategies for Management of Antiretroviral Therapy trial. Logistic regression models were used to examine the associations of these biomarkers with prevalent and incident myocardial ischemia. The latter analyses were performed for 1,411 participants who were randomly assigned to receive continuous antiretroviral therapy during follow-up to suppress the human immunodeficiency virus viral load and had ≥1 ECG reading during the follow-up period. The median hsCRP, IL-6, and D-dimer level was 1.65 μg/ml (interquartile range 0.69 to 4.11), 1.60 pg/ml (interquartile range 1.00 to 2.75), and 0.18 μg/ml (interquartile range 0.11 to 0.32), respectively. At baseline, the prevalence of major or minor Q-QS or ST-T ECG abnormalities was 18.6%. The biomarker levels were associated with prevalent major or minor ischemic abnormalities on the univariate analyses; however, adjustment for traditional risk factors attenuated these associations. The adjusted odds ratio for major or minor ischemic abnormalities and 95% confidence intervals for the greatest versus lowest quartiles was 1.3 (95% confidence interval 0.9 to 1.7) for hsCRP, 1.0 (95% confidence interval 0.7 to 1.3) for IL-6, and 1.1 (95% confidence interval 0.9 to 1.5) for D-dimer. During a median follow-up of 2.3 years, new definite or probable ischemic ECG abnormalities developed in 11.7% of participants receiving continuous antiretroviral therapy. Biomarker levels were not associated with incident abnormalities on unadjusted or adjusted analyses. In conclusion, higher levels of hsCRP, IL-6, and D-dimer were not associated with ischemic ECG abnormalities. Elevated biomarker levels and ECG abnormalities indicating myocardial ischemia might reflect different risk pathways for cardiovascular disease. Copyright © 2013 Elsevier Inc. All rights reserved.
Li, Debbie; Baxter, Nancy N; McLeod, Robin S; Moineddin, Rahim; Wilton, Andrew S; Nathens, Avery B
2014-12-01
There is increasing evidence to support the use of percutaneous abscess drainage, laparoscopy, and primary anastomosis in managing acute diverticulitis. The aim of this study was to evaluate how practices have evolved and to determine the effects on clinical outcomes. This is a population-based retrospective cohort study using administrative discharge data. This study was conducted in Ontario, Canada. All patients had been hospitalized for a first episode of acute diverticulitis (2002-2012). Temporal changes in treatment strategies and outcomes were evaluated by using the Cochran-Armitage test for trends. Multivariable logistic regression with generalized estimating equations was used to test for trends while adjusting for patient characteristics. There were 18,543 patients hospitalized with a first episode of diverticulitis, median age 60 years (interquartile range, 48-74). From 2002 to 2012, there was an increase in the proportion of patients admitted with complicated disease (abscess, perforation), 32% to 38%, yet a smaller proportion underwent urgent operation, 28% to 16% (all p < 0.001). The use of percutaneous drainage increased from 1.9% of admissions in 2002 to 3.3% in 2012 (p < 0.001). After adjusting for changes in patient and disease characteristics over time, the odds of urgent operation decreased by 0.87 per annum (95% CI, 0.85-0.89). In those undergoing urgent surgery (n = 3873), the use of laparoscopy increased (9% to 18%, p <0.001), whereas the use of the Hartmann procedure remained unchanged (64%). During this time, in-hospital mortality decreased (2.7% to 1.9%), as did the median length of stay (5 days, interquartile range, 3-9; to 3 days, interquartile range, 2-6; p <0.001). There is the potential for residual confounding, because clinical parameters available for risk adjustment were limited to fields existing within administrative data. There has been an increase in the use of nonoperative and minimally invasive strategies in treating patients with a first episode of acute diverticulitis. However, the Hartmann procedure remains the most frequently used urgent operative approach. Mortality and length of stay have improved during this time.
Al Jaaly, Emad; Fiorentino, Francesca; Reeves, Barnaby C; Ind, Philip W; Angelini, Gianni D; Kemp, Scott; Shiner, Robert J
2013-10-01
We compared the efficacy of noninvasive ventilation with bilevel positive airway pressure added to usual care versus usual care alone in patients undergoing coronary artery bypass grafting. We performed a 2-group, parallel, randomized controlled trial. The primary outcome was time until fit for discharge. Secondary outcomes were partial pressure of carbon dioxide, forced expiratory volume in 1 second, atelectasis, adverse events, duration of intensive care stay, and actual postoperative stay. A total of 129 patients were randomly allocated to bilevel positive airway pressure (66) or usual care (63). Three patients allocated to bilevel positive airway pressure withdrew. The median duration of bilevel positive airway pressure was 16 hours (interquartile range, 11-19). The median duration of hospital stay until fit for discharge was 5 days for the bilevel positive airway pressure group (interquartile range, 4-6) and 6 days for the usual care group (interquartile range, 5-7; hazard ratio, 1.68; 95% confidence interval, 1.08-2.31; P = .019). There was no significant difference in duration of intensive care, actual postoperative stay, and mean percentage of predicted forced expiratory volume in 1 second on day 3. Mean partial pressure of carbon dioxide was significantly reduced 1 hour after bilevel positive airway pressure application, but there was no overall difference between the groups up to 24 hours. Basal atelectasis occurred in 15 patients (24%) in the usual care group and 2 patients (3%) in the bilevel positive airway pressure group. Overall, 30% of patients in the bilevel positive airway pressure group experienced an adverse event compared with 59% in the usual care group. Among patients undergoing elective coronary artery bypass grafting, the use of bilevel positive airway pressure at extubation reduced the recovery time. Supported by trained staff, more than 75% of all patients allocated to bilevel positive airway pressure tolerated it for more than 10 hours. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Gionfriddo, Ashley; Nonoyama, Mika L; Laussen, Peter C; Cox, Peter N; Clarke, Megan; Floh, Alejandro A
2018-06-01
To promote standardization, the Centers for Disease Control and Prevention introduced a new ventilator-associated pneumonia classification, which was modified for pediatrics (pediatric ventilator-associated pneumonia according to proposed criteria [PVAP]). We evaluated the frequency of PVAP in a cohort of children diagnosed with ventilator-associated pneumonia according to traditional criteria and compared their strength of association with clinically relevant outcomes. Retrospective cohort study. Tertiary care pediatric hospital. Critically ill children (0-18 yr) diagnosed with ventilator-associated pneumonia between January 2006 and December 2015 were identified from an infection control database. Patients were excluded if on high frequency ventilation, extracorporeal membrane oxygenation, or reintubated 24 hours following extubation. None. Patients were assessed for PVAP diagnosis. Primary outcome was the proportion of subjects diagnosed with PVAP. Secondary outcomes included association with intervals of care. Two hundred seventy-seven children who had been diagnosed with ventilator-associated pneumonia were eligible for review; 46 were excluded for being ventilated under 48 hours (n = 16), on high frequency ventilation (n = 12), on extracorporeal membrane oxygenation (n = 8), ineligible bacteria isolated from culture (n = 8), and other causes (n = 4). ICU admission diagnoses included congenital heart disease (47%), neurological (16%), trauma (7%), respiratory (7%), posttransplant (4%), neuromuscular (3%), and cardiomyopathy (3%). Only 16% of subjects (n = 45) met the new PVAP definition, with 18% (n = 49) having any ventilator-associated condition. Failure to fulfill new definitions was based on inadequate increase in mean airway pressure in 90% or FIO2 in 92%. PVAP was associated with prolonged ventilation (median [interquartile range], 29 d [13-51 d] vs 16 d [8-34.5 d]; p = 0.002), ICU (median [interquartile range], 40 d [20-100 d] vs 25 d [14-61 d]; p = 0.004) and hospital length of stay (median [interquartile range], 81 d [40-182 d] vs 54 d [31-108 d]; p = 0.04), and death (33% vs 16%; p = 0.008). Few children with ventilator-associated pneumonia diagnosis met the proposed PVAP criteria. PVAP was associated with increased morbidity and mortality. This work suggests that additional study is required before new definitions for ventilator-associated pneumonia are introduced for children.
Boucher, Maria O; Smitherman, Andrew B; Pahl, Kristy S; Rao, Kathleen W; Deal, Allison M; Blatt, Julie
2016-04-01
RUNX1 (AML1) amplification in patients with B-cell acute lymphoblastic leukemia (B-ALL) has been associated with poor survival for unclear reasons. Our anecdotal experience suggests that children with B-ALL and RUNX1 amplification might be predisposed to thrombosis. We performed a retrospective cohort study of children with B-ALL treated from 2008 to 2014 at the North Carolina Children's Hospital. Patient demographics, cytogenetics, and diagnosis of thrombosis were extracted by blinded chart review. Analysis was performed examining the relationship between RUNX1 amplification and thrombosis. We identified 119 patients with B-ALL and a median age of 4.9 years (interquartile range, 2.9 to 8.6 y) at diagnosis. Four patients (3%) had RUNX1 amplification. The average number of RUNX1 copies among those with amplification was 5 (SD 0.81 [range, 4 to 6]). Eighteen thromboses were diagnosed within 6 months of starting treatment. These events were more likely among patients with RUNX1 amplification than in patients without amplification (75% vs. 13%; RR 5.75, 95% confidence interval, 2.75-12.01). RUNX1 amplification may predispose to early thrombotic events in children with B-ALL which could, in part, contribute to their poorer outcomes. Treatment implications, including possible prophylactic anticoagulation of patients with of RUNX1 amplification, justify larger studies to confirm these findings.
LaRovere, Kerri L; Graham, Robert J; Tasker, Robert C
2013-03-01
Pediatric neurocritical care is developing specialization within pediatric intensive care and pediatric neurology practice, and the evolving clinical expertise has relevance to training and education in both fields. We describe a model of service using a Neurology Consulting Team in the intensive care unit setting. Medical records were reviewed from a 32-month cohort of Neurology Consulting Team referrals. Six hundred eighty-nine (19%) of 3719 patients admitted to the intensive care unit were assessed by the team. The most common diagnostic categories were seizures, neurosurgical, cerebrovascular, or central nervous system infection. Fifty-seven percent (350 of 615 patients) required mechanical ventilation. Cohort mortality was 7% vs 2% for the general intensive care population (P < 0.01). The team provided 4592 initial and subsequent consultations; on average there were five to six new consultations per week. Each patient had a median of two (interquartile range, 1 to 6) consultations during admission. Three quarters of the cohort required neurodiagnostic investigation (1625 tests), with each patient undergoing a median of two (range, 0 to 3) studies. Taken together, the subset of pediatric intensive care unit patients undergoing neurology consultation, investigation, and management represents a significant practice experience for trainees, which has implications for future curriculum development in both pediatric critical care medicine and pediatric neurology. Copyright © 2013 Elsevier Inc. All rights reserved.
Patient Advocacy Organizations, Industry Funding, and Conflicts of Interest.
Rose, Susannah L; Highland, Janelle; Karafa, Matthew T; Joffe, Steven
2017-03-01
Patient advocacy organizations (PAOs) are influential health care stakeholders that provide direct counseling and education for patients, engage in policy advocacy, and shape research agendas. Many PAOs report having financial relationships with for-profit industry, yet little is known about the nature of these relationships. To describe the nature of industry funding and partnerships between PAOs and for-profit companies in the United States. A survey was conducted from September 1, 2013, to June 30, 2014, of a nationally representative random sample of 439 PAO leaders, representing 5.6% of 7865 PAOs identified in the United States. Survey questions addressed the nature of their activities, their financial relationships with industry, and the perceived effectiveness of their conflict of interest policies. Amount and sources of revenue as well as organizational experiences with and policies regarding financial conflict of interest. Of the 439 surveys mailed to PAO leaders, 289 (65.8%) were returned with at least 80% of the questions answered. The PAOs varied widely in terms of size, funding, activities, and disease focus. The median total revenue among responding organizations was $299 140 (interquartile range, $70 000-$1 200 000). A total of 165 of 245 PAOs (67.3%) reported receiving industry funding, with 19 of 160 PAOs (11.9%) receiving more than half of their funding from industry. Among the subset of PAOs that received industry funding, the median amount was $50 000 (interquartile range, $15 000-$200 000); the median proportion of industry support derived from the pharmaceutical, device, and/or biotechnology sectors was 45% (interquartile range, 0%-100%). A total of 220 of 269 respondents (81.8%) indicated that conflicts of interest are very or moderately relevant to PAOs, and 94 of 171 (55.0%) believed that their organizations' conflict of interest policies were very good. A total of 22 of 285 PAO leaders (7.7%) perceived pressure to conform their positions to the interests of corporate donors. Patient advocacy organizations engage in wide-ranging health activities. Although most PAOs receive modest funding from industry, a minority receive substantial industry support, raising added concerns about independence. Many respondents report a need to improve their conflict of interest policies to help maintain public trust.
Choquette, Anne F.
2014-01-01
This report summarizes pesticide and nitrate (as nitrogen) results from quarterly sampling of 31 surficial-aquifer wells in the Lake Wales Ridge Monitoring Network during April 1999 through January 2005. The wells, located adjacent to citrus orchards and used for monitoring only, were generally screened (sampled) within 5 to 40 feet of the water table. Of the 44 citrus pesticides and pesticide degradates analyzed, 17 were detected in groundwater samples. Parent pesticides and degradates detected in quarterly groundwater samples, ordered by frequency of detection, included norflurazon, demethyl norflurazon, simazine, diuron, bromacil, aldicarb sulfone, aldicarb sulfoxide, deisopropylatrazine (DIA), imidacloprid, metalaxyl, thiazopyr monoacid, oxamyl, and aldicarb. Reconnaissance sampling of five Network wells yielded detection of four additional pesticide degradates (hydroxysimazine, didealkylatrazine, deisopropylhydroxyatrazine, and hydroxyatrazine). The highest median concentration values per well, based on samples collected during the 1999–2005 period (n=14 to 24 samples per well), included 3.05 µg/L (micrograms per liter) (simazine), 3.90 µg/L (diuron), 6.30 µg/L (aldicarb sulfone), 6.85 µg/L (aldicarb sulfoxide), 22.0 µg/L (demethyl norflurazon), 25.0 µg/ (norflurazon), 89 µg/ (bromacil), and 25.5 mg/L (milligrams per liter) (nitrate). Nitrate concentrations exceeded the 10 mg/L (as nitrogen) drinking water standard in one or more groundwater samples from 28 of the wells, and the median nitrate concentration among these wells was 14 mg/L. Sampled groundwater pesticide concentrations exceeded Florida’s health-guidance benchmarks for aldicarb sulfoxide and aldicarb sulfone (4 wells), the sum of aldicarb and its degradates (6 wells), simazine (2 wells), the sum of simazine and DIA (3 wells), diuron (2 wells), bromacil (1 well), and the sum of norflurazon and demethyl norflurazon (1 well). The magnitude of fluctuations in groundwater pesticide concentrations varied between wells and between pesticide compounds. Of the 10 pesticide compounds detected at sufficient frequency to assess temporal variability in quarterly sampling records, median values of the relative interquartile range (ratio of the interquartile range to the median) among wells typically ranged from about 100 to 150 percent. The relative interquartile range of pesticide concentrations at individual wells could be much higher, sometimes exceeding 200 to 500 percent. No distinct spatial patterns were apparent among median pesticide concentrations in sampled wells; nitrate concentrations tended to be greater in samples from wells in the northern part of the study area.
Daily Alcohol Use as an Independent Risk Factor for HIV Seroconversion Among People Who Inject Drugs
Young, Samantha; Wood, Evan; Dong, Huiru; Kerr, Thomas; Hayashi, Kanna
2015-01-01
Aims To estimate the relationship between daily alcohol use and HIV seroconversion among people who inject drugs (PWID) in a Canadian setting. Design and Setting Data from an open prospective cohort study of PWID in Vancouver, Canada, recruited via snowball sampling and street outreach between May 1996 and November 2013. An interviewer-administered questionnaire including standardized behavioural assessment, and HIV antibody testing were conducted semiannually. Baseline HIV-seronegative participants completing ≥1 follow-up visits were eligible for the present analysis. Participants 1683 eligible participants, including 564 (33.5%) women, were followed for a median of 79.8 (interquartile range [IQR]: 33.3 – 119.1) months. Measurements The primary endpoint was time to HIV seroconversion, with the date of HIV seroconversion estimated as the midpoint between the last negative and the first positive antibody test results. The primary explanatory variable was self-reported daily alcohol use in the previous 6 months assessed semiannually. Other covariates considered included demographic, behavioural, social/structural, and environmental risk factors for HIV infection among PWID (e.g. daily cocaine injection, methadone use, etc.). Findings Of 1683 PWID, there were 176 HIV seroconversions during follow-up with an incidence density of 1.5 (95% confidence interval [CI]: 1.3 – 1.7) cases per 100 person-years. At baseline, 339 (20.1%) consumed alcohol at least daily in the previous six months. In multivariable extended Cox regression analyses, daily alcohol use remained independently associated with HIV seroconversion (Adjusted Hazard Ratio: 1.48; 95% CI: 1.00–2.17). Conclusions Daily alcohol use appears to be an independent risk factor for HIV seroconversion among our cohort of PWID. PMID:26639363
EARLY: a pilot study on early diagnosis of atrial fibrillation in a primary healthcare centre.
Benito, Luisa; Coll-Vinent, Blanca; Gómez, Eva; Martí, David; Mitjavila, Joan; Torres, Ferran; Miró, Òscar; Sisó, Antoni; Mont, Lluís
2015-11-01
Atrial fibrillation (AF) is associated with high morbidity and mortality. Early diagnosis is likely to improve therapy and prognosis. The study objective was to evaluate the usefulness of a programme for early diagnosis of AF in patients from an urban primary care centre. Participants were recruited from a randomized sample of patients not diagnosed with AF but having relevant risk factors: age ≥ 65 years, ischaemic and/or valvular heart disease, congestive heart failure, hypertension, and/or diabetes. Patients were randomly assigned to the intervention group (IG) or control group (CG). The intervention included (i) initial visit with clinical history, electrocardiogram, and instruction about pulse palpation and warning signs and (ii) electrocardiogram every 6 months during a 2-year follow-up. The main endpoint of the study was the proportion of new cases diagnosed at 6 months. Secondary endpoints were number of new AF diagnoses and complications associated with the arrhythmia in both groups. A total of 928 patients were included (463 IG and 465 CG). At 6 months, AF was diagnosed in 8 IG patients and 1 CG patient (1.7 vs. 0.2%, respectively, P = 0.018). After 2 years of follow-up, 11 IG patients and 6 CG patients had newly diagnosed AF (2.5 vs. 1.3%, respectively, P = 0.132). Time to first diagnosis of AF was shorter in IG patients [median (inter-quartile range): 7 (192) days vs. 227 (188.5) days in CG, P = 0.029]. The simple screening proposed could be useful for the early detection of AF in primary care. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.
Salvat, I; Zaldivar, P; Monterde, S; Montull, S; Miralles, I; Castel, A
2017-03-01
Multidisciplinary treatments have shown to be effective for fibromyalgia. We report detailed functional outcomes of patients with fibromyalgia who attended a 3-month Multidisciplinary treatment program. The hypothesis was that patients would have increased functional status, physical activity level, and exercise regularity after attending this program. We performed a retrospective analysis of a randomized, simple blinded clinical trial. The inclusion criteria consisted of female sex, a diagnosis of fibromyalgia, age 18-60 and 3-8 years of schooling. Measures from the Fibromyalgia Impact Questionnaire (FIQ) and the COOP/WONCA Functional Health Assessment Charts (WONCA) were obtained before and at the end of the treatment and at 3-, 6-, and 12-month follow-ups. Patients recorded their number of steps per day with pedometers. They performed the six-minute walk test (6 MW) before and after treatment. In total, 155 women participated in the study. Their median (interquartile interval) FIQ score was 68.0 (53.0-77.0) at the beginning of the treatment, and the difference between the Multidisciplinary and Control groups was statistically and clinically significant in all of the measures (except the 6-month follow-up). The WONCA charts showed significant clinical improvements in the Multidisciplinary group, with physical fitness in the normal range across almost all values. In that group, steps/day showed more regularity, and the 6 MW results showed improvement of -33.00 (-59.8 to -8.25) m, and the differences from the Control group were statistically significant. The patients who underwent the Multidisciplinary treatment had improved functional status, physical activity level, and exercise regularity. The functional improvements were maintained 1 year after treatment completion.
FEINSTEIN, Lydia; EDMONDS, Andrew; OKITOLONDA, Vitus; COLE, Stephen R; VAN RIE, Annelies; CHI, Benjamin H; NDJIBU, Papy; LUSIAMA, Jean; CHALACHALA, Jean Lambert; BEHETS, Frieda
2015-01-01
Background Programs to prevent mother-to-child HIV transmission (PMTCT) are plagued by loss to follow-up (LTFU) of HIV-exposed infants. We assessed if providing combination antiretroviral therapy (cART) to HIV-infected mothers was associated with reduced LTFU of their HIV-exposed infants in Kinshasa, DR Congo. Methods We constructed a cohort of mother-infant pairs using routinely collected clinical data. Maternal cART eligibility was based on national guidelines in effect at the time. Infants were considered LTFU following three failed tracking attempts after a missed visit or if more than six months passed since they were last seen in clinic. Statistical methods accounted for competing risks (e.g. death). Results 1318 infants enrolled at a median age of 2.6 weeks (interquartile range [IQR]: 2.1-6.9), at which point 24% of mothers were receiving cART. Overall, 5% of infants never returned to care following enrollment and 18% were LTFU by 18 months. The 18-month cumulative incidence of LTFU was 8% among infants whose mothers initiated cART by infant enrollment and 20% among infants whose mothers were not yet on cART. Adjusted for baseline factors, infants whose mothers were not on cART were over twice as likely to be LTFU, with a subdistribution hazard ratio of 2.75 (95% confidence limit: 1.81, 4.16). The association remained strong regardless of maternal CD4 count at infant enrollment. Conclusion Increasing access to cART for pregnant women could improve retention of HIV-exposed infants, thereby increasing the clinical and population-level impacts of PMTCT interventions and access to early cART for HIV-infected infants. PMID:25886922
Utility of Point-of-care Ultrasound in Children With Pulmonary Tuberculosis.
Bélard, Sabine; Heuvelings, Charlotte C; Banderker, Ebrahim; Bateman, Lindy; Heller, Tom; Andronikou, Savvas; Workman, Lesley; Grobusch, Martin P; Zar, Heather J
2018-07-01
Point-of-care ultrasound (POCUS) detects extrapulmonary tuberculosis (EPTB) in HIV infected adults but has not been evaluated in children despite their higher risk of EPTB. This study's aims were to investigate feasibility of POCUS for EPTB in children, frequency of POCUS findings suggestive of EPTB and time to sonographic resolution of findings with treatment. This prospective South African cohort study enrolled children with suspected pulmonary tuberculosis (PTB). POCUS for pleural, pericardial or ascitic effusion, abdominal lymphadenopathy or splenic or hepatic microabscesses was performed and repeated at 1, 3 and 6 months of tuberculosis (TB) treatment. Prevalence of POCUS findings and their association with HIV infection was investigated in children with confirmed PTB (microbiologically proven), unconfirmed PTB (clinically diagnosed) or unlikely TB (respiratory disease that improved during follow-up without TB treatment). Of 232 children [median age 37 months (interquartile range, 18-74)], 39 (17%) were HIV infected. Children with confirmed or unconfirmed PTB had a higher prevalence of POCUS findings than children with unlikely TB [18 of 58 (31%) and 36 of 119 (30%) vs. 8 of 55 (15%); P = 0.04 and P = 0.03, respectively]. Pleural effusion [n = 30 (13%)] or abdominal lymphadenopathy [n = 28 (12%)] were the most common findings; splenic microabscesses [n = 12 (5%)] were strongly associated with confirmed PTB. Children coinfected with HIV and TB were more likely than HIV-uninfected children with TB to have abdominal lymphadenopathy (37% vs. 10%; P < 0.001) or splenic microabscesses (23% vs. 3%; P < 0.001]. Most ultrasound findings were resolved by 3 months with appropriate TB treatment. POCUS for EPTB in children with PTB is feasible. The high prevalence of findings suggests that POCUS can contribute to timely diagnosis of childhood TB and to monitoring treatment response.
Evans, Jennifer L; Hahn, Judith A; Lum, Paula J; Stein, Ellen S; Page, Kimberly
2009-05-01
Studies of injection drug use cessation have largely sampled adults in drug treatment settings. Little is known about injection cessation and relapse among young injection drug users (IDU) in the community. A total of 365 HCV-negative IDU under age 30 years were recruited by street outreach and interviewed quarterly for a prospective cohort between January 2000 and February 2008. Participants were followed for a total of 638 person-years and 1996 visits. We used survival analysis techniques to identify correlates of injection cessation (> or =3 months) and relapse to injection. 67% of subjects were male, median age was 22 years (interquartile range (IQR) 20-26) and median years injecting was 3.6 (IQR 1.3-6.5). 28.8% ceased injecting during the follow-up period. Among those that ceased injecting, nearly one-half resumed drug injection on subsequent visits, one-quarter maintained injecting cessation, and one-quarter were lost to follow-up. Participating in a drug treatment program in the last 3 months and injecting less than 30 times per month were associated with injection cessation. Injecting heroin or heroin mixed with other drugs, injecting the residue from previously used drug preparation equipment, drinking alcohol, and using benzodiazepines were negatively associated with cessation. Younger age was associated with relapse to injection. These results suggest that factors associated with stopping injecting involve multiple areas of intervention, including access to drug treatment and behavioral approaches to reduce injection and sustain cessation. The higher incidence of relapse in the younger subjects in this cohort underscores the need for earlier detection and treatment programs targeted to adolescents and transition-age youth.
Nation-scale adoption of new medicines by doctors: an application of the Bass diffusion model
2012-01-01
Background The adoption of new medicines is influenced by a complex set of social processes that have been widely examined in terms of individual prescribers’ information-seeking and decision-making behaviour. However, quantitative, population-wide analyses of how long it takes for new healthcare practices to become part of mainstream practice are rare. Methods We applied a Bass diffusion model to monthly prescription volumes of 103 often-prescribed drugs in Australia (monthly time series data totalling 803 million prescriptions between 1992 and 2010), to determine the distribution of adoption rates. Our aim was to test the utility of applying the Bass diffusion model to national-scale prescribing volumes. Results The Bass diffusion model was fitted to the adoption of a broad cross-section of drugs using national monthly prescription volumes from Australia (median R2 = 0.97, interquartile range 0.95 to 0.99). The median time to adoption was 8.2 years (IQR 4.9 to 12.1). The model distinguished two classes of prescribing patterns – those where adoption appeared to be driven mostly by external forces (19 drugs) and those driven mostly by social contagion (84 drugs). Those driven more prominently by internal forces were found to have shorter adoption times (p = 0.02 in a non-parametric analysis of variance by ranks). Conclusion The Bass diffusion model may be used to retrospectively represent the patterns of adoption exhibited in prescription volumes in Australia, and distinguishes between adoption driven primarily by external forces such as regulation, or internal forces such as social contagion. The eight-year delay between the introduction of a new medicine and the adoption of the prescribing practice suggests the presence of system inertia in Australian prescribing practices. PMID:22876867
Paz-Bailey, Gabriela; Isern Fernandez, Virginia; Morales Miranda, Sonia; Jacobson, Jerry O; Mendoza, Suyapa; Paredes, Mayte A; Danaval, Damien C; Mabey, David; Monterroso, Edgar
2012-01-01
We conducted a study among HIV-positive men and women in Honduras to describe demographics, HIV risk behaviors and sexually transmitted infection prevalence, and identify correlates of unsafe sex. Participants were recruited from HIV clinics and nongovernmental organizations in Tegucigalpa and San Pedro Sula, Honduras in a cross-sectional study in 2006. We used audio-assisted computer interviews on demographics; behaviors in the past 12 months, 6 months, and 30 days; and access to care. Assays performed included herpes (HSV-2 Herpes Select), syphilis (rapid plasma reagin [RPR] and Treponema pallidum particle agglutination assay [TPPA]) serology, and other sexually transmitted infections by polymerase chain reaction (PCR). Bivariate and multivariate analyses were conducted to assess variables associated with unprotected sex across all partner types in the past 12 months. Of 810 participants, 400 were from Tegucigalpa and 410 from San Pedro Sula; 367 (45%) were men. Mean age was 37 years (interquartile range: 31-43). Consistent condom use for men and women was below 60% for all partner types. In multivariate analysis, unprotected sex was more likely among women (odds ratio [OR]: 1.9, 95% confidence interval [CI]: 1.2-3.1, P = 0.007), those with HIV diagnoses within the past year (OR: 2.0, 95% CI: 1.1-3.7, P = 0.016), those reporting difficulty accessing condoms (OR: 2.6, 95% CI: 1.4-4.7, P = 0.003), and those reporting discrimination (OR: 1.8, 95% CI: 1.1-3.0, P = 0.016). Programs targeting HIV-positive patients need to address gender-based disparities, improve condom access and use, and help establish a protective legal and policy environment free of stigma and discrimination.
NT-pro-BNP predicts worsening renal function in patients with chronic systolic heart failure.
Pfister, R; Müller-Ehmsen, J; Hagemeister, J; Hellmich, M; Erdmann, E; Schneider, C A
2011-06-01
Worsening renal function (WRF) is frequently observed in patients with heart failure and is associated with worse outcome. The aim of this study was to examine the association of the cardiac serum marker N-terminal pro-B-type natriuretic peptide (NT-pro-BNP) and WRF. A total of 125 consecutive patients of a tertiary care outpatient clinic for heart failure prospectively underwent evaluation of renal function every 6 months. The association of baseline NT-pro-BNP with WRF was analysed during a follow up of 18 months. Twenty-eight (22.4%) patients developed WRF (increase in serum creatinine ≥0.3 mg/dL). Patients with WRF (2870 pg/mL, interquartile range (IQR) 1063-4765) had significantly higher baseline NT-pro-BNP values than patients without WRF (547 pg/mL, IQR 173-1454). The risk for WRF increased by 4.0 (95% CI 2.1-7.5) for each standard deviation of log NT-pro-BNP. In multivariable analysis including age, baseline renal function, ejection fraction, New York Heart Association class and diuretic dose, only NT-pro-BNP and diabetes were independent predictors of WRF. At a cut-off level of 696 pg/mL, NT-pro-BNP showed a sensitivity of 92.9% and a negative predictive value of 96.4% for WRF. NT-pro-BNP is a strong independent predictor of WRF within 18 months in patients with systolic heart failure with a high negative predictive value. Further studies are needed to evaluate reno-protective strategies in patients with elevated NT-pro-BNP. © 2011 The Authors. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.
Young, Sera L; Israel-Ballard, Kiersten A; Dantzer, Emily A; Ngonyani, Monica M; Nyambo, Margaret T; Ash, Deborah M; Chantry, Caroline J
2010-12-01
To assess feeding practices of infants born to HIV-positive women in Dar es Salaam, Tanzania. These data then served as a proxy to evaluate the adequacy of current infant feeding counselling. A cross-sectional survey of infant feeding behaviours. Four clinics in greater Dar es Salaam in early 2008. A total of 196 HIV-positive mothers of children aged 6-10 months recruited from HIV clinics. Initiation of breast-feeding was reported by 95·4 % of survey participants. In the entire sample, 80·1 %, 34·2 % and 13·3 % of women reported exclusive breast-feeding (EBF) up to 2, 4 and 6 months, respectively. Median duration of EBF among women who ever breast-fed was 3 (interquartile range (IQR): 2·1, 4·0) months. Most non-breast-milk foods fed to infants were low in nutrient density. Complete cessation of breast-feeding occurred within 14 d of the introduction of non-breast-milk foods among 138 of the 187 children (73·8 %) who had ever received any breast milk. Of the 187 infants in the study who ever received breast milk, 19·4 % received neither human milk nor any replacement milks for 1 week or more (median duration of no milk was 14 (IQR: 7, 152) d). Infant feeding practices among these HIV-positive mothers resulted in infants receiving far less breast milk and more mixed complementary feeds than recommended, thus placing them at greater risk of both malnutrition and HIV infection. An environment that better enables mothers to follow national guidelines is urgently needed. More intensive infant feeding counselling programmes would very likely increase rates of optimal infant feeding.
Association between HIV status and psychological symptoms in perimenopausal women.
Looby, Sara E; Psaros, Christina; Raggio, Greer; Rivard, Corinne; Smeaton, Laura; Shifren, Jan; Grinspoon, Steven; Joffe, Hadine
2018-01-29
HIV-infected women are burdened by depression and anxiety, which may impact adherence to antiretroviral therapy and overall quality of life. Yet, little is known about the scope of psychological symptoms in the growing number of HIV-infected women reaching menopause, when affective symptoms are more prevalent in the general population. We conducted a longitudinal study to compare affective symptoms between perimenopausal HIV-infected and non-HIV-infected women. The Center for Epidemiologic Studies Depression Scale (CES-D), and the Generalized Anxiety Disorder scale (GAD-7) were completed at baseline and 12 months among 33 HIV-infected and 33 non-HIV-infected perimenopausal women matched by race, age, menstrual patterns, and BMI. Linear regression models estimated the relationship of baseline GAD-7 and CES-D scores with clinical factors. All women were perimenopausal at baseline, and the vast majority remained perimenopausal throughout follow-up. HIV status was associated with higher baseline CES-D scores (median [interquartile range] 21 [12, 29] vs 10 [5, 14]; P = 0.03) and GAD-7 scores (7 [5, 15] vs 2 [1, 7]; P = 0.01), controlling for smoking, substance use, and antidepressant use. Depressive symptoms and anxiety remained significantly higher in the HIV-infected women at 12 months (P ≤ 0.01). Significant relationships of depressive symptoms (P = 0.048) and anxiety (P = 0.02) with hot flash severity were also observed. Perimenopausal HIV-infected women experienced a disproportionately high level of affective symptom burden over a 12-month observation period. Given the potential for these factors to influence adherence to HIV clinical care and quality of life, careful assessment and referral for treatment of these symptoms is essential.
Moustaghfir, Abdelhamid; Haddak, Mohand; Mechmeche, Rachid
2012-11-01
The burden of cardiovascular diseases is anticipated to rise in developing countries. We sought to describe the epidemiology, management, and clinical outcomes of patients hospitalized with acute coronary syndromes (ACS) in three countries in western North Africa. Adult patients hospitalized with a diagnosis of ACS were enrolled in the prospective ACute Coronary Events - a multinational Survey of current management Strategies (ACCESS) registry over a 13-month period (January 2007 to January 2008). We report on patients enrolled at sites in Algeria, Morocco and Tunisia. A standardized form was used to collect data on patient characteristics, treatments and outcomes. A total of 1687 patients with confirmed ACS were enrolled (median age 59 [interquartile range 52, 68] years; 76% men), 59% with ST-elevation myocardial infarction (STEMI) and 41% with non-ST-elevation ACS (NSTE-ACS). During hospitalization, most patients received aspirin (96%) and a statin (90%), 83% received a beta-blocker and 74% an angiotensin-converting enzyme inhibitor. Among eligible STEMI patients, 42% (419/989) did not receive fibrinolysis or undergo percutaneous coronary intervention. All-cause death at 12 months was 8.1% and did not differ significantly between patients with STEMI or NSTE-ACS (8.3% vs 7.7%, respectively; Log-rank test P=0.82). Clinical factors associated with higher risk of death at 12 months included cardiac arrest, cardiogenic shock, bleeding episodes and diabetes, while percutaneous coronary intervention and male sex were associated with lower risk. In this observational study of ACS patients from three Maghreb countries, the use of evidence-based pharmacological therapies for ACS was quite high; however, 42% of the patients with STEMI were not given any form of reperfusion therapy. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
Ferman, Mutaz; Lim, Amanda H; Hossain, Monowar; Siow, Glenn W; Andrews, Jane M
2018-05-14
Multidisciplinary team meetings (MDTMs) have proven efficacy in cancer management. Whilst widely implemented in inflammatory bowel disease (IBD) care, their value is yet to be investigated. We reviewed the performance of MDTMs for IBD patients. Retrospective review of MDTMs from March 2013 to July 2016. Each patient's first MDTM was considered. Data collected included: report production and location, disease factors, recommendation(s), implementation and barriers to implementation. The MDTM process was considered successful when at least top-level recommendations were implemented within 6 months. MDTM attendance included IBD gastroenterologist, surgeons, radiologist, nurses, dieticians, psychologists and clinical trial staff. Initial MDTM encounters for 166 patients were reviewed: 86 females; mean age 40 years; 140 (84.3%) with Crohn's disease; mean disease duration 10.8 years (interquartile range 15 years). Electronic reports were filed for all patients; hard copies in 84%. In 151/166 episodes, all (n=127) or top-line (n=24) recommendations were implemented, although there was a delay beyond 6 months in 5. Of 146 patients with a successful MDTM, 85 (58.2%) were in clinical remission at last review (median follow-up 27 months). Amongst patients with unsuccessful MDTMs (n=13), only 2 (15.4%) were in clinical remission at follow-up. Barriers to implementation included patients declining recommendations and loss to follow-up. The majority of MDTM encounters were successful from both a process and clinical outcome perspective. System opportunities to improve the process include ensuring 100% reports are available and addressing implementation delays. Patient factors to address include improved engagement and understanding reasons for declining recommendations. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Moreno, Lucas; Rubie, Herve; Varo, Amalia; Le Deley, Marie Cecile; Amoroso, Loredana; Chevance, Aurelie; Garaventa, Alberto; Gambart, Marion; Bautista, Francisco; Valteau-Couanet, Dominique; Geoerger, Birgit; Vassal, Gilles; Paoletti, Xavier; Pearson, Andrew D J
2017-01-01
Few randomized trials have been conducted in children with relapsed/refractory neuroblastoma and data about outcomes including progression-free survival (PFS) in these patients are scarce. A meta-analysis of three phase II studies of children with relapsed/refractory neuroblastoma conducted in Europe (temozolomide, topotecan-vincristine-doxorubicin and topotecan-temozolomide) was performed. Individual patient data with extended follow-up were collected from the trial databases after publication to describe trial outcomes (response rate, clinical benefit ratio, duration of treatment, PFS, and overall survival [OS]). Characteristics of subjects with relapsed/refractory neuroblastoma were compared. Data from 71 children and adolescents with relapsed/refractory neuroblastoma were collected. Response definitions were not homogeneous in the three trials. Patients were on study for a median of 3.5 months (interquartile range [IQR] 1.9-6.2). Of those, 35.2% achieved a complete or partial response, 26.3% experienced a response after more than two cycles, and 23.9% received more than six cycles. Median PFS from study entry for all, refractory, and relapsed patients was 6.4 ± 1.0, 12.5 ± 6.8, and 5.7 ± 1.0 months, respectively (P = 0.006). Median OS from study entry for all, refractory, and relapsed patients was 16.1 ± 4.3, 27.9 ± 20.2, and 11.0 ± 1.6 months, respectively (P = 0.03). Baseline data for response rate, clinical benefit ratio, duration of treatment, PFS, and OS were provided. Two subpopulations (relapsed/refractory) were clearly distinct and should be included in the interpretation of all trials. These results should help informing the design of forthcoming studies in relapsed/refractory neuroblastoma. © 2016 Wiley Periodicals, Inc.
2011-01-01
Background To better understand the need for paediatric second-line antiretroviral therapy (ART), an ART management survey and a cross-sectional analysis of second-line ART use were conducted in the TREAT Asia Paediatric HIV Observational Database and the IeDEA Southern Africa (International Epidemiologic Databases to Evaluate AIDS) regional cohorts. Methods Surveys were conducted in April 2009. Analysis data from the Asia cohort were collected in March 2009 from 12 centres in Cambodia, India, Indonesia, Malaysia, and Thailand. Data from the IeDEA Southern Africa cohort were finalized in February 2008 from 10 centres in Malawi, Mozambique, South Africa and Zimbabwe. Results Survey responses reflected inter-regional variations in drug access and national guidelines. A total of 1301 children in the TREAT Asia and 4561 children in the IeDEA Southern Africa cohorts met inclusion criteria for the cross-sectional analysis. Ten percent of Asian and 3.3% of African children were on second-line ART at the time of data transfer. Median age (interquartile range) in months at second-line initiation was 120 (78-145) months in the Asian cohort and 66 (29-112) months in the southern African cohort. Regimens varied, and the then current World Health Organization-recommended nucleoside reverse transcriptase combination of abacavir and didanosine was used in less than 5% of children in each region. Conclusions In order to provide life-long ART for children, better use of current first-line regimens and broader access to heat-stable, paediatric second-line and salvage formulations are needed. There will be limited benefit to earlier diagnosis of treatment failure unless providers and patients have access to appropriate drugs for children to switch to. PMID:21306608
Vollmar, A C; Fox, P R
2016-01-01
Dilated cardiomyopathy (DCM) is a common cause of morbidity and mortality in the Irish Wolfhound (IW). However, the benefit of medical treatment in IW dogs with preclinical DCM, atrial fibrillation (AF), or both has not been demonstrated. Compare the time to develop congestive heart failure (CHF) or sudden death in IW dogs with preclinical DCM, AF, or both receiving monotherapy with pimobendan, methyldigoxin, or benazepril hydrochloride. Seventy-five client-owned IW dogs. Irish Wolfhound dogs were prospectively randomized to receive pimobendan (Vetmedin®), benazepril HCl (Fortekor®), or methyldigoxin (Lanitop®) monotherapy in a 1:1:1 ratio in a blinded clinical trial. The prospectively defined composite primary endpoint was onset of CHF or sudden death. To assure stringent evaluation of treatment effect, data from dogs complying with the study protocol were analyzed. Sixty-six IW fulfilling the study protocol included 39 males, 27 females; median (interquartile range) age, 4.0 years (3.0-5.0 years) and weight, 70.0 kg (63.0-75.0 kg). Primary endpoint was reached in 5 of 23 (21.7%) IW receiving pimobendan, 11 of 22 (50.0%) receiving benazepril HCl, and 9 of 21 (42.9%) receiving methyldigoxin. Median time to primary endpoint was significantly longer for pimobendan (1,991 days; 65.4 months) compared to methyldigoxin (1,263 days; 41.5 months; P = .031) or benazepril HCl-(997 days; 32.8 months; P = .008) treated dogs. In IW dogs with preclinical DCM, AF or both, pimobendan monotherapy significantly prolonged time to onset of CHF or sudden death than did monotherapy with benazepril HCl or methyldigoxin. Copyright © 2016 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Powis, Kathleen; Lockman, Shahin; Smeaton, Laura; Hughes, Michael D; Fawzi, Wafaie; Ogwu, Anthony; Moyo, Sikhulile; van Widenfelt, Erik; von Oettingen, Julia; Makhema, Joseph; Essex, Max; Shapiro, Roger L
2014-11-01
Low maternal 25(OH)D (vitamin D) values have been associated with higher mortality and impaired growth among HIV-exposed uninfected (HEU) infants of antiretroviral (ART)-naive women. These associations have not been studied among HEU infants of women receiving ART. We performed a nested case-control study in the Botswana Mma Bana Study, a study providing ART to women during pregnancy and breastfeeding. Median maternal vitamin D values, and the proportion with maternal vitamin D insufficiency, were compared between women whose HEU infants experienced morbidity/mortality during 24 months of follow-up and women with nonhospitalized HEU infants. Growth faltering was assessed for never hospitalized infants attending the 24-month-of-life visit. Multivariate logistic regression models determined associations between maternal vitamin D insufficiency and infant morbidity/mortality and growth faltering. Delivery plasma was available and vitamin D levels assayable from 119 (86%) of 139 cases and 233 (84%) of 278 controls, and did not differ significantly between cases and controls [median: 36.7 ng/mL, interquartile range (IQR): 29.1-44.7 vs. 37.1 ng/mL, IQR: 30.0-47.2, P = 0.32]. Vitamin D insufficiency (<32 ng/mL) was recorded among 112 (31.8%) of 352 women at delivery and occurred most frequently among women delivering in winter. Multivariate logistic regression models adjusted for maternal HIV disease progression did not show associations between maternal vitamin D insufficiency at delivery and child morbidity/mortality, or 24-month-of-life growth faltering. Vitamin D insufficiency was common among ART-treated pregnant women in Botswana, but was not associated with morbidity, mortality or growth impairment in their HIV-uninfected children.
Taggar, Jaspal S; Lewis, Sarah; Docherty, Graeme; Bauld, Linda; McEwen, Andy; Coleman, Tim
2015-04-14
Single-item urges to smoke measures have been contemplated as important measures of nicotine dependence This study aimed to prospectively determine the relationships between measures of craving to smoke and smoking cessation, and compare their ability to predict cessation with the Heaviness of Smoking Index, an established measure of nicotine dependence. We conducted a secondary analysis of data from the randomised controlled PORTSSS trial. Measures of nicotine dependence, ascertained before making a quit attempt, were the HSI, frequency of urges to smoke (FUTS) and strength of urges to smoke (SUTS). Self-reported abstinence at six months after quitting was the primary outcome measure. Multivariate logistic regression and Receiver Operating Characteristic (ROC) analysis were used to assess associations and abilities of the nicotine dependence measures to predict smoking cessation. Of 2,535 participants, 53.5% were female; the median (Interquartile range) age was 38 (28-50) years. Both FUTS and HSI were inversely associated with abstinence six months after quitting; for each point increase in HSI score, participants were 16% less likely to have stopped smoking (OR 0.84, 95% C.I 0.78-0.89, p < 0.0001). Compared to participants with the lowest possible FUTS scores, those with greater scores had generally lower odds of cessation (p across frequency of urges categories=0.0026). SUTS was not associated with smoking cessation. ROC analysis suggested the HSI and FUTS had similar predictive validity for cessation. Higher FUTS and HSI scores were inversely associated with successful smoking cessation six months after quit attempts began and both had similar validity for predicting cessation.
Lai, Agnes Y K; Fong, Daniel Y T; Lam, Jamie C M; Weaver, Terri E; Ip, Mary S M
2014-09-01
Poor adherence to CPAP treatment in OSA adversely affects the effectiveness of this therapy. This randomized controlled trial (RCT) examined the efficacy of a brief motivational enhancement education program in improving adherence to CPAP treatment in subjects with OSA. Subjects with newly diagnosed OSA were recruited into this RCT. The control group received usual advice on the importance of CPAP therapy and its care. The intervention group received usual care plus a brief motivational enhancement education program directed at enhancing the subjects' knowledge, motivation, and self-efficacy to use CPAP through the use of a 25-min video, a 20-min patient-centered interview, and a 10-min telephone follow-up. Self-reported daytime sleepiness adherence-related cognitions and quality of life were assessed at 1 month and 3 months. CPAP usage data were downloaded at the completion of this 3-month study. One hundred subjects with OSA (mean ± SD, age 52 ± 10 years; Epworth Sleepiness Scales [ESS], 9 ± 5; median [interquartile range] apnea-hypopnea index, 29 [20, 53] events/h) prescribed CPAP treatment were recruited. The intervention group had better CPAP use (higher daily CPAP usage by 2 h/d [Cohen d = 1.33, P < .001], a fourfold increase in the number using CPAP for ≥ 70% of days with ≥ 4 h/d [P < .001]), and greater improvements in daytime sleepiness (ESS) by 2.2 units (P = .001) and treatment self-efficacy by 0.2 units (P = .012) compared with the control group. Subjects with OSA who received motivational enhancement education in addition to usual care were more likely to show better adherence to CPAP treatment, with greater improvements in treatment self-efficacy and daytime sleepiness. ClinicalTrials.gov; No.: NCT01173406; URL: www.clinicaltrials.gov.
Pavlinac, P. B.; Denno, D. M.; John-Stewart, G. C.; Onchiri, F. M.; Naulikha, J. M.; Odundo, E. A.; Hulseberg, C. E.; Singa, B. O.; Manhart, L. E.; Walson, J. L.
2016-01-01
Background Shigella is a leading cause of childhood diarrhea mortality in sub-Saharan Africa. Current World Health Organization guidelines recommend antibiotics for children in non cholera-endemic areas only in the presence of dysentery, a proxy for suspected Shigella infection. Methods To assess the sensitivity and specificity of the syndromic diagnosis of Shigella-associated diarrhea, we enrolled children aged 6 months to 5 years presenting to 1 of 3 Western Kenya hospitals between November 2011 and July 2014 with acute diarrhea. Stool samples were tested using standard methods for bacterial culture and multiplex polymerase chain reaction for pathogenic Escherichia coli. Stepwise multivariable logit models identified factors to increase the sensitivity of syndromic diagnosis. Results Among 1360 enrolled children, median age was 21 months (interquartile range, 11–37), 3.4% were infected with human immunodeficiency virus, and 16.5% were stunted (height-for-age z-score less than −2). Shigella was identified in 63 children (4.6%), with the most common species being Shigella sonnei (53.8%) and Shigella flexneri (40.4%). Dysentery correctly classified 7 of 63 Shigella cases (sensitivity, 11.1%). Seventy-eight of 1297 children without Shigella had dysentery (specificity, 94.0%). The combination of fecal mucous, age over 23 months, and absence of excessive vomiting identified more children with Shigella-infection (sensitivity, 39.7%) but also indicated antibiotics in more children without microbiologically confirmed Shigella (specificity, 82.7%). Conclusions Reliance on dysentery as a proxy for Shigella results in the majority of Shigella-infected children not being identified for antibiotics. Field-ready rapid diagnostics or updated evidence-based algorithms are urgently needed to identify children with diarrhea most likely to benefit from antibiotic therapy. PMID:26407270
Kernéis, Solen; Launay, Odile; Ancelle, Thierry; Iordache, Laura; Naneix-Laroche, Véronique; Méchaï, Frédéric; Fehr, Thierry; Leroy, Jean-Philippe; Issartel, Bertrand; Dunand, Jean; van der Vliet, Diane; Wyplosz, Benjamin; Consigny, Paul-Henri; Hanslik, Thomas
2013-09-01
To assess the safety and immunogenicity of live attenuated yellow fever (YF) 17D vaccine in adults receiving systemic corticosteroid therapy. All adult travelers on systemic corticosteroid therapy who had received the YF17D vaccine in 24 French vaccination centers were prospectively enrolled and matched with healthy controls (1:2) on age and history of YF17D immunization. Safety was assessed in a self-administered standardized questionnaire within 10 days after immunization. YF-specific neutralizing antibody titers were measured 6 months after vaccination in patients receiving corticosteroids. Between July 2008 and February 2011, 102 vaccine recipients completed the safety study (34 receiving corticosteroids and 68 controls). The median age was 54.9 years (interquartile range [IQR] 45.1-60.3 years) and 45 participants had a history of previous YF17D immunization. The median time receiving corticosteroid therapy was 10 months (IQR 1-67 months) and the prednisone or equivalent dosage was 7 mg/day (IQR 5-20). Main indications were autoimmune diseases (n = 14), rheumatoid arthritis (n = 9), and upper respiratory tract infections (n = 8). No serious adverse event was reported; however, patients receiving corticosteroids reported more frequent moderate/severe local reactions than controls (12% and 2%, respectively; relative risk 8.0, 95% confidence interval 1.4-45.9). All subjects receiving corticosteroids who were tested (n = 20) had neutralizing antibody titers >10 after vaccination. After YF17D immunization, moderate/severe local reactions may be more frequent in patients receiving systemic corticosteroid therapy. Immunogenicity seems satisfactory. Large-scale studies are needed to confirm these results. Copyright © 2013 by the American College of Rheumatology.
Legrand, Delphine; Vaes, Bert; Matheï, Catharina; Adriaensen, Wim; Van Pottelbergh, Gijs; Degryse, Jean-Marie
2014-06-01
To evaluate the predictive value of muscle strength and physical performance in the oldest old for all-cause mortality; hospitalization; and the onset of disability, defined as a decline in activities of daily living (ADLs), independent of muscle mass, inflammatory markers, and comorbidities. A prospective, observational, population-based follow-up study. Three well-circumscribed areas of Belgium. Five hundred sixty participants aged 80 and older were followed for 33.5 months (interquartile range 31.1-35.6 months). Grip strength, Short Physical Performance Battery (SPPB) score, and muscle mass were measured at baseline; ADLs at baseline and after 20 months; and all-cause mortality and time to first hospitalization from inclusion onward. Kaplan-Meier curves and Cox proportional hazards models were calculated for all-cause mortality and hospitalization. Logistic regression analysis was used to determine predictors of decline in ADLs. Kaplan-Meier curves showed significantly higher all-cause mortality and hospitalization in subjects in the lowest tertile of grip strength and SPPB score. The adjusted Cox proportional hazards model showed that participants with high grip strength or a high SPPB score had a lower risk of mortality and hospitalization, independent of muscle mass, inflammatory markers, and comorbidity. A relationship was found between SPPB score and decline in ADLs, independent of muscle mass, inflammation, and comorbidity. In people aged 80 and older, physical performance is a strong predictor of mortality, hospitalization, and disability, and muscle strength is a strong predictor of mortality and hospitalization. All of these relationships were independent of muscle mass, inflammatory markers, and comorbidity. © 2014, Copyright the Authors Journal compilation © 2014, The American Geriatrics Society.
Arora, Sohrab; Abaza, Ronney; Adshead, James M; Ahlawat, Rajesh K; Challacombe, Benjamin J; Dasgupta, Prokar; Gandaglia, Giorgio; Moon, Daniel A; Yuvaraja, Thyavihally B; Capitanio, Umberto; Larcher, Alessandro; Porpiglia, Francesco; Porter, James R; Mottrie, Alexander; Bhandari, Mahendra; Rogers, Craig
2018-01-01
To analyse the outcomes of robot-assisted partial nephrectomy (RAPN) in patients with a solitary kidney in a large multi-institutional database. In all, 2755 patients in the Vattikuti Collective Quality Initiative database underwent RAPN by 22 surgeons at 14 centres in nine countries. Of these patients, 74 underwent RAPN with a solitary kidney between 2007 and 2016. We retrospectively analysed the functional and oncological outcomes of these 74 patients. A 'trifecta' of outcomes was assessed, with trifecta defined as a warm ischaemia time (WIT) of <20 min, negative surgical margins, and no complications intraoperatively or within 3 months of RAPN. All 74 patients underwent RAPN successfully with one conversion to radical nephrectomy. The median (interquartile range [IQR]) operative time was 180 (142-230) min. Early unclamping was used in 11 (14.9%) patients and zero ischaemia was used in 12 (16.2%). Trifecta outcomes were achieved in 38 of 66 patients (57.6%). The median (IQR) WIT was 15.5 (8.75-20.0) min for the entire cohort. The overall complication rate was 24.1% and the rate of Clavien-Dindo grade ≤II complications was 16.3%. Positive surgical margins were present in four cases (5.4%). The median (IQR) follow-up was 10.5 (2.12-24.0) months. The median drop in estimated glomerular filtration rate at 3 months was 7.0 mL/min/1.72 m 2 (11.01%). Our findings suggest that RAPN is a safe and effective treatment option for select renal tumours in solitary kidneys in terms of a trifecta of negative surgical margins, WIT of <20 min, and low operative and perioperative morbidity. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.
Loh, Jane; Kennedy, Mary Clare; Wood, Evan; Kerr, Thomas; Marshall, Brandon; Parashar, Surita; Montaner, Julio; Milloy, M-J
2016-11-01
Homelessness is common among people who use drugs (PWUD) and, for those living with HIV/AIDS, an important contributor to sub-optimal HIV treatment outcomes. This study aims to investigate the relationship between the duration of homelessness and the likelihood of plasma HIV-1 RNA viral load (VL) non-detectability among a cohort of HIV-positive PWUD. We used data from the ACCESS study, a long-running prospective cohort study of HIV-positive PWUD linked to comprehensive HIV clinical records including systematic plasma HIV-1 RNA VL monitoring. We estimated the longitudinal relationship between the duration of homelessness and the likelihood of exhibiting a non-detectable VL (i.e., <500 copies/mL plasma) using generalized linear mixed-effects modelling. Between May 1996 and June 2014, 922 highly active antiretroviral therapy-exposed participants were recruited and contributed 8188 observations. Of these, 4800 (59%) were characterized by non-detectable VL. Participants reported they were homeless in 910 (11%) interviews (median: six months, interquartile range: 6-12 months). A longer duration of homelessness was associated with lower odds of VL non-detectability (adjusted odds ratio = 0.71 per six-month period of homelessness, 95% confidence interval: 0.60-0.83) after adjustment for age, ancestry, drug use patterns, engagement in addiction treatment, and other potential confounders. Longer durations of episodes of homelessness in this cohort of HIV-positive illicit drug users were associated with a lower likelihood of plasma VL non-detectability. Our findings suggest that interventions that seek to promptly house homeless individuals, such as Housing First approaches, might assist in maximizing the clinical and public health benefits of antiretroviral therapy among people living with HIV/AIDS.
Parashar, Deepak; Kabra, Sushil K; Lodha, Rakesh; Singh, Varinder; Mukherjee, Aparna; Arya, Tina; Grewal, Harleen M S; Singh, Sarman
2013-06-01
The microbiological confirmation of pulmonary tuberculosis in children relies on cultures of gastric aspirate (GA) specimens. Conventionally, GAs are neutralized to improve culture yields of mycobacteria. However, there are limited data to support this practice. To study the utility of neutralization of GAs with sodium bicarbonate in children with intrathoracic tuberculosis, a total of 116 children of either sex, aged 6 months to 14 years (median age, 120 months; interquartile range [IQR], 7 to 192 months), underwent gastric aspiration on 2 consecutive days. Gastric aspirates were divided into two aliquots, and only one aliquot was neutralized with 1% sodium bicarbonate. Both aliquots were processed for smear and culture examinations. Out of the 232 gastric aspirates, 12 (5.17%) were acid-fast bacilli (AFB) smear positive. There were no differences in smear positivity rates from samples with or without neutralization. The yield of Mycobacterium tuberculosis on a Bactec MGIT 960 culture system was significantly lower in the neutralized samples (16.3% [38/232]) than in the nonneutralized samples (21.5% [50/232]) (P = 0.023). There was no significant difference between the neutralized and the nonneutralized samples in time to detection using the MGIT 960 system (average, 24.6 days; IQR, 12 to 37 days) (P = 0.9). The contamination rates were significantly higher in the neutralized samples than in the nonneutralized samples (17.2% [40/232] versus 3.9% [9/232]) (P = 0.001). The agreement for positive mycobacterial culture between the two approaches was 66.5% (P = 0.001). Hence, we recommend that gastric aspirate samples not be neutralized with sodium bicarbonate prior to culture for M. tuberculosis.
Balestre, Eric; Eholié, Serge P; Lokossue, Amani; Sow, Papa Salif; Charurat, Man; Minga, Albert; Drabo, Joseph; Dabis, François; Ekouevi, Didier K; Thiébaut, Rodolphe
2012-05-15
To assess the effect of aging on the immunological response to antiretroviral therapy (ART) in the West African context. The change in CD4 T-cell count was analysed according to age at the time of ART initiation among HIV-infected patients enrolled in the International epidemiological Database to Evaluate AIDS (IeDEA) Collaboration in the West African region. CD4 gain over 12 months of ART was estimated using linear mixed models. Models were adjusted for baseline CD4 cell count, sex, baseline clinical stage, calendar period and ART regimen. The total number of patients included was 24,107, contributing for 50,893 measures of CD4 cell count in the first year of ART. The baseline median CD4 cell count was 144 cells/μl [interquartile range (IQR) 61-235]; median CD4 cell count reached 310 cells/μl (IQR 204-443) after 1 year of ART. The median age at treatment initiation was 36.3 years (10th-90th percentiles = 26.5-50.1). In adjusted analysis, the mean CD4 gain was significantly higher in younger patients (P < 0.0001). At 12 months, patients below 30 years recovered an additional 22 cells/μl on average [95% confidence interval (CI) 2-43] compared to patients at least 50 years. Among HIV-infected adults in West Africa, the immunological response after 12 months of ART was significantly poorer in elderly patients. As the population of treated patients is likely to get older, the impact of this age effect on immunological response to ART may increase over time.
Antibiotics Are the Most Commonly Identified Cause of Perioperative Hypersensitivity Reactions.
Kuhlen, James L; Camargo, Carlos A; Balekian, Diana S; Blumenthal, Kimberly G; Guyer, Autumn; Morris, Theresa; Long, Aidan; Banerji, Aleena
2016-01-01
Hypersensitivity reactions (HSRs) during the perioperative period are unpredictable and can be life threatening. Prospective studies for the evaluation of perioperative HSRs are lacking, and data on causative agents vary between different studies. The objective of this study was to prospectively determine the success of a comprehensive allergy evaluation plan for patients with HSRs during anesthesia, including identification of a causative agent and outcomes during subsequent anesthesia exposure. All patients referred for a perioperative HSR between November 2013 and March 2015, from a Boston teaching hospital, were evaluated using a standardized protocol with skin testing (ST) within 6 months of HSR. Comprehensive allergy evaluation included collection of patient information, including characteristics of HSR during anesthesia. We reviewed the results of ST and/or test doses for all potential causative medications Event-related tryptase levels were reviewed when available. Over 17 months, 25 patients completed the comprehensive allergy evaluation. Fifty-two percent (13 of 25) were female with a median age of 52 (interquartile range 43-66) years. The most frequently observed HSR systems were cutaneous (68%), cardiovascular (64%), and pulmonary (24%). A culprit drug, defined as a positive ST, was identified in 36% (9 of 25) of patients. The most common agent identified was cefazolin (6 of 9). After our comprehensive evaluation and management plan, 7 (7 of 8, 88%) patients tolerated subsequent anesthesia. Cefazolin was the most commonly identified cause of a perioperative HSR in our study population. Skin testing patients within 6 months of a perioperative HSR may improve the odds of finding a positive result. Tolerance of subsequent anesthesia is generally achieved in patients undergoing our comprehensive evaluation. Copyright © 2016 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Li, Linda C; Maetzel, Andreas; Davis, Aileen M; Lineker, Sydney C; Bombardier, Claire; Coyte, Peter C
2006-06-15
To estimate the incremental cost-effectiveness (ICE) of services from a primary therapist compared with traditional physical therapists and/or occupational therapists for managing rheumatoid arthritis (RA), from the societal perspective. Patients with RA were randomly assigned to the primary therapist model (PTM) or traditional treatment model (TTM) for approximately 6 weeks of rehabilitation treatment. Health outcomes were expressed in terms of quality-adjusted life years (QALYs), measured with the EuroQol instrument at baseline, 6 weeks, and 6 months. Direct and indirect costs, including visits to health professionals, use of investigative tests, hospital visits, use of medications, purchases of adaptive aids, and productivity losses incurred by patients and their caregivers, were collected monthly. Of 144 consenting patients, 111 remained in the study after the baseline assessment: 63 PTM (87.3% women, mean age 54.2 years, disease duration 10.6 years) and 48 TTM (79.2% women, mean age 56.8 years, disease duration 13.2 years). From a societal perspective, PTM generated higher QALYs (mean +/- SD 0.068 +/- 0.22) and resulted in a higher mean cost ($6,848 Canadian, interquartile range [IQR] $1,984-$9,320) compared with TTM (mean +/- SD QALY -0.017 +/- 0.24; mean costs $6,266, IQR $1,938-$10,194) in 6 months, although differences were not statistically significant. The estimated ICE ratio was $13,700 per QALY gained (95% nonparametric confidence interval -$73,500, $230,000). The PTM has potential to be an alternative to traditional physical/occupational therapy, although it is premature to recommend widespread use of this model in other regions. Further research should focus on strategies to reduce costs of the model and assess the long-term economic consequences in managing RA and other rheumatologic conditions.