Adil, Eelam; Robson, Caroline; Perez-Atayde, Antonio; Heffernan, Colleen; Moritz, Ethan; Goumnerova, Liliana; Rahbar, Reza
2016-09-01
To describe our experience and current management approach for congenital nasal neuroglial heterotopia (NGH) and encephaloceles. Retrospective chart review at a tertiary pediatric hospital from 1970 to 2013. Thirty patients met inclusion criteria: 21 NGH and nine encephaloceles. Data including demographics, pathology, imaging modality, surgical approach, resection extent, outcomes, and complications were analyzed. Fourteen NGH patients (67%) presented with an internal nasal mass and nasal obstruction. Three patients (14%) presented with an external nasal mass and four (19%) had a mixed lesion. Median age at surgery was 0.51 years (interquartile range 1.32 years). Thirteen (62%) had an intranasal endoscopic approach. Median operative time was 1.6 hours (interquartile range 1.2 hours), and there were no major complications. Nine patients with encephalocele were identified: six (67%) presented with transethmoidal encephaloceles, two (22%) presented with nasoethmoidal encephaloceles, and one (11%) presented with a nasofrontal lesion. The median age at surgery was 1.25 years (interquartile range 1.4 years). All patients required a craniotomy for intracranial extension. Median operative time was 5 hours (interquartile range 1.9 hours), and eight patients (88%) had a total resection. Length of stay ranged from 3 to 14 days. Nasal neuroglial heterotopia and encephaloceles are very rare lesions that require multidisciplinary evaluation and management. At our institution, there has been a shift to magnetic resonance imaging alone for the evaluation of NGH to avoid radiation exposure. Endoscopic extracranial resection is feasible for most intranasal and mixed NGH without an increase in operative time, residual disease, or complications. 4. Laryngoscope, 126:2161-2167, 2016. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.
Prabhu, Malavika; Clapp, Mark A; McQuaid-Hanson, Emily; Ona, Samsiya; OʼDonnell, Taylor; James, Kaitlyn; Bateman, Brian T; Wylie, Blair J; Barth, William H
2018-07-01
To evaluate whether a liposomal bupivacaine incisional block decreases postoperative pain and represents an opioid-minimizing strategy after scheduled cesarean delivery. In a single-blind, randomized controlled trial among opioid-naive women undergoing cesarean delivery, liposomal bupivacaine or placebo was infiltrated into the fascia and skin at the surgical site, before fascial closure. Using an 11-point numeric rating scale, the primary outcome was pain score with movement at 48 hours postoperatively. A sample size of 40 women per group was needed to detect a 1.5-point reduction in pain score in the intervention group. Pain scores and opioid consumption, in oral morphine milligram equivalents, at 48 hours postoperatively were summarized as medians (interquartile range) and compared using the Wilcoxon rank-sum test. Between March and September 2017, 249 women were screened, 103 women enrolled, and 80 women were randomized. One woman in the liposomal bupivacaine group was excluded after randomization as a result of a vertical skin incision, leaving 39 patients in the liposomal bupivacaine group and 40 in the placebo group. Baseline characteristics between groups were similar. The median (interquartile range) pain score with movement at 48 hours postoperatively was 4 (2-5) in the liposomal bupivacaine group and 3.5 (2-5.5) in the placebo group (P=.72). The median (interquartile range) opioid use was 37.5 (7.5-60) morphine milligram equivalents in the liposomal bupivacaine group and 37.5 (15-75) morphine milligram equivalents in the placebo group during the first 48 hours postoperatively (P=.44). Compared with placebo, a liposomal bupivacaine incisional block at the time of cesarean delivery resulted in similar postoperative pain scores in the first 48 hours postoperatively. ClinicalTrials.gov, NCT02959996.
Nishisaki, Akira; Pines, Jesse M; Lin, Richard; Helfaer, Mark A; Berg, Robert A; Tenhave, Thomas; Nadkarni, Vinay M
2012-07-01
Attending physicians are only required to provide in-hospital coverage during daytime hours in many pediatric intensive care units. An in-hospital 24-hr pediatric intensive care unit attending coverage model has been increasingly popular, but the impact of 24-hr, in-hospital attending coverage on care processes and outcomes has not been reported. We compared processes of care and outcomes before and after the implementation of a 24-hr in-hospital pediatric intensive care unit attending physician model. Retrospective comparison of before and after cohorts. A single large, academic tertiary medical/surgical pediatric intensive care unit. : Pediatric intensive care unit admissions in 2000-2006. Transition to 24-hr from 12-hr in-hospital pediatric critical care attending physician coverage model in January 2004. A total of 18,702 patients were admitted to intensive care unit: 8,520 in 24 hrs; 10,182 in 12 hrs. Duration of mechanical ventilation was lower (median 33 hrs [interquartile range 12-88] vs. 48 hrs [interquartile range 16-133], adjusted reduction of 35% [95% confidence interval 25%-44%], p < .001) and intensive care unit length of stay was shorter (median 2 days [interquartile range 1-4] vs. 2 days [interquartile range 1-5], adjusted p < .001) for 24 hr vs. 12 hr coverage. The reduction in mechanical ventilation hours was similar when noninvasive, mechanical ventilation was included in ventilation hours (median 42 hrs vs. 56 hrs, adjusted reduction in ventilation hours: 33% [95% confidence interval 20-45], p < .001). Intensive care unit mortality was not significantly different (2.2% vs. 2.5%, adjusted p =.23). These associations were consistent across daytime and nighttime admissions, weekend and weekday admissions, and among subgroups with higher Pediatric Risk of Mortality III scores, postsurgical patients, and histories of previous intensive care unit admission. Implementation of 24-hr in-hospital pediatric critical care attending coverage was associated with shorter duration of mechanical ventilation and shorter length of intensive care unit stay. After accounting for potential confounders, this finding was consistent across a broad spectrum of critically ill children.
Yorifuji, Takashi; Suzuki, Etsuji; Kashima, Saori
2014-08-13
Epidemiological studies have shown adverse effects of short-term exposure to air pollution on respiratory disease outcomes; however, few studies examined this association on an hourly time scale. We evaluated the associations between hourly changes in air pollution and the risk of respiratory disease in the elderly, using the time of the emergency call as the disease onset for each case. We used a time-stratified case-crossover design. Study participants were 6,925 residents of the city of Okayama, Japan, aged 65 or above who were taken to hospital emergency rooms between January 2006 and December 2010 for onset of respiratory disease. We calculated city-representative hourly average concentrations of air pollutants from several monitoring stations. By using conditional logistic regression models, we estimated odds ratios per interquartile-range increase in each pollutant by exposure period prior to emergency call, adjusting for hourly ambient temperature, hourly relative humidity, and weekly numbers of reported influenza cases aged ≥60. Suspended particulate matter (SPM) exposure 24 to <72 hours prior to the onset and ozone exposure 48 to <96 hours prior to the onset were associated with the increased risk of respiratory disease. For example, following one interquartile-range increase, odds ratios were 1.05 (95% confidence interval: 1.01, 1.09) for SPM exposure 24 to <48 hours prior to the onset and 1.13 (95% confidence interval: 1.04, 1.23) for ozone exposure 72 to <96 hours prior to the onset. Sulfur dioxide (SO2) exposure 0 to <24 hours prior to onset was associated with the increased risk of pneumonia and influenza: odds ratio was 1.07 per one interquartile-range increase (95% confidence interval: 1.00, 1.14). Elevated risk for pneumonia and influenza of SO2 was observed at shorter lags (i.e., 8-18 hours) than the elevated risks for respiratory disease of SPM or ozone. Overall, the effect estimates for chronic obstructive pulmonary disease and allied conditions were equivocal. This study provides further evidence that hourly changes in air pollution exposure increase the risks of respiratory disease, and that SO2 may be related with more immediate onset of the disease than other pollutants.
Weiss, Scott L; Fitzgerald, Julie C; Balamuth, Fran; Alpern, Elizabeth R; Lavelle, Jane; Chilutti, Marianne; Grundmeier, Robert; Nadkarni, Vinay M; Thomas, Neal J
2014-11-01
Delayed antimicrobials are associated with poor outcomes in adult sepsis, but data relating antimicrobial timing to mortality and organ dysfunction in pediatric sepsis are limited. We sought to determine the impact of antimicrobial timing on mortality and organ dysfunction in pediatric patients with severe sepsis or septic shock. Retrospective observational study. PICU at an academic medical center. One hundred thirty patients treated for severe sepsis or septic shock. None. We determined if hourly delays from sepsis recognition to initial and first appropriate antimicrobial administration were associated with PICU mortality (primary outcome); ventilator-free, vasoactive-free, and organ failure-free days; and length of stay. Median time from sepsis recognition to initial antimicrobial administration was 140 minutes (interquartile range, 74-277 min) and to first appropriate antimicrobial was 177 minutes (90-550 min). An escalating risk of mortality was observed with each hour delay from sepsis recognition to antimicrobial administration, although this did not achieve significance until 3 hours. For patients with more than 3-hour delay to initial and first appropriate antimicrobials, the odds ratio for PICU mortality was 3.92 (95% CI, 1.27-12.06) and 3.59 (95% CI, 1.09-11.76), respectively. These associations persisted after adjustment for individual confounders and a propensity score analysis. After controlling for severity of illness, the odds ratio for PICU mortality increased to 4.84 (95% CI, 1.45-16.2) and 4.92 (95% CI, 1.30-18.58) for more than 3-hour delay to initial and first appropriate antimicrobials, respectively. Initial antimicrobial administration more than 3 hours was also associated with fewer organ failure-free days (16 [interquartile range, 1-23] vs 20 [interquartile range, 6-26]; p = 0.04). Delayed antimicrobial therapy was an independent risk factor for mortality and prolonged organ dysfunction in pediatric sepsis.
Delayed Antimicrobial Therapy Increases Mortality and Organ Dysfunction Duration in Pediatric Sepsis
Weiss, Scott L.; Fitzgerald, Julie C.; Balamuth, Fran; Alpern, Elizabeth R.; Lavelle, Jane; Chilutti, Marianne; Grundmeier, Robert; Nadkarni, Vinay M.; Thomas, Neal J.
2014-01-01
Objectives Delayed antimicrobials are associated with poor outcomes in adult sepsis, but data relating antimicrobial timing to mortality and organ dysfunction in pediatric sepsis are limited. We sought to determine the impact of antimicrobial timing on mortality and organ dysfunction in pediatric patients with severe sepsis or septic shock. Design Retrospective observational study. Setting PICU at an academic medical center. Patients One hundred thirty patients treated for severe sepsis or septic shock. Interventions None. Measurements and Main Results We determined if hourly delays from sepsis recognition to initial and first appropriate antimicrobial administration were associated with PICU mortality (primary outcome); ventilator-free, vasoactive-free, and organ failure–free days; and length of stay. Median time from sepsis recognition to initial antimicrobial administration was 140 minutes (interquartile range, 74–277 min) and to first appropriate antimicrobial was 177 minutes (90–550 min). An escalating risk of mortality was observed with each hour delay from sepsis recognition to antimicrobial administration, although this did not achieve significance until 3 hours. For patients with more than 3-hour delay to initial and first appropriate antimicrobials, the odds ratio for PICU mortality was 3.92 (95% CI, 1.27–12.06) and 3.59 (95% CI, 1.09–11.76), respectively. These associations persisted after adjustment for individual confounders and a propensity score analysis. After controlling for severity of illness, the odds ratio for PICU mortality increased to 4.84 (95% CI, 1.45–16.2) and 4.92 (95% CI, 1.30–18.58) for more than 3-hour delay to initial and first appropriate antimicrobials, respectively. Initial antimicrobial administration more than 3 hours was also associated with fewer organ failure–free days (16 [interquartile range, 1–23] vs 20 [interquartile range, 6–26]; p = 0.04). Conclusions Delayed antimicrobial therapy was an independent risk factor for mortality and prolonged organ dysfunction in pediatric sepsis. PMID:25148597
Hemoglobin Levels Across the Pediatric Critical Care Spectrum: A Point Prevalence Study.
Hassan, Nabil E; Reischman, Diann E; Fitzgerald, Robert K; Faustino, Edward Vincent S
2018-05-01
To determine the prevailing hemoglobin levels in PICU patients, and any potential correlates. Post hoc analysis of prospective multicenter observational data. Fifty-nine PICUs in seven countries. PICU patients on four specific days in 2012. None. Patients' hemoglobin and other clinical and institutional data. Two thousand three hundred eighty-nine patients with median age of 1.9 years (interquartile range, 0.3-9.8 yr), weight 11.5 kg (interquartile range, 5.4-29.6 kg), and preceding PICU stay of 4.0 days (interquartile range, 1.0-13.0 d). Their median hemoglobin was 11.0 g/dL (interquartile range, 9.6-12.5 g/dL). The prevalence of transfusion in the 24 hours preceding data collection was 14.2%. Neonates had the highest hemoglobin at 13.1 g/dL (interquartile range, 11.2-15.0 g/dL) compared with other age groups (p < 0.001). The percentage of 31.3 of the patients had hemoglobin of greater than or equal to 12 g/dL, and 1.1% had hemoglobin of less than 7 g/dL. Blacks had lower median hemoglobin (10.5; interquartile range, 9.3-12.1 g/dL) compared with whites (median, 11.1; interquartile range, 9.0-12.6; p < 0.001). Patients in Spain and Portugal had the highest median hemoglobin (11.4; interquartile range, 10.0-12.6) compared with other regions outside of the United States (p < 0.001), and the highest proportion (31.3%) of transfused patients compared with all regions (p < 0.001). Patients in cardiac PICUs had higher median hemoglobin than those in mixed PICUs or noncardiac PICUs (12.3, 11.0, and 10.6 g/dL, respectively; p < 0.001). Cyanotic heart disease patients had the highest median hemoglobin (12.6 g/dL; interquartile range, 11.1-14.5). Multivariable regression analysis within diagnosis groups revealed that hemoglobin levels were significantly associated with the geographic location and history of complex cardiac disease in most of the models. In children with cancer, none of the variables tested correlated with patients' hemoglobin levels. Patients' hemoglobin levels correlated with demographics like age, race, geographic location, and cardiac disease, but none found in cancer patients. Future investigations should account for the effects of these variables.
Appropriate working hours for surgical training according to Australasian trainees.
O'Grady, Gregory; Harper, Simon; Loveday, Benjamin; Adams, Brandon; Civil, Ian D; Peters, Matthew
2012-04-01
The demands of surgical training, learning and service delivery compete with the need to minimize fatigue and maintain an acceptable lifestyle. The optimal balance of working hours is uncertain. This study aimed to define the appropriate hours to meet these requirements according to trainees. All Australian and New Zealand surgical trainees were surveyed. Roster structures, weekly working hours and weekly 'sleep loss hours' (<8 per night) because of 24-h calls were defined. These work practices were then correlated with sufficiency of training time, time for study, fatigue and its impacts, and work-life balance preferences. Multivariate and univariate analyses were performed. The response rate was 55.3% with responders representative of the total trainee body. Trainees who worked median 60 h/week (interquartile range: 55-65) considered their work hours to be appropriate for 'technical' and 'non-technical' training needs compared with 55 h/week (interquartile range: 50-60) regarded as appropriate for study/research needs. Working ≥65 h/week, or accruing ≥5.5 weekly 'sleep loss hours', was associated with increased fatigue, reduced ability to study, more frequent dozing while driving and impaired concentration at work. Trainees who considered they had an appropriate work-life balance worked median 55 h/week. Approximately, 60 h/week proved an appropriate balance of working hours for surgical training, although study and lifestyle demands are better met at around 55 h/week. Sleep loss is an important determinant of fatigue and its impacts, and work hours should not be considered in isolation. © 2012 The Authors. ANZ Journal of Surgery © 2012 Royal Australasian College of Surgeons.
Cheuk, Queenie K Y; Lo, T K; Lee, C P; Yeung, Anita P C
2015-06-01
To evaluate the efficacy and safety of double balloon catheter for induction of labour in Chinese women with one previous caesarean section and unfavourable cervix at term. Retrospective cohort study. A regional hospital in Hong Kong. Women with previous caesarean delivery requiring induction of labour at term and with an unfavourable cervix from May 2013 to April 2014. Primary outcome was to assess rate of successful vaginal delivery (spontaneous or instrument-assisted) using double balloon catheter. Secondary outcomes were double balloon catheter induction-to-delivery and removal-to-delivery interval; cervical score improvement; oxytocin augmentation; maternal or fetal complications during cervical ripening, intrapartum and postpartum period; and risk factors associated with unsuccessful induction. All 24 Chinese women tolerated double balloon catheter well. After double balloon catheter expulsion or removal, the cervix successfully ripened in 18 (75%) cases. The improvement in Bishop score 3 (interquartile range, 2-4) was statistically significant (P<0.001). Overall, 18 (75%) cases were delivered vaginally. The median insertion-to-delivery and removal-to-delivery intervals were 19 (interquartile range, 13.4-23.0) hours and 6.9 (interquartile range, 4.1-10.8) hours, respectively. Compared with cases without, the interval to delivery was statistically significantly shorter in those with spontaneous balloon expulsion or spontaneous membrane rupture during ripening (7.8 vs 3.0 hours; P=0.025). There were no major maternal or neonatal complications. The only factor significantly associated with failed vaginal birth after caesarean was previous caesarean section for failure to progress (P<0.001). This is the first study using double balloon catheter for induction of labour in Asian Chinese women with previous caesarean section. Using double balloon catheter, we achieved a vaginal birth after caesarean rate of 75% without major complications.
The New MIRUS System for Short-Term Sedation in Postsurgical ICU Patients.
Romagnoli, Stefano; Chelazzi, Cosimo; Villa, Gianluca; Zagli, Giovanni; Benvenuti, Francesco; Mancinelli, Paola; Arcangeli, Giulio; Dugheri, Stefano; Bonari, Alessandro; Tofani, Lorenzo; Belardinelli, Andrea; De Gaudio, A Raffaele
2017-09-01
To evaluate the feasibility and safety of the MIRUS system (Pall International, Sarl, Fribourg, Switzerland) for sedation with sevoflurane for postsurgical ICU patients and to evaluate atmospheric pollution during sedation. Prospective interventional study. Surgical ICU. February 2016 to December 2016. Postsurgical patients requiring ICU admission, mechanical ventilation, and sedation. Sevoflurane was administered with the MIRUS system targeted to a Richmond Agitation Sedation Scale from -3 to -5 by adaptation of minimum alveolar concentration. Data collected included Richmond Agitation Sedation Scale, minimum alveolar concentration, inspired and expired sevoflurane fraction, wake-up times, duration of sedation, sevoflurane consumption, respiratory and hemodynamic data, Simplified Acute Physiology Score II, Sepsis-related Organ Failure Assessment, and laboratory data and biomarkers of organ injury. Atmospheric pollution was monitored at different sites: before sevoflurane delivery (baseline) and during sedation with the probe 15 cm up to the MIRUS system (S1) and 15 cm from the filter-Reflector group (S2). Sixty-two patients were enrolled in the study. No technical failure occurred. Median Richmond Agitation Sedation Scale was -4.5 (interquartile range, -5 to -3.6) with sevoflurane delivered at a median minimum alveolar concentration of 0.45% (interquartile range, 0.4-0.53) yielding a mean inspiratory and expiratory concentrations of 0.79% (SD, 0.24) and 0.76% (SD, 0.18), respectively. Median awakening time was 4 minutes (2.2-5 min). Median duration of sevoflurane administration was 3.33 hours (2.33-5.75 hr), range 1-19 hours with a mean consumption of 7.89 mL/hr (SD, 2.99). Hemodynamics remained stable over the study period, and no laboratory data indicated liver or kidney injury or dysfunction. Median sevoflurane room air concentration was 0.10 parts per million (interquartile range, 0.07-0.15), 0.17 parts per million (interquartile range, 0.14-0.27), and 0.15 parts per million (interquartile range, 0.07-0.19) at baseline, S1, and S2, respectively. The MIRUS system is a promising and safe alternative for short-term sedation with sevoflurane of ICU patients. Atmospheric pollution is largely below the recommended thresholds (< 5 parts per million). Studies extended to more heterogeneous population of patients undergoing longer duration of sedation are needed to confirm these observations.
Antimalarial Activity of KAF156 in Falciparum and Vivax Malaria.
White, Nicholas J; Duong, Tran T; Uthaisin, Chirapong; Nosten, François; Phyo, Aung P; Hanboonkunupakarn, Borimas; Pukrittayakamee, Sasithon; Jittamala, Podjanee; Chuthasmit, Kittiphum; Cheung, Ming S; Feng, Yiyan; Li, Ruobing; Magnusson, Baldur; Sultan, Marc; Wieser, Daniela; Xun, Xiaolei; Zhao, Rong; Diagana, Thierry T; Pertel, Peter; Leong, F Joel
2016-09-22
KAF156 belongs to a new class of antimalarial agents (imidazolopiperazines), with activity against asexual and sexual blood stages and the preerythrocytic liver stages of malarial parasites. We conducted a phase 2, open-label, two-part study at five centers in Thailand and Vietnam to assess the antimalarial efficacy, safety, and pharmacokinetic profile of KAF156 in adults with acute Plasmodium vivax or P. falciparum malaria. Assessment of parasite clearance rates in cohorts of patients with vivax or falciparum malaria who were treated with multiple doses (400 mg once daily for 3 days) was followed by assessment of the cure rate at 28 days in a separate cohort of patients with falciparum malaria who received a single dose (800 mg). Median parasite clearance times were 45 hours (interquartile range, 42 to 48) in 10 patients with falciparum malaria and 24 hours (interquartile range, 20 to 30) in 10 patients with vivax malaria after treatment with the multiple-dose regimen and 49 hours (interquartile range, 42 to 54) in 21 patients with falciparum malaria after treatment with the single dose. Among the 21 patients who received the single dose and were followed for 28 days, 1 had reinfection and 7 had recrudescent infections (cure rate, 67%; 95% credible interval, 46 to 84). The mean (±SD) KAF156 terminal elimination half-life was 44.1±8.9 hours. There were no serious adverse events in this small study. The most common adverse events included sinus bradycardia, thrombocytopenia, hypokalemia, anemia, and hyperbilirubinemia. Vomiting of grade 2 or higher occurred in 2 patients, 1 of whom discontinued treatment because of repeated vomiting after receiving the single 800-mg dose. More adverse events were reported in the single-dose cohort, which had longer follow-up, than in the multiple-dose cohorts. KAF156 showed antimalarial activity without evident safety concerns in a small number of adults with uncomplicated P. vivax or P. falciparum malaria. (Funded by Novartis and others; ClinicalTrials.gov number, NCT01753323 .).
Berney, Susan C; Rose, Joleen W; Bernhardt, Julie; Denehy, Linda
2015-08-01
Critical illness can result in impaired physical function. Increased physical activity, additional to rehabilitation, has demonstrated improved functional independence at hospital discharge. The purpose of this study was to measure patterns of physical activity in a group of critically ill patients. This was a single-center, open, observational behavioral mapping study performed in a quaternary intensive care unit (ICU) in Melbourne, Australia. Observations were collected every 10 minutes for 8 hours between 8:00 am and 5:00 pm with the highest level of physical activity, patient location, and persons present at the bedside recorded. Two thousand fifty observations were collected across 8 days. Patients spent more than 7 hours in bed (median [interquartile range] of 100% [69%-100%]) participating in little or no activity for approximately 7 hours of the day (median [interquartile range] 96% [76%-96%]). Outside rehabilitation, no activities associated with ambulation were undertaken. Patients who were ventilated at the time of observation compared with those who were not were less likely to be out of bed (98% reduction in odds). Patients spent up to 30% of their time alone. Outside rehabilitation, patients in ICU are inactive and spend approximately one-third of the 8-hour day alone. Strategies to increase physical activity levels in ICU are required. Copyright © 2015 Elsevier Inc. All rights reserved.
Uber, Amy J; Perman, Sarah M; Cocchi, Michael N; Patel, Parth V; Ganley, Sarah E; Portmann, Jocelyn M; Donnino, Michael W; Grossestreuer, Anne V
2018-04-03
Assess if amount of heat generated by postcardiac arrest patients to reach target temperature (Ttarget) during targeted temperature management is associated with outcomes by serving as a proxy for thermoregulatory ability, and whether it modifies the relationship between time to Ttarget and outcomes. Retrospective cohort study. Urban tertiary-care hospital. Successfully resuscitated targeted temperature management-treated adult postarrest patients between 2008 and 2015 with serial temperature data and Ttarget less than or equal to 34°C. None. Time to Ttarget was defined as time from targeted temperature management initiation to first recorded patient temperature less than or equal to 34°C. Patient heat generation ("heat units") was calculated as inverse of average water temperature × hours between initiation and Ttarget × 100. Primary outcome was neurologic status measured by Cerebral Performance Category score; secondary outcome was survival, both at hospital discharge. Univariate analyses were performed using Wilcoxon rank-sum tests; multivariate analyses used logistic regression. Of 203 patients included, those with Cerebral Performance Category score 3-5 generated less heat before reaching Ttarget (median, 8.1 heat units [interquartile range, 3.6-21.6 heat units] vs median, 20.0 heat units [interquartile range, 9.0-33.5 heat units]; p = 0.001) and reached Ttarget quicker (median, 2.3 hr [interquartile range, 1.5-4.0 hr] vs median, 3.6 hr [interquartile range, 2.0-5.0 hr]; p = 0.01) than patients with Cerebral Performance Category score 1-2. Nonsurvivors generated less heat than survivors (median, 8.1 heat units [interquartile range, 3.6-20.8 heat units] vs median, 19.0 heat units [interquartile range, 6.5-33.5 heat units]; p = 0.001) and reached Ttarget quicker (median, 2.2 hr [interquartile range, 1.5-3.8 hr] vs median, 3.6 hr [interquartile range, 2.0-5.0 hr]; p = 0.01). Controlling for average water temperature between initiation and Ttarget, the relationship between outcomes and time to Ttarget was no longer significant. Controlling for location, witnessed arrest, age, initial rhythm, and neuromuscular blockade use, increased heat generation was associated with better neurologic (adjusted odds ratio, 1.01 [95% CI, 1.00-1.03]; p = 0.039) and survival (adjusted odds ratio, 1.01 [95% CI, 1.00-1.03]; p = 0.045) outcomes. Increased heat generation during targeted temperature management initiation is associated with better outcomes at hospital discharge and may affect the relationship between time to Ttarget and outcomes.
Rosa, Regis Goulart; Tonietto, Tulio Frederico; da Silva, Daiana Barbosa; Gutierres, Franciele Aparecida; Ascoli, Aline Maria; Madeira, Laura Cordeiro; Rutzen, William; Falavigna, Maicon; Robinson, Caroline Cabral; Salluh, Jorge Ibrain; Cavalcanti, Alexandre Biasi; Azevedo, Luciano Cesar; Cremonese, Rafael Viegas; Haack, Tarissa Ribeiro; Eugênio, Cláudia Severgnini; Dornelles, Aline; Bessel, Marina; Teles, José Mario Meira; Skrobik, Yoanna; Teixeira, Cassiano
2017-10-01
To evaluate the effect of an extended visitation model compared with a restricted visitation model on the occurrence of delirium among ICU patients. Prospective single-center before and after study. Thirty-one-bed medical-surgical ICU. All patients greater than or equal to 18 years old with expected length of stay greater than or equal to 24 hours consecutively admitted to the ICU from May 2015 to November 2015. Change of visitation policy from a restricted visitation model (4.5 hr/d) to an extended visitation model (12 hr/d). Two hundred eighty-six patients were enrolled (141 restricted visitation model, 145 extended visitation model). The primary outcome was the cumulative incidence of delirium, assessed bid using the confusion assessment method for the ICU. Predefined secondary outcomes included duration of delirium/coma; any ICU-acquired infection; ICU-acquired bloodstream infection, pneumonia, and urinary tract infection; all-cause ICU mortality; and length of ICU stay. The median duration of visits increased from 133 minutes (interquartile range, 97.7-162.0) in restricted visitation model to 245 minutes (interquartile range, 175.0-272.0) in extended visitation model (p < 0.001). Fourteen patients (9.6%) developed delirium in extended visitation model compared with 29 (20.5%) in restricted visitation model (adjusted relative risk, 0.50; 95% CI, 0.26-0.95). In comparison with restricted visitation model patients, extended visitation model patients had shorter length of delirium/coma (1.5 d [interquartile range, 1.0-3.0] vs 3.0 d [interquartile range, 2.5-5.0]; p = 0.03) and ICU stay (3.0 d [interquartile range, 2.0-4.0] vs 4.0 d [interquartile range, 2.0-6.0]; p = 0.04). The rate of ICU-acquired infections and all-cause ICU mortality did not differ significantly between the two study groups. In this medical-surgical ICU, an extended visitation model was associated with reduced occurrence of delirium and shorter length of delirium/coma and ICU stay.
Tóth, Gábor; Sándor, Gábor László; Kleiner, Dénes; Szentmáry, Nóra; Kiss, Huba J; Blázovics, Anna; Nagy, Zoltán Zsolt
2016-11-01
Femtosecond laser is a revolutionary, innovative treatment method used in cataract surgery. To evaluate free radical quantity in the anterior chamber of the eye, during femtosecond laser assisted capsulotomy, in a porcine eye model. Seventy fresh porcine eyes were collected within 2 hours post mortem, were transported at 4 ºC and treated within 7 hours. Thirty-five eyes were used as control and 35 as femtosecond laser assisted capsulotomy group. A simple luminol-dependent chemiluminescence method was used to measure the total scavenger capacity in the aqueous humour, as an indicator of free radical production. The emitted photons were expressed in relative light unit %. The relative light unit % was lower in the control group (median 1%, interquartile range [0.4-3%]) than in the femtosecond laser assisted capsulotomy group (median 4.4%, interquartile range [1.5%-21%]) (p = 0.01). Femtosecond laser assisted capsulotomy decreases the antioxidant defense of the anterior chamber, which refers to a significant free radical production during femtosecond laser assisted capsulotomy. Orv. Hetil., 2016, 157(47), 1880-1883.
Bucker, Amber; Boers, Anna M; Bot, Joseph C J; Berkhemer, Olvert A; Lingsma, Hester F; Yoo, Albert J; van Zwam, Wim H; van Oostenbrugge, Robert J; van der Lugt, Aad; Dippel, Diederik W J; Roos, Yvo B W E M; Majoie, Charles B L M; Marquering, Henk A
2017-05-01
Ischemic lesion volume (ILV) on noncontrast computed tomography at 1 week can be used as a secondary outcome measure in patients with acute ischemic stroke. Twenty-four-hour ILV on noncontrast computed tomography has greater availability and potentially allows earlier estimation of functional outcome. We aimed to assess lesion growth 24 hours after stroke onset and compare the associations of 24-hour and 1-week ILV with functional outcome. We included 228 patients from MR CLEAN trial (Multicenter Randomized Clinical Trial of Endovascular Treatment for Acute Ischemic Stroke in the Netherlands), who received noncontrast computed tomography at 24-hour and 1-week follow-up on which ILV was measured. Relative and absolute lesion growth was determined. Logistic regression models were constructed either including the 24-hour or including the 1-week ILV. Ordinal and dichotomous (0-2 and 3-6) modified Rankin scale scores were, respectively, used as primary and secondary outcome measures. Median ILV was 42 mL (interquartile range, 21-95 mL) and 64 mL (interquartile range: 30-120 mL) at 24 hours and 1 week, respectively. Relative lesion growth exceeding 30% occurred in 121 patients (53%) and absolute lesion growth exceeding 20 mL occurred in 83 patients (36%). Both the 24-hour and 1-week ILVs were similarly significantly associated with functional outcome (both P <0.001). In the logistic analyses, the areas under the curve of the receiver-operator characteristic curves were similar: 0.85 (95% confidence interval, 0.80-0.90) and 0.87 (95% confidence interval, 0.82-0.91) for including the 24-hour and 1-week ILV, respectively. Growth of ILV is common 24-hour poststroke onset. Nevertheless, the 24-hour ILV proved to be a valuable secondary outcome measure as it is equally strongly associated with functional outcome as the 1-week ILV. URL: http://www.isrctn.com. Unique identifier: ISRCTN10888758. © 2017 American Heart Association, Inc.
Al Jaaly, Emad; Fiorentino, Francesca; Reeves, Barnaby C; Ind, Philip W; Angelini, Gianni D; Kemp, Scott; Shiner, Robert J
2013-10-01
We compared the efficacy of noninvasive ventilation with bilevel positive airway pressure added to usual care versus usual care alone in patients undergoing coronary artery bypass grafting. We performed a 2-group, parallel, randomized controlled trial. The primary outcome was time until fit for discharge. Secondary outcomes were partial pressure of carbon dioxide, forced expiratory volume in 1 second, atelectasis, adverse events, duration of intensive care stay, and actual postoperative stay. A total of 129 patients were randomly allocated to bilevel positive airway pressure (66) or usual care (63). Three patients allocated to bilevel positive airway pressure withdrew. The median duration of bilevel positive airway pressure was 16 hours (interquartile range, 11-19). The median duration of hospital stay until fit for discharge was 5 days for the bilevel positive airway pressure group (interquartile range, 4-6) and 6 days for the usual care group (interquartile range, 5-7; hazard ratio, 1.68; 95% confidence interval, 1.08-2.31; P = .019). There was no significant difference in duration of intensive care, actual postoperative stay, and mean percentage of predicted forced expiratory volume in 1 second on day 3. Mean partial pressure of carbon dioxide was significantly reduced 1 hour after bilevel positive airway pressure application, but there was no overall difference between the groups up to 24 hours. Basal atelectasis occurred in 15 patients (24%) in the usual care group and 2 patients (3%) in the bilevel positive airway pressure group. Overall, 30% of patients in the bilevel positive airway pressure group experienced an adverse event compared with 59% in the usual care group. Among patients undergoing elective coronary artery bypass grafting, the use of bilevel positive airway pressure at extubation reduced the recovery time. Supported by trained staff, more than 75% of all patients allocated to bilevel positive airway pressure tolerated it for more than 10 hours. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
McCully, Belinda H; Connelly, Christopher R; Fair, Kelly A; Holcomb, John B; Fox, Erin E; Wade, Charles E; Bulger, Eileen M; Schreiber, Martin A
2017-07-01
Altered coagulation function after trauma can contribute to development of venous thromboembolism (VTE). Severe trauma impairs coagulation function, but the trajectory for recovery is not known. We hypothesized that enhanced, early recovery of coagulation function increases VTE risk in severely injured trauma patients. Secondary analysis was performed on data from the Pragmatic Randomized Optimal Platelet and Plasma Ratio (PROPPR) trial, excluding patients who died within 24 hours or were on pre-injury anticoagulants. Patient characteristics, adverse outcomes, and parameters of platelet function and coagulation (thromboelastography) were compared from admission to 72 hours between VTE (n = 83) and non-VTE (n = 475) patients. A p value < 0.05 indicates significance. Despite similar patient demographics, VTE patients exhibited hypercoagulable thromboelastography parameters and enhanced platelet function at admission (p < 0.05). Both groups exhibited hypocoagulable thromboelastography parameters, platelet dysfunction, and suppressed clot lysis (low clot lysis at 30 minutes) 2 hours after admission (p < 0.05). The VTE patients exhibited delayed coagulation recovery (a significant change compared with 2 hours) of K-value (48 vs 24 hours), α-angle (no recovery), maximum amplitude (24 vs 12 hours), and clot lysis at 30 minutes (48 vs 12 hours). Platelet function recovery mediated by arachidonic acid (72 vs 4 hours), ADP (72 vs 12 hours), and collagen (48 vs 12 hours) was delayed in VTE patients. The VTE patients had lower mortality (4% vs 13%; p < 0.05), but fewer hospital-free days (0 days [interquartile range 0 to 8 days] vs 10 days [interquartile range 0 to 20 days]; p < 0.05) and higher complication rates (p < 0.05). Recovery from platelet dysfunction and coagulopathy after severe trauma were delayed in VTE patients. Suppressed clot lysis and compensatory mechanisms associated with altered coagulation that can potentiate VTE formation require additional investigation. Copyright © 2017 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
A Trial of Extending Hemodialysis Hours and Quality of Life.
Jardine, Meg J; Zuo, Li; Gray, Nicholas A; de Zoysa, Janak R; Chan, Christopher T; Gallagher, Martin P; Monaghan, Helen; Grieve, Stuart M; Puranik, Rajesh; Lin, Hongli; Eris, Josette M; Zhang, Ling; Xu, Jinsheng; Howard, Kirsten; Lo, Serigne; Cass, Alan; Perkovic, Vlado
2017-06-01
The relationship between increased hemodialysis hours and patient outcomes remains unclear. We randomized (1:1) 200 adult recipients of standard maintenance hemodialysis from in-center and home-based hemodialysis programs to extended weekly (≥24 hours) or standard (target 12-15 hours, maximum 18 hours) hemodialysis hours for 12 months. The primary outcome was change in quality of life from baseline assessed by the EuroQol 5 dimension instrument (3 level) (EQ-5D). Secondary outcomes included medication usage, clinical laboratory values, vascular access events, and change in left ventricular mass index. At 12 months, median weekly hemodialysis hours were 24.0 (interquartile range, 23.6-24.0) and 12.0 (interquartile range, 12.0-16.0) in the extended and standard groups, respectively. Change in EQ-5D score at study end did not differ between groups (mean difference, 0.04 [95% confidence interval, -0.03 to 0.11]; P =0.29). Extended hours were associated with lower phosphate and potassium levels and higher hemoglobin levels. Blood pressure (BP) did not differ between groups at study end. Extended hours were associated with fewer BP-lowering agents and phosphate-binding medications, but were not associated with erythropoietin dosing. In a substudy with 95 patients, we detected no difference between groups in left ventricular mass index (mean difference, -6.0 [95% confidence interval, -14.8 to 2.7] g/m 2 ; P =0.18). Five deaths occurred in the extended group and two in the standard group ( P =0.44); two participants in each group withdrew consent. Similar numbers of patients experienced vascular access events in the two groups. Thus, extending weekly hemodialysis hours did not alter overall EQ-5D quality of life score, but was associated with improvement in some laboratory parameters and reductions in medication burden. (Clinicaltrials.gov identifier: NCT00649298). Copyright © 2017 by the American Society of Nephrology.
England, Timothy J; Hedstrom, Amanda; O'Sullivan, Saoirse; Donnelly, Richard; Barrett, David A; Sarmad, Sarir; Sprigg, Nikola; Bath, Philip M
2017-05-01
Repeated episodes of limb ischemia and reperfusion (remote ischemic conditioning [RIC]) may improve outcome after acute stroke. We performed a pilot blinded placebo-controlled trial in patients with acute ischemic stroke, randomized 1:1 to receive 4 cycles of RIC within 24 hours of ictus. The primary outcome was tolerability and feasibility. Secondary outcomes included safety, clinical efficacy (day 90), putative biomarkers (pre- and post-intervention, day 4), and exploratory hemodynamic measures. Twenty-six patients (13 RIC and 13 sham) were recruited 15.8 hours (SD 6.2) post-onset, age 76.2 years (SD 10.5), blood pressure 159/83 mm Hg (SD 25/11), and National Institutes of Health Stroke Scale (NIHSS) score 5 (interquartile range, 3.75-9.25). RIC was well tolerated with 49 out of 52 cycles completed in full. Three patients experienced vascular events in the sham group: 2 ischemic strokes and 2 myocardial infarcts versus none in the RIC group ( P =0.076, log-rank test). Compared with sham, there was a significant decrease in day 90 NIHSS score in the RIC group, median NIHSS score 1 (interquartile range, 0.5-5) versus 3 (interquartile range, 2-9.5; P =0.04); RIC augmented plasma HSP27 (heat shock protein 27; P <0.05, repeated 2-way ANOVA) and phosphorylated HSP27 ( P <0.001) but not plasma S100-β, matrix metalloproteinase-9, endocannabinoids, or arterial compliance. RIC after acute stroke is well tolerated and appears safe and feasible. RIC may improve neurological outcome, and protective mechanisms may be mediated through HSP27. A larger trial is warranted. URL: http://www.isrctn.com. Unique identifier: ISRCTN86672015. © 2017 American Heart Association, Inc.
Cryoprecipitate use in the PROMMTT study.
Holcomb, John B; Fox, Erin E; Zhang, Xuan; White, Nathan; Wade, Charles E; Cotton, Bryan A; del Junco, Deborah J; Bulger, Eileen M; Cohen, Mitchell J; Schreiber, Martin A; Myers, John G; Brasel, Karen J; Phelan, Herb A; Alarcon, Louis H; Muskat, Peter; Rahbar, Mohammad H
2013-07-01
There are few clinical data to guide the use of cryoprecipitate in severely injured trauma patients. Cryoprecipitate is a rich source of fibrinogen and has been associated with improved survival in animal as well as limited human studies. Our objectives were to identify patterns and predictors of cryoprecipitate use and determine whether transfusing cryoprecipitate was associated with improved survival. This secondary analysis of 1,238 of 1,245 PRospective Observational Multicenter Major Trauma Transfusion (PROMMTT) study patients who had timed transfusion data included 359 (29%) who received cryoprecipitate. For this analysis, one dose of cryoprecipitate was defined as 10 U. Unadjusted predictors of cryoprecipitate use were identified using logistic regression. Multivariable time-dependent Cox models were performed to examine the association of cryoprecipitate on time to in-hospital death. Cryoprecipitate use varied significantly by center, ranging from 7% to 82%. Among patients who received cryoprecipitate, the median number of units infused by 24 hours was 10 (interquartile range, 10-20). The median time from admission to first cryoprecipitate unit was 2.7 hours (interquartile range, 1.7-4.4 hours). Of those who died of a hemorrhagic death within 6 hours of admission, 72% received no cryoprecipitate. Other unadjusted predictors of cryoprecipitate use included Injury Severity Score (ISS), initial fibrinogen levels, base deficit, international normalized ratio, prothrombin time/partial thromboplastin time, hemoglobin, damage-control surgery, and surgical intervention of the chest and abdomen. Cryoprecipitate use was not associated with in-hospital mortality after adjusting for initial pH, initial hemoglobin, emergency department systolic blood pressure, emergency department Glasgow Coma Scale (GCS) score, blood product use, ISS, and center. Ten US Level 1 trauma centers vary greatly in their timing and use of cryoprecipitate in severely injured trauma patients. We could not identify any association of cryoprecipitate use with in-hospital mortality, although most patients did not receive this product. Randomized controlled studies are needed to determine if cryoprecipitate (or fibrinogen concentrates) have a beneficial effect.
Aortoiliac morphologic correlations in aneurysms undergoing endovascular repair.
Ouriel, Kenneth; Tanquilut, Eugene; Greenberg, Roy K; Walker, Esteban
2003-08-01
The feasibility of endovascular aneurysm repair depends on morphologic characteristics of the aortoiliac segment. Knowledge of such characteristics is relevant to safe deployment of a particular device in a single patient and to development of new devices for use in patients with a broader spectrum of anatomic variations. We evaluated findings on computed tomography scans for 277 patients being considered for endovascular aneurysm repair. Aortic neck length and angulation estimates were generated with three-dimensional trigonometry. Specific centerline points were recorded, corresponding to the aorta at the celiac axis, lowest renal artery, cranial aspect of the aneurysm sac, aortic terminus, right hypogastric artery origin, and left hypogastric origin. Aortic neck thrombus and calcium content were recorded, and neck conicity was calculated in degrees. Statistical analysis was performed with the Spearman rank correlation. Data are expressed as median and interquartile range. Median diameter of the aneurysms was 52 mm (interquartile range, 48-59 mm) in minor axis and 56 mm (interquartile range, 51-64 mm) in major axis, and median length was 88 mm (interquartile range, 74-103 mm). Median proximal aortic neck diameter was 26 mm (interquartile range, 22-29 mm), and median neck length was 30 mm (interquartile range, 18-45 mm). The common iliac arteries were similar in diameter (right artery, 16 mm [interquartile range, 13-20 mm]; left artery, 15 mm [interquartile range, 11-18 mm]) and length (right, 59 mm [interquartile range, 50-69 mm]; left, 60 mm [interquartile range, 49-70 mm]). Median angulation of the infrarenal aortic neck was 40 degrees (interquartile range, 29-51 degrees), and median angulation of the suprarenal segment was 45 degrees (interquartile range, 36-57 degrees). By gender, sac diameter, proximal neck diameter, and iliac artery diameter were significantly larger in men. Significant linear associations were identified between sac diameter and sac length, neck angulation, and iliac artery diameter. As the length of the aneurysm sac increased the proximal aortic neck length decreased. Conversely, as the sac length decreased sac eccentricity increased. Mural thrombus content within the neck increased with increasing neck diameter. There is considerable variability in aortoiliac morphologic parameters. Significant associations were found between various morphologic variables, links that are presumably related to a shared pathogenesis for aberration in aortoiliac diameter, length, and angulation. Ultimately this information can be used to develop new endovascular devices with broader applicability and improved long-term results.
Safety and efficiency of emergency department interrogation of cardiac devices
Neuenschwander, James F.; Peacock, W. Frank; Migeed, Madgy; Hunter, Sara A.; Daughtery, John C.; McCleese, Ian C.; Hiestand, Brian C.
2016-01-01
Objective Patients with implanted cardiac devices may wait extended periods for interrogation in emergency departments (EDs). Our purpose was to determine if device interrogation could be done safely and faster by ED staff. Methods Prospective randomized, standard therapy controlled, trial of ED staff device interrogation vs. standard process (SP), with 30-day follow-up. Eligibility criteria: ED presentation with a self-report of a potential device related complaint, with signed informed consent. SP interrogation was by company representative or hospital employee. Results Of 60 patients, 42 (70%) were male, all were white, with a median (interquartile range) age of 71 (64 to 82) years. No patient was lost to follow up. Of all patients, 32 (53%) were enrolled during business hours. The overall median (interquartile range) ED vs. SP time to interrogation was 98.5 (40 to 260) vs. 166.5 (64 to 412) minutes (P=0.013). While ED and SP interrogation times were similar during business hours, 102 (59 to 138) vs. 105 (64 to 172) minutes (P=0.62), ED interrogation times were shorter vs. SP during non-business hours; 97 (60 to 126) vs. 225 (144 to 412) minutes, P=0.002, respectively. There was no difference in ED length of stay between the ED and SP interrogation, 249 (153 to 390) vs. 246 (143 to 333) minutes (P=0.71), regardless of time of presentation. No patient in any cohort suffered an unplanned medical contact or post-discharge adverse device related event. Conclusion ED staff cardiac device interrogations are faster, and with similar 30-day outcomes, as compared to SP. PMID:28168230
Safety and efficiency of emergency department interrogation of cardiac devices.
Neuenschwander, James F; Peacock, W Frank; Migeed, Madgy; Hunter, Sara A; Daughtery, John C; McCleese, Ian C; Hiestand, Brian C
2016-12-01
Patients with implanted cardiac devices may wait extended periods for interrogation in emergency departments (EDs). Our purpose was to determine if device interrogation could be done safely and faster by ED staff. Prospective randomized, standard therapy controlled, trial of ED staff device interrogation vs. standard process (SP), with 30-day follow-up. Eligibility criteria: ED presentation with a self-report of a potential device related complaint, with signed informed consent. SP interrogation was by company representative or hospital employee. Of 60 patients, 42 (70%) were male, all were white, with a median (interquartile range) age of 71 (64 to 82) years. No patient was lost to follow up. Of all patients, 32 (53%) were enrolled during business hours. The overall median (interquartile range) ED vs. SP time to interrogation was 98.5 (40 to 260) vs. 166.5 (64 to 412) minutes (P=0.013). While ED and SP interrogation times were similar during business hours, 102 (59 to 138) vs. 105 (64 to 172) minutes (P=0.62), ED interrogation times were shorter vs. SP during non-business hours; 97 (60 to 126) vs. 225 (144 to 412) minutes, P=0.002, respectively. There was no difference in ED length of stay between the ED and SP interrogation, 249 (153 to 390) vs. 246 (143 to 333) minutes (P=0.71), regardless of time of presentation. No patient in any cohort suffered an unplanned medical contact or post-discharge adverse device related event. ED staff cardiac device interrogations are faster, and with similar 30-day outcomes, as compared to SP.
Pulse pressure variation-guided fluid therapy after cardiac surgery: a pilot before-and-after trial.
Suzuki, Satoshi; Woinarski, Nicholas C Z; Lipcsey, Miklos; Candal, Cristina Lluch; Schneider, Antoine G; Glassford, Neil J; Eastwood, Glenn M; Bellomo, Rinaldo
2014-12-01
The aim of this study is to study the feasibility, safety, and physiological effects of pulse pressure variation (PPV)-guided fluid therapy in patients after cardiac surgery. We conducted a pilot prospective before-and-after study during mandatory ventilation after cardiac surgery in a tertiary intensive care unit. We introduced a protocol to deliver a fluid bolus for a PPV≥13% for at least >10 minutes during the intervention period. We studied 45 control patients and 53 intervention patients. During the intervention period, clinicians administered a fluid bolus on 79% of the defined PPV trigger episodes. Median total fluid intake was similar between 2 groups during mandatory ventilation (1297 mL [interquartile range 549-1968] vs 1481 mL [807-2563]; P=.17) and the first 24 hours (3046 mL [interquartile range 2317-3982] vs 3017 mL [2192-4028]; P=.73). After adjusting for several baseline factors, PPV-guided fluid management significantly increased fluid intake during mandatory ventilation (P=.004) but not during the first 24 hours (P=.47). Pulse pressure variation-guided fluid therapy, however, did not significantly affect hemodynamic, renal, and metabolic variables. No serious adverse events were noted. Pulse pressure variation-guided fluid management was feasible and safe during mandatory ventilation after cardiac surgery. However, its advantages may be clinically small. Copyright © 2014 Elsevier Inc. All rights reserved.
Using lean methodology to decrease wasted RN time in seeking supplies in emergency departments.
Richardson, David M; Rupp, Valerie A; Long, Kayla R; Urquhart, Megan C; Ricart, Erin; Newcomb, Lindsay R; Myers, Paul J; Kane, Bryan G
2014-11-01
Timely stocking of essential supplies in an emergency department (ED) is crucial to efficient and effective patient care. The objective of this study was to decrease wasted nursing time in obtaining needed supplies in an ED through the use of Lean process controls. As part of a Lean project, the team conducted a "before and after" prospective observation study of ED nurses seeking supplies. Nurses were observed for an entire shift for the time spent outside the patient room obtaining supplies at baseline and after implementation of a point-of-use storage system. Before implementation, nurses were leaving patient rooms a median of 11 times per 8-hour shift (interquartile range [IQR], 8 times per 8-hour shift) and 10 times per 12-hour shift (IQR, 23 times per 12-hour shift). After implementation of the new system, the numbers decreased to 2.5 per 8-hour shift (IQR, 2 per 8-hour shift) and 1 per 12-hour shift (IQR, 1 per 12-hour shift). A redesigned process including a standardized stocking system significantly decreases the number of searches by nurses for supplies.
Weber, Donald; Dulai, Sukhdeep K; Bergman, Joseph; Buckley, Richard; Beaupre, Lauren A
2014-11-01
To evaluate the association between time to surgery, antibiotic administration, Gustilo grade, fracture location, and development of deep infection in open fractures. Prospective cohort between 2001 and 2009. Three Level 1 Canadian trauma centers. A total of 736 (791 fractures) subjects were enrolled and 686 subjects (93%; 737 fractures) provided adequate follow-up data (1-year interview and/or clinical follow-up >90 days). Demographics, injury information, time to surgery, and antibiotics were recorded. Subjects were evaluated using standardized data forms until the fracture(s) healed. Phone interviews were undertaken 1 year after the fracture. Infection requiring unplanned surgical debridement and/or sustained antibiotic therapy. Tibia/fibula fractures were most common (n = 413, 52%), followed by upper extremity (UE) (n = 285, 36%), and femoral (n = 93, 12%) fractures. Infection developed in 46 fractures (6%). The median time to surgery was 9 hours 4 minutes (interquartile range, 6 hours 39 minutes to 12 hours 33 minutes) and 7 hours 39 minutes (interquartile range, 6 hours 10 minutes to 9 hours 54 minutes) for those without and with infection, respectively (P = 0.04). Gustilo grade 3B/3C fractures accounted for 17 of 46 infections (37%) (P < 0.001). Four UE (1.5%), 7 femoral (8%), and 35 tibia/fibula (9%) fractures developed infections (P = 0.001). Multivariate regression found no association between infection and time to surgery [odds ratio (OR), 0.97; 95% confidence interval (95% CI), 0.90-1.06] or antibiotics (OR, 1.0; 95% CI, 0.90-1.05). Grades 3A (OR, 6.37; 95% CI, 1.37-29.56) and 3B/3C (OR, 12.87; 95% CI, 2.72-60.95) relative to grade 1 injuries and tibia/fibula (OR, 3.91; 95% CI, 1.33-11.53) relative to UE fractures were significantly associated with infection. Infection after open fracture was associated with increasing Gustilo grade or tibia/fibula fractures but not time to surgery or antibiotics. Prognostic level I. See instructions for authors for a complete description of levels of evidence.
Belle, Loic; Motreff, Pascal; Mangin, Lionel; Rangé, Grégoire; Marcaggi, Xavier; Marie, Antoine; Ferrier, Nadine; Dubreuil, Olivier; Zemour, Gilles; Souteyrand, Géraud; Caussin, Christophe; Amabile, Nicolas; Isaaz, Karl; Dauphin, Raphael; Koning, René; Robin, Christophe; Faurie, Benjamin; Bonello, Laurent; Champin, Stanislas; Delhaye, Cédric; Cuilleret, François; Mewton, Nathan; Genty, Céline; Viallon, Magalie; Bosson, Jean Luc; Croisille, Pierre
2016-03-01
Delayed stent implantation after restoration of normal epicardial flow by a minimalist immediate mechanical intervention aims to decrease the rate of distal embolization and impaired myocardial reperfusion after percutaneous coronary intervention. We sought to confirm whether a delayed stenting (DS) approach (24-48 hours) improves myocardial reperfusion, versus immediate stenting, in patients with acute ST-segment-elevation myocardial infarction undergoing primary percutaneous coronary intervention. In the prospective, randomized, open-label minimalist immediate mechanical intervention (MIMI) trial, patients (n=140) with ST-segment-elevation myocardial infarction ≤12 hours were randomized to immediate stenting (n=73) or DS (n=67) after Thrombolysis In Myocardial Infarction 3 flow restoration by thrombus aspiration. Patients in the DS group underwent a second coronary arteriography for stent implantation a median of 36 hours (interquartile range 29-46) after randomization. The primary end point was microvascular obstruction (% left ventricular mass) on cardiac magnetic resonance imaging performed 5 days (interquartile range 4-6) after the first procedure. There was a nonsignificant trend toward lower microvascular obstruction in the immediate stenting group compared with DS group (1.88% versus 3.96%; P=0.051), which became significant after adjustment for the area at risk (P=0.049). Median infarct weight, left ventricular ejection fraction, and infarct size did not differ between groups. No difference in 6-month outcomes was apparent for the rate of major cardiovascular and cerebral events. The present findings do not support a strategy of DS versus immediate stenting in patients with ST-segment-elevation infarction undergoing primary percutaneous coronary intervention and even suggested a deleterious effect of DS on microvascular obstruction size. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01360242. © 2016 American Heart Association, Inc.
AIR POLLUTION INFLUENCES ON EXHALED NITRIC OXIDE AMONG PEOPLE WITH TYPE II DIABETES.
Peng, Cheng; Luttmann-Gibson, Heike; Zanobetti, Antonella; Cohen, Allison; De Souza, Celine; Coull, Brent A; Horton, Edward S; Schwartz, Joel; Koutrakis, Petros; Gold, Diane R
2016-04-01
In a population with type 2 diabetes mellitus (T2DM), we examined associations of short-term air pollutant exposures with pulmonary inflammation, measured as fraction of exhaled pulmonary nitric oxide (FeNO). Sixty-nine Boston Metropolitan residents with T2DM completed up to 5 bi-weekly visits with 321 offline FeNO measurements. We measured ambient concentrations of particle mass, number and components at our stationary central site. Ambient concentrations of gaseous air pollutants were obtained from state monitors. We used linear models with fixed effects for participants, adjusting for 24-hour mean temperature, 24-hour mean water vapor pressure, season, and scrubbed room NO the day of the visit, to estimate associations between FeNO and interquartile range increases in exposure. Interquartile increases in the 6-hour averages of black carbon (BC) (0.5 μg/m 3 ) and particle number (PN) (1,000 particles/cm 3 ) were associated with increases in FeNO of 3.84% (95% CI 0.60% to 7.18%) and 9.86 % (95% CI 3.59% to 16.52%), respectively. We also found significant associations of increases in FeNO with increases in 24-hour moving averages of BC, PN and nitrogen oxides (NOx). Recent studies have focused on FeNO as a marker for eosinophilic pulmonary inflammation in asthmatic populations. This study adds support to the relevance of FeNO as a marker for pulmonary inflammation in diabetic populations, whose underlying chronic inflammatory status is likely to be related to innate immunity and proinflammatory adipokines.
Poeran, Jashvant; Mazumdar, Madhu; Rasul, Rehana; Meyer, Joanne; Sacks, Henry S; Koll, Brian S; Wallach, Frances R; Moskowitz, Alan; Gelijns, Annetine C
2016-02-01
Antibiotic use, particularly type and duration, is a crucial modifiable risk factor for Clostridium difficile. Cardiac surgery is of particular interest because prophylactic antibiotics are recommended for 48 hours or less (vs ≤24 hours for noncardiac surgery), with increasing vancomycin use. We aimed to study associations between antibiotic prophylaxis (duration/vancomycin use) and C difficile among patients undergoing coronary artery bypass grafting. We extracted data on coronary artery bypass grafting procedures from the national Premier Perspective claims database (2006-2013, n = 154,200, 233 hospitals). Multilevel multivariable logistic regressions measured associations between (1) duration (<2 days, "standard" vs ≥2 days, "extended") and (2) type of antibiotic used ("cephalosporin," "cephalosporin + vancomycin," "vancomycin") and C difficile as outcome. Overall C difficile prevalence was 0.21% (n = 329). Most patients (59.7%) received a cephalosporin only; in 33.1% vancomycin was added, whereas 7.2% received vancomycin only. Extended prophylaxis was used in 20.9%. In adjusted analyses, extended prophylaxis (vs standard) was associated with significantly increased C difficile risk (odds ratio, 1.43; confidence interval, 1.07-1.92), whereas no significant associations existed for vancomycin use as adjuvant or primary prophylactic compared with the use of cephalosporins (odds ratio, 1.21; confidence interval, 0.92-1.60, and odds ratio, 1.39; confidence interval, 0.94-2.05, respectively). Substantial inter-hospital variation exists in the percentage of extended antibiotic prophylaxis (interquartile range, 2.5-35.7), use of adjuvant vancomycin (interquartile range, 4.2-61.1), and vancomycin alone (interquartile range, 2.3-10.4). Although extended use of antibiotic prophylaxis was associated with increased C difficile risk after coronary artery bypass grafting, vancomycin use was not. The observed hospital variation in antibiotic prophylaxis practices suggests great potential for efforts aimed at standardizing practices that subsequently could reduce C difficile risk. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Gionfriddo, Ashley; Nonoyama, Mika L; Laussen, Peter C; Cox, Peter N; Clarke, Megan; Floh, Alejandro A
2018-06-01
To promote standardization, the Centers for Disease Control and Prevention introduced a new ventilator-associated pneumonia classification, which was modified for pediatrics (pediatric ventilator-associated pneumonia according to proposed criteria [PVAP]). We evaluated the frequency of PVAP in a cohort of children diagnosed with ventilator-associated pneumonia according to traditional criteria and compared their strength of association with clinically relevant outcomes. Retrospective cohort study. Tertiary care pediatric hospital. Critically ill children (0-18 yr) diagnosed with ventilator-associated pneumonia between January 2006 and December 2015 were identified from an infection control database. Patients were excluded if on high frequency ventilation, extracorporeal membrane oxygenation, or reintubated 24 hours following extubation. None. Patients were assessed for PVAP diagnosis. Primary outcome was the proportion of subjects diagnosed with PVAP. Secondary outcomes included association with intervals of care. Two hundred seventy-seven children who had been diagnosed with ventilator-associated pneumonia were eligible for review; 46 were excluded for being ventilated under 48 hours (n = 16), on high frequency ventilation (n = 12), on extracorporeal membrane oxygenation (n = 8), ineligible bacteria isolated from culture (n = 8), and other causes (n = 4). ICU admission diagnoses included congenital heart disease (47%), neurological (16%), trauma (7%), respiratory (7%), posttransplant (4%), neuromuscular (3%), and cardiomyopathy (3%). Only 16% of subjects (n = 45) met the new PVAP definition, with 18% (n = 49) having any ventilator-associated condition. Failure to fulfill new definitions was based on inadequate increase in mean airway pressure in 90% or FIO2 in 92%. PVAP was associated with prolonged ventilation (median [interquartile range], 29 d [13-51 d] vs 16 d [8-34.5 d]; p = 0.002), ICU (median [interquartile range], 40 d [20-100 d] vs 25 d [14-61 d]; p = 0.004) and hospital length of stay (median [interquartile range], 81 d [40-182 d] vs 54 d [31-108 d]; p = 0.04), and death (33% vs 16%; p = 0.008). Few children with ventilator-associated pneumonia diagnosis met the proposed PVAP criteria. PVAP was associated with increased morbidity and mortality. This work suggests that additional study is required before new definitions for ventilator-associated pneumonia are introduced for children.
Nazeri, Pantea; Mirmiran, Parvin; Mehrabi, Yadollah; Hedayati, Mehdi; Delshad, Hossein; Azizi, Fereidoun
2010-12-01
Production of iodized salt in Iran for household consumption began in 1990. Previous studies have reported sustainable elimination of iodine deficiency disorders in Iran. The aim of this study was to evaluate the iodine nutritional status in Tehran in 2009. In this cross-sectional study, 383 Tehranian households were enrolled through randomized cluster sampling and a total of 639 adult subjects (242 men and 397 women), aged 19 and over, participated. A 24-hour urine sample was collected for measurement of urinary iodine, sodium, and creatinine concentrations using the digestion method, flame photometry, and autoanalyzer assay, respectively. Salt intake was estimated and iodine content of household salt was measured by titration. Median (interquartile range) iodine content of household salt and urinary iodine concentration (UIC) in Tehran were 21.2 (3.2-31.7) parts per million and 70.0 (34.0-131.2) μg/L, respectively. There was no statistically significant difference in 24-hour UICs between men and women. Median (interquartile range) daily salt intake was 7.6 (5.5-9.8) g, which was not different in the two genders. According to the WHO/ICCIDD/UNICEF classification, 11.2%, 25.9%, 26.7%, 25.1%, 8.0%, and 3.2% of participants had UIC <20, 20-49, 50-99, 100-199, 200-299, and >300 μg/L, respectively. Mild iodine deficiency has recurred in Tehranians. The results emphasize the need for continuous monitoring in all regions, even in a country with iodine sufficiency.
Effect of collaborative care on cost variation in an intensive care unit.
Garland, Allan
2013-05-01
Improving the cost-effectiveness of health care requires an understanding of the genesis of health care costs and in particular the sources of cost variation. Little is known about how multiple physicians, caring collaboratively for patients, contribute to costs. To explore the effect of collaborative care by physicians on variation in discretionary costs in an intensive care unit (ICU) by determining the contributions of the attending intensivists and ICU fellows. Prospective, observational study using a multivariable model of median discretionary costs for the first day in the ICU, adjusting for confounding variables. Analysis included 3514 patients who spent more than 2 hours in the ICU on the initial day. Impact of the physicians was assessed via variables representing the specific intensivist and ICU fellow responsible on the first ICU day and allowing for interaction terms. On the initial day, patients spent a median of 10.6 hours (interquartile range, 6.3-16.5) in the ICU, with median discretionary costs of $1343 (interquartile range, $788-2208). There was large variation in adjusted costs attributable to both the intensivists ($359; 95% CI, $244-$474) and the fellows ($756; 95% CI, $550-$965). The interaction terms were not significant (P = .12-.79). In an ICU care model with intensivists and subspecialty fellows, both types of physicians contributed significantly to the observed variation in discretionary costs. However, even in the presence of a hierarchical arrangement of clinical responsibilities, the influences on costs of the 2 types of physicians were independent.
Camazine, Maraya N; Karam, Oliver; Colvin, Ryan; Leteurtre, Stephane; Demaret, Pierre; Tucci, Marisa; Muszynski, Jennifer A; Stanworth, Simon; Spinella, Philip C
2017-05-01
To determine if the use of fresh frozen plasma/frozen plasma 24 hours compared to solvent detergent plasma is associated with international normalized ratio reduction or ICU mortality in critically ill children. This is an a priori secondary analysis of a prospective, observational study. Study groups were defined as those transfused with either fresh frozen plasma/frozen plasma 24 hours or solvent detergent plasma. Outcomes were international normalized ratio reduction and ICU mortality. Multivariable logistic regression was used to determine independent associations. One hundred one PICUs in 21 countries. All critically ill children admitted to a participating unit were included if they received at least one plasma unit during six predefined 1-week (Monday to Friday) periods. All children were exclusively transfused with either fresh frozen plasma/frozen plasma 24 hours or solvent detergent plasma. None. There were 443 patients enrolled in the study. Twenty-four patients (5%) were excluded because no plasma type was recorded; the remaining 419 patients were analyzed. Fresh frozen plasma/frozen plasma 24 hours group included 357 patients, and the solvent detergent plasma group included 62 patients. The median (interquartile range) age and weight were 1 year (0.2-6.4) and 9.4 kg (4.0-21.1), respectively. There was no difference in reason for admission, severity of illness score, pretransfusion international normalized ratio, or lactate values; however, there was a difference in primary indication for plasma transfusion (p < 0.001). There was no difference in median (interquartile range) international normalized ratio reduction, between fresh frozen plasma/frozen plasma 24 hours and solvent detergent plasma study groups, -0.2 (-0.4 to 0) and -0.2 (-0.3 to 0), respectively (p = 0.80). ICU mortality was lower in the solvent detergent plasma versus fresh frozen plasma/frozen plasma 24 hours groups, 14.5% versus 29.1%%, respectively (p = 0.02). Upon adjusted analysis, solvent detergent plasma transfusion was independently associated with reduced ICU mortality (odds ratio, 0.40; 95% CI, 0.16-0.99; p = 0.05). Solvent detergent plasma use in critically ill children may be associated with improved survival. This hypothesis-generating data support a randomized controlled trial comparing solvent detergent plasma to fresh frozen plasma/frozen plasma 24 hours.
Pasupathy, Sivabaskari; Tavella, Rosanna; Grover, Suchi; Raman, Betty; Procter, Nathan E K; Du, Yang Timothy; Mahadavan, Gnanadevan; Stafford, Irene; Heresztyn, Tamila; Holmes, Andrew; Zeitz, Christopher; Arstall, Margaret; Selvanayagam, Joseph; Horowitz, John D; Beltrame, John F
2017-09-05
Contemporary ST-segment-elevation myocardial infarction management involves primary percutaneous coronary intervention, with ongoing studies focusing on infarct size reduction using ancillary therapies. N-acetylcysteine (NAC) is an antioxidant with reactive oxygen species scavenging properties that also potentiates the effects of nitroglycerin and thus represents a potentially beneficial ancillary therapy in primary percutaneous coronary intervention. The NACIAM trial (N-acetylcysteine in Acute Myocardial Infarction) examined the effects of NAC on infarct size in patients with ST-segment-elevation myocardial infarction undergoing percutaneous coronary intervention. This randomized, double-blind, placebo-controlled, multicenter study evaluated the effects of intravenous high-dose NAC (29 g over 2 days) with background low-dose nitroglycerin (7.2 mg over 2 days) on early cardiac magnetic resonance imaging-assessed infarct size. Secondary end points included cardiac magnetic resonance-determined myocardial salvage and creatine kinase kinetics. Of 112 randomized patients with ST-segment-elevation myocardial infarction, 75 (37 in NAC group, 38 in placebo group) underwent early cardiac magnetic resonance imaging. Median duration of ischemia pretreatment was 2.4 hours. With background nitroglycerin infusion administered to all patients, those randomized to NAC exhibited an absolute 5.5% reduction in cardiac magnetic resonance-assessed infarct size relative to placebo (median, 11.0%; [interquartile range 4.1, 16.3] versus 16.5%; [interquartile range 10.7, 24.2]; P =0.02). Myocardial salvage was approximately doubled in the NAC group (60%; interquartile range, 37-79) compared with placebo (27%; interquartile range, 14-42; P <0.01) and median creatine kinase areas under the curve were 22 000 and 38 000 IU·h in the NAC and placebo groups, respectively ( P =0.08). High-dose intravenous NAC administered with low-dose intravenous nitroglycerin is associated with reduced infarct size in patients with ST-segment-elevation myocardial infarction undergoing percutaneous coronary intervention. A larger study is required to assess the impact of this therapy on clinical cardiac outcomes. Australian New Zealand Clinical Trials Registry. URL: http://www.anzctr.org.au/. Unique identifier: 12610000280000. © 2017 American Heart Association, Inc.
Vitamin D and parathyroid hormone status in a representative population living in Macau, China.
Ke, L; Mason, R S; Mpofu, E; Dibley, M; Li, Y; Brock, K E
2015-04-01
Associations between documented sun-exposure, exercise patterns and fish and supplement intake and 25-hydroxyvitamin D (25OHD) and parathyroid hormone (PTH) were investigated in a random household survey of Macau residents (aged 18-93). Blood samples (566) taken in summer were analyzed for 25OHD and PTH. In this Chinese population, 55% were deficient (25OHD <50nmol/L: median (interquartile range)=47.7 (24.2) nmol/L). Vitamin D deficiency was greatest in those aged <50 years: median (interquartile range)=43.3 (18.2) nmol/L, females: median (interquartile range)=45.5 (19.4) nmol/L and those with higher educational qualifications: median (interquartile range)=43.1 (18.7) nmol/L. In the total Macau population, statistically significant (p<0.01) modifiable associations with lower 25OHD levels were sunlight exposure (β=0.06), physical activity (PA) (measured as hours(hrs)/day: β=0.08), sitting (measured as hrs/day β=-0.20), intake of fish (β=0.08) and calcium (Ca) supplement intake (β=0.06) [linear regression analysis adjusting for demographic risk factors]. On similar analysis, and after adjustment for 25OHD, the only significant modifiable associations in the total population with PTH were sitting (β=-0.17), Body Mass Index (β=0.07) and Ca supplement intake (β=-0.06). In this Macau population less documented sun exposure, fish and Ca supplement intake and exercise were associated with lower 25OHD levels, especially in the younger population, along with the interesting finding that more sitting was associated with both lower 25OHD and high PTH blood levels. In conclusion, unlike findings from Caucasian populations, younger participants were significantly more vitamin D deficient, in particular highly educated single females. This may indicate the desire of young females to be pale and avoid the sun. There are also big differences in lifestyle between the older generation and the younger, in particular with respect to sun exposure and PA. This article is part of a Special Issue entitled '17th Vitamin D Workshop'. Copyright © 2015 Elsevier Ltd. All rights reserved.
Craig, Louise E; Bernhardt, Julie; Langhorne, Peter; Wu, Olivia
2010-11-01
Very early mobilization (VEM) is a distinctive characteristic of care in some stroke units; however, evidence of the effectiveness of this approach is limited. To date, only 2 phase II trials have compared VEM with standard care: A Very Early Rehabilitation Trial (AVERT) in Australia and the recently completed Very Early Rehabilitation or Intensive Telemetry after Stroke trial in the United Kingdom. The Very Early Rehabilitation or Intensive Telemetry after Stroke protocol was designed to complement that of AVERT in a number of key areas. The aim of this analysis was to investigate the impact of VEM on independence by pooling data from these 2 comparable trials. Individual data from the 2 trials were pooled. Overall, patients were between 27 and 97 years old, had first or recurring stroke, and were treated within 36 hours after stroke onset. The primary outcome was independence, defined as modified Rankin scale score of 0 to 2 at 3 months. The secondary outcomes included complications of immobility and activities of daily living. Logistic regression was used to assess the effect of VEM on outcome, adjusting for known confounders including age, baseline stroke severity, and premorbid modified Rankin scale score. Findings-All patients in AVERT and Very Early Rehabilitation or Intensive Telemetry after Stroke were included, resulting in 54 patients in the VEM group and 49 patients in the standard care group. The baseline characteristics of VEM patients were largely comparable with standard care patients. Time to first mobilization from symptom onset was significantly shorter among VEM patients (median, 21 hours; interquartile range, 15.8-27.8 hours) compared with standard care patients (median, 31 hours; interquartile range, 23.0-41.2 hours). VEM patients had significantly greater odds of independence compared with standard care patients (adjusted odds ratio, 3.11; 95% confidence interval, 1.03-9.33). Planned collaborations between stroke researchers to conduct trials with common protocols and outcome measures can help advance rehabilitation science. VEM was associated with improved independence at 3 months compared with standard care. However, both trials are limited by small sample sizes. Larger trials (such as AVERT phase III) are still needed in this field.
Comparison of postural ergonomics between laparoscopic and robotic sacrocolpopexy: a pilot study.
Tarr, Megan E; Brancato, Sam J; Cunkelman, Jacqueline A; Polcari, Anthony; Nutter, Benjamin; Kenton, Kimberly
2015-02-01
To compare resident, fellow, and attending urologic and gynecologic surgeons' musculoskeletal and mental strain during laparoscopic and robotic sacrocolpopexy. Prospective cohort study (Canadian Task Force classification II-2). Academic medical center. Patients who underwent robotic or laparoscopic sacrocolpopexy from October 2009 to January 2011. The Body Part Discomfort (BPD) survey was completed before cases, and the National Aeronautics and Space Administration Task Load Index and BPD survey were completed after cases. Higher scores on BPD and the National Aeronautics and Space Administration Task Load Index indicate greater musculoskeletal discomfort and mental strain. BPD scores were averaged over the following body regions: head/neck, back, hand/wrist, arms, and knees/ankles/feet. Changes in body region-specific discomfort scores were the primary outcomes. Multivariable analysis was performed using mixed-effects linear regression with surgeon as a random effect. Sixteen surgeons participated (53% fellows, 34% residents, and 13% attendings). Thirty-three robotic and 53 laparoscopic cases were analyzed, with a median surgical time of 231 minutes (interquartile range, 204-293 minutes) versus 227 minutes (interquartile range, 203-272 minutes; p = .31), a median estimated blood loss of 100 mL (interquartile range, 50-175 mL) versus 150 mL (interquartile range, 50-200 mL; p = .22), and a mean patient body mass index of 27 ± 4 versus 26 ± 4 kg/m(2) (p = .26), respectively. Robotic surgeries were associated with lower neck/shoulder (-0.19 [interquartile range, -0.32 to -0.01], T = -2.49) and back discomfort scores (-0.35 [interquartile range, -0.58 to 0], T = -2.38) than laparoscopic surgeries. Knee/ankle/foot and arm discomfort increased with case length (0.18 [interquartile range, 0.02-0.3], T = 2.81) and (0.07 [interquartile range, 0.01-0.14], p = .03), respectively. Surgeons performing minimally invasive sacrocolpopexy experienced less neck, shoulder, and back discomfort when surgery was performed robotically. Copyright © 2015 AAGL. Published by Elsevier Inc. All rights reserved.
Al-Tamimi, Yahia Z; Helmy, Adel; Bavetta, Seb; Price, Stephen J
2009-01-01
Intraparenchymal monitoring devices play an important role in the daily management of head injury and other critically ill neurosurgical patients. Although zero drift data exist for the Camino system (Camino Laboratories, San Diego, CA), only in vitro data exist for the Codman system (Codman and Shurtleff, Inc., Raynham, MA). The aim of this study was to assess the extent of zero drift for the Codman intracranial pressure (ICP) monitor in patients being monitored in 2 neurointensive care units. This was a prospective study conducted at 2 neurointensive care units. Eighty-eight patients who required ICP monitoring and who presented to the 2 neurosurgical departments, Center 1 (n = 48) and Center 2 (n = 40), were recruited for participation. The duration of ICP monitoring was noted, as was the resultant pressure reading in normal saline on removing the ICP monitor (zero drift). The median absolute zero drift for the group was 2.0 mm Hg (interquartile range, 1-3 mm Hg). The median time in situ was 108 hours (interquartile range, 69-201 hours). There was a positive correlation between the drift and time of the probe spent in situ (Spearman's correlation coefficient = 0.342; P = 0.001). Of the readings, 20 and 2% showed a drift greater than 5 and 10 mm Hg in magnitude, respectively. These data demonstrate that a small amount of zero drift exists in ICP monitors and that this drift increases with time. The wide range in the data demonstrates that some drift readings are quite excessive. This reinforces the school of thought that, although ICP readings contribute significantly to the management of neurosurgical patients, they should be interpreted carefully and in conjunction with clinical and radiological assessment of patients.
Novel dry cryotherapy system for cooling the equine digit
Stefanovski, Darko; Lenfest, Margret; Chatterjee, Sraboni; Orsini, James
2018-01-01
Objectives Digital cryotherapy is commonly used for laminitis prophylaxis and treatment. Currently validated methods for distal limb cryotherapy involve wet application or compression technology. There is a need for a practical, affordable, dry cryotherapy method that effectively cools the digit. The objective of this study was to evaluate the hoof wall surface temperatures (HWSTs) achieved with a novel dry cryotherapy technology. Design Repeated-measures in vivo experimental study. Setting Experimental intervention at a single site. Participants 6 systemically healthy horses (3 mares, 3 geldings). Interventions Cryotherapy was applied to six horses for eight hours with a commercially available rubber and rubber and welded fabricice boot, which extended proximally to include the foot and pastern. Reusable malleable cold therapy packs were secured against the foot and pastern with the three built-in hook-and-loop fastener panels. Primary and secondary outcome measures HWST and pastern surface temperature of the cryotherapy-treated limb, HWST of the control limb and ambient temperature were recorded every five minutes throughout the study period. Results Results were analysed with mixed-effects multivariable regression analysis. The HWST (median 11.1°C, interquartile range 8.6°C–14.7°C) in the cryotherapy-treated limb was significantly decreased compared with the control limb (median 29.7°C, interquartile range 28.9°C–30.4°C) (P≤0.001). Cryotherapy limb HWST reached a minimum of 6.75°C (median) with an interquartile range of 4.1°C–9.3°C. Minimum HWST was achieved 68 minutes after cryotherapy pack application. Conclusions Dry application of cryotherapy significantly reduced HWST and reached minimums below the therapeutic target of 10°C. This cryotherapy method might offer an effective alternative for digital cooling. PMID:29344364
Pulmonary disposition and pharmacokinetics of minocycline in adult horses.
Echeverria, Kate O; Lascola, Kara M; Giguère, Steeve; Foreman, Jonathan H; Austin, Scott A
2017-11-01
OBJECTIVE To determine pharmacokinetics and pulmonary disposition of minocycline in horses after IV and intragastric administration. ANIMALS 7 healthy adult horses. PROCEDURES For experiment 1 of the study, minocycline was administered IV (2.2 mg/kg) or intragastrically (4 mg/kg) to 6 horses by use of a randomized crossover design. Plasma samples were obtained before and 16 times within 36 hours after minocycline administration. Bronchoalveolar lavage (BAL) was performed 4 times within 24 hours after minocycline administration for collection of pulmonary epithelial lining fluid (PELF) and BAL cells. For experiment 2, minocycline was administered intragastrically (4 mg/kg, q 12 h, for 5 doses) to 6 horses. Plasma samples were obtained before and 20 times within 96 hours after minocycline administration. A BAL was performed 6 times within 72 hours after minocycline administration for collection of PELF samples and BAL cells. RESULTS Mean bioavailability of minocycline was 48% (range, 35% to 75%). At steady state, mean ± SD maximum concentration (Cmax) of minocycline in plasma was 2.3 ± 1.3 μg/mL, and terminal half-life was 11.8 ± 0.5 hours. Median time to Cmax (Tmax) was 1.3 hours (interquartile range [IQR], 1.0 to 1.5 hours). The Cmax and Tmax of minocycline in the PELF were 10.5 ± 12.8 μg/mL and 9.0 hours (IQR, 5.5 to 12.0 hours), respectively. The Cmax and Tmax for BAL cells were 0.24 ± 0.1 μg/mL and 6.0 hours (IQR, 0 to 6.0 hours), respectively. CONCLUSIONS AND CLINICAL RELEVANCE Minocycline was distributed into the PELF and BAL cells of adult horses.
Weissler-Snir, Adaya; Kornowski, Ran; Sagie, Alexander; Vaknin-Assa, Hana; Perl, Leor; Porter, Avital; Lev, Eli; Assali, Abid
2014-11-15
Little is known regarding gender differences in left ventricular (LV) function after anterior wall ST-segment elevation myocardial infarction (STEMI), despite it being a major determinant of patients' morbidity and mortality. We therefore sought to investigate the impact of gender on LV function after primary percutaneous coronary intervention (PCI) for first anterior wall STEMI. Seven hundred eighty-nine consecutive patients (625 men) with first anterior STEMI were included in the analysis. All patients underwent an echocardiographic study within 48 hours of PCI. Women were older and more likely to have diabetes, hypertension, chronic renal failure, and a higher Killip score. Women had prolonged ischemic time, which was driven by prolonged symptom-to-presentation time (2.75 [interquartile range 1.5 to 4] vs 2 [interquartile range 1 to 3.5] hours, p = 0.005). A higher percentage of women had moderate or worse LV dysfunction (LV ejection fraction <40%; 61.6% vs 48%, p = 0.002). In a univariable analysis female gender was associated with moderate or worse LV function (p = 0.002). However, after accounting for variable baseline risk profiles between the 2 groups using multivariable and propensity score techniques, ischemic time >3.5 hours, leukocytosis, and pre-PCI Thrombolysis In Myocardial Infarction flow grade <2 were independent predictors of moderate or worse LV dysfunction, whereas female gender was not. Data on LV function recovery at 6 months, which were available for 45% of female and male patients with moderate or worse LV dysfunction early after PCI, showed no significant gender related difference in LV function recovery. In conclusion, women undergoing PCI for the first event of anterior STEMI demonstrate worse LV function than that of men, which might be partially attributed to delay in presentation. Hence greater efforts should be devoted to increasing women's awareness of cardiac symptoms during the prehospital course of STEMI. Copyright © 2014 Elsevier Inc. All rights reserved.
Davidson, Anders J; Neff, Lucas P; Grayson, J Kevin; Clement, Nathan F; DeSoucy, Erik S; Simon Logan, Meryl A; Abbot, Christopher M; Sampson, James B; Williams, Timothy K
2017-09-01
The small diameter of temporary vascular shunts for vascular trauma management may restrict flow and result in ischemia or early thrombosis. We have previously reported a clinical experience with direct, open surgical reconstruction using expandable polytetrafluoroethylene stent grafts to create a "sutureless" anastomosis as an alternative to standard temporary vascular shunts. We sought to characterize patency and flow characteristics of these grafts compared with standard shunts in a survival model of porcine vascular injury. Twelve Yorkshire-cross swine received a 2-cm-long near-circumferential defect in the bilateral iliac arteries. A 14 Fr Argyle shunt was inserted into one randomly assigned artery, with a self-expanding expandable polytetrafluoroethylene stent deployed in the other. At 72 hours, conduit patency was evaluated by angiography. Arterial flow measurements were obtained at baseline, immediately after intervention, and after 72 hours via direct measurement with perivascular flow meters. Blood pressure proximal and distal to the conduits and arterial samples for histopathology were obtained during the terminal procedure. Angiography revealed no difference in patency at 72 hours (p = 1.0). While there was no difference in baseline arterial flow between arteries (p = 0.63), the stent grafts demonstrated significantly improved blood flow compared with shunts both immediately after intervention (390 ± 36 mL/min vs. 265 ± 25 mL/min, p = 0.002) and at 72 hours (261 ± 29 mL/min vs. 170 ± 36 mL/min, p = 0.005). The pressure gradient across the shunts was greater than that of the stent grafts (11.5 mm Hg [interquartile range, 3-19 mm Hg] vs. 3 mm Hg [interquartile range, 3-5 mm Hg], p = 0.013). The speed of deployment was similar between the two devices. Open "sutureless" direct site repair using commercially available stent grafts to treat vascular injury is a technically feasible strategy for damage control management of peripheral vascular injury and offers increased blood flow when compared with temporary shunts. Furthermore, stent grafts may offer improved durability to extend the window until definitive vascular repair. The combination of these traits may improve outcomes after vascular injury. Epidemiologic/Prognostic, level III.
Effort of breathing in children receiving high-flow nasal cannula.
Rubin, Sarah; Ghuman, Anoopindar; Deakers, Timothy; Khemani, Robinder; Ross, Patrick; Newth, Christopher J
2014-01-01
High-flow humidified nasal cannula is often used to provide noninvasive respiratory support in children. The effect of high-flow humidified nasal cannula on effort of breathing in children has not been objectively studied, and the mechanism by which respiratory support is provided remains unclear. This study uses an objective measure of effort of breathing (Pressure. Rate Product) to evaluate high-flow humidified nasal cannula in critically ill children. Prospective cohort study. Quaternary care free-standing academic children's hospital. ICU patients younger than 18 years receiving high-flow humidified nasal cannula or whom the medical team planned to extubate to high-flow humidified nasal cannula within 72 hours of enrollment. An esophageal pressure monitoring catheter was placed to measure pleural pressures via a Bicore CP-100 pulmonary mechanics monitor. Change in pleural pressure (ΔPes) and respiratory rate were measured on high-flow humidified nasal cannula at 2, 5, and 8 L/min. ΔPes and respiratory rate were multiplied to generate the Pressure.Rate Product, a well-established objective measure of effort of breathing. Baseline Pes, defined as pleural pressure at end exhalation during tidal breathing, reflected the positive pressure generated on each level of respiratory support. Twenty-five patients had measurements on high-flow humidified nasal cannula. Median age was 6.5 months (interquartile range, 1.3-15.5 mo). Median Pressure,Rate Product was lower on high-flow humidified nasal cannula 8 L/min (median, 329 cm H2O·min; interquartile range, 195-402) compared with high-flow humidified nasal cannula 5 L/min (median, 341; interquartile range, 232-475; p = 0.007) or high-flow humidified nasal cannula 2 L/min (median, 421; interquartile range, 233-621; p < 0.0001) and was lower on high-flow humidified nasal cannula 5 L/min compared with high-flow humidified nasal cannula 2 L/min (p = 0.01). Baseline Pes was higher on high-flow humidified nasal cannula 8 L/min than on high-flow humidified nasal cannula 2 L/min (p = 0.03). Increasing flow rates of high-flow humidified nasal cannula decreased effort of breathing in children, with the most significant impact seen from high-flow humidified nasal cannula 2 to 8 L/min. There are likely multiple mechanisms for this clinical effect, including generation of positive pressure and washout of airway dead space.
The 24-hour urine collection: gold standard or historical practice?
Côté, Anne-Marie; Firoz, Tabassum; Mattman, André; Lam, Elaine M; von Dadelszen, Peter; Magee, Laura A
2008-12-01
The objective of the study was to determine completeness of 24-hour urine collection in pregnancy. This was a retrospective laboratory/chart review of 24-hour urine collections at British Columbia Women's Hospital. Completeness was assessed by 24-hour urinary creatinine excretion (UcreatV): expected according to maternal weight for single collections and between-measurement difference for serial collections. For 198 randomly selected pregnant women with a hypertensive disorder (63% preeclampsia), 24-hour urine collections were frequently inaccurate (13-54%) on the basis of UcreatV of 97-220 micromol/kg per day (11.0-25.0 mg/kg per day) or 133-177 micromol/kg per day (15.1-20.1 mg/kg per day) of prepregnancy weight (respectively). Lean body weight resulted in more inaccurate collections (24-68%). The current weight was frequently unavailable (28%) and thus not used. For 161 women (81% proteinuric) with serial 24-hour urine levels, a median [interquartile range] of 11 [5-31] days apart, between-measurement difference in UcreatV was 14.4% [6.0-24.9]; 40 women (24.8%) had values 25% or greater, exceeding analytic and biologic variation. Twenty-four hour urine collection is frequently inaccurate and not a precise measure of proteinuria or creatinine clearance.
PTH(1-34) for the Primary Prevention of Postthyroidectomy Hypocalcemia: The THYPOS Trial.
Palermo, Andrea; Mangiameli, Giuseppe; Tabacco, Gaia; Longo, Filippo; Pedone, Claudio; Briganti, Silvia Irina; Maggi, Daria; Vescini, Fabio; Naciu, Anda; Lauria Pantano, Angelo; Napoli, Nicola; Angeletti, Silvia; Pozzilli, Paolo; Crucitti, Pierfilippo; Manfrini, Silvia
2016-11-01
There are no studies evaluating teriparatide for prevention of post-thyroidectomy hypocalcemia. Our objective was to evaluate whether teriparatide can prevent postsurgical hypocalcemia and shorten the hospitalization in subjects at high risk of hypocalcemia following thyroid surgery. This was a prospective phase II randomized open-label trial. This trial was set on a surgical ward. Twenty-six subjects (six males, 20 females) with intact PTH lower than10 pg/ml 4 hours after thyroidectomy were included. Subjects were randomized (1:1) to receive SC administration of 20 mcg of teriparatide every 12 hours until the discharge (treatment group) or to follow standard clinical care (control group). Adjusted serum calcium, duration of hospitalization, and calcium/calcitriol supplementation were measured. Overall, the incidence of hypocalcemia was 3/13 in treatment group and 11/13 in the control group (P = .006). Treated patients had a lower risk of hypocalcemia than controls (relative risk, 0.26 [95% confidence interval, 0.09-0.723)]). The median duration of hospitalization was 3 days (interquartile range, 1) in control subjects and 2 days (interquartile range, 0) in treated subjects (P = .012). One month after discharge, 10/13 subjects in the treatment group had stopped calcium carbonate supplements, while only 5/13 in the control group had discontinued calcium. The ANOVA for repeated measures showed a significant difference in calcium supplements between groups at 1-month visit (P = .04) as well as a significant difference between discharge and 1-month visit in the treatment group (P for interaction time group = .04) Conclusions: Teriparatide may prevent postsurgical hypocalcemia, shorten the duration of hospitalization, and reduce the need for calcium and vitamin D supplementation after discharge in high risk subjects after thyroid surgery.
Kieneker, Lyanne M; Gansevoort, Ron T; Mukamal, Kenneth J; de Boer, Rudolf A; Navis, Gerjan; Bakker, Stephan J L; Joosten, Michel M
2014-10-01
Previous prospective cohort studies on the association between potassium intake and risk of hypertension have almost exclusively relied on self-reported dietary data, whereas repeated 24-hour urine excretions, as estimate of dietary uptake, may provide a more objective and quantitative estimate of this association. Risk of hypertension (defined as blood pressure ≥140/90 mm Hg or initiation of blood pressure-lowering drugs) was prospectively studied in 5511 normotensive subjects aged 28 to 75 years not using blood pressure-lowering drugs at baseline of the Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. Potassium excretion was measured in two 24-hour urine specimens at baseline (1997-1998) and midway during follow-up (2001-2003). Baseline median potassium excretion was 70 mmol/24 h (interquartile range, 57-85 mmol/24 h), which corresponds to a dietary potassium intake of ≈91 mmol/24 h. During a median follow-up of 7.6 years (interquartile range, 5.0-9.3 years), 1172 subjects developed hypertension. The lowest sex-specific tertile of potassium excretion (men: <68 mmol/24 h; women: <58 mmol/24 h) had an increased risk of hypertension after multivariable adjustment (hazard ratio, 1.20; 95% confidence interval, 1.05-1.37), compared with the upper 2 tertiles (Pnonlinearity=0.008). The proportion of hypertension attributable to low potassium excretion was 6.2% (95% confidence interval, 1.7%-10.9%). No association was found between the sodium to potassium excretion ratio and risk of hypertension after multivariable adjustment. Low urinary potassium excretion was associated with an increased risk of developing hypertension. Dietary strategies to increase potassium intake to the recommended level of 90 mmol/d may have the potential to reduce the incidence of hypertension. © 2014 American Heart Association, Inc.
Mechanical Thrombectomy in Perioperative Strokes: A Case-Control Study.
Premat, Kévin; Clovet, Olivier; Frasca Polara, Giulia; Shotar, Eimad; Bartolini, Bruno; Yger, Marion; Di Maria, Federico; Baronnet, Flore; Pistocchi, Silvia; Le Bouc, Raphaël; Pires, Christine; Sourour, Nader; Alamowitch, Sonia; Samson, Yves; Degos, Vincent; Clarençon, Frédéric
2017-11-01
Perioperative strokes (POS) are rare but serious complications for which mechanical thrombectomy could be beneficial. We aimed to compare the technical results and patients outcomes in a population of POS versus non-POS (nPOS) treated by mechanical thrombectomy. From 2010 to 2017, 25 patients with POS (ie, acute ischemic stroke occurring during or within 30 days after a procedure) who underwent mechanical thrombectomy (POS group) were enrolled and paired with 50 consecutive patients with nPOS (control group), based on the occlusion's site, National Institute of Health Stroke Scale, and age. Respectively, mean age was 68.3±16.6 versus 67.2±16.6 years ( P =0.70), and median National Institute of Health Stroke Scale score at admission was 20 (interquartile range, 15-25) versus 19 (interquartile range, 17-25; P =0.79). Good clinical outcome (modified Rankin Scale score of 0-2 at 3 months) was achieved by 33.3% (POS) versus 56.5% (nPOS) of patients ( P =0.055). Successful reperfusion (modified Thrombolysis In Cerebral Infarction score of ≥2b) was obtained in 76% (POS) versus 86% (nPOS) of cases ( P =0.22). Mortality at 3 months was 33.3% in the POS group versus 4.2% (nPOS) ( P =0.002). The rate of major procedural complications was 4% (POS) versus 6% (nPOS); none were lethal. Average time from symptoms' onset to reperfusion was 4.9 hours (±2.0) in POS versus 5.2 hours (±2.6). Successful reperfusion seems accessible in POS within a reasonable amount of time and with a good level of safety. However, favorable outcome was achieved with a lower rate than in nPOS, owing to a higher mortality rate. © 2017 American Heart Association, Inc.
Seizure burden is independently associated with short term outcome in critically ill children
Payne, Eric T.; Zhao, Xiu Yan; Frndova, Helena; McBain, Kristin; Sharma, Rohit; Hutchison, James S.
2014-01-01
Seizures are common among critically ill children, but their relationship to outcome remains unclear. We sought to quantify the relationship between electrographic seizure burden and short-term neurological outcome, while controlling for diagnosis and illness severity. Furthermore, we sought to determine whether there is a seizure burden threshold above which there is an increased probability of neurological decline. We prospectively evaluated all infants and children admitted to our paediatric and cardiac intensive care units who underwent clinically ordered continuous video-electroencephalography monitoring over a 3-year period. Seizure burden was quantified by calculating the maximum percentage of any hour that was occupied by electrographic seizures. Outcome measures included neurological decline, defined as a worsening Paediatric Cerebral Performance Category score between hospital admission and discharge, and in-hospital mortality. Two hundred and fifty-nine subjects were evaluated (51% male) with a median age of 2.2 years (interquartile range: 0.3 days–9.7 years). The median duration of continuous video-electroencephalography monitoring was 37 h (interquartile range: 21–56 h). Seizures occurred in 93 subjects (36%, 95% confidence interval = 30–42%), with 23 (9%, 95% confidence interval = 5–12%) experiencing status epilepticus. Neurological decline was observed in 174 subjects (67%), who had a mean maximum seizure burden of 15.7% per hour, compared to 1.8% per hour for those without neurological decline (P < 0.0001). Above a maximum seizure burden threshold of 20% per hour (12 min), both the probability and magnitude of neurological decline rose sharply (P < 0.0001) across all diagnostic categories. On multivariable analysis adjusting for diagnosis and illness severity, the odds of neurological decline increased by 1.13 (95% confidence interval = 1.05–1.21, P = 0.0016) for every 1% increase in maximum hourly seizure burden. Seizure burden was not associated with mortality (odds ratio: 1.003, 95% confidence interval: 0.99–1.02, P = 0.613). We conclude that in this cohort of critically ill children, increasing seizure burden was independently associated with a greater probability and magnitude of neurological decline. Our observation that a seizure burden of more than 12 min in a given hour was strongly associated with neurological decline suggests that early antiepileptic drug management is warranted in this population, and identifies this seizure burden threshold as a potential therapeutic target. These findings support the hypothesis that electrographic seizures independently contribute to brain injury and worsen outcome. Our results motivate and inform the design of future studies to determine whether more aggressive seizure treatment can improve outcome. PMID:24595203
De Bernardi Rodrigues, Ana Maria; da Silva, Cleliani de Cassia; Vasques, Ana Carolina Junqueira; Camilo, Daniella Fernandes; Barreiro, Francieli; Cassani, Roberta Soares Lara; Zambon, Mariana Porto; Antonio, Maria Ângela Reis de Góes Monteiro; Geloneze, Bruno
2016-05-01
The association between short sleep duration and decreased insulin sensitivity in adolescents has been described. However, to our knowledge, no studies have investigated this association measuring insulin sensitivity by the hyperglycemic clamp technique. To compare the distributions of parameters of insulin resistance in adolescents with sleep deprivation vs adequate sleep, and to investigate the association between sleep deprivation and insulin sensitivity. Cross-sectional multicenter study using data from the Brazilian Metabolic Syndrome Study conducted from June 29, 2011, to December 3, 2014, at an obesity outpatient clinic at the University of Campinas and public schools, with a convenience sample of 615 adolescents aged 10 to 19.9 years with a body mass index (BMI; calculated as weight in kilograms divided by height in meters squared) for age and sex at the fifth percentile or higher. A subsample of 81 adolescents underwent the hyperglycemic clamp technique. The self-reported sleep duration was used to classify the population into 2 groups: adolescents with sleep deprivation (<8 hours/night) and adolescents with adequate sleep (≥8 hours/night). Insulin sensitivity was assessed using the hyperglycemic clamp technique. Among the 615 adolescents (56.3% female; median age, 15.9 years [interquartile range, 12.9-17.8 years]) included in the sample, the mean (SD) sleep duration was 7.9 (1.7) hours/night. The adolescents with sleep deprivation (n = 257) compared with those with adequate sleep (n = 358) had a higher median (interquartile range) age (17.0 [15.4-18.3] vs 14.1 [11.8-16.9] years), BMI (25.0 [21.2-29.3] vs 23.1 [19.5-27.6]), waist circumference (83.0 [73.5-95.4] vs 79.0 [68.5-91.0] cm), sagittal abdominal diameter (17.9 [15.8-20.8] vs 17.0 [15.0-19.8] cm), neck circumference (35.2 [33.0-38.0] vs 33.0 [30.0-35.5] cm), uric acid level (4.9 [4.0-5.8] vs 4.5 [3.7-5.5] mg/dL), and white blood cell count (7000 [5900-8200] vs 6600 [5600-7800] cells/μL) (all P < .05). Moreover, the adolescents with sleep deprivation had a lower median (interquartile range) insulin sensitivity index compared with those with adequate sleep (0.10 [0.05-0.21] vs 0.21 [0.09-0.33] mg · kgfat-free mass-1 · min-1 · mU/L × 100, respectively; difference, -0.01; 95% CI, -0.01 to -0.00; P = .02). After controlling for age and sex in the multivariate regression model, sleep deprivation remained an independent predictor for those variables. In the sleep deprivation group, BMI and central distribution of fat were higher in all categories of adiposity. Sleep deprivation (<8 hours of sleep per night) is associated with centripetal distribution of fat and decreased insulin sensitivity in adolescents. Therefore, investigations of sleep duration and sleep quality in adolescents should be included in clinical practice to promote, through health education, the eradication of the health risks associated with sleep restriction.
False-Positive Rate of AKI Using Consensus Creatinine–Based Criteria
Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G.S.; Negoianu, Dan; Testani, Jeffrey M.; Berns, Jeffrey S.; Parikh, Chirag R.
2015-01-01
Background and objectives Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false–positive rates because of inherent laboratory and biologic variabilities of creatinine. Design, setting, participants, & measurements We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false–positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient’s true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Results Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false–positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%–8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false–positive AKI diagnosis rate of 30.5% (interquartile range =30.1%–30.9%) versus 2.0% (interquartile range =1.9%–2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Conclusions Use of small serum creatinine changes to diagnose AKI is limited by high false–positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. PMID:26336912
False-Positive Rate of AKI Using Consensus Creatinine-Based Criteria.
Lin, Jennie; Fernandez, Hilda; Shashaty, Michael G S; Negoianu, Dan; Testani, Jeffrey M; Berns, Jeffrey S; Parikh, Chirag R; Wilson, F Perry
2015-10-07
Use of small changes in serum creatinine to diagnose AKI allows for earlier detection but may increase diagnostic false-positive rates because of inherent laboratory and biologic variabilities of creatinine. We examined serum creatinine measurement characteristics in a prospective observational clinical reference cohort of 2267 adult patients with AKI by Kidney Disease Improving Global Outcomes creatinine criteria and used these data to create a simulation cohort to model AKI false-positive rates. We simulated up to seven successive blood draws on an equal population of hypothetical patients with unchanging true serum creatinine values. Error terms generated from laboratory and biologic variabilities were added to each simulated patient's true serum creatinine value to obtain the simulated measured serum creatinine for each blood draw. We determined the proportion of patients who would be erroneously diagnosed with AKI by Kidney Disease Improving Global Outcomes creatinine criteria. Within the clinical cohort, 75.0% of patients received four serum creatinine draws within at least one 48-hour period during hospitalization. After four simulated creatinine measurements that accounted for laboratory variability calculated from assay characteristics and 4.4% of biologic variability determined from the clinical cohort and publicly available data, the overall false-positive rate for AKI diagnosis was 8.0% (interquartile range =7.9%-8.1%), whereas patients with true serum creatinine ≥1.5 mg/dl (representing 21% of the clinical cohort) had a false-positive AKI diagnosis rate of 30.5% (interquartile range =30.1%-30.9%) versus 2.0% (interquartile range =1.9%-2.1%) in patients with true serum creatinine values <1.5 mg/dl (P<0.001). Use of small serum creatinine changes to diagnose AKI is limited by high false-positive rates caused by inherent variability of serum creatinine at higher baseline values, potentially misclassifying patients with CKD in AKI studies. Copyright © 2015 by the American Society of Nephrology.
Kim, Joon-Tae; Chung, Pil-Wook; Starkman, Sidney; Sanossian, Nerses; Stratton, Samuel J; Eckstein, Marc; Pratt, Frank D; Conwit, Robin; Liebeskind, David S; Sharma, Latisha; Restrepo, Lucas; Tenser, May-Kim; Valdes-Sueiras, Miguel; Gornbein, Jeffrey; Hamilton, Scott; Saver, Jeffrey L
2017-02-01
The Los Angeles Motor Scale (LAMS) is a 3-item, 0- to 10-point motor stroke-deficit scale developed for prehospital use. We assessed the convergent, divergent, and predictive validity of the LAMS when performed by paramedics in the field at multiple sites in a large and diverse geographic region. We analyzed early assessment and outcome data prospectively gathered in the FAST-MAG trial (Field Administration of Stroke Therapy-Magnesium phase 3) among patients with acute cerebrovascular disease (cerebral ischemia and intracranial hemorrhage) within 2 hours of onset, transported by 315 ambulances to 60 receiving hospitals. Among 1632 acute cerebrovascular disease patients (age 70±13 years, male 57.5%), time from onset to prehospital LAMS was median 30 minutes (interquartile range 20-50), onset to early postarrival (EPA) LAMS was 145 minutes (interquartile range 119-180), and onset to EPA National Institutes of Health Stroke Scale was 150 minutes (interquartile range 120-180). Between the prehospital and EPA assessments, LAMS scores were stable in 40.5%, improved in 37.6%, and worsened in 21.9%. In tests of convergent validity, against the EPA National Institutes of Health Stroke Scale, correlations were r=0.49 for the prehospital LAMS and r=0.89 for the EPA LAMS. Prehospital LAMS scores did diverge from the prehospital Glasgow Coma Scale, r=-0.22. Predictive accuracy (adjusted C statistics) for nondisabled 3-month outcome was as follows: prehospital LAMS, 0.76 (95% confidence interval 0.74-0.78); EPA LAMS, 0.85 (95% confidence interval 0.83-0.87); and EPA National Institutes of Health Stroke Scale, 0.87 (95% confidence interval 0.85-0.88). In this multicenter, prospective, prehospital study, the LAMS showed good to excellent convergent, divergent, and predictive validity, further establishing it as a validated instrument to characterize stroke severity in the field. © 2017 American Heart Association, Inc.
2011-01-01
Introduction The role of ICU design and particularly single-patient rooms in decreasing bacterial transmission between ICU patients has been debated. A recent change in our ICU allowed further investigation. Methods Pre-move ICU-A and pre-move ICU-B were open-plan units. In March 2007, ICU-A moved to single-patient rooms (post-move ICU-A). ICU-B remained unchanged (post-move ICU-B). The same physicians cover both ICUs. Cultures of specified resistant organisms in surveillance or clinical cultures from consecutive patients staying >48 hours were compared for the different ICUs and periods to assess the effect of ICU design on acquisition of resistant organisms. Results Data were collected for 62, 62, 44 and 39 patients from pre-move ICU-A, post-move ICU-A, pre-move ICU-B and post-move ICU-B, respectively. Fewer post-move ICU-A patients acquired resistant organisms (3/62, 5%) compared with post-move ICU-B patients (7/39, 18%; P = 0.043, P = 0.011 using survival analysis) or pre-move ICU-A patients (14/62, 23%; P = 0.004, P = 0.012 on survival analysis). Only the admission period was significant for acquisition of resistant organisms comparing pre-move ICU-A with post-move ICU-A (hazard ratio = 5.18, 95% confidence interval = 1.03 to 16.06; P = 0.025). More antibiotic-free days were recorded in post-move ICU-A (median = 3, interquartile range = 0 to 5) versus post-move ICU-B (median = 0, interquartile range = 0 to 4; P = 0.070) or pre-move ICU-A (median = 0, interquartile range = 0 to 4; P = 0.017). Adequate hand hygiene was observed on 140/242 (58%) occasions in post-move ICU-A versus 23/66 (35%) occasions in post-move ICU-B (P < 0.001). Conclusions Improved ICU design, and particularly use of single-patient rooms, decreases acquisition of resistant bacteria and antibiotic use. This observation should be considered in future ICU design. PMID:21914222
Levin, Phillip D; Golovanevski, Mila; Moses, Allon E; Sprung, Charles L; Benenson, Shmuel
2011-01-01
The role of ICU design and particularly single-patient rooms in decreasing bacterial transmission between ICU patients has been debated. A recent change in our ICU allowed further investigation. Pre-move ICU-A and pre-move ICU-B were open-plan units. In March 2007, ICU-A moved to single-patient rooms (post-move ICU-A). ICU-B remained unchanged (post-move ICU-B). The same physicians cover both ICUs. Cultures of specified resistant organisms in surveillance or clinical cultures from consecutive patients staying >48 hours were compared for the different ICUs and periods to assess the effect of ICU design on acquisition of resistant organisms. Data were collected for 62, 62, 44 and 39 patients from pre-move ICU-A, post-move ICU-A, pre-move ICU-B and post-move ICU-B, respectively. Fewer post-move ICU-A patients acquired resistant organisms (3/62, 5%) compared with post-move ICU-B patients (7/39, 18%; P = 0.043, P = 0.011 using survival analysis) or pre-move ICU-A patients (14/62, 23%; P = 0.004, P = 0.012 on survival analysis). Only the admission period was significant for acquisition of resistant organisms comparing pre-move ICU-A with post-move ICU-A (hazard ratio = 5.18, 95% confidence interval = 1.03 to 16.06; P = 0.025). More antibiotic-free days were recorded in post-move ICU-A (median = 3, interquartile range = 0 to 5) versus post-move ICU-B (median = 0, interquartile range = 0 to 4; P = 0.070) or pre-move ICU-A (median = 0, interquartile range = 0 to 4; P = 0.017). Adequate hand hygiene was observed on 140/242 (58%) occasions in post-move ICU-A versus 23/66 (35%) occasions in post-move ICU-B (P < 0.001). Improved ICU design, and particularly use of single-patient rooms, decreases acquisition of resistant bacteria and antibiotic use. This observation should be considered in future ICU design.
Percutaneous Dilational Tracheotomy in Solid-Organ Transplant Recipients.
Ozdemirkan, Aycan; Ersoy, Zeynep; Zeyneloglu, Pinar; Gedik, Ender; Pirat, Arash; Haberal, Mehmet
2015-11-01
Solid-organ transplant recipients may require percutaneous dilational tracheotomy because of prolonged mechanical ventilation or airway issues, but data regarding its safety and effectiveness in solid-organ transplant recipients are scarce. Here, we evaluated the safety, effectiveness, and benefits in terms of lung mechanics, complications, and patient comfort of percutaneous dilational tracheotomy in solid-organ transplant recipients. Medical records from 31 solid-organ transplant recipients (median age of 41.0 years [interquartile range, 18.0-53.0 y]) who underwent percutaneous dilational tracheotomy at our hospital between January 2010 and March 2015 were analyzed, including primary diagnosis, comorbidities, duration of orotracheal intubation and mechanical ventilation, length of intensive care unit and hospital stays, the time interval between transplant to percutaneous dilational tracheotomy, Acute Physiology and Chronic Health Evaluation II score, tracheotomy-related complications, and pulmonary compliance and ratio of partial pressure of arterial oxygen to fraction of inspired oxygen. The median Acute Physiology and Chronic Health Evaluation II score on admission was 24.0 (interquartile range, 18.0-29.0). The median interval from transplant to percutaneous dilational tracheotomy was 105.5 days (interquartile range, 13.0-2165.0 d). The only major complication noted was left-sided pneumothorax in 1 patient. There were no significant differences in ratio of partial pressure of arterial oxygen to fraction of inspired oxygen before and after procedure (170.0 [interquartile range, 102.2-302.0] vs 210.0 [interquartile range, 178.5-345.5]; P = .052). However, pulmonary compliance results preprocedure and postprocedure were significantly different (0.020 L/cm H2O [interquartile range, 0.015-0.030 L/cm H2O] vs 0.030 L/cm H2O [interquartile range, 0.020-0.041 L/cm H2O); P = .001]). Need for sedation significantly decreased after tracheotomy (from 17 patients [54.8%] to 8 patients [25.8%]; P = .004]). Percutaneous dilational tracheotomy with bronchoscopic guidance is an efficacious and safe technique for maintaining airways in solidorgan transplant recipients who require prolonged mechanical ventilation, resulting in possible improvements in ventilatory mechanics and patient comfort.
Shkirkova, Kristina; Akam, Eftitan Y; Huang, Josephine; Sheth, Sunil A; Nour, May; Liang, Conrad W; McManus, Michael; Trinh, Van; Duckwiler, Gary; Tarpley, Jason; Vinuela, Fernando; Saver, Jeffrey L
2017-12-01
Background Rapid dissemination and coordination of clinical and imaging data among multidisciplinary team members are essential for optimal acute stroke care. Aim To characterize the feasibility and utility of the Synapse Emergency Room mobile (Synapse ERm) informatics system. Methods We implemented the Synapse ERm system for integration of clinical data, computerized tomography, magnetic resonance, and catheter angiographic imaging, and real-time stroke team communications, in consecutive acute neurovascular patients at a Comprehensive Stroke Center. Results From May 2014 to October 2014, the Synapse ERm application was used by 33 stroke team members in 84 Code Stroke alerts. Patient age was 69.6 (±17.1), with 41.5% female. Final diagnosis was: ischemic stroke 64.6%, transient ischemic attack 7.3%, intracerebral hemorrhage 6.1%, and cerebrovascular-mimic 22.0%. Each patient Synapse ERm record was viewed by a median of 10 (interquartile range 6-18) times by a median of 3 (interquartile range 2-4) team members. The most used feature was computerized tomography, magnetic resonance, and catheter angiography image display. In-app tweet team, communications were sent by median 1 (interquartile range 0-1, range 0-13) users per case and viewed by median 1 (interquartile range 0-3, range 0-44) team members. Use of the system was associated with rapid treatment times, faster than national guidelines, including median door-to-needle 51.0 min (interquartile range 40.5-69.5) and median door-to-groin 94.5 min (interquartile range 85.5-121.3). In user surveys, the mobile information platform was judged easy to employ in 91% (95% confidence interval 65%-99%) of uses and of added help in stroke management in 50% (95% confidence interval 22%-78%). Conclusion The Synapse ERm mobile platform for stroke team distribution and integration of clinical and imaging data was feasible to implement, showed high ease of use, and moderate perceived added utility in therapeutic management.
Excessive Exposure to Secondhand Tobacco Smoke among Hospitality Workers in Kyrgyzstan
Vinnikov, Denis; Brimkulov, Nurlan; Shahrir, Shahida; Breysse, Patrick; Navas-Acien, Ana
2010-01-01
The aim of this study was to assess the levels of secondhand smoke (SHS) exposure of men and women in public places in Kyrgyzstan. This cross-sectional study involved 10 bars and restaurants in Bishkek the capital city of Kyrgyzstan. Smoking was allowed in all establishments. Median (interquartile range) air nicotine concentrations were 6.82 (2.89, 8.86) μg/m3. Employees were asked about their smoking history and exposure to SHS at work. Employees were exposed to SHS for mean (SD) 13.5 (3.6) hours a day and 5.8 (1.4) days a week. Women were exposed to more hours of SHS at work compared to men. Hospitality workers are exposed to excessive amounts of SHS from customers. Legislation to ban smoking in public places including bars and restaurants is urgently needed to protect workers and patrons from the harmful effects of SHS. PMID:20617012
O'Bichere, Austin; Green, Colin; Phillips, Robin K S
2004-09-01
Water for colostomy irrigation is largely absorbed by the colon, which may result in less efficient expulsion of stool. This study compared the outcome of colonic cleansing with water and polyethylene glycol solution. In a cross-over study, 41 colostomy irrigators were randomly assigned to water or polyethylene glycol solution irrigation first and then the other regimen, each for one week. Patients recorded fluid inflow time, total washout time, cramps, leakage episodes, number of stoma pouches used, and satisfaction scores (Visual Analog Scale, 1-10: 1 = poor, and 10 = excellent). The median and interquartile range for each variable was calculated, and the two treatments were compared (Wilcoxon's test). Eight patients failed to complete the study. Thirty-three patients (20 females; mean age, 55 (range, 39-73) years) provided 352 irrigation sessions: water (n = 176), and polyethylene glycol solution (n = 176). Irrigation was performed every 24, 48, and 72 hours by 17, 9, and 7 patients respectively, using 500 ml (n = 1), 750 ml (n = 2), 1,000 ml (n = 16), 1,500 ml (n = 11), 2,000 ml (n = 2), and 3,500 ml (n = 1) of fluid. The median and interquartile range for water vs. polyethylene glycol solution were: fluid inflow time (6 (range, 4.4-10.8) vs. 6.3 (range, 4.1-11) minutes; P = 0.48), total washout time (53 (range, 33-69) vs. 38 (range, 28-55) minutes; P = 0.01), leakage episodes (2.3 (range, 1.7-3.8) vs. 0.7 (range, 0.2-1); P < 0.001), satisfaction score (5.8 (range, 4-7.5) vs. 8.8 (range, 8.3-10); P < 0.001), and stoma pouch usage per week (75 (range, 45-80) vs. 43 (range, 0-80); P = 0.008). No difference was demonstrated for frequency of cramps ( P = 0.24). Polyethylene glycol solution performed significantly better than water and may be a superior alternative fluid regimen for colostomy irrigation.
Ahmad, Tariq; Jackson, Keyanna; Rao, Veena S; Tang, W H Wilson; Brisco-Bacik, Meredith A; Chen, Horng H; Felker, G Michael; Hernandez, Adrian F; O'Connor, Christopher M; Sabbisetti, Venkata S; Bonventre, Joseph V; Wilson, F Perry; Coca, Steven G; Testani, Jeffrey M
2018-05-08
Worsening renal function (WRF) in the setting of aggressive diuresis for acute heart failure treatment may reflect renal tubular injury or simply indicate a hemodynamic or functional change in glomerular filtration. Well-validated tubular injury biomarkers, N -acetyl-β-d-glucosaminidase, neutrophil gelatinase-associated lipocalin, and kidney injury molecule 1, are now available that can quantify the degree of renal tubular injury. The ROSE-AHF trial (Renal Optimization Strategies Evaluation-Acute Heart Failure) provides an experimental platform for the study of mechanisms of WRF during aggressive diuresis for acute heart failure because the ROSE-AHF protocol dictated high-dose loop diuretic therapy in all patients. We sought to determine whether tubular injury biomarkers are associated with WRF in the setting of aggressive diuresis and its association with prognosis. Patients in the multicenter ROSE-AHF trial with baseline and 72-hour urine tubular injury biomarkers were analyzed (n=283). WRF was defined as a ≥20% decrease in glomerular filtration rate estimated with cystatin C. Consistent with protocol-driven aggressive dosing of loop diuretics, participants received a median 560 mg IV furosemide equivalents (interquartile range, 300-815 mg), which induced a urine output of 8425 mL (interquartile range, 6341-10 528 mL) over the 72-hour intervention period. Levels of N -acetyl-β-d-glucosaminidase and kidney injury molecule 1 did not change with aggressive diuresis (both P >0.59), whereas levels of neutrophil gelatinase-associated lipocalin decreased slightly (-8.7 ng/mg; interquartile range, -169 to 35 ng/mg; P <0.001). WRF occurred in 21.2% of the population and was not associated with an increase in any marker of renal tubular injury: neutrophil gelatinase-associated lipocalin ( P =0.21), N -acetyl-β-d-glucosaminidase ( P =0.46), or kidney injury molecule 1 ( P =0.22). Increases in neutrophil gelatinase-associated lipocalin, N -acetyl-β-d-glucosaminidase, and kidney injury molecule 1 were paradoxically associated with improved survival (adjusted hazard ratio, 0.80 per 10 percentile increase; 95% confidence interval, 0.69-0.91; P =0.001). Kidney tubular injury does not appear to have an association with WRF in the context of aggressive diuresis of patients with acute heart failure. These findings reinforce the notion that the small to moderate deteriorations in renal function commonly encountered with aggressive diuresis are dissimilar from traditional causes of acute kidney injury. © 2018 American Heart Association, Inc.
Nonresponders: prolonged fever among infants with urinary tract infections.
Bachur, R
2000-05-01
The majority of young children with fever and urinary tract infections (UTIs) have evidence of pyelonephritis based on renal scans. Resolution of fever during treatment is 1 clinical marker of adequate treatment. Theoretically, prolonged fever may be a clue to complications, such as urinary obstruction or renal abscess. Describe the pattern of fever in febrile children undergoing treatment of a UTI. Compare the clinical characteristics of those patients with prolonged fever to those who respond faster to therapy. An urban pediatric hospital. Medical record review. All children =2 years old admitted to the pediatric service with a primary discharge diagnosis of pyelonephritis or UTI were reviewed for 65 consecutive months. Patients with previous UTI, known urologic problems, or immunodeficiency were excluded. Only patients with an admitting temperature >/=38 degrees C and those who met standard culture criteria were studied. Temperatures are not recorded hourly on the inpatient unit; therefore, they were assigned to blocks of time. Nonresponders were defined as those above the 90th percentile for the time to defervesce. Nonresponders were then compared with the balance of the study patients, termed responders. Of 288 patients studied, the median age was 5.6 months (interquartile range: 1.3-7.9 months old). Median admission temperature was 39.3 degrees C (interquartile range: 38.5 degrees C-40.1 degrees C). Median time to defervesce ranged in the time block 13 to 16 hours. Sixty-eight percent were afebrile by 24 hours and 89% by 48 hours. Thirty-one patients had fever >48 hours (nonresponders). Nonresponders were older than responders (9.4 vs 4.1 months old) but had similar initial temperatures (39.8 vs 39.2 degrees C), white blood cell counts (18.4 vs 17.1 x 1000/mm(3)), and band counts (1.4 vs 1.2 x 1000/mm(3)). Nonresponders had similar urinalyses with regard to leukocyte esterase positive (23/29 vs 211/246), nitrite-positive (8/28 vs 88/221], and the number of patients with "too numerous to count" white blood cell counts per high power field (12/28 vs 77/220). Nonresponders were as likely as responders to have bacteremia (3/31 vs 21/256), hydronephrosis by renal ultrasound (1/31 vs 12/232), and significant vesicoureteral reflux (more than or equal to grade 3; 5/26 vs 30/219). Eschericia coli was the pathogen in cultures of 28 of 31 (nonresponders) and 225 of 257 (responders) cultures. The number of cultures with >/=100 colony-forming units/mL was similar (25/31 nonresponders vs 206/257 responders). Repeat urine cultures were performed in 93% of patients during the admission; all culture results were negative. No renal abscesses or pyo-hydronephrosis was diagnosed. Eighty-nine percent of young children with febrile UTIs were afebrile within 48 hours of initiating parenteral antibiotics. The patients who took longer than 48 hours to defervesce were clinically similar to those whose fevers responded faster to therapy. If antibiotic sensitivities are known, additional diagnostic studies or prolonged hospitalizations may not be justified solely based on persistent fever beyond 48 hours of therapy.
Intestinal cytokines in children with pervasive developmental disorders.
DeFelice, Magee L; Ruchelli, Eduardo D; Markowitz, Jonathan E; Strogatz, Melissa; Reddy, Krishna P; Kadivar, Khadijeh; Mulberg, Andrew E; Brown, Kurt A
2003-08-01
A relationship between autism and gastrointestinal (GI) immune dysregulation has been postulated based on incidence of GI complaints as well as macroscopically observed lymphonodular hyperplasia and microscopically determined enterocolitis in pediatric patients with autism. To evaluate GI immunity, we quantitatively assessed levels of proinflammatory cytokines, interleukin (IL)-6, IL-8, and IL-1beta, produced by intestinal biopsies of children with pervasive developmental disorders. Fifteen patients, six with pervasive developmental disorders and nine age-matched controls, presenting for diagnostic colonoscopy were enrolled. Endoscopic biopsies were organ cultured, supernatants were harvested, and IL-6, IL-8, and IL-1beta levels were quantified by ELISA. Tissue histology was evaluated by blinded pathologists. Concentrations of IL-6 from intestinal organ culture supernatants of patients with pervasive developmental disorders (median 318.5 pg/ml, interquartile range 282.0-393.0 pg/ml) when compared with controls (median 436.9 pg/ml, interquartile range 312.6-602.5 pg/ml) were not significantly different (p = 0.0987). Concentrations of IL-8 (median 84,000 pg/ml, interquartile range 16,000-143,000 pg/ml) when compared with controls (median 177,000 pg/ml, interquartile range 114,000-244,000 pg/ml) were not significantly different (p = 0.0707). Concentrations of IL-1beta (median 0.0 pg/ml, interquartile range 0.0-94.7 pg/ml) when compared with controls (median 0.0 pg/ml, interquartile range 0.0-60.2 pg/ml) were not significantly different (p = 0.8826). Tissue histology was nonpathological for all patients. We have demonstrated no significant difference in production of IL-6, IL-8, and IL-1beta between patients with pervasive developmental disorders and age-matched controls. In general, intestinal levels of IL-6 and IL-8 were lower in patients with pervasive developmental disorders than in age-matched controls. These data fail to support an association between autism and GI inflammation.
Evaluation of an artificial intelligence program for estimating occupational exposures.
Johnston, Karen L; Phillips, Margaret L; Esmen, Nurtan A; Hall, Thomas A
2005-03-01
Estimation and Assessment of Substance Exposure (EASE) is an artificial intelligence program developed by UK's Health and Safety Executive to assess exposure. EASE computes estimated airborne concentrations based on a substance's vapor pressure and the types of controls in the work area. Though EASE is intended only to make broad predictions of exposure from occupational environments, some occupational hygienists might attempt to use EASE for individual exposure characterizations. This study investigated whether EASE would accurately predict actual sampling results from a chemical manufacturing process. Personal breathing zone time-weighted average (TWA) monitoring data for two volatile organic chemicals--a common solvent (toluene) and a specialty monomer (chloroprene)--present in this manufacturing process were compared to EASE-generated estimates. EASE-estimated concentrations for specific tasks were weighted by task durations reported in the monitoring record to yield TWA estimates from EASE that could be directly compared to the measured TWA data. Two hundred and six chloroprene and toluene full-shift personal samples were selected from eight areas of this manufacturing process. The Spearman correlation between EASE TWA estimates and measured TWA values was 0.55 for chloroprene and 0.44 for toluene, indicating moderate predictive values for both compounds. For toluene, the interquartile range of EASE estimates at least partially overlapped the interquartile range of the measured data distributions in all process areas. The interquartile range of EASE estimates for chloroprene fell above the interquartile range of the measured data distributions in one process area, partially overlapped the third quartile of the measured data in five process areas and fell within the interquartile range in two process areas. EASE is not a substitute for actual exposure monitoring. However, EASE can be used in conditions that cannot otherwise be sampled and in preliminary exposure assessment if it is recognized that the actual interquartile range could be much wider and/or offset by a factor of 10 or more.
Variability in Antibiotic Use Across PICUs.
Brogan, Thomas V; Thurm, Cary; Hersh, Adam L; Gerber, Jeffrey S; Smith, Michael J; Shah, Samir S; Courter, Joshua D; Patel, Sameer J; Parker, Sarah K; Kronman, Matthew P; Lee, Brian R; Newland, Jason G
2018-06-01
To characterize and compare antibiotic prescribing across PICUs to evaluate the degree of variability. Retrospective analysis from 2010 through 2014 of the Pediatric Health Information System. Forty-one freestanding children's hospital. Children aged 30 days to 18 years admitted to a PICU in children's hospitals contributing data to Pediatric Health Information System. To normalize for potential differences in disease severity and case mix across centers, a subanalysis was performed of children admitted with one of the 20 All Patient Refined-Diagnosis Related Groups and the seven All Patient Refined-Diagnosis Related Groups shared by all PICUs with the highest antibiotic use. The study included 3,101,201 hospital discharges from 41 institutions with 386,914 PICU patients. All antibiotic use declined during the study period. The median-adjusted antibiotic use among PICU patients was 1,043 days of therapy/1,000 patient-days (interquartile range, 977-1,147 days of therapy/1,000 patient-days) compared with 893 among non-ICU children (interquartile range, 805-968 days of therapy/1,000 patient-days). For PICU patients, the median adjusted use of broad-spectrum antibiotics was 176 days of therapy/1,000 patient-days (interquartile range, 152-217 days of therapy/1,000 patient-days) and was 302 days of therapy/1,000 patient-days (interquartile range, 220-351 days of therapy/1,000 patient-days) for antimethicillin-resistant Staphylococcus aureus agents, compared with 153 days of therapy/1,000 patient-days (interquartile range, 130-182 days of therapy/1,000 patient-days) and 244 days of therapy/1,000 patient-days (interquartile range, 203-270 days of therapy/1,000 patient-days) for non-ICU children. After adjusting for potential confounders, significant institutional variability existed in antibiotic use in PICU patients, in the 20 All Patient Refined-Diagnosis Related Groups with the highest antibiotic usage and in the seven All Patient Refined-Diagnosis Related Groups shared by all 41 PICUs. The wide variation in antibiotic use observed across children's hospital PICUs suggests inappropriate antibiotic use.
Shaw, Kathryn D; Taylor, Nicholas F; Brusco, Natasha K
2013-06-01
Physiotherapy services provided outside of business hours may improve patient and hospital outcomes, but there is limited understanding of what services are provided. This study described current services provided outside of business hours across Australian hospitals. Design Descriptive, cross-sectional, Web-based survey. Participants A random sample of Australian hospitals from the public or private sector located in either metropolitan or rural/regional areas. A total of 112 completed surveys were submitted. The most common service outside of business hours was a Saturday service, provided by 61% of participating hospitals with a median (interquartile range [IQR]) of 1.0 hour (0.0 and 3.4) of physiotherapy per 30 beds. Sunday services were provided by 43% of hospitals, and services provided outside of business hours from Monday to Friday were provided by 14% of hospitals. More private hospitals provided some form of physiotherapy service outside of business hours (91%) than public hospitals (48%). More metropolitan hospitals provided some form of physiotherapy service outside of business hours (90%) than rural/regional hospitals (28%). Few of the hospitals providing sub-acute services had weekend physiotherapy (30%), but the majority of highly acute wards provided weekend physiotherapy (81%). Highly acute wards also provided more hours of service on a Saturday (median 8.1 hours per 30 beds, IQR 0.6-22.5) compared with acute wards (median 0.8 hours per 30 beds, IQR 0.0-2.8). There is limited availability of physiotherapy services in Australian hospitals outside of business hours. There are inequalities in physiotherapy services provided outside of business hours, with public, rural/regional and sub-acute facilities receiving fewer services outside of business hours than private, metropolitan and highly acute facilities. Copyright © 2012 John Wiley & Sons, Ltd.
Time to antibiotics for septic shock: evaluating a proposed performance measure.
Venkatesh, Arjun K; Avula, Umakanth; Bartimus, Holly; Reif, Justin; Schmidt, Michael J; Powell, Emilie S
2013-04-01
International guidelines recommend antibiotics within 1 hour of septic shock recognition; however, a recently proposed performance measure is focused on measuring antibiotic administration within 3 hours of emergency department (ED) arrival. Our objective was to describe the time course of septic shock and subsequent implications for performance measurement. Cross-sectional study of consecutive ED patients ultimately diagnosed with septic shock. All patients were evaluated at an urban, academic ED in 2006 to 2008. Primary outcomes included time to definition of septic shock and performance on 2 measures: antibiotics within 3 hours of ED arrival vs antibiotics within 1 hour of septic shock definition. Of 267 patients with septic shock, the median time to definition was 88 minutes (interquartile range, 37-156), and 217 patients (81.9%) met the definition within 3 hours of arrival. Of 221 (83.4%) of patients who received antibiotics within 3 hours of arrival, 38 (17.2%) did not receive antibiotics within 1 hour of definition. Of 207 patients who received antibiotics within 1 hour of definition, 11.6% (n = 24) did not receive antibiotics within 3 hours of arrival. The arrival measure did not accurately classify performance in 23.4% of patients. Nearly 1 of 5 patients cannot be captured for performance measurement within 3 hours of ED arrival due to the variable progression of septic shock. Use of this measure would misclassify performance in 23% of patients. Measuring antibiotic administration based on the clinical course of septic shock rather than from ED arrival would be more appropriate. Copyright © 2013 Elsevier Inc. All rights reserved.
Chaboyer, Wendy; Mills, Peter M; Roberts, Shelley; Latimer, Sharon
2015-02-01
Pressure injury guidelines recommend regular repositioning yet patients' mobility and repositioning patterns are unknown. An observational study using activity monitors was undertaken to describe the 24 h activity patterns of 84 hospitalized patients at risk of developing a pressure injury. The vast majority of participants' time was spent in the sedentary activity range (94% ± 3%) followed by the light range (5% ± 4 %). Patients changed their posture a median of 94 (interquartile range 48) time in the 24-h period (range 11-154), or ≈ 3.8 times per hour. Although a main focus for pressure injury prevention has been on repositioning, this study shows that patients with restricted mobility are actually moving quite often. Therefore, it might be appropriate to focus more attention on other pressure injury prevention strategies such as adequate nutrition, appropriate support surfaces and good skin care. © 2013 Wiley Publishing Asia Pty Ltd.
Single-Incision Laparoscopic Sterilization of the Cheetah (Acinonyx jubatus).
Hartman, Marthinus J; Monnet, Eric; Kirberger, Robert M; Schmidt-Küntzel, Anne; Schulman, Martin L; Stander, Jana A; Stegmann, George F; Schoeman, Johan P
2015-07-01
To describe laparoscopic ovariectomy and salpingectomy in the cheetah (Acinonyx jubatus) using single-incision laparoscopic surgery (SILS). Prospective cohort. Female cheetahs (Acinonyx jubatus) (n = 21). Cheetahs were randomly divided to receive either ovariectomy (n = 11) or salpingectomy (n = 10). The use and complications of a SILS port was evaluated in all of cheetahs. Surgery duration and insufflation volumes of carbon dioxide (CO2 ) were recorded and compared across procedures. Laparoscopic ovariectomy and salpingectomy were performed without complications using a SILS port. The poorly-developed mesosalpinx and ovarian bursa facilitated access to the uterine tube for salpingectomy in the cheetah. The median surgery duration for ovariectomy was 24 minutes (interquartile range 3) and for salpingectomy was 19.5 minutes (interquartile range 3) (P = .005). The median volume of CO2 used for ovariectomy was 11.25 L (interquartile range 3.08) and for salpingectomy was 4.90 L (interquartile range 2.52), (P = .001) CONCLUSIONS: Laparoscopic ovariectomy and salpingectomy can be performed in the cheetah using SILS without perioperative complications. Salpingectomy is faster than ovariectomy and requires less total CO2 for insufflation. © Copyright 2015 by The American College of Veterinary Surgeons.
Hetta, Diab Fuad; Rezk, Khalid Mohammed
2016-11-01
The aim of this study was to evaluate the analgesic efficacy and safety of pectoralis-serratus interfascial plane block in comparison with thoracic paravertebral block for postmastectomy pain. A prospective randomized controlled study. Tertiary center, university hospital. Sixty-four adult women, American Society of Anesthesiologists physical status classes I, II, and III, scheduled for unilateral modified radical mastectomy with axillary evacuation. Patients were randomized to receive either pectoralis-serratus interfascial plane block, PS group (n=32), or thoracic paravertebral block, PV group (n=32). Twenty-four-hour morphine consumption and the time to rescue analgesic were recorded. The pain intensity evaluated by visual analog scale (VAS) score at 0, 2, 4, 8, 16, and 24hours postoperatively was also recorded. The median (interquartile range) postoperative 24-hour morphine consumption was significantly increased in PS group in comparison to PV group (PS vs PV), 20 mg (16-23 mg) vs 12 mg (10-14 mg) (P<.001). The median postoperative time to first analgesic request was significantly shorter in PS group compared to PV group (PS, 6 hours [5-7 hours], vs PV, 11 hours [9-13 hours]) (P<.001). The intensity of pain was low in both groups in VAS 0, 2, and 4hours postoperatively. However, there was significant reduction in VAS in PV group compared to PS group at 8, 16, and 24hours postoperatively. Pectoralis-serratus interfascial plane block was safe and easy to perform and decreased intensity of postmastectomy pain, but it was inferior to thoracic paravertebral block. Copyright © 2016 Elsevier Inc. All rights reserved.
Out-of-pocket costs for childhood stroke: the impact of chronic illness on parents' pocketbooks.
Plumb, Patricia; Seiber, Eric; Dowling, Michael M; Lee, JoEllen; Bernard, Timothy J; deVeber, Gabrielle; Ichord, Rebecca N; Bastian, Rachel; Lo, Warren D
2015-01-01
Direct costs for children who had stroke are similar to those for adults. There is no information regarding the out-of-pocket costs families encounter. We described the out-of-pocket costs families encountered in the first year after a child's ischemic stroke. Twenty-two subjects were prospectively recruited at four centers in the United States and Canada in 2008 and 2009 as part of the "Validation of the Pediatric NIH Stroke Scale" study; families' indirect costs were tracked for 1 year. Every 3 months, parents reported hours they did not work, nonreimbursed costs for medical visits or other health care, and mileage. They provided estimates of annual income. We calculated total out-of-pocket costs in US dollars and reported costs as a proportion of annual income. Total median out-of-pocket cost for the year after an ischemic stroke was $4354 (range, $0-$28,666; interquartile range, $1008-$8245). Out-of-pocket costs were greatest in the first 3 months after the incident stroke, with the largest proportion because of lost wages, followed by transportation, and nonreimbursed health care. For the entire year, median costs represented 6.8% (range, 0%-81.9%; interquartile range, 2.7%-17.2%) of annual income. Out-of-pocket expenses are significant after a child's ischemic stroke. The median costs are noteworthy provided that the median American household had cash savings of $3650 at the time of the study. These results with previous reports of direct costs provide a more complete view of the overall costs to families and society. Childhood stroke creates an under-recognized cost to society because of decreased parental productivity. Copyright © 2015 Elsevier Inc. All rights reserved.
Rich, David Q; Utell, Mark J; Croft, Daniel P; Thurston, Sally W; Thevenet-Morrison, Kelly; Evans, Kristin A; Ling, Frederick S; Tian, Yilin; Hopke, Philip K
2018-01-01
Prior work has reported acute associations between ST-elevation myocardial infarction (STEMI) and short-term increases in airborne particulate matter. Subsequently, the association between STEMI and hourly measures of Delta-C (marker of woodsmoke) and black carbon (marker of traffic pollution) measured at a central site in Rochester, NY, were examined, but no association was found. Therefore, land use regression estimates of Delta-C and black carbon concentrations at each patient's residence were developed for 246 STEMI patients treated at the University of Rochester Medical Center during the winters of 2008-2012. Using case-crossover methods, the rate of STEMI associated with increased Delta-C and BC concentration on the same and previous 3 days was estimated after adjusting for 3-day mean temperature and relative humidity. Non-statistically significant increased rates of STEMI associated with interquartile range increases in concentrations of BC in the previous 2 days (1.10 μg/m 3 ; OR = 1.12; 95% CI 0.93, 1.35) and Delta-C in the previous 3 days (0.43 μg/m 3 ; OR = 1.16; 95% CI 0.96, 1.40) were found. Significantly increased rates of STEMI associated with interquartile range increases in concentrations of BC (1.23 μg/m 3 ; OR = 1.04; 95% CI = 0.87, 1.24) or Delta-C (0.40 μg/m 3 ; OR = 0.94; 95% CI = 0.85, 1.09) on the same day were not observed likely due, in part, to temporal misalignment. Therefore, sophisticated spatial-temporal models will be needed to minimize exposure error and bias by better predicting concentrations at individual locations for individual hours, especially for outcomes with short-term responses to air pollution (< 24 h).
Vinson, David R; Ballard, Dustin W; Huang, Jie; Reed, Mary E; Lin, James S; Kene, Mamata V; Sax, Dana R; Rauchwerger, Adina S; Wang, David H; McLachlan, D Ian; Pleshakov, Tamara S; Silver, Matthew A; Clague, Victoria A; Klonecke, Andrew S; Mark, Dustin G
2017-12-13
Outpatient management of emergency department (ED) patients with acute pulmonary embolism is uncommon. We seek to evaluate the facility-level variation of outpatient pulmonary embolism management and to describe patient characteristics and outcomes associated with home discharge. The Management of Acute Pulmonary Embolism (MAPLE) study is a retrospective cohort study of patients with acute pulmonary embolism undertaken in 21 community EDs from January 2013 to April 2015. We gathered demographic and clinical variables from comprehensive electronic health records and structured manual chart review. We used multivariable logistic regression to assess the association between patient characteristics and home discharge. We report ED length of stay, consultations, 5-day pulmonary embolism-related return visits and 30-day major hemorrhage, recurrent venous thromboembolism, and all-cause mortality. Of 2,387 patients, 179 were discharged home (7.5%). Home discharge varied significantly between EDs, from 0% to 14.3% (median 7.0%; interquartile range 4.2% to 10.9%). Median length of stay for home discharge patients (excluding those who arrived with a new pulmonary embolism diagnosis) was 6.0 hours (interquartile range 4.6 to 7.2 hours) and 81% received consultations. On adjusted analysis, ambulance arrival, abnormal vital signs, syncope or presyncope, deep venous thrombosis, elevated cardiac biomarker levels, and more proximal emboli were inversely associated with home discharge. Thirteen patients (7.2%) who were discharged home had a 5-day pulmonary embolism-related return visit. Thirty-day major hemorrhage and recurrent venous thromboembolism were uncommon and similar between patients hospitalized and those discharged home. All-cause 30-day mortality was lower in the home discharge group (1.1% versus 4.4%). Home discharge of ED patients with acute pulmonary embolism was uncommon and varied significantly between facilities. Patients selected for outpatient management had a low incidence of adverse outcomes. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Guillemette, Laetitia; Lacroix, Marilyn; Battista, Marie-Claude; Doyon, Myriam; Moreau, Julie; Ménard, Julie; Ardilouze, Jean-Luc; Perron, Patrice; Hivert, Marie-France
2014-05-01
TNFα is suspected to play a role in inflammation and insulin resistance leading to higher risk of metabolic impairment. Controversies exist concerning the role of TNFα in gestational insulin resistance. We investigated the interrelations between TNFα and insulin resistance in a large population-based cohort of pregnant women. Women (n = 756) were followed prospectively at 5-16 weeks and 24-28 weeks of pregnancy. Anthropometric measures and blood samples were collected at both visits. A 75-g oral glucose tolerance test (OGTT) was conducted at the second trimester to assess insulin sensitivity status (homeostasis model of assessment of insulin resistance and Matsuda index). TNFα was measured at the first trimester (nonfasting) and at each time point of the OGTT. Participants were 28.4 ± 4.4 years old and had a mean body mass index of 25.5 ± 5.5 kg/m(2) at first trimester. Median TNFα levels were 1.56 (interquartile range, 1.18-2.06) pg/mL at first trimester and 1.61 (interquartile range, 1.12-2.13) pg/mL at second trimester (1 h after glucose load). At second trimester, higher TNFα levels were associated with higher insulin resistance index levels (r = 0.37 and -0.30 for homeostasis model of assessment of insulin resistance and Matsuda index, respectively; P < .0001), even after adjustment for age, body mass index, triglycerides, and adiponectin. Women with higher insulin resistance showed a continuing decrease in TNFα levels during the OGTT, whereas women who were more insulin sensitive showed an increase in TNFα at hour 1 and a decrease at hour 2 of the test. Higher insulin resistance is associated with higher levels of circulating TNFα at first and second trimesters of pregnancy. TNFα level dynamics during an OGTT at second trimester vary according to insulin-resistance state.
Vision, Training Hours, and Road Testing Results in Bioptic Drivers
Dougherty, Bradley E.; Flom, Roanne E.; Bullimore, Mark A.; Raasch, Thomas W.
2015-01-01
Purpose Bioptic telescopic spectacles (BTS) can be used by people with central visual acuity that does not meet the state standards to obtain an unrestricted driver’s license. The purpose of this study was to examine the relationships among visual and demographic factors, training hours, and the results of road testing for bioptic drivers. Methods A retrospective study of patients who received an initial daylight bioptic examination at the Ohio State University and subsequently received a bioptic license was conducted. Data were collected on vision including visual acuity, contrast sensitivity, and visual field. Hours of driver training and results of Highway Patrol road testing were extracted from records. Relationships among vision, training hours, and road testing were analyzed. Results Ninety-seven patients who completed a vision examination between 2004 and 2008 and received daylight licensure with BTS were included. Results of the first Highway Patrol road test were available for 74 patients. The median interquartile range (IQR) hours of training prior to road testing was 21±17 hours, (range of 9 to 75 hours). Candidates without previous licensure were younger (p< 0.001) and had more documented training (p< 0.001). Lack of previous licensure and more training were significantly associated with having failed a portion of the Highway Patrol test and points deducted on the road test. Conclusions New bioptic drivers without previous non-bioptic driving experience required more training and performed more poorly on road testing for licensure than those who had previous non-bioptic licensure. No visual factor was predictive of road testing results after adjustment for previous experience. The hours of training received remained predictive of road testing outcome even with adjustment for previous experience. These results suggest that previous experience and trainer assessments should be investigated as potential predictors of road safety in bioptic drivers in future studies. PMID:25946098
Mobile pediatric neurosurgery: rapid response neurosurgery for remote or urgent pediatric patients.
Owler, Brian K; Carmo, Kathryn A Browning; Bladwell, Wendy; Fa'asalele, T Arieta; Roxburgh, Jane; Kendrick, Tina; Berry, Andrew
2015-09-01
Time-critical neurosurgical conditions require urgent operative treatment to prevent death or neurological deficits. In New South Wales/Australian Capital Territory patients' distance from neurosurgical care is often great, presenting a challenge in achieving timely care for patients with acute neurosurgical conditions. A protocol was developed to facilitate consultant neurosurgery locally. Children with acute, time-critical neurosurgical emergencies underwent operations in hospitals that do not normally offer neurosurgery. The authors describe the developed protocol, the outcome of its use, and the lessons learned in the 9 initial cases where the protocol has been used. Three cases are discussed in detail. Nine children were treated by a neurosurgeon at 5 rural hospitals, and 2 children were treated at a smaller metropolitan hospital. Road ambulance, fixed wing aircraft, and medical helicopters were used to transport the Newborn and Paediatric Emergency Transport Service (NETS) team, neurosurgeon, and patients. In each case, the time to definitive neurosurgical intervention was significantly reduced. The median interval from triage at the initial hospital to surgical start time was 3:55 hours, (interquartile range [IQR] 03:29-05:20 hours). The median distance traveled to reach a patient was 232 km (range 23-637 km). The median interval from the initial NETS call requesting patient retrieval to surgical start time was 3:15 hours (IQR 00:47-03:37 hours). The estimated median "time saved" was approximately 3:00 hours (IQR 1:44-3:15 hours) compared with the travel time to retrieve the child to the tertiary center: 8:31 hours (IQR 6:56-10:08 hours). Remote urgent neurosurgical interventions can be performed safely and effectively. This practice is relevant to countries where distance limits urgent access for patients to tertiary pediatric care. This practice is lifesaving for some children with head injuries and other acute neurosurgical conditions.
Engdal, Monika; Foss, Olav A; Taraldsen, Kristin; Husby, Vigdis S; Winther, Siri B
2017-07-01
Muscle weakness due to trauma from the surgical approach is anticipated to affect the ability of the patient to undertake daily physical activity early after total hip arthroplasty (THA). The objective of this study was to compare daily physical activity on days 1 to 4 after discharge, in patients following THA performed by 1 of 3 surgical approaches. A cohort study included 60 hip osteoarthritis patients, scheduled for THA, allocated to direct lateral approach, posterior approach, or anterior approach. Daily physical activity was measured by an accelerometer, with upright time per 24 hours as primary outcome and walking time, number of steps, and number of upright events per 24 hours as secondary outcomes. There were no statistically significant group differences in any of the measures of daily physical activity (P > 0.290) or between days of follow-up (P > 0.155). Overall, the median participant had 3.50 hours (interquartile range, 2.85-4.81 hours) of upright time, and participants showed wide variation in all outcomes of daily physical activity. There were no differences in daily physical activity between THA patients undergoing different surgical approaches. The surgical approach may not be a limiting factor for daily physical activity early after surgery in a fast-track treatment course.
Evidence for activation of nuclear factor kappaB in obstructive sleep apnea.
Yamauchi, Motoo; Tamaki, Shinji; Tomoda, Koichi; Yoshikawa, Masanori; Fukuoka, Atsuhiko; Makinodan, Kiyoshi; Koyama, Noriko; Suzuki, Takahiro; Kimura, Hiroshi
2006-12-01
Obstructive sleep apnea (OSA) is a risk factor for atherosclerosis, and atherosclerosis evolves from activation of the inflammatory cascade. We propose that activation of the nuclear factor kappaB (NF-kappaB), a key transcription factor in the inflammatory cascade, occurs in OSA. Nine age-matched, nonsmoking, and non-hypertensive men with OSA symptoms and seven similar healthy subjects were recruited for standard polysomnography followed by the collection of blood samples for monocyte nuclear p65 concentrations (OSA and healthy groups). In the OSA group, p65 and of monocyte production of tumor necrosis factor alpha (TNF-alpha) were measured at the same time and after the next night of continuous positive airway pressure (CPAP). p65 Concentrations in the OSA group were significantly higher than in the control group [median, 0.037 ng/microl (interquartile range, 0.034 to 0.051) vs 0.019 ng/microl (interquartile range, 0.013 to 0.032); p = 0.008], and in the OSA group were significantly correlated with apnea-hypopnea index and time spent below an oxygen saturation of 90% (r = 0.77 and 0.88, respectively) after adjustment for age and BMI. One night of CPAP resulted in a reduction in p65 [to 0.020 ng/mul (interquartile range, 0.010 to 0.036), p = 0.04] and levels of TNF-alpha production in cultured monocytes [16.26 (interquartile range, 7.75 to 24.85) to 7.59 ng/ml (interquartile range, 5.19 to 12.95), p = 0.01]. NF-kappaB activation occurs with sleep-disordered breathing. Such activation of NF-kappaB may contribute to the pathogenesis of atherosclerosis in OSA patients.
Sarkar, Siddhartha Sean; Bhagat, Indira; Bhatt-Mehta, Varsha; Sarkar, Subrata
2015-03-01
We hypothesized that maternal intrapartum antibiotic treatment delays the growth of the organism in the blood culture obtained during the work-up for infants with suspected early-onset sepsis (EOS). Single center, retrospective review of infants with blood culture-proven EOS over 13.5 years period. EOS was defined by isolation of a pathogen from blood culture obtained within 72 hours of birth and antibiotic treatment for ≥ 5 days. Among 81 infants with positive blood cultures, 38 were deemed to have EOS and 43 were deemed contaminants. The organisms grown were as follows: Escherichia coli in 17 infants, Group B streptococcus in 10 infants, and others in 11 infants. Overall, 17 infants with EOS did not receive intrapartum antibiotics and had blood cultures drawn for being symptomatic after birth. The other 21 infants who received intrapartum antibiotics had blood culture drawn primarily for maternal chorioamnionitis. The median (interquartile range [IQR]) incubation time to blood culture positivity was not different in infants who received intrapartum antibiotics compared with infants who did not (19.6 hours, IQR 16-28 hours vs. 19.5 hours, IQR 17.2-21.6 hours, p = 0.7489). Maternal intrapartum antibiotic treatment did not delay the time to blood culture positivity in infants with EOS. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Palkovits, Stefan; Seidel, Gerald; Pertl, Laura; Malle, Eva M; Hausberger, Silke; Makk, Johanna; Singer, Christoph; Osterholt, Julia; Herzog, Sereina A; Haas, Anton; Weger, Martin
2017-12-01
To evaluate the effect of intravitreal bevacizumab on the macular choroidal volume and the subfoveal choroidal thickness in treatment naïve eyes with exudative age-related macular degeneration. The macular choroidal volume and the subfoveal choroidal thickness were measured using enhanced depth imaging optical coherence tomography. After a screening examination, each patient received 3 monthly intravitreal injections of 1.25 mg bevacizumab. One month after the third injection was a final assessment. Forty-seven patients with a mean age of 80 ± 6.4 years were included. The macular choroidal volume decreased significantly from median 4.1 mm (interquartile range 3.4-5.9) to median 3.9 mm (interquartile range 3.1-5.6) between the baseline and final examination (difference -0.46 mm, 95% confidence interval: -0.57 to 0.35, P < 0.001). Similarly, subfoveal choroidal thickness had decreased from 157.0 μm (interquartile range 116.0-244.5) at baseline to 139.0 μm (interquartile range 102.5-212.0) at the final examination (P < 0.001). Both parameters macular choroidal volume at baseline and subfoveal choroidal thickness at baseline were not associated with the response to treatment. The macular choroidal volume and the subfoveal choroidal thickness decreased significantly after 3 monthly bevacizumab injections for exudative age-related macular degeneration.
Dingley, John; Liu, Xun; Gill, Hannah; Smit, Elisa; Sabir, Hemmen; Tooley, James; Chakkarapani, Ela; Windsor, David; Thoresen, Marianne
2015-06-01
Therapeutic hypothermia is the standard of care after perinatal asphyxia. Preclinical studies show 50% xenon improves outcome, if started early. During a 32-patient study randomized between hypothermia only and hypothermia with xenon, 5 neonates were given xenon during retrieval using a closed-circuit incubator-mounted system. Without xenon availability during retrieval, 50% of eligible infants exceeded the 5-hour treatment window. With the transportable system, 100% were recruited. Xenon delivery lasted 55 to 120 minutes, using 174 mL/h (117.5-193.2) (median [interquartile range]), after circuit priming (1300 mL). Xenon delivery during ambulance retrieval was feasible, reduced starting delays, and used very little gas.
The Timing of Early Antibiotics and Hospital Mortality in Sepsis.
Liu, Vincent X; Fielding-Singh, Vikram; Greene, John D; Baker, Jennifer M; Iwashyna, Theodore J; Bhattacharya, Jay; Escobar, Gabriel J
2017-10-01
Prior sepsis studies evaluating antibiotic timing have shown mixed results. To evaluate the association between antibiotic timing and mortality among patients with sepsis receiving antibiotics within 6 hours of emergency department registration. Retrospective study of 35,000 randomly selected inpatients with sepsis treated at 21 emergency departments between 2010 and 2013 in Northern California. The primary exposure was antibiotics given within 6 hours of emergency department registration. The primary outcome was adjusted in-hospital mortality. We used detailed physiologic data to quantify severity of illness within 1 hour of registration and logistic regression to estimate the odds of hospital mortality based on antibiotic timing and patient factors. The median time to antibiotic administration was 2.1 hours (interquartile range, 1.4-3.1 h). The adjusted odds ratio for hospital mortality based on each hour of delay in antibiotics after registration was 1.09 (95% confidence interval [CI], 1.05-1.13) for each elapsed hour between registration and antibiotic administration. The increase in absolute mortality associated with an hour's delay in antibiotic administration was 0.3% (95% CI, 0.01-0.6%; P = 0.04) for sepsis, 0.4% (95% CI, 0.1-0.8%; P = 0.02) for severe sepsis, and 1.8% (95% CI, 0.8-3.0%; P = 0.001) for shock. In a large, contemporary, and multicenter sample of patients with sepsis in the emergency department, hourly delays in antibiotic administration were associated with increased odds of hospital mortality even among patients who received antibiotics within 6 hours. The odds increased within each sepsis severity strata, and the increased odds of mortality were greatest in septic shock.
Asdaghi, Negar; Wang, Kefeng; Ciliberti-Vargas, Maria A; Gutierrez, Carolina Marinovic; Koch, Sebastian; Gardener, Hannah; Dong, Chuanhui; Rose, David Z; Garcia, Enid J; Burgin, W Scott; Zevallos, Juan Carlos; Rundek, Tatjana; Sacco, Ralph L; Romano, Jose G
2018-03-01
Mild stroke is the most common cause for thrombolysis exclusion in patients acutely presenting to the hospital. Thrombolysis administration in this subgroup is highly variable among different clinicians and institutions. We aim to study the predictors of thrombolysis in patients with mild ischemic stroke in the FL-PR CReSD registry (Florida-Puerto Rico Collaboration to Reduce Stroke Disparities). Among 73 712 prospectively enrolled patients with a final diagnosis of ischemic stroke or TIA from January 2010 to April 2015, we identified 7746 cases with persistent neurological symptoms and National Institutes of Health Stroke Scale ≤5 who arrived within 4 hours of symptom onset. Multilevel logistic regression analysis with generalized estimating equations was used to identify independent predictors of thrombolytic administration in the subgroup of patients without contraindications to thrombolysis. We included 6826 cases (final diagnosis mild stroke, 74.6% and TIA, 25.4%). Median age was 72 (interquartile range, 21); 52.7% men, 70.3% white, 12.9% black, 16.8% Hispanic; and median National Institutes of Health Stroke Scale, 2 (interquartile range, 3). Patients who received thrombolysis (n=1281, 18.7%) were younger (68 versus 72 years), had less vascular risk factors (hypertension, diabetes mellitus, and dyslipidemia), had lower risk of prior vascular disease (myocardial infarction, peripheral vascular disease, and previous stroke), and had a higher presenting median National Institutes of Health Stroke Scale (4 versus 2). In the multilevel multivariable model, early hospital arrival (arrive by 0-2 hours versus ≥3.5 hours; odds ratio [OR], 8.16; 95% confidence interval [CI], 4.76-13.98), higher National Institutes of Health Stroke Scale (OR, 1.87; 95% CI, 1.77-1.98), aphasia at presentation (OR, 1.35; 95% CI, 1.12-1.62), faster door-to-computed tomography time (OR, 1.81; 95% CI, 1.53-2.15), and presenting to an academic hospital (OR, 2.02; 95% CI, 1.39-2.95) were independent predictors of thrombolysis administration. Mild acutely presenting stroke patients are more likely to receive thrombolysis if they are young, white, or Hispanic and arrive early to the hospital with more severe neurological presentation. Identification of predictors of thrombolysis is important in design of future studies to assess the use of thrombolysis for mild stroke. © 2018 American Heart Association, Inc.
Rief, Matthias; Martus, Peter; Kendziora, Benjamin; Feger, Sarah; Dreger, Henryk; Priem, Sascha; Knebel, Fabian; Böhm, Marko; Schlattmann, Peter; Hamm, Bernd; Schönenberger, Eva; Laule, Michael; Zimmermann, Elke
2016-01-01
Objective To evaluate whether invasive coronary angiography or computed tomography (CT) should be performed in patients clinically referred for coronary angiography with an intermediate probability of coronary artery disease. Design Prospective randomised single centre trial. Setting University hospital in Germany. Participants 340 patients with suspected coronary artery disease and a clinical indication for coronary angiography on the basis of atypical angina or chest pain. Interventions 168 patients were randomised to CT and 172 to coronary angiography. After randomisation one patient declined CT and 10 patients declined coronary angiography, leaving 167 patients (88 women) and 162 patients (78 women) for analysis. Allocation could not be blinded, but blinded independent investigators assessed outcomes. Main outcome measure The primary outcome measure was major procedural complications within 48 hours of the last procedure related to CT or angiography. Results Cardiac CT reduced the need for coronary angiography from 100% to 14% (95% confidence interval 9% to 20%, P<0.001) and was associated with a significantly greater diagnostic yield from coronary angiography: 75% (53% to 90%) v 15% (10% to 22%), P<0.001. Major procedural complications were uncommon (0.3%) and similar across groups. Minor procedural complications were less common in the CT group than in the coronary angiography group: 3.6% (1% to 8%) v 10.5% (6% to 16%), P=0.014. CT shortened the median length of stay in the angiography group from 52.9 hours (interquartile range 49.5-76.4 hours) to 30.0 hours (3.5-77.3 hours, P<0.001). Overall median exposure to radiation was similar between the CT and angiography groups: 5.0 mSv (interquartile range 4.2-8.7 mSv) v 6.4 mSv (3.4-10.7 mSv), P=0.45. After a median follow-up of 3.3 years, major adverse cardiovascular events had occurred in seven of 167 patients in the CT group (4.2%) and six of 162 (3.7%) in the coronary angiography group (adjusted hazard ratio 0.90, 95% confidence interval 0.30 to 2.69, P=0.86). 79% of patients stated that they would prefer CT for subsequent testing. The study was conducted at a University hospital in Germany and thus the performance of CT may be different in routine clinical practice. The prevalence was lower than expected, resulting in an underpowered study for the predefined primary outcome. Conclusions CT increased the diagnostic yield and was a safe gatekeeper for coronary angiography with no increase in long term events. The length of stay was shortened by 22.9 hours with CT, and patients preferred non-invasive testing. Trial registration ClinicalTrials.gov NCT00844220. PMID:27777234
Dewey, Marc; Rief, Matthias; Martus, Peter; Kendziora, Benjamin; Feger, Sarah; Dreger, Henryk; Priem, Sascha; Knebel, Fabian; Böhm, Marko; Schlattmann, Peter; Hamm, Bernd; Schönenberger, Eva; Laule, Michael; Zimmermann, Elke
2016-10-24
To evaluate whether invasive coronary angiography or computed tomography (CT) should be performed in patients clinically referred for coronary angiography with an intermediate probability of coronary artery disease. Prospective randomised single centre trial. University hospital in Germany. 340 patients with suspected coronary artery disease and a clinical indication for coronary angiography on the basis of atypical angina or chest pain. 168 patients were randomised to CT and 172 to coronary angiography. After randomisation one patient declined CT and 10 patients declined coronary angiography, leaving 167 patients (88 women) and 162 patients (78 women) for analysis. Allocation could not be blinded, but blinded independent investigators assessed outcomes. The primary outcome measure was major procedural complications within 48 hours of the last procedure related to CT or angiography. Cardiac CT reduced the need for coronary angiography from 100% to 14% (95% confidence interval 9% to 20%, P<0.001) and was associated with a significantly greater diagnostic yield from coronary angiography: 75% (53% to 90%) v 15% (10% to 22%), P<0.001. Major procedural complications were uncommon (0.3%) and similar across groups. Minor procedural complications were less common in the CT group than in the coronary angiography group: 3.6% (1% to 8%) v 10.5% (6% to 16%), P=0.014. CT shortened the median length of stay in the angiography group from 52.9 hours (interquartile range 49.5-76.4 hours) to 30.0 hours (3.5-77.3 hours, P<0.001). Overall median exposure to radiation was similar between the CT and angiography groups: 5.0 mSv (interquartile range 4.2-8.7 mSv) v 6.4 mSv (3.4-10.7 mSv), P=0.45. After a median follow-up of 3.3 years, major adverse cardiovascular events had occurred in seven of 167 patients in the CT group (4.2%) and six of 162 (3.7%) in the coronary angiography group (adjusted hazard ratio 0.90, 95% confidence interval 0.30 to 2.69, P=0.86). 79% of patients stated that they would prefer CT for subsequent testing. The study was conducted at a University hospital in Germany and thus the performance of CT may be different in routine clinical practice. The prevalence was lower than expected, resulting in an underpowered study for the predefined primary outcome. CT increased the diagnostic yield and was a safe gatekeeper for coronary angiography with no increase in long term events. The length of stay was shortened by 22.9 hours with CT, and patients preferred non-invasive testing.Trial registration ClinicalTrials.gov NCT00844220. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Lin, Yucong; Xu, Xijin; Dai, Yifeng; Zhang, Yuling; Li, Weiqiu; Huo, Xia
2016-12-15
Data on vaccination effects in children chronically exposed to heavy metals are extremely scarce. This study aims to investigate the immune responsiveness to measles, mumps, and rubella (MMR) vaccination in children from an e-waste recycling area. 378 healthy children from Guiyu (exposed group) and Haojiang (reference group) were surveyed. Blood lead (Pb) levels were measured by graphite furnace atomic absorption. Titers of antibodies against MMR were quantified by ELISA. Blood Pb levels of children from the exposed group were significantly higher than those from the reference group (5.61μg/dL vs. 3.57μg/dL, p<0.001). In contrast, the antibody titers against MMR of the children from the exposed group were significantly lower than those from the reference group. The median titer of the anti-measles antibody of the exposed group was 669.64mIU/mL, with an interquartile range of 372.88-1068.42mIU/mL; this was decreased by nearly 40% compared to that of the reference group (median 1046.79mIU/mL, interquartile range 603.29-1733.10mIU/mL). For antibody titers against mumps, there was an about 45% decrease in the exposed group (median 272.24U/mL, interquartile range 95.19-590.16U/mL), compared to the reference group (median 491.78U/mL, interquartile range 183.38-945.96U/mL). In the case of rubella, the median titer of the antibody was also significantly lower in the exposed group (median 37.08IU/mL, interquartile range 17.67-66.66IU/mL) compared to the reference group (median 66.50IU/mL, interquartile range 25.32-105.59IU/mL); the decrease in this case was nearly 44%. The proportion of children whose antibody titers against MMR were below protective level in the exposed group was higher than it was in the reference group. The present study demonstrates that the immune responsiveness to routine vaccination was suppressed in children chronically exposed to lead. Thus, the vaccination strategies for these children living in an e-waste recycling area should be modified. Copyright © 2016. Published by Elsevier B.V.
Litton, Edward; Elliott, Rosalind; Thompson, Kelly; Watts, Nicola; Seppelt, Ian; Webb, Steven A R
2017-06-01
To use clinically accessible tools to determine unit-level and individual patient factors associated with sound levels and sleep disruption in a range of representative ICUs. A cross-sectional, observational study. Australian and New Zealand ICUs. All patients 16 years or over occupying an ICU bed on one of two Point Prevalence study days in 2015. Ambient sound was measured for 1 minute using an application downloaded to a personal mobile device. Bedside nurses also recorded the total time and number of awakening for each patient overnight. The study included 539 participants with sound level recorded using an application downloaded to a personal mobile device from 39 ICUs. Maximum and mean sound levels were 78 dB (SD, 9) and 62 dB (SD, 8), respectively. Maximum sound levels were higher in ICUs with a sleep policy or protocol compared with those without maximum sound levels 81 dB (95% CI, 79-83) versus 77 dB (95% CI, 77-78), mean difference 4 dB (95% CI, 0-2), p < 0.001. There was no significant difference in sound levels regardless of single room occupancy, mechanical ventilation status, or illness severity. Clinical nursing staff in all 39 ICUs were able to record sleep assessment in 15-minute intervals. The median time awake and number of prolonged disruptions were 3 hours (interquartile range, 1-4) and three (interquartile range, 2-5), respectively. Across a large number of ICUs, patients were exposed to high sound levels and substantial sleep disruption irrespective of factors including previous implementation of a sleep policy. Sound and sleep measurement using simple and accessible tools can facilitate future studies and could feasibly be implemented into clinical practice.
Zhang, Lihua; Desai, Nihar R; Li, Jing; Hu, Shuang; Wang, Qing; Li, Xi; Masoudi, Frederick A; Spertus, John A; Nuti, Sudhakar V; Wang, Sisi; Krumholz, Harlan M; Jiang, Lixin
2015-01-01
Background Early clopidogrel administration to patients with acute myocardial infarction (AMI) has been demonstrated to improve outcomes in a large Chinese trial. However, patterns of use of clopidogrel for patients with AMI in China are unknown. Methods and Results From a nationally representative sample of AMI patients from 2006 and 2011, we identified 11 944 eligible patients for clopidogrel therapy and measured early clopidogrel use, defined as initiation within 24 hours of hospital admission. Among the patients eligible for clopidogrel, the weighted rate of early clopidogrel therapy increased from 45.7% in 2006 to 79.8% in 2011 (P<0.001). In 2006 and 2011, there was significant variation in early clopidogrel use by region, ranging from 1.5% to 58.0% in 2006 (P<0.001) and 48.7% to 87.7% in 2011 (P<0.001). While early use of clopidogrel was uniformly high in urban hospitals in 2011 (median 89.3%; interquartile range: 80.1% to 94.5%), there was marked heterogeneity among rural hospitals (median 50.0%; interquartile range: 11.5% to 84.4%). Patients without reperfusion therapy and those admitted to rural hospitals were less likely to be treated with clopidogrel. Conclusions Although the use of early clopidogrel therapy in patients with AMI has increased substantially in China, there is notable wide variation across hospitals, with much less adoption in rural hospitals. Quality improvement initiatives are needed to increase consistency of early clopidogrel use for patients with AMI. Clinical Trial Registration URL: https://www.clinicaltrials.gov/. Unique identifier: NCT01624883. PMID:26163041
Grignard, Lynn; Gonçalves, Bronner P; Early, Angela M; Daniels, Rachel F; Tiono, Alfred B; Guelbéogo, Wamdaogo M; Ouédraogo, Alphonse; van Veen, Elke M; Lanke, Kjerstin; Diarra, Amidou; Nebie, Issa; Sirima, Sodiomon B; Targett, Geoff A; Volkman, Sarah K; Neafsey, Daniel E; Wirth, Dyann F; Bousema, Teun; Drakeley, Chris
2018-05-05
Plasmodium falciparum malaria infections often comprise multiple distinct parasite clones. Few datasets have directly assessed infection complexity in humans and mosquitoes they infect. Examining parasites using molecular tools may provide insights into the selective transmissibility of isolates. Using capillary electrophoresis genotyping and next generation amplicon sequencing, we analysed complexity of parasite infections in human blood and in the midguts of mosquitoes that became infected in membrane feeding experiments using the same blood material in two West African settings. Median numbers of clones in humans and mosquitoes were higher in samples from Burkina Faso (4.5, interquartile range 2-8 for humans; and 2, interquartile range 1-3 for mosquitoes) than in The Gambia (2, interquartile range 1-3 and 1, interquartile range 1-3, for humans and mosquitoes, respectively). Whilst the median number of clones was commonly higher in human blood samples, not all transmitted alleles were detectable in the human peripheral blood. In both study sample sets, additional parasite alleles were identified in mosquitoes compared with the matched human samples (10-88.9% of all clones/feeding assay, n = 73 feeding assays). The results are likely due to preferential amplification of the most abundant clones in peripheral blood but confirm the presence of low density clones that produce transmissible sexual stage parasites. Copyright © 2018. Published by Elsevier Ltd.
Serum Fatty Acid Binding Protein 4 (FABP4) Predicts Pre-eclampsia in Women With Type 1 Diabetes.
Wotherspoon, Amy C; Young, Ian S; McCance, David R; Patterson, Chris C; Maresh, Michael J A; Pearson, Donald W M; Walker, James D; Holmes, Valerie A
2016-10-01
To examine the association between fatty acid binding protein 4 (FABP4) and pre-eclampsia risk in women with type 1 diabetes. Serum FABP4 was measured in 710 women from the Diabetes and Pre-eclampsia Intervention Trial (DAPIT) in early pregnancy and in the second trimester (median 14 and 26 weeks' gestation, respectively). FABP4 was significantly elevated in early pregnancy (geometric mean 15.8 ng/mL [interquartile range 11.6-21.4] vs. 12.7 ng/mL [interquartile range 9.6-17]; P < 0.001) and the second trimester (18.8 ng/mL [interquartile range 13.6-25.8] vs. 14.6 ng/mL [interquartile range 10.8-19.7]; P < 0.001) in women in whom pre-eclampsia later developed. Elevated second-trimester FABP4 level was independently associated with pre-eclampsia (odds ratio 2.87 [95% CI 1.24-6.68], P = 0.03). The addition of FABP4 to established risk factors significantly improved net reclassification improvement at both time points and integrated discrimination improvement in the second trimester. Increased second-trimester FABP4 independently predicted pre-eclampsia and significantly improved reclassification and discrimination. FABP4 shows potential as a novel biomarker for pre-eclampsia prediction in women with type 1 diabetes. © 2016 by the American Diabetes Association.
Colon Transit Time Test in Korean Children with Chronic Functional Constipation
Yoo, Ha Yeong; Kim, Mock Ryeon; Park, Hye Won; Son, Jae Sung
2016-01-01
Purpose Each ethnic group has a unique life style, including diets. Life style affects bowel movement. The aim of this study is to describe the results of colon transit time (CTT) tests in Korean children who had chronic functional constipation based on highly refined data. Methods One hundred ninety (86 males) out of 415 children who performed a CTT test under the diagnosis of chronic constipation according to Rome III criteria at Konkuk University Medical Center from January 2006 through March 2015 were enrolled in this study. Two hundreds twenty-five children were excluded on the basis of CTT test result, defecation diary, and clinical setting. Shapiro-Wilk and Mann-Whitney U, and chi-square tests were used for statistical analysis. Results The median value and interquartile range (IQR) of CTT was 54 (37.5) hours in Encopresis group, and those in non-encopresis group was 40.2 (27.9) hours (p<0.001). The frequency of subtype between non-encopresis group and encopresis was statistically significant (p=0.002). The non-encopresis group (n=154, 81.1%) was divided into normal transit subgroup (n=84, 54.5%; median value and IQR of CTT=26.4 [9.6] hours), outlet obstruction subgroup (n=18, 11.7%; 62.4 [15.6] hours), and slow transit subgroup (n=52, 33.8%; 54.6 [21.0] hours]. The encopresis group (n=36, 18.9%) was divided into normal transit subgroup (n=8, 22.2%; median value and IQR of CTT=32.4 [9.9] hours), outlet obstruction subgroup (n=8, 22.2%; 67.8 [34.8] hours), and slow transit subgroup (n=20, 55.6%; 59.4 [62.7]hours). Conclusion This study provided the basic pattern and value of the CTT test in Korean children with chronic constipation. PMID:27064388
Iglesias, Verónica; Erazo, Marcia; Droppelmann, Andrea; Steenland, Kyle; Aceituno, Paulina; Orellana, Cecilia; Acuña, Marisol; Peruga, Armando; Breysse, Patrick N.; Navas-Acien, Ana
2015-01-01
Objective To evaluate the relative contribution of occupational vs. non-occupational secondhand tobacco smoke exposure to overall hair nicotine concentrations in non-smoking bar and restaurant employees. Method We recruited 76 non-smoking employees from venues that allowed smoking (n = 9), had mixed policies (smoking and non-smoking areas, n = 13) or were smoke-free (n = 2) between April and August 2008 in Santiago, Chile. Employees used personal air nicotine samplers during working and non-working hours for a 24-h period to assess occupational vs. non-occupational secondhand tobacco smoke exposure and hair nicotine concentrations to assess overall secondhand tobacco smoke exposure. Results Median hair nicotine concentrations were 1.5 ng/mg, interquartile range (IQR) 0.7 to 5.2 ng/mg. Time weighted average personal air nicotine concentrations were higher during working hours (median 9.7, IQR 3.3-25.4 μg/m3) compared to non-working hours (1.7, 1.0-3.1 μg/m3). Hair nicotine concentration was best predicted by personal air nicotine concentration at working hours. After adjustment, a 2-fold increase in personal air nicotine concentration in working hours was associated with a 42% increase in hair nicotine concentration (95% confidence interval 14-70%). Hair nicotine concentration was not associated with personal air nicotine concentration during non-working hours (non-occupational exposure). Conclusions Personal air nicotine concentration at working hours was the major determinant of hair nicotine concentrations in non-smoking employees from Santiago, Chile. Secondhand tobacco smoke exposure during working hours is a health hazard for hospitality employees working in venues where smoking is allowed. PMID:24813578
Efficacy of Liposomal Bupivacaine Infiltration on the Management of Total Knee Arthroplasty.
Sakamoto, Bryan; Keiser, Shelly; Meldrum, Russell; Harker, Gene; Freese, Andrew
2017-01-01
Liposomal bupivacaine is a novel extended-duration anesthetic that has recently been used for local infiltration in total knee arthroplasty (TKA). Athough liposomal bupivacaine is widely used, it is unknown if the benefits justify the cost in the veteran population at our institution. To evaluate a change in practice: the effect of local infiltration of liposomal bupivacaine on perioperative outcomes in patients undergoing primary TKA. A retrospective cohort study was conducted among patients who underwent primary TKA at a Veterans Affairs Medical Center before (March 3, 2013-March 2, 2014) and after (March 3, 2014-March 2, 2015) the implementation of liposomal bupivacaine for local infiltration in TKA. Drug utilization evaluation of liposomal bupivacaine for local infiltration in TKA. Use of opioids after discharge from the postanesthesia care unit. Among 199 patients, those who received liposomal bupivacaine after primary TKA (mean [SD] age, 65.3 [6.9] years; 93 males and 5 females) had a reduced median opioid use in the first 24 hours after surgery compared with those who did not receive liposomal bupivacaine (mean [SD] age, 64.9 [8.4] years; 95 males and 6 females; [intravenous morphine equivalents, 12.50 vs 22.50 mg; P = .001]). The use of patient-controlled analgesia was also reduced among patients who received liposomal bupivacaine vs those who did not (49 vs 91; P < .001). A reduction in the use of antiemetics was observed in the first 24 hours after surgery (13 vs 34; P = .001) and in the postanesthesia care unit among those who received liposomal bupivacaine vs those who did not (4 vs 20; P = .001). The number of patients in the postanesthesia care unit with no pain was improved among those who received liposomal bupivacaine vs those who did not (44 vs 19; P < .001). Although median (interquartile range) pain scores in the postanesthesia care unit were improved among patients who received liposomal bupivacaine vs those who did not (4.0 [0.0-6.6] vs 5.5 [3.0-7.5]; P = .001), patients who received liposomal bupivacaine had greater median (interquartile range) pain scores 48 hours (5.5 [4.0-7.0] vs 5.0 [3.0-6.0]; P = .01), 72 hours (5.0 [4.0-6.0] vs 4.0 [2.0-6.0]; P = .002), and 96 hours (5.0 [3.0-6.5] vs 4.0 [1.0-5.0]; P = .003) after surgery than those who did not receive liposomal bupivacaine. There was no difference in the median length of stay between the 2 groups. Institutional cost savings was estimated at $27 000 per year. Local infiltration of liposomal bupivacaine reduces use of opioids in the first 24 hours after primary TKA. Similarly, reduction in antiemetic use and improved postoperative pain are also seen in the first 24 hours after surgery but are limited to this time frame. Furthermore, a positive institutional cost savings was observed.
Tominaga, Ryoji; Sekiguchi, Miho; Yonemoto, Koji; Kakuma, Tatsuyuki; Konno, Shin-Ichi
2018-05-01
The Japanese Orthopaedic Association Back Pain Evaluation Questionnaire (JOABPEQ) was developed in 2007, including the five domains of Pain-related disorder, Lumbar spine dysfunction, Gait disturbance, Social life disturbance, and Psychological disorder. It is used by physicians to evaluate treatment efficacy by comparing scores before and after treatment. However, the JOABPEQ does not allow evaluation of the severity of a patient's condition compared to the general population at a single time point. Given the unavailability of a standard measurement of back pain, we sought to establish reference scores and interquartile ranges using data obtained from a multicenter, cross-sectional survey taken in Japanese primary care settings. The Lumbar Spinal Stenosis Diagnosis Support Tool project was conducted from 2011 to 2012 in 1657 hospitals in Japan to investigate the establishment of reference scores using JOABPEQ. Patients aged ≥ 20 years undergoing medical examinations by either non-orthopaedic primary care physicians or general orthopedists were considered for enrollment. A total of 10,651 consecutive low back pain patients (5331 men, 5320 women, 18 subjects with missing sex data) who had undergone a medical examination were included. Reference scores and interquartile ranges for each of the five domains of the JOABPEQ according to age and sex were recorded. The median score and interquartile range are the same in the domain of Pain-related disorder in all ages and sexes. The reference scores for Gait disturbance, Social life disturbance and Psychological disorder declined with increasing age in both age- and sex-stratified groups, while there was some different trend in Lumbar spine dysfunction between men and women. Reference scores and interquartile ranges for JOABPEQ were generated based on the data from the examination data. These provide a measurement standard to assess patient perceptions of low back pain at any time point during evaluation or therapy. Copyright © 2018 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
Development of a PICU in Nepal: the experience of the first year.
Basnet, Sangita; Shrestha, Shrijana; Ghimire, Amrit; Timila, Dipsal; Gurung, Jeena; Karki, Utkarsha; Adhikari, Neelam; Andoh, Jennifer; Koirala, Janak
2014-09-01
Analysis of hospitalization data can help elucidate the pattern of morbidity and mortality in any given area. Little data exist on critically ill children admitted to hospitals in the resource-limited nation of Nepal. We sought to characterize the profile, management, and mortality of children admitted to one PICU. Retrospective analysis. A newly established PICU in Nepal. All patients between the ages of 0 to 16 years admitted to the PICU from July 2009 to July 2010. None. In 12 months, 126 children were admitted to the PICU including 43% female patients. Sixty-three percent were under 5 years. Twenty-nine percent came from tertiary care hospitals and 38% from rural areas outside Kathmandu. Only 18% were transported by ambulance. Median distance travelled to be admitted was 30 km (interquartile range, 10-193). Highest number of admissions were in spring (40%) followed by summer (25%). Almost half were admitted for shock (45%), particularly septic shock (30%). The second commonest reason for admission was neurologic etiologies (15%). Neonatal admissions were also significant (19%). Mortality was 26% and was significantly associated with septic shock (p < 0.01), mechanical ventilation (p < 0.01), and multiple organ dysfunction (< 0.05). Almost one third of patients required mechanical ventilation; median duration was 4 days (interquartile range, 2-8). Mean length of stay in the hospital was 6.2 days (± 5.3) and median 4 (interquartile range, 2.5-9.0). Median Pediatric Risk of Mortality II score for nonsurvivors was 12 (interquartile range, 7-21), and median Pediatric Index of Mortality II for nonsurvivors was 10 (interquartile range, 3-32). Within a short time of opening, the PICU has been seeing significant numbers of critically ill children. Despite adverse conditions and limited resources, survival of 75% is similar to many units in developing nations. Sepsis was the most common reason for PICU admission and mortality.
Maude, Richard J.; Silamut, Kamolrat; Plewes, Katherine; Charunwatthana, Prakaykaew; Ho, May; Abul Faiz, M.; Rahman, Ridwanur; Hossain, Md Amir; Hassan, Mahtab U.; Bin Yunus, Emran; Hoque, Gofranul; Islam, Faridul; Ghose, Aniruddha; Hanson, Josh; Schlatter, Joel; Lacey, Rachel; Eastaugh, Alison; Tarning, Joel; Lee, Sue J.; White, Nicholas J.; Chotivanich, Kesinee; Day, Nicholas P. J.; Dondorp, Arjen M.
2014-01-01
Background. Cytoadherence and sequestration of erythrocytes containing mature stages of Plasmodium falciparum are central to the pathogenesis of severe malaria. The oral anthelminthic drug levamisole inhibits cytoadherence in vitro and reduces sequestration of late-stage parasites in uncomplicated falciparum malaria treated with quinine. Methods. Fifty-six adult patients with severe malaria and high parasitemia admitted to a referral hospital in Bangladesh were randomized to receive a single dose of levamisole hydrochloride (150 mg) or no adjuvant to antimalarial treatment with intravenous artesunate. Results. Circulating late-stage parasites measured as the median area under the parasite clearance curves were 2150 (interquartile range [IQR], 0–28 025) parasites/µL × hour in patients treated with levamisole and 5489 (IQR, 192–25 848) parasites/µL × hour in controls (P = .25). The “sequestration ratios” at 6 and 12 hours for all parasite stages and changes in microvascular blood flow did not differ between treatment groups (all P > .40). The median time to normalization of plasma lactate (<2 mmol/L) was 24 (IQR, 12–30) hours with levamisole vs 28 (IQR, 12–36) hours without levamisole (P = .15). Conclusions. There was no benefit of a single-dose of levamisole hydrochloride as adjuvant to intravenous artesunate in the treatment of adults with severe falciparum malaria. Rapid parasite killing by intravenous artesunate might obscure the effects of levamisole. PMID:23943850
Stability of procalcitonin at room temperature.
Milcent, Karen; Poulalhon, Claire; Fellous, Christelle Vauloup; Petit, François; Bouyer, Jean; Gajdos, Vincent
2014-01-01
The aim was to assess procalcitonin (PCT) stability after two days of storage at room temperature. Samples were collected from febrile children aged 7 to 92 days and were rapidly frozen after sampling. PCT levels were measured twice after thawing: immediately (named y) and 48 hours later after storage at room temperature (named x). PCT values were described with medians and interquartile ranges or by categorizing them into classes with thresholds 0.25, 0.5, and 2 ng/mL. The relationship between x and y PCT levels was analyzed using fractional polynomials in order to predict the PCT value immediately after thawing (named y') from x. A significant decrease in PCT values was observed after 48 hours of storage at room temperature, either in median, 30% lowering (p < 0.001), or as categorical variable (p < 0.001). The relationship between x and y can be accurately modeled with a simple linear model: y = 1.37 x (R2 = 0.99). The median of the predicted PCT values y' was quantitatively very close to the median of y and the distributions of y and y' across categories were very similar and not statistically different. PCT levels noticeably decrease after 48 hours of storage at room temperature. It is possible to pre- dict accurately effective PCT values from the values after 48 hours of storage at room temperature with a simple statistical model.
Stoudenmire, Laura G; Norman, Christy M; Latif, Erin Z
2016-10-01
This study aims to assess the impact of postoperative intravenous (IV) acetaminophen on opioid requirements and pain scores in patients following gynecologic procedures. A retrospective cohort study of patients undergoing gynecologic procedures was conducted to assess the impact of adding scheduled IV acetaminophen to postoperative analgesic regimens. The control group consisted of patients admitted prior to formulary addition of IV acetaminophen; the study group consisted of patients admitted after formulary addition of IV acetaminophen who received scheduled IV acetaminophen for at least the first 24 hours postoperatively. Opioid requirements 0 to 24 hours postoperatively served as the primary end point. Secondary end points included average pain score, cumulative acetaminophen dose, nonopioid analgesic requirements, and rate of adverse events 0 to 24 hours postoperatively. One hundred and thirty-seven patients who underwent a gynecologic procedure from January 2009 to April 2013 were included in this study. Baseline characteristics were similar between the groups. In the first 24 hours postoperatively, there was no difference in opioid requirements between the groups (21 mg [interquartile range, IQR, 15-39.8 mg] vs 32.6 mg [IQR, 16.75-41 mg], P = 0.150). The average pain score and incidence of adverse events did not differ between the 2 groups. Postoperative administration of IV acetaminophen did not provide a significant opioid-sparing effect in patients undergoing gynecologic procedures. © The Author(s) 2015.
Knelson, Lauren P.; Williams, David A.; Gergen, Maria F.; Rutala, William A.; Weber, David J.; Sexton, Daniel J.; Anderson, Deverick J.
2014-01-01
A total of 1,023 environmental surfaces were sampled from 45 rooms with patients infected or colonized with methicillin-resistant Staphylococcus aureus (MRSA) or vancomycin-resistant enterococci (VRE) before terminal room cleaning. Colonized patients had higher median total target colony-forming units (CFU) of MRSA or VRE than did infected patients (median, 25 CFU [interquartile range, 0–106 CFU] vs 0 CFU [interquartile range, 0–29 CFU]; P = .033). PMID:24915217
Peyton, P J; Wu, C; Jacobson, T; Hogg, M; Zia, F; Leslie, K
2017-07-01
Chronic postsurgical pain (CPSP) is a common and debilitating complication of major surgery. We undertook a pilot study at three hospitals to assess the feasibility of a proposed large multicentre placebo-controlled randomised trial of intravenous perioperative ketamine to reduce the incidence of CPSP. Ketamine, 0.5 mg/kg pre-incision, 0.25 mg/kg/hour intraoperatively and 0.1 mg/kg/hour for 24 hours, or placebo, was administered to 80 patients, recruited over a 15-month period, undergoing abdominal or thoracic surgery under general anaesthesia. The primary endpoint was CPSP in the area of the surgery reported at six-month telephone follow-up using a structured questionnaire. Fourteen patients (17.5%) reported CPSP (relative risk [95% confidence interval] if received ketamine 1.18 [0.70 to 1.98], P =0.56). Four patients in the treatment group and three in the control group reported ongoing analgesic use to treat CPSP and two patients in each group reported their worst pain in the previous 24 hours at ≥3/10 at six months. There were no significant differences in adverse event rates, quality of recovery scores, or cumulative morphine equivalents consumption in the first 72 hours. Numeric Rating Scale pain scores (median [interquartile range, IQR]) for average pain in the previous 24 hours among those patients reporting CPSP were 17.5 [0 to 40] /100 with no difference between treatment groups. A large (n=4,000 to 5,000) adequately powered multicentre trial is feasible using this population and methodology.
Falchook, Aaron D; Tracton, Gregg; Stravers, Lori; Fleming, Mary E; Snavely, Anna C; Noe, Jeanne F; Hayes, David N; Grilley-Olson, Juneko E; Weiss, Jared M; Reeve, Bryce B; Basch, Ethan M; Chera, Bhishamjit S
2016-01-01
Accurate assessment of toxicity allows for timely delivery of supportive measures during radiation therapy for head and neck cancer. The current paradigm requires weekly evaluation of patients by a provider. The purpose of this study is to evaluate the feasibility of monitoring patient reported symptoms via mobile devices. We developed a mobile application for patients to report symptoms in 5 domains using validated questions. Patients were asked to report symptoms using a mobile device once daily during treatment or more often as needed. Clinicians reviewed patient-reported symptoms during weekly symptom management visits and patients completed surveys regarding perceptions of the utility of the mobile application. The primary outcome measure was patient compliance with mobile device reporting. Compliance is defined as number of days with a symptom report divided by number of days on study. There were 921 symptom reports collected from 22 patients during treatment. Median reporting compliance was 71% (interquartile range, 45%-80%). Median number of reports submitted per patient was 34 (interquartile range, 21-53). Median number of reports submitted by patients per week was similar throughout radiation therapy and there was significant reporting during nonclinic hours. Patients reported high satisfaction with the use of mobile devices to report symptoms. A substantial percentage of patients used mobile devices to continuously report symptoms throughout a course of radiation therapy for head and neck cancer. Future studies should evaluate the impact of mobile device symptom reporting on improving patient outcomes.
Valkenburg, Abraham J; Calvier, Elisa A M; van Dijk, Monique; Krekels, Elke H J; O'Hare, Brendan P; Casey, William F; Mathôt, Ron A A; Knibbe, Catherijne A J; Tibboel, Dick; Breatnach, Cormac V
2016-10-01
To compare the pharmacodynamics and pharmacokinetics of IV morphine after cardiac surgery in two groups of children-those with and without Down syndrome. Prospective, single-center observational trial. PICU in a university-affiliated pediatric teaching hospital. Twenty-one children with Down syndrome and 17 without, 3-36 months old, scheduled for cardiac surgery with cardiopulmonary bypass. A loading dose of morphine (100 μg/kg) was administered after coming off bypass; thereafter, morphine infusion was commenced at 40 μg/kg/hr. During intensive care, nurses regularly assessed pain and discomfort with validated observational instruments (COMFORT-Behavior scale and Numeric Rating Scale-for pain). These scores guided analgesic and sedative treatment. Plasma samples were obtained for pharmacokinetic analysis. Median COMFORT-Behavior and Numeric Rating Scale scores were not statistically significantly different between the two groups. The median morphine infusion rate during the first 24 hours after surgery was 31.3 μg/kg/hr (interquartile range, 23.4-36.4) in the Down syndrome group versus 31.7 μg/kg/hr (interquartile range, 25.1-36.1) in the control group (p = 1.00). Population pharmacokinetic analysis revealed no statistically significant differences in any of the pharmacokinetic variables of morphine between the children with and without Down syndrome. This prospective trial showed that there are no differences in pharmacokinetics or pharmacodynamics between children with and without Down syndrome if pain and distress management is titrated to effect based on outcomes of validated assessment instruments. We have no evidence to adjust morphine dosing after cardiac surgery in children with Down syndrome.
Beshish, Asaad G; Baginski, Mathew R; Johnson, Thomas J; Deatrick, Barry K; Barbaro, Ryan P; Owens, Gabe E
2018-04-13
The purpose of this study is to describe the functional status of survivors from extracorporeal cardiopulmonary resuscitation instituted during in-hospital cardiac arrest using the Functional Status Scale. We aimed to determine risk factors leading to the development of new morbidity and unfavorable functional outcomes. This was a single-center retrospective chart review abstracting patient characteristics/demographic data, duration of cardiopulmonary resuscitation, duration of extracorporeal membrane oxygenation support, as well as maximum lactate levels within 2 hours before and after extracorporeal cardiopulmonary resuscitation. Cardiac arrest was defined as the administration of chest compressions for a nonperfusing cardiac rhythm. Extracorporeal cardiopulmonary resuscitation was defined by instituting extracorporeal membrane oxygenation during active chest compressions. Functional Status Scale scores were calculated at admission and on hospital discharge for patients who survived. Patients admitted in the pediatric cardiac ICU at C.S. Mott Children's Hospital from January 1, 2005, to December 31, 2015. Children less than 18 years who underwent extracorporeal cardiopulmonary resuscitation. Not applicable. Of 608 extracorporeal membrane oxygenation events during the study period, 80 were extracorporeal cardiopulmonary resuscitation (14%). There were 40 female patients (50%). Median age was 40 days (interquartile range, 9-342 d). Survival to hospital discharge was 48% (38/80). Median Functional Status Scale score at admission was 6 (interquartile range, 6-6) and at hospital discharge 9 (interquartile range, 8-11). Out of 38 survivors, 19 (50%) had a change of Functional Status Scale score greater than or equal to 3, that is consistent with new morbidity, and 26 (68%) had favorable functional outcomes with a change in Functional Status Scale score of less than 5. This is the first extracorporeal cardiopulmonary resuscitation report to examine changes in Functional Status Scale from admission (baseline) to discharge as a measure of overall functional outcome. Half of surviving patients (19/38) had new morbidity, while 68% (26/38) had favorable outcomes. Lactate levels, duration of cardiopulmonary resuscitation, and duration of extracorporeal membrane oxygenation were not found to be risk factors for the development of new morbidity and poor functional outcomes. Functional Status Scale may be used as a metric to monitor improvement of extracorporeal cardiopulmonary resuscitation outcomes and help guide research initiatives to decrease morbidity in this patient population.
Peng, Song; Hu, Liang; Chen, Wenzhi; Chen, Jinyun; Yang, Caiyong; Wang, Xi; Zhang, Rong; Wang, Zhibiao; Zhang, Lian
2015-04-01
To investigate the value of microbubble contrast-enhanced ultrasound (CEUS) in evaluating the treatment response of uterine fibroids to HIFU ablation. Sixty-eight patients with a solitary uterine fibroid from the First Affiliated Hospital of Chongqing Medical University were included and analyzed. All patients underwent pre- and post-treatment magnetic resonance imaging (MRI) with a standardized protocol, as well as pre-evaluation, intraprocedure, and immediate post-treatment CEUS. CEUS and MRI were compared by different radiologists. In comparison with MRI, CEUS showed that the size of fibroids, volume of fibroids, size of non-perfused regions, non-perfused volume (NPV) or fractional ablation (NPV ratio) was similar to that of MRI. In terms of CEUS examination results, the median volume of fibroids was 75.2 (interquartile range, 34.2-127.3) cm(3), the median non-perfused volume was 54.9 (interquartile range, 28.0-98.1) cm(3), the mean fractional ablation was 83.7±13.6 (range, 30.0-100.0)%. In terms of MRI examination results, the median volume of fibroids was 74.1 (interquartile range, 33.4-116.2) cm(3). On the basis of contrast-enhanced T1-weighted images immediately after HIFU treatment, the median non-perfused volume was 58.5 (interquartile range, 27.7-100.0) cm(3), the average fractional ablation was 84.2±14.2 (range, 40.0-100.0)%. CEUS clearly showed the size of fibroids and the non-perfused areas of the fibroid. Results from CEUS correlated well with results obtained from MRI. Copyright © 2015 Elsevier B.V. All rights reserved.
Cell-Free circulating DNA: a new biomarker for the acute coronary syndrome.
Cui, Ming; Fan, Mengkang; Jing, Rongrong; Wang, Huimin; Qin, Jingfeng; Sheng, Hongzhuan; Wang, Yueguo; Wu, Xinhua; Zhang, Lurong; Zhu, Jianhua; Ju, Shaoqing
2013-01-01
In recent studies, concentrations of cell-free circulating DNA (cf-DNA) have been correlated with clinical characteristics and prognosis in several diseases. The relationship between cf-DNA concentrations and the acute coronary syndrome (ACS) remains unknown. Moreover, no data are available for the detection cf-DNA in ACS by a branched DNA (bDNA)-based Alu assay. The aim of the present study was to investigate cf-DNA concentrations in ACS and their relationship with clinical features. Plasma cf-DNA concentrations of 137 ACS patients at diagnosis, of 60 healthy individuals and of 13 patients with stable angina (SA) were determined using a bDNA-based Alu assay. ACS patients (median 2,285.0, interquartile range 916.4-4,857.3 ng/ml), especially in ST-segment elevation myocardial infarction patients (median 5,745.4, interquartile range 4,013.5-8,643.9 ng/ml), showed a significant increase in plasma cf-DNA concentrations compared with controls (healthy controls: median 118.3, interquartile range 81.1-221.1 ng/ml; SA patients: median 202.3, interquartile range 112.7-256.1 ng/ml) using a bDNA-based Alu assay. Moreover, we found positive correlations between cf-DNA and Gensini scoring and GRACE (Global Registry of Acute Coronary Events) scoring in ACS. cf-DNA may be a valuable marker for diagnosing and predicting the severity of coronary artery lesions and risk stratification in ACS. Copyright © 2013 S. Karger AG, Basel.
Martin, K; Gertler, R; Sterner, A; MacGuill, M; Schreiber, C; Hörer, J; Vogt, M; Tassani, P; Wiesner, G
2011-08-01
ε-Aminocaproic acid (EACA) and tranexamic acid (TXA) are used for antifibrinolytic therapy in neonates undergoing cardiac surgery, although data directly comparing their blood-sparing efficacy are not yet available. We compared two consecutive cohorts of neonates for the effect of these two medications on perioperative blood loss and allogeneic transfusions. Data from the EACA group (n = 77) were collected over a 12-month period; data from the tranexamic acid group (n = 28) were collected over a 5-month period. Blood loss, rate of reoperation due to bleeding, and transfusion requirements were measured. There was no significant difference in blood loss at 6 hours (EACA 24 [17-30] mL/kg [median (interquartile range)] vs. TXA 20 [11-34] mL/kg, P = 0.491), at 12 hours (EACA 31 [22-38] mL/kg vs. TXA 27 [19-43] ml/kg, P = 0.496) or at 24 hours postoperatively (EACA 41 [31-47] mL/kg vs. TXA 39 [27-60] mL/kg; P = 0.625) or transfusion of blood products. ε-Aminocaproic acid and tranexamic acid are equally effective with respect to perioperative blood loss and transfusion requirements in newborns undergoing cardiac surgery. © Georg Thieme Verlag KG Stuttgart · New York.
Are EMS call volume predictions based on demand pattern analysis accurate?
Brown, Lawrence H; Lerner, E Brooke; Larmon, Baxter; LeGassick, Todd; Taigman, Michael
2007-01-01
Most EMS systems determine the number of crews they will deploy in their communities and when those crews will be scheduled based on anticipated call volumes. Many systems use historical data to calculate their anticipated call volumes, a method of prediction known as demand pattern analysis. To evaluate the accuracy of call volume predictions calculated using demand pattern analysis. Seven EMS systems provided 73 consecutive weeks of hourly call volume data. The first 20 weeks of data were used to calculate three common demand pattern analysis constructs for call volume prediction: average peak demand (AP), smoothed average peak demand (SAP), and 90th percentile rank (90%R). The 21st week served as a buffer. Actual call volumes in the last 52 weeks were then compared to the predicted call volumes by using descriptive statistics. There were 61,152 hourly observations in the test period. All three constructs accurately predicted peaks and troughs in call volume but not exact call volume. Predictions were accurate (+/-1 call) 13% of the time using AP, 10% using SAP, and 19% using 90%R. Call volumes were overestimated 83% of the time using AP, 86% using SAP, and 74% using 90%R. When call volumes were overestimated, predictions exceeded actual call volume by a median (Interquartile range) of 4 (2-6) calls for AP, 4 (2-6) for SAP, and 3 (2-5) for 90%R. Call volumes were underestimated 4% of time using AP, 4% using SAP, and 7% using 90%R predictions. When call volumes were underestimated, call volumes exceeded predictions by a median (Interquartile range; maximum under estimation) of 1 (1-2; 18) call for AP, 1 (1-2; 18) for SAP, and 2 (1-3; 20) for 90%R. Results did not vary between systems. Generally, demand pattern analysis estimated or overestimated call volume, making it a reasonable predictor for ambulance staffing patterns. However, it did underestimate call volume between 4% and 7% of the time. Communities need to determine if these rates of over-and underestimation are acceptable given their resources and local priorities.
Siegelaar, Sarah E; Barwari, Temo; Hermanides, Jeroen; van der Voort, Peter H J; Hoekstra, Joost B L; DeVries, J Hans
2013-11-01
Continuous glucose monitoring could be helpful for glucose regulation in critically ill patients; however, its accuracy is uncertain and might be influenced by microcirculation. We investigated the microcirculation and its relation to the accuracy of 2 continuous glucose monitoring devices in patients after cardiac surgery. The present prospective, observational study included 60 patients admitted for cardiac surgery. Two continuous glucose monitoring devices (Guardian Real-Time and FreeStyle Navigator) were placed before surgery. The relative absolute deviation between continuous glucose monitoring and the arterial reference glucose was calculated to assess the accuracy. Microcirculation was measured using the microvascular flow index, perfused vessel density, and proportion of perfused vessels using sublingual sidestream dark-field imaging, and tissue oxygenation using near-infrared spectroscopy. The associations were assessed using a linear mixed-effects model for repeated measures. The median relative absolute deviation of the Navigator was 11% (interquartile range, 8%-16%) and of the Guardian was 14% (interquartile range, 11%-18%; P = .05). Tissue oxygenation significantly increased during the intensive care unit admission (maximum 91.2% [3.9] after 6 hours) and decreased thereafter, stabilizing after 20 hours. A decrease in perfused vessel density accompanied the increase in tissue oxygenation. Microcirculatory variables were not associated with sensor accuracy. A lower peripheral temperature (Navigator, b = -0.008, P = .003; Guardian, b = -0.006, P = .048), and for the Navigator, also a higher Acute Physiology and Chronic Health Evaluation IV predicted mortality (b = 0.017, P < .001) and age (b = 0.002, P = .037) were associated with decreased sensor accuracy. The results of the present study have shown acceptable accuracy for both sensors in patients after cardiac surgery. The microcirculation was impaired to a limited extent compared with that in patients with sepsis and healthy controls. This impairment was not related to sensor accuracy but the peripheral temperature for both sensors and patient age and Acute Physiology and Chronic Health Evaluation IV predicted mortality for the Navigator were. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Koc, Sema; Durna, Zehra; Akin, Semiha
2017-06-14
This cross-sectional study aimed to assess interpretation of symptoms as a cause of delays in patients with acute myocardial infarction (AMI). It was conducted at a university hospital in Istanbul, Turkey. The sample included 93 patients: 73 male, mean age 57.89 (12.13) years. Prehospital delay time ranged from 15 minutes to 10 days, with a median of 2 hours (interquartile range: 9.50). Patients waited for pain to go away (48.4%) and tried to calm down (39.8%). Most patients attributed AMI-related symptoms to a reason other than heart disease. In a multivariate logistic regression analysis, the type of AMI was classified based on electrocardiography findings (odds ratio 5.18, 95% confidence interval: 1.69-15.91, P=0.004) and was independently associated with a long prehospital delay time, indicating that patients with ST segment elevation MI would seek early medical care. Misinterpretation of symptoms and misconceptions about emergency treatment during AMI cause delays in admission and may affect treatment.
Karaszewski, Bartosz; Thomas, Ralph G R; Dennis, Martin S; Wardlaw, Joanna M
2012-10-18
Pyrexia after stroke (temperature ≥37.5°C) is associated with poor prognosis, but information on timing of body temperature changes and relationship to stroke severity and subtypes varies. We recruited patients with acute ischemic stroke, measured stroke severity, stroke subtype and recorded four-hourly tympanic (body) temperature readings from admission to 120 hours after stroke. We sought causes of pyrexia and measured functional outcome at 90 days. We systematically summarised all relevant previous studies. Amongst 44 patients (21 males, mean age 72 years SD 11) with median National Institute of Health Stroke Score (NIHSS) 7 (range 0-28), 14 had total anterior circulation strokes (TACS). On admission all patients, both TACS and non-TACS, were normothermic (median 36.3°C vs 36.5°C, p=0.382 respectively) at median 4 hours (interquartile range, IQR, 2-8) after stroke; admission temperature and NIHSS were not associated (r(2)=0.0, p=0.353). Peak temperature, occurring at 35.5 (IQR 19.0 to 53.8) hours after stroke, was higher in TACS (37.7°C) than non-TACS (37.1°C, p<0.001) and was associated with admission NIHSS (r(2)=0.20, p=0.002). Poor outcome (modified Rankin Scale ≥3) at 90 days was associated with higher admission (36.6°C vs. 36.2°C p=0.031) and peak (37.4°C vs. 37.0°C, p=0.016) temperatures. Sixteen (36%) patients became pyrexial, in seven (44%) of whom we found no cause other than the stroke. Normothermia is usual within the first 4 hours of stroke. Peak temperature occurs at 1.5 to 2 days after stroke, and is related to stroke severity/subtype and more closely associated with poor outcome than admission temperature. Temperature-outcome associations after stroke are complex, but normothermia on admission should not preclude randomisation of patients into trials of therapeutic hypothermia.
Spagnuolo, Vincenzo; Travi, Giovanna; Galli, Laura; Cossarini, Francesca; Guffanti, Monica; Gianotti, Nicola; Salpietro, Stefania; Lazzarin, Adriano; Castagna, Antonella
2013-08-01
The objective of this study was to compare immunologic, virologic, and clinical outcomes between living human immunodeficiency virus (HIV)-infected individuals who had a diagnosis of lymphoma versus outcomes in a control group of cancer-free, HIV-infected patients. In this matched cohort study, patients in the case group were survivors of incident lymphomas that occurred between 1997 and June 2010. Controls were living, cancer-free, HIV-infected patients who were matched to cases at a 4:1 ratio by age, sex, nadir CD4 cell count, and year of HIV diagnosis. The date of lymphoma diagnosis served as the baseline in cases and in the corresponding controls. In total, 62 patients (cases) who had lymphoma (20 with Hodgkin disease [HD] and 42 with non-Hodgkin lymphoma [NHL]) were compared with 211 controls. The overall median follow-up was 4.8 years (interquartile range, 2.0-7.9 years). The CD4 cell count at baseline was 278 cells/mm³ (interquartile range, 122-419 cells/mm³) in cases versus 421 cells/mm³ (interquartile range, 222-574 cells/mm³) in controls (P = .003). At the last available visit, the CD4 cell count was 412 cells/mm³ (range, 269-694 cells/mm³) in cases versus 518 cells/mm³ (interquartile range, 350-661 cells/mm³) in controls (P = .087). The proportion of patients who achieved virologic success increased from 30% at baseline to 74% at the last available visit in cases (P = .008) and from 51% to 81% in controls (P = .0286). Patients with HD reached higher CD4 cell counts at their last visit than patients with NHL (589 cells/mm³ [range, 400-841 cells/mm³] vs 332 cells/mm³ [interquartile range, 220-530 cells/mm³], respectively; P = .003). Virologic success was similar between patients with HD and patients with NHL at the last visit. Forty cases (65%) and 76 controls (36%) experienced at least 1 clinical event after baseline (P < .0001); cases were associated with a shorter time to occurrence of the first clinical event compared with controls (P < .0001). HIV-infected lymphoma survivors experienced more clinical events than controls, especially during the first year of follow-up, but they reached similar long-term immunologic and virologic outcomes. © 2013 American Cancer Society.
A multicenter study of plasma use in the United States.
Triulzi, Darrell; Gottschall, Jerome; Murphy, Edward; Wu, Yanyun; Ness, Paul; Kor, Daryl; Roubinian, Nareg; Fleischmann, Debra; Chowdhury, Dhuly; Brambilla, Donald
2015-06-01
Detailed information regarding plasma use in the United States is needed to identify opportunities for practice improvement and design of clinical trials of plasma therapy. Ten US hospitals collected detailed medical information from the electronic health records for 1 year (2010-2011) for all adult patients transfused with plasma. A total of 72,167 units of plasma were transfused in 19,596 doses to 9269 patients. The median dose of plasma was 2 units (interquartile range, 2-4; range 1-72); 15% of doses were 1 unit, and 45% were 2 units. When adjusted by patient body weight (kg), the median dose was 7.3 mL/kg (interquartile range, 5.5-12.0). The median pretransfusion international normalized ratio (INR) was 1.9 (25%-75% interquartile range, 1.6-2.6). A total of 22.5% of plasma transfusions were given to patients with an INR of less than 1.6 and 48.5% for an INR of 2.0 or more. The median posttransfusion INR was 1.6 (interquartile range, 1.4-2.0). Only 42% of plasma transfusions resulted in a posttransfusion INR of less than 1.6. Correction of INR increased as the plasma dose increased from 1 to 4 units (p < 0.001). There was no difference in the INR response to different types of plasma. The most common issue locations were general ward (38%) and intensive care unit (ICU; 42%). This large database describing plasma utilization in the United States provides evidence for both inadequate dosing and unnecessary transfusion. Measures to improve plasma transfusion practice and clinical trials should be directed at patients on medical and surgical wards and in the ICU where plasma is most commonly used. © 2014 AABB.
A multicenter study of plasma use in the United States
Triulzi, Darrell; Gottschall, Jerome; Murphy, Edward; Wu, Yanyun; Ness, Paul; Kor, Daryl; Roubinian, Nareg; Fleischmann, Debra; Chowdhury, Dhuly; Brambilla, Donald
2016-01-01
Background Detailed information regarding plasma use in the United States is needed to identify opportunities for practice improvement and design of clinical trials of plasma therapy. Study Design and Methods Ten US hospitals collected detailed medical information from the electronic health records for 1 year (2010-2011) for all adult patients transfused with plasma. Results A total of 72,167 units of plasma were transfused in 19,596 doses to 9269 patients. The median dose of plasma was 2 units (interquartile range, 2-4; range 1-72); 15% of doses were 1 unit, and 45% were 2 units. When adjusted by patient body weight (kg), the median dose was 7.3 mL/kg (interquartile range, 5.5-12.0). The median pretransfusion international normalized ratio (INR) was 1.9 (25%-75% interquartile range, 1.6-2.6). A total of 22.5% of plasma transfusions were given to patients with an INR of less than 1.6 and 48.5% for an INR of 2.0 or more. The median posttransfusion INR was 1.6 (interquartile range, 1.4-2.0). Only 42% of plasma transfusions resulted in a posttransfusion INR of less than 1.6. Correction of INR increased as the plasma dose increased from 1 to 4 units (p < 0.001). There was no difference in the INR response to different types of plasma. The most common issue locations were general ward (38%) and intensive care unit (ICU; 42%). Conclusion This large database describing plasma utilization in the United States provides evidence for both inadequate dosing and unnecessary transfusion. Measures to improve plasma transfusion practice and clinical trials should be directed at patients on medical and surgical wards and in the ICU where plasma is most commonly used. PMID:25522888
Utilization and Outcomes of Sentinel Lymph Node Biopsy for Vulvar Cancer.
Cham, Stephanie; Chen, Ling; Burke, William M; Hou, June Y; Tergas, Ana I; Hu, Jim C; Ananth, Cande V; Neugut, Alfred I; Hershman, Dawn L; Wright, Jason D
2016-10-01
To examine the use and predictors of sentinel node biopsy in women with vulvar cancer. The Perspective database, an all-payer database that collects data from more than 500 hospitals, was used to perform a retrospective cohort study of women with vulvar cancer who underwent vulvectomy and lymph node assessment from 2006 to 2015. Multivariable models were used to determine factors associated with sentinel node biopsy. Length of stay and cost were compared between women who underwent sentinel node biopsy and lymphadenectomy. Among 2,273 women, sentinel node biopsy was utilized in 618 (27.2%) and 1,655 (72.8%) underwent inguinofemoral lymphadenectomy. Performance of sentinel node biopsy increased from 17.0% (95% confidence interval [CI] 12.0-22.0%) in 2006 to 39.1% (95% CI 27.1-51.0%) in 2015. In a multivariable model, women treated more recently were more likely to have undergone sentinel node biopsy, whereas women with more comorbidities and those treated at rural hospitals were less likely to have undergone the procedure. The median length of stay was shorter for those undergoing sentinel node biopsy (median 2 days, interquartile range 1-3) compared with women who underwent inguinofemoral lymphadenectomy (median 3 days, interquartile range 2-4). The cost of sentinel node biopsy was $7,599 (interquartile range $5,739-9,922) compared with $8,095 (interquartile range $5,917-11,281) for lymphadenectomy. The use of sentinel node biopsy for vulvar cancer has more than doubled since 2006. Sentinel lymph node biopsy is associated with a shorter hospital stay and decreased cost compared with inguinofemoral lymphadenectomy.
Pommerening, Matthew J; DuBose, Joseph J; Zielinski, Martin D; Phelan, Herb A; Scalea, Thomas M; Inaba, Kenji; Velmahos, George C; Whelan, James F; Wade, Charles E; Holcomb, John B; Cotton, Bryan A
2014-08-01
Failure to achieve primary fascial closure (PFC) after damage control laparotomy is costly and carries great morbidity. We hypothesized that time from the initial laparotomy to the first take-back operation would be predictive of successful PFC. Trauma patients managed with open abdominal techniques after damage control laparotomy were prospectively followed at 14 Level 1 trauma centers during a 2-year period. Time to the first take-back was evaluated as a predictor of PFC using hierarchical multivariate logistic regression analysis. A total of 499 patients underwent damage control laparotomy and were included in this analysis. PFC was achieved in 327 (65.5%) patients. Median time to the first take-back operation was 36 hours (interquartile range 24-48). After we adjusted for patient demographics, resuscitation volumes, and operative characteristics, increasing time to the first take-back was associated with a decreased likelihood of PFC. Specifically, each hour delay in return to the operating room (24 hours after initial laparotomy) was associated with a 1.1% decrease in the odds of PFC (odds ratio 0.989; 95% confidence interval 0.978-0.999; P = .045). In addition, there was a trend towards increased intra-abdominal complications in patients returning after 48 hours (odds ratio 1.80; 95% confidence interval 1.00-3.25; P = .05). Data from this prospective, multicenter study demonstrate that delays in returning to the operating room after damage control laparotomy are associated with reductions in PFC. These findings suggest that emphasis should be placed on returning to the operating room within 24 hours after the initial laparotomy if possible (and no later than 48 hours). Copyright © 2014 Mosby, Inc. All rights reserved.
Huber, Carola A; Rosemann, Thomas; Zoller, Marco; Eichler, Klaus; Senn, Oliver
2011-02-01
To investigate the demand for traditional out-of-hours general practitioner (GP) emergency care in Switzerland including GPs' satisfaction and reasons for encounter (RFE). During a 2-month period (2009), a questionnaire-based, cross-sectional study was performed in GPs participating in the mandatory out-of-hours service in the city of Zurich, Switzerland. The number and mode of patient contacts were assessed to investigate the demand for GP care in traditional out-of-hours services. GPs and patient characteristics, including RFE according to the International Classification of Primary Care, were noted. Descriptive statistics and non-parametric tests were conducted. Out of the 295 out-of-hours episodes during the study period, 148 (50%) duty periods were documented by a total of 93 GPs (75% men) with a mean (SD) age of 48.0 (6.2) years. The median (interquartile range) number of out-of-hours contacts was 5 (3-8) and the demand for home visits was significantly more common compared with practice and telephone consultations. A total of 112 different RFEs were responsible for the 382 documented patient contacts with fever accounting for the most common complaint (13.9%). Although 80% of GPs agreed to be satisfied overall with their profession as primary care provider, 57.6% among them were dissatisfied with the current out-of-hours service. Inappropriate payment and interference with their daily work in practice were most frequently reported. Our findings indicate that there is still strong patient demand for out-of-hours care with special need for home visits, suggesting that new organizational models such as integrating GPs into emergency care may not be an appropriate approach for all patients. Therefore, the ongoing reorganization of the out-of-hours-service in many health care systems has to be evaluated carefully in order not to miss important patient needs. © 2010 Blackwell Publishing Ltd.
Somily, Ali Mohammed; Habib, Hanan Ahmed; Torchyan, Armen Albert; Sayyed, Samina B; Absar, Muhammed; Al-Aqeel, Rima; Binkhamis, A Khalifa
2018-01-01
Bloodstream infections are associated with high rates of morbidity and mortality. Rapid detection of bloodstream infections is important in achieving better patient outcomes. Compare the time-to-detection (TTD) of the new BacT/Alert Virtuo and the BACTEC FX automated blood culture systems. Prospective simulated comparison of two instruments using seeded samples. Medical microbiology laboratory. Blood culture bottles were seeded in triplicate with each of the standard ATCC strains of aerobes, anaerobes and yeast. TTD was calculated as the length of time from the beginning of culture incubation to the detection of bacterial growth. TTD for the various tested organisms on the two microbial detection systems. The 99 bottles of seeded blood cultures incubated in each of the blood culture systems included 21 anaerobic, 39 aerobic and 39 pediatric bottles. The BacT/Alert Virtuo system exhibited significantly shorter TTD for 72.7 % of the tested organisms compared to BACTEC FX system with a median difference in mean TTD of 2.1 hours (interquartile range: 1.5-3.5 hours). The BACTEC FX system was faster in 15.2% (5/33) of microorganisms, with a median difference in mean TTD of 25.9 hours (IQR: 9.1-29.2 hours). TTD was significantly shorter for most of the microorganisms tested on the new BacT/Alert Virtuo system compared to the BACTEC FX system. Use of simulated cultures to assess TTD may not precisely represent clinical blood cultures. None.
Human milk IgA concentrations during the first year of lactation
Weaver, L.; Arthur, H.; Bunn, J.; Thomas, J.
1998-01-01
AIMS—To measure the concentrations of total IgA in the milk secreted by both breasts, throughout the first year of lactation, in a cohort of Gambian mothers of infants at high risk of infection. SUBJECTS AND METHODS—Sixty five women and their infants were studied monthly from the 4th to 52nd postpartum week. Samples of milk were obtained from each breast by manual expression immediately before the infant was suckled. Milk intakes were measured by test weighing the infants before and after feeds over 12 hour periods; IgA concentrations were determined by enzyme linked immunosorbent assay. RESULTS—A total of 1590 milk samples was measured. The median (interquartile range) concentration of IgA for all samples was 0.708(0.422-1.105) g/l; that in milk obtained from the left breast was 0.785 (0.458-1.247) g/l, and that in milk obtained from the right breast was 0.645 (0.388-1.011) g/l (p < 0.0001). There was no significant change in milk or IgA intakes with advancing infant age, but there was a close concordance of IgA concentrations between the two breasts, with "tracking" of the output of the left and right breasts. There was a significant (p < 0.01) negative correlation between maternal age and parity, and weight of milk ingested by infants. During the dry season (December to May) the median (interquartile range) IgA concentration was significantly higher at 0.853 (0.571-1.254) g/l than during the rainy season (June to November), when it was 0.518 (0.311-0.909) g/l (p < 0.0001). CONCLUSIONS—Sustained IgA secretion is likely to protect suckling infants from microbial infection. PMID:9613353
11-Oxygenated C19 Steroids Are the Predominant Androgens in Polycystic Ovary Syndrome.
O'Reilly, Michael W; Kempegowda, Punith; Jenkinson, Carl; Taylor, Angela E; Quanson, Jonathan L; Storbeck, Karl-Heinz; Arlt, Wiebke
2017-03-01
Androgen excess is a defining feature of polycystic ovary syndrome (PCOS), but the exact origin of hyperandrogenemia remains a matter of debate. Recent studies have highlighted the importance of the 11-oxygenated C19 steroid pathway to androgen metabolism in humans. In this study, we analyzed the contribution of 11-oxygenated androgens to androgen excess in women with PCOS. One hundred fourteen women with PCOS and 49 healthy control subjects underwent measurement of serum androgens by liquid chromatography-tandem mass spectrometry. Twenty-four-hour urinary androgen excretion was analyzed by gas chromatography-mass spectrometry. Fasting plasma insulin and glucose were measured for homeostatic model assessment of insulin resistance. Baseline demographic data, including body mass index, were recorded. As expected, serum concentrations of the classic androgens testosterone (P < 0.001), androstenedione (P < 0.001), and dehydroepiandrosterone (P < 0.01) were significantly increased in PCOS. Mirroring this, serum 11-oxygenated androgens 11β-hydroxyandrostenedione, 11-ketoandrostenedione, 11β-hydroxytestosterone, and 11-ketotestosterone were significantly higher in PCOS than in control subjects, as was the urinary 11-oxygenated androgen metabolite 11β-hydroxyandrosterone. The proportionate contribution of 11-oxygenated to total serum androgens was significantly higher in patients with PCOS compared with control subjects [53.0% (interquartile range, 48.7 to 60.3) vs 44.0% (interquartile range, 32.9 to 54.9); P < 0.0001]. Obese (n = 51) and nonobese (n = 63) patients with PCOS had significantly increased 11-oxygenated androgens. Serum 11β-hydroxyandrostenedione and 11-ketoandrostenedione correlated significantly with markers of insulin resistance. We show that 11-oxygenated androgens represent the majority of circulating androgens in women with PCOS, with close correlation to markers of metabolic risk.
11-Oxygenated C19 Steroids Are the Predominant Androgens in Polycystic Ovary Syndrome
O’Reilly, Michael W.; Kempegowda, Punith; Jenkinson, Carl; Taylor, Angela E.; Quanson, Jonathan L.; Storbeck, Karl-Heinz
2017-01-01
Context: Androgen excess is a defining feature of polycystic ovary syndrome (PCOS), but the exact origin of hyperandrogenemia remains a matter of debate. Recent studies have highlighted the importance of the 11-oxygenated C19 steroid pathway to androgen metabolism in humans. In this study, we analyzed the contribution of 11-oxygenated androgens to androgen excess in women with PCOS. Methods: One hundred fourteen women with PCOS and 49 healthy control subjects underwent measurement of serum androgens by liquid chromatography-tandem mass spectrometry. Twenty-four–hour urinary androgen excretion was analyzed by gas chromatography-mass spectrometry. Fasting plasma insulin and glucose were measured for homeostatic model assessment of insulin resistance. Baseline demographic data, including body mass index, were recorded. Results: As expected, serum concentrations of the classic androgens testosterone (P < 0.001), androstenedione (P < 0.001), and dehydroepiandrosterone (P < 0.01) were significantly increased in PCOS. Mirroring this, serum 11-oxygenated androgens 11β-hydroxyandrostenedione, 11-ketoandrostenedione, 11β-hydroxytestosterone, and 11-ketotestosterone were significantly higher in PCOS than in control subjects, as was the urinary 11-oxygenated androgen metabolite 11β-hydroxyandrosterone. The proportionate contribution of 11-oxygenated to total serum androgens was significantly higher in patients with PCOS compared with control subjects [53.0% (interquartile range, 48.7 to 60.3) vs 44.0% (interquartile range, 32.9 to 54.9); P < 0.0001]. Obese (n = 51) and nonobese (n = 63) patients with PCOS had significantly increased 11-oxygenated androgens. Serum 11β-hydroxyandrostenedione and 11-ketoandrostenedione correlated significantly with markers of insulin resistance. Conclusions: We show that 11-oxygenated androgens represent the majority of circulating androgens in women with PCOS, with close correlation to markers of metabolic risk. PMID:27901631
Støving, Kion; Rothe, Christian; Rosenstock, Charlotte V; Aasvang, Eske K; Lundstrøm, Lars H; Lange, Kai H W
2015-01-01
The transversus abdominis plane (TAP) block is a widely used nerve block. However, basic block characteristics are poorly described. The purpose of this study was to assess the cutaneous sensory block area, muscle-relaxing effect, and block duration. Sixteen healthy volunteers were randomized to receive an ultrasound-guided unilateral TAP block with 20 mL 7.5 mg/mL ropivacaine and placebo on the contralateral side. Measurements were performed at baseline and 90 minutes after performing the block. Cutaneous sensory block area was mapped and separated into a medial and lateral part by a vertical line through the anterior superior iliac spine. We measured muscle thickness of the 3 lateral abdominal muscle layers with ultrasound in the relaxed state and during maximal voluntary muscle contraction. The volunteers reported the duration of the sensory block and the abdominal muscle-relaxing effect. The lateral part of the cutaneous sensory block area was a median of 266 cm2 (interquartile range, 191-310 cm2) and the medial part 76 cm 2(interquartile range, 54-127 cm2). In all the volunteers, lateral wall muscle thickness decreased significantly by 9.2 mm (6.9-15.7 mm) during a maximal contraction. Sensory block and muscle-relaxing effect duration were 570 minutes (512-716 minutes) and 609 minutes (490-724 minutes), respectively. Cutaneous sensory block area of the TAP block is predominantly located lateral to a vertical line through the anterior superior iliac spine. The distribution is nondermatomal and does not cross the midline. The muscle-relaxing effect is significant and consistent. The block duration is approximately 10 hours with large variation.
Bhatia, Risha; Morley, Colin J; Argus, Brenda; Tingay, David G; Donath, Susan; Davis, Peter G
2013-01-01
Very preterm infants can be treated with nasal continuous positive airway pressure (CPAP) from birth, but some fail. A rapid test, such as the stable microbubble test (SMT) on gastric aspirate, may identify those who can be managed successfully using CPAP. To determine if SMT can identify soon after birth, very preterm infants who may be successfully managed on CPAP alone. Stable microbubbles (diameter <15 µm) were counted in gastric aspirates taken <1 h of age from infants <30 weeks' gestation, who received CPAP from birth. Infants failed CPAP if intubated at <72 h of age. Clinicians were masked to SMT results. A receiver operating characteristic curve was generated to determine the relationship between number of microbubbles/mm(2) and subsequent intubation. 68 infants of mean (SD) 28.1 (1.4) weeks' gestation received CPAP in the delivery room at a median (interquartile range) pressure 7 (6-8) cmH2O and FiO2 0.25 (0.21-0.3). Gastric aspirates were taken at a median (interquartile range) age of 0.5 (0.3-0.6) hours. The best cut-off point for predicting CPAP success or failure was a SMT count of 8 microbubbles/mm(2). The area under the receiver operating characteristic curve was 0.8 (95% CI 0.7-0.9). A SMT count ≥8 microbubbles/mm(2) had a sensitivity of 53%, a specificity of 100%, a positive predictive value of 100%, and a negative predictive value of 60% for predicting CPAP success. Infants treated with CPAP from birth, who had SMT counts ≥8 microbubbles/mm(2) on their gastric aspirate, did not fail CPAP. Copyright © 2013 S. Karger AG, Basel.
Airway Hypersensitivity, Reflux, and Phonation Contribute to Chronic Cough
Francis, David O.; Slaughter, James C.; Ates, Fehmi; Higginbotham, Tina; Stevens, Kristin L.; Garrett, C. Gaelyn; Vaezi, Michael F.
2015-01-01
Background & Aims Although chronic cough is a common, its etiology is often elusive, making patient management a challenge. Gastroesophageal reflux and airway hypersensitivity can cause chronic cough. We explored the relationship between reflux, phonation, and cough in patients with idiopathic chronic cough. Methods We performed a blinded, cross-sectional study of non-smoking patients with chronic cough (duration > 8 weeks) refractory to reflux treatment referred to the Digestive Disease Center at Vanderbilt University. All underwent 24-hour acoustic recording concurrently and temporally synchronized with ambulatory pH-impedance monitoring. Cough, phonation, and pH-impedance events were recorded. We evaluated the temporal relationship between cough and phonation or reflux events using Poisson and logistic regression. Results Seventeen patients met the inclusion criteria (88% female; 100% Caucasian; median age, 63 years and interquartile age range, 52–66 years; mean body mass index, 30.6 and interquartile range 27.9–34.0); there were 2048 analyzable coughing events. The probability of subsequent coughing increased with higher burdens of preceding cough, reflux, or phonation. Within the first 15 min after a cough event, the cough event itself was the main trigger of subsequent cough events. After this period, de novo coughing occurred with increases of 1.46-fold in association with reflux alone (95% confidence interval, 1.17–1.82; P<.001) and 1.71-fold in association with the combination of phonation and reflux events. Conclusion Antecedent phonation and reflux increased the rate of cough events in patients with idiopathic chronic cough. Reflux events were more strongly associated with increased rate of coughing. Our findings support the concept that airway hypersensitivity is a cause of chronic cough, and that the vocal folds may be an effector in chronic cough. ClinicalTrials.gov number, NCT01263626. PMID:26492842
Al Tmimi, Layth; Devroe, Sarah; Dewinter, Geertrui; Van de Velde, Marc; Poortmans, Gert; Meyns, Bart; Meuris, Bart; Coburn, Mark; Rex, Steffen
2017-10-01
Xenon was shown to cause less hemodynamic instability and reduce vasopressor needs during off-pump coronary artery bypass (OPCAB) surgery when compared with conventionally used anesthetics. As xenon exerts its organ protective properties even in subanesthetic concentrations, we hypothesized that in patients undergoing OPCAB surgery, 30% xenon added to general anesthesia with propofol results in superior hemodynamic stability when compared to anesthesia with propofol alone. Fifty patients undergoing elective OPCAB surgery were randomized to receive general anesthesia with 30% xenon adjuvant to a target-controlled infusion of propofol or with propofol alone. The primary end point was the total intraoperative dose of norepinephrine required to maintain an intraoperative mean arterial pressure >70 mm Hg. Secondary outcomes included the perioperative cardiorespiratory profile and the incidence of adverse and serious adverse events. Adding xenon to propofol anesthesia resulted in a significant reduction of norepinephrine required to attain the predefined hemodynamic goals (cumulative intraoperative dose: median [interquartile range]: 370 [116-570] vs 840 [335-1710] µg, P = .001). In the xenon-propofol group, significantly less propofol was required to obtain a similar depth of anesthesia as judged by clinical signs and the bispectral index (propofol effect site concentration [mean ± SD]: 1.8 ± 0.5 vs 2.8 ± 0.3 mg, P≤ .0001). Moreover, the xenon-propofol group required significantly less norepinephrine during the first 24 hours on the intensive care unit (median [interquartile range]: 1.5 [0.1-7] vs 5 [2-8] mg, P = .048). Other outcomes and safety parameters were similar in both groups. Thirty percent xenon added to propofol anesthesia improves hemodynamic stability by decreasing norepinephrine requirements in patients undergoing OPCAB surgery.
Gasper, Warren J; Jimenez, Cynthia A; Walker, Joy; Conte, Michael S; Seward, Kirk; Owens, Christopher D
2013-12-01
Endovascular interventions on peripheral arteries are limited by high rates of restenosis. Our hypothesis was that adventitial injection of rapamycin nanoparticles would be safe and reduce luminal stenosis in a porcine femoral artery balloon angioplasty model. Eighteen juvenile male crossbred swine were included. Single-injury (40%-60% femoral artery balloon overstretch injury; n=2) and double-injury models (endothelial denudation injury 2 weeks before a 20%-30% overstretch injury; n=2) were compared. The double-injury model produced significantly more luminal stenosis at 28 days, P=0.002, and no difference in medial fibrosis or inflammation. Four pigs were randomized to the double-injury model and adventitial injection of saline (n=2) or 500 μg of nanoparticle albumin-bound rapamycin (nab-rapamycin; n=2) with an endovascular microinfusion catheter. There was 100% procedural success and no difference in endothelial regeneration. At 28 days, nab-rapamycin led to significant reductions in luminal stenosis, 17% (interquartile range, 12%-35%) versus 10% (interquartile range, 8.3%-14%), P=0.001, medial cell proliferation, P<0.001, and fibrosis, P<0.001. There were significantly fewer adventitial leukocytes at 3 days, P<0.001, but no difference at 28 days. Pharmacokinetic analysis (single-injury model) found rapamycin concentrations 1500× higher in perivascular tissues than in blood at 1 hour. Perivascular rapamycin persisted ≥8 days and was not detectable at 28 days. Adventitial nab-rapamycin injection was safe and significantly reduced luminal stenosis in a porcine femoral artery balloon angioplasty model. Observed reductions in early adventitial leukocyte infiltration and late medial cell proliferation and fibrosis suggest an immunosuppressive and antiproliferative mechanism. An intraluminal microinfusion catheter for adventitial injection represents an alternative to stent- or balloon-based local drug delivery.
Ralph, Rachel; Patel, Jean A.; Postelnick, Michael; Ziauddin, Salma; Flis, Weronika; Galal, Audrey N.
2014-01-01
Background: Alerts issued by clinical decision support systems (CDSS) may be useful to identify and prevent the occurrence of acute kidney injury among patients on nephrotoxic drugs, particularly vancomycin. Objective: The purpose of this instructive study was to determine the effectiveness of using a pharmacist-run CDSS alert of early serum creatinine increases in patients receiving intravenous vancomycin to decrease the proportion of severely elevated vancomycin concentrations. Methods: This was a retrospective study of a prospectively reviewed CDSS alert that triggered in patients with an increase in serum creatinine by 25% from baseline within 24 hours. Severely elevated vancomycin concentrations were divided into a control group (before alert implementation) and a study group (after alert implementation) and considered for study inclusion. The proportion of severely elevated vancomycin concentrations (ie, >30 mg/L) were collected in the control and study groups. Results: There were 1290 and 1501 vancomycin concentrations in the control group and the study group, respectively. A total of 696 CDSS alerts triggered during the study period. The proportion of severely elevated vancomycin troughs decreased from 5.3% (n = 68, median = 36.6 mg/L, interquartile range = 33.75-43.2 mg/L) in the control group to 3.7% (n = 55, median = 34.7 mg/L, interquartile range = 31.3-39.3 mg/L) in the study group. This reflects a statistically significant decrease in the proportion of severely elevated vancomycin concentrations (P = .04). Conclusion: Overall, this instructive analysis on a novel use of CDSS software suggests that the implementation of an alert based on early detection of serum creatinine changes led to a significant decrease in the proportion of severely elevated serum vancomycin concentrations.
2011-01-01
Introduction The timely provision of critical care to hospitalised patients at risk for cardiopulmonary arrest is contingent upon identification and referral by frontline providers. Current approaches require improvement. In a single-centre study, we developed the Bedside Paediatric Early Warning System (Bedside PEWS) score to identify patients at risk. The objective of this study was to validate the Bedside PEWS score in a large patient population at multiple hospitals. Methods We performed an international, multicentre, case-control study of children admitted to hospital inpatient units with no limitations on care. Case patients had experienced a clinical deterioration event involving either an immediate call to a resuscitation team or urgent admission to a paediatric intensive care unit. Control patients had no events. The scores ranged from 0 to 26 and were assessed in the 24 hours prior to the clinical deterioration event. Score performance was assessed using the area under the receiver operating characteristic (AUCROC) curve by comparison with the retrospective rating of nurses and the temporal progression of scores in case patients. Results A total of 2,074 patients were evaluated at 4 participating hospitals. The median (interquartile range) maximum Bedside PEWS scores for the 12 hours ending 1 hour before the clinical deterioration event were 8 (5 to 12) in case patients and 2 (1 to 4) in control patients (P < 0.0001). The AUCROC curve (95% confidence interval) was 0.87 (0.85 to 0.89). In case patients, mean scores were 5.3 at 20 to 24 hours and 8.4 at 0 to 4 hours before the event (P < 0.0001). The AUCROC curve (95% CI) of the retrospective nurse ratings was 0.83 (0.81 to 0.86). This was significantly lower than that of the Bedside PEWS score (P < 0.0001). Conclusions The Bedside PEWS score identified children at risk for cardiopulmonary arrest. Scores were elevated and continued to increase in the 24 hours before the clinical deterioration event. Prospective clinical evaluation is needed to determine whether this score will improve the quality of care and patient outcomes. PMID:21812993
Clinical Performance of a New Bitangential Mini-scleral Lens.
Otten, Henny M; van der Linden, Bart J J J; Visser, Esther-Simone
2018-06-01
New bitangential mini-scleral lens designs provide a highly precise fit, follow the scleral shape, are comfortable to wear, and can correct residual astigmatism. This new scleral lens design complements the arsenal of medical contact lenses available to eye care practitioners. The aim of this study was to evaluate the subjective and objective performance of a new mini-scleral lens design with a bitangential periphery. In this observational study, data were collected for up to 15 months (median, 84 days; interquartile range, 76 days) from the left eyes of 133 patients fitted with this newly designed lens. Data were recorded during regular visits at Visser Contact Lens Practice's scleral lens clinics: diagnosis, clinical indication for scleral lenses, previous contact lens type, subjective performance, horizontal visible iris diameter, corrected distance visual acuity, and scleral lens fitting characteristics. The most common indication was keratoconus (45%), followed by irregular astigmatism (22%), keratoplasty (16.5%), ocular surface disease (13.5%), and other forms of irregular astigmatism (3%). The majority of patients (79%) scored comfort as either a 4 or 5 (out of 5), and 82% wore their lenses 12 hours or longer a day. Most lenses (81%) had a diameter of 16 mm (median, 16 mm; range, 15.5 to 17 mm) and were composed of Boston XO2 (46%), Menicon Z (44%), Boston XO (9%), or Boston Equalens II (1%). The median corrected distance visual acuity was 0.022 logarithm of the minimal angle of resolution (interquartile range, 0.155). The fitting characteristics revealed optimal values for centration and movement in 91% and 83%, respectively. Finally, the median stabilization axis was 50 degrees. New mini-scleral lenses with bitangential peripheral geometry yield satisfactory clinical results and good subjective performance and are therefore an effective option for managing patients who have irregular astigmatism or other corneal pathology.
Likhvantsev, Valery V; Landoni, Giovanni; Grebenchikov, Oleg A; Skripkin, Yuri V; Zabelina, Tatiana S; Zinovkina, Liudmila A; Prikhodko, Anastasia S; Lomivorotov, Vladimir V; Zinovkin, Roman A
2017-12-01
To measure the release of plasma nuclear deoxyribonucleic acid (DNA) and to assess the relationship between nuclear DNA level and acute kidney injury occurrence in patients undergoing cardiac surgery. Cardiovascular anesthesiology and intensive care unit of a large tertiary-care university hospital. Prospective observational study. Fifty adult patients undergoing cardiac surgery. Nuclear DNA concentration was measured in the plasma. The relationship between the level of nuclear DNA and the incidence of acute kidney injury after coronary artery bypass grafting was investigated. Cardiac surgery leads to significant increase in plasma nuclear DNA with peak levels 12 hours after surgery (median [interquartile range] 7.0 [9.6-22.5] µg/mL). No difference was observed between off-pump and on-pump surgical techniques. Nuclear DNA was the only predictor of acute kidney injury between baseline and early postoperative risk factors. The authors found an increase of nuclear DNA in the plasma of patients who had undergone coronary artery bypass grafting, with a peak after 12 hours and an association of nuclear DNA with postoperative acute kidney injury. Copyright © 2017 Elsevier Inc. All rights reserved.
Comparison of Antivenom Dosing Strategies for Rattlesnake Envenomation.
Spyres, Meghan B; Skolnik, Aaron B; Moore, Elizabeth C; Gerkin, Richard D; Padilla-Jones, Angela; Ruha, Anne-Michelle
2018-06-01
This study compares maintenance with clinical- and laboratory-triggered (as-needed [PRN]) antivenom dosing strategies with regard to patient-centered outcomes after rattlesnake envenomation. This is a retrospective cohort study of adult rattlesnake envenomations treated at a regional toxicology center. Data on demographics, envenomation details, antivenom administration, length of stay, and laboratory and clinical outcomes were compared between the PRN and maintenance groups. Primary outcomes were hospital length of stay and total antivenom used, with a hypothesis of no difference between the two dosing strategies. A single regional toxicology center PATIENTS:: Three-hundred ten adult patients envenomated by rattlesnakes between 2007 and 2014 were included. Patients were excluded if no antivenom was administered or for receiving an antivenom other than Crofab (BTG International, West Conshohocken, PA). This is a retrospective study of rattlesnake envenomations treated with and without maintenance antivenom dosing. One-hundred forty-eight in the maintenance group and 162 in the PRN group were included. There was no difference in demographics or baseline envenomation severity or hemotoxicity (32.7% vs 40.5%; respectively; p = 0.158) between the two groups. Comparing the PRN with the maintenance group, less antivenom was used (8 [interquartile range, 6-12] vs 16 [interquartile range, 12-18] vials, respectively; p < 0.001), and hospital length of stay was shorter (27 hr [interquartile range, 20-44 hr] vs 34 hr [interquartile range, 24-43 hr], respectively; p = 0.014). There were no differences in follow-up outcomes of readmission, retreatment, or bleeding and surgical complications. Hospital length of stay was shorter, and less antivenom was used in patients receiving a PRN antivenom dosing strategy after rattlesnake envenomation.
Soon, Reni; Tschann, Mary; Salcedo, Jennifer; Stevens, Katelyn; Ahn, Hyeong Jun; Kaneshiro, Bliss
2017-08-01
To evaluate the efficacy of a paracervical block to decrease pain during osmotic dilator insertion before second-trimester abortion. In this double-blind, randomized trial, 41 women undergoing Laminaria insertion before a second-trimester abortion received either a paracervical block with 18 mL 1% lidocaine and 2 mL sodium bicarbonate or a sham block. Women were between 14 and 23 6/7 weeks of gestation. The primary outcome was pain immediately after insertion of Laminaria. Women assessed their pain on a 100-mm visual analog scale. Secondary outcomes included assessment of pain at other times during the insertion procedure and overall satisfaction with pain control. To detect a 25-mm difference in pain immediately after Laminaria insertion, at an α of 0.05 and 80% power, we aimed to enroll 20 patients in each arm. From May 2015 to December 2015, 20 women received a paracervical block and 21 received a sham block. Groups were similar in demographics, including parity, history of surgical abortion, and number of Laminaria placed. The paracervical block reduced pain after Laminaria insertion (median scores 13 mm [interquartile range 2-39] compared with 54 mm [interquartile range 27-61], P=.01, 95% CI -47.0 to -4.0). Women who received a paracervical block also reported higher satisfaction with overall pain control throughout the entire Laminaria insertion procedure (median scores 95 mm [interquartile range 78-100] compared with 70 mm [interquartile range 44-90], P=.05, 95% CI 0.0-37.0). Paracervical block is effective at reducing the pain of Laminaria insertion. Additionally, a paracervical block increases overall patient satisfaction with pain control during Laminaria placement. ClinicalTrials.gov, NCT02454296.
Nakanishi, Taizo; Goto, Tadahiro; Kobuchi, Taketsune; Kimura, Tetsuya; Hayashi, Hiroyuki; Tokuda, Yasuharu
2017-12-22
To compare bystander cardiopulmonary resuscitation skills retention between conventional learning and flipped learning for first-year medical students. A post-test only control group design. A total of 108 participants were randomly assigned to either the conventional learning or flipped learning. The primary outcome measures of time to the first chest compression and the number of total chest compressions during a 2-minute test period 6 month after the training were assessed with the Mann-Whitney U test. Fifty participants (92.6%) in the conventional learning group and 45 participants (83.3%) in the flipped learning group completed the study. There were no statistically significant differences 6 months after the training in the time to the first chest compression of 33.0 seconds (interquartile range, 24.0-42.0) for the conventional learning group and 31.0 seconds (interquartile range, 25.0-41.0) for the flipped learning group (U=1171.0, p=0.73) or in the number of total chest compressions of 101.5 (interquartile range, 90.8-124.0) for the conventional learning group and 104.0 (interquartile range, 91.0-121.0) for the flipped learning group (U=1083.0, p=0.75). The 95% confidence interval of the difference between means of the number of total chest compressions 6 months after the training did not exceed a clinically important difference defined a priori. There were no significant differences between the conventional learning group and the flipped learning group in our main outcomes. Flipped learning might be comparable to conventional learning, and seems a promising approach which requires fewer resources and enables student-centered learning without compromising the acquisition of CPR skills.
Rocha Ferreira, Graziela Santos; de Almeida, Juliano Pinheiro; Landoni, Giovanni; Vincent, Jean Louis; Fominskiy, Evgeny; Gomes Galas, Filomena Regina Barbosa; Gaiotto, Fabio A; Dallan, Luís Oliveira; Franco, Rafael Alves; Lisboa, Luiz Augusto; Palma Dallan, Luis Roberto; Fukushima, Julia Tizue; Rizk, Stephanie Itala; Park, Clarice Lee; Strabelli, Tânia Mara; Gelas Lage, Silvia Helena; Camara, Ligia; Zeferino, Suely; Jardim, Jaquelline; Calvo Arita, Elisandra Cristina Trevisan; Caldas Ribeiro, Juliana; Ayub-Ferreira, Silvia Moreira; Costa Auler, Jose Otavio; Filho, Roberto Kalil; Jatene, Fabio Biscegli; Hajjar, Ludhmila Abrahao
2018-04-30
The aim of this study was to evaluate the efficacy of perioperative intra-aortic balloon pump use in high-risk cardiac surgery patients. A single-center randomized controlled trial and a meta-analysis of randomized controlled trials. Heart Institute of São Paulo University. High-risk patients undergoing elective coronary artery bypass surgery. Patients were randomized to receive preskin incision intra-aortic balloon pump insertion after anesthesia induction versus no intra-aortic balloon pump use. The primary outcome was a composite endpoint of 30-day mortality and major morbidity (cardiogenic shock, stroke, acute renal failure, mediastinitis, prolonged mechanical ventilation, and a need for reoperation). A total of 181 patients (mean [SD] age 65.4 [9.4] yr; 32% female) were randomized. The primary outcome was observed in 43 patients (47.8%) in the intra-aortic balloon pump group and 42 patients (46.2%) in the control group (p = 0.46). The median duration of inotrope use (51 hr [interquartile range, 32-94 hr] vs 39 hr [interquartile range, 25-66 hr]; p = 0.007) and the ICU length of stay (5 d [interquartile range, 3-8 d] vs 4 d [interquartile range, 3-6 d]; p = 0.035) were longer in the intra-aortic balloon pump group than in the control group. A meta-analysis of 11 randomized controlled trials confirmed a lack of survival improvement in high-risk cardiac surgery patients with perioperative intra-aortic balloon pump use. In high-risk patients undergoing cardiac surgery, the perioperative use of an intra-aortic balloon pump did not reduce the occurrence of a composite outcome of 30-day mortality and major complications compared with usual care alone.
THE FUNDUS PHENOTYPE ASSOCIATED WITH THE p.Ala243Val BEST1 MUTATION.
Khan, Kamron N; Islam, Farrah; Moore, Anthony T; Michaelides, Michel
2018-03-01
To describe a highly recognizable and reproducible retinal phenotype associated with a specific BEST1 mutation-p.Ala243Val. Retrospective review of consecutive cases where genetic testing has identified p.Ala243Val BEST1 as the cause of disease. Electronic patient records were used to extract demographic, as well as functional and anatomical data. These data were compared with those observed with the most common BEST1 genotype, p.Arg218Cys. Eight individuals (six families) were identified with the p.Ala243Val BEST1 mutation and seven patients with the pathologic variant p.Arg218Cys. No patients with mutation of codon 243 knowingly had a family history of retinal disease, whereas all patients with the p.Arg218Cys variant did. The maculopathy was bilateral in all cases. The p.Ala243Val mutation was associated with a pattern dystrophy-type appearance, most visible with near-infrared reflectance and fundus autofluorescence imaging. This phenotype was never observed with any other genotype. This mutation was associated with an older median age of symptom onset (median = 42, interquartile range = 22) compared with those harboring the p.Arg218Cys mutation (median = 18, interquartile range = 12; Mann-Whitney U test; P < 0.05). Despite their older age, the final recorded acuity seemed to be better in the p.Ala243Val group (median = 0.55, interquartile range = 0.6475; median = 0.33, interquartile range = 0.358), although this did not reach statistical significance (Mann-Whitney U test; P > 0.05). The mutation p.Ala243Val is associated with highly recognizable and reproducible pattern dystrophy-like phenotype. Patients develop symptoms at a later age and tend to have better preservation of electrooculogram amplitudes.
Syphilis in Drug Users in Low and Middle Income Countries
Coffin, Lara S.; Newberry, Ashley; Hagan, Holly; Cleland, Charles M.; Des Jarlais, Don C.; Perlman, David C.
2009-01-01
Background Genital ulcer disease (GUD), including syphilis, is an important cause of morbidity in low and middle income (LMI) countries and syphilis transmission is associated with HIV transmission. Methods We conducted a literature review to evaluate syphilis infection among drug users in LMI countries for the period 1995–2007. Countries were categorized using the World Bank Atlas method (The World Bank, 2007) according to 2006 gross national income per capita. Results Thirty-two studies were included (N=13,848 subjects), mostly from Southeast Asia with some from Latin America, Eastern Europe, Central and East Asia, North Africa and the Middle East but none from regions such as Sub-Saharan Africa. The median prevalence of overall lifetime syphilis (N=32 studies) was 11.1% (interquartile range: 6.3% to 15.3%) and of HIV (N=31 studies) was 1.1% (interquartile range: 0.22% to 5.50%). There was a modest relation (r=0.27) between HIV and syphilis prevalence. Median syphilis prevalence by gender was 4.0% (interquartile range: 3.4% to 6.6%) among males (N=11 studies) and 19.9% (interquartile range: 11.4% to 36.0%) among females (N=6 studies). There was a strong relation (r= 0.68) between syphilis prevalence and female gender that may be related to female sex work. Conclusion Drug users in LMI countries have a high prevalence of syphilis but data are limited and, in some regions, entirely lacking. Further data are needed, including studies targeting the risks of women. Interventions to promote safer sex, testing, counseling and education, as well as health care worker awareness, should be integrated in harm reduction programs and health care settings to prevent new syphilis infections and reduce HIV transmission among drug users and their partners in LMI countries. PMID:19361976
Comparison of denture base adaptation between CAD-CAM and conventional fabrication techniques.
Goodacre, Brian J; Goodacre, Charles J; Baba, Nadim Z; Kattadiyil, Mathew T
2016-08-01
Currently no data comparing the denture base adaptation of CAD-CAM and conventional denture processing techniques have been reported. The purpose of this in vitro study was to compare the denture base adaptation of pack and press, pour, injection, and CAD-CAM techniques for fabricating dentures to determine which process produces the most accurate and reproducible adaptation. A definitive cast was duplicated to create 40 gypsum casts that were laser scanned before any fabrication procedures were initiated. A master denture was made using the CAD-CAM process and was then used to create a putty mold for the fabrication of 30 standardized wax festooned dentures, 10 for each of the conventional processing techniques (pack and press, pour, injection). Scan files from 10 casts were sent to Global Dental Science, LLC for fabrication of the CAD-CAM test specimens. After specimens for each of the 4 techniques had been fabricated, they were hydrated for 24 hours and the intaglio surface laser scanned. The scan file of each denture was superimposed on the scan file of the corresponding preprocessing cast using surface matching software. Measurements were made at 60 locations, providing evaluation of fit discrepancies at the following areas: apex of the denture border, 6 mm from the denture border, crest of the ridge, palate, and posterior palatal seal. The use of median and interquartile range was used to assess accuracy and reproducibility. The Levine and Kruskal-Wallis analysis of variance was used to evaluate differences between processing techniques at the 5 specified locations (α=.05). The ranking of results based on median and interquartile range determined that the accuracy and reproducibility of the CAD-CAM technique was more consistently localized around zero at 3 of the 5 locations. Therefore, the CAD-CAM technique showed the best combination of accuracy and reproducibility among the tested fabrication techniques. The pack and press technique was more accurate at 2 of the 5 locations; however, its interquartile range (reproducibility) was the greatest of the 4 tested processing techniques. The pour technique was the most reproducible at 2 of the 5 locations; however, its accuracy was the lowest of the tested techniques. The CAD-CAM fabrication process was the most accurate and reproducible denture fabrication technique when compared with pack and press, pour, and injection denture base processing techniques. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Costs of informal nursing care for patients with neurologic disorders: A systematic review.
Diederich, Freya; König, Hans-Helmut; Mietzner, Claudia; Brettschneider, Christian
2018-01-02
To systematically review the economic burden of informal nursing care (INC), often called informal care, caused by multiple sclerosis (MS), Parkinson disease (PD), and epilepsy, with special attention to disease severity. We systematically searched MEDLINE, PsycINFO, and NHS Economic Evaluation Database for articles on the cost of illness of the diseases specified. Title, abstract, and full-text review were conducted in duplicate by 2 researchers. The distribution of hours and costs of INC were extracted and used to compare the relevance of INC across included diseases and disease severity. Seventy-one studies were included (44 on MS, 17 on PD, and 10 on epilepsy). Studies on epilepsy reported an average of 2.3-54.5 monthly hours of INC per patient. For PD, average values of 42.9-145.9 hours and for MS average values of 9.2-249 hours per patient per month were found. In line with utilized hours, costs of INC were lowest for epilepsy (interquartile range [IQR] 229-1,466 purchasing power parity US dollars [PPP-USD]) and similar for MS (IQR 4,454-11,222 PPP-USD) and PD (IQR 1,440-7,117 PPP-USD). In addition, costs of INC increased with disease severity and accounted for 38% of total health care costs in severe MS stages on average. The course of diseases and disease severity matter for the amount of INC used by patients. For each of the neurologic disorders, an increase in the costs of INC, due to increasing disease severity, considerably contributes to the rise in total health care costs. Copyright © 2017 American Academy of Neurology.
The Impact of Time to Rate Control of Junctional Ectopic Tachycardia After Congenital Heart Surgery.
Lim, Joel Kian Boon; Mok, Yee Hui; Loh, Yee Jim; Tan, Teng Hong; Lee, Jan Hau
2017-11-01
Junctional ectopic tachycardia (JET) after congenital heart disease (CHD) surgery is often self-limiting but is associated with increased risk of morbidity and mortality. Contributing factors and impact of time to achieve rate control of JET are poorly described. From January 2010 to June 2015, a retrospective, single-center cohort study was performed of children who developed JET after CHD surgery . We classified the cohort into two groups: patients who achieved rate control of JET in ≤24 hours and in >24 hours. We examined factors associated with time to rate control and compared clinical outcomes (mortality, duration of mechanical ventilation, length of intensive care unit [ICU], and hospital stay) between the two groups. Our cohort included 27 children, with a median age of 3 (interquartile range: 0.7-38] months. The most common CHD lesions were ventricular septal defect (n = 10, 37%), tetralogy of Fallot (n = 7, 25.9%), and transposition of the great arteries (n = 4, 14.8%). In all, 15 (55.6%) and 12 (44.4%) patients achieved rate control of JET in ≤24 hours and >24 hours, respectively. There was a difference in median mechanical ventilation time (97 [21-145) vs 311 [100-676] hours; P = .013) and ICU stay (5.0 [2.0-8.0] vs 15.5 [5.5-32.8] days, P = .023) between the patients who achieved faster rate control than those who didn't. There was no difference in length of hospital stay and mortality between the groups. Our study demonstrated that time to achieve rate control of JET was associated with increased duration of mechanical ventilation and ICU stay.
Emergency department identification and critical care management of a Utah prison botulism outbreak.
Williams, Benjamin T; Schlein, Sarah M; Caravati, E Martin; Ledyard, Holly; Fix, Megan L
2014-07-01
We report botulism poisoning at a state prison after ingestion of homemade wine (pruno). This is an observational case series with data collected retrospectively by chart review. All suspected exposures were referred to a single hospital in October 2011. Twelve prisoners consumed pruno, a homemade alcoholic beverage made from a mixture of ingredients in prison environments. Four drank pruno made without potato and did not develop botulism. Eight drank pruno made with potato, became symptomatic, and were hospitalized. Presenting symptoms included dysphagia, diplopia, dysarthria, and weakness. The median time to symptom onset was 54.5 hours (interquartile range [IQR] 49-88 hours) postingestion. All 8 patients received botulinum antitoxin a median of 12 hours post-emergency department admission (IQR 8.9-18.8 hours). Seven of 8 patients had positive stool samples for type A botulinum toxin. The 3 most severely affected patients had respiratory failure and were intubated 43, 64, and 68 hours postingestion. Their maximal inspiratory force values were -5, -15, and -30 cm H2O. Their forced vital capacity values were 0.91, 2.1, and 2.2 L, whereas the 5 nonintubated patients had median maximal inspiratory force of -60 cm H2O (IQR -60 to -55) and forced vital capacity of 4.5 L (IQR 3.7-4.9). Electromyography abnormalities were observed in 1 of the nonintubated and 2 of the intubated patients. A pruno-associated botulism outbreak resulted in respiratory failure and abnormal pulmonary parameters in the most affected patients. Electromyography abnormalities were observed in the majority of intubated patients. Potato in the pruno recipe was associated with botulism. Copyright © 2013 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.
Safety of a rapid diagnostic protocol with accelerated stress testing.
Soremekun, Olan A; Hamedani, Azita; Shofer, Frances S; O'Conor, Katie J; Svenson, James; Hollander, Judd E
2014-02-01
Most patients at low to intermediate risk for an acute coronary syndrome (ACS) receive a 12- to 24-hour "rule out." Recently, trials have found that a coronary computed tomographic angiography-based strategy is more efficient. If stress testing were performed within the same time frame as coronary computed tomographic angiography, the 2 strategies would be more similar. We tested the hypothesis that stress testing can safely be performed within several hours of presentation. We performed a retrospective cohort study of patients presenting to a university hospital from January 1, 2009, to December 31, 2011, with potential ACS. Patients placed in a clinical pathway that performed stress testing after 2 negative troponin values 2 hours apart were included. We excluded patients with ST-elevation myocardial infarction or with an elevated initial troponin. The main outcome was safety of immediate stress testing defined as the absence of death or acute myocardial infarction (defined as elevated troponin within 24 hours after the test). A total of 856 patients who presented with potential ACS were enrolled in the clinical pathway and included in this study. Patients had a median age of 55.0 (interquartile range, 48-62) years. Chest pain was the chief concern in 86%, and pain was present on arrival in 73% of the patients. There were no complications observed during the stress test. There were 0 deaths (95% confidence interval, 0%-0.46%) and 4 acute myocardial infarctions within 24 hours (0.5%; 95% confidence interval, 0.14%-1.27%). The peak troponins were small (0.06, 0.07, 0.07, and 0.19 ng/mL). Patients who present to the ED with potential ACS can safely undergo a rapid diagnostic protocol with stress testing. © 2013.
Some people move it, move it… for pressure injury prevention.
Sonenblum, Sharon E; Sprigle, Stephen H
2018-01-01
To describe differences in in-seat behavior observed between individuals with a spinal cord injury (SCI) with and without a history of recurrent pressure injuries. Cross-sectional cohort study. General community. Twenty-nine adults more than 2 years post SCI, who used a wheelchair as their primary mobility device and had the ability to independently perform weight shift maneuvers. Participants were grouped according to whether or not they had a history of recurrent pressure injuries (PrIs), with 12 subjects having had two or more pressure injuries in the pelvic area (PrI Group). Not applicable. Daily time in wheelchair, number of transfers, and frequency of pressure reliefs (full unloading), weight shifts (30% load reduction), and in-seat movements (transient center of pressure movements or unloading). The median participant spent 10.3 hours in his wheelchair and performed 16 transfers to or from the wheelchair daily. Pressure reliefs were performed less than once every 3 hours in both groups. Weight shifts were performed significantly more often by the No PrI Group (median (interquartile range) 2.5 (1.0-3.6) per hour) than the PrI Group (1.0 (0.4-1.9), with P = 0.037 and effect size r = 0.39). In-seat movements were performed 46.5 (28.7-76.7) times per hour by the No PrI group and 39.6 (24.3-49.7) times per hour for the PrI group (P = 0.352, effect size r = 0.17). Weight shifts that can be produced by functional activities and that partially unload the buttocks should be considered as an important addition to individuals' PrI prevention regimen.
Ducharme-Crevier, Laurence; Press, Craig A; Kurz, Jonathan E; Mills, Michele G; Goldstein, Joshua L; Wainwright, Mark S
2017-05-01
The role of sleep architecture as a biomarker for prognostication after resuscitation from cardiac arrest in children hospitalized in an ICU remains poorly defined. We sought to investigate the association between features of normal sleep architecture in children after cardiac arrest and a favorable neurologic outcome at 6 months. Retrospective review of medical records and continuous electroencephalography monitoring. Cardiac and PICU of a tertiary children's hospital. All patients from 6 months to 18 years old resuscitated from cardiac arrest who underwent continuous electroencephalography monitoring in the first 24 hours after in- or out-of-hospital cardiac arrest from January 2010 to June 2015. None. Thirty-four patients underwent continuous electroencephalography monitoring after cardiac arrest. The median age was 6.1 years (interquartile range, 1.5-12.5 yr), 20 patients were male (59%). Most cases (n = 23, 68%) suffered from in-hospital cardiac arrest. Electroencephalography monitoring was initiated a median of 9.3 hours (5.8-14.9 hr) after return of spontaneous circulation, for a median duration of 14.3 hours (6.0-16.0 hr) within the first 24-hour period after the cardiac arrest. Five patients had normal spindles, five had abnormal spindles, and 24 patients did not have any sleep architecture. The presence of spindles was associated with a favorable neurologic outcome at 6-month postcardiac arrest (p = 0.001). Continuous electroencephalography monitoring can be used in children to assess spindles in the ICU. The presence of spindles on continuous electroencephalography monitoring in the first 24 hours after resuscitation from cardiac arrest is associated with a favorable neurologic outcome. Assessment of sleep architecture on continuous electroencephalography after cardiac arrest could improve outcome prediction.
Nocturnal cough in children with stable cystic fibrosis.
van der Giessen, Lianne; Loeve, Martine; de Jongste, Johan; Hop, Wim; Tiddens, Harm
2009-09-01
To date no studies have been published on nocturnal cough frequency in children with stable CF. Aim of the study was to assess nocturnal cough frequency in children with CF. In addition nocturnal cough frequency was correlated with parameters of disease severity. During two nights cough was recorded with a digital audio recorder in 25 patients (mean age 13 years; range 6-19) with clinically stable CF. In addition oxygen saturation was measured. The day following the recording spirometry was carried out. CT scores were obtained from the most recent routine CT scan. Cough was expressed in cough seconds (csec) and in cough seconds per hour (csec/hr). Data shown are median values and interquartile range (IQR). First night: 8 csec (IQR 3-52); 0.9 csec/hr (IQR 0.3-6.1) Second night: 6 csec (IQR 2-32); 0.6 csec/hr (IQR 0.1-3.4). Csec in the 1st night did not correlate significantly with csec in the 2nd night. Only for the 2nd night a strong correlation was found between csec/hr and the FEV1%pred (r(s) = -0.75, P < 0.001) and FEF(75) %pred (r(s) = -0.71, P < 0.001). Bronchiectasis score correlated borderline with the mean csec/hr of both nights (r(s) = 0.39, P = 0.08). During both nights cough was significantly higher in the first hour of sleep (P < or = 0.04). Frequency of nocturnal coughing in children with CF was higher than that described for normal children. Nocturnal cough tended to be more severe in children with more advanced CF lung disease. Nocturnal cough was more severe in the first hour of sleep and varied from night-to-night. Copyright 2009 Wiley-Liss, Inc.
Hansen, Lea K; Becher, Naja; Bastholm, Sara; Glavind, Julie; Ramsing, Mette; Kim, Chong J; Romero, Roberto; Jensen, Jørgen S; Uldbjerg, Niels
2014-01-01
To evaluate the microbial load and the inflammatory response in the distal and proximal parts of the cervical mucus plug. Experimental research. Twenty women with a normal, singleton pregnancy. Vaginal swabs and specimens from the distal and proximal parts of the cervical mucus plug. Immunohistochemistry, enzyme-linked immunosorbent assay, quantitative polymerase chain reaction and histology. The total bacterial load (16S rDNA) was significantly lower in the cervical mucus plug compared with the vagina (p = 0.001). Among women harboring Ureaplasma parvum, the median genome equivalents/g were 1574 (interquartile range 2526) in the proximal part, 657 (interquartile range 1620) in the distal part and 60,240 (interquartile range 96,386) in the vagina. Histological examinations and quantitative polymerase chain reaction revealed considerable amounts of lactobacilli and inflammatory cells in both parts of the cervical mucus plug. The matrix metalloproteinase-8 concentration was decreased in the proximal part of the plug compared with the distal part (p = 0.08). The cervical mucus plug inhibits, but does not block, the passage of Ureaplasma parvum during its ascending route from the vagina through the cervical canal. © 2013 Nordic Federation of Societies of Obstetrics and Gynecology.
Adverse Drug Events and Medication Errors in African Hospitals: A Systematic Review.
Mekonnen, Alemayehu B; Alhawassi, Tariq M; McLachlan, Andrew J; Brien, Jo-Anne E
2018-03-01
Medication errors and adverse drug events are universal problems contributing to patient harm but the magnitude of these problems in Africa remains unclear. The objective of this study was to systematically investigate the literature on the extent of medication errors and adverse drug events, and the factors contributing to medication errors in African hospitals. We searched PubMed, MEDLINE, EMBASE, Web of Science and Global Health databases from inception to 31 August, 2017 and hand searched the reference lists of included studies. Original research studies of any design published in English that investigated adverse drug events and/or medication errors in any patient population in the hospital setting in Africa were included. Descriptive statistics including median and interquartile range were presented. Fifty-one studies were included; of these, 33 focused on medication errors, 15 on adverse drug events, and three studies focused on medication errors and adverse drug events. These studies were conducted in nine (of the 54) African countries. In any patient population, the median (interquartile range) percentage of patients reported to have experienced any suspected adverse drug event at hospital admission was 8.4% (4.5-20.1%), while adverse drug events causing admission were reported in 2.8% (0.7-6.4%) of patients but it was reported that a median of 43.5% (20.0-47.0%) of the adverse drug events were deemed preventable. Similarly, the median mortality rate attributed to adverse drug events was reported to be 0.1% (interquartile range 0.0-0.3%). The most commonly reported types of medication errors were prescribing errors, occurring in a median of 57.4% (interquartile range 22.8-72.8%) of all prescriptions and a median of 15.5% (interquartile range 7.5-50.6%) of the prescriptions evaluated had dosing problems. Major contributing factors for medication errors reported in these studies were individual practitioner factors (e.g. fatigue and inadequate knowledge/training) and environmental factors, such as workplace distraction and high workload. Medication errors in the African healthcare setting are relatively common, and the impact of adverse drug events is substantial but many are preventable. This review supports the design and implementation of preventative strategies targeting the most likely contributing factors.
Marcaccio, Christina L; Dumas, Ryan P; Huang, Yanlan; Yang, Wei; Wang, Grace J; Holena, Daniel N
2018-02-13
The traditional approach to stable blunt thoracic aortic injury (BTAI) endorsed by the Society for Vascular Surgery is early (<24 hours) thoracic endovascular aortic repair (TEVAR). Recently, some studies have shown improved mortality in stable BTAI patients repaired in a delayed manner (≥24 hours). However, the indications for use of delayed TEVAR for BTAI are not well characterized, and its overall impact on the patient's survival remains poorly understood. We sought to determine whether delayed TEVAR is associated with a decrease in mortality compared with early TEVAR in this population. We conducted a retrospective cohort study of adult patients with BTAI (International Classification of Diseases, Ninth Revision diagnosis code 901.0) who underwent TEVAR (International Classification of Diseases, Ninth Revision procedure code 39.73) from 2009 to 2013 using the National Sample Program data set. Missing physiologic data were imputed using chained multiple imputation. Patients were parsed into groups based on the timing of TEVAR (early, <24 hours, vs delayed, ≥24 hours). The χ 2 , Mann-Whitney, and Fisher exact tests were used to compare baseline characteristics and outcomes of interest between groups. Multivariable logistic regression for mortality was performed that included all variables significant at P ≤ .2 in univariate analyses. A total of 2045 adult patients with BTAI were identified, of whom 534 (26%) underwent TEVAR. Patients with missing data on TEVAR timing were excluded (n = 27), leaving a total of 507 patients for analysis (75% male; 69% white; median age, 40 years [interquartile range, 27-56 years]; median Injury Severity Score [ISS], 34 [interquartile range, 26-41]). Of these, 378 patients underwent early TEVAR and 129 underwent delayed TEVAR. The two groups were similar with regard to age, sex, race, ISS, and presenting physiology. Mortality was 11.9% in the early TEVAR group vs 5.4% in the delayed group, with the early group displaying a higher odds of death (odds ratio, 2.36; 95% confidence interval, 1.03-5.36; P = .042). After adjustment for age, ISS, and admission physiology, the association between early TEVAR and mortality was preserved (adjusted odds ratio, 2.39; 95% confidence interval, 1.01-5.67; P = .047). Consistent with current Society for Vascular Surgery recommendations, more BTAI patients underwent early TEVAR than delayed TEVAR during the study period. However, delayed TEVAR was associated with significantly reduced mortality in this population. Together, these findings support a need for critical appraisal and clarification of existing practice guidelines in management of BTAI. Future studies should seek to understand this survival disparity and to determine optimal selection of patients for early vs delayed TEVAR. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Smerud, Hilde Kloster; Bárány, Peter; Lindström, Karin; Fernström, Anders; Sandell, Anna; Påhlsson, Peter; Fellström, Bengt
2011-10-01
Systemic corticosteroid treatment has been shown to exert some protection against renal deterioration in IgA nephropathy (IgAN) but is not commonly recommended for long-term use due to the well-known systemic side effects. In this study, we investigated the efficacy and safety of a new enteric formulation of the locally acting glucocorticoid budesonide (Nefecon®), designed to release the active compound in the ileocecal region. The primary objective was to evaluate the efficacy of targeted release budesonide on albuminuria. Budesonide 8 mg/day was given to 16 patients with IgAN for 6 months, followed by a 3-month follow-up period. The efficacy was measured as change in 24-h urine albumin excretion, serum creatinine and estimated glomerular filtration rate (eGFR). The median relative reduction in urinary albumin excretion was 23% during the treatment period (interquartile range: -0.36 to -0.04, P = 0.04) with pretreatment values ranging from 0.3 to 6 g/24 h (median: 1.5 g/24 h). The median reduction in urine albumin peaked at 40% (interquartile range: -0.58 to -0.15) 2 months after treatment discontinuation. Serum creatinine was reduced by 6% (interquartile range: -0.12 to -0.02; P = 0.003), and eGFR [Modification of Diet in Renal Disease (MDRD)] increased ∼8% (interquartile range: 0.02-0.16, P = 0.003) during treatment. No major corticosteroid-related side effects were observed. In the present pilot study, enteric budesonide targeted to the ileocecal region had a significant effect on urine albumin excretion, accompanied by a minor reduction of serum creatinine and a modest increase of eGFR calculated by the MDRD equation, while eGFR calculated from Cockcroft-Gault equation and cystatin C was not changed. Enteric budesonide may represent a new treatment of IgAN warranting further investigation.
Gardner, Blake; Ling, Frederick; Hopke, Philip K; Frampton, Mark W; Utell, Mark J; Zareba, Wojciech; Cameron, Scott J; Chalupa, David; Kane, Cathleen; Kulandhaisamy, Suresh; Topf, Michael C; Rich, David Q
2014-01-02
We and others have shown that increases in particulate air pollutant (PM) concentrations in the previous hours and days have been associated with increased risks of myocardial infarction, but little is known about the relationships between air pollution and specific subsets of myocardial infarction, such as ST-elevation myocardial infarction (STEMI) and non ST-elevation myocardial infarction (NSTEMI). Using data from acute coronary syndrome patients with STEMI (n = 338) and NSTEMI (n = 339) and case-crossover methods, we estimated the risk of STEMI and NSTEMI associated with increased ambient fine particle (<2.5 um) concentrations, ultrafine particle (10-100 nm) number concentrations, and accumulation mode particle (100-500 nm) number concentrations in the previous few hours and days. We found a significant 18% increase in the risk of STEMI associated with each 7.1 μg/m³ increase in PM₂.₅ concentration in the previous hour prior to acute coronary syndrome onset, with smaller, non-significantly increased risks associated with increased fine particle concentrations in the previous 3, 12, and 24 hours. We found no pattern with NSTEMI. Estimates of the risk of STEMI associated with interquartile range increases in ultrafine particle and accumulation mode particle number concentrations in the previous 1 to 96 hours were all greater than 1.0, but not statistically significant. Patients with pre-existing hypertension had a significantly greater risk of STEMI associated with increased fine particle concentration in the previous hour than patients without hypertension. Increased fine particle concentrations in the hour prior to acute coronary syndrome onset were associated with an increased risk of STEMI, but not NSTEMI. Patients with pre-existing hypertension and other cardiovascular disease appeared particularly susceptible. Further investigation into mechanisms by which PM can preferentially trigger STEMI over NSTEMI within this rapid time scale is needed.
Yang, Jong Min; Park, Yoo Seok; Chung, Sung Phil; Chung, Hyun Soo; Lee, Hye Sun; You, Je Sung; Lee, Shin Ho; Park, Incheol
2014-08-01
Admission on weekends and off-hours has been associated with poor outcomes and mortality from acute stroke. The purpose of this study was to investigate whether an organized clinical pathway (CP) for ischemic stroke can effectively reduce the time from arrival to evaluation and treatment in the emergency department (ED) and improve outcomes, regardless of the time from arrival in the ED. We conducted a retrospective analysis of all consecutive patients included in the prospective registry database in the Brain Salvage through Emergency Stroke Therapy program, which uses the computerized physician order entry (CPOE) system. Patients were classified based on their time of arrival in the ED: group 1, normal working hours on weekdays; group 2, off-hours on weekdays; group 3, normal working hours on weekends; and group 4, off-hours on weekends. Clinical outcomes were categorized according to 30 days in-hospital mortality, in-hospital mortality, and the modified Rankin score during a single length of stay (LOS). No time intervals differed significantly among the 4 patient groups who received intravenous administration of tissue plasminogen activator (IV-tPA). Use of IV-tPA (P = .5110) was not affected by arrival in the ED on off-days or weekends. The overall mortality rate was 3.9%, and the median LOS was 7 days (Interquartile range (IQR), 5-10). By Kaplan-Meier analysis, the cumulative probability of mortality and survival did not differ significantly among the 4 groups over 30 days (P = .1557). An organized CP, based on CPOE, for ischemic stroke can effectively attenuate disparities in the time interval between ED arrival to evaluation and treatment regardless of ED arrival time. This pathway may also help to eliminate off-hour and weekend effects on outcomes from ischemic stroke. Copyright © 2014 Elsevier Inc. All rights reserved.
Ezeugwu, Victor E; Manns, Patricia J
2017-09-01
The aim of this study was to describe accelerometer-derived sleep duration, sedentary behavior, physical activity, and quality of life and their association with demographic and clinical factors within the first month after inpatient stroke rehabilitation. Thirty people with stroke (mean ± standard deviation, age: 63.8 ± 12.3 years, time since stroke: 3.6 ± 1.1 months) wore an activPAL3 Micro accelerometer (PAL Technologies, Glasgow, Scotland) continuously for 7 days to measure whole-day activity behavior. The Stroke Impact Scale and the Functional Independence Measure were used to assess quality of life and function, respectively. Sleep duration ranged from 6.6 to 11.6 hours/day. Fifteen participants engaged in long sleep greater than 9 hours/day. Participants spent 74.8% of waking hours in sedentary behavior, 17.9% standing, and 7.3% stepping. Of stepping time, only a median of 1.1 (interquartile range: .3-5.8) minutes were spent walking at a moderate-to-vigorous intensity (≥100 steps/minute). The time spent sedentary, the stepping time, and the number of steps differed significantly by the hemiparetic side (P < .05), but not by sex or the type of stroke. There were moderate to strong correlations between the stepping time and the number of steps with gait speed (Spearman r = .49 and .61 respectively, P < .01). Correlations between accelerometer-derived variables and age, time since stroke, and cognition were not significant. People with stroke sleep for longer than the normal duration, spend about three quarters of their waking hours in sedentary behaviors, and engage in minimal walking following stroke rehabilitation. Our findings provide a rationale for the development of behavior change strategies after stroke. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Staveski, Sandra L; Boulanger, Karen; Erman, Lee; Lin, Li; Almgren, Christina; Journel, Chloe; Roth, Stephen J; Golianu, Brenda
2018-06-14
The purpose of this pilot study was three-fold: 1) to evaluate the safety and feasibility of instituting massage therapy in the immediate postoperative period after congenital heart surgery, 2) to examine the preliminary results on effects of massage therapy versus standard of care plus three reading visits on postoperative pain and anxiety, and 3) to evaluate preliminary effects of opioid and benzodiazepine exposure in patients receiving massage therapy compared with reading controls. Prospective, randomized controlled trial. An academic children's hospital. Sixty pediatric heart surgery patients between ages 6 and 18 years. Massage therapy and reading. There were no adverse events related to massage or reading interventions in either group. Our investigation found no statistically significant difference in Pain or State-Trait Anxiety scores in the initial 24 hours after heart surgery (T1) and within 48 hours of transfer to the acute care unit (T2) after controlling for age, gender, and Risk Adjustment for Congenital Heart Surgery 1 score. However, children receiving massage therapy had significantly lower State-Trait Anxiety scores after receiving massage therapy at time of discharge (T3; p = 0.0075) than children receiving standard of care plus three reading visits. We found no difference in total opioid exposure during the first 3 postoperative days between groups (median [interquartile range], 0.80 mg/kg morphine equivalents [0.29-10.60] vs 1.13 mg/kg morphine equivalents [0.72-6.14]). In contrast, children receiving massage therapy had significantly lower total benzodiazepine exposure in the immediate 3 days following heart surgery (median [interquartile range], 0.002 mg/kg lorazepam equivalents [0-0.03] vs 0.03 mg/kg lorazepam equivalents [0.02-0.09], p = 0.0253, Wilcoxon rank-sum) and number of benzodiazepine PRN doses (0.5 [0-2.5] PRN vs 2 PRNs (1-4); p = 0.00346, Wilcoxon rank-sum). Our pilot study demonstrated the safety and feasibility of implementing massage therapy in the immediate postoperative period in pediatric heart surgery patients. We found decreased State-Trait Anxiety scores at discharge and lower total exposure to benzodiazepines. Preventing postoperative complications such as delirium through nonpharmacologic interventions warrants further evaluation.
Teriparatide Therapy and Reduced Postoperative Hospitalization for Postsurgical Hypoparathyroidism.
Shah, Meera; Bancos, Irina; Thompson, Geoffrey B; Richards, Melanie L; Kasperbauer, Jan L; Clarke, Bart L; Drake, Matthew T; Stan, Marius N
2015-09-01
Up to 20% of patients undergoing thyroidectomy develop hypocalcemia after surgery. Although usually transient, severe symptomatic hypocalcemia may occur. Teriparatide acetate (recombinant human parathyroid hormone 1-34) therapy can rapidly raise calcium levels. To test the hypothesis that teriparatide therapy in patients with postthyroidectomy hypoparathyroidism would expedite relief of symptomatic hypocalcemia and reduce the duration of hospitalization compared with standard treatment. Case series of all hospitalized patients 18 years or older treated with teriparatide for symptomatic postthyroidectomy hypocalcemia occurring immediately after thyroidectomy at Mayo Clinic, Rochester, Minnesota, between January 1, 2008, and June 30, 2014. A secondary analysis was performed with matched control and cohort groups having postthyroidectomy hypocalcemia of similar degree who received standard treatment only. Participants included 8 hospitalized patients who received teriparatide therapy after 24 hours of standard treatment (cases) and eight control patients selected from a cohort of 1193 thyroidectomies were matched for age, sex, body mass index, and nadir calcium levels. Teriparatide acetate therapy (20 µg twice daily) subcutaneously for 1 week, with the option of continuing at 20 µg/d for up to 3 weeks. Safety, symptom resolution, calcium supplementation, and duration of hospitalization. Among the 16 case and control patients the median nadir calcium level was 7.1 mg/dL in both groups. Most patients underwent thyroidectomy for thyroid cancer. Teriparatide therapy was safe, with no adverse events noted, and completely eliminated symptomatic hypocalcemia in all treated patients within 24 hours of initiation. Hospital discharge occurred at a median of 1.0 day (interquartile range, 1.0-1.0 day) after teriparatide therapy initiation among cases vs 2.5 days (interquartile range, 1.8-3.0 days) after the equivalent clinical point was reached in controls (P = .01). This value was 2.0 days in the source cohort (P = .02). On hospital discharge, patients had similar calcium levels. Six months after surgery, all patients treated with teriparatide showed partial or complete parathyroid recovery. Calcium supplementation and calcium levels were comparable between the groups. In this pilot study, teriparatide therapy in patients with postthyroidectomy hypoparathyroidism was safe, rapidly eliminated hypocalcemic symptoms, and likely reduced the duration of hospitalization. Given the limitations of this small study, a large-scale randomized trial is needed to verify these results and to assess the long-term effect of teriparatide therapy on clinical outcomes.
Restrictive versus Liberal Fluid Therapy for Major Abdominal Surgery.
Myles, Paul S; Bellomo, Rinaldo; Corcoran, Tomas; Forbes, Andrew; Peyton, Philip; Story, David; Christophi, Chris; Leslie, Kate; McGuinness, Shay; Parke, Rachael; Serpell, Jonathan; Chan, Matthew T V; Painter, Thomas; McCluskey, Stuart; Minto, Gary; Wallace, Sophie
2018-05-09
Background Guidelines to promote the early recovery of patients undergoing major surgery recommend a restrictive intravenous-fluid strategy for abdominal surgery. However, the supporting evidence is limited, and there is concern about impaired organ perfusion. Methods In a pragmatic, international trial, we randomly assigned 3000 patients who had an increased risk of complications while undergoing major abdominal surgery to receive a restrictive or liberal intravenous-fluid regimen during and up to 24 hours after surgery. The primary outcome was disability-free survival at 1 year. Key secondary outcomes were acute kidney injury at 30 days, renal-replacement therapy at 90 days, and a composite of septic complications, surgical-site infection, or death. Results During and up to 24 hours after surgery, 1490 patients in the restrictive fluid group had a median intravenous-fluid intake of 3.7 liters (interquartile range, 2.9 to 4.9), as compared with 6.1 liters (interquartile range, 5.0 to 7.4) in 1493 patients in the liberal fluid group (P<0.001). The rate of disability-free survival at 1 year was 81.9% in the restrictive fluid group and 82.3% in the liberal fluid group (hazard ratio for death or disability, 1.05; 95% confidence interval, 0.88 to 1.24; P=0.61). The rate of acute kidney injury was 8.6% in the restrictive fluid group and 5.0% in the liberal fluid group (P<0.001). The rate of septic complications or death was 21.8% in the restrictive fluid group and 19.8% in the liberal fluid group (P=0.19); rates of surgical-site infection (16.5% vs. 13.6%, P=0.02) and renal-replacement therapy (0.9% vs. 0.3%, P=0.048) were higher in the restrictive fluid group, but the between-group difference was not significant after adjustment for multiple testing. Conclusions Among patients at increased risk for complications during major abdominal surgery, a restrictive fluid regimen was not associated with a higher rate of disability-free survival than a liberal fluid regimen and was associated with a higher rate of acute kidney injury. (Funded by the Australian National Health and Medical Research Council and others; RELIEF ClinicalTrials.gov number, NCT01424150 .).
Dean, Dylan; Wetzel, Brian; White, Nathan; Kuppermann, Nathan; Wang, Nancy Ewen; Haukoos, Jason S; Hsia, Renee Y; Mann, N Clay; Barton, Erik D; Newgard, Craig D
2014-03-01
This study aimed to characterize initial clinical presentations of patients served by emergency medical services (EMS) who die following injury, with particular attention to patients with occult ("talk-and-die") presentations. This was a population-based, multiregion, mixed-methods retrospective cohort study of fatally injured children and adults evaluated by 94 EMS agencies transporting to 122 hospitals in seven Western US regions from 2006 to 2008. Fatalities were divided into two main groups: occult injuries (talk-and-die; Glasgow Coma Scale [GCS] score ≥ 13, no cardiopulmonary arrest, and no intubation) versus overt injuries (all other patients). These groups were further subdivided by timing of death: early (<48 hours) versus late (>48 hours). We then compared demographic, physiologic, procedural, and injury patterns using descriptive statistics. We also used qualitative methods to analyze available EMS chart narratives for contextual information from the out-of-hospital encounter. During the 3-year study period, 3,358 persons served by 9-1-1 EMS providers died, with 1,225 (37.1%) in the field, 1,016 (30.8%) early in the hospital, and 1,060 (32.1%) late in the hospital. Of the 2,133 patients transported to a hospital, there were 612 (28.7%) talk-and-die patients, of whom 114 (18.6%) died early. Talk-and-die patients were older (median age, 81 years; interquartile range, 67-87 years), normotensive (median systolic blood pressure, 138 mm Hg; interquartile range, 116-160 mm Hg), commonly injured by falls (71.3%), and frequently (52.4%) died in nontrauma hospitals. Compared with overtly injured patients, talk-and-die patients had relatively fewer serious head injuries (13.7%) but more frequent extremity injuries (20.3% vs. 10.6%) and orthopedic interventions (25.3% vs. 5.0%). EMS personnel often found talk-and-die patients lying on the ground with hip pain or extremity injuries. Patients served by EMS who "talk-and-die" are typically older adults with falls, transported to nontrauma hospitals, with subtle clinical indications of the severity of their injuries. Improving recognition of talk-and-die patients may avoid fatal outcomes in a portion of these patients. Epidemiologic study, level III.
Brain, Matthew; Anderson, Mike; Parkes, Scott; Fowler, Peter
2012-12-01
To describe magnesium flux and serum concentrations in ICU patients receiving continuous venovenous haemodiafiltration (CVVHDF). Samples were collected from 22 CVVHDF circuits using citrate anticoagulation solutions (Prismocitrate 10/2 and Prism0cal) and from 26 circuits using Hemosol B0 and heparin anticoagulation. CVVHDF prescription, magnesium supplementation and anticoagulation choice was by the treating intensivist. We analysed 334 sample sets consisting of arterial, prefilter and postfilter blood and effluent. Magnesium loss was calculated from an equation for conservation of mass, and arterial magnesium concentration was described by an equation for exponential decay. Using flow rates typical of adults receiving CVVHDF, we determined a median half-life for arterial magnesium concentration to decay to a new steady state of 4.73 hours (interquartile range [IQR], 3.73-7.32 hours). Median arterial magnesium concentration was 0.88mmol/L (IQR, 0.83-0.97mmol/L) in the heparin group and 0.79mmol/L (IQR, 0.69-0.91mmol/L) in the citrate group. Arterial magnesium concentrations fell below the reference range regularly in the citrate group and, when low, there was magnesium flux from dialysate to patient. Magnesium loss was greater in patients receiving citrate. Exponential decline in magnesium concentrations was sufficiently rapid that subtherapeutic serum magnesium concentrations may occur well before detection when once-daily sampling was used. Measurements should be interpreted with regard to timing of magnesium infusions. We suggest that continuous renal replacement therapy fluids with higher magnesium concentrations be introduced in the critical care setting.
2012-01-01
Background Pyrexia after stroke (temperature ≥37.5°C) is associated with poor prognosis, but information on timing of body temperature changes and relationship to stroke severity and subtypes varies. Methods We recruited patients with acute ischemic stroke, measured stroke severity, stroke subtype and recorded four-hourly tympanic (body) temperature readings from admission to 120 hours after stroke. We sought causes of pyrexia and measured functional outcome at 90 days. We systematically summarised all relevant previous studies. Results Amongst 44 patients (21 males, mean age 72 years SD 11) with median National Institute of Health Stroke Score (NIHSS) 7 (range 0–28), 14 had total anterior circulation strokes (TACS). On admission all patients, both TACS and non-TACS, were normothermic (median 36.3°C vs 36.5°C, p=0.382 respectively) at median 4 hours (interquartile range, IQR, 2–8) after stroke; admission temperature and NIHSS were not associated (r2=0.0, p=0.353). Peak temperature, occurring at 35.5 (IQR 19.0 to 53.8) hours after stroke, was higher in TACS (37.7°C) than non-TACS (37.1°C, p<0.001) and was associated with admission NIHSS (r2=0.20, p=0.002). Poor outcome (modified Rankin Scale ≥3) at 90 days was associated with higher admission (36.6°C vs. 36.2°C p=0.031) and peak (37.4°C vs. 37.0°C, p=0.016) temperatures. Sixteen (36%) patients became pyrexial, in seven (44%) of whom we found no cause other than the stroke. Conclusions Normothermia is usual within the first 4 hours of stroke. Peak temperature occurs at 1.5 to 2 days after stroke, and is related to stroke severity/subtype and more closely associated with poor outcome than admission temperature. Temperature-outcome associations after stroke are complex, but normothermia on admission should not preclude randomisation of patients into trials of therapeutic hypothermia. PMID:23075282
Runge, Charlotte; Børglum, Jens; Jensen, Jan Mick; Kobborg, Tina; Pedersen, Anette; Sandberg, Jon; Mikkelsen, Lone Ramer; Vase, Morten; Bendtsen, Thomas Fichtner
2016-01-01
Total knee arthroplasty (TKA) is associated with severe pain, and effective analgesia is essential for the quality of postoperative care and ambulation. The analgesic effects of adding an obturator nerve block (ONB) to a femoral triangle block (FTB) after TKA have not been tested previously. We hypothesized that combined ONB and FTB will reduce opioid consumption and pain compared with those of a single FTB or local infiltration analgesia (LIA). Seventy-eight patients were randomized to combined ONB and FTB, single FTB, or LIA after primary unilateral TKA. The primary outcome was morphine consumption during the first 24 postoperative hours. Secondary outcomes included morphine consumption during the first 48 postoperative hours, pain at rest and passive knee flexion, nausea and vomiting, cumulated ambulation score, and Timed Up and Go test. Seventy-five patients were included in the analysis. The total intravenous morphine consumption during the first 24 postoperative hours was 2 mg (interquartile range [IQR], 0-15) in the combined ONB and FTB group, 20 mg (IQR, 10-26) in the FTB group (P = 0.0007), and 17 mg (IQR, 10-36) in the LIA group (P = 0.002). The combined ONB and FTB group displayed reduced pain, nausea, and vomiting compared with the other groups. The ambulation tests showed no statistically significant differences between the groups. Addition of ONB to FTB significantly reduced opioid consumption and pain after TKA compared with a single FTB or LIA, without impaired ambulation.
Dietrich, Janan Janine; Laher, Fatima; Hornschuh, Stefanie; Nkala, Busisiwe; Chimoyi, Lucy; Otwombe, Kennedy; Kaida, Angela; Gray, Glenda Elisabeth; Miller, Cari
2016-09-28
Internet access via mobile phones and computers facilitates interaction and potential health communication among individuals through social networking. Many South African adolescents own mobile phones and can access social networks via apps. We investigated sociodemographic factors and HIV risk behaviors of adolescent social networking users in Soweto, South Africa. We conducted an interviewer-administered, cross-sectional survey of adolescents aged 14-19 years. Independent covariates of social networking were assessed by multivariate logistic regression analysis. Of 830 adolescents, 57% (475/830) were females and the median age was found to be 18 years (interquartile range 17-18). Social networking was used by 60% of adolescents (494/830); more than half, that is, 87% (396/494) accessed social networks through mobile phones and 56% (275/494) spent more than 4 hours per day using their mobile phones. Social networking was independently associated with mobile usage 2-4 hours (adjusted odds ratio [AOR]: 3.06, CI: 1.69-5.51) and more than 4 hours per day (AOR: 6.16, CI: 3.46-10.9) and one (AOR: 3.35, CI: 1.79-6.27) or more sexual partner(s) (AOR: 2.58, CI: 1.05-6.36). Mobile phone-based social networking is prevalent among sexually active adolescents living in Soweto and may be used as an entry point for health promotion and initiation of low-cost adolescent health interventions.
Laher, Fatima; Hornschuh, Stefanie; Nkala, Busisiwe; Chimoyi, Lucy; Otwombe, Kennedy; Kaida, Angela; Gray, Glenda Elisabeth; Miller, Cari
2016-01-01
Background Internet access via mobile phones and computers facilitates interaction and potential health communication among individuals through social networking. Many South African adolescents own mobile phones and can access social networks via apps. Objective We investigated sociodemographic factors and HIV risk behaviors of adolescent social networking users in Soweto, South Africa. Methods We conducted an interviewer-administered, cross-sectional survey of adolescents aged 14-19 years. Independent covariates of social networking were assessed by multivariate logistic regression analysis. Results Of 830 adolescents, 57% (475/830) were females and the median age was found to be 18 years (interquartile range 17-18). Social networking was used by 60% of adolescents (494/830); more than half, that is, 87% (396/494) accessed social networks through mobile phones and 56% (275/494) spent more than 4 hours per day using their mobile phones. Social networking was independently associated with mobile usage 2-4 hours (adjusted odds ratio [AOR]: 3.06, CI: 1.69-5.51) and more than 4 hours per day (AOR: 6.16, CI: 3.46-10.9) and one (AOR: 3.35, CI: 1.79-6.27) or more sexual partner(s) (AOR: 2.58, CI: 1.05-6.36). Conclusions Mobile phone–based social networking is prevalent among sexually active adolescents living in Soweto and may be used as an entry point for health promotion and initiation of low-cost adolescent health interventions. PMID:27683173
Decontamination of stethoscope membranes with chlorhexidine: Should it be recommended?
Álvarez, José A; Ruíz, Susana R; Mosqueda, Juan L; León, Ximena; Arreguín, Virginia; Macías, Alejandro E; Macias, Juan H
2016-11-01
To determine differences in the recontamination of stethoscope membranes after cleaning with chlorhexidine, triclosan, or alcohol. Experimental, controlled, blinded trial to determine differences in the bacterial load on stethoscope membranes. Membranes were cultured by direct imprint after disinfection with 70% isopropyl alcohol, 1% triclosan, or 1% chlorhexidine and normal use for 4 hours. As a baseline and an immediate effect control, bacterial load of membranes without disinfection and after 1 minute of disinfection with isopropyl alcohol was determined as well. Three hundred seventy cultures of in-use stethoscopes were taken, 74 from each arm. In the baseline arm the median growth was 10 CFU (interquartile range [IQR], 32-42 CFU); meanwhile, in the isopropyl alcohol immediate-effect arm it was 0 CFU (IQR, 0-0 CFU). In the arms cultured after 4 hours, a median growth of 8 CFU (IQR, 1-28 CFU) in the isopropyl alcohol arm, 4 CFU (IQR, 0-17 CFU) in the triclosan arm, and 0 CFU (IQR, 0-1 CFU) in the chlorhexidine arm were seen. No significant differences were observed between the bacterial load of the chlorhexidine arm (after 4 hours of use) and that of the isopropyl alcohol arm (after 1 minute without use) (Z= 2.41; P > .05). Chlorhexidine can inhibit recontamination of stethoscope membranes and its use could help avoid cross-infection. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Nitrous Oxide for Treatment-Resistant Major Depression: A Proof-of-Concept Trial.
Nagele, Peter; Duma, Andreas; Kopec, Michael; Gebara, Marie Anne; Parsoei, Alireza; Walker, Marie; Janski, Alvin; Panagopoulos, Vassilis N; Cristancho, Pilar; Miller, J Philip; Zorumski, Charles F; Conway, Charles R
2015-07-01
N-methyl-D-aspartate receptor antagonists, such as ketamine, have rapid antidepressant effects in patients with treatment-resistant depression (TRD). We hypothesized that nitrous oxide, an inhalational general anesthetic and N-methyl-D-aspartate receptor antagonist, may also be a rapidly acting treatment for TRD. In this blinded, placebo-controlled crossover trial, 20 patients with TRD were randomly assigned to 1-hour inhalation of 50% nitrous oxide/50% oxygen or 50% nitrogen/50% oxygen (placebo control). The primary endpoint was the change on the 21-item Hamilton Depression Rating Scale (HDRS-21) 24 hours after treatment. Mean duration of nitrous oxide treatment was 55.6 ± 2.5 (SD) min at a median inspiratory concentration of 44% (interquartile range, 37%-45%). In two patients, nitrous oxide treatment was briefly interrupted, and the treatment was discontinued in three patients. Depressive symptoms improved significantly at 2 hours and 24 hours after receiving nitrous oxide compared with placebo (mean HDRS-21 difference at 2 hours, -4.8 points, 95% confidence interval [CI], -1.8 to -7.8 points, p = .002; at 24 hours, -5.5 points, 95% CI, -2.5 to -8.5 points, p < .001; comparison between nitrous oxide and placebo, p < .001). Four patients (20%) had treatment response (reduction ≥50% on HDRS-21) and three patients (15%) had a full remission (HDRS-21 ≤ 7 points) after nitrous oxide compared with one patient (5%) and none after placebo (odds ratio for response, 4.0, 95% CI, .45-35.79; OR for remission, 3.0, 95% CI, .31-28.8). No serious adverse events occurred; all adverse events were brief and of mild to moderate severity. This proof-of-concept trial demonstrated that nitrous oxide has rapid and marked antidepressant effects in patients with TRD. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Time to Appendectomy and Risk of Complicated Appendicitis and Adverse Outcomes in Children.
Serres, Stephanie K; Cameron, Danielle B; Glass, Charity C; Graham, Dionne A; Zurakowski, David; Karki, Mahima; Anandalwar, Seema P; Rangel, Shawn J
2017-08-01
Management of appendicitis as an urgent rather than emergency procedure has become an increasingly common practice in children. Controversy remains as to whether this practice is associated with increased risk of complicated appendicitis and adverse events. To examine the association between time to appendectomy (TTA) and risk of complicated appendicitis and postoperative complications. In this retrospective cohort study using the Pediatric National Surgical Quality Improvement Program appendectomy pilot database, 2429 children younger than 18 years who underwent appendectomy within 24 hours of presentation at 23 children's hospitals from January 1, 2013, through December 31, 2014, were studied. The main exposure was TTA, defined as the time from emergency department presentation to appendectomy. Patients were further categorized into early and late TTA groups based on whether their TTA was shorter or longer than their hospital's median TTA. Exposures were defined in this manner to compare rates of complicated appendicitis within a time frame sensitive to each hospital's existing infrastructure and diagnostic practices. The primary outcome was complicated appendicitis documented at operation. The association between treatment delay and complicated appendicitis was examined across all hospitals by using TTA as a continuous variable and at the level of individual hospitals by using TTA as a categorical variable comparing outcomes between late and early TTA groups. Secondary outcomes included length of stay (LOS) and postoperative complications (incisional and organ space infections, percutaneous drainage procedures, unplanned reoperation, and hospital revisits). Of the 6767 patients who met the inclusion criteria, 2429 were included in the analysis (median age, 10 years; interquartile range, 8-13 years; 1467 [60.4%] male). Median hospital TTA was 7.4 hours (range, 5.0-19.2 hours), and 574 patients (23.6%) were diagnosed with complicated appendicitis (range, 5.2%-51.1% across hospitals). In multivariable analyses, increasing TTA was not associated with risk of complicated appendicitis (odds ratio per 1-hour increase in TTA, 0.99; 95% CI, 0.97-1.02). The odds ratios of complicated appendicitis for late vs early TTA across hospitals ranged from 0.39 to 9.63, and only 1 of the 23 hospitals had a statistically significant increase in their late TTA group (odds ratio, 9.63; 95% CI, 1.08-86.17; P = .03). Increasing TTA was associated with longer LOS (increase in mean LOS for each additional hour of TTA, 0.06 days; 95% CI, 0.03-0.08 days; P < .001) but was not associated with increased risk of any of the other secondary outcomes. Delay of appendectomy within 24 hours of presentation was not associated with increased risk of complicated appendicitis or adverse outcomes. These results support the premise that appendectomy can be safely performed as an urgent rather than emergency procedure.
Jang, Sae; Vanderpool, Rebecca R; Avazmohammadi, Reza; Lapshin, Eugene; Bachman, Timothy N; Sacks, Michael; Simon, Marc A
2017-09-12
Right ventricular (RV) diastolic function has been associated with outcomes for patients with pulmonary hypertension; however, the relationship between biomechanics and hemodynamics in the right ventricle has not been studied. Rat models of RV pressure overload were obtained via pulmonary artery banding (PAB; control, n=7; PAB, n=5). At 3 weeks after banding, RV hemodynamics were measured using a conductance catheter. Biaxial mechanical properties of the RV free wall myocardium were obtained to extrapolate longitudinal and circumferential elastic modulus in low and high strain regions (E 1 and E 2 , respectively). Hemodynamic analysis revealed significantly increased end-diastolic elastance (E ed ) in PAB (control: 55.1 mm Hg/mL [interquartile range: 44.7-85.4 mm Hg/mL]; PAB: 146.6 mm Hg/mL [interquartile range: 105.8-155.0 mm Hg/mL]; P =0.010). Longitudinal E 1 was increased in PAB (control: 7.2 kPa [interquartile range: 6.7-18.1 kPa]; PAB: 34.2 kPa [interquartile range: 18.1-44.6 kPa]; P =0.018), whereas there were no significant changes in longitudinal E 2 or circumferential E 1 and E 2 . Last, wall stress was calculated from hemodynamic data by modeling the right ventricle as a sphere: stress=Pressure×radius2×thickness. RV pressure overload in PAB rats resulted in an increase in diastolic myocardial stiffness reflected both hemodynamically, by an increase in E ed , and biomechanically, by an increase in longitudinal E 1 . Modest increases in tissue biomechanical stiffness are associated with large increases in E ed . Hemodynamic measurements of RV diastolic function can be used to predict biomechanical changes in the myocardium. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Krieger, Eric V; Clair, Mathieu; Opotowsky, Alexander R; Landzberg, Michael J; Rhodes, Jonathan; Powell, Andrew J; Colan, Steven D; Valente, Anne Marie
2013-02-01
The role of exercise testing to risk stratify patients with repaired coarctation of the aorta (CoA) is controversial. Concentric left ventricular (LV) hypertrophy, defined as an increase in the LV mass-to-volume ratio (MVR), is associated with a greater incidence of adverse cardiovascular events. The objective of the present study was to determine whether a hypertensive response to exercise (HRE) is associated with increased LVMVR in patients with repaired CoA. Adults with repaired CoA who had a symptom-limited exercise test and cardiac magnetic resonance imaging examination within 2 years were identified. A hypertensive response to exercise was defined as a peak systolic blood pressure >220 mm Hg during a symptom-limited exercise test. The LV mass and volume were measured using cardiac magnetic resonance by an investigator who was unaware of patient status. We included 47 patients (median age 27.3 years, interquartile range 19.8 to 37.3), who had undergone CoA repair at a median age of 4.6 years (interquartile range 0.4 to 15.7). Those with (n = 11) and without (n = 36) HRE did not differ in age, age at repair, body surface area, arm-to-leg systolic blood pressure gradient, gender, or peak oxygen uptake with exercise. Those with a HRE had a greater mean systolic blood pressure at rest (146 ± 18 vs 137 ± 18 mm Hg, p = 0.04) and greater median LVMVR (0.85, interquartile range 0.7 to 1, vs 0.66, interquartile range 0.6 to 0.7; p = 0.04) than those without HRE. Adjusting for systolic blood pressure at rest, age, age at repair, and gender, the relation between HRE and LVMVR remained significant (p = 0.001). In conclusion, HRE was associated with increased LVMVR, even after adjusting for multiple covariates. Copyright © 2013 Elsevier Inc. All rights reserved.
Measured degree of dehydration in children and adolescents with type 1 diabetic ketoacidosis.
Ugale, Judith; Mata, Angela; Meert, Kathleen L; Sarnaik, Ashok P
2012-03-01
Successful management of diabetic ketoacidosis depends on adequate rehydration while avoiding cerebral edema. Our objectives are to 1) measure the degree of dehydration in children with type 1 diabetes mellitus and diabetic ketoacidosis based on change in body weight; and 2) investigate the relationships between measured degree of dehydration and clinically assessed degree of dehydration, severity of diabetic ketoacidosis, and routine serum laboratory values. Prospective observational study. University-affiliated tertiary care children's hospital. Sixty-six patients <18 yrs of age with type 1 diabetic ketoacidosis. Patients were weighed using a portable scale at admission; 8, 16, and 24 hrs; and daily until discharge. Measured degree of dehydration was based on the difference between admission and plateau weights. Clinical degree of dehydration was assessed by physical examination and severity of diabetic ketoacidosis was assessed by blood gas values as defined by international guidelines. Laboratory values obtained on admission included serum glucose, urea nitrogen, sodium, and osmolality. Median measured degree of dehydration was 5.2% (interquartile range, 3.1% to 7.8%). Fourteen (21%) patients were clinically assessed as mild dehydration, 49 (74%) as moderate, and three (5%) as severe. Patients clinically assessed as moderately dehydrated had a greater measured degree of dehydration (5.8%; interquartile range, 3.6% to 9.6%) than those assessed as mildly dehydrated (3.7%; interquartile range, 2.3% to 6.4%) or severely dehydrated (2.5%; interquartile range, 2.3% to 2.6%). Nine (14%) patients were assessed as mild diabetic ketoacidosis, 18 (27%) as moderate, and 39 (59%) as severe. Diabetic ketoacidosis severity groups did not differ in measured degree of dehydration. Variables independently associated with measured degree of dehydration included serum urea nitrogen and sodium concentration on admission. Hydration status in children with diabetic ketoacidosis cannot be accurately assessed by physical examination or blood gas values. Fluid therapy based on maintenance plus 6% deficit replacement is reasonable for most patients.
Mulligan, Angela A; Kuhnle, Gunter G C; Lentjes, Marleen A H; van Scheltinga, Veronica; Powell, Natasha A; McTaggart, Alison; Bhaniani, Amit; Khaw, Kay-Tee
2013-08-01
A diet rich in phyto-oestrogens has been suggested to protect against a variety of common diseases but UK intake data on phyto-oestrogens or their food sources are sparse. The present study estimates the average intakes of isoflavones, lignans, enterolignans and coumestrol from 7 d food diaries and provides data on total isoflavone, lignan and phyto-oestrogen consumption by food group. Development of a food composition database for twelve phyto-oestrogens and analysis of soya food and phyto-oestrogen consumption in a populationbased study. Men and women, aged 40–79 years, from the general population participating in the Norfolk arm of the European Prospective Investigation into Cancer and Nutrition (EPIC-Norfolk) between 1993 and 1997, with nutrient and food data from 7 d food diaries. A subset of 20 437 participants. The median daily phyto-oestrogen intake for all men was 1199 mg (interquartile range 934–1537mg; mean 1504mg, SD 1502mg) and 888mg for all women (interquartile range 710–1135 mg; mean 1205 mg, SD 1701mg). In soya consumers, median daily intakes were higher: 2861 mg in men (interquartile range 1304–7269mg; mean 5051mg, SD 5031mg) and 3142 mg in women (interquartile range 1089–7327mg; mean 5396 mg, SD 6092 mg). In both men and women, bread made the greatest contribution to phyto-oestrogen intake – 40?8% and 35?6%, respectively. In soya consumers, vegetable dishes and soya/goat’s/sheep’s milks were the main contributors – 45?7% and 21?3% in men and 38?4% and 33?7% in women, respectively. The ability to estimate phyto-oestrogen intake in Western populations more accurately will aid investigations into their suggested effects on health.
Chen, Hung-Yuan; Chiang, Chih-Kang; Wang, Hsi-Hao; Hung, Kuan-Yu; Lee, Yue-Joe; Peng, Yu-Sen; Wu, Kwan-Dun; Tsai, Tun-Jun
2008-08-01
Greater than 50% of dialysis patients experience sleep disturbances. Cognitive-behavioral therapy (CBT) is effective for treating chronic insomnia, but its effectiveness has never been reported in peritoneal dialysis (PD) patients and its association with cytokines is unknown. We investigated the effectiveness of CBT in PD patients by assessing changes in sleep quality and inflammatory cytokines. Randomized control study with parallel-group design. 24 PD patients with insomnia in a tertiary medical center without active medical and psychiatric illness were enrolled. The intervention group (N = 13) received CBT from a psychiatrist for 4 weeks and sleep hygiene education, whereas the control group (N = 11) received only sleep hygiene education. Primary outcomes were changes in the Pittsburgh Sleep Quality Index and Fatigue Severity Scale scores, and secondary outcomes were changes in serum interleukin 6 (IL-6), IL-1beta, IL-18, and tumor necrosis factor alpha levels during the 4-week trial. Median percentages of change in global Pittsburgh Sleep Quality Index scores were -14.3 (interquartile range, -35.7 to - 6.3) and -1.7 (interquartile range, -7.6 to 7.8) in the intervention and control groups, respectively (P = 0.3). Median percentages of change in global Fatigue Severity Scale scores were -12.1 (interquartile range, -59.8 to -1.5) and -10.5 (interquartile range, -14.3 to 30.4) in the intervention and control groups, respectively (P = 0.04). Serum IL-1beta level decreased in the intervention group, but increased in the control group (P = 0.04). There were no significant differences in changes in other cytokines. This study had a small number of participants and short observation period, and some participants concurrently used hypnotics. CBT may be effective for improving the quality of sleep and decreasing fatigue and inflammatory cytokine levels. CBT can be an effective nonpharmacological therapy for PD patients with sleep disturbances.
Index to Estimate the Efficiency of an Ophthalmic Practice.
Chen, Andrew; Kim, Eun Ah; Aigner, Dennis J; Afifi, Abdelmonem; Caprioli, Joseph
2015-08-01
A metric of efficiency, a function of the ratio of quality to cost per patient, will allow the health care system to better measure the impact of specific reforms and compare the effectiveness of each. To develop and evaluate an efficiency index that estimates the performance of an ophthalmologist's practice as a function of cost, number of patients receiving care, and quality of care. Retrospective review of 36 ophthalmology subspecialty practices from October 2011 to September 2012 at a university-based eye institute. The efficiency index (E) was defined as a function of adjusted number of patients (N(a)), total practice adjusted costs (C(a)), and a preliminary measure of quality (Q). Constant b limits E between 0 and 1. Constant y modifies the influence of Q on E. Relative value units and geographic cost indices determined by the Centers for Medicare and Medicaid for 2012 were used to calculate adjusted costs. The efficiency index is expressed as the following: E = b(N(a)/C(a))Q(y). Independent, masked auditors reviewed 20 random patient medical records for each practice and filled out 3 questionnaires to obtain a process-based quality measure. The adjusted number of patients, adjusted costs, quality, and efficiency index were calculated for 36 ophthalmology subspecialties. The median adjusted number of patients was 5516 (interquartile range, 3450-11,863), the median adjusted cost was 1.34 (interquartile range, 0.99-1.96), the median quality was 0.89 (interquartile range, 0.79-0.91), and the median value of the efficiency index was 0.26 (interquartile range, 0.08-0.42). The described efficiency index is a metric that provides a broad overview of performance for a variety of ophthalmology specialties as estimated by resources used and a preliminary measure of quality of care provided. The results of the efficiency index could be used in future investigations to determine its sensitivity to detect the impact of interventions on a practice such as training modules or practice restructuring.
Abnormal environmental light exposure in the intensive care environment.
Fan, Emily P; Abbott, Sabra M; Reid, Kathryn J; Zee, Phyllis C; Maas, Matthew B
2017-08-01
We sought to characterize ambient light exposure in the intensive care unit (ICU) environment to identify patterns of light exposure relevant to circadian regulation. A light monitor was affixed to subjects' bed at eye level in a modern intensive care unit and continuously recorded illuminescence for at least 24h per subject. Blood was sampled hourly and measured for plasma melatonin. Subjects underwent hourly vital sign and bedside neurologic assessments. Care protocols and the ICU environment were not modified for the study. A total of 67,324 30-second epochs of light data were collected from 17 subjects. Light intensity peaked in the late morning, median 64.1 (interquartile range 19.7-138.7) lux. The 75th percentile of light intensity exceeded 100lx only between 9AM and noon, and never exceeded 150lx. There was no correlation between melatonin amplitude and daytime, nighttime or total light exposure (Spearman's correlation coefficients all <0.2 and p>0.5). Patients' environmental light exposure in the intensive care unit is consistently low and follows a diurnal pattern. No effect of nighttime light exposure was observed on melatonin secretion. Inadequate daytime light exposure in the ICU may contribute to abnormal circadian rhythms. Copyright © 2017 Elsevier Inc. All rights reserved.
Extended Infusion of Piperacillin/Tazobactam in Children.
Knoderer, Chad A; Karmire, Lauren C; Andricopulos, Katie L; Nichols, Kristen R
2017-01-01
Extended-infusion piperacillin/tazobactam (TZP) has been associated with positive clinical outcomes in adults, but similar data in children are lacking. The objective of this study was to describe efficacy outcomes with pediatric patients receiving extended-infusion TZP. This was a retrospective case series of children aged 1 month to 17 years who had documented Gram-negative infection and received extended-infusion TZP between April 2011 and March 2012. The primary outcome was 21-day clinical cure defined as negative follow-up cultures, where available, and infection resolution. Fifty children with a median (interquartile range [IQR]) age of 5 (2-9) years were included in the study. Patients received a median (IQR) TZP dose of 111.4 (100-112.5) mg/kg administered every 8 hours over 4 hours. Clinical and microbiologic cure were observed in 74% and 100% of patients, respectively. Patients not meeting criterial for 21-day clinical cure were younger (1 vs 7 years, p = 0.087) and had a longer length of hospital stay (23 vs 11 days, p = 0.037). The majority of children in this cohort achieved 21-day clinical cure with extended-interval TZP. Those without clinical cure tended to be younger and critically ill. Additional comparative studies evaluating traditional and extended-infusion TZP in children are needed.
Evaluation of delivery of enteral nutrition in mechanically ventilated Malaysian ICU patients.
Yip, Keng F; Rai, Vineya; Wong, Kang K
2014-01-01
There are numerous challenges in providing nutrition to the mechanically ventilated critically ill ICU patient. Understanding the level of nutritional support and the barriers to enteral feeding interruption in mechanically ventilated patients are important to maximise the nutritional benefits to the critically ill patients. Thus, this study aims to evaluate enteral nutrition delivery and identify the reasons for interruptions in mechanically ventilated Malaysian patients receiving enteral feeding. A cross sectional prospective study of 77 consecutive patients who required mechanical ventilation and were receiving enteral nutrition was done in an open 14-bed intensive care unit of a tertiary hospital. Data were collected prospectively over a 3 month period. Descriptive statistical analysis were made with respect to demographical data, time taken to initiate feeds, type of feeds, quantification of feeds attainment, and reasons for feed interruptions. There are no set feeding protocols in the ICU. The usual initial rate of enteral nutrition observed in ICU was 20 ml/hour, assessed every 6 hours and the decision was made thereafter to increase feeds. The target calorie for each patient was determined by the clinician alongside the dietitian. The use of prokinetic agents was also prescribed at the discretion of the attending clinician and is commonly IV metoclopramide 10 mg three times a day. About 66% of patients achieved 80% of caloric requirements within 3 days of which 46.8% achieved full feeds in less than 12 hours. The time to initiate feeds for patients admitted into the ICU ranged from 0 - 110 hours with a median time to start feeds of 15 hours and the interquartile range (IQR) of 6-59 hours. The mean time to achieve at least 80% of nutritional target was 1.8 days ± 1.5 days. About 79% of patients experienced multiple feeding interruptions. The most prevalent reason for interruption was for procedures (45.1%) followed by high gastric residual volume (38.0%), diarrhoea (8.4%), difficulty in nasogastric tube placement (5.6%) and vomiting (2.9%). Nutritional inadequacy in mechanically ventilated Malaysian patients receiving enteral nutrition was not as common as expected. However, there is still room for improvement with regards to decreasing the number of patients who did not achieve their caloric requirement throughout their stay in the ICU.
Rietberg, Marc B; van Wegen, Erwin E; Uitdehaag, Bernard M; de Vet, Henrica C; Kwakkel, Gert
2010-10-01
To determine the reproducibility of 24-hour monitoring of motor activity in patients with multiple sclerosis (MS). Test-retest design; 6 research assistants visited the participants twice within 1 week in the home situation. General community. A convenience sample of ambulatory patients (N=43; mean age ± SD, 48.7±7.0y; 30 women; median Expanded Disability Status Scale scores, 3.5; interquartile range, 2.5) were recruited from the outpatient clinic of a university medical center. Not applicable. Dynamic activity and static activity parameters were recorded by using a portable data logger and classified continuously for 24 hours. Reproducibility was determined by calculating intraclass correlation coefficients (ICCs) for test-retest reliability and by applying the Bland-Altman method for agreement between the 2 measurements. The smallest detectable change (SDC) was calculated based on the standard error of measurement. Test-retest reliability expressed by the ICC(agreement) was .72 for dynamic activity, .74 for transitions, .77 for walking, .71 for static activity, .67 for sitting, .62 for standing, and .55 for lying. Bland and Altman analysis indicated no systematic differences between the first and second assessment for dynamic and static activity. Measurement error expressed by the SDC was 1.23 for dynamic activity, 66 for transitions, .99 for walking, 1.52 for static activity, 4.68 for lying, 3.95 for sitting, and 3.34 for standing. The current study shows that with 24-hour monitoring, a reproducible estimate of physical activity can be obtained in ambulatory patients with MS. Copyright © 2010 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Macha, Kosmas; Volbers, Bastian; Bobinger, Tobias; Kurka, Natalia; Breuer, Lorenz; Huttner, Hagen B; Schwab, Stefan; Köhrmann, Martin
2016-09-01
Direct oral anticoagulants (DOACs) are increasingly used for secondary prevention of cardioembolic stroke. While DOACs are associated with a long-term reduced risk of intracranial hemorrhage compared to vitamin K antagonists, pivotal trials avoided the very early period after stroke and few data exist on early initiation of DOAC therapy post stroke. We retrospectively analyzed data from our prospective database of all consecutive transient ischemic attack (TIA) or ischemic stroke patients with atrial fibrillation treated with DOACs during hospital stay. As per our institutional treatment algorithm for patients with cardioembolic ischemia DOACs are started immediately in TIA and minor stroke (group 1), within days 3-5 in patients with infarcts affecting one third or less of the middle cerebral artery, the anterior cerebral artery, or the posterior cerebral artery territories (group 2) as well as in infratentorial stroke (group 3) and after 1-2 weeks in patients with large infarcts (>⅓MCA territory, group 4). We investigated baseline characteristics, time to initiation of DOAC therapy after symptom onset, and hemorrhagic complications. In 243 included patients, administration of DOAC was initiated 40.5 hours (interquartile range [IQR] 23.0-65.5) after stroke onset in group 1 (n = 41) and after 76.7 hours (IQR 48.0-134.0), 108.4 hours (IQR 67.3-176.4), and 161.8 hours (IQR 153.9-593.8) in groups 2-4 (n = 170, 28, and 4), respectively. Two cases of asymptomatic intracranial hemorrhage (.8%) and 1 case of symptomatic intracranial hemorrhage (.4%) were observed, both in group 2. No severe safety issues were observed in early initiation of DOACs for secondary prevention after acute stroke in our in-patient cohort. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Chaisson, Lelia H; Roemer, Marguerite; Cantu, David; Haller, Barbara; Millman, Alexander J; Cattamanchi, Adithya; Davis, J Lucian
2014-11-15
Placing inpatients with presumed active pulmonary tuberculosis in respiratory isolation pending results of serial sputum acid-fast bacilli (AFB) smear microscopy is standard practice in high-income countries. However, this diagnostic strategy is slow and yields few tuberculosis diagnoses. We sought to determine if replacing microscopy with the GeneXpert MTB/RIF (Xpert) nucleic acid amplification assay could reduce testing time and usage of isolation rooms. We prospectively followed inpatients at San Francisco General Hospital undergoing tuberculosis evaluation. We performed smear microscopy and Xpert testing on concentrated sputum, and calculated diagnostic accuracy for both strategies in reference to serial sputum mycobacterial culture. We measured turnaround time for microscopy and estimated hypothetical turnaround times for Xpert on concentrated and unconcentrated sputum. We compared median and total isolation times for microscopy to those estimated for the 2 Xpert strategies. Among 139 patients with 142 admissions, median age was 54 years (interquartile range [IQR], 43-60 years); 32 (23%) patients were female, and 42 (30%) were HIV seropositive. Serial sputum smear microscopy and a single concentrated sputum Xpert had identical sensitivity (89%; 95% confidence interval [CI], 52%-100%) and similar specificity (99% [95% CI, 96%-100%] vs 100% [95% CI, 97%-100%]). A single concentrated sputum Xpert could have saved a median of 35 hours (IQR, 24-36 hours) in unnecessary isolation compared with microscopy, and a single unconcentrated sputum Xpert, 45 hours (IQR, 35-46 hours). Replacing serial sputum smear microscopy with a single sputum Xpert could eliminate most unnecessary isolation for inpatients with presumed tuberculosis, greatly benefiting patients and hospitals. © The Author 2014. Published by Oxford University Press on behalf of the Infectious Diseases Society of America. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Maas, Matthew B; Naidech, Andrew M; Kim, Minjee; Batra, Ayush; Manno, Edward M; Sorond, Farzaneh A; Prabhakaran, Shyam; Liotta, Eric M
2018-05-01
We evaluated whether reduced platelet activity detected by point-of-care (POC) testing is a better predictor of hematoma expansion and poor functional outcomes in patients with intracerebral hemorrhage (ICH) than a history of antiplatelet medication exposure. Patients presenting with spontaneous ICH were enrolled in a prospective observational cohort study that collected demographic, clinical, laboratory, and radiographic data. We measured platelet activity using the PFA-100 (Siemens AG, Germany) and VerifyNow-ASA (Accumetrics, CA) systems on admission. We performed univariate and adjusted multivariate analyses to assess the strength of association between those measures and (1) hematoma growth at 24 hours and (2) functional outcomes measured by the modified Rankin Scale (mRS) at 3 months. We identified 278 patients for analysis (mean age 65 ± 15, median ICH score 1 [interquartile range 0-2]), among whom 164 underwent initial neuroimaging within 6 hours of symptom onset. Univariate association with hematoma growth was stronger for antiplatelet medication history than POC measures, which was confirmed in multivariable models (β 3.64 [95% confidence interval [CI] 1.02-6.26], P = .007), with a larger effect size measured in the under 6-hour subgroup (β 7.20 [95% CI 3.35-11.1], P < .001). Moreover, antiplatelet medication history, but not POC measures of platelet activity, was independently associated with poor outcome at 3 months (mRS 4-6) in the under 6-hour subgroup (adjusted OR 3.6 [95% CI 1.2-11], P = .023). A history of antiplatelet medication use better identifies patients at risk for hematoma growth and poor functional outcomes than POC measures of platelet activity after spontaneous ICH. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Taylor, Kathryn S.; Heneghan, Carl J.; Stevens, Richard J.; Adams, Emily C.; Nunan, David; Ward, Alison
2015-01-01
In addition to mean blood pressure, blood pressure variability is hypothesized to have important prognostic value in evaluating cardiovascular risk. We aimed to assess the prognostic value of blood pressure variability within 24 hours. Using MEDLINE, EMBASE and Cochrane Library to April 2013, we conducted a systematic review of prospective studies of adults, with at least one year follow-up and any day, night or 24-hour blood pressure variability measure as a predictor of one or more of the following outcomes: all-cause mortality, cardiovascular mortality, all cardiovascular events, stroke and coronary heart disease. We examined how blood pressure variability is defined and how its prognostic use is reported. We analysed relative risks adjusted for covariates including the appropriate mean blood pressure and considered the potential for meta-analysis. Our analysis of methods included 24 studies and analysis of predictions included 16 studies. There were 36 different measures of blood pressure variability and 13 definitions of night- and day-time periods. Median follow-up was 5.5 years (interquartile range 4.2–7.0). Comparing measures of dispersion, coefficient of variation was less well researched than standard deviation. Night dipping based on percentage change was the most researched measure and the only measure for which data could be meaningfully pooled. Night dipping or lower night-time blood pressure was associated with lower risk of cardiovascular events. The interpretation and use in clinical practice of 24-hour blood pressure variability, as an important prognostic indicator of cardiovascular events, is hampered by insufficient evidence and divergent methodologies. We recommend greater standardisation of methods. PMID:25984791
Hermes, Wendy A; Alvarez, Jessica A; Lee, Moon J; Chesdachai, Supavit; Lodin, Daud; Horst, Ron; Tangpricha, Vin
2017-08-01
There is little consensus on the most efficacious vehicle substance for vitamin D supplements. Fat malabsorption may impede the ability of patients with cystic fibrosis (CF) to absorb vitamin D in an oil vehicle. We hypothesized that vitamin D contained in a powder vehicle would be absorbed more efficiently than vitamin D contained in an oil vehicle in patients with CF. In this double-blind, randomized controlled trial, hospitalized adults with CF were given a one-time bolus dose of 100,000 IU of cholecalciferol (D 3 ) in a powder-based or oil-based vehicle. Serum D 3 , 25-hydroxyvitamin D, and parathyroid hormone concentrations were analyzed at 0, 12, 24, and 48 hours posttreatment. The area under the curve for serum D 3 and the 12-hour time point were also assessed as indicators of D 3 absorption. This trial was completed by 15 patients with CF. The median (interquartile range) age, body mass index, and forced expiratory volume in 1 second were 23.7 (19.9-33.2) years, 19.9 (18.6-22.6) kg/m 2 , and 63% (37%-80%), respectively. The increase in serum D 3 and the area under the curve was greater in the powder group ( P = .002 and P = .036, respectively). Serum D 3 was higher at 12 hours in the powder group compared with the oil group ( P = .002), although levels were similar between groups by 48 hours. In adults with CF, cholecalciferol is more efficiently absorbed in a powder compared with an oil vehicle. Physicians should consider prescribing vitamin D in a powder vehicle in patients with CF to improve the absorption of vitamin D from supplements.
Evaluation of Antimicrobial Stewardship-Related Alerts Using a Clinical Decision Support System.
Ghamrawi, Riane J; Kantorovich, Alexander; Bauer, Seth R; Pallotta, Andrea M; Sekeres, Jennifer K; Gordon, Steven M; Neuner, Elizabeth A
2017-11-01
Background: Information technology, including clinical decision support systems (CDSS), have an increasingly important and growing role in identifying opportunities for antimicrobial stewardship-related interventions. Objective: The aim of this study was to describe and compare types and outcomes of CDSS-built antimicrobial stewardship alerts. Methods: Fifteen alerts were evaluated in the initial antimicrobial stewardship program (ASP) review. Preimplementation, alerts were reviewed retrospectively. Postimplementation, alerts were reviewed in real-time. Data collection included total number of actionable alerts, recommendation acceptance rates, and time spent on each alert. Time to de-escalation to narrower spectrum agents was collected. Results: In total, 749 alerts were evaluated. Overall, 306 (41%) alerts were actionable (173 preimplementation, 133 postimplementation). Rates of actionable alerts were similar for custom-built and prebuilt alert types (39% [53 of 135] vs 41% [253 of 614], P = .68]. In the postimplementation group, an intervention was attempted in 97% of actionable alerts and 70% of interventions were accepted. The median time spent per alert was 7 minutes (interquartile range [IQR], 5-13 minutes; 15 [12-17] minutes for actionable alerts vs 6 [5-7] minutes for nonactionable alerts, P < .001). In cases where the antimicrobial was eventually de-escalated, the median time to de-escalation was 28.8 hours (95% confidence interval [CI], 10.0-69.1 hours) preimplementation vs 4.7 hours (95% CI, 2.4-22.1 hours) postimplementation, P < .001. Conclusions: CDSS have played an important role in ASPs to help identify opportunities to optimize antimicrobial use through prebuilt and custom-built alerts. As ASP roles continue to expand, focusing time on customizing institution specific alerts will be of vital importance to help redistribute time needed to manage other ASP tasks and opportunities.
Emergency department recidivism in adults older than 65 years treated for fractures.
Southerland, Lauren T; Richardson, Daniel S; Caterino, Jeffrey M; Essenmacher, Alex C; Swor, Robert A
2014-09-01
Fractures in older adults are a commonly diagnosed injury in the emergency department (ED). We performed a retrospective medical record review to determine the rate of return to the same ED within 72 hours (returns) and the risk factors associated with returning. A retrospective medical record review of patients at least 65 years old discharged from a large, academic ED with a new diagnosis of upper extremity, lower extremity, or rib fractures was performed. Risk factors analyzed included demographic data, type of fracture, analgesic prescriptions, assistive devices provided, other concurrent injuries, and comorbidities (Charlson Comorbidity Index). Our primary outcome was return to the ED within 72 hours. Three hundred fifteen patients qualified. Most fractures were in the upper extremity (64% [95% confidence interval {CI}, 58%-69%]). Twenty patients (6.3% [95% CI, 3.9%-9.6%]) returned within 72 hours. Most returns (15/20, 75%) were for reasons associated with the fracture itself, such as cast problems and inadequate pain control. Only 3 (<1% of all patients) patients returned for cardiac etiologies. Patients with distal forearm fractures had higher return rates (10.7% vs 4.5%, P = .03), and most commonly returned for cast or splint problems. Age, sex, other injuries, assistive devices, and Charlson Comorbidity Index score (median, 1 [interquartile range, 1-2] for both groups) did not predict 72-hour returns. Older adults with distal forearm fractures may have more unscheduled health care usage in the first 3 days after fracture diagnosis than older adults with other fracture types. Overall, revisits for cardiac reasons or repeat falls were rare (<1%). Copyright © 2014 Elsevier Inc. All rights reserved.
Dobinson, Hazel C; Gibani, Malick M; Jones, Claire; Thomaides-Brears, Helena B; Voysey, Merryn; Darton, Thomas C; Waddington, Claire S; Campbell, Danielle; Milligan, Iain; Zhou, Liqing; Shrestha, Sonu; Kerridge, Simon A; Peters, Anna; Stevens, Zoe; Podda, Audino; Martin, Laura B; D'Alessio, Flavia; Thanh, Duy Pham; Basnyat, Buddha; Baker, Stephen; Angus, Brian; Levine, Myron M; Blohmke, Christoph J; Pollard, Andrew J
2017-04-15
To expedite the evaluation of vaccines against paratyphoid fever, we aimed to develop the first human challenge model of Salmonella enterica serovar Paratyphi A infection. Two groups of 20 participants underwent oral challenge with S. Paratyphi A following sodium bicarbonate pretreatment at 1 of 2 dose levels (group 1: 1-5 × 103 colony-forming units [CFU] and group 2: 0.5-1 × 103 CFU). Participants were monitored in an outpatient setting with daily clinical review and collection of blood and stool cultures. Antibiotic treatment was started when prespecified diagnostic criteria were met (temperature ≥38°C for ≥12 hours and/or bacteremia) or at day 14 postchallenge. The primary study objective was achieved following challenge with 1-5 × 103 CFU (group 1), which resulted in an attack rate of 12 of 20 (60%). Compared with typhoid challenge, paratyphoid was notable for high rates of subclinical bacteremia (at this dose, 11/20 [55%]). Despite limited symptoms, bacteremia persisted for up to 96 hours after antibiotic treatment (median duration of bacteremia, 53 hours [interquartile range, 24-85 hours]). Shedding of S. Paratyphi A in stool typically preceded onset of bacteremia. Challenge with S. Paratyphi A at a dose of 1-5 × 103 CFU was well tolerated and associated with an acceptable safety profile. The frequency and persistence of bacteremia in the absence of clinical symptoms was notable, and markedly different from that seen in previous typhoid challenge studies. We conclude that the paratyphoid challenge model is suitable for the assessment of vaccine efficacy using endpoints that include bacteremia and/or symptomatology. NCT02100397. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Ford, Mackenzie A.; Almond, Christopher S.; Gauvreau, Kimberlee; Piercey, Gary; Blume, Elizabeth D.; Smoot, Leslie B.; Fynn-Thompson, Francis; Singh, Tajinder P.
2014-01-01
BACKGROUND Previous studies have found no association between graft ischemic time (IT) and survival in pediatric heart transplant (HTx) recipients. However, previous studies were small or analyzed risk only at the extremes of IT, where observations are few. We sought to determine whether graft IT is independently associated with graft survival in a large cohort of children with no a priori assumptions about where the risk threshold may lie. METHODS All children aged <18 years in the U.S. undergoing primary HTx (1987 to 2008) were included. The primary end point was graft loss (death or retransplant) within 6 months. Multivariate analysis was performed to analyze the association between graft IT and graft loss within 6 months after transplant. A secondary end point of longer-term graft loss was assessed among recipients who survived the first 6 months after transplant. RESULTS Of 4,716 pediatric HTxs performed, the median IT was 3.5 hours (interquartile range, 2.7–4.3 hours). Adjusted analysis showed that children with an IT > 3.5 hours were at increased risk of graft loss within 6 months after transplant (hazard ratio, 1.3; 95% confidence interval, 1.1–1.5; p = 0.002). Among 6-month survivors, IT was not associated with longer-term graft loss. CONCLUSIONS IT beyond 3.5 hours is associated with a 30% increase in risk of graft loss within 6 months in pediatric HT recipients. Although the magnitude of risk associated with IT is small compared with the risk associated with recipient factors, these findings may be important during donor assessment for high-risk transplant candidates. PMID:21676628
Tan Tanny, Sharman P; Busija, Lucy; Liew, Danny; Teo, Sarah; Davis, Stephen M; Yan, Bernard
2013-08-01
Previous economic studies outside Australia have demonstrated that patients treated with tissue-type plasminogen activator (tPA) within 4.5 hours of stroke onset have lower healthcare costs than those not. We aim to perform cost-effectiveness analysis of intravenous tPA in an Australian setting. Data on clinical outcomes and costs were derived for 378 patients who received intravenous tPA within 4.5 hours of stroke onset at Royal Melbourne Hospital (Australia) between January 2003 and December 2011. To simulate clinical outcomes and costs for a hypothetical control group assumed not to have received tPA, we applied efficacy data from a meta-analysis of randomized trials to outcomes observed in the tPA group. During a 1-year time-horizon, net costs, years of life lived, and quality-adjusted life-years were compared and incremental cost-effectiveness ratios derived for tPA versus no tPA. In the study population, mean (SD) age was 68.2 (13.5) years and 206 (54.5%) were men. Median National Institutes of Health Stroke Scale score (interquartile range) at presentation was 12.5 (8-18). Compared with no tPA, we estimated that tPA would result in 0.02 life-years and 0.04 quality-adjusted life-years saved per person>1 year. The net cost of tPA was AUD $55.61 per patient. The incremental cost-effectiveness ratios were AUD $2377 per life-year saved and AUD $1478 per quality-adjusted life-years saved. Because the costs of tPA are incurred only once, the incremental cost-effectiveness ratios would decrease with increasing time-horizon. Uncertainty analyses indicated the results to be robust. Intravenous tPA within 4.5 hours represents a cost-effective intervention for acute ischemic stroke.
Lesbian, gay, bisexual, and transgender-related content in undergraduate medical education.
Obedin-Maliver, Juno; Goldsmith, Elizabeth S; Stewart, Leslie; White, William; Tran, Eric; Brenman, Stephanie; Wells, Maggie; Fetterman, David M; Garcia, Gabriel; Lunn, Mitchell R
2011-09-07
Lesbian, gay, bisexual, and transgender (LGBT) individuals experience health and health care disparities and have specific health care needs. Medical education organizations have called for LGBT-sensitive training, but how and to what extent schools educate students to deliver comprehensive LGBT patient care is unknown. To characterize LGBT-related medical curricula and associated curricular development practices and to determine deans' assessments of their institutions' LGBT-related curricular content. Deans of medical education (or equivalent) at 176 allopathic or osteopathic medical schools in Canada and the United States were surveyed to complete a 13-question, Web-based questionnaire between May 2009 and March 2010. Reported hours of LGBT-related curricular content. Of 176 schools, 150 (85.2%) responded, and 132 (75.0%) fully completed the questionnaire. The median reported time dedicated to teaching LGBT-related content in the entire curriculum was 5 hours (interquartile range [IQR], 3-8 hours). Of the 132 respondents, 9 (6.8%; 95% CI, 2.5%-11.1%) reported 0 hours taught during preclinical years and 44 (33.3%; 95% CI, 25.3%-41.4%) reported 0 hours during clinical years. Median US allopathic clinical hours were significantly different from US osteopathic clinical hours (2 hours [IQR, 0-4 hours] vs 0 hours [IQR, 0-2 hours]; P = .008). Although 128 of the schools (97.0%; 95% CI, 94.0%-99.9%) taught students to ask patients if they "have sex with men, women, or both" when obtaining a sexual history, the reported teaching frequency of 16 LGBT-specific topic areas in the required curriculum was lower: at least 8 topics at 83 schools (62.9%; 95% CI, 54.6%-71.1%) and all topics at 11 schools (8.3%; 95% CI, 3.6%-13.0%). The institutions' LGBT content was rated as "fair" at 58 schools (43.9%; 95% CI, 35.5%-52.4%). Suggested successful strategies to increase content included curricular material focusing on LGBT-related health and health disparities at 77 schools (58.3%, 95% CI, 49.9%-66.7%) and faculty willing and able to teach LGBT-related curricular content at 67 schools (50.8%, 95% CI, 42.2%-59.3%). The median reported time dedicated to LGBT-related topics in 2009-2010 was small across US and Canadian medical schools, but the quantity, content covered, and perceived quality of instruction varied substantially.
An e-learning course in medical immunology: does it improve learning outcome?
Boye, Sondre; Moen, Torolf; Vik, Torstein
2012-01-01
E-learning is used by most medical students almost daily and several studies have shown e-learning to improve learning outcome in small-scale interventions. However, few studies have explored the effects of e-learning in immunology. To study the effect of an e-learning package in immunology on learning outcomes in a written integrated examination and to examine student satisfaction with the e-learning package. All second-year students at a Norwegian medical school were offered an animated e-learning package in basic immunology as a supplement to the regular teaching. Each student's log-on-time was recorded and linked with the student's score on multiple choice questions included in an integrated end-of-the-year written examination. Student satisfaction was assessed through a questionnaire. The intermediate-range students (interquartile range) on average scored 3.6% better on the immunology part of the examination per hour they had used the e-learning package (p = 0.0046) and log-on-time explained 17% of the variance in immunology score. The best and the less skilled students' examination outcomes were not affected by the e-learning. The e-learning was well appreciated among the students. Use of an e-learning package in immunology in addition to regular teaching improved learning outcomes for intermediate-range students.
Selberherr, Andreas; Hörmann, Marcus; Prager, Gerhard; Riss, Philipp; Scheuba, Christian; Niederle, Bruno
2017-03-01
The purpose of this study was to demonstrate the high number of kidney stones in primary hyperparathyroidism (PHPT) and the low number of in fact "asymptomatic" patients. Forty patients with PHPT (28 female, 12 male; median age 58 (range 33-80) years; interquartile range 17 years [51-68]) without known symptoms of kidney stones prospectively underwent multidetector computed tomography (MDCT) and ultrasound (US) examinations of the urinary tract prior to parathyroid surgery. Images were evaluated for the presence and absence of stones, as well as for the number of stones and sizes in the long axis. The MDCT and US examinations were interpreted by two experienced radiologists who were blinded to all clinical and biochemical data. Statistical analysis was performed using the Wilcoxon signed-rank test. US revealed a total of 4 kidney stones in 4 (10 %) of 40 patients (median size 6.5 mm, interquartile range 11.5 mm). MDCT showed a total of 41 stones (median size was 3 mm, interquartile range 2.25 mm) in 15 (38 %) of 40 patients. The number of kidney stones detected with MDCT was significantly higher compared to US (p = 0.00124). MDCT is a highly sensitive method for the detection of "silent" kidney stones in patients with PHPT. By widely applying this method, the number of asymptomatic courses of PHPT may be substantially reduced. MDCT should be used primarily to detect kidney stones in PHPT and to exclude asymptomatic PHPT.
Overdiagnosis of Clostridium difficile Infection in the Molecular Test Era.
Polage, Christopher R; Gyorke, Clare E; Kennedy, Michael A; Leslie, Jhansi L; Chin, David L; Wang, Susan; Nguyen, Hien H; Huang, Bin; Tang, Yi-Wei; Lee, Lenora W; Kim, Kyoungmi; Taylor, Sandra; Romano, Patrick S; Panacek, Edward A; Goodell, Parker B; Solnick, Jay V; Cohen, Stuart H
2015-11-01
Clostridium difficile is a major cause of health care-associated infection, but disagreement between diagnostic tests is an ongoing barrier to clinical decision making and public health reporting. Molecular tests are increasingly used to diagnose C difficile infection (CDI), but many molecular test-positive patients lack toxins that historically defined disease, making it unclear if they need treatment. To determine the natural history and need for treatment of patients who are toxin immunoassay negative and polymerase chain reaction (PCR) positive (Tox-/PCR+) for CDI. Prospective observational cohort study at a single academic medical center among 1416 hospitalized adults tested for C difficile toxins 72 hours or longer after admission between December 1, 2010, and October 20, 2012. The analysis was conducted in stages with revisions from April 27, 2013, to January 13, 2015. Patients undergoing C difficile testing were grouped by US Food and Drug Administration-approved toxin and PCR tests as Tox+/PCR+, Tox-/PCR+, or Tox-/PCR-. Toxin results were reported clinically. Polymerase chain reaction results were not reported. The main study outcomes were duration of diarrhea during up to 14 days of treatment, rate of CDI-related complications (ie, colectomy, megacolon, or intensive care unit care) and CDI-related death within 30 days. Twenty-one percent (293 of 1416) of hospitalized adults tested for C difficile were positive by PCR, but 44.7% (131 of 293) had toxins detected by the clinical toxin test. At baseline, Tox-/PCR+ patients had lower C difficile bacterial load and less antibiotic exposure, fecal inflammation, and diarrhea than Tox+/PCR+ patients (P < .001 for all). The median duration of diarrhea was shorter in Tox-/PCR+ patients (2 days; interquartile range, 1-4 days) than in Tox+/PCR+ patients (3 days; interquartile range, 1-6 days) (P = .003) and was similar to that in Tox-/PCR- patients (2 days; interquartile range, 1-3 days), despite minimal empirical treatment of Tox-/PCR+ patients. No CDI-related complications occurred in Tox-/PCR+ patients vs 10 complications in Tox+/PCR+ patients (0% vs 7.6%, P < .001). One Tox-/PCR+ patient had recurrent CDI as a contributing factor to death within 30 days vs 11 CDI-related deaths in Tox+/PCR+ patients (0.6% vs 8.4%, P = .001). Among hospitalized adults with suspected CDI, virtually all CDI-related complications and deaths occurred in patients with positive toxin immunoassay test results. Patients with a positive molecular test result and a negative toxin immunoassay test result had outcomes that were comparable to patients without C difficile by either method. Exclusive reliance on molecular tests for CDI diagnosis without tests for toxins or host response is likely to result in overdiagnosis, overtreatment, and increased health care costs.
Lima, Fabricio O; Furie, Karen L; Silva, Gisele S; Lev, Michael H; Camargo, Erica C S; Singhal, Aneesh B; Harris, Gordon J; Halpern, Elkan F; Koroshetz, Walter J; Smith, Wade S; Nogueira, Raul G
2014-02-01
Limited data exist regarding the natural history of proximal intracranial arterial occlusions. OBJECTIVE To investigate the outcomes of patients who had an acute ischemic stroke attributed to an anterior circulation proximal intracranial arterial occlusion. A prospective cohort study at 2 university-based hospitals from 2003 to 2005 in which nonenhanced computed tomography scans and computed tomography angiograms were obtained at admission of all adult patients suspected of having an ischemic stroke in the first 24 hours of symptom onset. Anterior circulation proximal intracranial arterial occlusion. Frequency of good outcome (defined as a modified Rankin Scale score of ≤ 2) and mortality at 6 months. A total of 126 patients with a unilateral complete occlusion of the intracranial internal carotid artery (ICA; 26 patients: median National Institutes of Health Stroke Scale [NIHSS] score, 11 [interquartile range, 5-17]), of the M1 segment of the middle cerebral artery (MCA; 52 patients: median NIHSS score, 13 [interquartile range, 6-16]), or of the M2 segment of the MCA (48 patients: median NIHSS score, 7 [interquartile range, 4-15]) were included. Of these 3 groups of patients, 10 (38.5%), 20 (38.5%), and 26 (54.2%) with ICA, MCA-M1, and MCA-M2 occlusions, respectively, achieved a modified Rankin Scale score of 2 or less, and 6 (23.1%), 12 (23.1%), and 10 (20.8%) were dead at 6 months. Worse outcomes were seen in patients with a baseline NIHSS score of 10 or higher, with a modified Rankin Scale score of 2 or less achieved in only 7.1% (1 of 14), 23.5% (8 of 34), and 22.7% (5 of 22) of patients and mortality rates of 35.7% (5 of 14), 32.4% (11 of 34), and 40.9% (9 of 22) among patients with ICA, MCA-M1, and MCA-M2 occlusions, respectively. Age (odds ratio, 0.94 [95% CI, 0.91-0.98]), NIHSS score (odds ratio, 0.73 [95% CI, 0.64-0.83]), and strength of leptomeningeal collaterals (odds ratio, 2.37 [95% CI, 1.08-5.20]) were independently associated with outcome, whereas the level of proximal intracranial arterial occlusion (ICA vs MCA-M1 vs MCA-M2) was not. The natural history of proximal intracranial arterial occlusion is variable, with poor outcomes overall. Stroke severity and collateral flow appear to be more important than the level of proximal intracranial arterial occlusion in determining outcomes. Our results provide useful data for proper patient selection and sample size calculations in the design of new clinical trials aimed at recanalization therapies.
An acute cough-specific quality-of-life questionnaire for children: Development and validation.
Anderson-James, Sophie; Newcombe, Peter A; Marchant, Julie M; O'Grady, Kerry-Ann F; Acworth, Jason P; Stone, D Grant; Turner, Catherine T; Chang, Anne B
2015-05-01
Patient-relevant outcome measures are essential for high-quality clinical research, and quality-of-life (QoL) tools are the current standard. Currently, there is no validated children's acute cough-specific QoL questionnaire. The objective of this study was to develop and validate the Parent-proxy Children's Acute Cough-specific QoL Questionnaire (PAC-QoL). Using focus groups, a 48-item PAC-QoL questionnaire was developed and later reduced to 16 items by using the clinical impact method. Parents of children with a current acute cough (<2 weeks) at enrollment completed 2 validated cough score measures, the preliminary 48-item PAC-QoL, and 3 other questionnaires (the State Trait Anxiety Inventory [STAI], the Short-Form 8-item 24-hour recall Health Survey [SF-8], and the Depression, Anxiety, and Stress 21-item Scale [DASS21]). All measures were repeated on days 3 and 14. The median age of the 155 children enrolled was 2.3 years (interquartile range, 1.3-4.6). Median cough duration at enrollment was 3 days (interquartile range, 2-5). The reduced 16-item scale had high internal consistency (Cronbach α = 0.95). Evidence for repeatability and criterion validity was shown by significant correlations between the domains and total PAC-QoL scores and the SF-8 (r = -0.36 and -0.51), STAI (r = -0.27 and -0.39), and DASS21 (r = -0.32 and -0.41) scales on days 0 and 3, respectively. The final PAC-QoL questionnaire was sensitive to change over time, with changes significantly relating to changes in cough score measures (P < .001). The 16-item PAC-QoL is a reliable and valid outcome measure that assesses QoL related to childhood acute cough at a given time point and reflects changes in acute cough-specific QoL over time. Copyright © 2014 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
Olafiranye, Oladipupo; Ladejobi, Adetola; Wayne, Max; Martin-Gill, Christian; Althouse, Andrew D; Sharbaugh, Michael S; Guyette, Francis X; Reis, Steven E; Kellum, John A; Toma, Catalin
2016-12-01
To assess the impact of remote ischemic peri-conditioning (RIPC) during inter-facility air medical transport of ST-segment elevation myocardial infarction (STEMI) patients on the incidence of acute kidney injury (AKI) following primary percutaneous coronary intervention (pPCI). STEMI patients who receive pPCI have an increased risk of AKI for which there is no well-defined prophylactic therapy in the setting of emergent pPCI. Using the ACTION Registry-GWTG, we evaluated the impact of RIPC applied during inter-facility helicopter transport of STEMI patients from non-PCI capable hospitals to 2 PCI-hospitals in the United States between March, 2013 and September, 2015 on the incidence of AKI following pPCI. AKI was defined as ≥0.3 mg/dL increase in creatinine within 48-72 hours after pPCI. Patients who received RIPC (n = 127), compared to those who did not (n = 92), were less likely to have AKI (11 of 127 patients [8.7%] vs. 17 of 92 patients [18.5%]; adjusted odds ratio = 0.32, 95% CI 0.12-0.85, P = 0.023) and all-cause in-hospital mortality (2 of 127 patients [1.6%] vs. 7 of 92 patients [7.6%]; adjusted odds ratio = 0.14, 95% CI 0.02-0.86, P = 0.034) after adjusting for socio-demographic and clinical characteristics. There was no difference in hospital length of stay (3 days [interquartile range, 2-4] vs. 3 days [interquartile range, 2-5], P = 0.357) between the 2 groups. RIPC applied during inter-facility helicopter transport of STEMI patients for pPCI is associated with lower incidence of AKI and in-hospital mortality. The use of RIPC for renal protection in STEMI patients warrants further in depth investigation. © 2016, Wiley Periodicals, Inc.
Cheng, Huaibing; Lu, Minjie; Hou, Cuihong; Chen, Xuhua; Wang, Jing; Yin, Gang; Chu, Jianmin; Zhang, Shu; Prasad, Sanjay K; Pu, Jielin; Zhao, Shihua
2015-02-01
Although N-terminal pro-brain natriuretic peptide (NT-proBNP) is a useful screening test of impaired right ventricular (RV) function in conditions affecting the right-sided cardiac muscle, the role of NT-proBNP remains unclear in patients with arrhythmogenic right ventricular cardiomyopathy (ARVC). This study was designed to clarify the relation between the plasma NT-proBNP level and the RV function evaluated by cardiovascular magnetic resonance (CMR) imaging. We selected 56 patients with confirmed ARVC only when their blood specimens for NT-proBNP measurements were collected within 48 hours of a CMR scan. The NT-proBNP level was significantly higher in patients with RV dysfunction than in patients without RV dysfunction (median of 655.3 [interquartile range 556.4 to 870.0] vs 347.0 [interquartile range 308.0 to 456.2] pmol/L, p <0.001). The NT-proBNP levels were positively correlated with RV end-diastolic and end-systolic volume indices (r = 0.49 and 0.70, respectively) and negatively correlated with RV ejection fraction (r = -0.76, all p <0.001), which remained significant after adjustment for age, gender, and body mass index. The area under the receiver-operating characteristic curve for NT-proBNP was 0.91 (95% confidence interval 0.80 to 0.97, p <0.001). The cut-off value of NT-proBNP (458 pmol/L) was associated with sensitivity, specificity, and positive and negative predictive values of 91%, 89%, 67%, and 98%, respectively. In conclusion, NT-proBNP is a useful marker for the detection of RV dysfunction and associated with extent of RV dilatation and dysfunction determined by CMR in patients with ARVC. Copyright © 2015 Elsevier Inc. All rights reserved.
Ma, Li; Roberts, Joan S; Pihoker, Catherine; Richards, Todd L; Shaw, Dennis W W; Marro, Ken I; Vavilala, Monica S
2014-10-01
Impaired cerebral autoregulation may be associated with poor outcome in diabetic ketoacidosis. We examined change in cerebral autoregulation during diabetic ketoacidosis treatment. Prospective observational cohort study. Tertiary care children's hospital. Children admitted to the ICU with diabetic ketoacidosis (venous pH < 7.3, glucose > 300 mg/dL, HCO3 < 15 mEq/L, and ketonuria) constituted cases, and children with type I diabetes without diabetic ketoacidosis constituted controls. None. Between 2005 and 2009, 32 cases and 50 controls were enrolled. Transcranial Doppler ultrasonography was used to measure middle cerebral artery flow velocities, and cerebral autoregulation testing was achieved via tilt-table testing. Cases underwent two and controls underwent one cerebral autoregulation test. Cerebral autoregulation was quantified by the autoregulatory index (autoregulatory index < 0.4 = impaired and autoregulatory index 0.4-1.0 = intact autoregulation). The first autoregulation test was obtained early (time 1, 12-24 hr; median [interquartile range], 8 hr [5-18 hr]) during diabetic ketoacidosis treatment, and a second autoregulation test was obtained during recovery (time 2, 36-72 hr; median [ interquartile range], 46 hr [40-59 hr]) from time 0 (defined as time of insulin start). Cases had lower autoregulatory index at time 1 than time 2 (p < 0.001) as well lower autoregulatory index than control subjects (p < 0.001). Cerebral autoregulation was impaired in 40% (n = 13) of cases at time 1 and in 6% (n = 2) of cases at time 2. Five cases (17%) showed persistent impairment of cerebral autoregulation between times 1 and 2 of treatment. All control subjects had intact cerebral autoregulation. Impaired cerebral autoregulation was common early during diabetic ketoacidosis treatment. Although the majority improved during diabetic ketoacidosis treatment, 17% of subjects had impairment between 36 and 72 hours after start of insulin therapy. The observed impaired cerebral autoregulation appears specific to the diabetic ketoacidosis process in patients with type I diabetes.
Barcelo, Antonia; Bauça, Josep Miquel; Yañez, Aina; Fueyo, Laura; Gomez, Cristina; de la Peña, Monica; Pierola, Javier; Rodriguez, Alberto; Sanchez-de-la-Torre, Manuel; Abad, Jorge; Mediano, Olga; Amilibia, Jose; Masdeu, Maria Jose; Teran, Joaquin; Montserrat, Josep Maria; Mayos, Mercè; Sanchez-de-la-Torre, Alicia; Barbé, Ferran
2016-01-01
Background Placental growth factor (PlGF) induces angiogenesis and promotes tissue repair, and plasma PlGF levels change markedly during acute myocardial infarction (AMI). Currently, the impact of obstructive sleep apnea (OSA) in patients with AMI is a subject of debate. Our objective was to evaluate the relationships between PlGF levels and both the severity of acute coronary syndrome (ACS) and short-term outcomes after ACS in patients with and without OSA. Methods A total of 538 consecutive patients (312 OSA patients and 226 controls) admitted for ACS were included in this study. All patients underwent polygraphy in the first 72 hours after hospital admission. The severity of disease and short-term prognoses were evaluated during the hospitalization period. Plasma PlGF levels were measured using an electrochemiluminescence immunoassay. Results Patients with OSA were significantly older and more frequently hypertensive and had higher BMIs than those without OSA. After adjusting for age, smoking status, BMI and hypertension, PlGF levels were significantly elevated in patients with OSA compared with patients without OSA (19.9 pg/mL, interquartile range: 16.6–24.5 pg/mL; 18.5 pg/mL, interquartile range: 14.7–22.7 pg/mL; p<0.001), and a higher apnea-hypopnea index (AHI) was associated with higher PlGF concentrations (p<0.003). Patients with higher levels of PlGF had also an increased odds ratio for the presence of 3 or more diseased vessels and for a Killip score>1, even after adjustment. Conclusions The results of this study show that in patients with ACS, elevated plasma levels of PlGF are associated with the presence of OSA and with adverse outcomes during short-term follow-up. Trial Registration ClinicalTrials.gov NCT01335087 PMID:26930634
Gamma-knife radiosurgery in acromegaly: a 4-year follow-up study.
Attanasio, Roberto; Epaminonda, Paolo; Motti, Enrico; Giugni, Enrico; Ventrella, Laura; Cozzi, Renato; Farabola, Mario; Loli, Paola; Beck-Peccoz, Paolo; Arosio, Maura
2003-07-01
Stereotactic radiosurgery by gamma-knife (GK) is an attractive therapeutic option after failure of microsurgical removal in patients with pituitary adenoma. In these tumors or remnants of them, it aims to obtain the arrest of cell proliferation and hormone hypersecretion using a single precise high dose of ionizing radiation, sparing surrounding structures. The long-term efficacy and toxicity of GK in acromegaly are only partially known. Thirty acromegalic patients (14 women and 16 men) entered a prospective study of GK treatment. Most were surgical failures, whereas in 3 GK was the primary treatment. Imaging of the adenoma and target coordinates identification were obtained by high resolution magnetic resonance imaging. All patients were treated with multiple isocenters (mean, 8; range, 3-11). The 50% isodose was used in 27 patients (90%). The mean margin dose was 20 Gy (range, 15-35), and the dose to the visual pathways was always less than 8 Gy. After a median follow-up of 46 months (range, 9-96), IGF-I fell from 805 micro g/liter (median; interquartile range, 640-994) to 460 micro g/liter (interquartile range, 217-654; P = 0.0002), and normal age-matched IGF-I levels were reached in 7 patients (23%). Mean GH levels decreased from 10 micro g/liter (interquartile range, 6.4-15) to 2.9 micro g/liter (interquartile range, 2-5.3; P < 0.0001), reaching levels below 2.5 micro g/liter in 11 (37%). The rate of persistently pathological hormonal levels was still 70% at 5 yr by Kaplan-Meier analysis. The median volume was 1.43 ml (range, 0.20-3.7). Tumor shrinkage (at least 25% of basal volume) occurred after 24 months (range, 12-36) in 11 of 19 patients (58% of assessable patients). The rate of shrinkage was 79% at 4 yr. In no case was further growth observed. Only 1 patient complained of side-effects (severe headache and nausea immediately after the procedure, with full recovery in a few days with steroid therapy). Anterior pituitary failures were observed in 2 patients, who already had partial hypopituitarism, after 2 and 6 yr, respectively. No patient developed visual deficits. GK is a valid adjunctive tool in the management of acromegaly that controls GH/IGF-I hypersecretion and tumor growth, with shrinkage of adenoma and no recurrence of the disease in the considered observation period and with low acute and chronic toxicity.
Taylor, Lauren J; Nabozny, Michael J; Steffens, Nicole M; Tucholka, Jennifer L; Brasel, Karen J; Johnson, Sara K; Zelenski, Amy; Rathouz, Paul J; Zhao, Qianqian; Kwekkeboom, Kristine L; Campbell, Toby C; Schwarze, Margaret L
2017-06-01
Although many older adults prefer to avoid burdensome interventions with limited ability to preserve their functional status, aggressive treatments, including surgery, are common near the end of life. Shared decision making is critical to achieve value-concordant treatment decisions and minimize unwanted care. However, communication in the acute inpatient setting is challenging. To evaluate the proof of concept of an intervention to teach surgeons to use the Best Case/Worst Case framework as a strategy to change surgeon communication and promote shared decision making during high-stakes surgical decisions. Our prospective pre-post study was conducted from June 2014 to August 2015, and data were analyzed using a mixed methods approach. The data were drawn from decision-making conversations between 32 older inpatients with an acute nonemergent surgical problem, 30 family members, and 25 surgeons at 1 tertiary care hospital in Madison, Wisconsin. A 2-hour training session to teach each study-enrolled surgeon to use the Best Case/Worst Case communication framework. We scored conversation transcripts using OPTION 5, an observer measure of shared decision making, and used qualitative content analysis to characterize patterns in conversation structure, description of outcomes, and deliberation over treatment alternatives. The study participants were patients aged 68 to 95 years (n = 32), 44% of whom had 5 or more comorbid conditions; family members of patients (n = 30); and surgeons (n = 17). The median OPTION 5 score improved from 41 preintervention (interquartile range, 26-66) to 74 after Best Case/Worst Case training (interquartile range, 60-81). Before training, surgeons described the patient's problem in conjunction with an operative solution, directed deliberation over options, listed discrete procedural risks, and did not integrate preferences into a treatment recommendation. After training, surgeons using Best Case/Worst Case clearly presented a choice between treatments, described a range of postoperative trajectories including functional decline, and involved patients and families in deliberation. Using the Best Case/Worst Case framework changed surgeon communication by shifting the focus of decision-making conversations from an isolated surgical problem to a discussion about treatment alternatives and outcomes. This intervention can help surgeons structure challenging conversations to promote shared decision making in the acute setting.
Lochocka, Klaudia; Bajerska, Joanna; Glapa, Aleksandra; Fidler-Witon, Ewa; Nowak, Jan K; Szczapa, Tomasz; Grebowiec, Philip; Lisowska, Aleksandra; Walkowiak, Jaroslaw
2015-07-30
Green tea is known worldwide for its beneficial effects on human health. However, objective data evaluating this influence in humans is scarce. The aim of the study was to assess the impact of green tea extract (GTE) on starch digestion and absorption. The study comprised of 28 healthy volunteers, aged 19 to 28 years. In all subjects, a starch (13)C breath test was performed twice. Subjects randomly ingested naturally (13)C-abundant cornflakes during the GTE test (GTE 4 g) or placebo test. The cumulative percentage dose recovery (CPDR) was significantly lower for the GTE test than for the placebo test (median [interquartile range]: 11.4% [5.5-15.5] vs. 16.1% [12.7-19.5]; p = 0.003). Likewise, CPDR expressed per hour was considerably lower in each point of the measurement. In conclusion, a single dose of green tea extract taken with a test meal decreases starch digestion and absorption.
Pulmonary rehabilitation in lymphangioleiomyomatosis: a controlled clinical trial.
Araujo, Mariana S; Baldi, Bruno G; Freitas, Carolina S G; Albuquerque, André L P; Marques da Silva, Cibele C B; Kairalla, Ronaldo A; Carvalho, Celso R F; Carvalho, Carlos R R
2016-05-01
Lymphangioleiomyomatosis (LAM) is a cystic lung disease frequently associated with reduced exercise capacity. The aim of this study was to assess safety and efficacy of pulmonary rehabilitation in LAM.This controlled clinical trial included 40 patients with LAM and a low physical activity level. The pulmonary rehabilitation programme comprised 24 aerobic and muscle strength training sessions and education. The primary outcome was exercise capacity (endurance time during a constant work rate exercise test). Secondary outcomes included health-related quality of life (St George's Respiratory Questionnaire (SGRQ)), 6-min walking distance (6MWD), dyspnoea, peak oxygen consumption (V'O2 ), daily physical activity (pedometer), symptoms of anxiety and depression, lung function and peripheral muscle strength (one-repetition maximum).The baseline characteristics were well balanced between the groups. The pulmonary rehabilitation group exhibited improvements in the following outcomes versus controls: endurance time (median (interquartile range) 169 (2-303) s versus -33 (-129-39) s; p=0.001), SGRQ (median (interquartile range) -8 (-16-2) versus 2 (-4-5); p=0.002) and 6MWD (median (interquartile range) 59 (13-81) m versus 20 (-12-30) m; p=0.002). Dyspnoea, peak V'O2 , daily physical activity and muscle strength also improved significantly. No serious adverse events were observed.Pulmonary rehabilitation is a safe intervention and improves exercise capacity, dyspnoea, daily physical activity, quality of life and muscle strength in LAM. Copyright ©ERS 2016.
Prehospital Emergency Care in Childhood Arterial Ischemic Stroke.
Stojanovski, Belinda; Monagle, Paul T; Mosley, Ian; Churilov, Leonid; Newall, Fiona; Hocking, Grant; Mackay, Mark T
2017-04-01
Immediately calling an ambulance is the key factor in reducing time to hospital presentation for adult stroke. Little is known about prehospital care in childhood arterial ischemic stroke (AIS). We aimed to determine emergency medical services call-taker and paramedic diagnostic sensitivity and to describe timelines of care in childhood AIS. This is a retrospective study of ambulance-transported children aged <18 years with first radiologically confirmed AIS, from 2008 to 2015. Interhospital transfers of children with preexisting AIS diagnosis were excluded. Twenty-three children were identified; 4 with unavailable ambulance records were excluded. Nineteen children were included in the study. Median age was 8 years (interquartile range, 3-14); median Pediatric National Institutes of Stroke Severity Scale score was 8 (interquartile range, 3-16). Emergency medical services call-taker diagnosis was stroke in 4 children (21%). Priority code 1 (lights and sirens) ambulances were dispatched for 13 children (68%). Paramedic diagnosis was stroke in 5 children (26%), hospital prenotification occurred in 8 children (42%), and 13 children (68%) were transported to primary stroke centers. Median prehospital timelines were onset to emergency medical services contact 13 minutes, call to scene 12 minutes, time at scene 14 minutes, transport time 43 minutes, and total prehospital time 71 minutes (interquartile range, 60-85). Emergency medical services call-taker and paramedic diagnostic sensitivity and prenotification rates are low in childhood AIS. © 2017 American Heart Association, Inc.
Location and size of flux ropes in Titan's ionosphere
NASA Astrophysics Data System (ADS)
Martin, C.; Arridge, C. S.; Badman, S. V.; Dieval, C.
2017-12-01
Cassini magnetometer data was surveyed during Titan flybys to find 73 instances of flux rope signatures. A force free flux rope model was utilised to obtain the radii, maximum magnetic field and flux content of flux ropes that adhere to the force-free assumptions. We find that flux ropes at Titan are similar in size in km and flux content to the giant flux ropes identified at Venus, with a median radii of 280 km and an inter-quartile range of 270 km, a median maximum magnetic field of 8 nT with an inter-quartile range of 7 nT and a median flux content of 76 Wb with a large inter-quartile range of 250 Wb. We additionally investigate the occurrence of flux ropes with respect to the Sun-lit facing hemisphere (zenith angle) and the ram-side of Titan within Saturn's corotating magnetosphere (angle of attack of the incoming plasma flow). We find that flux ropes are more commonly detected in Sun-lit areas of Titan's ionosphere, as well as the ram-side of Titan. We see a statistically-significant absence of flux ropes in all SLT sectors in the night side of Titan and the anti-ram side of Titan. We also comment on the physical mechanisms associated with the production of these flux ropes, with particular attention on the variability of Titan's environment in Saturn's magnetosphere.
Cheng, S; Teuffel, O; Ethier, M C; Diorio, C; Martino, J; Mayo, C; Regier, D; Wing, R; Alibhai, S M H; Sung, L
2011-01-01
Background: To describe (1) anticipated health-related quality of life during different strategies for febrile neutropaenia (FN) management and (2) attributes of those preferring inpatient management. Methods: Respondents were parents of children 0–18 years and children 12–18 years receiving cancer treatment. Anticipated health-related quality of life was elicited for four different FN management strategies: entire inpatient, early discharge, outpatient oral and outpatient intravenous (i.v.) therapy. Tools used to measure health-related quality of life were visual analogue scale (VAS), willingness to pay and time trade off. Results: A total of 155 parents and 43 children participated. For parents, median VAS scores were highest for early discharge (5.9, interquartile range 4.4–7.2) and outpatient i.v. (5.9, interquartile range 4.4–7.3). For children, median scores were highest for early discharge (6.1, interquartile range 4.6–7.2). In contrast, the most commonly preferred strategy for parents and children was inpatient in 55.0% and 37.2%, respectively. Higher current child health-related quality of life was associated with a stronger preference for outpatient management. Conclusion: Early discharge and outpatient i.v. management are associated with higher anticipated health-related quality of life, although the most commonly preferred strategy was inpatient care. This data may help with determining more cost-effective strategies for paediatric FN. PMID:21694729
Cheng, S; Teuffel, O; Ethier, M C; Diorio, C; Martino, J; Mayo, C; Regier, D; Wing, R; Alibhai, S M H; Sung, L
2011-08-23
To describe (1) anticipated health-related quality of life during different strategies for febrile neutropaenia (FN) management and (2) attributes of those preferring inpatient management. Respondents were parents of children 0-18 years and children 12-18 years receiving cancer treatment. Anticipated health-related quality of life was elicited for four different FN management strategies: entire inpatient, early discharge, outpatient oral and outpatient intravenous (i.v.) therapy. Tools used to measure health-related quality of life were visual analogue scale (VAS), willingness to pay and time trade off. A total of 155 parents and 43 children participated. For parents, median VAS scores were highest for early discharge (5.9, interquartile range 4.4-7.2) and outpatient i.v. (5.9, interquartile range 4.4-7.3). For children, median scores were highest for early discharge (6.1, interquartile range 4.6-7.2). In contrast, the most commonly preferred strategy for parents and children was inpatient in 55.0% and 37.2%, respectively. Higher current child health-related quality of life was associated with a stronger preference for outpatient management. Early discharge and outpatient i.v. management are associated with higher anticipated health-related quality of life, although the most commonly preferred strategy was inpatient care. This data may help with determining more cost-effective strategies for paediatric FN.
Tisè, Marco; Mazzarini, Laura; Fabrizzi, Giancarlo; Ferrante, Luigi; Giorgetti, Raffaele; Tagliabracci, Adriano
2011-05-01
The main importance in age estimation lies in the assessment of criminal liability and protection of unaccompanied minor immigrants, when their age is unknown. Under Italian law, persons are not criminally responsible before they reach the age of 14. The age of 18 is important when deciding whether juvenile or adult law must be applied. In the case of unaccompanied minors, it is important to assess age in order to establish special protective measures, and correct age estimation may prevent a person over 18 from benefiting from measures reserved for minors. Since the Greulich and Pyle method is one of the most frequently used in age estimation, the aim of this study was to assess the reproducibility and accuracy of the method on a large Italian sample of teenagers, to ascertain the applicability of the Atlas at the critical age thresholds of 14 and 18 years. This retrospective study examined posteroanterior X-ray projections of hand and wrist from 484 Italian-Caucasian young people (125 females, 359 males) between 11 and 19 years old. All radiographic images were taken from trauma patients hospitalized in the Azienda Ospedaliero Universitaria Ospedali Riuniti of Ancona (Italy) between 2006 and 2007. Two physicians analyzed all radiographic images separately. The blind method was used. In the case of an estimated age of 14 years old, the true age ranged from 12.2 to 15.9 years (median, 14.3 years, interquartile range, 1.0 years) for males, and 12.6 to 15.7 years (median, 14.2 years, interquartile range, 1.7 years) for females. In the case of an estimated age of 18 years, the true age ranged from 15.6 to 19.7 years (median, 17.7 years, interquartile range, 1.4 years) for males, and from 16.2 to 20.0 years (median, 18.7 years, interquartile range, 1.8 years) for females. Our study shows that although the GPM is a reproducible and repeatable method, there is a wide margin of error in the estimation of chronological age, mainly in the critical estimated ages of 14 and 18 years old in both males and females.
Vedel, Anne G; Holmgaard, Frederik; Rasmussen, Lars S; Langkilde, Annika; Paulson, Olaf B; Lange, Theis; Thomsen, Carsten; Olsen, Peter Skov; Ravn, Hanne Berg; Nilsson, Jens C
2018-04-24
Cerebral injury is an important complication after cardiac surgery with the use of cardiopulmonary bypass. The rate of overt stroke after cardiac surgery is 1% to 2%, whereas silent strokes, detected by diffusion-weighted magnetic resonance imaging, are found in up to 50% of patients. It is unclear whether a higher versus a lower blood pressure during cardiopulmonary bypass reduces cerebral infarction in these patients. In a patient- and assessor-blinded randomized trial, we allocated patients to a higher (70-80 mm Hg) or lower (40-50 mm Hg) target for mean arterial pressure by the titration of norepinephrine during cardiopulmonary bypass. Pump flow was fixed at 2.4 L·min -1 ·m -2 . The primary outcome was the total volume of new ischemic cerebral lesions (summed in millimeters cubed), expressed as the difference between diffusion-weighted imaging conducted preoperatively and again postoperatively between days 3 and 6. Secondary outcomes included diffusion-weighted imaging-evaluated total number of new ischemic lesions. Among the 197 enrolled patients, mean (SD) age was 65.0 (10.7) years in the low-target group (n=99) and 69.4 (8.9) years in the high-target group (n=98). Procedural risk scores were comparable between groups. Overall, diffusion-weighted imaging revealed new cerebral lesions in 52.8% of patients in the low-target group versus 55.7% in the high-target group ( P =0.76). The primary outcome of volume of new cerebral lesions was comparable between groups, 25 mm 3 (interquartile range, 0-118 mm 3 ; range, 0-25 261 mm 3 ) in the low-target group versus 29 mm 3 (interquartile range, 0-143 mm 3 ; range, 0-22 116 mm 3 ) in the high-target group (median difference estimate, 0; 95% confidence interval, -25 to 0.028; P =0.99), as was the secondary outcome of number of new lesions (1 [interquartile range, 0-2; range, 0-24] versus 1 [interquartile range, 0-2; range, 0-29] respectively; median difference estimate, 0; 95% confidence interval, 0-0; P =0.71). No significant difference was observed in frequency of severe adverse events. Among patients undergoing on-pump cardiac surgery, targeting a higher versus a lower mean arterial pressure during cardiopulmonary bypass did not seem to affect the volume or number of new cerebral infarcts. URL: https://www.clinicaltrials.gov. Unique identifier: NCT02185885. © 2018 American Heart Association, Inc.
Viana, Ricardo Borges; Campos, Mário Hebling; Santos, Douglas de Assis Teles; Xavier, Isabela Cristina Maioni; Vancini, Rodrigo Luiz; Andrade, Marília Santos; de Lira, Claudio Andre Barbosa
2018-04-16
Peer and near-peer teaching programs are common in medical undergraduate courses. However, there are no studies that have investigated the effectiveness of a near-peer teaching program on the academic performance of undergraduate students pursuing sport and exercise science coursework. This study was conducted to analyze the effectiveness of such a program for students who participated in a course on the functional anatomy of the locomotor apparatus. A total of 39 student participants were divided into two groups: students in one group voluntarily attended at least one session of a near-peer teaching program, and students in the other group attended no sessions. The final grade (range 0-100%) was recorded and used as an indicator of academic performance. The final grade of students who attended the near-peer teaching program (69.5 ± 16.0%) was 38.7% higher (P = 0.002, d = 1.06) than those who did not (50.1 ± 20.4%). When the academic performance of the same students was evaluated in another course (exercise physiology) that did not offer a near-peer teaching program, there were no significant differences between the groups (students who attended or did not attend the near-peer teaching program). A significant positive association was found between near-peer teaching program frequency and the number of students approved and not approved in the course (P = 0.041). A significant difference (P = 0.001) was found in the attendance at regular classes between the group who participated in the near-peer teaching program (median: 62 hours; IQR [interquartile ranges]: 4.0 hours) and those who did not (median: 58 hours; IQR: 4.0 hours). Gender was not a moderating factor on academic performance or near-peer teaching program attendance. These results highlight the effectiveness of a near-peer teaching program on the academic performance of students from a sport and exercise science degree program while enrolled in an anatomy course. Anat Sci Educ. © 2018 American Association of Anatomists. © 2018 American Association of Anatomists.
[New definition of metabolic syndrome: does it have the same cardiovascular risk?].
Rodilla, E; González, C; Costa, J A; Pascual, J M
2007-02-01
The International Diabetes Federation (IDF) has recently published the new criteria for the diagnosis of metabolic syndrome. The aim of this study was to compare the clinical characteristics and cardiovascular risk of the new patients with MS compared to the previous National Cholesterol Education Program ATP III definition, its differential characteristics and cardiovascular risk. Cross sectional study in a hypertension clinic. Coronary risk was calculated (Framingham function NCEP-ATP III) and other cardiovascular markers, urinary albumin excretion (UAE in mg/24 hours) and high sensitivity C-reactive protein (CRP) were assessed. A total 2,404 patients were evaluated, 1,901 non-diabetic and 503 diabetic hypertensive subjects. The non-diabetics 726 (38.2%) had MS with the previous NCEP ATP-III definition, the number increasing sharply to 1,091 (57.4%) with the new IDF definition. The proportion did not increase in diabetics (93% vs. 92%). Concordance in the diagnosis was 78% in non-diabetics and 91% in diabetics. The new patients had a similar coronary risk (Framingham) but lower values of other cardiovascular markers: logUAE 1.00 (0.49) mg/24 hours vs. 1.06 (0.55) mg/24 hours (p = 0.003), and CRP 1.9 (2.7) mg/L vs. 2.5 (3.2) mg/L (median, interquartile range; p < 0.001). The new IDF definition of MS increases the number of patients with MS. The new patients have a similar coronary risk (Framingham) but the new parameters used to assess cardiovascular risk (UAE and CRP) were lower. The relationship of the new definition of MS and cardiovascular risk remains to be defined.
Objectively Measured Activity Patterns among Adults in Residential Aged Care
Reid, Natasha; Eakin, Elizabeth; Henwood, Timothy; Keogh, Justin W. L.; Senior, Hugh E.; Gardiner, Paul A.; Winkler, Elisabeth; Healy, Genevieve N.
2013-01-01
Objectives: To determine the feasibility of using the activPAL3TM activity monitor, and, to describe the activity patterns of residential aged care residents. Design: Cross-sectional. Setting: Randomly selected aged care facilities within 100 km of the Gold Coast, Queensland, Australia. Participants: Ambulatory, older (≥60 years) residential aged care adults without cognitive impairment. Measurements: Feasibility was assessed by consent rate, sleep/wear diary completion, and through interviews with staff/participants. Activity patterns (sitting/lying, standing, and stepping) were measured via activPAL3TM monitors worn continuously for seven days. Times spent in each activity were described and then compared across days of the week and hours of the day using linear mixed models. Results: Consent rate was 48% (n = 41). Activity patterns are described for the 31 participants (mean age 84.2 years) who provided at least one day of valid monitor data. In total, 14 (45%) completed the sleep/wear diary. Participants spent a median (interquartile range) of 12.4 (1.7) h sitting/lying (with 73% of this accumulated in unbroken bouts of ≥30 min), 1.9 (1.3) h standing, and 21.4 (36.7) min stepping during their monitored waking hours per day. Activity did not vary significantly by day of the week (p ≥ 0.05); stepping showed significant hourly variation (p = 0.018). Conclusions: Older adults in residential aged care were consistently highly sedentary. Feasibility considerations for objective activity monitoring identified for this population include poor diary completion and lost monitors. PMID:24304508
Kallmünzer, Bernd; Breuer, Lorenz; Hering, Christiane; Raaz-Schrauder, Dorette; Kollmar, Rainer; Huttner, Hagen B; Schwab, Stefan; Köhrmann, Martin
2012-04-01
Anticoagulation is a highly effective secondary prevention in patients with cardioembolic stroke and atrial fibrillation/flutter (AF). However, the condition remains underdiagnosed, because paroxysmal AF may be missed by diagnostic tests in the acute phase. In this study, the sensitivity of AF detection was assessed for serial electrocardiographic recordings and continuous stroke unit telemetric monitoring with or without a structured algorithm to analyze telemetric data (SEA-AF). Three hundred forty-six consecutive patients with acute ischemic stroke were prospectively included and subjected to standard telemetric monitoring. In addition, telemetric data were separately analyzed following SEA-AF, consisting of a structured evaluation of episodes with high risk for AF and a chronological beat-to-beat screening of the full registration. Serial electrocardiograms were conducted in 24-hour intervals. Median effective telemetry monitoring time was 75.5 hours (interquartile range 64-86 hours). Overall, AF was diagnosed in 119 of 346 patients (34.4%). The structured reading algorithm was the most sensitive method to detected AF. Conventional telemetry and serial electrocardiographic assessments were less effective. However, only 35% of patients with previously documented paroxysmal AF and negative baseline electrocardiogram demonstrated AF episodes during monitoring. Continuous stroke unit telemetry using SEA-AF shows a significantly higher detection rate for AF compared with daily electrocardiographic assessments and standard telemetry without structured reading. The low overall probability to detect paroxysmal AF with either method during the first days after stroke demonstrates the urgent need for complementary diagnostic strategies such as long-term monitoring and frequent follow-up assessments. Clinical Trial Registration- URL: www.clinicaltrials.gov. Unique identifier: NCT01177748.
Consumption of Caffeinated Products and Cardiac Ectopy.
Dixit, Shalini; Stein, Phyllis K; Dewland, Thomas A; Dukes, Jonathan W; Vittinghoff, Eric; Heckbert, Susan R; Marcus, Gregory M
2016-01-26
Premature cardiac contractions are associated with increased morbidity and mortality. Though experts associate premature atrial contractions (PACs) and premature ventricular contractions (PVCs) with caffeine, there are no data to support this relationship in the general population. As certain caffeinated products may have cardiovascular benefits, recommendations against them may be detrimental. We studied Cardiovascular Health Study participants with a baseline food frequency assessment, 24-hour ambulatory electrocardiography (Holter) monitoring, and without persistent atrial fibrillation. Frequencies of habitual coffee, tea, and chocolate consumption were assessed using a picture-sort food frequency survey. The main outcomes were PACs/h and PVCs/hour. Among 1388 participants (46% male, mean age 72 years), 840 (61%) consumed ≥1 caffeinated product per day. The median numbers of PACs and PVCs/h and interquartile ranges were 3 (1-12) and 1 (0-7), respectively. There were no differences in the number of PACs or PVCs/h across levels of coffee, tea, and chocolate consumption. After adjustment for potential confounders, more frequent consumption of these products was not associated with ectopy. In examining combined dietary intake of coffee, tea, and chocolate as a continuous measure, no relationships were observed after multivariable adjustment: 0.48% fewer PACs/h (95% CI -4.60 to 3.64) and 2.87% fewer PVCs/h (95% CI -8.18 to 2.43) per 1-serving/week increase in consumption. In the largest study to evaluate dietary patterns and quantify cardiac ectopy using 24-hour Holter monitoring, we found no relationship between chronic consumption of caffeinated products and ectopy. © 2016 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.
Goker, Berna; Block, Joel A
2006-01-01
The risk of developing bilateral disease progressing to total hip arthroplasty (THA) among patients who undergo unilateral THA for non-traumatic avascular necrosis (AVN) remains poorly understood. An analysis of the time-course to contralateral THA, as well as the effects of underlying AVN risk factors, is presented. Forty-seven consecutive patients who underwent THA for AVN were evaluated. Peri-operative and annual post-operative antero-posterior pelvis radiographs were examined for evidence of contralateral involvement. Patient age, weight, height, underlying AVN risk factor(s), date of onset of contralateral hip pain if occurred, and date of contralateral THA if performed, were recorded. Bone scan, computerized tomography and magnetic resonance imaging data were utilized when available. Twenty-one patients (46.6%) underwent contralateral THA for AVN within a median of 9 months after the initial THA (range 0-93, interquartile range 28.5 months). The median follow-up for patients without contralateral THA was 75 months (range 3-109, interquartile range 69 months). Thirty-four patients had radiographic findings of contralateral AVN at study entry; 25 were symptomatic bilaterally at entry and 7 developed contralateral symptoms within a mean time of 12 months (median 10 months, interquartile range 12 months). None of the 13 patients who were free of radiographic evidence of contralateral AVN at study entry developed evidence of AVN during the follow-up. AVN associated with glucocorticoid use was more likely to manifest as bilateral disease than either idiopathic AVN or ethanol-associated AVN (P=0.02 and P=0.03 respectively). Radiographically-evident AVN in the contralateral hip at THA is unlikely to remain asymptomatic for a prolonged period of time. Conversely, asymptomatic contralateral hips without radiographic evidence of AVN are unlikely to develop clinically significant AVN.
Viani, Rolando M; Alvero, Carmelita; Fenton, Terry; Acosta, Edward P; Hazra, Rohan; Townley, Ellen; Steimers, Debra; Min, Sherene; Wiznia, Andrew
2015-11-01
To assess the pharmacokinetics (PK), safety and efficacy of dolutegravir plus optimized background regimen in HIV-infected treatment-experienced adolescents. Children older than 12 to younger than 18 years received dolutegravir weight-based fixed doses at approximately 1.0 mg/kg once daily in a phase I/II multicenter open label 48-week study. Intensive PK evaluation was done at steady state after dolutegravir was added to a failing regimen or started at the end of a treatment interruption. Safety and HIV RNA and CD4 cell count assessments were performed through week 48. Twenty-three adolescents were enrolled and 22 (96%) completed the 48-week study visit. Median age and weight were 15 years and 52 kg, respectively. Median [interquartile range (IQR)] baseline CD4+ cell count was 466 cells/μL (297, 771). Median (IQR) baseline HIV-1 RNA log10 was 4.3 log10 copies/mL (3.9, 4.6). Dolutegravir geometric mean of the area under the plasma concentration-time curve from time of administration to 24 hours after dosing (AUC0-24) and 24 hour postdose concentration (C24) were 46.0 μg hours/mL and 0.90 μg/mL, respectively, which were within the study targets based on adult PK ranges. Virologic success with an HIV RNA <400 copies/mL was achieved in 74% [95% confidence interval (CI): 52-90%] at week 48. Additionally, 61% (95% CI: 39-80%) had an HIV RNA <50 copies/mL at week 48. Median (IQR) gain in CD4 cell count at week 48 was 84 cells/μL (-81, 238). Dolutegravir was well tolerated, with no grade 4 adverse events, serious adverse events or discontinuations because of serious adverse events. Dolutegravir achieved target PK exposures in adolescents. Dolutegravir was safe and well tolerated, providing good virologic efficacy through week 48.
Population Pharmacokinetics of Enoxaparin in Pediatric Patients.
Moffett, Brady S; Lee-Kim, YoungNa; Galati, Marianne; Mahoney, Donald; Shah, Mona D; Teruya, Jun; Yee, Donald
2018-02-01
There are no studies evaluating the pharmacokinetics of enoxaparin in the hospitalized pediatric patient population. To characterize the pharmacokinetics of enoxaparin in pediatric patients. A retrospective review of inpatients 1 to 18 years of age admitted to our institution who received enoxaparin with anti-factor Xa activity level monitoring was performed. Demographic variables, enoxaparin dosing, and anti-factor Xa activity levels were collected. Population pharmacokinetic analysis was performed with bootstrap analysis. Simulation (n = 10 000) was performed to determine the percentage who achieved targeted anti-Xa levels at various doses. A total of 853 patients (male 52.1%, median age = 12.2 years; interquartile range [IQR] = 4.6-15.8 years) received a mean enoxaparin dose of 0.86 ± 0.31 mg/kg/dose. A median of 3 (IQR = 1-5) anti-factor Xa levels were sampled at 4.4 ± 1.3 hours after a dose, with a mean anti-factor Xa level of 0.52 ± 0.23 U/mL. A 1-compartment model best fit the data, and significant covariates included allometrically scaled weight, serum creatinine, and hematocrit on clearance, and platelets on volume of distribution. Simulations were run for patients both without and with reduced kidney function (creatinine clearance of ≤30 mL/min/1.73 m 2 ). A dose of 1 mg/kg/dose every 12 hours had the highest probability (72.3%) of achieving an anti-Xa level within the target range (0.5-1 U/mL), whereas a dose reduction of ~30% achieved the same result in patients with reduced kidney function. Pediatric patients should initially be dosed at 1-mg/kg/dose subcutaneously every 12 hours for treatment of thromboembolism followed by anti-Xa activity monitoring. Dose reductions of ~30% for creatinine clearance ≤30 mL/min/1.73 m 2 are required.
Branch-Elliman, Westyn; Stanislawski, Maggie; Strymish, Judith; Barón, Anna E; Gupta, Kalpana; Varosy, Paul D; Gold, Howard S; Ho, P Michael
2016-09-01
BACKGROUND Infections following cardiovascular implantable electronic device (CIED) procedures, including pacemaker and implantable cardioverter-defibrillators, are devastating and costly. Preimplantation prophylactic antimicrobials are effective for reducing postprocedural infections. However, routine postprocedural antimicrobials are not associated with improved outcomes, and they may be harmful. Thus, we sought to characterize antimicrobial use patterns following CIED procedures. DESIGN All patients who underwent CIED procedures from October 1, 2007 to September 30, 2013 and had procedural information entered into the VA Clinical Assessment Reporting and Tracking (CART) software program were included in this study. All antibiotic prescriptions lasting more than 24 hours following device implantation or revision were identified using pharmacy databases, and postprocedural antibiotic use lasting more than 24 hours was characterized. RESULTS In total, 3,712 CIED procedures were performed at 34 VA facilities on 3,570 patients with a mean age of 71.7 years (standard deviation [SD], 11.1 years), 98.4% of whom were male. Postprocedural antibiotics >24 hours were prescribed following 1,579 of 3,712 CIED procedures (42.5%). The median duration of therapy was 5 days (interquartile range [IQR], 3-7 days). The most commonly prescribed antibiotic was cephalexin (1,152 of 1,579; 72.9%), followed by doxycycline (118 of 1,579; 7.47%) and ciprofloxacin (93 of 1,579; 5.9%). Vancomycin was used in 73 of 1,579 prescriptions (4.62%). Among the highest quartile of procedural volume, prescribing practices varied considerably, ranging from 3.2% to 77.6%. CONCLUSIONS Nearly 1 in 2 patients received prolonged postprocedural antimicrobial therapy following CIED procedures, and the rate of postprocedural antimicrobial therapy use varied considerably by facility. Given the lack of demonstrated benefit of routine prolonged antimicrobial therapy following CIED procedures, antimicrobial use following cardiac device interventions may be a potential target for quality improvement programs and antimicrobial stewardship. Infect Control Hosp Epidemiol 2016;37:1005-1011.
Tasker, Robert C; Goodkin, Howard P; Fernández, Iván Sánchez; Chapman, Kevin E; Abend, Nicholas S; Arya, Ravindra; Brenton, James N; Carpenter, Jessica L; Gaillard, William D; Glauser, Tracy A; Goldstein, Joshua; Helseth, Ashley R; Jackson, Michele C; Kapur, Kush; Mikati, Mohamad A; Peariso, Katrina; Wainwright, Mark S; Wilfong, Angus A; Williams, Korwyn; Loddenkemper, Tobias
2016-01-01
Objective To describe pediatric patients with convulsive refractory status epilepticus (RSE) in whom there is intention-to-use an intravenous anesthetic for seizure control. Design Two-year prospective observational study evaluating patients (age range one month to 21 years) with RSE not responding to two antiepileptic drug classes and treated with continuous infusion of anesthetic agent. Setting Nine pediatric hospitals in the United States. Patients In a cohort of 111 patients with RSE (median age 3.7 years, 50% male), 54 (49%) underwent continuous infusion of anesthetic treatment. Main Results The median (interquartile range, IQR) intensive care unit length-of-stay was 10 (3–20) days. Up to four ‘cycles’ of serial anesthetic therapy were used and seizure termination was achieved in 94% by the second cycle. Seizure duration in controlled patients was 5.9 (1.9–34) hours for the first cycle, and longer when a second cycle was required (30 [4,−120] hours, p=0.048). Midazolam was the most frequent first-line anesthetic agent (78%); pentobarbital was the most frequently used second-line agent after midazolam failure (82%). An electroencephalographic endpoint was used in over half of the patients; higher midazolam dosing was used with a burst suppression endpoint. In midazolam non-responders, transition to a second agent occurred after a median of one day. Most patients (94%) experienced seizure termination with these two therapies. Conclusions Midazolam and pentobarbital remains the mainstay of continuous infusion therapy for RSE in the pediatric patient. The majority of patients experience seizure termination within a median of 30 hours. These data have implications for the design and feasibility of future intervention trials. That is, testing a new anesthetic anticonvulsant after failure of both midazolam and pentobarbital is unlikely to be feasible in a pediatric study, whereas a decision to test an alternative to pentobarbital, after midazolam failure, may be possible in a multicenter multinational study. PMID:27500721
Girardis, Massimo; Busani, Stefano; Damiani, Elisa; Donati, Abele; Rinaldi, Laura; Marudi, Andrea; Morelli, Andrea; Antonelli, Massimo; Singer, Mervyn
2016-10-18
Despite suggestions of potential harm from unnecessary oxygen therapy, critically ill patients spend substantial periods in a hyperoxemic state. A strategy of controlled arterial oxygenation is thus rational but has not been validated in clinical practice. To assess whether a conservative protocol for oxygen supplementation could improve outcomes in patients admitted to intensive care units (ICUs). Oxygen-ICU was a single-center, open-label, randomized clinical trial conducted from March 2010 to October 2012 that included all adults admitted with an expected length of stay of 72 hours or longer to the medical-surgical ICU of Modena University Hospital, Italy. The originally planned sample size was 660 patients, but the study was stopped early due to difficulties in enrollment after inclusion of 480 patients. Patients were randomly assigned to receive oxygen therapy to maintain Pao2 between 70 and 100 mm Hg or arterial oxyhemoglobin saturation (Spo2) between 94% and 98% (conservative group) or, according to standard ICU practice, to allow Pao2 values up to 150 mm Hg or Spo2 values between 97% and 100% (conventional control group). The primary outcome was ICU mortality. Secondary outcomes included occurrence of new organ failure and infection 48 hours or more after ICU admission. A total of 434 patients (median age, 64 years; 188 [43.3%] women) received conventional (n = 218) or conservative (n = 216) oxygen therapy and were included in the modified intent-to-treat analysis. Daily time-weighted Pao2 averages during the ICU stay were significantly higher (P < .001) in the conventional group (median Pao2, 102 mm Hg [interquartile range, 88-116]) vs the conservative group (median Pao2, 87 mm Hg [interquartile range, 79-97]). Twenty-five patients in the conservative oxygen therapy group (11.6%) and 44 in the conventional oxygen therapy group (20.2%) died during their ICU stay (absolute risk reduction [ARR], 0.086 [95% CI, 0.017-0.150]; relative risk [RR], 0.57 [95% CI, 0.37-0.90]; P = .01). Occurrences were lower in the conservative oxygen therapy group for new shock episode (ARR, 0.068 [95% CI, 0.020-0.120]; RR, 0.35 [95% CI, 0.16-0.75]; P = .006) or liver failure (ARR, 0.046 [95% CI, 0.008-0.088]; RR, 0.29 [95% CI, 0.10-0.82]; P = .02) and new bloodstream infection (ARR, 0.05 [95% CI, 0.00-0.09]; RR, 0.50 [95% CI, 0.25-0.998; P = .049). Among critically ill patients with an ICU length of stay of 72 hours or longer, a conservative protocol for oxygen therapy vs conventional therapy resulted in lower ICU mortality. These preliminary findings were based on unplanned early termination of the trial, and a larger multicenter trial is needed to evaluate the potential benefit of this approach. clinicaltrials.gov Identifier: NCT01319643.
Sleep duration from infancy to adolescence: reference values and generational trends.
Iglowstein, Ivo; Jenni, Oskar G; Molinari, Luciano; Largo, Remo H
2003-02-01
The main purpose of the present study was to calculate percentile curves for total sleep duration per 24 hours, for nighttime and for daytime sleep duration from early infancy to late adolescence to illustrate the developmental course and age-specific variability of these variables among subjects. A total of 493 subjects from the Zurich Longitudinal Studies were followed using structured sleep-related questionnaires at 1, 3, 6, 9, 12, 18, and 24 months after birth and then at annual intervals until 16 years of age. Gaussian percentiles for ages 3 months to 16 years were calculated for total sleep duration (time in bed) and nighttime and daytime sleep duration. The mean sleep duration for ages 1 to 16 years was estimated by generalized additive models based on the loess smoother; a cohort effect also had to be included. The standard deviation (SD) was estimated from the loess smoothed absolute residuals from the mean curve. For ages 3, 6, and 9 months, an alternative approach with a simple model linear in age was used. For age 1 month, empirical percentiles were calculated. Total sleep duration decreased from an average of 14.2 hours (SD: 1.9 hours) at 6 months of age to an average of 8.1 hours (SD: 0.8 hours) at 16 years of age. The variance showed the same declining trend: the interquartile range at 6 months after birth was 2.5 hours, whereas at 16 years of age, it was only 1.0 hours. Total sleep duration decreased across the studied cohorts (1974-1993) because of increasingly later bedtime but unchanged wake time across decades. Consolidation of nocturnal sleep occurred during the first 12 months after birth with a decreasing trend of daytime sleep. This resulted in a small increase of nighttime sleep duration by 1 year of age (mean 11.0 +/- 1.1 hours at 1 month to 11.7 +/- 1.0 hours at 1 year of age). The most prominent decline in napping habits occurred between 1.5 years of age (96.4% of all children) and 4 years of age (35.4%). Percentile curves provide valuable information on developmental course and age-specific variability of sleep duration for the health care professional who deals with sleep problems in pediatric practice.
Balaji, Seshadri; Daga, Ankana; Bradley, David J; Etheridge, Susan P; Law, Ian H; Batra, Anjan S; Sanatani, Shubayan; Singh, Anoop K; Gajewski, Kelly K; Tsao, Sabrina; Singh, Harinder R; Tisma-Dupanovic, Svjetlana; Tateno, Shigeru; Takamuro, Motoki; Nakajima, Hiromichi; Roos-Hesselink, Jolien W; Shah, Maully
2014-08-01
The study objective was to determine whether the extracardiac conduit Fontan confers an arrhythmia advantage over the intracardiac lateral tunnel Fontan. This multicenter study of 1271 patients compared bradyarrhythmia (defined as need for pacing) and tachyarrhythmia (defined as needing antiarrhythmic therapy) between 602 patients undergoing the intracardiac Fontan and 669 patients undergoing the extracardiac Fontan. The median age at the time of the Fontan procedure was 2.1 years (interquartile range, 1.6-3.2 years) for the intracardiac group and 3.0 years (interquartile range, 2.4-3.9) for the extracardiac group (P < .0001). The median follow-up was 9.2 years (interquartile range, 5-12.8) for the intracardiac group and 4.7 years (interquartile range, 2.8-7.7) for the extracardiac group (P < .0001). Early postoperative (<30 days) bradyarrhythmia occurred in 24 patients (4%) in the intracardiac group and 73 patients (11%) in the extracardiac group (P < .0001). Early postoperative (<30 days) tachyarrhythmia occurred in 32 patients (5%) in the intracardiac group and 53 patients (8%) in the extracardiac group (P = not significant). Late (>30 days) bradyarrhythmia occurred in 105 patients (18%) in the intracardiac group and 63 patients (9%) in the extracardiac group (P < .0001). Late (>30 days) tachyarrhythmia occurred in 58 patients (10%) in the intracardiac group and 23 patients (3%) in the extracardiac group (P < .0001). By multivariate analysis factoring time since surgery, more patients in the extracardiac group had early bradycardia (odds ratio, 2.9; 95% confidence interval, 1.8-4.6), with no difference in early tachycardia, late bradycardia, or late tachycardia. Overall arrhythmia burden is similar between the 2 groups, but the extracardiac Fontan group had a higher incidence of early bradyarrhythmias. There was no difference in the incidence of late tachyarrhythmias over time between the 2 operations. Therefore, the type of Fontan performed should be based on factors other than an anticipated reduction in arrhythmia burden from the extracardiac conduit. Copyright © 2014 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Acute heart failure: perspectives from a randomized trial and a simultaneous registry.
Ezekowitz, Justin A; Hu, Jia; Delgado, Diego; Hernandez, Adrian F; Kaul, Padma; Leader, Rolland; Proulx, Guy; Virani, Sean; White, Michel; Zieroth, Shelley; O'Connor, Christopher; Westerhout, Cynthia M; Armstrong, Paul W
2012-11-01
Randomized controlled trials (RCT) are limited by their generalizability to the broader nontrial population. To provide a context for Acute Study of Nesiritide in Decompensated Heart Failure (ASCEND-HF) trial, we designed a complementary registry to characterize clinical characteristics, practice patterns, and in-hospital outcomes of acute heart failure patients. Eligible patients for the registry included those with a principal diagnosis of acute heart failure (ICD-9-CM 402 and 428; ICD-10 I50.x, I11.0, I13.0, I13.2) from 8 sites participating in ASCEND-HF (n=697 patients, 2007-2010). Baseline characteristics, treatments, and hospital outcomes from the registy were compared with ASCEND-HF RCT patients from 31 Canadian sites (n=465, 2007-2010). Patients in the registry were older, more likely to be female, and have chronic respiratory disease, less likely to have diabetes mellitus: they had a similar incidence of ischemic HF, atrial fibrillation, and similar B-type natriuretic peptide levels. Registry patients had higher systolic blood pressure (registry: median 132 mm Hg [interquartile range 115-151 mm Hg]; RCT: median 120 mm Hg [interquartile range 110-135 mm Hg]) and ejection fraction (registry: median 40% [interquartile range 27-58%]; RCT: median 29% [interquartile range 20-40 mm Hg]) than RCT patients. Registry patients presented more often via ambulance and had a similar total length of stay as RCT patients. In-hospital mortality was significantly higher in the registry compared with the RCT patients (9.3% versus 1.3%,P<0.001), and this remained after multivariable adjustment (odds ratio 6.6, 95% CI 2.6-16.8, P<0.001). Patients enrolled in a large RCT of acute heart failure differed significantly based on clinical characteristics, treatments, and inpatient outcomes from contemporaneous patients participating in a registry. These results highlight the need for context of RCTs to evaluate generalizability of results and especially the need to improve clinical outcomes in acute heart failure. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00475852.
Avasare, Rupali S; Canetta, Pietro A; Bomback, Andrew S; Marasa, Maddalena; Caliskan, Yasar; Ozluk, Yasemin; Li, Yifu; Gharavi, Ali G; Appel, Gerald B
2018-03-07
C3 glomerulopathy is a form of complement-mediated GN. Immunosuppressive therapy may be beneficial in the treatment of C3 glomerulopathy. Mycophenolate mofetil is an attractive treatment option given its role in the treatment of other complement-mediated diseases and the results of the Spanish Group for the Study of Glomerular Diseases C3 Study. Here, we study the outcomes of patients with C3 glomerulopathy treated with steroids and mycophenolate mofetil. We conducted a retrospective chart review of patients in the C3 glomerulopathy registry at Columbia University and identified patients treated with mycophenolate mofetil for at least 3 months and follow-up for at least 1 year. We studied clinical, histologic, and genetic data for the whole group and compared data for those who achieved complete or partial remission (responders) with those who did not achieve remission (nonresponders). We compared remission with mycophenolate mofetil with remission with other immunosuppressive regimens. We identified 30 patients who met inclusion criteria. Median age was 25 years old (interquartile range, 18-36), median creatinine was 1.07 mg/dl (interquartile range, 0.79-1.69), and median proteinuria was 3200 mg/g creatinine (interquartile range, 1720-6759). The median follow-up time was 32 months (interquartile range, 21-68). Twenty (67%) patients were classified as responders. There were no significant differences in baseline characteristics between responders and nonresponders, although initial proteinuria was lower (median 2468 mg/g creatinine) in responders compared with nonresponders (median 5000 mg/g creatinine) and soluble membrane attack complex levels were higher in responders compared with nonresponders. For those tapered off mycophenolate mofetil, relapse rate was 50%. Genome-wide analysis on complement genes was done, and in 12 patients, we found 18 variants predicted to be damaging. None of these variants were previously reported to be pathogenic. Mycophenolate mofetil with steroids outperformed other immunosuppressive regimens. Among patients who tolerated mycophenolate mofetil, combination therapy with steroids induced remission in 67% of this cohort. Heavier proteinuria at the start of therapy and lower soluble membrane attack complex levels were associated with treatment resistance. Copyright © 2018 by the American Society of Nephrology.
Mattioli, Sandro; Ruffato, Alberto; Lugaresi, Marialuisa; Pilotti, Vladimiro; Aramini, Beatrice; D'Ovidio, Frank
2010-11-01
Quality of outcome of the Heller-Dor operation is sometimes different between studies, likely because of technical reasons. We analyze the details of myotomy and fundoplication in relation to the results achieved over a 30-year single center's experience. From 1979-2008, a long esophagogastric myotomy and a partial anterior fundoplication to protect the surface of the myotomy was routinely performed with intraoperative manometry in 202 patients (97 men; median age, 55.5 years; interquartile range, 43.7-71 years) through a laparotomy and in 60 patients (24 men; median age, 46 years; interquartile range, 36.2-63 years) through a laparoscopy. The follow-up consisted of periodical interview, endoscopy, and barium swallow, and a semiquantitative scale was used to grade results. Mortality was 1 of 202 in the laparotomy group and 0 of 60 in the laparoscopy group. Median follow-up was 96 months (interquartile range, 48-190.5 months) in the laparotomy group and 48 months (interquartile range, 27-69.5 months) in the laparoscopy group. At intraoperative manometry, complete abolition of the high-pressure zone was obtained in 100%. The Dor-related high-pressure zone length and mean pressure were 4.5 ± 0.4 cm and 13.3 ± 2.2 mm Hg in the laparotomy group and 4.5 ± 0.5 cm and 13.2 ± 2.2 mm Hg in the laparoscopy group (P = .75). In the laparotomy group poor results (19/201 [9.5%]) were secondary to esophagitis in 15 (7.5%) of 201 patients (in 2 patients after 184 and 252 months, respectively) and to recurrent dysphagia in 4 (2%) of 201 patients, all with end-stage sigmoid achalasia. In the laparoscopy group 2 (3.3%) of 60 had esophagitis. A long esophagogastric myotomy protected by means of Dor fundoplication cures or substantially reduces dysphagia in the great majority of patients affected by esophageal achalasia and effectively controls postoperative esophagitis. Intraoperative manometry is likely the key factor for achieving the reported results. Copyright © 2010 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Adolescent knowledge and attitudes related to clinical trials.
Brown, Devin L; Cowdery, Joan E; Jones, Toni Stokes; Langford, Aisha; Gammage, Catherine; Jacobs, Teresa L
2015-06-01
Poor enrollment plagues most clinical trials. Furthermore, despite mandates to improve minority representation in clinical trial participation, little progress has been made. We investigated the knowledge and attitudes of adolescents related to clinical trials and made race/ethnicity comparisons in an attempt to identify a possible educational intervention target. Students aged 13-18 years in southeast Michigan were offered participation through a class at one high school or two academic summer enrichment programs that drew from multiple high schools (73% response). Questionnaires previously validated in adults were administered. Non-Hispanic whites were compared with minorities using Wilcoxon rank-sum tests. Of the 82 respondents, the median age was 16 years (interquartile range: 15-17 years); 22 (28%) were white, 41 (51%) were African American, 11 (14%) were multiracial, 2 (2%) were American Indian or Alaska Native, 1 (1%) was Asian, 3 (4%) were Native Hawaiian or other Pacific Islander, and 2 respondents did not report a race (but did report Hispanic ethnicity). Nine (12%) were Hispanic. Only 27 (33%) had ever heard of a clinical trial. On a scale from 1 (most receptive) to 5 (least receptive) for learning more about a clinical trial for a relevant medical condition, the median score was 2 (interquartile range: 1-3) and for participating in a clinical trial for a relevant medical condition was 2 (interquartile range: 2-3). Overall knowledge was poor, with a median of 46% (interquartile range: 23%-62%) of knowledge answers correct. Knowledge was reduced (p = 0.0006) and attitudes were more negative (p = 0.05) in minorities than non-Hispanic whites, while minorities also endorsed more substantial barriers to trial participation (p = 0.0002). Distrust was similar between minority students and non-Hispanic whites (p = 0.15), and self-efficacy was greater in non-Hispanic whites (p = 0.05). Educational interventions directed toward adolescents that address knowledge, attitudes, and distrust in order to improve clinical trial awareness and receptivity overall are needed and may represent a tool to address disparities in minority enrollment in clinical trials. © The Author(s) 2015.
Cranston, Ross D; Baker, Jonathan R; Siegel, Aaron; Brand, Rhonda M; Janocko, Laura; McGowan, Ian
2018-03-01
Imiquimod can be used to treat internal anal high-grade squamous intraepithelial lesions. In HIV-1-infected individuals there is a theoretical concern for increased HIV replication in anorectal tissue secondary to imiquimod-induced mucosal inflammation. The purpose of this study was to assess local virologic, immunologic, and pathologic effects of imiquimod treatment in HIV-infected individuals. This was a pilot study at a single academic center. The study was conducted at the University of Pittsburgh Anal Dysplasia Clinic. HIV-1-infected individuals with biopsy-confirmed internal anal high-grade squamous intraepithelial lesions were included. Imiquimod cream was prescribed for intra-anal use 3 times per week for 9 weeks. Anal human papillomavirus typing, anal and rectal tissue HIV-1 RNA and DNA quantification, cytokine gene expression, and anal histology were measured. Nine evaluable participants (1 participant was lost to follow-up) were all white men with a median age of 46 years (interquartile range = 12 y) and a median CD4 T-cell count of 480 cells per cubic millimeter (interquartile range = 835). All were taking antiretroviral therapy, and 7 of 9 had HIV-1 RNA <50 copies per milliliter. The median dose of imiquimod used was 27.0 (interquartile range = 3.5), and there was a median of 11 days (interquartile range = 10 d) from last dose to assessment. There was no progression to cancer, no significant change in the number of human papillomavirus types detected, and no significant change in quantifiable cytokines/HIV-1 RNA or DNA levels in anal or rectal tissue. Seven (35%) of 20 high-grade lesions resolved to low-grade squamous intraepithelial lesions. The study was limited by the small number of participants and variable time to final assessment. Intra-anal imiquimod showed no evidence of immune activation or increase in HIV-1 viral replication in anal and rectal tissue and confirmed efficacy for intra-anal high-grade squamous intraepithelial lesion treatment morbidity. See Video Abstract at http://links.lww.com/DCR/A498.
Chang, Andrew K; Bijur, Polly E; Munjal, Kevin G; John Gallagher, E
2014-03-01
The objective was to test the hypothesis that hydrocodone/acetaminophen (Vicodin [5/500]) provides more efficacious analgesia than codeine/acetaminophen (Tylenol #3 [30/300]) in patients discharged from the emergency department (ED). Both are currently Drug Enforcement Administration (DEA) Schedule III narcotics. This was a prospective, randomized, double-blind, clinical trial of patients with acute extremity pain who were discharged home from the ED, comparing a 3-day supply of oral hydrocodone/acetaminophen (5 mg/500 mg) to oral codeine/acetaminophen (30 mg/300 mg). Pain was measured on a valid and reproducible verbal numeric rating scale (NRS) ranging from 0 to 10, and patients were contacted by telephone approximately 24 hours after being discharged. The primary outcome was the between-group difference in improvement in pain at 2 hours following the most recent ingestion of the study drug, relative to the time of phone contact after ED discharge. Secondary outcomes compared side-effect profiles and patient satisfaction. The median time from ED discharge to follow-up was 26 hours (interquartile range [IQR] = 24 to 39 hours). The mean NRS pain score before the most recent dose of pain medication after ED discharge was 7.6 NRS units for both groups. The mean decrease in pain scores 2 hours after pain medications were taken were 3.9 NRS units in the hydrocodone/acetaminophen group versus 3.5 NRS units in the codeine/acetaminophen group, for a difference of 0.4 NRS units (95% confidence interval [CI] = -0.3 to 1.2 NRS units). No differences were found in side effects or patient satisfaction. Both medications decreased NRS pain scores by approximately 50%. However, the oral hydrocodone/acetaminophen failed to provide clinically or statistically superior pain relief compared to oral codeine/acetaminophen when prescribed to patients discharged from the ED with acute extremity pain. Similarly, there were no clinically or statistically important differences in side-effect profiles or patient satisfaction. If the DEA reclassifies hydrocodone as a Schedule II narcotic, as recently recommended by its advisory board, our data suggest that the codeine/acetaminophen may be a clinically reasonable Schedule III substitute for hydrocodone/acetaminophen at ED discharge. These findings should be regarded as tentative and require independent validation in similar and other acute pain models. © 2014 by the Society for Academic Emergency Medicine.
Retrospective Evaluation of the Effect of Heart Rate on Survival in Dogs with Atrial Fibrillation.
Pedro, B; Dukes-McEwan, J; Oyama, M A; Kraus, M S; Gelzer, A R
2018-01-01
Atrial fibrillation (AF) usually is associated with a rapid ventricular rate. The optimal heart rate (HR) during AF is unknown. Heart rate affects survival in dogs with chronic AF. Forty-six dogs with AF and 24-hour ambulatory recordings were evaluated. Retrospective study. Holter-derived HR variables were analyzed as follows: mean HR (meanHR, 24-hour average), minimum HR (minHR, 1-minute average), maximum HR (maxHR, 1-minute average). Survival times were recorded from the time of presumed adequate rate control. The primary endpoint was all-cause mortality. Cox proportional hazards analysis identified variables independently associated with survival; Kaplan-Meier survival analysis estimated the median survival time of dogs with meanHR <125 bpm versus ≥125 bpm. All 46 dogs had structural heart disease; 31 of 46 had congestive heart failure (CHF), 44 of 46 received antiarrhythmic drugs. Of 15 dogs with cardiac death, 14 had CHF. Median time to all-cause death was 524 days (Interquartile range (IQR), 76-1,037 days). MeanHR was 125 bpm (range, 62-203 bpm), minHR was 82 bpm (range, 37-163 bpm), maxHR was 217 bpm (range, 126-307 bpm). These were significantly correlated with all-cause and cardiac-related mortality. For every 10 bpm increase in meanHR, the risk of all-cause mortality increased by 35% (hazard ratio, 1.35; 95% CI, 1.17-1.55; P < 0.001). Median survival time of dogs with meanHR<125 bpm (n = 23) was significantly longer (1,037 days; range, 524-open) than meanHR ≥125 bpm (n = 23; 105 days; range, 67-267 days; P = 0.0012). Mean HR was independently associated with all-cause and cardiovascular mortality (P < 0.003). Holter-derived meanHR affects survival in dogs with AF. Dogs with meanHR <125 bpm lived longer than those with meanHR ≥ 125 bpm. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Topping, Daniel B
2014-01-01
Anatomy educators are being tasked with delivering the same quantity and quality of material in the face of fewer classroom and laboratory hours. As a result they have turned to computer-aided instruction (CAI) to supplement and augment curriculum delivery. Research on the satisfaction and use of anatomy videos, a form of CAI, on examination performance continues to grow. The purpose of this study was to describe the usage and effect on examination scores of a series of locally produced anatomy videos after an 11% curriculum reduction. First-year medical students (n = 40) were given access to the videos and the prior year's students (n = 40) were used as historical controls. There was no significant difference in demographics between the two groups. The survey response rate was 85% (n = 34) in the experimental group. The students found the videos to be highly satisfying (median = 5 on a five-point Likert scale, interquartile range = 1) and used them on average 1.55 times/week (SD ± 0.77). Availability of the videos did have a statistically significant effect (4% improvement) on the final laboratory examination (p = 0.039). This suggests that the videos were a well-received form of CAI that may be useful in bridging the gap created by a reduction in gross anatomy course contact hours. © 2013 American Association of Anatomists.
Adverse obstetric outcomes in women with previous cesarean for dystocia in second stage of labor.
Jastrow, Nicole; Demers, Suzanne; Gauthier, Robert J; Chaillet, Nils; Brassard, Normand; Bujold, Emmanuel
2013-03-01
To evaluate obstetric outcomes in women undergoing a trial of labor (TOL) after a previous cesarean for dystocia in second stage of labor. A retrospective cohort study of women with one previous low transverse cesarean undergoing a first TOL was performed. Women with previous cesarean for dystocia in first stage and those with previous dystocia in second stage were compared with those with previous cesarean for nonrecurrent reasons (controls). Multivariable regressions analyses were performed. Of 1655 women, those with previous dystocia in second stage of labor (n = 204) had greater risks than controls (n = 880) to have an operative delivery [odds ratio (OR): 1.5; 95% confidence intervals (CI) 1.1 to 2.2], shoulder dystocia (OR: 2.9; 95% CI 1.1 to 8.0), and uterine rupture in the second stage of labor (OR: 4.9; 95% CI 1.1 to 23), and especially in case of fetal macrosomia (OR: 29.6; 95% CI 4.4 to 202). The median second stage of labor duration before uterine rupture was 2.5 hours (interquartile range: 1.5 to 3.2 hours) in these women. Previous cesarean for dystocia in the second stage of labor is associated with second-stage uterine rupture at next delivery, especially in cases of suspected fetal macrosomia and prolonged second stage of labor. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Szabo, Zoltan; dePaul, Vincent T.; Kraemer, Thomas F.; Parsa, Bahman
2005-01-01
Water in the unconfined Kirkwood-Cohansey aquifer system in the New Jersey Coastal Plain contains elevated concentrations (above 3 pCi/L (picocuries per liter)) of the alpha-particle-emitting radionuclide radium-224. Previously, water from the aquifer system had been found to contain radium-226 and radium-228. This observation is of concern because the previously undetected presence of radium-224 may pose an additional, quantifiable health risk that currently is not accounted for by the Maximum Contaminant Level (MCL) of 5 pCi/L for combined radium (the sum of radium-226 plus radium-228 is termed 'combined radium') in drinking water. Water samples were collected from a regional network of 88 wells for determination of concentrations of radium-224, radium-226, and radium-228; gross alpha-particle activity; and concentrations of major ions and selected trace elements. Both gamma and alpha spectroscopic techniques were used to determine concentrations of radium-224, which ranged from <0.5 to 16.8 pCi/L (median 2.1 pCi/L, interquartile range 1.2-3.7 pCi/L). Concentrations of radium-226 and radium-228 in the same samples ranged from <0.5 to 17.4 pCi/L (median 1.7 pCi/L, interquartile range 0.9-2.9 pCi/L) and <0.5 to 12.8 pCi/L (median 1.6 pCi/L, interquartile range, 0.9-2.6 pCi/L), respectively. Concentrations of radium-224 typically were greater than those of the other two radium radionuclides, as evidenced by the highest median, third quartile, and maximum concentrations, as well as the highest concentration among the three radium radioisotopes in 52 (59 percent) of the 88 samples. Concentrations of 5.0 to 5.5 pCi/L of radium-224 result in a gross alpha-particle activity of about 15 pCi/L (the MCL) 36 to 48 hours, respectively, after sample collection when ingrowth of radium-224 progeny radionuclides is considered, even with the unlikely assumption that no other alpha-particle-emitting radionuclide is present in the water. Concentrations of 3.4 to 3.7 pCi/L radium-224 result in a gross alpha-particle activity of 10 pCi/L 36 to 48 hours, respectively, after sample collection when ingrowth of Ra-224 progeny radionuclides is considered. In this latter case, it is possible that the summed alpha-particle activity from radium-226 present at a concentration less than or equal to 5 pCi/L (the MCL for combined radium) and from radium-224 present at a concentration about 3.4 pCi/L or greater may exceed the 15-pCi/L MCL for gross alpha-particle activity. In this study, gross alpha-particle activities were measured within 48 hours after sample collection and were found to exceed the MCL of 15 pCi/L in nearly half (43) of the 88 samples collected. The concentration of radium-224 exceeded that of radium-226 in 55 (62.5 percent) of the 88 samples. Concentrations of radium-224 correlate strongly with those of both radium-226 and radium-228 (Spearman correlation coefficients r=0.74 and 0.91, respectively). Concentrations of radium-224, radium-226, and radium-228 were greatest in the most acidic ground water. Concentrations of radium-224 and combined radium-226 and radium-228 in samples of ground water with pH less than 4.7 exceeded 5 pCi/L in 33 and 67 percent of the samples, respectively. Concentrations of radium-224, radium-226, and radium-228 (measured separately) were greatest in water from the southern part of the aquifer outcrop area. In water from the northern part of the aquifer system outcrop area, radium-224 concentrations were as high as 3.6 pCi/L, and concentrations of combined radium and gross alpha-particle activity in some samples exceeded their respective MCLs. The presence of gross alpha-particle activities greater than 15 pCi/L and combined radium-226 and radium-228 concentrations greater than 5 pCi/L in the southwestern part of the aquifer system outcrop area is common and had been documented before 1997. Results of this study confirm these earlier findings. In northeastern and southeastern parts of the aquifer
Normalization methods in time series of platelet function assays
Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham
2016-01-01
Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217
Time to Reperfusion and Treatment Effect for Acute Ischemic Stroke: A Randomized Clinical Trial.
Fransen, Puck S S; Berkhemer, Olvert A; Lingsma, Hester F; Beumer, Debbie; van den Berg, Lucie A; Yoo, Albert J; Schonewille, Wouter J; Vos, Jan Albert; Nederkoorn, Paul J; Wermer, Marieke J H; van Walderveen, Marianne A A; Staals, Julie; Hofmeijer, Jeannette; van Oostayen, Jacques A; Lycklama À Nijeholt, Geert J; Boiten, Jelis; Brouwer, Patrick A; Emmer, Bart J; de Bruijn, Sebastiaan F; van Dijk, Lukas C; Kappelle, L Jaap; Lo, Rob H; van Dijk, Ewoud J; de Vries, Joost; de Kort, Paul L M; van den Berg, J S Peter; van Hasselt, Boudewijn A A M; Aerden, Leo A M; Dallinga, René J; Visser, Marieke C; Bot, Joseph C J; Vroomen, Patrick C; Eshghi, Omid; Schreuder, Tobien H C M L; Heijboer, Roel J J; Keizer, Koos; Tielbeek, Alexander V; den Hertog, Heleen M; Gerrits, Dick G; van den Berg-Vos, Renske M; Karas, Giorgos B; Steyerberg, Ewout W; Flach, H Zwenneke; Marquering, Henk A; Sprengers, Marieke E S; Jenniskens, Sjoerd F M; Beenen, Ludo F M; van den Berg, René; Koudstaal, Peter J; van Zwam, Wim H; Roos, Yvo B W E M; van Oostenbrugge, Robert J; Majoie, Charles B L M; van der Lugt, Aad; Dippel, Diederik W J
2016-02-01
Intra-arterial treatment (IAT) for acute ischemic stroke caused by intracranial arterial occlusion leads to improved functional outcome in patients treated within 6 hours after onset. The influence of treatment delay on treatment effect is not yet known. To evaluate the influence of time from stroke onset to the start of treatment and from stroke onset to reperfusion on the effect of IAT. The Multicenter Randomized Clinical Trial of Endovascular Treatment of Acute Ischemic Stroke in the Netherlands (MR CLEAN) was a multicenter, randomized clinical open-label trial of IAT vs no IAT in 500 patients. The time to the start of treatment was defined as the time from onset of symptoms to groin puncture (TOG). The time from onset of treatment to reperfusion (TOR) was defined as the time to reopening the vessel occlusion or the end of the procedure in cases for which reperfusion was not achieved. Data were collected from December 3, 2010, to June 3, 2014, and analyzed (intention to treat) from July 1, 2014, to September 19, 2015. Main outcome was the modified Rankin Scale (mRS) score for functional outcome (range, 0 [no symptoms] to 6 [death]). Multiple ordinal logistic regression analysis estimated the effect of treatment and tested for the interaction of time to randomization, TOG, and TOR with treatment. The effect of treatment as a risk difference on reaching independence (mRS score, 0-2) was computed as a function of TOG and TOR. Calculations were adjusted for age, National Institutes of Health Stroke Scale score, previous stroke, atrial fibrillation, diabetes mellitus, and intracranial arterial terminus occlusion. Among 500 patients (58% male; median age, 67 years), the median TOG was 260 (interquartile range [IQR], 210-311) minutes; median TOR, 340 (IQR, 274-395) minutes. An interaction between TOR and treatment (P = .04) existed, but not between TOG and treatment (P = .26). The adjusted risk difference (95% CI) was 25.9% (8.3%-44.4%) when reperfusion was reached at 3 hours, 18.8% (6.6%-32.6%) at 4 hours, and 6.7% (0.4%-14.5%) at 6 hours. For every hour of reperfusion delay, the initially large benefit of IAT decreases; the absolute risk difference for a good outcome is reduced by 6% per hour of delay. Patients with acute ischemic stroke require immediate diagnostic workup and IAT in case of intracranial arterial vessel occlusion. trialregister.nl Identifier: NTR1804.
Moll, Joel; Krieger, Paul; Moreno-Walton, Lisa; Lee, Benjamin; Slaven, Ellen; James, Thea; Hill, Dustin; Podolsky, Susan; Corbin, Theodore; Heron, Sheryl L
2014-05-01
The Institute of Medicine, The Joint Commission, and the U.S. Department of Health and Human Services all have recently highlighted the need for cultural competency and provider education on lesbian, gay, bisexual, and transgender (LGBT) health. Forty percent of LGBT patients cite lack of provider education as a barrier to care. Only a few hours of medical school curriculum are devoted to LGBT education, and little is known about LGBT graduate medical education. The objective of this study was to perform a needs assessment to determine to what degree LGBT health is taught in emergency medicine (EM) residency programs and to determine whether program demographics affect inclusion of LGBT health topics. An anonymous survey link was sent to EM residency program directors (PDs) via the Council of Emergency Medicine Residency Directors listserv. The 12-item descriptive survey asked the number of actual and desired hours of instruction on LGBT health in the past year. Perceived barriers to LGBT health education and program demographics were also sought. There were 124 responses to the survey out of a potential response from 160 programs (response rate of 78%). Twenty-six percent of the respondents reported that they have ever presented a specific LGBT lecture, and 33% have incorporated topics affecting LGBT health in the didactic curriculum. EM programs presented anywhere from 0 to 8 hours on LGBT health, averaging 45 minutes of instruction in the past year (median = 0 minutes, interquartile range [IQR] = 0 to 60 minutes), and PDs support inclusion of anywhere from 0 to 10 hours of dedicated time to LGBT health, with an average of 2.2 hours (median = 2 hours, IQR = 1 to 3.5 hours) recommended. The majority of respondents have LGBT faculty (64.2%) and residents (56.2%) in their programs. The presence of LGBT faculty and previous LGBT education were associated with a greater number of desired hours on LGBT health. The majority of EM residency programs have not presented curricula specific to LGBT health, although PDs desire inclusion of these topics. Further curriculum development is needed to better serve LGBT patients. © 2014 by the Society for Academic Emergency Medicine.
Influence of Antiflatulent Dietary Advice on Intrafraction Motion for Prostate Cancer Radiotherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lips, Irene M., E-mail: I.M.Lips@umcutrecht.nl; Kotte, Alexis N.T.J.; Gils, Carla H. van
Purpose: To evaluate the effect of an antiflatulent dietary advice on the intrafraction prostate motion in patients treated with intensity-modulated radiotherapy (IMRT) for prostate cancer. Methods and Materials: Between February 2002 and December 2009, 977 patients received five-beam IMRT for prostate cancer to a dose of 76 Gy in 35 fractions combined with fiducial markers for position verification. In July 2008, the diet, consisting of dietary guidelines to obtain regular bowel movements and to reduce intestinal gas by avoiding certain foods and air swallowing, was introduced to reduce the prostate motion. The intrafraction prostate movement was determined from the portalmore » images of the first segment of all five beams. Clinically relevant intrafraction motion was defined as {>=}50% of the fractions with an intrafraction motion outside a range of 3 mm. Results: A total of 739 patients were treated without the diet and 105 patients were treated with radiotherapy after introduction of the diet. The median and interquartile range of the average intrafraction motion per patient was 2.53 mm (interquartile range, 2.2-3.0) without the diet and 3.00 mm (interquartile range, 2.4-3.5) with the diet (p < .0001). The percentage of patients with clinically relevant intrafraction motion increased statistically significant from 19.1% without diet to 42.9% with a diet (odds ratio, 3.18; 95% confidence interval, 2.07-4.88; p < .0001). Conclusions: The results of the present study suggest that antiflatulent dietary advice for patients undergoing IMRT for prostate cancer does not reduce the intrafraction movement of the prostate. Therefore, antiflatulent dietary advice is not recommended in clinical practice for this purpose.« less
Aggarwal, Neil R; Brower, Roy G; Hager, David N; Thompson, B Taylor; Netzer, Giora; Shanholtz, Carl; Lagakos, Adrian; Checkley, William
2018-04-01
High fractions of inspired oxygen may augment lung damage to exacerbate lung injury in patients with acute respiratory distress syndrome. Participants enrolled in Acute Respiratory Distress Syndrome Network trials had a goal partial pressure of oxygen in arterial blood range of 55-80 mm Hg, yet the effect of oxygen exposure above this arterial oxygen tension range on clinical outcomes is unknown. We sought to determine if oxygen exposure that resulted in a partial pressure of oxygen in arterial blood above goal (> 80 mm Hg) was associated with worse outcomes in patients with acute respiratory distress syndrome. Longitudinal analysis of data collected in these trials. Ten clinical trials conducted at Acute Respiratory Distress Syndrome Network hospitals between 1996 and 2013. Critically ill patients with acute respiratory distress syndrome. None. We defined above goal oxygen exposure as the difference between the fraction of inspired oxygen and 0.5 whenever the fraction of inspired oxygen was above 0.5 and when the partial pressure of oxygen in arterial blood was above 80 mm Hg. We then summed above goal oxygen exposures in the first five days to calculate a cumulative above goal oxygen exposure. We determined the effect of a cumulative 5-day above goal oxygen exposure on mortality prior to discharge home at 90 days. Among 2,994 participants (mean age, 51.3 yr; 54% male) with a study-entry partial pressure of oxygen in arterial blood/fraction of inspired oxygen that met acute respiratory distress syndrome criteria, average cumulative above goal oxygen exposure was 0.24 fraction of inspired oxygen-days (interquartile range, 0-0.38). Participants with above goal oxygen exposure were more likely to die (adjusted interquartile range odds ratio, 1.20; 95% CI, 1.11-1.31) and have lower ventilator-free days (adjusted interquartile range mean difference of -0.83; 95% CI, -1.18 to -0.48) and lower hospital-free days (adjusted interquartile range mean difference of -1.38; 95% CI, -2.09 to -0.68). We observed a dose-response relationship between the cumulative above goal oxygen exposure and worsened clinical outcomes for participants with mild, moderate, or severe acute respiratory distress syndrome, suggesting that the observed relationship is not primarily influenced by severity of illness. Oxygen exposure resulting in arterial oxygen tensions above the protocol goal occurred frequently and was associated with worse clinical outcomes at all levels of acute respiratory distress syndrome severity.
The Dynamics of Cardiovascular Biomarkers in non-Elite Marathon Runners.
Roca, Emma; Nescolarde, Lexa; Lupón, Josep; Barallat, Jaume; Januzzi, James L; Liu, Peter; Cruz Pastor, M; Bayes-Genis, Antoni
2017-04-01
The number of recreational/non-elite athletes participating in marathons is increasing, but data regarding impact of endurance exercise on cardiovascular health are conflicting. This study evaluated 79 recreational athletes of the 2016 Barcelona Marathon (72% men; mean age 39 ± 6 years; 71% ≥35 years). Blood samples were collected at baseline (24-48 h before the race), immediately after the race (1-2 h after the race), and 48-h post-race. Amino-terminal pro-B type natriuretic peptide (NT-proBNP, a marker of myocardial strain), ST2 (a marker of extracellular matrix remodeling and fibrosis, inflammation, and myocardial strain), and high-sensitivity troponin T (hs-TnT, a marker of myocyte stress/injury) were assayed. The median (interquartile range, IQR) years of training was 7 (5-11) years and median (IQR) weekly training hours was 6 (5-8) h/week, respectively. The median (IQR) race time (h:min:s) was 3:32:44 (3:18:50-3:51:46). Echocardiographic indices were within normal ranges. Immediately after the race, blood concentration of the three cardiac biomarkers increased significantly, with 1.3-, 1.6-, and 16-fold increases in NT-proBNP, ST2, and hs-TnT, respectively. We found an inverse relationship between weekly training hours and increased ST2 (p = 0.007), and a direct relationship between race time and increased hs-TnT (p < 0.001) and ST2 (p = 0.05). Our findings indicate that preparation for and participation in marathon running may affect multiple pathways affecting the cardiovascular system. More data and long-term follow-up studies in non-elite and elite athletes are needed.
Mansfield, Robert T; Lin, Kimberly Y; Zaoutis, Theoklis; Mott, Antonio R; Mohamad, Zeinab; Luan, Xianqun; Kaufman, Beth D; Ravishankar, Chitra; Gaynor, J William; Shaddy, Robert E; Rossano, Joseph W
2015-07-01
The use of ventricular assist devices has increased dramatically in adult heart failure patients. However, the overall use, outcome, comorbidities, and resource utilization of ventricular assist devices in pediatric patients have not been well described. We sought to demonstrate that the use of ventricular assist devices in pediatric patients has increased over time and that mortality has decreased. A retrospective study of the Pediatric Health Information System database was performed for patients 20 years old or younger undergoing ventricular assist device placement from 2000 to 2010. None. Four hundred seventy-five pediatric patients were implanted with ventricular assist devices during the study period: 69 in 2000-2003 (era 1), 135 in 2004-2006 (era 2), and 271 in 2007-2010 (era 3). Median age at ventricular assist device implantation was 6.0 years (interquartile range, 0.5-13.8), and the proportion of children who were 1-12 years old increased from 29% in era 1 to 47% in era 3 (p = 0.002). The majority of patients had a diagnosis of cardiomyopathy; this increased from 52% in era 1 to 72% in era 3 (p = 0.003). Comorbidities included arrhythmias (48%), pulmonary hypertension (16%), acute renal failure (34%), cerebrovascular disease (28%), and sepsis/systemic inflammatory response syndrome (34%). Two hundred forty-seven patients (52%) underwent heart transplantation and 327 (69%) survived to hospital discharge. Hospital mortality decreased from 42% in era 1 to 25% in era 3 (p = 0.004). Median hospital length of stay increased (37 d [interquartile range, 12-64 d] in era 1 vs 69 d [interquartile range, 35-130] in era 3; p < 0.001) and median adjusted hospital charges increased ($630,630 [interquartile range, $227,052-$853,318] in era 1 vs $1,577,983 [interquartile range, $874,463-$2,280,435] in era 3; p < 0.001). Factors associated with increased mortality include age less than 1 year (odds ratio, 2.04; 95% CI, 1.01-3.83), acute renal failure (odds ratio, 2.1; 95% CI, 1.26-3.65), cerebrovascular disease (odds ratio, 2.1; 95% CI, 1.25-3.62), and extracorporeal membrane oxygenation (odds ratio, 3.16; 95% CI, 1.79-5.60). Ventricular assist device placement in era 3 (odds ratio, 0.3; 95% CI, 0.15-0.57) and a diagnosis of cardiomyopathy (odds ratio, 0.5; 95% CI, 0.32-0.84), were associated with decreased mortality. Large-volume centers had lower mortality (odds ratio, 0.55; 95% CI, 0.34-0.88), lower use of extracorporeal membrane oxygenation, and higher charges. The use of ventricular assist devices and survival after ventricular assist device placement in pediatric patients have increased over time, with a concomitant increase in resource utilization. Age under 1 year, certain noncardiac morbidities, and the use of extracorporeal membrane oxygenation are associated with worse outcomes. Lower mortality was seen at larger volume ventricular assist device centers.
Gener, G; Canoui-Poitrine, F; Revuz, J E; Faye, O; Poli, F; Gabison, G; Pouget, F; Viallette, C; Wolkenstein, P; Bastuji-Garin, S
2009-01-01
Antibiotics are frequently used to treat hidradenitis suppurativa (HS); however, few data on their efficacy are available. To evaluate the efficacy of a combination of systemic clindamycin (300 mg twice daily) and rifampicin (600 mg daily) in the treatment of patients with severe HS. Patients (n = 116) who received this combination were studied retrospectively. The main outcome measure was the severity of the disease, assessed by the Sartorius score, before and after 10 weeks of treatment. The Sartorius score dramatically improved at the end of treatment (median = 29, interquartile range = 14.5, vs. median = 14.5, interquartile range = 11; p < 0.001), as did other parameters of severity as well as the quality of life score. Eight patients (6.9%) stopped the treatment because of side effects. The combination of clindamycin and rifampicin is effective in the treatment of severe HS. Copyright 2009 S. Karger AG, Basel.
Geographic variability of adherence to occupational injury treatment guidelines.
Trujillo, Antonio J; Heins, Sara E; Anderson, Gerard F; Castillo, Renan C
2014-12-01
To determine the geographic variability and relationship between six occupational injury practice guidelines. Guidelines were developed by an expert panel and evaluated using workers' compensation claims data from a large, national insurance company (1999 to 2010). Percentage compliance for each guideline was adjusted for age and sex using linear regression and mapped by hospital referral region. Regions with the lowest compliance were identified, and correlations between guidelines were calculated. Compliance to the unnecessary home care guideline showed the lowest geographic variation (interquartile range: 97.3 to 99.0), and inappropriate shoulder bracing showed the highest variation (interquartile range: 77.7 to 90.8). Correlation between the guidelines was weak and not always positive. Different guidelines showed different degrees of geographic variation. Lack of correlation between guidelines suggests that these indicators were not associated with a single underlying health care quality or patient severity construct.
ERIC Educational Resources Information Center
Turegun, Mikhail
2011-01-01
Traditional curricular materials and pedagogical strategies have not been effective in developing conceptual understanding of statistics topics and statistical reasoning abilities of students. Much of the changes proposed by statistics education research and the reform movement over the past decade have supported efforts to transform teaching…
Carreiro, Marina Pimenta; Lauria, Márcio W; Naves, Gabriel Nino T; Miranda, Paulo Augusto C; Leite, Ricardo Barsaglini; Rajão, Kamilla Maria Araújo Brandão; de Aguiar, Regina Amélia Lopes Pessoa; Nogueira, Anelise Impeliziere; Ribeiro-Oliveira, Antônio
2016-09-01
To study glucose profiles of gestational diabetes (GDM) patients with 72 h of continuous glucose monitoring (CGM) either before (GDM1) or after (GDM2) dietary counseling, comparing them with nondiabetic (NDM) controls. We performed CGM on 22 GDM patients; 11 before and 11 after dietary counseling and compared them to 11 healthy controls. Several physiological and clinical characteristics of the glucose profiles were compared across the groups, including comparisons for pooled 24-h measures and hourly median values, summary measures representing glucose exposure (area under the median curves) and variability (amplitude, standard deviation, interquartile range), and time points related to meals. Most women (81.8%) in the GDM groups had fasting glucose <95mg/dL, suggesting mild GDM. Variability, glucose levels 1 and 2h after breakfast and dinner, peak values after dinner and glucose levels between breakfast and lunch, were all significantly higher in GDM1 than NDM (P<0.05 for all comparisons). The GDM2 results were similar to NDM in all aforementioned comparisons (P>0.05). Both GDM groups spent more time with glucose levels above 140mg/dL when compared with the NDM group. No differences among the groups were found for: pooled measurements and hourly comparisons, exposure, nocturnal, fasting, between lunch and dinner and before meals, as well as after lunch (P>0.05 for all). The main differences between the mild GDM1 group and healthy controls were related to glucose variability and excursions above 140mg/dL, while glucose exposure was similar. Glucose levels after breakfast and dinner also discerned the GDM1 group. Dietary counseling was able to keep glucose levels to those of healthy patients. © 2016 European Society of Endocrinology.
A national survey of emergency pharmacy practice in the United States.
Thomas, Michael C; Acquisto, Nicole M; Shirk, Mary Beth; Patanwala, Asad E
2016-03-15
The results of a survey to characterize pharmacy practice in emergency department (ED) settings are reported. An electronic survey was sent to all members of the American Society of Health-System Pharmacists' Emergency Medicine Connect group and the American College of Clinical Pharmacy's Emergency Medicine Practice and Research Network. Approximately 400 nontrainee pharmacy practitioners were invited to participate in the survey, which was open for 30 days. Descriptive statistics were used for all analyses. Two hundred thirty-three responses to the survey that were at least partially completed were received. After the removal of duplicate responses and null records, 187 survey responses were retained. The majority of respondents were from community hospitals (59.6%) or academic medical centers (36.1%). A pharmacist's presence in the ED of more than eight hours per day on weekdays and weekends was commonly reported (68.7% of respondents); 49.4% of institutions provided more than eight hours of coverage daily. Nearly one in three institutions (34.8%) provided no weekend ED staffing. The most frequently reported hours of coverage were during the 1 p.m.-midnight time frame. The distribution of ED pharmacist activities, by category, was as follows (data are median reported time commitments): clinical, 25% (interquartile range [IQR], 15-40%); emergency response, 15% (IQR, 10-20%); order processing, 15% (IQR, 5-25%); medication reconciliation/history-taking, 10% (IQR, 5-25%); teaching, 10% (IQR, 5-15%); administrative, 5% (IQR, 3-10%); and scholarly endeavors, 0% (IQR, 0-5%). Pharmacists from academic and community EDs perform a variety of clinical, educational, and administrative activities. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Zanobetti, Antonella; Gold, Diane R.; Stone, Peter H.; Suh, Helen H.; Schwartz, Joel; Coull, Brent A.; Speizer, Frank E.
2010-01-01
Introduction Ambient particulate pollution and traffic have been linked to myocardial infarction and cardiac death risk. Possible mechanisms include autonomic cardiac dysfunction. Methods In a repeated-measures study of 46 patients 43–75 years of age, we investigated associations of central-site ambient particulate pollution, including black carbon (BC) (a marker for regional and local traffic), and report of traffic exposure with changes in half-hourly averaged heart rate variability (HRV), a marker of autonomic function measured by 24-hr Holter electrocardiogram monitoring. Each patient was observed up to four times within 1 year after a percutaneous intervention for myocardial infarction, acute coronary syndrome without infarction, or stable coronary artery disease (4,955 half-hour observations). For each half-hour period, diary data defined whether the patient was home or not home, or in traffic. Results A decrease in high frequency (HF; an HRV marker of vagal tone) of −16.4% [95% confidence interval (CI), −20.7 to −11.8%] was associated with an interquartile range of 0.3-μg/m3 increase in prior 5-day averaged ambient BC. Decreases in HF were independently associated both with the previous 2-hr averaged BC (−10.4%; 95% CI, −15.4 to −5.2%) and with being in traffic in the previous 2 hr (−38.5%; 95% CI, −57.4 to −11.1%). We also observed independent responses for particulate air matter with aerodynamic diameter ≤ 2.5 μm and for gases (ozone or nitrogen dioxide). Conclusion After hospitalization for coronary artery disease, both particulate pollution and being in traffic, a marker of stress and pollution, were associated with decreased HRV. PMID:20064780
Van den Heede, Koen; Sermeus, Walter; Diya, Luwis; Clarke, Sean P; Lesaffre, Emmanuel; Vleugels, Arthur; Aiken, Linda H
2009-07-01
Studies have linked nurse staffing levels (number and skill mix) to several nurse-sensitive patient outcomes. However, evidence from European countries has been limited. This study examines the association between nurse staffing levels (i.e. acuity-adjusted Nursing Hours per Patient Day, the proportion of registered nurses with a Bachelor's degree) and 10 different patient outcomes potentially sensitive to nursing care. DESIGN-SETTING-PARTICIPANTS: Cross-sectional analyses of linked data from the Belgian Nursing Minimum Dataset (general acute care and intensive care nursing units: n=1403) and Belgian Hospital Discharge Dataset (general, orthopedic and vascular surgery patients: n=260,923) of the year 2003 from all acute hospitals (n=115). Logistic regression analyses, estimated by using a Generalized Estimation Equation Model, were used to study the association between nurse staffing and patient outcomes. The mean acuity-adjusted Nursing Hours per Patient Day in Belgian hospitals was 2.62 (S.D.=0.29). The variability in patient outcome rates between hospitals is considerable. The inter-quartile ranges for the 10 patient outcomes go from 0.35 for Deep Venous Thrombosis to 3.77 for failure-to-rescue. No significant association was found between the acuity-adjusted Nursing Hours per Patient Day, proportion of registered nurses with a Bachelor's degree and the selected patient outcomes. The absence of associations between hospital-level nurse staffing measures and patient outcomes should not be inferred as implying that nurse staffing does not have an impact on patient outcomes in Belgian hospitals. To better understand the dynamics of the nurse staffing and patient outcomes relationship in acute hospitals, further analyses (i.e. nursing unit level analyses) of these and other outcomes are recommended, in addition to inclusion of other study variables, including data about nursing practice environments in hospitals.
Cubero Gómez, José M; Acosta Martínez, Juan; Mendias Benítez, Crsitina; Díaz De La Llera, Luis S; Fernández-Quero, Mónica; Guisado Rasco, Agustí; Villa Gil-Ortega, Manuel; Sánchez González, Ángel
2015-12-01
Diabetic patients with an acute coronary syndrome undergoing percutaneous coronary intervention frequently exhibit high platelet reactivity while on clopidogrel. We hypothesized that in diabetic patients undergoing percutaneous coronary intervention, who exhibit high-platelet-reactivity after standard treatment with clopidogrel, a 60-mg prasugrel loading dose is superior to standard treatment with clopidogrel for optimal P2Y12 inhibition within the first 24-36 h post-angioplasty. VERDI was a prospective, randomized, single-centre, single-blind, parallel-design study (NCT01684813). Consecutive diabetic patients with an non-ST-segment elevation acute coronary syndrome undergoing percutaneous coronary intervention and loaded with clopidogrel were considered for platelet reactivity assessment immediately before angioplasty with the VerifyNow assay measured in P2Y12 reaction units (PRU). Fifty of 63 screened patients (79.4%) had high platelet reactivity (PRU ≥ 208) and were randomized to receive a 60-mg prasugrel loading dose (n = 25) versus clopidogrel standard dose (n = 25). Platelet function was assessed again 24 hours post-angioplasty. Prasugrel achieved greater platelet inhibition than clopidogrel 24 hours post-angioplasty (median [interquartile range], 38 [9-72] vs 285 [240-337], respectively; P < 0.001). The non-high-platelet-reactivity rate (PRU < 208) at 24 h post-angioplasty (primary end point) was higher with prasugrel; 25 patients (100%) in the prasugrel group achieved optimal antiaggregation vs 4 patients (16%) in the clopidogrel group (P < 0.001). No significant acute bleeding was documented in either group. Among type 2 diabetic patients suffering an acute coronary syndrome with high-platelet-reactivity undergoing percutaneous coronary intervention, switching from clopidogrel to prasugrel was superior to standard treatment with clopidogrel for the achievement of optimal antiaggregation within the first 24 hours post-angioplasty.
Cumulative Probability and Time to Reintubation in U.S. ICUs.
Miltiades, Andrea N; Gershengorn, Hayley B; Hua, May; Kramer, Andrew A; Li, Guohua; Wunsch, Hannah
2017-05-01
Reintubation after liberation from mechanical ventilation is viewed as an adverse event in ICUs. We sought to describe the frequency of reintubations across U.S. ICUs and to propose a standard, appropriate time cutoff for reporting of reintubation events. We conducted a cohort study using data from the Project IMPACT database of 185 diverse ICUs in the United States. We included patients who received mechanical ventilation and excluded patients who received a tracheostomy, had a do-not-resuscitate order placed, or died prior to first extubation. We assessed the percentage of patients extubated who were reintubated; the cumulative probability of reintubation, with death and do-not-resuscitate orders after extubation modeled as competing risks, and time to reintubation. Among 98,367 patients who received mechanical ventilation without death or tracheostomy prior to extubation, 9,907 (10.1%) were reintubated, with a cumulative probability of 10.0%. Median time to reintubation was 15 hours (interquartile range, 2-45 hr). Of patients who required reintubation in the ICU, 90% did so within the first 96 hours after initial extubation; this was consistent across various patient subtypes (89.3% for electives surgical patients up to 94.8% for trauma patients) and ICU subtypes (88.6% for cardiothoracic ICUs to 93.5% for medical ICUs). The reintubation rate for ICU patients liberated from mechanical ventilation in U.S. ICUs is approximately 10%. We propose a time cutoff of 96 hours for reintubation definitions and benchmarking efforts, as it captures 90% of ICU reintubation events. Reintubation rates can be reported as simple percentages, without regard for deaths or changes in goals of care that might occur.
Cerebral Blood Flow Response to Hypercapnia in Children with Obstructive Sleep Apnea Syndrome.
Busch, David R; Lynch, Jennifer M; Winters, Madeline E; McCarthy, Ann L; Newland, John J; Ko, Tiffany; Cornaglia, Mary Anne; Radcliffe, Jerilynn; McDonough, Joseph M; Samuel, John; Matthews, Edward; Xiao, Rui; Yodh, Arjun G; Marcus, Carole L; Licht, Daniel J; Tapia, Ignacio E
2016-01-01
Children with obstructive sleep apnea syndrome (OSAS) often experience periods of hypercapnia during sleep, a potent stimulator of cerebral blood flow (CBF). Considering this hypercapnia exposure during sleep, it is possible that children with OSAS have abnormal CBF responses to hypercapnia even during wakefulness. Therefore, we hypothesized that children with OSAS have blunted CBF response to hypercapnia during wakefulness, compared to snorers and controls. CBF changes during hypercapnic ventilatory response (HCVR) were tested in children with OSAS, snorers, and healthy controls using diffuse correlation spectroscopy (DCS). Peak CBF changes with respect to pre-hypercapnic baseline were measured for each group. The study was conducted at an academic pediatric sleep center. Twelve children with OSAS (aged 10.1 ± 2.5 [mean ± standard deviation] y, obstructive apnea hypopnea index [AHI] = 9.4 [5.1-15.4] [median, interquartile range] events/hour), eight snorers (11 ± 3 y, 0.5 [0-1.3] events/hour), and 10 controls (11.4 ± 2.6 y, 0.3 [0.2-0.4] events/hour) were studied. The fractional CBF change during hypercapnia, normalized to the change in end-tidal carbon dioxide, was significantly higher in controls (9 ± 1.8 %/mmHg) compared to OSAS (7.1 ± 1.5, P = 0.023) and snorers (6.7 ± 1.9, P = 0.025). Children with OSAS and snorers have blunted CBF response to hypercapnia during wakefulness compared to controls. Noninvasive DCS blood flow measurements of hypercapnic reactivity offer insights into physiopathology of OSAS in children, which could lead to further understanding about the central nervous system complications of OSAS. © 2016 Associated Professional Sleep Societies, LLC.
Saxena, Manoj K; Taylor, Colman; Billot, Laurent; Bompoint, Severine; Gowardman, John; Roberts, Jason A; Lipman, Jeffery; Myburgh, John
2015-01-01
Strategies to prevent pyrexia in patients with acute neurological injury may reduce secondary neuronal damage. The aim of this study was to determine the safety and efficacy of the routine administration of 6 grams/day of intravenous paracetamol in reducing body temperature following severe traumatic brain injury, compared to placebo. A multicentre, randomised, blind, placebo-controlled clinical trial in adult patients with traumatic brain injury (TBI). Patients were randomised to receive an intravenous infusion of either 1g of paracetamol or 0.9% sodium chloride (saline) every 4 hours for 72 hours. The primary outcome was the mean difference in core temperature during the study intervention period. Forty-one patients were included in this study: 21 were allocated to paracetamol and 20 to saline. The median (interquartile range) number of doses of study drug was 18 (17-18) in the paracetamol group and 18 (16-18) in the saline group (P = 0.85). From randomisation until 4 hours after the last dose of study treatment, there were 2798 temperature measurements (median 73 [67-76] per patient). The mean ± standard deviation temperature was 37.4±0.5°C in the paracetamol group and 37.7±0.4°C in the saline group (absolute difference -0.3°C; 95% confidence interval -0.6 to 0.0; P = 0.09). There were no significant differences in the use of physical cooling, or episodes of hypotension or hepatic abnormalities, between the two groups. The routine administration of 6g/day of intravenous paracetamol did not significantly reduce core body temperature in patients with TBI. Australian New Zealand Clinical Trials Registry ACTRN12609000444280.
Strickland, Matthew J; Klein, Mitchel; Flanders, W Dana; Chang, Howard H; Mulholland, James A; Tolbert, Paige E; Darrow, Lyndsey A
2016-01-01
Background Children may have differing susceptibility to ambient air pollution concentrations depending on various background characteristics of the children. Methods Using emergency department (ED) data linked with birth records from Atlanta, Georgia, we identified ED visits for asthma or wheeze among children aged 2–16 years from 1 January 2002 through 30 June 2010 (n=109,758). We stratified by preterm delivery, term low birth weight, maternal race, Medicaid status, maternal education, maternal smoking, delivery method, and history of a bronchiolitis ED visit. Population-weighted daily average concentrations were calculated for 1-hour maximum carbon monoxide and nitrogen dioxide; 8-hour maximum ozone; and 24-hour average particulate matter less than 10 microns in diameter, particulate matter less than 2.5 microns in diameter (PM2.5), and the PM2.5 components sulfate, nitrate, ammonium, elemental carbon, and organic carbon, using measurements from stationary monitors. Poisson time-series models were used to estimate rate ratios for associations between three-day moving average pollutant concentrations and daily ED visit counts and to investigate effect-measure modification by the stratification factors. Results Associations between pollutant concentrations and asthma exacerbations were larger among children born preterm and among children born to African American mothers. Stratification by race and preterm status together suggested that both factors affected susceptibility. The largest estimated effect size (for an interquartile-range increase in pollution) was observed for ozone among preterm births to African American mothers: rate ratio=1.138 (95% confidence interval=1.077–1.203). In contrast, the rate ration for the ozone association among full-term births to mothers of other races was 1.025 (0.970–1.083). Conclusions Results support the hypothesis that children vary in their susceptibility to ambient air pollutants. PMID:25192402
Matthews, Lynn T; Ribaudo, Heather B; Kaida, Angela; Bennett, Kara; Musinguzi, Nicholas; Siedner, Mark J; Kabakyenga, Jerome; Hunt, Peter W; Martin, Jeffrey N; Boum, Yap; Haberer, Jessica E; Bangsberg, David R
2016-04-01
HIV-infected women risk sexual and perinatal HIV transmission during conception, pregnancy, childbirth, and breastfeeding. We compared HIV-1 RNA suppression and medication adherence across periconception, pregnancy, and postpartum periods, among women on antiretroviral therapy (ART) in Uganda. We analyzed data from women in a prospective cohort study, aged 18-49 years, enrolled at ART initiation and with ≥1 pregnancy between 2005 and 2011. Participants were seen quarterly. The primary exposure of interest was pregnancy period, including periconception (3 quarters before pregnancy), pregnancy, postpartum (6 months after pregnancy outcome), or nonpregnancy related. Regression models using generalized estimating equations compared the likelihood of HIV-1 RNA ≤400 copies per milliliter, <80% average adherence based on electronic pill caps (medication event monitoring system), and likelihood of 72-hour medication gaps across each period. One hundred eleven women contributed 486 person-years of follow-up. Viral suppression was present at 89% of nonpregnancy, 97% of periconception, 93% of pregnancy, and 89% of postpartum visits, and was more likely during periconception (adjusted odds ratio, 2.15) compared with nonpregnant periods. Average ART adherence was 90% [interquartile range (IQR), 70%-98%], 93% (IQR, 82%-98%), 92% (IQR, 72%-98%), and 88% (IQR, 63%-97%) during nonpregnant, periconception, pregnant, and postpartum periods, respectively. Average adherence <80% was less likely during periconception (adjusted odds ratio, 0.68), and 72-hour gaps per 90 days were less frequent during periconception (adjusted relative risk, 0.72) and more frequent during postpartum (adjusted relative risk, 1.40). Women with pregnancy were virologically suppressed at most visits, with an increased likelihood of suppression and high adherence during periconception follow-up. Increased frequency of 72-hour gaps suggests a need for increased adherence support during postpartum periods.
Zhang, Q; Lehmann, A; Rigda, R; Dent, J; Holloway, R H
2002-01-01
Background and aims: Transient lower oesophageal sphincter relaxations (TLOSRs) are the major cause of gastro-oesophageal reflux in normal subjects and in most patients with reflux disease. The gamma aminobutyric acid (GABA) receptor type B agonist, baclofen, is a potent inhibitor of TLOSRs in normal subjects. The aim of this study was to investigate the effect of baclofen on TLOSRs and postprandial gastro-oesophageal reflux in patients with reflux disease. Methods: In 20 patients with reflux disease, oesophageal motility and pH were measured, with patients in the sitting position, for three hours after a 3000 kJ mixed nutrient meal. On separate days at least one week apart, 40 mg oral baclofen or placebo was given 90 minutes before the meal. Results: Baclofen reduced the rate of TLOSRs by 40% from 15 (13.8–18.3) to 9 (5.8–13.3) per three hours (p<0.0002) and increased basal lower oesophageal sphincter pressure. Baclofen also significantly reduced the rate of reflux episodes by 43% from 7.0 (4.0–12.0) to 4.0 (1.5–9) per three hours (median (interquartile range); p<0.02). However, baclofen had no effect on oesophageal acid exposure (baclofen 4.9% (1.7–12.4) v placebo 5.0% (2.7–15.5)). Conclusions: In patients with reflux disease, the GABAB agonist baclofen significantly inhibits gastro-oesophageal reflux episodes by inhibition of TLOSRs. These findings suggest that GABAB agonists may be useful as therapeutic agents for the management of reflux disease. PMID:11772961
Fuster-Lluch, Oscar; Zapater-Hernández, Pedro; Gerónimo-Pardo, Manuel
2017-10-01
The pharmacokinetic profile of intravenous acetaminophen administered to critically ill multiple-trauma patients was studied after 4 consecutive doses of 1 g every 6 hours. Eleven blood samples were taken (predose and 15, 30, 45, 60, 90, 120, 180, 240, 300, and 360 minutes postdose), and urine was collected (during 6-hour intervals between doses) to determine serum and urine acetaminophen concentrations. These were used to calculate the following pharmacokinetic parameters: maximum and minimum concentrations, terminal half-life, area under serum concentration-time curve from 0 to 6 hours, mean residence time, volume of distribution, and serum and renal clearance of acetaminophen. Daily doses of acetaminophen required to obtain steady-state minimum (bolus dosing) and average plasma concentrations (continuous infusion) of 10 μg/mL were calculated (10 μg/mL is the presumed lower limit of the analgesic range). Data are expressed as median [interquartile range]. Twenty-two patients were studied, mostly young (age 44 [34-64] years) males (68%), not obese (weight 78 [70-84] kg). Acetaminophen concentrations and pharmacokinetic parameters were these: maximum concentration 33.6 [25.7-38.7] μg/mL and minimum concentration 0.5 [0.2-2.3] μg/mL, all values below 10 μg/mL and 8 below the detection limit; half-life 1.2 [1.0-1.9] hours; area under the curve for 6 hours 34.7 [29.7-52.7] μg·h/mL; mean residence time 1.8 [1.3-2.6] hours; steady-state volume of distribution 50.8 [42.5-66.5] L; and serum and renal clearance 28.8 [18.9-33.7] L/h and 15 [11-19] mL/min, respectively. Theoretically, daily doses for a steady-state minimum concentration of 10 μg/mL would be 12.2 [7.8-16.4] g/day (166 [112-202] mg/[kg·day]); for an average steady-state concentration of 10 μg/mL, they would be 6.9 [4.5-8.1] g/day (91 [59-111] mg/[kg·day]). In conclusion, administration of acetaminophen at the recommended dosage of 1 g per 6 hours to critically ill multiple-trauma patients yields serum concentrations below 10 μg/mL due to increased elimination. To reach the 10 μg/mL target, and from a strictly pharmacokinetic point of view, continuous infusion may be more feasible than bolus dosing. Such a change in dosing strategy requires appropriate, pharmacokinetic-pharmacodynamic and specific safety study. © 2017, The American College of Clinical Pharmacology.
Fassier, Thomas; Darmon, Michel; Laplace, Christian; Chevret, Sylvie; Schlemmer, Benoit; Pochard, Frédéric; Azoulay, Elie
2007-01-01
Providing family members with clear, honest, and timely information is a major task for intensive care unit physicians. Time spent informing families has been associated with effectiveness of information but has not been measured in specifically designed studies. To measure time spent informing families of intensive care unit patients. One-day cross-sectional study in 90 intensive care units in France. Clocked time spent by physicians informing the families of each of 951 patients hospitalized in the intensive care unit during a 24-hr period. Median family information time was 16 (interquartile range, 8-30) mins per patient, with 20% of the time spent explaining the diagnosis, 20% on explaining treatments, and 60% on explaining the prognosis. One third of the time was spent listening to family members. Multivariable analysis identified one factor associated with less information time (room with more than one bed) and seven factors associated with more information time, including five patient-related factors (surgery on the study day, higher Logistic Organ Dysfunction score, coma, mechanical ventilation, and worsening clinical status) and two family-related factors (first contact with family and interview with the spouse). Median information time was 20 (interquartile range, 10-39) mins when three factors were present and 106.5 (interquartile range, 103-110) mins when five were present. This study identifies factors associated with information time provided by critical care physicians to family members of critically ill patients. Whether information time correlates with communication difficulties or communication skills needs to be evaluated. Information time provided by residents and nurses should be studied.
Videogame playing as distraction technique in course of venipuncture.
Minute, M; Badina, L; Cont, G; Montico, M; Ronfani, L; Barbi, E; Ventura, A
2012-01-01
Needle-related procedures (venipuncture, intravenous cannulation) are the most common source of pain and distress for children. Reducing needle related pain and anxiety could be important in order to prevent further distress, especially for children needing multiple hospital admissions. The aim of the present open randomized controlled trial was to investigate the efficacy of adding an active distraction strategy (videogame) to EMLA premedication in needle-related pain in children. One-hundred and nine children (4 -10 years of age) were prospectively recruited to enter in the study. Ninety-seven were randomized in two groups: CC group (conventional care: EMLA only) as control group and AD group (active distraction: EMLA plus videogame) as intervention group. Outcome measures were: self-reported pain by mean of FPS-R scale (main study outcome), observer-reported pain by FLACC scale, number of attempts for successful procedure. In both groups FPS-R median rate was 0 (interquartile range: 0-2), with significant pain (FPS-R > 4) reported by 9% of subjects. FLACC median rate was 1 in both groups (interquartile range 0-3 in CC group; 0-2 in AD group). The percentage of children with major pain (FLACC > 4) was 18% in CC group and 9% in AD group (p = 0.2). The median of necessary attempts to succeed in the procedures was 1 (interquartile range 1-2) in both groups.. Active distraction doesn't improve EMLA analgesia for iv cannulation and venipuncture. Even though, it resulted in an easily applicable strategy appreciated by children. This technique could be usefully investigated in other painful procedures.
Foley, J
2008-03-01
To develop baseline data in relation to paediatric minor oral surgical procedures undertaken with both general anaesthesia and nitrous oxide inhalation sedation within a Hospital Dental Service. Data were collected prospectively over a three-year period from May 2003 to June 2006 for patients attending the Departments of Paediatric Dentistry, Dundee Dental Hospital and Ninewells Hospital, NHS Tayside, Great Britain, for all surgical procedures undertaken with either inhalation sedation or general anaesthetic. Both operator status and the procedure being undertaken were noted. In addition, the operating time was recorded. Data for 166 patients (F: 102; M: 64) with a median age of 12.50 (inter-quartile range 10.00, 14.20) years showed that 195 surgical procedures were undertaken. Of these 160 and 35 were with general anaesthetic and sedation respectively. The surgical removal of impacted, carious and supernumerary unit(s) accounted for 53.8% of all procedures, whilst the exposure of impacted teeth and soft tissue surgery represented 34.9% and 11.3% of procedures respectively. The median surgical time for techniques undertaken with sedation was 30.00 (inter-quartile range 25.00, 43.50) minutes whilst that for general anaesthetic was similar at 30.00 (inter-quartile range 15.25, 40.00) minutes (not statistically significant, (Mann Whitney U, W = 3081.5, P = 0.331). The majority of paediatric minor oral surgical procedures entail surgical exposure or removal of impacted teeth. The median treatment time for most procedures undertaken with either general anaesthetic or nitrous oxide sedation was 30 minutes.
Mehra, Tarun; Koljonen, Virve; Seifert, Burkhardt; Volbracht, Jörk; Giovanoli, Pietro; Plock, Jan; Moos, Rudolf Maria
2015-01-01
Reimbursement systems have difficulties depicting the actual cost of burn treatment, leaving care providers with a significant financial burden. Our aim was to establish a simple and accurate reimbursement model compatible with prospective payment systems. A total of 370 966 electronic medical records of patients discharged in 2012 to 2013 from Swiss university hospitals were reviewed. A total of 828 cases of burns including 109 cases of severe burns were retained. Costs, revenues and earnings for severe and nonsevere burns were analysed and a linear regression model predicting total inpatient treatment costs was established. The median total costs per case for severe burns was tenfold higher than for nonsevere burns (179 949 CHF [167 353 EUR] vs 11 312 CHF [10 520 EUR], interquartile ranges 96 782-328 618 CHF vs 4 874-27 783 CHF, p <0.001). The median of earnings per case for nonsevere burns was 588 CHF (547 EUR) (interquartile range -6 720 - 5 354 CHF) whereas severe burns incurred a large financial loss to care providers, with median earnings of -33 178 CHF (30 856 EUR) (interquartile range -95 533 - 23 662 CHF). Differences were highly significant (p <0.001). Our linear regression model predicting total costs per case with length of stay (LOS) as independent variable had an adjusted R2 of 0.67 (p <0.001 for LOS). Severe burns are systematically underfunded within the Swiss reimbursement system. Flat-rate DRG-based refunds poorly reflect the actual treatment costs. In conclusion, we suggest a reimbursement model based on a per diem rate for treatment of severe burns.
Radman, Monique; Mack, Ricardo; Barnoya, Joaquin; Castañeda, Aldo; Rosales, Monica; Azakie, Anthony; Mehta, Nilesh; Keller, Roberta; Datar, Sanjeev; Oishi, Peter; Fineman, Jeffrey
2014-01-01
The objective of this study was to determine the association between preoperative nutritional status and postoperative outcomes in children undergoing surgery for congenital heart defects (CHD). Seventy-one patients with CHD were enrolled in a prospective, 2-center cohort study. We adjusted for baseline risk differences using a standardized risk adjustment score for surgery for CHD. We assigned a World Health Organization z score for each subject's preoperative triceps skin-fold measurement, an assessment of total body fat mass. We obtained preoperative plasma concentrations of markers of nutritional status (prealbumin, albumin) and myocardial stress (B-type natriuretic peptide [BNP]). Associations between indices of preoperative nutritional status and clinical outcomes were sought. Subjects had a median (interquartile range [IQR]) age of 10.2 (33) months. In the University of California at San Francisco (UCSF) cohort, duration of mechanical ventilation (median, 19 hours; IQR, 29 hours), length of intensive care unit stay (median, 5 days; IQR 5 days), duration of any continuous inotropic infusion (median, 66 hours; IQR 72 hours), and preoperative BNP levels (median, 30 pg/mL; IQR, 75 pg/mL) were associated with a lower preoperative triceps skin-fold z score (P < .05). Longer duration of any continuous inotropic infusion and higher preoperative BNP levels were also associated with lower preoperative prealbumin (12.1 ± 0.5 mg/dL) and albumin (3.2 ± 0.1; P < .05) levels. Lower total body fat mass and acute and chronic malnourishment are associated with worse clinical outcomes in children undergoing surgery for CHD at UCSF, a resource-abundant institution. There is an inverse correlation between total body fat mass and BNP levels. Duration of inotropic support and BNP increase concomitantly as measures of nutritional status decrease, supporting the hypothesis that malnourishment is associated with decreased myocardial function. Copyright © 2014 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Capraro, Geoffrey A; Mader, Timothy J; Coughlin, Bret F; Lovewell, Carolanne; St Louis, Myron R L; Tirabassi, Michael; Wadie, George; Smithline, Howard A
2007-04-01
To assess whether near-infrared spectroscopy can detect testicular hypoxia in a sheep model of testicular torsion within 6 hours of experimental torsion. This was a randomized, controlled, nonblinded study. Trans-scrotal, near-infrared, spectroscopy-derived testicular tissue saturation of oxygen values were obtained from the posterior hemiscrota of 6 anesthetized sheep at baseline and every 15 minutes for 6 hours after either experimental-side, 720-degree, unilateral, medial testicular torsion and orchidopexy or control-side sham procedure with orchidopexy and then for 75 minutes after reduction of torsion and pexy. Color Doppler ultrasonography was performed every 30 minutes to confirm loss of vascular flow on the experimental side, return of flow after torsion reduction, and preserved flow on the control side. Near infrared spectroscopy detected a prompt, sustained reduction in testicular tissue saturation of oxygen after experimental torsion. Further, it documented a rapid return of these values to pretorsion levels after reduction of torsion. Experimental-side testicular tissue saturation of oxygen fell from a median value of 59% (interquartile range [IQR] 57% to 69%) at baseline to 14% (IQR 11% to 29%) at 2.5 hours of torsion, and postreduction values were approximately 70%. Control-side testicular tissue saturation of oxygen values increased from a median value of 67% (IQR 59% to 68%) at baseline to 77% (IQR 77% to 94%) at 2.5 hours and remained at approximately 80% for the entire protocol. The difference in median testicular tissue saturation of oxygen between experimental and control sides, using the Friedman test, was found to be significant (P=.017). This study demonstrates the feasibility, in a sheep model, of using near-infrared spectroscopy for the noninvasive diagnosis of testicular torsion and for quantification of reperfusion after torsion reduction. The applicability of these findings, from an animal model using complete torsion, to the clinical setting remains to be established.
Liquid plasma use during "super" massive transfusion protocol.
Allen, Casey J; Shariatmadar, Sherry; Meizoso, Jonathan P; Hanna, Mena M; Mora, Jose L; Ray, Juliet J; Namias, Nicholas; Dudaryk, Roman; Proctor, Kenneth G
2015-12-01
A massive transfusion protocol (MTP) presents a logistical challenge for most blood banks and trauma centers. We compare the ratio of packed red blood cells (PRBC) and plasma transfused over serial time points in those requiring MTP (10-30 U PRBC/24 h) to those requiring "super" MTP (S-MTP; >30 U PRBC/24 h) and test the hypothesis that changes in allocation of blood products with use of readily transfusable liquid plasma (LP) improves the ratio of PRBC and plasma during S-MTP. All transfused trauma patients (n = 1305) from January 01, 2009-April, 03, 2015 were reviewed. PRBC:plasma ratio was compared for MTP (n = 277) and S-MTP (n = 61) patients, before and after the availability of LP at our institution. Data are reported as mean ± standard deviation or median (interquartile range). Age was 41 ± 19 y, 52% blunt mechanism, injury severity score 32 ± 16, and 46.3% mortality. In 24 h, requirements were 17 (14) U PRBC and 10 (11) U plasma, with a PRBC:plasma of 1.6 (0.8). Within the first hour, PRBC:plasma for S-MTP versus MTP was 2.1:1 versus 1.7:1 (P = 0.017). With LP, S-MTP patients received significantly lower PRBC:plasma at the first hour (P < 0.001). Before institutional changes, PRBC:plasma positively correlated with PRBC transfused at hour 1 (r = 0.410, R(2) = 0.168, P < 0.001); after institutional changes and the advent of LP, there was no correlation (r = 0.177, R(2) = 0.031, P = 0.219). Within the first hour of transfusion, units of PRBC transfused positively correlated with PRBC:plasma, and patients receiving S-MTP had higher PRBC:plasma than those receiving MTP. Changes in our institution's MTP protocol to include LP improved the early PRBC:plasma transfused in patients requiring S-MTP. Copyright © 2015 Elsevier Inc. All rights reserved.
Kawai, Yu; Weatherhead, Jeffrey R; Traube, Chani; Owens, Tonie A; Shaw, Brenda E; Fraser, Erin J; Scott, Annette M; Wojczynski, Melody R; Slaman, Kristen L; Cassidy, Patty M; Baker, Laura A; Shellhaas, Renee A; Dahmer, Mary K; Shever, Leah L; Malas, Nasuh M; Niedner, Matthew F
2017-01-01
Noise pollution in pediatric intensive care units (PICU) contributes to poor sleep and may increase risk of developing delirium. The Environmental Protection Agency (EPA) recommends <45 decibels (dB) in hospital environments. The objectives are to assess the degree of PICU noise pollution, to develop a delirium bundle targeted at reducing noise, and to assess the effect of the bundle on nocturnal noise pollution. This is a QI initiative at an academic PICU. Thirty-five sound sensors were installed in patient bed spaces, hallways, and common areas. The pediatric delirium bundle was implemented in 8 pilot patients (40 patient ICU days) while 108 non-pilot patients received usual care over a 28-day period. A total of 20,609 hourly dB readings were collected. Hourly minimum, average, and maximum dB of all occupied bed spaces demonstrated medians [interquartile range] of 48.0 [39.0-53.0], 52.8 [48.1-56.2] and 67.0 [63.5-70.5] dB, respectively. Bed spaces were louder during the day (10AM to 4PM) than at night (11PM to 5AM) (53.5 [49.0-56.8] vs. 51.3 [46.0-55.3] dB, P < 0.01). Pilot patient rooms were significantly quieter than non-pilot patient rooms at night (n=210, 45.3 [39.7-55.9]) vs. n=1841, 51.2 [46.9-54.8] dB, P < 0.01). The pilot rooms compliant with the bundle had the lowest hourly nighttime average dB (44.1 [38.5-55.5]). Substantial noise pollution exists in our PICU, and utilizing the pediatric delirium bundle led to a significant noise reduction that can be perceived as half the loudness with hourly nighttime average dB meeting the EPA standards when compliant with the bundle.
Lewin, Antoine; Buteau, Stéphane; Brand, Allan; Kosatsky, Tom; Smargiassi, Audrey
2013-01-01
Few studies have measured the effect of short-term exposure to industrial emissions on the respiratory health of children. Here we estimate the risk of hospitalization for asthma and bronchiolitis in young children associated with their recent exposure to emissions from an aluminum smelter. We used a case-crossover design to assess the risk of hospitalization, February 1999-December 2008, in relation to short-term variation in levels of exposure among children 0-4 years old living less than 7.5 km from the smelter. The percentage of hours per day that the residence of a hospitalized child was in the shadow of winds crossing the smelter was used to estimate the effect of wind-borne emissions on case and crossover days. Community-wide pollutant exposure was estimated through daily mean and daily maximum SO2 and PM2.5 concentrations measured at a fixed monitoring site near the smelter. Odds ratios (OR) were estimated using conditional logistic regressions. The risk of same-day hospitalization for asthma or bronchiolitis increased with the percentage of hours in a day that a child's residence was downwind of the smelter. For children aged 2-4 years, the OR was 1.27 (95% CI=1.03-1.56; n=103 hospitalizations), for an interquartile range (IQR) of 21% of hours being downwind. In this age group, the OR with PM2.5 daily mean levels was slightly smaller than with the hours downwind (OR: 1.22 for an IQR of 15.7 μg/m(3), 95% CI=1.03-1.44; n=94 hospitalizations). Trends were observed between hospitalizations and levels of SO2 for children 2-4 years old. Increasing short-term exposure to emissions from a Quebec aluminum smelter was associated with an increased risk of hospitalization for asthma and bronchiolitis in young children who live nearby. Estimating exposure through records of wind direction allows for the integration of exposure to all pollutants carried from the smelter stack.
Podany, Anthony T.; Bao, Yajing; Swindells, Susan; Chaisson, Richard E.; Andersen, Janet W.; Mwelase, Thando; Supparatpinyo, Khuanchai; Mohapi, Lerato; Gupta, Amita; Benson, Constance A.; Kim, Peter; Fletcher, Courtney V.
2015-01-01
Background. Concomitant use of rifamycins to treat or prevent tuberculosis can result in subtherapeutic concentrations of antiretroviral drugs. We studied the interaction of efavirenz with daily rifapentine and isoniazid in human immunodeficiency virus (HIV)–infected individuals receiving a 4-week regimen to prevent tuberculosis. Methods. Participants receiving daily rifapentine and isoniazid with efavirenz had pharmacokinetic evaluations at baseline and weeks 2 and 4 of concomitant therapy. Efavirenz apparent oral clearance was estimated and the geometric mean ratio (GMR) of values before and during rifapentine and isoniazid was calculated. HIV type 1 (HIV-1) RNA was measured at baseline and week 8. Results. Eighty-seven participants were evaluable: 54% were female, and the median age was 35 years (interquartile range [IQR], 29–44 years). Numbers of participants with efavirenz concentrations ≥1 mg/L were 85 (98%) at week 0; 81 (93%) at week 2; 78 (90%) at week 4; and 75 (86%) at weeks 2 and 4. Median efavirenz apparent oral clearance was 9.3 L/hour (IQR, 6.42–13.22 L/hour) at baseline and 9.8 L/hour (IQR, 7.04–15.59 L/hour) during rifapentine/isoniazid treatment (GMR, 1.04 [90% confidence interval, .97–1.13]). Seventy-nine of 85 (93%) participants had undetectable HIV-1 RNA (<40 copies/mL) at entry; 71 of 75 (95%) participants had undetectable HIV-1 RNA at week 8. Two participants with undetectable HIV-1 RNA at study entry were detectable (43 and 47 copies/mL) at week 8. Conclusions. The proportion of participants with midinterval efavirenz concentrations ≥1 mg/L did not cross below the prespecified threshold of >80%, and virologic suppression was maintained. Four weeks of daily rifapentine plus isoniazid can be coadministered with efavirenz without clinically meaningful reductions in efavirenz mid-dosing concentrations or virologic suppression. Clinical Trials Registration. NCT 01404312. PMID:26082504
Data Analysis and Statistical Methods for the Assessment and Interpretation of Geochronologic Data
NASA Astrophysics Data System (ADS)
Reno, B. L.; Brown, M.; Piccoli, P. M.
2007-12-01
Ages are traditionally reported as a weighted mean with an uncertainty based on least squares analysis of analytical error on individual dates. This method does not take into account geological uncertainties, and cannot accommodate asymmetries in the data. In most instances, this method will understate uncertainty on a given age, which may lead to over interpretation of age data. Geologic uncertainty is difficult to quantify, but is typically greater than analytical uncertainty. These factors make traditional statistical approaches inadequate to fully evaluate geochronologic data. We propose a protocol to assess populations within multi-event datasets and to calculate age and uncertainty from each population of dates interpreted to represent a single geologic event using robust and resistant statistical methods. To assess whether populations thought to represent different events are statistically separate exploratory data analysis is undertaken using a box plot, where the range of the data is represented by a 'box' of length given by the interquartile range, divided at the median of the data, with 'whiskers' that extend to the furthest datapoint that lies within 1.5 times the interquartile range beyond the box. If the boxes representing the populations do not overlap, they are interpreted to represent statistically different sets of dates. Ages are calculated from statistically distinct populations using a robust tool such as the tanh method of Kelsey et al. (2003, CMP, 146, 326-340), which is insensitive to any assumptions about the underlying probability distribution from which the data are drawn. Therefore, this method takes into account the full range of data, and is not drastically affected by outliers. The interquartile range of each population of dates (the interquartile range) gives a first pass at expressing uncertainty, which accommodates asymmetry in the dataset; outliers have a minor affect on the uncertainty. To better quantify the uncertainty, a resistant tool that is insensitive to local misbehavior of data is preferred, such as the normalized median absolute deviations proposed by Powell et al. (2002, Chem Geol, 185, 191-204). We illustrate the method using a dataset of 152 monazite dates determined using EPMA chemical data from a single sample from the Neoproterozoic Brasília Belt, Brazil. Results are compared with ages and uncertainties calculated using traditional methods to demonstrate the differences. The dataset was manually culled into three populations representing discrete compositional domains within chemically-zoned monazite grains. The weighted mean ages and least squares uncertainties for these populations are 633±6 (2σ) Ma for a core domain, 614±5 (2σ) Ma for an intermediate domain and 595±6 (2σ) Ma for a rim domain. Probability distribution plots indicate asymmetric distributions of all populations, which cannot be accounted for with traditional statistical tools. These three domains record distinct ages outside the interquartile range for each population of dates, with the core domain lying in the subrange 642-624 Ma, the intermediate domain 617-609 Ma and the rim domain 606-589 Ma. The tanh estimator yields ages of 631±7 (2σ) for the core domain, 616±7 (2σ) for the intermediate domain and 601±8 (2σ) for the rim domain. Whereas the uncertainties derived using a resistant statistical tool are larger than those derived from traditional statistical tools, the method yields more realistic uncertainties that better address the spread in the dataset and account for asymmetry in the data.
Kawazoe, Yu; Miyamoto, Kyohei; Morimoto, Takeshi; Yamamoto, Tomonori; Fuke, Akihiro; Hashimoto, Atsunori; Koami, Hiroyuki; Beppu, Satoru; Katayama, Yoichi; Itoh, Makoto; Ohta, Yoshinori
2017-01-01
Importance Dexmedetomidine provides sedation for patients undergoing ventilation; however, its effects on mortality and ventilator-free days have not been well studied among patients with sepsis. Objectives To examine whether a sedation strategy with dexmedetomidine can improve clinical outcomes in patients with sepsis undergoing ventilation. Design, Setting, and Participants Open-label, multicenter randomized clinical trial conducted at 8 intensive care units in Japan from February 2013 until January 2016 among 201 consecutive adult patients with sepsis requiring mechanical ventilation for at least 24 hours. Interventions Patients were randomized to receive either sedation with dexmedetomidine (n = 100) or sedation without dexmedetomidine (control group; n = 101). Other agents used in both groups were fentanyl, propofol, and midazolam. Main Outcomes and Measures The co–primary outcomes were mortality and ventilator-free days (over a 28-day duration). Sequential Organ Failure Assessment score (days 1, 2, 4, 6, 8), sedation control, occurrence of delirium and coma, intensive care unit stay duration, renal function, inflammation, and nutrition state were assessed as secondary outcomes. Results Of the 203 screened patients, 201 were randomized. The mean age was 69 years (SD, 14 years); 63% were male. Mortality at 28 days was not significantly different in the dexmedetomidine group vs the control group (19 patients [22.8%] vs 28 patients [30.8%]; hazard ratio, 0.69; 95% CI, 0.38-1.22; P = .20). Ventilator-free days over 28 days were not significantly different between groups (dexmedetomidine group: median, 20 [interquartile range, 5-24] days; control group: median, 18 [interquartile range, 0.5-23] days; P = .20). The dexmedetomidine group had a significantly higher rate of well-controlled sedation during mechanical ventilation (range, 17%-58% vs 20%-39%; P = .01); other outcomes were not significantly different between groups. Adverse events occurred in 8 (8%) and 3 (3%) patients in the dexmedetomidine and control groups, respectively. Conclusions and Relevance Among patients requiring mechanical ventilation, the use of dexmedetomidine compared with no dexmedetomidine did not result in statistically significant improvement in mortality or ventilator-free days. However, the study may have been underpowered for mortality, and additional research may be needed to evaluate this further. Trial Registration clinicaltrials.gov Identifier: NCT01760967 PMID:28322414
Drewry, Anne M; Fuller, Brian M; Bailey, Thomas C; Hotchkiss, Richard S
2013-09-12
Early treatment of sepsis improves survival, but early diagnosis of hospital-acquired sepsis, especially in critically ill patients, is challenging. Evidence suggests that subtle changes in body temperature patterns may be an early indicator of sepsis, but data is limited. The aim of this study was to examine whether abnormal body temperature patterns, as identified by visual examination, could predict the subsequent diagnosis of sepsis in afebrile critically ill patients. Retrospective case-control study of 32 septic and 29 non-septic patients in an adult medical and surgical ICU. Temperature curves for the period starting 72 hours and ending 8 hours prior to the clinical suspicion of sepsis (for septic patients) and for the 72-hour period prior to discharge from the ICU (for non-septic patients) were rated as normal or abnormal by seven blinded physicians. Multivariable logistic regression was used to compare groups in regard to maximum temperature, minimum temperature, greatest change in temperature in any 24-hour period, and whether the majority of evaluators rated the curve to be abnormal. Baseline characteristics of the groups were similar except the septic group had more trauma patients (31.3% vs. 6.9%, p = .02) and more patients requiring mechanical ventilation (75.0% vs. 41.4%, p = .008). Multivariable logistic regression to control for baseline differences demonstrated that septic patients had significantly larger temperature deviations in any 24-hour period compared to control patients (1.5°C vs. 1.1°C, p = .02). An abnormal temperature pattern was noted by a majority of the evaluators in 22 (68.8%) septic patients and 7 (24.1%) control patients (adjusted OR 4.43, p = .017). This resulted in a sensitivity of 0.69 (95% CI [confidence interval] 0.50, 0.83) and specificity of 0.76 (95% CI 0.56, 0.89) of abnormal temperature curves to predict sepsis. The median time from the temperature plot to the first culture was 9.40 hours (IQR [inter-quartile range] 8.00, 18.20) and to the first dose of antibiotics was 16.90 hours (IQR 8.35, 34.20). Abnormal body temperature curves were predictive of the diagnosis of sepsis in afebrile critically ill patients. Analysis of temperature patterns, rather than absolute values, may facilitate decreased time to antimicrobial therapy.
2013-01-01
Introduction Early treatment of sepsis improves survival, but early diagnosis of hospital-acquired sepsis, especially in critically ill patients, is challenging. Evidence suggests that subtle changes in body temperature patterns may be an early indicator of sepsis, but data is limited. The aim of this study was to examine whether abnormal body temperature patterns, as identified by visual examination, could predict the subsequent diagnosis of sepsis in afebrile critically ill patients. Methods Retrospective case-control study of 32 septic and 29 non-septic patients in an adult medical and surgical ICU. Temperature curves for the period starting 72 hours and ending 8 hours prior to the clinical suspicion of sepsis (for septic patients) and for the 72-hour period prior to discharge from the ICU (for non-septic patients) were rated as normal or abnormal by seven blinded physicians. Multivariable logistic regression was used to compare groups in regard to maximum temperature, minimum temperature, greatest change in temperature in any 24-hour period, and whether the majority of evaluators rated the curve to be abnormal. Results Baseline characteristics of the groups were similar except the septic group had more trauma patients (31.3% vs. 6.9%, p = .02) and more patients requiring mechanical ventilation (75.0% vs. 41.4%, p = .008). Multivariable logistic regression to control for baseline differences demonstrated that septic patients had significantly larger temperature deviations in any 24-hour period compared to control patients (1.5°C vs. 1.1°C, p = .02). An abnormal temperature pattern was noted by a majority of the evaluators in 22 (68.8%) septic patients and 7 (24.1%) control patients (adjusted OR 4.43, p = .017). This resulted in a sensitivity of 0.69 (95% CI [confidence interval] 0.50, 0.83) and specificity of 0.76 (95% CI 0.56, 0.89) of abnormal temperature curves to predict sepsis. The median time from the temperature plot to the first culture was 9.40 hours (IQR [inter-quartile range] 8.00, 18.20) and to the first dose of antibiotics was 16.90 hours (IQR 8.35, 34.20). Conclusions Abnormal body temperature curves were predictive of the diagnosis of sepsis in afebrile critically ill patients. Analysis of temperature patterns, rather than absolute values, may facilitate decreased time to antimicrobial therapy. PMID:24028682
Olivotto, Iacopo; Maron, Barry J; Appelbaum, Evan; Harrigan, Caitlin J; Salton, Carol; Gibson, C Michael; Udelson, James E; O'Donnell, Christopher; Lesser, John R; Manning, Warren J; Maron, Martin S
2010-07-15
In hypertrophic cardiomyopathy (HCM), the clinical significance attributable to the broad range of left ventricular (LV) systolic function, assessed as the ejection fraction (EF), is incompletely resolved. We evaluated the EF using cardiovascular magnetic resonance (CMR) imaging in a large cohort of patients with HCM with respect to the clinical status and evidence of left ventricular remodeling with late gadolinium enhancement (LGE). CMR imaging was performed in 310 consecutive patients, aged 42 +/- 17 years. The EF in patients with HCM was 71 +/- 10% (range 28% to 89%), exceeding that of 606 healthy controls without cardiovascular disease (66 +/- 5%, p <0.001). LGE reflecting LV remodeling showed an independent, inverse relation to the EF (B-0.69, 95% confidence interval -0.86 to -0.52; p <0.001) and was greatest in patients with an EF <50%, in whom it constituted a median value of 29% of the LV volume (interquartile range 16% to 40%). However, the substantial subgroup with low-normal EF values of 50% to 65% (n = 45; 15% of the whole cohort), who were mostly asymptomatic or mildly symptomatic (37 or 82% with New York Heart Association functional class I to II), showed substantial LGE (median 5% of LV volume, interquartile range 2% to 10%). This overlapped with the subgroup with systolic dysfunction and significantly exceeded that of patients with an EF of 66% to 75% and >75% (median 2% of the LV volume, interquartile range 1.5% to 4%; p <0.01). In conclusion, in a large cohort of patients with HCM, a subset of patients with low-normal EF values (50% to 65%) was identified by contrast-enhanced CMR imaging as having substantial degrees of LGE, suggesting a transition phase, potentially heralding advanced LV remodeling and systolic dysfunction, with implications for clinical surveillance and management. Copyright (c) 2010. Published by Elsevier Inc.
Soudorn, Chuleekorn; Muntham, Dittapol; Reutrakul, Sirimon; Chirakalwasan, Naricha
2016-09-01
The addition of heated humidification to CPAP has been shown to improve nasal adverse effects in subjects with obstructive sleep apnea (OSA). However, current data regarding improvement in CPAP adherence is conflicting. Furthermore, there are no data from a tropical climate area with a high humidity level. In this prospective randomized crossover study conducted in Thailand, subjects with moderate to severe OSA with nasopharyngeal symptoms post-split-night study were enrolled in the study. Subjects were randomly assigned to receive CPAP with or without heated humidification for 4 weeks and then crossed over. Information on CPAP adherence, quality of life assessed by the Functional Outcomes of Sleep Questionnaire, nasopharyngeal symptoms assessed by a modified XERO questionnaire, and bedroom ambient humidity and temperature data were obtained. Data were collected on 20 subjects with OSA during the period of January to December 2014. Although the addition of heated humidification appeared to improve average hours of use for all days when compared with conventional CPAP, the difference was not statistically significant (CPAP with heated humidification = 4.6 ± 1.7 h/night; conventional CPAP = 4.0 ± 1.7 h/night, P = .1). However, the addition of heated humidification improved CPAP adherence on the days of use (5.5 ± 1.5 h/night) compared with conventional CPAP (5.2 ± 1.4 h/night), P = .033. Quality of life was also improved according to the Functional Outcomes of Sleep Questionnaire score (median 17.6 [interquartile range 3.5]) in the heated humidification group compared with conventional CPAP group (median 17.6 [interquartile range 4.5]), P = .046. Significant reduction in the dry throat/sore throat symptom was noted only when CPAP with heated humidification was used. Even in a tropical climate area, CPAP adherence and quality of life appeared to improve when heated humidification was employed in subjects with moderate to severe OSA with nasopharyngeal symptoms post-split-night polysomnography. The improvement may be related to a reduction in the dry throat/sore throat symptom. Copyright © 2016 by Daedalus Enterprises.
Capsaicin-evoked cough responses in asthmatic patients: Evidence for airway neuronal dysfunction.
Satia, Imran; Tsamandouras, Nikolaos; Holt, Kimberley; Badri, Huda; Woodhead, Mark; Ogungbenro, Kayode; Felton, Timothy W; O'Byrne, Paul M; Fowler, Stephen J; Smith, Jaclyn A
2017-03-01
Cough in asthmatic patients is a common and troublesome symptom. It is generally assumed coughing occurs as a consequence of bronchial hyperresponsiveness and inflammation, but the possibility that airway nerves are dysfunctional has not been fully explored. We sought to investigate capsaicin-evoked cough responses in a group of patients with well-characterized mild-to-moderate asthma compared with healthy volunteers and assess the influences of sex, atopy, lung physiology, inflammation, and asthma control on these responses. Capsaicin inhalational challenge was performed, and cough responses were analyzed by using nonlinear mixed-effects modeling to estimate the maximum cough response evoked by any concentration of capsaicin (E max ) and the capsaicin dose inducing half-maximal response (ED 50 ). Ninety-seven patients with stable asthma (median age, 23 years [interquartile range, 21-27 years]; 60% female) and 47 healthy volunteers (median age, 38 years [interquartile range, 29-47 years]; 64% female) were recruited. Asthmatic patients had higher E max and lower ED 50 values than healthy volunteers. E max values were 27% higher in female subjects (P = .006) and 46% higher in patients with nonatopic asthma (P = .003) compared with healthy volunteers. Also, patients with atopic asthma had a 21% lower E max value than nonatopic asthmatic patients (P = .04). The ED 50 value was 65% lower in female patients (P = .0001) and 71% lower in all asthmatic patients (P = .0008). ED 50 values were also influenced by asthma control and serum IgE levels, whereas E max values were related to 24-hour cough frequency. Age, body mass index, FEV 1 , PC 20 , fraction of exhaled nitric oxide, blood eosinophil counts, and inhaled steroid treatment did not influence cough parameters. Patients with stable asthma exhibited exaggerated capsaicin-evoked cough responses consistent with neuronal dysfunction. Nonatopic asthmatic patients had the highest cough responses, suggesting this mechanism might be most important in type 2-low asthma phenotypes. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Prehospital use of magnesium sulfate as neuroprotection in acute stroke.
Saver, Jeffrey L; Starkman, Sidney; Eckstein, Marc; Stratton, Samuel J; Pratt, Franklin D; Hamilton, Scott; Conwit, Robin; Liebeskind, David S; Sung, Gene; Kramer, Ian; Moreau, Gary; Goldweber, Robert; Sanossian, Nerses
2015-02-05
Magnesium sulfate is neuroprotective in preclinical models of stroke and has shown signals of potential efficacy with an acceptable safety profile when delivered early after stroke onset in humans. Delayed initiation of neuroprotective agents has hindered earlier phase 3 trials of neuroprotective agents. We randomly assigned patients with suspected stroke to receive either intravenous magnesium sulfate or placebo, beginning within 2 hours after symptom onset. A loading dose was initiated by paramedics before the patient arrived at the hospital, and a 24-hour maintenance infusion was started on the patient's arrival at the hospital. The primary outcome was the degree of disability at 90 days, as measured by scores on the modified Rankin scale (range, 0 to 6, with higher scores indicating greater disability). Among the 1700 enrolled patients (857 in the magnesium group and 843 in the placebo group), the mean (±SD) age was 69±13 years, 42.6% were women, and the mean pretreatment score on the Los Angeles Motor Scale of stroke severity (range, 0 to 10, with higher scores indicating greater motor deficits) was 3.7±1.3. The final diagnosis of the qualifying event was cerebral ischemia in 73.3% of patients, intracranial hemorrhage in 22.8%, and a stroke-mimicking condition in 3.9%. The median interval between the time the patient was last known to be free of stroke symptoms and the start of the study-drug infusion was 45 minutes (interquartile range, 35 to 62), and 74.3% of patients received the study-drug infusion within the first hour after symptom onset. There was no significant shift in the distribution of 90-day disability outcomes on the global modified Rankin scale between patients in the magnesium group and those in the placebo group (P=0.28 by the Cochran-Mantel-Haenszel test); mean scores at 90 days did not differ between the magnesium group and the placebo group (2.7 in each group, P=1.00). No significant between-group differences were noted with respect to mortality (15.4% in the magnesium group and 15.5% in the placebo group, P=0.95) or all serious adverse events. Prehospital initiation of magnesium sulfate therapy was safe and allowed the start of therapy within 2 hours after the onset of stroke symptoms, but it did not improve disability outcomes at 90 days. (Funded by the National Institute of Neurological Disorders and Stroke; FAST-MAG ClinicalTrials.gov number, NCT00059332.).
Stewart, Barclay T; Tansley, Gavin; Gyedu, Adam; Ofosu, Anthony; Donkor, Peter; Appiah-Denkyira, Ebenezer; Quansah, Robert; Clarke, Damian L; Volmink, Jimmy; Mock, Charles
2016-08-17
Conditions that can be treated by surgery comprise more than 16% of the global disease burden. However, 5 billion people do not have access to essential surgical care. An estimated 90% of the 87 million disability-adjusted life-years incurred by surgical conditions could be averted by providing access to timely and safe surgery in low-income and middle-income countries. Population-level spatial access to essential surgery in Ghana is not known. To assess the performance of bellwether procedures (ie, open fracture repair, emergency laparotomy, and cesarean section) as a proxy for performing essential surgery more broadly, to map population-level spatial access to essential surgery, and to identify first-level referral hospitals that would most improve access to essential surgery if strengthened in Ghana. Population-based study among all households and public and private not-for-profit hospitals in Ghana. Households were represented by georeferenced census data. First-level and second-level referral hospitals managed by the Ministry of Health and all tertiary hospitals were included. Surgical data were collected from January 1 to December 31, 2014. All procedures performed at first-level referral hospitals in Ghana in 2014 were used to sort each facility into 1 of the following 3 hospital groups: those without capability to perform all 3 bellwether procedures, those that performed 1 to 11 of each procedure, and those that performed at least 12 of each procedure. Candidates for targeted capability improvement were identified by cost-distance and network analysis. Of 155 first-level referral hospitals managed by the Ghana Health Service and the Christian Health Association of Ghana, 123 (79.4%) reported surgical data. Ninety-five (77.2%) did not have the capability in 2014 to perform all 3 bellwether procedures, 24 (19.5%) performed 1 to 11 of each bellwether procedure, and 4 (3.3%) performed at least 12. The essential surgical procedure rate was greater in bellwether procedure-capable first-level referral hospitals than in noncapable hospitals (median, 638; interquartile range, 440-1418 vs 360; interquartile range, 0-896 procedures per 100 000 population; P = .03). Population-level spatial access within 2 hours to a hospital that performed 1 to 11 and at least 12 of each bellwether procedure was 83.2% (uncertainty interval [UI], 82.2%-83.4%) and 71.4% (UI, 64.4%-75.0%), respectively. Five hospitals were identified for targeted capability improvement. Almost 30% of Ghanaians cannot access essential surgery within 2 hours. Bellwether capability is a useful metric for essential surgery more broadly. Similar strategic planning exercises might be useful for other low-income and middle-income countries aiming to improve access to essential surgery.
Chen, Jie; Yang, Jian; Hu, Fen; Yu, Si-Hong; Yang, Bing-Xiang; Liu, Qian; Zhu, Xiao-Ping
2018-06-01
Simulation-based curriculum has been demonstrated as crucial to nursing education in the development of students' critical thinking and complex clinical skills during a resuscitation simulation. Few studies have comprehensively examined the effectiveness of a standardised simulation-based emergency and intensive care nursing curriculum on the performance of students in a resuscitation simulation. To evaluate the impact of a standardised simulation-based emergency and intensive care nursing curriculum on nursing students' response time in a resuscitation simulation. Two-group, non-randomised quasi-experimental design. A simulation centre in a Chinese University School of Nursing. Third-year nursing students (N = 39) in the Emergency and Intensive Care course were divided into a control group (CG, n = 20) and an experimental group (EG, n = 19). The experimental group participated in a standardised high-technology, simulation-based emergency and intensive care nursing curriculum. The standardised simulation-based curriculum for third-year nursing students consists of three modules: disaster response, emergency care, and intensive care, which include clinical priorities (e.g. triage), basic resuscitation skills, airway/breathing management, circulation management and team work with eighteen lecture hours, six skill-practice hours and twelve simulation hours. The control group took part in the traditional curriculum. This course included the same three modules with thirty-four lecture hours and two skill-practice hours (trauma). Perceived benefits included decreased median (interquartile ranges, IQR) seconds to start compressions [CG 32 (25-75) vs. EG 20 (18-38); p < 0.001] and defibrillation [CG 204 (174-240) vs. EG 167 (162-174); p < 0.001] at the end of the course, compared with compressions [CG 41 (32-49) vs. EG 42 (33-46); p > 0.05] and defibrillation [CG 222 (194-254) vs. EG 221 (214-248); p > 0.05] at the beginning of the course. A simulation-based emergency and intensive care nursing curriculum was created and well received by third-year nursing students and associated with decreased response time in a resuscitation simulation. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ley, Benedikt; Alam, Mohammad Shafiul; Thriemer, Kamala; Hossain, Mohammad Sharif; Kibria, Mohammad Golam; Auburn, Sarah; Poirot, Eugenie; Price, Ric N; Khan, Wasif Ali
2016-01-01
The Bangladeshi national treatment guidelines for uncomplicated malaria follow WHO recommendations but without G6PD testing prior to primaquine administration. A prospective observational study was conducted to assess the efficacy of the current antimalarial policy. Patients with uncomplicated malaria, confirmed by microscopy, attending a health care facility in the Chittagong Hill Tracts, Bangladesh, were treated with artemether-lumefantrine (days 0-2) plus single dose primaquine (0.75mg/kg on day2) for P. falciparum infections, or with chloroquine (days 0-2) plus 14 days primaquine (3.5mg/kg total over 14 days) for P. vivax infections. Hb was measured on days 0, 2 and 9 in all patients and also on days 16 and 30 in patients with P. vivax infection. Participants were followed for 30 days. The study was registered with the clinical trials website (NCT02389374). Between September 2014 and February 2015 a total of 181 patients were enrolled (64% P. falciparum, 30% P. vivax and 6% mixed infections). Median parasite clearance times were 22.0 (Interquartile Range, IQR: 15.2-27.3) hours for P. falciparum, 20.0 (IQR: 9.5-22.7) hours for P. vivax and 16.6 (IQR: 10.0-46.0) hours for mixed infections. All participants were afebrile within 48 hours, two patients with P. falciparum infection remained parasitemic at 48 hours. No patient had recurrent parasitaemia within 30 days. Adjusted male median G6PD activity was 7.82U/gHb. One male participant (1/174) had severe G6PD deficiency (<10% activity), five participants (5/174) had mild G6PD deficiency (10-60% activity). The Hb nadir occurred on day 2 prior to primaquine treatment in P. falciparum and P. vivax infected patients; mean fractional fall in Hb was -8.8% (95%CI -6.7% to -11.0%) and -7.4% (95%CI: -4.5 to -10.4%) respectively. The current antimalarial policy remains effective. The prevalence of G6PD deficiency was low. Main contribution to haemolysis in G6PD normal individuals was attributable to acute malaria rather than primaquine administration. ClinicalTrials.gov NCT02389374.
Pereira, Vinicius B P; Garcia, Renato; Torricelli, Andre A M; Mukai, Adriana; Bechara, Samir J
2017-10-01
Pain after photorefractive keratectomy (PRK) is significant, and the analgesic efficacy and safety of oral opioids in combination with acetaminophen has not been fully investigated in PRK trials. To assess the efficacy and safety of the combination of codeine plus acetaminophen (paracetamol) versus placebo as an add-on therapy for pain control after PRK. Randomized, double-blind, placebo-controlled trial. Single tertiary center. One eye was randomly allocated to the intervention, whereas the fellow eye was treated with a placebo. Eyes were operated 2 weeks apart. The participants were adults older than 20 years with refractive stability for ≥1 year, who underwent PRK for correction of myopia or myopic astigmatism. Codeine (30 mg) plus acetaminophen (500 mg) was given orally 4 times per day for 4 days after PRK. The follow-up duration was 4 months. The study outcomes included pain scores at 1 to 72 hours, as measured by the visual analog scale, McGill Pain Questionnaire, and Brief Pain Inventory, as well as adverse events and corneal wound healing. Of the initial 82 eyes, 80 completed the trial (40 intervention, 40 placebo). Median (interquartile range) pain scores as measured by the visual analog scale were statistically and clinically lower during treatment with codeine/acetaminophen compared with the placebo: 1 hour: 4 (2-4) versus 6 (3-6), P < 0.001; 24 hours: 4 (3-6) versus 7 (6-9), P < 0.001; 48 hours: 1 (0-2) versus 3 (2-5), P < 0.001; and 72 hours: 0 (0-0) versus 0 (0-2), P = 0.001. Virtually identical results were obtained by the McGill Pain Questionnaire and Brief Pain Inventory scales. The most common adverse events with codeine/acetaminophen were drowsiness (42%), nausea (18%), and constipation (5%). No case of delayed epithelial healing was observed in both treatment arms. When added to the usual care therapy, the oral combination of codeine/acetaminophen was safe and significantly superior to the placebo for pain control after PRK. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02625753.
Zatzick, Douglas; O'Connor, Stephen S; Russo, Joan; Wang, Jin; Bush, Nigel; Love, Jeff; Peterson, Roselyn; Ingraham, Leah; Darnell, Doyanne; Whiteside, Lauren; Van Eaton, Erik
2015-10-01
Posttraumatic stress disorder (PTSD) and its comorbidities are endemic among injured trauma survivors. Previous collaborative care trials targeting PTSD after injury have been effective, but they have required intensive clinical resources. The present pragmatic clinical trial randomized acutely injured trauma survivors who screened positive on an automated electronic medical record PTSD assessment to collaborative care intervention (n = 60) and usual care control (n = 61) conditions. The stepped measurement-based intervention included care management, psychopharmacology, and psychotherapy elements. Embedded within the intervention were a series of information technology (IT) components. PTSD symptoms were assessed with the PTSD Checklist at baseline prerandomization and again, 1-, 3-, and 6-months postinjury. IT utilization was also assessed. The technology-assisted intervention required a median of 2.25 hours (interquartile range = 1.57 hours) per patient. The intervention was associated with modest symptom reductions, but beyond the margin of statistical significance in the unadjusted model: F(2, 204) = 2.95, p = .055. The covariate adjusted regression was significant: F(2, 204) = 3.06, p = .049. The PTSD intervention effect was greatest at the 3-month (Cohen's effect size d = 0.35, F(1, 204) = 4.11, p = .044) and 6-month (d = 0.38, F(1, 204) = 4.10, p = .044) time points. IT-enhanced collaborative care was associated with modest PTSD symptom reductions and reduced delivery times; the intervention model could potentially facilitate efficient PTSD treatment after injury. Copyright © 2015 Wiley Periodicals, Inc., A Wiley Company.
Foreign bodies in a pediatric emergency department in South Africa.
Timmers, Maarten; Snoek, Kitty G; Gregori, Dario; Felix, Janine F; van Dijk, Monique; van As, Sebastian A B
2012-12-01
Foreign body-related pediatric trauma has a high incidence, but studies with large data sets are rare and typically stem from Western settings. The aim of this study was to identify characteristics of foreign body-related trauma in children treated at our trauma unit in South Africa. In this retrospective study, we analyzed all foreign body-related trauma admissions from 1991 to 2009. We collected detailed data including age, sex, type of foreign body, injury severity, and anatomical location of the foreign body. We analysed 8149 cases. Marginally more boys (54.9%) than girls were involved. The overall median age was 3 years (interquartile range, 2-6 years); 78.8% were younger than 7 years. The predominant anatomical sites were the respiratory tract and the gastrointestinal tract (39.1%); ears (23.9%); nose (19.4%); and extremities (8.8%). The commonest objects were coins (20.8 %), (parts of) jewelry (9.5%), and food (8.7%). Three quarters (74.5%) of patients presented between 1 and 2 hours after the injury (median, 1 hour). A total of 164 cases (2.0%) were marked as possible child abuse; 17 cases were filed as confirmed child abuse. Preventive parent education programs targeting foreign body-related injury should mainly focus on both sexes younger than 7 years. Parents should be taught to keep small objects out of reach of young children, especially coins, because these most often result in a trauma unit visit.
Labuschagne, G S; Morris, R W
2017-07-01
Sodium picosulfate, used in combination with magnesium oxide and citric acid for bowel cleansing, can result in dehydration. We investigated whether enhanced carbohydrate fluid intake pre-colonoscopy could mitigate this effect. We enrolled 398 elective colonoscopy patients in a prospective, controlled, single-blinded study. The control group (n=194) fasted routinely (minimum seven hours) whilst the treatment group (n=197) drank 1,200 ml carbohydrate solution leading up to admission (up until two hours pre-colonoscopy). On admission a patient survey was completed, and urine specific gravity obtained. Supine blood pressure and pulse rate were measured, and repeated within three minutes of standing. The carbohydrate group had reduced symptoms and signs of dehydration, including thirst (34% versus 65%, P <0.001), dry mouth (45% versus 59%, P =0.008), dizziness (10% versus 20%, P =0.010), lower mean urine specific gravity (1.007 versus 1.017, P <0.001), lower incidence of orthostatic hypotension (2.6% versus 11%, P <0.001), and lower mean erect pulse rate (78 versus 81 /minute, P =0.047). The postural change in systolic blood pressure was less in the treatment group (mean -0.4 mmHg, median -1 mmHg [interquartile range, IQR -7 to 7]) than in the control group (mean -4.1 mmHg, median -1 mmHg [IQR -12 to 3], P =0.028). These findings indicate that hydration with carbohydrate solution in patients taking sodium picosulfate has clinical benefit.
Vargas, Mariela; Talledo-Ulfe, Lincolth; Heredia, Paula; Quispe-Colquepisco, Sarita; Mejia, Christian R
To determine the influence of habits on depression in medical students from 7 Peruvian Regions. Analytical cross-sectional study of a secondary data analysis. The diagnosis of depression was obtained according to the Zung test result, with any level of this condition being considered positive. This was also compared with other social and educational variables that were important according to previous literature. Of the 1922 respondents, 54.5% (1047) were female. The median age was 20 [interquartile range, 18-22] years, and 13.5% (259) had some degree of depression according to the Zung scale. In the multivariate analysis, the frequency of depression increased with the hours of study per day (RPA=1.03; 95%CI; 1.01-1.04; P<.001) and the student work (RPA=1.98; 95%CI; 1.21-3.23; P=.006). On the other hand, decreased the frequency of depression decreased on having similar meal schedules (RPA=0.59; 95%CI; 0.38-0.93; P=.022), and having a fixed place in which to get food (RPA=0.66; 95%CI; 0.46-0.96; P=.030), adjusted for the year of college entrance. Some stressors predisposed to depression were found (the work and studying more hours a day). On the other hand, to have order in their daily routine decreased this condition (having a set place and times for meals). Copyright © 2017 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Cluster-Randomized, Crossover Trial of Head Positioning in Acute Stroke.
Anderson, Craig S; Arima, Hisatomi; Lavados, Pablo; Billot, Laurent; Hackett, Maree L; Olavarría, Verónica V; Muñoz Venturelli, Paula; Brunser, Alejandro; Peng, Bin; Cui, Liying; Song, Lily; Rogers, Kris; Middleton, Sandy; Lim, Joyce Y; Forshaw, Denise; Lightbody, C Elizabeth; Woodward, Mark; Pontes-Neto, Octavio; De Silva, H Asita; Lin, Ruey-Tay; Lee, Tsong-Hai; Pandian, Jeyaraj D; Mead, Gillian E; Robinson, Thompson; Watkins, Caroline
2017-06-22
The role of supine positioning after acute stroke in improving cerebral blood flow and the countervailing risk of aspiration pneumonia have led to variation in head positioning in clinical practice. We wanted to determine whether outcomes in patients with acute ischemic stroke could be improved by positioning the patient to be lying flat (i.e., fully supine with the back horizontal and the face upwards) during treatment to increase cerebral perfusion. In a pragmatic, cluster-randomized, crossover trial conducted in nine countries, we assigned 11,093 patients with acute stroke (85% of the strokes were ischemic) to receive care in either a lying-flat position or a sitting-up position with the head elevated to at least 30 degrees, according to the randomization assignment of the hospital to which they were admitted; the designated position was initiated soon after hospital admission and was maintained for 24 hours. The primary outcome was degree of disability at 90 days, as assessed with the use of the modified Rankin scale (scores range from 0 to 6, with higher scores indicating greater disability and a score of 6 indicating death). The median interval between the onset of stroke symptoms and the initiation of the assigned position was 14 hours (interquartile range, 5 to 35). Patients in the lying-flat group were less likely than patients in the sitting-up group to maintain the position for 24 hours (87% vs. 95%, P<0.001). In a proportional-odds model, there was no significant shift in the distribution of 90-day disability outcomes on the global modified Rankin scale between patients in the lying-flat group and patients in the sitting-up group (unadjusted odds ratio for a difference in the distribution of scores on the modified Rankin scale in the lying-flat group, 1.01; 95% confidence interval, 0.92 to 1.10; P=0.84). Mortality within 90 days was 7.3% among the patients in the lying-flat group and 7.4% among the patients in the sitting-up group (P=0.83). There were no significant between-group differences in the rates of serious adverse events, including pneumonia. Disability outcomes after acute stroke did not differ significantly between patients assigned to a lying-flat position for 24 hours and patients assigned to a sitting-up position with the head elevated to at least 30 degrees for 24 hours. (Funded by the National Health and Medical Research Council of Australia; HeadPoST ClinicalTrials.gov number, NCT02162017 .).
Slack, Donald F; Corwin, Douglas S; Shah, Nirav G; Shanholtz, Carl B; Verceles, Avelino C; Netzer, Giora; Jones, Kevin M; Brown, Clayton H; Terrin, Michael L; Hasday, Jeffrey D
2017-07-01
Prior studies suggest hypothermia may be beneficial in acute respiratory distress syndrome, but cooling causes shivering and increases metabolism. The objective of this study was to assess the feasibility of performing a randomized clinical trial of hypothermia in patients with acute respiratory distress syndrome receiving treatment with neuromuscular blockade because they cannot shiver. Retrospective study and pilot, prospective, open-label, feasibility study. Medical ICU. Retrospective review of 58 patients with acute respiratory distress syndrome based on Berlin criteria and PaO2/FIO2 less than 150 who received neuromuscular blockade. Prospective hypothermia treatment in eight acute respiratory distress syndrome patients with PaO2/FIO2 less than 150 receiving neuromuscular blockade. Cooling to 34-36°C for 48 hours. Core temperature, hemodynamics, serum glucose and electrolytes, and P/F were sequentially measured, and medians (interquartile ranges) presented, 28-day ventilator-free days, and hospital mortality were calculated in historical controls and eight cooled patients. Average patient core temperature was 36.7°C (36-37.3°C), and fever occurred during neuromuscular blockade in 30 of 58 retrospective patients. In the prospectively cooled patients, core temperature reached target range less than or equal to 4 hours of initiating cooling, remained less than 36°C for 92% of the 48 hours cooling period without adverse events, and was lower than the controls (34.35°C [34-34.8°C]; p < 0.0001). Compared with historical controls, the cooled patients tended to have lower hospital mortality (75% vs 53.4%; p = 0.26), more ventilator-free days (9 [0-21.5] vs 0 [0-12]; p = 0.16), and higher day 3 P/F (255 [160-270] vs 171 [120-214]; p = 0.024). Neuromuscular blockade alone does not cause hypothermia but allowed acute respiratory distress syndrome patients to be effectively cooled. Results support conducting a randomized clinical trial of hypothermia in acute respiratory distress syndrome and the feasibility of studying acute respiratory distress syndrome patients receiving neuromuscular blockade.
Simonis, Fabienne D; de Iudicibus, Gianfranco; Cremer, Olaf L; Ong, David S Y; van der Poll, Tom; Bos, Lieuwe D; Schultz, Marcus J
2018-01-01
Macrolides have been associated with favorable immunological effects in various inflammatory disease states. We investigated the association between macrolide therapy and mortality in patients with the acute respiratory distress syndrome (ARDS). This was an unplanned secondary analysis of patients with ARDS within a large prospective observational study of critically ill patients in the intensive care units (ICUs) of two university-affiliated hospitals in the Netherlands. The exposure of interest was low-dose macrolide use prescribed for another reason than infection; we excluded patients who received high-dose macrolides for an infection. The primary endpoint was 30-day mortality. The association between macrolide therapy and mortality was determined in the whole cohort, as well as in a propensity score matched cohort; the association was compared between pulmonary versus non-pulmonary ARDS, and between two biological phenotypes based on plasma levels of 20 biomarkers. In total, 873 patients with ARDS were analyzed, of whom 158 patients (18%) received macrolide therapy during stay in ICU for a median duration of 3 (interquartile range, 1-4) days. Erythromycin was the most frequent prescribed macrolide (97%). Macrolide therapy was associated with reduced 30-day mortality in the whole cohort [22.8% vs. 31.6%; crude odds ratio (OR), 0.64 (interquartile range, 0.43-0.96), P=0.03]. The association in the propensity score matched cohort remained significant [22.8% vs. 32.9%; OR, 0.62 (interquartile range, 0.39-0.96), P=0.03]. Propensity matched associations with mortality were different in patients with non-pulmonary ARDS vs. pulmonary ARDS and also varied by biological phenotype. These data together show that low-dose macrolide therapy prescribed for another reason than infection is associated with decreased mortality in patients with ARDS.
Scientific Production of Research Fellows at the Zagreb University School of Medicine, Croatia
Polašek, Ozren; Kolčić, Ivana; Buneta, Zoran; Čikeš, Nada; Pećina, Marko
2006-01-01
Aim To evaluate scientific production among research fellows employed at the Zagreb University School of Medicine and identify factors associated with their scientific output. Method We conducted a survey among research fellows and their mentors during June 2005. The main outcome measure was publication success, defined for each fellow as publishing at least 0.5 articles per employment year in journals indexed in the Current Contents bibliographic database. Bivariate methods and binary logistic regression were used in data analysis. Results A total of 117 fellows (response rate 95%) and 83 mentors (100%) were surveyed. The highest scientific production was recorded among research fellows employed in public health departments (median 3.0 articles, interquartile range 4.0), compared with those from pre-clinical (median 0.0, interquartile range 2.0) and clinical departments (median 1.0, interquartile range 2.0) (Kruskal-Wallis, P = 0.003). A total of 36 (29%) research fellows published at least 0.5 articles per employment year and were considered successful. Three variables were associated with fellows’ publication success: mentor’s scientific production (odds ratio [OR], 3.14; 95% confidence interval [CI], 1.31-7.53), positive mentor’s assessment (OR, 3.15; 95% CI, 1.10-9.05), and fellows’ undergraduate publication in journals indexed in the Current Contents bibliographic database (OR, 4.05; 95% CI, 1.07-15.34). Conclusion Undergraduate publication could be used as one of the main criteria in selecting research fellows. One of the crucial factors in a fellow’s scientific production and career advancement is mentor’s input, which is why research fellows would benefit most from working with scientifically productive mentors. PMID:17042070
Response to Antimalarials in Cutaneous Lupus Erythematosus A Prospective Analysis
Chang, Aileen Y.; Piette, Evan W.; Foering, Kristen P.; Tenhave, Thomas R.; Okawa, Joyce; Werth, Victoria P.
2012-01-01
Objective To demonstrate response to antimalarials in patients with cutaneous lupus erythematosus using activity scores from the Cutaneous Lupus Erythematosus Disease Area and Severity Index, a validated outcome measure. Design Prospective, longitudinal cohort study. Setting University cutaneous autoimmune disease clinic. Participants One hundred twenty-eight patients with cutaneous lupus erythematosus who presented from January 2007-July 2010 and had at least 2 visits with activity scores. Main Outcome Measures Response defined by 4-point or 20% decrease in activity score. Response to initiation determined with score before treatment and first visit at least 2 months after treatment. Response to continuation determined with score at first visit and most recent visit on treatment. Results Of 11 patients initiated on hydroxychloroquine, 55% were responders with a decrease in median (interquartile range) activity score from 8.0 (3.5-13) to 3.0 (1.8-7.3) (p=0.03). Of 15 patients who had failed hydroxychloroquine, 67% were responders to initiation of hydroxychloroquine-quinacrine, with a decrease in median (interquartile range) activity score from 6.0 (4.8-8.3) to 3.0 (0.75-5.0) (p=0.004). Nine out of 21 patients (43%) continued on hydroxychloroquine and 9 out of 21 patients (43%) continued on hydroxychloroquine-quinacrine were responders with a decrease in median (interquartile range) activity score from 6.0 (1.5-9.5) to 1.0 (0-4.5) (p=0.009) and 8.5 (4.25-17.5) to 5.0 (0.5-11.5) (p=0.01), respectively. Conclusion The use of quinacrine with hydroxychloroquine is associated with response in patients who fail hydroxychloroquine monotherapy. Further reduction in disease activity can be associated with continuation of antimalarials. PMID:21768444
Clinical presentation of patients with Ebola virus disease in Conakry, Guinea.
Bah, Elhadj Ibrahima; Lamah, Marie-Claire; Fletcher, Tom; Jacob, Shevin T; Brett-Major, David M; Sall, Amadou Alpha; Shindo, Nahoko; Fischer, William A; Lamontagne, Francois; Saliou, Sow Mamadou; Bausch, Daniel G; Moumié, Barry; Jagatic, Tim; Sprecher, Armand; Lawler, James V; Mayet, Thierry; Jacquerioz, Frederique A; Méndez Baggi, María F; Vallenas, Constanza; Clement, Christophe; Mardel, Simon; Faye, Ousmane; Faye, Oumar; Soropogui, Baré; Magassouba, Nfaly; Koivogui, Lamine; Pinto, Ruxandra; Fowler, Robert A
2015-01-01
In March 2014, the World Health Organization was notified of an outbreak of Zaire ebolavirus in a remote area of Guinea. The outbreak then spread to the capital, Conakry, and to neighboring countries and has subsequently become the largest epidemic of Ebola virus disease (EVD) to date. From March 25 to April 26, 2014, we performed a study of all patients with laboratory-confirmed EVD in Conakry. Mortality was the primary outcome. Secondary outcomes included patient characteristics, complications, treatments, and comparisons between survivors and nonsurvivors. Of 80 patients who presented with symptoms, 37 had laboratory-confirmed EVD. Among confirmed cases, the median age was 38 years (interquartile range, 28 to 46), 24 patients (65%) were men, and 14 (38%) were health care workers; among the health care workers, nosocomial transmission was implicated in 12 patients (32%). Patients with confirmed EVD presented to the hospital a median of 5 days (interquartile range, 3 to 7) after the onset of symptoms, most commonly with fever (in 84% of the patients; mean temperature, 38.6°C), fatigue (in 65%), diarrhea (in 62%), and tachycardia (mean heart rate, >93 beats per minute). Of these patients, 28 (76%) were treated with intravenous fluids and 37 (100%) with antibiotics. Sixteen patients (43%) died, with a median time from symptom onset to death of 8 days (interquartile range, 7 to 11). Patients who were 40 years of age or older, as compared with those under the age of 40 years, had a relative risk of death of 3.49 (95% confidence interval, 1.42 to 8.59; P=0.007). Patients with EVD presented with evidence of dehydration associated with vomiting and severe diarrhea. Despite attempts at volume repletion, antimicrobial therapy, and limited laboratory services, the rate of death was 43%.
Ibrahim, Wanis H; Alousi, Faraj H; Al-Khal, Abdulatif; Bener, Abdulbari; AlSalman, Ahmed; Aamer, Aaiza; Khaled, Ahmed; Raza, Tasleem
2016-01-01
To determine the mean and median delays in pulmonary tuberculosis (PTB) diagnosis among adults in one of the world's highest gross domestic product per capita countries and identify patient and health system-related reasons for these delays. This is a cross-sectional, face-to-face, prospective study of 100 subjects with confirmed PTB, conducted at main tuberculosis (TB) admitting facilities in Qatar. The mean and median diagnostic delays were measured. The Chi-square test with two-sided P < 0.05 was considered to determine the association between factors and diagnostic delay. The mean and median total diagnostic delays of PTB were 53 (95% confidence interval [CI] 42.61-63.59) and 30 (interquartile range; Q1-Q3, 15-60) days, respectively. The mean patient factor delay was 45.7 (95% CI 28.1-63.4) days, and the median was 30 (interquartile range; Q1-Q3, 15-60) days. The mean health system factor delay was 46.3 (95% CI 35.46-57.06) days, and the median was 30 (interquartile range; Q1-Q3, 18-60) days. The most common cause of patient factor delay was neglect of TB symptoms by patients (in 39% of cases), and for health-care system factor delay was a failure (mostly at general and private care levels) to suspect PTB by doctors (in 57% of cases). There were no significant associations between the presence of language barrier, patient occupation or nationality, and diagnostic delay. Despite a favorable comparison to other countries, there is a substantial delay in the diagnosis of PTB in Qatar. Relevant actions including health education on TB are required to minimize this delay.
CDH1 gene polymorphisms, plasma CDH1 levels and risk of gastric cancer in a Chinese population.
Zhan, Zhen; Wu, Juan; Zhang, Jun-Feng; Yang, Ya-Ping; Tong, Shujuan; Zhang, Chun-Bing; Li, Jin; Yang, Xue-Wen; Dong, Wei
2012-08-01
The genetic polymorphisms in E-cadherin gene (CDH1) may affect invasive/metastatic development of gastric cancer by altering gene transcriptional activity of epithelial cell. Our study aims to explore the associations among CDH1 gene polymorphisms, and predisposition of gastric cancer. We genotyped four potentially functional polymorphisms (rs13689, rs1801552, rs16260 and rs17690554) of the CDH1 gene in a case-control study of 387 incident gastric cancer cases and 392 healthy controls by polymerase chain reaction-ligation detection reaction methods (PCR-LDR) and measured the plasma CDH1 levels using enzyme immunoassay among the subjects. The median and inter-quartile range were adopted for representing the mean level of non-normally distributed data, and we found the level of plasma CDH1 in gastric cancer patients (median: 171.00 pg/ml; inter-quartile range: 257.10 pg/ml) were significantly higher than that of controls (median: 137.40 pg/ml; inter-quartile range: 83.90 pg/ml, P = 0.003). However, none of the four polymorphisms or their haplotypes achieved significant differences in their distributions between gastric cancer cases and controls, and interestingly, in the subgroup analysis of gastric cancer, we found that CA genotype of rs26160 and CG genotype of rs17690554 were associated with the risk of diffuse gastric cancer, compared with their wild genotypes (OR = 2.98, 95 % CI: 1.60-5.53; OR = 2.10, 95 % CI: 1.14-3.85, respectively, P < 0.05). In conclusion, our results indicated that plasma CDH1 levels may serve as a risk marker against gastric cancer and variant genotypes of rs26160 and rs17690554 may contribute to the etiology of diffuse gastric cancer in this study. Further studies are warranted to verify these findings.
de Iudicibus, Gianfranco; Cremer, Olaf L.; Ong, David S. Y.; van der Poll, Tom; Bos, Lieuwe D.; Schultz, Marcus J.
2018-01-01
Background Macrolides have been associated with favorable immunological effects in various inflammatory disease states. We investigated the association between macrolide therapy and mortality in patients with the acute respiratory distress syndrome (ARDS). Methods This was an unplanned secondary analysis of patients with ARDS within a large prospective observational study of critically ill patients in the intensive care units (ICUs) of two university-affiliated hospitals in the Netherlands. The exposure of interest was low-dose macrolide use prescribed for another reason than infection; we excluded patients who received high-dose macrolides for an infection. The primary endpoint was 30-day mortality. The association between macrolide therapy and mortality was determined in the whole cohort, as well as in a propensity score matched cohort; the association was compared between pulmonary versus non-pulmonary ARDS, and between two biological phenotypes based on plasma levels of 20 biomarkers. Results In total, 873 patients with ARDS were analyzed, of whom 158 patients (18%) received macrolide therapy during stay in ICU for a median duration of 3 (interquartile range, 1–4) days. Erythromycin was the most frequent prescribed macrolide (97%). Macrolide therapy was associated with reduced 30-day mortality in the whole cohort [22.8% vs. 31.6%; crude odds ratio (OR), 0.64 (interquartile range, 0.43–0.96), P=0.03]. The association in the propensity score matched cohort remained significant [22.8% vs. 32.9%; OR, 0.62 (interquartile range, 0.39–0.96), P=0.03]. Propensity matched associations with mortality were different in patients with non-pulmonary ARDS vs. pulmonary ARDS and also varied by biological phenotype. Conclusions These data together show that low-dose macrolide therapy prescribed for another reason than infection is associated with decreased mortality in patients with ARDS. PMID:29430441
St Louis, James D; Jodhka, Upinder; Jacobs, Jeffrey P; He, Xia; Hill, Kevin D; Pasquali, Sara K; Jacobs, Marshall L
2014-12-01
Contemporary outcomes data for complete atrioventricular septal defect (CAVSD) repair are limited. We sought to describe early outcomes of CAVSD repair across a large multicenter cohort, and explore potential associations with patient characteristics, including age, weight, and genetic syndromes. Patients in the Society of Thoracic Surgeons Congenital Heart Surgery Database having repair of CAVSD (2008-2011) were included. Preoperative, operative, and outcomes data were described. Univariate associations between patient factors and outcomes were described. Of 2399 patients (101 centers), 78.4% had Down syndrome. Median age at surgery was 4.6 months (interquartile range, 3.5-6.1 months), with 11.8% (n = 284) aged ≤ 2.5 months. Median weight at surgery was 5.0 kg (interquartile range, 4.3-5.8 kg) with 6.3% (n = 151) < 3.5 kg. Pulmonary artery band removal at CAVSD repair was performed in 122 patients (4.6%). Major complications occurred in 9.8%, including permanent pacemaker implantation in 2.7%. Median postoperative length of stay (PLOS) was 8 days (interquartile range, 5-14 days). Overall hospital mortality was 3.0%. Weight < 3.5 kg and age ≤ 2.5 months were associated with higher mortality, longer PLOS, and increased frequency of major complications. Patients with Down syndrome had lower rates of mortality and morbidities than other patients; PLOS was similar. In a contemporary multicenter cohort, most patients with CAVSD have repair early in the first year of life. Prior pulmonary artery band is rare. Hospital mortality is generally low, although patients at extremes of low weight and younger age have worse outcomes. Mortality and major complication rates are lower in patients with Down syndrome. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Ward, Shan L; Quinn, Carson M; Valentine, Stacey L; Sapru, Anil; Curley, Martha A Q; Willson, Douglas F; Liu, Kathleen D; Matthay, Michael A; Flori, Heidi R
2016-10-01
To determine the frequency of low-tidal volume ventilation in pediatric acute respiratory distress syndrome and assess if any demographic or clinical factors improve low-tidal volume ventilation adherence. Descriptive post hoc analysis of four multicenter pediatric acute respiratory distress syndrome studies. Twenty-six academic PICU. Three hundred fifteen pediatric acute respiratory distress syndrome patients. All patients who received conventional mechanical ventilation at hours 0 and 24 of pediatric acute respiratory distress syndrome who had data to calculate ideal body weight were included. Two cutoff points for low-tidal volume ventilation were assessed: less than or equal to 6.5 mL/kg of ideal body weight and less than or equal to 8 mL/kg of ideal body weight. Of 555 patients, we excluded 240 for other respiratory support modes or missing data. The remaining 315 patients had a median PaO2-to-FIO2 ratio of 140 (interquartile range, 90-201), and there were no differences in demographics between those who did and did not receive low-tidal volume ventilation. With tidal volume cutoff of less than or equal to 6.5 mL/kg of ideal body weight, the adherence rate was 32% at hour 0 and 33% at hour 24. A low-tidal volume ventilation cutoff of tidal volume less than or equal to 8 mL/kg of ideal body weight resulted in an adherence rate of 58% at hour 0 and 60% at hour 24. Low-tidal volume ventilation use was no different by severity of pediatric acute respiratory distress syndrome nor did adherence improve over time. At hour 0, overweight children were less likely to receive low-tidal volume ventilation less than or equal to 6.5 mL/kg ideal body weight (11% overweight vs 38% nonoverweight; p = 0.02); no difference was noted by hour 24. Furthermore, in the overweight group, using admission weight instead of ideal body weight resulted in misclassification of up to 14% of patients as receiving low-tidal volume ventilation when they actually were not. Low-tidal volume ventilation is underused in the first 24 hours of pediatric acute respiratory distress syndrome. Age, Pediatric Risk of Mortality-III, and pediatric acute respiratory distress syndrome severity were not associated with improved low-tidal volume ventilation adherence nor did adherence improve over time. Overweight children were less likely to receive low-tidal volume ventilation strategies in the first day of illness.
Automated external defibrillators and simulated in-hospital cardiac arrests.
Rossano, Joseph W; Jefferson, Larry S; Smith, E O'Brian; Ward, Mark A; Mott, Antonio R
2009-05-01
To test the hypothesis that pediatric residents would have shorter time to attempted defibrillation using automated external defibrillators (AEDs) compared with manual defibrillators (MDs). A prospective, randomized, controlled trial of AEDs versus MDs was performed. Pediatric residents responded to a simulated in-hospital ventricular fibrillation cardiac arrest and were randomized to using either an AED or MD. The primary end point was time to attempted defibrillation. Sixty residents, 21 (35%) interns, were randomized to 2 groups (AED = 30, MD = 30). Residents randomized to the AED group had a significantly shorter time to attempted defibrillation [median, 60 seconds (interquartile range, 53 to 71 seconds)] compared with those randomized to the MD group [median, 103 seconds (interquartile range, 68 to 288 seconds)] (P < .001). All residents in the AED group attempted defibrillation at <5 minutes compared with 23 (77%) in the MD group (P = .01). AEDs improve the time to attempted defibrillation by pediatric residents in simulated cardiac arrests. Further studies are needed to help determine the role of AEDs in pediatric in-hospital cardiac arrests.
Changing Epidemiology of Human Brucellosis, China, 1955-2014.
Lai, Shengjie; Zhou, Hang; Xiong, Weiyi; Gilbert, Marius; Huang, Zhuojie; Yu, Jianxing; Yin, Wenwu; Wang, Liping; Chen, Qiulan; Li, Yu; Mu, Di; Zeng, Lingjia; Ren, Xiang; Geng, Mengjie; Zhang, Zike; Cui, Buyun; Li, Tiefeng; Wang, Dali; Li, Zhongjie; Wardrop, Nicola A; Tatem, Andrew J; Yu, Hongjie
2017-02-01
Brucellosis, a zoonotic disease, was made statutorily notifiable in China in 1955. We analyzed the incidence and spatial-temporal distribution of human brucellosis during 1955-2014 in China using notifiable surveillance data: aggregated data for 1955-2003 and individual case data for 2004-2014. A total of 513,034 brucellosis cases were recorded, of which 99.3% were reported in northern China during 1955-2014, and 69.1% (258, 462/374, 141) occurred during February-July in 1990-2014. Incidence remained high during 1955-1978 (interquartile range 0.42-1.0 cases/100,000 residents), then decreased dramatically in 1979-1994. However, brucellosis has reemerged since 1995 (interquartile range 0.11-0.23 in 1995-2003 and 1.48-2.89 in 2004-2014); the historical high occurred in 2014, and the affected area expanded from northern pastureland provinces to the adjacent grassland and agricultural areas, then to southern coastal and southwestern areas. Control strategies in China should be adjusted to account for these changes by adopting a One Health approach.
Luttmann-Gibson, Heike; Sarnat, Stefanie Ebelt; Suh, Helen H; Coull, Brent A; Schwartz, Joel; Zanobetti, Antonella; Gold, Diane R
2014-02-01
We examine whether ambient air pollution is associated with oxygen saturation in 32 elderly subjects in Steubenville, Ohio. We used linear mixed models to examine the effects of fine particulate matter less than 2.5 μm (PM(2.5)), sulfate (SO(4)(-2)), elemental carbon, and gases on median oxygen saturation. An interquartile range increase of 13.4 μg/m in PM(2.5) on the previous day was associated with a decrease of -0.18% (95% confidence interval: -0.31 to -0.06) and a 5.1 μg/m(3) interquartile range increase in SO(4)(-2) on the previous day was associated with a decrease of -0.16% (95% confidence interval: -0.27 to -0.04) in oxygen saturation during the initial 5-minute rest period of the protocol. Increased exposure to air pollution, including the nontraffic pollutant SO(4)(-2) from industrial sources, led to changes in oxygen saturation that may reflect particle-induced pulmonary inflammatory or vascular responses.
Cancer patient experience, hospital performance and case mix: evidence from England.
Abel, Gary A; Saunders, Catherine L; Lyratzopoulos, Georgios
2014-01-01
This study aims to explore differences between crude and case mix-adjusted estimates of hospital performance with respect to the experience of cancer patients. This study analyzed the English 2011/2012 Cancer Patient Experience Survey covering all English National Health Service hospitals providing cancer treatment (n = 160). Logistic regression analysis was used to predict hospital performance for each of the 64 evaluative questions, adjusting for age, gender, ethnic group and cancer diagnosis. The degree of reclassification was explored across three categories (bottom 20%, middle 60% and top 20% of hospitals). There was high concordance between crude and adjusted ranks of hospitals (median Kendall's τ = 0.84; interquartile range: 0.82-0.88). Across all questions, a median of 5.0% (eight) of hospitals (interquartile range: 3.8-6.4%; six to ten hospitals) moved out of the extreme performance categories after case mix adjustment. In this context, patient case mix has only a small impact on measured hospital performance for cancer patient experience.
Sawinski, Deirdre; Patel, Nikunjkumar; Appolo, Brenda; Bloom, Roy
2017-05-01
Hepatitis C virus (HCV) infection is prevalent in the renal transplant population but direct acting antiviral agents (DAA) provide an effective cure of HCV infection without risk of allograft rejection. We report our experience treating 43 renal transplant recipients with 4 different DAA regimens. One hundred percent achieved a sustained viral response by 12 weeks after therapy, and DAA regimens were well tolerated. Recipients transplanted with a HCV+ donor responded equally well to DAA therapy those transplanted with a kidney from an HCV- donor, but recipients of HCV+ organs experienced significantly shorter wait times to transplantation, 485 days (interquartile range, 228-783) versus 969 days (interquartile range, 452-2008; P = 0.02). On this basis, we advocate for a strategy of early posttransplant HCV eradication to facilitate use of HCV+ organs whenever possible. Additional studies are needed to identify the optimal DAA regimen for kidney transplant recipients, accounting for efficacy, timing relative to transplant, posttransplant clinical outcomes, and cost.
Durga, Padmaja; Raavula, Parvathi; Gurajala, Indira; Gunnam, Poojita; Veerabathula, Prardhana; Reddy, Mukund; Upputuri, Omkar; Ramachandran, Gopinath
2015-09-01
To assess the effect of tranexamic acid on the quality of the surgical field. Prospective, randomized, double-blind study. Institutional, tertiary referral hospital. American Society of Anesthesiologists physical status class I patients, aged 8 to 60 months with Group II or III (Balakrishnan's classification) clefts scheduled for cleft palate repair. Children were randomized into two groups. The control group received saline, and the tranexamic acid group received tranexamic acid 10 mg/kg as a bolus, 15 minutes before incision. Grade of surgical field on a 10-point scale, surgeon satisfaction, and primary hemorrhage. Significant improvements were noted in surgeon satisfaction and median grade of assessment of the surgical field (4 [interquartile range, 4 to 6] in the control group vs. 3 [interquartile range, 2 to 4] in the test group; P = .003) in the tranexamic acid group compared to the control group. Preincision administration of 10 mg/kg of tranexamic acid significantly improved the surgical field during cleft palate repair.
Shehabi, Yahya; Bellomo, Rinaldo; Kadiman, Suhaini; Ti, Lian Kah; Howe, Belinda; Reade, Michael C; Khoo, Tien Meng; Alias, Anita; Wong, Yu-Lin; Mukhopadhyay, Amartya; McArthur, Colin; Seppelt, Ian; Webb, Steven A; Green, Maja; Bailey, Michael J
2018-06-01
In the absence of a universal definition of light or deep sedation, the level of sedation that conveys favorable outcomes is unknown. We quantified the relationship between escalating intensity of sedation in the first 48 hours of mechanical ventilation and 180-day survival, time to extubation, and delirium. Harmonized data from prospective multicenter international longitudinal cohort studies SETTING:: Diverse mix of ICUs. Critically ill patients expected to be ventilated for longer than 24 hours. Richmond Agitation Sedation Scale and pain were assessed every 4 hours. Delirium and mobilization were assessed daily using the Confusion Assessment Method of ICU and a standardized mobility assessment, respectively. Sedation intensity was assessed using a Sedation Index, calculated as the sum of negative Richmond Agitation Sedation Scale measurements divided by the total number of assessments. We used multivariable Cox proportional hazard models to adjust for relevant covariates. We performed subgroup and sensitivity analysis accounting for immortal time bias using the same variables within 120 and 168 hours. The main outcome was 180-day survival. We assessed 703 patients in 42 ICUs with a mean (SD) Acute Physiology and Chronic Health Evaluation II score of 22.2 (8.5) with 180-day mortality of 32.3% (227). The median (interquartile range) ventilation time was 4.54 days (2.47-8.43 d). Delirium occurred in 273 (38.8%) of patients. Sedation intensity, in an escalating dose-dependent relationship, independently predicted increased risk of death (hazard ratio [95% CI], 1.29 [1.15-1.46]; p < 0.001, delirium hazard ratio [95% CI], 1.25 [1.10-1.43]), p value equals to 0.001 and reduced chance of early extubation hazard ratio (95% CI) 0.80 (0.73-0.87), p value of less than 0.001. Agitation level independently predicted subsequent delirium hazard ratio [95% CI], of 1.25 (1.04-1.49), p value equals to 0.02. Delirium or mobilization episodes within 168 hours, adjusted for sedation intensity, were not associated with survival. Sedation intensity independently, in an ascending relationship, predicted increased risk of death, delirium, and delayed time to extubation. These observations suggest that keeping sedation level equivalent to a Richmond Agitation Sedation Scale 0 is a clinically desirable goal.
Stone, M; Collins, A L; Silins, U; Emelko, M B; Zhang, Y S
2014-03-01
There is increasing global concern regarding the impacts of large scale land disturbance by wildfire on a wide range of water and related ecological services. This study explores the impact of the 2003 Lost Creek wildfire in the Crowsnest River basin, Alberta, Canada on regional scale sediment sources using a tracing approach. A composite geochemical fingerprinting procedure was used to apportion the sediment efflux among three key spatial sediment sources: 1) unburned (reference) 2) burned and 3) burned sub-basins that were subsequently salvage logged. Spatial sediment sources were characterized by collecting time-integrated suspended sediment samples using passive devices during the entire ice free periods in 2009 and 2010. The tracing procedure combines the Kruskal-Wallis H-test, principal component analysis and genetic-algorithm driven discriminant function analysis for source discrimination. Source apportionment was based on a numerical mass balance model deployed within a Monte Carlo framework incorporating both local optimization and global (genetic algorithm) optimization. The mean relative frequency-weighted average median inputs from the three spatial source units were estimated to be 17% (inter-quartile uncertainty range 0-32%) from the reference areas, 45% (inter-quartile uncertainty range 25-65%) from the burned areas and 38% (inter-quartile uncertainty range 14-59%) from the burned-salvage logged areas. High sediment inputs from burned and the burned-salvage logged areas, representing spatial source units 2 and 3, reflect the lasting effects of forest canopy and forest floor organic matter disturbance during the 2003 wildfire including increased runoff and sediment availability related to high terrestrial erosion, streamside mass wasting and river bank collapse. The results demonstrate the impact of wildfire and incremental pressures associated with salvage logging on catchment spatial sediment sources in higher elevation Montane regions where forest growth and vegetation recovery are relatively slow. Copyright © 2013 Elsevier B.V. All rights reserved.
Tobin, W Oliver; Guo, Yong; Krecke, Karl N; Parisi, Joseph E; Lucchinetti, Claudia F; Pittock, Sean J; Mandrekar, Jay; Dubey, Divyanshu; Debruyne, Jan; Keegan, B Mark
2017-09-01
Chronic lymphocytic inflammation with pontine perivascular enhancement responsive to steroids (CLIPPERS) is a central nervous system inflammatory syndrome predominantly affecting the brainstem, cerebellum, and spinal cord. Following its initial description, the salient features of CLIPPERS have been confirmed and expanded upon, but the lack of formalized diagnostic criteria has led to reports of patients with dissimilar features purported to have CLIPPERS. We evaluated clinical, radiological and pathological features of patients referred for suspected CLIPPERS and propose diagnostic criteria to discriminate CLIPPERS from non-CLIPPERS aetiologies. Thirty-five patients were evaluated for suspected CLIPPERS. Clinical and neuroimaging data were reviewed by three neurologists to confirm CLIPPERS by consensus agreement. Neuroimaging and neuropathology were reviewed by experienced neuroradiologists and neuropathologists, respectively, both of whom were blinded to the clinical data. CLIPPERS was diagnosed in 23 patients (18 male and five female) and 12 patients had a non-CLIPPERS diagnosis. CLIPPERS patients' median age of onset was 58 years (interquartile range, 24-72) and were followed a median of 44 months (interquartile range 38-63). Non-CLIPPERS patients' median age of onset was 52 years (interquartile range, 39-59) and were followed a median of 27 months (interquartile range, 14-47). Clinical symptoms of gait ataxia, diplopia, cognitive impairment, and facial paraesthesia did not discriminate CLIPPERS from non-CLIPPERS. Marked clinical and radiological corticosteroid responsiveness was observed in CLIPPERS (23/23), and clinical worsening occurred in all 12 CLIPPERS cases when corticosteroids were discontinued. Corticosteroid responsiveness was common but not universal in non-CLIPPERS [clinical improvement (8/12); radiological improvement (2/12); clinical worsening on discontinuation (3/8)]. CLIPPERS patients had brainstem predominant perivascular gadolinium enhancing lesions on magnetic resonance imaging that were discriminated from non-CLIPPERS by: homogenous gadolinium enhancing nodules <3 mm in diameter without ring-enhancement or mass effect, and homogenous T2 signal abnormality not significantly exceeding the T1 enhancement. Brain neuropathology on 14 CLIPPERS cases demonstrated marked CD3-positive T-lymphocyte, mild B-lymphocyte and moderate macrophage infiltrates, with perivascular predominance as well as diffuse parenchymal infiltration (14/14), present in meninges, white and grey matter, associated with variable tissue destruction, astrogliosis and secondary myelin loss. Clinical, radiological and pathological feature define CLIPPERS syndrome and are differentiated from non-CLIPPERS aetiologies by neuroradiological and neuropathological findings. © The Author (2017). Published by Oxford University Press on behalf of the Guarantors of Brain. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Aso, Yoshimasa; Terasawa, Tomoko; Kato, Kanako; Jojima, Teruo; Suzuki, Kunihiro; Iijima, Toshie; Kawagoe, Yoshiaki; Mikami, Shigeru; Kubota, Yoshiro; Inukai, Toshihiko; Kasai, Kikuo
2013-11-01
A soluble form of CD26/dipeptidyl peptidase 4 (sCD26/DPP4) is found in serum and it has DPP4 enzymatic activity. We investigated whether the serum level of sCD26/DPP4 was influenced by the oral glucose tolerance test (OGTT) in healthy subjects. The serum sCD26/DPP4 level increased significantly from 824.5 ng/mL (interquartile range, from 699.0 to 1050 ng/mL) at baseline to a peak of 985.0 ng/mL (interquartile range, from 796.5 to 1215 ng/mL) during the OGTT (P < 0.0001). The peak sCD26/DPP4 level correlated positively with the baseline age and body mass index, and fasting plasma glucose (FPG), homeostasis model assessment of insulin resistance (HOMA-IR), triglycerides (TG), alanine aminotransferase, and γ-glutamyl transpeptidase (GGT) levels whereas it correlated negatively with high-density lipoprotein (HDL) cholesterol and the serum levels of total and high-molecular weight (HMW) adiponectin. Stepwise regression analysis was done with forward selection of variables, including age, FPG, HOMA-IR, TG, HDL cholesterol, uric acid, GGT, C-reactive protein, and HMW adiponectin. In a model that explained 57.5% of the variation of the peak sCD26/DPP4 level, GGT (β = 0.382, P = 0.007) and HOMA-IR (β = 0.307, P = 0.034) were independent determinants of the peak serum level of sCD26/DPP4. Serum HMW adiponectin decreased significantly from 4.43 μg/mL (interquartile range, from 2.80 to 6.65 μg/mL) at baseline to 4.17 μg/mL (interquartile range, from 2.48 to 6.96 μg/mL) 120 minutes after the oral glucose load (P < 0.0001). The baseline serum level of sCD26/DPP4 showed a significant negative correlation with the percent change of HMW adiponectin during the OGTT. In conclusion, the serum level of sCD26/DPP4 increased acutely after an oral glucose load in apparently healthy subjects. The abrupt increase of serum sCD26/DPP4 after a glucose load may be a marker of insulin resistance that could come from liver or muscle. Copyright © 2013 Mosby, Inc. All rights reserved.
Adverse events and dropouts in Alzheimer's disease studies: what can we learn?
Henley, David B; Sundell, Karen L; Sethuraman, Gopalan; Schneider, Lon S
2015-01-01
Interpreting Alzheimer's disease (AD) clinical trial (CT) outcomes is complicated by treatment dropouts and adverse events (AEs). In elderly participants, AE rates, dropouts, and deaths are important considerations as they may undermine the validity of clinical trials. Published discontinuation and safety data are limited. Safety data from 1054 placebo-treated participants in IDENTITY and IDENTITY-2, 76-week, Phase 3 AD studies conducted in 31 countries, were pooled, annualized, and summarized overall, by country and age group. Median age was 74.2 (interquartile range 67.9-79.5) years; 57.4% were female; and median observation time was 63.2 (interquartile range 41.6-77.4) weeks when study drug dosing was halted. Overall annualized rates for discontinuations, discontinuations due to AEs, serious adverse events (SAEs), and deaths were 21.6% (range 19.6%-24.0%), 8.2% (range 8.1%-8.3%), 12.0%, and 1.7%, respectively. AE and discontinuation rates varied by country and age groups. Fall, pneumonia, and atrial fibrillation AEs were more frequent in the oldest age group. These annualized placebo safety data provide insight into the course of enrolled patients with mild-to-moderate AD, and are useful in planning longer term trials and in monitoring safety. Copyright © 2015 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
[Fluoride content of bottled natural mineral waters in Spain and prevention of dental caries].
Maraver, Francisco; Vitoria, Isidro; Almerich-Silla, José Manuel; Armijo, Francisco
2015-01-01
The aim of the study was to determine the concentration of fluoride in natural mineral waters marketed in Spain in order to prevent tooth decay without the risk of causing dental fluorosis Descriptive and cross-sectional study during 2012. Natural mineral waters marketed in Spain. Three bottles with different bottling dates of 109 natural mineral waters (97 Spanish and 12 imported brands). Determination of fluoride by ion chromatography Median fluoride concentrations of the natural mineral waters bottled in Spain was 0.22 (range 0.00-4.16; interquartile range:0.37). Most samples (61 brands, 62%) contained less than 0.3mg/L. There are 19 Spanish brands with more than 0.6 mg/L. The median level in imported brands was 0.35 (range 0.10-1.21; interquartile range: 0.23). Only 28 of the 109 brands examined (25.6%) specified the fluoride content on the label. Good correlation was observed between the concentrations indicated and those determined. Fluoride concentrations in natural mineral waters showed high variation. Given the growing consumption of natural mineral waters in Spain, this type of information is important to make proper use of fluoride in the primary prevention of dental caries. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Quah, H M; Seow-Choen, F
2004-03-01
This study was designed to compare diathermy excision and diathermy coagulation in the treatment of symptomatic prolapsed piles. Forty-five consecutive patients were randomly assigned to diathermy excision hemorrhoidectomy (Group A, n = 25) and diathermy coagulation (Group B, n = 20) under general anesthesia. The median duration of surgery was ten minutes for both groups. There was no statistical difference in the severity of postoperative pain at rest between the two groups, but Group A patients felt less pain during defecation on the third postoperative day (median, 5 (interquartile range, 3-7) vs. 8 (4-9); P = 0.04) and on the sixth postoperative day (median, 5 (interquartile range, 2-6) vs. 9 (5-10); P = 0.02). There was, however, no statistical difference in postoperative oral analgesics use and patients' satisfaction scores between the two groups. Complication rates were similar except that diathermy coagulation tended to leave some residual skin components of external hemorrhoid especially in very large prolapsed piles. Group A patients resumed work earlier (mean, 12 (range, 4-20) vs. 17 (11-21) days); however, this was not statistically significant ( P = 0.1). Diathermy coagulation of hemorrhoids is a simple technique and may be considered in suitable cases.
Left ventricular hypertrophy with strain and aortic stenosis.
Shah, Anoop S V; Chin, Calvin W L; Vassiliou, Vassilis; Cowell, S Joanna; Doris, Mhairi; Kwok, T'ng Choong; Semple, Scott; Zamvar, Vipin; White, Audrey C; McKillop, Graham; Boon, Nicholas A; Prasad, Sanjay K; Mills, Nicholas L; Newby, David E; Dweck, Marc R
2014-10-28
ECG left ventricular hypertrophy with strain is associated with an adverse prognosis in aortic stenosis. We investigated the mechanisms and outcomes associated with ECG strain. One hundred and two patients (age, 70 years [range, 63-75 years]; male, 66%; aortic valve area, 0.9 cm(2) [range, 0.7-1.2 cm(2)]) underwent ECG, echocardiography, and cardiovascular magnetic resonance. They made up the mechanism cohort. Myocardial fibrosis was determined with late gadolinium enhancement (replacement fibrosis) and T1 mapping (diffuse fibrosis). The relationship between ECG strain and cardiovascular magnetic resonance was then assessed in an external validation cohort (n=64). The outcome cohort was made up of 140 patients from the Scottish Aortic Stenosis and Lipid Lowering Trial Impact on Regression (SALTIRE) study and was followed up for 10.6 years (1254 patient-years). Compared with those without left ventricular hypertrophy (n=51) and left ventricular hypertrophy without ECG strain (n=30), patients with ECG strain (n=21) had more severe aortic stenosis, increased left ventricular mass index, more myocardial injury (high-sensitivity plasma cardiac troponin I concentration, 4.3 ng/L [interquartile range, 2.5-7.3 ng/L] versus 7.3 ng/L [interquartile range, 3.2-20.8 ng/L] versus 18.6 ng/L [interquartile range, 9.0-45.2 ng/L], respectively; P<0.001) and increased diffuse fibrosis (extracellular volume fraction, 27.4±2.2% versus 27.2±2.9% versus 30.9±1.9%, respectively; P<0.001). All patients with ECG strain had midwall late gadolinium enhancement (positive and negative predictive values of 100% and 86%, respectively). Indeed, late gadolinium enhancement was independently associated with ECG strain (odds ratio, 1.73; 95% confidence interval, 1.08-2.77; P=0.02), a finding confirmed in the validation cohort. In the outcome cohort, ECG strain was an independent predictor of aortic valve replacement or cardiovascular death (hazard ratio, 2.67; 95% confidence interval, 1.35-5.27; P<0.01). ECG strain is a specific marker of midwall myocardial fibrosis and predicts adverse clinical outcomes in aortic stenosis. © 2014 American Heart Association, Inc.
Podiatry Ankle Duplex Scan: Readily Learned and Accurate in Diabetes.
Normahani, Pasha; Powezka, Katarzyna; Aslam, Mohammed; Standfield, Nigel J; Jaffer, Usman
2018-03-01
We aimed to train podiatrists to perform a focused duplex ultrasound scan (DUS) of the tibial vessels at the ankle in diabetic patients; podiatry ankle (PodAnk) duplex scan. Thirteen podiatrists underwent an intensive 3-hour long simulation training session. Participants were then assessed performing bilateral PodAnk duplex scans of 3 diabetic patients with peripheral arterial disease. Participants were assessed using the duplex ultrasound objective structured assessment of technical skills (DUOSATS) tool and an "Imaging Score". A total of 156 vessel assessments were performed. All patients had abnormal waveforms with a loss of triphasic flow. Loss of triphasic flow was accurately detected in 145 (92.9%) vessels; the correct waveform was identified in 139 (89.1%) cases. Participants achieved excellent DUOSATS scores (median 24 [interquartile range: 23-25], max attainable score of 26) as well as "Imaging Scores" (8 [8-8], max attainable score of 8) indicating proficiency in technical skills. The mean time taken for each bilateral ankle assessment was 20.4 minutes (standard deviation ±6.7). We have demonstrated that a focused DUS for the purpose of vascular assessment of the diabetic foot is readily learned using intensive simulation training.
Stroke Laterality Bias in the Management of Acute Ischemic Stroke.
McCluskey, Gavin; Wade, Carrie; McKee, Jacqueline; McCarron, Peter; McVerry, Ferghal; McCarron, Mark O
2016-11-01
Little is known of the impact of stroke laterality on the management process and outcome of patients with acute ischemic stroke (AIS). Consecutive patients admitted to a general hospital over 1 year with supratentorial AIS were eligible for inclusion in the study. Baseline characteristics and risk factors, delays in hospital admission, imaging, intrahospital transfer to an acute stoke unit, stroke severity and classification, length of hospital admission, as well as 10-year mortality were measured and compared among right and left hemisphere AIS patients. There were 141 patients (77 men, 64 women; median age 73 [interquartile range 63-79] years), There were 71 patients with left hemisphere AIS and 70 with right hemisphere AIS. Delays to hospital admission from stroke onset to neuroimaging were similar among right and left hemisphere AIS patients. Delay in transfer to an acute stroke unit (ASU) following hospital admission was on average 14 hours more for right hemisphere compared to left hemisphere AIS patients (P = .01). Laterality was not associated with any difference in 10-year survival. Patients with mild and nondominant AIS merit particular attention to minimize their intrahospital transfer time to an ASU. Copyright © 2016 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Werber, Dirk; Hoffmann, Alexandra; Santibanez, Sabine; Mankertz, Annette; Sagebiel, Daniel
2017-01-01
The largest measles outbreak in Berlin since 2001 occurred from October 2014 to August 2015. Overall, 1,344 cases were ascertained, 86% (with available information) unvaccinated, including 146 (12%) asylum seekers. Median age was 17 years (interquartile range: 4–29 years), 26% were hospitalised and a 1-year-old child died. Measles virus genotyping uniformly revealed the variant ‘D8-Rostov-Don’ and descendants. The virus was likely introduced by and initially spread among asylum seekers before affecting Berlin’s resident population. Among Berlin residents, the highest incidence was in children aged < 2 years, yet most cases (52%) were adults. Post-exposure vaccinations in homes for asylum seekers, not always conducted, occurred later (median: 7.5 days) than the recommended 72 hours after onset of the first case and reached only half of potential contacts. Asylum seekers should not only have non-discriminatory, equitable access to vaccination, they also need to be offered measles vaccination in a timely fashion, i.e. immediately upon arrival in the receiving country. Supplementary immunisation activities targeting the resident population, particularly adults, are urgently needed in Berlin. PMID:28857043
Yassin, Hany Mahmoud; Abd Elmoneim, Ahmed Tohamy; El Moutaz, Hatem
2017-06-01
Ultrasound-guided rectus sheath blockade has been described to provide analgesia for midline abdominal incisions. We aimed to compare thoracic epidural analgesia (TEA) and rectus sheath analgesia (RSA) with respect to safety and efficacy. Sixty patients who underwent elective laparotomies through a midline incision were assigned randomly to receive either continuous TEA (TEA group, n = 31) or intermittent RSA (RSA group, n = 29). The number of patients who required analgesia, the time to first request analgesia, the interval and the cumulative morphine doses consumption during 72 hours postoperatively, and pain intensity using visual analog score (VAS) at rest and upon coughing were reported in addition to any side effects related to both techniques or administered drugs. While 17 (54.84 %) patients were in the TEA group, 25 (86.21%) patients in the RSA group required analgesia postoperatively, P = 0.008. Cumulative morphine consumed during the early 72 hours postoperatively median (interquartile range) = 33 mg (27 - 39 mg), 95% confidence interval (28.63 - 37.37 mg) for the TEA group. While in the RSA group, it was 51 mg (45 - 57 mg), 95% CI (47.4 - 54.6 mg), P < 0.001. The time for the first request of morphine was 256.77 ± 73.45 minutes in the TEA group versus 208.82 ± 64.65 min in the RSA group, P = 0.031. VAS at rest and cough were comparable in both groups at all time points of assessment, P > 0.05. The time to the ambulation was significantly shorter in the RSA group (38.47 ± 12.34 hours) as compared to the TEA group (45.89 ± 8.72 hours), P = 0.009. Sedation scores were significantly higher in the RSA group, only at 12 hours and 24 hours postoperatively than in TEA group, with P = 0.041 and 0.013, respectively. The incidence of other morphine-related side effects, time to pass flatus, and patients satisfaction scores were comparable between both groups. Continuous TEA had better opioid sparing effects markedly during the early 72 hours postoperatively than that of intermittent RSA with catheters inserted under real-time ultrasound guidance, both had comparable safety perspectives, and RSA had the advantage of early ambulation. RSA could be used as an effective alternative when TEA could not be employed in patients undergoing laparotomies with an extended midline incision, especially after the first postoperative day.
Yassin, Hany Mahmoud; Abd Elmoneim, Ahmed Tohamy; El Moutaz, Hatem
2017-01-01
Background Ultrasound-guided rectus sheath blockade has been described to provide analgesia for midline abdominal incisions. We aimed to compare thoracic epidural analgesia (TEA) and rectus sheath analgesia (RSA) with respect to safety and efficacy. Methods Sixty patients who underwent elective laparotomies through a midline incision were assigned randomly to receive either continuous TEA (TEA group, n = 31) or intermittent RSA (RSA group, n = 29). The number of patients who required analgesia, the time to first request analgesia, the interval and the cumulative morphine doses consumption during 72 hours postoperatively, and pain intensity using visual analog score (VAS) at rest and upon coughing were reported in addition to any side effects related to both techniques or administered drugs. Results While 17 (54.84 %) patients were in the TEA group, 25 (86.21%) patients in the RSA group required analgesia postoperatively, P = 0.008. Cumulative morphine consumed during the early 72 hours postoperatively median (interquartile range) = 33 mg (27 - 39 mg), 95% confidence interval (28.63 - 37.37 mg) for the TEA group. While in the RSA group, it was 51 mg (45 - 57 mg), 95% CI (47.4 - 54.6 mg), P < 0.001. The time for the first request of morphine was 256.77 ± 73.45 minutes in the TEA group versus 208.82 ± 64.65 min in the RSA group, P = 0.031. VAS at rest and cough were comparable in both groups at all time points of assessment, P > 0.05. The time to the ambulation was significantly shorter in the RSA group (38.47 ± 12.34 hours) as compared to the TEA group (45.89 ± 8.72 hours), P = 0.009. Sedation scores were significantly higher in the RSA group, only at 12 hours and 24 hours postoperatively than in TEA group, with P = 0.041 and 0.013, respectively. The incidence of other morphine-related side effects, time to pass flatus, and patients satisfaction scores were comparable between both groups. Conclusions Continuous TEA had better opioid sparing effects markedly during the early 72 hours postoperatively than that of intermittent RSA with catheters inserted under real-time ultrasound guidance, both had comparable safety perspectives, and RSA had the advantage of early ambulation. RSA could be used as an effective alternative when TEA could not be employed in patients undergoing laparotomies with an extended midline incision, especially after the first postoperative day. PMID:28856110
Nitrous Oxide Anesthesia and Plasma Homocysteine in Adolescents
Nagele, Peter; Tallchief, Danielle; Blood, Jane; Sharma, Anshuman; Kharasch, Evan D.
2011-01-01
Background Nitrous oxide inactivates vitamin B12, inhibits methionine synthase and consequently increases plasma total homocysteine (tHcy). Prolonged exposure to nitrous oxide can lead to neuropathy, spinal cord degeneration and even death in children. We tested the hypothesis that nitrous oxide anesthesia causes a significant increase in plasma tHcy in children. Methods Twenty-seven children (age 10-18 years) undergoing elective major spine surgery were enrolled and serial plasma samples from 0 – 96 hours after induction were obtained. The anesthetic regimen, including the use of nitrous oxide, was at the discretion of the anesthesiologist. Plasma tHcy was measured using standard enzymatic assays. Results The median baseline plasma tHcy concentration was 5.1 μmol/L (3.9 – 8.0 μmol/L, interquartile range) and increased in all patients exposed to nitrous oxide (n=26) by an average of +9.4 μmol/L (geometric mean; 95% CI 7.1 – 12.5 μmol/L) or +228% (mean; 95% CI 178% - 279%). Plasma tHcy peaked between 6-8 hours after induction of anesthesia. One patient who did not receive nitrous oxide had no increase in plasma tHcy. Several patients experienced a several-fold increase in plasma tHcy (max. +567%). The increase in plasma tHcy was strongly correlated with the duration and average concentration of nitrous oxide anesthesia (r= 0.80; p<0.001). Conclusions Pediatric patients undergoing nitrous oxide anesthesia develop significantly increased plasma tHcy concentrations. The magnitude of this effect appears to be greater compared to adults; however, the clinical relevance is unknown. PMID:21680854
Conway, Laurie J.; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L.
2015-01-01
Article-at-a-Glance Background Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care–associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. Methods An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. Results After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Conclusions Feedback via an automated system was associated with improved hand hygiene performance in the short term. PMID:25252389
Brazilian adults' sedentary behaviors by life domain: population-based study.
Mielke, Grégore I; da Silva, Inácio C M; Owen, Neville; Hallal, Pedro C
2014-01-01
There is rapidly-emerging evidence on the harmful health effects of sedentary behaviors. The aim of this paper was to quantify time in sedentary behaviors and document socio-demographic variations in different life domains among adults. A population-based survey was carried out in 2012 through face-to-face interviews with Brazilian adults aged 20+ years (N = 2,927). Information about time spent sedentary in a typical weekday was collected for five different domains (workplace, commuting, school/university, watching TV, and computer use at home). Descriptive and bivariate analyses examined variations in overall and domain-specific sedentary time by gender, age, educational attainment and socioeconomic position. On average, participants reported spending 5.8 (SD 4.5) hours per day sitting. The median value was 4.5 (interquartile range: 2.5-8) hours. Men, younger adults, those with higher schooling and from the wealthiest socioeconomic groups had higher overall sedentary scores. TV time was higher in women, older adults and among those with low schooling and socioeconomic position. Sedentary time in transport was higher in men, younger adults, and participants with high schooling and high socioeconomic position. Computer use at home was more frequent among young adults and those from high socioeconomic groups. Sitting at work was higher in those with higher schooling and from the wealthiest socioeconomic groups. Sedentary behavior at school was related inversely to age and directly to schooling. Patterns of sedentary behavior are different by life domains. Initiatives to reduce prolonged sitting among Brazilian adults will be required on multiple levels for different life domains.
Conway, Laurie J; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L
2014-09-01
Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care-associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Feedback via an automated system was associated with improved hand hygiene performance in the short-term.
A Study to Draw a Normative Database of Laryngopharynx pH Profile in Chinese
Feng, Guijian; Wang, Junyao; Zhang, Lihong; Liu, Yulan
2014-01-01
Background/Aims To draw a normative database of laryngopharynx pH profile in Chinese subjects. Methods Normal volunteers were recruited from “www.Ganji.com” and People’s hospital between May 2008 and December 2009. The Restech pH Probes were calibrated in pH 7 and pH 4 buffer solutions according to the manufacturer's instructions. Each volunteer was asked to wear the device for a 24-hour period and was encouraged to participate in normal daily activities. Results The healthy volunteers consisted of 20 males and 9 females with a median age of 23 years (interquartile range, 21 to 32 years). The 95th percentiles for % total times at pH < 4, pH < 4.5, pH < 5.0 and pH < 5.5 for the oropharynx pH catheter were 0.06%, 1.01%, 7.23% and 27.34%, respectively. The 95th percentile for number of reflux events within the 24-hour period at pH < 4, pH < 4.5, pH < 5.0 and pH < 5.5 were 2.0, 18.0, 107.5 and 284.5, respectively. Conclusions This is the first study to systematically assess the degree of reflux detected by the new pH probe in healthy asymptomatic Chinese volunteers and to report normative values in Chinese people. Using an oropharyngeal pH catheter to monitor laryngopharyngeal reflux indicated that in healthy Chinese, reflux should be considered normal if the percent time at pH less than 4.5 is no more than 1%. PMID:24948130
De la Maza, Verónica; Simian, Daniela; Castro, Magdalena; Torres, Juan Pablo; Lucero, Yudeth; Sepúlveda, Fanny; Mazquiaran, Soraya; Salazar, Carolina; Segovia, Lorena; Santolaya, Maria Elena
2015-10-01
Early administration of antimicrobial (AM) is relevant in children with cancer, fever and neutropenia (FN). The recommendation is to administer the first dose of AM within the first hour of hospital admission. Our aims were to determine the time from the moment that a child with FN is admitted to the hospital until they receive their first dose of AM and to determine the association with clinical outcomes. This prospective, multicenter study evaluated the time elapsed from the admission to the first dose of AM, comparing this variable by admitting hospital and presentation location (Emergency Department/Oncology Units) and evaluating the clinical outcomes by the following variables: days of fever, days of hospitalization, hypotension, transfer to intensive care unit, sepsis and mortality. A total of 226 children with 388 episodes of FN were enrolled from 5 hospitals (July 2012-April 2014). The median time between hospital admission and administration of the first dose of AM was 132 minutes (interquartile range: 60-246 minutes). The median time to AM administration was significantly different between hospitals (70 vs. 200 minutes, P < 0.0001) and between presentation locations (Emergency Department vs. Oncology Units, median: 200 vs. 100 minutes, P < 0.0001). Twenty-five percentage of children received AM within 1 hour of admission. The administration of AM after 60 minutes was not associated with worse outcomes. Time to AM administration was longer than the recommendation. The findings described provide an opportunity to identify gaps and implement programs aimed at improving the equity and excellence of care in children with cancer and FN.
Stone, Will; Sawa, Patrick; Lanke, Kjerstin; Rijpma, Sanna; Oriango, Robin; Nyaurah, Maureen; Osodo, Paul; Osoti, Victor; Mahamar, Almahamoudou; Diawara, Halimatou; Woestenenk, Rob; Graumans, Wouter; van de Vegte-Bolmer, Marga; Bradley, John; Chen, Ingrid; Brown, Joelle; Siciliano, Giulia; Alano, Pietro; Gosling, Roly; Dicko, Alassane; Drakeley, Chris; Bousema, Teun
2017-01-01
Abstract Background Single low-dose primaquine (PQ) reduces Plasmodium falciparum infectivity before it impacts gametocyte density. Here, we examined the effect of PQ on gametocyte sex ratio as a possible explanation for this early sterilizing effect. Methods Quantitative reverse-transcription polymerase chain reaction assays were developed to quantify female gametocytes (targeting Pfs25 messenger RNA [mRNA]) and male gametocytes (targeting Pf3D7_1469900 mRNA) in 2 randomized trials in Kenya and Mali, comparing dihydroartemisinin-piperaquine (DP) alone to DP with PQ. Gametocyte sex ratio was examined in relation to time since treatment and infectivity to mosquitoes. Results In Kenya, the median proportion of male gametocytes was 0.33 at baseline. Seven days after treatment, gametocyte density was significantly reduced in the DP-PQ arm relative to the DP arm (females: 0.05% [interquartile range {IQR}, 0.0–0.7%] of baseline; males: 3.4% [IQR, 0.4%–32.9%] of baseline; P < .001). Twenty-four hours after treatment, gametocyte sex ratio became male-biased and was not significantly different between the DP and DP-PQ groups. In Mali, there was no significant difference in sex ratio between the DP and DP-PQ groups (>0.125 mg/kg) 48 hours after treatment, and gametocyte sex ratio was not associated with mosquito infection rates. Conclusions The early sterilizing effects of PQ may not be explained by the preferential clearance of male gametocytes and may be due to an effect on gametocyte fitness. PMID:28931236
Compliance with occlusion therapy for childhood amblyopia.
Wallace, Michael P; Stewart, Catherine E; Moseley, Merrick J; Stephens, David A; Fielder, Alistair R
2013-09-17
Explore compliance with occlusion treatment of amblyopia in the Monitored and Randomized Occlusion Treatment of Amblyopia Studies (MOTAS and ROTAS), using objective monitoring. Both studies had a three-phase protocol: initial assessment, refractive adaptation, and occlusion. In the occlusion phase, participants were instructed to dose for 6 hours/day (MOTAS) or randomized to 6 or 12 hour/day (ROTAS). Dose was monitored continuously using an occlusion dose monitor (ODM). One hundred and fifty-two patients (71 male, 81 female; 122 Caucasian, 30 non-Caucasian) of mean ± SD age 68 ± 18 months participated. Amblyopia was defined as an interocular acuity difference of at least 0.1 logMAR and was associated with anisometropia in 50, strabismus in 44, and both (mixed) in 58. Median duration of occlusion was 99 days (interquartile range 72 days). Mean compliance was 44%, mean proportion of days with no patch worn was 42%. Compliance was lower (39%) on weekends compared with weekdays (46%, P = 0.04), as was the likelihood of dosing at all (52% vs. 60%, P = 0.028). Compliance was lower when attendance was less frequent (P < 0.001) and with prolonged treatment duration (P < 0.001). Age, sex, amblyopia type, and severity were not associated with compliance. Mixture modeling suggested three subpopulations of patch day doses: less than 30 minutes; doses that achieve 30% to 80% compliance; and doses that achieve around 100% compliance. This study shows that compliance with patching treatment averages less than 50% and is influenced by several factors. A greater understanding of these influences should improve treatment outcome. (ClinicalTrials.gov number, NCT00274664).
Tasker, Robert C; Goodkin, Howard P; Sánchez Fernández, Iván; Chapman, Kevin E; Abend, Nicholas S; Arya, Ravindra; Brenton, James N; Carpenter, Jessica L; Gaillard, William D; Glauser, Tracy A; Goldstein, Joshua; Helseth, Ashley R; Jackson, Michele C; Kapur, Kush; Mikati, Mohamad A; Peariso, Katrina; Wainwright, Mark S; Wilfong, Angus A; Williams, Korwyn; Loddenkemper, Tobias
2016-10-01
To describe pediatric patients with convulsive refractory status epilepticus in whom there is intention to use an IV anesthetic for seizure control. Two-year prospective observational study evaluating patients (age range, 1 mo to 21 yr) with refractory status epilepticus not responding to two antiepileptic drug classes and treated with continuous infusion of anesthetic agent. Nine pediatric hospitals in the United States. In a cohort of 111 patients with refractory status epilepticus (median age, 3.7 yr; 50% male), 54 (49%) underwent continuous infusion of anesthetic treatment. The median (interquartile range) ICU length of stay was 10 (3-20) days. Up to four "cycles" of serial anesthetic therapy were used, and seizure termination was achieved in 94% by the second cycle. Seizure duration in controlled patients was 5.9 (1.9-34) hours for the first cycle and longer when a second cycle was required (30 [4-120] hr; p = 0.048). Midazolam was the most frequent first-line anesthetic agent (78%); pentobarbital was the most frequently used second-line agent after midazolam failure (82%). An electroencephalographic endpoint was used in over half of the patients; higher midazolam dosing was used with a burst suppression endpoint. In midazolam nonresponders, transition to a second agent occurred after a median of 1 day. Most patients (94%) experienced seizure termination with these two therapies. Midazolam and pentobarbital remain the mainstay of continuous infusion therapy for refractory status epilepticus in the pediatric patient. The majority of patients experience seizure termination within a median of 30 hours. These data have implications for the design and feasibility of future intervention trials. That is, testing a new anesthetic anticonvulsant after failure of both midazolam and pentobarbital is unlikely to be feasible in a pediatric study, whereas a decision to test an alternative to pentobarbital, after midazolam failure, may be possible in a multicenter multinational study.
Kim, Joon-Tae; Fonarow, Gregg C; Smith, Eric E; Reeves, Mathew J; Navalkele, Digvijaya D; Grotta, James C; Grau-Sepulveda, Maria V; Hernandez, Adrian F; Peterson, Eric D; Schwamm, Lee H; Saver, Jeffrey L
2017-01-10
Earlier tissue plasminogen activator treatment improves ischemic stroke outcome, but aspects of the time-benefit relationship still not well delineated are: (1) the degree of additional benefit accrued with treatment in the first 60 minutes after onset, and (2) the shape of the time-benefit curve through 4.5 hours. We analyzed patients who had acute ischemic stroke treated with intravenous tissue plasminogen activator within 4.5 hours of onset from the Get With The Guidelines-Stroke US national program. Onset-to-treatment time was analyzed as a continuous, potentially nonlinear variable and as a categorical variable comparing patients treated within 60 minutes of onset with later epochs. Among 65 384 tissue plasminogen activator-treated patients, the median onset-to-treatment time was 141 minutes (interquartile range, 110-173) and 878 patients (1.3%) were treated within the first 60 minutes. Treatment within 60 minutes, compared with treatment within 61 to 270 minutes, was associated with increased odds of discharge to home (adjusted odds ratio, 1.25; 95% confidence interval, 1.07-1.45), independent ambulation at discharge (adjusted odds ratio, 1.22; 95% confidence interval, 1.03-1.45), and freedom from disability (modified Rankin Scale 0-1) at discharge (adjusted odds ratio, 1.72; 95% confidence interval, 1.21-2.46), without increased hemorrhagic complications or in-hospital mortality. The pace of decline in benefit of tissue plasminogen activator from onset-to-treatment times of 20 through 270 minutes was mildly nonlinear for discharge to home, with more rapid benefit loss in the first 170 minutes than later, and linear for independent ambulation and in-hospital mortality. Thrombolysis started within the first 60 minutes after onset is associated with best outcomes for patients with acute ischemic stroke, and benefit declined more rapidly early after onset for the ability to be discharged home. These findings support intensive efforts to organize stroke systems of care to improve the timeliness of thrombolytic therapy in acute ischemic stroke. © 2016 American Heart Association, Inc.
Lewin, Antoine; Buteau, Stéphane; Brand, Allan; Kosatsky, Tom; Smargiassi, Audrey
2013-01-01
Few studies have measured the effect of short-term exposure to industrial emissions on the respiratory health of children. Here we estimate the risk of hospitalization for asthma and bronchiolitis in young children associated with their recent exposure to emissions from an aluminum smelter. We used a case–crossover design to assess the risk of hospitalization, February 1999–December 2008, in relation to short-term variation in levels of exposure among children 0–4 years old living less than 7.5 km from the smelter. The percentage of hours per day that the residence of a hospitalized child was in the shadow of winds crossing the smelter was used to estimate the effect of wind-borne emissions on case and crossover days. Community-wide pollutant exposure was estimated through daily mean and daily maximum SO2 and PM2.5 concentrations measured at a fixed monitoring site near the smelter. Odds ratios (OR) were estimated using conditional logistic regressions. The risk of same-day hospitalization for asthma or bronchiolitis increased with the percentage of hours in a day that a child's residence was downwind of the smelter. For children aged 2–4 years, the OR was 1.27 (95% CI=1.03–1.56; n=103 hospitalizations), for an interquartile range (IQR) of 21% of hours being downwind. In this age group, the OR with PM2.5 daily mean levels was slightly smaller than with the hours downwind (OR: 1.22 for an IQR of 15.7 μg/m3, 95% CI=1.03–1.44; n=94 hospitalizations). Trends were observed between hospitalizations and levels of SO2 for children 2–4 years old. Increasing short-term exposure to emissions from a Quebec aluminum smelter was associated with an increased risk of hospitalization for asthma and bronchiolitis in young children who live nearby. Estimating exposure through records of wind direction allows for the integration of exposure to all pollutants carried from the smelter stack. PMID:23695491
Yang, Wen-Yi; Efremov, Ljupcho; Mujaj, Blerim; Zhang, Zhen-Yu; Wei, Fang-Fei; Huang, Qi-Fang; Thijs, Lutgarde; Vanassche, Thomas; Nawrot, Tim S; Staessen, Jan A
2018-01-01
In view of decreasing lead exposure and guidelines endorsing ambulatory above office blood pressure (BP) measurement, we reassessed association of BP with blood lead (BL) in 236 newly employed men (mean age, 28.6 years) without previous lead exposure not treated for hypertension. Office BP was the mean of five auscultatory readings at one visit. Twenty-four-hour BP was recorded at 15- and 30-minute intervals during wakefulness and sleep. BL was determined by inductively coupled plasma mass spectrometry. Systolic/diastolic office BP averaged 120.0/80.7 mm Hg, and the 24-hour, awake, and asleep BP 125.5/73.6, 129.3/77.9, and 117.6/65.0 mm Hg, respectively. The geometric mean of blood lead was 4.5 μg/dL (interquartile range, 2.60-9.15 μg/dL). In multivariable-adjusted analyses, effect sizes associated with BL doubling were 0.79/0.87 mm Hg (P = .11/.043) for office BP and 0.29/-0.25, 0.60/-0.10, and -0.40/-0.43 mm Hg for 24-hour, awake, and asleep BP (P ≥ .33). Neither office nor 24-hour ambulatory hypertension was related to BL (P ≥ .14). A clinically relevant white coat effect (WCE; office minus awake BP, ≥20/≥10 mm Hg) was attributable to exceeding the systolic or diastolic threshold in 1 and 45 workers, respectively. With BL doubling, the systolic/diastolic WCE increased by 0.20/0.97 mm Hg (P = .57/.046). Accounting for the presence of a diastolic WCE, reduced the association size of office diastolic BP with BL to 0.39 mm Hg (95% confidence interval, -0.20 to 1.33; P = .15). In conclusion, a cross-sectional analysis of newly hired workers before lead exposure identified the WCE as confounder of the association between office BP and BL and did not reveal any association between ambulatory BP and BL. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Early exposure to hyperoxia and mortality in critically ill patients with severe traumatic injuries.
Russell, Derek W; Janz, David R; Emerson, William L; May, Addison K; Bernard, Gordon R; Zhao, Zhiguo; Koyama, Tatsuki; Ware, Lorraine B
2017-02-03
Hyperoxia is common early in the course of resuscitation of critically ill patients. It has been associated with mortality in some, but not all, studies of cardiac arrest patients and other critically ill cohorts. Reasons for the inconsistency are unclear and may depend on unmeasured patient confounders, the timing and duration of hyperoxia, population characteristics, or the way that hyperoxia is defined and measured. We sought to determine whether, in a prospectively collected cohort of mechanically ventilated patients with traumatic injuries with and without head trauma, higher maximum partial pressure of arterial oxygen (PaO2) within 24 hours of admission would be associated with increased risk of in-hospital mortality. Critically ill patients with traumatic injuries undergoing invasive mechanical ventilation enrolled in the Validating Acute Lung Injury biomarkers for Diagnosis (VALID) study were included in this study. All arterial blood gases (ABGs) from the first 24 hours of admission were recorded. Primary analysis was comparison of the highest PaO2 between hospital survivors and non-survivors. A total of 653 patients were evaluated for inclusion. Of these, 182 were not mechanically ventilated or did not have an ABG measured in the first 24 hours, leaving 471 patients in the primary analysis. In survivors, the maximum PaO2 was 141 mmHg (median, interquartile range 103 - 212) compared to 148 mmHg (IQR 105 - 209) in non-survivors (p = 0.82). In the subgroup with head trauma (n = 266), the maximum PaO2 was 133 mmHg (IQR 97 - 187) among survivors and 152 mmHg (108 - 229) among nonsurvivors (p = 0.19). After controlling for age, injury severity score, number of arterial blood gases, and fraction of inspired oxygen, maximum PaO2 was not associated with increased mortality (OR 1.27 for every fold increase of PaO2 (95% CI 0.72 - 2.25). In mechanically ventilated patients with severe traumatic injuries, hyperoxia in the first 24 hours of admission was not associated with increased risk of death or worsened neurological outcomes in a setting without brain tissue oxygenation monitoring.
Stewart, Barclay T.; Tansley, Gavin; Gyedu, Adam; Ofosu, Anthony; Donkor, Peter; Appiah-Denkyira, Ebenezer; Quansah, Robert; Clarke, Damian L.; Volmink, Jimmy; Mock, Charles
2017-01-01
IMPORTANCE Conditions that can be treated by surgery comprise more than 16% of the global disease burden. However, 5 billion people do not have access to essential surgical care. An estimated 90% of the 87 million disability-adjusted life-years incurred by surgical conditions could be averted by providing access to timely and safe surgery in low-income and middle-income countries. Population-level spatial access to essential surgery in Ghana is not known. OBJECTIVES To assess the performance of bellwether procedures (ie, open fracture repair, emergency laparotomy, and cesarean section) as a proxy for performing essential surgery more broadly, to map population-level spatial access to essential surgery, and to identify first-level referral hospitals that would most improve access to essential surgery if strengthened in Ghana. DESIGN, SETTING, AND PARTICIPANTS Population-based study among all households and public and private not-for-profit hospitals in Ghana. Households were represented by georeferenced census data. First-level and second-level referral hospitals managed by the Ministry of Health and all tertiary hospitals were included. Surgical data were collected from January 1 to December 31, 2014. MAIN OUTCOMES AND MEASURES All procedures performed at first-level referral hospitals in Ghana in 2014 were used to sort each facility into 1 of the following 3 hospital groups: those without capability to perform all 3 bellwether procedures, those that performed 1 to 11 of each procedure, and those that performed at least 12 of each procedure. Candidates for targeted capability improvement were identified by cost-distance and network analysis. RESULTS Of 155 first-level referral hospitals managed by the Ghana Health Service and the Christian Health Association of Ghana, 123 (79.4%) reported surgical data. Ninety-five (77.2%) did not have the capability in 2014 to perform all 3 bellwether procedures, 24 (19.5%) performed 1 to 11 of each bellwether procedure, and 4 (3.3%) performed at least 12. The essential surgical procedure rate was greater in bellwether procedure–capable first-level referral hospitals than in noncapable hospitals (median, 638; interquartile range, 440–1418 vs 360; interquartile range, 0–896 procedures per 100 000 population; P = .03). Population-level spatial access within 2 hours to a hospital that performed 1 to 11 and at least 12 of each bellwether procedure was 83.2% (uncertainty interval [UI], 82.2%–83.4%) and 71.4% (UI, 64.4%–75.0%), respectively. Five hospitals were identified for targeted capability improvement. CONCLUSIONS AND RELEVANCE Almost 30% of Ghanaians cannot access essential surgery within 2 hours. Bellwether capability is a useful metric for essential surgery more broadly. Similar strategic planning exercises might be useful for other low-income and middle-income countries aiming to improve access to essential surgery. PMID:27331865
Analysis of postdischarge costs following emergent general surgery in elderly patients
Eamer, Gilgamesh J.; Clement, Fiona; Pederson, Jenelle L.; Churchill, Thomas A.; Khadaroo, Rachel G.
2018-01-01
Background As populations age, more elderly patients will undergo surgery. Frailty and complications are considered to increase in-hospital cost in older adults, but little is known on costs following discharge, particularly those borne by the patient. We examined risk factors for increased cost and the type of costs accrued following discharge in elderly surgical patients. Methods Acute abdominal surgery patients aged 65 years and older were prospectively enrolled. We assessed baseline clinical characteristics, including Clinical Frailty Scale (CFS) scores. We calculated 6-month cost (in Canadian dollars) from patient-reported use following discharge according to the validated Health Resource Utilization Inventory. Primary outcomes were 6-month overall cost and cost for health care services, medical products and lost productive hours. Outcomes were log-transformed and assessed in multivariable generalized linear and zero-inflated negative binomial regressions and can be interpreted as adjusted ratios (AR). Complications were assessed according to Clavien–Dindo classification. Results We included 150 patients (mean age 75.5 ± 7.6 yr; 54.1% men) in our analysis; 10.8% had major and 43.2% had minor complications postoperatively. The median 6-month overall cost was $496 (interquartile range $140–$1948). Disaggregated by cost type, frailty independently predicted increasing costs of health care services (AR 1.76, 95% confidence interval [CI] 1.43–2.18, p < 0.001) and medical products (AR 1.61, 95% CI 1.15–2.25, p = 0.005), but decreasing costs in lost productive hours (AR 0.39, p = 0.002). Complications did not predict increased cost. Conclusion Frail patients accrued higher health care services and product costs, but lower costs from lost productive hours. Interventions in elderly surgical patients should consider patient-borne cost in older adults and lost productivity in less frail patients. Trial registration NCT02233153 (clinicaltrials.gov). PMID:29368673
Warner, Matthew A; Woodrum, David A; Hanson, Andrew C; Schroeder, Darrell R; Wilson, Gregory A; Kor, Daryl J
2016-08-01
To determine the association between prophylactic plasma transfusion and periprocedural red blood cell (RBC) transfusion rates in patients with elevated international normalized ratio (INR) values undergoing interventional radiology procedures. In this retrospective cohort study, adult patients undergoing interventional radiology procedures with a preprocedural INR available within 30 days of the procedure during a study period of January 1, 2009, to December 31, 2013, were eligible for inclusion. Baseline characteristics, coagulation parameters, transfusion requirements, and procedural details were extracted. Univariate and multivariable propensity-matched analyses were used to assess the relationships between prophylactic plasma transfusion and the outcomes of interest, with a primary outcome assessed a priori of RBC transfusion occurring during the procedure or within the first 24 hours postprocedurally. A total of 18,204 study participants met inclusion criteria for this study, and 1803 (9.9%) had an INR of 1.5 or greater before their procedure. Of these 1803 patients, 196 patients (10.9%) received prophylactic plasma transfusion with a median time of 1.9 hours (interquartile range [IQR], 1.1-3.2 hours) between plasma transfusion initiation and procedure initiation. In multivariable propensity-matched analysis, plasma administration was associated with increased periprocedural RBC transfusions (odds ratio, 2.20; 95% CI, 1.38-3.50; P<.001) and postprocedural intensive care unit admission rates (odds ratio, 2.11; 95% CI, 1.41-3.14; P<.001) as compared with those who were not transfused preprocedurally. Similar relationships were seen at higher INR thresholds for plasma transfusion. In patients undergoing interventional radiology procedures, preprocedural plasma transfusions given in the setting of elevated INR values were associated with increased periprocedural RBC transfusions. Additional research is needed to clarify this potential association between preprocedural plasma transfusion and periprocedural RBC transfusion. Copyright © 2016 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Long-term Safety of Living Kidney Donation in an Emerging Economy.
Rizvi, S Adibul Hasan; Zafar, Mirza Naqi; Jawad, Fatema; Aziz, Tahir; Hussain, Zafar; Hashmi, Altaf; Hussain, Manzoor; Akhtar, Fazal; Ahmed, Ejaz; Naqvi, Rubina; Naqvi, S A Anwar
2016-06-01
Long-term follow-up and management of donors was undertaken in a specialist kidney transplant unit in Pakistan to identify risk and prevent adverse outcomes in living related kidney donors. In an observation cohort study between 1985 and 2012, 3748 donors were offered free medical follow-up and treatment 6 to 12 months after donation and annually thereafter. Each visit included history, physical examination, blood tests for renal, lipid, glucose profiles, and 24-hour urine for proteinuria and creatinine clearance. Preventive intervention was undertaken for new onset clinical conditions. Donor outcomes were compared with 90 nondonor healthy siblings matched for age, sex, and body mass index. Of the 3748 donors, 2696 (72%) were in regular yearly follow-up for up to 27 years (median, 5.6; interquartile range, 7.9). Eleven (0.4%) died 4 to 22 years after donation with all-cause mortality of 4.0/10 000 person years. Six (0.2%) developed end-stage renal disease 5 to 17 years after donation, (2.7/10 000 person years). Proteinuria greater than 1000 mg/24 hours developed in 28 patients (1%), hypertension in 371 patients (13.7%), and diabetes in 95 patients (3.6%). Therapeutic intervention-controlled protein was less than 1000 mg/24 hours, blood pressure was below 140/90 mm Hg, and glycemic control in 85% up to 15 years after onset. Creatinine clearance fell from 109.8 ± 22.3 mL/min per 1.73 m predonation to 78 ± 17 at 1 year, 84 ± 19 at 5 years, and 70 ± 20 at 25 years. Comparison of 90 nondonor sibling and donor pairs showed significantly higher fasting glucose and hypertension in nondonors. Long-term follow-up of donors has demonstrated end-stage renal disease in 0.6% at 25 years. Regular follow-up identified new onset of disease and allowed interventions that may have prevented adverse outcomes.
Aye, Myint Myint; Kilpatrick, Eric S; Aburima, Ahmed; Wraith, Katie S; Magwenzi, Simbarashe; Spurgeon, B; Rigby, Alan S; Sandeman, Derek; Naseem, Khalid M; Atkin, Stephen L
2014-02-28
Atherothrombosis is associated with platelet hyperactivity. Hypertriglyceridemia and insulin resistance (IR) are features of polycystic ovary syndrome (PCOS). The effect of induced hypertriglyceridemia on IR and platelet function was examined in young women with PCOS. Following overnight fasting, 13 PCOS and 12 healthy women were infused with saline or 20% intralipid for 5 hours on separate days. Insulin sensitivity was measured using a hyperinsulinemic euglycaemic clamp in the final 2 hours of each infusion. Platelet responses to adenosine diphosphate (ADP) and prostacyclin (PGI2) were measured by flow cytometric analysis of platelet fibrinogen binding and P-selectin expression using whole blood taken during each infusion (at 2 hours) and at the end of each clamp. Lipid infusion increased triglycerides and reduced insulin sensitivity in both controls (median, interquartile range ) (5.25 [3.3, 6.48] versus 2.60 [0.88, 3.88] mg kg(-1) min(-1), P<0.001) and PCOS (3.15 [2.94, 3.85] versus 1.06 [0.72, 1.43] mg kg(-1) min(-1), P<0.001). Platelet activation by ADP was enhanced and ability to suppress platelet activation by PGI2 diminished during lipid infusion in both groups when compared to saline. Importantly, insulin infusion decreased lipid-induced platelet hyperactivity by decreasing their response to 1 μmol/L ADP (78.7% [67.9, 82.3] versus 62.8% [51.8, 73.3], P=0.02) and increasing sensitivity to 0.01 μmol/L PGI2 (67.6% [39.5, 83.8] versus 40.9% [23.8, 60.9], P=0.01) in controls, but not in PCOS. Acute hypertriglyceridemia induced IR, and increased platelet activation in both groups that was not reversed by insulin in PCOS subjects compared to controls. This suggests that platelet hyperactivity induced by acute hypertriglyceridemia and IR could contribute athero-thrombotic risk. www.isrctn.org. Unique Identifier: ISRCTN42448814.
Aye, Myint Myint; Kilpatrick, Eric S.; Aburima, Ahmed; Wraith, Katie S.; Magwenzi, Simbarashe; Spurgeon, B.; Rigby, Alan S.; Sandeman, Derek; Naseem, Khalid M.; Atkin, Stephen L.
2014-01-01
Background Atherothrombosis is associated with platelet hyperactivity. Hypertriglyceridemia and insulin resistance (IR) are features of polycystic ovary syndrome (PCOS). The effect of induced hypertriglyceridemia on IR and platelet function was examined in young women with PCOS. Methods and Results Following overnight fasting, 13 PCOS and 12 healthy women were infused with saline or 20% intralipid for 5 hours on separate days. Insulin sensitivity was measured using a hyperinsulinemic euglycaemic clamp in the final 2 hours of each infusion. Platelet responses to adenosine diphosphate (ADP) and prostacyclin (PGI2) were measured by flow cytometric analysis of platelet fibrinogen binding and P‐selectin expression using whole blood taken during each infusion (at 2 hours) and at the end of each clamp. Lipid infusion increased triglycerides and reduced insulin sensitivity in both controls (median, interquartile range ) (5.25 [3.3, 6.48] versus 2.60 [0.88, 3.88] mg kg−1 min−1, P<0.001) and PCOS (3.15 [2.94, 3.85] versus 1.06 [0.72, 1.43] mg kg−1 min−1, P<0.001). Platelet activation by ADP was enhanced and ability to suppress platelet activation by PGI2 diminished during lipid infusion in both groups when compared to saline. Importantly, insulin infusion decreased lipid‐induced platelet hyperactivity by decreasing their response to 1 μmol/L ADP (78.7% [67.9, 82.3] versus 62.8% [51.8, 73.3], P=0.02) and increasing sensitivity to 0.01 μmol/L PGI2 (67.6% [39.5, 83.8] versus 40.9% [23.8, 60.9], P=0.01) in controls, but not in PCOS. Conclusion Acute hypertriglyceridemia induced IR, and increased platelet activation in both groups that was not reversed by insulin in PCOS subjects compared to controls. This suggests that platelet hyperactivity induced by acute hypertriglyceridemia and IR could contribute athero‐thrombotic risk. Clinical Trial Registration URL: www.isrctn.org. Unique Identifier: ISRCTN42448814. PMID:24584741
Bodapati, Rohan K; Kizer, Jorge R; Kop, Willem J; Kamel, Hooman; Stein, Phyllis K
2017-07-21
Heart rate variability (HRV) characterizes cardiac autonomic functioning. The association of HRV with stroke is uncertain. We examined whether 24-hour HRV added predictive value to the Cardiovascular Health Study clinical stroke risk score (CHS-SCORE), previously developed at the baseline examination. N=884 stroke-free CHS participants (age 75.3±4.6), with 24-hour Holters adequate for HRV analysis at the 1994-1995 examination, had 68 strokes over ≤8 year follow-up (median 7.3 [interquartile range 7.1-7.6] years). The value of adding HRV to the CHS-SCORE was assessed with stepwise Cox regression analysis. The CHS-SCORE predicted incident stroke (HR=1.06 per unit increment, P =0.005). Two HRV parameters, decreased coefficient of variance of NN intervals (CV%, P =0.031) and decreased power law slope (SLOPE, P =0.033) also entered the model, but these did not significantly improve the c-statistic ( P =0.47). In a secondary analysis, dichotomization of CV% (LOWCV% ≤12.8%) was found to maximally stratify higher-risk participants after adjustment for CHS-SCORE. Similarly, dichotomizing SLOPE (LOWSLOPE <-1.4) maximally stratified higher-risk participants. When these HRV categories were combined (eg, HIGHCV% with HIGHSLOPE), the c-statistic for the model with the CHS-SCORE and combined HRV categories was 0.68, significantly higher than 0.61 for the CHS-SCORE alone ( P =0.02). In this sample of older adults, 2 HRV parameters, CV% and power law slope, emerged as significantly associated with incident stroke when added to a validated clinical risk score. After each parameter was dichotomized based on its optimal cut point in this sample, their composite significantly improved prediction of incident stroke during ≤8-year follow-up. These findings will require validation in separate, larger cohorts. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Low Levels of Physical Activity During Critical Illness and Weaning: The Evidence-Reality Gap.
Connolly, Bronwen A; Mortimore, Jessica L; Douiri, Abdel; Rose, Joleen W; Hart, Nicholas; Berney, Susan C
2017-01-01
Physical rehabilitation can benefit critically ill patients during intensive care unit (ICU) admission, but routine clinical practice remains inconsistent nor examined in prolonged mechanical ventilation patients transferred to a specialist ventilator weaning unit (VWU). Behavioral mapping is a sampling approach that allows detailed reporting of physical activity profiles. The objective of this study was to characterize the physical activity profile of critically ill patients in a UK ICU and VWU. Single-center, prospective observational study in a university teaching hospital. Patient observations, conducted Monday through Sunday from 08:30 am to 08:00 pm and for 1 minute every 10 minutes, included data points of patient location, people in attendance, and highest level of activity. Descriptive statistics were utilized to analyze and report data. Forty-two ICU and 11 VWU patients were recruited, with 2646 and 693 observations, respectively, recorded. In the ICU, patients spent a median (interquartile range) of 100% (96%-100%) of the day (10.5 [10.0-10.5] hours) located in bed, with minimal/no activity for 99% (96%-100%) of the day (10.4 [9.7-10.5] hours). Nursing staff were most frequently observed in attendance with patients irrespective of ventilation or sedation status, although patients still spent approximately two-thirds of the day alone. Bed-to-chair transfer was the highest activity level observed. In the VWU, patients spent 94% (73%-100%) of the day (9.9 [7.7-10.5] hours) in bed and 56% (43%-60%) of time alone. Physical activity levels were higher and included ambulation. All physical activities occurred during physical rehabilitation sessions. These profiles of low physical activity behavior across both patients in the ICU and VWU highlight the need for targeted strategies to improve levels beyond therapeutic rehabilitation and support for a culture shift toward providing patients with, and engaging them in, a multidisciplinary, multiprofessional environment that optimizes overall physical activity.
Lin, Weiwei; Huang, Wei; Hu, Min; Brunekreef, Bert; Zhang, Yuanhang; Liu, Xingang; Cheng, Hong; Gehring, Ulrike; Li, Chengcai; Tang, Xiaoyan
2011-01-01
Background: Epidemiologic evidence for a causative association between black carbon (BC) and health outcomes is limited. Objectives: We estimated associations and exposure–response relationships between acute respiratory inflammation in schoolchildren and concentrations of BC and particulate matter with an aerodynamic diameter of ≤ 2.5 μm (PM2.5) in ambient air before and during the air pollution intervention for the 2008 Beijing Olympics. Methods: We measured exhaled nitric oxide (eNO) as an acute respiratory inflammation biomarker and hourly mean air pollutant concentrations to estimate BC and PM2.5 exposure. We used 1,581 valid observations of 36 subjects over five visits in 2 years to estimate associations of eNO with BC and PM2.5 according to generalized estimating equations with polynomial distributed-lag models, controlling for body mass index, asthma, temperature, and relative humidity. We also assessed the relative importance of BC and PM2.5 with two-pollutant models. Results: Air pollution concentrations and eNO were clearly lower during the 2008 Olympics. BC and PM2.5 concentrations averaged over 0–24 hr were strongly associated with eNO, which increased by 16.6% [95% confidence interval (CI), 14.1–19.2%] and 18.7% (95% CI, 15.0–22.5%) per interquartile range (IQR) increase in BC (4.0 μg/m3) and PM2.5 (149 μg/m3), respectively. In the two-pollutant model, estimated effects of BC were robust, but associations between PM2.5 and eNO decreased with adjustment for BC. We found that eNO was associated with IQR increases in hourly BC concentrations up to 10 hr after exposure, consistent with effects primarily in the first hours after exposure. Conclusions: Recent exposure to BC was associated with acute respiratory inflammation in schoolchildren in Beijing. Lower air pollution levels during the 2008 Olympics also were associated with reduced eNO. PMID:21642045
Mostad, Ingrid L; Bjerve, Kristian S; Basu, Samar; Sutton, Pauline; Frayn, Keith N; Grill, Valdemar
2009-12-01
Fatty acids (FA) can impair glucose metabolism to a varying degree depending on time of exposure and also of type of FA. Here we tested for acute effects of marine n-3 FA on insulin sensitivity, insulin secretion, energy metabolism, and oxidative stress. This was a randomized, double-blind, crossover study in 11 subjects with type 2 diabetes mellitus. A 4-hour lipid infusion (Intralipid [Fresenius Kabi, Halden, Norway], total of 384 mL) was compared with a similar lipid infusion partly replaced by Omegaven (Fresenius Kabi) that contributed a median of 0.1 g fish oil per kilogram body weight, amounting to 0.04 g/kg of marine n-3 FA. Insulin sensitivity was assessed by isoglycemic hyperinsulinemic clamps; insulin secretion (measured after the clamps), by C-peptide glucagon tests; and energy metabolism, by indirect calorimetry. Infusion of Omegaven increased the proportion of n-3 FA in plasma nonesterified fatty acids (NEFA) compared with Intralipid alone (20:5n-3: median, 1.5% [interquartile range, 0.6%] vs -0.2% [0.2%], P = .001; 22:6n-3: 0.8% [0.4%] vs -0.7% [0.2%], P = .001). However, glucose utilization was not affected; neither was insulin secretion or total energy production (P = .966, .210, and .423, respectively, for the differences between the lipid clamps). Omegaven tended to lower oxidation of fat (P = .062) compared with Intralipid only, correlating with the rise in individual n-3 NEFA (r = 0.627, P = .039). The effects of clamping on phospholipid FA composition, leptin, adiponectin, or F(2)-isoprostane concentrations were not affected by Omegaven. Enrichment of NEFA with n-3 FA during a 4-hour infusion of Intralipid failed to affect insulin sensitivity, insulin secretion, or markers of oxidative stress in subjects with type 2 diabetes mellitus.
Herrero-Cortina, B; Vilaró, J; Martí, D; Torres, A; San Miguel-Pagola, M; Alcaraz, V; Polverino, E
2016-12-01
To compare the efficacy of three slow expiratory airway clearance techniques (ACTs). Randomised crossover trial. Tertiary hospital. Thirty-one outpatients with bronchiectasis and chronic sputum expectoration. Autogenic drainage (AD), slow expiration with glottis opened in lateral posture (ELTGOL), and temporary positive expiratory pressure (TPEP). Sputum expectoration during each session (primary endpoint) and in the 24-hour period after each session. Leicester Cough Questionnaire (LCQ) score and spirometry results were recorded at the beginning and after each week of treatment. Data were summarised as median difference [95% confidence interval (CI)]. Median (interquartile range) daily expectoration at baseline was 21.1 (15.3 to 35.6)g. During physiotherapy sessions, AD and ELTGOL expectorated more sputum than TPEP [AD vs TPEP 3.1g (95% CI 1.5 to 4.8); ELTGOL vs TPEP 3.6g (95% CI 2.8 to 7.1)], while overall expectoration in the 24-hour period after each session was similar for all techniques (P=0.8). Sputum clearance at 24hours post-intervention was lower than baseline assessment for all techniques [AD vs baseline -10.0g (95% CI -15.0 to -6.8); ELTGOL vs baseline -9.2g (95% CI -14.2 to -7.9); TPEP vs baseline -6.0g (95% CI -12.0 to -6.1)]. The LCQ score increased with all techniques (AD 0.5, 95% CI 0.1 to 0.5; ELTGOL 0.9, 95% CI 0.5 to 2.1; TPEP 0.4, 95% CI 0.1 to 1.2), being similar for all ACTs (P=0.6). No changes in lung function were observed. Slow expiratory ACTs enhance mucus clearance during treatment sessions, and reduce expectoration for the rest of the day in patients with bronchiectasis. NCT01854788. Copyright © 2015 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Review of a fluid resuscitation protocol: "fluid creep" is not due to nursing error.
Faraklas, Iris; Cochran, Amalia; Saffle, Jeffrey
2012-01-01
Recent reviews of burn resuscitation have included the suggestion that "fluid creep" may be influenced by practitioner error. Our center uses a nursing-driven resuscitation protocol that permits titration of fluid based on hourly urine output, including the addition of colloid when patients fail to respond appropriately. The purpose of this study was to examine protocol compliance. We reviewed 140 patients (26 children) with burns of ≥20% TBSA who received protocol-directed resuscitation from 2005 to 2010. We compared each patient's actual hourly fluid infusion with that predicted by the protocol. Sixty-seven patients (48%) completed resuscitation using crystalloid alone, whereas 73 patients required colloid supplementation. Groups did not differ in age, gender, weight, or time from injury to admission. Patients requiring colloid had larger median total burns (33.0 vs 23.5% TBSA) and full-thickness burns (15.5 vs 4.5% TBSA) and more inhalation injuries (60.3 vs 28.4%; P < .001) than those who resuscitated with crystalloid alone. Because we included basic maintenance fluids in their regimen, patients had median predicted requirements of 5.4 ml/kg/%TBSA. Crystalloid-only patients required fluid volumes close to Parkland predictions (4.7 ml/kg/%TBSA), whereas patients who received colloid required more fluid than the predicted volume (7.5 ml/kg/%TBSA). However, the hourly difference between the predicted and received fluids was a median of only 1.0% (interquartile range: -6.1 to 11.1%) and did not differ between groups. Pediatric patients had greater calculated differences than adults. Crystalloid patients exhibited higher urine outputs than colloid patients until colloid was started, suggesting that early over-resuscitation did not contribute to fluid creep. Adherence to our protocol for burn shock resuscitation was excellent overall. Fluid creep exhibited by more seriously injured patients was not due to nurses' failure to follow the protocol. This review has illuminated some opportunities for practice improvement, possibly using a computerized decision support system.
Asferg, Camilla L; Andersen, Ulrik B; Linneberg, Allan; Goetze, Jens P; Jeppesen, Jørgen L
2018-05-07
Obese persons have lower circulating natriuretic peptide (NP) concentrations. It has been proposed that this natriuretic handicap plays a role in obesity-related hypertension. In contrast, hypertensive patients with left atrial enlargement have higher circulating NP concentrations. On this background, we investigated whether obese hypertensive men could have lower circulating NP concentrations despite evidence of pressure-induced greater left atrial size. We examined 98 obese men (body mass index [BMI] ≥ 30.0 kg/m2) and 27 lean normotensive men (BMI 20.0-24.9 kg/m2). All men were healthy, medication free, with normal left ventricular ejection fraction. We measured blood pressure using 24-hour ambulatory blood pressure (ABP) recordings. Hypertension was defined as 24-hour ABP ≥ 130/80 mm Hg, and normotension was defined as 24-hour ABP < 130/80 mm Hg. We determined left atrial size using echocardiography, and we measured fasting serum concentrations of midregional proatrial NP (MR-proANP). Of the 98 obese men, 62 had hypertension and 36 were normotensive. The obese hypertensive men had greater left atrial size (mean ± SD: 28.7 ± 6.0 ml/m2) compared with the lean normotensive men (23.5 ± 4.5 ml/m2) and the obese normotensive men (22.7 ± 5.1 ml/m2), P < 0.01. Nevertheless, despite evidence of pressure-induced greater left atrial size, the obese hypertensive men had lower serum MR-proANP concentrations (median [interquartile range]: 48.5 [37.0-64.7] pmol/l) compared with the lean normotensive men (69.3 [54.3-82.9] pmol/l), P < 0.01, whereas the obese normotensive men had serum MR-proANP concentrations in between the 2 other groups (54.1 [43.6-62.9] pmol/l). Despite greater left atrial size, obese hypertensive men have lower circulating MR-proANP concentrations compared with lean normotensive men.
Patarroyo, Maria; Wehbe, Edgard; Hanna, Mazen; Taylor, David O; Starling, Randall C; Demirjian, Sevag; Tang, W H Wilson
2012-11-06
The purpose of this study was to examine the clinical outcomes of using slow continuous ultrafiltration (SCUF) in patients with acute decompensated heart failure (HF) refractory to intensive medical therapy. Several studies have demonstrated the clinical usefulness of early SCUF in patients with acute decompensated HF to improve fluid overload and hemodynamics. We reviewed clinical data from 63 consecutive adult patients with acute decompensated HF admitted to the Heart Failure Intensive Care Unit from 2004 through 2009 who required SCUF because of congestion refractory to hemodynamically guided intensive medical therapy. The mean creatinine level was 1.9 ± 0.8 mg/dl on admission and 2.2 ± 0.9 mg/dl at SCUF initiation. After 48 hours of SCUF, there were significant improvements in hemodynamic variables (mean pulmonary arterial pressure: 40 ± 12 mm Hg vs. 33 ± 8 mm Hg, p = 0.002, central venous pressure: 20 ± 6 mm Hg vs. 16 ± 8 mm Hg, p = 0.007, mean pulmonary wedge pressure: 27 ± 8 mm Hg vs. 20 ± 7 mm Hg, p = 0.02, Fick cardiac index: 2.2 l/min/m(2) [interquartile range: 1.87 to 2.77 l/min/m(2)] vs. 2.6 l/min/m(2) [interquartile range: 2.2 to 2.9 l/min/m(2)], p = 0.0008), and weight loss (102 ± 25 kg vs. 99 ± 23 kg, p < 0.0001). However, there were no significant improvements in serum creatinine levels (2.2 ± 0.9 mg/dl vs. 2.4 ± 1 mg/dl, p = 0.12) and blood urea nitrogen (60 ± 30 mg/dl vs. 60 ± 28 mg/dl, p = 0.97). Fifty-nine percent required conversion to continuous hemodialysis during their hospital course, and 14% were dependent on dialysis at hospital discharge. Thirty percent died during hospitalization, and 6 patients were discharged to hospice care. In our single-center experience, SCUF after admission for acute decompensated HF refractory to standard medical therapy was associated with high incidence of subsequent transition to renal replacement therapy and high in-hospital mortality, despite significant improvement in hemodynamics. Copyright © 2012 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Richards, Jeremy B; McCallister, Jennifer W; Lenz, Peter H
2016-04-01
Many pulmonary and critical care medicine (PCCM) fellows are interested in improving their teaching skills as well as learning about careers as clinician educators. Educational opportunities in PCCM fellowship programs designed to address these interests have not been well characterized in U.S. training programs. We aimed to characterize educational content and structure for training fellows to teach in PCCM fellowship programs. We evaluated three major domains: (1) existing educational opportunities, (2) PCCM program directors' attitudes toward the importance of teaching fellows how to teach, and (3) potential components of an optimal teaching skills curriculum for PCCM fellows. We surveyed program and associate program directors who were members of the Association of Pulmonary and Critical Care Medicine Program Directors in 2014. Survey domains included existing teaching skills content and structure, presence of a formal medical education curriculum or clinician educator track, perceived barriers to teaching fellows teaching skills, and open-ended qualitative inquiries about the ideal curricula. Data were analyzed both quantitatively and qualitatively. Of 158 invited Association of Pulmonary and Critical Care Medicine Program Directors members, 85 program directors and associate directors responded (53.8% response rate). Annual curricular time dedicated to teaching skills varied widely (median, 3 h; mean, 5.4 h; interquartile range, 2.0-6.3 h), with 17 respondents (20%) allotting no time to teaching fellows to teach and 14 respondents (17%) dedicating more than 10 hours. Survey participants stated that the optimal duration for training fellows in teaching skills was significantly less than what they reported was actually occurring (median optimal duration, 1.5 h/yr; mean, 2.1 h/yr; interquartile range, 1.5-3.5 h/yr; P < 0.001). Only 28 (33.7%) had a formal curriculum for teaching medical education skills. Qualitative analyses identified several barriers to implementing formal teaching skills curricula, including "time," "financial resources," "competing priorities," and "lack of expert faculty." While prior work has demonstrated that fellows are interested in obtaining medical education skills, PCCM program directors and associate directors noted significant challenges to implementing formal educational opportunities to teach fellows these skills. Effective strategies are needed to design, implement, sustain, and assess teaching skills curricula for PCCM fellowships.
Pilot proof of concept clinical trials of Stochastic Targeted (STAR) glycemic control.
Evans, Alicia; Shaw, Geoffrey M; Le Compte, Aaron; Tan, Chia-Siong; Ward, Logan; Steel, James; Pretty, Christopher G; Pfeifer, Leesa; Penning, Sophie; Suhaimi, Fatanah; Signal, Matthew; Desaive, Thomas; Chase, J Geoffrey
2011-09-19
Tight glycemic control (TGC) has shown benefits but has been difficult to achieve consistently. STAR (Stochastic TARgeted) is a flexible, model-based TGC approach directly accounting for intra- and inter- patient variability with a stochastically derived maximum 5% risk of blood glucose (BG) < 4.0 mmol/L. This research assesses the safety, efficacy, and clinical burden of a STAR TGC controller modulating both insulin and nutrition inputs in pilot trials. Seven patients covering 660 hours. Insulin and nutrition interventions are given 1-3 hourly as chosen by the nurse to allow them to manage workload. Interventions are calculated by using clinically validated computer models of human metabolism and its variability in critical illness to maximize the overlap of the model-predicted (5-95th percentile) range of BG outcomes with the 4.0-6.5 mmol/L band while ensuring a maximum 5% risk of BG < 4.0 mmol/L. Carbohydrate intake (all sources) was selected to maximize intake up to 100% of SCCM/ACCP goal (25 kg/kcal/h). Maximum insulin doses and dose changes were limited for safety. Measurements were made with glucometers. Results are compared to those for the SPRINT study, which reduced mortality 25-40% for length of stay ≥3 days. Written informed consent was obtained for all patients, and approval was granted by the NZ Upper South A Regional Ethics Committee. A total of 402 measurements were taken over 660 hours (~14/day), because nurses showed a preference for 2-hourly measurements. Median [interquartile range, (IQR)] cohort BG was 5.9 mmol/L [5.2-6.8]. Overall, 63.2%, 75.9%, and 89.8% of measurements were in the 4.0-6.5, 4.0-7.0, and 4.0-8.0 mmol/L bands. There were no hypoglycemic events (BG < 2.2 mmol/L), and the minimum BG was 3.5 mmol/L with 4.5% < 4.4 mmol/L. Per patient, the median [IQR] hours of TGC was 92 h [29-113] using 53 [19-62] measurements (median, ~13/day). Median [IQR] results: BG, 5.9 mmol/L [5.8-6.3]; carbohydrate nutrition, 6.8 g/h [5.5-8.7] (~70% goal feed median); insulin, 2.5 U/h [0.1-5.1]. All patients achieved BG < 6.1 mmol/L. These results match or exceed SPRINT and clinical workload is reduced more than 20%. STAR TGC modulating insulin and nutrition inputs provided very tight control with minimal variability by managing intra- and inter- patient variability. Performance and safety exceed that of SPRINT, which reduced mortality and cost in the Christchurch ICU. The use of glucometers did not appear to impact the quality of TGC. Finally, clinical workload was self-managed and reduced 20% compared with SPRINT.
Higher sweat chloride levels in patients with asthma: a case-control study.
Awasthi, Shally; Dixit, Pratibha; Maurya, Nutan
2015-02-01
To screen asthmatic patients by sweat chloride test to identify proportion with Cystic Fibrosis (CF); (Sweat chloride level >60 mmol/L). Also, to compare sweat chloride levels between cases of bronchial asthma and age and sex matched healthy children aged 5 mo-15 y. The present case-control study was conducted in a tertiary care hospital in India. Cases of bronchial asthma, diagnosed by GINA guideline 2008, and age matched healthy controls were included. Case to control ratio was 2:1. Sweat Chloride test was done by Pilocarpine Iontophoresis method. From April 2010 through May 2012, 216 asthmatics and 112 controls were recruited. Among asthmatics, there was no case of Cystic Fibrosis. Mean sweat chloride levels in asthmatics was 22.39 ± 8.45 mmol/L (inter-quartile range - 15-28 mmol/L) and in controls 19.55 ± 7.04 mmol/L (inter-quartile range - 15-23.5 mmol/L) (p value = 0.048). No Cystic Fibrosis case was identified among asthmatics. Mean sweat chloride levels were higher in asthmatics as compared to controls.
Changing Epidemiology of Human Brucellosis, China, 1955–2014
Lai, Shengjie; Zhou, Hang; Xiong, Weiyi; Gilbert, Marius; Huang, Zhuojie; Yu, Jianxing; Yin, Wenwu; Wang, Liping; Chen, Qiulan; Li, Yu; Mu, Di; Zeng, Lingjia; Ren, Xiang; Geng, Mengjie; Zhang, Zike; Cui, Buyun; Li, Tiefeng; Wang, Dali; Li, Zhongjie; Wardrop, Nicola A.; Tatem, Andrew J.
2017-01-01
Brucellosis, a zoonotic disease, was made statutorily notifiable in China in 1955. We analyzed the incidence and spatial–temporal distribution of human brucellosis during 1955–2014 in China using notifiable surveillance data: aggregated data for 1955–2003 and individual case data for 2004–2014. A total of 513,034 brucellosis cases were recorded, of which 99.3% were reported in northern China during 1955–2014, and 69.1% (258, 462/374, 141) occurred during February–July in 1990–2014. Incidence remained high during 1955–1978 (interquartile range 0.42–1.0 cases/100,000 residents), then decreased dramatically in 1979–1994. However, brucellosis has reemerged since 1995 (interquartile range 0.11–0.23 in 1995–2003 and 1.48–2.89 in 2004–2014); the historical high occurred in 2014, and the affected area expanded from northern pastureland provinces to the adjacent grassland and agricultural areas, then to southern coastal and southwestern areas. Control strategies in China should be adjusted to account for these changes by adopting a One Health approach. PMID:28098531
Androgenic correlates of genetic variation in the gene encoding 5alpha-reductase type 1.
Ellis, Justine A; Panagiotopoulos, Sianna; Akdeniz, Aysel; Jerums, George; Harrap, Stephen B
2005-01-01
Androgens determine male secondary sexual characteristics and influence a variety of metabolic pathways. Circulating levels of androgens are highly heritable; however, the genes involved are largely unknown. The 5alpha-reductase enzymes types 1 and 2 responsible for converting testosterone to the more potent androgen dihydrotestosterone are encoded by the SRD5A1 and SRD5A2 genes, respectively. We performed indirect genetic association studies of SRD5A1 and SRD5A2 and the dihydrotestosterone/testosterone ratio that reflects the activity of 5alpha-reductase in 57 males with type 2 diabetes. We found evidence of significant association between a single nucleotide polymorphism in SRD5A1 and the dihydrotestosterone/testosterone ratio (median 0.10, interquartile range 0.08 vs. median 0.06, interquartile range 0.04, P = 0.009). The polymorphism was not associated with any diabetic phenotypes. These results suggest that functional genetic variants might exist in or around SRD5A1 that affect the activity of the 5alpha-reductase enzyme type 1 and influence androgen levels.
Göktay, Fatih; Altan, Zeynep Müzeyyen; Talas, Anıl; Akpınar, Esma; Özdemir, Ekin Özge; Aytekin, Sema
2016-01-01
Patient anxiety about nail surgery relates mainly to pain associated with needle puncture, anesthetic flow during the procedure, and postoperative care, as well as possible past traumatic experience. The aims of this study were to compare anxiety levels among patients undergoing nail surgery and skin punch biopsy and to assess the effects of demographic characteristics on anxiety. Forty-eight consecutive patients who were referred to a dermatological surgery unit for nail surgery intervention (group 1) and 50 age- and sex-matched patients referred to the same unit for skin punch biopsy (group 2) were enrolled in the study. Patients' anxiety levels were measured using Spielberger's State-Trait Anxiety Inventory. There was no significant difference in median anxiety level between group 1 (42.00; interquartile range, 6.50) and group 2 (41.00; interquartile range, 8.25) (P = .517). The demographic factors of patient sex, educational status, and prior surgery showed no significant effects on anxiety levels. Nail surgery does not seem to cause significantly greater anxiety than skin punch biopsy. © The Author(s) 2015.
Rathi, Vinay K; Wang, Bo; Ross, Joseph S; Downing, Nicholas S; Kesselheim, Aaron S; Gray, Stacey T
2017-02-01
The US Food and Drug Administration (FDA) approves high-risk medical devices based on premarket pivotal clinical studies demonstrating reasonable assurance of safety and effectiveness and may require postapproval studies (PAS) to further inform benefit-risk assessment. We conducted a cross-sectional analysis using publicly available FDA documents to characterize industry-sponsored pivotal studies and PAS of high-risk devices used in the treatment of otolaryngologic diseases. Between 2000 and 2014, the FDA approved 23 high-risk otolaryngologic devices based on 28 pivotal studies. Median enrollment was 118 patients (interquartile range, 67-181), and median duration of longest primary effectiveness end point follow-up was 26 weeks (interquartile range, 16-96). Fewer than half were randomized (n = 13, 46%), blinded (n = 12, 43%), or controlled (n = 10, 36%). The FDA required 23 PASs for 16 devices (70%): almost two-thirds (n = 15, 65%) monitored long-term performance, and roughly one-third (n = 8, 35%) focused on subgroups. Otolaryngologists should be aware of limitations in the strength of premarket evidence when considering the use of newly approved devices.
The Development of a Machine Learning Inpatient Acute Kidney Injury Prediction Model.
Koyner, Jay L; Carey, Kyle A; Edelson, Dana P; Churpek, Matthew M
2018-07-01
To develop an acute kidney injury risk prediction model using electronic health record data for longitudinal use in hospitalized patients. Observational cohort study. Tertiary, urban, academic medical center from November 2008 to January 2016. All adult inpatients without pre-existing renal failure at admission, defined as first serum creatinine greater than or equal to 3.0 mg/dL, International Classification of Diseases, 9th Edition, code for chronic kidney disease stage 4 or higher or having received renal replacement therapy within 48 hours of first serum creatinine measurement. None. Demographics, vital signs, diagnostics, and interventions were used in a Gradient Boosting Machine algorithm to predict serum creatinine-based Kidney Disease Improving Global Outcomes stage 2 acute kidney injury, with 60% of the data used for derivation and 40% for validation. Area under the receiver operator characteristic curve (AUC) was calculated in the validation cohort, and subgroup analyses were conducted across admission serum creatinine, acute kidney injury severity, and hospital location. Among the 121,158 included patients, 17,482 (14.4%) developed any Kidney Disease Improving Global Outcomes acute kidney injury, with 4,251 (3.5%) developing stage 2. The AUC (95% CI) was 0.90 (0.90-0.90) for predicting stage 2 acute kidney injury within 24 hours and 0.87 (0.87-0.87) within 48 hours. The AUC was 0.96 (0.96-0.96) for receipt of renal replacement therapy (n = 821) in the next 48 hours. Accuracy was similar across hospital settings (ICU, wards, and emergency department) and admitting serum creatinine groupings. At a probability threshold of greater than or equal to 0.022, the algorithm had a sensitivity of 84% and a specificity of 85% for stage 2 acute kidney injury and predicted the development of stage 2 a median of 41 hours (interquartile range, 12-141 hr) prior to the development of stage 2 acute kidney injury. Readily available electronic health record data can be used to predict impending acute kidney injury prior to changes in serum creatinine with excellent accuracy across different patient locations and admission serum creatinine. Real-time use of this model would allow early interventions for those at high risk of acute kidney injury.
Finn Davis, Katherine; Napolitano, Natalie; Li, Simon; Buffman, Hayley; Rehder, Kyle; Pinto, Matthew; Nett, Sholeen; Jarvis, J Dean; Kamat, Pradip; Sanders, Ronald C; Turner, David A; Sullivan, Janice E; Bysani, Kris; Lee, Anthony; Parker, Margaret; Adu-Darko, Michelle; Giuliano, John; Biagas, Katherine; Nadkarni, Vinay; Nishisaki, Akira
2017-10-01
To describe promoters and barriers to implementation of an airway safety quality improvement bundle from the perspective of interdisciplinary frontline clinicians and ICU quality improvement leaders. Mixed methods. Thirteen PICUs of the National Emergency Airway Registry for Children network. Remote or on-site focus groups with interdisciplinary ICU staff. Two semistructured interviews with ICU quality improvement leaders with quantitative and qualitative data-based feedbacks. Bundle implementation success (compliance) was defined as greater than or equal to 80% use for tracheal intubations for 3 consecutive months. ICUs were classified as early or late adopters. Focus group discussions concentrated on safety concerns and promoters and barriers to bundle implementation. Initial semistructured quality improvement leader interviews assessed implementation tactics and provided recommendations. Follow-up interviews assessed degree of acceptance and changes made after initial interview. Transcripts were thematically analyzed and contrasted by early versus late adopters. Median duration to achieve success was 502 days (interquartile range, 182-781). Five sites were early (median, 153 d; interquartile range, 146-267) and eight sites were late adopters (median, 783 d; interquartile range, 773-845). Focus groups identified common "promoter" themes-interdisciplinary approach, influential champions, and quality improvement bundle customization-and "barrier" themes-time constraints, competing paperwork and quality improvement activities, and poor engagement. Semistructured interviews with quality improvement leaders identified effective and ineffective tactics implemented by early and late adopters. Effective tactics included interdisciplinary quality improvement team involvement (early adopter: 5/5, 100% vs late adopter: 3/8, 38%; p = 0.08); ineffective tactics included physician-only rollouts, lack of interdisciplinary education, lack of data feedback to frontline clinicians, and misconception of bundle as research instead of quality improvement intervention. Implementation of an airway safety quality improvement bundle with high compliance takes a long time across diverse ICUs. Both early and late adopters identified similar promoter and barrier themes. Early adopter sites customized the quality improvement bundle and had an interdisciplinary quality improvement team approach.
Closed-loop insulin delivery during pregnancy complicated by type 1 diabetes.
Murphy, Helen R; Elleri, Daniela; Allen, Janet M; Harris, Julie; Simmons, David; Rayman, Gerry; Temple, Rosemary; Dunger, David B; Haidar, Ahmad; Nodale, Marianna; Wilinska, Malgorzata E; Hovorka, Roman
2011-02-01
This study evaluated closed-loop insulin delivery with a model predictive control (MPC) algorithm during early (12-16 weeks) and late gestation (28-32 weeks) in pregnant women with type 1 diabetes. Ten women with type 1 diabetes (age 31 years, diabetes duration 19 years, BMI 24.1 kg/m(2), booking A1C 6.9%) were studied over 24 h during early (14.8 weeks) and late pregnancy (28.0 weeks). A nurse adjusted the basal insulin infusion rate from continuous glucose measurements (CGM), fed into the MPC algorithm every 15 min. Mean glucose and time spent in target (63-140 mg/dL), hyperglycemic (>140 to ≥ 180 mg/dL), and hypoglycemic (<63 to ≤ 50 mg/dL) were calculated using plasma and sensor glucose measurements. Linear mixed-effects models were used to compare glucose control during early and late gestation. During closed-loop insulin delivery, median (interquartile range) plasma glucose levels were 117 (100.8-154.8) mg/dL in early and 126 (109.8-140.4) mg/dL in late gestation (P = 0.72). The overnight mean (interquartile range) plasma glucose time in target was 84% (50-100%) in early and 100% (94-100%) in late pregnancy (P = 0.09). Overnight mean (interquartile range) time spent hyperglycemic (>140 mg/dL) was 7% (0-40%) in early and 0% (0-6%) in late pregnancy (P = 0.25) and hypoglycemic (<63 mg/dL) was 0% (0-3%) and 0% (0-0%), respectively (P = 0.18). Postprandial glucose control, glucose variability, insulin infusion rates, and CGM sensor accuracy were no different in early or late pregnancy. MPC algorithm performance was maintained throughout pregnancy, suggesting that overnight closed-loop insulin delivery could be used safely during pregnancy. More work is needed to achieve optimal postprandial glucose control.
Closed-Loop Insulin Delivery During Pregnancy Complicated by Type 1 Diabetes
Murphy, Helen R.; Elleri, Daniela; Allen, Janet M.; Harris, Julie; Simmons, David; Rayman, Gerry; Temple, Rosemary; Dunger, David B.; Haidar, Ahmad; Nodale, Marianna; Wilinska, Malgorzata E.; Hovorka, Roman
2011-01-01
OBJECTIVE This study evaluated closed-loop insulin delivery with a model predictive control (MPC) algorithm during early (12–16 weeks) and late gestation (28–32 weeks) in pregnant women with type 1 diabetes. RESEARCH DESIGN AND METHODS Ten women with type 1 diabetes (age 31 years, diabetes duration 19 years, BMI 24.1 kg/m2, booking A1C 6.9%) were studied over 24 h during early (14.8 weeks) and late pregnancy (28.0 weeks). A nurse adjusted the basal insulin infusion rate from continuous glucose measurements (CGM), fed into the MPC algorithm every 15 min. Mean glucose and time spent in target (63–140 mg/dL), hyperglycemic (>140 to ≥180 mg/dL), and hypoglycemic (<63 to ≤50 mg/dL) were calculated using plasma and sensor glucose measurements. Linear mixed-effects models were used to compare glucose control during early and late gestation. RESULTS During closed-loop insulin delivery, median (interquartile range) plasma glucose levels were 117 (100.8–154.8) mg/dL in early and 126 (109.8–140.4) mg/dL in late gestation (P = 0.72). The overnight mean (interquartile range) plasma glucose time in target was 84% (50–100%) in early and 100% (94–100%) in late pregnancy (P = 0.09). Overnight mean (interquartile range) time spent hyperglycemic (>140 mg/dL) was 7% (0–40%) in early and 0% (0–6%) in late pregnancy (P = 0.25) and hypoglycemic (<63 mg/dL) was 0% (0–3%) and 0% (0–0%), respectively (P = 0.18). Postprandial glucose control, glucose variability, insulin infusion rates, and CGM sensor accuracy were no different in early or late pregnancy. CONCLUSIONS MPC algorithm performance was maintained throughout pregnancy, suggesting that overnight closed-loop insulin delivery could be used safely during pregnancy. More work is needed to achieve optimal postprandial glucose control. PMID:21216859
Chen, J; Li, Y; Wang, Z; McCulloch, P; Hu, L; Chen, W; Liu, G; Li, J; Lang, J
2018-02-01
To evaluate the clinical outcomes of high-intensity focused ultrasound (HIFU) and surgery in treating uterine fibroids, and prepare for a definitive randomised trial. Prospective multicentre patient choice cohort study (IDEAL Exploratory study) of HIFU, myomectomy or hysterectomy for treating symptomatic uterine fibroids. 20 Chinese hospitals. 2411 Chinese women with symptomatic fibroids. Prospective non-randomised cohort study with learning curve analysis (IDEAL Stage 2b Prospective Exploration Study). Complications, hospital stay, return to normal activities, and quality of life (measured with UFS-Qol and SF-36 at baseline, 6 and 12 months), and need for further treatment. Quality-of-life outcomes were adjusted using regression modelling. HIFU treatment quality was evaluated using LC-CUSUM to identify operator learning curves. A health economic analysis of costs was performed. 1353 women received HIFU, 472 hysterectomy and 586 myomectomy. HIFU patients were significantly younger (P < 0.001), slimmer (P < 0.001), better educated (P < 0.001), and wealthier (P = 0.002) than surgery patients. Both UFS and QoL improved more rapidly after HIFU than after surgery (P = 0.002 and P = 0.001, respectively at 6 months), but absolute differences were small. Major adverse events occurred in 3 (0.2%) of HIFU and in 133 (12.6%) of surgical cases (P < 0.001). Median time for hospital stay was 4 days (interquartile range, 0-5 days), 10 days (interquartile range, 8-12.5 days) and 8 days (interquartile range, 7-10 days). HIFU caused substantially less morbidity than surgery, with similar longer-term QoL. Despite group baseline differences and lack of blinding, these findings support the need for a randomised controlled trial (RCT) of HIFU treatment for fibroids. The IDEAL Exploratory design facilitated RCT protocol development. HIFU had much better short-term outcomes than surgery for fibroids in 2411-patient Chinese IDEAL format study. © 2017 Royal College of Obstetricians and Gynaecologists.
Hubert, Gordian J; Meretoja, Atte; Audebert, Heinrich J; Tatlisumak, Turgut; Zeman, Florian; Boy, Sandra; Haberl, Roman L; Kaste, Markku; Müller-Barna, Peter
2016-12-01
Intravenous thrombolysis with tissue-type plasminogen activator (tPA) for acute ischemic stroke is more effective when delivered early. Timely delivery is challenging particularly in rural areas with long distances. We compared delays and treatment rates of a large, decentralized telemedicine-based system and a well-organized, large, centralized single-hospital system. We analyzed the centralized system of the Helsinki University Central Hospital (Helsinki and Province of Uusimaa, Finland, 1.56 million inhabitants, 9096 km 2 ) and the decentralized TeleStroke Unit network in a predominantly rural area (Telemedical Project for Integrative Stroke Care [TEMPiS], South-East Bavaria, Germany, 1.94 million inhabitants, 14 992 km 2 ). All consecutive tPA treatments were prospectively registered. We compared tPA rates per total ischemic stroke admissions in the Helsinki and TEMPiS catchment areas. For delay comparisons, we excluded patients with basilar artery occlusions, in-hospital strokes, and those being treated after 270 minutes. From January 1, 2011, to December 31, 2013, 912 patients received tPA in Helsinki University Central Hospital and 1779 in TEMPiS hospitals. Area-based tPA rates were equal (13.0% of 7017 ischemic strokes in the Helsinki University Central Hospital area versus 13.3% of 14 637 ischemic strokes in the TEMPiS area; P=0.078). Median prehospital delays were longer (88; interquartile range, 60-135 versus 65; 48-101 minutes; P<0.001) but in-hospital delays were shorter (18; interquartile range, 13-30 versus 39; 26-56 minutes; P<0.001) in Helsinki University Central Hospital compared with TEMPiS with no difference in overall delays (117; interquartile range, 81-168 versus 115; 87-155 minutes; P=0.45). A decentralized telestroke thrombolysis service can achieve similar treatment rates and time delays for a rural population as a centralized system can achieve for an urban population. © 2016 American Heart Association, Inc.
Geographic Variance of Cost Associated With Hysterectomy.
Sheyn, David; Mahajan, Sangeeta; Billow, Megan; Fleary, Alexandra; Hayashi, Emi; El-Nashar, Sherif A
2017-05-01
To estimate whether the cost of hysterectomy varies by geographic region. This was a cross-sectional, population-based study using the 2013 Healthcare Cost and Utilization Project National Inpatient Sample of women older than 18 years undergoing inpatient hysterectomy for benign conditions. Hospital charges obtained from the National Inpatient Sample database were converted to actual costs using cost-to-charge ratios provided by the Healthcare Cost and Utilization Project. Multivariate regression was used to assess the effects that demographic factors, concomitant procedures, diagnoses, and geographic region have on hysterectomy cost above the median. Women who underwent hysterectomy for benign conditions were identified (N=38,414). The median cost of hysterectomy was $13,981 (interquartile range $9,075-29,770). The mid-Atlantic region had the lowest median cost of $9,661 (interquartile range $6,243-15,335) and the Pacific region had the highest median cost, $22,534 (interquartile range $15,380-33,797). Compared with the mid-Atlantic region, the Pacific (adjusted odds ratio [OR] 10.43, 95% confidence interval [CI] 9.44-11.45), South Atlantic (adjusted OR 5.39, 95% CI 4.95-5.86), and South Central (adjusted OR 2.40, 95% CI 2.21-2.62) regions were associated with the highest probability of costs above the median. All concomitant procedures were associated with an increased cost with the exception of bilateral salpingectomy (adjusted OR 1.03, 95% CI 0.95-1.12). Compared with vaginal hysterectomy, laparoscopic and robotic modes of hysterectomy were associated with higher probabilities of increased costs (adjusted OR 2.86, 95% CI 2.61-3.15 and adjusted OR 5.66, 95% CI 5.11-6.26, respectively). Abdominal hysterectomy was not associated with a statistically significant increase in cost compared with vaginal hysterectomy (adjusted OR 1.01, 95% CI 0.91-1.09). The cost of hysterectomy varies significantly with geographic region after adjusting for confounders.
Goei, Dustin; van Kuijk, Jan-Peter; Flu, Willem-Jan; Hoeks, Sanne E; Chonchol, Michel; Verhagen, Hence J M; Bax, Jeroen J; Poldermans, Don
2011-02-15
Plasma N-terminal pro-B-type natriuretic peptide (NT-pro-BNP) levels improve preoperative cardiac risk stratification in vascular surgery patients. However, single preoperative measurements of NT-pro-BNP cannot take into account the hemodynamic stress caused by anesthesia and surgery. Therefore, the aim of the present study was to assess the incremental predictive value of changes in NT-pro-BNP during the perioperative period for long-term cardiac mortality. Detailed cardiac histories, rest left ventricular echocardiography, and NT-pro-BNP levels were obtained in 144 patients before vascular surgery and before discharge. The study end point was the occurrence of cardiovascular death during a median follow-up period of 13 months (interquartile range 5 to 20). Preoperatively, the median NT-pro-BNP level in the study population was 314 pg/ml (interquartile range 136 to 1,351), which increased to a median level of 1,505 pg/ml (interquartile range 404 to 6,453) before discharge. During the follow-up period, 29 patients (20%) died, 27 (93%) from cardiovascular causes. The median difference in NT-pro-BNP in the survivors was 665 pg/ml, compared to 5,336 pg/ml in the patients who died (p = 0.01). Multivariate Cox regression analyses, adjusted for cardiac history and cardiovascular risk factors (age, angina pectoris, myocardial infarction, stroke, diabetes mellitus, renal dysfunction, body mass index, type of surgery and the left ventricular ejection fraction), demonstrated that the difference in NT-pro-BNP level between pre- and postoperative measurement was the strongest independent predictor of cardiac outcome (hazard ratio 3.06, 95% confidence interval 1.36 to 6.91). In conclusion, the change in NT-pro-BNP, indicated by repeated measurements before surgery and before discharge is the strongest predictor of cardiac outcomes in patients who undergo vascular surgery. Copyright © 2011 Elsevier Inc. All rights reserved.
A Review of Online Evidence-based Practice Point-of-Care Information Summary Providers
Liberati, Alessandro; Moschetti, Ivan; Tagliabue, Ludovica; Moja, Lorenzo
2010-01-01
Background Busy clinicians need easy access to evidence-based information to inform their clinical practice. Publishers and organizations have designed specific tools to meet doctors’ needs at the point of care. Objective The aim of this study was to describe online point-of-care summaries and evaluate their breadth, content development, and editorial policy against their claims of being “evidence-based.” Methods We searched Medline, Google, librarian association websites, and information conference proceedings from January to December 2008. We included English Web-based point-of-care summaries designed to deliver predigested, rapidly accessible, comprehensive, periodically updated, evidence-based information to clinicians. Two investigators independently extracted data on the general characteristics and content presentation of summaries. We assessed and ranked point-of-care products according to: (1) coverage (volume) of medical conditions, (2) editorial quality, and (3) evidence-based methodology. We explored how these factors were associated. Results We retrieved 30 eligible summaries. Of these products, 18 met our inclusion criteria and were qualitatively described, and 16 provided sufficient data for quantitative evaluation. The median volume of medical conditions covered was 80.6% (interquartile range, 68.9% - 84.2%) and varied for the different products. Similarly, differences emerged for editorial policy (median 8.0, interquartile range 5.8 - 10.3) and evidence-based methodology scores (median 10.0, interquartile range 1.0 - 12.8) on a 15-point scale. None of these dimensions turned out to be significantly associated with the other dimensions (editorial quality and volume, Spearman rank correlation r = -0.001, P = .99; evidence-based methodology and volume, r = -0.19, P = .48; editorial and evidence-based methodology, r = 0.43, P =.09). Conclusions Publishers are moving to develop point-of-care summary products. Some of these have better profiles than others, and there is room for improved reporting of the strengths and weaknesses of these products. PMID:20610379
Gurvitch, R; Wood, D A; Tay, E L; Leipsic, J; Ye, J; Lichtenstein, S V; Thompson, C R; Carere, R G; Wijesinghe, N; Nietlispach, F; Boone, R H; Lauck, S; Cheung, A; Webb, J G
2010-09-28
Although short- and medium-term outcomes after transcatheter aortic valve implantation are encouraging, long-term data on valve function and clinical outcomes are limited. Consecutive high-risk patients who had been declined as surgical candidates because of comorbidities but who underwent successful transcatheter aortic valve implantation with a balloon-expandable valve between January 2005 and December 2006 and survived past 30 days were assessed. Clinical, echocardiographic, and computed tomographic follow-up examinations were performed. Seventy patients who underwent successful procedures and survived longer than 30 days were evaluated at a minimum follow-up of 3 years. At a median follow-up of 3.7 years (interquartile range 3.4 to 4.3 years), survival was 57%. Survival at 1, 2, and 3 years was 81%, 74%, and 61%, respectively. Freedom from reoperation was 98.5% (1 patient with endocarditis). During this early procedural experience, 11 patients died within 30 days, and 8 procedures were unsuccessful. When these patients were included, overall survival was 51%. Transaortic pressure gradients increased from 10.0 mm Hg (interquartile range 8.0 to 12.0 mm Hg) immediately after the procedure to 12.1 mm Hg (interquartile range 8.6 to 16.0 mm Hg) after 3 years (P=0.03). Bioprosthetic valve area decreased from a mean of 1.7±0.4 cm(2) after the procedure to 1.4±0.3 cm(2) after 3 years (P<0.01). Aortic incompetence after implantation was trivial or mild in 84% of cases and remained unchanged or improved over time. There were no cases of structural valvular deterioration, stent fracture, deformation, or valve migration. Transcatheter aortic valve implantation demonstrates good medium- to long-term durability and preserved hemodynamic function, with no evidence of structural failure. The procedure appears to offer an adequate and lasting resolution of aortic stenosis in selected patients.
Roca, Bernardino; Mendoza, María A; Roca, Manuel
2016-10-01
To compare the efficacy of extracorporeal shock wave therapy (ESWT) with botulinum toxin type A (BoNT-A) in the treatment of plantar fasciitis (PF). Open label, prospective, randomized study. A total of 72 patients were included. In all participants the median (and interquartile range) of the visual analog scale (VAS) of pain result, when taking the first steps, was 8 (6-9) points before treatment and 6 (4-8) points after treatment (p < 0.001). In the group of patients that received ESWT, the median (and interquartile range) of improvement in the VAS of pain result, when taking the first steps, was 2 (1-4) points, and in the group of patients that received BoNT-A the same result was 1 (0-2) points (p = 0.009). In the group of patients that received ESWT, the median (and interquartile range) of improvement in the Roles and Maudsley scale of pain result was 1 (0-1) points, and in the group of patients that received BoNT-A the same result was 0 (0-1) points (p = 0.006). In a multivariate analysis use of ESWT and lower weight were associated with improvement of pain with treatment in at least one of the three VAS of pain scales used in the study. ESWT was superior to BoNT-A in the control of pain in patients with PF. Implications for Rehabilitation Plantar fasciitis is characterized by pain at the calcaneal origin of the plantar fascia, exacerbated by weight bearing after prolonged periods of rest. Although studies comparing extracorporeal shock wave therapy or botulinum toxin type A to placebo suggest a superiority of the first one, no reliable data exist about it. Extracorporeal shock wave therapy was superior to botulinum toxin type A in the control of pain in patients with PF.
Cyclodiode photocoagulation for refractory glaucoma after penetrating keratoplasty.
Shah, P; Lee, G A; Kirwan, J K; Bunce, C; Bloom, P A; Ficker, L A; Khaw, P T
2001-11-01
This study analyzes the results of intraocular pressure (IOP) reduction by contact diode cycloablation (cyclodiode) in cases of refractory glaucoma after penetrating keratoplasty. Retrospective noncomparative, interventional case series. Twenty-eight eyes in 28 patients attending the Moorfields Eye Hospital. Cyclodiode (40 applications x 1.5 W x 1.5 seconds over 270-300 degrees ) was used to control the IOP in refractory glaucoma after penetrating keratoplasty. Postoperative IOP, graft status, visual acuity, and number of antiglaucoma medications were recorded after cyclodiode treatment. Cyclodiode resulted in a reduction of IOP from a median of 33 mmHg (interquartile range [28, 40.5]) to a median of 15 mmHg (interquartile range [12, 20.5]). Most patients had a significant lowering in IOP with a median reduction of 16 mmHg (interquartile range [12, 25]; P < 0.0001). IOPs of 6 to 21 mmHg were achieved in 22 patients (79%). Sixteen patients (57%) required more than one treatment with cyclodiode to control the IOP, with three patients (11%) requiring three treatments and two patients (7%) requiring four treatments. Visual acuity improved (> two Snellen lines of acuity) in three patients (11%) and remained the same (+/- one Snellen line) in 17 patients (61%). The mean number of antiglaucoma medications before cycloablation was 2.6 and was 1.8 after treatment (P < 0.001). Of the 19 patients (68%) with originally clear grafts, three grafts (16%) developed opacification. One patient (4%), with a history of nanophthalmos and recurrent uveal effusion, had delayed hypotony (IOP < 6 mmHg) occurring 46 months after the diode treatment. All patients had at least 6 months follow-up. These patients have often undergone multiple previous complicated ocular interventions and are often not suitable for filtration surgery. Reduction of IOP with maintenance of visual acuity and a good safety profile was achieved in most patients in this study but may require multiple treatments. We propose cyclodiode as an effective treatment for many patients in the management of refractory glaucoma after penetrating keratoplasty.
King, D; Hume, P; Gissane, C; Brughelli, M; Clark, T
2016-02-01
Head impacts and resulting head accelerations cause concussive injuries. There is no standard for reporting head impact data in sports to enable comparison between studies. The aim was to outline methods for reporting head impact acceleration data in sport and the effect of the acceleration thresholds on the number of impacts reported. A systematic review of accelerometer systems utilised to report head impact data in sport was conducted. The effect of using different thresholds on a set of impact data from 38 amateur senior rugby players in New Zealand over a competition season was calculated. Of the 52 studies identified, 42% reported impacts using a >10-g threshold, where g is the acceleration of gravity. Studies reported descriptive statistics as mean ± standard deviation, median, 25th to 75th interquartile range, and 95th percentile. Application of the varied impact thresholds to the New Zealand data set resulted in 20,687 impacts of >10 g, 11,459 (45% less) impacts of >15 g, and 4024 (81% less) impacts of >30 g. Linear and angular raw data were most frequently reported. Metrics combining raw data may be more useful; however, validity of the metrics has not been adequately addressed for sport. Differing data collection methods and descriptive statistics for reporting head impacts in sports limit inter-study comparisons. Consensus on data analysis methods for sports impact assessment is needed, including thresholds. Based on the available data, the 10-g threshold is the most commonly reported impact threshold and should be reported as the median with 25th and 75th interquartile ranges as the data are non-normally distributed. Validation studies are required to determine the best threshold and metrics for impact acceleration data collection in sport. Until in-field validation studies are completed, it is recommended that head impact data should be reported as median and interquartile ranges using the 10-g impact threshold.
Shahian, David M; He, Xia; Jacobs, Jeffrey P; Kurlansky, Paul A; Badhwar, Vinay; Cleveland, Joseph C; Fazzalari, Frank L; Filardo, Giovanni; Normand, Sharon-Lise T; Furnary, Anthony P; Magee, Mitchell J; Rankin, J Scott; Welke, Karl F; Han, Jane; O'Brien, Sean M
2015-10-01
Previous composite performance measures of The Society of Thoracic Surgeons (STS) were estimated at the STS participant level, typically a hospital or group practice. The STS Quality Measurement Task Force has now developed a multiprocedural, multidimensional composite measure suitable for estimating the performance of individual surgeons. The development sample from the STS National Database included 621,489 isolated coronary artery bypass grafting procedures, isolated aortic valve replacement, aortic valve replacement plus coronary artery bypass grafting, mitral, or mitral plus coronary artery bypass grafting procedures performed by 2,286 surgeons between July 1, 2011, and June 30, 2014. Each surgeon's composite score combined their aggregate risk-adjusted mortality and major morbidity rates (each weighted inversely by their standard deviations) and reflected the proportion of case types they performed. Model parameters were estimated in a Bayesian framework. Composite star ratings were examined using 90%, 95%, or 98% Bayesian credible intervals. Measure reliability was estimated using various 3-year case thresholds. The final composite measure was defined as 0.81 × (1 minus risk-standardized mortality rate) + 0.19 × (1 minus risk-standardized complication rate). Risk-adjusted mortality (median, 2.3%; interquartile range, 1.7% to 3.0%), morbidity (median, 13.7%; interquartile range, 10.8% to 17.1%), and composite scores (median, 95.4%; interquartile range, 94.4% to 96.3%) varied substantially across surgeons. Using 98% Bayesian credible intervals, there were 207 1-star (lower performance) surgeons (9.1%), 1,701 2-star (as-expected performance) surgeons (74.4%), and 378 3-star (higher performance) surgeons (16.5%). With an eligibility threshold of 100 cases over 3 years, measure reliability was 0.81. The STS has developed a multiprocedural composite measure suitable for evaluating performance at the individual surgeon level. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Neurogranin as a Cerebrospinal Fluid Biomarker for Synaptic Loss in Symptomatic Alzheimer Disease
Kester, Maartje I.; Teunissen, Charlotte E.; Crimmins, Daniel L.; Herries, Elizabeth M.; Ladenson, Jack. H.; Scheltens, Philip; van der Flier, Wiesje M.; Morris, John C.; Holtzman, David M.; Fagan, Anne M.
2015-01-01
IMPORTANCE Neurogranin (NGRN) seems to be a promising novel cerebrospinal fluid (CSF) biomarker for synaptic loss; however, clinical, and especially longitudinal, data are sparse. OBJECTIVE To examine the utility of NGRN, with repeated CSF sampling, for diagnosis, prognosis, and monitoring of Alzheimer disease (AD). DESIGN, SETTING, AND PARTICIPANTS Longitudinal study of consecutive patients who underwent 2 lumbar punctures between the beginning of 1995 and the end of 2010 within the memory clinic–based Amsterdam Dementia Cohort. The study included 163 patients: 37 cognitively normal participants (mean [SE] age, 64 [2] years; 38% female; and mean [SE] Mini-Mental State Examination [MMSE] score, 28 [0.3]), 61 patients with mild cognitive impairment (MCI) (mean [SE] age, 68 [1] years; 38% female; and mean [SE] MMSE score, 27 [0.3]), and 65 patients with AD (mean [SE] age, 65 [1] years; 45% female; and mean [SE] MMSE score, 22 [0.7]). The mean (SE) interval between lumbar punctures was 2.0 (0.1) years, and the mean (SE) duration of cognitive follow-up was 3.8 (0.2) years. Measurements of CSF NGRN levels were obtained in January and February 2014. MAIN OUTCOME AND MEASURE Levels of NGRN in CSF samples. RESULTS Baseline CSF levels of NGRN in patients with AD (median level, 2381 pg/mL [interquartile range, 1651-3416 pg/mL]) were higher than in cognitively normal participants (median level, 1712 pg/mL [interquartile range, 1206-2724 pg/mL]) (P = .04). Baseline NGRN levels were highly correlated with total tau and tau phosphorylated at threonine 181 in all patient groups (all P < .001), but not with Aβ42. Baseline CSF levels of NGRN were also higher in patients with MCI who progressed to AD (median level, 2842 pg/mL [interquartile range, 1882-3950 pg/mL]) compared with those with stable MCI (median level, 1752 pg/mL [interquartile range, 1024-2438 pg/mL]) (P = .004), and they were predictive of progression from MCI to AD (hazard ratio, 1.8 [95% CI, 1.1-2.9]; stratified by tertiles). Linear mixed-model analyses demonstrated that within-person levels of NGRN increased over time in cognitively normal participants (mean [SE] level, 90 [45] pg/mL per year; P < .05) but not in patients with MCI or AD. CONCLUSIONS AND RELEVANCE Neurogranin is a promising biomarker for AD because levels were elevated in patients with AD compared with cognitively normal participants and predicted progression from MCI to AD. Within-person levels of NGRN increased in cognitively normal participants but not in patients with later stage MCI or AD, which suggests that NGRN may reflect presymptomatic synaptic dysfunction or loss. PMID:26366630
Microcystic macular oedema in multiple sclerosis is associated with disease severity
Gelfand, Jeffrey M.; Nolan, Rachel; Schwartz, Daniel M.; Graves, Jennifer
2012-01-01
Macular oedema typically results from blood–retinal barrier disruption. It has recently been reported that patients with multiple sclerosis treated with FTY-720 (fingolimod) may exhibit macular oedema. Multiple sclerosis is not otherwise thought to be associated with macular oedema except in the context of comorbid clinical uveitis. Despite a lack of myelin, the retina is a site of inflammation and microglial activation in multiple sclerosis and demonstrates significant neuronal and axonal loss. We unexpectedly observed microcystic macular oedema using spectral domain optical coherence tomography in patients with multiple sclerosis who did not have another reason for macular oedema. We therefore evaluated spectral domain optical coherence tomography images in consecutive patients with multiple sclerosis for microcystic macular oedema and examined correlations between macular oedema and visual and ambulatory disability in a cross-sectional analysis. Participants were excluded if there was a comorbidity that could account for the presence of macular oedema, such as uveitis, diabetes or other retinal disease. A microcystic pattern of macular oedema was observed on optical coherence tomography in 15 of 318 (4.7%) patients with multiple sclerosis. No macular oedema was identified in 52 healthy controls assessed over the same period. The microcystic oedema predominantly involved the inner nuclear layer of the retina and tended to occur in small, discrete patches. Patients with multiple sclerosis with microcystic macular oedema had significantly worse disability [median Expanded Disability Score Scale 4 (interquartile range 3–6)] than patients without macular oedema [median Expanded Disability Score Scale 2 (interquartile range 1.5–3.5)], P = 0.0002. Patients with multiple sclerosis with microcystic macular oedema also had higher Multiple Sclerosis Severity Scores, a measure of disease progression, than those without oedema [median of 6.47 (interquartile range 4.96–7.98) versus 3.65 (interquartile range 1.92–5.87), P = 0.0009]. Microcystic macular oedema occurred more commonly in eyes with prior optic neuritis than eyes without prior optic neuritis (50 versus 27%) and was associated with lower visual acuity (median logMAR acuity of 0.17 versus −0.1) and a thinner retinal nerve fibre layer. The presence of microcystic macular oedema in multiple sclerosis suggests that there may be breakdown of the blood–retinal barrier and tight junction integrity in a part of the nervous system that lacks myelin. Microcystic macular oedema may also contribute to visual dysfunction beyond that explained by nerve fibre layer loss. Microcystic changes need to be assessed, and potentially adjusted for, in clinical trials that evaluate macular volume as a marker of retinal ganglion cell survival. These findings also have implications for clinical monitoring in patients with multiple sclerosis on sphingosine 1-phosphate receptor modulating agents. PMID:22539259
Site Variability in Regulatory Oversight for an International Study of Pediatric Sepsis.
Michelson, Kelly N; Reubenson, Gary; Weiss, Scott L; Fitzgerald, Julie C; Ackerman, Kate K; Christie, LeeAnn; Bush, Jenny L; Nadkarni, Vinay M; Thomas, Neal J; Schreiner, Mark S
2018-04-01
Duplicative institutional review board/research ethics committee review for multicenter studies may impose administrative burdens and inefficiencies affecting study implementation and quality. Understanding variability in site-specific institutional review board/research ethics committee assessment and barriers to using a single review committee (an increasingly proposed solution) can inform a more efficient process. We provide needed data about the regulatory oversight process for the Sepsis PRevalence, OUtcomes, and Therapies multicenter point prevalence study. Survey. Sites invited to participate in Sepsis PRevalence, OUtcomes, and Therapies. Investigators at sites that expressed interest and/or participated in Sepsis PRevalence, OUtcomes, and Therapies. None. Using an electronic survey, we collected data about 1) logistics of protocol submission, 2) institutional review board/research ethics committee requested modifications, and 3) use of a single institutional review board (for U.S. sites). We collected surveys from 104 of 167 sites (62%). Of the 97 sites that submitted the protocol for institutional review board/research ethics committee review, 34% conducted full board review, 54% expedited review, and 4% considered the study exempt. Time to institutional review board/research ethics committee approval required a median of 34 (range 3-186) days, which took longer at sites that required protocol modifications (median [interquartile range] 50 d [35-131 d] vs 32 d [14-54 d)]; p = 0.02). Enrollment was delayed at eight sites due to prolonged (> 50 d) time to approval. Of 49 U.S. sites, 43% considered using a single institutional review board, but only 18% utilized this option. Time to final approval for U.S. sites using the single institutional review board was 62 days (interquartile range, 34-70 d) compared with 34 days (interquartile range, 15-54 d) for nonsingle institutional review board sites (p = 0.16). Variability in regulatory oversight was evident for this minimal-risk observational research study, most notably in the category of type of review conducted. Duplicative review prolonged time to protocol approval at some sites. Use of a single institutional review board for U.S. sites was rare and did not improve efficiency of protocol approval. Suggestions for minimizing these challenges are provided.
Treatment and therapeutic monitoring of canine hypothyroidism.
Dixon, R M; Reid, S W J; Mooney, C T
2002-08-01
Thirty-one dogs with spontaneous hypothyroidism were treated with thyroid hormone replacement therapy (THRT) and monitored for approximately three months. Good clinical and laboratory control was ultimately achieved in all cases with a mean L-thyroxine (T4) dose of 0.026 mg/kg administered once daily. There was a significant increase and decrease in circulating total T4 and canine thyroid stimulating hormone (cTSH) concentrations, respectively, after starting THRT. After commencing treatment, 11 cases subsequently required an increase and three cases required a decrease in dose to achieve optimal clinical control. Median (semi interquartile range [SIR]) circulating six-hour post-pill total T4 (53.6 [27.91 nmol/litre) and cTSH (0.03 [0] microg/litre) concentrations were significantly increased and decreased, respectively, in treated dogs that did not require a dose change; corresponding values in treated dogs in which an increase in dose was required were 29.3 (12.7) nmol/litre and 0.15 (0.62) microg/litre, respectively. However, circulating cTSH measurement was of limited value in assessing therapeutic control because, although increased values were associated with inadequate therapy, reference range cTSH values were common in inadequately treated dogs. Lethargy and mental demeanour were typically the first clinical signs to improve, with significant bodyweight reduction occurring within two weeks of commencing THRT. Routine clinicopathological monitoring was of value in confirming a general metabolic response to THRT, but was of limited value in accurately monitoring cases or tailoring therapy in individual cases.
Kreitzer, Natalie; Lyons, Michael S; Hart, Kim; Lindsell, Cristopher J; Chung, Sora; Yick, Andrew; Bonomo, Jordan
2014-10-01
Emergency department (ED) management of mild traumatic brain injury (TBI) patients with any form of traumatic intracranial hemorrhage (ICH) is variable. Since 2000, our center's standard practice has been to obtain a repeat head computed tomography (CT) at least 6 hours after initial imaging. Patients are eligible for discharge if clinical and CT findings are stable. Whether this practice is safe is unknown. This study characterized clinical outcomes in mild TBI patients with acute traumatic ICH seen on initial ED neuroimaging. This retrospective cohort study included patients presenting to the ED with blunt mild TBI with Glasgow Coma Scale (GCS) scores of 14 or 15 and stable vital signs, during the period from January 2001 to January 2010. Patients with any ICH on initial head CT and repeat head CT within 24 hours were eligible. Cases were excluded for initial GCS < 14, injury > 24 hours old, pregnancy, concomitant nonminor injuries, and coagulopathy. A single investigator abstracted data from records using a standardized case report form and data dictionary. Primary endpoints included death, neurosurgical procedures, and for discharged patients, return to the ED within 7 days. Differences in proportions were computed with 95% confidence intervals (CIs). Of 1,011 patients who presented to the ED and had two head CTs within 24 hours, 323 (32%) met inclusion criteria. The median time between CT scans was 6 hours (interquartile range = 5 to 7 hours). A total of 153 (47%) patients had subarachnoid hemorrhage, 132 (41%) patients had subdural hemorrhage, 11 (3%) patients had epidural hemorrhage, 78 (24%) patients had cerebral contusions, and 59 (18%) patients had intraparenchymal hemorrhage. Four of 323 (1.2%, 95% CI = 0.3% to 3.2%) patients died within 2 weeks of injury. Three of the patients who died had been admitted from the ED on their initial visits, and one had been discharged home. There were 206 patients (64%) discharged from the ED, 28 (13.6%) of whom returned to the ED within 1 week. Of the 92 who were hospitalized, three (0.9%, 95% CI = 0.2% to 2.7%) required neurosurgical intervention. Discharge after a repeat head CT and brief period of observation in the ED allowed early discharge of a cohort of mild TBI patients with traumatic ICH without delayed adverse outcomes. Whether this justifies the cost and radiation exposure involved with this pattern of practice requires further study. © 2014 by the Society for Academic Emergency Medicine.
On the Use of Rank Tests and Estimates in the Linear Model.
1982-06-01
assumption A5, McKean and Hettmansperger (1976) show that 10 w (W(N-c) - W (c+l))/ (2Z /2) (14) where 2Z is the 1-a interpercentile range of the standard...r(.75n) - r(.25n)) (13) The window width h incorporates a resistant estimate of scale, then interquartile range of the residuals, and a normalizing...alternative estimate of i is available with the additional assumption of symmetry of the error distribution. ASSUMPTION: A5. Suppose the underlying error
Basu, Chandrasekhar Bob; Chen, Li-Mei; Hollier, Larry H; Shenaq, Saleh M
2004-12-01
The Accreditation Council for Graduate Medical Education (ACGME) Work-Hours Duty Policy became effective on July 1, 2003, mandating the reduction of resident duty work hours. The Baylor College of Medicine Multi-Institutional Integrated Plastic Surgery Program instituted a resident duty work-hours policy on July 1, 2002 (1 year ahead of the national mandate). Outcomes data are needed to facilitate continuous improvements in plastic surgical residency training while maintaining high-quality patient care. To assess the effect of this policy intervention on plastic surgery resident education as measured through the six core competencies and patient/resident safety, the investigators surveyed all categorical plastic surgery residents 6 months after implementation of the policy. This work represents the first empiric study investigating the effect of duty hours reduction on plastic surgery training and education. The categorical plastic surgery residents at the Baylor College of Medicine Multi-Institutional Integrated Plastic Surgery Program completed a 68-item survey on a five-point Likert scale (1 = strongly disagree to 5 = strongly agree). Residents were asked to rate multiple parameters based on the ACGME six core competencies, including statements on patient care and clinical/operative duties, resident education, resident quality of life, and resident perceptions on this policy. All surveys were completed anonymously. The sample size was n = 12 (program year 3 through program year 6), with a 100 percent response rate. Univariate and bivariate statistical analysis was conducted with SPSS version 10.0 statistical software. Specifically, interquartile deviations were used to find consensus among resident responses to each statement. Descriptive statistics indicated higher percentages of agreement on a majority of statements in three categories, including patient care and clinical/operative duties, academic duties, and resident quality of life. Using interquartile deviation, the highest levels of consensus among the residents were found in positive statements addressing resident alertness (both in and out of the operative environment), time to read/prepare for cases/conferences, efficacy of the didactic curriculum, and overall satisfaction with this policy for surgery resident education. Residents also felt that their patients favored this work hours policy. In addition, there was high consensus that this policy improved overall patient care. The majority of residents identified a negative effect of this policy through an increase in cross-coverage responsibilities, however, and half of the residents perceived that faculty negatively viewed their unavailability postcall. In addition, no consensus among the residents was achieved regarding perceptions on overall weekly operative experience. Plastic surgery residents perceived that the reduction of resident work hours through adherence to the ACGME guidelines has beneficial effects on patient care and clinical/operative duties, academic duties, and resident quality of life. Residents felt, however, that these benefits may increase cross-coverage workloads. Furthermore, residents were concerned about faculty perception of their changes in postcall duties. In contrast to previously published findings in the general surgery literature, the current results indicate that residents do not believe that this policy negatively affects continuity of patient care. In fact, the current findings suggest that adherence to this policy improves patient care on multiple levels. The effect on the operative experience remains to be elucidated. Further large-scale and longitudinal research design and analysis is warranted to better assess the results of the ACGME resident duty work-hours policy in plastic surgery resident education.
Birko, Stanislav; Dove, Edward S; Özdemir, Vural
2015-01-01
The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger's Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss' Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts' opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts' variable degrees of conformity (stubbornness/flexibility) in modifying their opinions.
Birko, Stanislav; Dove, Edward S.; Özdemir, Vural
2015-01-01
The extent of consensus (or the lack thereof) among experts in emerging fields of innovation can serve as antecedents of scientific, societal, investor and stakeholder synergy or conflict. Naturally, how we measure consensus is of great importance to science and technology strategic foresight. The Delphi methodology is a widely used anonymous survey technique to evaluate consensus among a panel of experts. Surprisingly, there is little guidance on how indices of consensus can be influenced by parameters of the Delphi survey itself. We simulated a classic three-round Delphi survey building on the concept of clustered consensus/dissensus. We evaluated three study characteristics that are pertinent for design of Delphi foresight research: (1) the number of survey questions, (2) the sample size, and (3) the extent to which experts conform to group opinion (the Group Conformity Index) in a Delphi study. Their impacts on the following nine Delphi consensus indices were then examined in 1000 simulations: Clustered Mode, Clustered Pairwise Agreement, Conger’s Kappa, De Moivre index, Extremities Version of the Clustered Pairwise Agreement, Fleiss’ Kappa, Mode, the Interquartile Range and Pairwise Agreement. The dependency of a consensus index on the Delphi survey characteristics was expressed from 0.000 (no dependency) to 1.000 (full dependency). The number of questions (range: 6 to 40) in a survey did not have a notable impact whereby the dependency values remained below 0.030. The variation in sample size (range: 6 to 50) displayed the top three impacts for the Interquartile Range, the Clustered Mode and the Mode (dependency = 0.396, 0.130, 0.116, respectively). The Group Conformity Index, a construct akin to measuring stubbornness/flexibility of experts’ opinions, greatly impacted all nine Delphi consensus indices (dependency = 0.200 to 0.504), except the Extremity CPWA and the Interquartile Range that were impacted only beyond the first decimal point (dependency = 0.087 and 0.083, respectively). Scholars in technology design, foresight research and future(s) studies might consider these new findings in strategic planning of Delphi studies, for example, in rational choice of consensus indices and sample size, or accounting for confounding factors such as experts’ variable degrees of conformity (stubbornness/flexibility) in modifying their opinions. PMID:26270647
Location characteristics of early perihaematomal oedema
McCarron, M O; McCarron, P; Alberts, M J
2006-01-01
Background The natural history and triggers of perihaematomal oedema (PHO) remain poorly understood. Cerebral amyloid angiopathy (a common cause of lobar haemorrhage) has localised anticoagulant and thrombolytic properties, which may influence PHO. We hypothesised that early (within 24 hours) oedema to haematoma volume ratios are smaller in patients with lobar intracerebral haemorrhage (ICH) than in patients with deep ICH. Methods Haematoma and PHO volumes were measured in consecutive patients admitted to an acute stroke unit with a diagnosis of spontaneous supratentorial ICH proven by computed tomography. The oedema to haematoma volume ratios were calculated and compared in patients with lobar ICH and deep ICH. Results In total, 44 patients with ICH were studied: 19 patients had deep ICH, median haematoma volume 8.4 ml (interquartile range (IQR) 4.8 to 20.8), median PHO 8.2 ml (2.8 to 16), and 25 had lobar ICHs, median haematoma volume 17.6 ml (6.6 to 33.1) and median oedema volume 10.2 ml (3.4 to 24.2). Patients with lobar ICH were older than those with deep ICH (65.7 v 57.4 years, p = 0.009) but ICH location did not differ by sex or race. There was no evidence that haematoma or oedema volumes were related to type of ICH (p = 0.23, p = 0.39 respectively). The median oedema to haematoma volume ratios were similar in patients with lobar and deep ICH (0.67 v 0.58, p = 0.71). Controlling for age, sex, and race made little difference to these comparisons. Conclusions There are no major location specific differences in PHO volumes within 24 hours of ICH onset. Deep and lobar ICH may have common therapeutic targets to reduce early PHO. PMID:16484648
[Evaluation of an Xpert EV (Cepheid®) molecular diagnostic technique for enteroviral meningitis].
Alonso Pérez, Natalia; Sagastizabal Cardelus, Belén; Prieto Tato, Luis Manuel; Guillén Martín, Sara; González Torralba, Ana; García Bermejo, Isabel; Ramos Amador, José Tomás
2017-10-01
Polymerase chain reaction (PCR) assays have shown to be useful and quick for the diagnosis of enterovirus in aseptic meningitis. The aim of our study was to analyse the changes in clinical practice after the introduction of a real-time polymerase chain reaction (RT-PCR) technique using the Xpert EV (Cepheid ® ) assay for the qualitative detection of enterovirus RNA in cerebrospinal fluid specimens from children with suspected viral meningitis. A retrospective study was performed in children older than 1year, diagnosed with enterovirus meningitis in a third level hospital from November 2006 to February 2013. The first period, before the availability of Xpert EV (Cepheid ® ) (Group1, November 2006-August 2010) was compared with the later period (Group2, September 2010-February 2013). Clinical characteristics, the mean length of stay, and the cost per inpatient cases, were compared between the 2periods. Forty-one patients (60.9% male) were included, with a median age of 64 months (interquartile range 28-96). Twenty-six patients (63.4%) were included in Group2. There were non-statistically significant differences in the epidemiological, disease severity, and laboratory characteristics between both periods of study. A significant difference was observed in the mean length of stay, with it being shorter in Group2 (48hours vs 40.5hours, P=.039), and a significant lower inpatient cost per case (€779.77 vs €656.05, P<.05). Xpert EV (Cepheid ® ) assay was useful for decreasing the length of hospital stay and the costs associated with hospitalisation in children with enterovirus meningitis. Copyright © 2016 Asociación Española de Pediatría. Publicado por Elsevier España, S.L.U. All rights reserved.
Epstein, Daniel S; Mitra, Biswadev; Cameron, Peter A; Fitzgerald, Mark; Rosenfeld, Jeffrey V
2016-07-01
Acute traumatic coagulopathy (ATC) has been reported in the setting of isolated traumatic brain injury (iTBI) and is associated with poor outcomes. We aimed to evaluate the effectiveness of procoagulant agents administered to patients with ATC and iTBI during resuscitation, hypothesizing that timely normalization of coagulopathy may be associated with a decrease in mortality. A retrospective review of the Alfred Hospital trauma registry, Australia, was conducted and patients with iTBI (head Abbreviated Injury Score [AIS] ⩾3 and all other body AIS <3) and coagulopathy (international normalized ratio ⩾1.3) were selected for analysis. Data on procoagulant agents used (fresh frozen plasma, platelets, cryoprecipitate, prothrombin complex concentrates, tranexamic acid, vitamin K) were extracted. Among patients who had achieved normalization of INR or survived beyond 24hours and were not taking oral anticoagulants, the association of normalization of INR and death at hospital discharge was analyzed using multivariable logistic regression analysis. There were 157 patients with ATC of whom 68 (43.3%) received procoagulant products within 24hours of presentation. The median time to delivery of first products was 182.5 (interquartile range [IQR] 115-375) minutes, and following administration of coagulants, time to normalization of INR was 605 (IQR 274-1146) minutes. Normalization of INR was independently associated with significantly lower mortality (adjusted odds ratio 0.10; 95% confidence interval 0.03-0.38). Normalization of INR was associated with improved mortality in patients with ATC in the setting of iTBI. As there was a substantial time lag between delivery of products and eventual normalization of coagulation, specific management of coagulopathy should be implemented as early as possible. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bhogal, Sanjit K; McGillivray, David; Bourbeau, Jean; Benedetti, Andrea; Bartlett, Susan; Ducharme, Francine M
2012-07-01
The variable effectiveness of clinical asthma pathways to reduce hospital admissions may be explained in part by the timing of systemic corticosteroid administration. We examine the effect of early (within 60 minutes [SD 15 minutes] of triage) versus delayed (>75 minutes) administration of systemic corticosteroids on health outcomes. We conducted a prospective observational cohort of children aged 2 to 17 years presenting to the emergency department with moderate or severe asthma, defined as a Pediatric Respiratory Assessment Measure (PRAM) score of 5 to 12. The outcomes were hospital admission, relapse, and length of active treatment; they were analyzed with multivariate logistic and linear regressions adjusted for covariates and potential confounders. Among the 406 eligible children, 88% had moderate asthma; 22%, severe asthma. The median age was 4 years (interquartile range 3 to 8 years); 64% were male patients. Fifty percent of patients received systemic corticosteroids early; in 33%, it was delayed; 17% of children failed to receive any. Overall, 36% of patients were admitted to the hospital. Compared with delayed administration, early administration reduced the odds of admission by 0.4 (95% confidence interval 0.2 to 0.7) and the length of active treatment by 0.7 hours (95% confidence interval -1.3 to -0.8 hours), with no significant effect on relapse. Delayed administration was positively associated with triage priority and negatively with PRAM score. In this study of children with moderate or severe asthma, administration of systemic corticosteroids within 75 minutes of triage decreased hospital admission rate and length of active treatment, suggesting that early administration of systemic corticosteroids may allow for optimal effectiveness. Copyright © 2012 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.
Shah, Monica R; Whellan, David J; Peterson, Eric D; Nohria, Anju; Hasselblad, Vic; Xue, Zhenyi; Bowers, Margaret T; O'Connor, Christopher M; Califf, Robert M; Stevenson, Lynne W
2008-04-01
Little data exist to assist to help those organizing and managing heart failure (HF) disease management (DM) programs. We aimed to describe the intensity of outpatient HF care (clinic visits and telephone calls) and medical and nonpharmacological interventions in the outpatient setting. This was a prospective substudy of 130 patients enrolled in STARBRITE in HFDM programs at 3 centers. Follow-up occurred 10, 30, 60, 90, and 120 days after discharge. The number of clinic visits and calls made by HF cardiologists, nurse practitioners, and nurses were prospectively tracked. The results were reported as medians and interquartile ranges. There were a total of 581 calls with 4 (2, 6) per patient and 467 clinic visits with 3 (2, 5) per patient. Time spent per patient was 8.9 (6, 10.6) minutes per call and 23.8 (20, 28.3) minutes per clinic visit. Nurses and nurse practitioners spent 113 hours delivering care on the phone, and physicians and nurse practitioners spent 187.6 hours in clinic. Issues addressed during calls included HF education (341 times [52.6%]) and fluid overload (87 times [41.8%]). Medical interventions included adjustments to loop diuretics (calls 101 times, clinic 156 times); beta-blockers (calls 18 times, clinic 126 times); vasodilators (calls 8 times, clinic 55 times). More than a third of clinician time was spent on calls, during which >50% of patient contacts and HF education and >39% of diuretic adjustments occurred. Administrators and public and private insurers need to recognize the amount of medical care delivered over the telephone and should consider reimbursement for these activities.
Long, Ann C; Muni, Sarah; Treece, Patsy D; Engelberg, Ruth A; Nielsen, Elizabeth L; Fitzpatrick, Annette L; Curtis, J Randall
2015-12-01
Discussions about withdrawal of life-sustaining therapies often include family members of critically ill patients. These conversations should address essential components of the dying process, including expected time to death after withdrawal. The study objective was to aid physician communication about the dying process by identifying predictors of time to death after terminal withdrawal of mechanical ventilation. We conducted an observational analysis from a single-center, before-after evaluation of an intervention to improve palliative care. We studied 330 patients who died after terminal withdrawal of mechanical ventilation. Predictors included patient demographics, laboratory, respiratory, and physiologic variables, and medication use. The median time to death for the entire cohort was 0.58 hours (interquartile range (IQR) 0.22-2.25 hours) after withdrawal of mechanical ventilation. Using Cox regression, independent predictors of shorter time to death included higher positive end-expiratory pressure (per 1 cm H2O hazard ratio [HR], 1.07; 95% CI 1.04-1.11); higher static pressure (per 1 cm H2O HR, 1.03; 95% CI 1.01-1.04); extubation prior to death (HR, 1.41; 95% CI 1.06-1.86); and presence of diabetes (HR, 1.75; 95% CI 1.25-2.44). Higher noninvasive mean arterial pressure predicted longer time to death (per 1 mmHg HR, 0.98; 95% CI 0.97-0.99). Comorbid illness and key respiratory and physiologic parameters may inform physician predictions of time to death after withdrawal of mechanical ventilation. An understanding of the predictors of time to death may facilitate discussions with family members of dying patients and improve communication about end-of-life care.
Muni, Sarah; Treece, Patsy D.; Engelberg, Ruth A.; Nielsen, Elizabeth L.; Fitzpatrick, Annette L.; Curtis, J. Randall
2015-01-01
Abstract Background: Discussions about withdrawal of life-sustaining therapies often include family members of critically ill patients. These conversations should address essential components of the dying process, including expected time to death after withdrawal. Objectives: The study objective was to aid physician communication about the dying process by identifying predictors of time to death after terminal withdrawal of mechanical ventilation. Methods: We conducted an observational analysis from a single-center, before–after evaluation of an intervention to improve palliative care. We studied 330 patients who died after terminal withdrawal of mechanical ventilation. Predictors included patient demographics, laboratory, respiratory, and physiologic variables, and medication use. Results: The median time to death for the entire cohort was 0.58 hours (interquartile range (IQR) 0.22–2.25 hours) after withdrawal of mechanical ventilation. Using Cox regression, independent predictors of shorter time to death included higher positive end-expiratory pressure (per 1 cm H2O hazard ratio [HR], 1.07; 95% CI 1.04–1.11); higher static pressure (per 1 cm H2O HR, 1.03; 95% CI 1.01–1.04); extubation prior to death (HR, 1.41; 95% CI 1.06–1.86); and presence of diabetes (HR, 1.75; 95% CI 1.25–2.44). Higher noninvasive mean arterial pressure predicted longer time to death (per 1 mmHg HR, 0.98; 95% CI 0.97–0.99). Conclusions: Comorbid illness and key respiratory and physiologic parameters may inform physician predictions of time to death after withdrawal of mechanical ventilation. An understanding of the predictors of time to death may facilitate discussions with family members of dying patients and improve communication about end-of-life care. PMID:26555010
Graham, Eric M; Atz, Andrew M; Gillis, Jenna; Desantis, Stacia M; Haney, A Lauren; Deardorff, Rachael L; Uber, Walter E; Reeves, Scott T; McGowan, Francis X; Bradley, Scott M; Spinale, Francis G
2012-05-01
Factors contributing to postoperative complications include blood loss and a heightened inflammatory response. The objective of this study was to test the hypothesis that aprotinin would decrease perioperative blood product use, reduce biomarkers of inflammation, and result in improved clinical outcome parameters in neonates undergoing cardiac operations. This was a secondary retrospective analysis of a clinical trial whereby neonates undergoing cardiac surgery received either aprotinin (n = 34; before May 2008) or tranexamic acid (n = 42; after May 2008). Perioperative blood product use, clinical course, and measurements of cytokines were compared. Use of perioperative red blood cells, cryoprecipitate, and platelets was reduced in neonates receiving aprotinin compared with tranexamic acid (P < .05). Recombinant activated factor VII use (2/34 [6%] vs 18/42 [43%]; P < .001), delayed sternal closure (12/34 [35%] vs 26/42 [62%]; P = .02), and inotropic requirements at 24 and 36 hours (P < .05) were also reduced in the aprotinin group. Median duration of mechanical ventilation was reduced compared with tranexamic acid: 2.9 days (interquartile range: 1.7-5.1 days) versus 4.2 days (2.9-5.2 days), P = .04. Production of tumor necrosis factor and interleukin-2 activation were attenuated in the aprotinin group at 24 hours postoperatively. No differential effects on renal function were seen between agents. Aprotinin, compared with tranexamic acid, was associated with reduced perioperative blood product use, improved early indices of postoperative recovery, and attenuated indices of cytokine activation, without early adverse effects. These findings suggest that aprotinin may have unique effects in the context of neonatal cardiac surgery and challenge contentions that antifibrinolytics are equivalent with respect to early postoperative outcomes. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Sargent, James A; Roeder, Hilary A; Ward, Kristy K; Moore, Thomas R; Ramos, Gladys A
2015-12-01
We hypothesized that patients with type 1 diabetes mellitus (T1DM) who were managed during their pregnancy with a continuous subcutaneous insulin infusion (CSII) would have a lower incidence of neonatal hypoglycemia (NH) than patients managed with multiple daily injections (MDI) of insulin. This was a retrospective cohort of 95 women with T1DM who delivered singleton, term neonates between 2007 and 2014. The primary outcome was incidence of NH (capillary plasma glucose ≤ 45 mg/dL) in the first 24 hours after birth. The incidence of NH was 66.0% (62/95). The NH rate was significantly higher in women managed with CSII versus MDI (62 vs. 38%, p = 0.024). Neonates with NH had a higher birth weight (3,867 ± 658 vs. 3,414 ± 619 g, p = 0.002). When analyzing intrapartum glucose management, mothers of neonates with NH had significantly less time managed on an insulin infusion (median interquartile range 7 [3.5-30.5] vs. 17.5 [2.0-17.5] hours, p = 0.014). In multivariable analysis, only maternal body mass index (BMI) (p = 0.035) and time on an insulin infusion (p = 0.043) were significantly associated with NH. In our population of patients with T1DM, CSII was more prevalent in the NH group; however, when controlling for other factors, intrapartum glucose management and early maternal BMI were the only variables associated with NH. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Graham, Eric M.; Atz, Andrew M.; Gillis, Jenna; DeSantis, Stacia M.; Haney, A. Lauren; Deardorff, Rachael L.; Uber, Walter E.; Reeves, Scott T.; McGowan, Francis X.; Bradley, Scott M.; Spinale, Francis G.
2011-01-01
Objective Factors contributing to postoperative complications include blood loss and a heightened inflammatory response. The objective of this study was to test the hypothesis that aprotinin would decrease perioperative blood product use, reduce biomarkers of inflammation, and result in improved clinical outcome parameters in neonates undergoing cardiac operations. Methods This was a secondary retrospective analysis of a clinical trial whereby neonates undergoing cardiac surgery received either aprotinin (n = 34; before May 2008) or tranexamic acid (n = 42; after May 2008). Perioperative blood product use, clinical course, and measurements of cytokines were compared. Results Use of perioperative red blood cells, cryoprecipitate, and platelets was reduced in neonates receiving aprotinin compared with tranexamic acid (P < .05). Recombinant activated factor VII use (2/34 [6%] vs 18/42 [43%]; P < .001), delayed sternal closure (12/34 [35%] vs 26/42 [62%]; P = .02), and inotropic requirements at 24 and 36 hours (P < .05) were also reduced in the aprotinin group. Median duration of mechanical ventilation was reduced compared with tranexamic acid: 2.9 days (interquartile range: 1.7–5.1 days) versus 4.2 days (2.9–5.2days), P = .04. Production of tumor necrosis factor and interleukin-2 activation were attenuated in the aprotinin group at 24 hours postoperatively. No differential effects on renal function were seen between agents. Conclusions Aprotinin, compared with tranexamic acid, was associated with reduced perioperative blood product use, improved early indices of postoperative recovery, and attenuated indices of cytokine activation, without early adverse effects. These findings suggest that aprotinin may have unique effects in the context of neonatal cardiac surgery and challenge contentions that antifibrinolytics are equivalent with respect to early postoperative outcomes. PMID:22075061
Delayed diagnosis of injuries in pediatric trauma: the role of radiographic ordering practices.
Willner, Emily L; Jackson, Hollie A; Nager, Alan L
2012-01-01
We sought to describe the use of radiographic studies in pediatric major trauma patients and determine the extent to which a selective, clinically guided use of imaging contributes to delayed diagnosis of injury (DDI). We conducted a retrospective chart review of 324 consecutive pediatric major trauma patients at our level 1 trauma center. One radiologist reviewed all imaging. Delayed diagnosis of injury was defined as detection after more than 12 hours. Equivalency testing was performed to compare radiology use in patients with and without DDI. Twenty-six (8%) of 324 patients had 36 DDI; 27 (75%) of 36 were orthopedic injuries. Median time to DDI detection was 20.5 hours (interquartile range, 15-60.5). During initial evaluation, DDI patients had similar numbers of plain radiographs (3.5 vs 3, P = .54) but more computed tomographic (CT) scans (4 vs 3, P = .03) compared with patients without DDI. Sixteen percent of all patients received CT thorax; 55%, CT cervical spine; and 56%, CT abdomen. Only 1 clinically important DDI was detected solely on the basis of a later CT scan (0.3%; 95% confidence interval, 0-1.5). No cervical spine, intrathoracic, or intraabdominal DDI was attributable to failure to obtain a CT during initial evaluation. Patients with DDI had higher injury severity scores, intubation rates, and pediatric intensive care unit admission rates than those without DDI. Patients with DDI had similar initial plain x-ray evaluations to patients without DDI, despite DDI patients being more severely injured. Delayed diagnosis of injury was not attributable to inadequate CT use. Most DDIs were orthopedic, highlighting the importance of a tertiary survey and a low threshold for skeletal radiographs. Copyright © 2012 Elsevier Inc. All rights reserved.
Cumulative lactate and hospital mortality in ICU patients
2013-01-01
Background Both hyperlactatemia and persistence of hyperlactatemia have been associated with bad outcome. We compared lactate and lactate-derived variables in outcome prediction. Methods Retrospective observational study. Case records from 2,251 consecutive intensive care unit (ICU) patients admitted between 2001 and 2007 were analyzed. Baseline characteristics, all lactate measurements, and in-hospital mortality were recorded. The time integral of arterial blood lactate levels above the upper normal threshold of 2.2 mmol/L (lactate-time-integral), maximum lactate (max-lactate), and time-to-first-normalization were calculated. Survivors and nonsurvivors were compared and receiver operating characteristic (ROC) analysis were applied. Results A total of 20,755 lactate measurements were analyzed. Data are srpehown as median [interquartile range]. In nonsurvivors (n = 405) lactate-time-integral (192 [0–1881] min·mmol/L) and time-to-first normalization (44.0 [0–427] min) were higher than in hospital survivors (n = 1846; 0 [0–134] min·mmol/L and 0 [0–75] min, respectively; all p < 0.001). Normalization of lactate <6 hours after ICU admission revealed better survival compared with normalization of lactate >6 hours (mortality 16.6% vs. 24.4%; p < 0.001). AUC of ROC curves to predict in-hospital mortality was the largest for max-lactate, whereas it was not different among all other lactate derived variables (all p > 0.05). The area under the ROC curves for admission lactate and lactate-time-integral was not different (p = 0.36). Conclusions Hyperlactatemia is associated with in-hospital mortality in a heterogeneous ICU population. In our patients, lactate peak values predicted in-hospital mortality equally well as lactate-time-integral of arterial blood lactate levels above the upper normal threshold. PMID:23446002
Reif, Ingalill; Wincent, Anders; Stiller, Carl-Olav
2017-06-01
The objective of this randomized double blind cross-over trial was to determine if patients with severe cancer-related pain and inadequate response to systemic opioids prefer intrathecal (IT) pain relief with a combination of bupivacaine and morphine or bupivacaine only. Adult patients with cancer-related pain (n = 23) scheduled for IT analgesia at the Pain Center at the Karo-linska University Hospital Solna, Stockholm, Sweden, were included. The optimal individual flow rate of IT bupivacaine (2 mg/mL) in addition to bolus doses was titrated and maintained for 4 days. Morphine (1 mg/mL) was added to bupivacaine either on day 2 or 4 according to a randomization protocol. Expression of pain relief preference for morphine instead of control (bupivacaine only) was the primary outcome. Secondary outcomes were difference in pain intensity, pain relief, total use of bupivacaine per 24 hours and number of requested bolus doses. Eight patients dropped out during the 4-day study period for reasons not related to the trial. IT bupivacaine significantly decreased median (interquartile range) pain intensity from 5 (3 - 7) at baseline (before catheter insertion) to 1 (0 - 1) (p = 0.0001; Wilcoxon test). Only 1 patient of 15 with 4-day data expressed any preference for morphine. The addition of IT morphine did not result in any significant change of pain intensity, pain relief score, total use of bupivacaine per 24 hours, or number of requested bolus doses. These results suggest that patients with cancer-related pain treated with high doses of systemic opioids, may start IT treatment with an optimal dose of IT bupivacaine without morphine. .
Bloos, Frank; Rüddel, Hendrik; Thomas-Rüddel, Daniel; Schwarzkopf, Daniel; Pausch, Christine; Harbarth, Stephan; Schreiber, Torsten; Gründling, Matthias; Marshall, John; Simon, Philipp; Levy, Mitchell M; Weiss, Manfred; Weyland, Andreas; Gerlach, Herwig; Schürholz, Tobias; Engel, Christoph; Matthäus-Krämer, Claudia; Scheer, Christian; Bach, Friedhelm; Riessen, Reimer; Poidinger, Bernhard; Dey, Karin; Weiler, Norbert; Meier-Hellmann, Andreas; Häberle, Helene H; Wöbker, Gabriele; Kaisers, Udo X; Reinhart, Konrad
2017-11-01
Guidelines recommend administering antibiotics within 1 h of sepsis recognition but this recommendation remains untested by randomized trials. This trial was set up to investigate whether survival is improved by reducing the time before initiation of antimicrobial therapy by means of a multifaceted intervention in compliance with guideline recommendations. The MEDUSA study, a prospective multicenter cluster-randomized trial, was conducted from July 2011 to July 2013 in 40 German hospitals. Hospitals were randomly allocated to receive conventional continuous medical education (CME) measures (control group) or multifaceted interventions including local quality improvement teams, educational outreach, audit, feedback, and reminders. We included 4183 patients with severe sepsis or septic shock in an intention-to-treat analysis comparing the multifaceted intervention (n = 2596) with conventional CME (n = 1587). The primary outcome was 28-day mortality. The 28-day mortality was 35.1% (883 of 2596 patients) in the intervention group and 26.7% (403 of 1587 patients; p = 0.01) in the control group. The intervention was not a risk factor for mortality, since this difference was present from the beginning of the study and remained unaffected by the intervention. Median time to antimicrobial therapy was 1.5 h (interquartile range 0.1-4.9 h) in the intervention group and 2.0 h (0.4-5.9 h; p = 0.41) in the control group. The risk of death increased by 2% per hour delay of antimicrobial therapy and 1% per hour delay of source control, independent of group assignment. Delay in antimicrobial therapy and source control was associated with increased mortality but the multifaceted approach was unable to change time to antimicrobial therapy in this setting and did not affect survival.
Brener, Sorin J; Moliterno, David J; Lincoff, A Michael; Steinhubl, Steven R; Wolski, Kathy E; Topol, Eric J
2004-08-24
Unfractionated heparin (UFH) is the most widely used antithrombin during percutaneous coronary intervention (PCI). Despite significant pharmacological and mechanical advancements in PCI, uncertainty remains about the optimal activated clotting time (ACT) for prevention of ischemic or hemorrhagic complications. We analyzed the outcome of all UFH-treated patients enrolled in 4 large, contemporary PCI trials with independent adjudication of ischemic and bleeding events. Of 9974 eligible patients, maximum ACT was available in 8369 (84%). The median ACT was 297 seconds (interquartile range 256 to 348 seconds). The incidence of death, myocardial infarction, or revascularization at 48 hours, by ACT quartile, was 6.2%, 6.8%, 6.0%, and 5.7%, respectively (P=0.40 for trend). Covariate-adjusted rate of ischemic complications was not correlated with maximal procedural ACT (continuous value, P=0.29). Higher doses of UFH (>5000 U, or up to 90 U/kg) were independently associated with higher rates of events. The incidence of major or minor bleeding at 48 hours, by ACT quartile, was 2.9%, 3.5%, 3.8%, and 4.0%, respectively (P=0.04 for trend). In a multivariable logistic model with a spline transformation for ACT, there was a linear increase in risk of bleeding as the ACT approached 365 seconds (P=0.01), which leveled off beyond that value. Increasing UFH weight-indexed dose was independently associated with higher bleeding rates (OR 1.04 [1.02 to 1.07] for each 10 U/kg, P=0.001). In patients undergoing PCI with frequent stent and potent platelet inhibition use, ACT does not correlate with ischemic complications and has a modest association with bleeding complications, driven mainly by minor bleeding. Lower values do not appear to compromise efficacy while increasing safety.
Ting, Joseph Y; Resende, Maura; More, Kiran; Nicholls, Donna; Weisz, Dany E; El-Khuffash, Afif; Jain, Amish; McNamara, Patrick J
2016-08-01
The postoperative course of preterm babies undergoing surgical closure of a patent ductus arteriosus (PDA) is often complicated by postligation cardiac syndrome (PLCS). Despite targeted milrinone prophylaxis, some infants continue to experience postoperative respiratory deterioration. Our objective is to describe the immediate postoperative course and identify risk factors for respiratory instability when preterm infants undergoing PDA ligation are managed with targeted milrinone treatment. A retrospective review of a cohort of infants undergoing PDA ligation between January, 2010 and August, 2013 was conducted. All infants had a targeted neonatal echocardiogram performed 1 hour after surgery. Infants received prophylactic milrinone treatment if the left ventricular output was <200 mL/kg/min. The primary outcome measure was the development of respiratory instability within 24 hours of surgery. Multivariable logistic regression was performed to identify predictors of respiratory instability. Eighty-six infants with a median gestational age of 25 weeks (interquartile range [IQR], 24-26) and a birth weight of 740 g (IQR, 640-853) were included in this study. Forty-nine (57.0%) received milrinone prophylaxis. There were 44 (51.2%) infants who developed oxygenation or ventilation failure, and 7 (8.1%) neonates developed PLCS. Infants with longer isovolumic relaxation time (IVRT ≥30 milliseconds) were more likely to develop either oxygenation or ventilation failure. Although the incidence of PLCS has declined after the introduction of targeted milrinone prophylaxis, many preterm infants continue to develop respiratory instability after surgical ligation. In this population, diastolic dysfunction manifested by prolonged IVRT could be associated with an adverse postoperative respiratory course. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Kiernan, Michael S; Stevens, Susanna R; Tang, W H Wilson; Butler, Javed; Anstrom, Kevin J; Birati, Edo Y; Grodin, Justin L; Gupta, Divya; Margulies, Kenneth B; LaRue, Shane; Dávila-Román, Victor G; Hernandez, Adrian F; de Las Fuentes, Lisa
2018-03-01
Poor response to loop diuretic therapy is a marker of risk during heart failure hospitalization. We sought to describe baseline determinants of diuretic response and to further explore the relationship between this response and clinical outcomes. Patient data from the National Heart, Lung, and Blood Institute Heart Failure Network ROSE-AHF and CARRESS-HF clinical trials were analyzed to determine baseline determinants of diuretic response. Diuretic efficiency (DE) was defined as total 72-hour fluid output per total equivalent loop diuretic dose. Data from DOSE-AHF was then used to determine if these predictors of DE correlated with response to a high- versus low-dose diuretic strategy. At 72 hours, the high-DE group had median fluid output of 9071 ml (interquartile range: 7240-11775) with median furosemide dose of 320 mg (220-480) compared with 8030 ml (6300-9915) and 840 mg (600-1215) respectively for the low DE group. Cystatin C was independently associated with DE (odds ratio 0.36 per 1mg/L increase; 95% confidence interval: 0.24-0.56; P < 0.001). Independently from baseline characteristics, reduced fluid output, weight loss and DE were each associated with increased 60 day mortality. Among patients with estimated glomerular filtration rate below the median, those randomized to a high-dose strategy had improved symptoms compared with those randomized to a low-dose strategy. Elevated baseline cystatin C, as a biomarker of renal dysfunction, is associated with reduced diuretic response during heart failure hospitalization. Higher loop diuretic doses are required for therapeutic decongestion in patients with renal insufficiency. Poor response identifies a high-risk population. Copyright © 2018 Elsevier Inc. All rights reserved.
Moon, Katherine A; Rule, Ana M; Magid, Hoda S; Ferguson, Jacqueline M; Susan, Jolie; Sun, Zhuolu; Torrey, Christine; Abubaker, Salahaddin; Levshin, Vladimir; Çarkoglu, Asli; Radwan, Ghada Nasr; El-Rabbat, Maha; Cohen, Joanna E; Strickland, Paul; Breysse, Patrick N; Navas-Acien, Ana
2018-03-06
Most smoke-free legislation to reduce secondhand smoke (SHS) exposure exempts waterpipe (hookah) smoking venues. Few studies have examined SHS exposure in waterpipe venues and their employees. We surveyed 276 employees of 46 waterpipe tobacco venues in Istanbul, Moscow, and Cairo. We interviewed venue managers and employees and collected biological samples from employees to measure exhaled carbon monoxide (CO), hair nicotine, saliva cotinine, urine cotinine, urine 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL), and urine 1-hydroxypyrene glucuronide (1-OHPG). We estimated adjusted geometric mean ratios (GMR) of each SHS biomarker by employee characteristics and indoor air SHS measures. There were 73 nonsmoking employees and 203 current smokers of cigarettes or waterpipe. In nonsmokers, the median (interquartile) range concentrations of SHS biomarkers were 1.1 (0.2, 40.9) µg/g creatinine urine cotinine, 5.5 (2, 15) ng/mL saliva cotinine, 0.95 (0.36, 5.02) ng/mg hair nicotine, 1.48 (0.98, 3.97) pg/mg creatinine urine NNAL, 0.54 (0.25, 0.97) pmol/mg creatinine urine 1-OHPG, and 1.67 (1.33, 2.33) ppm exhaled CO. An 8-hour increase in work hours was associated with higher urine cotinine (GMR: 1.68, 95% CI: 1.20, 2.37) and hair nicotine (GMR: 1.22, 95% CI: 1.05, 1.43). Lighting waterpipes was associated with higher saliva cotinine (GMR: 2.83, 95% CI: 1.05, 7.62). Nonsmoking employees of waterpipe tobacco venues were exposed to high levels of SHS, including measurable levels of carcinogenic biomarkers (tobacco-specific nitrosamines and PAHs). Smoke-free regulation should be extended to waterpipe venues to protect nonsmoking employees and patrons from the adverse health effects of SHS.
Mustanoja, Satu; Putaala, Jukka; Gordin, Daniel; Tulkki, Lauri; Aarnio, Karoliina; Pirinen, Jani; Surakka, Ida; Sinisalo, Juha; Lehto, Mika; Tatlisumak, Turgut
2016-06-01
High blood pressure (BP) in acute stroke has been associated with a poor outcome; however, this has not been evaluated in young adults. The relationship between BP and long-term outcome was assessed in 1004 consecutive young, first-ever ischemic stroke patients aged 15 to 49 years enrolled in the Helsinki Young Stroke Registry. BP parameters included systolic (SBP) and diastolic BP, pulse pressure, and mean arterial pressure at admission and 24 hours. The primary outcome measure was recurrent stroke in the long-term follow-up. Adjusted for demographics and preexisting comorbidities, Cox regression models were used to assess independent BP parameters associated with outcome. Of our patients (63% male), 393 patients (39%) had prestroke hypertension and 358 (36%) used antihypertensive treatment. The median follow-up period was 8.9 years (interquartile range 5.7-13.2). Patients with a recurrent stroke (n=142, 14%) had significantly higher admission SBP, diastolic BP, pulse pressure, and mean arterial pressure (P<0.001) and 24-h SBP, diastolic BP, and mean arterial pressure compared with patients without the recurrent stroke. Patients with SBP ≥160 mm Hg compared with those with SBP <160 mm Hg had significantly more recurrent strokes (hazard ratio 3.3 [95% confidence interval, 2.05-4.55]; P<0.001) occurring earlier (13.9 years [13.0-14.6] versus 16.2 [15.8-16.6]; P<0.001) within the follow-up period. In multivariable analyses, higher admission SBP, diastolic BP, pulse pressure, and mean arterial pressure were independently associated with the risk of recurrent stroke, while the 24-hour BP levels were not. In young ischemic stroke patients, high acute phase BP levels are independently associated with a high risk of recurrent strokes. © 2016 American Heart Association, Inc.
Friedman, Benjamin W; Adewunmi, Victoria; Campbell, Caron; Solorzano, Clemencia; Esses, David; Bijur, Polly E; Gallagher, E John
2013-10-01
We compare metoclopramide 20 mg intravenously, combined with diphenhydramine 25 mg intravenously, with ketorolac 30 mg intravenously in adults with tension-type headache and all nonmigraine, noncluster recurrent headaches. In this emergency department (ED)-based randomized, double-blind study, we enrolled adults with nonmigraine, noncluster recurrent headaches. Patients with tension-type headache were a subgroup of special interest. Our primary outcome was a comparison of the improvement in pain score between baseline and 1 hour later, assessed on a 0 to 10 verbal scale. We defined a between-group difference of 2.0 as the minimum clinically significant difference. Secondary endpoints included need for rescue medication in the ED, achieving headache freedom in the ED and sustaining it for 24 hours, and patient's desire to receive the same medication again. We included 120 patients in the analysis. The metoclopramide/diphenhydramine arm improved by a median of 5 (interquartile range 3, 7) scale units, whereas the ketorolac arm improved by a median of 3 (IQR 2, 6) (95% confidence interval [CI] for difference 0 to 3). Metoclopramide+diphenhydramine was superior to ketorolac for all 3 secondary outcomes: the number needed to treat for not requiring ED rescue medication was 3 (95% CI 2 to 6); for sustained headache freedom, 6 (95% CI 3 to 20); and for wish to receive the same medication again, 7 (95% CI 4 to 65). Tension-type headache subgroup results were similar. For adults who presented to an ED with tension-type headache or with nonmigraine, noncluster recurrent headache, intravenous metoclopramide+diphenhydramine provided more headache relief than intravenous ketorolac. Copyright © 2013 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.
Warner, Matthew A.; Woodrum, David A.; Hanson, Andrew C.; Schroeder, Darrell R.; Wilson, Gregory A.; Kor, Daryl J.
2016-01-01
Objective To determine the association between prophylactic plasma transfusion and periprocedural RBC transfusion rates in patients with elevated INR values undergoing interventional radiology procedures. Patients and Methods In this retrospective cohort study, adult patients undergoing interventional radiology procedures with a preprocedural INR available within 30 days of the procedure during a study period of Jan 1st, 2009 to Dec 31st, 2013 were eligible for inclusion. Baseline characteristics, coagulation parameters, transfusion requirements, and procedural details were extracted. Univariate and multivariable propensity-matched analyses were used to assess the relationships between prophylactic plasma transfusion and the outcomes of interest, with a primary outcome assessed a priori of RBC transfusion occurring during the procedure or within the first 24 hours post-procedurally. Results A total of 18,204 study participants met inclusion criteria for this study, and 1,803 (9.9%) had an INR ≥ 1.5 prior to their procedure. Among these, 196 patients (10.9%) received prophylactic plasma transfusion with a median (interquartile range) time between plasma initiation and procedural start of 1.9 (1.1 – 3.2) hours. In multivariable propensity-matched analysis, plasma administration was associated with increased periprocedural RBC transfusions [OR (95% CI) = 2.20 (1.38 – 3.50); P<.001] and postprocedural ICU admission rates [OR (95% CI) = 2.11 (1.41 – 3.14); P<.001] compared to those who were not transfused preprocedurally. Similar relationships were seen at higher INR thresholds for plasma transfusion. Conclusion In patients undergoing interventional radiology procedures, preprocedural plasma transfusions given in the setting of elevated INR values were associated with increased periprocedural RBC transfusions. Additional research is needed to clarify this potential association between preprocedural plasma and periprocedural RBC transfusion. PMID:27492911
Shanir, P P Muhammed; Khan, Kashif Ahmad; Khan, Yusuf Uzzaman; Farooq, Omar; Adeli, Hojjat
2017-12-01
Epileptic neurological disorder of the brain is widely diagnosed using the electroencephalography (EEG) technique. EEG signals are nonstationary in nature and show abnormal neural activity during the ictal period. Seizures can be identified by analyzing and obtaining features of EEG signal that can detect these abnormal activities. The present work proposes a novel morphological feature extraction technique based on the local binary pattern (LBP) operator. LBP provides a unique decimal value to a sample point by weighing the binary outcomes after thresholding the neighboring samples with the present sample point. These LBP values assist in capturing the rising and falling edges of the EEG signal, thus providing a morphologically featured discriminating pattern for epilepsy detection. In the present work, the variability in the LBP values is measured by calculating the sum of absolute difference of the consecutive LBP values. Interquartile range is calculated over the preprocessed EEG signal to provide dispersion measure in the signal. For classification purpose, K-nearest neighbor classifier is used, and the performance is evaluated on 896.9 hours of data from CHB-MIT continuous EEG database. Mean accuracy of 99.7% and mean specificity of 99.8% is obtained with average false detection rate of 0.47/h and sensitivity of 99.2% for 136 seizures.
Timing of malaria messages for target audience on radio airwaves
2012-01-01
Background Due to the limitations of face-to-face communication to teach families how to manage, control and prevent malaria, national and local malaria programmes try to reach people through the radio. However, information regarding the timing of radio messages for the target audiences is lacking. Methods Within a large-scale trial (Clinicaltrials.gov: NCT00565071), data regarding the time at which people listen to the radio was collected from 1,628 consenting outpatients (and caregivers for minors) attending six rural government primary level health care centres in Bushenyi and Iganga districts of Uganda from February to July 2011. Results The majority of households, 1,099 (67.5%) owned a radio. The majority, 1,221 (86.3%), participants had heard about malaria from the radio. Some participants started listening to the radio at about 06.00 East African local time (EAT). The peak hours at which people listen to the radio are 12.00-14.00 and 18.00-23.00 local time. The median time of listening to the radio by men is 20.00 (inter-quartile range (IQR): 18.30-21.00) and women 19.30 (IQR: 13.00-20.30). Conclusion Planners of malaria radio interventions need to broadcast their messages within the two peak EAT of 12.00-14.00 and 18.00-23.00. PMID:22905781
Timing of malaria messages for target audience on radio airwaves.
Batwala, Vincent; Magnussen, Pascal; Mirembe, Justine; Mulogo, Edgar; Nuwaha, Fred
2012-08-20
Due to the limitations of face-to-face communication to teach families how to manage, control and prevent malaria, national and local malaria programmes try to reach people through the radio. However, information regarding the timing of radio messages for the target audiences is lacking. Within a large-scale trial (Clinicaltrials.gov: NCT00565071), data regarding the time at which people listen to the radio was collected from 1,628 consenting outpatients (and caregivers for minors) attending six rural government primary level health care centres in Bushenyi and Iganga districts of Uganda from February to July 2011. The majority of households, 1,099 (67.5%) owned a radio. The majority, 1,221 (86.3%), participants had heard about malaria from the radio. Some participants started listening to the radio at about 06.00 East African local time (EAT). The peak hours at which people listen to the radio are 12.00-14.00 and 18.00-23.00 local time. The median time of listening to the radio by men is 20.00 (inter-quartile range (IQR): 18.30-21.00) and women 19.30 (IQR: 13.00-20.30). Planners of malaria radio interventions need to broadcast their messages within the two peak EAT of 12.00-14.00 and 18.00-23.00.
Visualizations of Travel Time Performance Based on Vehicle Reidentification Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Stanley Ernest; Sharifi, Elham; Day, Christopher M.
This paper provides a visual reference of the breadth of arterial performance phenomena based on travel time measures obtained from reidentification technology that has proliferated in the past 5 years. These graphical performance measures are revealed through overlay charts and statistical distribution as revealed through cumulative frequency diagrams (CFDs). With overlays of vehicle travel times from multiple days, dominant traffic patterns over a 24-h period are reinforced and reveal the traffic behavior induced primarily by the operation of traffic control at signalized intersections. A cumulative distribution function in the statistical literature provides a method for comparing traffic patterns from variousmore » time frames or locations in a compact visual format that provides intuitive feedback on arterial performance. The CFD may be accumulated hourly, by peak periods, or by time periods specific to signal timing plans that are in effect. Combined, overlay charts and CFDs provide visual tools with which to assess the quality and consistency of traffic movement for various periods throughout the day efficiently, without sacrificing detail, which is a typical byproduct of numeric-based performance measures. These methods are particularly effective for comparing before-and-after median travel times, as well as changes in interquartile range, to assess travel time reliability.« less
How to Predict Oral Rehydration Failure in Children With Gastroenteritis.
Geurts, Dorien; Steyerberg, Ewout W; Moll, Henriëtte; Oostenbrink, Rianne
2017-11-01
Oral rehydration is the standard in most current guidelines for young children with acute gastroenteritis (AGE). Failure of oral rehydration can complicate the disease course, leading to morbidity due to severe dehydration. We aimed to identify prognostic factors of oral rehydration failure in children with AGE. A prospective, observational study was performed at the Emergency department, Erasmus Medical Centre, Rotterdam, The Netherlands, 2010-2012, including 802 previously healthy children, ages 1 month to 5 years with AGE. Failure of oral rehydration was defined by secondary rehydration by a nasogastric tube, or hospitalization or revisit for dehydration within 72 hours after initial emergency department visit. We observed 167 (21%) failures of oral rehydration in a population of 802 children with AGE (median 1.03 years old, interquartile range 0.4-2.1; 60% boys). In multivariate logistic regression analysis, independent predictors for failure of oral rehydration were a higher Manchester Triage System urgency level, abnormal capillary refill time, and a higher clinical dehydration scale score. Early recognition of young children with AGE at risk of failure of oral rehydration therapy is important, as emphasized by the 21% therapy failure in our population. Associated with oral rehydration failure are higher Manchester Triage System urgency level, abnormal capillary refill time, and a higher clinical dehydration scale score.
Albawardi, Nada M; Jradi, Hoda; Almalki, Abdulla A; Al-Hazzaa, Hazzaa M
2017-06-19
Research in Saudi Arabia has revealed a shocking level of insufficiently physically active adults, particularly women. The risk of sedentary behavior will likely increase as the number of women with office-based jobs increases. The aim of this study is to determine the level of sedentary behavior, and its associated factors, among Saudi women working office-based jobs in the city of Riyadh. A cross-sectional study of 420 Saudi female employees at 8 office-based worksites were measured to determine body mass index and were given a self-administered survey to evaluate their level of physical activity and sedentary behavior. Median sitting time on work days was 690 min per day (interquartile range, IQR 541-870), with nearly half accumulated during work hours, and 575 min per day (IQR 360-780) on non-work days. Predictors of work day sitting time were level of education, number of children, and working in the private sector. Number of children, whether they were single, and whether they lived in a small home were found to predict non-work day sitting time. This study identifies Saudi women in office-based jobs as a high-risk group for sedentary behavior. There is a need to promote physical activity at worksites and reduce prolonged sitting.
Early Fever As a Predictor of Paroxysmal Sympathetic Hyperactivity in Traumatic Brain Injury.
Hinson, Holly E; Schreiber, Martin A; Laurie, Amber L; Baguley, Ian J; Bourdette, Dennis; Ling, Geoffrey S F
Paroxysmal sympathetic hyperactivity (PSH) is characterized by episodic, hyperadrenergic alterations in vital signs after traumatic brain injury (TBI). We sought to apply an objective scale to the vital sign alterations of PSH in order to determine whether 1 element might be predictive of developing PSH. We conducted an observational study of consecutive TBI patients (Glasgow Coma Scale score ≤12) and monitored the cohort for clinical evidence of PSH. PSH was defined as a paroxysm of 3 or more of the following characteristics: (1) tachycardia, (2) tachypnea, (3) hypertension, (4) fever, (5) dystonia (rigidity or decerebrate posturing), and (6) diaphoresis, with no other obvious causation (ie, alcohol withdrawal, sepsis). The Modified Clinical Feature Severity Scale (mCFSS) was applied to each participant once daily for the first 5 days of hospitalization. Nineteen (11%) of the 167 patients met criteria for PSH. Patients with PSH had a higher 5-day cumulative mCFSS score than those without PSH (median [interquartile range] = 36 [29-42] vs 29 [22-35], P = .01). Of the 4 components of the mCFSS, elevated temperature appeared to be most predictive of the development of PSH, especially during the first 24 hours (odds ratio = 1.95; 95% confidence interval, 1.12-3.40). Early fever after TBI may signal impending autonomic dysfunction.
Leelarathna, Lalantha; English, Shane W; Thabit, Hood; Caldwell, Karen; Allen, Janet M; Kumareswaran, Kavita; Wilinska, Malgorzata E; Nodale, Marianna; Haidar, Ahmad; Evans, Mark L; Burnstein, Rowan; Hovorka, Roman
2014-02-01
Accurate real-time continuous glucose measurements may improve glucose control in the critical care unit. We evaluated the accuracy of the FreeStyle(®) Navigator(®) (Abbott Diabetes Care, Alameda, CA) subcutaneous continuous glucose monitoring (CGM) device in critically ill adults using two methods of calibration. In a randomized trial, paired CGM and reference glucose (hourly arterial blood glucose [ABG]) were collected over a 48-h period from 24 adults with critical illness (mean±SD age, 60±14 years; mean±SD body mass index, 29.6±9.3 kg/m(2); mean±SD Acute Physiology and Chronic Health Evaluation score, 12±4 [range, 6-19]) and hyperglycemia. In 12 subjects, the CGM device was calibrated at variable intervals of 1-6 h using ABG. In the other 12 subjects, the sensor was calibrated according to the manufacturer's instructions (1, 2, 10, and 24 h) using arterial blood and the built-in point-of-care glucometer. In total, 1,060 CGM-ABG pairs were analyzed over the glucose range from 4.3 to 18.8 mmol/L. Using enhanced calibration median (interquartile range) every 169 (122-213) min, the absolute relative deviation was lower (7.0% [3.5, 13.0] vs. 12.8% [6.3, 21.8], P<0.001), and the percentage of points in the Clarke error grid Zone A was higher (87.8% vs. 70.2%). Accuracy of the Navigator CGM device during critical illness was comparable to that observed in non-critical care settings. Further significant improvements in accuracy may be obtained by frequent calibrations with ABG measurements.
El-Kersh, Karim; Ruf, Kathryn M; Smith, J Shaun
There is no standard protocol for intravenous treprostinil dose escalation. In most cases, slow up-titration is performed in the outpatient setting. However, rapid up-titration in an inpatient setting is an alternative that provides opportunity for aggressive treatment of common side effects experienced during dose escalation. In this study, we describe our experience with inpatient rapid up-titration of intravenous treprostinil. This was a single-center, retrospective study in which we reviewed the data of subjects with pulmonary arterial hypertension treated at our center who underwent inpatient rapid up-titration of intravenous treprostinil. Our treprostinil dose escalation protocol included initiation at 2 ng·kg·min with subsequent up-titration by 1 ng·kg·min every 6 to 8 hours as tolerated by side effects. A total of 16 subjects were identified. Thirteen subjects were treprostinil naive (naive group), and 3 subjects were receiving subcutaneous treprostinil but were hospitalized for further intravenous up-titration of treprostinil dose (nonnaive group). In the naive group, the median maximum dose achieved was 20 ng·kg·min with an interquartile range (IQR) of 20-23 ng·kg·min. The median up-titration interval was 6 days (IQR: 4-9). In the nonnaive group, the median maximum dose achieved was 20 ng·kg·min (range: 17-30). The median up-titration interval was 8.5 days (range: 1.5-11). Overall, the median maximum dose achieved was 20 ng·kg·min (IQR: 20-23.5), and the median up-titration interval was 6 days (IQR: 4.6-9.25), with no reported significant adverse hemodynamic events. In patients with pulmonary arterial hypertension, rapid inpatient titration of intravenous treprostinil is safe and tolerable.
Leelarathna, Lalantha; English, Shane W.; Thabit, Hood; Caldwell, Karen; Allen, Janet M.; Kumareswaran, Kavita; Wilinska, Malgorzata E.; Nodale, Marianna; Haidar, Ahmad; Evans, Mark L.; Burnstein, Rowan
2014-01-01
Abstract Objective: Accurate real-time continuous glucose measurements may improve glucose control in the critical care unit. We evaluated the accuracy of the FreeStyle® Navigator® (Abbott Diabetes Care, Alameda, CA) subcutaneous continuous glucose monitoring (CGM) device in critically ill adults using two methods of calibration. Subjects and Methods: In a randomized trial, paired CGM and reference glucose (hourly arterial blood glucose [ABG]) were collected over a 48-h period from 24 adults with critical illness (mean±SD age, 60±14 years; mean±SD body mass index, 29.6±9.3 kg/m2; mean±SD Acute Physiology and Chronic Health Evaluation score, 12±4 [range, 6–19]) and hyperglycemia. In 12 subjects, the CGM device was calibrated at variable intervals of 1–6 h using ABG. In the other 12 subjects, the sensor was calibrated according to the manufacturer's instructions (1, 2, 10, and 24 h) using arterial blood and the built-in point-of-care glucometer. Results: In total, 1,060 CGM–ABG pairs were analyzed over the glucose range from 4.3 to 18.8 mmol/L. Using enhanced calibration median (interquartile range) every 169 (122–213) min, the absolute relative deviation was lower (7.0% [3.5, 13.0] vs. 12.8% [6.3, 21.8], P<0.001), and the percentage of points in the Clarke error grid Zone A was higher (87.8% vs. 70.2%). Conclusions: Accuracy of the Navigator CGM device during critical illness was comparable to that observed in non–critical care settings. Further significant improvements in accuracy may be obtained by frequent calibrations with ABG measurements. PMID:24180327
Atar, Shaul; Tolstrup, Kirsten; Cercek, Bojan; Siegel, Robert J
2007-07-01
Chlamydia pneumoniae has previously been associated with higher prevalence of valvular and cardiac calcifications. To investigate a possible association of seropositivity for C. pneumoniae and the presence of cardiac calcifications (mitral annular or aortic root calcification, and aortic valve sclerosis). We retrospectively analyzed serological data (immunoglobulin G TWAR antibodies) from the AZACS trial (Azithromycin in Acute Coronary Syndromes), and correlated the serological findings according to titer levels with the presence of cardiac calcifications as detected by transthoracic echocardiography. In 271 patients, age 69 +/- 13 years, who underwent both serological and echocardiographic evaluation, we found no significant association between the "calcification sum score" (on a scale of 0-3) in seropositive compared to seronegative patients (1.56 +/- 1.15 vs.1.35 +/- 1.15, respectively, P = 0.26). The median calcification sum score was 1 (interquartile range 0-3) for the seronegative group, and 2 (interquartile range 0-3) for the seropositive group (P = 0.2757). In addition, we did not find a significant correlation of any of the individual sites of cardiac calcification and C. pneumoniae seropositivity. Our findings suggest that past C. pneumoniae infection may not be associated with the pathogenesis of valvular and cardiac calcifications.
Shaw, Leslee J.; Berman, Daniel S.; Picard, Michael H.; Friedrich, Matthias G.; Kwong, Raymond Y.; Stone, Gregg W.; Senior, Roxy; Min, James K.; Hachamovitch, Rory; Scherrer-Crosbie, Marielle; Mieres, Jennifer H.; Marwick, Thomas H.; Phillips, Lawrence M.; Chaudhry, Farooq A.; Pellikka, Patricia A.; Slomka, Piotr; Arai, Andrew E.; Iskandrian, Ami E.; Bateman, Timothy M.; Heller, Gary V.; Miller, Todd D.; Nagel, Eike; Goyal, Abhinav; Borges-Neto, Salvador; Boden, William E.; Reynolds, Harmony R.; Hochman, Judith S.; Maron, David J.; Douglas, Pamela S.
2014-01-01
The lack of standardized reporting of the magnitude of ischemia on noninvasive imaging contributes to variability in translating the severity of ischemia across stress imaging modalities. We identified the risk of coronary artery disease (CAD) death or myocardial infarction (MI) associated with ≥10% ischemic myocardium on stress nuclear imaging as the risk threshold for stress echocardiography and cardiac magnetic resonance. A narrative review revealed that ≥10% ischemic myocardium on stress nuclear imaging was associated with a median rate of CAD death or MI of 4.9%/year (interquartile range: 3.75% to 5.3%). For stress echocardiography, ≥3 newly dysfunctional segments portend a median rate of CAD death or MI of 4.5%/year (interquartile range: 3.8% to 5.9%). Although imprecisely delineated, moderate-severe ischemia on cardiac magnetic resonance may be indicated by ≥4 of 32 stress perfusion defects or ≥3 dobutamine-induced dysfunctional segments. Risk-based thresholds can define equivalent amounts of ischemia across the stress imaging modalities, which will help to translate a common understanding of patient risk on which to guide subsequent management decisions. PMID:24925328
Issar, Tushar; Arnold, Ria; Kwai, Natalie C G; Pussell, Bruce A; Endre, Zoltan H; Poynten, Ann M; Kiernan, Matthew C; Krishnan, Arun V
2018-05-01
To demonstrate construct validity of the Total Neuropathy Score (TNS) in assessing peripheral neuropathy in subjects with chronic kidney disease (CKD). 113 subjects with CKD and 40 matched controls were assessed for peripheral neuropathy using the TNS. An exploratory factor analysis was conducted and internal consistency of the scale was evaluated using Cronbach's alpha. Construct validity of the TNS was tested by comparing scores between case and control groups. Factor analysis revealed valid item correlations and internal consistency of the TNS was good with a Cronbach's alpha of 0.897. Subjects with CKD scored significantly higher on the TNS (CKD: median, 6, interquartile range, 1-13; controls: median, 0, interquartile range, 0-1; p < 0.001). Subgroup analysis revealed construct validity was maintained for subjects with stages 3-5 CKD with and without diabetes. The TNS is a valid measure of peripheral neuropathy in patients with CKD. The TNS is the first neuropathy scale to be formally validated in patients with CKD. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.
A strategy for optimizing staffing to improve the timeliness of inpatient phlebotomy collections.
Morrison, Aileen P; Tanasijevic, Milenko J; Torrence-Hill, Joi N; Goonan, Ellen M; Gustafson, Michael L; Melanson, Stacy E F
2011-12-01
The timely availability of inpatient test results is a key to physician satisfaction with the clinical laboratory, and in an institution with a phlebotomy service may depend on the timeliness of blood collections. In response to safety reports filed for delayed phlebotomy collections, we applied Lean principles to the inpatient phlebotomy service at our institution. Our goal was to improve service without using additional resources by optimizing our staffing model. To evaluate the effect of a new phlebotomy staffing model on the timeliness of inpatient phlebotomy collections. We compared the median time of morning blood collections and average number of safety reports filed for delayed phlebotomy collections during a 6-month preimplementation period and 5-month postimplementation period. The median time of morning collections was 17 minutes earlier after implementation (7:42 am preimplementation; interquartile range, 6:27-8:48 am; versus 7:25 am postimplementation; interquartile range, 6:20-8:26 am). The frequency of safety reports filed for delayed collections decreased 80% from 10.6 per 30 days to 2.2 per 30 days. Reallocating staff to match the pattern of demand for phlebotomy collections throughout the day represents a strategy for improving the performance of an inpatient phlebotomy service.
The Effect of Manual Restraint on Physiological Parameters in Barred Owls ( Strix varia ).
Doss, Grayson A; Mans, Christoph
2017-03-01
Manual restraint is commonly necessary when working with avian species in medical, laboratory, and field settings. Despite their prevalence, little is known about the stress response in raptorial bird species. To further understand the effect of restraint on the stress response in birds of prey, 12 barred owls ( Strix varia ) were manually restrained for 15 minutes. Physiological parameters (cloacal temperature, respiratory rate, heart rate) were followed over time and recorded at defined points during the restraint period. Heart rate decreased significantly over the restraint period by a mean ± SD of -73 ± 46 beats/min. Respiratory rate also decreased significantly (median: -11 breaths/min, interquartile range: -8 to -18). Cloacal temperature increased significantly over time in manually restrained owls (median: +1.5°C [+2.7°F], interquartile range: 1.3°C-2.1°C [2.3°F-3.8°F]). This study is the first to document stress hyperthermia in an owl species. Similar to another raptorial bird, the red-tailed hawk ( Buteo jamaicensis ), both heart rate and respiratory rate decreased and cloacal temperature increased over time in restrained barred owls. Barred owls appear to cope differently to restraint stress when compared to psittacine species.
Differences in sleep architecture between left and right temporal lobe epilepsy.
Nakamura, Miki; Jin, Kazutaka; Kato, Kazuhiro; Itabashi, Hisashi; Iwasaki, Masaki; Kakisaka, Yosuke; Nakasato, Nobukazu
2017-01-01
To investigate whether seizure lateralization affects sleep macrostructure in patients with left and right temporal lobe epilepsy (TLE), as rapid eye movement (REM) sleep is shorter in patients with right hemispheric cerebral infarction than with left. We retrospectively analyzed data from 16 patients with TLE (6 men and 10 women aged 34.9 ± 11.4 years) who underwent polysomnography as well as long-term video electroencephalography. Ten patients were diagnosed with left TLE and six patients with right TLE. Sleep stages and respiratory events were scored based on the American Academy of Sleep Medicine criteria. Sleep and respiratory parameters were compared between the patient groups. Percentage of REM stage sleep was significantly (p < 0.05) lower in patients with left TLE (median 8.8 %, interquartile range 5.5-13.8 %) than in patients with right TLE (median 17.0 %, interquartile range 14.1-18.3 %). The other parameters showed no significant differences. Shorter REM sleep in patients with left TLE sharply contrasts with the previous report of shorter REM sleep in patients with right cerebral infarction. Laterality of the irritative epileptic focus versus destructive lesion may have different effects on the sleep macrostructures.
van der Woude, Olga C P; Cuper, Natascha J; Getrouw, Chavalleh; Kalkman, Cor J; de Graaff, Jurgen C
2013-06-01
Poor vein visibility can make IV cannulation challenging in children with dark skin color. In the operating room, we studied the effectiveness of a near-infrared vascular imaging device (VascuLuminator) to facilitate IV cannulation in children with dark skin color. In the operating room of a general hospital in Curacao, all consecutive children (0-15 years of age) requiring IV cannulation were included in a pragmatic cluster randomized clinical trial. The VascuLuminator was made available to anesthesiologists at the operating complex in randomized clusters of 1 week. Success at first attempt was 63% (27/43, 95% confidence interval [CI], 47%-77%) in the VascuLuminator group vs 51% (23 of 45 patients, 95% CI, 36%-66%) in the control group (P = 0.27). Median time to successful cannulation was 53 seconds (interquartile range: 34-154) in the VascuLuminator group and 68 seconds (interquartile range: 40-159) in the control group (P = 0.54), and hazard ratio was 1.12 (95% CI, 0.73-1.71). The VascuLuminator has limited value in improving success at first attempt of facilitating IV cannulation in children with dark skin color.
Multidrug-resistant tuberculosis around the world: what progress has been made?
Mirzayev, Fuad; Wares, Fraser; Baena, Inés Garcia; Zignol, Matteo; Linh, Nguyen; Weyer, Karin; Jaramillo, Ernesto; Floyd, Katherine; Raviglione, Mario
2015-01-01
Multidrug-resistant tuberculosis (MDR-TB) (resistance to at least isoniazid and rifampicin) will influence the future of global TB control. 88% of estimated MDR-TB cases occur in middle- or high-income countries, and 60% occur in Brazil, China, India, the Russian Federation and South Africa. The World Health Organization collects country data annually to monitor the response to MDR-TB. Notification, treatment enrolment and outcome data were summarised for 30 countries, accounting for >90% of the estimated MDR-TB cases among notified TB cases worldwide. In 2012, a median of 14% (interquartile range 6–50%) of estimated MDR-TB cases were notified in the 30 countries studied. In 15 of the 30 countries, the number of patients treated for MDR-TB in 2012 (71 681) was >50% higher than in 2011. Median treatment success was 53% (interquartile range 40–70%) in the 25 countries reporting data for 30 021 MDR-TB cases who started treatment in 2010. Although progress has been noted in the expansion of MDR-TB care, urgent efforts are required in order to provide wider access to diagnosis and treatment in most countries with the highest burden of MDR-TB. PMID:25261327
Use of Intranasal Dexmedetomidine as a Solo Sedative for MRI of Infants.
Olgun, Gokhan; Ali, Mir Hyder
2018-01-23
Dexmedetomidine, a selective α-2 receptor agonist, can be delivered via the intranasal (IN) route and be used for procedural sedation. The drug's favorable hemodynamic profile and relative ease of application make it a promising agent for sedation during radiologic procedures, although there are few studies on its efficacy for MRI studies. A retrospective chart review was performed between June 2014 and December 2016. Outpatients between 1 and 12 months of age who received 4 μg/kg of IN dexmedetomidine for MRI were included in the analysis. Our aim with this study was to determine the rate of successful completion of the sedation procedure without the need for a rescue drug (other than repeat IN dexmedetomidine). A total of 52 subjects were included in our study. Median (interquartile range) patient age was 7 (5-8) months. Median (interquartile range) procedure length was 40 (35-50) minutes. Overall success rate (including first dose and any rescue dose IN) of dexmedetomidine was 96.2%. None of the patients had significant adverse effects related to dexmedetomidine. IN dexmedetomidine is an effective solo sedative agent for MRI in infants. Copyright © 2018 by the American Academy of Pediatrics.
Riveros-Perez, Efrain; Wood, Cristina
2018-03-01
To assess the management and maternal outcomes of placenta accreta spectrum (PAS) disorders. A retrospective chart review was conducted of patients diagnosed with PAS disorders (placenta creta, increta, or percreta) who were treated at a US tertiary care center between February 1, 2011, and January 31, 2016. Obstetric management, anesthetic management, and maternal outcomes were analyzed. A total of 43 cases were identified; placenta previa was diagnosed among 33 (77%). Median age was 33 years (range 23-42). Median blood loss was 1500 mL (interquartile range 1000-2500); blood loss was greatest among the 10 patients with placenta percreta (3250 mL, interquartile range 2200-6000). Transfusion of blood products was necessary among 14 (33%) patients, with no difference in frequency according to the degree of placental invasion (P=0.107). Surgical complications occurred among 10 (23%) patients. Overall, 30 (70%) patients received combined spinal-epidural plus general anesthesia, 4 (9%) received only general anesthesia, and 9 (21%) underwent surgery with combined spinal-epidural anesthesia. One patient experienced difficult airway and another experienced accidental dural puncture. Placenta previa and accreta coexist in many patients, leading to substantial bleeding related to the degree of myometrial invasion. An interdisciplinary team approach plus the use of combined spinal-epidural anesthesia, transitioning to general anesthesia, were advisable and safe. © 2017 International Federation of Gynecology and Obstetrics.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Wilcoxon-Mann-Whitney Test (a) When n and m are less than 21, use Table 1. In order to find the appropriate... trigger (Step 3). The interquartile range (R) is the difference between the quartiles M-1 and M1; these... baseline observations were obtained, calculate the median (M) of all baseline observations: Instructions...
Code of Federal Regulations, 2010 CFR
2010-07-01
... Wilcoxon-Mann-Whitney Test (a) When n and m are less than 21, use Table 1. In order to find the appropriate... trigger (Step 3). The interquartile range (R) is the difference between the quartiles M-1 and M1; these... baseline observations were obtained, calculate the median (M) of all baseline observations: Instructions...
An Initial Evaluation of the Impact of Pokémon GO on Physical Activity.
Xian, Ying; Xu, Hanzhang; Xu, Haolin; Liang, Li; Hernandez, Adrian F; Wang, Tracy Y; Peterson, Eric D
2017-05-16
Pokémon GO is a location-based augmented reality game. Using GPS and the camera on a smartphone, the game requires players to travel in real world to capture animated creatures, called Pokémon. We examined the impact of Pokémon GO on physical activity (PA). A pre-post observational study of 167 Pokémon GO players who were self-enrolled through recruitment flyers or online social media was performed. Participants were instructed to provide screenshots of their step counts recorded by the iPhone Health app between June 15 and July 31, 2016, which was 3 weeks before and 3 weeks after the Pokémon GO release date. Of 167 participants, the median age was 25 years (interquartile range, 21-29 years). The daily average steps of participants at baseline was 5678 (SD, 2833; median, 5718 [interquartile range, 3675-7279]). After initiation of Pokémon GO, daily activity rose to 7654 steps (SD, 3616; median, 7232 [interquartile range, 5041-9744], pre-post change: 1976; 95% CI, 1494-2458, or a 34.8% relative increase [ P <0.001]). On average, 10 000 "XP" points (a measure of game progression) was associated with 2134 additional steps per day (95% CI, 1673-2595), suggesting a potential dose-response relationship. The number of participants achieving a goal of 10 000+ steps per day increased from 15.3% before to 27.5% after (odds ratio, 2.06; 95% CI, 1.70-2.50). Increased PA was also observed in subgroups, with the largest increases seen in participants who spent more time playing Pokémon GO, those who were overweight/obese, or those with a lower baseline PA level. Pokémon GO participation was associated with a significant increase in PA among young adults. Incorporating PA into gameplay may provide an alternative way to promote PA in persons who are attracted to the game. URL: http://www.clinicaltrials.gov. Unique identifier: NCT02888314. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.
Patel, Priyesh A; Liang, Li; Khazanie, Prateeti; Hammill, Bradley G; Fonarow, Gregg C; Yancy, Clyde W; Bhatt, Deepak L; Curtis, Lesley H; Hernandez, Adrian F
2016-07-01
Diabetes mellitus, heart failure (HF), and chronic kidney disease are common comorbidities, but overall use and safety of antihyperglycemic medications (AHMs) among patients with these comorbidities are poorly understood. Using Get With the Guidelines-Heart Failure and linked Medicare Part D data, we assessed AHM use within 90 days of hospital discharge among HF patients with diabetes mellitus discharged from Get With the Guidelines-Heart Failure hospitals between January 1, 2006, and October 1, 2011. We further summarized use by renal function and assessed renal contraindicated AHM use for patients with estimated glomerular filtration rate <30 mL/min/1.73m(2). Among 8791 patients meeting inclusion criteria, the median age was 77 (interquartile range 71-83), 62.3% were female, median body mass index was 29.7 (interquartile range 25.5-35.3), median hemoglobin A1c was 6.8 (interquartile range 6.2-7.8), and 34% had ejection fraction <40%. 74.9% of patients filled a prescription for an AHM, with insulin (39.5%), sulfonylureas (32.4%), and metformin (17%) being the most commonly used AHMs. Insulin use was higher and sulfonylurea/metformin use was lower among patients with lower renal function classes. Among 1512 patients with estimated glomerular filtration rate <30 mL/min/1.73m(2), 35.4% filled prescriptions for renal contraindicated AHMs per prescribing information, though there was a trend toward lower renal contraindicated AHM use over time (Cochran-Mantel-Haenszel row-mean score test P=0.048). Although use of other AHMs was low overall, thiazolidinediones were used in 6.6% of HF patients, and dipeptidyl peptidase-4 inhibitors were used in 5.1%, with trends for decreasing thiazolidinedione use and increased dipeptidyl peptidase-4 inhibitor use over time (P<0.001). Treatment of diabetes mellitus in patients with HF and chronic kidney disease is complex, and these patients are commonly treated with renal contraindicated AHMs, including over 6% receiving a thiazolidinedione, despite known concerns regarding HF. More research regarding safety and efficacy of various AHMs among HF patients is needed. © 2016 American Heart Association, Inc.
Zifan, Ali; Ledgerwood-Lee, Melissa; Mittal, Ravinder K
2016-12-01
Three-dimensional high-definition anorectal manometry (3D-HDAM) is used to assess anal sphincter function; it determines profiles of regional pressure distribution along the length and circumference of the anal canal. There is no consensus, however, on the best way to analyze data from 3D-HDAM to distinguish healthy individuals from persons with sphincter dysfunction. We developed a computer analysis system to analyze 3D-HDAM data and to aid in the diagnosis and assessment of patients with fecal incontinence (FI). In a prospective study, we performed 3D-HDAM analysis of 24 asymptomatic healthy subjects (control subjects; all women; mean age, 39 ± 10 years) and 24 patients with symptoms of FI (all women; mean age, 58 ± 13 years). Patients completed a standardized questionnaire (FI severity index) to score the severity of FI symptoms. We developed and evaluated a robust prediction model to distinguish patients with FI from control subjects using linear discriminant, quadratic discriminant, and logistic regression analyses. In addition to collecting pressure information from the HDAM data, we assessed regional features based on shape characteristics and the anal sphincter pressure symmetry index. The combination of pressure values, anal sphincter area, and reflective symmetry values was identified in patients with FI versus control subjects with an area under the curve value of 1.0. In logistic regression analyses using different predictors, the model identified patients with FI with an area under the curve value of 0.96 (interquartile range, 0.22). In discriminant analysis, results were classified with a minimum error of 0.02, calculated using 10-fold cross-validation; different combinations of predictors produced median classification errors of 0.16 in linear discriminant analysis (interquartile range, 0.25) and 0.08 in quadratic discriminant analysis (interquartile range, 0.25). We developed and validated a novel prediction model to analyze 3D-HDAM data. This system can accurately distinguish patients with FI from control subjects. Copyright © 2016 AGA Institute. Published by Elsevier Inc. All rights reserved.
Gao, S; Sun, F-K; Fan, Y-C; Shi, C-H; Zhang, Z-H; Wang, L-Y; Wang, K
2015-08-01
Glutathione-S-transferase P1 (GSTP1) methylation has been demonstrated to be associated with oxidative stress induced liver damage in acute-on-chronic hepatitis B liver failure (ACHBLF). To evaluate the methylation level of GSTP1 promoter in acute-on-chronic hepatitis B liver failure and determine its predictive value for prognosis. One hundred and five patients with acute-on-chronic hepatitis B liver failure, 86 with chronic hepatitis B (CHB) and 30 healthy controls (HC) were retrospectively enrolled. GSTP1 methylation level in peripheral mononuclear cells (PBMC) was detected by MethyLight. Clinical and laboratory parameters were obtained. GSTP1 methylation levels were significantly higher in patients with acute-on-chronic hepatitis B liver failure (median 16.84%, interquartile range 1.83-59.05%) than those with CHB (median 1.25%, interquartile range 0.48-2.47%; P < 0.01) and HC (median 0.80%, interquartile range 0.67-1.27%; P < 0.01). In acute-on-chronic hepatitis B liver failure group, nonsurvivors showed significantly higher GSTP1 methylation levels (P < 0.05) than survivors. GSTP1 methylation level was significantly correlated with total bilirubin (r = 0.29, P < 0.01), prothrombin time activity (r = -0.24, P = 0.01) and model for end-stage liver disease (MELD) score (r = 0.26, P = 0.01). When used to predict 1- or 2-month mortality of acute-on-chronic hepatitis B liver failure, GSTP1 methylation showed significantly better predictive value than MELD score [area under the receiver operating characteristic curve (AUC) 0.89 vs. 0.72, P < 0.01; AUC 0.83 vs. 0.70, P < 0.05 respectively]. Meanwhile, patients with GSTP1 methylation levels above the cut-off points showed significantly poorer survival than those below (P < 0.05). Aberrant GSTP1 promoter methylation exists in acute-on-chronic hepatitis B liver failure and shows high predictive value for short-term mortality. It might serve as a potential prognostic marker for acute-on-chronic hepatitis B liver failure. © 2015 John Wiley & Sons Ltd.
Gupta, Mihir; Miller, Christopher J; Baker, Jason V; Lazar, Jason; Bogner, Johannes R; Calmy, Alexandra; Soliman, Elsayed Z; Neaton, James D
2013-03-01
We assessed the relation of inflammatory and coagulation biomarkers with electrocardiographic (ECG) evidence of myocardial ischemia. High-sensitivity C-reactive protein (hsCRP), interleukin-6 (IL-6), and D-dimer levels were measured at study entry for 3,085 human immunodeficiency virus-infected participants (mean age 44 years; 26.4% women; 24.6% black) in the Strategies for Management of Antiretroviral Therapy trial. Logistic regression models were used to examine the associations of these biomarkers with prevalent and incident myocardial ischemia. The latter analyses were performed for 1,411 participants who were randomly assigned to receive continuous antiretroviral therapy during follow-up to suppress the human immunodeficiency virus viral load and had ≥1 ECG reading during the follow-up period. The median hsCRP, IL-6, and D-dimer level was 1.65 μg/ml (interquartile range 0.69 to 4.11), 1.60 pg/ml (interquartile range 1.00 to 2.75), and 0.18 μg/ml (interquartile range 0.11 to 0.32), respectively. At baseline, the prevalence of major or minor Q-QS or ST-T ECG abnormalities was 18.6%. The biomarker levels were associated with prevalent major or minor ischemic abnormalities on the univariate analyses; however, adjustment for traditional risk factors attenuated these associations. The adjusted odds ratio for major or minor ischemic abnormalities and 95% confidence intervals for the greatest versus lowest quartiles was 1.3 (95% confidence interval 0.9 to 1.7) for hsCRP, 1.0 (95% confidence interval 0.7 to 1.3) for IL-6, and 1.1 (95% confidence interval 0.9 to 1.5) for D-dimer. During a median follow-up of 2.3 years, new definite or probable ischemic ECG abnormalities developed in 11.7% of participants receiving continuous antiretroviral therapy. Biomarker levels were not associated with incident abnormalities on unadjusted or adjusted analyses. In conclusion, higher levels of hsCRP, IL-6, and D-dimer were not associated with ischemic ECG abnormalities. Elevated biomarker levels and ECG abnormalities indicating myocardial ischemia might reflect different risk pathways for cardiovascular disease. Copyright © 2013 Elsevier Inc. All rights reserved.
Li, Debbie; Baxter, Nancy N; McLeod, Robin S; Moineddin, Rahim; Wilton, Andrew S; Nathens, Avery B
2014-12-01
There is increasing evidence to support the use of percutaneous abscess drainage, laparoscopy, and primary anastomosis in managing acute diverticulitis. The aim of this study was to evaluate how practices have evolved and to determine the effects on clinical outcomes. This is a population-based retrospective cohort study using administrative discharge data. This study was conducted in Ontario, Canada. All patients had been hospitalized for a first episode of acute diverticulitis (2002-2012). Temporal changes in treatment strategies and outcomes were evaluated by using the Cochran-Armitage test for trends. Multivariable logistic regression with generalized estimating equations was used to test for trends while adjusting for patient characteristics. There were 18,543 patients hospitalized with a first episode of diverticulitis, median age 60 years (interquartile range, 48-74). From 2002 to 2012, there was an increase in the proportion of patients admitted with complicated disease (abscess, perforation), 32% to 38%, yet a smaller proportion underwent urgent operation, 28% to 16% (all p < 0.001). The use of percutaneous drainage increased from 1.9% of admissions in 2002 to 3.3% in 2012 (p < 0.001). After adjusting for changes in patient and disease characteristics over time, the odds of urgent operation decreased by 0.87 per annum (95% CI, 0.85-0.89). In those undergoing urgent surgery (n = 3873), the use of laparoscopy increased (9% to 18%, p <0.001), whereas the use of the Hartmann procedure remained unchanged (64%). During this time, in-hospital mortality decreased (2.7% to 1.9%), as did the median length of stay (5 days, interquartile range, 3-9; to 3 days, interquartile range, 2-6; p <0.001). There is the potential for residual confounding, because clinical parameters available for risk adjustment were limited to fields existing within administrative data. There has been an increase in the use of nonoperative and minimally invasive strategies in treating patients with a first episode of acute diverticulitis. However, the Hartmann procedure remains the most frequently used urgent operative approach. Mortality and length of stay have improved during this time.
Effect of an independent-capacity protocol on overcrowding in an urban emergency department.
Cha, Won Chul; Shin, Sang Do; Song, Kyoung Jun; Jung, Sung Koo; Suh, Gil Joon
2009-12-01
The authors hypothesized that a new strategy, termed the independent-capacity protocol (ICP), which was defined as primary stabilization at the emergency department (ED) and utilization of community resources via transfer to local hospitals, would reduce ED overcrowding without requiring additional hospital resources. This is a before-and-after trial that included all patients who visited an urban, tertiary care ED in Korea from July 2006 to June 2008. To improve ED throughput, introduction of the ICP gave emergency physicians (EPs) more responsibility and authority over patient disposition, even when the patients belonged to another specific clinical department. The ICP utilizes the ED as a temporary, nonspecific place that cares for any patient for a limited time period. Within 48 hours, EPs, associated specialists, and transfer coordinators perform secondary assessment and determine patient disposition. If the hospital is full and cannot admit these patients after 48 hours, the EP and transfer coordinators move the patients to other appropriate community facilities. We collected clinical data such as sex, age, diagnosis, and treatment. The main outcomes included ED length of stay (LOS), the numbers of admissions to inpatient wards, and the mortality rate. A total of 87,309 patients were included. The median number of daily patients was 114 (interquartile range [IQR] = 104 to 124) in the control phase and 124 (IQR = 112 to 135) in the ICP phase. The mean ED LOS decreased from 15.1 hours (95% confidence interval [CI] = 14.8 to 15.3) to 13.4 hours (95% CI = 13.2 to 13.6; p < 0.001). The mean LOS in the emergency ward decreased from 4.5 days (95% CI = 4.4 to 4.6 days) to 3.1 days (95% CI = 3.0 to 3.2 days; p < 0.001). The percentage of transfers from the ED to other hospitals decreased from 3.5% to 2.5% (p < 0.001). However, transfers from the emergency ward to other hospitals increased from 2.9% to 8.2% (p < 0.001). Admissions to inpatient wards from the ED were significantly reduced, and admission from the emergency ward did not change. The ED mortality and hospital mortality rates did not change (p = 0.15 and p = 0.10, respectively). After introduction of the ICP, ED LOS decreased without an increase in hospital capacity.
Mathews, Kusum S; Durst, Matthew S; Vargas-Torres, Carmen; Olson, Ashley D; Mazumdar, Madhu; Richardson, Lynne D
2018-05-01
ICU admission delays can negatively affect patient outcomes, but emergency department volume and boarding times may also affect these decisions and associated patient outcomes. We sought to investigate the effect of emergency department and ICU capacity strain on ICU admission decisions and to examine the effect of emergency department boarding time of critically ill patients on in-hospital mortality. A retrospective cohort study. Single academic tertiary care hospital. Adult critically ill emergency department patients for whom a consult for medical ICU admission was requested, over a 21-month period. None. Patient data, including severity of illness (Mortality Probability Model III on Admission), outcomes of mortality and persistent organ dysfunction, and hourly census reports for the emergency department, for all ICUs and all adult wards were compiled. A total of 854 emergency department requests for ICU admission were logged, with 455 (53.3%) as "accept" and 399 (46.7%) as "deny" cases, with median emergency department boarding times 4.2 hours (interquartile range, 2.8-6.3 hr) and 11.7 hours (3.2-20.3 hr) and similar rates of persistent organ dysfunction and/or death 41.5% and 44.6%, respectively. Those accepted were younger (mean ± SD, 61 ± 17 vs 65 ± 18 yr) and more severely ill (median Mortality Probability Model III on Admission score, 15.3% [7.0-29.5%] vs 13.4% [6.3-25.2%]) than those denied admission. In the multivariable model, a full medical ICU was the only hospital-level factor significantly associated with a lower probability of ICU acceptance (odds ratio, 0.55 [95% CI, 0.37-0.81]). Using propensity score analysis to account for imbalances in baseline characteristics between those accepted or denied for ICU admission, longer emergency department boarding time after consult was associated with higher odds of mortality and persistent organ dysfunction (odds ratio, 1.77 [1.07-2.95]/log10 hour increase). ICU admission decisions for critically ill emergency department patients are affected by medical ICU bed availability, though higher emergency department volume and other ICU occupancy did not play a role. Prolonged emergency department boarding times were associated with worse patient outcomes, suggesting a need for improved throughput and targeted care for patients awaiting ICU admission.
Dennekamp, Martine; Straney, Lahn D; Erbas, Bircan; Abramson, Michael J; Keywood, Melita; Smith, Karen; Sim, Malcolm R; Glass, Deborah C; Del Monaco, Anthony; Haikerwal, Anjali; Tonkin, Andrew M
2015-10-01
Millions of people can potentially be exposed to smoke from forest fires, making this an important public health problem in many countries. In this study we aimed to measure the association between out-of-hospital cardiac arrest (OHCA) and forest fire smoke exposures in a large city during a severe forest fire season, and estimate the number of excess OHCAs due to the fire smoke. We investigated the association between particulate matter (PM) and other air pollutants and OHCA using a case-crossover study of adults (≥ 35 years of age) in Melbourne, Australia. Conditional logistic regression models were used to derive estimates of the percent change in the rate of OHCA associated with an interquartile range (IQR) increase in exposure. From July 2006 through June 2007, OHCA data were collected from the Victorian Ambulance Cardiac Arrest Registry. Hourly air pollution concentrations and meteorological data were obtained from a central monitoring site. There were 2,046 OHCAs with presumed cardiac etiology during our study period. Among men during the fire season, greater increases in OHCA were observed with IQR increases in the 48-hr lagged PM with diameter ≤ 2.5 μm (PM2.5) (8.05%; 95% CI: 2.30, 14.13%; IQR = 6.1 μg/m(3)) or ≤ 10 μm (PM10) (11.1%; 95% CI: 1.55, 21.48%; IQR = 13.7 μg/m(3)) and carbon monoxide (35.7%; 95% CI: 8.98, 68.92%; IQR = 0.3 ppm). There was no significant association between the rate of OHCA and air pollutants among women. One hundred seventy-four "fire-hours" (i.e., hours in which Melbourne's air quality was affected by forest fire smoke) were identified during 12 days of the 2006/2007 fire season, and 23.9 (95% CI: 3.1, 40.2) excess OHCAs were estimated to occur due to elevations in PM2.5 during these fire-hours. This study found an association between exposure to forest fire smoke and an increase in the rate of OHCA. These findings have implications for public health messages to raise community awareness and for planning of emergency services during forest fire seasons.
Patient Advocacy Organizations, Industry Funding, and Conflicts of Interest.
Rose, Susannah L; Highland, Janelle; Karafa, Matthew T; Joffe, Steven
2017-03-01
Patient advocacy organizations (PAOs) are influential health care stakeholders that provide direct counseling and education for patients, engage in policy advocacy, and shape research agendas. Many PAOs report having financial relationships with for-profit industry, yet little is known about the nature of these relationships. To describe the nature of industry funding and partnerships between PAOs and for-profit companies in the United States. A survey was conducted from September 1, 2013, to June 30, 2014, of a nationally representative random sample of 439 PAO leaders, representing 5.6% of 7865 PAOs identified in the United States. Survey questions addressed the nature of their activities, their financial relationships with industry, and the perceived effectiveness of their conflict of interest policies. Amount and sources of revenue as well as organizational experiences with and policies regarding financial conflict of interest. Of the 439 surveys mailed to PAO leaders, 289 (65.8%) were returned with at least 80% of the questions answered. The PAOs varied widely in terms of size, funding, activities, and disease focus. The median total revenue among responding organizations was $299 140 (interquartile range, $70 000-$1 200 000). A total of 165 of 245 PAOs (67.3%) reported receiving industry funding, with 19 of 160 PAOs (11.9%) receiving more than half of their funding from industry. Among the subset of PAOs that received industry funding, the median amount was $50 000 (interquartile range, $15 000-$200 000); the median proportion of industry support derived from the pharmaceutical, device, and/or biotechnology sectors was 45% (interquartile range, 0%-100%). A total of 220 of 269 respondents (81.8%) indicated that conflicts of interest are very or moderately relevant to PAOs, and 94 of 171 (55.0%) believed that their organizations' conflict of interest policies were very good. A total of 22 of 285 PAO leaders (7.7%) perceived pressure to conform their positions to the interests of corporate donors. Patient advocacy organizations engage in wide-ranging health activities. Although most PAOs receive modest funding from industry, a minority receive substantial industry support, raising added concerns about independence. Many respondents report a need to improve their conflict of interest policies to help maintain public trust.
Choquette, Anne F.
2014-01-01
This report summarizes pesticide and nitrate (as nitrogen) results from quarterly sampling of 31 surficial-aquifer wells in the Lake Wales Ridge Monitoring Network during April 1999 through January 2005. The wells, located adjacent to citrus orchards and used for monitoring only, were generally screened (sampled) within 5 to 40 feet of the water table. Of the 44 citrus pesticides and pesticide degradates analyzed, 17 were detected in groundwater samples. Parent pesticides and degradates detected in quarterly groundwater samples, ordered by frequency of detection, included norflurazon, demethyl norflurazon, simazine, diuron, bromacil, aldicarb sulfone, aldicarb sulfoxide, deisopropylatrazine (DIA), imidacloprid, metalaxyl, thiazopyr monoacid, oxamyl, and aldicarb. Reconnaissance sampling of five Network wells yielded detection of four additional pesticide degradates (hydroxysimazine, didealkylatrazine, deisopropylhydroxyatrazine, and hydroxyatrazine). The highest median concentration values per well, based on samples collected during the 1999–2005 period (n=14 to 24 samples per well), included 3.05 µg/L (micrograms per liter) (simazine), 3.90 µg/L (diuron), 6.30 µg/L (aldicarb sulfone), 6.85 µg/L (aldicarb sulfoxide), 22.0 µg/L (demethyl norflurazon), 25.0 µg/ (norflurazon), 89 µg/ (bromacil), and 25.5 mg/L (milligrams per liter) (nitrate). Nitrate concentrations exceeded the 10 mg/L (as nitrogen) drinking water standard in one or more groundwater samples from 28 of the wells, and the median nitrate concentration among these wells was 14 mg/L. Sampled groundwater pesticide concentrations exceeded Florida’s health-guidance benchmarks for aldicarb sulfoxide and aldicarb sulfone (4 wells), the sum of aldicarb and its degradates (6 wells), simazine (2 wells), the sum of simazine and DIA (3 wells), diuron (2 wells), bromacil (1 well), and the sum of norflurazon and demethyl norflurazon (1 well). The magnitude of fluctuations in groundwater pesticide concentrations varied between wells and between pesticide compounds. Of the 10 pesticide compounds detected at sufficient frequency to assess temporal variability in quarterly sampling records, median values of the relative interquartile range (ratio of the interquartile range to the median) among wells typically ranged from about 100 to 150 percent. The relative interquartile range of pesticide concentrations at individual wells could be much higher, sometimes exceeding 200 to 500 percent. No distinct spatial patterns were apparent among median pesticide concentrations in sampled wells; nitrate concentrations tended to be greater in samples from wells in the northern part of the study area.
Shin, Andrea; Camilleri, Michael; Busciglio, Irene; Burton, Duane; Smith, Steven A; Vella, Adrian; Ryks, Michael; Rhoten, Deborah; Zinsmeister, Alan R
2013-11-01
RM-131, a synthetic ghrelin agonist, greatly accelerates gastric emptying of solids in patients with type 2 diabetes and delayed gastric emptying (DGE). We investigated the safety and effects of a single dose of RM-131 on gastric emptying and upper gastrointestinal (GI) symptoms in patients with type 1 diabetes and previously documented DGE. In a double-blind cross-over study, 10 patients with type 1 diabetes (age, 45.7 ± 4.4 y; body mass index, 24.1 ± 1.1 kg/m(2)) and previously documented DGE were assigned in random order to receive a single dose of RM-131 (100 μg, subcutaneously) or placebo. Thirty minutes later, they ate a radiolabeled solid-liquid meal containing EggBeaters (ConAgra Foods, Omaha, NE), and then underwent 4 hours of gastric emptying and 6 hours of colonic filling analyses by scintigraphy. Upper GI symptoms were assessed using a daily diary, gastroparesis cardinal symptom index (total GCSI-DD) and a combination of nausea, vomiting, fullness, and pain (NVFP) scores (each rated on a 0-5 scale). At screening, participants' mean level of hemoglobin A1c was 9.1% ± 0.5%; their total GCSI-DD score was 1.66 ± 0.38 (median, 1.71), and their total NVFP score was 1.73 ± 0.39 (median, 1.9). The t1/2 of solid gastric emptying was 84.9 ± 31.6 minutes when subjects were given RM-131 and 118.7 ± 26.7 when they were given a placebo. The median difference (Δ)was 33.9 minutes (interquartile range [IQR] -12, -49), or -54.7% (IQR, -21%,-110%). RM-131 decreased gastric retention of solids at 1 hour (P = .005) and 2 hours (P = .019). Numeric differences in t1/2 for gastric emptying of liquids, solid gastric emptying lag time, and colonic filling at 6 hours were not significant. Total GCSI-DD scores were 0.79 on placebo (IQR, 0.75, 2.08) and 0.17 on RM-131 (IQR, 0.00, 0.67; P = .026); NVFP scores were lower on RM-131 (P = .041). There were no significant adverse effects. RM-131 significantly accelerates gastric emptying of solids and reduces upper GI symptoms in patients with type 1 diabetes and documented DGE. Copyright © 2013 AGA Institute. Published by Elsevier Inc. All rights reserved.
Heiner, Jason D; Bebarta, Vikhyat S; Varney, Shawn M; Bothwell, Jason D; Cronin, Aaron J
2013-12-01
Annually, more than 100,000 US and international military and civilian personnel work in Afghanistan within terrain harboring venomous snakes. Current literature insufficiently supports Afghan antivenom treatment and stocking guidelines. We report the clinical course and treatments for snakebite victims presenting to US military hospitals in Afghanistan. All snakebite victims presenting to 3 US military emergency departments between July 2010 and August 2011 in northern and southern Afghanistan were examined via chart review. Case information included patient demographics, snake description, bite details and complications, laboratory results, antivenom use and adverse effects, procedures performed, and hospital course. Of 17 cases, median patient age was 20 years (interquartile range [IQR], 12-30), 16 were male, and 82% were Afghans. All bites were to an extremity, and median time to care was 2.8 hours (IQR, 2-5.8). On arrival, 8 had tachycardia and none had hypotension or hypoxia. A viper was implicated in 5 cases. Ten cases received at least 1 dose of polyvalent antivenom, most commonly for coagulopathy, without adverse effects. Six received additional antivenom, 6 had an international normalized ratio (INR) > 10, and none developed delayed coagulopathy. Three received blood transfusions. Hospital stay ranged from 1 to 4 days. None required vasopressors, fasciotomy, or other surgery, and none died. All had resolution of marked coagulopathies and improved swelling and pain on discharge. We report the largest series of snake envenomations treated by US physicians in Afghanistan. Antivenom was tolerated well with improvement of coagulopathy and symptoms. All patients survived with minimal advanced interventions other than blood transfusion. Published by Elsevier Inc.
Moodley, Dhayendre; Srikewal, Jyothi; Msweli, Lindiwe; Maharaj, Niren R
2011-02-01
While countries strengthen their health information systems, local health managers require alternative strategies to monitor their prevention of mother-to-child transmission (PMTCT) programmes to improve coverage and service delivery. To demonstrate the use of a postpartum audit to establish PMTCT coverage and programme deficiencies at hospitals and multiple primary health care facilities. A cross-sectional hospital-based medical chart audit of pregnant women admitted in labour to their regional hospital. Their antenatal hand-held medical records were added to a hospital-issued maternity chart that was used to record further obstetric and perinatal management during their hospital stay. Women recuperating in the postnatal wards up to 48 hours after delivery at two hospitals in KwaZulu-Natal participated. Data included their antenatal attendance, access to HIV counselling and testing (HCT), and access to nevirapine (NVP) for PMTCT. Fifty-three clinics were indirectly evaluated as a result of the postpartum audit. All clinics provided HCT and the average HIV testing rate was 91% (range 40 - 100); 15% (N = 8) of these clinics with HIV testing rates of < 80% were identified. The median frequency of NVP dispensing at 53 clinics was 87% (interquartile range 67 - 100); among these 30% (N = 16) with NVP dispensing frequencies of < 80% were identified. An exit survey by trained nurses at a maternity hospital can provide health services management with a quick estimate of antenatal and PMTCT coverage of multiple primary health facilities in a specified catchment area. Challenges in the PMTCT programme at primary health clinic and hospital levels were highlighted.
Bernardo, Ana Paula; Oliveira, Jose C; Santos, Olivia; Carvalho, Maria J; Cabrita, Antonio; Rodrigues, Anabela
2015-12-07
Insulin resistance has been associated with cardiovascular disease in peritoneal dialysis patients. Few studies have addressed the impact of fast transport status or dialysis prescription on insulin resistance. The aim of this study was to test whether insulin resistance is associated with obesity parameters, peritoneal transport rate, and glucose absorption. Insulin resistance was evaluated with homeostasis model assessment method (HOMA-IR), additionally corrected by adiponectin (HOMA-AD). Enrolled patients were prevalent nondiabetics attending at Santo António Hospital Peritoneal Dialysis Unit, who were free of hospitalization or infectious events in the previous 3 months (51 patients aged 50.4 ± 15.9 years, 59% women). Leptin, adiponectin, insulin-like growth factor-binding protein 1 (IGFBP-1), and daily glucose absorption were also measured. Lean tissue index, fat tissue index (FTI), and relative fat mass (rel.FM) were assessed using multifrequency bioimpedance. Patients were categorized according to dialysate to plasma creatinine ratio at 4 hours, 3.86% peritoneal equilibration test, and obesity parameters. Obesity was present in 49% of patients according to rel.FM. HOMA-IR correlated better with FTI than with body mass index. Significant correlations were found in obese, but not in nonobese patients, between HOMA-IR and leptin, leptin/adiponectin ratio (LAR), and IGFBP-1. HOMA-IR correlated with HOMA-AD, but did not correlate with glucose absorption or transport rate. There were no significant differences in insulin resistance indices, glucose absorption, and body composition parameters between fast and nonfast transporters. A total of 18 patients (35.3%) who had insulin resistance presented with higher LAR and rel.FM (7.3 [12.3, interquartile range] versus 0.7 [1.4, interquartile range], P<0.001, and 39.4 ± 10.1% versus 27.2 ± 11.5%, P=0.002, respectively), lower IGFBP-1 (8.2 ± 7.2 versus 21.0 ± 16.3 ng/ml, P=0.002), but similar glucose absorption and small-solute transport compared with patients without insulin resistance. FTI and LAR were independent correlates of HOMA-IR in multivariate analysis adjusted for glucose absorption and small-solute transport (r=0.82, P<0.001). Insulin resistance in nondiabetic peritoneal dialysis patients is associated with obesity and LAR independent of glucose absorption and small-solute transport status. Fast transport status was not associated with higher likelihood of obesity or insulin resistance. Copyright © 2015 by the American Society of Nephrology.
Outcome of visceral chimney grafts after urgent endovascular repair of complex aortic lesions.
Bin Jabr, Adel; Lindblad, Bengt; Kristmundsson, Thorarinn; Dias, Nuno; Resch, Timothy; Malina, Martin
2016-03-01
Endovascular abdominal aortic repair requires an adequate sealing zone. The chimney graft (CG) technique may be the only option for urgent high-risk patients who are unfit for open repair and have no adequate sealing zone. This single-center experience provides long-term results of CGs with endovascular repair for urgent and complex aortic lesions. Between July 2006 and October 2012, 51 patients (16 women) with a median age of 77 years (interquartile range, 72-81 years), were treated urgently (within 24 hours [61%]) or semiurgently (within 3 days [39%]) with endovascular aortic repair and visceral CGs (n = 73). Median follow-up was 2.3 years (interquartile range, 0.8-5.0 years) for the whole cohort, 3 years for 30-day survivors, and 4.8 years for patients who are still alive. Five patients (10%) died within 30 days. All of them had a sacrificed kidney. All-cause mortality was 57% (n = 29), but the chimney- and procedure-related mortality was 6% (n = 3) and 16% (n = 8), respectively. Chimney-related death was due to bleeding, infection, renal failure, and multiple organ failure. There were two postoperative ruptures; both were fatal although not related to the treated disease. The primary and secondary long-term CG patencies were 89% (65 of 73) and 93% (68 of 73), respectively. Primary type I endoleak (EL-I) occurred in 10% (5 of 51) of the patients, and only one patient had recurrent EL-I (2%; 1 of 51). No secondary endoleak was observed. Chimney-related reintervention was required in 16% (8 of 51) of the patients because of EL-I (n = 3), visceral ischemia (n = 4), and bleeding (n = 2). The reinterventions included stenting (n = 5), embolization (n = 3), and laparotomy (n = 2). Thirty-one visceral branches were sacrificed (9 celiac trunks, 9 right, and 13 left renal arteries). Among the 30-day survivors, 8 of 17 patients (47%) with a sacrificed kidney required permanent dialysis; of these, seven underwent an urgent index operation. The aneurysm sac shrank in 63% (29 of 46) of cases. The 6% chimney-related mortality and 93% long-term patency seem promising in urgent, complex aortic lesions of a high-risk population and may justify a continued yet restrictive applicability of this technique. Most endoleaks could be sealed endovascularly. However, sacrifice of a kidney in this elderly cohort was associated with permanent dialysis in 47% of patients. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Newth, Christopher J L; Sward, Katherine A; Khemani, Robinder G; Page, Kent; Meert, Kathleen L; Carcillo, Joseph A; Shanley, Thomas P; Moler, Frank W; Pollack, Murray M; Dalton, Heidi J; Wessel, David L; Berger, John T; Berg, Robert A; Harrison, Rick E; Holubkov, Richard; Doctor, Allan; Dean, J Michael; Jenkins, Tammara L; Nicholson, Carol E
2017-11-01
Although pediatric intensivists philosophically embrace lung protective ventilation for acute lung injury and acute respiratory distress syndrome, we hypothesized that ventilator management varies. We assessed ventilator management by evaluating changes to ventilator settings in response to blood gases, pulse oximetry, or end-tidal CO2. We also assessed the potential impact that a pediatric mechanical ventilation protocol adapted from National Heart Lung and Blood Institute acute respiratory distress syndrome network protocols could have on reducing variability by comparing actual changes in ventilator settings to those recommended by the protocol. Prospective observational study. Eight tertiary care U.S. PICUs, October 2011 to April 2012. One hundred twenty patients (age range 17 d to 18 yr) with acute lung injury/acute respiratory distress syndrome. Two thousand hundred arterial and capillary blood gases, 3,964 oxygen saturation by pulse oximetry, and 2,757 end-tidal CO2 values were associated with 3,983 ventilator settings. Ventilation mode at study onset was pressure control 60%, volume control 19%, pressure-regulated volume control 18%, and high-frequency oscillatory ventilation 3%. Clinicians changed FIO2 by ±5 or ±10% increments every 8 hours. Positive end-expiratory pressure was limited at ~10 cm H2O as oxygenation worsened, lower than would have been recommended by the protocol. In the first 72 hours of mechanical ventilation, maximum tidal volume/kg using predicted versus actual body weight was 10.3 (8.5-12.9) (median [interquartile range]) versus 9.2 mL/kg (7.6-12.0) (p < 0.001). Intensivists made changes similar to protocol recommendations 29% of the time, opposite to the protocol's recommendation 12% of the time and no changes 56% of the time. Ventilator management varies substantially in children with acute respiratory distress syndrome. Opportunities exist to minimize variability and potentially injurious ventilator settings by using a pediatric mechanical ventilation protocol offering adequately explicit instructions for given clinical situations. An accepted protocol could also reduce confounding by mechanical ventilation management in a clinical trial.
Glover, Guy W; Thomas, Richard M; Vamvakas, George; Al-Subaie, Nawaf; Cranshaw, Jules; Walden, Andrew; Wise, Matthew P; Ostermann, Marlies; Thomas-Jones, Emma; Cronberg, Tobias; Erlinge, David; Gasche, Yvan; Hassager, Christian; Horn, Janneke; Kjaergaard, Jesper; Kuiper, Michael; Pellis, Tommaso; Stammet, Pascal; Wanscher, Michael; Wetterslev, Jørn; Friberg, Hans; Nielsen, Niklas
2016-11-26
Targeted temperature management is recommended after out-of-hospital cardiac arrest and may be achieved using a variety of cooling devices. This study was conducted to explore the performance and outcomes for intravascular versus surface devices for targeted temperature management after out-of-hospital cardiac arrest. A retrospective analysis of data from the Targeted Temperature Management trial. N = 934. A total of 240 patients (26%) managed with intravascular versus 694 (74%) with surface devices. Devices were assessed for speed and precision during the induction, maintenance and rewarming phases in addition to adverse events. All-cause mortality, as well as a composite of poor neurological function or death, as evaluated by the Cerebral Performance Category and modified Rankin scale were analysed. For patients managed at 33 °C there was no difference between intravascular and surface groups in the median time taken to achieve target temperature (210 [interquartile range (IQR) 180] minutes vs. 240 [IQR 180] minutes, p = 0.58), maximum rate of cooling (1.0 [0.7] vs. 1.0 [0.9] °C/hr, p = 0.44), the number of patients who reached target temperature (within 4 hours (65% vs. 60%, p = 0.30); or ever (100% vs. 97%, p = 0.47), or episodes of overcooling (8% vs. 34%, p = 0.15). In the maintenance phase, cumulative temperature deviation (median 3.2 [IQR 5.0] °C hr vs. 9.3 [IQR 8.0] °C hr, p = <0.001), number of patients ever out of range (57.0% vs. 91.5%, p = 0.006) and median time out of range (1 [IQR 4.0] hours vs. 8.0 [IQR 9.0] hours, p = <0.001) were all significantly greater in the surface group although there was no difference in the occurrence of pyrexia. Adverse events were not different between intravascular and surface groups. There was no statistically significant difference in mortality (intravascular 46.3% vs. surface 50.0%; p = 0.32), Cerebral Performance Category scale 3-5 (49.0% vs. 54.3%; p = 0.18) or modified Rankin scale 4-6 (49.0% vs. 53.0%; p = 0.48). Intravascular and surface cooling was equally effective during induction of mild hypothermia. However, surface cooling was associated with less precision during the maintenance phase. There was no difference in adverse events, mortality or poor neurological outcomes between patients treated with intravascular and surface cooling devices. TTM trial ClinicalTrials.gov number https://clinicaltrials.gov/ct2/show/NCT01020916 NCT01020916; 25 November 2009.
Unaka, Ndidi I; Statile, Angela; Haney, Julianne; Beck, Andrew F; Brady, Patrick W; Jerardi, Karen E
2017-02-01
The average American adult reads at an 8th-grade level. Discharge instructions written above this level might increase the risk of adverse outcomes for children as they transition from hospital to home. We conducted a cross-sectional study at a large urban academic children's hospital to describe readability levels, understandability scores, and completeness of written instructions given to families at hospital discharge. Two hundred charts for patients discharged from the hospital medicine service were randomly selected for review. Written discharge instructions were extracted and scored for readability (Fry Readability Scale [FRS]), understandability (Patient Education Materials Assessment Tool [PEMAT]), and completeness (5 criteria determined by consensus). Descriptive statistics enumerated the distribution of readability, understandability, and completeness of written discharge instructions. Of the patients included in the study, 51% were publicly insured. Median age was 3.1 years, and median length of stay was 2.0 days. The median readability score corresponded to a 10th-grade reading level (interquartile range, 8-12; range, 1-13). Median PEMAT score was 73% (interquartile range, 64%-82%; range, 45%-100%); 36% of instructions scored below 70%, correlating with suboptimal understandability. The diagnosis was described in only 33% of the instructions. Although explicit warning signs were listed in most instructions, 38% of the instructions did not include information on the person to contact if warning signs developed. Overall, the readability, understandability, and completeness of discharge instructions were subpar. Efforts to improve the content of discharge instructions may promote safe and effective transitions home. Journal of Hospital Medicine 2017;12:98-101. © 2017 Society of Hospital Medicine.
Electromagnetic Navigation Bronchoscopy for Identifying Lung Nodules for Thoracoscopic Resection.
Marino, Katy A; Sullivan, Jennifer L; Weksler, Benny
2016-08-01
Pulmonary nodules smaller than 1 cm can be difficult to identify during minimally invasive resection, necessitating conversion to thoracotomy. We hypothesized that localizing nodules with electromagnetic navigation bronchoscopy and marking them with methylene blue would allow minimally invasive resection and reduce conversion to thoracotomy. We retrospectively identified all patients who underwent electromagnetic navigation bronchoscopy followed by minimally invasive resection of a pulmonary nodule from 2011 to 2014. Lung nodules smaller than 10 mm and nodules smaller than 20 mm that were also located more than 10 mm from the pleural surface were localized and marked with methylene blue. Immediately after marking, all patients underwent resection. Seventy patients underwent electromagnetic navigation bronchoscopy marking followed by minimally invasive resection. The majority of patients (68/70, 97%) had one nodule localized; 2 patients (2/70, 3%) had two nodules localized. The median nodule size was 8 mm (range, 4-17 mm; interquartile range, 5 mm). The median distance from the pleural surface was 6 mm (range, 1-19 mm; interquartile range, 6 mm). There were no conversions to thoracotomy. Nodule marking was successful in 70 of 72 attempts (97.2%); two nodules were identified by palpation. The nodules were most commonly metastases from other sites (31/70, 44.3%). There were no adverse events related to electromagnetic navigation bronchoscopy-guided marking or wedge resection, and minimal adverse events after resections that were more extensive. Localizing and marking small pulmonary nodules using electromagnetic navigation bronchoscopy is safe and effective for nodule identification before minimally invasive resection. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Nowak, Rebecca G; Gravitt, Patti E; He, Xin; Ketende, Sosthenes; Dauda, Wuese; Omuh, Helen; Blattner, William A; Charurat, Manhattan E
2016-04-01
Prevalence estimates of anal high-risk human papillomavirus (HR-HPV) are needed in sub-Saharan Africa where HIV is endemic. This study evaluated anal HR-HPV in Nigeria among HIV-positive and HIV-negative men who have sex with men (MSM) for future immunization recommendations. We conducted a cross-sectional study to compare the prevalence of anal HR-HPV infections between 64 HIV-negative and 90 HIV-positive MSM. Multivariate Poisson regression analyses were used to examine demographic and behavioral risk factors associated with any HR-HPV infections. The median age of the 154 participants was 25 years (interquartile range, 22-28 years; range, 16-38 years), and the median age at initiation of anal sex with another man was 16 years (interquartile range, 13-18 years; range, 7-29 years). The prevalence of anal HR-HPV was higher among HIV-positive than HIV-negative MSM (91.1% vs. 40.6%, P < 0.001). In the multivariate analysis, HIV infection (adjusted prevalence ratio [aPR], 2.02; 95% confidence interval [CI], 1.49-2.72), 10 years or more since anal sexual debut (aPR, 1.26; 95% CI, 1.07-1.49), and concurrent relationships with men (aPR, 1.32; 95% CI, 1.04-1.67) were associated with increased anal HR-HPV prevalence. Anal HR-HPV infection is high for young Nigerian MSM, and rates are amplified in those coinfected with HIV. Providing universal coverage as well as catch-up immunization for young MSM may be an effective anal cancer prevention strategy in Nigeria.
Hecht, Silke; Adams, W H; Cunningham, M A; Lane, I F; Howell, N E
2013-01-01
Effective teaching of veterinary radiology can be challenging in a traditional classroom environment. Audience response systems, colloquially known as "clickers," provide a means of encouraging student interaction. The purpose of this study was to compare student performance and course evaluations before and after using the Classroom Performance System™ in the third-year (fifth semester) didactic radiology course at the University of Tennessee College of Veterinary Medicine. Overall student performance was assessed by comparing median numeric final course grades (%) between years without and with use of the Classroom Performance System™. Grades of students were determined for individual instructors' sections. Student evaluations of the radiology course were compared for the years available (2007-2010). Student interactions were also evaluated subjectively by instructors who used the Classroom Performance System™. There was a significant difference (p = 0.009) between the median student grade before (2005 - 2008, median 82.2%; interquartile range 77.6-85.7%; range 61.9-95.5%) and after use of the classroom performance system (2009-2010, median 83.6%; interquartile range 79.9-87.9%; range 68.2-93.2%). There was no statistically significant difference in median student grades for individual instructors over the study period. The radiology course student evaluation scores were significantly higher in years where the Classroom Performance System™ was used in comparison to previous years (P = 0.019). Subjectively, students appeared more involved when using clickers. Findings indicated that the Classroom Performance System™ may be a useful tool for enhancing veterinary radiology education. © 2012 Veterinary Radiology & Ultrasound.
O'Connor, Jeremy M; Fessele, Kristen L; Steiner, Jean; Seidl-Rathkopf, Kathi; Carson, Kenneth R; Nussbaum, Nathan C; Yin, Emily S; Adelson, Kerin B; Presley, Carolyn J; Chiang, Anne C; Ross, Joseph S; Abernethy, Amy P; Gross, Cary P
2018-05-10
The US Food and Drug Administration (FDA) is increasing its pace of approvals for novel cancer therapeutics, including for immune checkpoint inhibitors of programmed cell death 1 protein (anti-PD-1 agents). However, little is known about how quickly anti-PD-1 agents reach eligible patients in practice or whether such patients differ from those studied in clinical trials that lead to FDA approval (pivotal clinical trials). To assess the speed with which anti-PD-1 agents reached eligible patients in practice and to compare the ages of patients treated in clinical practice with the ages of those treated in pivotal clinical trials. This retrospective cohort study, performed from January 1, 2011, through August 31, 2016, included patients from the Flatiron Health Network who were eligible for anti-PD-1 treatment of selected cancer types, which included melanoma, non-small cell lung cancer (NSCLC), and renal cell carcinoma (RCC). Cumulative proportions of eligible patients receiving anti-PD-1 treatment and their age distributions. The study identified 3089 patients who were eligible for anti-PD-1 treatment (median age, 66 [interquartile range, 56-75] years for patients with melanoma, 66 [interquartile range, 58-72] years for patients with RCC, and 67 [interquartile range, 59-74] years for patients with NSCLC; 1742 male [56.4%] and 1347 [43.6%] female; 2066 [66.9%] white). Of these patients, 2123 (68.7%) received anti-PD-1 treatment, including 439 eligible patients with melanoma (79.1%), 1417 eligible patients with NSCLC (65.6%), and 267 eligible patients with RCC (71.2%). Within 4 months after FDA approval, greater than 60% of eligible patients in each cohort had received anti-PD-1 treatment. Overall, similar proportions of older and younger patients received anti-PD-1 treatment during the first 9 months after FDA approval. However, there were significant differences in age between clinical trial participants and patients receiving anti-PD-1 treatment in clinical practice, with more patients being older than 65 years in clinical practice (range, 327 of 1365 [60.6%] to 46 of 72 [63.9%]) than in pivotal clinical trials (range, 38 of 120 [31.7%] to 223 of 544 [41.0%]; all P < .001). Anti-PD-1 agents rapidly reached patients in clinical practice, and patients treated in clinical practice differed significantly from patients treated in pivotal clinical trials. Future actions are needed to ensure that rapid adoption occurs on the basis of representative trial evidence.
The Role of Sleep in the Modulation of Gastroesophageal Reflux and Symptoms in NICU Neonates.
Qureshi, Aslam; Malkar, Manish; Splaingard, Mark; Khuhro, Abdul; Jadcherla, Sudarshan
2015-09-01
Newborns sleep about 80% of the time. Gastroesophageal reflux disease is prevalent in about 10% of neonatal intensive care unit infants. Concurrent polysomnography and pH-impedance studies clarify the relationship of gastroesophageal reflux with sleep. To characterize spatiotemporal and chemical characteristics of impedance-positive gastroesophageal reflux and define symptom associations in sleep and wake states in symptomatic neonates. We hypothesized that frequency of impedance-positive gastroesophageal reflux events and their association with cardiorespiratory symptoms is greater during sleep. Eighteen neonates underwent concurrent polysomnography with a pH-impedance study. Impedance-positive gastroesophageal reflux events (weakly acidic or acidic) were categorized between sleep versus wake states: Symptom Index = number of symptoms with gastroesophageal reflux/total symptoms*100; Symptom Sensitivity Index = number of gastroesophageal reflux with symptoms/total gastroesophageal reflux*100; Symptom Association Probability = [(1 - probability of observed association between reflux and symptoms)*100]). We analyzed 317 gastroesophageal reflux events during 116 hours of polysomnography. During wake versus sleep, respectively, the median (interquartile range) frequency of impedance-positive gastroesophageal reflux was 4.9 (3.1-5.8) versus 1.4 (0.7-1.7) events/hour (P < 0.001) and the proximal migration was 2.6 (0.8-3.3) versus 0.2 (0.0-0.9) events/hour (P < 0.001). The Symptom Index for cardiorespiratory symptoms for impedance-positive events was 22.5 (0-55.3) versus 6.1 (0-13), P = 0.04, whereas the Symptom Sensitivity Index was 9.1 (0-23.1) versus 18.4 (0-50), P = 0.04, although Symptom Association Probability was similar (P = 0.68). Contrary to our hypothesis, frequency of gastroesophageal reflux in sleep is lower; however, spatiotemporal and chemical characteristics of gastroesophageal reflux and symptom-generation mechanisms are distinct. For cardiorespiratory symptoms during sleep, a lower Symptom Index entails evaluation for etiologies other than gastroesophageal reflux disease, a higher Symptom Sensitivity Index implies heightened esophageal sensitivity, and similar Symptom Association Probability indicates other mechanistic possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.
Er, Anıl; Çağlar, Aykut; Akgül, Fatma; Ulusoy, Emel; Çitlenbik, Hale; Yılmaz, Durgül; Duman, Murat
2018-06-01
High-flow nasal cannula (HFNC) is a new treatment option for pediatric respiratory distress and we aimed to assess early predictive factors of unresponsiveness to HFNC therapy in a pediatric emergency department (ED). Patients who presented with respiratory distress and were treated by HFNC, were included. The age, gender, weight, medical history, diagnosis, vital signs, oxygen saturation/fraction of inspired oxygen (SpO 2 /FiO 2 ) ratio, modified Respiratory Distress Assessment Instrument (mRDAI) scores, medical interventions, duration of HFNC therapy, time to escalation, adverse effects, and laboratory test results were obtained from medical and nursing records. The requirement of a higher level of respiratory support due to unchanged or increased RR compared to initial RR, incipient, or progressive respiratory acidosis, incipient hemodynamic instability was defined as unresponsiveness to HFNC. The study enrolled 154 children with a median age of 10 months (interquartile range [IQR], 5.7-22.5 months). The diagnosis was acute bronchiolitis in 59 patients (38.3%), bacterial pneumonia in 64 patients (41.6%), and atypical or viral pneumonia in 31 patients (20.1%). Twenty-five patients (16.2%) were in the unresponsive group, and the median time for escalating respiratory support was 7 h (IQR: 4-20 h). The unresponsive group had lower SpO 2 and SpO 2 /FiO 2 (SF) ratio on admission, lower venous pH, and higher partial pressure of carbon dioxide (pCO 2 ) (P = 0.002, P = 0.012, and P = 0.001, respectively). Also the alteration of RR, mRDAI score, and SF ratio at the first hour was greater in the responsive group. The cut-off value of SF ratio at the first hour of HFNC was 195 for unresponsiveness. The low initial SpO 2 and SF ratio, respiratory acidosis, and SF ratio less than 195 at the first hours of treatment were related to unresponsiveness to HFNC therapy in our pediatric emergency department. © 2018 Wiley Periodicals, Inc.
Cohrs, Imke; Grünberg, Walter
2018-05-01
Hypophosphatemia is commonly associated with disease and decreased productivity in dairy cows particularly in early lactation. Oral supplementation with phosphate salts is recognized as suitable for the rapid correction of hypophosphatemia. Little information is available about the differences in efficacy between salts used for oral phosphorus supplementation. Comparison of efficacy of oral administration of NaH 2 PO 4 , Na 2 HPO 4 , and MgHPO 4 in treating hypophosphatemia in cattle. 12 healthy dairy cows in the fourth week of lactation in their second to fifth lactation. Randomized clinical study. Phosphorus deficient, hypophosphatemic cows underwent a sham treatment and were afterwards assigned to 1 of 3 treatments-NaH 2 PO 4 , Na 2 HPO 4 , or MgHPO 4 (each provided the equivalent of 60 g of phosphorus). Blood samples were obtained immediately before and repeatedly after treatment. Treatment with NaH 2 PO 4 and Na 2 HPO 4 resulted in rapid and sustained increases of plasma phosphate concentrations ([Pi]). Significant effects were apparent within 1 hour (NaH 2 PO 4 : P = .0044; Na 2 HPO 4 : P = .0077). Peak increments of plasma [Pi] of 5.33 mg/dL [5.26-5.36] and 4.30 mg/dL [3.59-4.68] (median and interquartile range) were reached after 7 and 6 hours in animals treated with NaPH 2 PO 4 and Na 2 HPO 4 , respectively, whereas treatment with MgHPO 4 led to peak increments 14 hours after treatment (3.19 mg/dL [2.11-4.04]). NaH 2 PO 4 and Na 2 HPO 4 are suitable to rapidly correct hypophosphatemia in cattle. Because of the protracted and weaker effect, MgHPO 4 cannot be recommended for this purpose. Despite important differences in solubility of NaH 2 PO 4 and Na 2 HPO 4 only small plasma [Pi] differences were observed after treatment. Copyright © 2018 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Linder, Brian J; Rivera, Marcelino E; Ziegelmann, Matthew J; Elliott, Daniel S
2015-09-01
To evaluate long-term device outcomes following primary artificial urinary sphincter (AUS) implantation. We identified 1802 male patients with stress urinary incontinence that underwent AUS placement from 1983 to 2011. Of these, 1082 (60%) were involving primary implantations and comprise the study cohort. Multiple clinical and surgical variables were evaluated for potential association with treatment failure, defined as any secondary surgery. Patient follow-up was obtained through office examination, operative report, and written or telephone correspondence. Patients undergoing AUS implantation had a median age of 71 years (interquartile range 66-76) and median follow-up of 4.1 years (interquartile range 0.8-7.7). Overall, 338 of 1082 patients (31.2%) underwent secondary surgery, including 89 for device infection and/or erosion, 131 for device malfunction, 89 for urethral atrophy, and 29 for pump malposition or tubing complications. No patient-related risk factors were independently associated with an increased risk of secondary surgery on multivariable analysis. Secondary surgery-free survival was 90% at 1 year, 74% at 5 years, 57% at 10 years, and 41% at 15 years. Primary AUS implantation is associated with acceptable long-term outcomes. Recognition of long-term success is important for preoperative patient counseling. Copyright © 2015 Elsevier Inc. All rights reserved.
Poola, Ashwini Suresh; Rentea, Rebecca M; Weaver, Katrina L; St Peter, Shawn David
2017-05-01
While there is literature on techniques for pectus bar removal, there are limited reports on post-operative management. This can include obtaining a postoperative chest radiograph (CXR) despite the minimal risk of associated intra-thoracic complications. This is a review of our experience with bar removal and lack of routine post-operative CXR. A single institution retrospective chart review was performed from 2000 to 2015. Patients who underwent a pectus bar removal procedure were included. We assessed operative timing of bar placement and removal, procedure length, intra-operative and post-operative complications and post-operative CXR findings, specifically the rate of pneumothoraces. 450 patients were identified in this study. Median duration of bar placement prior to removal was 35 months (interquartile range 30 and 36 months). Sixtey-four patients obtained a post-operative CXR. Of these, only one (58%) film revealed a pneumothorax; this was not drained. A CXR was not obtained in 386 (86%) patients with no immediate or delayed complications from this practice. Median follow-up time for all patients was 11 months (interquartile range 7.5-17 months). The risk for a clinically relevant pneumothorax is minimal following bar removal. This suggests that not obtaining routine imaging following bar removal may be a safe practice.
van der Westhuizen, J; Kuo, P Y; Reed, P W; Holder, K
2011-03-01
Gastric absorption of oral paracetamol (acetaminophen) may be unreliable perioperatively in the starved and stressed patient. We compared plasma concentrations of parenteral paracetamol given preoperatively and oral paracetamol when given as premedication. Patients scheduled for elective ear; nose and throat surgery or orthopaedic surgery were randomised to receive either oral or intravenous paracetamol as preoperative medication. The oral dose was given 30 minutes before induction of anaesthesia and the intravenous dose given pre-induction. All patients were given a standardised anaesthetic by the same specialist anaesthetist who took blood for paracetamol concentrations 30 minutes after the first dose and then at 30 minute intervals for 240 minutes. Therapeutic concentrations of paracetamol were reached in 96% of patients who had received the drug parenterally, and 67% of patients who had received it orally. Maximum median plasma concentrations were 19 mg.l(-1) (interquartile range 15 to 23 mg.l(-1)) and 13 mg.l(-1) (interquartile range 0 to 18 mg.l(-1)) for the intravenous and oral group respectively. The difference between intravenous and oral groups was less marked after 150 minutes but the intravenous preparation gave higher plasma concentrations throughout the study period. It can be concluded that paracetamol gives more reliable therapeutic plasma concentrations when given intravenously.
The impact of a dedicated research education month for anesthesiology residents.
Freundlich, Robert E; Newman, Jessica W; Tremper, Kevin K; Mhyre, Jill M; Kheterpal, Sachin; Sanford, Theodore J; Tait, Alan R
2015-01-01
An educational intervention was implemented at the University of Michigan starting in 2008, in which anesthesiology interns complete a dedicated month-long didactic rotation in evidence-based medicine (EBM) and research methodology. We sought to assess its utility. Scores on a validated EBM test before and after the rotation were compared and assessed for significance of improvement. A survey was also given to gauge satisfaction with the quality of the rotation and self-reported improvement in understanding of EBM topics. Fourteen consecutive interns completed the research rotation during the study period. One hundred percent completed both the pre- and postrotation test. The mean pretest score was 7.78 ± 2.46 (median = 7.5, 0-15 scale, and interquartile range 7.0-10.0) and the mean posttest score was 10.00 ± 2.35 (median = 9.5, interquartile range 8.0-12.3), which represented a statistically significant increase (P = 0.011, Wilcoxon signed-rank test). All fourteen of the residents "agreed" or "strongly agreed" that they would recommend the course to future interns and that the course increased their ability to critically review the literature. Our findings demonstrate that this can be an effective means of improving understanding of EBM topics and anesthesiology research.
Potential Reporting Bias in Neuroimaging Studies of Sex Differences.
David, Sean P; Naudet, Florian; Laude, Jennifer; Radua, Joaquim; Fusar-Poli, Paolo; Chu, Isabella; Stefanick, Marcia L; Ioannidis, John P A
2018-04-17
Numerous functional magnetic resonance imaging (fMRI) studies have reported sex differences. To empirically evaluate for evidence of excessive significance bias in this literature, we searched for published fMRI studies of human brain to evaluate sex differences, regardless of the topic investigated, in Medline and Scopus over 10 years. We analyzed the prevalence of conclusions in favor of sex differences and the correlation between study sample sizes and number of significant foci identified. In the absence of bias, larger studies (better powered) should identify a larger number of significant foci. Across 179 papers, median sample size was n = 32 (interquartile range 23-47.5). A median of 5 foci related to sex differences were reported (interquartile range, 2-9.5). Few articles (n = 2) had titles focused on no differences or on similarities (n = 3) between sexes. Overall, 158 papers (88%) reached "positive" conclusions in their abstract and presented some foci related to sex differences. There was no statistically significant relationship between sample size and the number of foci (-0.048% increase for every 10 participants, p = 0.63). The extremely high prevalence of "positive" results and the lack of the expected relationship between sample size and the number of discovered foci reflect probable reporting bias and excess significance bias in this literature.
Schmid, Sabrina; Goldberg-Bockhorn, Eva; Schwarz, Silke; Rotter, Nicole; Kassubek, Jan; Del Tredici, Kelly; Pinkhardt, Elmar; Otto, Markus; Ludolph, Albert C; Oeckl, Patrick
2018-06-01
In autopsy cases staged for sporadic Parkinson's disease (PD), the neuropathology is characterized by a preclinical phase that targets the enteric nervous system of the gastrointestinal tract (GIT). Therefore, the ENS might be a source of potential (presymptomatic) PD biomarkers. In this clinically based study, we examined the alpha-synuclein (αSyn) concentration in an easily accessible protein storage medium of the GIT, dental calculus, in 21/50 patients with PD and 28/50 age- and gender-matched controls using ELISA. αSyn was detectable in dental calculus and the median concentration in the control patients was 8.6 pg/mg calculus (interquartile range 2.6-13.1 pg/mg). αSyn concentrations were significantly influenced by blood contamination and samples with a hemoglobin concentration of > 4000 ng/mL were excluded. There was no significant difference of αSyn concentrations in the dental calculus of PD patients (5.76 pg/mg, interquartile range 2.91-9.74 pg/mg) compared to those in controls (p = 0.40). The total αSyn concentration in dental calculus is not a suitable biomarker for sporadic PD. Disease-related variants such as oligomeric or phosphorylated αSyn in calculus might prove to be more specific.
Central obesity, leptin and cognitive decline: the Sacramento Area Latino Study on Aging.
Zeki Al Hazzouri, Adina; Haan, Mary N; Whitmer, Rachel A; Yaffe, Kristine; Neuhaus, John
2012-01-01
Central obesity is a risk factor for cognitive decline. Leptin is secreted by adipose tissue and has been associated with better cognitive function. Aging Mexican Americans have higher levels of obesity than non-Hispanic Whites, but no investigations examined the relationship between leptin and cognitive decline among them or the role of central obesity in this association. We analyzed 1,480 dementia-free older Mexican Americans who were followed over 10 years. Cognitive function was assessed every 12-15 months with the Modified Mini Mental State Exam (3MSE) and the Spanish and English Verbal Learning Test (SEVLT). For females with a small waist circumference (≤35 inches), an interquartile range difference in leptin was associated with 35% less 3MSE errors and 22% less decline in the SEVLT score over 10 years. For males with a small waist circumference (≤40 inches), an interquartile range difference in leptin was associated with 44% less 3MSE errors and 30% less decline in the SEVLT score over 10 years. There was no association between leptin and cognitive decline among females or males with a large waist circumference. Leptin interacts with central obesity in shaping cognitive decline. Our findings provide valuable information about the effects of metabolic risk factors on cognitive function. Copyright © 2012 S. Karger AG, Basel.
Colacino, Justin A.; Arthur, Anna E.; Ferguson, Kelly K.; Rozek, Laura S.
2014-01-01
Chronic cadmium exposure may cause disease through induction of systemic oxidative stress and inflammation. Factors that mitigate cadmium toxicity and could serve as interventions in exposed populations have not been well characterized. We used data from the 2003–2010 National Health and Nutrition Examination Survey to quantify diet’s role in modifying associations between cadmium exposure and oxidative stress and inflammation. We created a composite antioxidant and anti-inflammatory diet score (ADS) by ranking participants by quintile of intake across a panel of 19 nutrients. We identified associations and effect modification between ADS, urinary cadmium, and markers of oxidative stress and inflammation by multiple linear regression. An interquartile range increase in urinary cadmium was associated with a 47.5%, 8.8%, and 3.7% increase in C-reactive protein (CRP), gamma glutamyl transferase (GGT), and alkaline phosphatase (ALP), respectively. An interquartile range increase in ADS was associated with an 7.4%, 3.3%, 5.2%, and 2.5% decrease in CRP, GGT, ALP, and total white blood cell count respectively, and a 3.0% increase in serum bilirubin. ADS significantly attenuated the association between cadmium exposure, CRP and ALP. Dietary interventions may provide a route to reduce the impact of cadmium toxicity on the population level. PMID:24607659
Merino-Ingelmo, Raquel; Santos-de Soto, José; Coserria-Sánchez, Félix; Descalzo-Señoran, Alfonso; Valverde-Pérez, Israel
2014-05-01
Percutaneous pulmonary valvuloplasty is the preferred interventional procedure for pulmonary valve stenosis. The aim of this study was to evaluate the effectiveness of this technique, assess the factors leading to its success, and determine the long-term results in the pediatric population. The study included 53 patients with pulmonary valve stenosis undergoing percutaneous balloon valvuloplasty between December 1985 and December 2000. Right ventricular size and functional echocardiographic parameters, such as pulmonary regurgitation and residual transvalvular gradient, were assessed during long-term follow-up. Peak-to-peak transvalvular gradient decreased from 74 mmHg [interquartile range, 65-100 mmHg] to 20 mmHg [interquartile range, 14-34 mmHg]. The procedure was unsuccessful in 2 patients (3.77%). The immediate success rate was 73.58%. Follow-up ranged from 10 years to 24 years (median, 15 years). During follow-up, all patients developed late pulmonary regurgitation which was assessed as grade II in 58.4% and grade III in 31.2%. There was only 1 case of long-term restenosis (2.1%). Severe right ventricular dilatation was observed in 27.1% of the patients. None of the patients developed significant right ventricular dysfunction. Pulmonary valve replacement was not required in any of the patients. Percutaneous balloon valvuloplasty is an effective technique in the treatment of pulmonary valve stenosis with good long-term results. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.
Mendoza, Rosario; Tolentino-Mayo, Lizbeth; Hernández-Barrera, Lucia; Nieto, Claudia; Monterrubio-Flores, Eric A; Barquera, Simón
2018-01-19
A Mexican Committee of Nutrition Experts (MCNE) from the National Institute of Public Health (INSP), free from conflict of interest, established food content standards to place the front-of-package (FOP) logo on foods that meet these nutrition criteria. The objectives were to simulate the effect on nutrient intake in the Mexican adult population (20-59 years old) after replacing commonly consumed processed foods with those that meet the FOP nutrition-labeling criteria. Twenty-four hour dietary recalls were collected from the 2012 Mexican National Health and Nutrition Survey ( n = 2164 adults). A food database from the INSP was used. Weighted medians and 25-75 inter-quartile ranges (IQR) of energy and nutrient intake were calculated for all subjects by sociodemographic characteristics before and after replacing foods. Significant decreases were observed in energy (-5.4%), saturated fatty acids (-18.9%), trans-fatty acids (-20%), total sugar (-36.8%) and sodium (-10.7%) intake and a significant increase in fiber intake (+15.5%) after replacing foods, using the MCNE nutrition criteria. Replacing commonly consumed processed foods in the diet with foods that meet the FOP nutrition-labeling criteria set by the MCNE can lead to improvements in energy and nutrient intake in the Mexican adult population.
Mendoza, Rosario; Tolentino-Mayo, Lizbeth; Hernández-Barrera, Lucia; Monterrubio-Flores, Eric A.; Barquera, Simón
2018-01-01
A Mexican Committee of Nutrition Experts (MCNE) from the National Institute of Public Health (INSP), free from conflict of interest, established food content standards to place the front-of-package (FOP) logo on foods that meet these nutrition criteria. The objectives were to simulate the effect on nutrient intake in the Mexican adult population (20–59 years old) after replacing commonly consumed processed foods with those that meet the FOP nutrition-labeling criteria. Twenty-four hour dietary recalls were collected from the 2012 Mexican National Health and Nutrition Survey (n = 2164 adults). A food database from the INSP was used. Weighted medians and 25–75 inter-quartile ranges (IQR) of energy and nutrient intake were calculated for all subjects by sociodemographic characteristics before and after replacing foods. Significant decreases were observed in energy (−5.4%), saturated fatty acids (−18.9%), trans-fatty acids (−20%), total sugar (−36.8%) and sodium (−10.7%) intake and a significant increase in fiber intake (+15.5%) after replacing foods, using the MCNE nutrition criteria. Replacing commonly consumed processed foods in the diet with foods that meet the FOP nutrition-labeling criteria set by the MCNE can lead to improvements in energy and nutrient intake in the Mexican adult population. PMID:29351257
Anaerobic antibiotic usage for pneumonia in the medical intensive care unit.
Kioka, Mutsumi J; DiGiovine, Bruno; Rezik, Mohamed; Jennings, Jeffrey H
2017-11-01
Pneumonia is a common admitting diagnosis in the intensive care unit (ICU). When aspiration is suspected, antibiotics to cover anaerobes are frequently used, but in the absence of clear risk factors, current guidelines have questioned their role. It is unknown how frequently these guidelines are followed. We conducted a single-centre observational study on practice patterns of anaerobic antibiotic use in consecutive patients admitted to the ICU with aspiration pneumonia (Asp), community-acquired pneumonia (CAP) and healthcare-associated pneumonia (HCAP). A total of 192 patients were studied (Asp: 20, HCAP: 107, CAP: 65). Overall, 59 patients received anaerobic antibiotics (Asp: 90%, HCAP: 28%, CAP 17%) but a significant proportion of these patients did not meet criteria to receive them. Inappropriate anaerobic antibiotic use was 12/20 for Asp, 27/107 for HCAP and 9/65 for CAP. Mortality probability model III at zero hours (MPM0) score and a diagnosis of Asp were predictors of receiving inappropriate anaerobic antibiotics. Receiving inappropriate anaerobic antibiotics was associated with a longer ICU length of stay (LOS; 7 days (interquartile range (IQR): 7-21) vs 4 days (IQR:2-9), P = 0.017). For patients in the ICU admitted with pneumonia, there is a high occurrence of inappropriately prescribed anaerobic antibiotics, the use of which was associated with a longer ICU LOS. © 2017 Asian Pacific Society of Respirology.
Albawardi, Nada M.; Jradi, Hoda; Almalki, Abdulla A.; Al-Hazzaa, Hazzaa M.
2017-01-01
Research in Saudi Arabia has revealed a shocking level of insufficiently physically active adults, particularly women. The risk of sedentary behavior will likely increase as the number of women with office-based jobs increases. The aim of this study is to determine the level of sedentary behavior, and its associated factors, among Saudi women working office-based jobs in the city of Riyadh. A cross-sectional study of 420 Saudi female employees at 8 office-based worksites were measured to determine body mass index and were given a self-administered survey to evaluate their level of physical activity and sedentary behavior. Median sitting time on work days was 690 min per day (interquartile range, IQR 541–870), with nearly half accumulated during work hours, and 575 min per day (IQR 360–780) on non-work days. Predictors of work day sitting time were level of education, number of children, and working in the private sector. Number of children, whether they were single, and whether they lived in a small home were found to predict non-work day sitting time. This study identifies Saudi women in office-based jobs as a high-risk group for sedentary behavior. There is a need to promote physical activity at worksites and reduce prolonged sitting. PMID:28629200
Household ventilation and tuberculosis transmission in Kampala, Uganda.
Chamie, G; Wandera, B; Luetkemeyer, A; Bogere, J; Mugerwa, R D; Havlir, D V; Charlebois, E D
2013-06-01
To test the feasibility of measuring household ventilation and evaluate whether ventilation is associated with tuberculosis (TB) in household contacts in Kampala, Uganda. Adults with pulmonary TB and their household contacts received home visits to ascertain social and structural household characteristics. Ventilation was measured in air changes per hour (ACH) in each room by raising carbon dioxide (CO₂) levels using dry ice, removing the dry ice, and measuring changes in the natural log of CO₂ (lnCO2) over time. Ventilation was compared in homes with and without co-prevalent TB. Members of 61 of 66 (92%) households approached were enrolled. Households averaged 5.4 residents/home, with a median of one room/home. Twelve homes (20%) reported co-prevalent TB in household contacts. Median ventilation for all rooms was 14 ACH (interquartile range [IQR] 10-18). Median ventilation was 12 vs. 15 ACH in index cases' sleeping rooms in households with vs. those without co-prevalent TB (P = 0.12). Among smear-positive indexes not infected by the human immunodeficiency virus (HIV), median ventilation was 11 vs. 17 ACH in index cases' sleeping rooms in homes with vs. those without co-prevalent TB (P = 0.1). Our findings provide evidence that a simple CO₂ decay method used to measure ventilation in clinical settings can be adapted to homes, adding a novel tool and a neglected variable, ventilation, to the study of household TB transmission.
Ji, Liang; Wang, Gang; Li, Le; Li, Yi-Long; Hu, Ji-Sheng; Zhang, Guang-Quan; Chen, Hong-Ze; Chen, Hua; Kong, Rui; Bai, Xue-Wei; Sun, Bei
2018-04-01
This study aimed to assess the need of surgical necrosectomy after percutaneous catheter drainage (PCD) for infected necrotizing pancreatitis. The clinical data of documented/suspected patients who were treated with a step-up approach were extracted and analyzed. Of the 329 patients enrolled, the initial PCD was performed at 12 (interquartile range, 9-15) days since onset and 35.3% were cured by PCD alone. In the pre-PCD model, mean computed tomographic (CT) density of necrotic fluid collection (NFC; P < 0.001), and multiple-organ failure (MOF; P < 0.001) within 24 hours before the initial PCD were independent risk factors, and a combination of the previously mentioned 2 factors produced an area under the curve of 0.775. In the post-PCD model, mean CT density of NFC (P = 0.041), MOF (P = 0.002), and serum procalcitonin level (P = 0.035) 3 days after the initial PCD were independent risk factors, and a combination of these previously mentioned factors produced an area under the curve of 0.642. Both mean CT density of NFC and MOF are independent pre- and post-PCD risk factors for the need of necrosectomy after PCD. Post-PCD serum procalcitonin level might be a respondent factor that is correlated with the necessity of necrosectomy.
Levac, Danielle; McCormick, Anna; Levin, Mindy F; Brien, Marie; Mills, Richard; Miller, Elka; Sveistrup, Heidi
2018-02-01
To compare changes in gross motor skills and functional mobility between ambulatory children with cerebral palsy who underwent a 1-week clinic-based virtual reality intervention (VR) followed by a 6-week, therapist-monitored home active video gaming (AVG) program and children who completed only the 6-week home AVG program. Pilot non-randomized controlled trial. Five children received 1 hour of VR training for 5 days followed by a 6-week home AVG program, supervised online by a physical therapist. Six children completed only the 6-week home AVG program. The Gross Motor Function Measure Challenge Module (GMFM-CM) and Six Minute Walk Test (6MWT) evaluated change. There were no significant differences between groups. The home AVG-only group demonstrated a statistically and clinically significant improvement in GMFM-CM scores following the 6-week AVG intervention (median difference 4.5 points, interquartile range [IQR] 4.75, p = 0.042). The VR + AVG group demonstrated a statistically and clinically significant decrease in 6MWT distance following the intervention (median decrease 68.2 m, IQR 39.7 m, p = 0.043). All 6MWT scores returned to baseline at 2 months post-intervention. Neither intervention improved outcomes in this small sample. Online mechanisms to support therapist-child communication for exercise progression were insufficient to individualize exercise challenge.
Epidemiology and prognosis of coma in daytime television dramas
Casarett, David; Fishman, Jessica M; MacMoran, Holly Jo; Pickard, Amy; Asch, David A
2005-01-01
Objective To determine how soap operas portray, and possibly misrepresent, the likelihood of recovery for patients in coma. Design Retrospective cohort study. Setting Nine soap operas in the United States reviewed between 1 January 1995 and 15 May 2005. Subjects 64 characters who experienced a period of unconsciousness lasting at least 24 hours. Their final status at the end of the follow-up period was compared with pooled data from a meta-analysis. Results Comas lasted a median of 13 days (interquartile range 7-25 days). Fifty seven (89%) patients recovered fully, five (8%) died, and two (3%) remained in a vegetative state. Mortality for non-traumatic and traumatic coma was significantly lower than would be predicted from the meta-analysis data (non-traumatic 4% v 53%; traumatic 6% v 67%; Fisher's exact test both P < 0.001). On the day that patients regained consciousness, most (49/57; 86%) had no evidence of limited function, cognitive deficit, or residual disability needing rehabilitation. Compared with meta-analysis data, patients in this sample had a much better than expected chance of returning to normal function (non-traumatic 91% v 1%; traumatic 89% v 7%; both P < 0.001). Conclusions The portrayal of coma in soap operas is overly optimistic. Although these programmes are presented as fiction, they may contribute to unrealistic expectations of recovery. PMID:16373744
Abd ElHafeez, Samar; Tripepi, Giovanni; Quinn, Robert; Naga, Yasmine; Abdelmonem, Sherif; AbdelHady, Mohamed; Liu, Ping; James, Matthew; Zoccali, Carmine; Ravani, Pietro
2017-12-07
Epidemiology of acute kidney injury (AKI) in developing countries is under-studied. We evaluated the risk and prognosis of AKI in patients admitted to intensive care units (ICUs) in Egypt. We recruited consecutive adults admitted to ICUs in Alexandria Teaching Hospitals over six months. We used the KDIGO criteria for AKI. We followed participants until the earliest of ICU discharge, death, day 30 from entry or study end. Of the 532 participants (median age 45 (Interquartile range [IQR]: 30-62) years, 41.7% male, 23.7% diabetics), 39.6% had AKI at ICU admission and 37.4% developed AKI after 24 hours of ICU admission. Previous need of diuretics, sepsis and low education were associated with AKI at ICU admission; APACHE II score independently predicted AKI after ICU admission. A total of 120 (22.6%) patients died during 30-day follow-up. Compared to patients who remained AKI-free, mortality was significantly higher in patients who had AKI at study entry (Hazard Ratio [HR] 2.14; 95% Confidence Interval [CI] 1.02-4.48) or developed AKI in ICU (HR 2.74; 95% CI 1.45-5.17). The risk of AKI is high in critically ill people and predicts poor outcomes. Further studies are needed to estimate the burden of AKI among patients before ICU admission.
Oei, Ju Lee; Abdel-Latif, Mohamed E; Craig, Fiona; Kee, Aivy; Austin, Marie-Paule; Lui, Kei
2009-04-01
The aim of the present study was to determine the characteristics and short-term outcomes of mother-infant pairs with comorbid drug dependency and psychiatric disorders. A population-based retrospective chart review was carried out of 879 drug-dependent mother and infant pairs in New South Wales (NSW) and the Australian Capital Territory (ACT) who delivered between 1 January and 31 December 2004. Psychiatric comorbidity (dual diagnosis, DD) was identified in 396 (45%) of the 879 drug-dependent women. DSM-IV depression (79%), followed by anxiety (20%), was most prevalent. DD women were more likely to use amphetamines (29% vs 18%, p < 0.05), less likely to use opiates (42% vs 51%, p < 0.05) and to have had no antenatal care (24% vs 8%, p < 0.05). They also had more previous pregnancies (4, range = 2-5 vs 3, range = 2-5, p < 0.05) and domestic violence (29% vs 14%, p < 0.05) was more common. DD infants were less likely to be admitted to a nursery (47% vs 55%, p < 0.05). Withdrawal scores were similar (maximum median Finnegan scores = 4 (interquartile range = 3-8) vs 10 (interquartile range = 7-12, p = 0.30) but fewer needed withdrawal medication (19% vs 27%, p < 0.05). Maternal psychotropic agents did not worsen the severity of neonatal withdrawal. Psychiatric comorbidity, especially depression, is common and affects almost half of drug-using mothers. Antenatal care, drug use and social outcomes are worse for DD mothers and their infants. It is recommended that all drug-using women be assessed antenatally for psychosocial disorders so that timely mental health intervention can be offered, if required.
de Waard, Claudia S; Poot, Antonius J; den Elzen, Wendy P J; Wind, Annet W; Caljouw, Monique A A; Gussekloo, Jacobijn
2018-06-01
Understanding patient satisfaction from the perspective of older adults is important to improve quality of their care. Since patient and care variables which can be influenced are of specific interest, this study examines the relation between patient satisfaction and the perceived doctor-patient relationship in older persons and their general practitioners (GPs). Cross-sectional survey. Older persons (n = 653, median age 87 years; 69.4% female) living in 41 residential homes. Patient satisfaction (report mark) and perceived doctor-patient relationship (Leiden Perioperative care Patient Satisfaction questionnaire); relationships were examined by comparing medians and use of regression models. The median satisfaction score was 8 (interquartile range 7.5-9; range 0-10) and doctor-patient relationship 65 (interquartile range 63-65; range 13-65). Higher satisfaction scores were related to higher scores on doctor-patient relationship (Jonckheere Terpstra test, p for trend <.001) independent of gender, age, duration of stay in the residential home, functional and clinical characteristics. Adjusted for these characteristics, per additional point for doctor-patient relationship, satisfaction increased with 0.103 points (β = 0.103, 95% CI 0.092-0.114; p < .001). In those with a 'low' doctor-patient relationship rating, the percentage awarding 'sufficient or good' to their GP for 'understanding about the personal situation' was 12%, 'receiving attention as an individual' 22%, treating the patient kindly 78%, and being polite 94%. In older persons, perceived doctor-patient relationship and patient satisfaction are related, irrespective of patient characteristics. GPs may improve patient satisfaction by focusing more on the affective aspects of the doctor-patient relationship. Key Points Examination of the perceived doctor-patient relationship as a variable might better accommodate patients' expectations and improve satisfaction with the provided primary care.
Increased prevalence of gallbladder polyps in acromegaly.
Annamalai, Anand K; Gayton, Emma L; Webb, Alison; Halsall, David J; Rice, Caiomhe; Ibram, Ferda; Chaudhry, Afzal N; Simpson, Helen L; Berman, Laurence; Gurnell, Mark
2011-07-01
Several studies have suggested an increased prevalence of benign and malignant tumors in acromegaly, particularly colonic neoplasms. The gallbladder's epithelial similarity to the colon raises the possibility that gallbladder polyps (GBP) may occur more frequently in acromegaly. Thirty-one patients with newly diagnosed acromegaly (14 females, 17 males; mean age 54.7 yr, range 27-76 yr) were referred to our center between 2004 and 2008. All had pituitary adenomas and were treated with somatostatin analogs prior to transsphenoidal surgery. Biliary ultrasonography was performed at the time of referral. In a retrospective case-cohort study, we compared the prevalence of GBP in these scans with those of 13,234 consecutive patients (age range 20-80 yr) presenting at the hospital for abdominal/biliary ultrasound during the same time interval. Associations between GH and IGF-I levels and GBP in acromegaly were also examined. There was a higher prevalence of GBP in patients with acromegaly compared with controls (29.03 vs 4.62%, P = 0.000008); relative risk was 6.29 (95% confidence interval 3.61-10.96). Eight of nine patients with acromegaly and GBP were older than 50 yr of age. GH levels were higher in those with GBP (median 30.8 μg/liter, interquartile range 10.9-39.1) than those without (8.2 μg/liter, interquartile range 6.0-16.0), but IGF-I levels were comparable. This is the first study to demonstrate an increased prevalence of GBP in patients with newly diagnosed acromegaly. Further studies are required to determine whether these patients are at increased risk of developing gallbladder carcinoma and to define the role, if any, of biliary ultrasound surveillance.
NASA Astrophysics Data System (ADS)
Denis-Bacelar, Ana M.; Chittenden, Sarah J.; Murray, Iain; Divoli, Antigoni; McCready, V. Ralph; Dearnaley, David P.; O'Sullivan, Joe M.; Johnson, Bernadette; Flux, Glenn D.
2017-04-01
Skeletal tumour burden is a biomarker of prognosis and survival in cancer patients. This study proposes a novel method based on the linear quadratic model to predict the reduction in metastatic tumour burden as a function of the absorbed doses delivered from molecular radiotherapy treatments. The range of absorbed doses necessary to eradicate all the bone lesions and to reduce the metastatic burden was investigated in a cohort of 22 patients with bone metastases from castration-resistant prostate cancer. A metastatic burden reduction curve was generated for each patient, which predicts the reduction in metastatic burden as a function of the patient mean absorbed dose, defined as the mean of all the lesion absorbed doses in any given patient. In the patient cohort studied, the median of the patient mean absorbed dose predicted to reduce the metastatic burden by 50% was 89 Gy (interquartile range: 83-105 Gy), whilst a median of 183 Gy (interquartile range: 107-247 Gy) was found necessary to eradicate all metastases in a given patient. The absorbed dose required to eradicate all the lesions was strongly correlated with the variability of the absorbed doses delivered to multiple lesions in a given patient (r = 0.98, P < 0.0001). The metastatic burden reduction curves showed a potential large reduction in metastatic burden for a small increase in absorbed dose in 91% of patients. The results indicate the range of absorbed doses required to potentially obtain a significant survival benefit. The metastatic burden reduction method provides a simple tool that could be used in routine clinical practice for patient selection and to indicate the required administered activity to achieve a predicted patient mean absorbed dose and reduction in metastatic tumour burden.
Gupta, Ayush; Kapil, Arti; Kabra, S K; Lodha, Rakesh; Sood, Seema; Dhawan, Benu; Das, Bimal K; Sreenivas, V
2013-12-01
Healthcare associated infections (HAIs) are responsible for morbidity and mortality among immunocompromised and critically ill patients. We undertook this study to estimate the burden of HAIs in the paediatric cancer patients in a tertiary care hospital in north India. This prospective, observational study, based on active surveillance for a period of 11 months was undertaken in a 4-bedded isolated, cubicle for paediatric cancer patients. Patients who stayed in the cubicle for ≥48 h, were followed prospectively for the development of HAIs. Of the 138 patients, 13 developed 14 episodes of HAIs during the study period. Patient-days calculated were 1273 days. Crude infection rate (CIR) and incidence density (ID) of all HAIs were 9.4/100 patients and 11/1000 patient-days, respectively. Of the 14 episodes of HAIs, seven (50%) were of blood stream infections (HA-BSI), five (36%) of pneumonia (HAP) and two (14%) urinary tract infections (HA-UTI). The CIRs of HA-BSI, HAP and HA-UTI were 5.1, 3.6 and 1.4/100 patients, respectively. The corresponding IDs were 5.5, 3.9 and 1.6/1000 patient-days, respectively. Mean length of stay was significantly higher in patients who developed an HAI [13.8 (range 7-30), median (Interquartile range) 12 (11-14)] vs 7.5 days [range 2-28, median (interquartile range) 7 (5-9); P<0.0001]. Also mortality was significantly higher in patients who developed an HAI [23% (3/13) vs 3% (4/125), P<0.05]. The incidence of HAIs in the paediatric cancer patients in the study was 11/1000 patient days, of which HA-BSIs were the commonest. HAIs were associated with an increase in morbidity and mortality amongst this high risk patient population.
Hays, Ron D; Tarver, Michelle E; Spritzer, Karen L; Reise, Steve; Hilmantel, Gene; Hofmeister, Elizabeth M; Hammel, Keri; May, Jeanine; Ferris, Frederick; Eydelman, Malvina
2017-01-01
Patient-reported outcome (PRO) measures for laser in situ keratomileusis (LASIK) are needed. To develop PRO measures to assess satisfaction, eye-related symptoms, and their effect on functioning and well-being following LASIK based on patient and expert input. The Patient-Reported Outcomes With LASIK (PROWL) studies were prospective observational studies of patients undergoing LASIK surgery for myopia, hyperopia, or astigmatism. PROWL-1 was a single-center study of active-duty US Navy personnel and PROWL-2 was a 5-center study of civilians. PROWL-1 enrolled 262 active-duty service personnel and PROWL-2 enrolled 312 civilians 21 years or older who spoke English; 241 individuals in PROWL-1 and 280 in PROWL-2 completed a baseline questionnaire before surgery. The analytic sample included those also completing 1 or more follow-up questionnaires: 240 (99.6%) of those in PROWL-1 and 271 (94.4%) of those in PROWL-2. Questionnaires were self-administered through the internet preoperatively and at 1 and 3 months postoperatively in both studies and at 6 months postoperatively in PROWL-1. PROWL-1 began in August 2011 and was completed May 30, 2014; PROWL-2 began in July 2012 and was completed June 27, 2014. Data were analyzed from June 28, 2014, to October 24, 2016. Scales assessing visual symptoms (double images, glare, halos, and starbursts), dry eye symptoms, satisfaction with vision, and satisfaction with LASIK surgery. Items from the National Eye Institute (NEI) Refractive Error Quality of Life Instrument (NEI-RQL-42), NEI Visual Function Questionnaire (NEI-VFQ), and the Ocular Surface Disease Index (OSDI) were included. All scales are scored on a 0 to 100 possible range. Construct validity and responsiveness to change were evaluated (comparing scores before and after surgery). The median age of the 240-person PROWL-1 analytic sample was 27 years (range, 21-52 years); 49 were women (20.4%). The median age of the 271-person PROWL-2 analytic sample was 30 years (range, 21-57 years); 147 were women (54.2%). Internal consistency reliabilities for the 4 visual symptom scales ranged from 0.96 to 0.98 in PROWL-1 and from 0.95 to 0.97 in PROWL-2. The median (interquartile range) test-retest intraclass correlation was 0.69 (0.57-0.79) and 0.76 (0.68-0.84) in PROWL-1 and PROWL-2, respectively. Product-moment correlations of satisfaction with surgery with visual symptom scales at follow-up evaluations ranged from r = 0.24 to r = 0.49. Measures improved from baseline to follow-up, with effect sizes of 0.14 to 1.98, but scores on the NEI-RQL-42 glare scale worsened at the 1-month follow-up. Hours of work did not change significantly from baseline to 1-month follow-up, with the mean number (mean [SD] difference) in PROWL-1 of 41.7 vs 40.9 hours (-0.8 [18.7]) and in PROWL-2 of 38.8 vs 38.2 hours (-0.6 [17.1]). The results of these studies support the reliability and validity of visual symptom scales to evaluate the effects of LASIK surgery in future studies.
Boutin, A
2017-01-01
Abstract BACKGROUND: Febrile neonates are at high risk of morbidity and mortality from infectious causes. This risk further increases if antibiotics are not received in a timely manner. Current guidelines recommend early initiation (less than 1 hour) of antibiotics for patients with severe sepsis. Time-to-antibiotic administration (TAA) should also be targeted as a quality-of-care (QOC) measure for febrile neonates. A previous evaluation showed that most of these patients were not receiving antibiotics in the first hour at our emergency department (ED). OBJECTIVES: We evaluated whether a simple quality improvement protocol would improve the proportion of febrile neonates receiving antibiotics within 60 minutes of arrival to the ED. DESIGN/METHODS: This was a pre-post intervention study conducted in the ED of an academic pediatric tertiary care hospital with an annual volume of approximately 83,000 patients in 2014-2016. Participants were a random sample of all children younger than 28 days old visiting the ED for a febrile illness. The new protocol, which consisted for the nurses, after triage, to place the patients directly in the resuscitation room for immediate assessment by a physician, was implemented in February 2016. Previously, these children were triaged level 2 on the Canadian Triage and Acuity Scale (CTAS), flagged and placed in a regular examination room waiting for the physician assessment. With the new protocol, IV access, blood culture, urine analysis and culture were immediately obtained by the nurse in charge with the concomitant assessment by the attending physician. Forty charts prior to and 50 charts after protocol initiation were reviewed by an archivist using a standardized form between 2014-2015 and 2016, respectively. The primary outcome was TAA. This was defined as the time from initial ED registration to the beginning of antibiotics infusion. As a secondary outcome, all cases were reviewed individually to determine barriers to rapid antibiotic administration (day, evening, or night shifts, other treatments or investigations, number of attempts for intravenous access) and to elicit new quality improvement strategies. RESULTS: During the study periods a total of 178 (pre) and 135 (post) patients fulfilled the inclusion criteria. Among the random samples, 6/50 (12%) of patients received their antibiotics within 60 minutes in the post-intervention period compared to 0/40 (0%) in the pre-implementation period (difference 12%; 95 CI: 1-24%). Within 90 minutes, the proportion improved from 1/40 (2.5%) to 29/50 (58%) (difference 56%; 95 CI: 38-68%). Median TAA in febrile neonates decreased from 182 minutes (interquartile range, 147-219 minutes) in the pre-implementation period to 85 minutes (interquartile range, 73-115 minutes) in the post-implementation period. The main obstacle to the goal of 60 minutes for TAA was the difficulty to get IV access as well as antibiotic availability. CONCLUSION: In this study, a new protocol mandating the immediate transfer of febrile neonate from triage to the resuscitation room improved proportion of febrile neonates receiving antibiotics in less than 60 minutes in our ED. Our results suggest that simple interventions can reduce TAA in a selected group of patients presenting to the ED.
Bosdou, J K; Venetis, C A; Dafopoulos, K; Zepiridis, L; Chatzimeletiou, K; Anifandis, G; Mitsoli, A; Makedos, A; Messinis, I E; Tarlatzis, B C; Kolibianakis, E M
2016-05-01
Does pretreatment with transdermal testosterone increase the number of cumulus-oocyte complexes (COCs) retrieved by more than 1.5 in poor responders undergoing intracytoplasmic sperm injection (ICSI), using recombinant follicle stimulating hormone (FSH) and gonadotrophin releasing hormone agonists (GnRHa)? Testosterone pretreatment failed to increase the number of COCs by more than 1.5 as compared with no pretreatment in poor responders undergoing ICSI (difference between medians: 0.0, 95% CI: -1.0 to +1.0). Androgens are thought to play an important role in early follicular development by enhancing ovarian sensitivity to FSH. In a recent meta-analysis, testosterone pretreatment resulted in an increase of 1.5 COCs as compared with no pretreatment. However, this effect was based on the analysis of only two randomized controlled trials (RCTs) including 163 patients. Evidently, there is a need for additional RCTs that will allow firmer conclusions to be drawn. The present RCT was designed to detect a difference of 1.5 COCs (sample size required = 48 patients). From 02/2014 until 04/2015, 50 poor responders fulfilling the Bologna criteria have been randomized (using a randomization list) to either testosterone pretreatment for 21 days ( ITALIC! n = 26) or no pretreatment ( ITALIC! n = 24). All patients underwent a long follicular GnRHa protocol. Recombinant FSH stimulation was started on Day 22 following GnRHa initiation. In the testosterone pretreatment group, a daily dose of 10 mg of testosterone gel was applied transdermally for 21 days starting from GnRHa initiation. Results are expressed as median (interquartile range). No differences in baseline characteristics were observed between the two groups compared. Testosterone levels [median (interquartile range)] were significantly higher in the testosterone pretreatment on the day of initiation of FSH stimulation [114 (99.5) ng/dl versus 20 (20) ng/dl, respectively, ITALIC! P < 0.001]. Duration of FSH stimulation [median (interquartile range)] was similar between the groups compared [12.5 (3.0) days versus 12 (3.0) days, respectively, ITALIC! P = 0.52]. The number of COCs retrieved [median (interquartile range)] was not different between the testosterone pretreatment and the no pretreatment groups [3.5 (4.0) versus 3.0 (3.0), 95% CI for the median: 2.0-5.0 versus 2.7-4.3, respectively; difference between medians: 0.0, 95% CI: +1.0 to -1.0). Similarly no differences were observed regarding fertilization rates [median (interquartile range)] [66.7% (32.5) versus 66.7% (42.9), respectively, ITALIC! P = 0.97] and live birth rates per randomized patient (7.7% versus 8.3%, respectively, rate difference: -0.6%, 95% CI: -19.0 to +16.9). The study was not powered to detect differences less than 1.5 COCs, although it is doubtful whether these differences would be clinically relevant. Moreover, due to sample size restrictions, no conclusions can be drawn regarding the probability of live birth. The results of this randomized clinical trial, suggesting that pretreatment with 10 mg of transdermal testosterone for 21 days does not improve ovarian response by more than 1.5 oocytes, could be used to more accurately consult patients with poor ovarian response. However, an improvement in IVF outcome using a higher dose of testosterone or a longer pretreatment period cannot be excluded. The study was partially funded by a Scholarship from the Academy of Athens. C.A.V. reports personal fees and non-financial support from Merck, Sharp and Dome, personal fees and non-financial support from Merck Serono, personal fees and non-financial support from IPSEN Hellas S.A., outside the submitted work. B.C.T. reports grants from Merck Serono, grants from Merck Sharp & Dohme, personal fees from Merck Serono, personal fees from Merck Sharp & Dohme, personal fees from IBSA & Ferring, outside the submitted work. NCT01961336. 10 October 2013. 02/2014. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
76 FR 50 - Airworthiness Directives; Airbus Model A310 Series Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-03
...; or long range use, AFT exceeding 3.6 hours. Table 1--Compliance Times for Paragraph (g) of This AD... flight time (AFT) equal to or less than 3.6 hours; or long range use, AFT exceeding 3.6 hours. Table 2... or 30,300 flight hours, whichever occurs first. Configurations 06, 07, and 08 long range Detailed...
Emergency Department Overcrowding and Ambulance Turnaround Time
Lee, Yu Jin; Shin, Sang Do; Lee, Eui Jung; Cho, Jin Seong; Cha, Won Chul
2015-01-01
Objective The aims of this study were to describe overcrowding in regional emergency departments in Seoul, Korea and evaluate the effect of crowdedness on ambulance turnaround time. Methods This study was conducted between January 2010 and December 2010. Patients who were transported by 119-responding ambulances to 28 emergency centers within Seoul were eligible for enrollment. Overcrowding was defined as the average occupancy rate, which was equal to the average number of patients staying in an emergency department (ED) for 4 hours divided by the number of beds in the ED. After selecting groups for final analysis, multi-level regression modeling (MLM) was performed with random-effects for EDs, to evaluate associations between occupancy rate and turnaround time. Results Between January 2010 and December 2010, 163,659 patients transported to 28 EDs were enrolled. The median occupancy rate was 0.42 (range: 0.10-1.94; interquartile range (IQR): 0.20-0.76). Overcrowded EDs were more likely to have older patients, those with normal mentality, and non-trauma patients. Overcrowded EDs were more likely to have longer turnaround intervals and traveling distances. The MLM analysis showed that an increase of 1% in occupancy rate was associated with 0.02-minute decrease in turnaround interval (95% CI: 0.01 to 0.03). In subgroup analyses limited to EDs with occupancy rates over 100%, we also observed a 0.03 minute decrease in turnaround interval per 1% increase in occupancy rate (95% CI: 0.01 to 0.05). Conclusions In this study, we found wide variation in emergency department crowding in a metropolitan Korean city. Our data indicate that ED overcrowding is negatively associated with turnaround interval with very small practical significance. PMID:26115183
Bergthorsdottir, Ragnhildur; Ragnarsson, Oskar; Skrtic, Stanko; Glad, Camilla A M; Nilsson, Staffan; Ross, Ian Louis; Leonsson-Zachrisson, Maria; Johannsson, Gudmundur
2017-11-01
Patients with Addison's disease (AD) have increased cardiovascular mortality. To study visceral fat and conventional and exploratory cardiovascular risk factors in patients with AD. A cross-sectional, single-center, case-control study. Patients (n = 76; n = 51 women) with AD and 76 healthy control subjects were matched for sex, age, body mass index (BMI), and smoking habits. The primary outcome variable was visceral abdominal adipose tissue (VAT) measured using computed tomography. Secondary outcome variables were prevalence of metabolic syndrome (MetS) and 92 biomarkers of cardiovascular disease. The mean ± standard deviation age of all subjects was 53 ± 14 years; mean BMI, 25 ± 4 kg/m2; and mean duration of AD, 17 ± 12 years. The median (range) daily hydrocortisone dose was 30 mg (10 to 50 mg). Median (interquartile range) 24-hour urinary free cortisol excretion was increased in patients vs controls [359 nmol (193 to 601 nmol) vs 175 nmol (140 to 244 nmol); P < 0.001]. VAT did not differ between groups. After correction for multiple testing, 17 of the 92 studied biomarkers differed significantly between patients and control subjects. Inflammatory, proinflammatory, and proatherogenic risk biomarkers were increased in patients [fold change (FC), >1] and vasodilatory protective marker was decreased (FC, <1). Twenty-six patients (34%) vs 12 control subjects (16%) fulfilled the criteria for MetS (P = 0.01). Despite higher cortisol exposure, VAT was not increased in patients with AD. The prevalence of MetS was increased and several biomarkers of cardiovascular disease were adversely affected in patients with AD. Copyright © 2017 Endocrine Society
Logistics and safety of extracorporeal membrane oxygenation in medical retrieval.
Burns, Brian J; Habig, Karel; Reid, Cliff; Kernick, Paul; Wilkinson, Chris; Tall, Gary; Coombes, Sarah; Manning, Ron
2011-01-01
This article reviews the logistics and safety of extracorporeal membrane oxygenation (ECMO) medical retrieval in New South Wales, Australia. We describe the logistics involved in ECMO road and rotary-wing retrieval by a multidisciplinary team during the H1N1 influenza epidemic in winter 2009 (i.e., June 1 to August 31, 2009). Basic patient demographics and key retrieval time lines were analyzed. There were 17 patients retrieved on ECMO, with their ages ranging from 22 to 55 years. The median weight was 110 kg. Four critical events were recorded during retrieval, with no adverse outcomes. The retrieval distance varied from 20.8 to 430 km. There were delays in times from retrieval booking to both retrieval tasking and retrieval team departure in 88% of retrievals. The most common reasons cited were "patient not ready" 23.5% (4/17); "vehicle not available," 23.5% (4/17); and "complex retrieval," 41.2% (7/17). The median time (hours:minutes) from booking with the medical retrieval unit (MRU) to tasking was 4:35 (interquartile range [IQR] 3:27-6:15). The median time lag from tasking to departure was 1:00 (IQR 00:10-2:20). The median stabilization time was 1:30 (IQR 1:20-1:55). The median retrieval duration was 7:35 (IQR 5:50-10:15). The process of development of ECMO retrieval was enabled by the preexistence of a high-volume experienced medical retrieval service. Although ECMO retrieval is not a new concept, we describe an entire process for ECMO retrieval that we believe will benefit other retrieval service providers. The increased workload of ECMO retrieval during the swine flu pandemic has led to refinement in the system and process for the future.
Gerber, Bernhard; Alberio, Lorenzo; Rochat, Sophie; Stenner, Frank; Manz, Markus G; Buser, Andy; Schanz, Urs; Stussi, Georg
2016-10-01
Curative chemotherapy approaches in patients with malignancies and platelet (PLT) transfusion refractoriness due to alloimmunization may be hampered by the lack of suitable PLT donors. For these patients, transfusion of cryopreserved autologous PLTs is an option, but is time- and resource-consuming. We aimed at further simplifying this process. A retrospective single-center analysis was conducted on the transfusion of cryopreserved autologous PLTs in nine female alloimmunized, PLT transfusion-refractory patients treated for acute leukemia (n = 8) and non-Hodgkin's lymphoma (n = 1). No additional processing was used before transfusion, and most notably, washing and centrifugation steps were omitted. Clinical efficacy and safety, as well as a flow cytometric assessment of structural and functional PLT changes, were analyzed. A total of 40 autologous PLT concentrates were thawed at bedside and transfused a median of 32 (range, 9 to 994) days after cryopreservation. No major bleeds and no severe dimethyl sulfoxide toxicity were observed. The median PLT count increments did not differ 1 and 18 to 24 hours after transfusion and reached 6 × 10 9 /L (interquartile range [IQR], 3 × 10 9 -7.5 × 10 9 /L) and 6 × 10 9 /L (IQR, 2.5 × 10 9 -9.5 × 10 9 /L), respectively. Cryopreservation resulted in partial activation of one-third of the PLTs. In vitro stimulation with strong agonists induced additional full activation of cryopreserved PLTs: median, 55% (IQR, 42%-60%) after thrombin and 39% (IQR, 36%-39%) after convulxin. The transfusion of cryopreserved autologous PLTs is feasible and safe. Despite the cryopreservation process, PLT functionality is partially maintained. © 2016 AABB.
Emergency Department Patient Burden from an Electronic Dance Music Festival.
Chhabra, Neeraj; Gimbar, Renee P; Walla, Lisa M; Thompson, Trevonne M
2018-04-01
Electronic dance music (EDM) festivals are increasingly common and psychoactive substance use is prevalent. Although prehospital care can obviate the transfer of many attendees to health care facilities (HCFs), little is known regarding the emergency department (ED) burden of patients presenting from EDM festivals. This study describes the patient volume, length of stay (LOS), and presenting complaints of patients from a 3-day EDM festival in close proximity to an area ED. Medical charts of patients presenting to one HCF from an EDM festival were reviewed for substances used, ED LOS, and sedative medications administered. Additionally, preparedness techniques are described. Over the 3-day festival, 28 patients presented to the ED (median age 21 years; range 18-29 years). Twenty-five had complaints related to substance use including ethanol (n = 18), "molly" or "ecstasy" (n = 13), and marijuana (n = 8). Three patients required intensive care or step-down unit admission for endotracheal intubation, rhabdomyolysis, and protracted altered mental status. The median LOS for discharged patients was 265 min (interquartile range 210-347 min). Eleven patients required the use of sedative medications, with cumulative doses of 42 mg of lorazepam and 350 mg of ketamine. All patients presented within the hours of 5:00 pm and 2:15 am. The majority of ED visits from an EDM festival were related to substance use. ED arrival times clustered during the evening and were associated with prolonged LOS. Few patients required hospital admission, but admitted patients required high levels of care. HCFs should use these data as a guide in planning for future events. Copyright © 2017 Elsevier Inc. All rights reserved.
Tseng, Sheng-Hsuan; Lim, Chuan Poh; Chen, Qi; Tang, Cheng Cai; Kong, Sing Teang; Ho, Paul Chi-Lui
2018-04-01
Bacterial sepsis is a major cause of morbidity and mortality in neonates, especially those involving methicillin-resistant Staphylococcus aureus (MRSA). Guidelines by the Infectious Diseases Society of America recommend the vancomycin 24-h area under the concentration-time curve to MIC ratio (AUC 24 /MIC) of >400 as the best predictor of successful treatment against MRSA infections when the MIC is ≤1 mg/liter. The relationship between steady-state vancomycin trough concentrations and AUC 24 values (mg·h/liter) has not been studied in an Asian neonatal population. We conducted a retrospective chart review in Singapore hospitals and collected patient characteristics and therapeutic drug monitoring data from neonates on vancomycin therapy over a 5-year period. A one-compartment population pharmacokinetic model was built from the collected data, internally validated, and then used to assess the relationship between steady-state trough concentrations and AUC 24 A Monte Carlo simulation sensitivity analysis was also conducted. A total of 76 neonates with 429 vancomycin concentrations were included for analysis. Median (interquartile range) was 30 weeks (28 to 36 weeks) for postmenstrual age (PMA) and 1,043 g (811 to 1,919 g) for weight at the initiation of treatment. Vancomycin clearance was predicted by weight, PMA, and serum creatinine. For MRSA isolates with a vancomycin MIC of ≤1, our major finding was that the minimum steady-state trough concentration range predictive of achieving an AUC 24 /MIC of >400 was 8 to 8.9 mg/liter. Steady-state troughs within 15 to 20 mg/liter are unlikely to be necessary to achieve an AUC 24 /MIC of >400, whereas troughs within 10 to 14.9 mg/liter may be more appropriate. Copyright © 2018 American Society for Microbiology.
Crosbie, Emma J; Massiah, Nadine S; Achiampong, Josephine Y; Dolling, Stuart; Slade, Richard J
2012-02-01
To describe the surgical rectus sheath block for post-operative pain relief following major gynaecological surgery. Local anaesthetic (20 ml 0.25% bupivacaine bilaterally) is administered under direct vision to the rectus sheath space at the time of closure of the anterior abdominal wall. We conducted a retrospective case note review of 98 consecutive patients undergoing major gynaecological surgery for benign or malignant disease who received either standard subcutaneous infiltration of the wound with local anaesthetic (LA, n=51) or the surgical rectus sheath block (n=47) for post-operative pain relief. (1) Pain scores on waking, (2) duration of morphine-based patient controlled analgesia (PCA), (3) quantity of morphine used during the first 48 post-operative hours and (4) length of post-operative stay. The groups were similar in age, the range of procedures performed and the type of pathology observed. Patients who received the surgical rectus sheath block had lower pain scores on waking [0 (0-1) vs. 2 (1-3), p<0.001], required less morphine post-operatively [12 mg (9-26) vs. 36 mg (30-48), p<0.001], had their PCAs discontinued earlier [24h (18-34) vs. 37 h (28-48), p<0.001] and went home earlier [4 days (3-4) vs. 5 days post-op (4-8), p<0.001] [median (interquartile range)] than patients receiving standard subcutaneous local anaesthetic into the wound. The surgical rectus sheath block appears to provide effective post-operative analgesia for patients undergoing major gynaecological surgery. A randomised controlled clinical trial is required to assess its efficacy further. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Redgrave, Jessica N; Moore, Lucy; Oyekunle, Tosin; Ebrahim, Maryam; Falidas, Konstantinos; Snowdon, Nicola; Ali, Ali; Majid, Arshad
2018-03-23
Invasive vagus nerve stimulation (VNS) has the potential to enhance the effects of physiotherapy for upper limb motor recovery after stroke. Noninvasive, transcutaneous auricular branch VNS (taVNS) may have similar benefits, but this has not been evaluated in stroke recovery. We sought to determine the feasibility of taVNS delivered alongside upper limb repetitive task-specific practice after stroke and its effects on a range of outcome measures evaluating limb function. Thirteen participants at more than 3 months postischemic stroke with residual upper limb dysfunction were recruited from the community of Sheffield, United Kingdom (October-December 2016). Participants underwent 18 × 1-hour sessions over 6 weeks in which they made 30-50 repetitions of 8-10 arm movements concurrently with taVNS (NEMOS; Cerbomed, Erlangen, Germany, 25 Hz, .1-millisecond pulse width) at maximum tolerated intensity (mA). An electrocardiogram and rehabilitation outcome scores were obtained at each visit. Qualitative interviews determined the acceptability of taVNS to participants. Median time after stroke was 1.16 years, and baseline median/interquartile range upper limb Fugl-Meyer (UFM) score was 63 (54.5-99.5). Participants attended 92% of the planned treatment sessions. Three participants reported side effects, mainly fatigue, but all performed mean of more than 300 arm repetitions per session with no serious adverse events. There was a significant change in the UFM score with a mean increase per participant of 17.1 points (standard deviation 7.8). taVNS is feasible and well-tolerated alongside upper limb repetitive movements in poststroke rehabilitation. The motor improvements observed justify a phase 2 trial in patients with residual arm weakness. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Storm, Christian; Danne, Oliver; Ueland, Per Magne; Leithner, Christoph; Hasper, Dietrich; Schroeder, Tim
2013-01-01
Objective Choline is related to phospholipid metabolism and is a marker for global ischaemia with a small reference range in healthy volunteers. The aim of our study was to characterize the early kinetics of plasma free choline in patients after cardiac arrest. Additionally, we investigated the potential of plasma free choline to predict neurological outcome. Methods Twenty patients admitted to our medical intensive care unit were included in this prospective, observational trial. All patients were enrolled between May 2010 and May 2011. They received post cardiac arrest treatment including mild therapeutic hypothermia which was initiated with a combination of cold fluid and a feedback surface cooling device according to current guidelines. Sixteen blood samples per patient were analysed for plasma free choline levels within the first week after resuscitation. Choline was detected by liquid chromatography-tandem mass spectrometry. Results Most patients showed elevated choline levels on admission (median 14.8 µmol/L; interquartile range; IQR 9.9-20.1) which subsequently decreased. 48 hours after cardiac arrest choline levels in all patients reached subnormal levels at a median of 4.0 µmol/L (IQR 3-4.9; p = 0.001). Subsequently, choline levels normalized within seven days. There was no significant difference in choline levels when groups were analyzed in relation to neurological outcome. Conclusions Our data indicate a choline deficiency in the early postresucitation phase. This could potentially result in impaired cell membrane recovery. The detailed characterization of the early choline time course may aid in planning of choline supplementation trials. In a limited number of patients, choline was not promising as a biomarker for outcome prediction. PMID:24098804
Storm, Christian; Danne, Oliver; Ueland, Per Magne; Leithner, Christoph; Hasper, Dietrich; Schroeder, Tim
2013-01-01
Choline is related to phospholipid metabolism and is a marker for global ischaemia with a small reference range in healthy volunteers. The aim of our study was to characterize the early kinetics of plasma free choline in patients after cardiac arrest. Additionally, we investigated the potential of plasma free choline to predict neurological outcome. Twenty patients admitted to our medical intensive care unit were included in this prospective, observational trial. All patients were enrolled between May 2010 and May 2011. They received post cardiac arrest treatment including mild therapeutic hypothermia which was initiated with a combination of cold fluid and a feedback surface cooling device according to current guidelines. Sixteen blood samples per patient were analysed for plasma free choline levels within the first week after resuscitation. Choline was detected by liquid chromatography-tandem mass spectrometry. Most patients showed elevated choline levels on admission (median 14.8 µmol/L; interquartile range; IQR 9.9-20.1) which subsequently decreased. 48 hours after cardiac arrest choline levels in all patients reached subnormal levels at a median of 4.0 µmol/L (IQR 3-4.9; p = 0.001). Subsequently, choline levels normalized within seven days. There was no significant difference in choline levels when groups were analyzed in relation to neurological outcome. Our data indicate a choline deficiency in the early postresucitation phase. This could potentially result in impaired cell membrane recovery. The detailed characterization of the early choline time course may aid in planning of choline supplementation trials. In a limited number of patients, choline was not promising as a biomarker for outcome prediction.
Larochelle, Marc R; Bernson, Dana; Land, Thomas; Stopka, Thomas J; Wang, Na; Xuan, Ziming; Bagley, Sarah M; Liebschutz, Jane M; Walley, Alexander Y
2018-06-19
Opioid overdose survivors have an increased risk for death. Whether use of medications for opioid use disorder (MOUD) after overdose is associated with mortality is not known. To identify MOUD use after opioid overdose and its association with all-cause and opioid-related mortality. Retrospective cohort study. 7 individually linked data sets from Massachusetts government agencies. 17 568 Massachusetts adults without cancer who survived an opioid overdose between 2012 and 2014. Three types of MOUD were examined: methadone maintenance treatment (MMT), buprenorphine, and naltrexone. Exposure to MOUD was identified at monthly intervals, and persons were considered exposed through the month after last receipt. A multivariable Cox proportional hazards model was used to examine MOUD as a monthly time-varying exposure variable to predict time to all-cause and opioid-related mortality. In the 12 months after a nonfatal overdose, 2040 persons (11%) enrolled in MMT for a median of 5 months (interquartile range, 2 to 9 months), 3022 persons (17%) received buprenorphine for a median of 4 months (interquartile range, 2 to 8 months), and 1099 persons (6%) received naltrexone for a median of 1 month (interquartile range, 1 to 2 months). Among the entire cohort, all-cause mortality was 4.7 deaths (95% CI, 4.4 to 5.0 deaths) per 100 person-years and opioid-related mortality was 2.1 deaths (CI, 1.9 to 2.4 deaths) per 100 person-years. Compared with no MOUD, MMT was associated with decreased all-cause mortality (adjusted hazard ratio [AHR], 0.47 [CI, 0.32 to 0.71]) and opioid-related mortality (AHR, 0.41 [CI, 0.24 to 0.70]). Buprenorphine was associated with decreased all-cause mortality (AHR, 0.63 [CI, 0.46 to 0.87]) and opioid-related mortality (AHR, 0.62 [CI, 0.41 to 0.92]). No associations between naltrexone and all-cause mortality (AHR, 1.44 [CI, 0.84 to 2.46]) or opioid-related mortality (AHR, 1.42 [CI, 0.73 to 2.79]) were identified. Few events among naltrexone recipients preclude confident conclusions. A minority of opioid overdose survivors received MOUD. Buprenorphine and MMT were associated with reduced all-cause and opioid-related mortality. National Center for Advancing Translational Sciences of the National Institutes of Health.
Geographic clustering of elevated blood heavy metal levels in pregnant women.
King, Katherine E; Darrah, Thomas H; Money, Eric; Meentemeyer, Ross; Maguire, Rachel L; Nye, Monica D; Michener, Lloyd; Murtha, Amy P; Jirtle, Randy; Murphy, Susan K; Mendez, Michelle A; Robarge, Wayne; Vengosh, Avner; Hoyo, Cathrine
2015-10-09
Cadmium (Cd), lead (Pb), mercury (Hg), and arsenic (As) exposure is ubiquitous and has been associated with higher risk of growth restriction and cardiometabolic and neurodevelopmental disorders. However, cost-efficient strategies to identify at-risk populations and potential sources of exposure to inform mitigation efforts are limited. The objective of this study was to describe the spatial distribution and identify factors associated with Cd, Pb, Hg, and As concentrations in peripheral blood of pregnant women. Heavy metals were measured in whole peripheral blood of 310 pregnant women obtained at gestational age ~12 weeks. Prenatal residential addresses were geocoded and geospatial analysis (Getis-Ord Gi* statistics) was used to determine if elevated blood concentrations were geographically clustered. Logistic regression models were used to identify factors associated with elevated blood metal levels and cluster membership. Geospatial clusters for Cd and Pb were identified with high confidence (p-value for Gi* statistic <0.01). The Cd and Pb clusters comprised 10.5 and 9.2 % of Durham County residents, respectively. Medians and interquartile ranges of blood concentrations (μg/dL) for all participants were Cd 0.02 (0.01-0.04), Hg 0.03 (0.01-0.07), Pb 0.34 (0.16-0.83), and As 0.04 (0.04-0.05). In the Cd cluster, medians and interquartile ranges of blood concentrations (μg/dL) were Cd 0.06 (0.02-0.16), Hg 0.02 (0.00-0.05), Pb 0.54 (0.23-1.23), and As 0.05 (0.04-0.05). In the Pb cluster, medians and interquartile ranges of blood concentrations (μg/dL) were Cd 0.03 (0.02-0.15), Hg 0.01 (0.01-0.05), Pb 0.39 (0.24-0.74), and As 0.04 (0.04-0.05). Co-exposure with Pb and Cd was also clustered, the p-values for the Gi* statistic for Pb and Cd was <0.01. Cluster membership was associated with lower education levels and higher pre-pregnancy BMI. Our data support that elevated blood concentrations of Cd and Pb are spatially clustered in this urban environment compared to the surrounding areas. Spatial analysis of metals concentrations in peripheral blood or urine obtained routinely during prenatal care can be useful in surveillance of heavy metal exposure.
Persaud, Deborah; Patel, Kunjal; Karalius, Brad; Rainwater-Lovett, Kaitlin; Ziemniak, Carrie; Ellis, Angela; Chen, Ya Hui; Richman, Douglas; Siberry, George K.; Van Dyke, Russell B.; Burchett, Sandra; Seage, George R.; Luzuriaga, Katherine
2014-01-01
Importance Combination antiretroviral therapy (cART) initiated within several weeks of HIV infection in adults limits proviral reservoirs that preclude HIV cure. Biomarkers of restricted proviral reservoirs may aid in the monitoring of HIV remission or cure. Objectives To quantify peripheral blood proviral reservoir size in perinatally HIV-infected adolescents and to identify correlates of limited proviral reservoirs. Design, Setting, and Participants A cross-sectional study including 144 perinatally HIV-infected (PHIV+) youth (median age: 14.3 years), enrolled in the US-based Pediatric HIV/AIDS Cohort Study, on durable (median: 10.2 years) cART, stratified by age at virologic control. Main Outcome and Measures The primary endpoint was peripheral blood mononuclear cell (PBMC) proviral load following virologic control at different ages. Correlations between proviral load and markers of active HIV production (HIV-specific antibodies, 2-long terminal repeat (2-LTR) circles), and markers of immune activation and inflammation were also assessed. Results Proviral reservoir size was markedly reduced in the PHIV+ youth who achieved virologic control by age 1 year (4.2 [interquartile range, 2.6-8 6] copies per 1 million PBMCs) compared to those who achieved virologic control between 1-5 years of age (19.4 [interquartile range, 5.5-99.8] copies per 1 million PBMCs) or after age 5 years (−(70.7 [interquartile range, 23.2-209.4] copies per 1 million PBMCs; P < .00l). A proviral burden <10 copies/million PBMCs was measured in 11 (79%), 20 (40%), and 13 (18%) participants with virologic control at ages <1 year, 1-5 years, and >5 years, respectively (p<0.001). Lower proviral load was associated with undetectable 2-LTR circles (p<0.001) and HIV negative or indeterminate serostatus (p<0.001), but not with concentrations of soluble immune activation markers CD14 and CD163. Conclusions and Relevance Early effective cART along with prolonged virologic suppression after perinatal HIV infection leads to negligible peripheral blood proviral reservoirs in adolescence and is associated with negative or indeterminate HIV serostatus. These findings highlight the long-term effect of early effective control of HIV replication on biomarkers of HIV persistence in perinatal infection and the utility of HIV serostatus as a biomarker for small proviral reservoir size, though not necessarily of cure. PMID:25286283
Buonerba, Carlo; Sonpavde, Guru; Vitrone, Francesca; Bosso, Davide; Puglia, Livio; Izzo, Michela; Iaccarino, Simona; Scafuri, Luca; Muratore, Margherita; Foschini, Francesca; Mucci, Brigitta; Tortora, Vincenzo; Pagliuca, Martina; Ribera, Dario; Riccio, Vittorio; Morra, Rocco; Mosca, Mirta; Cesarano, Nicola; Di Costanzo, Ileana; De Placido, Sabino; Di Lorenzo, Giuseppe
2017-01-01
Background: Cabazitaxel is a second-generation taxane that is approved for use with concomitant low dose daily prednisone in metastatic castration resistant prostate cancer (mCRPC) after docetaxel failure. Since the role of daily corticosteroids in improving cabazitaxel efficacy or ameliorating its safety profile has not been adequately investigated so far, we compared outcomes of patients receiving cabazitaxel with or without daily corticosteroids in a retrospective single-Institution cohort of mCRPC patients. Patients and methods: Medical records of deceased patients with documented mCRPC treated with cabazitaxel following prior docetaxel between January, 2011 and January, 2017 were reviewed at the single participating center. Patients who were receiving daily doses of systemic corticosteroids other than low dose daily prednisone or prednisolone (<= 10 mg a day) were excluded. The primary end point of this analysis was overall survival (OS). Secondary end-points were exposure to cabazitaxel as well as incidence of grade 3-4 adverse events. Univariable and multivariable Cox proportional hazards regression was used to evaluate prednisone use and other variables as potentially prognostic for overall survival. Results: Overall, among 91 patients, 57 patients received cabazitaxel concurrently with low dose prednisone and 34 patients did not receive concurrent prednisone. The median overall survival of the population was 9.8 months (interquartile range, 9 to 14). Patients receiving prednisone had an overall survival of 9 months (interquartile range, 8 to 12) vs.14 months (interquartile range, 9.4 to 16.7) for patients not treated with prednisone. Approximately 45% of patients had a >30% PSA decline at 12 weeks. Prednisone use was not significantly prognostic for overall survival or PSA decline ≥30% rates on regression analyses. Importantly, a >30% PSA decline at 12, but not at 3, 6, 9 weeks, was prognostic for improved survival at multivariate analysis Conclusions: The data presented here support the hypothesis that omitting daily corticosteroids in cabazitaxel-treated patients has no negative impact on either survival or safety profile. In the large prospective trial CABACARE, cabazitaxel-treated patients will be randomized to receive or not receive daily prednisone. The CABACARE (EudraCT n. 2016-003646-81) study is currently ongoing at University Federico II of Naples and at other multiple participating centers in Italy.
Buonerba, Carlo; Sonpavde, Guru; Vitrone, Francesca; Bosso, Davide; Puglia, Livio; Izzo, Michela; Iaccarino, Simona; Scafuri, Luca; Muratore, Margherita; Foschini, Francesca; Mucci, Brigitta; Tortora, Vincenzo; Pagliuca, Martina; Ribera, Dario; Riccio, Vittorio; Morra, Rocco; Mosca, Mirta; Cesarano, Nicola; Di Costanzo, Ileana; De Placido, Sabino; Di Lorenzo, Giuseppe
2017-01-01
Background: Cabazitaxel is a second-generation taxane that is approved for use with concomitant low dose daily prednisone in metastatic castration resistant prostate cancer (mCRPC) after docetaxel failure. Since the role of daily corticosteroids in improving cabazitaxel efficacy or ameliorating its safety profile has not been adequately investigated so far, we compared outcomes of patients receiving cabazitaxel with or without daily corticosteroids in a retrospective single-Institution cohort of mCRPC patients. Patients and methods: Medical records of deceased patients with documented mCRPC treated with cabazitaxel following prior docetaxel between January, 2011 and January, 2017 were reviewed at the single participating center. Patients who were receiving daily doses of systemic corticosteroids other than low dose daily prednisone or prednisolone (<= 10 mg a day) were excluded. The primary end point of this analysis was overall survival (OS). Secondary end-points were exposure to cabazitaxel as well as incidence of grade 3-4 adverse events. Univariable and multivariable Cox proportional hazards regression was used to evaluate prednisone use and other variables as potentially prognostic for overall survival. Results: Overall, among 91 patients, 57 patients received cabazitaxel concurrently with low dose prednisone and 34 patients did not receive concurrent prednisone. The median overall survival of the population was 9.8 months (interquartile range, 9 to 14). Patients receiving prednisone had an overall survival of 9 months (interquartile range, 8 to 12) vs.14 months (interquartile range, 9.4 to 16.7) for patients not treated with prednisone. Approximately 45% of patients had a >30% PSA decline at 12 weeks. Prednisone use was not significantly prognostic for overall survival or PSA decline ≥30% rates on regression analyses. Importantly, a >30% PSA decline at 12, but not at 3, 6, 9 weeks, was prognostic for improved survival at multivariate analysis Conclusions: The data presented here support the hypothesis that omitting daily corticosteroids in cabazitaxel-treated patients has no negative impact on either survival or safety profile. In the large prospective trial CABACARE, cabazitaxel-treated patients will be randomized to receive or not receive daily prednisone. The CABACARE (EudraCT n. 2016-003646-81) study is currently ongoing at University Federico II of Naples and at other multiple participating centers in Italy. PMID:28928853
Zittema, Debbie; van den Berg, Else; Meijer, Esther; Boertien, Wendy E; Muller Kobold, Anneke C; Franssen, Casper F M; de Jong, Paul E; Bakker, Stephan J L; Navis, Gerjan; Gansevoort, Ron T
2014-09-05
Plasma copeptin, a marker of arginine vasopressin, is elevated in patients with autosomal dominant polycystic kidney disease and predicts disease progression. It is unknown whether elevated copeptin levels result from decreased kidney clearance or as compensation for impaired concentrating capacity. Data from patients with autosomal dominant polycystic kidney disease and healthy kidney donors before and after donation were used, because after donation, overall GFR decreases with a functionally normal kidney. Data were obtained between October of 2008 and January of 2012 from healthy kidney donors who visited the institution for routine measurements predonation and postdonation and patients with autosomal dominant polycystic kidney disease who visited the institution for kidney function measurement. Plasma copeptin levels were measured using a sandwich immunoassay, GFR was measured as (125)I-iothalamate clearance, and urine concentrating capacity was measured as urine-to-plasma ratio of urea. In patients with autosomal dominant polycystic kidney disease, total kidney volume was measured with magnetic resonance imaging. Patients with autosomal dominant polycystic kidney disease (n=122, age=40 years, men=56%) had significantly higher copeptin levels (median=6.8 pmol/L; interquartile range=3.4-15.7 pmol/L) compared with donors (n=134, age=52 years, men=49%) both predonation and postdonation (median=3.8 pmol/L; interquartile range=2.8-6.3 pmol/L; P<0.001; median=4.4 pmol/L; interquartile range=3.6-6.1 pmol/L; P<0.001). In donors, copeptin levels did not change after donation, despite a significant fall in GFR (from 105 ± 17 to 66 ± 10; P<0.001). Copeptin and GFR were significantly associated in patients with autosomal dominant polycystic kidney disease (β=-0.45, P<0.001) but not in donors. In patients with autosomal dominant polycystic kidney disease, GFR and total kidney volume were both associated significantly with urine-to-plasma ratio of urea (β=0.84, P<0.001; β=-0.51, P<0.001, respectively). On the basis of the finding in donors that kidney clearance is not a main determinant of plasma copeptin levels, it was hypothesized that, in patients with autosomal dominant polycystic kidney disease, kidney damage and associated impaired urine concentration capacity determine copeptin levels. Copyright © 2014 by the American Society of Nephrology.
Scheuermeyer, Frank X; DeWitt, Christopher; Christenson, Jim; Grunau, Brian; Kestler, Andrew; Grafstein, Eric; Buxton, Jane; Barbic, David; Milanovic, Stefan; Torkjari, Reza; Sahota, Indy; Innes, Grant
2018-03-09
Fentanyl overdoses are increasing and few data guide emergency department (ED) management. We evaluate the safety of an ED protocol for patients with presumed fentanyl overdose. At an urban ED, we used administrative data and explicit chart review to identify and describe consecutive patients with uncomplicated presumed fentanyl overdose (no concurrent acute medical issues) from September to December 2016. We linked regional ED and provincial vital statistics databases to ascertain admissions, revisits, and mortality. Primary outcome was a composite of admission and death within 24 hours. Other outcomes included treatment with additional ED naloxone, development of a new medical issue while in the ED, and length of stay. A prespecified subgroup analysis assessed low-risk patients with normal triage vital signs. There were 1,009 uncomplicated presumed fentanyl overdose, mainly by injection. Median age was 34 years, 85% were men, and 82% received out-of-hospital naloxone. One patient was hospitalized and one discharged patient died within 24 hours (combined outcome 0.2%; 95% confidence interval [CI] 0.04% to 0.8%). Sixteen patients received additional ED naloxone (1.6%; 95% CI 1.0% to 2.6%), none developed a new medical issue (0%; 95% CI 0% to 0.5%), and median length of stay was 173 minutes (interquartile range 101 to 267). For 752 low-risk patients, no patients were admitted or developed a new issue, and one died postdischarge; 3 (0.4%; 95% CI 0.01% to 1.3%) received ED naloxone. In our cohort of ED patients with uncomplicated presumed fentanyl overdose-typically after injection-deterioration, admission, mortality, and postdischarge complications appear low; the majority can be discharged after brief observation. Patients with normal triage vital signs are unlikely to require ED naloxone. Copyright © 2018 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
Donor Risk Index for African American liver transplant recipients with hepatitis C virus.
Shores, Nathan J; Dodge, Jennifer L; Feng, Sandy; Terrault, Norah A
2013-10-01
African American (AA) liver transplant (LT) recipients with hepatitis C virus (HCV) have higher rates of graft loss than other racial/ethnic groups. The Donor Risk Index (DRI) predicts graft loss but is neither race- nor disease-specific and may not be optimal for assessing donor risk for AA HCV-positive LT recipients. We developed a DRI for AA with HCV with the goal of enhancing graft loss predictions. All U.S. HCV-positive adult AA first deceased donor LTs surviving ≥30 days from March 2002 to December 2009 were included. A total of 1,766 AA LT recipients were followed for median 2.8 (interquartile range [IQR] 1.3-4.9) years. Independent predictors of graft loss were donor age (40-49 years: hazard ratio [HR] 1.54; 50-59 years: HR 1.80; 60+ years: HR 2.34, P < 0.001), non-AA donor (HR 1.66, P < 0.001), and cold ischemia time (CIT) (HR 1.03 per hour >8 hours, P = 0.03). Importantly, the negative effect of increasing donor age on graft and patient survival among AAs was attenuated by receipt of an AA donor. A new donor risk model for AA (AADRI-C) consisting of donor age, race, and CIT yielded 1-year, 3-year, and 5-year predicted graft survival rates of 91%, 77%, and 68% for AADRI <1.60; 86%, 67%, and 55% for AADRI 1.60-2.44; and 78%, 53%, and 39% for AADRI >2.44. In the validation dataset, AADRI-C correctly reclassified 27% of patients (net reclassification improvement P = 0.04) compared to the original DRI. AADRI-C identifies grafts at higher risk of failure and this information is useful for risk-benefit discussions with recipients. Use of AA donors allows consideration of older donors. Copyright © 2013 by the American Association for the Study of Liver Diseases.
Cerebrovascular Reactivity in Young Subjects with Sleep Apnea
Buterbaugh, John; Wynstra, Charles; Provencio, Natalie; Combs, Daniel; Gilbert, Michael; Parthasarathy, Sairam
2015-01-01
Study Objectives: Regional brain alterations may be involved in the pathogenesis and adverse consequences of obstructive sleep apnea (OSA). The objectives for the current study were to (1) determine cerebrovascular reactivity in the motor areas that control upper airway musculature in patients with OSA, and (2) determine whether young patients with OSA have decreased cerebrovascular reactivity in response to breath holding. Design: Case-control study. Setting: Academic center. Participants: Twelve subjects with OSA (age 24–42 y; apnea-hypopnea index 17; interquartile range [IQR] 9, 69 per hour) and control subjects (n = 10; age 29–44 y; AHI 2; IQR 1, 3 per hour). Measurements and Results: Subjects underwent blood oxygen level-dependent functional magnetic resonance imaging (BOLD-fMRI) while awake, swallowing, and breath holding. In subjects with OSA, during swallowing, there was less activity in the brainstem than in controls (P = 0.03) that remained reduced after adjusting for cortical motor strip activity (P = 0.036). In OSA subjects, brain regions of increased cerebrovascular reactivity (38; IQR 17, 96 cm3) was smaller than that in controls (199; IQR 5, 423 cm3; P = 0.01). In OSA subjects, brain regions of decreased cerebrovascular reactivity during breath hold was greater (P = 0.01), and the ratio of increased-to-decreased brain regions was lower than that of controls (P = 0.006). Adjustment for cerebral volumes, body mass index, and white matter lesions did not change these results substantively. Conclusions: In patients with obstructive sleep apnea (OSA), diminished change in brainstem activity during swallowing and reduced cerebrovascular reactivity may contribute to the etiopathogenesis and adverse cerebrovascular consequences, respectively. We speculate that decreased cerebral auto-regulation may be causative of gray matter loss in OSA. Citation: Buterbaugh J, Wynstra C, Provencio N, Combs D, Gilbert M, Parthasarathy S. Cerebrovascular reactivity in young subjects with sleep apnea. SLEEP 2015;38(2):241–250. PMID:25409111
Linnaus, Maria E; Langlais, Crystal S; Garcia, Nilda M; Alder, Adam C; Eubanks, James W; Maxson, R Todd; Letton, Robert W; Ponsky, Todd A; St Peter, Shawn D; Leys, Charles; Bhatia, Amina; Ostlie, Daniel J; Tuggle, David W; Lawson, Karla A; Raines, Alexander R; Notrica, David M
2017-04-01
Nonoperative management (NOM) is standard of care for most pediatric blunt liver and spleen injuries (BLSI); only 5% of patients fail NOM in retrospective reports. No prospective studies examine failure of NOM of BLSI in children. The aim of this study was to determine the frequency and clinical characteristics of failure of NOM in pediatric BLSI patients. A prospective observational study was conducted on patients 18 years or younger presenting to any of 10 Level I pediatric trauma centers April 2013 and January 2016 with BLSI on computed tomography. Management of BLSI was based on the Arizona-Texas-Oklahoma-Memphis-Arkansas Consortium pediatric guideline. Failure of NOM was defined as needing laparoscopy or laparotomy. A total of 1008 patients met inclusion; 499 (50%) had liver injury, 410 (41%) spleen injury, and 99 (10%) had both. Most patients were male (n = 624; 62%) with a median age of 10.3 years (interquartile range, 5.9, 14.2). A total of 69 (7%) underwent laparotomy or laparoscopy, but only 34 (3%) underwent surgery for spleen or liver bleeding. Other (nonexclusive) operations were for 21 intestinal injuries; 15 hematoma evacuations, washouts, or drain placements; 9 pancreatic injuries; 5 mesenteric injuries; 3 diaphragm injuries; and 2 bladder injuries. Patients who failed were more likely to receive blood (52 of 69 vs. 162 of 939; p < 0.001) and median time from injury to first blood transfusion was 2.3 hours for those who failed versus 5.9 hours for those who did not (p = 0.002). Overall mortality rate was 24% (8 of 34) in those who failed NOM due to bleeding. NOM fails in 7% of children with BLSI, but only 3% of patients failed for bleeding due to liver or spleen injury. For children failing NOM due to bleeding, the mortality was 24%. Therapeutic study, level II.
Riboldi, Bárbara P; Luft, Vivian C; de Castilhos, Cristina D; de Cardoso, Letícia O; Schmidt, Maria I; Barreto, Sandhi M; de Sander, Maria F; Alvim, Sheila M; Duncan, Bruce B
2015-02-13
To assess glucose and triglyceride excursions 2 hours after the ingestion of a standardized meal and their associations with clinical characteristics and cardiovascular complications in individuals with diabetes. Blood samples of 898 subjects with diabetes were collected at fasting and 2 hours after a meal containing 455 kcal, 14 g of saturated fat and 47 g of carbohydrates. Self-reported morbidity, socio-demographic characteristics and clinical measures were obtained by interview and exams performed at the baseline visit of the ELSA-Brasil cohort study. Median (interquartile range, IQR) for fasting glucose was 150.5 (123-198) mg/dL and for fasting triglycerides 140 (103-199) mg/dL. The median excursion for glucose was 45 (15-76) mg/dL and for triglycerides 26 (11-45) mg/dL. In multiple linear regression, a greater glucose excursion was associated with higher glycated hemoglobin (10.7, 95% CI 9.1-12.3 mg/dL), duration of diabetes (4.5; 2.6-6.4 mg/dL, per 5 year increase), insulin use (44.4; 31.7-57.1 mg/dL), and age (6.1; 2.5-9.6 mg/dL, per 10 year increase); and with lower body mass index (-5.6; -8.4- -2.8 mg/dL, per 5 kg/m2 increase). In adjusted logistic regression models, a greater glucose excursion was marginally associated with the presence of cardiovascular comorbidities (coronary heart disease, myocardial infarction and angina) in those with obesity. A greater postprandial glycemic response to a small meal was positively associated with indicators of a decreased capacity for insulin secretion and negatively associated with obesity. No pattern of response was observed with a greater postprandial triglyceride excursion.
Mehta, Amar J.; Kloog, Itai; Zanobetti, Antonella; Coull, Brent A.; Sparrow, David; Vokonas, Pantel; Schwartz, Joel
2014-01-01
Background The underlying mechanisms of the association between ambient temperature and cardiovascular morbidity and mortality are not well understood, particularly for daily temperature variability. We evaluated if daily mean temperature and standard deviation of temperature was associated with heart rate-corrected QT interval (QTc) duration, a marker of ventricular repolarization in a prospective cohort of older men. Methods This longitudinal analysis included 487 older men participating in the VA Normative Aging Study with up to three visits between 2000–2008 (n = 743). We analyzed associations between QTc and moving averages (1–7, 14, 21, and 28 days) of the 24-hour mean and standard deviation of temperature as measured from a local weather monitor, and the 24-hour mean temperature estimated from a spatiotemporal prediction model, in time-varying linear mixed-effect regression. Effect modification by season, diabetes, coronary heart disease, obesity, and age was also evaluated. Results Higher mean temperature as measured from the local monitor, and estimated from the prediction model, was associated with longer QTc at moving averages of 21 and 28 days. Increased 24-hr standard deviation of temperature was associated with longer QTc at moving averages from 4 and up to 28 days; a 1.9°C interquartile range increase in 4-day moving average standard deviation of temperature was associated with a 2.8 msec (95%CI: 0.4, 5.2) longer QTc. Associations between 24-hr standard deviation of temperature and QTc were stronger in colder months, and in participants with diabetes and coronary heart disease. Conclusion/Significance In this sample of older men, elevated mean temperature was associated with longer QTc, and increased variability of temperature was associated with longer QTc, particularly during colder months and among individuals with diabetes and coronary heart disease. These findings may offer insight of an important underlying mechanism of temperature-related cardiovascular morbidity and mortality in an older population. PMID:25238150
Deiner, Stacie; Luo, Xiaodong; Lin, Hung-Mo; Sessler, Daniel I; Saager, Leif; Sieber, Frederick E; Lee, Hochang B; Sano, Mary; Jankowski, Christopher; Bergese, Sergio D; Candiotti, Keith; Flaherty, Joseph H; Arora, Harendra; Shander, Aryeh; Rock, Peter
2017-08-16
Postoperative delirium occurs in 10% to 60% of elderly patients having major surgery and is associated with longer hospital stays, increased hospital costs, and 1-year mortality. Emerging literature suggests that dexmedetomidine sedation in critical care units is associated with reduced incidence of delirium. However, intraoperative use of dexmedetomidine for prevention of delirium has not been well studied. To evaluate whether an intraoperative infusion of dexmedetomidine reduces postoperative delirium. This study was a multicenter, double-blind, randomized, placebo-controlled trial that randomly assigned patients to dexmedetomidine or saline placebo infused during surgery and for 2 hours in the recovery room. Patients were assessed daily for postoperative delirium (primary outcome) and secondarily for postoperative cognitive decline. Participants were elderly (>68 years) patients undergoing major elective noncardiac surgery. The study dates were February 2008 to May 2014. Dexmedetomidine infusion (0.5 µg/kg/h) during surgery and up to 2 hours in the recovery room. The primary hypothesis tested was that intraoperative dexmedetomidine administration would reduce postoperative delirium. Secondarily, the study examined the correlation between dexmedetomidine use and postoperative cognitive change. In total, 404 patients were randomized; 390 completed in-hospital delirium assessments (median [interquartile range] age, 74.0 [71.0-78.0] years; 51.3% [200 of 390] female). There was no difference in postoperative delirium between the dexmedetomidine and placebo groups (12.2% [23 of 189] vs 11.4% [23 of 201], P = .94). After adjustment for age and educational level, there was no difference in the postoperative cognitive performance between treatment groups at 3 months and 6 months. Adverse events were comparably distributed in the treatment groups. Intraoperative dexmedetomidine does not prevent postoperative delirium. The reduction in delirium previously demonstrated in numerous surgical intensive care unit studies was not observed, which underscores the importance of timing when administering the drug to prevent delirium. clinicaltrials.gov Identifier NCT00561678.
Dynamics of Cough Frequency in Adults Undergoing Treatment for Pulmonary Tuberculosis.
Proaño, Alvaro; Bravard, Marjory A; López, José W; Lee, Gwenyth O; Bui, David; Datta, Sumona; Comina, Germán; Zimic, Mirko; Coronel, Jorge; Caviedes, Luz; Cabrera, José L; Salas, Antonio; Ticona, Eduardo; Vu, Nancy M; Kirwan, Daniela E; Loader, Maria-Cristina I; Friedland, Jon S; Moore, David A J; Evans, Carlton A; Tracey, Brian H; Gilman, Robert H
2017-05-01
Cough is the major determinant of tuberculosis transmission. Despite this, there is a paucity of information regarding characteristics of cough frequency throughout the day and in response to tuberculosis therapy. Here we evaluate the circadian cycle of cough, cough frequency risk factors, and the impact of appropriate treatment on cough and bacillary load. We prospectively evaluated human immunodeficiency virus-negative adults (n = 64) with a new diagnosis of culture-proven, drug-susceptible pulmonary tuberculosis immediately prior to treatment and repeatedly until treatment day 62. At each time point, participant cough was recorded (n = 670) and analyzed using the Cayetano Cough Monitor. Consecutive coughs at least 2 seconds apart were counted as separate cough episodes. Sputum samples (n = 426) were tested with microscopic-observation drug susceptibility broth culture, and in culture-positive samples (n = 252), the time to culture positivity was used to estimate bacillary load. The highest cough frequency occurred from 1 pm to 2 pm, and the lowest from 1 am to 2 am (2.4 vs 1.1 cough episodes/hour, respectively). Cough frequency was higher among participants who had higher sputum bacillary load (P < .01). Pretreatment median cough episodes/hour was 2.3 (interquartile range [IQR], 1.2-4.1), which at 14 treatment days decreased to 0.48 (IQR, 0.0-1.4) and at the end of the study decreased to 0.18 (IQR, 0.0-0.59) (both reductions P < .001). By 14 treatment days, the probability of culture conversion was 29% (95% confidence interval, 19%-41%). Coughs were most frequent during daytime. Two weeks of appropriate treatment significantly reduced cough frequency and resulted in one-third of participants achieving culture conversion. Thus, treatment by 2 weeks considerably diminishes, but does not eliminate, the potential for airborne tuberculosis transmission. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
The impact of rheumatoid arthritis on work capacity in Chinese patients: a cross-sectional study.
Zhang, Xiaoying; Mu, Rong; Wang, Xiuru; Xu, Chuanhui; Duan, Tianjiao; An, Yuan; Han, Shuling; Li, Xiaofeng; Wang, Lizhi; Wang, Caihong; Wang, Yongfu; Yang, Rong; Wang, Guochun; Lu, Xin; Zhu, Ping; Chen, Lina; Liu, Jinting; Jin, Hongtao; Liu, Xiangyuan; Sun, Lin; Wei, Ping; Wang, Junxiang; Chen, Haiying; Cui, Liufu; Shu, Rong; Liu, Bailu; Zhang, Zhuoli; Li, Guangtao; Li, Zhenbin; Yang, Jing; Li, Junfang; Jia, Bin; Zhang, Fengxiao; Tao, Jiemei; Lin, Jinying; Wei, Meiqiu; Liu, Xiaomin; Ke, Dan; Hu, Shaoxian; Ye, Cong; Yang, Xiuyan; Li, Hao; Huang, Cibo; Gao, Ming; Lai, Pei; Li, Xingfu; Song, Lijun; Wang, Yi; Wang, Xiaoyuan; Su, Yin; Li, Zhanguo
2015-08-01
To evaluate the impact of RA on work capacity and identify factors related to work capacity impairment in patients with RA. A cross-sectional multicentre study was performed in 21 tertiary care hospitals across China. A consecutive sample of 846 patients with RA was recruited, of which 589 patients of working age at disease onset constituted the study population. Information on the socio-demographic, clinical, working and financial conditions of the patients was collected. Logistic regression analyses were used to identify factors associated with work capacity impairment. The rate of work capacity impairment was 48.0% in RA patients with a mean disease duration of 60 months (interquartile range 14-134 months), including 11.7% leaving the labour force early, 33.6% working reduced hours and 2.7% changing job. Multivariable logistic regression analysis showed that reduced working hours was significantly related to current smoking [odds ratio (OR) 2.07 (95% CI 1.08, 3.97)], no insurance [OR 1.94 (95% CI 1.20, 3.12)], in manual labour [OR 2.66 (95% CI 1.68, 4.20)] and higher HAQ score [OR 2.22 (95% CI 1.36, 3.60)]. There was an association of current smoking [OR 3.75 (95% CI 1.54, 9.15)], in manual labour [OR 2.33 (95% CI 1.17, 4.64)], longer disease duration [OR 1.01 (95% CI 1.00, 1.01)] and lower BMI [OR 0.90 (95% CI 0.82, 0.99)] with leaving the labour force early. There is a substantial impact of RA on the work capacity of patients in China. Social-demographic, disease- and work-related factors are all associated with work capacity impairment. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Vinh, Ha; Anh, Vo Thi Cuc; Anh, Nguyen Duc; Campbell, James I.; Hoang, Nguyen Van Minh; Nga, Tran Vu Thieu; Nhu, Nguyen Thi Khanh; Minh, Pham Van; Thuy, Cao Thu; Duy, Pham Thanh; Phuong, Le Thi; Loan, Ha Thi; Chinh, Mai Thu; Thao, Nguyen Thi Thu; Tham, Nguyen Thi Hong; Mong, Bui Li; Bay, Phan Van Be; Day, Jeremy N.; Dolecek, Christiane; Lan, Nguyen Phu Huong; Diep, To Song; Farrar, Jeremy J.; Chau, Nguyen Van Vinh; Wolbers, Marcel; Baker, Stephen
2011-01-01
Background The bacterial genus Shigella is the leading cause of dysentery. There have been significant increases in the proportion of Shigella isolated that demonstrate resistance to nalidixic acid. While nalidixic acid is no longer considered as a therapeutic agent for shigellosis, the fluoroquinolone ciprofloxacin is the current recommendation of the World Health Organization. Resistance to nalidixic acid is a marker of reduced susceptibility to older generation fluoroquinolones, such as ciprofloxacin. We aimed to assess the efficacy of gatifloxacin versus ciprofloxacin in the treatment of uncomplicated shigellosis in children. Methodology/Principal Findings We conducted a randomized, open-label, controlled trial with two parallel arms at two hospitals in southern Vietnam. The study was designed as a superiority trial and children with dysentery meeting the inclusion criteria were invited to participate. Participants received either gatifloxacin (10 mg/kg/day) in a single daily dose for 3 days or ciprofloxacin (30 mg/kg/day) in two divided doses for 3 days. The primary outcome measure was treatment failure; secondary outcome measures were time to the cessation of individual symptoms. Four hundred and ninety four patients were randomized to receive either gatifloxacin (n = 249) or ciprofloxacin (n = 245), of which 107 had a positive Shigella stool culture. We could not demonstrate superiority of gatifloxacin and observed similar clinical failure rate in both groups (gatifloxacin; 12.0% and ciprofloxacin; 11.0%, p = 0.72). The median (inter-quartile range) time from illness onset to cessation of all symptoms was 95 (66–126) hours for gatifloxacin recipients and 93 (68–120) hours for the ciprofloxacin recipients (Hazard Ratio [95%CI] = 0.98 [0.82–1.17], p = 0.83). Conclusions We conclude that in Vietnam, where nalidixic acid resistant Shigellae are highly prevalent, ciprofloxacin and gatifloxacin are similarly effective for the treatment of acute shigellosis. Trial Registration Controlled trials number ISRCTN55945881 PMID:21829747
2014-01-01
BACKGROUND Although variations in plasma renin activity (PRA) and aldosterone have been examined in whites and blacks, the association of these hormones with blood pressure in multiethnic populations has not been described. METHODS We measured PRA and aldosterone in 1,021 participants in the Multi-Ethnic Study of Atherosclerosis not taking antihypertensives and examined the association between ethnicity and PRA/aldosterone and the association between PRA/aldosterone with systolic blood pressure (SBP). RESULTS Average age was 62 (SD = 9) years, and 49% of participants were women. Median PRA was 0.51 (interquartile range (IQR) = 0.29–0.87) ng/ml/hour, and median aldosterone was 12.6 (IQR = 9.1–17.1) ng/dl. After age and sex adjustment, compared with whites, blacks had 28% lower PRA and 17.4% lower aldosterone, and Hispanics had 20.1% higher PRA but similar aldosterone levels. After multivariable adjustment, compared with whites, only Hispanic ethnicity independently associated with higher PRA (0.18ng/ml/hour; 95% confidence interval (CI) = 0.06–0.31). Blacks had lower aldosterone (−1.7ng/dl; 95% CI = −3.2 to −0.2) compared with whites. After multivariable adjustment, PRA was associated with lower SBP in whites (−3.2mm Hg; 95% CI = −5.2 to −1.2 per standardized unit PRA), Chinese (−3.5mm Hg; 95% CI = −6.2 to −0.80 per standardized unit), and Hispanics (−2.3mm Hg; 95% CI = −4.1 to −0.6 per standardized unit) but not blacks. Aldosterone was associated with higher SBP only in Hispanics (2.5mm Hg; 95% CI = 0.4–4.5 per SD). CONCLUSIONS Compared with whites, blacks have lower aldosterone and Hispanics have higher PRA. Aldosterone had significant associations with higher SBP in Hispanics compared with other groups, a finding that may suggest a different mechanism of hypertension. PMID:24436325
Nontechnical skills performance and care processes in the management of the acute trauma patient.
Pucher, Philip H; Aggarwal, Rajesh; Batrick, Nicola; Jenkins, Michael; Darzi, Ara
2014-05-01
Acute trauma management is a complex process, with the effective cooperation among multiple clinicians critical to success. Despite this, the effect of nontechnical skills on performance on outcomes has not been investigated previously in trauma. Trauma calls in an urban, level 1 trauma center were observed directly. Nontechnical performance was measured using T-NOTECHS. Times to disposition and completion of assessment care processes were recorded, as well as any delays or errors. Statistical analysis assessed the effect of T-NOTECHS on performance and outcomes, accounting for Injury Severity Scores (ISS) and time of day as potential confounding factors. Meta-analysis was performed for incidence of delays. Fifty trauma calls were observed, with an ISS of 13 (interquartile range [IQR], 5-25); duration of stay 1 (IQR, 1-8) days; T-NOTECHS, 20.5 (IQR, 18-23); time to disposition, 24 minutes (IQR, 18-42). Trauma calls with low T-NOTECHS scores had a greater time to disposition: 35 minutes (IQR, 23-53) versus 20 (IQR, 16-25; P = .046). ISS showed a significant correlation to duration of stay (r = 0.736; P < .001), but not to T-NOTECHS (r = 0.201; P = .219) or time to disposition (r = 0.113; P = .494). There was no difference between "in-hours" and "out-of-hours" trauma calls for T-NOTECHS scores (21 [IQR, 18-22] vs 20 [IQR, 20-23]; P = .361), or time to disposition (34 minutes [IQR, 24-52] vs 17 [IQR, 15-27]; P = .419). Regression analysis revealed T-NOTECHS as the only factor associated with delays (odds ratio [OR], 0.24; 95% confidence interval [CI], 0.06-0.95). Better teamwork and nontechnical performance are associated with significant decreases in disposition time, an important marker of quality in acute trauma care. Addressing team and nontechnical skills has the potential to improve patient assessment, treatment, and outcomes. Copyright © 2014 Mosby, Inc. All rights reserved.
Ling, Qi; Liu, Jimin; Zhuo, Jianyong; Zhuang, Runzhou; Huang, Haitao; He, Xiangxiang; Xu, Xiao; Zheng, Shusen
2018-04-27
Donor characteristics and graft quality were recently reported to play an important role in the recurrence of hepatocellular carcinoma after liver transplantation. Our aim was to establish a prognostic model by using both donor and recipient variables. Data of 1,010 adult patients (training/validation: 2/1) undergoing primary liver transplantation for hepatocellular carcinoma were extracted from the China Liver Transplant Registry database and analyzed retrospectively. A multivariate competing risk regression model was developed and used to generate a nomogram predicting the likelihood of post-transplant hepatocellular carcinoma recurrence. Of 673 patients in the training cohort, 70 (10.4%) had hepatocellular carcinoma recurrence with a median recurrence time of 6 months (interquartile range: 4-25 months). Cold ischemia time was the only independent donor prognostic factor for predicting hepatocellular carcinoma recurrence (hazard ratio = 2.234, P = .007). The optimal cutoff value was 12 hours when patients were grouped according to cold ischemia time at 2-hour intervals. Integrating cold ischemia time into the Milan criteria (liver transplantation candidate selection criteria) improved the accuracy for predicting hepatocellular carcinoma recurrence in both training and validation sets (P < .05). A nomogram composed of cold ischemia time, tumor burden, differentiation, and α-fetoprotein level proved to be accurate and reliable in predicting the likelihood of 1-year hepatocellular carcinoma recurrence after liver transplantation. Additionally, donor anti-hepatitis B core antibody positivity, prolonged cold ischemia time, and anhepatic time were linked to the intrahepatic recurrence, whereas older donor age, prolonged donor warm ischemia time, cold ischemia time, and ABO incompatibility were relevant to the extrahepatic recurrence. The graft quality integrated models exhibited considerable predictive accuracy in early hepatocellular carcinoma recurrence risk assessment. The identification of donor risks can further help understand the mechanism of different patterns of recurrence. Copyright © 2018 Elsevier Inc. All rights reserved.
Dynamics of Cough Frequency in Adults Undergoing Treatment for Pulmonary Tuberculosis
Bravard, Marjory A.; López, José W.; Lee, Gwenyth O.; Bui, David; Datta, Sumona; Comina, Germán; Zimic, Mirko; Coronel, Jorge; Caviedes, Luz; Cabrera, José L.; Salas, Antonio; Ticona, Eduardo; Vu, Nancy M.; Kirwan, Daniela E.; Loader, Maria-Cristina I.; Friedland, Jon S.; Moore, David A. J.; Evans, Carlton A.; Tracey, Brian H.; Gilman, Robert H.
2017-01-01
Abstract Background. Cough is the major determinant of tuberculosis transmission. Despite this, there is a paucity of information regarding characteristics of cough frequency throughout the day and in response to tuberculosis therapy. Here we evaluate the circadian cycle of cough, cough frequency risk factors, and the impact of appropriate treatment on cough and bacillary load. Methods. We prospectively evaluated human immunodeficiency virus–negative adults (n = 64) with a new diagnosis of culture-proven, drug-susceptible pulmonary tuberculosis immediately prior to treatment and repeatedly until treatment day 62. At each time point, participant cough was recorded (n = 670) and analyzed using the Cayetano Cough Monitor. Consecutive coughs at least 2 seconds apart were counted as separate cough episodes. Sputum samples (n = 426) were tested with microscopic-observation drug susceptibility broth culture, and in culture-positive samples (n = 252), the time to culture positivity was used to estimate bacillary load. Results. The highest cough frequency occurred from 1 pm to 2 pm, and the lowest from 1 am to 2 am (2.4 vs 1.1 cough episodes/hour, respectively). Cough frequency was higher among participants who had higher sputum bacillary load (P < .01). Pretreatment median cough episodes/hour was 2.3 (interquartile range [IQR], 1.2–4.1), which at 14 treatment days decreased to 0.48 (IQR, 0.0–1.4) and at the end of the study decreased to 0.18 (IQR, 0.0–0.59) (both reductions P < .001). By 14 treatment days, the probability of culture conversion was 29% (95% confidence interval, 19%–41%). Conclusions. Coughs were most frequent during daytime. Two weeks of appropriate treatment significantly reduced cough frequency and resulted in one-third of participants achieving culture conversion. Thus, treatment by 2 weeks considerably diminishes, but does not eliminate, the potential for airborne tuberculosis transmission. PMID:28329268
Rapid HIV-1 testing during labor: a multicenter study.
Bulterys, Marc; Jamieson, Denise J; O'Sullivan, Mary Jo; Cohen, Mardge H; Maupin, Robert; Nesheim, Steven; Webber, Mayris P; Van Dyke, Russell; Wiener, Jeffrey; Branson, Bernard M
2004-07-14
Timely testing of women in labor with undocumented human immunodeficiency virus (HIV) status could enable immediate provision of antiretroviral prophylaxis. To determine the feasibility and acceptance of rapid HIV testing among women in labor and to assess rapid HIV assay performance. The Mother-Infant Rapid Intervention At Delivery (MIRIAD) study implemented 24-hour counseling and voluntary rapid HIV testing for women in labor at 16 US hospitals from November 16, 2001, through November 15, 2003. A rapid HIV-1 antibody test for whole blood was used. Acceptance of HIV testing; sensitivity, specificity, and predictive value of the rapid test; time from blood collection to patient notification of results. There were 91,707 visits to the labor and delivery units in the study, 7381 of which were by eligible women without documentation of HIV testing. Of these, 5744 (78%) women were approached for rapid HIV testing and 4849 (84%) consented. HIV-1 test results were positive for 34 women (prevalence = 7/1000). Sensitivity and specificity of the rapid test were 100% and 99.9%, respectively; positive predictive value was 90% compared with 76% for enzyme immunoassay (EIA). Factors independently associated with higher test acceptance included younger age, being black or Hispanic, gestational age less than 32 weeks, and having had no prenatal care. Lower acceptance was associated with being admitted between 4 pm and midnight, particularly on Friday nights, but this may be explained in part by fewer available personnel. Median time from blood collection to patient notification of result was 66 minutes (interquartile range, 45-120 minutes), compared with 28 hours for EIA (P<.001). Rapid HIV testing is feasible and delivers accurate and timely test results for women in labor. It provides HIV-positive women prompt access to intrapartum and neonatal antiretroviral prophylaxis, proven to reduce perinatal HIV transmission, and may be particularly applicable to higher-risk populations.
Taniguchi, Tomohiko; Shiomi, Hiroki; Toyota, Toshiaki; Morimoto, Takeshi; Akao, Masaharu; Nakatsuma, Kenji; Ono, Koh; Makiyama, Takeru; Shizuta, Satoshi; Furukawa, Yutaka; Nakagawa, Yoshihisa; Ando, Kenji; Kadota, Kazushige; Horie, Minoru; Kimura, Takeshi
2014-10-15
The influence of preinfarction angina pectoris (AP) on long-term clinical outcomes in patients with ST-segment elevation myocardial infarction (STEMI) who underwent primary percutaneous coronary intervention (PCI) remains controversial. In 5,429 patients with acute myocardial infarction (AMI) enrolled in the Coronary Revascularization Demonstrating Outcome Study in Kyoto AMI Registry, the present study population consisted of 3,476 patients with STEMI who underwent primary PCI within 24 hours of symptom onset and in whom the data on preinfarction AP were available. Preinfarction AP defined as AP occurring within 48 hours of hospital arrival was present in 675 patients (19.4%). Patients with preinfarction AP was younger and more often had anterior AMI and longer total ischemic time, whereas they less often had history of heart failure, atrial fibrillation, and shock presentation. The infarct size estimated by peak creatinine phosphokinase was significantly smaller in patients with than in patients without preinfarction AP (median [interquartile range] 2,141 [965 to 3,867] IU/L vs 2,462 [1,257 to 4,495] IU/L, p <0.001). The cumulative 5-year incidence of death was significantly lower in patients with preinfarction AP (12.4% vs 20.7%, p <0.001) with median follow-up interval of 1,845 days. After adjusting for confounders, preinfarction AP was independently associated with a lower risk for death (hazard ratio 0.69, 95% confidence interval 0.54 to 0.86, p = 0.001). The lower risk for 5-year mortality in patients with preinfarction AP was consistently observed across subgroups stratified by total ischemic time, initial Thrombolysis In Myocardial Infarction flow grade, hemodynamic status, infarct location, and diabetes mellitus. In conclusion, preinfarction AP was independently associated with lower 5-year mortality in patients with STEMI who underwent primary PCI. Copyright © 2014 Elsevier Inc. All rights reserved.
Gilliom, Robert J.; Helsel, Dennis R.
1986-01-01
A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensored observations, for determining the best performing parameter estimation method for any particular data set. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification.
Ghosh, Jo Kay C.; Wilhelm, Michelle; Su, Jason; Goldberg, Daniel; Cockburn, Myles; Jerrett, Michael; Ritz, Beate
2012-01-01
Few studies have examined associations of birth outcomes with toxic air pollutants (air toxics) in traffic exhaust. This study included 8,181 term low birth weight (LBW) children and 370,922 term normal-weight children born between January 1, 1995, and December 31, 2006, to women residing within 5 miles (8 km) of an air toxics monitoring station in Los Angeles County, California. Additionally, land-use-based regression (LUR)-modeled estimates of levels of nitric oxide, nitrogen dioxide, and nitrogen oxides were used to assess the influence of small-area variations in traffic pollution. The authors examined associations with term LBW (≥37 weeks’ completed gestation and birth weight <2,500 g) using logistic regression adjusted for maternal age, race/ethnicity, education, parity, infant gestational age, and gestational age squared. Odds of term LBW increased 2%–5% (95% confidence intervals ranged from 1.00 to 1.09) per interquartile-range increase in LUR-modeled estimates and monitoring-based air toxics exposure estimates in the entire pregnancy, the third trimester, and the last month of pregnancy. Models stratified by monitoring station (to investigate air toxics associations based solely on temporal variations) resulted in 2%–5% increased odds per interquartile-range increase in third-trimester benzene, toluene, ethyl benzene, and xylene exposures, with some confidence intervals containing the null value. This analysis highlights the importance of both spatial and temporal contributions to air pollution in epidemiologic birth outcome studies. PMID:22586068
Financial Ties Between Emergency Physicians and Industry: Insights From Open Payments Data.
Fleischman, William; Ross, Joseph S; Melnick, Edward R; Newman, David H; Venkatesh, Arjun K
2016-08-01
The Open Payments program requires reporting of payments by medical product companies to teaching hospitals and licensed physicians. We seek to describe nonresearch, nonroyalty payments made to emergency physicians in the United States. We performed a descriptive analysis of the most recent Open Payments data released to the public by the Centers for Medicare & Medicaid Services covering the 2014 calendar year. We calculated the median payment, the total pay per physician, the types of payments, and the drugs and devices associated with payments to emergency physicians. For context, we also calculated total pay per physician and the percentage of active physicians receiving payments for all specialties. There were 46,405 payments totaling $10,693,310 to 12,883 emergency physicians, representing 30% of active emergency physicians in 2013. The percentage of active physicians within a specialty who received a payment ranged from 14.6% in preventive medicine to 91% in orthopedic surgery. The median payment and median total pay to emergency physicians were $16 (interquartile range $12 to $68) and $44 (interquartile range $16 to $123), respectively. The majority of payments (83%) were less than $100. Food and beverage (86%) was the most frequent type of payment. The most common products associated with payments to emergency physicians were rivaroxaban, apixaban, ticagrelor, ceftaroline, canagliflozin, dabigatran, and alteplase. Nearly a third of emergency physicians received nonresearch, nonroyalty payments from industry in 2014. Most payments were of small monetary value and for activities related to the marketing of antithrombotic drugs. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilliom, R.J.; Helsel, D.R.
1986-02-01
A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensoredmore » observations, for determining the best performing parameter estimation method for any particular data det. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification.« less
Estimation of distributional parameters for censored trace-level water-quality data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilliom, R.J.; Helsel, D.R.
1984-01-01
A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water-sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensored observations,more » for determining the best-performing parameter estimation method for any particular data set. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least-squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification. 6 figs., 6 tabs.« less
Daniels, Tracey; Goodacre, Lynne; Sutton, Chris; Pollard, Kim; Conway, Steven; Peckham, Daniel
2011-08-01
People with cystic fibrosis have a high treatment burden. While uncertainty remains about individual patient level of adherence to medication, treatment regimens are difficult to tailor, and interventions are difficult to evaluate. Self- and clinician-reported measures are routinely used despite criticism that they overestimate adherence. This study assessed agreement between rates of adherence to prescribed nebulizer treatments when measured by self-report, clinician report, and electronic monitoring suitable for long-term use. Seventy-eight adults with cystic fibrosis were questioned about their adherence to prescribed nebulizer treatments over the previous 3 months. Self-report was compared with clinician report and stored adherence data downloaded from the I-Neb nebulizer system. Adherence measures were expressed as a percentage of the prescribed regimen, bias was estimated by the paired difference in mean (95% CI) patient and clinician reported and actual adherence. Agreement between adherence measures was calculated using intraclass correlation coefficients (95% CI), and disagreements for individuals were displayed using Bland-Altman plots. Patient-identified prescriptions matched the medical record prescription. Median self-reported adherence was 80% (interquartile range, 60%-95%), whereas median adherence measured by nebulizer download was 36% (interquartile range, 5%-84.5%). Nine participants overmedicated and underreported adherence. Median clinician report ranged from 50% to 60%, depending on profession. Extensive discrepancies between self-report and clinician report compared with nebulizer download were identified for individuals. Self- and clinician-reporting of adherence does not provide accurate measurement of adherence when compared with electronic monitoring. Using inaccurate measures has implications for treatment burden, clinician prescribing practices, cost, and accuracy of trial data.
Ghosh, Jo Kay C; Wilhelm, Michelle; Su, Jason; Goldberg, Daniel; Cockburn, Myles; Jerrett, Michael; Ritz, Beate
2012-06-15
Few studies have examined associations of birth outcomes with toxic air pollutants (air toxics) in traffic exhaust. This study included 8,181 term low birth weight (LBW) children and 370,922 term normal-weight children born between January 1, 1995, and December 31, 2006, to women residing within 5 miles (8 km) of an air toxics monitoring station in Los Angeles County, California. Additionally, land-use-based regression (LUR)-modeled estimates of levels of nitric oxide, nitrogen dioxide, and nitrogen oxides were used to assess the influence of small-area variations in traffic pollution. The authors examined associations with term LBW (≥37 weeks' completed gestation and birth weight <2,500 g) using logistic regression adjusted for maternal age, race/ethnicity, education, parity, infant gestational age, and gestational age squared. Odds of term LBW increased 2%-5% (95% confidence intervals ranged from 1.00 to 1.09) per interquartile-range increase in LUR-modeled estimates and monitoring-based air toxics exposure estimates in the entire pregnancy, the third trimester, and the last month of pregnancy. Models stratified by monitoring station (to investigate air toxics associations based solely on temporal variations) resulted in 2%-5% increased odds per interquartile-range increase in third-trimester benzene, toluene, ethyl benzene, and xylene exposures, with some confidence intervals containing the null value. This analysis highlights the importance of both spatial and temporal contributions to air pollution in epidemiologic birth outcome studies.
Sympathetic nerve dysfunction is common in patients with chronic intestinal pseudo-obstruction.
Mattsson, Tomas; Roos, Robert; Sundkvist, Göran; Valind, Sven; Ohlsson, Bodil
2008-02-01
To clarify whether disturbances in the autonomic nervous system, reflected in abnormal cardiovascular reflexes, could explain symptoms of impaired heat regulation in patients with intestinal pseudo-obstruction. Chronic intestinal pseudo-obstruction is a clinical syndrome characterized by diffuse, unspecific gastrointestinal symptoms due to damage to the enteric nervous system or the smooth muscle cells. These patients often complain of excessive sweating or feeling cold, suggesting disturbances in the autonomic nervous system. Earlier studies have pointed to a coexistence of autonomic disturbances in the enteric and cardiovascular nervous system. Thirteen consecutive patients (age range 23 to 79, mean 44 y) fulfilling the criteria for chronic intestinal pseudo-obstruction were investigated. Six of them complained of sweating or a feeling of cold. Examination of autonomic reflexes included heart rate variation to deep-breathing (expiration/inspiration index), heart rate reaction to tilt (acceleration index, brake index), and vasoconstriction (VAC) due to indirect cooling by laser doppler (VAC-index; high index indicates impaired VAC). Test results in patients were compared with healthy individuals. Patients had significantly higher (more abnormal) median VAC-index compared with healthy controls [1.79 (interquartile ranges 1.89) vs. 0.08 (interquartile ranges 1.29); P=0.0007]. However, symptoms of impaired heat regulation were not related to the VAC-index. There were no differences in expiration/inspiration, acceleration index, or brake index between patients and controls. The patients with severe gastrointestinal dysmotility showed impaired sympathetic nerve function which, however, did not seem to be associated with symptoms of impaired heat regulation.
Anomaly Detection for Data Reduction in an Unattended Ground Sensor (UGS) Field
2014-09-01
information (shown with solid lines in the diagram). Typically, this would be a mobile ad - hoc network (MANET). The clusters are connected to other nodes...interquartile ranges MANET mobile ad - hoc network OSUS Open Standards for Unattended Sensors TOC tactical operations center UAVs unmanned aerial vehicles...19b. TELEPHONE NUMBER (Include area code ) 301-394-1221 Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std. Z39.18 iii Contents List of
Compact near-IR and mid-IR cavity ring down spectroscopy device
NASA Technical Reports Server (NTRS)
Miller, J. Houston (Inventor)
2011-01-01
This invention relates to a compact cavity ring down spectrometer for detection and measurement of trace species in a sample gas using a tunable solid-state continuous-wave mid-infrared PPLN OPO laser or a tunable low-power solid-state continuous wave near-infrared diode laser with an algorithm for reducing the periodic noise in the voltage decay signal which subjects the data to cluster analysis or by averaging of the interquartile range of the data.
Ibrahim, Moustafa I; Ramy, Ahmed R; Abdelhamid, Ahmed S; Ellaithy, Mohamed I; Omar, Amna; Harara, Rany M; Fathy, Hayam; Abolouz, Ashraf S
2017-03-01
To assess maternal serum amyloid A (SAA) levels among women with primary unexplained recurrent early pregnancy loss (REPL). A prospective study was conducted among women with missed spontaneous abortion in the first trimester at Ain Shams University Maternity Hospital, Cairo, Egypt, between January 21 and December 25, 2014. Women with at least two consecutive primary unexplained REPLs and no previous live births were enrolled. A control group was formed of women with no history of REPL who had at least one previous uneventful pregnancy with no adverse outcomes. Serum samples were collected to measure SAA levels. The main outcome was the association between SAA and primary unexplained REPL. Each group contained 96 participants. Median SAA level was significantly higher among women with REPL (50.0 μg/mL, interquartile range 26.0-69.0) than among women in the control group (11.6 μg/mL, interquartile range 6.2-15.5; P<0.001). The SAA level was an independent indicator of primary unexplained REPL, after adjusting for maternal age and gestational age (odds ratio 1.12, 95% confidence interval 1.06-1.19; P<0.001). Elevated SAA levels found among women with primary unexplained REPL could represent a novel biomarker for this complication of pregnancy. © 2016 International Federation of Gynecology and Obstetrics.
Autonomic regulation in fetuses with Congenital Heart Disease
Siddiqui, Saira; Wilpers, Abigail; Myers, Michael; Nugent, J. David; Fifer, William P.; Williams, Ismée A.
2015-01-01
Background Exposure to antenatal stressors affects autonomic regulation in fetuses. Whether the presence of congenital heart disease (CHD) alters the developmental trajectory of autonomic regulation is not known. Aims/Study Design This prospective observational cohort study aimed to further characterize autonomic regulation in fetuses with CHD; specifically hypoplastic left heart syndrome (HLHS), transposition of the great arteries (TGA), and tetralogy of Fallot (TOF). Subjects From 11/2010 – 11/2012, 92 fetuses were enrolled: 41 controls and 51 with CHD consisting of 19 with HLHS, 12 with TGA, and 20 with TOF. Maternal abdominal fetal electrocardiogram (ECG) recordings were obtained at 3 gestational ages: 19-27 weeks (F1), 28-33 weeks (F2), and 34-38 weeks (F3). Outcome measures Fetal ECG was analyzed for mean heart rate along with 3 measures of autonomic variability of the fetal heart rate: interquartile range, standard deviation, and root mean square of the standard deviation of the heart rate (RMSSD), a measure of parasympathetic activity. Results During F1 and F2 periods, HLHS fetuses demonstrated significantly lower mean HR than controls (p<0.05). Heart rate variability at F3, as measured by standard deviation, interquartile range, and RMSSD was lower in HLHS than controls (p<0.05). Other CHD subgroups showed a similar, though non-significant trend towards lower variability. Conclusions Autonomic regulation in CHD fetuses differs from controls with HLHS fetuses most markedly affected. PMID:25662702
Crisp, Tom; Khan, Faisal; Padhiar, Nat; Morrissey, Dylan; King, John; Jalan, Rosy; Maffulli, Nicola; Frcr, Otto Chan
2008-01-01
To evaluate a novel conservative management modality for patellar tendinopathy. We recruited nine patients with patellar tendinopathy who had failed conservative management and showed evidence of neovascularisation on power Doppler scanning. A high volume ultrasound guided injection at the interface between the patellar tendon and Hoffa's body. The injection contained 10 ml 0.5% Bupivacaine, 25 mg Hydrocortisone, and between 12 and 40 ml normosaline. 100 mm visual analogue scales (VAS) for pain and for function, and Victorian Institute of Sport Assessment - Patellar tendon (VISA-P) questionnaires at an average of 9 months from the injection. All but one patient (whose pain was unchanged) improved (p = 0.028). The mean improvement in function 2 weeks after injection was 58 mm on VAS (interquartile range 27 - 88, p = 0.018). The mean improvement in pain 2 weeks after injection was 56 mm on a VAS scale (interquartile range 32 - 80, p = 0.018). At a mean follow up of 9 months, an improvement of 22 points from a baseline score of 46 on the VISA-P questionnaire (100 being normal) was established. High volume injections to mechanically disrupt the neovascularisation in patellar tendinopathy are helpful in the management of this condition. Controlled trials would be warranted to investigate in a more conclusive fashion this management modality.