Survival and Neurodevelopmental Outcomes among Periviable Infants.
Younge, Noelle; Goldstein, Ricki F; Bann, Carla M; Hintz, Susan R; Patel, Ravi M; Smith, P Brian; Bell, Edward F; Rysavy, Matthew A; Duncan, Andrea F; Vohr, Betty R; Das, Abhik; Goldberg, Ronald N; Higgins, Rosemary D; Cotten, C Michael
2017-02-16
Data reported during the past 5 years indicate that rates of survival have increased among infants born at the borderline of viability, but less is known about how increased rates of survival among these infants relate to early childhood neurodevelopmental outcomes. We compared survival and neurodevelopmental outcomes among infants born at 22 to 24 weeks of gestation, as assessed at 18 to 22 months of corrected age, across three consecutive birth-year epochs (2000-2003 [epoch 1], 2004-2007 [epoch 2], and 2008-2011 [epoch 3]). The infants were born at 11 centers that participated in the National Institute of Child Health and Human Development Neonatal Research Network. The primary outcome measure was a three-level outcome - survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, or death. After accounting for differences in infant characteristics, including birth center, we used multinomial generalized logit models to compare the relative risk of survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, and death. Data on the primary outcome were available for 4274 of 4458 infants (96%) born at the 11 centers. The percentage of infants who survived increased from 30% (424 of 1391 infants) in epoch 1 to 36% (487 of 1348 infants) in epoch 3 (P<0.001). The percentage of infants who survived without neurodevelopmental impairment increased from 16% (217 of 1391) in epoch 1 to 20% (276 of 1348) in epoch 3 (P=0.001), whereas the percentage of infants who survived with neurodevelopmental impairment did not change significantly (15% [207 of 1391] in epoch 1 and 16% [211 of 1348] in epoch 3, P=0.29). After adjustment for changes in the baseline characteristics of the infants over time, both the rate of survival with neurodevelopmental impairment (as compared with death) and the rate of survival without neurodevelopmental impairment (as compared with death) increased over time (adjusted relative risks, 1.27 [95% confidence interval {CI}, 1.01 to 1.59] and 1.59 [95% CI, 1.28 to 1.99], respectively). The rate of survival without neurodevelopmental impairment increased between 2000 and 2011 in this large cohort of periviable infants. (Funded by the National Institutes of Health and others; ClinicalTrials.gov numbers, NCT00063063 and NCT00009633 .).
Mukkamalla, Shiva Kumar R; Naseri, Hussain M; Kim, Byung M; Katz, Steven C; Armenio, Vincent A
2018-04-01
Background: Cholangiocarcinoma (CCA) includes cancers arising from the intrahepatic and extrahepatic bile ducts. The etiology and pathogenesis of CCA remain poorly understood. This is the first study investigating both incidence patterns of CCA from 1973 through 2012 and demographic, clinical, and treatment variables affecting survival of patients with CCA. Patients and Methods: Using the SEER database, age-adjusted incidence rates were evaluated from 1973-2012 using SEER*Stat software. A retrospective cohort of 26,994 patients diagnosed with CCA from 1973-2008 was identified for survival analysis. Cox proportional hazards models were used to perform multivariate survival analysis. Results: Overall incidence of CCA increased by 65% from 1973-2012. Extrahepatic CCA (ECC) remained more common than intrahepatic CCA (ICC), whereas the incidence rates for ICC increased by 350% compared with a 20% increase seen with ECC. Men belonging to non-African American and non-Caucasian ethnicities had the highest incidence rates of CCA. This trend persisted throughout the study period, although African Americans and Caucasians saw 50% and 59% increases in incidence rates, respectively, compared with a 9% increase among other races. Median overall survival (OS) was 8 months in patients with ECC compared with 4 months in those with ICC. Our survival analysis found Hispanic women to have the best 5-year survival outcome ( P <.0001). OS diminished with age ( P <.0001), and ECC had better survival outcomes compared with ICC ( P <.0001). Patients who were married, were nonsmokers, belonged to a higher income class, and underwent surgery had better survival outcomes compared with others ( P <.0001). Conclusions: This is the most up-to-date study of CCA from the SEER registry that shows temporal patterns of increasing incidence of CCA across different races, sexes, and ethnicities. We identified age, sex, race, marital status, income, smoking status, anatomic location of CCA, tumor grade, tumor stage, radiation, and surgery as independent prognostic factors for OS in patients with CCA. Copyright © 2018 by the National Comprehensive Cancer Network.
Beckman, Sarah A; Sekiya, Naosumi; Chen, William C W; Mlakar, Logan; Tobita, Kimimassa; Huard, Johnny
2014-01-01
Since myoblasts have been limited by poor cell survival after cellular myoplasty, the major goal of the current study was to determine whether improving myoblast survival with an antioxidant could improve cardiac function after the transplantation of the myoblasts into an acute myocardial infarction. We previously demonstrated that early myogenic progenitors such as muscle-derived stem cells (MDSCs) exhibited superior cell survival and improved cardiac repair after transplantation into infarcted hearts compared to myoblasts, which we partially attributed to MDSC's higher antioxidant levels. To determine if antioxidant treatment could increase myoblast survival, subsequently improving cardiac function after myoblast transplantation into infarcted hearts. Myoblasts were pre-treated with the antioxidant N-acetylcysteine (NAC) or the glutathione depleter, diethyl maleate (DEM), and injected into infarcted murine hearts. Regenerative potential was monitored by cell survival and cardiac function. At early time points, hearts injected with NAC-treated myoblasts exhibited increased donor cell survival, greater cell proliferation, and decreased cellular apoptosis, compared to untreated myoblasts. NAC-treated myoblasts significantly improved cardiac contractility, reduced fibrosis, and increased vascular density compared to DEM-treated myoblasts, but compared to untreated myoblasts, no difference was noted. While early survival of myoblasts transplanted into infarcted hearts was augmented by NAC pre-treatment, cardiac function remained unchanged compared to non-treated myoblasts. Despite improving cell survival with NAC treated myoblast transplantation in a MI heart, cardiac function remained similar to untreated myoblasts. These results suggest that the reduced cardiac regenerative potential of myoblasts, when compared to MDSCs, is not only attributable to cell survival but is probably also related to the secretion of paracrine factors by the MDSCs.
Beckman, Sarah A.; Sekiya, Naosumi; Chen, William C.W.; Mlakar, Logan; Tobita, Kimimassa; Huard, Johnny
2017-01-01
Introduction Since myoblasts have been limited by poor cell survival after cellular myoplasty, the major goal of the current study was to determine whether improving myoblast survival with an antioxidant could improve cardiac function after the transplantation of the myoblasts into an acute myocardial infarction. Background We previously demonstrated that early myogenic progenitors such as muscle-derived stem cells (MDSCs) exhibited superior cell survival and improved cardiac repair after transplantation into infarcted hearts compared to myoblasts, which we partially attributed to MDSC’s higher antioxidant levels. Aim To determine if antioxidant treatment could increase myoblast survival, subsequently improving cardiac function after myoblast transplantation into infarcted hearts. Materials and Methods Myoblasts were pre-treated with the antioxidant N-acetylcysteine (NAC) or the glutathione depleter, diethyl maleate (DEM), and injected into infarcted murine hearts. Regenerative potential was monitored by cell survival and cardiac function. Results At early time points, hearts injected with NAC-treated myoblasts exhibited increased donor cell survival, greater cell proliferation, and decreased cellular apoptosis, compared to untreated myoblasts. NAC-treated myoblasts significantly improved cardiac contractility, reduced fibrosis, and increased vascular density compared to DEM-treated myoblasts, but compared to untreated myoblasts, no difference was noted. Discussion While early survival of myoblasts transplanted into infarcted hearts was augmented by NAC pre-treatment, cardiac function remained unchanged compared to non-treated myoblasts. Conclusion Despite improving cell survival with NAC treated myoblast transplantation in a MI heart, cardiac function remained similar to untreated myoblasts. These results suggest that the reduced cardiac regenerative potential of myoblasts, when compared to MDSCs, is not only attributable to cell survival but is probably also related to the secretion of paracrine factors by the MDSCs. PMID:28989945
Survival and Neurodevelopmental Outcomes among Periviable Infants
Younge, Noelle; Goldstein, Ricki F.; Bann, Carla M.; Hintz, Susan R.; Patel, Ravi M.; Smith, P. Brian; Bell, Edward F.; Rysavy, Matthew A.; Duncan, Andrea F.; Vohr, Betty R.; Das, Abhik; Goldberg, Ronald N.; Higgins, Rosemary D.; Cotten, C. Michael
2017-01-01
BACKGROUND Data reported during the past 5 years indicate that rates of survival have increased among infants born at the borderline of viability, but less is known about how increased rates of survival among these infants relate to early childhood neurodevelopmental outcomes. METHODS We compared survival and neurodevelopmental outcomes among infants born at 22 to 24 weeks of gestation, as assessed at 18 to 22 months of corrected age, across three consecutive birth-year epochs (2000–2003 [epoch 1], 2004–2007 [epoch 2], and 2008–2011 [epoch 3]). The infants were born at 11 centers that participated in the National Institute of Child Health and Human Development Neonatal Research Network. The primary outcome measure was a three-level outcome — survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, or death. After accounting for differences in infant characteristics, including birth center, we used multinomial generalized logit models to compare the relative risk of survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, and death. RESULTS Data on the primary outcome were available for 4274 of 4458 infants (96%) born at the 11 centers. The percentage of infants who survived increased from 30% (424 of 1391 infants) in epoch 1 to 36% (487 of 1348 infants) in epoch 3 (P<0.001). The percentage of infants who survived without neurodevelopmental impairment increased from 16% (217 of 1391) in epoch 1 to 20% (276 of 1348) in epoch 3 (P = 0.001), whereas the percentage of infants who survived with neurodevelopmental impairment did not change significantly (15% [207 of 1391] in epoch 1 and 16% [211 of 1348] in epoch 3, P = 0.29). After adjustment for changes in the baseline characteristics of the infants over time, both the rate of survival with neurodevelopmental impairment (as compared with death) and the rate of survival without neurodevelopmental impairment (as compared with death) increased over time (adjusted relative risks, 1.27 [95% confidence interval {CI}, 1.01 to 1.59] and 1.59 [95% CI, 1.28 to 1.99], respectively). CONCLUSIONS The rate of survival without neurodevelopmental impairment increased between 2000 and 2011 in this large cohort of periviable infants. (Funded by the National Institutes of Health and others; ClinicalTrials.gov numbers, NCT00063063 and NCT00009633.) PMID:28199816
Surgical therapy of canine nasal tumors: A retrospective study (1982-1986)
Laing, Elizabeth J.; Binnington, Allen G.
1988-01-01
The results of surgical therapy in 15 dogs with histologically confirmed nasal tumors were analyzed retrospectively and compared to previous reports. Median survival time for all dogs was seven months. When adjusted for nontumor-related deaths, median survival increased to nine months. These values are two to three times longer than previous reports. To determine possible prognostic indicators, tumor stage, location, and histological type were compared to survival time. Dogs with unilateral nasal tumors had a median survival of 11 months, as compared to three months for dogs with bilateral tumors (p = 0.005). Tumor stage and histological type were not significant factors in comparing survival times. PMID:17423139
Salmen, Marcus; Ewy, Gordon A; Sasson, Comilla
2012-01-01
To determine whether the use of cardiocerebral resuscitation (CCR) or AHA/ERC 2005 Resuscitation Guidelines improved patient outcomes from out-of-hospital cardiac arrest (OHCA) compared to older guidelines. Systematic review and meta-analysis. MEDLINE, EMBASE, Web of Science and the Cochrane Library databases. We also hand-searched study references and consulted experts. Design: randomised controlled trials and observational studies. OHCA patients, age >17 years. 'Control' protocol versus 'Study' protocol. 'Control' protocol defined as AHA/ERC 2000 Guidelines for cardiopulmonary resuscitation (CPR). 'Study' protocol defined as AHA/ERC 2005 Guidelines for CPR, or a CCR protocol. Survival to hospital discharge. High-quality or medium-quality studies, as measured by the Newcastle Ottawa Scale using predefined categories. Twelve observational studies met inclusion criteria. All the three studies using CCR demonstrated significantly improved survival compared to use of AHA 2000 Guidelines, as did five of the nine studies using AHA/ERC 2005 Guidelines. Pooled data demonstrate that use of a CCR protocol has an unadjusted OR of 2.26 (95% CI 1.64 to 3.12) for survival to hospital discharge among all cardiac arrest patients. Among witnessed ventricular fibrillation/ventricular tachycardia (VF/VT) patients, CCR increased survival by an OR of 2.98 (95% CI 1.92 to 4.62). Studies using AHA/ERC 2005 Guidelines showed an overall trend towards increased survival, but significant heterogeneity existed among these studies. We demonstrate an association with improved survival from OHCA when CCR protocols or AHA/ERC 2005 Guidelines are compared to use of older guidelines. In the subgroup of patients with witnessed VF/VT, there was a threefold increase in OHCA survival when CCR was used. CCR appears to be a promising resuscitation protocol for Emergency Medical Services providers in increasing survival from OHCA. Future research will need to be conducted to directly compare AHA/ERC 2010 Guidelines with the CCR approach.
Epinephrine in cardiac arrest: systematic review and meta-analysis
Morales-Cané, Ignacio; Valverde-León, María Del Rocío; Rodríguez-Borrego, María Aurora
2016-01-01
abstract Objective: evaluate the effectiveness of epinephrine used during cardiac arrest and its effect on the survival rates and neurological condition. Method: systematic review of scientific literature with meta-analysis, using a random effects model. The following databases were used to research clinical trials and observational studies: Medline, Embase and Cochrane, from 2005 to 2015. Results: when the Return of Spontaneous Circulation (ROSC) with administration of epinephrine was compared with ROSC without administration, increased rates were found with administration (OR 2.02. 95% CI 1.49 to 2.75; I2 = 95%). Meta-analysis showed an increase in survival to discharge or 30 days after administration of epinephrine (OR 1.23; 95% IC 1.05-1.44; I2=83%). Stratification by shockable and non-shockable rhythms showed an increase in survival for non-shockable rhythm (OR 1.52; 95% IC 1.29-1.78; I2=42%). When compared with delayed administration, the administration of epinephrine within 10 minutes showed an increased survival rate (OR 2.03; 95% IC 1.77-2.32; I2=0%). Conclusion: administration of epinephrine appears to increase the rate of ROSC, but when compared with other therapies, no positive effect was found on survival rates of patients with favorable neurological status. PMID:27982306
Lin, Pei-Jung; Concannon, Thomas W; Greenberg, Dan; Cohen, Joshua T; Rossi, Gregory; Hille, Jeffrey; Auerbach, Hannah R; Fang, Chi-Hui; Nadler, Eric S; Neumann, Peter J
2013-08-01
To investigate the relationship between the framing of survival gains and the perceived value of cancer care. Through a population-based survey of 2040 US adults, respondents were randomized to one of the two sets of hypothetical scenarios, each of which described the survival benefit for a new treatment as either an increase in median survival time (median survival), or an increase in the probability of survival for a given length of time (landmark survival). Each respondent was presented with two randomly selected scenarios with different prognosis and survival improvements, and asked about their willingness to pay (WTP) for the new treatments. Predicted WTP increased with survival benefits and respondents' income, regardless of how survival benefits were described. Framing therapeutic benefits as improvements in landmark rather than median time survival increased the proportion of the population willing to pay for that gain by 11-35%, and the mean WTP amount by 42-72% in the scenarios we compared. How survival benefits are described may influence the value people place on cancer care.
Survival in Very Preterm Infants: An International Comparison of 10 National Neonatal Networks.
Helenius, Kjell; Sjörs, Gunnar; Shah, Prakesh S; Modi, Neena; Reichman, Brian; Morisaki, Naho; Kusuda, Satoshi; Lui, Kei; Darlow, Brian A; Bassler, Dirk; Håkansson, Stellan; Adams, Mark; Vento, Maximo; Rusconi, Franca; Isayama, Tetsuya; Lee, Shoo K; Lehtonen, Liisa
2017-12-01
To compare survival rates and age at death among very preterm infants in 10 national and regional neonatal networks. A cohort study of very preterm infants, born between 24 and 29 weeks' gestation and weighing <1500 g, admitted to participating neonatal units between 2007 and 2013 in the International Network for Evaluating Outcomes of Neonates. Survival was compared by using standardized ratios (SRs) comparing survival in each network to the survival estimate of the whole population. Network populations differed with respect to rates of cesarean birth, exposure to antenatal steroids and birth in nontertiary hospitals. Network SRs for survival were highest in Japan (SR: 1.10; 99% confidence interval: 1.08-1.13) and lowest in Spain (SR: 0.88; 99% confidence interval: 0.85-0.90). The overall survival differed from 78% to 93% among networks, the difference being highest at 24 weeks' gestation (range 35%-84%). Survival rates increased and differences between networks diminished with increasing gestational age (GA) (range 92%-98% at 29 weeks' gestation); yet, relative differences in survival followed a similar pattern at all GAs. The median age at death varied from 4 days to 13 days across networks. The network ranking of survival rates for very preterm infants remained largely unchanged as GA increased; however, survival rates showed marked variations at lower GAs. The median age at death also varied among networks. These findings warrant further assessment of the representativeness of the study populations, organization of perinatal services, national guidelines, philosophy of care at extreme GAs, and resources used for decision-making. Copyright © 2017 by the American Academy of Pediatrics.
Salmen, Marcus; Ewy, Gordon A; Sasson, Comilla
2012-01-01
Objective To determine whether the use of cardiocerebral resuscitation (CCR) or AHA/ERC 2005 Resuscitation Guidelines improved patient outcomes from out-of-hospital cardiac arrest (OHCA) compared to older guidelines. Design Systematic review and meta-analysis. Data sources MEDLINE, EMBASE, Web of Science and the Cochrane Library databases. We also hand-searched study references and consulted experts. Study selection Design: randomised controlled trials and observational studies. Population OHCA patients, age >17 years. Comparators ‘Control’ protocol versus ‘Study’ protocol. ‘Control’ protocol defined as AHA/ERC 2000 Guidelines for cardiopulmonary resuscitation (CPR). ‘Study’ protocol defined as AHA/ERC 2005 Guidelines for CPR, or a CCR protocol. Outcome Survival to hospital discharge. Quality High-quality or medium-quality studies, as measured by the Newcastle Ottawa Scale using predefined categories. Results Twelve observational studies met inclusion criteria. All the three studies using CCR demonstrated significantly improved survival compared to use of AHA 2000 Guidelines, as did five of the nine studies using AHA/ERC 2005 Guidelines. Pooled data demonstrate that use of a CCR protocol has an unadjusted OR of 2.26 (95% CI 1.64 to 3.12) for survival to hospital discharge among all cardiac arrest patients. Among witnessed ventricular fibrillation/ventricular tachycardia (VF/VT) patients, CCR increased survival by an OR of 2.98 (95% CI 1.92 to 4.62). Studies using AHA/ERC 2005 Guidelines showed an overall trend towards increased survival, but significant heterogeneity existed among these studies. Conclusions We demonstrate an association with improved survival from OHCA when CCR protocols or AHA/ERC 2005 Guidelines are compared to use of older guidelines. In the subgroup of patients with witnessed VF/VT, there was a threefold increase in OHCA survival when CCR was used. CCR appears to be a promising resuscitation protocol for Emergency Medical Services providers in increasing survival from OHCA. Future research will need to be conducted to directly compare AHA/ERC 2010 Guidelines with the CCR approach. PMID:23036985
Population-based survival-cure analysis of ER-negative breast cancer.
Huang, Lan; Johnson, Karen A; Mariotto, Angela B; Dignam, James J; Feuer, Eric J
2010-08-01
This study investigated the trends over time in age and stage specific population-based survival of estrogen receptor negative (ER-) breast cancer patients by examining the fraction of cured patients and the median survival time for uncured patients. Cause-specific survival data from the Surveillance, Epidemiology, and End Results program for cases diagnosed during 1992-1998 were used in mixed survival cure models to evaluate the cure fraction and the extension in survival for uncured patients. Survival trends were compared with adjuvant chemotherapy data available from an overlapping patterns-of-care study. For stage II N+ disease, the largest increase in cure fraction was 44-60% (P = 0.0257) for women aged >or=70 in contrast to a 7-8% point increase for women aged <50 or 50-69 (P = 0.056 and 0.038, respectively). For women with stage III disease, the increases in the cure fraction were not statistically significant, although women aged 50-69 had a 10% point increase (P = 0.103). Increases in cure fraction correspond with increases in the use of adjuvant chemotherapy, particularly for the oldest age group. In this article, for the first time, we estimate the cure fraction for ER- patients. We notice that at age >o5r=70, the accelerated increase in cure fraction from 1992 to 1998 for women with stage II N+ compared with stage III suggests a selective benefit for chemotherapy in the lower stage group.
Lloyd, Penn; Martin, Thomas E.
2016-01-01
Slow life histories are characterized by high adult survival and few offspring, which are thought to allow increased investment per offspring to increase juvenile survival. Consistent with this pattern, south temperate zone birds are commonly longer-lived and have fewer young than north temperate zone species. However, comparative analyses of juvenile survival, including during the first few weeks of the post-fledging period when most juvenile mortality occurs, are largely lacking. We combined our measurements of fledgling survival for eight passerines in South Africa with estimates from published studies of 57 north and south temperate zone songbird species to test three predictions: (1) fledgling survival increases with length of development time in the nest; (2) fledgling survival increases with adult survival and reduced brood size controlled for development time; and (3) south temperate zone species, with their higher adult survival and smaller brood sizes, exhibit higher fledgling survival than north temperate zone species controlled for development time. We found that fledgling survival was higher among south temperate zone species and generally increased with development time and adult survival within and between latitudinal regions. Clutch size did not explain additional variation, but was confounded with adult survival. Given the importance of age-specific mortality to life history evolution, understanding the causes of these geographical patterns of mortality is important.
Improved survival among older acute myeloid leukemia patients - a population-based study.
Shah, Binay Kumar; Ghimire, Krishna Bilas
2014-07-01
Survival in acute myeloid leukemia (AML) has improved in younger patients over the last decade. This study was conducted to evaluate the relative survival rates in older AML patients over two decades in the US. We analyzed Surveillance, Epidemiology, and End Results (SEER) registry database to evaluate relative survival rate in older (≥ 75 years) AML population diagnosed during 1992-2009. We selected AML patients from 13 registries of SEER 18 database to compare RS during 1992-2000 and 2001-2009. The relative survival rates improved significantly during 2001-2009 compared to 1992-2000 for all age groups and sex. For young elderly patients (75-84 years) RS increased from 13.1 ± 0.8% to 17.4 ± 0.9% at one year Z-value = 3.98, p < 0.0001 and from 2.0 ± 0.4 to 2.6 ± 0.5%, Z-value = 3.61, p < 0.0005 at five years. Similarly, for very elderly (≥ 85 years) patients RS increased from 5.3 ± 1.0% to 8.0 ± 1.0%, Z-value = 3.03, p < 0.005 at one year, but no improvement seen at five years. The relative survival in elderly AML has increased significantly during 2001-2009 compared to 1992-2000.
Dietary Pectin Increases Intestinal Crypt Stem Cell Survival following Radiation Injury.
Sureban, Sripathi M; May, Randal; Qu, Dongfeng; Chandrakesan, Parthasarathy; Weygant, Nathaniel; Ali, Naushad; Lightfoot, Stan A; Ding, Kai; Umar, Shahid; Schlosser, Michael J; Houchen, Courtney W
2015-01-01
Gastrointestinal (GI) mucosal damage is a devastating adverse effect of radiation therapy. We have recently reported that expression of Dclk1, a Tuft cell and tumor stem cell (TSC) marker, 24h after high dose total-body gamma-IR (TBI) can be used as a surrogate marker for crypt survival. Dietary pectin has been demonstrated to possess chemopreventive properties, whereas its radioprotective property has not been studied. The aim of this study was to determine the effects of dietary pectin on ionizing radiation (IR)-induced intestinal stem cell (ISC) deletion, crypt and overall survival following lethal TBI. C57BL/6 mice received a 6% pectin diet and 0.5% pectin drinking water (pre-IR mice received pectin one week before TBI until death; post-IR mice received pectin after TBI until death). Animals were exposed to TBI (14 Gy) and euthanized at 24 and 84h post-IR to assess ISC deletion and crypt survival respectively. Animals were also subjected to overall survival studies following TBI. In pre-IR treatment group, we observed a three-fold increase in ISC/crypt survival, a two-fold increase in Dclk1+ stem cells, increased overall survival (median 10d vs. 7d), and increased expression of Dclk1, Msi1, Lgr5, Bmi1, and Notch1 (in small intestine) post-TBI in pectin treated mice compared to controls. We also observed increased survival of mice treated with pectin (post-IR) compared to controls. Dietary pectin is a radioprotective agent; prevents IR-induced deletion of potential reserve ISCs; facilitates crypt regeneration; and ultimately promotes overall survival. Given the anti-cancer activity of pectin, our data support a potential role for dietary pectin as an agent that can be administered to patients receiving radiation therapy to protect against radiation-induces mucositis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madonna, G.S.; Ledney, G.D.; Moore, M.M.
Compromise of antimicrobial defenses by irradiation can result in sepsis and death. Additional trauma can further predispose patients to infection and thus increase mortality. We recently showed that injection of synthetic trehalose dicorynomycolate (S-TDCM) significantly augments resistance to infection and increases survival of mice compromised either by whole-body irradiation with gamma radiation or equal mixtures of fission neutron and gamma radiation. In this study, C3H/HeN mice were given a lethal dose of gamma radiation (8.0 Gy) and an open wound (15% total body surface area (TBSA)) 1 hr later while anesthetized. Irradiated/wounded mice became more severely leukopenic and thrombocytopenic thanmore » mice exposed to radiation alone, and died from natural wound infection and sepsis within 7 days. S-TDCM given 1 hr postirradiation increased survival of mice exposed to radiation alone. However, this treatment did not increase survival of the irradiated/wounded mice. Systemic antibiotic therapy with gentamicin or ofloxacin for 10 days significantly increased survival time compared with untreated irradiated/wounded mice (p less than 0.01). Combination therapy with topical gentamicin cream and systemic oxacillin increased survival from 0% to 100%. Treatment with S-TDCM combined with the suboptimal treatment of topical and systemic gentamicin increased survival compared with antibiotic treatment alone. These studies demonstrate that post-trauma therapy with S-TDCM and antibiotics augments resistance to infection in immunocompromised mice. The data suggest that therapies which combine stimulation of nonspecific host defense mechanisms with antibiotics may increase survival of irradiated patients inflicted with accidental or surgical trauma.« less
The Impact of Race on Survival After Hepatocellular Carcinoma in a Diverse American Population.
Jones, Patricia D; Diaz, Carlos; Wang, Danlu; Gonzalez-Diaz, Joselin; Martin, Paul; Kobetz, Erin
2018-02-01
Hepatocellular carcinoma (HCC) incidence is increasing at differential rates depending on race. We aimed to identify associations between race and survival after HCC diagnosis in a diverse American population. Using the cancer registry from Sylvester Comprehensive Cancer Center, University of Miami and Jackson Memorial Hospitals, we performed retrospective analysis of 999 patients diagnosed with HCC between 9/24/2004 and 12/19/2014. We identified clinical characteristics by reviewing available electronic medical records. The association between race and survival was analyzed using Cox proportional hazards regression. Median survival in days was 425 in Blacks, 904.5 in non-Hispanic Whites, 652 in Hispanics, 570 in Asians, and 928 in others, p < 0.01. Blacks and Asians presented at more advanced stages with larger tumors. Although Whites had increased severity of liver disease at diagnosis compared to other races, they had 36% reduced rate of death compared to Blacks, [hazard ratio (HR) 0.64, 95% confidence interval (CI) 0.51-0.8, p < 0.01]. After adjusting for significant covariates, Whites had 22% (HR 0.78, 95% CI 0.61-0.99, p 0.04) reduced risk of death, compared to Blacks. Transplant significantly reduced rate of death; however, only 13.3% of Blacks had liver transplant, compared to 40.1% of Whites, p < 0.01. In this diverse sample of patients, survival among Blacks is the shortest after HCC diagnosis. Survival differences reflect a more advanced tumor stage at presentation rather than severity of underlying liver disease precluding treatment. Improving survival in minority populations, in whom HCC incidence is rapidly increasing, requires identification and modification of factors contributing to late-stage presentation.
Epperly, Michael W.; Wang, Hong; Jones, Jeffrey A.; Dixon, Tracy; Montesinos, Carlos A.; Greenberger, Joel S.
2011-01-01
Many acute and chronic effects of ionizing radiation are mediated by reactive oxygen species and reactive nitrogen species, which deplete antioxidant stores, leading to cellular apoptosis, stem cell depletion and accelerated aging. C57BL/6NHsd mice receiving intravenous MnSOD-PL prior to 9.5 Gy total-body irradiation (TBI) show increased survival from the acute hematopoietic syndrome, and males demonstrated improved long-term survival (Epperly et al., Radiat. Res. 170, 437–444, 2008). We evaluated the effect of an antioxidant-chemopreventive diet compared to a regular diet on long-term survival in female mice. Twenty-four hours before the LD50/30 dose of 9.5 Gy TBI, subgroups of mice were injected intravenously with MnSOD-PL (100 μg plasmid DNA in 100 μl of liposomes). Mice on either diet treated with MnSOD-PL showed decreased death after irradiation compared to irradiated mice on the house diet alone (P = 0.031 for the house diet plus MnSOD-PL or 0.015 for antioxidant diet plus MnSOD-PL). The mice on the antioxidant-chemoprevention diet alone or with MnSOD-PL that survived 30 days after irradiation had a significant increase in survival compared to mice on the regular diet (P = 0.04 or 0.01, respectively). In addition, mice treated with MnSOD-PL only and surviving 30 days after radiation also had increased survival compared to those on the regular diet alone (P = 0.02). Survivors of acute ionizing radiation damage have ameliorated life shortening if they are fed an antioxidant-chemopreventive diet. PMID:21466381
Kidney Transplant Outcomes in the Super Obese: A National Study From the UNOS Dataset.
Kanthawar, Pooja; Mei, Xiaonan; Daily, Michael F; Chandarana, Jyotin; Shah, Malay; Berger, Jonathan; Castellanos, Ana Lia; Marti, Francesc; Gedaly, Roberto
2016-11-01
We evaluated outcomes of super-obese patients (BMI > 50) undergoing kidney transplantation in the US. We performed a review of 190 super-obese patients undergoing kidney transplantation from 1988 through 2013 using the UNOS dataset. Super-obese patients had a mean age of 45.7 years (21-75 years) and 111 (58.4 %) were female. The mean BMI of the super-obese group was 56 (range 50.0-74.2). A subgroup analysis demonstrated that patients with BMI > 50 had worse survival compared to any other BMI class. The 30-day perioperative mortality and length of stay was 3.7 % and 10.09 days compared to 0.8 % and 7.34 days in nonsuper-obese group. On multivariable analysis, BMI > 50 was an independent predictor of 30-day mortality, with a 4.6-fold increased risk of perioperative death. BMI > 50 increased the risk of delayed graft function and the length of stay by twofold. The multivariable analysis of survival showed a 78 % increased risk of death in this group. Overall patient survival for super-obese transplant recipients at 1, 3, and 5 years was 88, 82, and 76 %, compared to 96, 91, 86 % on patients transplanted with BMI < 50. A propensity score adjusted analysis further demonstrates significant worse survival rates in super-obese patients undergoing kidney transplantation. Super-obese patients had prolonged LOS and worse DGF rates. Perioperative mortality was increased 4.6-fold compared to patients with BMI < 50. In a subgroup analysis, super-obese patients who underwent kidney transplantation had significantly worse graft and patient survival compared to underweight, normal weight, and obesity class I, II, and III (BMI 40-50) patients.
Beker, Mustafa Caglar; Caglayan, Berrak; Yalcin, Esra; Caglayan, Ahmet Burak; Turkseven, Seyma; Gurel, Busra; Kelestemur, Taha; Sertel, Elif; Sahin, Zafer; Kutlu, Selim; Kilic, Ulkan; Baykal, Ahmet Tarik; Kilic, Ertugrul
2018-03-01
Occurrence of stroke cases displays a time-of-day variation in human. However, the mechanism linking circadian rhythm to the internal response mechanisms against pathophysiological events after ischemic stroke remained largely unknown. To this end, temporal changes in the susceptibility to ischemia/reperfusion (I/R) injury were investigated in mice in which the ischemic stroke induced at four different Zeitgeber time points with 6-h intervals (ZT0, ZT6, ZT12, and ZT18). Besides infarct volume and brain swelling, neuronal survival, apoptosis, ischemia, and circadian rhythm related proteins were examined using immunohistochemistry, Western blot, planar surface immune assay, and liquid chromatography-mass spectrometry tools. Here, we present evidence that midnight (ZT18; 24:00) I/R injury in mice resulted in significantly improved infarct volume, brain swelling, neurological deficit score, neuronal survival, and decreased apoptotic cell death compared with ischemia induced at other time points, which were associated with increased expressions of circadian proteins Bmal1, PerI, and Clock proteins and survival kinases AKT and Erk-1/2. Moreover, ribosomal protein S6, mTOR, and Bad were also significantly increased, while the levels of PRAS40, negative regulator of AKT and mTOR, and phosphorylated p53 were decreased at this time point compared to ZT0 (06:00). Furthermore, detailed proteomic analysis revealed significantly decreased CSKP, HBB-1/2, and HBA levels, while increased GNAZ, NEGR1, IMPCT, and PDE1B at midnight as compared with early morning. Our results indicate that nighttime I/R injury results in less severe neuronal damage, with increased neuronal survival, increased levels of survival kinases and circadian clock proteins, and also alters the circadian-related proteins.
Brenner, Hermann; Jansen, Lina
2016-02-01
Monitoring cancer survival is a key task of cancer registries, but timely disclosure of progress in long-term survival remains a challenge. We introduce and evaluate a novel method, denoted "boomerang method," for deriving more up-to-date estimates of long-term survival. We applied three established methods (cohort, complete, and period analysis) and the boomerang method to derive up-to-date 10-year relative survival of patients diagnosed with common solid cancers and hematological malignancies in the United States. Using the Surveillance, Epidemiology and End Results 9 database, we compared the most up-to-date age-specific estimates that might have been obtained with the database including patients diagnosed up to 2001 with 10-year survival later observed for patients diagnosed in 1997-2001. For cancers with little or no increase in survival over time, the various estimates of 10-year relative survival potentially available by the end of 2001 were generally rather similar. For malignancies with strongly increasing survival over time, including breast and prostate cancer and all hematological malignancies, the boomerang method provided estimates that were closest to later observed 10-year relative survival in 23 of the 34 groups assessed. The boomerang method can substantially improve up-to-dateness of long-term cancer survival estimates in times of ongoing improvement in prognosis. Copyright © 2016 Elsevier Inc. All rights reserved.
Pjetursson, Bjarni E; Asgeirsson, Asgeir G; Zwahlen, Marcel; Sailer, Irena
2014-01-01
The objective of this systematic review was to assess and compare the survival and complication rates of implant-supported prostheses reported in studies published in the year 2000 and before, to those reported in studies published after the year 2000. Three electronic searches complemented by manual searching were conducted to identify 139 prospective and retrospective studies on implant-supported prostheses. The included studies were divided in two groups: a group of 31 older studies published in the year 2000 or before, and a group of 108 newer studies published after the year 2000. Survival and complication rates were calculated using Poisson regression models, and multivariable robust Poisson regression was used to formally compare the outcomes of older and newer studies. The 5-year survival rate of implant-supported prostheses was significantly increased in newer studies compared with older studies. The overall survival rate increased from 93.5% to 97.1%. The survival rate for cemented prostheses increased from 95.2% to 97.9%; for screw-retained reconstruction, from 77.6% to 96.8%; for implant-supported single crowns, from 92.6% to 97.2%; and for implant-supported fixed dental prostheses (FDPs), from 93.5% to 96.4%. The incidence of esthetic complications decreased in more recent studies compared with older ones, but the incidence of biologic complications was similar. The results for technical complications were inconsistent. There was a significant reduction in abutment or screw loosening by implant-supported FDPs. On the other hand, the total number of technical complications and the incidence of fracture of the veneering material was significantly increased in the newer studies. To explain the increased rate of complications, minor complications are probably reported in more detail in the newer publications. The results of the present systematic review demonstrated a positive learning curve in implant dentistry, represented in higher survival rates and lower complication rates reported in more recent clinical studies. The incidence of esthetic, biologic, and technical complications, however, is still high. Hence, it is important to identify these complications and their etiology to make implant treatment even more predictable in the future.
Weber, Arthur J; Viswanáthan, Suresh; Ramanathan, Chidambaram; Harman, Christine D
2010-01-01
To determine whether application of BDNF to the eye and brain provides a greater level of neuroprotection after optic nerve injury than treatment of the eye alone. Retinal ganglion cell survival and pattern electroretinographic responses were compared in normal cat eyes and in eyes that received (1) a mild nerve crush and no treatment, (2) a single intravitreal injection of BDNF at the time of the nerve injury, or (3) intravitreal treatment combined with 1 to 2 weeks of continuous delivery of BDNF to the visual cortex, bilaterally. Relative to no treatment, administration of BDNF to the eye alone resulted in a significant increase in ganglion cell survival at both 1 and 2 weeks after nerve crush (1 week, 79% vs. 55%; 2 weeks, 60% vs. 31%). Combined treatment of the eye and visual cortex resulted in a modest additional increase (17%) in ganglion cell survival in the 1-week eyes, a further significant increase (55%) in the 2-week eyes, and ganglion cell survival levels for both that were comparable to normal (92%-93% survival). Pattern ERG responses for all the treated eyes were comparable to normal at 1 week after injury; however, at 2 weeks, only the responses of eyes receiving the combined BDNF treatment remained so. Although treatment of the eye alone with BDNF has a significant impact on ganglion cell survival after optic nerve injury, combined treatment of the eye and brain may represent an even more effective approach and should be considered in the development of future optic neuropathy-related neuroprotection strategies.
Østgård, Lene Sofie Granfeldt; Nørgaard, Mette; Medeiros, Bruno C; Friis, Lone Smidstrup; Schoellkopf, Claudia; Severinsen, Marianne Tang; Marcher, Claus Werenberg; Nørgaard, Jan Maxwell
2017-11-10
Purpose Previous US studies have shown that socioeconomic status (SES) affects survival in acute myeloid leukemia (AML). However, no large study has investigated the association between education or income and clinical characteristics, treatment, and outcome in AML. Methods To investigate the effects of education and income in a tax-supported health care system, we conducted a population-based study using individual-level SES and clinical data on all Danish patients with AML (2000 to 2014). We compared treatment intensity, allogeneic transplantation, and response rates by education and income level using logistic regression (odds ratios). We used Cox regression (hazard ratios [HRs]) to compare survival, adjusting for age, sex, SES, and clinical prognostic markers. Results Of 2,992 patients, 1,588 (53.1%) received intensive chemotherapy. Compared with low-education patients, highly educated patients more often received allogeneic transplantation (16.3% v 8.7%). In intensively treated patients younger than 60 years of age, increased mortality was observed in those with lower and medium education (1-year survival, 66.7%; adjusted HR, 1.47; 95% CI, 1.11 to 1.93; and 1-year survival, 67.6%; adjusted HR, 1.55; CI, 1.21 to 1.98, respectively) compared with higher education (1-year survival, 76.9%). Over the study period, 5-year survival improvements were limited to high-education patients (from 39% to 58%), increasing the survival gap between groups. In older patients, low-education patients received less intensive therapy (30% v 48%; adjusted odds ratio, 0.65; CI, 0.44 to 0.98) compared with high-education patients; however, remission rates and survival were not affected in those intensively treated. Income was not associated with therapy intensity, likelihood of complete remission, or survival (high income: adjusted HR, 1.0; medium income: adjusted HR, 0.96; 95% CI, 0.82 to 1.12; low income: adjusted HR, 1.06; CI, .88 to 1.27). Conclusion In a universal health care system, education level, but not income, affects transplantation rates and survival in younger patients with AML. Importantly, recent survival improvement has exclusively benefitted highly educated patients.
Contribution of surgical specialization to improved colorectal cancer survival.
Oliphant, R; Nicholson, G A; Horgan, P G; Molloy, R G; McMillan, D C; Morrison, D S
2013-09-01
Reorganization of colorectal cancer services has led to surgery being increasingly, but not exclusively, delivered by specialist surgeons. Outcomes from colorectal cancer surgery have improved, but the exact determinants remain unclear. This study explored the determinants of outcome after colorectal cancer surgery over time. Postoperative mortality (within 30 days of surgery) and 5-year relative survival rates for patients in the West of Scotland undergoing surgery for colorectal cancer between 1991 and 1994 were compared with rates for those having surgery between 2001 and 2004. The 1823 patients who had surgery in 2001-2004 were more likely to have had stage I or III tumours, and to have undergone surgery with curative intent than the 1715 patients operated on in 1991-1994. The proportion of patients presenting electively who received surgery by a specialist surgeon increased over time (from 14·9 to 72·8 per cent; P < 0·001). Postoperative mortality increased among patients treated by non-specialists over time (from 7·4 to 10·3 per cent; P = 0·026). Non-specialist surgery was associated with an increased risk of postoperative death (adjusted odds ratio 1·72, 95 per cent confidence interval (c.i.) 1·17 to 2·55; P = 0·006) compared with specialist surgery. The 5-year relative survival rate increased over time and was higher among those treated by specialist compared with non-specialist surgeons (62·1 versus 53·0 per cent; P < 0·001). Compared with the earlier period, the adjusted relative excess risk ratio for the later period was 0·69 (95 per cent c.i. 0·61 to 0·79; P < 0·001). Increased surgical specialization accounted for 18·9 per cent of the observed survival improvement. Increased surgical specialization contributed significantly to the observed improvement in longer-term survival following colorectal cancer surgery. © 2013 British Journal of Surgery Society Ltd. Published by John Wiley & Sons Ltd.
Siskind, Eric; Maloney, Caroline; Akerman, Meredith; Alex, Asha; Ashburn, Sarah; Barlow, Meade; Siskind, Tamar; Bhaskaran, Madhu; Ali, Nicole; Basu, Amit; Molmenti, Ernesto; Ortiz, Jorge
2014-09-01
Previously, increasing age has been a part of the exclusion criteria used when determining eligibility for a pancreas transplant. However, the analysis of pancreas transplantation outcomes based on age groupings has largely been based on single-center reports. A UNOS database review of all adult pancreas and kidney-pancreas transplants between 1996 and 2012 was performed. Patients were divided into groups based on age categories: 18-29 (n = 1823), 30-39 (n = 7624), 40-49 (n = 7967), 50-59 (n = 3160), and ≥60 (n = 280). We compared survival outcomes and demographic variables between each age grouping. Of the 20 854 pancreas transplants, 3440 of the recipients were 50 yr of age or above. Graft survival was consistently the greatest in adults 40-49 yr of age. Graft survival was least in adults age 18-29 at one-, three-, and five-yr intervals. At 10- and 15-yr intervals, graft survival was the poorest in adults >60 yr old. Patient survival and age were found to be inversely proportional; as the patient population's age increased, survival decreased. Pancreas transplants performed in patients of increasing age demonstrate decreased patient and graft survival when compared to pancreas transplants in patients <50 yr of age. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Otgaar, Henry; Smeets, Tom; van Bergen, Saskia
2010-01-01
Recent studies have shown that processing words according to a survival scenario leads to superior retention relative to control conditions. Here, we examined whether a survival recall advantage could be elicited by using pictures. Furthermore, in Experiment 1, we were interested in whether survival processing also results in improved memory for details. Undergraduates rated the relevance of pictures in a survival, moving, or pleasantness scenario and were subsequently given a surprise free recall test. We found that survival processing yielded superior retention. We also found that distortions occurred more often in the survival condition than in the pleasantness condition. In Experiment 2, we directly compared the survival recall effect between pictures and words. A comparable survival recall advantage was found for pictures and words. The present findings support the idea that memory is enhanced by processing information in terms of fitness value, yet at the same time, the present results suggest that this may increase the risk for memory distortions.
Pulte, Dianne; Castro, Felipe A; Jansen, Lina; Luttmann, Sabine; Holleczek, Bernd; Nennecke, Alice; Ressing, Meike; Katalinic, Alexander; Brenner, Hermann
2016-03-22
Recent population-based studies in the United States of America (USA) and other countries have shown improvements in survival for patients with chronic lymphocytic leukemia (CLL) diagnosed in the early twenty-first century. Here, we examine the survival for patients diagnosed with CLL in Germany in 1997-2011. Data were extracted from 12 cancer registries in Germany and compared to the data from the USA. Period analysis was used to estimate 5- and 10-year relative survival (RS). Five- and 10-year RS estimates in 2009-2011 of 80.2 and 59.5%, respectively, in Germany and 82.4 and 64.7%, respectively, in the USA were observed. Overall, 5-year RS increased significantly in Germany and the difference compared to the survival in the USA which slightly decreased between 2003-2005 and 2009-2011. However, age-specific analyses showed persistently higher survival for all ages except for 15-44 in the USA. In general, survival decreased with age, but the age-related disparity was small for patients younger than 75. In both countries, 5-year RS was >80% for patients less than 75 years of age but <70% for those age 75+. Overall, 5-year survival for patients with CLL is good, but 10-year survival is significantly lower, and survival was much lower for those age 75+. Major differences in survival between countries were not observed. Further research into ways to increase survival for older CLL patients are needed to reduce the persistent large age-related survival disparity.
Recent cancer survival in Germany: an analysis of common and less common cancers.
Jansen, Lina; Castro, Felipe A; Gondos, Adam; Krilaviciute, Agne; Barnes, Benjamin; Eberle, Andrea; Emrich, Katharina; Hentschel, Stefan; Holleczek, Bernd; Katalinic, Alexander; Brenner, Hermann
2015-06-01
The monitoring of cancer survival by population-based cancer registries is a prerequisite to evaluate the current quality of cancer care. Our study provides 1-, 5- and 10-year relative survival as well as 5-year relative survival conditional on 1-year survival estimates and recent survival trends for Germany using data from 11 population-based cancer registries, covering around one-third of the German population. Period analysis was used to estimate relative survival for 24 common and 11 less common cancer sites for the period 2007-2010. The German and the United States survival estimates were compared using the Surveillance, Epidemiology and End Results 13 database. Trends in cancer survival in Germany between 2002-2004 and 2008-2010 were described. Five-year relative survival increased in Germany from 2002-2004 to 2008-2010 for most cancer sites. Among the 24 most common cancers, largest improvements were seen for multiple myeloma (8.0% units), non-Hodgkin lymphoma (6.2% units), prostate cancer (5.2% units) and colorectal cancer (4.6% units). In 2007-2010, the survival disadvantage in Germany compared to the United States was largest for cancers of the mouth/pharynx (-11.0% units), thyroid (-6.8% units) and prostate (-7.5% units). Although survival estimates were much lower for elderly patients in both countries, differences in age patterns were observed for some cancer sites. The reported improvements in cancer survival might reflect advances in the quality of cancer care on the population level as well as increased use of screening in Germany. The survival differences across countries and the survival disadvantage in the elderly require further investigation. © 2014 UICC.
Powell, D.B.; Palm, R.C.; MacKenzie, A.P.; Winton, J.R.
2009-01-01
The effects of temperature, ionic strength, and new cryopreservatives derived from polar ice bacteria were investigated to help accelerate the development of economical, live attenuated vaccines for aquaculture. Extracts of the extremophile Gelidibacter algens functioned very well as part of a lyophilization cryoprotectant formulation in a 15-week storage trial. The bacterial extract and trehalose additives resulted in significantly higher colony counts of columnaris bacteria (Flavobacterium columnare) compared to nonfat milk or physiological saline at all time points measured. The bacterial extract combined with trehalose appeared to enhance the relative efficiency of recovery and growth potential of columnaris in flask culture compared to saline, nonfat milk, or trehalose-only controls. Pre-lyophilization temperature treatments significantly affected F. columnare survival following rehydration. A 30-min exposure at 0 ??C resulted in a 10-fold increase in bacterial survival following rehydration compared to mid-range temperature treatments. The brief 30 and 35 ??C pre-lyophilization exposures appeared to be detrimental to the rehydration survival of the bacteria. The survival of F. columnare through the lyophilization process was also strongly affected by changes in ionic strength of the bacterial suspension. Changes in rehydration constituents were also found to be important in promoting increased survival and growth. As the sodium chloride concentration increased, the viability of rehydrated F. columnare decreased. ?? 2009 Elsevier Inc.
2014-01-01
Background Lower breast cancer survival has been reported for Australian Aboriginal women compared to non-Aboriginal women, however the reasons for this disparity have not been fully explored. We compared the surgical treatment and survival of Aboriginal and non-Aboriginal women diagnosed with breast cancer in New South Wales (NSW), Australia. Methods We analysed NSW cancer registry records of breast cancers diagnosed in 2001–2007, linked to hospital inpatient episodes and deaths. We used unconditional logistic regression to compare the odds of Aboriginal and non-Aboriginal women receiving surgical treatment. Breast cancer-specific survival was examined using cumulative mortality curves and Cox proportional hazards regression models. Results Of the 27 850 eligible women, 288 (1.03%) identified as Aboriginal. The Aboriginal women were younger and more likely to have advanced spread of disease when diagnosed than non-Aboriginal women. Aboriginal women were less likely than non-Aboriginal women to receive surgical treatment (odds ratio 0.59, 95% confidence interval (CI) 0.42-0.86). The five-year crude breast cancer-specific mortality was 6.1% higher for Aboriginal women (17.7%, 95% CI 12.9-23.2) compared with non-Aboriginal women (11.6%, 95% CI 11.2-12.0). After accounting for differences in age at diagnosis, year of diagnosis, spread of disease and surgical treatment received the risk of death from breast cancer was 39% higher in Aboriginal women (HR 1.39, 95% CI 1.01-1.86). Finally after also accounting for differences in comorbidities, socioeconomic disadvantage and place of residence the hazard ratio was reduced to 1.30 (95% CI 0.94-1.75). Conclusion Preventing comorbidities and increasing rates of surgical treatment may increase breast cancer survival for NSW Aboriginal women. PMID:24606675
Ohayon, Delphine; De Chiara, Alessia; Chapuis, Nicolas; Candalh, Céline; Mocek, Julie; Ribeil, Jean-Antoine; Haddaoui, Lamya; Ifrah, Norbert; Hermine, Olivier; Bouillaud, Frédéric; Frachet, Philippe; Bouscary, Didier; Witko-Sarsat, Véronique
2016-10-19
Cytosolic proliferating cell nuclear antigen (PCNA), a scaffolding protein involved in DNA replication, has been described as a key element in survival of mature neutrophil granulocytes, which are non-proliferating cells. Herein, we demonstrated an active export of PCNA involved in cell survival and chemotherapy resistance. Notably, daunorubicin-resistant HL-60 cells (HL-60R) have a prominent cytosolic PCNA localization due to increased nuclear export compared to daunorubicin-sensitive HL-60 cells (HL-60S). By interacting with nicotinamide phosphoribosyltransferase (NAMPT), a protein involved in NAD biosynthesis, PCNA coordinates glycolysis and survival, especially in HL-60R cells. These cells showed a dramatic increase in intracellular NAD+ concentration as well as glycolysis including increased expression and activity of hexokinase 1 and increased lactate production. Furthermore, this functional activity of cytoplasmic PCNA was also demonstrated in patients with acute myeloid leukemia (AML). Our data uncover a novel pathway of nuclear export of PCNA that drives cell survival by increasing metabolism flux.
Ohayon, Delphine; De Chiara, Alessia; Chapuis, Nicolas; Candalh, Céline; Mocek, Julie; Ribeil, Jean-Antoine; Haddaoui, Lamya; Ifrah, Norbert; Hermine, Olivier; Bouillaud, Frédéric; Frachet, Philippe; Bouscary, Didier; Witko-Sarsat, Véronique
2016-01-01
Cytosolic proliferating cell nuclear antigen (PCNA), a scaffolding protein involved in DNA replication, has been described as a key element in survival of mature neutrophil granulocytes, which are non-proliferating cells. Herein, we demonstrated an active export of PCNA involved in cell survival and chemotherapy resistance. Notably, daunorubicin-resistant HL-60 cells (HL-60R) have a prominent cytosolic PCNA localization due to increased nuclear export compared to daunorubicin-sensitive HL-60 cells (HL-60S). By interacting with nicotinamide phosphoribosyltransferase (NAMPT), a protein involved in NAD biosynthesis, PCNA coordinates glycolysis and survival, especially in HL-60R cells. These cells showed a dramatic increase in intracellular NAD+ concentration as well as glycolysis including increased expression and activity of hexokinase 1 and increased lactate production. Furthermore, this functional activity of cytoplasmic PCNA was also demonstrated in patients with acute myeloid leukemia (AML). Our data uncover a novel pathway of nuclear export of PCNA that drives cell survival by increasing metabolism flux. PMID:27759041
Hall, Matthew D; McGee, James L; McGee, Mackenzie C; Hall, Kevin A; Neils, David M; Klopfenstein, Jeffrey D; Elwood, Patrick W
2014-12-01
Stereotactic radiosurgery (SRS) alone is increasingly used in patients with newly diagnosed brain metastases. Stereotactic radiosurgery used together with whole-brain radiotherapy (WBRT) reduces intracranial failure rates, but this combination also causes greater neurocognitive toxicity and does not improve survival. Critics of SRS alone contend that deferring WBRT results in an increased need for salvage therapy and in higher costs. The authors compared the cost-effectiveness of treatment with SRS alone, SRS and WBRT (SRS+WBRT), and surgery followed by SRS (S+SRS) at the authors' institution. The authors retrospectively reviewed the medical records of 289 patients in whom brain metastases were newly diagnosed and who were treated between May 2001 and December 2007. Overall survival curves were plotted using the Kaplan-Meier method. Multivariate proportional hazards analysis (MVA) was used to identify factors associated with overall survival. Survival data were complete for 96.2% of patients, and comprehensive data on the resource use for imaging, hospitalizations, and salvage therapies were available from the medical records. Treatment costs included the cost of initial and all salvage therapies for brain metastases, hospitalizations, management of complications, and imaging. They were computed on the basis of the 2007 Medicare fee schedule from a payer perspective. Average treatment cost and average cost per month of median survival were compared. Sensitivity analysis was performed to examine the impact of variations in key cost variables. No significant differences in overall survival were observed among patients treated with SRS alone, SRS+WBRT, or S+SRS with respective median survival of 9.8, 7.4, and 10.6 months. The MVA detected a significant association of overall survival with female sex, Karnofsky Performance Scale (KPS) score, primary tumor control, absence of extracranial metastases, and number of brain metastases. Salvage therapy was required in 43% of SRS-alone and 26% of SRS+WBRT patients (p < 0.009). Despite an increased need for salvage therapy, the average cost per month of median survival was $2412 per month for SRS alone, $3220 per month for SRS+WBRT, and $4360 per month for S+SRS (p < 0.03). Compared with SRS+WBRT, SRS alone had an average incremental cost savings of $110 per patient. Sensitivity analysis confirmed that the average treatment cost of SRS alone remained less than or was comparable to SRS+WBRT over a wide range of costs and treatment efficacies. Despite an increased need for salvage therapy, patients with newly diagnosed brain metastases treated with SRS alone have similar overall survival and receive more cost-effective care than those treated with SRS+WBRT. Compared with SRS+WBRT, initial management with SRS alone does not result in a higher average cost.
The effects of exercise and stress on the survival and maturation of adult-generated granule cells
Snyder, Jason S.; Glover, Lucas R.; Sanzone, Kaitlin M.; Kamhi, J. Frances; Cameron, Heather A.
2009-01-01
Stress strongly inhibits proliferation of granule cell precursors in the dentate gyrus, while voluntary running has the opposite effect. Few studies, however, have examined the possible effects of these environmental manipulations on the maturation and survival of young granule cells. We examined number of surviving granule cells and the proportion of young neurons that were functionally mature, as defined by seizure-induced immediate-early gene expression, in 14 and 21 day-old granule cells in mice that were given access to a running wheel, restrained daily for 2 hours, or given no treatment during this period. Importantly, treatments began two days after BrdU injection, to isolate effects on survival from those on cell proliferation. We found a large increase in granule cell survival in running mice compared with controls at both time points. In addition, running increased the proportion of granule cells expressing the immediate-early gene Arc in response to seizures, suggesting that it speeds incorporation into circuits, i.e., functional maturation. Stressed mice showed no change in Arc expression, compared to control animals, but, surprisingly, showed a transient increase in survival of 14-day-old granule cells, which was gone 7 days later. Examination of cell proliferation, using the endogenous mitotic marker proliferating cell nuclear antigen (PCNA) showed an increase in cell proliferation after 12 days of running but not after 19 days of running. The number of proliferating cells was unchanged 24 hours after the 12th or 19th episode of daily restraint stress. These findings demonstrate that running has strong effects on survival and maturation of young granule cells as well as their birth and that stress can have positive but short-lived effects on granule cell survival. PMID:19156854
Impact of Marital Status on Tumor Stage at Diagnosis and on Survival in Male Breast Cancer.
Adekolujo, Orimisan Samuel; Tadisina, Shourya; Koduru, Ujwala; Gernand, Jill; Smith, Susan Jane; Kakarala, Radhika Ramani
2017-07-01
The effect of marital status (MS) on survival varies according to cancer type and gender. There has been no report on the impact of MS on survival in male breast cancer (MBC). This study aims to determine the influence of MS on tumor stage at diagnosis and survival in MBC. Men with MBC ≥18 years of age in the SEER database from 1990 to 2011 were included in the study. MS was classified as married and unmarried (including single, divorced, separated, widowed). Kaplan-Meier method was used to estimate the 5-year cancer-specific survival. Multivariate regression analyses were done to determine the effect of MS on presence of Stage IV disease at diagnosis and on cancer-specific mortality. The study included 3,761 men; 2,647 (70.4%) were married. Unmarried men were more often diagnosed with Stage IV MBC compared with married (10.7% vs. 5.5%, p < .001). Unmarried men (compared with married) were significantly less likely to undergo surgery (92.4% vs. 96.7%, p < .001). Overall unmarried males with Stages II, III, and IV MBC have significantly worse 5-year cancer-specific survival compared with married. On multivariate analysis, being unmarried was associated with increased hazard of death (HR = 1.43, p < .001) and increased likelihood of Stage IV disease at diagnosis ( OR = 1.96, p < .001). Unmarried males with breast cancer are at greater risk for Stage IV disease at diagnosis and poorer outcomes compared with married males.
Halland, Frode; Morken, Nils-Halvdan; DeRoo, Lisa A; Klungsøyr, Kari; Wilcox, Allen J; Skjærven, Rolv
2016-11-24
To assess the association between perinatal losses and mother's long-term mortality and modification by surviving children and attained education. A population-based cohort study. Norwegian national registries. We followed 652 320 mothers with a first delivery from 1967 and completed reproduction before 2003, until 2010 or death. We excluded mothers with plural pregnancies, without information on education (0.3%) and women born outside Norway. Main outcome measures were age-specific (40-69 years) cardiovascular and non-cardiovascular mortality. We calculated mortality in mothers with perinatal losses, compared with mothers without, and in mothers with one loss by number of surviving children in strata of mothers' attained education (<11 years (low), ≥11 years (high)). Mothers with perinatal losses had increased crude mortality compared with mothers without; total: HR 1.3 (95% CI 1.3 to 1.4), cardiovascular: HR 1.8 (1.5 to 2.1), non-cardiovascular: HR 1.3 (1.2 to 1.4). Childless mothers with one perinatal loss had increased mortality compared with mothers with one child and no loss; cardiovascular: low education HR 2.7 (1.7 to 4.3), high education HR 0.91 (0.13 to 6.5); non-cardiovascular: low education HR 1.6 (1.3 to 2.2), high education HR 1.8 (1.1 to 2.9). Mothers with one perinatal loss, surviving children and high education had no increased mortality, whereas corresponding mothers with low education had increased mortality; cardiovascular: two surviving children HR 1.7 (1.2 to 2.4), three or more surviving children HR 1.6 (1.1 to 2.4); non-cardiovascular: one surviving child HR 1.2 (1.0 to 1.5), two surviving children HR 1.2 (1.1 to 1.4). Irrespective of education, we find excess mortality in childless mothers with a perinatal loss. Increased mortality in mothers with one perinatal loss and surviving children was limited to mothers with low education. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Kuo, Phoebe; Sosa, Julie A; Burtness, Barbara A; Husain, Zain A; Mehra, Saral; Roman, Sanziana A; Yarbrough, Wendell G; Judson, Benjamin L
2016-06-15
The current study was performed to characterize trends and survival outcomes for chemotherapy in the definitive and adjuvant treatment of hypopharyngeal cancer in the United States. A total of 16,248 adult patients diagnosed with primary hypopharyngeal cancer without distant metastases between 1998 and 2011 were identified in the National Cancer Data Base. The association between treatment modality and overall survival was analyzed using Kaplan-Meier survival curves and 5-year survival rates. A multivariate Cox regression analysis was performed on a subset of 3357 cases to determine the treatment modalities that predict improved survival when controlling for demographic and clinical factors. There was a significant increase in the use of chemotherapy with radiotherapy both as definitive treatment (P<.001) and as adjuvant chemoradiotherapy with surgery (P=.001). This was accompanied by a decrease in total laryngectomy/pharyngectomy rates (P<.001). Chemoradiotherapy was associated with improved 5-year survival compared with radiotherapy alone in the definitive setting (31.8% vs 25.2%; log rank P<.001). Similarly, in multivariateanalysis, definitive radiotherapy was found to be associated with compromised survival compared with definitive chemoradiotherapy (hazard ratio, 1.51; P<.001). Survival analysis revealed that overall 5-year survival rates were higher for chemoradiotherapy compared with radiotherapy alone in the definitive setting, but were comparable between surgery with chemoradiotherapy and surgery with radiotherapy. Cancer 2016;122:1853-60. © 2016 American Cancer Society. © 2016 American Cancer Society.
Rogers, Nina Trivedy; Demakakos, Panayotes; Taylor, Mark Steven; Steptoe, Andrew; Hamer, Mark; Shankar, Aparna
2018-01-01
Background Volunteering has been linked to reduced mortality in older adults but the mechanisms explaining this effect remain unclear. This study investigated whether volunteering is associated with increased survival in participants of the English Longitudinal Study of Ageing and whether differences in survival are modified by functional disabilities. Methods A multivariate Cox Proportional Hazards model was used to estimate the association of volunteering with survival over a period of 10.9 years in 10,324 participants, whilst controlling for selected confounders. To investigate effect modification by disability, the analyses were repeated in participants with and without self-reported functional disabilities. Results Volunteering was associated with a reduced probability of death from all-causes in univariate analyses (HR = 0.65, CI 0.58–0.73, P < 0.0001), but adjustment for covariates rendered this association non-significant (HR = 0.90, CI 0.79–1.01, P = 0.07). Able-bodied volunteers had significantly increased survival compared to able-bodied non-volunteers (HR = 0.81, 95% CI: 0.69 – 0.95, P = 0.009). There was no significant survival advantage among disabled volunteers, compared to disabled non-volunteers (HR = 1.06, CI 0.88–1.29, P = 0.53). Conclusion Volunteering is associated with reduced mortality in older adults in England, but this effect appears to be limited to volunteers who report no disabilities. PMID:26811548
Archer, D C; Pinchbeck, G L; Proudman, C J
2011-08-01
Epiploic foramen entrapment (EFE) has been associated with reduced post operative survival compared to other types of colic but specific factors associated with reduced long-term survival of these cases have not been evaluated in a large number of horses using survival analysis. To describe post operative survival of EFE cases and to identify factors associated with long-term survival. A prospective, multicentre, international study was conducted using clinical data and long-term follow-up information for 126 horses diagnosed with EFE during exploratory laparotomy at 15 clinics in the UK, Ireland and USA. Descriptive data were generated and survival analysis performed to identify factors associated with reduced post operative survival. For the EFE cohort that recovered following anaesthesia, survival to hospital discharge was 78.5%. Survival to 1 and 2 years post operatively was 50.6 and 34.3%, respectively. The median survival time of EFE cases undergoing surgery was 397 days. Increased packed cell volume (PCV) and increased length of small intestine (SI) resected were significantly associated with increased likelihood of mortality when multivariable analysis of pre- and intraoperative variables were analysed. When all pre-, intra- and post operative variables were analysed separately, only horses that developed post operative ileus (POI) were shown to be at increased likelihood of mortality. Increased PCV, increased length of SI resected and POI are all associated with increased likelihood of mortality of EFE cases. This emphasises the importance of early diagnosis and treatment and the need for improved strategies in the management of POI in order to reduce post operative mortality in these cases. The present study provides evidence-based information to clinicians and owners of horses undergoing surgery for EFE about long-term survival. These results are applicable to university and large private clinics over a wide geographical area. © 2011 EVJ Ltd.
Wang, Xiangyang; Cao, Weilan; Zheng, Chenguo; Hu, Wanle; Liu, Changbao
2018-06-01
Marital status has been validated as an independent prognostic factor for survival in several cancer types, but is controversial in rectal cancer (RC). The objective of this study was to investigate the impact of marital status on the survival outcomes of patients with RC. We extracted data of 27,498 eligible patients diagnosed with RC between 2004 and 2009 from the Surveillance, Epidemiology and End Results (SEER) database. Patients were categorized into married, never married, divorced/separated and widowed groups.We used Chi-square tests to compare characteristics of patients with different marital status.Rectal cancer specific survival was compared using the Kaplan-Meier method,and multivariate Cox regression analyses was used to analyze the survival outcome risk factors in different marital status. The widowed group had the highest percentage of elderly patients and women,higher proportion of adenocarcinomas, and more stage I/II in tumor stage (P < 0.05),but with a lower rate of surgery compared to the married group (76.7% VS 85.4%). Compared with the married patients, the never married (HR 1.40), widowed (HR 1.61,) and divorced/separated patients (HR 1.16) had an increased overall 5-year mortality. A further analysis showed that widowed patients had an increased overall 5-year cause-specific survival(CSS) compared with married patients at stage I(HR 1.92),stage II (HR 1.65),stage III (HR 1.73),and stage IV (HR 1.38). Our study showed marriage was associated with better outcomes of RC patients, but unmarried RC patients, especially widowed patients,are at greater risk of cancer specific mortality. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockshin, A.; Giovanella, B.C.; Vardeman, D.M.
1985-04-01
Anticancer drugs were tested on NIH-2 nude mice inoculated ip with BRO human melanoma cells, which are rapidly lethal for these hosts. Criteria for drug activity were a) increased host survival and b) an increased rate of radioactivity loss from mice bearing BRO cells prelabeled with (/sup 125/I)5-iodo-2'-deoxyuridine. Diphtheria toxin, which is selectively toxic to human cells compared to mouse cells, prolonged host survival and accelerated /sup 125/I elimination in a dose-dependent manner. Drugs that increased the rate of /sup 125/I loss compared to the rate of untreated mice also prolonged the lives of treated mice. With one exception, drugsmore » that did not accelerate /sup 125/I elimination had little or no effect on the length of survival.« less
Stephenson, Anne L; Sykes, Jenna; Stanojevic, Sanja; Quon, Bradley S; Marshall, Bruce C; Petren, Kristofer; Ostrenga, Josh; Fink, Aliza K; Elbert, Alexander; Goss, Christopher H
2017-04-18
In 2011, the median age of survival of patients with cystic fibrosis reported in the United States was 36.8 years, compared with 48.5 years in Canada. Direct comparison of survival estimates between national registries is challenging because of inherent differences in methodologies used, data processing techniques, and ascertainment bias. To use a standardized approach to calculate cystic fibrosis survival estimates and to explore differences between Canada and the United States. Population-based study. 42 Canadian cystic fibrosis clinics and 110 U.S. cystic fibrosis care centers. Patients followed in the Canadian Cystic Fibrosis Registry (CCFR) and U.S. Cystic Fibrosis Foundation Patient Registry (CFFPR) between 1990 and 2013. Cox proportional hazards models were used to compare survival between patients followed in the CCFR (n = 5941) and those in the CFFPR (n = 45 448). Multivariable models were used to adjust for factors known to be associated with survival. Median age of survival in patients with cystic fibrosis increased in both countries between 1990 and 2013; however, in 1995 and 2005, survival in Canada increased at a faster rate than in the United States (P < 0.001). On the basis of contemporary data from 2009 to 2013, the median age of survival in Canada was 10 years greater than in the United States (50.9 vs. 40.6 years, respectively). The adjusted risk for death was 34% lower in Canada than the United States (hazard ratio, 0.66 [95% CI, 0.54 to 0.81]). A greater proportion of patients in Canada received transplants (10.3% vs. 6.5%, respectively [standardized difference, 13.7]). Differences in survival between U.S. and Canadian patients varied according to U.S. patients' insurance status. Ascertainment bias due to missing data or nonrandom loss to follow-up might affect the results. Differences in cystic fibrosis survival between Canada and the United States persisted after adjustment for risk factors associated with survival, except for private-insurance status among U.S. patients. Differential access to transplantation, increased posttransplant survival, and differences in health care systems may, in part, explain the Canadian survival advantage. U.S. Cystic Fibrosis Foundation.
Near-Infrared Irradiation Increases Length of Axial Pattern Flap Survival in Rats.
Yasunaga, Yoshichika; Matsuo, Kiyoshi; Tanaka, Yohei; Yuzuriha, Shunsuke
2017-01-01
Objective: We previously reported that near-infrared irradiation nonthermally induces long-lasting vasodilation of the subdermal plexus by causing apoptosis of vascular smooth muscle cells. To clarify the possible application of near-infrared irradiation to prevent skin flap necrosis, we evaluated the length of axial pattern flap survival in rats by near-infrared irradiation. Methods: A bilaterally symmetric island skin flap was elevated under the panniculus carnosus on the rat dorsum. Half of the flap was subjected to near-infrared irradiation just before flap elevation with a device that simulates solar radiation, which has a specialized contact cooling apparatus to avoid thermal effects. The length of flap survival of the near-infrared irradiated side was measured 7 days after flap elevation and compared with the nonirradiated side. Results: The irradiated side showed elongation of flap survival compared with the nonirradiated side (73.3 ± 11.7 mm vs 67.3 ± 14.9 mm, respectively, P = .03). Conclusions: Near-infrared irradiation increases the survival length of axial pattern flaps in rats.
Mitry, E; Rollot, F; Jooste, V; Guiu, B; Lepage, C; Ghiringhelli, F; Faivre, J; Bouvier, A M
2013-09-01
To describe trends in survival of non-resectable metastatic colorectal cancer (MCRC) over a 34-year period in a French population-based registry taking into account major advances in medical therapy. 3804 patients with non-resectable metastatic colorectal cancer diagnosed between 1976 and 2009 were included. Three periods (1976-96, 1997-2004 and 2005-09) were considered. The proportion of patients receiving chemotherapy dramatically increased from 19% to 57% between the first two periods, then increased steadily thereafter reaching 59% during the last period (p<0.001). Median relative survival increased from 5.9 months during the 1976-96 period to 10.2 months during the 1997-2004 period but, despite the availability of targeted therapies, remained at 9.5 months during the 2005-09 period. During the last study period, less than 10% of elderly patients received targeted therapies compared to more than 40% for younger patients. Their median relative survival was 5.0 months compared to 15.6 months in younger patients. There was an improvement in survival in relation with the increased use of more effective medical treatment. However, at a population-based level, patients are not all treated equally and most of them, especially the elderly, do not benefit from the most up-to-date treatment options. Copyright © 2013 Elsevier Ltd. All rights reserved.
The Natural History of Nonobstructive Hypertrophic Cardiomyopathy.
Hebl, Virginia B; Miranda, William R; Ong, Kevin C; Hodge, David O; Bos, J Martijn; Gentile, Federico; Klarich, Kyle W; Nishimura, Rick A; Ackerman, Michael J; Gersh, Bernard J; Ommen, Steve R; Geske, Jeffrey B
2016-03-01
To describe the survival of a large nonobstructive hypertrophic cardiomyopathy (NO-HCM) cohort and to identify risk factors for increased mortality in this population. Patients were identified from the Mayo Clinic HCM database from January 1, 1975, through November 30, 2006, for this retrospective observational study. Patients with resting or provocable left ventricular outflow tract gradients were excluded. Echocardiographic, clinical, and genetic data were compared between subgroups, and survival data were compared with expected population rates. A total of 706 patients with NO-HCM were identified. During median follow-up of 5 years (mean, 7 years), there were 208 deaths. Overall survival was no different than expected compared with age- and sex-matched white US population mortality rates (P=.77). Independent predictors of death were age at diagnosis, "burned out" HCM, and history of transient ischemic attack or stroke; use of an implantable cardioverter defibrillator (ICD) was inversely related to death. After exclusion of patients with an ICD, there was no difference in survival compared with age- and sex- matched individuals (P=.39); age, previous transient ischemic attack/stroke, and burned out HCM were predictors of death. In this cohort, patients with NO-HCM had similar survival rates as age- and sex-matched white US population mortality rates. Although use of an ICD was inversely related to death, no differences in overall survival were seen after those patients were excluded. Burned out HCM was independently associated with an increased risk of death, identifying a subset of patients who may benefit from more aggressive therapies. Copyright © 2016 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
Koch, Jan Christoph; Tönges, Lars; Michel, Uwe; Bähr, Mathias; Lingor, Paul
2014-01-01
The Rho/ROCK pathway is a promising therapeutic target in neurodegenerative and neurotraumatic diseases. Pharmacological inhibition of various pathway members has been shown to promote neuronal regeneration and survival. However, because pharmacological inhibitors are inherently limited in their specificity, shRNA-mediated approaches can add more information on the function of each single kinase involved. Thus, we generated adeno-associated viral vectors (AAV) to specifically downregulate Ras homologous member A (RhoA) via shRNA. We found that specific knockdown of RhoA promoted neurite outgrowth of retinal ganglion cells (RGC) grown on the inhibitory substrate chondroitin sulfate proteoglycan (CSPG) as well as neurite regeneration of primary midbrain neurons (PMN) after scratch lesion. In the rat optic nerve crush (ONC) model in vivo, downregulation of RhoA significantly enhanced axonal regeneration compared to control. Moreover, survival of RGC transduced with AAV expressing RhoA-shRNA was substantially increased at 2 weeks after optic nerve axotomy. Compared to previous data using pharmacological inhibitors to target RhoA, its upstream regulator Nogo or its main downstream target ROCK, the specific effects of RhoA downregulation shown here were most pronounced in regard to promoting RGC survival but neurite outgrowth and axonal regeneration were also increased significantly. Taken together, we show here that specific knockdown of RhoA substantially increases neuronal survival after optic nerve axotomy and modestly increases neurite outgrowth in vitro and axonal regeneration after optic nerve crush. PMID:25249936
Survival from skin cancer and its associated factors in Kurdistan province of Iran.
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
We explored survival of skin cancer and its determinants in Kurdistan province of Iran. In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar's office and information on type, stage and anatomic locations were extracted from patients' hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients.
Survival from skin cancer and its associated factors in Kurdistan province of Iran
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
Background: We explored survival of skin cancer and its determinants in Kurdistan province of Iran. Methods: In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar’s office and information on type, stage and anatomic locations were extracted from patients’ hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Results: Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Conclusion: Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients. PMID:26793668
Jansen, L; Buttmann-Schweiger, N; Listl, S; Ressing, M; Holleczek, B; Katalinic, A; Luttmann, S; Kraywinkel, K; Brenner, H
2018-01-01
The epidemiology of squamous cell oral cavity and pharyngeal cancers (OCPC) has changed rapidly during the last years, possibly due to an increase of human papilloma virus (HPV) positive tumors and successes in tobacco prevention. Here, we compare incidence and survival of OCPC by HPV-relation of the site in Germany and the United States (US). Age-standardized and age-specific incidence and 5-year relative survival was estimated using data from population-based cancer registries in Germany and the US Surveillance Epidemiology and End Results (SEER) 13 database. Incidence was estimated for each year between 1999 and 2013. Relative survival for 2002-2005, 2006-2009, and 2010-2013 was estimated using period analysis. The datasets included 52,787 and 48,861 cases with OCPC diagnosis between 1997 and 2013 in Germany and the US. Incidence was much higher in Germany compared to the US for HPV-unrelated OCPC and more recently also for HPV-related OCPC in women. Five-year relative survival differences between Germany and the US were small for HPV-unrelated OCPC. For HPV-related OCPC, men had higher survival in the US (62.1%) than in Germany (45.4%) in 2010-2013. These differences increased over time and were largest in younger patients and stage IV disease without metastasis. In contrast, women had comparable survival for HPV-related OCPC in both countries. Strong survival differences between Germany and the US were observed for HPV-related OCPC in men, which might be explained by differences in HPV-attributable proportions. Close monitoring of the epidemiology of OCPC in each country is needed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Intermittent individual housing increases survival of newly proliferated cells.
Aberg, Elin; Pham, Therese M; Zwart, Mieke; Baumans, Vera; Brené, Stefan
2005-09-08
In this study, we analyzed how intermittent individual housing with or without a running wheel influenced corticosterone levels and survival of newly proliferated cells in the dentate gyrus of the hippocampus. Female Balb/c mice, in standard or enhanced housing, were divided into groups that were individually housed with or without running wheels on every second day. Intermittent individual housing without, but not with, running wheels increased survival of proliferated cells in the dentate gyrus as compared with continuous group housing in standard or enhanced conditions. Thus, changes in housing conditions on every second day can, under certain circumstances, have an impact on the survival of newly proliferated cells in the dentate gyrus.
The importance of calcium in improving resistance of Daphnia to Microcystis.
Akbar, Siddiq; Du, Jingjing; Jia, Yong; Tian, Xingjun
2017-01-01
Changing environmental calcium (Ca) and rising cyanobacterial blooms in lake habitats could strongly reduce Daphnia growth and survival. Here, we assessed the effects of maternal Ca in Daphnia on transfer of resistance to their offspring against Microcystis aeruginosa PCC7806 (M. aeruginosa). Laboratory microcosm experiments were performed to examine effects in Daphnia carinata (D. carinata) and Daphnia pulex (D. pulex), and that how Ca induce responses in their offspring. The results showed that growth and survival were increased in offspring from exposed Daphnia as compared to unexposed, when raised in high Ca and increasing M. aeruginosa concentration. Among exposed Daphnia, offspring from high Ca mothers, produced more neonates with large size and higher survival as compared to offspring from low maternal Ca. Exposed D. carinata and D. pulex offspring, when reared in Ca deficient medium and increasing M. aeruginosa concentration, time to first brood increased, size become large and total offspring decreased subsequently in three alternative broods in offspring from low maternal Ca. In contrast, growth and reproduction in offspring from high Ca exposed mothers were consistent in three alternative broods. Despite species specific responses in growth, survival and variant life history traits in two Daphnia species, our results not only show maternal induction in Daphnia but also highlight that offspring response to M. aeruginosa varies with maternal Ca. This study demonstrates that Ca have role in Daphnia maternal induction against Microcystis, and recent Ca decline and increasing Microcystis concentration in lakes may decrease Daphnia growth and survival. Our data provide insights into the interactive effect of maternal Ca and Microcystis exposure on Daphnia and their outcome on offspring life history traits and survival.
Baade, Peter D; Dasgupta, Paramita; Dickman, Paul W; Cramb, Susanna; Williamson, John D; Condon, John R; Garvey, Gail
2016-08-01
The survival inequality faced by Indigenous Australians after a cancer diagnosis is well documented; what is less understood is whether this inequality has changed over time and what this means in terms of the impact a cancer diagnosis has on Indigenous people. Survival information for all patients identified as either Indigenous (n=3168) or non-Indigenous (n=211,615) and diagnosed in Queensland between 1997 and 2012 were obtained from the Queensland Cancer Registry, with mortality followed up to 31st December, 2013. Flexible parametric survival models were used to quantify changes in the cause-specific survival inequalities and the number of lives that might be saved if these inequalities were removed. Among Indigenous cancer patients, the 5-year cause-specific survival (adjusted by age, sex and broad cancer type) increased from 52.9% in 1997-2006 to 58.6% in 2007-2012, while it improved from 61.0% to 64.9% among non-Indigenous patients. This meant that the adjusted 5-year comparative survival ratio (Indigenous: non-Indigenous) increased from 0.87 [0.83-0.88] to 0.89 [0.87-0.93], with similar improvements in the 1-year comparative survival. Using a simulated cohort corresponding to the number and age-distribution of Indigenous people diagnosed with cancer in Queensland each year (n=300), based on the 1997-2006 cohort mortality rates, 35 of the 170 deaths due to cancer (21%) expected within five years of diagnosis were due to the Indigenous: non-Indigenous survival inequality. This percentage was similar when applying 2007-2012 cohort mortality rates (19%; 27 out of 140 deaths). Indigenous people diagnosed with cancer still face a poorer survival outlook than their non-Indigenous counterparts, particularly in the first year after diagnosis. The improving survival outcomes among both Indigenous and non-Indigenous cancer patients, and the decreasing absolute impact of the Indigenous survival disadvantage, should provide increased motivation to continue and enhance current strategies to further reduce the impact of the survival inequalities faced by Indigenous people diagnosed with cancer. Copyright © 2016 Elsevier Ltd. All rights reserved.
Steenkamp, Retha; Shaw, Catriona; Feest, Terry
2013-01-01
These analyses examine a) survival from the start of renal replacement therapy (RRT) based on the total incident UK RRT population reported to the UK Renal Registry, b) survival of prevalent patients. Changes in survival between 1997 and 2011 are also reported. Survival was calculated for both incident and prevalent patients on RRT and compared between the UK countries after adjustment for age. Survival of incident patients (starting RRT during 2010) was calculated both from the start of RRT and from 90 days after starting RRT, both with and without censoring at transplantation. Prevalent dialysis patients were censored at transplantation; this means that the patient is considered alive up to the point of transplantation, but the patient's status post-transplant is not considered. Both Kaplan-Meier and Cox adjusted models were used to calculate survival. Causes of death were analysed for both groups. The relative risk of death was calculated compared with the general UK population. The unadjusted 1 year after 90 day survival for patients starting RRT in 2010 was 87.3%, representing an increase from the previous year (86.6%). In incident patients aged 18-64 years, the unadjusted 1 year survival had risen from 86.0% in patients starting RRT in 1997 to 92.6% in patients starting RRT in 2010 and for those aged ≥65 it had increased from 63.9% to 77.0% over the same period. The age-adjusted one year survival (adjusted to age 60) of prevalent dialysis patients increased from 88.1% in the 2001 cohort to 89.8% in the 2010 cohort. Prevalent diabetic patient one year survival rose from 82.1% in the 2002 cohort to 84.7% in the 2010 cohort. The age-standardised mortality ratio for prevalent RRT patients compared with the general population was 18 for age group 30-34 and 2.5 at age 85+ years. In the prevalent RRT dialysis population, cardiovascular disease accounted for 22% of deaths, infection and treatment withdrawal 18% each and 25% were recorded as other causes of death. Treatment withdrawal was a more frequent cause of death in those incident patients aged ≥65 than in younger patients. The median life years remaining for a 25-29 year old on RRT was 18 years and approximately three years for a 75+ year old. Survival of patients starting RRT has improved in the 2010 incident cohort. The relative risk of death on RRT compared with the general population has fallen since 2001. Copyright © 2013 S. Karger AG, Basel.
Deventer, S A; Herberstein, M E; Mayntz, D; O'Hanlon, J C; Schneider, J M
2017-12-01
Many hypotheses explaining the evolution and maintenance of sexual cannibalism incorporate the nutritional aspect of the consumption of males. Most studies have focused on a fecundity advantage through consumption of a male; however, recent studies have raised the intriguing possibility that consumption of a male may also affect offspring quality. In particular, recent studies suggest prolonged survival for offspring from sexually cannibalistic females. Here, we measured the protein and lipid content of males compared to insect prey (crickets), quantified female nutrient intake of both prey types and finally assessed how sexual cannibalism affects female fecundity and spiderling quality in the orb-web spider Larinioides sclopetarius. We found no evidence that sexual cannibalism increased fecundity when compared to a female control group fed a cricket. Contrary to previous studies, spiderlings from females fed a male showed reduced survival under food deprivation compared to spiderlings from the control group. Offspring from females fed a male also tended to begin web construction sooner. The low lipid content of males compared to crickets may have reduced offspring survival duration. Whether additional proteins obtained through consumption of a male translate to enhanced silk production in offspring requires further investigation. © 2017 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2017 European Society For Evolutionary Biology.
Landscape‐level patterns in fawn survival across North America
Gingery, Tess M.; Diefenbach, Duane R.; Wallingford, Bret D.; Rosenberry, Christopher S.
2018-01-01
A landscape‐level meta‐analysis approach to examining early survival of ungulates may elucidate patterns in survival not evident from individual studies. Despite numerous efforts, the relationship between fawn survival and habitat characteristics remains unclear and there has been no attempt to examine trends in survival across landscape types with adequate replication. In 2015–2016, we radiomarked 98 white‐tailed deer (Odocoileus virginianus) fawns in 2 study areas in Pennsylvania. By using a meta‐analysis approach, we compared fawn survival estimates from across North America using published data from 29 populations in 16 states to identify patterns in survival and cause‐specific mortality related to landscape characteristics, predator communities, and deer population density. We modeled fawn survival relative to percentage of agricultural land cover and deer density. Estimated average survival to 3–6 months of age was 0.414 ± 0.062 (SE) in contiguous forest landscapes (no agriculture) and for every 10% increase in land area in agriculture, fawn survival increased 0.049 ± 0.014. We classified cause‐specific mortality as human‐caused, natural (excluding predation), and predation according to agriculturally dominated, forested, and mixed (i.e., both agricultural and forest cover) landscapes. Predation was the greatest source of mortality in all landscapes. Landscapes with mixed forest and agricultural cover had greater proportions and rates of human‐caused mortalities, and lower proportions and rates of mortality due to predators, when compared to forested landscapes. Proportion and rate of natural deaths did not differ among landscapes. We failed to detect any relationship between fawn survival and deer density. The results highlight the need to consider multiple spatial scales when accounting for factors that influence fawn survival. Furthermore, variation in mortality sources and rates among landscapes indicate the potential for altered landscape mosaics to influence fawn survival rates. Wildlife managers can use the meta‐analysis to identify factors that will facilitate comparisons of results among studies and advance a better understanding of patterns in fawn survival.
Biddle, Martha; Moser, Debra; Song, Eun Kyeung; Heo, Seongkum; Payne-Emerson, Heather; Dunbar, Sandra B.; Pressler, Susan; Lennie, Terry
2012-01-01
Background The antioxidant lycopene may be beneficial for patients with heart failure (HF). Processed tomato products are a major source of lycopene, although they are also high in sodium. Increased sodium intake may counter the positive antioxidant effect of lycopene. Methods This was a prospective study of 212 patients with HF. Dietary intake of lycopene and sodium was obtained from weighted 4-day food diaries. Patients were grouped by the median split of lycopene of 2471 μg/day and stratified by daily sodium levels above and below 3 g/day. Patients were followed for 1 year to collect survival and hospitalization data. Cox proportional hazards modeling was used to compare cardiac event-free survival between lycopene groups within each stratum of sodium intake. Results Higher lycopene intake was associated with longer cardiac event-free survival compared with lower lycopene intake (p = 0.003). The worst cardiac event-free survival was observed in the low lycopene intake group regardless of sodium intake (> 3 g/day HR = 3.01; p = 0.027 and ≤ 3 g/day HR= 3.34; p = 0.023). Conclusion These findings suggest that increased lycopene intake has the potential to improve cardiac event-free survival in patients with HF independent of sodium intake. PMID:23076979
Biddle, Martha; Moser, Debra; Song, Eun Kyeung; Heo, Seongkum; Payne-Emerson, Heather; Dunbar, Sandra B; Pressler, Susan; Lennie, Terry
2013-08-01
The antioxidant lycopene may be beneficial for patients with heart failure (HF). Processed tomato products are a major source of lycopene, although they are also high in sodium. Increased sodium intake may counter the positive antioxidant effect of lycopene. This was a prospective study of 212 patients with HF. Dietary intake of lycopene and sodium was obtained from weighted 4-day food diaries. Patients were grouped by the median split of lycopene of 2471 µg/day and stratified by daily sodium levels above and below 3 g/day. Patients were followed for 1 year to collect survival and hospitalization data. Cox proportional hazards modeling was used to compare cardiac event-free survival between lycopene groups within each stratum of sodium intake. Higher lycopene intake was associated with longer cardiac event-free survival compared with lower lycopene intake (p = 0.003). The worst cardiac event-free survival was observed in the low lycopene intake group regardless of sodium intake (> 3 g/day HR = 3.01; p = 0.027 and ≤ 3 g/day HR= 3.34; p = 0.023). These findings suggest that increased lycopene intake has the potential to improve cardiac event-free survival in patients with HF independent of sodium intake.
Castillo, V.; Pessina, P.; Hall, P.; Blatter, M.F. Cabrera; Miceli, D.; Arias, E. Soler; Vidal, P.
2016-01-01
The objective of the present study was to compare the effects of isotretinoin 9-cis (RA9-cis) as a post-surgery treatment of thyroid carcinoma to a traditional treatment (doxorubicin) and no treatment. Owners who did not want their dogs to receive treatment were placed into the control group A (GA; n=10). The remaining dogs were randomly placed into either group B (GB; n=12) and received doxorubicin at a dose of 30 mg/m2 every three weeks, for six complete cycles or group C (GC; n=15) and treated with RA9-cis at a dose of 2 mg/kg/day for 6 months. The time of the recurrence was significantly shorter in the GA and GB compared to GC (P=0.0007; P=0.0015 respectively), while we did not detect differences between GA and GB. The hazard ratio of recurrence between GA and GB compared to GC were 7.25 and 5.60 times shorter, respectively. We did not detect any differences between the other groups. The risk ratio of recurrence was 2.0 times higher in GA compared to GC and 2.1 times higher in GB compared to GC. The type of carcinoma had an effect on time of survival with follicular carcinomas having an increased mean survival time than follicular-compact carcinomas (P<0.0001) and follicular-compact carcinomas had a longer mean survival time than compact carcinomas. The interaction among treatment and type was significant, but survival time in follicular carcinomas did not differ between treatments. In follicular-compact carcinomas the survival time of GC was greater than GB (P<0.05), but we did not detect a difference between GA and GB. In conclusion, this study shows that the use of surgery in combination with RA9-cis treatment significantly increases survival rate and decreases the time to tumor recurrence when compared to doxorubicin treated or untreated dogs. The histological type of carcinoma interacted with treatment for time to recurrence and survival time, with more undifferentiated carcinomas having a worse prognosis than differentiated carcinomas. PMID:26862515
Enhanced Microbial Survivability in Subzero Brines.
Heinz, Jacob; Schirmack, Janosch; Airo, Alessandro; Kounaves, Samuel P; Schulze-Makuch, Dirk
2018-04-17
It is well known that dissolved salts can significantly lower the freezing point of water and thus extend habitability to subzero conditions. However, most investigations thus far have focused on sodium chloride as a solute. In this study, we report on the survivability of the bacterial strain Planococcus halocryophilus in sodium, magnesium, and calcium chloride or perchlorate solutions at temperatures ranging from +25°C to -30°C. In addition, we determined the survival rates of P. halocryophilus when subjected to multiple freeze/thaw cycles. We found that cells suspended in chloride-containing samples have markedly increased survival rates compared with those in perchlorate-containing samples. In both cases, the survival rates increase with lower temperatures; however, this effect is more pronounced in chloride-containing samples. Furthermore, we found that higher salt concentrations increase survival rates when cells are subjected to freeze/thaw cycles. Our findings have important implications not only for the habitability of cold environments on Earth but also for extraterrestrial environments such as that of Mars, where cold brines might exist in the subsurface and perhaps even appear temporarily at the surface such as at recurring slope lineae. Key Words: Brines-Halophile-Mars-Perchlorate-Subzero-Survival. Astrobiology 18, xxx-xxx.
Ouweneel, Dagmar M; Schotborgh, Jasper V; Limpens, Jacqueline; Sjauw, Krischan D; Engström, A E; Lagrand, Wim K; Cherpanath, Thomas G V; Driessen, Antoine H G; de Mol, Bas A J M; Henriques, José P S
2016-12-01
Veno-arterial extracorporeal life support (ECLS) is increasingly used in patients during cardiac arrest and cardiogenic shock, to support both cardiac and pulmonary function. We performed a systematic review and meta-analysis of cohort studies comparing mortality in patients treated with and without ECLS support in the setting of refractory cardiac arrest and cardiogenic shock complicating acute myocardial infarction. We systematically searched MEDLINE, EMBASE, the Cochrane Central Register of Controlled Trials and the publisher subset of PubMed updated to December 2015. Thirteen studies were included of which nine included cardiac arrest patients (n = 3098) and four included patients with cardiogenic shock after acute myocardial infarction (n = 235). Data were pooled by a Mantel-Haenzel random effects model and heterogeneity was examined by the I 2 statistic. In cardiac arrest, the use of ECLS was associated with an absolute increase of 30 days survival of 13 % compared with patients in which ECLS was not used [95 % CI 6-20 %; p < 0.001; number needed to treat (NNT) 7.7] and a higher rate of favourable neurological outcome at 30 days (absolute risk difference 14 %; 95 % CI 7-20 %; p < 0.0001; NNT 7.1). Propensity matched analysis, including 5 studies and 438 patients (219 in both groups), showed similar results. In cardiogenic shock, ECLS showed a 33 % higher 30-day survival compared with IABP (95 % CI, 14-52 %; p < 0.001; NNT 13) but no difference when compared with TandemHeart/Impella (-3 %; 95 % CI -21 to 14 %; p = 0.70; NNH 33). In cardiac arrest, the use of ECLS was associated with an increased survival rate as well as an increase in favourable neurological outcome. In the setting of cardiogenic shock there was an increased survival with ECLS compared with IABP.
Agar, Nita Sally; Wedgeworth, Emma; Crichton, Siobhan; Mitchell, Tracey J; Cox, Michael; Ferreira, Silvia; Robson, Alistair; Calonje, Eduardo; Stefanato, Catherine M; Wain, Elizabeth Mary; Wilkins, Bridget; Fields, Paul A; Dean, Alan; Webb, Katherine; Scarisbrick, Julia; Morris, Stephen; Whittaker, Sean J
2010-11-01
We have analyzed the outcome of mycosis fungoides (MF) and Sézary syndrome (SS) patients using the recent International Society for Cutaneous Lymphomas (ISCL)/European Organisation for Research and Treatment of Cancer (EORTC) revised staging proposal. Overall survival (OS), disease-specific survival (DSS), and risk of disease progression (RDP) were calculated for a cohort of 1,502 patients using univariate and multivariate models. The mean age at diagnosis was 54 years, and 71% of patients presented with early-stage disease. Disease progression occurred in 34%, and 26% of patients died due to MF/SS. A significant difference in survival and progression was noted for patients with early-stage disease having patches alone (T1a/T2a) compared with those having patches and plaques (T1b/T2b). Univariate analysis established that (1) advanced skin and overall clinical stage, increased age, male sex, increased lactate dehydrogenase (LDH), and large-cell transformation were associated with reduced survival and increased RDP; (2) hypopigmented MF, MF with lymphomatoid papulosis, and poikilodermatous MF were associated with improved survival and reduced RDP; and (3) folliculotropic MF was associated with an increased RDP. Multivariate analysis established that (1) advanced skin (T) stage, the presence in peripheral blood of the tumor clone without Sézary cells (B0b), increased LDH, and folliculotropic MF were independent predictors of poor survival and increased RDP; (2) large-cell transformation and tumor distribution were independent predictors of increased RDP only; and (3) N, M, and B stages; age; male sex; and poikilodermatous MF were only significant for survival. This study has validated the recently proposed ISCL/EORTC staging system and identified new prognostic factors.
Comparison of cancer survival in New Zealand and Australia, 2006-2010.
Aye, Phyu S; Elwood, J Mark; Stevanovic, Vladimir
2014-12-19
Previous studies have shown substantially higher mortality rates from cancer in New Zealand compared to Australia, but these studies have not included data on patient survival. This study compares the survival of cancer patients diagnosed in 2006-10 in the whole populations of New Zealand and Australia. Identical period survival methods were used to calculate relative survival ratios for all cancers combined, and for 18 cancers each accounting for more than 50 deaths per year in New Zealand, from 1 to 10 years from diagnosis. Cancer survival was lower in New Zealand, with 5-year relative survival being 4.2% lower in women, and 3.8% lower in men for all cancers combined. Of 18 cancers, 14 showed lower survival in New Zealand; the exceptions, with similar survival in each country, being melanoma, myeloma, mesothelioma, and cervical cancer. For most cancers, the differences in survival were maximum at 1 year after diagnosis, becoming smaller later; however, for breast cancer, the survival difference increased with time after diagnosis. The lower survival in New Zealand, and the higher mortality rates shown earlier, suggest that further improvements in recognition, diagnosis, and treatment of cancer in New Zealand should be possible. As the survival differences are seen soon after diagnosis, issues of early management in primary care and time intervals to diagnosis and treatment may be particularly important.
Conditional relative survival of oral cavity cancer: Based on Korean Central Cancer Registry.
Min, Seung-Ki; Choi, Sung Weon; Ha, Johyun; Park, Joo Yong; Won, Young-Joo; Jung, Kyu-Won
2017-09-01
Conditional relative survival (CRS) describes the survival chance of patients who have already survived for a certain period of time after diagnosis and treatment of cancer. Thus, CRS can complement the conventional 5-year relative survival, which does not consider the time patients have survived after their diagnosis. This study aimed to assess the 5-year CRS among Korean patients with oral cancer and the related risk factors. We identified 15,329 oral cavity cancer cases with a diagnosis between 1993 and 2013 in the Korea Central Cancer Registry. The CRS rates were calculated according to sex, age, subsite, histology, and stage at diagnosis. The 5-year relative survival was 57.2%, and further analysis revealed that the 5-year CRS increased during the first 2years and reached a plateau at 86.5% after 5years of survival. Women had better 5-year CRS than men after 5years of survival (90.0% vs. 83.3%), and ≤45-year-old patients had better 5-year CRS than older patient groups (93.3% vs. 86.4% or 86.7%). Subsite-specific differences in 5-year CRS were observed (tongue: 91% vs. mouth floor: 73.9%). Squamous cell carcinoma had a CRS of 87.3%, compared to 85.5% for other histological types. Localized disease had a CRS of 95.7%, compared to 87.3% for regional metastasis. Patients with oral cavity cancer exhibited increasing CRS rates, which varied according to sex, age, subsite, histology, and stage at diagnosis. Thus, CRS analysis provides a more detailed perspective regarding survival during the years after the initial diagnosis or treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Guzauskas, Gregory F; Villa, Kathleen F; Vanhove, Geertrui F; Fisher, Vicki L; Veenstra, David L
2017-03-01
To estimate the risk-benefit trade-off of a pediatric-inspired regimen versus hyperfractionated cyclophosphamide, vincristine, doxorubicin, and dexamethasone (hyper-CVAD) for first-line treatment of adolescents/young adult (AYA; ages 16-39 years) patients with Philadelphia-negative acute lymphoblastic leukemia. Patient outcomes were simulated using a 6-state Markov model, including complete response (CR), no CR, first relapse, second CR, second relapse, and death. A Weibull distribution was fit to the progression-free survival curve of hyper-CVAD-treated AYA patients from a single-center study, and comparable patient data from a retrospective study of pediatric regimen-treated AYA patients were utilized to estimate a relative progression difference (hazard ratio = 0.51) and model survival differences. Health-state utilities were estimated based on treatment stage, with an assumption that the pediatric protocol had 0.10 disutility compared with hyper-CVAD before the maintenance phase of treatment. Total life-years and quality-adjusted life-years (QALYs) were compared between treatment protocols at 1, 5, and 10 years, with additional probabilistic sensitivity analyses. Treatment with the pediatric-inspired protocol was associated with a 0.04 increase in life-years, but a 0.01 decrease in QALYs at 1 year. By years 5 and 10, the pediatric-inspired protocol resulted in 0.18 and 0.24 increase in life-years and 0.25 and 0.32 increase in QALYs, respectively, relative to hyper-CVAD. The lower quality of life associated with the induction and intensification phases of pediatric treatment was offset by more favorable progression-free survival and overall survival relative to hyper-CVAD. Our exploratory analysis suggests that, compared with hyper-CVAD, pediatric-inspired protocols may increase life-years throughout treatment stages and QALYs in the long term.
Survival of Kidney Retransplant Compared With First Kidney Transplant: A Report From Southern Iran.
Roozbeh, Jamshid; Malekmakan, Leila; Monavarian, Mehri; Daneshian, Arghavan; Karimi, Zeynab
2016-11-18
Kidney retransplant is increasingly performed, but patient survival is controversial. The aim of this study was to evaluate the outcomes of patients with second kidney grafts and compare survival rates of recipients with first and second kidney transplant procedures. This was a retrospective study analyzing records from the Shiraz University of Medical Sciences transplant ward. Survival rates of retrans?lanted patients were compared with a randomly selected group of first kidney recipients. Factors related to retransplant survival were evaluated. Data were analyzed by SPSS version 16.0, and P < .05 was consi?ered as significant. This study included 200 patients with first kidney transplants and 68 patients with kidney retransplants. We found that 1-, 3-, 5-, and 7-year graft survival rates were 91.9%, 87.2% ,86.3%, and 86.3% among retransplanted patients versus 98.3%, 95.4%, 90.2%, and 88.7% among the first transplant group (P = .130). Hospital stay duration after transplant, kidney rejection rate during hospitalization, delayed graft function, and creatinine levels at discharge were significantly associated with survival in retransplanted patients (P < .05). Kidney retransplants can yield desirable outcomes and is the treatment of choice in patients who have lost their graft. Careful screening for risk factors should be consider for obtaining better results in second kidney transplant procedures.
Go, Vivian F; Frangakis, Constantine; Le Minh, Nguyen; Ha, Tran Viet; Latkin, Carl A; Sripaipan, Teerada; Zelaya, Carla E; Davis, Wendy W; Celentano, David D; Quan, Vu Minh
2017-02-01
In Vietnam, where 58% of prevalent HIV cases are attributed to people who inject drugs, we evaluated whether a multi-level intervention could improve care outcomes and increase survival. We enrolled 455 HIV-infected males who inject drugs from 32 communes in Thai Nguyen Province. Communes were randomized to a community stigma reduction intervention or standard of care and then within each commune, to an individual enhanced counseling intervention or standard of care, resulting into 4 arms: Arm 1 (standard of care); Arm 2 (community intervention alone); Arm 3 (individual intervention alone); and Arm 4 (community + individual interventions). Follow-up was conducted at 6, 12, 18, and 24 months to assess survival. Overall mortality was 23% (n = 103/455) more than 2 years. There were no losses to follow-up for the mortality endpoint. Survival at 24 months was different across arms: Arm 4 (87%) vs Arm 1 (82%) vs Arm 2 (68%) vs Arm 3 (73%); log-rank test for comparison among arms: P = 0.001. Among those with CD4 cell count <200 cells/mm and not on antiretroviral therapy at baseline (n = 162), survival at 24 months was higher in Arm 4 (84%) compared with other arms (Arm 1: 61%; Arm 2: 50%; Arm 3: 53%; P-value = 0.002). Overall, Arm 4 (community + individual interventions) had increased uptake of antiretroviral therapy compared with Arms 1, 2, and 3. This multi-level behavioral intervention seemed to increase survival of HIV-infected participants more than a 2-year period. Relative to the standard of care, the greatest intervention effect was among those with lower CD4 cell counts.
Organista-Nava, Jorge; Gómez-Gómez, Yazmín; Illades-Aguiar, Berenice; Rivera-Ramírez, Ana Bertha; Saavedra-Herrera, Mónica Virginia; Leyva-Vázquez, Marco Antonio
2018-06-01
Dihydrofolate reductase (DHFR) has an important function in DNA synthesis and is a target of methotrexate, which is a crucial treatment option for acute lymphoblastic leukemia (ALL). However, the number of studies conducted to date on DHFR expression in childhood ALL is limited. The aim of the present study was to determine whether the expression of DHFR is associated with survival in childhood ALL. The expression of DHFR in 96 children with ALL and 100 control individuals was determined using reverse transcription-quantitative polymerase chain reaction. The results of the present study demonstrated that the expression of DHFR mRNA in children with ALL was significantly increased (P<0.001), compared with that in the control group. In addition, increased levels of DHFR mRNA were observed in patients with B-cell lineage, compared with patients with T-cell lineage ALL (P<0.05). The Kaplan-Meier estimator analysis revealed that children with ALL who exhibited increased levels of DHFR mRNA had a decreased overall survival time (P<0.05). It was observed that certain patient prognostic features (including age, sex, white blood cell count and high DHFR expression), are associated with poor survival (log-rank test, P<0.05). Therefore, the results of the present study indicated that DHFR upregulation is a factor for poor survival in ALL.
Temperature mediated moose survival in Northeastern Minnesota
Lenarz, M.S.; Nelson, M.E.; Schrage, M.W.; Edwards, A.J.
2009-01-01
The earth is in the midst of a pronounced warming trend and temperatures in Minnesota, USA, as elsewhere, are projected to increase. Northern Minnesota represents the southern edge to the circumpolar distribution of moose (Alces alces), a species intolerant of heat. Moose increase their metabolic rate to regulate their core body temperature as temperatures rise. We hypothesized that moose survival rates would be a function of the frequency and magnitude that ambient temperatures exceeded the upper critical temperature of moose. We compared annual and seasonal moose survival in northeastern Minnesota between 2002 and 2008 with a temperature metric. We found that models based on January temperatures above the critical threshold were inversely correlated with subsequent survival and explained >78 of variability in spring, fall, and annual survival. Models based on late-spring temperatures also explained a high proportion of survival during the subsequent fall. A model based on warm-season temperatures was important in explaining survival during the subsequent winter. Our analyses suggest that temperatures may have a cumulative influence on survival. We expect that continuation or acceleration of current climate trends will result in decreased survival, a decrease in moose density, and ultimately, a retreat of moose northward from their current distribution.
Boerner, T; Graichen, A; Jeiter, T; Zemann, F; Renner, P; März, L; Soeder, Y; Schlitt, H J; Piso, P; Dahlke, M H
2016-11-01
Peritoneal carcinomatosis (PC) is a dismal feature of gastric cancer that most often is treated by systemic palliative chemotherapy. In this retrospective matched pairs-analysis, we sought to establish whether specific patient subgroups alternatively should be offered a multimodal therapy concept, including cytoreductive surgery (CRS) and intraoperative hyperthermic chemotherapy (HIPEC). Clinical outcomes of 38 consecutive patients treated with gastrectomy, CRS and HIPEC for advanced gastric cancer with PC were compared to patients treated by palliative management (with and without gastrectomy) and to patients with advanced gastric cancer with no evidence of PC. Kaplan-Meier survival curves and multivariate Cox regression models were applied. Median survival time after gastrectomy was similar between patients receiving CRS-HIPEC and matched control patients operated for advanced gastric cancer without PC [18.1 months, confidence interval (CI) 10.1-26.0 vs. 21.8 months, CI 8.0-35.5 months], resulting in comparable 5-year survival (11.9 vs. 12.1 %). The median survival time after first diagnosis of PC for gastric cancer was 17.2 months (CI 10.1-24.2 months) in the CRS-HIPEC group compared with 11.0 months (CI 7.4-14.6 months) for those treated by gastrectomy and chemotherapy alone, resulting in a twofold increase of 2-year survival (35.8 vs. 16.9 %). We provide retrospective evidence that multimodal treatment with gastrectomy, CRS, and HIPEC is associated with improved survival for patients with PC of advanced gastric cancer compared with gastrectomy and palliative chemotherapy alone. We also show that patients treated with CRS-HIPEC have comparable survival to matched control patients without PC. However, regardless of treatment scheme, all patients subsequently recur and die of disease.
Freudenberg, Robert; Wendisch, Maria; Runge, Roswitha; Wunderlich, Gerd; Kotzerke, Jörg
2012-12-01
Cellular radionuclide uptake increases the heterogeneity of absorbed dose to biological structures. Dose increase depends on uptake yield and emission characteristics of radioisotopes. We used an in vitro model to compare the impact of cellular uptake of (188)Re-perrhenate and (99m)Tc-pertechnetate on cellular survival. Rat thyroid PC Cl3 cells in culture were incubated with (188)Re or (99m)Tc in the presence or absence of perchlorate for 1 hour. Clonogenic cell survival was measured by colony formation. In addition, intracellular radionuclide uptake was quantified. Dose effect curves were established for (188)Re and (99m)Tc for various extra- and intracellular distributions of the radioactivity. In the presence of perchlorate, no uptake of radionuclides was detected and (188)Re reduced cell survival more efficiently than (99m)Tc. A(37), the activity that is necessary to yield 37% cell survival was 14 MBq/ml for (188)Re and 480 MBq/ml for (99m)Tc. In the absence of perchlorate, both radionuclides showed similar uptakes; however, A(37) was reduced by 30% for the beta-emitter and by 95% for (99m)Tc. The dose D(37) that yields 37% cell survival was between 2.3 and 2.8 Gy for both radionuclides. Uptake of (188)Re and (99m)Tc decreased cell survival. Intracellular (99m)Tc yielded a dose increase that was higher compared to (188)Re due to emitted Auger and internal conversion-electrons. Up to 5 Gy there was no difference in radiotoxicity of (188)Re and (99m)Tc. At doses higher than 5 Gy intracellular (99m)Tc became less radiotoxic than (188)Re, probably due to a non-uniform lognormal radionuclide uptake.
Hazel, A R; Heins, B J; Hansen, L B
2017-11-01
Montbéliarde (MO) × Holstein (HO) and Viking Red (VR) × HO crossbred cows were compared with pure HO cows in 8 large, high-performance dairy herds in Minnesota. All cows calved for the first time from December 2010 to April 2014. Fertility and survival traits were calculated from records of insemination, pregnancy diagnosis, calving, and disposal that were recorded via management software. Body condition score and conformation were subjectively scored once during early lactation by trained evaluators. The analysis of survival to 60 d in milk included 536 MO × HO, 560 VR × HO, and 1,033 HO cows during first lactation. Cows analyzed for other fertility, survival, and conformation traits had up to 13% fewer cows available for analysis. The first service conception rate of the crossbred cows (both types combined) increased 7%, as did the conception rate across the first 5 inseminations, compared with the HO cows during first lactation. Furthermore, the combined crossbred cows (2.11 ± 0.05) had fewer times bred than HO cows (2.30 ± 0.05) and 10 fewer d open compared with their HO herdmates. Across the 8 herds, breed groups did not differ for survival to 60 d in milk; however, the superior fertility of the crossbred cows allowed an increased proportion of the combined crossbreds (71 ± 1.5%) to calve a second time within 14 mo compared with the HO cows (63 ± 1.5%). For survival to second calving, the combined crossbred cows had 4% superior survival compared with the HO cows. The MO × HO and VR × HO crossbred cows both had increased body condition score (+0.50 ± 0.02 and +0.25 ± 0.02, respectively) but shorter stature and less body depth than HO cows. The MO × HO cows had less set to the hock and a steeper foot angle than the HO cows, and the VR × HO cows had more set to the hock with a similar foot angle to the HO cows. The combined crossbred cows had less udder clearance from the hock than HO cows, more width between both front and rear teats, and longer teat length than the HO cows; however, the frequency of first-lactation cows culled for udder conformation was uniformly low (<1%) across the breed groups. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Mahmoud, K. Gh. M; Scholkamy, T. H; Darwish, S. F
2015-01-01
Cryopreservation and sexing of embryos are integrated into commercial embryo transfer technologies. To improve the effectiveness of vitrification of in vitro produced buffalo embryos, two experiments were conducted. The first evaluated the effect of exposure time (2 and 3 min) and developmental stage (morula and blastocysts) on the viability and development of vitrified buffalo embryos. Morphologically normal embryos and survival rates (re-expansion) significantly increased when vitrified morulae were exposed for 2 min compared to 3 min (P<0.001). On the other hand, morphologically normal and survival rates of blastocysts significantly increased when exposed for 3 min compared to 2 min (P<0.001). However, there were no significant differences between the two developmental stages (morulae and blastocystes) in the percentages of morphologically normal embryos and re-expansion rates after a 24 h culture. The second experiment aimed to evaluate the effect of viability on the sex ratio of buffalo embryos after vitrification and whether male and female embryos survived vitrification differently. A total number of 61 blastocysts were vitrified for 3 min with the same cryoprotectant as experiment 1. Higher percentages of males were recorded for live as compared to dead embryos; however, this difference was not significant. In conclusion, the post-thaw survival and development of in vitro produced morulae and blastocysts were found to be affected by exposure time rather than developmental stage. Survivability had no significant effect on the sex ratio of vitrified blastocysts; nevertheless, the number of surviving males was higher than dead male embryos. PMID:27175197
Redfield, Robert R; Scalea, Joseph R; Zens, Tiffany J; Mandelbrot, Didier A; Leverson, Glen; Kaufman, Dixon B; Djamali, Arjang
2016-10-01
We sought to determine whether the mode of sensitization in highly sensitized patients contributed to kidney allograft survival. An analysis of the United Network for Organ Sharing dataset involving all kidney transplants between 1997 and 2014 was undertaken. Highly sensitized adult kidney transplant recipients [panel reactive antibody (PRA) ≥98%] were compared with adult, primary non-sensitized and re-transplant recipients. Kaplan-Meier survival analyses were used to determine allograft survival rates. Cox proportional hazards regression analyses were conducted to determine the association of graft loss with key predictors. Fifty-three percent of highly sensitized patients transplanted were re-transplants. Pregnancy and transfusion were the only sensitizing event in 20 and 5%, respectively. The 10-year actuarial graft survival for highly sensitized recipients was 43.9% compared with 52.4% for non-sensitized patients, P < 0.001. The combination of being highly sensitized by either pregnancy or blood transfusion increased the risk of graft loss by 23% [hazard ratio (HR) 1.230, confidence interval (CI) 1.150-1.315, P < 0.001], and the combination of being highly sensitized from a prior transplant increased the risk of graft loss by 58.1% (HR 1.581, CI 1.473-1.698, P < 0.001). The mode of sensitization predicts graft survival in highly sensitized kidney transplant recipients (PRA ≥98%). Patients who are highly sensitized from re-transplants have inferior graft survival compared with patients who are highly sensitized from other modes of sensitization. © The Author 2016. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Baili, Paolo; Di Salvo, Francesca; Marcos-Gragera, Rafael; Siesling, Sabine; Mallone, Sandra; Santaquilani, Mariano; Micheli, Andrea; Lillini, Roberto; Francisci, Silvia
2015-10-01
Overall survival after cancer is frequently used when assessing a health care service's performance as a whole. It is mainly used by the public, politicians and the media, and is often dismissed by clinicians because of the heterogeneous mix of different cancers, risk factors and treatment modalities. Here we give survival details for all cancers combined in Europe, correlating it with economic variables to suggest reasons for differences. We computed age and cancer site case-mix standardised relative survival for all cancers combined (ACRS) for 29 countries participating in the EUROCARE-5 project with data on more than 7.5million cancer cases from 87 population-based cancer registries, using complete and period approach. Denmark, United Kingdom (UK) and Eastern European countries had lower survival than neighbouring countries. Five-year ACRS has been increasing throughout Europe, and substantial increases, between 1999-2001 and 2005-2007, have been achieved in countries where survival was lower in the past. Five-year ACRS for men and women are positively correlated with macro-economic variables like the Gross Domestic Product (GDP) and Total National Expenditure on Health (TNEH) (R 2 about 70%). Countries with recent larger increases in GDP and TNEH had greater increases in cancer survival. ACRS serves to compare all cancer survival in Europe taking account of the geographical variability in case-mixes. The EUROCARE-5 data on ACRS confirm previous EUROCARE findings. Survival appears to correlate with macro-economic determinants, particularly with investments in the health care system. Copyright © 2015 Elsevier Ltd. All rights reserved.
Martino, Suella; Jamme, Mathieu; Deligny, Christophe; Busson, Marc; Loiseau, Pascale; Azoulay, Elie; Galicier, Lionel; Pène, Frédéric; Provôt, François; Dossier, Antoine; Saheb, Samir; Veyradier, Agnès; Coppo, Paul
2016-01-01
Black people are at increased risk of thrombotic thrombocytopenic purpura (TTP). Whether clinical presentation of TTP in Black patients has specific features is unknown. We assessed here differences in TTP presentation and outcome between Black and White patients. Clinical presentation was comparable between both ethnic groups. However, prognosis differed with a lower death rate in Black patients than in White patients (2.7% versus 11.6%, respectively, P = .04). Ethnicity, increasing age and neurologic involvement were retained as risk factors for death in a multivariable model (P < .05 all). Sixty-day overall survival estimated by the Kaplan-Meier curves and compared with the Log-Rank test confirmed that Black patients had a better survival than White patients (P = .03). Salvage therapies were similarly performed between both groups, suggesting that disease severity was comparable. The comparison of HLA-DRB1*11, -DRB1*04 and -DQB1*03 allele frequencies between Black patients and healthy Black individuals revealed no significant difference. However, the protective allele against TTP, HLA-DRB1*04, was dramatically decreased in Black individuals in comparison with White individuals. Black people with TTP may have a better survival than White patients despite a comparable disease severity. A low natural frequency of HLA-DRB1*04 in Black ethnicity may account for the greater risk of TTP in this population.
Martino, Suella; Jamme, Mathieu; Deligny, Christophe; Busson, Marc; Loiseau, Pascale; Azoulay, Elie; Galicier, Lionel; Pène, Frédéric; Provôt, François; Dossier, Antoine; Saheb, Samir; Veyradier, Agnès; Coppo, Paul
2016-01-01
Black people are at increased risk of thrombotic thrombocytopenic purpura (TTP). Whether clinical presentation of TTP in Black patients has specific features is unknown. We assessed here differences in TTP presentation and outcome between Black and White patients. Clinical presentation was comparable between both ethnic groups. However, prognosis differed with a lower death rate in Black patients than in White patients (2.7% versus 11.6%, respectively, P = .04). Ethnicity, increasing age and neurologic involvement were retained as risk factors for death in a multivariable model (P < .05 all). Sixty-day overall survival estimated by the Kaplan-Meier curves and compared with the Log-Rank test confirmed that Black patients had a better survival than White patients (P = .03). Salvage therapies were similarly performed between both groups, suggesting that disease severity was comparable. The comparison of HLA-DRB1*11, -DRB1*04 and -DQB1*03 allele frequencies between Black patients and healthy Black individuals revealed no significant difference. However, the protective allele against TTP, HLA-DRB1*04, was dramatically decreased in Black individuals in comparison with White individuals. Black people with TTP may have a better survival than White patients despite a comparable disease severity. A low natural frequency of HLA-DRB1*04 in Black ethnicity may account for the greater risk of TTP in this population. PMID:27383202
Clark, Andrew L; Knosalla, Christoph; Birks, Emma; Loebe, Matthias; Davos, Constantinos H; Tsang, Sui; Negassa, Abdissa; Yacoub, Magdi; Hetzer, Roland; Coats, Andrew J S; Anker, Stefan D
2007-08-01
Heart transplantation is an important treatment for end-stage chronic heart failure. We studied the effect of body mass index (BMI), and the effect of subsequent weight change, on survival following transplantation in 1902 consecutive patients. Patients were recruited from: London (n=553), Berlin (N=971) and Boston (N=378). Patients suitable for transplantation due to symptoms, low left ventricular ejection fraction (
Mantziari, Styliani; Allemann, Pierre; Winiker, Michael; Sempoux, Christine; Demartines, Nicolas; Schäfer, Markus
2017-09-01
Lymph node (LN) involvement by esophageal cancer is associated with compromised long-term prognosis. This study assessed whether LN downstaging by neoadjuvant treatment (NAT) might offer a survival benefit compared to patients with a priori negative LN. Patients undergoing esophagectomy for cancer between 2005 and 2014 were screened for inclusion. Group 1 included cN0 patients confirmed as pN0 who were treated with surgery first, whereas group 2 included patients initially cN+ and down-staged to ypN0 after NAT. Survival analysis was performed with the Kaplan-Meier and Cox regression methods. Fifty-seven patients were included in our study, 24 in group 1 and 33 in group 2. Group 2 patients had more locally advanced lesions compared to a priori negative patients, and despite complete LN sterilization by NAT they still had worse long-term survival. Overall 3-year survival was 86.8% for a priori LN negative versus 63.3% for downstaged patients (P = 0.013), while disease-free survival was 79.6% and 57.9%, respectively (P = 0.021). Tumor recurrence was also earlier and more disseminated for the down-staged group. Downstaged LN, despite the systemic effect of NAT, still inherit an increased risk for early tumor recurrence and worse long-term survival compared to a priori negative LN. © 2017 Wiley Periodicals, Inc.
Rogers, Nina Trivedy; Demakakos, Panayotes; Taylor, Mark Steven; Steptoe, Andrew; Hamer, Mark; Shankar, Aparna
2016-06-01
Volunteering has been linked to reduced mortality in older adults, but the mechanisms explaining this effect remain unclear. This study investigated whether volunteering is associated with increased survival in participants of the English Longitudinal Study of Ageing and whether differences in survival are modified by functional disabilities. A multivariate Cox Proportional Hazards model was used to estimate the association of volunteering with survival over a period of 10.9 years in 10 324 participants, while controlling for selected confounders. To investigate effect modification by disability, the analyses were repeated in participants with and without self-reported functional disabilities. Volunteering was associated with a reduced probability of death from all causes in univariate analyses (HR=0.65, CI 0.58 to 0.73, p<0.0001), but adjustment for covariates rendered this association non-significant (HR=0.90, CI 0.79 to 1.01, p=0.07). Able-bodied volunteers had significantly increased survival compared with able-bodied non-volunteers (HR=0.81, 95% CI 0.69 to 0.95, p=0.009). There was no significant survival advantage among disabled volunteers, compared with disabled non-volunteers (HR=1.06, CI 0.88 to 1.29, p=0.53). Volunteering is associated with reduced mortality in older adults in England, but this effect appears to be limited to volunteers who report no disabilities. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Magaji, Bello Arkilla; Moy, Foong Ming; Roslani, April Camilla; Law, Chee Wei
2017-05-18
Colorectal cancer is the third most commonly diagnosed malignancy and the fourth leading cause of cancer-related death globally. It is the second most common cancer among both males and females in Malaysia. The economic burden of colorectal cancer is likely to increase over time owing to its current trend and aging population. Cancer survival analysis is an essential indicator for early detection and improvement in cancer treatment. However, there was a scarcity of studies concerning survival of colorectal cancer patients as well as its predictors. Therefore, we aimed to determine the 1-, 3- and 5-year survival rates, compare survival rates among ethnic groups and determine the predictors of survival among colorectal cancer patients. This was an ambidirectional cohort study conducted at the University Malaya Medical Centre (UMMC) in Kuala Lumpur, Malaysia. All Malaysian citizens or permanent residents with histologically confirmed diagnosis of colorectal cancer seen at UMMC from 1 January 2001 to 31 December 2010 were included in the study. Demographic and clinical characteristics were extracted from the medical records. Patients were followed-up until death or censored at the end of the study (31st December 2010). Censored patients' vital status (whether alive or dead) were cross checked with the National Registration Department. Survival analyses at 1-, 3- and 5-year intervals were performed using the Kaplan-Meier method. Log-rank test was used to compare the survival rates, while Cox proportional hazard regression analysis was carried out to determine the predictors of 5-year colorectal cancer survival. Among 1212 patients, the median survival for colorectal, colon and rectal cancers were 42.0, 42.0 and 41.0 months respectively; while the 1-, 3-, and 5-year relative survival rates ranged from 73.8 to 76.0%, 52.1 to 53.7% and 40.4 to 45.4% respectively. The Chinese patients had the lowest 5-year survival compared to Malay and Indian patients. Based on the 814 patients with data on their Duke's staging, independent predictors of poor colorectal cancer (5-year) survival were male sex (Hazard Ratio [HR]: 1.41; 95% CI: 1.12, 1.76), Chinese ethnicity (HR: 1.41; 95% CI: 1.07,1.85), elevated (≥ 5.1 ng/ml) pre-operative carcino-embryonic antigen (CEA) level (HR: 2.13; 95% CI: 1.60, 2.83), Duke's stage C (HR: 1.68; 95% CI: 1.28, 2.21), Duke's stage D (HR: 4.61; 95% CI: 3.39, 6.28) and emergency surgery (HR: 1.52; 95% CI: 1.07, 2.15). The survival rates of colorectal cancer among our patients were comparable with those of some Asian countries but lower than those found in more developed countries. Males and patients from the Chinese ethnic group had lower survival rates compared to their counterparts. More advanced staging and late presentation were important predictors of colorectal cancer survival. Health education programs targeting high risk groups and emphasizing the importance of screening and early diagnosis, as well as the recognition of symptoms and risk factors should be implemented. A nationwide colorectal cancer screening program should be designed and implemented to increase early detection and improve survival outcomes.
Epelboym, Irene; Zenati, Mazen S; Hamad, Ahmad; Steve, Jennifer; Lee, Kenneth K; Bahary, Nathan; Hogg, Melissa E; Zeh, Herbert J; Zureikat, Amer H
2017-09-01
Receipt of 6 cycles of adjuvant chemotherapy (AC) is standard of care in pancreatic cancer (PC). Neoadjuvant chemotherapy (NAC) is increasingly utilized; however, optimal number of cycles needed alone or in combination with AC remains unknown. We sought to determine the optimal number and sequence of perioperative chemotherapy cycles in PC. Single institutional review of all resected PCs from 2008 to 2015. The impact of cumulative number of chemotherapy cycles received (0, 1-5, and ≥6 cycles) and their sequence (NAC, AC, or NAC + AC) on overall survival was evaluated Cox-proportional hazard modeling, using 6 cycles of AC as reference. A total of 522 patients were analyzed. Based on sample size distribution, four combinations were evaluated: 0 cycles = 12.1%, 1-5 cycles of combined NAC + AC = 29%, 6 cycles of AC = 25%, and ≥6 cycles of combined NAC + AC = 34%, with corresponding survival. 13.1, 18.5, 37, and 36.8 months. On MVA (P < 0.0001), tumor stage [hazard ratio (HR) 1.35], LNR (HR 4.3), and R1 margins (HR 1.77) were associated with increased hazard of death. Compared with 6 cycles AC, receipt of 0 cycles [HR 3.57, confidence interval (CI) 2.47-5.18] or 1-5 cycles in any combination (HR 2.37, CI 1.73-3.23) was associated with increased hazard of death, whereas receipt of ≥6 cycles in any sequence was associated with optimal and comparable survival (HR 1.07, CI 0.78-1.47). Receipt of 6 or more perioperative cycles of chemotherapy either as combined neoadjuvant and adjuvant or adjuvant alone may be associated with optimal and comparable survival in resected PC.
Seneviratne, Sanjeewa; Campbell, Ian; Scott, Nina; Shirley, Rachel; Lawrenson, Ross
2015-01-31
Indigenous Māori women experience a 60% higher breast cancer mortality rate compared with European women in New Zealand. We explored the impact of differences in rates of screen detected breast cancer on inequities in cancer stage at diagnosis and survival between Māori and NZ European women. All primary breast cancers diagnosed in screening age women (as defined by the New Zealand National Breast Cancer Screening Programme) during 1999-2012 in the Waikato area (n = 1846) were identified from the Waikato Breast Cancer Register and the National Screening Database. Stage at diagnosis and survival were compared for screen detected (n = 1106) and non-screen detected (n = 740) breast cancer by ethnicity and socioeconomic status. Indigenous Māori women were significantly more likely to be diagnosed with more advanced cancer compared with NZ European women (OR = 1.51), and approximately a half of this difference was explained by lower rate of screen detected cancer for Māori women. For non-screen detected cancer, Māori had significantly lower 10-year breast cancer survival compared with NZ European (46.5% vs. 73.2%) as did most deprived compared with most affluent socioeconomic quintiles (64.8% vs. 81.1%). No significant survival differences were observed for screen detected cancer by ethnicity or socioeconomic deprivation. The lower rate of screen detected breast cancer appears to be a key contributor towards the higher rate of advanced cancer at diagnosis and lower breast cancer survival for Māori compared with NZ European women. Among women with screen-detected breast cancer, Māori women do just as well as NZ European women, demonstrating the success of breast screening for Māori women who are able to access screening. Increasing breast cancer screening rates has the potential to improve survival for Māori women and reduce breast cancer survival inequity between Māori and NZ European women.
Kennedy, Christopher; Redden, David; Gray, Stephen; Eckhoff, Devin; Massoud, Omar; McGuire, Brendan; Alkurdi, Basem; Bloomer, Joseph; DuBay, Derek A
2012-09-01
Orthotopic liver transplantation (LT) in non-alcoholic steatohepatitis (NASH) is increasing in parallel with the obesity epidemic. This study retrospectively reviewed the clinical outcomes of LTs in NASH (n = 129) and non-NASH (n = 775) aetiologies carried out at a single centre between 1999 and 2009. Rates of 1-, 3- and 5-year overall survival in NASH (90%, 88% and 85%, respectively) were comparable with those in non-NASH (92%, 86% and 80%, respectively) patients. Mortality within 4 months of LT was twice as high in NASH as in non-NASH patients (8.5% vs. 4.2%; P = 0.04). Compared with non-NASH patients, post-LT mortality in NASH patients was more commonly caused by infectious (38% vs. 26%; P < 0.05) or cardiac (19% vs. 7%; P < 0.05) aetiologies. Five-year survival was lower in NASH patients with a high-risk phenotype (age >60 years, body mass index >30 kg/m(2), with hypertension and diabetes) than in NASH patients without these characteristics (72% vs. 87%; P = 0.02). Subgroup analyses revealed that 5-year overall survival in NASH was equivalent to that in Laennec's cirrhosis (85% vs. 80%; P 0.87), but lower than that in cirrhosis of cryptogenic aetiology (85% vs. 96%; P = 0.04). Orthotopic LT in NASH was associated with increased early postoperative mortality, but 1-, 3- and 5-year overall survival rates were equivalent to those in non-NASH patients. © 2012 International Hepato-Pancreato-Biliary Association.
Dumitrascu, T; Dima, S; Brasoveanu, V; Stroescu, C; Herlea, V; Moldovan, S; Ionescu, M; Popescu, I
2014-12-01
The impact of venous resection (VR) in pancreatico-dudenectomy (PD) for pancreatic adenocarcinoma (PDAC) is controversial. The aim of the study is to comparatively assess the postoperative outcomes after PD with and without VR for PDAC and to identify predictors of morbidity and survival in the subgroup of PD with VR. The data of 51 PD with VR were compared with those of 183 PD without VR. Binary logistic regression and Cox survival analyses were performed. Both the operative time and estimated blood loss was significantly higher in the VR group (P<0.001). A trend towards an increased 90-day mortality (9.8% vs. 5.5%) and severe morbidity (20% vs. 13%) was observed when a VR was performed (P ≥0.264). The median overall survival time after the PD with and without VR was 13 months and 17 months, respectively (P=0.845). The absence of histological tumor invasion of the VR was found as the only independent predictor for a better survival (HR=0.359; 95% CI 0.161-0.803; P=0.013). A PD with VR can be safely incorporated in a pancreatic surgeon armamentarium. However, the trend towards increased mortality and severe morbidity rates should be expected, along with higher operative time and blood loss, compared with PD without VR. Associated VR does not appear to significantly impair the prognosis after PD for PDAC; however, histological tumor invasion of the VR has a negative impact on the survival.
Long-term survival following in-hospital cardiac arrest: A matched cohort study☆
Feingold, Paul; Mina, Michael J.; Burke, Rachel M.; Hashimoto, Barry; Gregg, Sara; Martin, Greg S.; Leeper, Kenneth; Buchman, Timothy
2016-01-01
Background Each year, 200,000 patients undergo an in-hospital cardiac arrest (IHCA), with approximately 15–20% surviving to discharge. Little is known, however, about the long-term prognosis of these patients after discharge. Previous efforts to describe out-of-hospital survival of IHCA patients have been limited by small sample sizes and narrow patient populations Methods A single institution matched cohort study was undertaken to describe mortality following IHCA. Patients surviving to discharge following an IHCA between 2008 and 2010 were matched on age, sex, race and hospital admission criteria with non-IHCA hospital controls and follow-up between 9 and 45 months. Kaplan–Meier curves and Cox PH models assessed differences in survival. Results Of the 1262 IHCAs, 20% survived to hospital discharge. Of those discharged, survival at 1 year post-discharge was 59% for IHCA patients and 82% for controls (p < 0.0001). Hazard ratios (IHCA vs. controls) for mortality were greatest within the 90 days following discharge (HR = 2.90, p < 0.0001) and decreased linearly thereafter, with those surviving to one year post-discharge having an HR for mortality below 1.0. Survival after discharge varied amongst IHCA survivors. When grouped by discharge destination, out of hospital survival varied; in fact, IHCA patients discharged home without services demonstrated no survival difference compared to their non-IHCA controls (HR 1.10, p = 0.72). IHCA patients discharged to long-term hospital care or hospice, however, had a significantly higher mortality compared to matched controls (HR 3.91 and 20.3, respectively; p < 0.0001). Conclusion Among IHCA patients who survive to hospital discharge, the highest risk of death is within the first 90 days after discharge. Additionally, IHCA survivors overall have increased long-term mortality vs. controls. Survival rates were varied widely with different discharge destinations, and those discharged to home, skilled nursing facilities or to rehabilitation services had survival rates no different than controls. Thus, increased mortality was primarily driven by patients discharged to long-term care or hospice. PMID:26703463
Bae, Soo Youn; Jung, Seung Pil; Jung, Eun Sung; Park, Sung Min; Lee, Se Kyung; Yu, Jong Han; Lee, Jeong Eon; Kim, Seok Won; Nam, Seok Jin
2018-06-18
Pregnancy-associated breast cancer (PABC) is rare and is generally defined as breast cancer diagnosed during pregnancy or within 1 year of delivery. The average ages of marriage and childbearing are increasing, and PABC is expected to also increase. This study is intended to increase understanding of the characteristics of PABC. A database of 2,810 patients with breast cancer diagnosed when they were less than 40 years of age was reviewed. The clinicopathological factors and survival of PABC (40 patients) were compared to those of patients with young breast cancer (YBC, non-pregnant or over 12 months after delivery; 2,770 patients). PABC had significantly lower estrogen receptor (ER) and progesterone receptor (PR) expression (ER-positive 50.0%, PR-positive 45.0%) and higher HER2 overexpression (38.5%) than YBC. The most common subtype of PABC was triple-negative breast cancer (TNBC; 35.9%), and luminal A subtype represented only 7.7% of cases. In univariate analysis, PABC had significantly worse disease-free survival (DFS) and breast cancer-specific survival (BCSS) compared to YBC. In multivariate analysis, PABC was associated with worse BCSS (HR 4.0, 95% CI 1.2-12.9, p = 0.019) and survival, but there was no difference in DFS between PABC and YBC. In subgroup analysis by subtype, luminal B subtype of PABC showed worse DFS (HR 3.5; 95% CI 1.1-11.2, p = 0.039) and BCSS (HR 10.2, 95% CI 1.2-87.1, p = 0.035), especially with high Ki67. However, no differences were demonstrated in other subtypes. In this study, PABC showed lower expression of ER/PR, higher overexpression of HER2, fewer luminal A subtype, and more TNBC subtype compared to YBC. PABC had worse BCSS, especially luminal B subtype, compared to YBC. © 2018 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dokko, H.; Min, P.S.; Cherrick, H.M.
1991-04-01
Low doses of ultraviolet (UV) light, x-rays, photodynamic treatment, or aflatoxins increase the survival of UV-irradiated virus in cells. This effect is postulated to occur by enhancement of the error-prone cellular repair function, which could also be associated with oncogenic cell transformation. The present study was designed to investigate whether treatment of green monkey kidney cells with water extract of snuff (snuff extract), benzo(a)pyrene, nicotine, or tobacco-specific N'-nitrosamines would result in enhanced survival of UV-irradiated herpes simplex virus (HSV). Exposure of the cells with snuff extract, benzo(a)pyrene, N'-nitrosonornicotine, or 4-(N-methyl-N'-nitrosamino)-1-(3-pyridyl)-1-butanone resulted in an enhancement of survival of UV-irradiated HSV typemore » 1 compared with the control whereas exposure of the cells with nicotine did not. These data indicate that the water-extractable component of snuff and tobacco-related chemical carcinogens increase the cellular repair mechanism and provides for increased survival of UV-irradiated HSV.« less
The significance of serum urea and renal function in patients with heart failure.
Gotsman, Israel; Zwas, Donna; Planer, David; Admon, Dan; Lotan, Chaim; Keren, Andre
2010-07-01
Renal function and urea are frequently abnormal in patients with heart failure (HF) and are predictive of increased mortality. The relative importance of each parameter is less clear. We prospectively compared the predictive value of renal function and serum urea on clinical outcome in patients with HF. Patients hospitalized with definite clinical diagnosis of HF (n = 355) were followed for short-term (1 yr) and long-term (mean, 6.5 yr) survival and HF rehospitalization. Increasing tertiles of discharge estimated glomerular filtration rate (eGFR) were an independent predictor of increased long-term survival (hazard ratio [HR], 0.65; 95% confidence interval [CI], 0.47-0.91; p = 0.01) but not short-term survival. Admission and discharge serum urea and blood urea nitrogen (BUN)/creatinine ratio were predictors of reduced short- and long-term survival on multivariate Cox regression analysis. Increasing tertiles of discharge urea were a predictor of reduced 1-year survival (HR, 2.13; 95% CI, 1.21-3.73; p = 0.009) and long-term survival (HR, 1.93; 95% CI, 1.37-2.71; p < 0.0001). Multivariate analysis including discharge eGFR and serum urea demonstrated that only serum urea remained a significant predictor of long-term survival; however, eGFR and BUN/creatinine ratio were both independently predictive of survival. Urea was more discriminative than eGFR in predicting long-term survival by area under the receiver operating characteristic curve (0.803 vs. 0.787; p = 0.01). Increasing tertiles of discharge serum urea and BUN/creatinine were independent predictors of HF rehospitalization and combined death and HF rehospitalization. This study suggests that serum urea is a more powerful predictor of survival than eGFR in patients with HF. This may be due to urea's relation to key biological parameters including renal, hemodynamic, and neurohormonal parameters pertaining to the overall clinical status of the patient with chronic HF.
Racial disparities in advanced stage colorectal cancer survival
Wallace, Kristin; Hill, Elizabeth G.; Lewin, David N.; Williamson, Grace; Oppenheimer, Stephanie; Ford, Marvella E.; Wargovich, Michael J.; Berger, Franklin G.; Bolick, Susan W.; Thomas, Melanie B.; Alberg, Anthony J.
2013-01-01
Purpose African Americans (AA) have a higher incidence and lower survival from colorectal cancer (CRC) compared to European Americans (EA). In the present study, statewide, population-based data from South Carolina Central Cancer Registry (SCCCR) is used to investigate the relationship between race and age on advanced stage CRC survival. Methods The study population was comprised of 3865 advanced pathologically documented colon and rectal adenocarcinoma cases diagnosed between 01 January 1996 and 31 December 2006: 2673 (69%) EA and 1192 (31%) AA. Kaplan-Meier methods were used to generate median survival time and corresponding 95% confidence intervals (CI) by race, age, and gender. Factors associated with survival were evaluated by fitting Cox proportional hazards (CPH) regression models to generate Hazard Ratios (HR) and 95% CI. Results We observed a significant interaction between race and age on CRC survival (p = 0.04). Among younger patients (< 50 years), AA race was associated with a 1.34 (95% CI 1.06-1.71) higher risk of death compared to EA. Among older patients, we observed a modest increase risk of death among AA men compared to EA (HR 1.16 (95% CI 1.01-1.32) but no difference by race among women (HR 0.94 (95% CI 0.82-1.08)). Moreover, we observed that the disparity in survival has worsened over the past 15 years. Conclusions Future studies that integrate clinical, molecular, and treatment-related data are needed for advancing understanding of the racial disparity in CRC survival, especially for those < 50 years old. PMID:23296454
Falardeau, Justin; Walji, Khalil; Haure, Maxime; Fong, Karen; Taylor, Gregory A; Ma, Yussanne; Smukler, Sean; Wang, Siyun
2018-05-18
Soil is an important reservoir for Listeria monocytogenes, a foodborne pathogen implicated in numerous produce-related outbreaks. Our objective was to (i) compare the survival of L. monocytogenes between three soils, (ii) compare the native bacterial communities across these soils, and (iii) investigate relationships between L. monocytogenes survival, native bacterial communities, and soil properties. Listeria spp. populations were monitored on PALCAM agar in three soils inoculated with L. monocytogenes (~5 x 106 CFU/g): conventionally farmed (CS), grassland transitioning to conventionally farmed (TS), and uncultivated grassland (GS). Bacterial diversity of the soils was analyzed using 16s rRNA targeted amplicon sequencing. A two-log reduction of Listeria spp. was observed in all soils within 10 days, but at a significantly lower rate in GS (Fisher's LSD; p < 0.05). Survival correlated with increased moisture and a neutral pH. GS showed the highest microbial diversity. Acidobacteria was the dominant phylum differentiating CS and TS from GS, and was negatively correlated with pH, carbon, nitrogen, and moisture. High moisture content and neutral pH are likely to increase the ability of L. monocytogenes to persist in soil. This study confirmed that native bacterial communities and short-term survival of L. monocytogenes varies across soils.
Garin, Elienne; Rakotonirina, Hervé; Lejeune, Florence; Denizot, Benoit; Roux, Jerome; Noiret, Nicolas; Mesbah, Habiba; Herry, Jean-Yues; Bourguet, Patrick; Lejeune, Jean-Jacques
2006-04-01
It has been shown that the use of a cocktail of isotopes of different ranges of action leads to an increase in the effectiveness of metabolic radiotherapy. The purpose of the present study was to compare with a control group the effectiveness of three different treatments in rats bearing hepatocellular carcinoma (HCC), using (1) a mixture of lipiodol labelled with both I and Re, (2) lipiodol labelled with I alone and (3) lipiodol labelled with Re alone. Four groups were made up, each containing 14 rats with the N1-S1 tumour cell line. Group 1 received a mixture composed of 22 MBq of Re-SSS lipiodol and 7 MBq I-lipiodol. Group 2 received 14 MBq I-lipiodol. Group 3 received 44 MBq of Re-SSS lipiodol and group 4 acted as the control. The survival of the various groups was compared by a non-parametric test of log-rank, after a follow-up of 60, 180 and 273 days. Compared with the controls, the rats treated with a mixture of Re-SSS lipiodol and I-lipiodol show an increase in survival, but only from day 60 onwards (P=0.05 at day 60 and 0.13 at days 180 and 273). For the rats treated with I-lipiodol, there was a highly significant increase in survival compared with the controls at day 60, day 180 and day 273 (P=0.03, 0.04 and 0.04, respectively). There is no significant increase in survival for the rats treated with Re-SSS lipiodol, irrespective of the follow-up duration (P=0.53 at day 60, 0.48 at day 180, and 0.59 at day 273). In this study, I-lipiodol is the most effective treatment in HCC-bearing rats, because this is the only method that leads to a prolonged improvement of survival. These results cannot necessarily be extrapolated to humans because of the relatively small size and unifocal nature of the lesions in this study. It appears necessary to carry out a study in humans with larger tumours in order to compare these three treatments, particularly with a view to replacing I-labelled lipiodol by Re-labelled lipiodol. However, this study clearly demonstrated that, for small tumours, as in an adjuvant setting for example, I-labelled lipiodol should be a better option than Re-labelled lipiodol.
Serine/threonine protein phosphatase 6 modulates the radiation sensitivity of glioblastoma
Shen, Y; Wang, Y; Sheng, K; Fei, X; Guo, Q; Larner, J; Kong, X; Qiu, Y; Mi, J
2011-01-01
Increasing the sensitivity of glioblastoma cells to radiation is a promising approach to improve survival in patients with glioblastoma multiforme (GBM). This study aims to determine if serine/threonine phosphatase (protein phosphatase 6 (PP6)) is a molecular target for GBM radiosensitization treatment. The GBM orthotopic xenograft mice model was used in this study. Our data demonstrated that the protein level of PP6 catalytic subunit (PP6c) was upregulated in the GBM tissue from about 50% patients compared with the surrounding tissue or control tissue. Both the in vitro survival fraction of GBM cells and the patient survival time were highly correlated or inversely correlated with PP6c expression (R2=0.755 and −0.707, respectively). We also found that siRNA knockdown of PP6c reduced DNA-dependent protein kinase (DNA-PK) activity in three different GBM cell lines, increasing their sensitivity to radiation. In the orthotopic mice model, the overexpression of PP6c in GBM U87 cells attenuated the effect of radiation treatment, and reduced the survival time of mice compared with the control mice, while the PP6c knocking-down improved the effect of radiation treatment, and increased the survival time of mice. These findings demonstrate that PP6 regulates the sensitivity of GBM cells to radiation, and suggest small molecules disrupting or inhibiting PP6 association with DNA-PK is a potential radiosensitizer for GBM. PMID:22158480
Growth of longleaf and loblolly pine planted on South Carolina Sandhill sites.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cram, Michelle, M.; Outcalt, Kenneth, W.; Zarnoch, Stanley, J.
2010-07-01
Performance of longleaf (Pinus palustris Mill.) and loblolly pine (P. taeda L.) were compared 15–19 years after outplanting on 10 different sites in the sandhillsof South Carolina. The study was established from 1988 to 1992 with bareroot seedlings artificially inoculated with Pisolithus tinctorius (Pt) or naturally inoculated with mycorrhizae in the nursery. A containerized longleaf pine treatment with and without Pt inoculation was added to two sites in 1992. Effects of the Pt nursery treatment were mixed, with a decrease in survival of bareroot longleaf pine on two sites and an increase in survival on another site. The containerized longleafmore » pine treatment substantially increased survival, which led to greater volume compared with bareroot longleaf pine. Loblolly pine yielded more volume than longleaf pine on all sites but one, where survival was negatively affected by fire. Depth of sandy surface horizon affected mean annual height growth of both loblolly and longleaf pine. Height growth per year decreased with an increase in sand depth for both species. Multiple regression analysis of volume growth(ft3/ac per year) for both species indicated a strong relationship to depth of sandy soil and survival. After 15–19 years, loblolly pine has been more productive than longleaf pine, although longleaf pine productivity may be equal to or greater than that of loblolly pine on the soils with the deepest sandy surface layers over longer rotations.« less
Sylvester, Peter T.; Evans, John A.; Zipfel, Gregory J.; Chole, Richard A.; Uppaluri, Ravindra; Haughey, Bruce H.; Getz, Anne E.; Silverstein, Julie; Rich, Keith M.; Kim, Albert H.; Dacey, Ralph G.
2014-01-01
Purpose The clinical benefit of combined intraoperative magnetic resonance imaging (iMRI) and endoscopy for transsphenoidal pituitary adenoma resection has not been completely characterized. This study assessed the impact of microscopy, endoscopy, and/or iMRI on progression-free survival, extent of resection status (gross-, near-, and subtotal resection), and operative complications. Methods Retrospective analyses were performed on 446 transsphenoidal pituitary adenoma surgeries at a single institution between 1998 and 2012. Multivariate analyses were used to control for baseline characteristics, differences during extent of resection status, and progression-free survival analysis. Results Additional surgery was performed after iMRI in 56/156 cases (35.9 %), which led to increased extent of resection status in 15/156 cases (9.6 %). Multivariate ordinal logistic regression revealed no increase in extent of resection status following iMRI or endoscopy alone; however, combining these modalities increased extent of resection status (odds ratio 2.05, 95 % CI 1.21–3.46) compared to conventional transsphenoidal microsurgery. Multivariate Cox regression revealed that reduced extent of resection status shortened progression-free survival for near- versus gross-total resection [hazard ratio (HR) 2.87, 95 % CI 1.24–6.65] and sub- versus near-total resection (HR 2.10; 95 % CI 1.00–4.40). Complication comparisons between microscopy, endoscopy, and iMRI revealed increased perioperative deaths for endoscopy versus microscopy (4/209 and 0/237, respectively), but this difference was non-significant considering multiple post hoc comparisons (Fisher exact, p = 0.24). Conclusions Combined use of endoscopy and iMRI increased pituitary adenoma extent of resection status compared to conventional transsphenoidal microsurgery, and increased extent of resection status was associated with longer progression-free survival. Treatment modality combination did not significantly impact complication rate. PMID:24599833
Results of Liver Transplantation With Donors Older than 75 Years: A Case-Control Study.
León Díaz, F J; Fernández Aguilar, J L; Sánchez Pérez, B; Montiel Casado, C; Aranda Narváez, J M; Pérez Daga, J A; Suárez Muñoz, M Á; Santoyo Santoyo, J
2016-09-01
The inclusion of elderly donors can increase the pool of organs available for transplantation. The objective of this study was to compare clinical outcomes and survival rates of patients who received livers from donors aged ≥75 years versus younger donors. We considered all liver transplantations performed in our unit from January 2006 to January 2015. Thirty-two patients received a liver from a cadaveric donor aged ≥75 years (study group), and their outcomes were compared with those of patients who received a liver from a younger donor (control group) immediately before and after each transplantation in the study group. This is a descriptive, retrospective, case-control study carried out to analyze the characteristics of donors and recipients as well as the clinical course and survival of recipients of older and younger donors. Statistically significant differences were observed according to donors' age (53.3 ± 13.6 vs 79 ± 3.4 years; P < .001). In total, 6.2% of the recipients of a liver from a donor aged <75 years required retransplantation versus 15.6% of recipients of donors ≥75 years. Patient survivals at 1, 3, and 5 years, respectively, were 89%, 78.6%, and 74.5% for recipients of donors <75 years versus 83.4%, 79.4%, and 59.6% for the study group. Livers from older donors can be safely used for transplantation with acceptable survival rates. However, survival rates are lower for recipients of livers from older donors compared with younger donors, and survival only increased with retransplantation. Copyright © 2016 Elsevier Inc. All rights reserved.
Yannopoulos, Demetris; Bartos, Jason A.; George, Stephen A.; Sideris, George; Voicu, Sebastian; Oestreich, Brett; Matsuura, Timothy; Shekar, Kadambari; Rees, Jennifer; Aufderheide, Tom P.
2017-01-01
Introduction Sodium nitroprusside (SNP) enhanced CPR (SNPeCPR) demonstrates increased vital organ blood flow and survival in multiple porcine models. We developed a new, coronary occlusion/ischemia model of prolonged resuscitation, mimicking the majority of out-of-hospital cardiac arrests presenting with shockable rhythms. Hypothesis SNPeCPR will increase short term (4-hour) survival compared to standard 2015 Advanced Cardiac Life Support (ACLS) guidelines in an ischemic refractory ventricular fibrillation (VF), prolonged CPR model. Methods Sixteen anesthetized pigs had the ostial left anterior descending artery occluded leading to ischemic VF arrest. VF was untreated for 5 minutes. Basic life support was performed for 10 minutes. At minute 10 (EMS arrival), animals received either SNPeCPR (n=8) or standard ACLS (n=8). Defibrillation (200J) occurred every 3 minutes. CPR continued for a total of 45 minutes, then the balloon was deflated simulating revascularization. CPR continued until return of spontaneous circulation (ROSC) or a total of 60 minutes, if unsuccessful. SNPeCPR animals received 2 mg of SNP at minute 10 followed by 1 mg every 5 minutes until ROSC. Standard ACLS animals received 0.5 mg epinephrine every 5 minutes until ROSC. Primary endpoints were ROSC and 4-hour survival. Results All SNPeCPR animals (8/8) achieved sustained ROSC versus 2/8 standard ACLS animals within one hour of resuscitation (p=0.04). The 4-hour survival was significantly improved with SNPeCPR compared to standard ACLS, 7/8 versus 1/8 respectively, p=0.0019. Conclusion SNPeCPR significantly improved ROSC and 4-hour survival compared with standard ACLS CPR in a porcine model of prolonged ischemic, refractory VF cardiac arrest. PMID:27771299
Yannopoulos, Demetris; Bartos, Jason A; George, Stephen A; Sideris, George; Voicu, Sebastian; Oestreich, Brett; Matsuura, Timothy; Shekar, Kadambari; Rees, Jennifer; Aufderheide, Tom P
2017-01-01
Sodium nitroprusside (SNP) enhanced CPR (SNPeCPR) demonstrates increased vital organ blood flow and survival in multiple porcine models. We developed a new, coronary occlusion/ischemia model of prolonged resuscitation, mimicking the majority of out-of-hospital cardiac arrests presenting with shockable rhythms. SNPeCPR will increase short term (4-h) survival compared to standard 2015 Advanced Cardiac Life Support (ACLS) guidelines in an ischemic refractory ventricular fibrillation (VF), prolonged CPR model. Sixteen anesthetized pigs had the ostial left anterior descending artery occluded leading to ischemic VF arrest. VF was untreated for 5min. Basic life support was performed for 10min. At minute 10 (EMS arrival), animals received either SNPeCPR (n=8) or standard ACLS (n=8). Defibrillation (200J) occurred every 3min. CPR continued for a total of 45min, then the balloon was deflated simulating revascularization. CPR continued until return of spontaneous circulation (ROSC) or a total of 60min, if unsuccessful. SNPeCPR animals received 2mg of SNP at minute 10 followed by 1mg every 5min until ROSC. Standard ACLS animals received 0.5mg epinephrine every 5min until ROSC. Primary endpoints were ROSC and 4-h survival. All SNPeCPR animals (8/8) achieved sustained ROSC versus 2/8 standard ACLS animals within one hour of resuscitation (p=0.04). The 4-h survival was significantly improved with SNPeCPR compared to standard ACLS, 7/8 versus 1/8 respectively, p=0.0019. SNPeCPR significantly improved ROSC and 4-h survival compared with standard ACLS CPR in a porcine model of prolonged ischemic, refractory VF cardiac arrest. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Role of survivor bias in pancreatic cancer case-control studies.
Hu, Zhen-Huan; Connett, John E; Yuan, Jian-Min; Anderson, Kristin E
2016-01-01
The purpose of this study was to evaluate the impact of survivor bias on pancreatic cancer case-control studies. The authors constructed five case-loss scenarios based on the Iowa Women's Health Study cohort to reflect how case recruitment in population-based studies varies by case survival time. Risk factors for disease incidence included smoking, body mass index (BMI), waist circumference, diabetes, and alcohol consumption. Odds ratios (ORs) were estimated by conditional logistic regression and quantitatively compared by the interactions between risk factors and 3-month survival time. Additionally, Kaplan-Meier estimates for overall survival were compared within the subset cohort of pancreatic cancer cases. BMI and waist circumference showed a significant inverse relationship with survival time. Decreasing trends in ORs for BMI and waist circumference were observed with increasing case survival time. The interaction between BMI and survival time based on a cutpoint of 3 months was significant (P < .01) as was the interaction between waist circumference and survival time (P < .01). The findings suggested that case losses could result in survivor bias causing underestimated odds ratios for both BMI and waist circumference, whereas other risk factors were not significantly affected by case losses. Copyright © 2016 Elsevier Inc. All rights reserved.
Rahman, Sajjad; Salameh, Khalil; Al-Rifai, Hilal; Masoud, Ahmed; Lutfi, Samawal; Salama, Husam; Abdoh, Ghassan; Omar, Fahmi; Bener, Abdulbari
2011-09-01
To analyze and compare the current gestational age specific neonatal survival rates between Qatar and international benchmarks. An analytical comparative study. Women's Hospital, Hamad Medical Corporation, Doha, Qatar, from 2003-2008. Six year's (2003-2008) gestational age specific neonatal mortality data was stratified for each completed week of gestation at birth from 24 weeks till term. The data from World Health Statistics by WHO (2010), Vermont Oxford Network (VON, 2007) and National Statistics United Kingdom (2006) were used as international benchmarks for comparative analysis. A total of 82,002 babies were born during the study period. Qatar's neonatal mortality rate (NMR) dropped from 6/1000 in 2003 to 4.3/1000 in 2008 (p < 0.05). The overall and gestational age specific neonatal mortality rates of Qatar were comparable with international benchmarks. The survival of < 27 weeks and term babies was better in Qatar (p=0.01 and p < 0.001 respectively) as compared to VON. The survival of > 32 weeks babies was better in UK (p=0.01) as compared to Qatar. The relative risk (RR) of death decreased with increasing gestational age (p < 0.0001). Preterm babies (45%) followed by lethal chromosomal and congenital anomalies (26.5%) were the two leading causes of neonatal deaths in Qatar. The current total and gestational age specific neonatal survival rates in the State of Qatar are comparable with international benchmarks. In Qatar, persistently high rates of low birth weight and lethal chromosomal and congenital anomalies significantly contribute towards neonatal mortality.
NASA Astrophysics Data System (ADS)
Bessell-Browne, Pia; Stat, Michael; Thomson, Damian; Clode, Peta L.
2014-09-01
Colonies of Coscinaraea marshae corals from Rottnest Island, Western Australia have survived for more than 11 months in various bleached states following a severe heating event in the austral summer of 2011. These colonies are situated in a high-latitude, mesophotic environment, which has made their long-term survival of particular interest as such environments typically suffer from minimal thermal pressures. We have investigated corals that remain unbleached, moderately bleached, or severely bleached to better understand potential survival mechanisms utilised in response to thermal stress. Specifically, Symbiodinium (algal symbiont) density and genotype, chlorophyll- a concentrations, and δ13C and δ15N levels were compared between colonies in the three bleaching categories. Severely bleached colonies housed significantly fewer Symbiodinium cells ( p < 0.05) and significantly reduced chlorophyll- a concentrations ( p < 0.05), compared with unbleached colonies. Novel Symbiodinium clade associations were observed for this coral in both severely and moderately bleached colonies, with clade C and a mixed clade population detected. In unbleached colonies, only clade B was observed. Levels of δ15N indicate that severely bleached colonies are utilising heterotrophic feeding mechanisms to aid survival whilst bleached. Collectively, these results suggest that these C. marshae colonies can survive with low symbiont and chlorophyll densities, in response to prolonged thermal stress and extended bleaching, and increase heterotrophic feeding levels sufficiently to meet energy demands, thus enabling some colonies to survive and recover over long time frames. This is significant as it suggests that corals in mesophotic and high-latitude environments may possess considerable plasticity and an ability to tolerate and adapt to large environmental fluctuations, thereby improving their chances of survival as climate change impacts coral ecosystems worldwide.
Chen, Ming; Bao, Yong; Ma, Hong-Lian; Wang, Jin; Wang, Yan; Peng, Fang; Zhou, Qi-Chao; Xie, Cong-Hua
2013-01-01
This prospective randomized study is to evaluate the locoregional failure and its impact on survival by comparing involved field radiotherapy (IFRT) with elective nodal irradiation (ENI) in combination with concurrent chemotherapy for locally advanced non-small cell lung cancer. It appears that higher dose could be delivered in IFRT arm than that in ENI arm, and IFRT did not increase the risk of initially uninvolved or isolated nodal failures. Both a tendency of improved locoregional progression-free survival and a significant increased overall survival rate are in favor of IFRT arm in this study. PMID:23762840
Chapman, William C; Vachharajani, Neeta; Collins, Kelly M; Garonzik-Wang, Jackie; Park, Yikyung; Wellen, Jason R; Lin, Yiing; Shenoy, Surendra; Lowell, Jeffrey A; Doyle, M B Majella
2015-07-01
The shortage of donor organs has led to increasing use of extended criteria donors, including older donors. The upper limit of donor age that produces acceptable outcomes continues to be explored. In liver transplantation, with appropriate selection, graft survival and patient outcomes would be comparable regardless of age. We performed a retrospective analysis of 1,036 adult orthotopic liver transplantations (OLT) from a prospectively maintained database performed between January 1, 2000 and December 31, 2013. The study focus group was liver transplantations performed using grafts from older (older than 60 years) deceased donors. Deceased donor liver transplantations done during the same time period using grafts from younger donors (younger than 60 years) were analyzed for comparison. Both groups were further divided based on recipient age (less than 60 years and 60 years or older). Donor age was the primary variable. Recipient variables included were demographics, indication for transplantation, Model for End-Stage Liver Disease (MELD), graft survival, and patient survival. Operative details and postoperative complications were analyzed. Patient demographics and perioperative details were similar between groups. Patient and graft survival rates were similar in the 4 groups. Rates of rejection (p = 0.07), bile leak (p = 0.17), and hepatic artery thrombosis were comparable across all groups (p = 0.84). Hepatitis C virus recurrence was similar across all groups (p = 0.10). Thirty-one young recipients (less than 60 years) received grafts from donors aged 70 or older. Their survival and other complication rates were comparable to those in the young donor to young recipient group. Comparable outcomes in graft and patient survivals were achieved using older donors (60 years or more), regardless of recipient age, without increased rate of complications. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Diethelm, A G; Blackstone, E H; Naftel, D C; Hudson, S L; Barber, W H; Deierhoi, M H; Barger, B O; Curtis, J J; Luke, R G
1988-01-01
Multiple risk factors contribute to the allograft survival of patients who have cadaveric renal transplantation. A retrospective review of 19 such factors in 426 patients identified race, DR match, B + DR match, number of transplants, and preservation time to have a significant influence. The parametric analysis confirmed the effect to be primarily in the early phase, i.e., first 6 months. All patients received cyclosporine with other methods of immunosuppression resulting in an overall 1-year graft survival rate of 66%. The overall 1-year graft survival rate in the white race was 73% and in the black race was 57% (p = 0.002). Allograft survival and DR match showed white recipients with a 1 DR match to have 75% survival at 1 year compared with 57% in the black patient (p = 0.009). If HLA B + DR match was considered, the white recipient allograft survival increased to 76%, 84%, and 88% for 1, 2, and 3 match kidneys by parametric analysis. Patients receiving first grafts had better graft survival (68%) than those undergoing retransplantation (58%) (p = 0.05). Organ preservation less than 12 hours influenced allograft survival with a 78% 1-year survival rate compared with 63% for kidneys with 12-18 hours of preservation. Despite the benefits of B + DR typing, short preservation time, and first transplants to the white recipient, the allograft survival in the black recipient remained uninfluenced by these parameters. PMID:3288138
Lorentz, C Adam; Liang, Zhe; Meng, Mei; Chen, Ching-Wen; Yoseph, Benyam P; Breed, Elise R; Mittal, Rohit; Klingensmith, Nathan J; Farris, Alton B; Burd, Eileen M; Koval, Michael; Ford, Mandy L; Coopersmith, Craig M
2017-06-07
Sepsis-induced intestinal hyperpermeability is mediated by disruption of the epithelial tight junction, which is closely associated with the peri-junctional actin-myosin ring. Myosin light chain kinase (MLCK) phosphorylates the myosin regulatory light chain, resulting in increased permeability. The purpose of this study was to determine whether genetic deletion of MLCK would alter gut barrier function and survival from sepsis. MLCK -/- and wild type (WT) mice were subjected to cecal ligation and puncture and assayed for both survival and mechanistic studies. Survival was significantly increased in MLCK -/- mice (95% vs. 24%, p<0.0001). Intestinal permeability increased in septic WT mice compared to unmanipulated mice. In contrast, permeability in septic MLCK -/- mice was similar to that seen in unmanipulated animals. Improved gut barrier function in MLCK -/- mice was associated with increases in the tight junction mediators ZO-1 and claudin 15 without alterations in claudin 1, 2, 3, 4, 5, 7, 8, 13, occludin or JAM-A. Other components of intestinal integrity (apoptosis, proliferation and villus length) were unaffected by MLCK deletion as were local peritoneal inflammation and distant lung injury. Systemic IL-10 was decreased greater than 10-fold in MLCK -/- mice; however, survival was similar between septic MLCK -/- mice given exogenous IL-10 or vehicle. These data demonstrate that deletion of MLCK improves survival following sepsis, associated with normalization of intestinal permeability and selected tight junction proteins.
Elliott, Thomas B.; Bolduc, David L.; Ledney, G. David; Kiang, Juliann G.; Fatanmi, Oluseyi O.; Wise, Stephen Y.; Romaine, Patricia L. P.; Newman, Victoria L.; Singh, Vijay K.
2015-01-01
Purpose: A combination therapy for combined injury (CI) using a non-specific immunomodulator, synthetic trehalose dicorynomycolate and monophosphoryl lipid A (STDCM-MPL), was evaluated to augment oral antimicrobial agents, levofloxacin (LVX) and amoxicillin (AMX), to eliminate endogenous sepsis and modulate cytokine production. Materials and methods: Female B6D2F1/J mice received 9.75 Gy cobalt-60 gamma-radiation and wound. Bacteria were isolated and identified in three tissues. Incidence of bacteria and cytokines were compared between treatment groups. Results: Results demonstrated that the lethal dose for 50% at 30 days (LD50/30) of B6D2F1/J mice was 9.42 Gy. Antimicrobial therapy increased survival in radiation-injured (RI) mice. Combination therapy increased survival after RI and extended survival time but did not increase survival after CI. Sepsis began five days earlier in CI mice than RI mice with Gram-negative species predominating early and Gram-positive species increasing later. LVX plus AMX eliminated sepsis in CI and RI mice. STDCM-MPL eliminated Gram-positive bacteria in CI and most RI mice but not Gram-negative. Treatments significantly modulated 12 cytokines tested, which pertain to wound healing or elimination of infection. Conclusions: Combination therapy eliminates infection and prolongs survival time but does not assure CI mouse survival, suggesting that additional treatment for proliferative-cell recovery is required. PMID:25994812
Elliott, Thomas B; Bolduc, David L; Ledney, G David; Kiang, Juliann G; Fatanmi, Oluseyi O; Wise, Stephen Y; Romaine, Patricia L P; Newman, Victoria L; Singh, Vijay K
2015-01-01
A combination therapy for combined injury (CI) using a non-specific immunomodulator, synthetic trehalose dicorynomycolate and monophosphoryl lipid A (STDCM-MPL), was evaluated to augment oral antimicrobial agents, levofloxacin (LVX) and amoxicillin (AMX), to eliminate endogenous sepsis and modulate cytokine production. Female B6D2F(1)/J mice received 9.75 Gy cobalt-60 gamma-radiation and wound. Bacteria were isolated and identified in three tissues. Incidence of bacteria and cytokines were compared between treatment groups. Results demonstrated that the lethal dose for 50% at 30 days (LD(50/30)) of B6D2F(1)/J mice was 9.42 Gy. Antimicrobial therapy increased survival in radiation-injured (RI) mice. Combination therapy increased survival after RI and extended survival time but did not increase survival after CI. Sepsis began five days earlier in CI mice than RI mice with Gram-negative species predominating early and Gram-positive species increasing later. LVX plus AMX eliminated sepsis in CI and RI mice. STDCM-MPL eliminated Gram-positive bacteria in CI and most RI mice but not Gram-negative. Treatments significantly modulated 12 cytokines tested, which pertain to wound healing or elimination of infection. Combination therapy eliminates infection and prolongs survival time but does not assure CI mouse survival, suggesting that additional treatment for proliferative-cell recovery is required.
Surgery Increases Survival in Patients With Gastrinoma
Norton, Jeffrey A.; Fraker, Douglas L.; Alexander, H R.; Gibril, Fathia; Liewehr, David J.; Venzon, David J.; Jensen, Robert T.
2006-01-01
Objective: To determine whether the routine use of surgical exploration for gastrinoma resection/cure in 160 patients with Zollinger-Ellison syndrome (ZES) altered survival compared with 35 ZES patients who did not undergo surgery. Summary Background Data: The role of routine surgical exploration for resection/cure in patients with ZES has been controversial since the original description of this disease in 1955. This controversy continues today, not only because medical therapy for acid hypersecretion is so effective, but also in large part because no studies have shown an effect of tumor resection on survival. Methods: Long-term follow-up of 160 ZES patients who underwent routine surgery for gastrinoma/resection/cure was compared with 35 patients who had similar disease but did not undergo surgery for a variety of reasons. All patients had preoperative CT, MRI, ultrasound; if unclear, angiography and somatostatin receptor scintigraphy since 1994 to determine resectability. At surgery, all had the same standard ZES operation. All patients were evaluated yearly with imaging studies and disease activity studies. Results: The 35 nonsurgical patients did not differ from the 160 operated in clinical, laboratory, or tumor imaging results. The 2 groups did not differ in follow-up time since initial evaluation (range, 11.8–12 years). At surgery, 94% had a tumor removed, 51% were cured immediately, and 41% at last follow-up. Significantly more unoperated patients developed liver metastases (29% vs. 5%, P = 0.0002), died of any cause (54 vs. 21%, P = 0.0002), or died a disease-related death (23 vs. 1%, P < 0.00001). Survival plots showed operated patients had a better disease-related survival (P = 0.0012); however, there was no difference in non-disease-related survival. Fifteen-year disease-related survival was 98% for operated and 74% for unoperated (P = 0.0002). Conclusions: These results demonstrate that routine surgical exploration increases survival in patients with ZES by increasing disease-related survival and decreasing the development of advanced disease. Routine surgical exploration should be performed in ZES patients. PMID:16926567
The Vitamin B12 Analog Cobinamide Is an Effective Antidote for Oral Cyanide Poisoning.
Lee, Jangwoen; Mahon, Sari B; Mukai, David; Burney, Tanya; Katebian, Behdod S; Chan, Adriano; Bebarta, Vikhyat S; Yoon, David; Boss, Gerry R; Brenner, Matthew
2016-12-01
Cyanide is a major chemical threat, and cyanide ingestion carries a higher risk for a supra-lethal dose exposure compared to inhalation but provides an opportunity for effective treatment due to a longer treatment window and a gastrointestinal cyanide reservoir that could be neutralized prior to systemic absorption. We hypothesized that orally administered cobinamide may function as a high-binding affinity scavenger and that gastric alkalinization would reduce cyanide absorption and concurrently increase cobinamide binding, further enhancing antidote effectiveness. Thirty New Zealand white rabbits were divided into five groups and were given a lethal dose of oral cyanide poisoning (50 mg). The survival time of animals was monitored with oral cyanide alone, oral cyanide with gastric alkalinization with oral sodium bicarbonate buffer (500 mg), and in combination with either aquohydroxocobinamide or dinitrocobinamide (250 mM). Red blood cell cyanide concentration, plasma cobinamide, and thiocyanate concentrations were measured from blood samples. In cyanide ingested animals, oral sodium bicarbonate alone significantly prolonged survival time to 20.3 ± 8.6 min compared to 10.5 ± 4.3 min in saline-treated controls, but did not lead to overall survival. Aquohydroxocobinamide and dinitrocobinamide increased survival time to 64 ± 41 (p < 0.05) and 75 ± 16.4 min (p < 0.001), respectively. Compared to aquohydroxocobinamide, dinitrocobinamide showed greater systemic absorption and reduced blood pressure. Dinitrocobinamide also markedly increased the red blood cell cyanide concentration. Under all conditions, the plasma thiocyanate concentration gradually increased with time. This study demonstrates a promising new approach to treat high-dose cyanide ingestion, with gastric alkalinization alone and in combination with oral cobinamide for treating a supra-lethal dose of orally administered cyanide in rabbits.
Ventetuolo, Corey E; Hess, Edward; Austin, Eric D; Barón, Anna E; Klinger, James R; Lahm, Tim; Maddox, Thomas M; Plomondon, Mary E; Thompson, Lauren; Zamanian, Roham T; Choudhary, Gaurav; Maron, Bradley A
2017-01-01
Women have an increased risk of pulmonary hypertension (PH) but better survival compared to men. Few studies have explored sex-based differences in population-based cohorts with PH. We sought to determine whether sex was associated with hemodynamics and survival in US veterans with PH (mean pulmonary artery pressure [mPAP] ≥ 25 mm Hg) from the Veterans Affairs Clinical Assessment, Reporting, and Tracking database. The relationship between sex and hemodynamics was assessed with multivariable linear mixed modeling. Cox proportional hazards models were used to compare survival by sex for those with PH and precapillary PH (mPAP ≥ 25 mm Hg, pulmonary artery wedge pressure [PAWP] ≤ 15 mm Hg and pulmonary vascular resistance [PVR] > 3 Wood units) respectively. The study population included 15,464 veterans with PH, 516 (3%) of whom were women; 1,942 patients (13%) had precapillary PH, of whom 120 (6%) were women. Among those with PH, women had higher PVR and pulmonary artery pulse pressure, and lower right atrial pressure and PAWP (all p <0.001) compared with men. There were no significant differences in hemodynamics according to sex in veterans with precapillary PH. Women with PH had 18% greater survival compared to men with PH (adjusted HR 0.82, 95% CI 0.69-0.97, p = 0.020). Similarly, women with precapillary PH were 29% more likely to survive as compared to men with PH (adjusted HR 0.71, 95% CI 0.52-0.98, p = 0.040). In conclusion, female veterans with PH have better survival than males despite higher pulmonary afterload.
Campbell, Kevin; Gastier-Foster, Julie M; Mann, Meegan; Naranjo, Arlene H; Van Ryn, Collin; Bagatell, Rochelle; Matthay, Katherine K; London, Wendy B; Irwin, Meredith S; Shimada, Hiroyuki; Granger, M Meaghan; Hogarty, Michael D; Park, Julie R; DuBois, Steven G
2017-11-01
High-level MYCN amplification (MNA) is associated with poor outcome and unfavorable clinical and biological features in patients with neuroblastoma. To the authors' knowledge, less is known regarding these associations in patients with low-level MYCN copy number increases. In this retrospective study, the authors classified patients has having tumors with MYCN wild-type tumors, MYCN gain (2-4-fold increase in MYCN signal compared with the reference probe), or MNA (>4-fold increase). Tests of trend were used to investigate ordered associations between MYCN copy number category and features of interest. Log-rank tests and Cox models compared event-free survival and overall survival by subgroup. Among 4672 patients, 3694 (79.1%) had MYCN wild-type tumors, 133 (2.8%) had MYCN gain, and 845 (18.1%) had MNA. For each clinical/biological feature, the percentage of patients with an unfavorable feature was lowest in the MYCN wild-type category, intermediate in the MYCN gain category, and highest in the MNA category (P<.0001), except for 11q aberration, for which the highest rates were in the MYCN gain category. Patients with MYCN gain had inferior event-free survival and overall survival compared with those with MYCN wild-type. Among patients with high-risk disease, MYCN gain was associated with the lowest response rate after chemotherapy. Patients with non-stage 4 disease (according to the International Neuroblastoma Staging System) and patients with non-high-risk disease with MYCN gain had a significantly increased risk for death, a finding confirmed on multivariable testing. Increasing MYCN copy number is associated with an increasingly higher rate of unfavorable clinical/biological features, with 11q aberration being an exception. Patients with MYCN gain appear to have inferior outcomes, especially in otherwise more favorable groups. Cancer 2017;123:4224-4235. © 2017 American Cancer Society. © 2017 American Cancer Society.
Yoshikawa, Kyoko; Iwasa, Motoh; Kojima, Shinichi; Yoshizawa, Naohiko; Tempaku, Mina; Sugimoto, Ryosuke; Yamamoto, Norihiko; Sugimoto, Kazushi; Kobayashi, Yoshinao; Hasegawa, Hiroshi; Takei, Yoshiyuki
2017-01-01
Chronic liver disease patients often have complications, such as hepatocellular carcinoma (HCC) and acute bacterial infection. Model for end‐stage liver disease and Child‐Pugh scores are useful prognostic factors for chronic liver diseases but not for all chronic conditions, such as HCC. Our investigative aim targeted the prognostic abilities of neutrophil gelatinase‐associated lipocalin (NGAL) in rat and human chronic liver diseases. Blood NGAL levels were measured by enzyme‐linked immunosorbent assay in rats with cirrhosis and 96 patients with chronic liver disease and HCC. We examined the correlation between blood NGAL levels and liver functions as well as survival. In our rat model, liver NGAL expression was assessed by immunostaining, real‐time quantitative polymerase chain reaction, and immunoblot. In rats with cirrhosis, blood NGAL levels were continuously and significantly elevated in the deceased group and were significantly correlated with liver functions. Liver NGAL, toll‐like receptor 4, and interleukin‐6 levels were increased in the deceased group compared to the survival group. Blood NGAL levels were significantly correlated with liver NGAL levels, indicating blood NGAL was derived from the liver. In patients with chronic liver disease, blood NGAL levels were associated with liver function and renal function. Blood NGAL levels were significantly increased in patients with chronic liver disease with HCC compared to without HCC. For the survival group, 38 out of 96 patients were dead in the average follow‐up period of 9.9 months. The patients with blood NGAL ≤119 ng/mL had significantly longer rates of survival compared to patients with blood NGAL >119 ng/mL. Conclusion: Blood NGAL predicts the survival rate in rat and human chronic liver diseases. Our findings suggest blood NGAL may be prognostic of survival in chronic liver diseases complicated by HCC. (Hepatology Communications 2017;1:946–956) PMID:29404502
Shadyab, Aladdin H.; Macera, Caroline A.; Shaffer, Richard A.; Jain, Sonia; Gallo, Linda C.; Gass, Margery L.S.; Waring, Molly E.; Stefanick, Marcia L.; LaCroix, Andrea Z.
2016-01-01
Objective To investigate associations between reproductive factors and survival to age 90 years. Methods This was a prospective study of postmenopausal women from the Women's Health Initiative recruited from 1993-1998 and followed until the last outcomes evaluation on August 29, 2014. Participants included 16,251 women born on or before August 29, 1924 for whom survival to age 90 during follow-up was ascertained. Women were classified as having survived to age 90 (exceptional longevity) or died before age 90. Multivariable logistic regression models were used to evaluate associations of ages at menarche and menopause (natural or surgical) and reproductive lifespan with longevity, adjusting for demographic, lifestyle, and reproductive characteristics. Results Participants were on average aged 74.7 years (range, 69-81 years) at baseline. Of 16,251 women, 8,892 (55%) survived to age 90. Women aged ≥12 years at menarche had modestly increased odds of longevity (odds ratio [OR], 1.09; 95% confidence interval [CI], 1.00-1.19). There was a significant trend toward increased longevity for later age at menopause (natural or surgical; Ptrend =0.01), with ORs (95% CIs) of 1.19 (1.04-1.36) and 1.18 (1.02-1.36) for 50-54 and ≥55 compared with <40 years, respectively. Later age at natural menopause as a separate exposure was also significantly associated with increased longevity (Ptrend =0.02). Longer reproductive lifespan was significantly associated with increased longevity (Ptrend=0.008). The odds of longevity were 13% (OR, 1.13; 95% CI, 1.03-1.25) higher in women with >40 compared with <33 reproductive years. Conclusions Reproductive characteristics were associated with late-age survival in older women. PMID:27465713
Kirol, Christopher P; Sutphin, Andrew L; Bond, Laura; Fuller, Mark R; Maechtle, Thomas L
Sagebrush ( Artemisia spp.) habitats being developed for oil and gas reserves are inhabited by sagebrush obligate species-including the greater sage-grouse ( Centrocercus urophasianus ; sage-grouse) that is currently being considered for protection under the U.S. Endangered Species Act. Numerous studies suggest increasing oil and gas development may exacerbate species extinction risks. Therefore, there is a great need for effective on-site mitigation to reduce impacts to co-occurring wildlife such as sage-grouse. Nesting success is a primary factor in avian productivity and declines in nesting success are also thought to be an important contributor to population declines in sage-grouse. From 2008 to 2011 we monitored 296 nests of radio-marked female sage-grouse in a natural gas (NG) field in the Powder River Basin, Wyoming, USA and compared nest survival in mitigated and non-mitigated development areas and relatively unaltered areas to determine if specific mitigation practices were enhancing nest survival. Nest survival was highest in relatively unaltered habitats followed by mitigated, and then non-mitigated NG areas. Reservoirs used for holding NG discharge water had the greatest support as having a direct relationship to nest survival. Within a 5 km 2 area surrounding a nest, the probability of nest failure increased by about 15% for every 1.5 km increase in reservoir water edge. Reducing reservoirs was a mitigation focus and sage-grouse nesting in mitigated areas were exposed to almost half of the amount of water edge compared to those in non-mitigated areas. Further, we found that an increase in sagebrush cover was positively related to nest survival. Consequently, mitigation efforts focused on reducing reservoir construction and reducing surface disturbance, especially when the surface disturbance results in sagebrush removal, are important to enhancing sage-grouse nesting success.
Kirol, Christopher P.; Sutphin, Andrew L.; Bond, Laura; Fuller, Mark R.; Maechtle, Thomas L.
2015-01-01
Sagebrush (Artemisia spp.) habitats being developed for oil and gas reserves are inhabited by sagebrush obligate species—including the greater sage-grouse (Centrocercus urophasianus; sage-grouse) that is currently being considered for protection under the U.S. Endangered Species Act. Numerous studies suggest increasing oil and gas development may exacerbate species extinction risks. Therefore, there is a great need for effective on-site mitigation to reduce impacts to co-occurring wildlife such as sage-grouse. Nesting success is a primary factor in avian productivity and declines in nesting success are also thought to be an important contributor to population declines in sage-grouse. From 2008 to 2011 we monitored 296 nests of radio-marked female sage-grouse in a natural gas (NG) field in the Powder River Basin, Wyoming, USA and compared nest survival in mitigated and non-mitigated development areas and relatively unaltered areas to determine if specific mitigation practices were enhancing nest survival. Nest survival was highest in relatively unaltered habitats followed by mitigated, and then non-mitigated NG areas. Reservoirs used for holding NG discharge water had the greatest support as having a direct relationship to nest survival. Within a 5 km2 area surrounding a nest, the probability of nest failure increased by about 15% for every 1.5 km increase in reservoir water edge. Reducing reservoirs was a mitigation focus and sage-grouse nesting in mitigated areas were exposed to almost half of the amount of water edge compared to those in non-mitigated areas. Further, we found that an increase in sagebrush cover was positively related to nest survival. Consequently, mitigation efforts focused on reducing reservoir construction and reducing surface disturbance, especially when the surface disturbance results in sagebrush removal, are important to enhancing sage-grouse nesting success. PMID:26366042
Ohara, M; Lu, H; Shiraki, K; Ishimura, Y; Uesaka, T; Katoh, O; Watanabe, H
2001-12-01
The radioprotective effect of miso, a fermentation product from soy bean, was investigated with reference to the survival time, crypt survival and jejunum crypt length in male B6C3F1 mice. Miso at three different fermentation stages (early-, medium- and long-term fermented miso) was mixed in MF diet into biscuits at 10% and was administered from 1 week before irradiation. Animal survival in the long-term fermented miso group was significantly prolonged as compared with the short-term fermented miso and MF cases after 8 Gy of 60Co-gamma-ray irradiation at a dose rate of 2Gy min(-1). Delay in mortality was evident in all three miso groups, with significantly increased survival. At doses of 10 and 12 Gy X-irradiation at a dose rate of 4 Gy min(-1), the treatment with long-term fermented miso significantly increased crypt survival. Also the protective influence against irradiation in terms of crypt lengths in the long-term fermented miso group was significantly greater than in the short-term or medium-term fermented miso and MF diet groups. Thus, prolonged fermentation appears to be very important for protection against radiation effects.
Angiodrastic Chemokines in Colorectal Cancer: Clinicopathological Correlations.
Emmanouil, George; Ayiomamitis, George; Zizi-Sermpetzoglou, Adamantia; Tzardi, Maria; Moursellas, Andrew; Voumvouraki, Argyro; Kouroumalis, Elias
2018-01-01
To study the expression of angiodrastic chemokines in colorectal tumors and correlate findings with clinicopathological parameters and survival. The proangiogenic factor VEGF, the angiogenic chemokines CXCL8 and CXCL6, and the angiostatic chemokine CXCL4 were measured by ELISA in tumor and normal tissue of 35 stage II and III patients and correlated with the histopathology markers Ki67, p53, p21, bcl2, EGFR, and MLH1 and 5-year survival. The Wilcoxon and chi-square tests were used for statistical comparisons. There was a significant increase of CXCL6 ( p = 0.005) and VEGF ( p = 0.003) in cancerous tissue compared to normal. Patients with lower levels of CXCL8 and CXCL4 lived significantly longer. Patients with loss of EGFR expression had higher levels of CXCL8 while p21 loss was associated with higher levels of CXCL6. Chemokine levels were not correlated with TNM or Dukes classification. Strong expression of p53 was accompanied by decreased survival. (1) The angiogenic factors CXCL6 and VEGF are increased in colorectal cancer tissue with no association with the clinical stage of the disease or survival. (2) However, increased levels of tissue CXCL8 and CXCL4 are associated with poor survival. (3) Strong expression of p53 is found in patients with poor survival.
Bergt, Stefan; Wagner, Nana-Maria; Heidrich, Manja; Butschkau, Antje; Nöldge-Schomburg, Gabriele E F; Vollmar, Brigitte; Roesner, Jan P
2013-11-01
Toll-like receptors (TLRs) play a crucial role in early host defense against microorganisms. Toll-like receptor 2 (TLR2) polymorphisms have a prevalence of 10%; functional defects of TLR2 are associated with higher susceptibility toward gram-positive bacteria, and TLR2 deficiency has been associated with an impaired adrenal stress response. In the present study, we compared endogenous corticosterone production of wild-type (WT) and TLR2-deficient (TLR2) mice and analyzed survival after hydrocortisone therapy during sepsis induced by cecal ligation and puncture (CLP). Male C57BL/6J (WT); and B6.129-Tlr2tm1Kir/J (TLR2) mice were subjected to CLP or sham operation and randomly assigned to postoperative treatment with either hydrocortisone (5 mg/kg) or vehicle (n = 10 mice/group). Survival was documented for an observation period of 48 h. Endogenous corticosterone production following hydrocortisone treatment and lipoteichoic acid (LTA) exposure, interleukin 6 (IL-6) and IL-1β plasma levels, and blood counts were determined following sham operation or CLP using another n = 5 mice/group. Statistical analysis was performed using analysis of variance/Bonferroni. TLR2 mice exhibited a lack of suppression and an attenuated increase in endogenous corticosterone production following hydrocortisone or LTA treatment, respectively. After CLP, TLR2 mice exhibited an uncompromised adrenal stress response, higher IL-6 levels, and increased survival compared with WT controls (75 vs. 35%; P < 0.05). Hydrocortisone therapy of TLR2 mice completely abolished this advantage (decrease in survival to 45%, P < 0.05 vs. vehicle-treated TLR2 mice) and was associated with decreased IL-1β plasma concentrations. Toll-like receptor 2 deficiency is associated with an uncompromised adrenal stress response and increased survival rates during polymicrobial sepsis. Hydrocortisone treatment increases mortality of septic TLR2 mice, suggesting that hydrocortisone therapy might be harmful for individuals with functional TLR2 polymorphisms.
Norton, Nadine; Fox, Nicholas; McCarl, Christie-Ann; Tenner, Kathleen S; Ballman, Karla; Erskine, Courtney L; Necela, Brian M; Northfelt, Donald; Tan, Winston W; Calfa, Carmen; Pegram, Mark; Colon-Otero, Gerardo; Perez, Edith A; Clynes, Raphael; Knutson, Keith L
2018-06-14
Resected HER2 breast cancer patients treated with adjuvant trastuzumab and chemotherapy have superior survival compared to patients treated with chemotherapy alone. We previously showed that trastuzumab and chemotherapy induce HER2-specific antibodies which correlate with improved survival in HER2 metastatic breast cancer patients. It remains unclear whether the generation of immunity required trastuzumab and whether endogenous antibody immunity is associated with improved disease-free survival in the adjuvant setting. In this study, we addressed this question by analyzing serum anti-HER2 antibodies from a subset of patients enrolled in the NCCTG trial N9831, which includes an arm (Arm A) in which trastuzumab was not used. Arms B and C received trastuzumab sequentially or concurrently to chemotherapy, respectively. Pre-and post-treatment initiation sera were obtained from 50 women enrolled in N9831. Lambda IgG antibodies (to avoid detection of trastuzumab) to HER2 were measured and compared between arms and with disease-free survival. Prior to therapy, across all three arms, N9831 patients had similar mean anti-HER2 IgG levels. Following treatment, the mean levels of antibodies increased in the trastuzumab arms but not the chemotherapy-only arm. The proportion of patients who demonstrated antibodies increased by 4% in Arm A and by 43% in the Arms B and C combined (p = 0.003). Cox modeling demonstrated that larger increases in antibodies were associated with improved disease-free survival in all patients (HR = 0.23; p = 0.04). These results show that the increased endogenous antibody immunity observed in adjuvant patients treated with combination trastuzumab and chemotherapy is clinically significant, in view of its correlation with improved disease-free survival. The findings may have important implications for predicting treatment outcomes in patients treated with trastuzumab in the adjuvant setting. ClinicalTrials.gov, NCT00005970 . Registered on July 5, 2000.
Movement and survival of an amphibian in relation to sediment and culvert design
Honeycutt, R.K; Lowe, W.H.; Hossack, Blake R.
2016-01-01
Habitat disturbance from stream culverts can affect aquatic organisms by increasing sedimentation or forming barriers to movement. Land managers are replacing many culverts to reduce these negative effects, primarily for stream fishes. However, these management actions are likely to have broad implications for many organisms, including amphibians in small streams. To assess the effects of culverts on movement and survival of the Idaho giant salamander (Dicamptodon aterrimus), we used capture-mark-recapture surveys and measured sediment in streams with 2 culvert types (i.e., unimproved culverts, improved culverts) and in streams without culverts (i.e., reference streams). We predicted culverts would increase stream sediment levels, limit movement, and reduce survival of Idaho giant salamanders. We also determined the effect of sediment levels on survival of salamanders because although sediment is often associated with distribution and abundance of stream amphibians, links with vital rates remain unclear. To estimate survival, we used a spatial Cormack–Jolly–Seber (CJS) model that explicitly incorporated information on movement, eliminating bias in apparent survival estimated from traditional (i.e., non-spatial) CJS models caused by permanent emigration beyond the study area. To demonstrate the importance of using spatial data in studies of wildlife populations, we compared estimates from the spatial CJS to estimates of apparent survival from a traditional CJS model. Although high levels of sediment reduced survival of salamanders, culvert type was unrelated to sediment levels or true survival of salamanders. Across all streams, we documented only 15 movement events between study reaches. All movement events were downstream, and they occurred disproportionately in 1 stream, which precluded measuring the effect of culvert design on movement. Although movement was low overall, the variance among streams was high enough to bias estimates of apparent survival compared to true survival. Our results suggest that where sedimentation occurs from roads and culverts, survival of the Idaho giant salamander could be reduced. Though culverts clearly do not completely block downstream movements of Idaho giant salamanders, the degree to which culvert improvements affect movements under roads in comparison to unimproved culverts remains unclear, especially for rare, but potentially important, upstream movements.
Association of Low-Dose Aspirin and Survival of Women With Endometrial Cancer.
Matsuo, Koji; Cahoon, Sigita S; Yoshihara, Kosuke; Shida, Masako; Kakuda, Mamoru; Adachi, Sosuke; Moeini, Aida; Machida, Hiroko; Garcia-Sayre, Jocelyn; Ueda, Yutaka; Enomoto, Takayuki; Mikami, Mikio; Roman, Lynda D; Sood, Anil K
2016-07-01
To examine the survival outcomes in women with endometrial cancer who were taking low-dose aspirin (81-100 mg/d). A multicenter retrospective study was conducted examining patients with stage I-IV endometrial cancer who underwent hysterectomy-based surgical staging between January 2000 and December 2013 (N=1,687). Patient demographics, medical comorbidities, medication types, tumor characteristics, and treatment patterns were correlated to survival outcomes. A Cox proportional hazard regression model was used to estimate adjusted hazard ratio for disease-free and disease-specific overall survival. One hundred fifty-eight patients (9.4%, 95% confidence interval [CI] 8.8-11.9) were taking low-dose aspirin. Median follow-up time for the study cohort was 31.5 months. One hundred twenty-seven patients (7.5%) died of endometrial cancer. Low-dose aspirin use was significantly correlated with concurrent obesity, hypertension, diabetes mellitus, and hypercholesterolemia (all P<.001). Low-dose aspirin users were more likely to take other antihypertensive, antiglycemic, and anticholesterol agents (all P<.05). Low-dose aspirin use was not associated with histologic subtype, tumor grade, nodal metastasis, or cancer stage (all P>.05). On multivariable analysis, low-dose aspirin use remained an independent prognostic factor associated with an improved 5-year disease-free survival rate (90.6% compared with 80.9%, adjusted hazard ratio 0.46, 95% CI 0.25-0.86, P=.014) and disease-specific overall survival rate (96.4% compared with 87.3%, adjusted hazard ratio 0.23, 95% CI 0.08-0.64, P=.005). The increased survival effect noted with low-dose aspirin use was greatest in patients whose age was younger than 60 years (5-year disease-free survival rates, 93.9% compared with 84.0%, P=.013), body mass index was 30 or greater (92.2% compared with 81.4%, P=.027), who had type I cancer (96.5% compared with 88.6%, P=.029), and who received postoperative whole pelvic radiotherapy (88.2% compared with 61.5%, P=.014). These four factors remained significant for disease-specific overall survival (all P<.05). Our results suggest that low-dose aspirin use is associated with improved survival outcomes in women with endometrial cancer, especially in those who are young, obese, with low-grade disease, and who receive postoperative radiotherapy.
Kovacs, Alexander; Vadeboncoeur, Tyler F; Stolz, Uwe; Spaite, Daniel W; Irisawa, Taro; Silver, Annemarie; Bobrow, Bentley J
2015-07-01
We evaluated the association between chest compression release velocity (CCRV) and outcomes after out-of-hospital cardiac arrest (OHCA). CPR quality was measured using a defibrillator with accelerometer-based technology (E Series, ZOLL Medical) during OHCA resuscitations by 2 EMS agencies in Arizona between 10/2008 and 06/2013. All non-EMS-witnessed adult (≥ 18 years) arrests of presumed cardiac etiology were included. The association between mean CCRV (assessed as an appropriate measure of central tendency) and both survival to hospital discharge and neurologic outcome (Cerebral Performance Category score = 1 or 2) was analyzed using multivariable logistic regression to control for known and potential confounders and multiple imputation to account for missing data. 981 OHCAs (median age 68 years, 65% male, 11% survival to discharge) were analyzed with 232 (24%) missing CPR quality data. All-rhythms survival varied significantly with CCRV [fast (≥ 400 mm/s) = 18/79 (23%); moderate (300-399.9 mm/s) = 50/416 (12%); slow (<300 mm/s) 17/255 (7%); p < 0.001], as did favorable neurologic outcome [fast = 14/79 (18%); moderate = 43/415 (10%); slow = 11/255 (4%); p < 0.001]. Fast CCRV was associated with increased survival compared to slow [adjusted odds ratio (aOR) 4.17 (95% CI: 1.61, 10.82) and moderate CCRV [aOR 3.08 (1.39, 6.83)]. Fast CCRV was also associated with improved favorable neurologic outcome compared to slow [4.51 (1.57, 12.98)]. There was a 5.2% increase in the adjusted odds of survival for each 10mm/s increase in CCRV [aOR 1.052 (1.001, 1.105)]. CCRV was independently associated with improved survival and favorable neurologic outcome at hospital discharge after adult OHCA. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Racial disparities in advanced-stage colorectal cancer survival.
Wallace, Kristin; Hill, Elizabeth G; Lewin, David N; Williamson, Grace; Oppenheimer, Stephanie; Ford, Marvella E; Wargovich, Michael J; Berger, Franklin G; Bolick, Susan W; Thomas, Melanie B; Alberg, Anthony J
2013-03-01
African-Americans (AA) have a higher incidence of and lower survival from colorectal cancer (CRC) compared with European Americans (EA). In the present study, statewide, population-based data from South Carolina Central Cancer Registry are used to investigate the relationship between race and age on advanced-stage CRC survival. The study population was comprised of 3,865 advanced pathologically documented colon and rectal adenocarcinoma cases diagnosed between 01 January 1996 and 31 December 2006: 2,673 (69 %) EA and 1,192 (31 %) AA. Kaplan-Meier methods were used to generate median survival time and corresponding 95 % confidence intervals (CI) by race, age, and gender. Factors associated with survival were evaluated by fitting Cox proportional hazards regression models to generate hazard ratios (HR) and 95 % CI. We observed a significant interaction between race and age on CRC survival (p = 0.04). Among younger patients (<50 years), AA race was associated with a 1.34 times (95 % CI 1.06-1.71) higher risk of death compared with EA. Among older patients, we observed a modest increase in risk of death among AA men compared with EA [HR 1.16 (95 % CI 1.01-1.32)] but no difference by race between women [HR 0.94 (95 % CI 0.82-1.08)]. Moreover, we observed that the disparity in survival has worsened over the past 15 years. Future studies that integrate clinical, molecular, and treatment-related data are needed for advancing understanding of the racial disparity in CRC survival, especially for those <50 years old.
Halliday, Jane; Collins, Veronica; Riley, Merilyn; Youssef, Danielle; Muggli, Evelyne
2009-01-01
With this study we aimed to compare survival rates for children with Down syndrome in 2 time periods, 1 before prenatal screening (1988-1990) and 1 contemporaneous with screening (1998-2000), and to examine the frequency of comorbidities and their influence on survival rates. Record-linkage was performed between the population-based Victorian Birth Defects Register and records of deaths in children up to 15 years of age collected under the auspice of the Consultative Council on Obstetric and Pediatric Mortality and Morbidity. Cases of Down syndrome were coded according to the presence or absence of comorbidities by using the International Classification of Diseases, Ninth Revision classification of birth defects. Kaplan-Meier survival functions and log rank tests for equality of survival distributions were performed. Of infants liveborn with Down syndrome in 1998-2000, 90% survived to 5 years of age, compared with 86% in the earlier cohort. With fetal deaths excluded, the proportion of isolated Down syndrome cases in the earlier cohort was 48.7% compared with 46.1% in the most recent cohort. In 1988-1990 there was at least 1 cardiac defect in 41.1% of cases and in 45.4% in 1998-2000. There was significant variation in survival rates for the different comorbidity groupings in the 1988-1990 cohort, but this was not so evident in the 1998-2000 cohort. Survival of children with Down syndrome continues to improve, and there is an overall survival figure of 90% to at least 5 years of age. It is clear from this study that prenatal screening technologies are not differentially ascertaining fetuses with Down syndrome and additional defects, because there has been no proportional increase in births of isolated cases with Down syndrome.
Survival of metastatic colorectal cancer patients treated with chemotherapy in Alberta (1995-2004).
Chen, Yiqun; Qiu, Zhenguo; Kamruzzaman, Anmmd; Snodgrass, Tom; Scarfe, Andrew; Bryant, Heather E
2010-02-01
Clinical trials have suggested that advances in chemotherapy significantly improve the survival of patients with metastatic colorectal cancer. Comparable evidence from clinical practice is scarce. This study aims to investigate the survival of patients with metastatic colorectal cancer treated with chemotherapy in Alberta, Canada. Trends of relative survival of patients diagnosed in 1994-2003 were assessed using Alberta Cancer Registry (ACR) data. The median overall survival (OS) of patients diagnosed in 2004 was determined by linking Cancer Registry data with Electronic Medical Records (EMR). Cox regression models were fitted to calculate the hazard ratio for patients treated with chemotherapy. The 2-year relative survival for patients with metastatic colorectal cancer who received chemotherapy increased significantly from 29% to 41% over the 10 years (1994-2003, p < 0.015). A 69% reduction in the risk of mortality was observed in the 168 patients who received chemotherapy compared to the 87 patients who did not, after adjusting for age, gender, and number of metastases. The median OS of patients who received chemotherapy was 17.5 months. This is comparable to the 18-20 months seen in recently published clinical trials, considering the patients in this study were from the real clinical practice, nearly half of them were older than 70, and many of them might have important co-morbidities. The survival of patients diagnosed with metastatic colorectal cancer in Alberta has improved in recent years; this is most likely attributable in large part to the use of chemotherapy.
Polanco, Patricio M; Ding, Ying; Knox, Jordan M; Ramalingam, Lekshmi; Jones, Heather; Hogg, Melissa E; Zureikat, Amer H; Holtzman, Matthew P; Pingpank, James; Ahrendt, Steven; Zeh, Herbert J; Bartlett, David L; Choudry, Haroon A
2016-02-01
High-grade (HG) mucinous appendiceal neoplasms (MAN) have a worse prognosis than low-grade histology. Our objective was to assess the safety and efficacy of cytoreductive surgery with hyperthermic intraperitoneal chemoperfusion (CRS/HIPEC) in patients with high-grade, high-volume (HG-HV) peritoneal metastases in whom the utility of this aggressive approach is controversial. Prospectively collected perioperative data were compared between patients with peritoneal metastases from HG-HV MAN, defined as simplified peritoneal cancer index (SPCI) ≥12, and those with high-grade, low-volume (HG-LV; SPCI <12) disease. Kaplan-Meier curves and multivariate Cox regression models identified prognostic factors affecting oncologic outcomes. Overall, 54 patients with HG-HV and 43 with HG-LV peritoneal metastases underwent CRS/HIPEC. The HG-HV group had longer operative time, increased blood loss/transfusion, and increased intensive care unit length of stay (p < 0.05). Incomplete macroscopic cytoreduction (CC-1/2/3) was higher in the HG-HV group compared with the HG-LV group (68.5 vs. 32.6 %; p = 0.005). Patients with HG-HV disease demonstrated worse survival than those with HG-LV disease (overall survival [OS] 17 vs. 42 m, p = 0.009; time to progression (TTP) 10 vs. 14 m, p = 0.024). However, when complete macroscopic resection (CC-0) was achieved, the OS and progression-free survival of patients with HG-HV disease were comparable with HG-LV disease (OS 56 vs. 52 m, p = 0.728; TTP 20 vs. 19 m, p = 0.393). In a multivariate Cox proportional hazard regression model, CC-0 resection was the only significant predictor of improved survival for patients with HG-HV disease. Although patients with HG-HV peritoneal metastases from MAN have worse prognosis compared with patients with HG-LV disease, their survival is comparable when complete macroscopic cytoreduction is achieved.
Zhu, Lin; Qualls, Whitney A.; Marshall, John M; Arheart, Kris L.; DeAngelis, Donald L.; McManus, John W.; Traore, Sekou F.; Doumbia, Seydou; Schlein, Yosef; Muller, Gunter C.; Beier, John C.
2015-01-01
BackgroundAgent-based modelling (ABM) has been used to simulate mosquito life cycles and to evaluate vector control applications. However, most models lack sugar-feeding and resting behaviours or are based on mathematical equations lacking individual level randomness and spatial components of mosquito life. Here, a spatial individual-based model (IBM) incorporating sugar-feeding and resting behaviours of the malaria vector Anopheles gambiae was developed to estimate the impact of environmental sugar sources and resting sites on survival and biting behaviour.MethodsA spatial IBM containing An. gambiae mosquitoes and humans, as well as the village environment of houses, sugar sources, resting sites and larval habitat sites was developed. Anopheles gambiae behaviour rules were attributed at each step of the IBM: resting, host seeking, sugar feeding and breeding. Each step represented one second of time, and each simulation was set to run for 60 days and repeated 50 times. Scenarios of different densities and spatial distributions of sugar sources and outdoor resting sites were simulated and compared.ResultsWhen the number of natural sugar sources was increased from 0 to 100 while the number of resting sites was held constant, mean daily survival rate increased from 2.5% to 85.1% for males and from 2.5% to 94.5% for females, mean human biting rate increased from 0 to 0.94 bites per human per day, and mean daily abundance increased from 1 to 477 for males and from 1 to 1,428 for females. When the number of outdoor resting sites was increased from 0 to 50 while the number of sugar sources was held constant, mean daily survival rate increased from 77.3% to 84.3% for males and from 86.7% to 93.9% for females, mean human biting rate increased from 0 to 0.52 bites per human per day, and mean daily abundance increased from 62 to 349 for males and from 257 to 1120 for females. All increases were significant (P < 0.01). Survival was greater when sugar sources were randomly distributed in the whole village compared to clustering around outdoor resting sites or houses.ConclusionsIncreases in densities of sugar sources or outdoor resting sites significantly increase the survival and human biting rates of An. gambiae mosquitoes. Survival of An. gambiae is more supported by random distribution of sugar sources than clustering of sugar sources around resting sites or houses. Density and spatial distribution of natural sugar sources and outdoor resting sites modulate vector populations and human biting rates, and thus malaria parasite transmission.
Zhu, Lin; Qualls, Whitney A; Marshall, John M; Arheart, Kris L; DeAngelis, Donald L; McManus, John W; Traore, Sekou F; Doumbia, Seydou; Schlein, Yosef; Müller, Günter C; Beier, John C
2015-02-05
Agent-based modelling (ABM) has been used to simulate mosquito life cycles and to evaluate vector control applications. However, most models lack sugar-feeding and resting behaviours or are based on mathematical equations lacking individual level randomness and spatial components of mosquito life. Here, a spatial individual-based model (IBM) incorporating sugar-feeding and resting behaviours of the malaria vector Anopheles gambiae was developed to estimate the impact of environmental sugar sources and resting sites on survival and biting behaviour. A spatial IBM containing An. gambiae mosquitoes and humans, as well as the village environment of houses, sugar sources, resting sites and larval habitat sites was developed. Anopheles gambiae behaviour rules were attributed at each step of the IBM: resting, host seeking, sugar feeding and breeding. Each step represented one second of time, and each simulation was set to run for 60 days and repeated 50 times. Scenarios of different densities and spatial distributions of sugar sources and outdoor resting sites were simulated and compared. When the number of natural sugar sources was increased from 0 to 100 while the number of resting sites was held constant, mean daily survival rate increased from 2.5% to 85.1% for males and from 2.5% to 94.5% for females, mean human biting rate increased from 0 to 0.94 bites per human per day, and mean daily abundance increased from 1 to 477 for males and from 1 to 1,428 for females. When the number of outdoor resting sites was increased from 0 to 50 while the number of sugar sources was held constant, mean daily survival rate increased from 77.3% to 84.3% for males and from 86.7% to 93.9% for females, mean human biting rate increased from 0 to 0.52 bites per human per day, and mean daily abundance increased from 62 to 349 for males and from 257 to 1120 for females. All increases were significant (P < 0.01). Survival was greater when sugar sources were randomly distributed in the whole village compared to clustering around outdoor resting sites or houses. Increases in densities of sugar sources or outdoor resting sites significantly increase the survival and human biting rates of An. gambiae mosquitoes. Survival of An. gambiae is more supported by random distribution of sugar sources than clustering of sugar sources around resting sites or houses. Density and spatial distribution of natural sugar sources and outdoor resting sites modulate vector populations and human biting rates, and thus malaria parasite transmission.
Effects of electrical stimulation in the treatment of osteonecrosis of the femoral head.
Fornell, Salvador; Ribera, Juan; Mella, Mario; Carranza, Andrés; Serrano-Toledano, David; Domecq, Gabriel
2017-10-16
The aim of this study was to examine whether the use of an internal electrostimulator could improve the results obtained with core decompression alone in the treatment of osteonecrosis of the femoral head. We performed a retrospective study of 41 patients (55 hips) treated for osteonecrosis of the femoral head between 2005 and 2014. Mean follow-up time was 56 (12-108) months. We recorded 3 parameters: time to recurrence of pain, time to conversion to arthroplasty and time to radiographic failure. Survival was estimated using the Kaplan-Meier method. The equality of the survival distributions was determined by the Log rank test. Implanted electrostimulator was a factor that increased the survival of hips in a pre-op Steinberg stage of II or below, while it remained unchanged if the stage was III or higher. The addition of an internal electrostimulator provides increased survival compared to core decompression alone at stages below III.
Roberts, Tracy E
1998-01-01
Objective: To compare the resource implications and short term outcomes of extracorporeal membrane oxygenation and conventional management for term babies with severe respiratory failure. Design: Cost effectiveness evaluation alongside a randomised controlled trial. Setting: 55 approved recruiting hospitals in the United Kingdom. These hospitals provided conventional management, but infants randomised to extracorporeal membrane oxygenation were transferred to one of five specialist centres. Subjects: 185 mature newborn infants (gestational age at birth >35 weeks, birth weight >2 kg) with severe respiratory failure (oxygenation index >40) recruited between 1993 and 1995. The commonest diagnoses were persistent pulmonary hypertension due to meconium aspiration, congenital diaphragmatic hernia, isolated persistent fetal circulation, sepsis, and idiopathic respiratory distress syndrome. Main outcome measure: Cost effectiveness based on survival at 1 year of age without severe disability. Results: 63 (68%) of the 93 infants randomised to extracorporeal membrane oxygenation survived to 1 year compared with 38 (41%) of the 92 infants who received conventional management. Of those that survived, one infant in each arm was lost to follow up and the proportion with disability at 1 year was similar in the two arms of the trial. One child in each arm had severe disability. The estimated additional cost of extracorporeal membrane oxygenation per additional surviving infant without severe disability was £51 222 and the cost per surviving infant with no disability was £75 327. Conclusions: Extracorporeal membrane oxygenation for term neonates with severe respiratory failure would increase overall survival without disability. Although the policy will increase costs of neonatal health care, it is likely to be as cost effective as other life extending technologies. Key messagesExtracorporeal membrane oxygenation increases survival for term neonates in respiratory failureThe technique was three times more costly than conventional managementIf extracorporeal membrane oxygenation is adopted it will increase the cost of neonatal health care.Extracorporeal membrane oxygenation may be as cost effective as other life extending technologies, but long term follow up studies are needed to confirm this PMID:9756807
Twenty-Five Year Survival of Children with Intellectual Disability in Western Australia.
Bourke, Jenny; Nembhard, Wendy N; Wong, Kingsley; Leonard, Helen
2017-09-01
To investigate survival up to early adulthood for children with intellectual disability and compare their risk of mortality with that of children without intellectual disability. This was a retrospective cohort study of all live births in Western Australia between January 1, 1983 and December 31, 2010. Children with an intellectual disability (n = 10 593) were identified from the Western Australian Intellectual Disability Exploring Answers Database. Vital status was determined from linkage to the Western Australian Mortality database. Kaplan-Meier product limit estimates and 95% CIs were computed by level of intellectual disability. Hazard ratios (HRs) and 95% CIs were calculated from Cox proportional hazard regression models adjusting for potential confounders. After adjusting for potential confounders, compared with those without intellectual disability, children with intellectual disability had a 6-fold increased risk of mortality at 1-5 years of age (adjusted HR [aHR] = 6.0, 95%CI: 4.8, 7.6), a 12-fold increased risk at 6-10 years of age (aHR = 12.6, 95% CI: 9.0, 17.7) and a 5-fold increased risk at 11-25 years of age (aHR = 4.9, 95% CI: 3.9, 6.1). Children with severe intellectual disability were at even greater risk. No difference in survival was observed for Aboriginal children with intellectual disability compared with non-Aboriginal children with intellectual disability. Although children with intellectual disability experience higher mortality at all ages compared with those without intellectual disability, the greatest burden is for those with severe intellectual disability. However, even children with mild to moderate intellectual disability have increased risk of death compared with unaffected children. Copyright © 2017 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qin, Rosie; Olson, Adam; Singh, Bhavana
Purpose: Ipilimumab and radiation therapy (RT) are standard treatments for advanced melanoma; preclinical models suggest the potential for synergy. However, limited clinical information exists regarding safety and optimal timing of the combination. Methods and Materials: We reviewed the records of consecutive patients with unresectable stage 3 or 4 melanoma treated with ipilimumab. Patients were categorized as having received RT or not. Differences were estimated between these 2 cohorts. Results: We identified 88 patients treated with ipilimumab. At baseline, the ipilimumab-plus-RT group (n=44) had more unfavorable characteristics. Despite this, overall survival, progression-free survival, and both immune-related and non–immune-related toxicity were notmore » statistically different (P=.67). Patients who received ipilimumab before RT had an increased duration of irradiated tumor response compared with patients receiving ipilimumab after RT (74.7% vs 44.8% at 12 months; P=.01, log-rank test). In addition, patients receiving ablative RT had non–statistically significantly improved median overall survival (19.6 vs 10.2 months), as well as 6-month (95.1% vs 72.7%) and 12-month (79.7% vs 48.5%) survival rates, compared with those treated with conventionally fractionated RT. Conclusions: We found that both ablative and conventionally fractionated RT can be safely administered with ipilimumab without a clinically apparent increase in toxicity. Patients who received ipilimumab before RT had an increased duration of irradiated tumor response.« less
Does colour polymorphism enhance survival of prey populations?
Wennersten, Lena; Forsman, Anders
2009-01-01
That colour polymorphism may protect prey populations from predation is an old but rarely tested hypothesis. We examine whether colour polymorphic populations of prey exposed to avian predators in an ecologically valid visual context were exposed to increased extinction risk compared with monomorphic populations. We made 2976 artificial pastry prey, resembling Lepidoptera larvae, in four different colours and presented them in 124 monomorphic and 124 tetramorphic populations on tree trunks and branches such that they would be exposed to predation by free-living birds, and monitored their ‘survival’. Among monomorphic populations, there was a significant effect of prey coloration on survival, confirming that coloration influenced susceptibility to visually oriented predators. Survival of polymorphic populations was inferior to that of monomorphic green populations, but did not differ significantly from monomorphic brown, yellow or red populations. Differences in survival within polymorphic populations paralleled those seen among monomorphic populations; the red morph most frequently went extinct first and the green morph most frequently survived the longest. Our findings do not support the traditional protective polymorphism hypothesis and are in conflict with those of earlier studies. As a possible explanation to our findings, we offer a competing ‘giveaway cue’ hypothesis: that polymorphic populations may include one morph that attracts the attention of predators and that polymorphic populations therefore may suffer increased predation compared with some monomorphic populations. PMID:19324729
Kemna, Mariska; Albers, Erin; Bradford, Miranda C; Law, Sabrina; Permut, Lester; McMullan, D Mike; Law, Yuk
2016-03-01
The effect of donor-recipient sex matching on long-term survival in pediatric heart transplantation is not well known. Adult data have shown worse survival when male recipients receive a sex-mismatched heart, with conflicting results in female recipients. We analyzed 5795 heart transplant recipients ≤ 18 yr in the Scientific Registry of Transplant Recipients (1990-2012). Recipients were stratified based on donor and recipient sex, creating four groups: MM (N = 1888), FM (N = 1384), FF (N = 1082), and MF (N = 1441). Males receiving sex-matched donor hearts had increased unadjusted allograft survival at five yr (73.2 vs. 71%, p = 0.01). However, this survival advantage disappeared with longer follow-up and when adjusted for additional risk factors by multivariable Cox regression analysis. In contrast, for females, receiving a sex-mismatched heart was associated with an 18% higher risk of allograft loss over time compared to receiving a sex-matched heart (HR 1.18, 95% CI: 1.00-1.38) and a 26% higher risk compared to sex-matched male recipients (HR 1.26, 95% CI: 1.10-1.45). Females who receive a heart from a male donor appear to have a distinct long-term survival disadvantage compared to all other groups. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Trends in incidence and survival for anal cancer in New South Wales, Australia, 1972-2009.
Soeberg, Matthew J; Rogers, Kris; Currow, David C; Young, Jane M
2015-12-01
Little is known about the incidence and survival of anal cancer in New South Wales (NSW), Australia, as anal cancer cases are often grouped together with other colorectal cancers in descriptive epidemiological analyses. We studied patterns and trends in the incidence and survival of people diagnosed with anal cancer in NSW, Australia, 1972-2009 (n=2724). We also predicted anal cancer incidence in NSW during 2010-2032. Given the human papilloma virus-associated aetiology for most anal cancers, we quantified these changes over time in incidence and survival by histological subtype: anal squamous cell carcinoma (ASCC); and anal adenocarcinoma (AAC). There was a linear increase in incident anal cancer cases in NSW with an average annual percentage change (AAPC) of 1.6 (95% CI 1.1-2.0) such that, in combination with age-period-cohort modelling, we predict there will be 198 cases of anal cancer in the 2032 calendar year (95% CI 169-236). Almost all of these anal cancer cases are projected to be ASCC (94%). Survival improved over time regardless of histological subtype. However, five-year relative survival was substantially higher for people with ASCC (70% (95% CI 66-74%)) compared to AAC (51% (95% CI 43-59%)), a 37% difference. Survival was also greater for women (69% (95% CI 64-73%)) with ASCC compared to men (55% (95% CI 50-60%)). It was not possible to estimate survival by stage at diagnosis particularly given that 8% of all cases were recorded as having distant stage and 22% had missing stage data. Aetiological explanations, namely exposure to oncogenic types of human papillomavirus, along with demographic changes most likely explain the actual and projected increase in ASCC case numbers. Survival differences by gender and histological subtype point to areas where further research is warranted to improve treatment and outcomes for all anal cancer patients. Copyright © 2015 Elsevier Ltd. All rights reserved.
Demographic effects of canine parvovirus on a free-ranging wolf population over 30 years
Mech, L.D.; Goyal, S.M.; Paul, W.J.; Newton, W.E.
2008-01-01
We followed the course of canine parvovinis (CPV) antibody prevalence in a subpopulation of wolves (Canis 1upus) in northeastern Minnesota from 1973, when antibodies were first detected, through 2004. Annual early pup survival was reduced by 70%, and wolf population change was related to CPV antibody prevalence. In the greater Minnesota population of 3,000 wolves, pup survival was reduced by 40-60%. This reduction limited the Minnesota wolf population rate of increase to about 4% per year compared with increases of 16-58% in other populations. Because it is young wolves that disperse, reduced pup survival may have caused reduced dispersal and reduced recolonization of new range in Minnesota. ?? Wildlife Disease Association 2008.
Bohorquez, H; Seal, J B; Cohen, A J; Kressel, A; Bugeaud, E; Bruce, D S; Carmody, I C; Reichman, T W; Battula, N; Alsaggaf, M; Therapondos, G; Bzowej, N; Tyson, G; Joshi, S; Nicolau-Raducu, R; Girgrah, N; Loss, G E
2017-08-01
Donation after circulatory death (DCD) liver transplantation (LT) reportedly yields inferior survival and increased complication rates compared with donation after brain death (DBD). We compare 100 consecutive DCD LT using a protocol that includes thrombolytic therapy (late DCD group) to an historical DCD group (early DCD group n = 38) and a cohort of DBD LT recipients (DBD group n = 435). Late DCD LT recipients had better 1- and 3-year graft survival rates than early DCD LT recipients (92% vs. 76.3%, p = 0.03 and 91.4% vs. 73.7%, p = 0.01). Late DCD graft survival rates were comparable to those of the DBD group (92% vs. 93.3%, p = 0.24 and 91.4% vs. 88.2%, p = 0.62). Re-transplantation occurred in 18.4% versus 1% for the early and late DCD groups, respectively (p = 0.001). Patient survival was similar in all three groups. Ischemic-type biliary lesions (ITBL) occurred in 5%, 3%, and 0.2% for early DCD, late DCD, and DBD groups, respectively, but unlike in the early DCD group, in the late DCD group ITBL was endoscopically managed and resolved in each case. Using a protocol that includes a thrombolytic therapy, DCD LT yielded patient and graft survival rates comparable to DBD LT. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.
Schmidt, Morten; Pedersen, Susanne Bendesgaard; Farkas, Dóra Körmendiné; Hjortshøj, Søren Pihlkjær; Bøtker, Hans Erik; Nielsen, Jens Cosedis; Sørensen, Henrik Toft
2015-09-01
Long-term trends in use of implantable cardioverter-defibrillators (ICDs) and outcomes are rare. We examined 13-year nationwide trends in ICD implantation and survival rates in Denmark. Using medical databases, we identified all first time ICD recipients in Denmark during 2000-2012 (N = 8460) and ascertained all-cause mortality. We computed standardized annual implantation rates and mortality rate ratios according to age, sex, comorbidity level, indication, and device type. The standardized annual implantation rate increased from 42 per million persons in 2000 to 213 per million persons in 2012 (from 34 to 174 for men and from 8 to 39 for women). The increase was driven by secondary prophylactic ICDs until 2006 and primary prophylactic ICDs thereafter. The increase occurred particularly in older patients and those with a high level of comorbidity. Independent of indication, 76% of all patients with ICD were alive after 5 years. Men had a higher mortality rate compared with women (mortality rate ratio 1.28; 95% confidence interval 1.10-1.49). Compared with low comorbidity level, moderate, severe, and very severe comorbidity levels were associated with 1.6-, 2.5-, and 4.9-fold increased mortality rates, respectively. The most influential individual comorbidities were heart failure, diabetes, liver disease, and renal disease. The annual implantation rate of ICDs increased 5-fold in Denmark between 2000 and 2012. The rate increase occurred for both men and women, but particularly in the elderly and patients with severe comorbidity. Five-year survival probability was high, but severe comorbidity and male sex were associated with shorter survival. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.
Shimizu, Fumitaka; Muto, Satoru; Taguri, Masataka; Ieda, Takeshi; Tsujimura, Akira; Sakamoto, Yoshiro; Fujita, Kazuhiko; Okegawa, Takatsugu; Yamaguchi, Raizo; Horie, Shigeo
2017-05-01
To evaluate the clinical benefit of adjuvant platinum-based chemotherapy after radical cystectomy for muscle-invasive bladder cancer in routine clinical practice. The present observational study was carried out to compare the effectiveness of adjuvant chemotherapy versus observation post-radical cystectomy in patients with clinically muscle-invasive bladder cancer. Cancer-specific survival and overall survival between the adjuvant chemotherapy group and radical cystectomy alone group were compared using Kaplan-Meier method and log-rank test. After adjusting for background factors using propensity score weighting, differences in cancer-specific survival and overall survival between these two groups were compared. Subgroup analyses by the pathological characteristics were carried out. In total, 322 patients were included in the present study. Of these, 23% received adjuvant chemotherapy post-radical cystectomy. Clinicopathological characteristics showed that patients in the adjuvant chemotherapy group were pathologically more advanced and were at higher risk than the radical cystectomy alone group. In the unadjusted population, although it is not significant, the adjuvant chemotherapy group had lower overall survival (3-year overall survival; 61.5% vs 73.6%, HR 1.33, P = 0.243, log-rank test, adjuvant chemotherapy vs radical cystectomy alone). In the weighted propensity score analysis, although it is not significant, the adjuvant chemotherapy group were superior to radical cystectomy alone groups (overall survival: HR 0.65, 95% CI 0.39-1.09, P = 0.099, log-rank test, adjuvant chemotherapy vs radical cystectomy alone). Subgroup analyses showed that adjuvant chemotherapy significantly reduced the hazard ratio of overall survival and cancer-specific survival in the ≥pT3, pN+, ly+ and v+ subgroups. Platinum-based adjuvant chemotherapy might be associated with increased cancer-specific survival and overall survival in patients with high-risk invasive bladder cancer. © 2017 The Japanese Urological Association.
Loman, Zachary G.; Monroe, Adrian; Riffell, Samuel K.; Miller, Darren A.; Vilella, Francisco; Wheat, Bradley R.; Rush, Scott A.; Martin, James A.
2018-01-01
Switchgrass (Panicum virgatum) intercropping is a novel forest management practice for biomass production intended to generate cellulosic feedstocks within intensively managed loblolly pine‐dominated landscapes. These pine plantations are important for early‐successional bird species, as short rotation times continually maintain early‐successional habitat. We tested the efficacy of using community models compared to individual surrogate species models in understanding influences on nest survival. We analysed nest data to test for differences in habitat use for 14 bird species in plots managed for switchgrass intercropping and controls within loblolly pine (Pinus taeda) plantations in Mississippi, USA.We adapted hierarchical models using hyper‐parameters to incorporate information from both common and rare species to understand community‐level nest survival. This approach incorporates rare species that are often discarded due to low sample sizes, but can inform community‐level demographic parameter estimates. We illustrate use of this approach in generating both species‐level and community‐wide estimates of daily survival rates for songbird nests. We were able to include rare species with low sample size (minimum n = 5) to inform a hyper‐prior, allowing us to estimate effects of covariates on daily survival at the community level, then compare this with a single‐species approach using surrogate species. Using single‐species models, we were unable to generate estimates below a sample size of 21 nests per species.Community model species‐level survival and parameter estimates were similar to those generated by five single‐species models, with improved precision in community model parameters.Covariates of nest placement indicated that switchgrass at the nest site (<4 m) reduced daily nest survival, although intercropping at the forest stand level increased daily nest survival.Synthesis and applications. Community models represent a viable method for estimating community nest survival rates and effects of covariates while incorporating limited data for rarely detected species. Intercropping switchgrass in loblolly pine plantations slightly increased daily nest survival at the research plot scale (0.1 km2), although at a local scale (50 m2) switchgrass negatively influenced nest survival. A likely explanation is intercropping shifted community composition, favouring species with greater disturbance tolerance.
Suh, Charles P-C
2017-01-01
Abstract The boll weevil, Anthonomus grandis grandis Boheman (Coleoptera: Curculionidae), has been the most important pest of cotton (Gossypium spp.) wherever it occurs. Although eradication programs in the United States have reduced the range of this pest, the weevil remains an intractable problem in subtropical Texas, Mexico, and much of South America. A key to managing the weevil in the subtropics and tropics might lie in better understanding its diapause and overwintering survival in regions characterized by relatively high late-season temperatures. We examined the temporal patterns of acquisition of diapause characters at 18.3, 23.9, and 29.4°C, and the effects of temperature during the diapause-induction period on subsequent host-free survival at 23.9°C. Occurrence of the diapause characters generally increased with weevil age at all temperatures but appeared more rapidly at higher temperatures. Acquisition of the diapause characters tended to occur slightly earlier in female weevils compared with the male weevils. Despite the slower development of diapause characters at lower temperatures, when adult weevils were fed under low temperatures, subsequent host-free survival was enhanced. These results are consistent with reports of increased weevil survival with delayed entry into overwintering. Our findings also suggest that the potential host-free survival facilitated by diapause occurring in subtropical or tropical production regions may be reduced compared with dormancy developing in southern temperate regions. This reduced survival potential emphasizes the importance of a maximized host-free season and suggests that the late-season diapause spray intervals should be short enough to ensure that the number of dormant weevils developing in late-season cotton is minimized.
Antitoxin Treatment of Inhalation Anthrax: A Systematic Review
Huang, Eileen; Pillai, Satish K.; Bower, William A.; Hendricks, Katherine A.; Guarnizo, Julie T.; Hoyle, Jamechia D.; Gorman, Susan E.; Boyer, Anne E.; Quinn, Conrad P.; Meaney-Delman, Dana
2016-01-01
Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident. PMID:26690378
Antitoxin Treatment of Inhalation Anthrax: A Systematic Review.
Huang, Eileen; Pillai, Satish K; Bower, William A; Hendricks, Katherine A; Guarnizo, Julie T; Hoyle, Jamechia D; Gorman, Susan E; Boyer, Anne E; Quinn, Conrad P; Meaney-Delman, Dana
2015-01-01
Concern about use of anthrax as a bioweapon prompted development of novel anthrax antitoxins for treatment. Clinical guidelines for the treatment of anthrax recommend antitoxin therapy in combination with intravenous antimicrobials; however, a large-scale or mass anthrax incident may exceed antitoxin availability and create a need for judicious antitoxin use. We conducted a systematic review of antitoxin treatment of inhalation anthrax in humans and experimental animals to inform antitoxin recommendations during a large-scale or mass anthrax incident. A comprehensive search of 11 databases and the FDA website was conducted to identify relevant animal studies and human reports: 28 animal studies and 3 human cases were identified. Antitoxin monotherapy at or shortly after symptom onset demonstrates increased survival compared to no treatment in animals. With early treatment, survival did not differ between antimicrobial monotherapy and antimicrobial-antitoxin therapy in nonhuman primates and rabbits. With delayed treatment, antitoxin-antimicrobial treatment increased rabbit survival. Among human cases, addition of antitoxin to combination antimicrobial treatment was associated with survival in 2 of the 3 cases treated. Despite the paucity of human data, limited animal data suggest that adjunctive antitoxin therapy may improve survival. Delayed treatment studies suggest improved survival with combined antitoxin-antimicrobial therapy, although a survival difference compared with antimicrobial therapy alone was not demonstrated statistically. In a mass anthrax incident with limited antitoxin supplies, antitoxin treatment of individuals who have not demonstrated a clinical benefit from antimicrobials, or those who present with more severe illness, may be warranted. Additional pathophysiology studies are needed, and a point-of-care assay correlating toxin levels with clinical status may provide important information to guide antitoxin use during a large-scale anthrax incident.
Mortality in the Vertebroplasty Population
McDonald, Robert J.; Achenbach, Sara; Atkinson, Elizabeth; Gray, Leigh A.; Cloft, Harry J.; Melton, L. Joseph; Kallmes, David F.
2011-01-01
Purpose Vertebroplasty is an effective treatment for painful compression fractures refractory to conservative management. Since there are limited data regarding the survival characteristics of this patient population, we compared the survival of a treated to an untreated vertebral fracture cohort to determine if vertebroplasty affects mortality rates. Materials and Methods The survival of a treated cohort, comprising 524 vertebroplasty recipients with refractory osteoporotic vertebral compression fractures, was compared to a separate, historical cohort of 589 subjects with fractures not treated by vertebroplasty who were identified from the Rochester Epidemiology Project. Mortality was compared between cohorts using Cox proportional hazard models adjusting for age, gender, and Charlson indices of co-morbidity. Mortality was also correlated with pre-, peri-, and post-procedural clinical metrics (e.g., cement volume utilization, Roland-Morris Disability Questionnaire score, analog pain scales, frequency of narcotic use, and improvements in mobility) within the treated cohort. Results Vertebroplasty recipients demonstrated 77% of the survival expected for individuals of similar age, ethnicity, and gender within the US population. When compared to individuals with both symptomatic and asymptomatic untreated vertebral fractures, vertebroplasty recipients retained a 17% greater mortality risk. However, when compared to symptomatic untreated vertebral fractures, vertebroplasty recipients had no increased mortality following adjustment for differences in age, sex and co-morbidity (HR 1.02; CI 0.82–1.25). In addition, no clinical metrics used to assess the efficacy of vertebroplasty were predictive of survival. Conclusion Vertebroplasty recipients have mortality rates similar to individuals with untreated symptomatic fractures but worse mortality compared to those with asymptomatic vertebral fractures. PMID:21998109
Efird, Jimmy T; Griffin, William F; Gudimella, Preeti; O'Neal, Wesley T; Davies, Stephen W; Crane, Patricia B; Anderson, Ethan J; Kindell, Linda C; Landrine, Hope; O'Neal, Jason B; Alwair, Hazaim; Kypson, Alan P; Nifong, Wiley L; Chitwood, W Randolph
2015-09-01
Conditional survival is defined as the probability of surviving an additional number of years beyond that already survived. The aim of this study was to compute conditional survival in patients who received a robotically assisted, minimally invasive mitral valve repair procedure (RMVP). Patients who received RMVP with annuloplasty band from May 2000 through April 2011 were included. A 5- and 10-year conditional survival model was computed using a multivariable product-limit method. Non-smoking men (≤65 years) who presented in sinus rhythm had a 96% probability of surviving at least 10 years if they survived their first year following surgery. In contrast, recent female smokers (>65 years) with preoperative atrial fibrillation only had an 11% probability of surviving beyond 10 years if alive after one year post-surgery. In the context of an increasingly managed healthcare environment, conditional survival provides useful information for patients needing to make important treatment decisions, physicians seeking to select patients most likely to benefit long-term following RMVP, and hospital administrators needing to comparatively assess the life-course economic value of high-tech surgical procedures.
Su, Xiaowei W; Li, Xiao-Yuan; Banasr, Mounira; Koo, Ja Wook; Shahid, Mohammed; Henry, Brian; Duman, Ronald S
2009-10-01
Currently available antidepressants upregulate hippocampal neurogenesis and prefrontal gliogenesis after chronic administration, which could block or reverse the effects of stress. Allosteric alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid receptor potentiators (ARPs), which have novel targets compared to current antidepressants, have been shown to have antidepressant properties in neurogenic and behavioral models. This study analyzed the effect of the ARP Org 26576 on the proliferation, survival, and differentiation of neurons and glia in the hippocampus and prelimbic cortex of adult rats. Male Sprague-Dawley rats received acute (single day) or chronic (21 day) twice-daily intraperitoneal injections of Org 26576 (1-10 mg/kg). Bromodeoxyuridine (BrdU) immunohistochemistry was conducted 24 h or 28 days after the last drug injection for the analysis of cell proliferation or survival, respectively. Confocal immunofluorescence analysis was used to determine the phenotype of surviving cells. Acute administration of Org 26576 did not increase neuronal cell proliferation. However, chronic administration of Org 26576 increased progenitor cell proliferation in dentate gyrus (approximately 40%) and in prelimbic cortex (approximately 35%) at the 10-mg/kg dosage. Cells born in response to chronic Org 26576 in dentate gyrus exhibited increased rates of survival (approximately 30%) with the majority of surviving cells expressing a neuronal phenotype. Findings suggest that Org 26576 may have antidepressant properties, which may be attributed, in part, to upregulation of hippocampal neurogenesis and prelimbic cell proliferation.
Di, W; Jia, M X; Xu, J; Li, B L; Liu, Y
Reactive oxygen species (ROS)-induced oxidative damage is responsible for viability loss in plant tissues following cryopreservation. Antioxidants may improve viability by preventing or repairing the injury. This work aimed at studying the effect of catalase (CAT) and pyruvate dehydrogenase (PDH), which are involved in ROS metabolism and are differentially expressed during pollen cryopreservation, for cryopreservation of Dendrobium nobile Lindl. 'Hamana Lake Dream' protocorm-like bodies (PLBs). Different concentrations of exogenous CAT or PDH were added at the loading, PVS2 treatment, unloading steps during vitrification-cryopreservation of PLBs. Their survival and regeneration were evaluated and correlated with physiological oxidative indexes. PLB survival increased significantly when CAT and PDH were added separately to the unloading solution at a suitable concentration. CAT at 400 U·ml -1 increased PLB survival and regeneration by 33.5 and 14.6 percent respectively. It had no impact on the production of superoxide anion radical (·O2-) and on superoxide dismutase (SOD) activity, but it reduced the hydrogen peroxide (H 2 O 2 ) and malondialdehyde (MDA) contents and enhanced ascorbic acid (AsA) and endogenous CAT levels compared to PLBs cryopreserved using the standard vitrification protocol (CK1). PDH at 0.1 U·ml -1 significantly improved PLB survival (by 2.5 percent), but it had no marked effect on regeneration compared to the CK1 group. It induced the same variations in ·O2-, AsA and endogenous CAT levels that were observed following CAT addition. However, PDH did not affect the H 2 O 2 and MDA content but significantly increased SOD activity. These results indicate that the addition of 400 U·ml -1 CAT and 0.1 U·ml -1 PDH at the unloading step increased survival of cryopreserved PLBs and that this improvement was associated with scavenging of H 2 O 2 and the repair of oxidative damage. Exogenous CAT also significantly improved PLB regeneration after cryopreservation, while PDH had no obvious effect. The effect of exogenous CAT on PLB survival and regeneration was stronger than that of PDH, which may be due to the increased SOD activity by PDH addition.
Adipose-Derived Mesenchymal Stem Cell Administration Does Not Improve Corneal Graft Survival Outcome
Fuentes-Julián, Sherezade; Arnalich-Montiel, Francisco; Jaumandreu, Laia; Leal, Marina; Casado, Alfonso; García-Tuñon, Ignacio; Hernández-Jiménez, Enrique; López-Collazo, Eduardo; De Miguel, Maria P.
2015-01-01
The effect of local and systemic injections of mesenchymal stem cells derived from adipose tissue (AD-MSC) into rabbit models of corneal allograft rejection with either normal-risk or high-risk vascularized corneal beds was investigated. The models we present in this study are more similar to human corneal transplants than previously reported murine models. Our aim was to prevent transplant rejection and increase the length of graft survival. In the normal-risk transplant model, in contrast to our expectations, the injection of AD-MSC into the graft junction during surgery resulted in the induction of increased signs of inflammation such as corneal edema with increased thickness, and a higher level of infiltration of leukocytes. This process led to a lower survival of the graft compared with the sham-treated corneal transplants. In the high-risk transplant model, in which immune ocular privilege was undermined by the induction of neovascularization prior to graft surgery, we found the use of systemic rabbit AD-MSCs prior to surgery, during surgery, and at various time points after surgery resulted in a shorter survival of the graft compared with the non-treated corneal grafts. Based on our results, local or systemic treatment with AD-MSCs to prevent corneal rejection in rabbit corneal models at normal or high risk of rejection does not increase survival but rather can increase inflammation and neovascularization and break the innate ocular immune privilege. This result can be partially explained by the immunomarkers, lack of immunosuppressive ability and immunophenotypical secretion molecules characterization of AD-MSC used in this study. Parameters including the risk of rejection, the inflammatory/vascularization environment, the cell source, the time of injection, the immunosuppression, the number of cells, and the mode of delivery must be established before translating the possible benefits of the use of MSCs in corneal transplants to clinical practice. PMID:25730319
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, R.M.; Boyce, J.T.; Kociba, G.J.
This study demonstrates the potential usefulness of a flow cytometric technique to measure platelet survival time in cats utilizing autologous platelets labeled in vitro with fluorescein isothiocyanate (FITC). When compared with a 51Cr method, no significant differences in estimated survival times were found. Both the 51Cr and FITC-labeling procedures induced similar changes in platelet shape and collagen-induced aggregation. Platelets labeled with FITC had significantly greater volumes compared with those of glutaraldehyde-fixed platelets. These changes were primarily related to the platelet centrifugation and washing procedures rather than the labels themselves. This novel technique potentially has wide applicability to cell circulation timemore » studies as flow cytometry equipment becomes more readily available. Problems with the technique are discussed. In a preliminary study of the platelet survival time in feline leukemia virus (FeLV)-infected cats, two of three cats had significantly reduced survival times using both flow cytometric and radioisotopic methods. These data suggest increased platelet turnover in FeLV-infected cats.« less
Shen, Rong; Liu, Hongliang; Wen, Juyi; Liu, Zhensheng; Wang, Li-E; Wang, Qiming; Tan, Dongfeng; Ajani, Jaffer A; Wei, Qingyi
2015-09-01
Thymidylate synthase (TYMS) plays a crucial role in folate metabolism as well as DNA synthesis and repair. We hypothesized that functional polymorphisms in the 3' UTR of TYMS are associated with gastric cancer risk and survival. In the present study, we tested our hypothesis by genotyping three potentially functional (at miRNA binding sites) TYMS SNPs (rs16430 6bp del/ins, rs2790 A>G and rs1059394 C>T) in 379 gastric cancer patients and 431 cancer-free controls. Compared with the rs16430 6bp/6bp + 6bp/0bp genotypes, the 0bp/0bp genotype was associated with significantly increased gastric cancer risk (adjusted OR = 1.72, 95% CI = 1.15-2.58). Similarly, rs2790 GG and rs1059394 TT genotypes were also associated with significantly increased risk (adjusted OR = 2.52, 95% CI = 1.25-5.10 and adjusted OR = 1.57, 95% CI = 1.04-2.35, respectively), compared with AA + AG and CC + CT genotypes, respectively. In the haplotype analysis, the T-G-0bp haplotype was associated with significantly increased gastric cancer risk, compared with the C-A-6bp haplotype (adjusted OR = 1.34, 95% CI = 1.05-1.72). Survival analysis revealed that rs16430 0bp/0bp and rs1059394 TT genotypes were also associated with poor survival in gastric cancer patients who received chemotherapy treatment (adjusted HR = 1.61, 95% CI = 1.05-2.48 and adjusted HR = 1.59, 95% CI = 1.02-2.48, respectively). These results suggest that these three variants in the miRNA binding sites of TYMS may be associated with cancer risk and survival of gastric cancer patients. Larger population studies are warranted to verify these findings. © 2014 Wiley Periodicals, Inc.
Merz, Maximilian; Jansen, Lina; Castro, Felipe A; Hillengass, Jens; Salwender, Hans; Weisel, Katja; Scheid, Christof; Luttmann, Sabine; Emrich, Katharina; Holleczek, Bernd; Katalinic, Alexander; Nennecke, Alice; Straka, Christian; Langer, Christian; Engelhardt, Monika; Einsele, Hermann; Kröger, Nicolaus; Beelen, Dietrich; Dreger, Peter; Brenner, Hermann; Goldschmidt, Hartmut
2016-07-01
The aim of this study was to determine the value of upfront autologous transplantation (ASCT) in elderly patients (60-79 years) with myeloma. We analysed relative survival (RS) of patients diagnosed in 1998-2011 and treated with ASCT within 12 months after diagnosis in Germany (n = 3591; German Registry of Stem Cell Transplantation) and compare RS with survival of myeloma patients diagnosed in the same years in Germany (n = 13,903; population-based German Cancer Registries). Utilisation of ASCT has increased rapidly between 2000-2002 and 2009-2011 (60-64years: 7.0-43.0%; 65-69 years: 6.6-23.7%; 70-79 years: 0.4-4.0%). Comparison of 5-year RS of patients from the general German myeloma population who have survived the first year after diagnosis with 5-year RS of patients treated with ASCT revealed higher survival for transplanted patients among all age groups (60-64: 59.2% versus 66.1%; 65-69: 57.4% versus 61.7%; 70-79: 51.0% versus 56.6%). RS increased strongly between 2003-2005 and 2009-2011 for the general German myeloma population (+8.5%) and for patients treated with ASCT (+11.8%). Differences in RS between these groups increased over time from +1.9% higher age-standardised survival in transplanted patients in 2003-2005 to 5.2% higher survival in 2009-2011. We conclude that upfront ASCT might be a major contributor to improved survival for elderly myeloma patients in Germany. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kruse, M A; Holmes, E S; Balko, J A; Fernandez, S; Brown, D C; Goldschmidt, M H
2013-07-01
Osteosarcoma is the most common bone tumor in dogs. However, current literature focuses primarily on appendicular osteosarcoma. This study examined the prognostic value of histological and clinical factors in flat and irregular bone osteosarcomas and hypothesized that clinical factors would have a significant association with survival time while histological factors would not. All osteosarcoma biopsy samples of the vertebra, rib, sternum, scapula, or pelvis were reviewed while survival information and clinical data were obtained from medical records, veterinarians, and owners. Forty-six dogs were included in the analysis of histopathological variables and 27 dogs with complete clinical data were included in the analysis of clinical variables. In the histopathologic cox regression model, there was no significant association between any histologic feature of osteosarcoma, including grade, and survival time. In the clinical cox regression model, there was a significant association between the location of the tumor and survival time as well as between the percent elevation of alkaline phosphatase (ALP) above normal and survival time. Controlling for ALP elevation, dogs with osteosarcoma located in the scapula had a significantly greater hazard for death (2.8) compared to dogs with tumors in other locations. Controlling for tumor location, every 100% increase in ALP from normal increased the hazard for death by 1.7. For canine osteosarcomas of the flat and irregular bones, histopathological features, including grade do not appear to be rigorous predictors of survival. Clinical variables such as increased ALP levels and tumor location in the scapula were associated with decreased survival times.
Bock, Matthew J; Pahl, Elfriede; Rusconi, Paolo G; Boyle, Gerard J; Parent, John J; Twist, Clare J; Kirklin, James K; Pruitt, Elizabeth; Bernstein, Daniel
2017-08-01
We aimed to determine whether malignancy after pediatric HTx for ACM affects overall post-HTx survival. Patients <18y listed for HTx for ACM in the PHTS database between 1993 and 2014 were compared to those with DCM. A 2:1 matched DCM cohort was also compared. Wait-list and post-HTx survival, along with freedom from common HTx complications, were compared. Eighty subjects were listed due to ACM, whereas 1985 were listed for DCM. Although wait-list survival was higher in the ACM group, post-HTx survival was lower for the ACM cohort. Neither difference persisted in the matched cohort analysis. Primary cause of death in the ACM group was infection, which was higher than the DCM group. Malignancy rates were not different. All ACM malignancies were due to PTLD without primary cancer recurrence or SMN. Long-term graft survival after pediatric HTx for ACM is no different than for matched DCM peers, nor is there an increased risk of any malignancy. However, risk of infection and death from infection after HTx are higher in the ACM group. Further studies are needed to assess the effects of prior chemotherapy on susceptibility to infection in this group. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Yang, Ya’nan; Yin, Xue; Sheng, Lei; Xu, Shan; Dong, Lingling; Liu, Lian
2015-01-01
To clarify the effect of neoadjuvant chemotherapy (NAC) on the survival outcomes of operable gastric cancers, we searched PubMed, Embase, and Cochrane Library for randomized clinical trials published until June 2014 that compared NAC-containing strategies with NAC-free strategies in patients with adenocarcinoma of the stomach or the esophagogastric junction, who had undergone potentially curative resection. The adjusted pooled hazard ratio (HR) for overall survival (OS) was insignificant when comparing the NAC-containing arm with the NAC-free arm. Subgroup analysis showed that the OS of the treatment arm that involved both adjuvant chemotherapy (AC) and NAC was significantly improved over the control arm (AC only) (HR = 0.48, 95% CI: 0.35–0.67; P < 0.001). While NAC alone plus surgery did not show any survival benefit over surgery alone. Perioperative chemotherapy (PC) also showed a significant increase in PFS and a significant reduction in distant metastasis compared to surgery alone. Therefore, in patients with resectable gastric cancer, NAC alone is not enough and AC alone is not good enough to definitely improve their OS. Collectively, PC combined with surgery could maximize the survival benefit for patients with resectable gastric cancer. PMID:26242393
Castleberry, A W; Güller, U; Tarantino, I; Berry, M F; Brügger, L; Warschkow, R; Cerny, T; Mantyh, C R; Candinas, D; Worni, M
2014-06-01
Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988-2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p < 0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991-1.000) per year (p = 0.03). A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.
Cold Temperatures Increase Cold Hardiness in the Next Generation Ophraella communa Beetles
Zhou, Zhong-Shi; Rasmann, Sergio; Li, Min; Guo, Jian-Ying; Chen, Hong-Song; Wan, Fang-Hao
2013-01-01
The leaf beetle, Ophraella communa, has been introduced to control the spread of the common ragweed, Ambrosia artemisiifolia, in China. We hypothesized that the beetle, to be able to track host-range expansion into colder climates, can phenotypically adapt to cold temperatures across generations. Therefore, we questioned whether parental experience of colder temperatures increases cold tolerance of the progeny. Specifically, we studied the demography, including development, fecundity, and survival, as well as physiological traits, including supercooling point (SCP), water content, and glycerol content of O. communa progeny whose parents were maintained at different temperature regimes. Overall, the entire immature stage decreased survival of about 0.2%–4.2% when parents experienced cold temperatures compared to control individuals obtained from parents raised at room temperature. However, intrinsic capacity for increase (r), net reproductive rate (R 0) and finite rate of increase (λ) of progeny O. communa were maximum when parents experienced cold temperatures. Glycerol contents of both female and male in progeny was significantly higher when maternal and paternal adults were cold acclimated as compared to other treatments. This resulted in the supercooling point of the progeny adults being significantly lower compared to beetles emerging from parents that experienced room temperatures. These results suggest that cold hardiness of O. communa can be promoted by cold acclimation in previous generation, and it might counter-balance reduced survival in the next generation, especially when insects are tracking their host-plants into colder climates. PMID:24098666
Obstetric acute renal failure 1956-1987.
Turney, J H; Ellis, C M; Parsons, F M
1989-06-01
A total of 142 women with severe acute renal failure (ARF) resulting from obstetric causes was treated by dialysis at a single centre from 1956 to 1987. One-year survival was 78.6%, which compares favourably with other causes of ARF. Abortion, haemorrhage and preclampsia comprised 95% of cases, with survival being best (82.9%) with abortion. Survival was adversely affected by increasing age. Acute cortical necrosis (12.7% of patients) carried 100% mortality after 6 years. Follow-up of survivors showed normal renal function up to 31 years following ARF; 25-year patient survival was 71.6%. Improvements in obstetric care and the disappearance of illegal abortions have resulted in a dramatic decline in the incidence of obstetric ARF.
Second Primary Malignant Neoplasms and Survival in Adolescent and Young Adult Cancer Survivors.
Keegan, Theresa H M; Bleyer, Archie; Rosenberg, Aaron S; Li, Qian; Goldfarb, Melanie
2017-11-01
Although the increased incidence of second primary malignant neoplasms (SPMs) is a well-known late effect after cancer, few studies have compared survival after an SPM to survival of the same cancer occurring as first primary malignant neoplasm (PM) by age. To assess the survival impact of SPMs in adolescents and young adults (AYAs) (15-39 years) compared with that of pediatric (<15 years) and older adult (≥40 years) patients with the same SPMs. This was a population-based, retrospective cohort study of patients with cancer in 13 Surveillance, Epidemiology and End Results regions in the United States diagnosed from 1992 to 2008 and followed through 2013. Data analysis was performed between June 2016 and January 2017. Five-year relative survival was calculated overall and for each cancer occurring as a PM or SPM by age at diagnosis. The impact of SPM status on cancer-specific death was examined using multivariable Cox proportional hazards regression. A total of 15 954 pediatric, 125 750 AYAs, and 878 370 older adult patients diagnosed as having 14 cancers occurring as a PM or SPM were included. Overall, 5-year survival after an SPM was 33.1% lower for children, 20.2% lower for AYAs, and 8.3% lower for older adults compared with a PM at the same age. For the most common SPMs in AYAs, the absolute difference in 5-year survival was 42% lower for secondary non-Hodgkin lymphoma, 19% for secondary breast carcinoma, 15% for secondary thyroid carcinoma, and 13% for secondary soft-tissue sarcoma. Survival by SPM status was significantly worse in younger vs older patients for thyroid, Hodgkin lymphoma, non-Hodgkin lymphoma, acute myeloid leukemia, soft-tissue sarcoma, and central nervous system cancer. Adolescents and young adults with secondary Hodgkin lymphoma (hazard ratio [95% CI], 3.5 [1.7-7.1]); soft-tissue sarcoma (2.8 [2.1-3.9]); breast carcinoma (2.1 [1.8-2.4]); acute myeloid leukemia (1.9 [1.5-2.4]); and central nervous system cancer (1.8 [1.2-2.8]) experienced worse survival compared with AYAs with the same PMs. The adverse impact of SPMs on survival is substantial for AYAs and may partially explain the relative lack of survival improvement in AYAs compared with other age groups. The impact of a particular SPM diagnosis on survival may inform age-specific prevention, screening, treatment, and survivorship recommendations.
A realistic appraisal of methods to enhance desiccation tolerance of entomopathogenic nematodes.
Perry, Roland N; Ehlers, Ralf-Udo; Glazer, Itamar
2012-06-01
Understanding the desiccation survival attributes of infective juveniles of entomopathogenic nematodes (EPN) of the genera Steinernema and Heterorhabditis, is central to evaluating the reality of enhancing the shelf-life and field persistence of commercial formulations. Early work on the structural and physiological aspects of desiccation survival focused on the role of the molted cuticle in controlling the rate of water loss and the importance of energy reserves, particularly neutral lipids. The accumulation of trehalose was also found to enhance desiccation survival. Isolation of natural populations that can survive harsh environments, such as deserts, indicated that some populations have enhanced abilities to survive desiccation. However, survival abilities of EPN are limited compared with those of some species of plant-parasitic nematodes inhabiting aerial parts of plants. Research on EPN stress tolerance has expanded on two main lines: i) to select strains of species, currently in use commercially, which have increased tolerance to environmental extremes; and ii) to utilize molecular information, including expressed sequence tags and genome sequence data, to determine the underlying genetic factors that control longevity and stress tolerance of EPN. However, given the inherent limitations of EPN survival ability, it is likely that improved formulation will be the major factor to enhance EPN longevity and, perhaps, increase the range of applications.
Falkowski-Temporini, Gislaine Janaina; Lopes, Carina Ribeiro; Massini, Paula Fernanda; Brustolin, Camila Fernanda; Ferraz, Fabiana Nabarro; Sandri, Patricia Flora; Hernandes, Luzmarina; Aleixo, Denise Lessa; Barion, Terezinha Fátima; Esper, Luiz Gilson; de Araújo, Silvana Marques
2017-09-01
Recent evidence includes apoptosis as a defense against Trypanosoma cruzi infection, which promotes an immune response in the host induced by T cells, type 1, 2 and 17. Currently, there is no medicine completely preventing the progression of this disease. We investigated the immunological and apoptotic effects, morbidity and survival of mice infected with T. cruzi and treated with dynamized homeopathic compounds 13c: Kalium causticum (GCaus), Conium maculatum, (GCon), Lycopodium clavatum (GLy) and 7% alcohol solution (control, vehicle compounds, GCI). There was significant difference in the increase of apoptosis in the treated groups, compared with GCI, which might indicate action of the compounds in these cells. Infected animals treated with Lycopodium clavatum presented better performance compared with other groups. GLy showed a higher amount of hepatocytes and splenocytes undergoing apoptosis, higher number of apoptotic bodies in the liver, predominance of Th1 response, increased TNF-α and decreased IL-6, higher survival, lower morbidity, higher water consumption, body temperature, tendency to higher feed intake and weight gain compared with GCI. Conium maculatum had worse results with increased Th2 response with increased IL-4, worsening of the infection with early mortality of the animals. Together, these data suggest that highly diluted medicines modulate the immune response and apoptosis, affecting the morbidity of animals infected with a highly virulent strain of T. cruzi, being able to minimize the course of infection, providing more alternative approaches in the treatment of Chagas disease. Copyright © 2017 Elsevier Ltd. All rights reserved.
Powell, Brian D; Saxon, Leslie A; Boehmer, John P; Day, John D; Gilliam, F Roosevelt; Heidenreich, Paul A; Jones, Paul W; Rousseau, Matthew J; Hayes, David L
2013-10-29
This study sought to determine if the risk of mortality associated with inappropriate implantable cardioverter-defibrillator (ICD) shocks is due to the underlying arrhythmia or the shock itself. Shocks delivered from ICDs are associated with an increased risk of mortality. It is unknown if all patients who experience inappropriate ICD shocks have an increased risk of death. We evaluated survival outcomes in patients with an ICD and a cardiac resynchronization therapy defibrillator enrolled in the LATITUDE remote monitoring system (Boston Scientific Corp., Natick, Massachusetts) through January 1, 2010. First shock episode rhythms from 3,809 patients who acutely survived the initial shock were adjudicated by 7 electrophysiologists. Patients with a shock were matched to patients without a shock (n = 3,630) by age at implant, implant year, sex, and device type. The mean age of the study group was 64 ± 13 years, and 78% were male. Compared with no shock, there was an increased rate of mortality in those who received their first shock for monomorphic ventricular tachycardia (hazard ratio [HR]: 1.65, p < 0.0001), ventricular fibrillation/polymorphic ventricular tachycardia (HR: 2.10, p < 0.0001), and atrial fibrillation/flutter (HR: 1.61, p = 0.003). In contrast, mortality after first shocks due to sinus tachycardia and supraventricular tachycardia (HR: 0.97, p = 0.86) and noise/artifact/oversensing (HR: 0.91, p = 0.76) was comparable to that in patients without a shock. Compared with no shock, those who received their first shock for ventricular rhythms and atrial fibrillation had an increased risk of death. There was no significant difference in survival after inappropriate shocks for sinus tachycardia or noise/artifact/oversensing. In this study, the adverse prognosis after first shock appears to be more related to the underlying arrhythmia than to an adverse effect from the shock itself. Copyright © 2013 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Developmental and Evolutionary History Affect Survival in Stressful Environments
Hopkins, Gareth R.; Brodie, Edmund D.; French, Susannah S.
2014-01-01
The world is increasingly impacted by a variety of stressors that have the potential to differentially influence life history stages of organisms. Organisms have evolved to cope with some stressors, while with others they have little capacity. It is thus important to understand the effects of both developmental and evolutionary history on survival in stressful environments. We present evidence of the effects of both developmental and evolutionary history on survival of a freshwater vertebrate, the rough-skinned newt (Taricha granulosa) in an osmotically stressful environment. We compared the survival of larvae in either NaCl or MgCl2 that were exposed to salinity either as larvae only or as embryos as well. Embryonic exposure to salinity led to greater mortality of newt larvae than larval exposure alone, and this reduced survival probability was strongly linked to the carry-over effect of stunted embryonic growth in salts. Larval survival was also dependent on the type of salt (NaCl or MgCl2) the larvae were exposed to, and was lowest in MgCl2, a widely-used chemical deicer that, unlike NaCl, amphibian larvae do not have an evolutionary history of regulating at high levels. Both developmental and evolutionary history are critical factors in determining survival in this stressful environment, a pattern that may have widespread implications for the survival of animals increasingly impacted by substances with which they have little evolutionary history. PMID:24748021
Brenner, Hermann; Castro, Felipe A; Eberle, Andrea; Emrich, Katharina; Holleczek, Bernd; Katalinic, Alexander; Jansen, Lina
2016-01-01
The proportion of cases notified by death certificate only (DCO) is a commonly used data quality indicator in studies comparing cancer survival across regions and over time. We aimed to assess dependence of DCO proportions on the age structure of cancer patients. Using data from a national cancer survival study in Germany, we determined age specific and overall (crude) DCO proportions for 24 common forms of cancer. We then derived overall (crude) DCO proportions expected in case of shifts of the age distribution of the cancer populations by 5 and 10 years, respectively, assuming age specific DCO proportions to remain constant. Median DCO proportions across the 24 cancers were 2.4, 3.7, 5.5, 8.5 and 23.9% in age groups 15-44, 45-54, 55-64, 65-74, and 75+, respectively. A decrease of ages by 5 and 10 years resulted in decreases of cancer specific crude DCO proportions ranging from 0.4 to 4.8 and from 0.7 to 8.6 percent units, respectively. Conversely, an increase of ages by 5 and 10 years led to increases of cancer specific crude DCO proportions ranging from 0.8 to 4.8 and from 1.8 to 9.6 percent units, respectively. These changes were of similar magnitude (but in opposite direction) as changes in crude 5-year relative survival resulting from the same shifts in age distribution. The age structure of cancer patient populations has a substantial impact on DCO proportions. DCO proportions should therefore be age adjusted in comparative studies on cancer survival across regions and over time. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kurian, Allison W; Canchola, Alison J; Gomez, Scarlett L; Clarke, Christina A
2016-11-01
Nipple-sparing mastectomy, which may improve cosmesis, body image, and sexual function in comparison to non-nipple-sparing mastectomy, is increasingly used to treat early-stage breast cancer; however, long-term survival data are lacking. We evaluated survival after nipple-sparing mastectomy versus non-nipple-sparing mastectomy in a population-based cancer registry. We conducted an observational study using the California Cancer Registry, considering all stage 0-III breast cancers diagnosed in California from 1988 to 2013. We compared breast cancer-specific and overall survival time after nipple-sparing versus non-nipple-sparing mastectomy, using multivariable analysis. Among 157,592 stage 0-III female breast cancer patients treated with unilateral mastectomy from 1988-2013, 993 (0.6 %) were reported as having nipple-sparing and 156,599 (99.4 %) non-nipple-sparing mastectomies; median follow-up was 7.9 years. The proportion of mastectomies that were nipple-sparing increased over time (1988, 0.2 %; 2013, 5.1 %) and with neighborhood socioeconomic status, and decreased with age and stage. On multivariable analysis, nipple-sparing mastectomy was associated with a lower risk of breast cancer-specific mortality compared to non-nipple-sparing mastectomy [hazard ratio (HR) 0.71, 95 % confidence interval (CI) 0.51-0.98]. However, when restricting to diagnoses 1996 or later and adjusting for a larger set of covariates, risk was attenuated (HR 0.86, 95 % CI 0.52-1.42). Among California breast cancer patients diagnosed from 1988-2013, nipple-sparing mastectomy was not associated with worse survival than non-nipple-sparing mastectomy. These results may inform the decisions of patients and doctors deliberating between these surgical approaches for breast cancer treatment.
Dijk, F Nicole; McKay, Karen; Barzi, Federica; Gaskin, Kevin J; Fitzgerald, Dominic A
2011-12-01
Newborn screening (NBS) for cystic fibrosis (CF) is associated with improved early nutritional outcomes and improved spirometry in children. The aim of this study was to determine whether early diagnosis and treatment of CF with NBS in New South Wales in 1981 led to better clinical outcomes and survival into early adulthood. Retrospective observational study comprising two original cohorts born in the 3 years before ('non-screened cohort', n=57) and after ('screened'; n=60) the introduction of NBS. Patient records were assessed at transfer from paediatric to adult care by age 19 years and survival was documented to age 25 years. Non-screened patients (n=38) when compared with screened patients (n=41) had a higher rate and lower age of Pseudomonas aeruginosa acquisition at age 18 years (p ≤ 0.01). Height, weight and body mass index (BMI) z scores (all p<0.01) and forced expiratory volume in 1 s (FEV(1))% were better in the screened group (n=41) (difference: 16.7 ± 6.4%; p=0.01) compared to non-screened (n=38) subjects on transfer to adult care. Each 1% increase in FEV(1)% was associated with a 3% (95% CI 1% to 5%; p=0.001) decrease in risk of death and each 1.0 kg/m(2) increase in BMI contributed to a 44% (95% CI 31% to 55%; p<0.001) decrease in risk of death. This accumulated in a significant survival difference at age 25 years (25 vs 13 deaths or lung transplants; p=0.01). NBS for CF leads to better lung function, nutritional status and improved survival in screened patients in early adulthood.
Adabag, Selcuk; Hodgson, Lucinda; Garcia, Santiago; Anand, Vidhu; Frascone, Ralph; Conterato, Marc; Lick, Charles; Wesley, Keith; Mahoney, Brian; Yannopoulos, Demetris
2017-01-01
Despite many advances in resuscitation science the outcomes of sudden cardiac arrest (SCA) remain poor. The Minnesota Resuscitation Consortium (MRC) is a statewide integrated resuscitation program, established in 2011, to provide standardized, evidence-based resuscitation and post-resuscitation care. The objective of this study is to assess the outcomes of a state-wide integrated resuscitation program. We examined the trends in resuscitation metrics and outcomes in Minnesota since 2011 and compared these to the results from the national Cardiac Arrest Registry to Enhance Survival (CARES) program. Since 2011 MRC has expanded significantly providing service to >75% of Minnesota's population. A total of 5192 SCA occurred in counties covered by MRC from 2011 to 2014. In this period, bystander cardiopulmonary resuscitation (CPR) and use of hypothermia, automatic CPR device and impedance threshold device increased significantly (p<0.0001 for all). Compared to CARES, SCA cases in Minnesota were more likely to be ventricular fibrillation (31% vs. 23%, p<0.0001) but less likely to receive bystander CPR (33% vs. 39%, p<0.0001). Survival to hospital discharge with good or moderate cerebral performance (12% vs. 8%, p<0.0001), survival in SCA with a shockable rhythm (Utstein survival) (38% vs. 33%, p=0.0003) and Utstein survival with bystander CPR (44% vs. 37%, p=0.003) were greater in Minnesota than CARES. State-wide integration of resuscitation services in Minnesota was feasible. Survival rate after cardiac arrest is greater in Minnesota compared to the mean survival rate in CARES. Published by Elsevier Ireland Ltd.
Jung, Enjae; Perrone, Erin E; Brahmamdan, Pavan; McDonough, Jacquelyn S; Leathersich, Ann M; Dominguez, Jessica A; Clark, Andrew T; Fox, Amy C; Dunne, W Michael; Hotchkiss, Richard S; Coopersmith, Craig M
2013-01-01
World conditions place large populations at risk from ionizing radiation (IR) from detonation of dirty bombs or nuclear devices. In a subgroup of patients, ionizing radiation exposure would be followed by a secondary infection. The effects of radiation combined injury are potentially more lethal than either insult in isolation. The purpose of this study was to determine mechanisms of mortality and possible therapeutic targets in radiation combined injury. Mice were exposed to IR with 2.5 Gray (Gy) followed four days later by intratracheal methicillin-resistant Staphylococcus aureus (MRSA). While either IR or MRSA alone yielded 100% survival, animals with radiation combined injury had 53% survival (p = 0.01). Compared to IR or MRSA alone, mice with radiation combined injury had increased gut apoptosis, local and systemic bacterial burden, decreased splenic CD4 T cells, CD8 T cells, B cells, NK cells, and dendritic cells, and increased BAL and systemic IL-6 and G-CSF. In contrast, radiation combined injury did not alter lymphocyte apoptosis, pulmonary injury, or intestinal proliferation compared to IR or MRSA alone. In light of the synergistic increase in gut apoptosis following radiation combined injury, transgenic mice that overexpress Bcl-2 in their intestine and wild type mice were subjected to IR followed by MRSA. Bcl-2 mice had decreased gut apoptosis and improved survival compared to WT mice (92% vs. 42%; p<0.01). These data demonstrate that radiation combined injury results in significantly higher mortality than could be predicted based upon either IR or MRSA infection alone, and that preventing gut apoptosis may be a potential therapeutic target.
Muraleedharan, Vakkat; Marsh, Hazel; Kapoor, Dheeraj; Channer, Kevin S; Jones, T Hugh
2013-12-01
Men with type 2 diabetes are known to have a high prevalence of testosterone deficiency. No long-term data are available regarding testosterone and mortality in men with type 2 diabetes or any effect of testosterone replacement therapy (TRT). We report a 6-year follow-up study to examine the effect of baseline testosterone and TRT on all-cause mortality in men with type 2 diabetes and low testosterone. A total of 581 men with type 2 diabetes who had testosterone levels performed between 2002 and 2005 were followed up for a mean period of 5.81.3 S.D. years. mortality rates were compared between total testosterone 10.4nmol/l (300ng/dl; n=343) and testosterone 10.4nmol/l (n=238). the effect of TRT (as per normal clinical practise: 85.9% testosterone gel and 14.1% intramuscular testosterone undecanoate) was assessed retrospectively within the low testosterone group. Mortality was increased in the low testosterone group (17.2%) compared with the normal testosterone group (9%; P=0.003) when controlled for covariates. In the Cox regression model, multivariate-adjusted hazard ratio (HR) for decreased survival was 2.02 (P=0.009, 95% CI 1.2-3.4). TRT (mean duration 41.6±20.7 months; n=64) was associated with a reduced mortality of 8.4% compared with 19.2% (P=0.002) in the untreated group (n=174). The multivariate-adjusted HR for decreased survival in the untreated group was 2.3 (95% CI 1.3-3.9, P=0.004). Low testosterone levels predict an increase in all-cause mortality during long-term follow-up. Testosterone replacement may improve survival in hypogonadal men with type 2 diabetes.
Jung, Enjae; Perrone, Erin E.; Brahmamdan, Pavan; McDonough, Jacquelyn S.; Leathersich, Ann M.; Dominguez, Jessica A.; Clark, Andrew T.; Fox, Amy C.; Dunne, W. Michael; Hotchkiss, Richard S.; Coopersmith, Craig M.
2013-01-01
World conditions place large populations at risk from ionizing radiation (IR) from detonation of dirty bombs or nuclear devices. In a subgroup of patients, ionizing radiation exposure would be followed by a secondary infection. The effects of radiation combined injury are potentially more lethal than either insult in isolation. The purpose of this study was to determine mechanisms of mortality and possible therapeutic targets in radiation combined injury. Mice were exposed to IR with 2.5 Gray (Gy) followed four days later by intratracheal methicillin-resistant Staphylococcus aureus (MRSA). While either IR or MRSA alone yielded 100% survival, animals with radiation combined injury had 53% survival (p = 0.01). Compared to IR or MRSA alone, mice with radiation combined injury had increased gut apoptosis, local and systemic bacterial burden, decreased splenic CD4 T cells, CD8 T cells, B cells, NK cells, and dendritic cells, and increased BAL and systemic IL-6 and G-CSF. In contrast, radiation combined injury did not alter lymphocyte apoptosis, pulmonary injury, or intestinal proliferation compared to IR or MRSA alone. In light of the synergistic increase in gut apoptosis following radiation combined injury, transgenic mice that overexpress Bcl-2 in their intestine and wild type mice were subjected to IR followed by MRSA. Bcl-2 mice had decreased gut apoptosis and improved survival compared to WT mice (92% vs. 42%; p<0.01). These data demonstrate that radiation combined injury results in significantly higher mortality than could be predicted based upon either IR or MRSA infection alone, and that preventing gut apoptosis may be a potential therapeutic target. PMID:24204769
Durán-Aniotz, Claudia; Segal, Gabriela; Salazar, Lorena; Pereda, Cristián; Falcón, Cristián; Tempio, Fabián; Aguilera, Raquel; González, Rodrigo; Pérez, Claudio; Tittarelli, Andrés; Catalán, Diego; Nervi, Bruno; Larrondo, Milton; Salazar-Onfray, Flavio; López, Mercedes N
2013-04-01
Immunization with autologous dendritic cells (DCs) loaded with a heat shock-conditioned allogeneic melanoma cell lysate caused lysate-specific delayed type hypersensitivity (DTH) reactions in a number of patients. These responses correlated with a threefold prolonged long-term survival of DTH(+) with respect to DTH(-) unresponsive patients. Herein, we investigated whether the immunological reactions associated with prolonged survival were related to dissimilar cellular and cytokine responses in blood. Healthy donors and melanoma patient's lymphocytes obtained from blood before and after vaccinations and from DTH biopsies were analyzed for T cell population distribution and cytokine release. Peripheral blood lymphocytes from melanoma patients have an increased proportion of Th3 (CD4(+) TGF-β(+)) regulatory T lymphocytes compared with healthy donors. Notably, DTH(+) patients showed a threefold reduction of Th3 cells compared with DTH(-) patients after DCs vaccine treatment. Furthermore, DCs vaccination resulted in a threefold augment of the proportion of IFN-γ releasing Th1 cells and in a twofold increase of the IL-17-producing Th17 population in DTH(+) with respect to DTH(-) patients. Increased Th1 and Th17 cell populations in both blood and DTH-derived tissues suggest that these profiles may be related to a more effective anti-melanoma response. Our results indicate that increased proinflammatory cytokine profiles are related to detectable immunological responses in vivo (DTH) and to prolonged patient survival. Our study contributes to the understanding of immunological responses produced by DCs vaccines and to the identification of follow-up markers for patient outcome that may allow a closer individual monitoring of patients.
Contrasting nest survival patterns for ducks and songbirds in northern mixed-grass prairie
Grant, Todd; Shaffer, Terry L.; Madden, Elizabeth M.; Nenneman, Melvin P.
2017-01-01
Management actions intended to protect or improve habitat for ducks may benefit grassland-nesting passerines, but scant information is available to explore this assumption. During 1998–2003, we examined nest survival of ducks and songbirds to determine whether effects of prescribed fire and other habitat features (e.g., shrub cover and distance to habitat edges) were similar for ducks and passerines breeding in North Dakota. We used the logistic-exposure method to estimate survival of duck and songbird nests (n = 3,171). We used an information-theoretic approach to identify factors that most influenced nest survival. Patterns of nest survival were markedly different between taxonomic groups. For ducks, nest survival was greater during the first postfire nesting season (daily survival rate [DSR] = 0.957, 85% CI = 0.951–0.963), relative to later postfire nesting seasons (DSR = 0.946, 85% CI = 0.942–0.950). Furthermore duck nest survival and nest densities were inversely related. Duck nest survival also was greater as shrub cover decreased and as distance from cropland and wetland edges increased. Passerines had lower nest survival during the first postfire nesting season (DSR = 0.934, 85% CI = 0.924–0.944), when densities also were low compared to subsequent postfire nesting seasons (DSR = 0.947, 85% CI = 0.944–0.950). Parasitism by brown-headed cowbirds (Molothrus ater) reduced passerine nest survival and this effect was more pronounced during the first postfire nesting season compared to subsequent nesting seasons. Passerine nest survival was greater as shrub cover decreased and perhaps for more concealed nests. Duck and songbird nest survival rates were not correlated during this study and for associated studies that examined additional variables using the same dataset, suggesting that different mechanisms influenced their survival. Based on our results, ducks should not be considered direct surrogates for passerines when predicting effects of prescribed fire, shrub cover, and habitat edges on nest survival.
Wagner, Michael R; Chen, Zhong
2004-12-01
The southwestern pine tip moth, Rhyacionia neomexicana (Dyar) (Lepidoptera: Tortricidae), is a native forest pest that attacks seedlings and saplings of ponderosa pine, Pinus ponderosa Dougl. ex Laws, in the southwestern United States. Repeated attacks can cause severe deformation of host trees and significant long-term growth loss. Alternatively, effective control of R. neomexicana, vegetative competition, or both in young pine plantations may increase survival and growth of trees for many years after treatments are applied. We test the null hypothesis that 4 yr of R. neomexicana and weed control with insecticide, weeding, and insecticide plus weeding would not have any residual effect on survival and growth of trees in ponderosa pine plantation in northern Arizona 14 yr post-treatment, when the trees were 18 yr old. Both insecticide and weeding treatment increased tree growth and reduced the incidence of southwestern pine tip moth damage compared with the control. However, weeding alone also significantly increased tree survival, whereas insecticide alone did not. The insecticide plus weeding treatment had the greatest tree growth and survival, and the lowest rate of tip moth damage. Based on these results, we rejected our null hypothesis and concluded that there were detectable increases in the survival and growth of ponderosa pines 14 yr after treatments applied to control R. neomexicana and weeds.
Prognostic value of Child-Turcotte criteria in medically treated cirrhosis.
Christensen, E; Schlichting, P; Fauerholdt, L; Gluud, C; Andersen, P K; Juhl, E; Poulsen, H; Tygstrup, N
1984-01-01
The Child- Turcotte criteria (CTC) (based on serum bilirubin and albumin, ascites, neurological disorder and nutrition) are established prognostic factors in patients with cirrhosis having portacaval shunt surgery. The objective of this study was to evaluate the prognostic value of CTC in conservatively treated cirrhosis. Patients (n = 245) with histologically verified cirrhosis from a control group of a controlled clinical trial were studied. Data at entry into the trial were used to classify patients according to CTC. Survival curves for up to 16 years were made, and survival rates were compared using the log-rank test. Survival decreased significantly with increasing degree of abnormality (A----B----C) of albumin (p less than 0.001), ascites (p less than 0.001), bilirubin (p = 0.02) and nutritional status (p = 0.03). Survival was insignificantly influenced by neurological status (p = 0.11) probably because none of the patients had hepatic coma at entry into the trial. The five variables in CTC were combined to a score. With increasing score, the median survival time decreased from 6.4 years (score 5) to 2 months (scores 12 or more). Furthermore, the mortality from hepatic failure, gastrointestinal bleeding or hepatocellular carcinoma increased significantly with increasing score. CTC provide valuable and easily obtainable prognostic information in cirrhosis. However, CTC are inferior to a prognostic index based on multivariate analysis of prognostic factors.
Gram-Negative Bacterial Wound Infections
2014-05-01
shows an effect with increasing concentration, however survival analysis does not show a significant difference between treatment groups and controls ...with 3 dead larvae in the 25 mM group compared to a single dead larva in the control group (Fig. 7). Probit analysis estimates the lethal...statistically differ- ent from that of the control group . The levels (CFU/g) of bacteria in lung tissue correlated with the survival curves. The median
Ståhle, Lars; Granström, Elisabeth; Ljungdahl Ståhle, Ewa; Isaksson, Sven; Samuelsson, Anders; Rudling, Mats; Sepp, Harry
2013-06-01
To describe clinical chemistry and weight changes after short-term food or sleep deprivation or multiple deprivations during civilian survival training. Data from one baseline-controlled two-period crossover study designed to compare sleep deprivation for up to 50 hours with food deprivation for up to 66 hours (n = 12) and data from regular multiple-deprivations survival training comparing participants (n = 33) with nondeprived instructors (n = 10). Food deprivation was associated with decreased body weight, blood glucose, serum triglycerides, sodium, chloride, and urine pH, and there were increases in blood and urine ketones and serum free fatty acids. Sleep deprivation was associated with a minor decrease in hemoglobin and erythrocyte particle count and volume fraction and an increase in leukocytes. The clinical chemistry and body weight changes associated with food deprivation were qualitatively similar to those observed in fasting obese patients but developed quicker in the survival training setting. Sleep deprivation had few effects on the clinical chemistry profile except for hematological variables. Physicians evaluating clinical chemistry data from patients subjected to short-term food or sleep deprivation should take the physiological state into account in their assessment. Copyright © 2013 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.
Zhu, Qianqian; Li, Na; Zeng, Xiaoyan; Han, Qunying; Li, Fang; Yang, Cuiling; Lv, Yi; Zhou, Zhihua; Liu, Zhengwen
2015-02-28
Hepatocellular carcinoma (HCC) is among the most common and lethal cancers worldwide, especially in China. We retrospectively analyzed data from patients who were diagnosed and treated HCC between 2002 and 2011 in a large hospital in northwest China and compared the data between periods 2002-2006 (P1) and 2007-2011 (P2). 2045 patients were included in analysis. The HCC stages at diagnosis according to the Barcelona clinic liver cancer staging system had no significant change. Treatment options of liver transplantation, transcatheter arterial chemoembolization and other therapy decreased while percutaneous local ablation and supportive care increased from P1 to P2. Options of surgical resection and systematic therapy had no significant change. Patient survival rates at 1, 3 and 5 years significantly improved from P1 to P2. The treatments with increasing option trend had a higher magnitude of survival increase and vise versa. Over the last 10 years, the patient survival had a significant increase which was mainly a result of the optimal therapeutic selections according to disease stages in this center. However, the proportion of patients diagnosed at early stages of HCC remained low and did not increase, a result calling for implementing surveillance system for at risk patients.
NASA Technical Reports Server (NTRS)
Evans, Helen H.; Horng, Min-Fen; Ricanati, Marlene; Diaz-Insua, Mireya; Jordan, Robert; Schwartz, Jeffrey L.
2002-01-01
Genomic instability in the human lymphoblast cell line TK6 was studied in clones surviving 36 generations after exposure to accelerated 56Fe ions. Clones were assayed for 20 characteristics, including chromosome aberrations, plating efficiency, apoptosis, cell cycle distribution, response to a second irradiation, and mutant frequency at two loci. The primary effect of the 56Fe-ion exposure on the surviving clones was a significant increase in the frequency of unstable chromosome aberrations compared to the very low spontaneous frequency, along with an increase in the phenotypic complexity of the unstable clones. The radiation-induced increase in the frequency of unstable chromosome aberrations was much greater than that observed previously in clones of the related cell line, WTK1, which in comparison to the TK6 cell line expresses an increased radiation resistance, a mutant TP53 protein, and an increased frequency of spontaneous unstable chromosome aberrations. The characteristics of the unstable clones of the two cell lines also differed. Most of the TK6 clones surviving exposure to 56Fe ions showed unstable cytogenetic abnormalities, while the phenotype of the WTK1 clones was more diverse. The results underscore the importance of genotype in the characteristics of instability after radiation exposure.
Sundaram, Vinay; Choi, Gina; Jeon, Christie Y; Ayoub, Walid S; Nissen, Nicholas N; Klein, Andrew S; Tran, Tram T
2015-05-01
Primary sclerosing cholangitis (PSC) patients suffer from comorbidities unaccounted for by the model for end-stage liver disease scoring system and may benefit from the increased donor organ pool provided by donation after cardiac death (DCD) liver transplantation. However, the impact of DCD transplantation on PSC graft outcomes is unknown. We studied 41,018 patients using the United Network for Organ Sharing database from 2002 through 2012. Kaplan-Meier analysis and Cox regression were used to evaluate graft survival and risk factors for graft failure, respectively. The PSC patients receiving DCD livers (n=75) showed greater overall graft failure (37.3% vs. 20.4%, P = 0.001), graft failure from biliary complications (47.4% vs. 13.9%, P = 0.002), and shorter graft survival time (P = 0.003), compared to PSC patients receiving donation after brain death organs (n=1592). Among DCD transplants (n=1943), PSC and non-PSC patients showed similar prevalence of graft failure and graft survival time, though a trend existed toward increased biliary-induced graft failure among PSC patients (47.4 vs. 26.4%, P = 0.063). Cox modeling demonstrated that PSC patients have a positive graft survival advantage compared to non-PSC patients (hazard ratio [HR]=0.72, P < 0.001), whereas DCD transplantation increased risk of graft failure (HR = 1.28, P < 0.001). Furthermore, the interaction between DCD transplant and PSC was significant (HR = 1.76, P = 0.015), indicating that use of DCD organs impacts graft survival more in PSC than non-PSC patients. Donation after cardiac death liver transplantation leads to significantly worse outcomes in PSC. We recommend cautious use of DCD transplantation in this population.
Peacock, Elizabeth; Laake, Jeff; Laidre, Kristin L.; Born, Erik W.; Atkinson, Stephen N.
2012-01-01
Management of polar bear (Ursus maritimus) populations requires the periodic assessment of life history metrics such as survival rate. This information is frequently obtained during short-term capture and marking efforts (e.g., over the course of three years) that result in hundreds of marked bears remaining in the population after active marking is finished. Using 10 additional years of harvest recovery subsequent to a period of active marking, we provide updated estimates of annual survival for polar bears in the Baffin Bay population of Greenland and Canada. Our analysis suggests a decline in survival of polar bears since the period of active marking that ended in 1997; some of the decline in survival can likely be attributed to a decline in springtime ice concentration over the continental shelf of Baffin Island. The variance around the survival estimates is comparatively high because of the declining number of marks available; therefore, results must be interpreted with caution. The variance of the estimates of survival increased most substantially in the sixth year post-marking. When survival estimates calculated with recovery-only and recapture-recovery data sets from the period of active marking were compared, survival rates were indistinguishable. However, for the period when fewer marks were available, survival estimates were lower using the recovery-only data set, which indicates that part of the decline we detected for 2003 – 09 may be due to using only harvest recovery data. Nevertheless, the decline in the estimates of survival is consistent with population projections derived from harvest numbers and earlier vital rates, as well as with an observed decline in the extent of sea ice habitat.
Campbell, Ian; Scott, Nina; Seneviratne, Sanjeewa; Kollias, James; Walters, David; Taylor, Corey; Roder, David
2015-01-01
The Quality Audit (BQA) program of the Breast Surgeons of Australia and New Zealand (NZ) collects data on early female breast cancer and its treatment. BQA data covered approximately half all early breast cancers diagnosed in NZ during roll-out of the BQA program in 1998-2010. Coverage increased progressively to about 80% by 2008. This is the biggest NZ breast cancer database outside the NZ Cancer Registry and it includes cancer and clinical management data not collected by the Registry. We used these BQA data to compare socio-demographic and cancer characteristics and survivals by ethnicity. BQA data for 1998-2010 diagnoses were linked to NZ death records using the National Health Index (NHI) for linking. Live cases were followed up to December 31st 2010. Socio-demographic and invasive cancer characteristics and disease-specific survivals were compared by ethnicity. Five-year survivals were 87% for Maori, 84% for Pacific, 91% for other NZ cases and 90% overall. This compared with the 86% survival reported for all female breast cases covered by the NZ Cancer Registry which also included more advanced stages. Patterns of survival by clinical risk factors accorded with patterns expected from the scientific literature. Compared with Other cases, Maori and Pacific women were younger, came from more deprived areas, and had larger cancers with more ductal and fewer lobular histology types. Their cancers were also less likely to have a triple negative phenotype. More of the Pacific women had vascular invasion. Maori women were more likely to reside in areas more remote from regional cancer centres, whereas Pacific women generally lived closer to these centres than Other NZ cases. NZ BQA data indicate previously unreported differences in breast cancer biology by ethnicity. Maori and Pacific women had reduced breast cancer survival compared with Other NZ women, after adjusting for socio-demographic and cancer characteristics. The potential contributions to survival differences of variations in service access, timeliness and quality of care, need to be examined, along with effects of co- morbidity and biological factors.
Outcomes of Male Patients with Alport Syndrome Undergoing Renal Replacement Therapy
Temme, Johanna; Kramer, Anneke; Jager, Kitty J.; Lange, Katharina; Peters, Frederick; Müller, Gerhard-Anton; Kramar, Reinhard; Heaf, James G.; Finne, Patrik; Palsson, Runolfur; Reisæter, Anna V.; Hoitsma, Andries J.; Metcalfe, Wendy; Postorino, Maurizio; Zurriaga, Oscar; Santos, Julio P.; Ravani, Pietro; Jarraya, Faical; Verrina, Enrico; Dekker, Friedo W.
2012-01-01
Summary Background and objectives Patients with the hereditary disease Alport syndrome commonly require renal replacement therapy (RRT) in the second or third decade of life. This study compared age at onset of RRT, renal allograft, and patient survival in men with Alport syndrome receiving various forms of RRT (peritoneal dialysis, hemodialysis, or transplantation) with those of men with other renal diseases. Design, setting, participants, & measurements Patients with Alport syndrome receiving RRT identified from 14 registries in Europe were matched to patients with other renal diseases. A linear spline model was used to detect changes in the age at start of RRT over time. Kaplan-Meier method and Cox regression analysis were used to examine patient and graft survival. Results Age at start of RRT among patients with Alport syndrome remained stable during the 1990s but increased by 6 years between 2000–2004 and 2005–2009. Survival of patients with Alport syndrome requiring dialysis or transplantation did not change between 1990 and 2009. However, patients with Alport syndrome had better renal graft and patient survival than matched controls. Numbers of living-donor transplantations were lower in patients with Alport syndrome than in matched controls. Conclusions These data suggest that kidney failure in patients with Alport syndrome is now being delayed compared with previous decades. These patients appear to have superior patient survival while undergoing dialysis and superior patient and graft survival after deceased-donor kidney transplantation compared with patients receiving RRT because of other causes of kidney failure. PMID:22997344
Outcomes of male patients with Alport syndrome undergoing renal replacement therapy.
Temme, Johanna; Kramer, Anneke; Jager, Kitty J; Lange, Katharina; Peters, Frederick; Müller, Gerhard-Anton; Kramar, Reinhard; Heaf, James G; Finne, Patrik; Palsson, Runolfur; Reisæter, Anna V; Hoitsma, Andries J; Metcalfe, Wendy; Postorino, Maurizio; Zurriaga, Oscar; Santos, Julio P; Ravani, Pietro; Jarraya, Faical; Verrina, Enrico; Dekker, Friedo W; Gross, Oliver
2012-12-01
Patients with the hereditary disease Alport syndrome commonly require renal replacement therapy (RRT) in the second or third decade of life. This study compared age at onset of RRT, renal allograft, and patient survival in men with Alport syndrome receiving various forms of RRT (peritoneal dialysis, hemodialysis, or transplantation) with those of men with other renal diseases. Patients with Alport syndrome receiving RRT identified from 14 registries in Europe were matched to patients with other renal diseases. A linear spline model was used to detect changes in the age at start of RRT over time. Kaplan-Meier method and Cox regression analysis were used to examine patient and graft survival. Age at start of RRT among patients with Alport syndrome remained stable during the 1990s but increased by 6 years between 2000-2004 and 2005-2009. Survival of patients with Alport syndrome requiring dialysis or transplantation did not change between 1990 and 2009. However, patients with Alport syndrome had better renal graft and patient survival than matched controls. Numbers of living-donor transplantations were lower in patients with Alport syndrome than in matched controls. These data suggest that kidney failure in patients with Alport syndrome is now being delayed compared with previous decades. These patients appear to have superior patient survival while undergoing dialysis and superior patient and graft survival after deceased-donor kidney transplantation compared with patients receiving RRT because of other causes of kidney failure.
Arshad, Hafiz Muhammad Sharjeel; Kabir, Christopher; Tetangco, Eula; Shah, Natahsa; Raddawi, Hareth
2017-09-01
Recently published data indicate increasing incidence of colorectal adenocarcinoma (CRC) in young-onset (<50 years) patients. This study examines racial disparities in presentation and survival times among non-Hispanic Blacks (NHB) and Hispanics compared with non-Hispanic Whites (NHW). A retrospective single-center cohort study was conducted from 2004 through 2014 using 96 patient medical charts with a diagnosis of young-onset CRC. Age, gender, primary site, and histological stage at the time of diagnosis were assessed for survival probabilities by racial group over a minimum follow-up period of 5 years. Among subjects with CRC diagnosis before 50 years of age, the majority of subjects were between 40 and 50 years, with CRC presentation occurring among this age group for 51 (79.7%) of NHW, 18 (81.8%) of NHB, and 5 (50.0%) of Hispanics. The majority of all patients presented with advanced stages of CRC (31.3% with stage III and 27.1% with stage IV). NHB exhibited statistically significantly worse survival compared to NHW (adjusted hazard ratio for death = 2.09; 95% confidence interval 1.14-3.84; P = 0.02). A possible trend of worse survival was identified for Hispanics compared to NHW, but this group was low in numbers and results were not statistically significant. Disparities between racial groups among young-onset CRC cases were identified in overall survival and reflect growing concern in rising incidence and differentiated care management.
Climate Change, Precipitation and Impacts on an Estuarine Refuge from Disease
Levinton, Jeffrey; Doall, Michael; Ralston, David; Starke, Adam; Allam, Bassem
2011-01-01
Background Oysters play important roles in estuarine ecosystems but have suffered recently due to overfishing, pollution, and habitat loss. A tradeoff between growth rate and disease prevalence as a function of salinity makes the estuarine salinity transition of special concern for oyster survival and restoration. Estuarine salinity varies with discharge, so increases or decreases in precipitation with climate change may shift regions of low salinity and disease refuge away from optimal oyster bottom habitat, negatively impacting reproduction and survival. Temperature is an additional factor for oyster survival, and recent temperature increases have increased vulnerability to disease in higher salinity regions. Methodology/Principal Findings We examined growth, reproduction, and survival of oysters in the New York Harbor-Hudson River region, focusing on a low-salinity refuge in the estuary. Observations were during two years when rainfall was above average and comparable to projected future increases in precipitation in the region and a past period of about 15 years with high precipitation. We found a clear tradeoff between oyster growth and vulnerability to disease. Oysters survived well when exposed to intermediate salinities during two summers (2008, 2010) with moderate discharge conditions. However, increased precipitation and discharge in 2009 reduced salinities in the region with suitable benthic habitat, greatly increasing oyster mortality. To evaluate the estuarine conditions over longer periods, we applied a numerical model of the Hudson to simulate salinities over the past century. Model results suggest that much of the region with suitable benthic habitat that historically had been a low salinity refuge region may be vulnerable to higher mortality under projected increases in precipitation and discharge. Conclusions/Significance Predicted increases in precipitation in the northeastern United States due to climate change may lower salinities past important thresholds for oyster survival in estuarine regions with appropriate substrate, potentially disrupting metapopulation dynamics and impeding oyster restoration efforts, especially in the Hudson estuary where a large basin constitutes an excellent refuge from disease. PMID:21552552
Moore, Suzanne P; Green, Adèle C; Bray, Freddie; Coory, Michael; Garvey, Gail; Sabesan, Sabe; Valery, Patricia C
2016-06-01
While Indigenous people in Queensland have lower colorectal cancer (CRC) incidence and mortality than the rest of the population, CRC remains the third most frequent cancer among Australian Indigenous people overall. This study aimed to investigate patterns of care and survival between Indigenous and non-Indigenous Australians with CRC. Through a matched-cohort design we compared 80 Indigenous and 85 non-Indigenous people all diagnosed with CRC and treated in Queensland public hospitals during 1998-2004 (frequency matched on age, sex, geographical remoteness). We compared clinical and treatment data (Pearson's chi-square) and all-cause and cancer survival (Cox regression analysis). Indigenous patients with CRC were not significantly more likely to have comorbidity, advanced disease at diagnosis or less treatment than non-Indigenous people. There was also no statistically significant difference in all-cause survival (HR 1.14, 95% CI 0.69, 1.89) or cancer survival (HR 1.01, 95% CI 0.60, 1.69) between the two groups. Similar CRC mortality among Indigenous and other Australians may reflect both the lower incidence and adequate management. Increasing life expectancy and exposures to risk factors suggests that Indigenous people are vulnerable to a growing burden of CRC. Primary prevention and early detection will be of paramount importance to future CRC control among Indigenous Australians. Current CRC management must be maintained and include prevention measures to ensure that predicted increases in CRC burden are minimized. © 2014 Wiley Publishing Asia Pty Ltd.
Seiler, Stefanie; Di Santo, Stefano; Sahli, Sebastian; Andereggen, Lukas; Widmer, Hans Rudolf
2017-08-01
Cell transplantation using ventral mesencephalic tissue is an experimental approach to treat Parkinson's disease. This approach is limited by poor survival of the transplants and the high number of dopaminergic neurons needed for grafting. Increasing the yield of dopaminergic neurons in donor tissue is of great importance. We have previously shown that antagonization of the Nogo-receptor 1 by NEP1-40 promoted survival of cultured dopaminergic neurons and exposure to neurotrophin-4/5 increased dopaminergic cell densities in organotypic midbrain cultures. We investigated whether a combination of both treatments offers a novel tool to further improve dopaminergic neuron survival. Rat embryonic ventral mesencephalic neurons grown as organotypic free-floating roller tube or primary dissociated cultures were exposed to neurotrophin-4/5 and NEP1-40. The combined and single factor treatment resulted in significantly higher numbers of tyrosine hydroxylase positive neurons compared to controls. Significantly stronger tyrosine hydroxylase signal intensity was detected by Western blotting in the combination-treated cultures compared to controls but not compared to single factor treatments. Neurotrophin-4/5 and the combined treatment showed significantly higher signals for the neuronal marker microtubule-associated protein 2 in Western blots compared to control while no effects were observed for the astroglial marker glial fibrillary acidic protein between groups, suggesting that neurotrophin-4/5 targets mainly neuronal cells. Finally, NEP1-40 and the combined treatment significantly augmented tyrosine hydroxylase positive neurite length. Summarizing, our findings substantiate that antagonization of the Nogo-receptor 1 promotes dopaminergic neurons but does not further increase the yield of dopaminergic neurons and their morphological complexity when combined with neurotrophin-4/5 hinting to the idea that these treatments might exert their effects by activating common downstream pathways. Copyright © 2017 Elsevier B.V. All rights reserved.
Auluck, Ajit; Hislop, Greg; Bajdik, Chris; Hay, John; Bottorff, Joan L; Zhang, Lewei; Rosin, Miriam P
2012-12-01
A shift in etiology of oral cancers has been associated with a rise in incidence for oropharyngeal cancers (OPC) and decrease for oral cavity cancers (OCC); however, there is limited information about population-based survival trends. We report epidemiological transitions in survival for both OPC and OCC from a population-based cancer registry, focusing upon gender and ethnic differences. All primary oral cancers diagnosed between 1980 and 2005 were identified from the British Columbia Cancer Registry and regrouped into OPC and OCC by topographical subsites, time periods (1980-1993 and 1994-2005), stage at diagnosis, and ethnicity. Cases were then followed up to December 2009. Using gender-based analysis, actuarial life tables were used to calculate survival rates, which were compared using Kaplan-Meier curves and log-rank tests. For OPC, survival improved, significant for tonsil and base of tongue in men and marginally significant at base of tongue in women. This improvement occurred in spite of an increase in late-stage diagnosis for OPC in both genders. Interestingly, there was no difference in survival for early- and late-stage disease for OPC in men. For OCC, there was a decrease in survival for floor of mouth cancers in both genders although significant in women only. South Asians had the poorest survival for OCC in both genders. Survival for OPC improved, more dramatically in men than women, in spite of late-stage diagnosis and increasing nodal involvement. Given the poor survival rates and need for early detection, targeted OCC screening programs are required for South Asians.
Improved Survival After Heart Failure: A Community‐Based Perspective
Joffe, Samuel W.; Webster, Kristy; McManus, David D.; Kiernan, Michael S.; Lessard, Darleen; Yarzebski, Jorge; Darling, Chad; Gore, Joel M.; Goldberg, Robert J.
2013-01-01
Background Heart failure is a highly prevalent, morbid, and costly disease with a poor long‐term prognosis. Evidence‐based therapies utilized over the past 2 decades hold the promise of improved outcomes, yet few contemporary studies have examined survival trends in patients with acute heart failure. The primary objective of this population‐based study was to describe trends in short‐ and long‐term survival in patients hospitalized with acute decompensated heart failure (ADHF). A secondary objective was to examine patient characteristics associated with decreased long‐term survival. Methods and Results We reviewed the medical records of 9748 patients hospitalized with ADHF at all 11 medical centers in central Massachusetts during 1995, 2000, 2002, and 2004. Patients hospitalized with ADHF were more likely to be elderly and to have been diagnosed with multiple comorbidities in 2004 compared with 1995. Over this period, survival was significantly improved in‐hospital, and at 1, 2, and 5 years postdischarge. Five‐year survival rates increased from 20% in 1995 to 29% in 2004. Although survival improved substantially over time, older patients and patients with chronic kidney disease, chronic obstructive pulmonary disease, anemia, low body mass index, and low blood pressures had consistently lower postdischarge survival rates than patients without these comorbidities. Conclusion Between 1995 and 2004, patients hospitalized with ADHF have become older and increasingly comorbid. Although there has been a significant improvement in survival among these patients, their long‐term prognosis remains poor, as fewer than 1 in 3 patients hospitalized with ADHF in 2004 survived more than 5 years. PMID:23676294
Yang, Haibing; Zhang, Xiao; Gaxiola, Roberto A.; Xu, Guohua; Peer, Wendy Ann; Murphy, Angus S.
2014-01-01
Phosphorus (P), an element required for plant growth, fruit set, fruit development, and fruit ripening, can be deficient or unavailable in agricultural soils. Previously, it was shown that over-expression of a proton-pyrophosphatase gene AVP1/AVP1D (AVP1DOX) in Arabidopsis, rice, and tomato resulted in the enhancement of root branching and overall mass with the result of increased mineral P acquisition. However, although AVP1 over-expression also increased shoot biomass in Arabidopsis, this effect was not observed in tomato under phosphate-sufficient conditions. AVP1DOX tomato plants exhibited increased rootward auxin transport and root acidification compared with control plants. AVP1DOX tomato plants were analysed in detail under limiting P conditions in greenhouse and field trials. AVP1DOX plants produced 25% (P=0.001) more marketable ripened fruit per plant under P-deficient conditions compared with the controls. Further, under low phosphate conditions, AVP1DOX plants displayed increased phosphate transport from leaf (source) to fruit (sink) compared to controls. AVP1DOX plants also showed an 11% increase in transplant survival (P<0.01) in both greenhouse and field trials compared with the control plants. These results suggest that selection of tomato cultivars for increased proton pyrophosphatase gene expression could be useful when selecting for cultivars to be grown on marginal soils. PMID:24723407
Incidence, Survival, and Mortality of Malignant Cutaneous Melanoma in Wisconsin, 1995-2011.
Peterson, Molly; Albertini, Mark R; Remington, Patrick
2015-10-01
To assess trends in malignant melanoma incidence, survival, and mortality in Wisconsin. Incidence data for Wisconsin were obtained from the Wisconsin Cancer Reporting System Bureau of Health Information using Wisconsin Interactive Statistics on Health, while incidence data for the United States were obtained from the Surveillance, Epidemiology, and End Results system (SEER). The mortality to incidence ratio [1 - (mortality/incidence)] was used as a proxy to estimate relative 5-year survival in Wisconsin, while observed 5-year survival rates for the United States were obtained from SEER. Mortality data for both Wisconsin and the United States were extracted using the Centers for Disease Control and Prevention Wide-ranging Online Data for Epidemiologic Research. During the past decade, malignant melanoma incidence rates increased 57% in Wisconsin (from 12.1 to 19.0 cases per 100,000) versus a 33% increase (from 20.9 to 27.7 cases per 100,000) in the United States during the same time period. The greatest Wisconsin increase in incidence was among women ages 45-64 years and among men ages 65 years and older. Overall relative percent difference in 5-year survival in Wisconsin rose 10% (from 77% to 85%) and was unchanged (82%) for the United States. Wisconsin overall mortality rates were unchanged at 2.8 deaths per 100,000, compared to a 10% increase in the United States (from 3.1 to 3.4 deaths per 100,000). Wisconsin mortality rates improved for women ages 45-64 and for men ages 25-44. Despite improvements in malignant melanoma survival rates, increases in incidence represent a major public health challenge for physicians and policymakers.
Exosomes Derived from Squamous Head and Neck Cancer Promote Cell Survival after Ionizing Radiation
Mutschelknaus, Lisa; Peters, Carsten; Winkler, Klaudia; Yentrapalli, Ramesh; Heider, Theresa; Atkinson, Michael John; Moertl, Simone
2016-01-01
Exosomes are nanometer-sized extracellular vesicles that are believed to function as intercellular communicators. Here, we report that exosomes are able to modify the radiation response of the head and neck cancer cell lines BHY and FaDu. Exosomes were isolated from the conditioned medium of irradiated as well as non-irradiated head and neck cancer cells by serial centrifugation. Quantification using NanoSight technology indicated an increased exosome release from irradiated compared to non-irradiated cells 24 hours after treatment. To test whether the released exosomes influence the radiation response of other cells the exosomes were transferred to non-irradiated and irradiated recipient cells. We found an enhanced uptake of exosomes isolated from both irradiated and non-irradiated cells by irradiated recipient cells compared to non-irradiated recipient cells. Functional analyses by exosome transfer indicated that all exosomes (from non-irradiated and irradiated donor cells) increase the proliferation of non-irradiated recipient cells and the survival of irradiated recipient cells. The survival-promoting effects are more pronounced when exosomes isolated from irradiated compared to non-irradiated donor cells are transferred. A possible mechanism for the increased survival after irradiation could be the increase in DNA double-strand break repair monitored at 6, 8 and 10 h after the transfer of exosomes isolated from irradiated cells. This is abrogated by the destabilization of the exosomes. Our results demonstrate that radiation influences both the abundance and action of exosomes on recipient cells. Exosomes transmit prosurvival effects by promoting the proliferation and radioresistance of head and neck cancer cells. Taken together, this study indicates a functional role of exosomes in the response of tumor cells to radiation exposure within a therapeutic dose range and encourages that exosomes are useful objects of study for a better understanding of tumor radiation response. PMID:27006994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peeler, C; Bronk, L; UT Graduate School of Biomedical Sciences at Houston, Houston, TX
2015-06-15
Purpose: High throughput in vitro experiments assessing cell survival following proton radiation indicate that both the alpha and the beta parameters of the linear quadratic model increase with increasing proton linear energy transfer (LET). We investigated the relative biological effectiveness (RBE) of double-strand break (DSB) induction as a means of explaining the experimental results. Methods: Experiments were performed with two lung cancer cell lines and a range of proton LET values (0.94 – 19.4 keV/µm) using an experimental apparatus designed to irradiate cells in a 96 well plate such that each column encounters protons of different dose-averaged LET (LETd). Traditionalmore » linear quadratic survival curve fitting was performed, and alpha, beta, and RBE values obtained. Survival curves were also fit with a model incorporating RBE of DSB induction as the sole fit parameter. Fitted values of the RBE of DSB induction were then compared to values obtained using Monte Carlo Damage Simulation (MCDS) software and energy spectra calculated with Geant4. Other parameters including alpha, beta, and number of DSBs were compared to those obtained from traditional fitting. Results: Survival curve fitting with RBE of DSB induction yielded alpha and beta parameters that increase with proton LETd, which follows from the standard method of fitting; however, relying on a single fit parameter provided more consistent trends. The fitted values of RBE of DSB induction increased beyond what is predicted from MCDS data above proton LETd of approximately 10 keV/µm. Conclusion: In order to accurately model in vitro proton irradiation experiments performed with high throughput methods, the RBE of DSB induction must increase more rapidly than predicted by MCDS above LETd of 10 keV/µm. This can be explained by considering the increased complexity of DSBs or the nature of intra-track pairwise DSB interactions in this range of LETd values. NIH Grant 2U19CA021239-35.« less
Demanelis, Kathryn; Sriplung, Hutcha; Meza, Rafael; Wiangnon, Surapon; Rozek, Laura S.; Scheurer, Michael E.; Lupo, Philip J.
2015-01-01
BACKGROUND Childhood leukemia incidence and survival varies globally, and this variation may be attributed to environmental risk factors, genetics, and/or disparities in diagnosis and treatment. PROCEDURE We analyzed childhood leukemia incidence and survival trends in children age 0–19 years from 1990 to 2011 in Songkhla, Thailand (n=316) and compared these results to US data from the Surveillance, Epidemiology, and End Results (SEER) registry (n=6,738). We computed relative survival using Ederer II and estimated survival functions using the Kaplan-Meier method. Changes in incidence and five-year survival by year of diagnosis were evaluated using joinpoint regression and are reported as annual percent changes (APC). RESULTS The age-standardized incidence of leukemia was 3.2 and 4.1 cases per 100,000 in Songkhla and SEER-9, respectively. In Songkhla, incidence from 1990–2011 significantly increased for leukemia (APC=1.7%, p=0.031) and acute lymphoblastic leukemia (ALL) (APC=1.8%, p=0.033). Acute myeloid leukemia (AML) incidence significantly increased (APC=4.2%, p=0.044) and was significantly different from the US (p=0.026), where incidence was stable during the same period (APC=0.3%, p=0.541). The overall five-year relative survival for leukemia was lower than that reported in the US (43% vs. 79%). Five-year survival significantly improved by at least 2% per year from 1990–2011 in Songkhla for leukemia, ALL, and AML (p<0.050). CONCLUSIONS While leukemia and ALL incidence increased in Songkhla, differences in leukemia trends, particularly AML incidence, may suggest etiologic or diagnostic differences between Songkhla and the US. This work highlights the importance of evaluating childhood cancer trends in low- and middle-income countries. PMID:25962869
On the susceptibility of adaptive memory to false memory illusions.
Howe, Mark L; Derbish, Mary H
2010-05-01
Previous research has shown that survival-related processing of word lists enhances retention for that material. However, the claim that survival-related memories are more accurate has only been examined when true recall and recognition of neutral material has been measured. In the current experiments, we examined the adaptive memory superiority effect for different types of processing and material, measuring accuracy more directly by comparing true and false recollection rates. Survival-related information and processing was examined using word lists containing backward associates of neutral, negative, and survival-related critical lures and type of processing (pleasantness, moving, survival) was varied using an incidental memory paradigm. Across four experiments, results showed that survival-related words were more susceptible than negative and neutral words to the false memory illusion and that processing information in terms of its relevance to survival independently increased this susceptibility to the false memory illusion. Overall, although survival-related processing and survival-related information resulted in poorer, not more accurate, memory, such inaccuracies may have adaptive significance. These findings are discussed in the context of false memory research and recent theories concerning the importance of survival processing and the nature of adaptive memory. Copyright 2009 Elsevier B.V. All rights reserved.
Beltman, M E; Lonergan, P; Diskin, M G; Roche, J F; Crowe, M A
2009-04-15
Progesterone is essential for establishment and maintenance of pregnancy in mammals. The objective of this study was to examine the effect of elevating progesterone during the different physiological stages of early embryo development on embryo survival. Estrus was synchronized in cross-bred beef heifers (n=197, approximately 2-years old) and they were inseminated 12-18h after estrus onset (=Day 0). Inseminated heifers were randomly assigned to 1 of 3 treatments: (1) Control, n=69; (2) progesterone supplementation using a Controlled Internal Drug Release Device (CIDR) from Day 3 to 6.5, n=64; or (3) progesterone supplementation using a CIDR from Day 4.5 to 8, n=64. Body condition (BCS) and locomotion scores (scale of 1-5) were recorded for all animals. Animals with a locomotion score >/=4 (very lame) were excluded. Embryo survival rate was determined at slaughter on Day 25. Conceptus length and weight were recorded and the corpus luteum (CL) of all pregnant animals was dissected and weighed. Supplementation with exogenous progesterone increased (P<0.05) peripheral progesterone concentrations, but did not affect embryo survival rate compared with controls. Mean CL weight, conceptus length and conceptus weight were not different between treatments. There was a positive relationship (P<0.04) between the increase in progesterone concentrations from Days 3 to 6.5 and embryo survival rate in treated heifers and a similar trend existed between the increase from Days 4.5 to 8 (P<0.06). There was also a positive relationship (P<0.05) between the progesterone concentration on Day 6.5 and the embryo survival rate in treated heifers. A direct correlation was seen between locomotion score and embryo survival rate, with higher (P<0.05) early embryo survival rates in heifers with a lower locomotion score. In conclusion, supplementation with progesterone at different stages of early embryo development increased peripheral progesterone concentration and resulted in a positive association between changes in progesterone concentration during the early luteal phase and embryo survival rate. Supplementation with progesterone had no effect on either CL weight or conceptus size in pregnant animals. Lameness had a significant negative effect on early embryo survival.
Maxwell, Jessica Hooton; Kumar, Bhavna; Feng, Felix Y.; Worden, Francis P.; Lee, Julia; Eisbruch, Avraham; Wolf, Gregory T.; Prince, Mark E.; Moyer, Jeffrey S.; Teknos, Theodoros N.; Chepeha, Douglas B.; McHugh, Jonathan B.; Urba, Susan; Stoerker, Jay; Walline, Heather; Kurnit, David; Cordell, Kitrina G.; Davis, Samantha J.; Ward, Preston D.; Bradford, Carol R.; Carey, Thomas E.
2009-01-01
Purpose The goal of this study was to examine the effect of tobacco use on disease recurrence (local/regional recurrence, distant metastasis, or second primary) among HPV-positive patients with squamous cell carcinoma of the oropharynx (SCCOP) following a complete response to chemoradiation therapy. Experimental Design Between 1999 and 2007, 124 patients with advanced SCCOP (86% with stage IV) and adequate tumor tissue for HPV analysis who were enrolled in one of two consecutive University of Michigan treatment protocols were prospectively included in this study. Patients were categorized as never, former, or current tobacco users. The primary end-points were risk of disease recurrence and time to recurrence; secondary end-points were disease-specific survival and overall survival. Results One hundred and two patients (82.3%) had HPV-positive tumors. Over two-thirds (68%) of patients with HPV-positive tumors were tobacco users. Among HPV-positive patients, current tobacco users were at significantly higher risk of disease recurrence than never-tobacco users (hazard ratio = 5.2; confidence interval [1.1-24.4]; p=0.038). Thirty-five percent of HPV-positive ever tobacco users recurred compared to only 6% of HPV-positive never users and 50% of HPV-negative patients. All HPV-negative patients were tobacco users and had significantly shorter times to recurrence (p=0.002) and reduced disease-specific survival (p=0.004) and overall survival (p<0.001) compared to HPV-positive patients. Compared to HPV-positive never-tobacco users, those with a tobacco history showed a trend for reduced disease-specific survival (p=0.064) but not overall survival (p=0.221). Conclusion Current tobacco users with advanced, HPV-positive SCCOP are at higher risk of disease recurrence compared to never-tobacco users. PMID:20145161
Dynamic patient counseling: a novel concept in idiopathic pulmonary fibrosis.
Brown, A Whitney; Shlobin, Oksana A; Weir, Nargues; Albano, Maria C; Ahmad, Shahzad; Smith, Mary; Leslie, Kevin; Nathan, Steven D
2012-10-01
The characteristics of long-term survivors with idiopathic pulmonary fibrosis (IPF) have never been fully elucidated. We sought to illustrate the attenuated mortality and describe the characteristics of patients with IPF who survived at least 5 years beyond their initial presentation. Patients with IPF evaluated between 1997 and 2006 were identified through the clinic database. Patients who survived beyond 5 years from the time of their evaluation were compared with those who died or underwent lung transplantation within 5 years. Survival analyses were performed from the time of initial evaluation and contingent on annualized survival thereafter. Eighty-seven patients who survived at least 5 years formed the comparator group to whom other patients were contrasted. These patients had a higher BMI, FVC % predicted, FEV1 % predicted, total lung capacity % predicted, and diffusing capacity of lung for carbon monoxide % predicted, but a lower FEV1/FVC ratio and lower mean pulmonary artery pressures. More than one-half of these patients had moderate or severe disease at the time of presentation. Our annualized contingent survival analyses revealed a progressively increasing median survival dependent on the duration of the disease. Although we were able to demonstrate differences in our 5-year survivors, rather than being a distinct group, these patients appear to exist within a continuum of improving survival dependent on prior disease duration. This progressively improving time-dependent prognosis mandates the serial reevaluation of an individual patient’s projected outcomes. The implementation of dynamic counseling is an important concept in more accurately predicting life expectancy for patients with IPF who are frequently haunted by the prospects of a dismal survival.
Kirol, Christopher P.; Sutphin, Andrew L.; Bond, Laura S.; Fuller, Mark R.; Maechtle, Thomas L.
2015-01-01
Sagebrush Artemisia spp. habitats being developed for oil and gas reserves are inhabited by sagebrush obligate species — including the greater sage-grouse Centrocercus urophasianus (sage-grouse) that is currently being considered for protection under the U.S. Endangered Species Act. Numerous studies suggest increasing oil and gas development may exacerbate species extinction risks. Therefore, there is a great need for effective on-site mitigation to reduce impacts to co-occurring wildlife such as sage-grouse. Nesting success is a primary factor in avian productivity and declines in nesting success are also thought to be an important contributor to population declines in sage-grouse. From 2008 to 2011 we monitored 296 nests of radio-marked female sage-grouse in a natural gas (NG) field in the Powder River Basin, Wyoming, USA, and compared nest survival in mitigated and non-mitigated development areas and relatively unaltered areas to determine if specific mitigation practices were enhancing nest survival. Nest survival was highest in relatively unaltered habitats followed by mitigated, and then non-mitigated NG areas. Reservoirs used for holding NG discharge water had the greatest support as having a direct relationship to nest survival. Within a 5-km2 area surrounding a nest, the probability of nest failure increased by about 15% for every 1.5 km increase in reservoir water edge. Reducing reservoirs was a mitigation focus and sage-grouse nesting in mitigated areas were exposed to almost half of the amount of water edge compared to those in non-mitigated areas. Further, we found that an increase in sagebrush cover was positively related to nest survival. Consequently, mitigation efforts focused on reducing reservoir construction and reducing surface disturbance, especially when the surface disturbance results in sagebrush removal, are important to enhancing sage-grouse nesting success.
Local delivery of cancer-cell glycolytic inhibitors in high-grade glioma
Wicks, Robert T.; Azadi, Javad; Mangraviti, Antonella; Zhang, Irma; Hwang, Lee; Joshi, Avadhut; Bow, Hansen; Hutt-Cabezas, Marianne; Martin, Kristin L.; Rudek, Michelle A.; Zhao, Ming; Brem, Henry; Tyler, Betty M.
2015-01-01
Background 3-bromopyruvate (3-BrPA) and dichloroacetate (DCA) are inhibitors of cancer-cell specific aerobic glycolysis. Their application in glioma is limited by 3-BrPA's inability to cross the blood-brain-barrier and DCA's dose-limiting toxicity. The safety and efficacy of intracranial delivery of these compounds were assessed. Methods Cytotoxicity of 3-BrPA and DCA were analyzed in U87, 9L, and F98 glioma cell lines. 3-BrPA and DCA were incorporated into biodegradable pCPP:SA wafers, and the maximally tolerated dose was determined in F344 rats. Efficacies of the intracranial 3-BrPA wafer and DCA wafer were assessed in a rodent allograft model of high-grade glioma, both as a monotherapy and in combination with temozolomide (TMZ) and radiation therapy (XRT). Results 3-BrPA and DCA were found to have similar IC50 values across the 3 glioma cell lines. 5% 3-BrPA wafer-treated animals had significantly increased survival compared with controls (P = .0027). The median survival of rats with the 50% DCA wafer increased significantly compared with both the oral DCA group (P = .050) and the controls (P = .02). Rats implanted on day 0 with a 5% 3-BrPA wafer in combination with TMZ had significantly increased survival over either therapy alone. No statistical difference in survival was noted when the wafers were added to the combination therapy of TMZ and XRT, but the 5% 3-BrPA wafer given on day 0 in combination with TMZ and XRT resulted in long-term survivorship of 30%. Conclusion Intracranial delivery of 3-BrPA and DCA polymer was safe and significantly increased survival in an animal model of glioma, a potential novel therapeutic approach. The combination of intracranial 3-BrPA and TMZ provided a synergistic effect. PMID:25053853
Shadyab, Aladdin H; Macera, Caroline A; Shaffer, Richard A; Jain, Sonia; Gallo, Linda C; Gass, Margery L S; Waring, Molly E; Stefanick, Marcia L; LaCroix, Andrea Z
2017-01-01
The aim of the present study was to investigate associations between reproductive factors and survival to age 90 years. This was a prospective study of postmenopausal women from the Women's Health Initiative recruited from 1993 to 1998 and followed until the last outcomes evaluation on August 29, 2014. Participants included 16,251 women born on or before August 29, 1924 for whom survival to age 90 during follow-up was ascertained. Women were classified as having survived to age 90 (exceptional longevity) or died before age 90. Multivariable logistic regression models were used to evaluate associations of ages at menarche and menopause (natural or surgical) and reproductive lifespan with longevity, adjusting for demographic, lifestyle, and reproductive characteristics. Participants were on average aged 74.7 years (range, 69-81 y) at baseline. Of 16,251 women, 8,892 (55%) survived to age 90. Women aged at least 12 years at menarche had modestly increased odds of longevity (odds ratio [OR], 1.09; 95% CI, 1.00-1.19). There was a significant trend toward increased longevity for later age at menopause (natural or surgical; Ptrend = 0.01), with ORs (95% CIs) of 1.19 (1.04-1.36) and 1.18 (1.02-1.36) for 50 to 54 and at least 55 compared with less than 40 years, respectively. Later age at natural menopause as a separate exposure was also significantly associated with increased longevity (Ptrend = 0.02). Longer reproductive lifespan was significantly associated with increased longevity (Ptrend = 0.008). The odds of longevity were 13% (OR 1.13; 95% CI, 1.03-1.25) higher in women with more than 40 compared with less than 33 reproductive years. Reproductive characteristics were associated with late-age survival in older women.
Earlam, S; Glover, C; Davies, M; Fordy, C; Allen-Mersh, T G
1997-05-01
Since systemic and regional (HAI) fluorinated pyrimidine chemotherapies offer similar survival benefit in treatment of colorectal liver metastases (CLM), we sought to identify their impact on quality of life (QoL), which might be a useful indicator of treatment preference. We compared QoL in 135 CLM patients managed by symptom control (n = 49 patients), systemic fluorouracil (5FU)/folinic acid (n = 35), or hepatic arterial floxuridine (FUDR) (n = 51). Full blood count and liver function tests, World Health Organization (WHO) toxicity criteria, and QoL (Rotterdam Symptom Checklist [RSC], the Sickness Impact Profile [SIP], and the Hospital Anxiety and Depression scale [HAD]) were measured monthly in all patients. The HAD anxiety score was significantly increased in symptom control compared with chemotherapy patients 1 month after randomization. There was a significant increase in RSC physical score (repeated measures, P = .05), and in scores for sore mouth (P < .01), dry mouth (P < .01), and tingling hands and feet (P < .01) in systemic chemotherapy compared with symptom control patients. Significant QoL differences (repeated measures and Mann-Whitney U [MWU]) between HAI and symptom control patients were not detected. Systemic chemotherapy patients lived for significantly longer (log-rank test, P < or = .0001) with abnormal HAD anxiety, RSC psychosocial, or RSC sore mouth scores compared with HAI patients, but there were no overall survival differences. Randomization to symptom control only was associated with increased anxiety. QoL with systemic chemotherapy was impaired by side effects. HAI was associated with similar survival to systemic chemotherapy but with better sustained QoL.
Bishop, Andrew J; McDonald, Mark W; Chang, Andrew L; Esiashvili, Natia
2012-01-01
To evaluate the incidence of infant brain tumors and survival outcomes by disease and treatment variables. The Surveillance, Epidemiology, and End Results (SEER) Program November 2008 submission database provided age-adjusted incidence rates and individual case information for primary brain tumors diagnosed between 1973 and 2006 in infants less than 12 months of age. Between 1973 and 1986, the incidence of infant brain tumors increased from 16 to 40 cases per million (CPM), and from 1986 to 2006, the annual incidence rate averaged 35 CPM. Leading histologies by annual incidence in CPM were gliomas (13.8), medulloblastoma and primitive neuroectodermal tumors (6.6), and ependymomas (3.6). The annual incidence was higher in whites than in blacks (35.0 vs. 21.3 CPM). Infants with low-grade gliomas had the highest observed survival, and those with atypical teratoid rhabdoid tumors (ATRTs) or primary rhabdoid tumors of the brain had the lowest. Between 1979 and 1993, the annual rate of cases treated with radiation within the first 4 months from diagnosis declined from 20.5 CPM to <2 CPM. For infants with medulloblastoma, desmoplastic histology and treatment with both surgery and upfront radiation were associated with improved survival, but on multivariate regression, only combined surgery and radiation remained associated with improved survival, with a hazard ratio for death of 0.17 compared with surgery alone (p = 0.005). For ATRTs, those treated with surgery and upfront radiation had a 12-month survival of 100% compared with 24.4% for those treated with surgery alone (p = 0.016). For ependymomas survival was higher in patients treated in more recent decades (p = 0.001). The incidence of infant brain tumors has been stable since 1986. Survival outcomes varied markedly by histology. For infants with medulloblastoma and ATRTs, improved survival was observed in patients treated with both surgery and early radiation compared with those treated with surgery alone. Copyright © 2012 Elsevier Inc. All rights reserved.
Wang, Dan; Ding, Xiaoming; Xue, Wujun; Zheng, Jin; Tian, Xiaohui; Li, Yang; Wang, Xiaohong; Song, Huanjin; Liu, Hua; Luo, Xiaohui
2017-01-01
It is unknown whether a scaffold containing both small intestinal submucosa (SIS) and mesenchymal stem cells (MSCs) for transplantation may improve pancreatic islet function and survival. In this study, we examined the effects of a SIS-MSC scaffold on islet function and survival in vitro and in vivo. MSCs and pancreatic islets were isolated from Sprague-Dawley rats, and SIS was isolated from Bamei pigs. The islets were apportioned among 3 experimental groups as follows: SIS-islets, SIS-MSC-islets and control-islets. In vitro, islet function was measured by a glucose-stimulated insulin secretion test; cytokines in cultured supernatants were assessed by enzyme-linked immunosorbent assay; and gene expression was analyzed by reverse transcription-quantitative PCR. In vivo, islet transplantation was performed in rats, and graft function and survival were monitored by measuring the blood glucose levels. In vitro, the SIS-MSC scaffold was associated with improved islet viability and enhanced insulin secretion compared with the controls, as well as with the increased the expression of insulin 1 (Ins1), pancreatic and duodenal homeobox 1 (Pdx1), platelet endothelial cell adhesion molecule 1 [Pecam1; also known as cluster of differentiation 31 (CD31)] and vascular endothelial growth factor A (Vegfa) in the islets, increased growth factor secretion, and decreased tumor necrosis factor (TNF) secretion. In vivo, the SIS-MSC scaffold was associated with improved islet function and graft survival compared with the SIS and control groups. On the whole, our findings demonstrate that the SIS-MSC scaffold significantly improved pancreatic islet function and survival in vitro and in vivo. This improvement may be associated with the upregulation of insulin expression, the improvement of islet microcirculation and the secretion of cytokines. PMID:27909715
Chua, Hui Lin; Plett, P Artur; Sampson, Carol H; Katz, Barry P; Carnathan, Gilbert W; MacVittie, Thomas J; Lenden, Keith; Orschell, Christie M
2014-01-01
In an effort to expand the worldwide pool of available medical countermeasures (MCM) against radiation, the PEGylated G-CSF (PEG-G-CSF) molecules Neulasta and Maxy-G34, a novel PEG-G-CSF designed for increased half-life and enhanced activity compared to Neulasta, were examined in a murine model of the Hematopoietic Syndrome of the Acute Radiation Syndrome (H-ARS), along with the lead MCM for licensure and stockpiling, G-CSF. Both PEG-G-CSFs were shown to retain significant survival efficacy when administered as a single dose 24 h post-exposure, compared to the 16 daily doses of G-CSF required for survival efficacy. Furthermore, 0.1 mg kg of either PEG-G-CSF affected survival of lethally-irradiated mice that was similar to a 10-fold higher dose. The one dose/low dose administration schedules are attractive attributes of radiation MCM given the logistical challenges of medical care in a mass casualty event. Maxy-G34-treated mice that survived H-ARS were examined for residual bone marrow damage (RBMD) up to 9 mo post-exposure. Despite differences in Sca-1 expression and cell cycle position in some hematopoietic progenitor phenotypes, Maxy-G34-treated mice exhibited the same degree of hematopoietic stem cell (HSC) insufficiency as vehicle-treated H-ARS survivors in competitive transplantation assays of 150 purified Sca-1+cKit+lin-CD150+cells. These data suggest that Maxy-G34, at the dose, schedule, and time frame examined, did not mitigate RBMD but significantly increased survival from H-ARS at one-tenth the dose previously tested, providing strong support for advanced development of Maxy-G34, as well as Neulasta, as MCM against radiation.
Soghomonyan, Diana; Trchounian, Armen
2013-01-01
The effects of low-intensity electromagnetic irradiation (EMI) with the frequencies of 51.8 and 53 GHz on Lactobacillus acidophilus growth and survival were revealed. These effects were compared with antibacterial effects of antibiotic ceftazidime. Decrease in bacterial growth rate by EMI was comparable with the inhibitory effect of ceftazidime (minimal inhibitory concentration-16 μM) and no enhanced action was observed with combined effects of EMI and the antibiotic. However, EMI-enhanced antibiotic inhibitory effect on bacterial survival. The kinetics of the bacterial suspension oxidation-reduction potential up to 24 h of the growth was changed by EMI and ceftazidime. The changes were more strongly expressed by combined effects of EMI and antibiotic especially up to 12 h. Moreover, EMI did not change overall energy (glucose)-dependent H(+) efflux across the membrane but it increased N,N'-dicyclohexylcarbodiimide (DCCD)-inhibited H(+) efflux. In contrast, this EMI in combination with ceftazidime decreased DCCD-sensitive H(+) efflux. Low-intensity EMI had inhibitory effect on L. acidophilus bacterial growth and survival. The effect on bacterial survival was more significant in the combination with ceftazidime. The H(+)-translocating F 0 F 1-ATPase, for which DCCD is specific inhibitor, might be a target for EMI and ceftazidime. The revealed bactericide effects on L. acidophilus can be applied in biotechnology, food producing and safety technology.
Sasidhar, Manda V; Itoh, Noriko; Gold, Stefan M; Lawson, Gregory W; Voskuhl, Rhonda R
2012-08-01
Many autoimmune diseases are characterised by a female predominance. This may be caused by sex hormones, sex chromosomes or both. This report uses a transgenic mouse model to investigate how sex chromosome complement, not confounded by differences in gonadal type, might contribute to lupus pathogenesis. Transgenic NZM2328 mice were created by deletion of the Sry gene from the Y chromosome, thereby separating genetic from gonadal sex. Survival, renal histopathology and markers of immune activation were compared in mice carrying the XX versus the XY(-) sex chromosome complement, with each genotype being ovary bearing. Mice with XX sex chromosome complement compared with XY(-) exhibited poorer survival rates and increased kidney pathology. Splenic T lymphocytes from XX mice demonstrated upregulated X-linked CD40 ligand expression and higher levels of activation markers ex vivo. Increased MMP, TGF and IL-13 production was found, while IL-2 was lower in XX mice. An accumulation of splenic follicular B cells and peritoneal marginal zone B cells was observed, coupled with upregulated costimulatory marker expression on B cells in XX mice. These data show that the XX sex chromosome complement, compared with XY(-), is associated with accelerated spontaneous lupus.
Comparative optimism in older adults' future health expectations.
Vanderzanden, Karen; Ruthig, Joelle C
2018-05-13
Despite a common belief that health declines with age, many older adults remain optimistic about their future health. However, the longitudinal impact of personal and comparatively optimistic future health estimates (FHEs) is unclear. Among 408 older adults (M age = 70.32 years), this study identified the prevalence, source, and two-year stability of comparatively optimistic FHEs; examined demographic, psychosocial, and health correlates of comparative FHEs; and assessed the role of comparative FHEs in predicting eight-year survival odds. Nearly half of participants were comparatively optimistic due to interpersonal pessimism more so than personal optimism. Regarding stability, comparative optimism declined over the two-year period. Being younger and having more perceived control, dispositional optimism, and recent positive emotions were associated with better FHEs for oneself and a similar other. Beyond effects of age, gender, relationship status, and dispositional optimism, optimistic personal FHEs predicted eight-year survival odds. Findings have implications for predicting survival and advancing the conceptual understanding of comparative FHEs. Statement of contribution What is already known on the subject? Previous research has demonstrated that older adults tend to believe diminished health accompanies increasing age. Despite this notion, older adults remain comparatively optimistic about their health. What does this study add? The longitudinal results of the current study indicated that nearly half of participants were categorized as comparative optimists, primarily due to interpersonal pessimism. The current study demonstrated that there is little distinction between personal FHEs and those for a similar other in terms of demographic, psychosocial, and health correlates. The current study identified factors that predicted eight-year survival among older adults, such as being female, younger, in a committed relationship, and better personal FHEs. © 2018 The British Psychological Society.
Zheng, Jun; Xiang, Jie; Zhou, Jie; Li, Zhiwei; Hu, Zhenhua; Lo, Chung Mau; Wang, Weilin
2014-01-01
Patients with a history of diabetes mellitus (DM) have worse survival than those without DM after liver transplantation. However, the effect of liver grafts from DM donors on the post-transplantation survival of recipients is unclear. Using the Scientific Registry of Transplant Recipients database (2004–2008), 25,413 patients were assessed. Among them, 2,469 recipients received grafts from donors with DM. The demographics and outcome of patients were assessed. Patient survival was assessed using Kaplan–Meier methodology and Cox regression analyses. Recipients from DM donors experienced worse graft survival than recipients from non-DM donors (one-year survival: 81% versus 85%, and five-year survival: 67% versus 74%, P<0.001, respectively). Graft survival was significantly lower for recipients from DM donors with DM duration >5 years (P<0.001) compared with those with DM duration <5 years. Cox regression analyses showed that DM donors were independently associated with worse graft survival (hazard ratio, 1.11; 95% confidence interval, 1.02–1.19). The effect of DM donors was more pronounced on certain underlying liver diseases of recipients. Increases in the risk of graft loss were noted among recipients from DM donors with hepatitis-C virus (HCV) infection, whereas those without HCV experienced similar outcomes compared with recipients from non-DM donors. These data suggest that recipients from DM donors experience significantly worse patient survival after liver transplantation. However, in patients without HCV infection, using DM donors was not independently associated with worse post-transplantation graft survival. Matching these DM donors to recipients without HCV may be safe. PMID:24847864
High tie versus low tie of the inferior mesenteric artery in colorectal cancer: A meta-analysis.
Yang, Yafan; Wang, Guiying; He, Jingli; Zhang, Jianfeng; Xi, Jinchuan; Wang, Feifei
2018-04-01
Colorectal cancer surgery includes "high tie" and "low tie"of the inferior mesenteric artery(IMA). However, different ligation level is closely related to the blood supply of anastomosis, which may increase the leakage rate, and it is unclear which technique confers a lower anastomotic leakage rate(AL) and survival advantage. To compare the effectiveness and impact of inferior mesenteric artery (IMA) high ligation versus IMA low ligation on anastomotic leakage, lymph nodes yield rates and 5-year survival. A list of these studies, published in English from 1990 to 2017, was obtained independently by two reviewers from databases such as PubMed, Medline, ScienceDirect and Web of Science. Anastomotic leakage rate, the yield of lymph nodes and 5-year survival were compared using Review Manager 5.3. There was no significant difference in anastomotic leakage, number of lymph nodes retrieved and 5-year survival rate for both techniques. Neither the high tie nor the low tie strategy has an evidence in terms of anastomotic leakage rate, harvested lymph nodes, and the 5-year survival rate. Further RCT is needed. Copyright © 2018 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
Lung transplantation in idiopathic pulmonary fibrosis: a systematic review of the literature
2014-01-01
Background Idiopathic pulmonary fibrosis (IPF) is a distinct form of interstitial pneumonia with unknown origin and poor prognosis. Current pharmacologic treatments are limited and lung transplantation is a viable option for appropriate patients. The aim of this review was to summarize lung transplantation survival in IPF patients overall, between single (SLT) vs. bilateral lung transplantation (BLT), pre- and post Lung Allocation Score (LAS), and summarize wait-list survival. Methods A systematic review of English-language studies published in Medline or Embase between 1990 and 2013 was performed. Eligible studies were those of observational design reporting survival post-lung transplantation or while on the wait list among IPF patients. Results Median survival post-transplantation among IPF patients is estimated at 4.5 years. From ISHLT and OPTN data, one year survival ranged from 75% - 81%; 3-year: 59% - 64%; and 5-year: 47% - 53%. Post-transplant survival is lower for IPF vs. other underlying pre-transplant diagnoses. The proportion of IPF patients receiving BLT has steadily increased over the last decade and a half. Unadjusted analyses suggest improved long-term survival for BLT vs. SLT; after adjustment for patient characteristics, the differences tend to disappear. IPF patients account for the largest proportion of patients on the wait list and while wait list time has decreased, the number of transplants for IPF patients has increased over time. OPTN data show that wait list mortality is higher for IPF patients vs. other diagnoses. The proportion of IPF patients who died while awaiting transplantation ranged from 14% to 67%. While later transplant year was associated with increased survival, no significant differences were noted pre vs. post LAS implementation; however a high LAS vs low LAS was associated with decreased one-year survival. Conclusions IPF accounts for the largest proportion of patients awaiting lung transplants, and IPF is associated with higher wait-list and post-transplant mortality vs. other diagnoses. Improved BLT vs. SLT survival may be the result of selection bias. Survival pre- vs. post LAS appears to be similar except for IPF patients with high LAS, who have lower survival compared to pre-LAS. Data on post-transplant morbidity outcomes are sparse. PMID:25127540
Wu, Mengrui; Wang, Yiping; Deng, Lianfu; Chen, Wei; Li, Yi-Ping
2012-01-01
Osteoclasts are the principle bone-resorbing cells. Precise control of balanced osteoclast activity is indispensable for bone homeostasis. Osteoclast activation mediated by RANK-TRAF6 axis has been clearly identified. However, a negative regulation-machinery in osteoclast remains unclear. TRAF family member-associated NF-κB activator (TANK) is induced by about 10 folds during osteoclastogenesis, according to a genome-wide analysis of gene expression before and after osteoclast maturation, and confirmed by western blot and quantitative RT-PCR. Bone marrow macrophages (BMMs) transduced with lentivirus carrying tank-shRNA were induced to form osteoclast in the presence of RANKL and M-CSF. Tank expression was downregulated by 90% by Tank-shRNA, which is confirmed by western blot. Compared with wild-type (WT) cells, osteoclastogenesis of Tank-silenced BMMs was increased, according to tartrate-resistant acid phosphatase (TRAP) stain on day 5 and day 7. Number of bone resorption pits by Tank-silenced osteoclasts was increased by 176% compared with WT cells, as shown by wheat germ agglutinin (WGA) stain and scanning electronic microscope (SEM) analysis. Survival rate of Tank-silenced mature osteoclast is also increased. However, acid production of Tank-knockdown cells was not changed compared with control cells. IκBα phosphorylation is increased in tank-silenced cells, indicating that TANK may negatively regulate NF-κB activity in osteoclast. In conclusion, Tank, whose expression is increased during osteoclastogenesis, inhibits osteoclast formation, activity and survival, by regulating NF-κB activity and c-FLIP expression. Tank enrolls itself in a negative feedback loop in bone resorption. These results may provide means for therapeutic intervention in diseases of excessive bone resorption. PMID:23139637
Wu, Mengrui; Wang, Yiping; Deng, Lianfu; Chen, Wei; Li, Yi-Ping
2012-01-01
Osteoclasts are the principle bone-resorbing cells. Precise control of balanced osteoclast activity is indispensable for bone homeostasis. Osteoclast activation mediated by RANK-TRAF6 axis has been clearly identified. However, a negative regulation-machinery in osteoclast remains unclear. TRAF family member-associated NF-κB activator (TANK) is induced by about 10 folds during osteoclastogenesis, according to a genome-wide analysis of gene expression before and after osteoclast maturation, and confirmed by western blot and quantitative RT-PCR. Bone marrow macrophages (BMMs) transduced with lentivirus carrying tank-shRNA were induced to form osteoclast in the presence of RANKL and M-CSF. Tank expression was downregulated by 90% by Tank-shRNA, which is confirmed by western blot. Compared with wild-type (WT) cells, osteoclastogenesis of Tank-silenced BMMs was increased, according to tartrate-resistant acid phosphatase (TRAP) stain on day 5 and day 7. Number of bone resorption pits by Tank-silenced osteoclasts was increased by 176% compared with WT cells, as shown by wheat germ agglutinin (WGA) stain and scanning electronic microscope (SEM) analysis. Survival rate of Tank-silenced mature osteoclast is also increased. However, acid production of Tank-knockdown cells was not changed compared with control cells. IκBα phosphorylation is increased in tank-silenced cells, indicating that TANK may negatively regulate NF-κB activity in osteoclast. In conclusion, Tank, whose expression is increased during osteoclastogenesis, inhibits osteoclast formation, activity and survival, by regulating NF-κB activity and c-FLIP expression. Tank enrolls itself in a negative feedback loop in bone resorption. These results may provide means for therapeutic intervention in diseases of excessive bone resorption.
Survival of postfledging Forster's terns in relation to mercury exposure in San Francisco Bay
Ackerman, Joshua T.; Eagles-Smith, Collin A.; Takekawa, John Y.; Iverson, S.A.
2008-01-01
We examined factors influencing mercury concentrations in 90 fledgling Forster's terns (Sterna forsteri) and evaluated whether mercury influenced postfledging survival in San Francisco Bay, California. Mercury concentrations (??SE) in chicks 21-29 days old (just before fledging) were 0.33 ?? 0.01 ??g g-1 ww for blood and 6.44 ?? 0.28 ??g g -1 fw for breast feathers. Colony site had an overriding influence on fledgling contamination, however hatching date and age also affected blood, but not feather, mercury concentrations. Blood mercury concentrations decreased by 28% during the 50-day hatching period and increased with chick age by 30% during the last week prior to fledging. Using radio-telemetry, we calculated that cumulative survival during the 35-day postfledging time period was 0.81 ?? 0.09 (SE). Postfledging survival rates increased with size-adjusted mass, and cumulative survival probability was 61% lower for terns with the lowest, compared to the highest, observed masses. Conversely, survival was not influenced by blood mercury concentration, time since fledging, sex, or hatch date. Mercury concentrations in breast feathers of fledglings found dead at nesting colonies also were no different than those in live chicks. Our results indicate that colony site, hatching date, and age influenced mercury concentrations in fledgling Forster's terns, but that mercury did not influence postfledging survival. ?? 2008 Springer Science+Business Media, LLC.
Survival benefit of neoadjuvant chemotherapy for resectable breast cancer: A meta-analysis.
Chen, Yan; Shi, Xiu-E; Tian, Jin-Hui; Yang, Xu-Juan; Wang, Yong-Feng; Yang, Ke-Hu
2018-05-01
Neoadjuvant chemotherapy (NAC) increases breast conservation rates in patients with resectable breast cancer at the associated cost of higher locoregional recurrence rates; however, the magnitude of the survival benefits of NAC for these patients remains undefined. Therefore, we aimed to clarify the survival benefit of NAC versus postoperative chemotherapy by conducting an updated meta-analysis of randomized clinical trials (RCTs). The authors searched the Cochrane Library, PubMed, Embase, Web of Science, Chinese biomedical literature database, and Chinese Scientific Journals full-text database from their inception to December 2016. The authors identified relevant RCTs that compared NAC with postoperative chemotherapy in the treatment of operable breast cancer. The main endpoints were overall survival (OS) and recurrence-free survival (RFS). A total of 21 citations representing 16 unique studies were eligible. There were 787 deaths among 2794 patients assigned to NAC groups and 816 deaths among 2799 patients assigned to adjuvant chemotherapy groups. A meta-analysis of data indicated that there was no significant benefit in terms of OS ([hazard ratio [HR] = 1.03, 95% confidence interval [CI]: 0.94-1.13, P = .51) and RFS (HR = 1.01, 95% CI: 0.93-1.10, P = .80) between the NAC and postoperative chemotherapy groups. The pooled HR estimate for OS was not influenced by NAC cycles, the total number of chemotherapy cycles, administration of tamoxifen, administration of adjuvant chemotherapy, or type of NAC regimen. Subgroup analysis showed that the pooled HR estimate for RFS was influenced by anthracycline-containing regimens. Patients with a pathological complete response had superior survival outcomes compared with patients who had residual disease. The survival benefits for patients with operable breast cancer who received either NAC or adjuvant chemotherapy based on anthracycline regimens were comparable.
Hung, Giun-Yi; Yen, Hsiu-Ju; Yen, Chueh-Chuan; Wu, Po-Kuei; Chen, Cheng-Fong; Chen, Paul C-H; Wu, Hung-Ta H; Chiou, Hong-Jen; Chen, Wei-Ming
2016-04-01
The aim of this study was to compare survival before and after 2004 and define the prognostic factors for high-grade osteosarcomas beyond those of typical young patients with localized extremity disease. Few studies have reported the long-term treatment outcomes of high-grade osteosarcoma in Taiwan. A total of 202 patients with primary high-grade osteosarcoma who received primary chemotherapy at Taipei Veterans General Hospital between January 1995 and December 2011 were retrospectively evaluated and compared by period (1995-2003 vs 2004-2011). Patients of all ages and tumor sites and those following or not following controlled protocols were included in analysis of demographic, tumor-related, and treatment-related variables and survival. Overall survival and progression-free survival at 5 years were, respectively, 67.7% and 48% for all patients (n = 202), 77.3% and 57.1% for patients without metastasis (n = 157), and 33.9% and 14.8% for patients with metastasis (n = 45). The survival rates of patients treated after 2004 were significantly higher (by 13%-16%) compared with those of patients treated before 2004, with an accompanying 30% increase in histological good response rate (P = .002). Factors significantly contributing to inferior survival in univariate and multivariate analyses were diagnosis before 2004, metastasis at diagnosis, and being a noncandidate for a controlled treatment protocol. By comparison with the regimens used at our institution before 2004, the current results support the effectiveness of the post-2004 regimens, which consisted of substantially reduced cycles of high-dose methotrexate and a higher dosage of ifosfamide per cycle, cisplatin, and doxorubicin, for treating high-grade osteosarcoma in Asian patients.
Trends in overall survival and costs of multiple myeloma, 2000-2014.
Fonseca, R; Abouzaid, S; Bonafede, M; Cai, Q; Parikh, K; Cosler, L; Richardson, P
2017-09-01
Little real-world evidence is available to describe the recent trends in treatment costs and outcomes for patients with multiple myeloma (MM). Using the Truven Health MarketScan Research Databases linked with social security administration death records, this study found that the percentage of MM patients using novel therapy continuously increased from 8.7% in 2000 to 61.3% in 2014. Compared with MM patients diagnosed in earlier years, those diagnosed after 2010 had higher rates of novel therapy use and better survival outcomes; patients diagnosed in 2012 were 1.25 times more likely to survive 2 years than those diagnosed in 2006. MM patients showed improved survival over the study period, with the 2-year survival gap between MM patients and matched controls decreasing at a rate of 3% per year. Total costs among MM patients have increased in all healthcare services over the years; however, the relative contribution of drug costs has remained fairly stable since 2009 despite new novel therapies coming to market. Findings from this study corroborate clinical data, suggesting a paradigm shift in MM treatment over the past decade that is associated with substantial survival gains. Future studies should focus on the impact on specific novel agents on patients' outcomes.
Disparities in survival after Hodgkin lymphoma: a population-based study
Keegan, Theresa H.M.; Clarke, Christina A.; Chang, Ellen T.; Shema, Sarah J.; Glaser, Sally L.
2009-01-01
Survival after Hodgkin lymphoma (HL) is generally favorable, but may vary by patient demographic characteristics. The authors examined HL survival according to race/ethnicity and neighborhood socioeconomic status (SES), determined from residential census block group at diagnosis. For 12,492 classical HL patients ≥15 years diagnosed in California during 1988-2006 and followed through 2007, we determined risk of overall and HL-specific death using Cox proportional hazards regression; analyses were stratified by age and Ann Arbor stage. Irrespective of disease stage, patients with lower neighborhood SES had worse overall and HL-specific survival than patients with higher SES. Patients with the lowest quintile of neighborhood SES had a 64% (patients aged 15-44 years) and 36% (≥45 years) increased risk of HL-death compared to patients with the highest quintile of SES; SES results were similar for overall survival. Even after adjustment for neighborhood SES, blacks and Hispanics had increased risks of HL-death 74% and 43% (15-44 years) and 40% and 17% (≥45 years), respectively, higher than white patients. The racial/ethnic differences in survival were evident for all stages of disease. These data provide evidence for substantial, and probably remediable, racial/ethnic and neighborhood SES disparities in HL outcomes. PMID:19557531
Liu, George Y; Essex, Anthony; Buchanan, John T; Datta, Vivekanand; Hoffman, Hal M; Bastian, John F; Fierer, Joshua; Nizet, Victor
2005-07-18
Golden color imparted by carotenoid pigments is the eponymous feature of the human pathogen Staphylococcus aureus. Here we demonstrate a role of this hallmark phenotype in virulence. Compared with the wild-type (WT) bacterium, a S. aureus mutant with disrupted carotenoid biosynthesis is more susceptible to oxidant killing, has impaired neutrophil survival, and is less pathogenic in a mouse subcutaneous abscess model. The survival advantage of WT S. aureus over the carotenoid-deficient mutant is lost upon inhibition of neutrophil oxidative burst or in human or murine nicotinamide adenine dinucleotide phosphate oxidase-deficient hosts. Conversely, heterologous expression of the S. aureus carotenoid in the nonpigmented Streptococcus pyogenes confers enhanced oxidant and neutrophil resistance and increased animal virulence. Blocking S. aureus carotenogenesis increases oxidant sensitivity and decreases whole-blood survival, suggesting a novel target for antibiotic therapy.
Winter, Alexander; Sirri, Eunice; Jansen, Lina; Wawroschek, Friedhelm; Kieschke, Joachim; Castro, Felipe A; Krilaviciute, Agne; Holleczek, Bernd; Emrich, Katharina; Waldmann, Annika; Brenner, Hermann
2017-04-01
To better understand the influence of prostate-specific antigen (PSA) screening and other health system determinants on prognosis of prostate cancer, up-to-date relative survival (RS), stage distributions, and trends in survival and incidence in Germany were evaluated and compared with the United States of America (USA). Incidence and mortality rates for Germany and the USA for the period 1999-2010 were obtained from the Centre for Cancer Registry Data at the Robert Koch Institute and the USA Surveillance Epidemiology and End Results (SEER) database. For analyses on stage and survival, data from 12 population-based cancer registries in Germany and from the SEER-13 database were analysed. Patients (aged ≥ 15 years) diagnosed with prostate cancer (1997-2010) and mortality follow-up to December 2010 were included. The 5- and 10-year RS and survival trends (2002-2010) were calculated using standard and model-based period analysis. Between 1999 and 2010, prostate cancer incidence decreased in the USA but increased in Germany. Nevertheless, incidence remained higher in the USA throughout the study period (99.8 vs 76.0 per 100,000 in 2010). The proportion of localised disease significantly increased from 51.9% (1998-2000) to 69.6% (2007-2010) in Germany and from 80.5% (1998-2000) to 82.6% (2007-2010) in the USA. Mortality slightly decreased in both countries (1999-2010). Overall, 5- and 10-year RS was lower in Germany (93.3%; 90.7%) than in the USA (99.4%; 99.6%) but comparable after adjustment for stage. The same patterns were seen in age-specific analyses. Improvements seen in prostate cancer survival between 2002-2004 and 2008-2010 (5-year RS: 87.4% and 91.2%; +3.8% units) in Germany disappeared after adjustment for stage (P = 0.8). The survival increase in Germany and the survival advantage in the USA might be explained by differences in incidence and stage distributions over time and across countries. Effects of early detection or a lead-time bias due to the more widespread utilisation and earlier introduction of PSA testing in the USA are likely to explain the observed patterns. © 2016 The Authors BJU International © 2016 BJU International Published by John Wiley & Sons Ltd.
Levosimendan improves postresuscitation outcomes in a rat model of CPR.
Huang, Lei; Weil, Max Harry; Sun, Shijie; Cammarata, Gianluca; Cao, Lan; Tang, Wanchun
2005-11-01
In this study we sought to determine whether a calcium sensitizer, levosimendan, would have a more favorable effect on postresuscitation myocardial function and, consequently, postresuscitation survival than beta-adrenergic dobutamine. The extreme decrease in survival before hospital discharge of resuscitated victims is attributed, in part, to postresuscitation myocardial failure, and dobutamine has been recommended for the management of postresuscitation myocardial failure. We studied a total of 15 animals. Ventricular fibrillation was induced in Sprague-Dawley rats weighing 450 to 550 g. Cardiopulmonary resuscitation (CPR), including chest compressions and mechanical ventilation, was begun after 8 minutes of untreated cardiac arrest. Electrical defibrillation was attempted after 6 minutes of CPR. Each animal was resuscitated. Animals were randomized to undergo treatment with levosimendan, dobutamine, or saline-solution placebo. These agents were administered 10 minutes after the return of spontaneous circulation. Levosimendan was administered in a loading dose of 12 microg kg(-1) over a 10-minute period, followed by infusion of 0.3 microg kg(-1) min(-1) over the next 230 minutes. Dobutamine was continuously infused at a dosage of 3 microg kg(-1) min(-1). Saline-solution placebo was administered in the same volume and over the same amount of time as levosimendan. Levosimendan and dobutamine produced comparable increases in cardiac output and rate of left-ventricular pressure increase. However, administration of levosimendan resulted in lower heart rates and lesser increases in left ventricular diastolic pressure compared with both dobutamine and placebo. The duration of postresuscitation survival was significantly greater with levosimendan (16 +/- 2 hours), intermediate with dobutamine (11 +/- 2 hours) and least with saline-solution placebo (8 +/- 1 hour). Levosimendan and dobutamine both improved postresuscitation myocardial function. However, levosimendan produced more favorable postresuscitation myocardial function and increased the duration of postresuscitation survival.
Pankhurst, Laura; Hudson, Alex; Mumford, Lisa; Willicombe, Michelle; Galliford, Jack; Shaw, Olivia; Thuraisingham, Raj; Puliatti, Carmelo; Talbot, David; Griffin, Sian; Torpey, Nicholas; Ball, Simon; Clark, Brendan; Briggs, David; Fuggle, Susan V.; Higgins, Robert M.
2017-01-01
Background ABO and HLA antibody incompatible (HLAi) renal transplants (AIT) now comprise around 10% of living donor kidney transplants. However, the relationship between pretransplant factors and medium-term outcomes are not fully understood, especially in relation to factors that may vary between centers. Methods The comprehensive national registry of AIT in the United Kingdom was investigated to describe the donor, recipient and transplant characteristics of AIT. Kaplan-Meier analysis was used to compare survival of AIT to all other compatible kidney transplants performed in the United Kingdom. Cox proportional hazards regression modeling was used to determine which pretransplant factors were associated with transplant survival in HLAi and ABOi separately. The primary outcome was transplant survival, taking account of death and graft failure. Results For 522 HLAi and 357 ABO incompatible (ABOi) transplants, 5-year transplant survival rates were 71% (95% confidence interval [CI], 66-75%) for HLAi and 83% (95% CI, 78-87%) for ABOi, compared with 88% (95% CI, 87-89%) for 7290 standard living donor transplants, and 78% (95% CI, 77-79%) for 15 322 standard deceased donor transplants (P < 0.0001). Increased chance of transplant loss in HLAi was associated with increasing number of donor specific HLA antibodies, center performing the transplant, antibody level at the time of transplant, and an interaction between donor age and dialysis status. In ABOi, transplant loss was associated with no use of IVIg, cytomegalovirus seronegative recipient, 000 HLA donor-recipient mismatch; and increasing recipient age. Conclusions Results of AIT were acceptable, certainly in the context of a choice between living donor AIT and an antibody compatible deceased donor transplant. Several factors were associated with increased chance of transplant loss, and these can lead to testable hypotheses for further improving therapy. PMID:28706984
Pinheiro, Paulo S; Morris, Cyllene R; Liu, Lihua; Bungum, Timothy J; Altekruse, Sean F
2014-11-01
The accuracy of cancer survival statistics relies on the quality of death linkages and follow-up information collected by population-based cancer registries. Methodological issues on survival data by race-ethnicity in the United States, in particular for Hispanics and Asians, have not been well studied and may undermine our understanding of survival disparities. Based on Surveillance, Epidemiology, and End Results (SEER)-18 data, we analyzed existing biases in survival statistics when comparing the four largest racial-ethnic groups in the United States, whites, blacks, Hispanics and Asians. We compared the "reported alive" method for calculation of survival, which is appropriate when date of last alive contact is available for all cases, with the "presumed alive" method used when dates of last contact are unavailable. Cox regression was applied to calculate the likelihood of incomplete follow-up (those with less than 5 years of vital status information) according to racial-ethnic group and stage of diagnosis. Finally, potentially missed deaths were estimated based on the numbers of cases with incomplete follow-up for highly fatal cancers. The presumed alive method overestimated survival compared with the reported alive method by as much as 0.9-6.2 percentage points depending on the cancer site among Hispanics and by 0.4-2.7 percentage points among Asians. In SEER data, Hispanics and Asians are more likely to have incomplete follow-up than whites or blacks. The assumption of random censoring across race-ethnicity is not met, as among non-white cases, those who have a worse prognosis are more likely to have incomplete follow-up than those with a better prognosis (P < .05). Moreover, death ascertainment is not equal across racial-ethnic groups. Overall, 3% of cancer deaths were missed among Hispanics and Asians compared with less than 0.5% among blacks and whites. Cancer survival studies involving Hispanics and Asians should be interpreted with caution because the current available data overtly inflates survival in these populations. Censoring is clearly nonrandom across race-ethnicity meaning that findings of Hispanic and Asian survival advantages may be biased. Problematic death linkages among Hispanics and Asians contribute to missing deaths and overestimated survival. More complete follow-up with at least 5 years of information on vital status as well as improved death linkages will decisively increase the validity of survival estimates for these growing populations. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Liver transplantation for HCV cirrhosis at Karolinska University Hospital Huddinge, Stockholm.
Gjertsen, H; Weiland, O; Oksanen, A; Söderdahl, G; Broomé, U; Ericzon, B-G
2006-10-01
Hepatitis C virus (HCV)-induced cirrhosis is the major indication for liver transplantation globally, and an increasing indication for liver transplantation in Sweden. We have retrospectively examined the 120 patients transplanted for HCV cirrhosis from 1987 through 2005, including 11 who received more than one graft. The 1-, 3-, and 5-year postoperative survivals for all patients transplanted for HCV with or without hepatocellular cancer (HCC) were 77%, 66%, and 53%, respectively. HCV patients without HCC had a 1-, 3-, and 5-year survivals of 78%, 73%, and 61%, compared with 84%, 79% and 74%, respectively, for patients transplanted with chronic liver diseases without cancer or HCV. The number of patients with HCV cirrhosis transplanted in our center is increasing. Compared with patients transplanted for other chronic liver diseases, we experienced inferior results among patients with HCV cirrhosis.
Johnson, Richard J; Stenvinkel, Peter; Jensen, Thomas; Lanaspa, Miguel A; Roncal, Carlos; Song, Zhilin; Bankir, Lise; Sánchez-Lozada, Laura G
2016-08-01
Climate change (global warming) is leading to an increase in heat extremes and coupled with increasing water shortage, provides a perfect storm for a new era of environmental crises and potentially, new diseases. We use a comparative physiologic approach to show that one of the primary mechanisms by which animals protect themselves against water shortage is to increase fat mass as a means for providing metabolic water. Strong evidence suggests that certain hormones (vasopressin), foods (fructose), and metabolic products (uric acid) function as survival signals to help reduce water loss and store fat (which also provides a source of metabolic water). These mechanisms are intricately linked with each other and stimulated by dehydration and hyperosmolarity. Although these mechanisms were protective in the setting of low sugar and low salt intake in our past, today, the combination of diets high in fructose and salty foods, increasing temperatures, and decreasing available water places these survival signals in overdrive and may be accelerating the obesity and diabetes epidemics. The recent discovery of multiple epidemics of CKD occurring in agricultural workers in hot and humid environments may represent harbingers of the detrimental consequences of the combination of climate change and overactivation of survival pathways. Copyright © 2016 by the American Society of Nephrology.
Metabolic and Kidney Diseases in the Setting of Climate Change, Water Shortage, and Survival Factors
Stenvinkel, Peter; Jensen, Thomas; Lanaspa, Miguel A.; Roncal, Carlos; Song, Zhilin; Bankir, Lise; Sánchez-Lozada, Laura G.
2016-01-01
Climate change (global warming) is leading to an increase in heat extremes and coupled with increasing water shortage, provides a perfect storm for a new era of environmental crises and potentially, new diseases. We use a comparative physiologic approach to show that one of the primary mechanisms by which animals protect themselves against water shortage is to increase fat mass as a means for providing metabolic water. Strong evidence suggests that certain hormones (vasopressin), foods (fructose), and metabolic products (uric acid) function as survival signals to help reduce water loss and store fat (which also provides a source of metabolic water). These mechanisms are intricately linked with each other and stimulated by dehydration and hyperosmolarity. Although these mechanisms were protective in the setting of low sugar and low salt intake in our past, today, the combination of diets high in fructose and salty foods, increasing temperatures, and decreasing available water places these survival signals in overdrive and may be accelerating the obesity and diabetes epidemics. The recent discovery of multiple epidemics of CKD occurring in agricultural workers in hot and humid environments may represent harbingers of the detrimental consequences of the combination of climate change and overactivation of survival pathways. PMID:27283495
Survival of the House Fly (Diptera: Muscidae) on Truvia and Other Sweeteners.
Fisher, Michael L; Fowler, Fallon E; Denning, Steven S; Watson, David W
2017-07-01
The house fly, Musca domestica L. (Diptera: Muscidae), is a disease vector of mechanically transmitted pathogens including bacteria, viruses, and protozoans. Opportunities for pathogen transmission can increase as fly longevity increases. Dietary preferences play an important role in insect longevity; therefore, we investigated house fly preferences, sucrose availability, and caloric constraints on house fly longevity. Experimental goals were: 1) to test the effects of calorie restriction on survival of house flies by manipulating concentrations of erythritol (low caloric content) and sucrose (high caloric content), and comparing commercial sweeteners of differing calorie content, 2) to identify house fly preferences for either erythritol or sucrose, and 3) to evaluate the insecticidal activity or toxicity of erythritol on house flies. Our data show that house flies may prefer high calorie options when given a choice and that house fly longevity likely increases as calorie content increases. Additionally, no significant differences in longevity were observed between the water only control (zero calories) and erythritol treatments. This suggests that decreased survival rates and death could be the result of starvation rather than insecticidal activity. This research furthers our understanding of house fly survival and sugar-feeding behavior. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Smits, Jacqueline M A; Vanhaecke, Johan; Haverich, Axel; de Vries, Erwin; Smith, Mike; Rutgrink, Ellis; Ramsoebhag, Annemarie; Hop, Alinde; Persijn, Guido; Laufer, Gunther
2003-01-01
The definition of proper patient selection criteria remains a prominent item in constant need of attention. While the concept of gathering evidence in order to determine practice continues to be hopelessly ambiguous, it can never be emphasized too much that these univariate results are just a first foray into analysing predictors of survival; all following results should be regarded and interpreted in this perspective. HEART TRANSPLANT SURVIVAL: The 3-year survival rate for heart transplant recipients under age 16 was 83% versus 72% for adult recipients. Acutely retransplanted adult heart recipients had a 3-year survival rate of 36% compared with 72% for recipients of a first heart allograft. Patients suffering from DCM had the best survival rates at 3 years (74%) compared with patients suffering from CAD (70%) or from another end-stage heart disease (67%). With advancing age of the adult recipient, the mortality risk increased. Patients aged 16-40 had a 3-year survival rate of 77%, compared with 74%, 70% and 61% for transplant recipients aged 41-55, 56-65 and over age 65, respectively. The 3-year survival rates for adult recipients transplanted with an heart allograft from a donor aged under 16 or between 16-44 were 78% and 74%, compared with 66% and 63% for donors aged 45-55 and over 55, respectively. The 3-year survival rates for recipients of hearts with cold ischemic times under 2 hours, 2-3, 3-4, 4-5, 5-6 and more than 6 hours were 74%, 75%, 70%, 65%, 54% and 40%, respectively. Transplanting a female donor heart into a male recipient was associated with the worst prognosis: the 3-year survival rates were 73%, 71%, 66% and 76%, respectively, for the donor/recipient groups male/male, male/female, female/male and female/female, respectively. When the donor-to-recipient body weight ratio was below 0.8, the 3-year survival rate was 64%, compared to 72% for weight-matched pairs and 74% for patients who received a heart from an oversized donor (p=0.004). Better survival rates were obtained for better HLA-matched transplants. The 3-year survival rates were 75%, 89%, 78%, 78%, 69%, 72%, and 71% for HLA-A,-B,-DR zero, 1, 2, 3, 4, 5 and 6 mismatched groups, respectively (p=0.04). Survival was significantly associated with the CMV serologic status of the donor and recipient; the 3-year survival rates were: D+/R+, 71%; D+/R-, 69%; D- R-, 76%; and D-/R+, 76% (p=0.04). Patients in an ICU had a 3-year survival rate of 62%, compared to 72% for patients in a general ward and 74% for outpatients (p<0.0001). Patients that were on a VAD and there-upon transplanted had a 3-year survival rate of 65%, compared to 73% for patients without a VAD (p=0.004). Being on a ventilator was a major risk factor for death after transplantation; patients on ventilator support at the time of the transplant had a 3-year survival rate of 52% compared to 73% for the other patients (p<0.0001). LUNG TRANSPLANT SURVIVAL: The 3-year survival rate for children (73%) appeared to be better than the adult rate (61%; p=0.8). Adult lung transplant survival was significantly worse in the case of a repeat lung transplant; a 3-year retransplant survival rate of 42% was obtained compared with 61% for first transplants (p=0.049). With respect to the underlying end-stage lung disease, no statistically significant difference in long-term survival could be detected in this cohort. The 3-year survival rates were: 62% for COPD/Emphysema, 70% for CF, 58% for IPF, 64% for Alpha-1 ATD and 56% for PPH (p=0.2). Our data demonstrated no effect of the recipient's age on long-term lung transplant survival, except for 2 senior patients in this cohort. At 3-years the survival rates for recipients aged 16-40, 41-55 and 56-65 were 65%, 60% and 62%, respectively (p=0.05). The 3-year survival rates for transplants performed with lungs from donors aged under 16, 16-44, 45-55 and over 55 was 57%, 64%, 55% and 62%, respectively (p=0.1) No association between the duration of cold ischemic time and 3-year survival was observed; under 3 hours, 3-4, 4-5, 5-6 and over 6 hours of ischemia resulted in 3-year survival rates of 53%, 59%, 64%, 68% and 57%, respectively (p=0.2). Early posttransplant outcome tended to be better for gender-matched transplants, while transplanting a female donor lung into a male recipient was associated with the worst prognosis. The 3-year survival rates were 65% for male/male, 63% for male/female, 48% for female/male and 61% for female/female (p=0.009). No effect of donor-to-recipient weight match was observed in this Eurotransplant cohort; when the donor-to-recipient weight ratio was below 0.8, the 3-year survival rate was 57%, compared with 59% for weight-matched pairs and 64% for patients who received a lung from an oversized donor (p=0.5). Long-term survival after lung transplantation was influenced by HLA matching. The 3-year survival rates were 100%, 68%, 70%, 65%, 54% and 55% for the HLA-A,-B,-DR 1, 2, 3, 4, 5 and 6 mismatched groups, respectively (p=0.06). A donor CMV+ and recipient CMV- match was a risk factor for long-term mortality, with 3-year survival rates of 56% for D+/R+, 55% for D+/R-, 71% for D-/R- and 62% for D-/R+ transplants (p=0.046). En-bloc transplantation of both lungs yielded worse early results, but the 3-year survival rates for patients who underwent single (60%), bilateral sequential double lung (63%) and en-bloc double lung transplantation (56%) were not different (p=0.2). Ventilator dependency was associated with a significantly reduced survival at 3 years. Patients on a ventilator support at the time of the transplant had a 3-year survival rate of 48% compared with 63% for other patients (p=0.006).
Hurthle cell carcinoma: an update on survival over the last 35 years.
Nagar, Sapna; Aschebrook-Kilfoy, Briseis; Kaplan, Edwin L; Angelos, Peter; Grogan, Raymon H
2013-12-01
Hurthle cell carcinoma (HCC) of the thyroid is a variant of follicular cell carcinoma (FCC). A low incidence and lack of long-term follow-up data have caused controversy regarding the survival characteristics of HCC. We aimed to clarify this controversy by analyzing HCC survival over a 35-year period using the Surveillance, Epidemiology, and End Results (SEER) database. Cases of HCC and FCC were extracted from the SEER-9 database (1975-2009). Five- and 10-year survival rates were calculated. We compared changes in survival over time by grouping cases into 5-year intervals. We identified 1,416 cases of HCC and 4,973 cases of FCC. For cases diagnosed from 1975 to 1979, HCC showed a worse survival compared with FCC (5 years, 75%; 95% confidence interval [CI], 60.2-85) versus 88.7% (95% CI, 86-90.8; 10 years, 66.7% [95% CI, 51.5-78.1] vs. 79.7% [95% CI, 76.5-82.6]). For cases diagnosed from 2000 to 2004 we found no difference in 5-year survival between HCC and FCC (91.1% [95% CI, 87.6-93.7] vs. 89.1% [95% CI, 86.5-91.2]). For cases diagnosed from 1995 to 1999, there was no difference in 10-year survival between HCC and FCC (80.9% [95% CI, 75.6-85.2] vs. 83.9% [95% CI, 80.8-86.6]). HCC survival improved over the study period while FCC survival rates remained stable (increase in survival at 5 years, 21.7% vs. 0.4%; at 10 years, 21.3% vs. 5.2%). Improvement in HCC survival was observed for both genders, in age ≥45 years, in local and regional disease, for tumors >4 cm, and with white race. HCC survival has improved dramatically over time such that HCC and FCC survival rates are now the same. These findings explain how studies over the last 4 decades have shown conflicting results regarding HCC survival; however, our data do not explain why HCC survival has improved. Copyright © 2013 Mosby, Inc. All rights reserved.
Reinholz, Monica M; Zinnen, Shawn P; Dueck, Amylou C; Dingli, David; Reinholz, Gregory G; Jonart, Leslie A; Kitzmann, Kathleen A; Bruzek, Amy K; Negron, Vivian; Abdalla, Abdalla K; Arendt, Bonnie K; Croatt, Anthony J; Sanchez-Perez, Luis; Sebesta, David P; Lönnberg, Harri; Yoneda, Toshiyuki; Nath, Karl A; Jelinek, Diane F; Russell, Stephen J; Ingle, James N; Spelsberg, Thomas C; Dixon, Henry B F Hal; Karpeisky, Alexander; Lingle, Wilma L
2010-07-01
Despite palliative treatments, tumor-induced bone disease (TIBD) remains highly debilitating for many cancer patients and progression typically results in death within two years. Therefore, more effective therapies with enhanced anti-resorptive and cytotoxic characteristics are needed. We developed bisphosphonate-chemotherapeutic conjugates designed to bind bone and hydrolyze, releasing both compounds, thereby targeting both osteoclasts and tumor cells. This study examined the effects of our lead compound, MBC-11 (the anhydride formed between arabinocytidine (AraC)-5'-phosphate and etidronate), on bone tumor burden, bone volume, femur bone mineral density (BMD), and overall survival using two distinct mouse models of TIBD, the 4T1/luc breast cancer and the KAS-6/1-MIP1alpha multiple myeloma models. In mice orthotopically inoculated with 4T1/luc mouse mammary cells, MBC-11 (0.04 microg/day; s.c.) reduced the incidence of bone metastases to 40% (4/10), compared to 90% (9/10; p=0.057) and 100% (5/5; p=0.04) of PBS- or similarly-dosed, zoledronate-treated mice, respectively. MBC-11 also significantly decreased bone tumor burden compared to PBS- or zoledronate-treated mice (p=0.021, p=0.017, respectively). MBC-11 and zoledronate (0.04 microg/day) significantly increased bone volume by two- and four-fold, respectively, compared to PBS-treated mice (p=0.005, p<0.001, respectively). In mice systemically injected with human multiple myeloma KAS-6/1-MIP1alpha cells, 0.04 and 4.0 microg/day MBC-11 improved femur BMD by 13% and 16%, respectively, compared to PBS (p=0.025, p=0.017, respectively) at 10 weeks post-tumor cell injection and increased mean survival to 95 days compared to 77 days in mice treated with PBS (p=0.047). Similar doses of zoledronate also improved femur BMD (p< or =0.01 vs PBS) and increased mean survival to 86 days, but this was not significantly different than in PBS-treated mice (p=0.53). These results demonstrate that MBC-11 decreases bone tumor burden, maintains bone structure, and may increase overall survival, warranting further investigation as a treatment for TIBD. 2010 Elsevier Inc. All rights reserved.
Metamorphosis of two amphibian species after chronic cadmium exposure in outdoor aquatic mesocosms
James, S.M.; Little, E.E.; Semlitsch, R.D.
2005-01-01
Amphibian larvae at contaminated sites may experience an alteration of metamorphic traits and survival compared to amphibians in uncontaminated conditions. Effects of chronic cadmium (Cd) exposure on the metamorphosis of American toads (Bufo americanus) and southern leopard frogs (Rana sphenocephala) were determined. The two species were reared separately from shortly after hatching through metamorphosis in outdoor mesocosms (1,325-L polyethylene cattle tanks) that simulated natural ponds and enhanced environmental realism relative to the laboratory. Both species exhibited a decrease in survival with increasing initial nominal aqueous Cd concentration. Cadmium treatment did not influence mass at metamorphosis for either species when survival was included as a covariate, but increased the age at metamorphosis for the American toads. The whole body Cd content of metamorphs increased with aqueous Cd treatment level for both species, and the American toads tended to possess more elevated residues. Cadmium quickly partitioned out of the water column and accumulated in and altered the abundance of the tadpoles' diet. Cadmium-contaminated sites may produce fewer metamorphs, and those that survive will metamorphose later and contain Cd. Interspecific differences in the response variables illustrate the importance of testing multiple species when assessing risk.
Differential parental care by adult Mountain Plovers, Charadrius montanus
Dinsmore, S.J.; Knopf, F.L.
2005-01-01
We studied chick survival of the Mountain Plover (Charadrius montanus) in Montana and found that chicks tended by females had higher survival rates than chicks tended by males, and that chick survival generally increased during the nesting season. Differences in chick survival were most pronounced early in the nesting season, and may be related to a larger sample of nests during this period. When compared to information about the nest survival of male- and female-tended plover nests, our chick data suggest a trade-off for adult plovers between the egg and chick phases of reproduction. Because Mountain Plover pairs have clutches at two nests at two different locations and show differential success between the sexes during the egg and chick phases, we offer that the Mountain Plover breeding system favours optimizing annual recruitment in a dynamic ecologic setting driven by annually unpredictable drought, grazing, and predation pressures.
Survival Analysis of Patients with End Stage Renal Disease
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
Nutritional status and survival of maintenance hemodialysis patients receiving lanthanum carbonate.
Komaba, Hirotaka; Kakuta, Takatoshi; Wada, Takehiko; Hida, Miho; Suga, Takao; Fukagawa, Masafumi
2018-04-16
Hyperphosphatemia and poor nutritional status are associated with increased mortality. Lanthanum carbonate is an effective, calcium-free phosphate binder, but little is known about the long-term impact on mineral metabolism, nutritional status and survival. We extended the follow-up period of a historical cohort of 2292 maintenance hemodialysis patients that was formed in late 2008. We examined 7-year all-cause mortality according to the serum phosphate levels and nutritional indicators in the entire cohort and then compared the mortality rate of the 562 patients who initiated lanthanum with that of the 562 propensity score-matched patients who were not treated with lanthanum. During a mean ± SD follow-up of 4.9 ± 2.3 years, 679 patients died in the entire cohort. Higher serum phosphorus levels and lower nutritional indicators (body mass index, albumin and creatinine) were each independently associated with an increased risk of death. In the propensity score-matched analysis, patients who initiated lanthanum had a 23% lower risk for mortality compared with the matched controls. During the follow-up period, the serum phosphorus levels tended to decrease comparably in both groups, but the lanthanum group maintained a better nutritional status than the control group. The survival benefit associated with lanthanum was unchanged after adjustment for time-varying phosphorus or other mineral metabolism parameters, but was attenuated by adjustments for time-varying indicators of nutritional status. Treatment with lanthanum is associated with improved survival in hemodialysis patients. This effect may be partially mediated by relaxation of dietary phosphate restriction and improved nutritional status.
Neoadjuvant treatments for locally advanced, resectable esophageal cancer: A network meta-analysis.
Chan, Kelvin K W; Saluja, Ronak; Delos Santos, Keemo; Lien, Kelly; Shah, Keya; Cramarossa, Gemma; Zhu, Xiaofu; Wong, Rebecca K S
2018-02-14
The relative survival benefits and postoperative mortality among the different types of neoadjuvant treatments (such as chemotherapy only, radiotherapy only or chemoradiotherapy) for esophageal cancer patients are not well established. To evaluate the relative efficacy and safety of neoadjuvant therapies in resectable esophageal cancer, a Bayesian network meta-analysis was performed. MEDLINE, EMBASE and the Cochrane Central Register of Controlled Trials were searched for publications up to May 2016. ASCO and ASTRO annual meeting abstracts were also searched up to the 2015 conferences. Randomized controlled trials that compared at least two of the following treatments for resectable esophageal cancer were included: surgery alone, surgery preceded by neoadjuvant chemotherapy, neoadjuvant radiotherapy or neoadjuvant chemoradiotherapy. The primary outcome assessed from the trials was overall survival. Thirty-one randomized controlled trials involving 5496 patients were included in the quantitative analysis. The network meta-analysis showed that neoadjuvant chemoradiotherapy improved overall survival when compared to all other treatments including surgery alone (HR 0.75, 95% CR 0.67-0.85), neoadjuvant chemotherapy (HR 0.83. 95% CR 0.70-0.96) and neoadjuvant radiotherapy (HR 0.82, 95% CR 0.67-0.99). However, the risk of postoperative mortality increased when comparing neoadjuvant chemoradiotherapy to either surgery alone (RR 1.46, 95% CR 1.00-2.14) or to neoadjuvant chemotherapy (RR 1.58, 95% CR 1.00-2.49). In conclusion, neoadjuvant chemoradiotherapy improves overall survival but may also increase the risk of postoperative mortality in patients locally advanced resectable esophageal carcinoma. © 2018 UICC.
Bergenfelz, Anders; Bladström, Anna; Their, Mark; Nordenström, Erik; Valdemarsson, Stig; Westerdahl, Johan
2007-07-01
Primary hyperparathyroidism (pHPT) is associated with an increased mortality attributable to cardiovascular disease (CVD), suggested to be alleviated by surgery. The exact mechanism of the beneficial influence of parathyroidectomy on survival is unknown. Furthermore, studies suggest that there is no increased mortality compared to the mortality rate in the general population during recent years. This study therefore investigated relative survival (RS), as well overall mortality associated with the clinical and biochemical variables in patients undergoing operation for sporadic pHPT. Furthermore, the influence of surgery on biochemical variables associated with pHPT was analyzed. A group of 323 patients with sporadic pHPT operated between September 1989 and July 2003 were followed from surgery over a 10-year period. The median and mean follow-up time was 69 and 70 months, respectively (range: 1-120 months). Relative survival (RS) was calculated, and the impact of clinical and biochemical variables on overall death were evaluated. Postoperatively, serum levels of triglycerides and uric acid decreased. Glucose levels and glomerular filtration rate remained unchanged. A decreased RS was evident during the latter part of the 10 year follow-up period. In the multivariate Cox-analysis, diabetes mellitus (hazard ratio [HR] = 2.8, 95%; confidence interval [CI] 1.2-6.7), and the combination of an increased level of serum uric acid and cardiovascular disease (CVD) (HR = 8.6, 95%; CI 1.5-49.7) was associated with a higher mortality. The increased risk of death was evident for patients with persistently increased levels of uric acid postoperatively (HR = 4.8, 95%; CI = 1.4-16.01). Patients undergoing operation for pHPT had a decreased RS during a 10-year follow-up compared to the general population. This decrease in RS is associated with diabetes mellitus and increased levels of uric acid pre-and postoperatively.
Côté, Steeve D; Festa-Bianchet, Marco
2001-04-01
In temperate environments, early-born ungulates may enjoy a longer growth period before winter, and so attain a higher body mass and an increased probability of survival compared to late-born ones. We assessed the effects of maternal characteristics, forage quality and population density on kid birthdate, mass and survival in a population of marked mountain goats (Oreamnos americanus) in Alberta. The duration and timing of the birth season were similar in all years. Births were highly synchronised: 80% of kids were born within 2 weeks of the first birth. Maternal age, maternal social rank and density did not affect kid birthdate or mass. Previous breeding experience was not related to kid birthdate, but kids born to pluriparous mothers were heavier during summer than kids born to primiparous mothers. Male and female kids had similar mass and accumulated mass linearly during summer. Early-born kids were heavier than late-born kids. Faecal crude protein (FCP) in late spring and maternal mass were positively related to kid mass. Survival to weaning appeared higher for males (90%) than for females (78%), but survival to 1 year was 65% for both sexes. FCP in late spring, density, birthdate and mass did not affect kid survival to weaning in either sex. Survival to 1 year increased with FCP in late spring for females, but not for males. Survival to 1 year was independent of birthdate for both sexes, but heavy females survived better than light ones. Multiple logistic regression revealed a positive effect of mass on survival to 1 year when the sexes were pooled. Our results suggest that mountain goats are constrained to give birth in a short birth season synchronised with forage productivity.
Ethnic differences in breast cancer in Hawai'i: age, stage, hormone receptor status, and survival.
Braun, Kathryn L; Fong, Megan; Gotay, Carolyn C; Chong, Clayton D K
2004-09-01
Previous examinations of breast cancer and survival in Hawai'i's 5 major ethnic groups have found that Native Hawaiian women have the highest breast cancer mortality rates. Although ethnic disparities in survival are reduced when age and stage at diagnosis are controlled for statistically, prior studies could not explain ethnic variation in survival among women who were diagnosed at the same stage. We examined variations in breast tumor characteristics for a multiethnic sample of 4,583 women diagnosed in 1990-1997 by stage and age group and extended previous multivariate analyses by adding a new prognostic variable: estrogen receptor (ER) and progesterone receptor (PR) status. Logistic regression was used to examine the influence of age, stage, and hormone status on 5-year survival. With a few exceptions, greater proportions of Native Hawaiian women were diagnosed both in later stages of disease and at earlier ages compared to women of other ethnicities, and smaller proportions of Native Hawaiians survived 5 years post diagnosis in each stage and age group. Surprisingly, greater proportions of Native Hawaiian women in all age groups had ER/PR positive tumors, which is a prognostic indicator for better, not worse, survival. Native Hawaiian women had an increased risk of death and Japanese women had an increased chance of survival after controlling for age, stage, and ER/PR status. Future studies should examine other reasons for better survival of Japanese women and worse survival of Native Hawaiian women, including socioeconomic status, access to health insurance, adequacy of recommended screening frequency, co-morbid conditions, treatment appropriateness and compliance, and genetic markers of tumor aggressiveness.
Renal Salvage with Renal Artery Stenting Improves Long-term Survival.
Modrall, J Gregory; Trimmer, Clayton; Tsai, Shirling; Kirkwood, Melissa L; Ali, Mujtaba; Rectenwald, John E; Timaran, Carlos H; Rosero, Eric B
2017-11-01
The Cardiovascular Outcomes in Renal Atherosclerotic Lesions (CORAL) Trial cast doubt on the benefits of renal artery stenting (RAS). However, the outcomes for patients with chronic kidney disease (CKD) were not analyzed separately in the CORAL Trial. We hypothesized that patients who experienced a significant improvement in renal function after RAS would have improved long-term survival, compared with patients whose renal function was not improved by stenting. This single-center retrospective study included 60 patients with stage 3 or worse CKD and renal artery occlusive disease who were treated with RAS for renal salvage. Patients were categorized as "responders" or "nonresponders" based on postoperative changes in estimated glomerular filtration rate (eGFR) after RAS. "Responders" were those patients with an improvement of at least 20% in eGFR over baseline; all others were categorized as "nonresponders." Survival was analyzed using the Kaplan-Meier method. Cox proportional hazards regression was used to identify predictors of long-term survival. The median age of the cohort was 66 years (interquartile range [IQR], 60-73). Median preoperative eGFR was 34 mL/min/1.73 m 2 (IQR, 24-45). At late follow-up (median 35 months, IQR, 22-97 months), 16 of 60 patients (26.7%) were categorized as "responders" with a median increase in postoperative eGFR of 40% (IQR, 21-67). Long-term survival was superior for responders, compared with nonresponders (P = 0.046 by log-rank test). Cox proportional hazards regression identified improved renal function after RAS as the only significant predictor of increased long-term survival (hazard ratio = 0.235, 95% confidence interval = 0.075-0.733; P = 0.0126 for improved versus worsened renal function after RAS). Successful salvage of renal function by RAS is associated with improved long-term survival. These data provide an important counter argument to the prior negative clinical trials that found no benefit to RAS. Published by Elsevier Inc.
Singal, Amit G; Mittal, Sahil; Yerokun, Olutola A; Ahn, Chul; Marrero, Jorge A; Yopp, Adam C; Parikh, Neehar D; Scaglione, Steve J
2017-09-01
Professional societies recommend hepatocellular carcinoma screening in patients with cirrhosis, but high-quality data evaluating its effectiveness to improve early tumor detection and survival in "real world" clinical practice are needed. We aim to characterize the association between hepatocellular carcinoma screening and early tumor detection, curative treatment, and overall survival among patients with cirrhosis. We performed a retrospective cohort study of patients diagnosed with hepatocellular carcinoma between June 2012 and May 2013 at 4 health systems in the US. Patients were categorized in the screening group if hepatocellular carcinoma was detected by imaging performed for screening purposes. Generalized linear models and multivariate Cox regression with frailty adjustment were used to compare early detection, curative treatment, and survival between screen-detected and non-screen-detected patients. Among 374 hepatocellular carcinoma patients, 42% (n = 157) were detected by screening. Screen-detected patients had a significantly higher proportion of early tumors (Barcelona Clinic Liver Cancer stage A 63.1% vs 36.4%, P <.001) and were more likely to undergo curative treatment (31% vs 13%, P = .02). Hepatocellular carcinoma screening was significantly associated with improved survival in multivariate analysis (hazards ratio 0.41; 95% confidence interval, 0.26-0.65) after adjusting for patient demographics, Child-Pugh class, and performance status. Median survival of screen-detected patients was 14.6 months, compared with 6.0 months for non-screen-detected patients, with the difference remaining significant after adjusting for lead-time bias (hazards ratio 0.59, 95% confidence interval, 0.37-0.93). Hepatocellular carcinoma screening is associated with increased early tumor detection and improved survival; however, a minority of hepatocellular carcinoma patients are detected by screening. Interventions to increase screening use in patients with cirrhosis may help curb hepatocellular carcinoma mortality rates. Copyright © 2017 Elsevier Inc. All rights reserved.
Dominguez, Jessica A; Samocha, Alexandr J; Liang, Zhe; Burd, Eileen M; Farris, Alton B; Coopersmith, Craig M
2013-10-01
Nuclear factor-κB is a critical regulator of cell-survival genes and the host inflammatory response. The purpose of this study was to investigate the role of enterocyte-specific NF-kB in sepsis through selective ablation of IkB kinase. Prospective, randomized controlled study. Animal laboratories in university medical centers. Mice lacking functional NF-kB in their intestinal epithelium (Vil-Cre/Ikkβ) and wild-type mice were subjected to sham laparotomy or cecal ligation and puncture. Animals were killed at 24 hours or followed 7 days for survival. Septic wild-type mice had decreased villus length compared with sham mice, whereas villus atrophy was further exacerbated in septic Vil-Cre/Ikkβ mice. Sepsis induced an increase in intestinal epithelial apoptosis compared with sham mice, which was further exacerbated in Vil-Cre/Ikkβ mice. Sepsis induced intestinal hyperpermeability in wild-type mice compared with sham mice, which was further exacerbated in septic Vil-Cre/Ikkβ mice. This was associated with increased intestinal expression of claudin-2 in septic wild-type mice, which was further increased in septic Vil-Cre/Ikkβ mice. Both, pro-inflammatory and anti-inflammatory cytokines were increased in serum following cecal ligation and puncture, and interleukin 10 and monocyte chemoattractant protein-1 levels were higher in septic Vil-Cre/Ikkβ mice than in septic wild-type mice. All septic mice were bacteremic, but no differences in bacterial load were identified between wild-type and Vil-Cre/Ikkβ mice. To determine the functional significance of these results, animals were followed for survival. Septic wild-type mice had lower mortality than septic Vil-Cre/Ikkβ mice (47% vs 80%, p<0.05). Antitumor necrosis factor administration decreased intestinal apoptosis, permeability, and mortality in wild-type septic mice, and a similar improvement in intestinal integrity and survival were seen when antitumor necrosis factor was given to Vil-Cre/Ikkβ mice. Enterocyte-specific NF-kB has a beneficial role in sepsis by partially preventing sepsis-induced increases in apoptosis and permeability, which are associated with worsening mortality.
Daters, A T; Mauldin, G E; Mauldin, G N; Brodsky, E M; Post, G S
2010-03-01
The purpose of this study was to evaluate the efficacy of adding mitoxantrone to a cyclophosphamide, doxorubicin, vincristine, L-asparaginase and prednisone containing protocol. Sixty-five dogs with multicentric lymphoma were evaluated for overall remission and survival times. Remission and survival time versus stage, substage, pretreatment hypercalcaemia and pretreatment steroid administration were also evaluated. Overall median remission for dogs with multicentric lymphoma was 302 days and overall median survival was 622 days. Of the dogs with multicentric lymphoma, 23 (35%) received all scheduled mitoxantrone doses. Only median survival versus substage was found to be significant (substage a median survival was 679 days and substage b median survival was 302 days, P = 0.025). Increasing the total combined dose of doxorubicin and mitoxantrone may improve remission times when compared with historical controls, and further studies are needed to determine how best to utilize mitoxantrone in multidrug chemotherapy protocols for canine multicentric lymphoma.
Survival estimation and the effects of dependency among animals
Schmutz, Joel A.; Ward, David H.; Sedinger, James S.; Rexstad, Eric A.
1995-01-01
Survival models assume that fates of individuals are independent, yet the robustness of this assumption has been poorly quantified. We examine how empirically derived estimates of the variance of survival rates are affected by dependency in survival probability among individuals. We used Monte Carlo simulations to generate known amounts of dependency among pairs of individuals and analyzed these data with Kaplan-Meier and Cormack-Jolly-Seber models. Dependency significantly increased these empirical variances as compared to theoretically derived estimates of variance from the same populations. Using resighting data from 168 pairs of black brant, we used a resampling procedure and program RELEASE to estimate empirical and mean theoretical variances. We estimated that the relationship between paired individuals caused the empirical variance of the survival rate to be 155% larger than the empirical variance for unpaired individuals. Monte Carlo simulations and use of this resampling strategy can provide investigators with information on how robust their data are to this common assumption of independent survival probabilities.
Benard, Vicki B; Watson, Meg; Saraiya, Mona; Harewood, Rhea; Townsend, Julie S; Stroup, Antoinette M; Weir, Hannah K; Allemani, Claudia
2017-12-15
Overall, cervical cancer survival in the United States has been reported to be among the highest in the world, despite slight decreases over the last decade. Objective of the current study was to describe cervical cancer survival trends among US women and examine differences by race and stage. This study used data from the CONCORD-2 study to compare survival among women (aged 15-99 years) diagnosed in 37 states covering 80% of the US population. Survival was adjusted for background mortality (net survival) with state- and race-specific life tables and was age-standardized with the International Cancer Survival Standard weights. Five-year survival was compared by race (all races, blacks, and whites). Two time periods, 2001-2003 and 2004-2009, were considered because of changes in how the staging variable was collected. From 2001 to 2009, 90,620 women were diagnosed with invasive cervical cancer. The proportion of cancers diagnosed at a regional or distant stage increased over time in most states. Overall, the 5-year survival was 63.5% in 2001-2003 and 62.8% in 2004-2009. The survival was lower for black women versus white women in both calendar periods and in most states; black women had a higher proportion of distant-stage cancers. The stability of the overall survival over time and the persistent differences in survival between white and black women in all US states suggest that there is a need for targeted interventions and improved access to screening, timely treatment, and follow-up care, especially among black women. Cancer 2017;123:5119-37. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
A small increase in UV-B increases the susceptibility of tadpoles to predation
Alton, Lesley A.; Wilson, Robbie S.; Franklin, Craig E.
2011-01-01
Increased ultraviolet-B (UV-B) radiation as a consequence of ozone depletion is one of the many potential drivers of ongoing global amphibian declines. Both alone and in combination with other environmental stressors, UV-B is known to have detrimental effects on the early life stages of amphibians, but our understanding of the fitness consequences of these effects remains superficial. We examined the independent and interactive effects of UV-B and predatory chemical cues (PCC) on a suite of traits of Limnodynastes peronii embryos and tadpoles, and assessed tadpole survival time in a predator environment to evaluate the potential fitness consequences. Exposure to a 3 to 6 per cent increase in UV-B, which is comparable to changes in terrestrial UV-B associated with ozone depletion, had no effect on any of the traits measured, except survival time in a predator environment, which was reduced by 22 to 28 per cent. Exposure to PCC caused tadpoles to hatch earlier, have reduced hatching success, have improved locomotor performance and survive for longer in a predator environment, but had no effect on tadpole survival, behaviour or morphology. Simultaneous exposure to UV-B and PCC resulted in no interactive effects. These findings demonstrate that increased UV-B has the potential to reduce tadpole fitness, while exposure to PCCs improves their fitness. PMID:21270039
Alhamad, Tarek; Spatz, Christin; Uemura, Tadahiro; Lehman, Eric; Farooq, Umar
2014-12-15
There has been a remarkable increase in simultaneous liver and kidney transplantations (SLK). As organ demand has increased, so has the use of donation after cardiac death (DCD). However, little is known about the outcomes of DCD in SLK. We performed a retrospective analysis using the United Network for Organ Sharing database to compare the outcomes of DCD SLK to donation after brain death (DBD) and determine the impact of donor and recipient factors on allograft and patient survival. Between 2002 and 2011, a total of 3,026 subjects received SLK from DBD and 98 from DCD. Kidney, liver, and patient survival from DCD donors were inferior to DBD at 1, 3, and 5 years (P=0.0056, P=0.0035, and P=0.0205, respectively). With the use of the Cox model, DCD was a significant risk factor for kidney and liver allograft failure and patient mortality. Recipient factors that were associated with worse allograft and patient outcomes included black race, diabetes, being on a ventilator, hospitalization, delayed graft function, hepatocellular carcinoma, and intensive care unit stay. Older age of the donor was also associated with worse outcomes. Despite the decreased allograft and patient survival compared with DBD, DCD SLK provides an acceptable option for SLK, with a survival probability of more than 50% at 5 years.
Wong, Robert J; Aguilar, Maria; Cheung, Ramsey; Perumpail, Ryan B; Harrison, Stephen A; Younossi, Zobair M; Ahmed, Aijaz
2015-03-01
Nonalcoholic steatohepatitis (NASH) has been predicted to become the leading indication for liver transplantation (LT) in the United States. However, few studies have evaluated changes in the etiology of liver diseases among patients awaiting LT, and none have focused on the effects of NASH on liver transplant waitlists in the United States. We collected data from the United Network for Organ Sharing and Organ Procurement and Transplantation Network registry from 2004 through 2013, on liver transplant waitlist registrants with hepatitis C virus (HCV) infection, NASH, alcoholic liver disease (ALD), or a combination of HCV infection and ALD. We compared differences in survival within 90 days of registration (90-day survival) and probability of LT among patients with different diseases using Kaplan-Meier and multivariate logistic regression models. Between 2004 and 2013, new waitlist registrants with NASH increased by 170% (from 804 to 2174), with ALD increased by 45% (from 1400 to 2024), and with HCV increased by 14% (from 2887 to 3291); registrants with HCV and ALD decreased by 9% (from 880 to 803). In 2013, NASH became the second-leading disease among liver transplant waitlist registrants, after HCV. Patients with ALD had a significantly higher mean Model for End-Stage Liver Disease score at time of waitlist registration than other registrants. However, after multivariate adjustment, patients with ALD were less likely to die within 90 days when compared with patients with NASH (odds ratio [OR] = 0.77; 95% confidence interval [CI]: 0.67-0.89; P < .001); patients with HCV infection or HCV and ALD had similar odds for 90-day survival compared with NASH patients. Compared with patients with NASH, patients with HCV (OR = 1.45; 95% CI: 1.35-1.55; P < .001), ALD (OR = 1.15; 95% CI: 1.06-1.24; P < .001), or HCV and ALD (OR = 1.29; 95% CI: 1.18-1.42; P < .001) had higher odds for 90-day survival. Based on data from US adult LT databases, since 2004 the number of adults with NASH awaiting LTs has almost tripled. However, patients with NASH are less likely to undergo LT and less likely to survive for 90 days on the waitlist than patients with HCV, ALD, or HCV and ALD. Copyright © 2015 AGA Institute. Published by Elsevier Inc. All rights reserved.
Delabaere, A; Accoceberry, M; Niro, J; Velemir, L; Laurichesse-Delmas, H; Coste, K; Bœuf, B; Labbe, A; Storme, B; Lemery, D; Gallot, D
2011-09-01
Our objective was to report perinatal outcome during the first three years of an emerging centre for laser photocoagulation in twin-twin transfusion syndrome (TTTS) and to compare with outcome observed earlier in the same centre when management consisted in recurrent amniodrainage. We conducted a single centre retrospective study. We compared perinatal outcome of 19 consecutive cases of mid trimester TTTS managed by amniodrainage over a 10-year period with 49 cases of TTTS managed by laser photocoagulation over a 3-year period. Laser photocoagulation increased survival rate at birth (P=0.02) and at postnatal day 28 (P=0.01). Neurologic and cardiologic complications did not differ significantly (P=0.5 and P=0.3 respectively). We observed a significant increase in survival of the donor after laser coagulation at birth (P=0.04). Our study demonstrated better outcome after laser photocoagulation. Early results of an emerging centre appeared comparable to those of more experienced centres. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Life course evolution of body size and breast cancer survival in the E3N cohort.
His, Mathilde; Le Guélennec, Marine; Mesrine, Sylvie; Boutron-Ruault, Marie-Christine; Clavel-Chapelon, Françoise; Fagherazzi, Guy; Dossus, Laure
2018-04-15
Although adult obesity has been associated with poor breast cancer survival, data on adiposity at different periods in life and its lifelong evolution are scarce. Our aims were to assess the associations between breast cancer survival and body size during childhood, puberty and early adulthood and body size trajectories from childhood to adulthood. Self-assessed body size at age 8, at puberty, at age 20-25 and at age 35-40 and trajectories of body size of 4,662 breast cancer survivors from the prospective E3N cohort were studied in relation to risk of death from any cause, death from breast cancer and second invasive cancer event using multivariate Cox regression models. Four trajectories of body size were identified (T1 "moderate increase," T2 "stable/low increase," T3 "increase at puberty" and T4 "constantly high"). Compared with stable body size, an increase in body size during adult life was associated with an increased risk of death from any cause (HR T1 vs. T2 = 1.27; 95% CI = 1.01-1.60) and an increased risk of second invasive cancer event (HR T1 vs. T2 = 1.25; 95% CI = 1.06-1.47). Silhouettes at various ages were not associated with survival. Our results suggest that the evolution of body size from childhood to adulthood has a long-term influence on breast cancer survival. Although these results need to be confirmed, this work sheds light on the need to combine lifelong approaches to current BMI to better identify breast cancer survivors who are at higher risk of recurrence or second primary cancer, or of death. © 2017 UICC.
Stiasny, Martina H.; Jutfelt, Fredrik; Riebesell, Ulf; Clemmesen, Catriona
2018-01-01
In the coming decades, environmental change like warming and acidification will affect life in the ocean. While data on single stressor effects on fish are accumulating rapidly, we still know relatively little about interactive effects of multiple drivers. Of particular concern in this context are the early life stages of fish, for which direct effects of increased CO2 on growth and development have been observed. Whether these effects are further modified by elevated temperature was investigated here for the larvae of Atlantic herring (Clupea harengus), a commercially important fish species. Over a period of 32 days, larval survival, growth in size and weight, and instantaneous growth rate were assessed in a crossed experimental design of two temperatures (10°C and 12°C) with two CO2 levels (400 μatm and 900 μatm CO2) at food levels mimicking natural levels using natural prey. Elevated temperature alone led to increased swimming activity, as well as decreased survival and instantaneous growth rate (Gi). The comparatively high sensitivity to elevated temperature in this study may have been influenced by low food levels offered to the larvae. Larval size, Gi and swimming activity were not affected by CO2, indicating tolerance of this species to projected "end of the century" CO2 levels. A synergistic effect of elevated temperature and CO2 was found for larval weight, where no effect of elevated CO2 concentrations was detected in the 12°C treatment, but a negative CO2 effect was found in the 10°C treatment. Contrasting CO2 effects were found for survival between the two temperatures. Under ambient CO2 conditions survival was increased at 12°C compared to 10°C. In general, CO2 effects were minor and considered negligible compared to the effect of temperature under these mimicked natural food conditions. These findings emphasize the need to include biotic factors such as energy supply via prey availability in future studies on interactive effects of multiple stressors. PMID:29370273
Abdelmaksoud, Ahmed Hosni; Mandooh, Safaa; Nabeel, Mohamed Mahmoud; Elbaz, Tamer Mahmoud; Shousha, Hend Ibrahim; Monier, Ashraf; Elattar, Inas Anwar; Abdelaziz, Ashraf Omar
2017-01-01
Objective: Hepatocellular carcinoma with portal vein thrombosis is considered a relative contraindication for transarterial chemoembolization (TACE). The aim of our study was to evaluate the prognostic factors and management in patients with hepatocellular carcinoma with portal vein thrombosis (PVT). Methods: Between February 2011 and February 2015, 140 patients presented to our specialized multidisciplinary HCC clinic. All were assessed by imaging at regular intervals for tumor response and the data compared with baseline laboratory and imaging characteristics obtained before treatment. Results: At the end of the follow up in February 2015, 78 (55.7%) of the 140 patients had died, 33.1% in the 1st year and 20.7% in the 2nd year. The overall median survival was 10 months from the date of diagnosis. Clinical progression was noted in 45 (32.1%). Univariate analysis revealed that, the Child-Pugh score, the performance states (Eastern Cooperative Oncology Group “ECOG” 0-1) and the presence of ascites exerted non-significant affects on survival. Similarly, the serum albumen level and AFP >400 ng/ml were without influence. However, patients with =>2 tumors, abdominal lymphadenopathy and serum bilirubin >2mg/dl had a significantly worse prognosis. Specific treatment significantly increased survival compared to patients left untreated (P value = 0.027). Conclusion: Application of specific treatments (curative or palliative) significantly increased survival in HCC patients with PVT. TACE can be considered as a promising procedure for unresectable PVT-associated HCCs. The main predictors of survival in our study were the serum bilirubin level and specific treatment application. PMID:28240515
Abdelmaksoud, Ahmed Hosni; Mandooh, Safaa; Nabeel, Mohamed Mahmoud; Elbaz, Tamer Mahmoud; Shousha, Hend Ibrahim; Monier, Ashraf; Elattar, Inas Anwar; Abdelaziz, Ashraf Omar
2017-01-01
Objective: Hepatocellular carcinoma with portal vein thrombosis is considered a relative contraindication for transarterial chemoembolization (TACE). The aim of our study was to evaluate the prognostic factors and management in patients with hepatocellular carcinoma with portal vein thrombosis (PVT). Methods: Between February 2011 and February 2015, 140 patients presented to our specialized multidisciplinary HCC clinic. All were assessed by imaging at regular intervals for tumor response and the data compared with baseline laboratory and imaging characteristics obtained before treatment. Results: At the end of the follow up in February 2015, 78 (55.7%) of the 140 patients had died, 33.1% in the 1st year and 20.7% in the 2nd year. The overall median survival was 10 months from the date of diagnosis. Clinical progression was noted in 45 (32.1%). Univariate analysis revealed that, the Child-Pugh score, the performance states (Eastern Cooperative Oncology Group “ECOG” 0-1) and the presence of ascites exerted non-significant affects on survival. Similarly, the serum albumen level and AFP >400 ng/ml were without influence. However, patients with =>2 tumors, abdominal lymphadenopathy and serum bilirubin >2mg/dl had a significantly worse prognosis. Specific treatment significantly increased survival compared to patients left untreated (P value = 0.027). Conclusion: Application of specific treatments (curative or palliative) significantly increased survival in HCC patients with PVT. TACE can be considered as a promising procedure for unresectable PVT-associated HCCs. The main predictors of survival in our study were the serum bilirubin level and specific treatment application. Creative Commons Attribution License
Suo, Biao; Yang, Hua; Wang, Yuexia; Lv, Haipeng; Li, Zhen; Xu, Chao; Ai, Zhilu
2018-01-01
When frozen, Staphylococcus aureus survives in a sublethally injured state. However, S. aureus can recover at a suitable temperature, which poses a threat to food safety. To elucidate the resuscitation mechanism of freezing survived S. aureus, we used cells stored at -18°C for 90 days as controls. After resuscitating the survived cells at 37°C, the viable cell numbers were determined on tryptic soy agar with 0.6% yeast extract (TSAYE), and the non-injured-cell numbers were determined on TSAYE supplemented with 10% NaCl. The results showed that the total viable cell number did not increase within the first 3 h of resuscitation, but the osmotic regulation ability of freezing survived cells gradually recovered to the level of healthy cells, which was evidenced by the lack of difference between the two samples seen by differential cell enumeration. Scanning electron microscopy (SEM) showed that, compared to late exponential stage cells, some frozen survived cells underwent splitting and cell lysis due to deep distortion and membrane rupture. Transmission electron microscopy (TEM) showed that, in most of the frozen survived cells, the nucleoids (low electronic density area) were loose, and the cytoplasmic matrices (high electronic density area) were sparse. Additionally, a gap was seen to form between the cytoplasmic membranes and the cell walls in the frozen survived cells. The morphological changes were restored when the survived cells were resuscitated at 37°C. We also analyzed the differential proteome after resuscitation using non-labeled high-performance liquid chromatography–mass spectrometry (HPLC-MS). The results showed that, compared with freezing survived S. aureus cells, the cells resuscitated for 1 h had 45 upregulated and 73 downregulated proteins. The differentially expressed proteins were functionally categorized by gene ontology enrichment, KEGG pathway, and STRING analyses. Cell membrane synthesis-related proteins, oxidative stress resistance-related proteins, metabolism-related proteins, and virulence factors exhibited distinct expression patterns during resuscitation. These findings have implications in the understanding of the resuscitation mechanism of freezing survived S. aureus, which may facilitate the development of novel technologies for improved detection and control of foodborne pathogens in frozen food. PMID:29774015
Kulaylat, Audrey S; Hollenbeak, Christopher S; Stewart, David B
2017-09-01
Squamous cell cancers of the anus are rare GI malignancies for which neoadjuvant chemoradiation is the first-line treatment for nonmetastatic disease. Squamous cancers of the rectum are far less common, and it is unclear to what degree chemoradiotherapy improves their outcomes. The purpose of this study was to compare stage-specific survival for anal and rectal squamous cancers stratified by treatment approach. This was a retrospective cohort study. The study was conducted at Commission on Cancer designated hospitals. Patients (2006-2012) identified in the National Cancer Database with pretreatment clinical stage I to III cancers who underwent chemoradiotherapy, with and without subsequent salvage surgical resection (low anterior resection or abdominoperineal resection), ≥12 weeks after chemoradiotherapy were included in the study. Overall survival and the need for salvage surgery were measured. Anal cancers (n = 11,224) typically presented with stage II (45.7%) or III (36.3%) disease, whereas rectal cancer stages (n = 1049) were more evenly distributed (p < 0.001). More patients with rectal cancer underwent low anterior or abdominoperineal resections 12 weeks or later after chemoradiotherapy versus those undergoing abdominoperineal resection for anal cancer (3.8% versus 1.2%; p < 0.001). Stage I and II rectal cancer was associated with poorer survival compared with anal cancer (stage I, p = 0.017; stage II, p < 0.001); survival was similar for stage III disease. Salvage surgery for anal cancer was associated with worse survival for stage I to III cancers; salvage surgery did not significantly affect survival for rectal cancer. This was a retrospective study without cancer-specific survival measures. Squamous rectal cancers are associated with significantly worse survival than squamous cancers of the anus for clinical stage I and II disease. Despite both cancers exhibiting squamous histology, rectal cancers may be less radiosensitive than anal cancers, as suggested by the greater incidence of salvage surgery that does not appear to significantly improve overall survival. See Video Abstract at http://links.lww.com/DCR/A422.
Saumell, Yaimarelis; Sanchez, Lizet; González, Sandra; Ortiz, Ramón; Medina, Edadny; Galán, Yaima; Lage, Agustin
2017-12-01
Despite improvements in surgical techniques and treatments introduced into clinical practice, the overall survival of patients with esophageal squamous cell carcinoma remains low. Several epidermal growth factor receptor inhibitors are being evaluated in the context of clinical trials, but there is little evidence of effectiveness in real-world conditions. This study aimed at assessing the effectiveness of nimotuzumab combined with onco-specific treatment in Cuban real-life patients with locally advanced or metastatic esophageal squamous cell carcinoma. A comparative and retrospective effectiveness study was performed. The 93 patients treated with nimotuzumab were matched, with use of propensity score matching, with patients who received a diagnosis of locally advanced or metastatic squamous cell carcinoma of the esophagus in three Cuban provinces reported between 2011 and 2015 to the National Cancer Registry. The Kaplan-Meier method was used to estimate event-time distributions. Log-rank statistics were used for comparisons of overall survival between groups. A two-component mixture model assuming a Weibull distribution was fitted to assess the effect of nimotuzumab on short-term and long-term survival populations. There was an increase in median overall survival in patients treated with nimotuzumab (11.9 months versus 6.5 months without treatment) and an increase in the 1-year survival rate (54.0% versus 21.9% without treatment). The 2-year survival rates were 21.1% for patients treated with nimotuzumab and 0% in the untreated cohort. There were statistically significant differences in survival between groups treated and not treated with nimotuzumab, both in the short-term survival population (6.0 months vs 4.0 months, p = 0.009) and in the long-term survival population (18.0 months vs 11.0 months, p = 0.001). Our study shows that nimotuzumab treatment concurrent with chemoradiotherapy increases the survival of real-world patients with locally advanced or metastatic esophageal squamous cell carcinoma. Further prospective studies are required to confirm the therapeutic effectiveness of nimotuzumab in esophageal cancer.
Changes in Risk Profile Over Time in the Population of a Pediatric Heart Transplant Program.
Reinhartz, Olaf; Maeda, Katsuhide; Reitz, Bruce A; Bernstein, Daniel; Luikart, Helen; Rosenthal, Daniel N; Hollander, Seth A
2015-09-01
Single-center data on pediatric heart transplantation spanning long time frames is sparse. We attempted to analyze how risk profile and pediatric heart transplant survival outcomes at a large center changed over time. We divided 320 pediatric heart transplants done at Stanford University between 1974 and 2014 into three groups by era: the first 20 years (95 transplants), the subsequent 10 years (87 transplants), and the most recent 10 years (138 transplants). Differences in age at transplant, indication, mechanical support, and survival were analyzed. Follow-up was 100% complete. Average age at time of transplantation was 10.4 years, 11.9 years, and 5.6 years in eras 1, 2, and 3, respectively. The percentage of infants who received transplants by era was 21%, 7%, and 18%, respectively. The indication of end-stage congenital heart disease vs cardiomyopathy was 24%, 22%, and 49%, respectively. Only 1 patient (1%) was on mechanical support at transplant in era 1 compared with 15% in era 2 and 30% in era 3. Overall survival was 72% at 5 years and 57% at 10 years. Long-term survival increased significantly with each subsequent era. Patients with cardiomyopathy generally had a survival advantage over those with congenital heart disease. The risk profile of pediatric transplant patients in our institution has increased over time. In the last 10 years, median age has decreased and ventricular assist device support has increased dramatically. Transplantation for end-stage congenital heart disease is increasingly common. Despite this, long-term survival has significantly and consistently improved. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Lean body mass predicts long-term survival in Chinese patients on peritoneal dialysis.
Huang, Jenq-Wen; Lien, Yu-Chung; Wu, Hon-Yen; Yen, Chung-Jen; Pan, Chun-Chun; Hung, Tsai-Wei; Su, Chi-Ting; Chiang, Chih-Kang; Cheng, Hui-Teng; Hung, Kuan-Yu
2013-01-01
Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients' outcomes and the factors related to increasing LBM are seldom reported. We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan-Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups. Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD. LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM.
Lean Body Mass Predicts Long-Term Survival in Chinese Patients on Peritoneal Dialysis
Huang, Jenq-Wen; Lien, Yu-Chung; Wu, Hon-Yen; Yen, Chung-Jen; Pan, Chun-Chun; Hung, Tsai-Wei; Su, Chi-Ting; Chiang, Chih-Kang; Cheng, Hui-Teng; Hung, Kuan-Yu
2013-01-01
Background Reduced lean body mass (LBM) is one of the main indicators in malnutrition inflammation syndrome among patients on dialysis. However, the influence of LBM on peritoneal dialysis (PD) patients’ outcomes and the factors related to increasing LBM are seldom reported. Methods We enrolled 103 incident PD patients between 2002 and 2003, and followed them until December 2011. Clinical characteristics, PD-associated parameters, residual renal function, and serum chemistry profiles of each patient were collected at 1 month and 1 year after initiating PD. LBM was estimated using creatinine index corrected with body weight. Multiple linear regression analysis, Kaplan–Meier survival analysis, and Cox regression proportional hazard analysis were used to define independent variables and compare survival between groups. Results Using the median LBM value (70% for men and 64% for women), patients were divided into group 1 (n = 52; low LBM) and group 2 (n = 51; high LBM). Group 1 patients had higher rates of peritonitis (1.6 vs. 1.1/100 patient months; p<0.05) and hospitalization (14.6 vs. 9.7/100 patient months; p<0.05). Group 1 patients also had shorter overall survival and technique survival (p<0.01). Each percentage point increase in LBM reduced the hazard ratio for mortality by 8% after adjustment for diabetes, age, sex, and body mass index (BMI). Changes in residual renal function and protein catabolic rate were independently associated with changes in LBM in the first year of PD. Conclusions LBM serves as a good parameter in addition to BMI to predict the survival of patients on PD. Preserving residual renal function and increasing protein intake can increase LBM. PMID:23372806
Dugger, Katie M.; Ainley, David G.; Lyver, Phil O'B.; Barton, Kerry; Ballard, Grant
2010-01-01
High survival and breeding philopatry was previously confirmed for the Adélie penguin (Pygoscelis adeliae) during a period of stable environmental conditions. However, movements of breeding adults as a result of an unplanned natural experiment within a four-colony meta-population provided interesting insights into this species’ population dynamics. We used multistate mark-recapture models to investigate apparent survival and dispersal of breeding birds in the southwestern Ross Sea during 12 breeding seasons (1996–2007). The natural experiment was facilitated by the temporary grounding of two immense icebergs that (i) erected a veritable fence separating colonies and altering migration routes and (ii) added additional stress by trapping extensive sea ice in the region during 5 of 12 y. Colony size varied by orders of magnitude, allowing investigation of apparent survival and dispersal rates in relation to both environmental conditions and colony size within this meta-population. Apparent survival was lowest for the smallest colony (4,000 pairs) and similar for the medium (45,000 pairs) and large colonies (155,000 pairs), despite increased foraging effort expended by breeders at the largest colony. Dispersal of breeding birds was low (<1%), except during years of difficult environmental conditions when movements increased, especially away from the smallest colony (3.5%). Decreased apparent survival at the smallest colony could reflect differences in migration chronology and winter habitat use compared with the other colonies, or it may reflect increased permanent emigration to colonies outside this meta-population. Contrary to current thought, breeding penguins are not always philopatric. Rather, stressful conditions can significantly increase dispersal rates. PMID:20566874
Wu, Qian Qian; You, Hyun Ju; Ahn, Hyung Jin; Kwon, Bin; Ji, Geun Eog
2012-06-15
Bifidobacterium adolescentis Int57 (Int57) and Propionibacterium freudenreichii subsp. shermanii ATCC 13673 (ATCC 13673) were grown either in coculture or as pure cultures in different media, such as cow's milk, soybean milk, and modified MRS medium. The viable cell counts of bacteria, changes in pH, concentrations of organic acids, and contents of various sugars were analyzed during incubation up to 7days. In soy milk, the survival of cocultured Int57 was six times higher than the monocultured cells, and ATCC 13673 cocultured with Int57 consumed 69.4% of lactic acid produced by Int57 at the end of fermentation. In cow's milk, coculture with ATCC 13673 increased the growth of Int57 from 24h until 120h by approximately tenfold and did not affect the survival of Int57 cells. After 96h of fermentation of modified MRS, the survival of ATCC 13673 cells cocultured with Int57 increased by 3.2- to 7.4-folds as compared with ATCC 13673 monoculture, whereas the growth of Int57 cells was unaffected. The growth and metabolic patterns of two strains during coculture showed noticeable differences between food grade media and laboratory media. The consumption of stachyose in soy milk during coculture of Int57 with ATCC 13673 was increased by more than twice compared with Int57 monoculture, and completed within 24h. The combinational use of Bifidobacterium and Propionibacterium could be applied to the development of fermented milk or soy milk products. Copyright © 2012 Elsevier B.V. All rights reserved.
Moreira, Alvaro; Leisgang, Waltraud; Schuler, Gerold; Heinzerling, Lucie
2017-01-01
The prognostic role of eosinophils in cancer has been controversial. Some entities such as gastrointestinal cancers show a better survival, while others such as Hodgkin's lymphoma a worse survival in patients with eosinophilia. Patients who exhibited an increase in eosinophils upon therapy with ipilimumab or pembrolizumab were shown to survive longer. We wanted to investigate whether eosinophilia is a prognostic marker in metastatic melanoma. In total, 173 patients with metastatic melanoma from our data base (median age 60 years; n = 86 with immunotherapy, n = 87 without immunotherapy) were analyzed for eosinophil counts and survival over the course of 12 years. Eosinophilic count was detected by peripheral blood smear. The ethical committee had approved this retrospective study. Melanoma patients with eosinophilia at any point in their course of disease show a trend toward longer survival independently of their therapy. There is a statistically significant difference for the patients who survive at least 12 months (p < 0.005). In patients with checkpoint inhibitor therapy, survival was significantly prolonged in every patient with eosinophilia (p < 0.05). Furthermore, 69% of the patients treated with immunotherapy experienced at least once an eosinophilia of 5% or greater compared with 46% in the immunotherapy naive-group; for an eosinophilia of 10% values were 30 and 9%, respectively. Interestingly, in patients with more than 20% eosinophils (n = 7) survival was prolonged with a median of 35 months (range 19-60 months) as compared with 16 months (range 1-117 months). Eosinophilia is a prognostic marker in patients with metastatic melanoma.
Outcomes following total laryngectomy for squamous cell carcinoma: one centre experience.
Leong, S C; Kartha, S-S; Kathan, C; Sharp, J; Mortimore, S
2012-12-01
To evaluate the clinical outcomes of total laryngectomy (TL), complications and factors affecting survival. Retrospective review of hospital electronic database for head and neck squamous cell carcinoma (SCCa). Large district general hospital in England, United Kingdom. Patients who had TL between January 1994 and January 2008. 5-year disease specific survival (DSS) and disease-free survival (DFS). Seventy-one patients were reviewed, of whom 38 (54%) had laryngeal SCCa and 33 (46%) hypopharyngeal SCCa. The overall mean survival period following TL was 42.4 months. The 5-year DSS and DFS was better for laryngeal SCCa compared to hypopharyngeal SCCa, although not statistically significant (P=0.090, P=0.54 respectively). Patients treated for laryngeal SCCa had a mean survival period of 47.5 months compared to 36.5 months for hypopharyngeal disease. Those who had laryngeal recurrence after primary radiotherapy (RT) demonstrated statistically better survival probability than those who had hypopharyngeal recurrence (P=0.011). Patients without cervical lymphadenopathy had statistically better survival (P=0.049). The most common early complication was related to the cardiorespiratory system. One fatal complication of erosion of the brachiocephalic artery due to the laryngectomy tube was noted. The most common late complication was neopharyngeal stenosis. The commonest cause of death was due to locoregional recurrence, followed by medical co-morbidities. Patients referred to specialised head and neck clinic had a better survival probability than those referred to a general ENT clinic (P=0.37). While there is increasing tendency towards laryngeal conservation, total laryngectomy remains a robust treatment option in selected patients. Copyright © 2012. Published by Elsevier Masson SAS.
Frouws, M A; Rademaker, E; Bastiaannet, E; van Herk-Sukel, M P P; Lemmens, V E; Van de Velde, C J H; Portielje, J E A; Liefers, G J
2017-05-01
Several studies have suggested that the association between aspirin and improved cancer survival is mediated through the mechanism of aspirin as thrombocyte aggregation inhibitors (TAI). The aim of this study was to provide epidemiological evidence for this mechanism assessing the association between overall survival and the use of aspirin and non-aspirin TAI in patients with colorectal cancer. In this observational study, data from the Netherlands Comprehensive Cancer Organisation were linked to PHARMO Database Network. Patients using aspirin or aspirin in combination with non-aspirin TAI (dual users) were selected and compared with non-users. The association between overall survival and the use of (non-)aspirin TAI was analysed using Cox regression models with the use of (non-)aspirin TAI as a time-varying covariate. In total, 9196 patients were identified with colorectal cancer and 1766 patients used TAI after diagnosis. Non-aspirin TAI were mostly clopidogrel and dipyridamole. Aspirin use was associated with a significant increased overall survival and hazard ratio (HR) 0.41 (95% confidence interval [CI] 0.37-0.47), and the use of non-aspirin TAI was not associated with survival of HR 0.92 (95% CI 0.70-1.22). Dual users did not have an improved overall survival when compared with patients using solely aspirin. Aspirin use after diagnosis of colorectal cancer was associated with significantly lower mortality rates and this effect remained significant after adjusting for potential confounders. No additional survival benefit was observed in patients using both aspirin and another TAI. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kairiene, Igne; Pasauliene, Ramune; Lipunova, Nadezda; Vaitkeviciene, Goda; Rageliene, Lina; Rascon, Jelena
2017-10-01
The reported treatment outcomes of children treated for cancer in Eastern European countries are inferior to those in Northern/Western Europe. We hypothesized that recent survival rates could be comparable to the current standards and performed a population-based analysis of treatment outcome of childhood acute myeloid leukemia (AML) in Lithuania, a small Eastern European country. Children < 18 years old who were treated for AML from 2000 to 2013 were included (n = 54). Estimates of 5-year event-free (EFS 5y ) and overall survival (OS 5y ) rates were analyzed. Comparing periods 2000-2006 (n = 32) and 2007-2013 (n = 22), the EFS 5y improved from 31 to 63% (p = 0.04), and the OS 5y improved from 31 to 72% (p = 0.02) because of reductions in toxicity-related mortality (42 vs. 15%, p = 0.08) and relapse (43 vs. 25%, p = 0.08). The most significant improvement was demonstrated in high-risk patients (OS 5y improved from 26 to 75%, p = 0.02) who benefited from hematopoietic stem cell transplantation: the post-transplant EFS 5y increased from 13 to 86% (p = 0.01). The current survival rate of Lithuanian children treated for AML was comparable to the expected rate in other parts of Europe. What is Known: • In the last three decades, significant improvement has been achieved in treating childhood cancer, with an overall survival (OS) rate of > 80% in high-income countries. The difference in survival rates between Northern/Western and Eastern European countries as well as between high- and middle-/low-income countries is as much as 20%. Recently, the 5-year event-free survival rate of acute myeloid leukemia (AML) has reached > 60% in high-income countries. The survival rates for myeloproliferative diseases were the lowest in Eastern European countries. • The reported inferior survival rates were calculated based on outcome data of patients treated until 2007. The recent survival rates in Eastern European countries are unknown. What is New: • Being a small Eastern European country, Lithuania has experienced good economic growth during the last decade. We hypothesized that economic growth and gain of experience could result in better survival rates of children treated for cancer in our country in recent years. • A population-based analysis of treatment outcome of childhood AML treated in Lithuania in the recent years was performed for the first time. The survival rates of childhood AML in Lithuania are comparable to those of other high-income countries. Current survival rates of children treated for cancer in Eastern European countries could be comparable to the best current standards contributing to better European survival rates of childhood cancer in general.
Survival in cats with primary and secondary cardiomyopathies.
Spalla, Ilaria; Locatelli, Chiara; Riscazzi, Giulia; Santagostino, Sara; Cremaschi, Elena; Brambilla, Paola
2016-06-01
Feline cardiomyopathies (CMs) represent a heterogeneous group of myocardial diseases. The most common CM is hypertrophic cardiomyopathy (HCM), followed by restrictive cardiomyopathy (RCM). Studies comparing survival and outcome for different types of CM are scant. Furthermore, little is known about the cardiovascular consequences of systemic diseases on survival. The aim of this retrospective study was to compare survival and prognostic factors in cats affected by HCM, RCM or secondary CM referred to our institution over a 10 year period. The study included 94 cats with complete case records and echocardiographic examination. Fifty cats presented HCM, 14 RCM and 30 secondary CM. A statistically significant difference in survival time was identified for cats with HCM (median survival time of 865 days), RCM (273 days) and secondary CM (<50% cardiac death rate). In the overall population and in the primary CM group (HCM + RCM), risk factors in the multivariate analysis, regardless of the CM considered, were the presence of clinical signs, an increased left atrial to aortic root (LA/Ao) ratio and a hypercoagulable state. Primary CMs in cats share some common features (ie, LA dimension and hypercoagulable state) linked to feline cardiovascular physiology, which influence survival greatly in end-stage CM. The presence of clinical signs has to be regarded as a marker of disease severity, regardless of the underlying CM. Secondary CMs are more benign conditions, but if the primary disease is not properly managed, the prognosis might also be poor in this group of patients. © ISFM and AAFP 2015.
Cystic Fibrosis Associated with Worse Survival After Liver Transplantation.
Black, Sylvester M; Woodley, Frederick W; Tumin, Dmitry; Mumtaz, Khalid; Whitson, Bryan A; Tobias, Joseph D; Hayes, Don
2016-04-01
Survival in cystic fibrosis patients after liver transplantation and liver-lung transplantation is not well studied. To discern survival rates after liver transplantation and liver-lung transplantation in patients with and without cystic fibrosis. The United Network for Organ Sharing database was queried from 1987 to 2013. Univariate Cox proportional hazards, multivariate Cox models, and propensity score matching were performed. Liver transplant and liver-lung transplant were performed in 212 and 53 patients with cystic fibrosis, respectively. Univariate Cox proportional hazards regression identified lower survival in cystic fibrosis after liver transplant compared to a reference non-cystic fibrosis liver transplant cohort (HR 1.248; 95 % CI 1.012, 1.541; p = 0.039). Supplementary analysis found graft survival was similar across the 3 recipient categories (log-rank test: χ(2) 2.68; p = 0.262). Multivariate Cox models identified increased mortality hazard among cystic fibrosis patients undergoing liver transplantation (HR 2.439; 95 % CI 1.709, 3.482; p < 0.001) and liver-lung transplantation (HR 2.753; 95 % CI 1.560, 4.861; p < 0.001). Propensity score matching of cystic fibrosis patients undergoing liver transplantation to non-cystic fibrosis controls identified a greater mortality hazard in the cystic fibrosis cohort using a Cox proportional hazards model stratified on matched pairs (HR 3.167; 95 % CI 1.265, 7.929, p = 0.014). Liver transplantation in cystic fibrosis is associated with poorer long-term patient survival compared to non-cystic fibrosis patients, although the difference is not due to graft survival.
Roué, Tristan; Labbé, Sylvain; Belliardo, Sophie; Plenet, Juliette; Douine, Maylis; Nacher, Mathieu
2016-08-01
The prognosis of patients with breast cancer in French Guiana is worse than in France, with 23 deaths per 100 incident cases versus 17 per 100 in metropolitan France. This study aimed to compare the relative survival of patients with invasive breast cancer (IBC) between women from French Guiana and metropolitan France and to determine risk factors influencing breast cancer survival in French Guiana. Data were collected from the Cancer Registry of French Guiana. We compared the relative survival of women with IBC between French Guiana and metropolitan France. We used the Cox proportional hazard regression to evaluate the effect of prognostic factors on cancer-specific mortality in French Guiana. We included all 269 cases of IBC in women diagnosed in French Guiana between 2003 and 2009. The overall 5-year relative survival rate of patients with IBC was 79% in French Guiana and 86% in metropolitan France. The place of birth (foreign country vs. French territory), the tumor stage at the time of diagnosis, the mode of diagnosis (symptoms vs. screening), the presence of hormone receptors in the tumor, and the histologic type were the variables associated with survival differences. None of the other study variables were significantly associated with prognosis. Access to care for migrants is challenging, which leads to health inequalities. Early detection through prevention programs is crucial to increase IBC survival, notably for foreign-born patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Effects of radiomarking on prairie falcons: Attachment failures provide insights about survival
Steenhof, Karen; Bates, Kirk K.; Fuller, Mark R.; Kochert, Michael N.; McKinley, James O.; Lukacs, Paul M.
2006-01-01
From 1999–2002, we attached satellite-received platform transmitter terminals (PTTs) to 40 adult female prairie falcons (Falco mexicanus) on their nesting grounds in the Snake River Birds of Prey National Conservation Area (NCA) in southwest Idaho. We used 3 variations of a backpack harness design that had been used previously on raptors. Each radiomarked falcon also received a color leg band with a unique alphanumeric code. We monitored survival of birds using radiotelemetry and searched for marked birds on their nesting grounds during breeding seasons after marking. Because 6 falcons removed their harnesses during the first year, we were able to compare survival rates of birds that shed PTTs with those that retained them. We describe a harness design that failed prematurely as well as designs that proved successful for long-term PTT attachment. We resighted 21 marked individuals on nesting areas 1–5 years after they were radiomarked and documented 13 mortalities of satellite-tracked falcons. We used a Cormack-Jolly-Seber model to estimate apparent survival probability based on band resighting and telemetry data. Platform transmitter terminals had no short-term effects on falcons or their nesting success during the nesting season they were marked, but birds that shed their transmitters increased their probability of survival. Estimated annual survival for birds that shed their transmitters was 87% compared to 49% for birds wearing transmitters. We discuss possible reasons for differences in apparent survival rates and offer recommendations for future marking of falcons.
Effects of radio marking on prairie falcons: Attachment failures provide insights about survival
Steenhof, Karen; Bates, Kirk K.; Fuller, Mark R.; Kochert, Michael N.; McKinley, J.O.; Lukacs, Paul M.
2006-01-01
From 1999-2002, we attached satellite-received platform transmitter terminals (PTTs) to 40 adult female prairie falcons (Falco mexicanus) on their nesting grounds in the Snake River Birds of Prey National Conservation Area (NCA) in southwest Idaho. We used 3 variations of a backpack harness design that had been used previously on raptors. Each radiomarked falcon also received a color leg band with a unique alphanumeric code. We monitored survival of birds using radiotelemetry and searched for marked birds on their nesting grounds during breeding seasons after marking. Because 6 falcons removed their harnesses during the first year, we were able to compare survival rates of birds that shed PTTs with those that retained them. We describe a harness design that failed prematurely as well as designs that proved successful for long-term PTT attachment. We resighted 21 marked individuals on nesting areas 1-5 years after they were radiomarked and documented 13 mortalities of satellite-tracked falcons. We used a Cormack-Jolly-Seber model to estimate apparent survival probability based on band resighting and telemetry data. Platform transmitter terminals had no short-term effects on falcons or their nesting success during the nesting season they were marked, but birds that shed their transmitters increased their probability of survival. Estimated annual survival for birds that shed their transmitters was 87% compared to 49% for birds wearing transmitters. We discuss possible reasons for differences in apparent survival rates and offer recommendations for future marking of falcons.
Lefbom, Bonnie K; Peckens, Neal K
2016-07-01
OBJECTIVE To assess the effects of in-person collaborative care by primary care veterinarians (pcDVMs) and board-certified veterinary cardiologists (BCVCs) on survival time of dogs after onset of congestive heart failure (CHF) and on associated revenue for the attending pcDVMs. DESIGN Retrospective cohort study. ANIMALS 26 small-breed dogs treated for naturally occurring CHF secondary to myxomatous mitral valve disease at a multilocation primary care veterinary hospital between 2008 and 2013. PROCEDURES Electronic medical records were reviewed to identify dogs with confirmed CHF secondary to myxomatous mitral valve disease and collect information on patient care, survival time, and pcDVM revenue. Data were compared between dogs that received collaborative care from the pcDVM and a BCVC and dogs that received care from the pcDVM alone. RESULTS Dogs that received collaborative care had a longer median survival time (254 days) than did dogs that received care from the pcDVM alone (146 days). A significant positive correlation was identified between pcDVM revenue and survival time for dogs that received collaborative care (ie, the longer the dog survived, the greater the pcDVM revenue generated from caring for that patient). CONCLUSIONS AND CLINICAL RELEVANCE Findings suggested that collaborative care provided to small-breed dogs with CHF by a BCVC and pcDVM could result in survival benefits for affected dogs and increased revenue for pcDVMs, compared with care provided by a pcDVM alone.
Carcinogenicity studies after intraperitoneal injection of two types of stone wool fibres in rats.
Kamstrup, O; Ellehauge, A; Collier, C G; Davis, J M G
2002-03-01
A summary is given of the pathology results after intraperitoneal (i.p.) injection in rats of insulation wool HT, representing the new biosoluble types. The pathology results are compared with a previously conducted i.p. study with traditional stone wool D6 (with similar chemical composition to MMVF21). The HT fibre is characterized by a relatively high content of aluminium and a relatively low content of silica compared to MMVF21. HT has a high in vitro dissolution rate at pH 4.5, a relatively low dissolution rate at pH 7.5 and is less biopersistent than the MMVF21 fibre. Female Wistar rats received a dose of 2 x 10(9) WHO HT fibres by i.p. injection. The fibres had been size-selected to be largely rat respirable. The negative control group was exposed to saline. Following exposure, the animals were maintained until survival in one group fell below 20%. At this time, all animals were killed. All animals were subjected to a necropsy examination; any gross abnormalities observed at necropsy were subjected to histopathological examination. In addition, histopathology was carried out on a predefined list of tissues. The incidences of lesions and survival in the control and fibre dosed animals were compared using appropriate statistical methods to determine whether the dosed animals showed adverse effects on survival or a positive carcinogenic response. The main protocol for the previously conducted study with D6 (MMVF21) was similar, but the animals were maintained as long as they survived, and the WHO fibre dose was lower. The results of the comparative study showed a marked difference in the i.p. pathogenicity of D6 (MMVF21) and HT in terms of their carcinogenic potential. D6 (MMVF21) caused a statistically significant increase of mesotheliomas in the peritoneal cavity compared to the negative control, but the HT fibre did not cause any mesotheliomas or any increase in other tumour types.
Roland, Michelle E.; Barin, Burc; Huprikar, Shirish; Murphy, Barbara; Hanto, Douglas W.; Blumberg, Emily; Olthoff, Kim; Simon, David; Hardy, William D.; Beatty, George; Stock, Peter G.
2016-01-01
Objectives To evaluate the impact of liver and kidney transplantation on survival in HIV-positive transplant candidates and compare outcomes between HIV-positive and negative recipients. Design Observational cohort of HIV-positive transplant candidates and recipients and secondary analysis comparing study recipients to HIV-negative national registry controls. Methods We fit proportional hazards models to assess transplantation impact on mortality among recipients and candidates. We compared time to graft failure and death with HIV-negative controls in unmatched, demographic-matched, and risk-adjusted models. Results There were 17 (11.3%) and 46 (36.8%) deaths among kidney and liver recipients during a median follow-up of 4.0 and 3.5 years, respectively. Transplantation was associated with survival benefit for HIV-infected liver recipients with model for end-stage liver disease (MELD) greater than or equal 15 [hazard ratio (HR) 0.1; 95% confidence interval (CI) 0.05, 0.01; P <0.0001], but not for MELD less than 15 (HR 0.7; 95% CI 0.3, 1.8; P =0.43) or for kidney recipients (HR 0.6; 95% CI 0.3, 1.4; P =0.23). In HIV-positive kidney recipients, unmatched and risk-matched analyses indicated a marginally significant HR for graft loss [1.3 (P =0.07) and HR 1.4 (P =0.052)]; no significant increase in risk of death was observed. All models demonstrated a higher relative hazard of graft loss or death in HIV-positive liver recipients; the absolute difference in the proportion of deaths was 6.7% in the risk-matched analysis. Conclusion Kidney transplantation should be standard of care for well managed HIV-positive patients. Liver transplant in candidates with high MELD confers survival benefit; transplant is a viable option in selected candidates. The increased mortality risk compared with HIV-negative recipients was modest. Trial Registration ClinicalTrials.Gov; NCT00074386; http://clinicaltrials.gov/. PMID:26765937
Can we maximize both value and quality in gynecologic cancer care? A work in progress.
Havrilesky, Laura J; Fountain, Cynthia
2014-01-01
Value is defined as desirable health outcomes achieved per monetary unit spent. Comparative effectiveness research and cost-effectiveness research are methods that have been developed to quantify effectiveness and value to inform management decisions. In this article we review the comparative and cost-effectiveness literature in the field of ovarian cancer treatment. Studies have shown that improved ovarian cancer survival is associated with complete primary surgical cytoreduction, with treatment at high volume facilities by subspecialist providers (gynecologic oncologists) and with National Comprehensive Cancer Network (NCCN) guideline-adherent care in both surgical staging and chemotherapy regimens. Intraperitoneal/intravenous chemotherapy (compared with intravenous alone) has been associated with improved survival and cost-effectiveness. Bevacizumab for primary and maintenance therapy has been found to not be cost-effective (even in selective subsets) despite a small progression-free survival (PFS) advantage. For platinum-sensitive recurrent ovarian cancer, secondary cytoreduction and platinum-based combinations are associated with improved overall survival (OS); several platinum-based combinations have also been found cost-effective. For platinum-resistant recurrence, single agent therapy and supportive care are cost-effective compared with combination therapies. Although little prospective clinical research has been done around end-of-life care, one study reported that for platinum-resistant ovarian cancer, palliative intervention would potentially reduce costs and increase quality adjusted life years compared with usual care (based on improvement in quality of life [QOL]). Overall, cost comparisons of individual chemotherapy regimens are highly dependent on market prices of novel therapeutic agents.
Wang, Tao; Tziviskou, Effie; Chu, Maggie; Bargman, Joanne; Jassal, Vanita; Vas, Stephen; Oreopoulos, Dimitrios G
2003-01-01
Recently it has been suggested that the survival of dialysis patients may differ among different races. Both registry data and data from Asian countries indicates that Asians on peritoneal dialysis may survive longer than their Caucasian counterparts. In the present study, we performed a detailed analysis of survival differences between oriental Asians and Caucasians on peritoneal dialysis in our multiethnic, multicultural program. Retrospectively we analyzed the survival data for patients who started peritoneal dialysis after January 1, 1996 and before December 31, 1999, in our hospital. They were followed for at least for two years. Excluded from the present analysis were those who survived for less than three months on peritoneal dialysis. The patient demographic characteristics, comorbidities, and residual renal function at the start of dialysis were collected. Indices for adequacy of dialysis were collected 1-3 months after the initiation of dialysis. Actuarial survival rates were determined by the Kaplan-Meier method. The Cox proportional hazards model was used to classify risk factors for a high mortality. There were 87 Caucasians and 29 Oriental Asian peritoneal dialysis patients. No differences were found in age, gender, primary renal disease, and residual renal function between the two groups. The Caucasians had significantly higher body surface area and urea volume and higher incidence of cardiovascular diseases. Even with slightly higher dialysis dose, the peritoneal creatinine clearance was significantly lower among the Caucasians than among Asians. There was no difference in the peritoneal D/P value between the two groups. However, compared to the Caucasians, the 24hr peritoneal fluid removal and total fluid removal volumes were significantly lower in the Asian patients. The one, two, three and four year survival rates were 95.8%, 91%, 86% and 80% for Asians and 91.3%, 78.1%, 64.7% and 54.1% for Caucasians. Significant predictors for a higher mortality were the presence of cardiovascular disease (42% increase in risk), Caucasians (39% increase in risk) and older age (37% increase in risk for age older than 65). Our study confirms that oriental Asians on peritoneal dialysis patients survive much longer than their Caucasian counterparts; this was partly due to the fact that Asian patients have less cardiovascular disease when they began peritoneal dialysis. Due to their smaller body size, the Asians tended to have a higher peritoneal small solute clearances despite their smaller dialysis doses, indicating that, to achieve the same solute clearance targets, Asians need a smaller dialysis dose compared to Caucasians.
Chemo Before Surgery May Help Stomach Cancer
Chemotherapy given before surgery for cancer of the lower esophagus and stomach increased the number of patients surviving for five years compared to surgery alone, according to findings presented at the 2007 ASCO meeting in Chicago.
Hepatitis C virus recurrence after liver transplantation: a 10-year evaluation.
Gitto, Stefano; Belli, Luca Saverio; Vukotic, Ranka; Lorenzini, Stefania; Airoldi, Aldo; Cicero, Arrigo Francesco Giuseppe; Vangeli, Marcello; Brodosi, Lucia; Panno, Arianna Martello; Di Donato, Roberto; Cescon, Matteo; Grazi, Gian Luca; De Carlis, Luciano; Pinna, Antonio Daniele; Bernardi, Mauro; Andreone, Pietro
2015-04-07
To evaluate the predictors of 10-year survival of patients with hepatitis C recurrence. Data from 358 patients transplanted between 1989 and 2010 in two Italian transplant centers and with evidence of hepatitis C recurrence were analyzed. A χ(2), Fisher's exact test and Kruskal Wallis' test were used for categorical and continuous variables, respectively. Survival analysis was performed at 10 years after transplant using the Kaplan-Meier method, and a log-rank test was used to compare groups. A P level less than 0.05 was considered significant for all tests. Multivariate analysis of the predictive role of different variables on 10-year survival was performed by a stepwise Cox logistic regression. The ten-year survival of the entire population was 61.2%. Five groups of patients were identified according to the virological response or lack of a response to antiviral treatment and, among those who were not treated, according to the clinical status (mild hepatitis C recurrence, "too sick to be treated" and patients with comorbidities contraindicating the treatment). While the 10-year survival of treated and untreated patients was not different (59.1% vs 64.7%, P = 0.192), patients with a sustained virological response had a higher 10-year survival rate than both the "non-responders" (84.7% vs 39.8%, P < 0.0001) and too sick to be treated (84.7% vs 0%, P < 0.0001). Sustained virological responders had a survival rate comparable to patients untreated with mild recurrence (84.7% vs 89.3%). A sustained virological response and young donor age were independent predictors of 10-year survival. Sustained virological response significantly increased long-term survival. Awaiting the interferon-free regimen global availability, antiviral treatment might be questionable in selected subjects with mild hepatitis C recurrence.
Effect of oxygen on survival of faecal pollution indicators in drinking water.
Roslev, P; Bjergbaek, L A; Hesselsoe, M
2004-01-01
The aim of this study was to determine the effect of oxygen on the survival of faecal pollution indicators including Escherichia coli in nondisinfected drinking water. Aerobic and anaerobic drinking water microcosms were inoculated with E. coli ATCC 25922 or raw sewage. Survival of E. coli was monitored by membrane filtration combined with cultivation on standard media, and by in situ hybridization with 16S rRNA-targeted fluorescent oligonucleotide probes. Anaerobic conditions significantly increased the survival of E. coli in drinking water compared with aerobic conditions. Escherichia coli ATCC 25922 showed a biphasic decrease in survival under aerobic conditions with an initial first-order decay rate of -0.11 day(-1) followed by a more rapid rate of -0.35 day(-1). In contrast, the first-order decay rate under anaerobic conditions was only -0.02 day(-1). After 35 days, <0.01% of the initial E. coli ATCC 25922 population remained detectable in aerobic microcosms compared with 48% in anaerobic microcosms. A poor survival was observed under aerobic conditions regardless of whether E. coli ATCC 25922 or sewage-derived E. coli was examined, and regardless of the detection method used (CFU or fluorescent in situ hybridization). Aerobic conditions in drinking water also appeared to decrease the survival of faecal enterococci, somatic coliphages and coliforms other than E. coli. The results indicate that oxygen is a major regulator of the survival of E. coli in nondisinfected drinking water. The results also suggest that faecal pollution indicators other than E. coli may persist longer in drinking water under anaerobic conditions. The effect of oxygen should be considered when evaluating the survival potential of enteric pathogens in oligotrophic environments.
Effect of the lung allocation score on lung transplantation in the United States.
Egan, Thomas M; Edwards, Leah B
2016-04-01
On May 4, 2005, the system for allocation of deceased donor lungs for transplant in the United States changed from allocation based on waiting time to allocation based on the lung allocation score (LAS). We sought to determine the effect of the LAS on lung transplantation in the United States. Organ Procurement and Transplantation Network data on listed and transplanted patients were analyzed for 5 calendar years before implementation of the LAS (2000-2004), and compared with data from 6 calendar years after implementation (2006-2011). Counts were compared between eras using the Wilcoxon rank sum test. The rates of transplant increase within each era were compared using an F-test. Survival rates computed using the Kaplan-Meier method were compared using the log-rank test. After introduction of the LAS, waitlist deaths decreased significantly, from 500/year to 300/year; the number of lung transplants increased, with double the annual increase in rate of lung transplants, despite no increase in donors; the distribution of recipient diagnoses changed dramatically, with significantly more patients with fibrotic lung disease receiving transplants; age of recipients increased significantly; and 1-year survival had a small but significant increase. Allocating lungs for transplant based on urgency and benefit instead of waiting time was associated with fewer waitlist deaths, more transplants performed, and a change in distribution of recipient diagnoses to patients more likely to die on the waiting list. Copyright © 2016 International Society for Heart and Lung Transplantation. All rights reserved.
Freireich, E J; Lichtiger, B; Mattiuzzi, G; Martinez, F; Reddy, V; Kyle Wathen, J
2013-01-01
A prospective, randomized double-blind study comparing the effects of irradiated and unirradiated white blood cells was conducted in 108 acute leukemia patients with life-threatening infections, refractory to antibiotics. The study demonstrated no significant improvement in 30-day survival or overall survival. Transfusion of unirradiated white cells did not compromise the patient's opportunity to undergo allogeneic stem cell transplant, nor the success rate or overall survival after allogeneic transplant. The important positive finding in this study was that the unirradiated white cells produced a significantly higher increment in circulating granulocytes and in a higher proportion of patients granulocyte count exceeded 1000 per microliter, approaching normal concentrations. The increase in the number and the improved survival of the unirradiated granulocytes suggest that this procedure might potentially be a method to improve the utility of granulocyte transfusions and merits further investigation. The study demonstrated non-inferiority for unirradiated white cells. There were no harmful effects such as graft-versus-host disease, indicating that such studies would be safe to conduct in the future. PMID:23072780
Quinn, Casey; Ma, Qiufei; Kudlac, Amber; Palmer, Stephen; Barber, Beth; Zhao, Zhongyun
2016-04-01
Few randomized controlled trials have compared new treatments for metastatic melanoma. We sought to examine the relative treatment effect of talimogene laherparepvec compared with ipilimumab and vemurafenib. A systematic literature review of treatments for metastatic melanoma was undertaken but a valid network of evidence could not be established because of a lack of comparative data or studies with sufficient common comparators. A conventional adjusted indirect treatment comparison via network meta-analysis was, therefore, not feasible. Instead, a meta-analysis of absolute efficacy was undertaken, adjusting overall survival (OS) data for differences in prognostic factors between studies using a published algorithm. Four trials were included in the final indirect treatment comparison: two of ipilimumab, one of vemurafenib, and one of talimogene laherparepvec. Median OS for ipilimumab and vemurafenib increased significantly when adjustment was applied, demonstrating that variation in disease and patient characteristics was biasing OS estimates; adjusting for this made the survival data more comparable. For both ipilimumab and vemurafenib, the adjustments improved Kaplan-Meier OS curves; the observed talimogene laherparepvec OS curve remained above the adjusted OS curves for ipilimumab and vemurafenib, showing that long-term survival could differ from the observed medians. Even with limited data, talimogene laherparepvec, ipilimumab, and vemurafenib could be compared following adjustments, thereby providing a more reliable understanding of the relative effect of treatment on survival in a more comparable patient population. The results of this analysis suggest that OS with talimogene laherparepvec is at least as good as with ipilimumab and vemurafenib and improvement was more pronounced in patients with no bone, brain, lung or other visceral metastases. Amgen Inc.
Johansson, Inger; Andersson, Rune; Friman, Vanda; Selimovic, Nedim; Hanzen, Lars; Nasic, Salmir; Nyström, Ulla; Sigurdardottir, Vilborg
2015-12-24
Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.
Survival of patients with colon and rectal cancer in central and northern Denmark, 1998–2009
Ostenfeld, Eva B; Erichsen, Rune; Iversen, Lene H; Gandrup, Per; Nørgaard, Mette; Jacobsen, Jacob
2011-01-01
Objective The prognosis for colon and rectal cancer has improved in Denmark over the past decades but is still poor compared with that in our neighboring countries. We conducted this population-based study to monitor recent trends in colon and rectal cancer survival in the central and northern regions of Denmark. Material and methods Using the Danish National Registry of Patients, we identified 9412 patients with an incident diagnosis of colon cancer and 5685 patients diagnosed with rectal cancer between 1998 and 2009. We determined survival, and used Cox proportional hazard regression analysis to compare mortality over time, adjusting for age and gender. Among surgically treated patients, we computed 30-day mortality and corresponding mortality rate ratios (MRRs). Results The annual numbers of colon and rectal cancer increased from 1998 through 2009. For colon cancer, 1-year survival improved from 65% to 70%, and 5-year survival improved from 37% to 43%. For rectal cancer, 1-year survival improved from 73% to 78%, and 5-year survival improved from 39% to 47%. Men aged 80+ showed most pronounced improvements. The 1- and 5-year adjusted MRRs decreased: for colon cancer 0.83 (95% confidence interval CI: 0.76–0.92) and 0.84 (95% CI: 0.78–0.90) respectively; for rectal cancer 0.79 (95% CI: 0.68–0.91) and 0.81 (95% CI: 0.73–0.89) respectively. The 30-day postoperative mortality after resection also declined over the study period. Compared with 1998–2000 the 30-day MRRs in 2007–2009 were 0.68 (95% CI: 0.53–0.87) for colon cancer and 0.59 (95% CI: 0.37–0.96) for rectal cancer. Conclusion The survival after colon and rectal cancer has improved in central and northern Denmark during the 1998–2009 period, as well as the 30-day postoperative mortality. PMID:21814467
Survival of patients with colon and rectal cancer in central and northern Denmark, 1998-2009.
Ostenfeld, Eva B; Erichsen, Rune; Iversen, Lene H; Gandrup, Per; Nørgaard, Mette; Jacobsen, Jacob
2011-01-01
The prognosis for colon and rectal cancer has improved in Denmark over the past decades but is still poor compared with that in our neighboring countries. We conducted this population-based study to monitor recent trends in colon and rectal cancer survival in the central and northern regions of Denmark. Using the Danish National Registry of Patients, we identified 9412 patients with an incident diagnosis of colon cancer and 5685 patients diagnosed with rectal cancer between 1998 and 2009. We determined survival, and used Cox proportional hazard regression analysis to compare mortality over time, adjusting for age and gender. Among surgically treated patients, we computed 30-day mortality and corresponding mortality rate ratios (MRRs). The annual numbers of colon and rectal cancer increased from 1998 through 2009. For colon cancer, 1-year survival improved from 65% to 70%, and 5-year survival improved from 37% to 43%. For rectal cancer, 1-year survival improved from 73% to 78%, and 5-year survival improved from 39% to 47%. Men aged 80+ showed most pronounced improvements. The 1- and 5-year adjusted MRRs decreased: for colon cancer 0.83 (95% confidence interval CI: 0.76-0.92) and 0.84 (95% CI: 0.78-0.90) respectively; for rectal cancer 0.79 (95% CI: 0.68-0.91) and 0.81 (95% CI: 0.73-0.89) respectively. The 30-day postoperative mortality after resection also declined over the study period. Compared with 1998-2000 the 30-day MRRs in 2007-2009 were 0.68 (95% CI: 0.53-0.87) for colon cancer and 0.59 (95% CI: 0.37-0.96) for rectal cancer. The survival after colon and rectal cancer has improved in central and northern Denmark during the 1998-2009 period, as well as the 30-day postoperative mortality.
Gadgil, Anita; Roy, Nobhojit; Sankaranarayanan, Rengaswamy; Muwonge, Richard; Sauvaget, Catherine
2012-01-01
Breast cancer is the second most common cancer in women in India and the disease burden is increasing annually. The lack of awareness initiatives, structured screening, and affordable treatment facilities continue to result in poor survival. We present a breast cancer survival scenario, in urban population in India, where standardised care is distributed equitably and free of charge through an employees' healthcare scheme. We studied 99 patients who were treated at our hospital during the period 2005 to 2010 and our follow-up rates were 95.95%. Patients received evidence-based standardised care in line with the tertiary cancer centre in Mumbai. One-, three- and five-year survival rates were calculated using Kaplan-Meier method. Socio-demographic, reproductive and tumor factors, relevant to survival, were analysed. Mortality hazard ratios (HR) were calculated using Cox proportional hazard method. Survival in this series was compared to that in registries across India and discrepancies were discussed. Patients mean age was 56 years, mean tumor size was 3.2 cms, 85% of the tumors belonged to T1 and T2 stages, and 45% of the patients belonged to the composite stages I and IIA. Overall 5-year survival was 74.9%. Patients who presented with large-sized tumors (HR 3.06; 95% CI 0.4-9.0), higher composite stage (HR 1.91; 0.55-6.58) and undergone mastectomy (HR 2.94; 0.63- 13.62) had a higher risk of mortality than women who had higher levels of education (HR 0.25; 0.05-1.16), although none of these results reached the significant statistical level. We observed 25% better survival compared to other Indian populations. Our results are comparable to those from the European Union and North America, owing to early presentation, equitable access to standardised free healthcare and complete follow-up ensured under the scheme. This emphasises that equitable and affordable delivery of standardised healthcare can translate into early presentation and better survival in India.
Rajeshkumar, N V; Yabuuchi, Shinichi; Pai, Shweta G; Tong, Zeen; Hou, Shihe; Bateman, Scott; Pierce, Daniel W; Heise, Carla; Von Hoff, Daniel D; Maitra, Anirban; Hidalgo, Manuel
2016-08-09
Albumin-bound paclitaxel (nab-paclitaxel, nab-PTX) plus gemcitabine (GEM) combination has demonstrated efficient antitumour activity and statistically significant overall survival of patients with metastatic pancreatic ductal adenocarcinoma (PDAC) compared with GEM monotherapy. This regimen is currently approved as a standard of care treatment option for patients with metastatic PDAC. It is unclear whether cremophor-based PTX combined with GEM provide a similar level of therapeutic efficacy in PDAC. We comprehensively explored the antitumour efficacy, effect on metastatic dissemination, tumour stroma and survival advantage following GEM, PTX and nab-PTX as monotherapy or in combination with GEM in a locally advanced, and a highly metastatic orthotopic model of human PDAC. Nab-PTX treatment resulted in significantly higher paclitaxel tumour plasma ratio (1.98-fold), robust stromal depletion, antitumour efficacy (3.79-fold) and survival benefit compared with PTX treatment. PTX plus GEM treatment showed no survival gain over GEM monotherapy. However, nab-PTX in combination with GEM decreased primary tumour burden, metastatic dissemination and significantly increased median survival of animals compared with either agents alone. These therapeutic effects were accompanied by depletion of dense fibrotic tumour stroma and decreased proliferation of carcinoma cells. Notably, nab-PTX monotherapy was equivalent to nab-PTX plus GEM in providing survival advantage to mice in a highly aggressive metastatic PDAC model, indicating that nab-PTX could potentially stop the progression of late-stage pancreatic cancer. Our data confirmed that therapeutic efficacy of PTX and nab-PTX vary widely, and the contention that these agents elicit similar antitumour response was not supported. The addition of PTX to GEM showed no survival advantage, concluding that a clinical combination of PTX and GEM may unlikely to provide significant survival advantage over GEM monotherapy and may not be a viable alternative to the current standard-of-care nab-PTX plus GEM regimen for the treatment of PDAC patients.
Survival after initial diagnosis of Alzheimer disease.
Larson, Eric B; Shadlen, Marie-Florence; Wang, Li; McCormick, Wayne C; Bowen, James D; Teri, Linda; Kukull, Walter A
2004-04-06
Alzheimer disease is an increasingly common condition in older people. Knowledge of life expectancy after the diagnosis of Alzheimer disease and of associations of patient characteristics with survival may help planning for future care. To investigate the course of Alzheimer disease after initial diagnosis and examine associations hypothesized to correlate with survival among community-dwelling patients with Alzheimer disease. Prospective observational study. An Alzheimer disease patient registry from a base population of 23 000 persons age 60 years and older in the Group Health Cooperative, Seattle, Washington. 521 newly recognized persons with Alzheimer disease enrolled from 1987 to 1996 in an Alzheimer disease patient registry. Baseline measurements included patient demographic features, Mini-Mental State Examination score, Blessed Dementia Rating Scale score, duration since reported onset of symptoms, associated symptoms, comorbid conditions, and selected signs. Survival was the outcome of interest. The median survival from initial diagnosis was 4.2 years for men and 5.7 years for women with Alzheimer disease. Men had poorer survival across all age groups compared with females. Survival was decreased in all age groups compared with the life expectancy of the U.S. population. Predictors of mortality based on proportional hazards models included a baseline Mini-Mental State Examination score of 17 or less, baseline Blessed Dementia Rating Scale score of 5.0 or greater, presence of frontal lobe release signs, presence of extrapyramidal signs, gait disturbance, history of falls, congestive heart failure, ischemic heart disease, and diabetes at baseline. The base population, although typical of the surrounding Seattle community, may not be representative of other, more diverse populations. In this sample of community-dwelling elderly persons who received a diagnosis of Alzheimer disease, survival duration was shorter than predicted on the basis of U.S. population data, especially for persons with onset at relatively younger ages. Features significantly associated with reduced survival at diagnosis were increased severity of cognitive impairment, decreased functional level, history of falls, physical examination findings of frontal release signs, and abnormal gait. The variables most strongly associated with survival were measures of disease severity at the time of diagnosis. These results should be useful to patients and families experiencing Alzheimer disease, other caregivers, clinicians, and policymakers when planning for future care needs.
Incidence, prevalence, and survival of chronic pancreatitis: a population-based study.
Yadav, Dhiraj; Timmons, Lawrence; Benson, Joanne T; Dierkhising, Ross A; Chari, Suresh T
2011-12-01
Population-based data on chronic pancreatitis (CP) in the United States are scarce. We determined incidence, prevalence, and survival of CP in Olmsted County, MN. Using Mayo Clinic Rochester's Medical Diagnostic Index followed by a detailed chart review, we identified 106 incident CP cases from 1977 to 2006 (89 clinical cases, 17 diagnosed only at autopsy); CP was defined by previously published Mayo Clinic criteria. We calculated age- and sex-adjusted incidence (for each decade) and prevalence rate (1 January 2006) per 100,000 population (adjusted to 2000 US White population). We compared the observed survival rate for patients with expected survival for age- and sex-matched Minnesota White population. Median age at diagnosis of CP was 58 years, 56% were male, and 51% had alcoholic CP. The overall (clinical cases or diagnosed only at autopsy) age- and sex-adjusted incidence was 4.05/100,000 person-years (95% confidence interval (CI) 3.27-4.83). The incidence rate for clinical cases increased significantly from 2.94/100,000 during 1977-1986 to 4.35/100,000 person-years during 1997-2006 (P<0.05) because of an increase in the incidence of alcoholic CP. There were 51 prevalent CP cases on 1 January 2006 (57% male, 53% alcoholic). The age- and sex-adjusted prevalence rate per 100,000 population was 41.76 (95% CI 30.21-53.32). At last follow-up, 50 patients were alive. Survival among CP patients was significantly lower than age- and sex-specific expected survival in Minnesota White population (P<0.001). Incidence and prevalence of CP are low, and ∼50% are alcohol related. The incidence of CP cases diagnosed during life is increasing. Survival of CP patients is lower than in the Minnesota White population.
Pérez-Cuevas, Ricardo; Doubova, Svetlana V; Zapata-Tarres, Marta; Flores-Hernández, Sergio; Frazier, Lindsay; Rodríguez-Galindo, Carlos; Cortes-Gallo, Gabriel; Chertorivski-Woldenberg, Salomon; Muñoz-Hernández, Onofre
2013-02-01
In 2006, the Mexican government launched the Fund for Protection Against Catastrophic Expenditures (FPGC) to support financially healthcare of high cost illnesses. This study aimed at answering the question whether FPGC improved coverage for cancer care and to measure survival of FPGC affiliated children with cancer. A retrospective cohort study (2006-2009) was conducted in 47 public hospitals. Information of children and adolescents with cancer was analyzed. The coverage was estimated in accordance with expected number of incident cases and those registered at FPGC. The survival was analyzed by using Kaplan-Meier survival curves and Cox proportional hazards regression modeling. The study included 3,821 patients. From 2006 to 2009, coverage of new cancer cases increased from 3.3% to 55.3%. Principal diagnoses were acute lymphoblastic leukemia (ALL, 46.4%), central nervous system (CNS) tumors (8.2%), and acute myeloid leukemia (AML, 7.4%). The survival rates at 36 months were ALL (50%), AML (30.5%), Hodgkin lymphoma (74.5%), Non-Hodgkin lymphoma (40.1%), CNS tumors (32.8%), renal tumors (58.4%), bone tumors (33.4%), retinoblastoma (59.2%), and other solid tumors (52.6%). The 3-year overall survival rates varied among the regions; children between the east and south-southeast had the higher risks (hazard ratio 3.0; 95% CI: 2.3-3.9) and 2.4; 95% CI: 2.0-2.8) of death from disease when compared with those from the central region. FPGC has increased coverage of cancer cases. Survival rates were different throughout the country. It is necessary to evaluate the effectiveness of this policy to increase access and identify opportunities to reduce the differences in survival. Copyright © 2012 Wiley Periodicals, Inc.
Pérez-Cuevas, Ricardo; Doubova, Svetlana V; Zapata-Tarres, Marta; Flores-Hernández, Sergio; Frazier, Lindsay; Rodríguez-Galindo, Carlos; Cortes-Gallo, Gabriel; Chertorivski-Woldenberg, Salomon; Muñoz-Hernández, Onofre
2013-01-01
Background In 2006, the Mexican government launched the Fund for Protection Against Catastrophic Expenditures (FPGC) to support financially healthcare of high cost illnesses. This study aimed at answering the question whether FPGC improved coverage for cancer care and to measure survival of FPGC affiliated children with cancer. Procedure A retrospective cohort study (2006–2009) was conducted in 47 public hospitals. Information of children and adolescents with cancer was analyzed. The coverage was estimated in accordance with expected number of incident cases and those registered at FPGC. The survival was analyzed by using Kaplan–Meier survival curves and Cox proportional hazards regression modeling. Results The study included 3,821 patients. From 2006 to 2009, coverage of new cancer cases increased from 3.3% to 55.3%. Principal diagnoses were acute lymphoblastic leukemia (ALL, 46.4%), central nervous system (CNS) tumors (8.2%), and acute myeloid leukemia (AML, 7.4%). The survival rates at 36 months were ALL (50%), AML (30.5%), Hodgkin lymphoma (74.5%), Non-Hodgkin lymphoma (40.1%), CNS tumors (32.8%), renal tumors (58.4%), bone tumors (33.4%), retinoblastoma (59.2%), and other solid tumors (52.6%). The 3-year overall survival rates varied among the regions; children between the east and south-southeast had the higher risks (hazard ratio 3.0; 95% CI: 2.3–3.9) and 2.4; 95% CI: 2.0–2.8) of death from disease when compared with those from the central region. Conclusion FPGC has increased coverage of cancer cases. Survival rates were different throughout the country. It is necessary to evaluate the effectiveness of this policy to increase access and identify opportunities to reduce the differences in survival. PMID:22887842
Newborn survival in Malawi: a decade of change and future implications.
Zimba, Evelyn; Kinney, Mary V; Kachale, Fannie; Waltensperger, Karen Z; Blencowe, Hannah; Colbourn, Tim; George, Joby; Mwansambo, Charles; Joshua, Martias; Chanza, Harriet; Nyasulu, Dorothy; Mlava, Grace; Gamache, Nathalie; Kazembe, Abigail; Lawn, Joy E
2012-07-01
Malawi is one of two low-income sub-Saharan African countries on track to meet the Millennium Development Goal (MDG 4) for child survival despite high fertility and HIV and low health worker density. With neonatal deaths becoming an increasing proportion of under-five deaths, addressing newborn survival is critical for achieving MDG 4. We examine change for newborn survival in the decade 2000-10, analysing mortality and coverage indicators whilst considering other contextual factors. We assess national and donor funding, as well as policy and programme change for newborn survival using standard analyses and tools being applied as part of a multi-country analysis. Compared with the 1990s, progress towards MDG 4 and 5 accelerated considerably from 2000 to 2010. Malawi's neonatal mortality rate (NMR) reduced slower than annual reductions in mortality for children 1-59 months and maternal mortality (NMR reduced 3.5% annually). Yet, the NMR reduced at greater pace than the regional and global averages. A significant increase in facility births and other health system changes, including increased human resources, likely contributed to this decline. High level attention for maternal health and associated comprehensive policy change has provided a platform for a small group of technical and programme experts to link in high impact interventions for newborn survival. The initial entry point for newborn care in Malawi was mainly through facility initiatives, such as Kangaroo Mother Care. This transitioned to an integrated and comprehensive approach at community and facility level through the Community-Based Maternal and Newborn Care package, now being implemented in 17 of 28 districts. Addressing quality gaps, especially for care at birth in facilities, and including newborn interventions in child health programmes, will be critical to the future agenda of newborn survival in Malawi.
Gomes, Simone A; Paula, Adriano R; Ribeiro, Anderson; Moraes, Catia O P; Santos, Jonathan W A B; Silva, Carlos P; Samuels, Richard I
2015-12-30
Entomopathogenic fungi are potential candidates for use in integrated vector management and many isolates are compatible with synthetic and natural insecticides. Neem oil was tested separately and in combination with the entomopathogenic fungus Metarhizium anisopliae against larvae of the dengue vector Aedes aegypti. Our aim was to increase the effectiveness of the fungus for the control of larval mosquito populations. Commercially available neem oil was used at concentrations ranging from 0.0001 to 1%. Larval survival rates were monitored over a 7 day period following exposure to neem. The virulence of the fungus M. anisopliae was confirmed using five conidial concentrations (1 × 10(5) to 1 × 10(9) conidia mL(-1)) and survival monitored over 7 days. Two concentrations of fungal conidia were then tested together with neem (0.001%). Survival curve comparisons were carried out using the Log-rank test and end-point survival rates were compared using one-way ANOVA. 1% neem was toxic to A. aegypti larvae reducing survival to 18% with S50 of 2 days. Neem had no effect on conidial germination or fungal vegetative growth in vitro. Larval survival rates were reduced to 24% (S50 = 3 days) when using 1 × 10(9) conidia mL(-1). Using 1 × 10(8) conidia mL(-1), 30% survival (S50 = 3 days) was observed. We tested a "sub-lethal" neem concentration (0.001%) together with these concentrations of conidia. For combinations of neem + fungus, the survival rates were significantly lower than the survival rates seen for fungus alone or for neem alone. Using a combination of 1 × 10(7) conidia mL(-1) + neem (0.001%), the survival rates were 36%, whereas exposure to the fungus alone resulted in 74% survival and exposure to neem alone resulted in 78% survival. When using 1 × 10(8) conidia mL(-1), the survival curves were modified, with a combination of the fungus + neem resulting in 12% survival, whilst the fungus alone at this concentration also significantly reduced survival rates (28%). The use of adjuvants is an important strategy for maintaining/increasing fungal virulence and/or shelf-life. The addition of neem to conidial suspensions improved virulence, significantly reducing larval survival times and percentages.
Malignant melanoma in 63 dogs (2001-2011): the effect of carboplatin chemotherapy on survival.
Brockley, L K; Cooper, M A; Bennett, P F
2013-01-01
The aim of the study was to compare the effect of carboplatin chemotherapy on the survival of canine patients diagnosed with malignant melanoma after loco-regional control or as a sole therapy. A retrospective study of 63 dogs with oral, digital or cutaneous malignant melanoma treated with surgery and/or chemotherapy was undertaken. Dogs were grouped based on the anatomical site of melanoma development. For oral melanoma, dogs were subclassified into two groups: loco-regional control and gross disease. All patients in the digital and cutaneous groups had achieved loco-regional control with surgery. Comparisons between survival data for each group at each anatomical site were then made. Within the loco-regional control groups survival time was compared between those treated with and without chemotherapy post surgery. For the oral melanoma patients with gross disease survival was compared between those treated with chemotherapy and palliative therapy. The toxicity of carboplatin chemotherapy was evaluated overall. The overall median survival times for patients with oral, digital and cutaneous melanoma were 389, 1,350 days and not reached (with a median follow-up of 776 days) respectively. Median survival time was defined as "not reached" when less than 50% of the subjects died of the disease at the end of the follow-up period, or at the time they were lost to follow-up. The addition of chemotherapy to surgery did not confer a survival benefit in the loco-regional control setting when assessing survival for each anatomical site. For oral melanoma patients with gross disease there was no difference between survival of patients treated with chemotherapy and palliative intent therapy. There was however an improvement in survival in the three dogs that responded to chemotherapy (978 days; p=0.039) compared to the eight non-responders (147 days). On univariate and multivariate analysis, anatomic location was the only variable that was significantly related to survival (p=0.0002 and p=0.009, respectively). The addition of chemotherapy to local treatments for canine melanoma at oral, digital and cutaneous sites did not lead to a significant increase in survival times. Carboplatin was well tolerated and appeared to have activity against oral melanoma in a subset of patients with gross disease that responded to treatment. Carboplatin with piroxicam could be considered for patients with gross disease when more traditional therapies, such as surgery or radiation therapy, are declined or are not available. In the loco-regional control setting, prospective randomised blinded studies with matched control groups are required to determine if chemotherapy has a role in the treatment of these types of cancer.
Obesity does not affect survival outcomes in extremity soft tissue sarcoma.
Alamanda, Vignesh K; Moore, David C; Song, Yanna; Schwartz, Herbert S; Holt, Ginger E
2014-09-01
Obesity is a growing epidemic and has been associated with an increased frequency of complications after various surgical procedures. Studies also have shown adipose tissue to promote a microenvironment favorable for tumor growth. Additionally, the relationship between obesity and prognosis of soft tissue sarcomas has yet to be evaluated. We sought to assess if (1) obesity affects survival outcomes (local recurrence, distant metastasis, and death attributable to disease) in patients with extremity soft tissue sarcomas; and (2) whether obesity affected wound healing and other surgical complications after treatment. A BMI of 30 kg/m(2) or greater was used to define obesity. Querying our prospective database between 2001 and 2008, we identified 397 patients for the study; 154 were obese and 243 were not obese. Mean followup was 4.5 years (SD, 3.1 years) in the obese group and 3.9 years (SD, 3.2 years) in the nonobese group; the group with a BMI of 30 kg/m(2) or greater had a higher proportion of patients with followups of at least 2 years compared with the group with a BMI less than 30 kg/m(2) (76% versus 62%). Outcomes, including local recurrence, distant metastasis, and overall survival, were analyzed after patients were stratified by BMI. Multivariable survival models were used to identify independent predictors of survival outcomes. Wilcoxon rank sum test was used to compare continuous variables. Based on the accrual interval of 8 years, the additional followup of 5 years after data collection, and the median survival time for the patients with a BMI less than 30 kg/m(2) of 3 years, we were able to detect true median survival times in the patients with a BMI of 30 kg/m(2) of 2.2 years or less with 80% power and type I error rate of 0.05. Patients who were obese had similar survival outcomes and wound complication rates when compared with their nonobese counterparts. Patients who were obese were more likely to have lower-grade tumors (31% versus 20%; p = 0.021) and additional comorbidities including diabetes mellitus (26% versus 7%; p < 0.001), hypertension (63% versus 38%; p < 0.001), and smoking (49% versus 37%; p = 0.027). Regression analysis confirmed that even after accounting for certain tumor characteristics and comorbidities, obesity did not serve as an independent risk factor in affecting survival outcomes. Although the prevalence of obesity continues to increase and lead to many negative health consequences, it does not appear to adversely affect survival, local recurrence, or wound complication rates for patients with extremity soft tissue sarcomas. Level III, therapeutic study. See the Instructions for Authors for a complete description of levels of evidence.
Marcos-Gragera, Rafael; Allemani, Claudia; Tereanu, Carmen; De Angelis, Roberta; Capocaccia, Riccardo; Maynadie, Marc; Luminari, Stefano; Ferretti, Stefano; Johannesen, Tom Børge; Sankila, Risto; Karjalainen-Lindsberg, Marja-Liisa; Simonetti, Arianna; Martos, Maria Carmen; Raphaël, Martine; Giraldo, Pilar; Sant, Milena
2011-01-01
Background The European Cancer Registry-based project on hematologic malignancies (HAEMACARE), set up to improve the availability and standardization of data on hematologic malignancies in Europe, used the European Cancer Registry-based project on survival and care of cancer patients (EUROCARE-4) database to produce a new grouping of hematologic neoplasms (defined by the International Classification of Diseases for Oncology, Third Edition and the 2001/2008 World Health Organization classifications) for epidemiological and public health purposes. We analyzed survival for lymphoid neoplasms in Europe by disease group, comparing survival between different European regions by age and sex. Design and Methods Incident neoplasms recorded between 1995 to 2002 in 48 population-based cancer registries in 20 countries participating in EUROCARE-4 were analyzed. The period approach was used to estimate 5-year relative survival rates for patients diagnosed in 2000–2002, who did not have 5 years of follow up. Results The 5-year relative survival rate was 57% overall but varied markedly between the defined groups. Variation in survival within the groups was relatively limited across European regions and less than in previous years. Survival differences between men and women were small. The relative survival for patients with all lymphoid neoplasms decreased substantially after the age of 50. The proportion of ‘not otherwise specified’ diagnoses increased with advancing age. Conclusions This is the first study to analyze survival of patients with lymphoid neoplasms, divided into groups characterized by similar epidemiological and clinical characteristics, providing a benchmark for more detailed analyses. This Europe-wide study suggests that previously noted differences in survival between regions have tended to decrease. The survival of patients with all neoplasms decreased markedly with age, while the proportion of ‘not otherwise specified’ diagnoses increased with advancing age. Thus the quality of diagnostic work-up and care decreased with age, suggesting that older patients may not be receiving optimal treatment. PMID:21330324
Rotics, Shay; Kaatz, Michael; Resheff, Yehezkel S; Turjeman, Sondra Feldman; Zurell, Damaris; Sapir, Nir; Eggers, Ute; Flack, Andrea; Fiedler, Wolfgang; Jeltsch, Florian; Wikelski, Martin; Nathan, Ran
2016-07-01
Migration conveys an immense challenge, especially for juvenile birds coping with enduring and risky journeys shortly after fledging. Accordingly, juveniles exhibit considerably lower survival rates compared to adults, particularly during migration. Juvenile white storks (Ciconia ciconia), which are known to rely on adults during their first fall migration presumably for navigational purposes, also display much lower annual survival than adults. Using detailed GPS and body acceleration data, we examined the patterns and potential causes of age-related differences in fall migration properties of white storks by comparing first-year juveniles and adults. We compared juvenile and adult parameters of movement, behaviour and energy expenditure (estimated from overall dynamic body acceleration) and placed this in the context of the juveniles' lower survival rate. Juveniles used flapping flight vs. soaring flight 23% more than adults and were estimated to expend 14% more energy during flight. Juveniles did not compensate for their higher flight costs by increased refuelling or resting during migration. When juveniles and adults migrated together in the same flock, the juvenile flew mostly behind the adult and was left behind when they separated. Juveniles showed greater improvement in flight efficiency throughout migration compared to adults which appears crucial because juveniles exhibiting higher flight costs suffered increased mortality. Our findings demonstrate the conflict between the juveniles' inferior flight skills and their urge to keep up with mixed adult-juvenile flocks. We suggest that increased flight costs are an important proximate cause of juvenile mortality in white storks and likely in other soaring migrants and that natural selection is operating on juvenile variation in flight efficiency. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.
Raheel, Shafay; Shbeeb, Izzat; Crowson, Cynthia S; Matteson, Eric L
2017-08-01
To determine time trends in the incidence and survival of polymyalgia rheumatica (PMR) over a 15-year period in Olmsted County, Minnesota, and to examine trends in incidence of PMR in the population by comparing this time period to a previous incidence cohort from the same population base. All cases of incident PMR among Olmsted County, Minnesota residents in 2000-2014 were identified to extend the previous 1970-1999 cohort. Detailed review of all individual medical records was performed. Incidence rates were age- and sex-adjusted to the US white 2010 population. Survival rates were compared with the expected rates in the population of Minnesota. There were 377 incident cases of PMR during the 15-year study period. Of these, 64% were female and the mean age at incidence was 74.1 years. The overall age- and sex-adjusted annual incidence of PMR was 63.9 (95% confidence interval [95% CI] 57.4-70.4) per 100,000 population ages ≥50 years. Incidence rates increased with age in both sexes, but incidence fell after age 80 years. There was a slight increase in incidence of PMR in the recent time period compared to 1970-1999 (P = 0.063). Mortality among individuals with PMR was not significantly worse than that expected in the general population (standardized mortality ratio 0.70 [95% CI 0.57-0.85]). The incidence of PMR has increased slightly in the past 15 years compared to previous decades. Survivorship in patients with PMR is not worse than in the general population. © 2016, American College of Rheumatology.
Stack, A M; Saladino, R A; Siber, G R; Thompson, C; Marra, M N; Novitsky, T J; Fleisher, G R
1997-01-01
To compare a recombinant bactericidal/permeability-increasing protein variant and a recombinant endotoxin-neutralizing protein. Randomized, blinded, controlled study, using a rat model of sepsis. Animal research facility. Male Wistar rats. An inoculum of 1.5 x 10(7) to 1.8 x 10(8) Escherichia coli O18ac K1, implanted in the peritoneum, produced bacteremia in 95% of animals after 1 hr. One hour after E. coli challenge, animals received recombinant bactericidal/permeability-increasing protein variant, recombinant endotoxin-neutralizing protein, or saline intravenously, followed by ceftriaxone and gentamicin intramuscularly. Twenty-four (85.7%) of 28 animals receiving recombinant endotoxin-neutralizing protein (p < .001 vs. control) survived 7 days compared with nine (33.3%) of 27 recombinant bactericidal/permeability-increasing protein variant-treated (p < .001 vs. control) and two (6.5%) of 31 control animals. Both recombinant endotoxin-neutralizing protein and recombinant bactericidal/permeability-increasing protein variant improved survival. Recombinant endotoxin-neutralizing protein was superior to recombinant bactericidal/permeability-increasing protein variant in its protective effect at the doses tested. Our results suggest that both proteins may be useful in the treatment of human Gram-negative sepsis.
São Julião, Guilherme Pagin; Habr-Gama, Angelita; Vailati, Bruna Borba; Aguilar, Patricia Bailão; Sabbaga, Jorge; Araújo, Sérgio Eduardo Alonso; Mattacheo, Adrian; Alexandre, Flavia Andrea; Fernandez, Laura Melina; Gomes, Diogo Bugano; Gama-Rodrigues, Joaquim; Perez, Rodrigo Oliva
2018-01-01
Patients with cT3 rectal cancer are less likely to develop complete response to neoadjuvant chemoradiation (nCRT) and still face significant risk for systemic relapse. In this setting, radiation (RT) dose-escalation and consolidation chemotherapy in "extended" nCRT regimens have been suggested to improve primary tumor response and decrease the risks of systemic recurrences. For these reasons we compared surgery-free and distant-metastases free survival among cT3 patients undergoing standard or extended nCRT. Patients with distal and non-metastatic T3 rectal cancer managed by nCRT were retrospectively reviewed. Patients undergoing standard CRT (50.4 Gy and 2 cycles of 5FU-based chemotherapy) were compared to those undergoing extended CRT (54 Gy and 6 cycles of 5FU-based chemotherapy). Patients were assessed for tumor response at 8-10 weeks. Patients with complete clinical response (cCR) underwent organ-preservation strategy (Watch & Wait). Patients were referred to salvage surgery in the event of local recurrence during follow-up. Cox's logistic regression was performed to identify independent features associated with improved surgery-free survival after cCR and distant-metastases-free survival. 155 patients underwent standard and 66 patients extended CRT. Patients undergoing extended CRT were more likely to harbor larger initial tumor size (p = 0.04), baseline nodal metastases (cN+; p < 0.001) and higher tumor location (p = 0.02). Cox-regression analysis revealed that the type of nCRT regimen was not independently associated with distinct surgery-free survival after cCR or distant-metastases-free survival (p > 0.05). Dose-escalation and consolidation chemotherapy are insufficient to increase long-term surgery-free survival among cT3 rectal cancer patients and provides no advantage in distant metastases-free survival. Copyright © 2017 Elsevier Ltd, BASO ~ The Association for Cancer Surgery, and the European Society of Surgical Oncology. All rights reserved.
Harshman, Lauren C.; Chen, Yu-Hui; Liu, Glenn; Carducci, Michael A.; Jarrard, David; Dreicer, Robert; Hahn, Noah; Garcia, Jorge A.; Hussain, Maha; Shevrin, Daniel; Eisenberger, Mario; Kohli, Manish; Plimack, Elizabeth R.; Cooney, Matthew; Vogelzang, Nicholas J.; Picus, Joel; Dipaola, Robert
2018-01-01
Purpose We evaluated the relationship between prostate-specific antigen (PSA) and overall survival in the context of a prospectively randomized clinical trial comparing androgen-deprivation therapy (ADT) plus docetaxel with ADT alone for initial metastatic hormone-sensitive prostate cancer. Methods We performed a landmark survival analysis at 7 months using the E3805 Chemohormonal Therapy Versus Androgen Ablation Randomized Trial for Extensive Disease in Prostate Cancer (CHAARTED) database (ClinicalTrials.gov identifier: NCT00309985). Inclusion required at least 7 months of follow-up and PSA levels at 7 months from ADT initiation. We used the prognostic classifiers identified in a previously reported trial (Southwest Oncology Group 9346) of PSA ≤ 0.2, > 0.2 to 4, and > 4 ng/mL. Results Seven hundred nineteen of 790 patients were eligible for this subanalysis; 358 were treated with ADT plus docetaxel, and 361 were treated with ADT alone. Median follow-up time was 23.1 months. On multivariable analysis, achieving a 7-month PSA ≤ 0.2 ng/mL was more likely with docetaxel, low-volume disease, prior local therapy, and lower baseline PSAs (all P ≤ .01). Across all patients, median overall survival was significantly longer if 7-month PSA reached ≤ 0.2 ng/mL compared with > 4 ng/mL (median survival, 60.4 v 22.2 months, respectively; P < .001). On multivariable analysis, 7-month PSA ≤ 0.2 and low volume disease were prognostic of longer overall survival (all P < 0.01). The addition of docetaxel increased the likelihood of achieving a PSA ≤ 0.2 ng/mL at 7 months (45.3% v 28.8% of patients on ADT alone). Patients on ADT alone who achieved a 7-month PSA ≤ 0.2 ng/mL had the best survival and were more likely to have low-volume disease (56.7%). Conclusion PSA ≤ 0.2 ng/mL at 7 months is prognostic for longer overall survival with ADT for metastatic hormone-sensitive prostate cancer irrespective of docetaxel administration. Adding docetaxel increased the likelihood of a lower PSA and improved survival. PMID:29261442
Systemic treatments for metastatic cutaneous melanoma.
Pasquali, Sandro; Hadjinicolaou, Andreas V; Chiarion Sileni, Vanna; Rossi, Carlo Riccardo; Mocellin, Simone
2018-02-06
The prognosis of people with metastatic cutaneous melanoma, a skin cancer, is generally poor. Recently, new classes of drugs (e.g. immune checkpoint inhibitors and small-molecule targeted drugs) have significantly improved patient prognosis, which has drastically changed the landscape of melanoma therapeutic management. This is an update of a Cochrane Review published in 2000. To assess the beneficial and harmful effects of systemic treatments for metastatic cutaneous melanoma. We searched the following databases up to October 2017: the Cochrane Skin Group Specialised Register, CENTRAL, MEDLINE, Embase and LILACS. We also searched five trials registers and the ASCO database in February 2017, and checked the reference lists of included studies for further references to relevant randomised controlled trials (RCTs). We considered RCTs of systemic therapies for people with unresectable lymph node metastasis and distant metastatic cutaneous melanoma compared to any other treatment. We checked the reference lists of selected articles to identify further references to relevant trials. Two review authors extracted data, and a third review author independently verified extracted data. We implemented a network meta-analysis approach to make indirect comparisons and rank treatments according to their effectiveness (as measured by the impact on survival) and harm (as measured by occurrence of high-grade toxicity). The same two review authors independently assessed the risk of bias of eligible studies according to Cochrane standards and assessed evidence quality based on the GRADE criteria. We included 122 RCTs (28,561 participants). Of these, 83 RCTs, encompassing 21 different comparisons, were included in meta-analyses. Included participants were men and women with a mean age of 57.5 years who were recruited from hospital settings. Twenty-nine studies included people whose cancer had spread to their brains. Interventions were categorised into five groups: conventional chemotherapy (including single agent and polychemotherapy), biochemotherapy (combining chemotherapy with cytokines such as interleukin-2 and interferon-alpha), immune checkpoint inhibitors (such as anti-CTLA4 and anti-PD1 monoclonal antibodies), small-molecule targeted drugs used for melanomas with specific gene changes (such as BRAF inhibitors and MEK inhibitors), and other agents (such as anti-angiogenic drugs). Most interventions were compared with chemotherapy. In many cases, trials were sponsored by pharmaceutical companies producing the tested drug: this was especially true for new classes of drugs, such as immune checkpoint inhibitors and small-molecule targeted drugs.When compared to single agent chemotherapy, the combination of multiple chemotherapeutic agents (polychemotherapy) did not translate into significantly better survival (overall survival: HR 0.99, 95% CI 0.85 to 1.16, 6 studies, 594 participants; high-quality evidence; progression-free survival: HR 1.07, 95% CI 0.91 to 1.25, 5 studies, 398 participants; high-quality evidence. Those who received combined treatment are probably burdened by higher toxicity rates (RR 1.97, 95% CI 1.44 to 2.71, 3 studies, 390 participants; moderate-quality evidence). (We defined toxicity as the occurrence of grade 3 (G3) or higher adverse events according to the World Health Organization scale.)Compared to chemotherapy, biochemotherapy (chemotherapy combined with both interferon-alpha and interleukin-2) improved progression-free survival (HR 0.90, 95% CI 0.83 to 0.99, 6 studies, 964 participants; high-quality evidence), but did not significantly improve overall survival (HR 0.94, 95% CI 0.84 to 1.06, 7 studies, 1317 participants; high-quality evidence). Biochemotherapy had higher toxicity rates (RR 1.35, 95% CI 1.14 to 1.61, 2 studies, 631 participants; high-quality evidence).With regard to immune checkpoint inhibitors, anti-CTLA4 monoclonal antibodies plus chemotherapy probably increased the chance of progression-free survival compared to chemotherapy alone (HR 0.76, 95% CI 0.63 to 0.92, 1 study, 502 participants; moderate-quality evidence), but may not significantly improve overall survival (HR 0.81, 95% CI 0.65 to 1.01, 2 studies, 1157 participants; low-quality evidence). Compared to chemotherapy alone, anti-CTLA4 monoclonal antibodies is likely to be associated with higher toxicity rates (RR 1.69, 95% CI 1.19 to 2.42, 2 studies, 1142 participants; moderate-quality evidence).Compared to chemotherapy, anti-PD1 monoclonal antibodies (immune checkpoint inhibitors) improved overall survival (HR 0.42, 95% CI 0.37 to 0.48, 1 study, 418 participants; high-quality evidence) and probably improved progression-free survival (HR 0.49, 95% CI 0.39 to 0.61, 2 studies, 957 participants; moderate-quality evidence). Anti-PD1 monoclonal antibodies may also result in less toxicity than chemotherapy (RR 0.55, 95% CI 0.31 to 0.97, 3 studies, 1360 participants; low-quality evidence).Anti-PD1 monoclonal antibodies performed better than anti-CTLA4 monoclonal antibodies in terms of overall survival (HR 0.63, 95% CI 0.60 to 0.66, 1 study, 764 participants; high-quality evidence) and progression-free survival (HR 0.54, 95% CI 0.50 to 0.60, 2 studies, 1465 participants; high-quality evidence). Anti-PD1 monoclonal antibodies may result in better toxicity outcomes than anti-CTLA4 monoclonal antibodies (RR 0.70, 95% CI 0.54 to 0.91, 2 studies, 1465 participants; low-quality evidence).Compared to anti-CTLA4 monoclonal antibodies alone, the combination of anti-CTLA4 plus anti-PD1 monoclonal antibodies was associated with better progression-free survival (HR 0.40, 95% CI 0.35 to 0.46, 2 studies, 738 participants; high-quality evidence). There may be no significant difference in toxicity outcomes (RR 1.57, 95% CI 0.85 to 2.92, 2 studies, 764 participants; low-quality evidence) (no data for overall survival were available).The class of small-molecule targeted drugs, BRAF inhibitors (which are active exclusively against BRAF-mutated melanoma), performed better than chemotherapy in terms of overall survival (HR 0.40, 95% CI 0.28 to 0.57, 2 studies, 925 participants; high-quality evidence) and progression-free survival (HR 0.27, 95% CI 0.21 to 0.34, 2 studies, 925 participants; high-quality evidence), and there may be no significant difference in toxicity (RR 1.27, 95% CI 0.48 to 3.33, 2 studies, 408 participants; low-quality evidence).Compared to chemotherapy, MEK inhibitors (which are active exclusively against BRAF-mutated melanoma) may not significantly improve overall survival (HR 0.85, 95% CI 0.58 to 1.25, 3 studies, 496 participants; low-quality evidence), but they probably lead to better progression-free survival (HR 0.58, 95% CI 0.42 to 0.80, 3 studies, 496 participants; moderate-quality evidence). However, MEK inhibitors probably have higher toxicity rates (RR 1.61, 95% CI 1.08 to 2.41, 1 study, 91 participants; moderate-quality evidence).Compared to BRAF inhibitors, the combination of BRAF plus MEK inhibitors was associated with better overall survival (HR 0.70, 95% CI 0.59 to 0.82, 4 studies, 1784 participants; high-quality evidence). BRAF plus MEK inhibitors was also probably better in terms of progression-free survival (HR 0.56, 95% CI 0.44 to 0.71, 4 studies, 1784 participants; moderate-quality evidence), and there appears likely to be no significant difference in toxicity (RR 1.01, 95% CI 0.85 to 1.20, 4 studies, 1774 participants; moderate-quality evidence).Compared to chemotherapy, the combination of chemotherapy plus anti-angiogenic drugs was probably associated with better overall survival (HR 0.60, 95% CI 0.45 to 0.81; moderate-quality evidence) and progression-free survival (HR 0.69, 95% CI 0.52 to 0.92; moderate-quality evidence). There may be no difference in terms of toxicity (RR 0.68, 95% CI 0.09 to 5.32; low-quality evidence). All results for this comparison were based on 324 participants from 2 studies.Network meta-analysis focused on chemotherapy as the common comparator and currently approved treatments for which high- to moderate-quality evidence of efficacy (as represented by treatment effect on progression-free survival) was available (based on the above results) for: biochemotherapy (with both interferon-alpha and interleukin-2); anti-CTLA4 monoclonal antibodies; anti-PD1 monoclonal antibodies; anti-CTLA4 plus anti-PD1 monoclonal antibodies; BRAF inhibitors; MEK inhibitors, and BRAF plus MEK inhibitors. Analysis (which included 19 RCTs and 7632 participants) generated 21 indirect comparisons.The best evidence (moderate-quality evidence) for progression-free survival was found for the following indirect comparisons:• both combinations of immune checkpoint inhibitors (HR 0.30, 95% CI 0.17 to 0.51) and small-molecule targeted drugs (HR 0.17, 95% CI 0.11 to 0.26) probably improved progression-free survival compared to chemotherapy;• both BRAF inhibitors (HR 0.40, 95% CI 0.23 to 0.68) and combinations of small-molecule targeted drugs (HR 0.22, 95% CI 0.12 to 0.39) were probably associated with better progression-free survival compared to anti-CTLA4 monoclonal antibodies;• biochemotherapy (HR 2.81, 95% CI 1.76 to 4.51) probably lead to worse progression-free survival compared to BRAF inhibitors;• the combination of small-molecule targeted drugs probably improved progression-free survival (HR 0.38, 95% CI 0.21 to 0.68) compared to anti-PD1 monoclonal antibodies;• both biochemotherapy (HR 5.05, 95% CI 3.01 to 8.45) and MEK inhibitors (HR 3.16, 95% CI 1.77 to 5.65) were probably associated with worse progression-free survival compared to the combination of small-molecule targeted drugs; and• biochemotherapy was probably associated with worse progression-free survival (HR 2.81, 95% CI 1.54 to 5.11) compared to the combination of immune checkpoint inhibitors.The best evidence (moderate-quality evidence) for toxicity was found for the following indirect comparisons:• combination of immune checkpoint inhibitors (RR 3.49, 95% CI 2.12 to 5.77) probably increased toxicity compared to chemotherapy;• combination of immune checkpoint inhibitors probably increased toxicity (RR 2.50, 95% CI 1.20 to 5.20) compared to BRAF inhibitors;• the combination of immune checkpoint inhibitors probably increased toxicity (RR 3.83, 95% CI 2.59 to 5.68) compared to anti-PD1 monoclonal antibodies; and• biochemotherapy was probably associated with lower toxicity (RR 0.41, 95% CI 0.24 to 0.71) compared to the combination of immune checkpoint inhibitors.Network meta-analysis-based ranking suggested that the combination of BRAF plus MEK inhibitors is the most effective strategy in terms of progression-free survival, whereas anti-PD1 monoclonal antibodies are associated with the lowest toxicity.Overall, the risk of bias of the included trials can be considered as limited. When considering the 122 trials included in this review and the seven types of bias we assessed, we performed 854 evaluations only seven of which (< 1%) assigned high risk to six trials. We found high-quality evidence that many treatments offer better efficacy than chemotherapy, especially recently implemented treatments, such as small-molecule targeted drugs, which are used to treat melanoma with specific gene mutations. Compared with chemotherapy, biochemotherapy (in this case, chemotherapy combined with both interferon-alpha and interleukin-2) and BRAF inhibitors improved progression-free survival; BRAF inhibitors (for BRAF-mutated melanoma) and anti-PD1 monoclonal antibodies improved overall survival. However, there was no difference between polychemotherapy and monochemotherapy in terms of achieving progression-free survival and overall survival. Biochemotherapy did not significantly improve overall survival and has higher toxicity rates compared with chemotherapy.There was some evidence that combined treatments worked better than single treatments: anti-PD1 monoclonal antibodies, alone or with anti-CTLA4, improved progression-free survival compared with anti-CTLA4 monoclonal antibodies alone. Anti-PD1 monoclonal antibodies performed better than anti-CTLA4 monoclonal antibodies in terms of overall survival, and a combination of BRAF plus MEK inhibitors was associated with better overall survival for BRAF-mutated melanoma, compared to BRAF inhibitors alone.The combination of BRAF plus MEK inhibitors (which can only be administered to people with BRAF-mutated melanoma) appeared to be the most effective treatment (based on results for progression-free survival), whereas anti-PD1 monoclonal antibodies appeared to be the least toxic, and most acceptable, treatment.Evidence quality was reduced due to imprecision, between-study heterogeneity, and substandard reporting of trials. Future research should ensure that those diminishing influences are addressed. Clinical areas of future investigation should include the longer-term effect of new therapeutic agents (i.e. immune checkpoint inhibitors and targeted therapies) on overall survival, as well as the combination of drugs used in melanoma treatment; research should also investigate the potential influence of biomarkers.
Wanjugi, P; Fox, G A; Harwood, V J
2016-10-01
Nutrient levels, competition from autochthonous microorganisms, and protozoan predation may all influence survival of fecal microorganisms as they transition from the gastrointestinal tract to aquatic habitats. Although Escherichia coli is an important indicator of waterborne pathogens, the effects of environmental stressors on its survival in aquatic environments remain poorly understood. We manipulated organic nutrient, predation, and competition levels in outdoor microcosms containing natural river water, sediments, and microbial populations to determine their relative contribution to E. coli survival. The activities of predator (protozoa) and competitor (indigenous bacteria) populations were inhibited by adding cycloheximide or kanamycin. We developed a statistical model of E. coli density over time that fits with the data under all experimental conditions. Predation and competition had significant negative effects on E. coli survival, while higher nutrient levels increased survival. Among the main effects, predation accounted for the greatest variation (40 %) compared with nutrients (25 %) or competition (15 %). The highest nutrient level mitigated the effect of predation on E. coli survival. Thus, elevated organic nutrients may disproportionately enhance the survival of E. coli, and potentially that of other enteric bacteria, in aquatic habitats.
Rajtak, Ursula; Boland, Fiona; Bolton, Declan; Fanning, Séamus
2012-01-01
The persistence of Salmonella in the environment is an important factor influencing the transmission of infection in pig production. This study evaluated the effects of acid tolerance response (ATR), organic acid supplementation, and physical properties of feed on the survival of a five-strain Salmonella mixture in porcine feces held at 4 and 22°C for 88 days. Acid-adapted or non-acid-adapted nalidixic acid-resistant Salmonella strains were used to inoculate feces of pigs fed four different diets, which consisted of a nonpelleted, finely ground meal feed or a finely ground, pelleted feed that was left unsupplemented or was supplemented with K-diformate. Organic acid supplementation and physical properties of feed markedly influenced Salmonella survival, but the effects were highly dependent on storage temperature; survival was unaffected by ATR. The most pronounced effects were observed at 22°C, a temperature similar to that of finishing pig houses. The supplementation of meal diets with K-diformate significantly reduced the duration of survival (P < 0.1) and increased rates of decline (P < 0.0001) of salmonellae in feces compared to survival in feces of pigs fed unsupplemented meal. The pelleting of feed, compared to feeding meal, significantly reduced (P < 0.1) the duration of survival in feces held at 22°C. Only minor effects of feed form and acid supplementation on survivor numbers were observed at 4°C. Differences in the fecal survival of Salmonella could not be related to diet-induced changes in fecal physiochemical parameters. The predominant survival of S. enterica serovar Typhimurium DT193 and serotype 4,[5],12:i:- in porcine feces demonstrates the superior ability of these serotypes to survive in this environment. Fecal survival and transmission of Salmonella in pig herds may be reduced by dietary approaches, but effects are highly dependent on environmental temperature. PMID:22038599
Glioblastoma multiforme (GBM) in the elderly: initial treatment strategy and overall survival.
Glaser, Scott M; Dohopolski, Michael J; Balasubramani, Goundappa K; Flickinger, John C; Beriwal, Sushil
2017-08-01
The EORTC trial which solidified the role of external beam radiotherapy (EBRT) plus temozolomide (TMZ) in the management of GBM excluded patients over age 70. Randomized studies of elderly patients showed that hypofractionated EBRT (HFRT) alone or TMZ alone was at least equivalent to conventionally fractionated EBRT (CFRT) alone. We sought to investigate the practice patterns and survival in elderly patients with GBM. We identified patients age 65-90 in the National Cancer Data Base (NCDB) with histologically confirmed GBM from 1998 to 2012 and known chemotherapy and radiotherapy status. We analyzed factors predicting treatment with EBRT alone vs. EBRT plus concurrent single-agent chemotherapy (CRT) using multivariable logistic regression. Similarly, within the EBRT alone cohort we compared CFRT (54-65 Gy at 1.7-2.1 Gy/fraction) to HFRT (34-60 Gy at 2.5-5 Gy/fraction). Multivariable Cox proportional hazards model (MVA) with propensity score adjustment was used to compare survival. A total of 38,862 patients were included. Initial treatments for 1998 versus 2012 were: EBRT alone = 50 versus 10%; CRT = 6 versus 50%; chemo alone = 1.6% (70% single-agent) versus 3.2% (94% single-agent). Among EBRT alone patients, use of HFRT (compared to CFRT) increased from 13 to 41%. Numerous factors predictive for utilization of CRT over EBRT alone and for HFRT over CFRT were identified. Median survival and 1-year overall survival were higher in the CRT versus EBRT alone group at 8.6 months vs. 5.1 months and 36.0 versus 15.7% (p < 0.0005 by log-rank, multivariable HR 0.65 [95% CI = 0.61-0.68, p < 0.0005], multivariable HR with propensity adjustment 0.66 [95% CI = 0.63-0.70, p < 0.0005]). For elderly GBM patients in the United States, CRT is the most common initial treatment and appears to offer a survival advantage over EBRT alone. Adoption of hypofractionation has increased over time but continues to be low.
Living-donor vs deceased-donor liver transplantation for patients with hepatocellular carcinoma.
Akamatsu, Nobuhisa; Sugawara, Yasuhiko; Kokudo, Norihiro
2014-09-27
With the increasing prevalence of living-donor liver transplantation (LDLT) for patients with hepatocellular carcinoma (HCC), some authors have reported a potential increase in the HCC recurrence rates among LDLT recipients compared to deceased-donor liver transplantation (DDLT) recipients. The aim of this review is to encompass current opinions and clinical reports regarding differences in the outcome, especially the recurrence of HCC, between LDLT and DDLT. While some studies report impaired recurrence - free survival and increased recurrence rates among LDLT recipients, others, including large database studies, report comparable recurrence - free survival and recurrence rates between LDLT and DDLT. Studies supporting the increased recurrence in LDLT have linked graft regeneration to tumor progression, but we found no association between graft regeneration/initial graft volume and tumor recurrence among our 125 consecutive LDLTs for HCC cases. In the absence of a prospective study regarding the use of LDLT vs DDLT for HCC patients, there is no evidence to support the higher HCC recurrence after LDLT than DDLT, and LDLT remains a reasonable treatment option for HCC patients with cirrhosis.
Frequent shopping by men and women increases survival in the older Taiwanese population.
Chang, Yu-Hung; Chen, Rosalind Chia-Yu; Wahlqvist, Mark L; Lee, Meei-Shyuan
2012-07-01
Active ageing is a key to healthy ageing; shopping behaviour is an economically relevant activity of the elderly. Analysis was based on the NAHSIT 1999-2000 dataset. A total of 1841 representative free-living elderly Taiwanese people were selected and information included demographics, socioeconomic status, health behaviours, shopping frequencies, physical function and cognitive function. These data were linked to official death records. Cox proportional hazard models were used to evaluate shopping frequency on death from 1999-2008 with possible covariate adjustment. Highly frequent shopping compared to never or rarely predicted survival (HR 0.54, 95% CI 0.43 to 0.67) with adjustment for physical function and cognitive function and other covariates HR was 0.73 (95% CI 0.56 to 0.93). Elderly who shopped every day have 27% less risk of death than the least frequent shoppers. Men benefited more from everyday shopping than women with decreased HR 28% versus 23% compared to the least. Shopping behaviour favourably predicts survival. Highly frequent shopping may favour men more than women. Shopping captures several dimensions of personal well-being, health and security as well as contributing to the community's cohesiveness and economy and may represent or actually confer increased longevity.
Santarmaki, Valentini; Kourkoutas, Yiannis; Zoumpopoulou, Georgia; Mavrogonatou, Eleni; Kiourtzidis, Mikis; Chorianopoulos, Nikos; Tassou, Chrysoula; Tsakalidou, Effie; Simopoulos, Constantinos; Ypsilantis, Petros
2017-09-01
Survival during transit through the gastrointestinal track, intestinal mucosa adhesion, and a potential immunomodulatory effect of Lactobacillus plantarum strains 2035 and ACA-DC 2640 were investigated in a rat model. According to microbiological and multiplex PCR analysis, both strains were detected in feces 24 h after either single-dose or daily administration for 7 days. Intestinal mucosa adhesion of L. plantarum 2035 was noted in the large intestine at 24 h after single-dose administration, while it was not detected at 48 h. Daily dosing, prolonged detection of the strain up to 48 h post-administration, and expanded adhesion to the small intestine. Adhesion of L. plantarum ACA-DC 2640 to the intestinal mucosa after single-dose administration was prolonged and more extended compared to L. plantarum 2035. Daily dosing increased both the levels and the rate of positive cultures of the strains compared to those of the single-dose scheme. In addition, both strains increased total IgG while decreased IgM and IgA serum levels. In conclusion, L. plantarum 2035 and L. plantarum ACA-DC 2640 survived transit through the gastrointestinal track, exhibited transient distinct adhesion to the intestinal mucosa and modulated the systemic immune response.
Response to alkaline stress by root canal bacteria in biofilms.
Chávez de Paz, L E; Bergenholtz, G; Dahlén, G; Svensäter, G
2007-05-01
To determine whether bacteria isolated from infected root canals survive alkaline shifts better in biofilms than in planktonic cultures. Clinical isolates of Enterococcus faecalis, Lactobacillus paracasei, Olsenella uli, Streptococcus anginosus, S. gordonii, S. oralis and Fusobacterium nucleatum in biofilm and planktonic cultures were stressed at pH 10.5 for 4 h, and cell viability determined using the fluorescent staining LIVE/DEAD BacLight bacterial viability kit. In addition, proteins released into extracellular culture fluids were identified by Western blotting. Enterococcus faecalis, L. paracasei, O. uli and S. gordonii survived in high numbers in both planktonic cultures and in biofilms after alkaline challenge. S. anginosus, S. oralis and F. nucleatum showed increased viability in biofilms compared with planktonic cultures. Alkaline exposure caused all planktonic cultures to aggregate into clusters and resulted in a greater extrusion of cellular proteins compared with cells in biofilms. Increased levels of DnaK, HPr and fructose-1,6-bisphosphate aldolase were observed in culture fluids, especially amongst streptococci. In general, bacteria isolated from infected roots canals resisted alkaline stress better in biofilms than in planktonic cultures, however, planktonic cells appeared to use aggregation and the extracellular transport of specific proteins as survival mechanisms.
Fehring, Richard J; Schneider, Mary; Raviele, Kathleen; Rodriguez, Dana; Pruszynski, Jessica
2013-07-01
The aim was to compare the efficacy and acceptability of two Internet-supported fertility-awareness-based methods of family planning. Six hundred and sixty-seven women and their male partners were randomized into either an electronic hormonal fertility monitor (EHFM) group or a cervical mucus monitoring (CMM) group. Both groups utilized a Web site with instructions, charts and support. Acceptability was assessed online at 1, 3 and 6 months. Pregnancy rates were determined by survival analysis. The EHFM participants (N=197) had a total pregnancy rate of 7 per 100 users over 12 months of use compared with 18.5 for the CMM group (N=164). The log rank survival test showed a significant difference (p<.01) in survival functions. Mean acceptability for both groups increased significantly over time (p<.0001). Continuation rates at 12 months were 40.6% for the monitor group and 36.6% for the mucus group. In comparison with the CMM, the EHFM method of family planning was more effective. All users had an increase in acceptability over time. Results are tempered by the high dropout rate. Copyright © 2013 Elsevier Inc. All rights reserved.
Encounter with mesoscale eddies enhances survival to settlement in larval coral reef fishes
Shulzitski, Kathryn; Sponaugle, Su; Hauff, Martha; Walter, Kristen D.; Cowen, Robert K.
2016-01-01
Oceanographic features, such as eddies and fronts, enhance and concentrate productivity, generating high-quality patches that dispersive marine larvae may encounter in the plankton. Although broad-scale movement of larvae associated with these features can be captured in biophysical models, direct evidence of processes influencing survival within them, and subsequent effects on population replenishment, are unknown. We sequentially sampled cohorts of coral reef fishes in the plankton and nearshore juvenile habitats in the Straits of Florida and used otolith microstructure analysis to compare growth and size-at-age of larvae collected inside and outside of mesoscale eddies to those that survived to settlement. Larval habitat altered patterns of growth and selective mortality: Thalassoma bifasciatum and Cryptotomus roseus that encountered eddies in the plankton grew faster than larvae outside of eddies and likely experienced higher survival to settlement. During warm periods, T. bifasciatum residing outside of eddies in the oligotrophic Florida Current experienced high mortality and only the slowest growers survived early larval life. Such slow growth is advantageous in nutrient poor habitats when warm temperatures increase metabolic demands but is insufficient for survival beyond the larval stage because only fast-growing larvae successfully settled to reefs. Because larvae arriving to the Straits of Florida from distant sources must spend long periods of time outside of eddies, our results indicate that they have a survival disadvantage. High productivity features such as eddies not only enhance the survival of pelagic larvae, but also potentially increase the contribution of locally spawned larvae to reef populations. PMID:27274058
Hansen, Lisbeth Truelstrup; Vogel, Birte Fonnesbech
2011-03-15
The foodborne bacterial pathogen, Listeria monocytogenes, commonly contaminates foods during processing, where the microorganisms are potentially subjected to low relative humidity (RH) conditions for extended periods of time. The objective of this study was to examine survival during desiccation (43% RH and 15 °C) of biofilm L. monocytogenes N53-1 cells on stainless steel coupons and to assess subsequent transfer to salmon products. Formation of static biofilm (2 days at 100% RH and 15 °C) prior to desiccation for 23 days significantly (P<0.05) improved survival of cells desiccated in initial low salt concentrations (0.5%) compared to the survival for non-biofilm cells also desiccated in low salt, indicating the protective effect of the biofilm matrix. Osmoadaptation of cells in 5% NaCl before formation of the static biofilm significantly (P<0.05) increased long-term desiccation survival (49 days) irrespectively of the initial salt levels (0.5% and 5% NaCl). The efficiency of transfer (EOT) of desiccated biofilm cells was significantly (P<0.05) lower than EOTs for desiccated non-biofilm bacteria, however, as biofilm formation enhanced desiccation survival more bacteria were still transferred to smoked and fresh salmon. In conclusion, the current work shows the protective effect of biofilm formation, salt and osmoadaptation on the desiccation survival of L. monocytogenes, which in turn increases the potential for cross-contamination during food processing. Copyright © 2011 Elsevier B.V. All rights reserved.
Fuller, Brian M; Mohr, Nicholas M; Drewry, Anne M; Ferguson, Ian T; Trzeciak, Stephen; Kollef, Marin H; Roberts, Brian W
2017-10-01
To describe the prevalence of hypocapnia and hypercapnia during the earliest period of mechanical ventilation, and determine the association between P a CO 2 and mortality. A cohort study using an emergency department registry of mechanically ventilated patients. P a CO 2 was categorized: hypocapnia (<35mmHg), normocapnia (35-45mmHg), and hypercapnia (>45mmHg). The primary outcome was survival to hospital discharge. A total of 1,491 patients were included. Hypocapnia occurred in 375 (25%) patients and hypercapnia in 569 (38%). Hypercapnia (85%) had higher survival rate compared to normocapnia (74%) and hypocapnia (66%), P<0.001. P a CO 2 was an independent predictor of survival to hospital discharge [hypocapnia (aOR 0.65 (95% confidence interval [CI] 0.48-0.89), normocapnia (reference category), hypercapnia (aOR 1.83 (95% CI 1.32-2.54)]. Over ascending ranges of P a CO 2 , there was a linear trend of increasing survival up to a P a CO 2 range of 66-75mmHg, which had the strongest survival association, aOR 3.18 (95% CI 1.35-7.50). Hypocapnia and hypercapnia occurred frequently after initiation of mechanical ventilation. Higher P a CO 2 levels were associated with increased survival. These data provide rationale for a trial examining the optimal P a CO 2 in the critically ill. Copyright © 2017 Elsevier Inc. All rights reserved.
Poor horse traders: large mammals trade survival for reproduction during the process of feralization
Grange, Sophie; Duncan, Patrick; Gaillard, Jean-Michel
2009-01-01
We investigated density dependence on the demographic parameters of a population of Camargue horses (Equus caballus), individually monitored and unmanaged for eight years. We also analysed the contributions of individual demographic parameters to changes in the population growth rates. The decrease in resources caused a loss of body condition. Adult male survival was not affected, but the survival of foals and adult females decreased with increasing density. Prime-aged females maintained high reproductive performance at high density, and their survival decreased. The higher survival of adult males compared with females at high density presumably results from higher investment in reproduction by mares. The high fecundity in prime-aged females, even when at high density, may result from artificial selection for high reproductive performance, which is known to have occurred in all the major domestic ungulates. Other studies suggest that feral ungulates including cattle and sheep, as these horses, respond differently from wild ungulates to increases in density, by trading adult survival for reproduction. As a consequence, populations of feral animals should oscillate more strongly than their wild counterparts, since they should be both more invasive (as they breed faster), and more sensitive to harsh environmental conditions (as the population growth rate of long-lived species is consistently more sensitive to a given proportional change in adult survival than to the same change in any other vital rate). If this principle proves to be general, it has important implications for management of populations of feral ungulates. PMID:19324787
Survival in Adult Lung Transplant Recipients Receiving Pediatric Versus Adult Donor Allografts.
Hayes, Don; Whitson, Bryan A; Ghadiali, Samir N; Lloyd, Eric A; Tobias, Joseph D; Mansour, Heidi M; Black, Sylvester M
2015-10-01
Recent evidence showed that pediatric donor lungs increased rates of allograft failure in adult lung transplant recipients; however, the influence on survival is unclear. The United Network for Organ Sharing (UNOS) database was queried from 2005 to 2013 for adult lung transplant recipients (≥18 years) to assess survival differences among donor age categories (<18 years, 18 to 29 years, 30 to 59 years, ≥60 years). Of 12,297 adult lung transplants, 12,209 were used for univariate Cox models and Kaplan-Meier (KM) analysis and 11,602 for multivariate Cox models. A total of 1,187 adult recipients received pediatric donor lungs compared with 11,110 receiving adult donor organs. Univariate and multivariate Cox models found no difference in survival between donor ages 0 to 17 and donor ages 18 to 29, whereas donor ages 60 and older were significantly associated with increased mortality hazard, relative to the modal category of donor ages 30 to 59 (adjusted hazard ratio = 1.381; 95% confidence interval = 1.188% to 1.606%; p < 0.001). Interactions between recipient and donor age range found that the oldest donor age range was negatively associated with survival among middle-aged (30 to 59) and older (≥60) lung transplant recipients. Pediatric donor lung allografts were not negatively associated with survival in adult lung transplant recipients; however, the oldest donor age range was associated with increased mortality hazard for adult lung transplant recipients. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Heubner, Martin; Wimberger, Pauline; Otterbach, Friedrich; Kasimir-Bauer, Sabine; Siffert, Winfried; Kimmig, Rainer; Nückel, Holger
2009-01-01
Bcl-2 plays a key role in the regulation of apoptosis. Recently, a novel regulatory single nucleotide polymorphism (-938C>A) in the inhibitory P2 BCL2 promoter was described. In this study we investigated its potential association with survival in epithelial ovarian cancer. Patients (n=110) with primary epithelial ovarian cancer were retrospectively genotyped by pyrosequencing. Genotype distribution was not significantly different between 110 ovarian cancer patients and 120 healthy controls, suggesting that genotypes of this polymorphism do not increase the susceptibility to ovarian cancer. Kaplan-Meier curves showed a significant association of the AA genotype with increased survival (p=0.002). Multivariate analysis revealed that the BCL2-938AC/CC genotype (hazard ratio 4.5; p=0.003) was an independent prognostic factor compared to other prognostic factors such as age, histological grade or tumor stage. The results suggest a role for the BCL2-938C>A polymorphism as a marker for survival in patients with epithelial ovarian cancer.
Kim, Eun Chul; Meng, Huan; Jun, Albert S.
2014-01-01
The present study evaluated survival effects of N-acetylcysteine (NAC) on cultured corneal endothelial cells exposed to oxidative and endoplasmic reticulum (ER) stress and in a mouse model of early-onset Fuchs endothelial corneal dystrophy (FECD). Cultured bovine corneal endothelial cell viability against oxidative and ER stress was determined by CellTiter-Glo® luminescent reagent. Two-month-old homozygous knock-in Col8a2L450W/L450W mutant (L450W) and C57/Bl6 wild-type (WT) animals were divided into two groups of 15 mice. Group I received 7 mg/mL NAC in drinking water and Group II received control water for 7 months. Endothelial cell density and morphology were evaluated with confocal microscopy. Antioxidant gene (iNos) and ER stress/unfolded protein response gene (Grp78 and Chop) mRNA levels and protein expression were measured in corneal endothelium by real time PCR and Western blotting. Cell viability of H2O2 and thapsigargin exposed cells pre-treated with NAC was significantly increased compared to untreated controls (pitalic>0.01). Corneal endothelial cell density (CD) was higher (p=0.001) and percent polymegathism was lower (p=0.04) in NAC treated L450W mice than in untreated L450W mice. NAC treated L450W endothelium showed significant upregulation of iNos, whereas Grp78 and Chop were downregulated compared to untreated L450W endothelium by real time PCR and Western blotting. NAC increases survival in cultured corneal endothelial cells exposed against ER and oxidative stress. Systemic NAC ingestion increases corneal endothelial cell survival which is associated with increased antioxidant and decreased ER stress markers in a mouse model of early-onset FECD. Our study presents in vivo evidence of a novel potential medical treatment for FECD. PMID:24952277
Kim, Eun Chul; Meng, Huan; Jun, Albert S
2014-10-01
The present study evaluated survival effects of N-acetylcysteine (NAC) on cultured corneal endothelial cells exposed to oxidative and endoplasmic reticulum (ER) stress and in a mouse model of early-onset Fuchs endothelial corneal dystrophy (FECD). Cultured bovine corneal endothelial cell viability against oxidative and ER stress was determined by CellTiter-Glo(®) luminescent reagent. Two-month-old homozygous knock-in Col8a2(L450W/L450W) mutant (L450W) and C57/Bl6 wild-type (WT) animals were divided into two groups of 15 mice. Group I received 7 mg/mL NAC in drinking water and Group II received control water for 7 months. Endothelial cell density and morphology were evaluated with confocal microscopy. Antioxidant gene (iNos) and ER stress/unfolded protein response gene (Grp78 and Chop) mRNA levels and protein expression were measured in corneal endothelium by real time PCR and Western blotting. Cell viability of H2O2 and thapsigargin exposed cells pre-treated with NAC was significantly increased compared to untreated controls (p < 0.01). Corneal endothelial cell density (CD) was higher (p = 0.001) and percent polymegathism was lower (p = 0.04) in NAC treated L450W mice than in untreated L450W mice. NAC treated L450W endothelium showed significant upregulation of iNos, whereas Grp78 and Chop were downregulated compared to untreated L450W endothelium by real time PCR and Western blotting. NAC increases survival in cultured corneal endothelial cells exposed against ER and oxidative stress. Systemic NAC ingestion increases corneal endothelial cell survival which is associated with increased antioxidant and decreased ER stress markers in a mouse model of early-onset FECD. Our study presents in vivo evidence of a novel potential medical treatment for FECD. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ono, Junya; Shime, Hiroaki; Takaki, Hiromi; Takashima, Ken; Funami, Kenji; Yoshida, Sumito; Takeda, Yohei; Matsumoto, Misako; Kasahara, Masanori; Seya, Tsukasa
2017-10-17
Intestinal tumorigenesis is promoted by myeloid differentiation primary response gene 88 (MyD88) activation in response to the components of microbiota in Apc Min/+ mice. Microbiota also contains double-stranded RNA (dsRNA), a ligand for TLR3, which activates the toll-like receptor adaptor molecule 1 (TICAM-1, also known as TRIF) pathway. We established Apc Min/+ Ticam1 -/- mice and their survival was compared to survival of Apc Min/+ Myd88 -/- and wild-type (WT) mice. The properties of polyps were investigated using immunofluorescence staining and RT-PCR analysis. We demonstrate that TICAM-1 is essential for suppression of polyp formation in Apc Min/+ mice. TICAM-1 knockout resulted in shorter survival of mice compared to WT mice or mice with knockout of MyD88 in the Apc Min/+ background. Polyps were more frequently formed in the distal intestine of Apc Min/+ Ticam1 -/- mice than in Apc Min/+ mice. Infiltration of immune cells such as CD11b + and CD8α + cells into the polyps was detected histologically. CD11b and CD8α mRNAs were increased in polyps of Apc Min/+ Ticam1 -/- mice compared to Apc Min/+ mice. Gene expression of inducible nitric oxide synthase (iNOS), interferon (IFN)-γ, CXCL9 and IL-12p40 was increased in polyps of Apc Min/+ Ticam1 -/- mice. mRNA and protein expression of c-Myc, a critical transcription factor for inflammation-associated polyposis, were increased in polyps of Apc Min/+ Ticam1 -/- mice. A Lactobacillus strain producing dsRNA was detected in feces of Apc Min/+ mice. These results imply that the TLR3/TICAM-1 pathway inhibits polyposis through suppression of c-Myc expression and supports long survival in Apc Min/+ mice.
Matsumura, Y; Matsumoto, J; Idoguchi, K; Kondo, H; Ishida, T; Kon, Y; Tomita, K; Ishida, K; Hirose, T; Umakoshi, K; Funabiki, T
2017-08-22
Resuscitative endovascular balloon occlusion of the aorta (REBOA) is now a feasible and less invasive resuscitation procedure. This study aimed to compare the clinical course of trauma and non-trauma patients undergoing REBOA. Patient demographics, etiology, bleeding sites, hemodynamic response, length of critical care, and cause of death were recorded. Characteristics and outcomes were compared between non-trauma and trauma patients. Kaplan-Meier survival analysis was then conducted. Between August 2011 and December 2015, 142 (36 non-trauma; 106 trauma) cases were analyzed. Non-traumatic etiologies included gastrointestinal bleeding, obstetrics and gynecology-derived events, visceral aneurysm, abdominal aortic aneurysm, and post-abdominal surgery. The abdomen was a common bleeding site (69%), followed by the pelvis or extra-pelvic retroperitoneum. None of the non-trauma patients had multiple bleeding sites, whereas 45% of trauma patients did (P < 0.001). No non-trauma patients required resuscitative thoracotomy compared with 28% of the trauma patients (P < 0.001). Non-trauma patients presented a lower 24-h mortality than trauma patients (19 vs. 51%, P = 0.001). The non-trauma cases demonstrated a gradual but prolonged increased mortality, whereas survival in trauma cases rapidly declined (P = 0.009) with similar hospital mortality (68 vs. 64%). Non-trauma patients who survived for 24 h had 0 ventilator-free days and 0 ICU-free days vs. a median of 19 and 12, respectively, for trauma patients (P = 0.33 and 0.39, respectively). Non-hemorrhagic death was more common in non-trauma vs. trauma patients (83 vs. 33%, P < 0.001). Non-traumatic hemorrhagic shock often resulted from a single bleeding site, and resulted in better 24-h survival than traumatic hemorrhage among Japanese patients who underwent REBOA. However, hospital mortality increased steadily in non-trauma patients affected by non-hemorrhagic causes after a longer period of critical care.
Dhillon, R K; Hillman, S C; Pounds, R; Morris, R K; Kilby, M D
2015-11-01
To compare the Solomon and selective techniques for fetoscopic laser ablation (FLA) for the treatment of twin-twin transfusion syndrome (TTTS) in monochorionic-diamniotic twin pregnancies. This was a systematic review conducted in accordance with the PRISMA statement. Electronic searches were performed for relevant citations published from inception to September 2014. Selected studies included pregnancies undergoing FLA for TTTS that reported on recurrence of TTTS, occurrence of twin anemia-polycythemia sequence (TAPS) or survival. From 270 possible citations, three studies were included, two cohort studies and one randomized controlled trial (RCT), which directly compared the Solomon and selective techniques for FLA. The odds ratios (OR) of recurrent TTTS when using the Solomon vs the selective technique in the two cohort studies (n = 249) were 0.30 (95% CI, 0.00-4.46) and 0.45 (95% CI, 0.07-2.20). The RCT (n = 274) demonstrated a statistically significant reduction in risk of recurrent TTTS with the Solomon technique (OR, 0.21 (95% CI, 0.04-0.98); P = 0.03). The ORs for the development of TAPS following the Solomon and the selective techniques were 0.20 (95% CI, 0.00-2.46) and 0.61 (95% CI, 0.05-5.53) in the cohort studies and 0.16 (95% CI, 0.05-0.49) in the RCT, with statistically significant differences for the RCT only (P < 0.001). Observational evidence suggested overall better survival with the Solomon technique, which was statistically significant for survival of at least one twin. The RCT did not demonstrate a significant difference in survival between the two techniques, most probably owing to the small sample size and lack of power. This systematic review of observational, comparative cohort and RCT data suggests a trend towards a reduction in TAPS and recurrent TTTS and an increase in twin survival, with no increase in the occurrence of complications or adverse events, when using the Solomon compared to the selective technique for the treatment of TTTS. These findings need to be confirmed by an appropriately-powered RCT with long-term neurological follow-up. Copyright © 2015 ISUOG. Published by John Wiley & Sons Ltd.
Model for equitable care and outcomes for remote full care hemodialysis units.
Bernstein, Keevin; Zacharias, James; Blanchard, James F; Yu, B Nancy; Shaw, Souradet Y
2010-04-01
Remotely located patients not living close to a nephrologist present major challenges for providing care. Various models of remotely delivered care have been developed, with a gap in knowledge regarding the outcomes of these heterogeneous models. This report describes a satellite care model for remote full-care hemodialysis units managed homogenously in the province of Manitoba, Canada, without onsite nephrologists. Survival in remotely located full-care units is compared with a large, urban full-care center with onsite nephrologists. Data from a Canadian provincial dialysis registry were extracted on 2663 patients between 1990 and 2005. All-cause mortality after initiation of chronic hemodialysis was assessed with Cox proportional hazards regression. Both short-term (1 year) and long-term (2 to 5 years) survival were analyzed. Survival for patients receiving remotely delivered care was shown to be better than for those receiving care in the urban care center with this particular Canadian model of care. Furthermore, there was no difference when assessing short- and long-term survival. This was independent of distance from the urban center. Chronic hemodialysis patients receiving remotely delivered care in a specialized facility attain comparable, if not better survival outcomes than their urban counterparts with direct onsite nephrology care. This model can potentially be adapted to other underserviced areas, including increasingly larger urban centers.
Noppakun, Kajohnsak; Ingsathit, Atiporn; Pongskul, Cholatip; Premasthian, Nalinee; Avihingsanon, Yingyos; Lumpaopong, Adisorn; Vareesangthip, Kriangsak; Sumethkul, Vasant
2015-03-01
To report the kidney transplant activity and survival data during the past 25 years from the Thai Transplant Registry. By using the registry database that was collected and updated yearly by 26 transplant centres across the country, we have reported the donor, recipient, and transplant characteristics during the past 25 years from 1987 to 2012. The primary outcome was graft loss that was defined as return to dialysis, graft removal, retransplant, or patient death. 465 kidney transplants were performed in 2012, an 8.1% and 23.0% increase in living and deceased donor transplants compared to the previous year, respectively. Between 1987 and 2012 with the data of 3808 recipients, patient survival and graft survival improved significantly. Traffic accident was the most common cause of death in brain-dead donors. Additionally, the most common cause of end-stage kidney disease was glomerulonephritis. Infection has been among the most common causes of death in kidney transplant recipients. We have reported the total number, the graft and the patient survival data of kidney transplant recipients in Thailand for the period from 1987 to 2012. Although the number of patients is much lower than that in the developed countries, the patients and the graft survival rates are comparable. © 2014 Asian Pacific Society of Nephrology.
Evaluation of red cell distribution width in dogs with pulmonary hypertension.
Swann, James W; Sudunagunta, Siddharth; Covey, Heather L; English, Kate; Hendricks, Anke; Connolly, David J
2014-12-01
To compare red cell distribution width (RDW) between dogs with different causes of pulmonary hypertension (PH) and a control dog population to determine whether RDW was correlated with severity of PH as measured by echocardiography. A further aim was to determine the prognostic significance of increased RDW for dogs with PH. Forty-four client-owned dogs with PH and 79 control dogs presented to a single tertiary referral institution. Signalment, clinical pathological and echocardiographic data were obtained retrospectively from the medical records of dogs with PH, and RDW measured on a Cell-Dyn 3500 was compared between dogs with pre- and post-capillary PH and a control population. Referring veterinary surgeons were contacted for follow-up information and Kaplan-Meier analysis was conducted to investigate differences in survival time between affected dogs with different RDW values. The RDW was significantly greater in dogs with pre-capillary PH compared to control dogs. There was no difference in median survival times between dogs with PH divided according to RDW values. The RDW was positively correlated with mean corpuscular volume and haematocrit in dogs with PH, but did not correlate with echocardiographic variables. An association was found between dogs with PH and increased RDW; however there was considerable overlap in values between control dogs and dogs with PH. The RDW was not associated with survival in this study. Copyright © 2014 Elsevier B.V. All rights reserved.
Seasonal survival of radiomarked emperor geese in western Alaska
Hupp, Jerry W.; Schmutz, Joel A.; Ely, Craig R.
2008-01-01
The population of emperor geese (Chen canagica) in western Alaska, USA, declined by >50% from the 1960s to the mid-1980s and has increased only slightly since. Rates of population increase among arctic geese are especially sensitive to changes in adult survival. Improving adult survival in seasons or geographic areas where survival is low may be the best means of increasing the emperor goose population. We monitored fates of 133 adult female emperor geese that were radiomarked with surgically implanted very high frequency or satellite radiotransmitters from 1999 to 2004 to assess whether monthly survival varied among years, seasons, or geographic areas. Because of uncertainties in determining whether a bird had died based on the radio signal, we analyzed 2 versions of the data. One version used conservative criteria to identify which birds had died based on radio signals and the other used more liberal criteria. In the conservative version of the data we detected 12 mortalities of emperor geese, whereas in the liberal interpretation there were 18 mortalities. In both versions, the models with greatest support indicated that monthly survival varied seasonally and that compared to most seasons estimated monthly survival was lower (?? -0.95-0.98) in May and August when emperor geese were mainly on the Yukon-Kuskokwim Delta. From 44% to 47% of annual mortality occurred in those months. Estimated monthly survival was higher (?? = 0.98-1.0) from September through March when emperor geese were at autumn staging or wintering areas and in June and July when birds were nesting, rearing broods, or molting. Estimated annual survival was 0.85 (95% CI = 0.77-0.92) in the best-supported model when we used conservative criteria to identify mortalities and 0.79 (95%o CI = 0.74-0.85) under the best model using liberal mortality criteria. Lower survival in August and May corresponded to periods when subsistence harvest of emperor geese was likely highest. Managers may be able to most effectively influence population growth rate of emperor geese by reducing subsistence harvest on the Yukon-Kuskokwim Delta in May and August.
Desai, A; Wu, H; Sun, L; Sesterhenn, I A; Mostofi, F K; McLeod, D; Amling, C; Kusuda, L; Lance, R; Herring, J; Foley, J; Baldwin, D; Bishoff, J T; Soderdahl, D; Moul, J W
2002-01-01
The objectives of this work were to evaluate the efficacy of controlled close step-sectioned and whole-mounted radical prostatectomy specimen processing in prediction of clinical outcome as compared to the traditional processing techniques. Two-hundred and forty nine radical prostatectomy (RP) specimens were whole-mounted and close step-sectioned at caliper-measured 2.2-2.3 mm intervals. A group of 682 radical prostatectomy specimens were partially sampled as control. The RPs were performed during 1993-1999 with a mean follow-up of 29.3 months, pretreatment PSA of 0.1-40, and biopsy Gleason sums of 5-8. Disease-free survival based on biochemical or clinical recurrence and secondary intervention were computed using a Kaplan-Meier analysis. There were no significant differences in age at diagnosis, age at surgery, PSA at diagnosis, or biopsy Gleason between the two groups (P<0.05). Compared with the non-close step-sectioned group, the close step-sectioned group showed higher detection rates of extra-prostatic extension (215 (34.1%) vs, 128 (55.4%), P<0.01), and seminal vesicle invasion (50 (7.6%) vs 35 (14.7%), P<0.01). The close step-sectioned group correlated with greater 3-y disease-free survival in organ-confined (P<0.01) and specimen-confined (P<0.01) cases, over the non-uniform group. The close step-sectioned group showed significantly higher disease-free survival for cases with seminal vesicle invasion (P=0.046). No significant difference in disease-free survival was found for the positive margin group (P=0.39) between the close step-sectioned and non-uniform groups. The close step-sectioned technique correlates with increased disease-free survival rates for organ and specimen confined cases, possibly due to higher detection rates of extra-prostatic extension and seminal vesicle invasion. Close step-sectioning provides better assurance of organ-confined disease, resulting in enhanced prediction of outcome by pathological (TNM) stage.
High morale is associated with increased survival in the very old.
Niklasson, Johan; Hörnsten, Carl; Conradsson, Mia; Nyqvist, Fredrica; Olofsson, Birgitta; Lövheim, Hugo; Gustafson, Yngve
2015-07-01
high morale is defined as future-oriented optimism. Previous research suggests that a high morale independently predicts increased survival among old people, though very old people have not been specifically studied. to investigate whether high morale is associated with increased survival among very old people. the Umeå 85+/GErontological Regional DAtabase-study (GERDA) recruited participants aged 85 years and older in northern Sweden and western Finland during 2000-02 and 2005-07, of whom 646 were included in this study. demographic, functional- and health-related data were collected in this population-based study through structured interviews and assessments carried out during home visits and from reviews of medical records. The 17-item Philadelphia Geriatric Center Morale Scale (PGCMS) was used to assess morale. the 5-year survival rate was 31.9% for participants with low morale, 39.4% for moderate and 55.6% for those with high morale. In an unadjusted Cox model, the relative risk (RR) of mortality was higher among participants with low morale (RR = 1.86, P < 0.001) and moderate morale (RR = 1.59, P < 0.001) compared with participants with high morale. Similar results were found after adjustment for age and gender. In a Cox model adjusted for several demographic, health- and function-related confounders, including age and gender, mortality was higher among participants with low morale (RR = 1.36, P = 0.032) than those with high morale. There was a similar but non-significant pattern towards increased mortality in participants with moderate morale (RR = 1.21, P value = 0.136). high morale is independently associated with increased survival among very old people. © The Author 2015. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Chen, Yongqiang; Henson, Elizabeth S; Xiao, Wenyan; Huang, Daniel; McMillan-Ward, Eileen M; Israels, Sara J; Gibson, Spencer B
2016-06-02
Autophagy is an intracellular lysosomal degradation pathway where its primary function is to allow cells to survive under stressful conditions. Autophagy is, however, a double-edge sword that can either promote cell survival or cell death. In cancer, hypoxic regions contribute to poor prognosis due to the ability of cancer cells to adapt to hypoxia in part through autophagy. In contrast, autophagy could contribute to hypoxia induced cell death in cancer cells. In this study, we showed that autophagy increased during hypoxia. At 4 h of hypoxia, autophagy promoted cell survival whereas, after 48 h of hypoxia, autophagy increased cell death. Furthermore, we found that the tyrosine phosphorylation of EGFR (epidermal growth factor receptor) decreased after 16 h in hypoxia. Furthermore, EGFR binding to BECN1 in hypoxia was significantly higher at 4 h compared to 72 h. Knocking down or inhibiting EGFR resulted in an increase in autophagy contributing to increased cell death under hypoxia. In contrast, when EGFR was reactivated by the addition of EGF, the level of autophagy was reduced which led to decreased cell death. Hypoxia led to autophagic degradation of the lipid raft protein CAV1 (caveolin 1) that is known to bind and activate EGFR in a ligand-independent manner during hypoxia. By knocking down CAV1, the amount of EGFR phosphorylation was decreased in hypoxia and amount of autophagy and cell death increased. This indicates that the activation of EGFR plays a critical role in the switch between cell survival and cell death induced by autophagy in hypoxia.
Craciun, Florin L; Schuller, Elizabeth R; Remick, Daniel G
2010-12-01
Neutrophils are critical for the rapid eradication of bacterial pathogens, but they also contribute to the development of multiple organ failure in sepsis. We hypothesized that increasing early recruitment of neutrophils to the focus of infection will increase bacterial clearance and improve survival. Sepsis was induced in mice, using cecal ligation and puncture (CLP); blood samples were collected at 6 and 24 h; and survival was followed for 28 d. In separate experiments, peritoneal bacteria and inflammatory cells were measured. Septic mice predicted to die based on IL-6 levels (Die-P) had higher concentrations of CXCL1 and CXCL2 in the peritoneum and plasma compared with those predicted to live (Live-P). At 6 h, Live-P and Die-P had equivalent numbers of peritoneal neutrophils and bacteria. In Die-P mice the number of peritoneal bacteria increased between 6 and 24 h post-CLP, whereas in Live-P it decreased. The i.p. injection of CXCL1 and CXCL2 in naive mice resulted in local neutrophil recruitment. When given immediately after CLP, CXC chemokines increased peritoneal neutrophil recruitment at 6 h after CLP. This early increase in neutrophils induced by exogenous chemokines resulted in significantly fewer peritoneal bacteria by 24 h [CFU (log) = 6.04 versus 4.99 for vehicle versus chemokine treatment; p < 0.05]. Chemokine treatment significantly improved survival at both 5 d (40 versus 72%) and 28 d (27 versus 52%; p < 0.02 vehicle versus chemokines). These data demonstrate that early, local treatment with CXC chemokines enhances neutrophil recruitment and clearance of bacteria as well as improves survival in the CLP model of sepsis.
Davis, Bryce H; Morimoto, Yoshihisa; Sample, Chris; Olbrich, Kevin; Leddy, Holly A; Guilak, Farshid; Taylor, Doris A
2012-10-01
One of the primary limitations of cell therapy for myocardial infarction is the low survival of transplanted cells, with a loss of up to 80% of cells within 3 days of delivery. The aims of this study were to investigate the distribution of nutrients and oxygen in infarcted myocardium and to quantify how macromolecular transport properties might affect cell survival. Transmural myocardial infarction was created by controlled cryoablation in pigs. At 30 days post-infarction, oxygen and metabolite levels were measured in the peripheral skeletal muscle, normal myocardium, the infarct border zone, and the infarct interior. The diffusion coefficients of fluorescein or FITC-labeled dextran (0.3-70 kD) were measured in these tissues using fluorescence recovery after photobleaching. The vascular density was measured via endogenous alkaline phosphatase staining. To examine the influence of these infarct conditions on cells therapeutically used in vivo, skeletal myoblast survival and differentiation were studied in vitro under the oxygen and glucose concentrations measured in the infarct tissue. Glucose and oxygen concentrations, along with vascular density were significantly reduced in infarct when compared to the uninjured myocardium and infarct border zone, although the degree of decrease differed. The diffusivity of molecules smaller than 40 kD was significantly higher in infarct center and border zone as compared to uninjured heart. Skeletal myoblast differentiation and survival were decreased stepwise from control to hypoxia, starvation, and ischemia conditions. Although oxygen, glucose, and vascular density were significantly reduced in infarcted myocardium, the rate of macromolecular diffusion was significantly increased, suggesting that diffusive transport may not be inhibited in infarct tissue, and thus the supply of nutrients to transplanted cells may be possible. in vitro studies mimicking infarct conditions suggest that increasing nutrients available to transplanted cells may significantly increase their ability to survive in infarct.
Wang, Xiaohong; Zhao, Tiemin; Huang, Wei; Wang, Tao; Qian, Jiang; Xu, Meifeng; Kranias, Evangelia G.; Wang, Yigang; Fan, Guo-Chang
2009-01-01
Although heat-shock preconditioning has been shown to promote cell survival under oxidative stress, the nature of heat-shock response from different cells is variable and complex. Therefore, it remains unclear whether mesenchymal stem cells (MSCs) modified with a single heat-shock protein (Hsp) gene are effective in the repair of a damaged heart. In this study, we genetically engineered rat MSCs with Hsp20 gene (Hsp20-MSCs) and examined cell survival, revascularization, and functional improvement in rat left anterior descending ligation (LAD) model via intracardial injection. We observed that overexpression of Hsp20 protected MSCs against cell death triggered by oxidative stress in vitro. The survival of Hsp20-MSCs was increased by approximately twofold by day 4 after transplantation into the infarcted heart, compared with that of vector-MSCs. Furthermore, Hsp20-MSCs improved cardiac function of infarcted myocardium as compared with vector-MSCs, accompanied by reduction of fibrosis and increase in the vascular density. The mechanisms contributing to the beneficial effects of Hsp20 were associated with enhanced Akt activation and increased secretion of growth factors (VEGF, FGF-2, and IGF-1). The paracrine action of Hsp20-MSCs was further validated in vitro by cocultured adult rat cardiomyocytes with a stress-conditioned medium from Hsp20-MSCs. Taken together, these data support the premise that genetic modification of MSCs before transplantation could be salutary for treating myocardial infarction. PMID:19816949
Liu, Zun Chang; Chang, Thomas M.S.
2012-01-01
We implanted artificial cell bioencapsulated bone marrow mesenchymal stem cells into the spleens of 90% hepatectomized (PH) rats. The resulting 14 days survival rate was 91%. This is compared to a survival rate of 21% in 90% hepatectomized rats and 25% for those receiving free MSCs transplanted the same way. Unlike free MSCs, the bioencapsulated MSCs are retained in the spleens and their hepatotrophic factors can continue to drain directly into the liver without dilution resulting in improved hepatic regeneration. In addition, with time the transdifferentiation of MSCs into hepatocyte-like cells in the spleen renders the spleen as a ectopic liver support. PMID:19132579
Moore, Tamanna; Hennessy, Enid M; Myles, Jonathan; Johnson, Samantha J; Draper, Elizabeth S; Costeloe, Kate L; Marlow, Neil
2012-12-04
To determine outcomes at age 3 years in babies born before 27 completed weeks' gestation in 2006, and to evaluate changes in outcome since 1995 for babies born between 22 and 25 weeks' gestation. Prospective national cohort studies, EPICure and EPICure 2. Hospital and home based evaluations, England. 1031 surviving babies born in 2006 before 27 completed weeks' gestation. Outcomes for 584 babies born at 22-25 weeks' gestation were compared with those of 260 surviving babies of the same gestational age born in 1995. Survival to age 3 years, impairment (2008 consensus definitions), and developmental scores. Multiple imputation was used to account for the high proportion of missing data in the 2006 cohort. Of the 576 babies evaluated after birth in 2006, 13.4% (n=77) were categorised as having severe impairment and 11.8% (n=68) moderate impairment. The prevalence of neurodevelopmental impairment was significantly associated with length of gestation, with greater impairment as gestational age decreased: 45% at 22-23 weeks, 30% at 24 weeks, 25% at 25 weeks, and 20% at 26 weeks (P<0.001). Cerebral palsy was present in 83 (14%) survivors. Mean developmental quotients were lower than those of the general population (normal values 100 (SD 15)) and showed a direct relation with gestational age: 80 (SD 21) at 22-23 weeks, 87 (19) at 24 weeks, 88 (19) at 25 weeks, and 91 (18) at 26 weeks. These results did not differ significantly after imputation. Comparing imputed outcomes between the 2006 and 1995 cohorts, the proportion of survivors born between 22 and 25 weeks' gestation with severe disability, using 1995 definitions, was 18% (95% confidence interval 14% to 24%) in 1995 and 19% (14% to 23%) in 2006. Fewer survivors had shunted hydrocephalus or seizures. Survival of babies admitted for neonatal care increased from 39% (35% to 43%) in 1995 to 52% (49% to 55%) in 2006, an increase of 13% (8% to 18%), and survival without disability increased from 23% (20% to 26%) in 1995 to 34% (31% to 37%) in 2006, an increase of 11% (6% to 16%). Survival and impairment in early childhood are both closely related to gestational age for babies born at less than 27 weeks' gestation. Using multiple imputation to account for the high proportion of missing values, a higher proportion of babies admitted for neonatal care now survive without disability, particularly those born at gestational ages 24 and 25 weeks.
Ou, Judy Y; Spraker-Perlman, Holly; Dietz, Andrew C; Smits-Seemann, Rochelle R; Kaul, Sapna; Kirchhoff, Anne C
2017-10-01
Survival estimates for soft tissue sarcomas (STS) and malignant bone tumors (BT) diagnosed in pediatric, adolescent, and young adult patients are not easily available. We present survival estimates based on a patient having survived a defined period of time (conditional survival). Conditional survival estimates for the short-term were calculated for patients from diagnosis to the first five years after diagnosis and for patients surviving in the long-term (up to 20 years after diagnosis). We identified 703 patients who were diagnosed with a STS or BT at age ≤25 years from January 1, 1986 to December 31, 2012 at a large pediatric oncology center in Salt Lake City, Utah, United States. We obtained cancer type, age at diagnosis, primary site, and demographic data from medical records, and vital status through the National Death Index. Cancer stage was available for a subset of the cohort through the Utah Cancer Registry. Cox proportional hazards models, adjusted for age and sex, calculated survival estimates for all analyses. Short-term survival improves over time for both sarcomas. Short-term survival for STS from diagnosis (Year 0) did not differ by sex, but short-term survival starting from 1-year post diagnosis was significantly worse for male patients (Survival probability 1-year post-diagnosis [SP1]:77% [95% CI:71-83]) than female patients (SP1:86% [81-92]). Survival for patients who were diagnosed at age ≤10 years (Survival probability at diagnosis [SP0]:85% [79-91]) compared to diagnosis at ages 16-25 years (SP0:67% [59-75]) was significantly better at all time-points from diagnosis to 5-years post-diagnosis. Survival for axial sites (SP0:69% [63-75]) compared to extremities (SP0:84% [79-90]) was significantly worse from diagnosis to 1-year post-diagnosis. Survival for axial BT (SP0: 64% [54-74] was significantly worse than BT in the extremities (SP0:73% [68-79]) from diagnosis to 3-years post diagnosis. Relapsed patients of both sarcoma types had significantly worse short-term survival than non-relapsed patients. Long-term survival for STS in this cohort is 65% at diagnosis, and improves to 86% 5-years post-diagnosis. BT survival improves from 51% at diagnosis to 78% at 5-years post-diagnosis. Conditional survival for short- and long-term STS and BT improve as time from diagnosis increases. Short-term survival was significantly affected by patients' sex, age at diagnosis, cancer site, and relapse status. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pell, Jill P; Sirel, Jane M; Marsden, Andrew K; Ford, Ian; Walker, Nicola L; Cobbe, Stuart M
2002-01-01
Objective To estimate the potential impact of public access defibrillators on overall survival after out of hospital cardiac arrest. Design Retrospective cohort study using data from an electronic register. A statistical model was used to estimate the effect on survival of placing public access defibrillators at suitable or possibly suitable sites. Setting Scottish Ambulance Service. Subjects Records of all out of hospital cardiac arrests due to heart disease in Scotland in 1991-8. Main outcome measures Observed and predicted survival to discharge from hospital. Results Of 15 189 arrests, 12 004 (79.0%) occurred in sites not suitable for the location of public access defibrillators, 453 (3.0%) in sites where they may be suitable, and 2732 (18.0%) in suitable sites. Defibrillation was given in 67.9% of arrests that occurred in possibly suitable sites for locating defibrillators and in 72.9% of arrests that occurred in suitable sites. Compared with an actual overall survival of 744 (5.0%), the predicted survival with public access defibrillators ranged from 942 (6.3%) to 959 (6.5%), depending on the assumptions made regarding defibrillator coverage. Conclusions The predicted increase in survival from targeted provision of public access defibrillators is less than the increase achievable through expansion of first responder defibrillation to non-ambulance personnel, such as police or firefighters, or of bystander cardiopulmonary resuscitation. Additional resources for wide scale coverage of public access defibrillators are probably not justified by the marginal improvement in survival. What is already known on this topicThree quarters of all deaths from acute coronary events occur before the patient reaches a hospitalDefibrillation is an independent predictor of survival from out of hospital cardiac arrestThe probability of a rhythm being amenable to defibrillation declines with timeInterest in providing public access defibrillators to reduce the time to defibrillation has been growing, but their potential impact on overall survival is unknownWhat this study addsMost arrests occur in sites unsuitable for locating public access defibrillatorsArrests that occur in sites suitable for locating defibrillators already have the best profile in terms of ambulance response time, use of defibrillation, and survival of the patientPublic access defibrillators are less likely to increase survival than expansion of first responder defibrillation or bystander cardiopulmonary resuscitation PMID:12217989
A comparison of lamellar and penetrating keratoplasty outcomes: a registry study.
Coster, Douglas J; Lowe, Marie T; Keane, Miriam C; Williams, Keryn A
2014-05-01
To investigate changing patterns of practice of keratoplasty in Australia, graft survival, visual outcomes, the influence of experience, and the surgeon learning curve for endothelial keratoplasty. Observational, prospective cohort study. From a long-standing national corneal transplantation register, 13 920 penetrating keratoplasties, 858 deep anterior lamellar keratoplasties (DALKs), and 2287 endokeratoplasties performed between January 1996 and February 2013 were identified. Kaplan-Meier functions were used to assess graft survival and surgeon experience, the Pearson chi-square test was used to compare visual acuities, and linear regression was used to examine learning curves. Graft survival. The total number of corneal grafts performed annually is increasing steadily. More DALKs but fewer penetrating grafts are being performed for keratoconus, and more endokeratoplasties but fewer penetrating grafts are being performed for Fuchs' dystrophy and pseudophakic bullous keratopathy. In 2012, 1482 grafts were performed, compared with 955 in 2002, translating to a requirement for 264 extra corneal donors across the country in 2012. Comparing penetrating grafts and DALKs performed for keratoconus over the same era, both graft survival (P <0.001) and visual outcomes (P <0.001) were significantly better for penetrating grafts. Survival of endokeratoplasties performed for Fuchs' dystrophy or pseudophakic bullous keratopathy was poorer than survival of penetrating grafts for the same indications over the same era (P <0.001). Visual outcomes were significantly better for penetrating grafts than for endokeratoplasties performed for Fuchs' dystrophy (P <0.001), but endokeratoplasties achieved better visual outcomes than penetrating grafts for pseudophakic bullous keratopathy (P <0.001). Experienced surgeons (>100 registered keratoplasties) achieved significantly better survival of endokeratoplasties (P <0.001) than surgeons who had performed fewer grafts (<100 registered keratoplasties). In the hands of experienced, high-volume surgeons, endokeratoplasty failures occurred even after 100 grafts had been performed. More corneal transplants, especially DALKs and endokeratoplasties, are being performed in Australia than ever before. Survival of DALKs and endokeratoplasties is worse than the survival of penetrating grafts performed for the same indications over the same timeframe. Many endokeratoplasties fail early, but the evidence for a surgeon learning curve is unconvincing. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishop, Andrew J.; McDonald, Mark W., E-mail: mwmcdona@iupui.edu; Indiana University Health Proton Therapy Center, Bloomington, IN
2012-01-01
Purpose: To evaluate the incidence of infant brain tumors and survival outcomes by disease and treatment variables. Methods and Materials: The Surveillance, Epidemiology, and End Results (SEER) Program November 2008 submission database provided age-adjusted incidence rates and individual case information for primary brain tumors diagnosed between 1973 and 2006 in infants less than 12 months of age. Results: Between 1973 and 1986, the incidence of infant brain tumors increased from 16 to 40 cases per million (CPM), and from 1986 to 2006, the annual incidence rate averaged 35 CPM. Leading histologies by annual incidence in CPM were gliomas (13.8), medulloblastomamore » and primitive neuroectodermal tumors (6.6), and ependymomas (3.6). The annual incidence was higher in whites than in blacks (35.0 vs. 21.3 CPM). Infants with low-grade gliomas had the highest observed survival, and those with atypical teratoid rhabdoid tumors (ATRTs) or primary rhabdoid tumors of the brain had the lowest. Between 1979 and 1993, the annual rate of cases treated with radiation within the first 4 months from diagnosis declined from 20.5 CPM to <2 CPM. For infants with medulloblastoma, desmoplastic histology and treatment with both surgery and upfront radiation were associated with improved survival, but on multivariate regression, only combined surgery and radiation remained associated with improved survival, with a hazard ratio for death of 0.17 compared with surgery alone (p = 0.005). For ATRTs, those treated with surgery and upfront radiation had a 12-month survival of 100% compared with 24.4% for those treated with surgery alone (p = 0.016). For ependymomas survival was higher in patients treated in more recent decades (p = 0.001). Conclusion: The incidence of infant brain tumors has been stable since 1986. Survival outcomes varied markedly by histology. For infants with medulloblastoma and ATRTs, improved survival was observed in patients treated with both surgery and early radiation compared with those treated with surgery alone.« less
Nemes, Balázs; Gámán, György; Polak, Wojciech G; Gelley, Fanni; Hara, Takanobu; Ono, Shinichiro; Baimakhanov, Zhassulan; Piros, Laszlo; Eguchi, Susumu
2016-07-01
Extended-criteria donors (ECDs) have an impact on early allograft dysfunction (EAD), biliary complications, relapse of hepatitis C virus (HCV), and survivals. Early allograft dysfunction was frequently seen in grafts with moderate and severe steatosis. Donors after cardiac death (DCD) have been associated with higher rates of graft failure and biliary complications compared to donors after brain death. Extended warm ischemia, reperfusion injury and endothelial activation trigger a cascade, leading to microvascular thrombosis, resulting in biliary necrosis, cholangitis, and graft failure. The risk of HCV recurrence increased by donor age, and associated with using moderately and severely steatotic grafts. With the administration of protease inhibitors sustained virological response was achieved in majority of the patients. Donor risk index and EC donor scores (DS) are reported to be useful, to assess the outcome. The 1-year survival rates were 87% and 40% respectively, for donors with a DS of 0 and 3. Graft survival was excellent up to a DS of 2, however a DS >2 should be avoided in higher-risk recipients. The 1, 3 and 5-year survival of DCD recipients was comparable to optimal donors. However ECDs had minor survival means of 85%, 78.6%, and 72.3%. The graft survival of split liver transplantation (SLT) was comparable to that of whole liver orthotopic liver transplantation. SLT was not regarded as an ECD factor in the MELD era any more. Full-right-full-left split liver transplantation has a significant advantage to extend the high quality donor pool. Hypothermic oxygenated machine perfusion can be applied clinically in DCD liver grafts. Feasibility and safety were confirmed. Reperfusion injury was also rare in machine perfused DCD livers.
McKenzie, Nicole; Williams, Teresa A; Ho, Kwok M; Inoue, Madoka; Bailey, Paul; Celenza, Antonio; Fatovich, Daniel; Jenkins, Ian; Finn, Judith
2018-05-02
To compare survival outcomes of adults with out-of-hospital cardiac arrest (OHCA) of medical aetiology directly transported to a percutaneous-coronary-intervention capable (PCI-capable) hospital (direct transport) with patients transferred to a PCI-capable hospital via another hospital without PCI services available (indirect transport) by emergency medical services (EMS). This retrospective cohort study used the St John Ambulance Western Australia OHCA Database and medical chart review. We included OHCA patients (≥18 years) admitted to any one of five PCI-capable hospitals in Perth between January 2012 and December 2015. Survival to hospital discharge (STHD) and survival up to 12-months after OHCA were compared between the direct and indirect transport groups using multivariable logistic and Cox-proportional hazards regression, respectively, while adjusting for so-called "Utstein variables" and other potential confounders. Of the 509 included patients, 404 (79.4%) were directly transported to a PCI-capable hospital and 105 (20.6%) transferred via another hospital to a PCI-capable hospital; 274/509 (53.8%) patients STHD and 253/509 (49.7%) survived to 12-months after OHCA. Direct transport patients were twice as likely to STHD (adjusted odds ratio 1.97, 95% confidence interval [CI] 1.13-3.43) than those transferred via another hospital. Indirect transport was also associated with a possible increased risk of death, up to 12-months, compared to direct transport (adjusted hazard ratio 1.36, 95% CI 1.00-1.84). Direct transport to a PCI-capable hospital for post-resuscitation care is associated with a survival advantage for adults with OHCA of medical aetiology. This has implications for EMS transport protocols for patients with OHCA. Copyright © 2018 Elsevier B.V. All rights reserved.
Agaku, Israel T; Adisa, Akinyele O
2014-04-01
Nativity status is a major determinant of health and healthcare access in the United States. This study compared oral squamous cell carcinoma (OSCC) survival between US-born and foreign-born patients. Data were obtained from the 1988-2008 Surveillance, Epidemiology and End Results database. A Cox proportional hazards multivariate model was used to assess the eff ect of birthplace on OSCC survival, adjusting for other sociodemographic and clinical covariates. US-born patients had a higher median survival time (19.3 years; 95% confidence interval [CI]: 18.6-19.7) compared to foreign-born patients (10.7 years; 95% CI: 10.1-11.3). After adjusting for other factors, being born in the US conferred a modest protective eff ect from OSCC mortality (hazard ratio [HR] = 0.93, 95% CI: 0.87- 0.99). Other factors that conferred better survival included involvement of paired structures (HR = 0.65; 95% CI: 0.58- 0.74), lip involvement rather than tongue lesions (HR = 0.76; 95% CI: 0.71-0.82), and receipt of either surgery (HR = 0.89; 95% CI: 0.84-0.94) or radiation therapy (HR = 0.92; 95% CI: 0.87-0.97). US-born patients had significantly better OSCC survival compared to their foreign-born counterparts. This underscores the need for enhanced and sustained efforts to improve access to healthcare among immigrant populations. In addition, oral health professionals such as general dentists, oral pathologists, and oral surgeons providing care to immigrant patients should ensure that reasonable efforts are made to communicate effectively with patients with language barriers, especially in high-stake conditions such as cancer. This may help increase such patients' awareness of treatment provided and the critical issues regarding cancer care, resulting in enhanced treatment outcome.
Survival after out-of-hospital cardiac arrest in relation to sex: a nationwide registry-based study.
Wissenberg, Mads; Hansen, Carolina Malta; Folke, Fredrik; Lippert, Freddy K; Weeke, Peter; Karlsson, Lena; Rajan, Shahzleen; Søndergaard, Kathrine Bach; Kragholm, Kristian; Christensen, Erika Frischknecht; Nielsen, Søren L; Køber, Lars; Gislason, Gunnar H; Torp-Pedersen, Christian
2014-09-01
Crude survival has increased following an out-of-hospital cardiac arrest (OHCA). We aimed to study sex-related differences in patient characteristics and survival during a 10-year study period. Patients≥12 years old with OHCA of a presumed cardiac cause, and in whom resuscitation was attempted, were identified through the Danish Cardiac Arrest Registry 2001-2010. A total of 19,372 patients were included. One-third were female, with a median age of 75 years (IQR 65-83). Compared to females, males were five years younger; and less likely to have severe comorbidities, e.g., chronic obstructive pulmonary disease (12.8% vs. 16.5%); but more likely to have arrest outside of the home (29.4% vs. 18.7%), receive bystander CPR (32.9% vs. 25.9%), and have a shockable rhythm (32.6% vs. 17.2%), all p<0.001. Thirty-day crude survival increased in males (3.0% in 2001 to 12.9% in 2010); and in females (4.8% in 2001 to 6.7% in 2010), p<0.001. Multivariable logistic regression analyses adjusted for patient characteristics including comorbidities, showed no survival difference between sexes in patients with a non-shockable rhythm (OR 1.00; CI 0.72-1.40), while female sex was positively associated with survival in patients with a shockable rhythm (OR 1.31; CI 1.07-1.59). Analyses were rhythm-stratified due to interaction between sex and heart rhythm; there was no interaction between sex and calendar-year. Temporal increase in crude survival was more marked in males due to poorer prognostic characteristics in females with a lower proportion of shockable rhythm. In an adjusted model, female sex was positively associated with survival in patients with a shockable rhythm. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Shi, Minghan; Fortin, David; Sanche, Léon; Paquette, Benoit
2015-01-01
The prognosis for patients with glioblastoma remains poor with current treatments. Although platinum based drugs are sometimes offered at relapse, their efficacy in this setting is still disputed. In this study, we use convection-enhanced delivery (CED) to deliver the platinum-based drugs (cisplatin, carboplatin, and Lipoplatin™-liposomal formulation of cisplatin) directly into the tumor of F98 glioma-bearing rats that were subsequently treated with γ radiation (15 Gy). CED increased by factors varying between 17 and 111, the concentration of these platinum-based drugs in the brain tumor compared to intra-venous (i.v.) administration, and by 9- to 34-fold, when compared to intra-arterial (i.a.) administration. Furthermore, CED resulted in a better systemic tolerance to platinum drugs compared to their i.a. injection. Among the drugs tested, carboplatin showed the highest maximum tolerated dose (MTD). Treatment with carboplatin resulted in the best median survival time (MeST) (38.5 days), which was further increased by the addition of radiotherapy (54.0 days). Although the DNA-bound platinum adduct were higher at 4 h after CED than 24 h for carboplatin group, combination with radiotherapy led to similar improvement of median survival time. However, less toxicity was observed in animals irradiated 24 h after CED-based chemotherapy. In conclusion, CED increased the accumulation of platinum drugs in tumor, reduced the toxicity, and resulted in a higher median survival time. The best treatment was obtained in animals treated with carboplatin and irradiated 24 h later. PMID:25784204
Shi, Minghan; Fortin, David; Sanche, Léon; Paquette, Benoit
2015-06-01
The prognosis for patients with glioblastoma remains poor with current treatments. Although platinum-based drugs are sometimes offered at relapse, their efficacy in this setting is still disputed. In this study, we use convection-enhanced delivery (CED) to deliver the platinum-based drugs (cisplatin, carboplatin, and Lipoplatin(TM) - liposomal formulation of cisplatin) directly into the tumor of F98 glioma-bearing rats that were subsequently treated with γ radiation (15 Gy). CED increased by factors varying between 17 and 111, the concentration of these platinum-based drugs in the brain tumor compared to intra-venous (i.v.) administration, and by 9- to 34-fold, when compared to intra-arterial (i.a.) administration. Furthermore, CED resulted in a better systemic tolerance to platinum drugs compared to their i.a. injection. Among the drugs tested, carboplatin showed the highest maximum tolerated dose (MTD). Treatment with carboplatin resulted in the best median survival time (MeST) (38.5 days), which was further increased by the addition of radiotherapy (54.0 days). Although the DNA-bound platinum adduct were higher at 4 h after CED than 24 h for carboplatin group, combination with radiotherapy led to similar improvement of median survival time. However, less toxicity was observed in animals irradiated 24 h after CED-based chemotherapy. In conclusion, CED increased the accumulation of platinum drugs in tumor, reduced the toxicity, and resulted in a higher median survival time. The best treatment was obtained in animals treated with carboplatin and irradiated 24 h later.
Impact of breast cancer subtypes on 3-year survival among adolescent and young adult women
2013-01-01
Introduction Young women have poorer survival after breast cancer than do older women. It is unclear whether this survival difference relates to the unique distribution of hormone receptor (HR) and human epidermal growth factor receptor 2 (HER2)-defined molecular breast cancer subtypes among adolescent and young adult (AYA) women aged 15 to 39 years. The purpose of our study was to examine associations between breast cancer subtypes and short-term survival in AYA women, as well as to determine whether the distinct molecular subtype distribution among AYA women explains the unfavorable overall breast cancer survival statistics reported for AYA women compared with older women. Methods Data for 5,331 AYA breast cancers diagnosed between 2005 and 2009 were obtained from the California Cancer Registry. Survival by subtype (triple-negative; HR+/HER2-; HR+/HER2+; HR-/HER2+) and age-group (AYA versus 40- to 64-year-olds) was analyzed with Cox proportional hazards regression with follow-up through 2010. Results With up to 6 years of follow-up and a mean survival time of 3.1 years (SD = 1.5 years), AYA women diagnosed with HR-/HER + and triple-negative breast cancer experienced a 1.6-fold and 2.7-fold increased risk of death, respectively, from all causes (HR-/HER + hazard ratio: 1.55; 95% confidence interval (CI): 1.10 to 2.18; triple-negative HR: 2.75; 95% CI, 2.06 to 3.66) and breast cancer (HR-/HER + hazard ratio: 1.63; 95% CI, 1.12 to 2.36; triple-negative hazard ratio: 2.71; 95% CI, 1.98 to 3.71) than AYA women with HR+/HER2- breast cancer. AYA women who resided in lower socioeconomic status neighborhoods, had public health insurance, and were of Black, compared with White, race/ethnicity experienced worse survival. This race/ethnicity association was attenuated somewhat after adjusting for breast cancer subtypes (hazard ratio, 1.33; 95% CI, 0.98 to 1.82). AYA women had similar all-cause and breast cancer-specific short-term survival as older women for all breast cancer subtypes and across all stages of disease. Conclusions Among AYA women with breast cancer, short-term survival varied by breast cancer subtypes, with the distribution of breast cancer subtypes explaining some of the poorer survival observed among Black, compared with White, AYA women. Future studies should consider whether distribution of breast cancer subtypes and other factors, including differential receipt of treatment regimens, influences long-term survival in young compared with older women. PMID:24131591
Zhang, Lei; Kundu, Soumi; Feenstra, Tjerk; Li, Xiujuan; Jin, Chuan; Laaniste, Liisi; El Hassan, Tamador Elsir Abu; Ohlin, K Elisabet; Yu, Di; Olofsson, Tommie; Olsson, Anna-Karin; Pontén, Fredrik; Magnusson, Peetra U; Nilsson, Karin Forsberg; Essand, Magnus; Smits, Anja; Dieterich, Lothar C; Dimberg, Anna
2015-12-08
Glioblastomas are aggressive astrocytomas characterized by endothelial cell proliferation and abnormal vasculature, which can cause brain edema and increase patient morbidity. We identified the heparin-binding cytokine pleiotrophin as a driver of vascular abnormalization in glioma. Pleiotrophin abundance was greater in high-grade human astrocytomas and correlated with poor survival. Anaplastic lymphoma kinase (ALK), which is a receptor that is activated by pleiotrophin, was present in mural cells associated with abnormal vessels. Orthotopically implanted gliomas formed from GL261 cells that were engineered to produce pleiotrophin showed increased microvessel density and enhanced tumor growth compared with gliomas formed from control GL261 cells. The survival of mice with pleiotrophin-producing gliomas was shorter than that of mice with gliomas that did not produce pleiotrophin. Vessels in pleiotrophin-producing gliomas were poorly perfused and abnormal, a phenotype that was associated with increased deposition of vascular endothelial growth factor (VEGF) in direct proximity to the vasculature. The growth of pleiotrophin-producing GL261 gliomas was inhibited by treatment with the ALK inhibitor crizotinib, the ALK inhibitor ceritinib, or the VEGF receptor inhibitor cediranib, whereas control GL261 tumors did not respond to either inhibitor. Our findings link pleiotrophin abundance in gliomas with survival in humans and mice, and show that pleiotrophin promotes glioma progression through increased VEGF deposition and vascular abnormalization. Copyright © 2015, American Association for the Advancement of Science.
Selenium and mercury have a synergistic negative effect on fish reproduction.
Penglase, S; Hamre, K; Ellingsen, S
2014-04-01
Selenium (Se) can reduce the negative impacts of mercury (Hg) toxicity on growth and survival, but little is known about how these two elements interact in reproduction. In the following study we explored the effects of organic Hg and Se on the growth, survival and reproduction of female zebrafish (Danio rerio). Fish were fed one of four diets from 73 until 226 dpf in a 2 × 2 factorial design, using selenomethionine (SeMet) and methylmercury (MeHg) as the Se and Hg sources, respectively. Each diet contained Se at either requirement (0.7 mg Se/kg DM) or elevated levels (10 mg Se/kgDM), and Hg at either low (0.05 mg Hg/kg DM) or elevated (12 mg Hg/kg DM) levels. Between 151 and 206 dpf the female fish were pairwise crossed against untreated male fish and the mating success, fecundity, embryo survival, and subsequent overall reproductive success were measured. Elevated dietary Se reduced Hg levels in both the adult fish and their eggs. Elevated dietary Hg and Se increased egg Se levels to a greater extent than when dietary Se was elevated alone. At elevated maternal intake levels, egg concentrations of Se and Hg reflected the maternal dietary levels and not the body burdens of the adult fish. Elevated dietary Hg reduced the growth and survival of female fish, but these effects were largely prevented with elevated dietary Se. Elevated dietary Se alone did not affect fish growth or survival. Compared to other treatments, elevated dietary Hg alone increased both mating and overall reproductive success with <100 days of exposure, but decreased these parameters with >100 days exposure. Elevated dietary Se decreased fecundity, embryo survival, and overall reproductive success. The combination of elevated Se and Hg had a synergistic negative effect on all aspects of fish reproduction compared to those groups fed elevated levels of either Se or Hg. Overall the data demonstrate that while increased dietary Se may reduce adverse effects of Hg on the growth and survival in adult fish, it can negatively affect fish reproductive potential, and the effect on reproduction is enhanced in the presence of elevated Hg. Copyright © 2014 Elsevier B.V. All rights reserved.
Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng
2016-01-01
Introduction There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Methods and analysis Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Discussion Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology trials. Systematic evidence produced from our study will contribute to improvement of patient care and practice of evidence-based medicine in oncology. PMID:27591026
Han, Hyuk-Soo; Kang, Seung-Baik
2013-05-01
The long-term survivorship of TKA in Asian countries is comparable to that in Western countries. High-flexion TKA designs were introduced to improve flexion after TKA. However, several studies suggest high-flexion designs are at greater risk of femoral component loosening compared with conventional TKA designs. We previously reported a revision rate of 21% at 11 to 45 months; this report is intended as a followup to that study. Do implant survival and function decrease with time and do high-flexion activities increase the risk of premature failure? We prospectively followed 72 Nexgen LPS-flex fixed TKAs in 47 patients implanted by a single surgeon between March 2003 and September 2004. We determined the probability of survival using revision as an end point and compared survival between those who could and those who could not perform high-flexion activities. Minimum followup was 0.9 years (median, 6.5 years; range, 0.9-8.6 years). Twenty-five patients (33 knees) underwent revision for aseptic loosening of the femoral component at a mean of 4 years (range, 1-8 years). The probability of revision-free survival for aseptic loosening was 67% and 52% at 5 and 8 years, respectively. Eight-year cumulative survivorship was lower in patients capable of squatting, kneeling, or sitting crosslegged (31% compared with 78%). There were no differences in the pre- and postoperative mean Hospital for Special Surgery scores and maximum knee flexion degrees whether or not high-flexion activities could be achieved. Overall midterm high-flexion TKA survival in our Asian cohort was lower than that of conventional and other high-flexion designs. This unusually high rate of femoral component loosening was associated with postoperative high-flexion activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Xiangpeng; Schipper, Matthew; Department of Biostatistics, the University of Michigan, Ann Arbor, Michigan
Purpose: This study compared treatment outcomes of stereotactic body radiation therapy (SBRT) with those of surgery in stage I non-small cell lung cancer (NSCLC). Methods and Materials: Eligible studies of SBRT and surgery were retrieved through extensive searches of the PubMed, Medline, Embase, and Cochrane library databases from 2000 to 2012. Original English publications of stage I NSCLC with adequate sample sizes and adequate SBRT doses were included. A multivariate random effects model was used to perform a meta-analysis to compare survival between treatments while adjusting for differences in patient characteristics. Results: Forty SBRT studies (4850 patients) and 23 surgerymore » studies (7071 patients) published in the same period were eligible. The median age and follow-up duration were 74 years and 28.0 months for SBRT patients and 66 years and 37 months for surgery patients, respectively. The mean unadjusted overall survival rates at 1, 3, and 5 years with SBRT were 83.4%, 56.6%, and 41.2% compared to 92.5%, 77.9%, and 66.1% with lobectomy and 93.2%, 80.7%, and 71.7% with limited lung resections. In SBRT studies, overall survival improved with increasing proportion of operable patients. After we adjusted for proportion of operable patients and age, SBRT and surgery had similar estimated overall and disease-free survival. Conclusions: Patients treated with SBRT differ substantially from patients treated with surgery in age and operability. After adjustment for these differences, OS and DFS do not differ significantly between SBRT and surgery in patients with operable stage I NSCLC. A randomized prospective trial is warranted to compare the efficacy of SBRT and surgery.« less
Tehrani, Behnam; Truesdell, Alexander; Singh, Ramesh; Murphy, Charles; Saulino, Patricia
2018-06-28
The development and implementation of a Cardiogenic Shock initiative focused on increased disease awareness, early multidisciplinary team activation, rapid initiation of mechanical circulatory support, and hemodynamic-guided management and improvement of outcomes in cardiogenic shock. The objectives of this study are (1) to collect retrospective clinical outcomes for acute decompensated heart failure cardiogenic shock and acute myocardial infarction cardiogenic shock, and compare current versus historical survival rates and clinical outcomes; (2) to evaluate Inova Heart and Vascular Institute site specific outcomes before and after initiation of the Cardiogenic Shock team on January 1, 2017; (3) to compare outcomes related to early implementation of mechanical circulatory support and hemodynamic-guided management versus historical controls; (4) to assess survival to discharge rate in patients receiving intervention from the designated shock team and (5) create a clinical archive of Cardiogenic Shock patient characteristics for future analysis and the support of translational research studies. This is an observational, retrospective, single center study. Retrospective and prospective data will be collected in patients treated at the Inova Heart and Vascular Institute with documented cardiogenic shock as a result of acute decompensated heart failure or acute myocardial infarction. This registry will include data from patients prior to and after the initiation of the multidisciplinary Cardiogenic Shock team on January 1, 2017. Clinical outcomes associated with early multidisciplinary team intervention will be analyzed. In the study group, all patients evaluated for documented cardiogenic shock (acute decompensated heart failure cardiogenic shock, acute myocardial infarction cardiogenic shock) treated at the Inova Heart and Vascular Institute by the Cardiogenic Shock team will be included. An additional historical Inova Heart and Vascular Institute control group will be analyzed as a comparator. Means with standard deviations will be reported for outcomes. For categorical variables, frequencies and percentages will be presented. For continuous variables, the number of subjects, mean, standard deviation, minimum, 25th percentile, median, 75th percentile and maximum will be reported. Reported differences will include standard errors and 95% CI. Preliminary data analysis for the year 2017 has been completed. Compared to a baseline 2016 survival rate of 47.0%, from 2017 to 2018, CS survival rates were increased to 57.9% (58/110) and 81.3% (81/140), respectively (P=.01 for both). Study data will continue to be collected until December 31, 2018. The preliminary results of this study demonstrate that the INOVA SHOCK team approach to the treatment of Cardiogenic Shock with early team activation, rapid initiation of mechanical circulatory support, hemodynamic-guided management, and strict protocol adherence is associated with superior clinical outcomes: survival to discharge and overall survival when compared to 2015 and 2016 outcomes prior to Shock team initiation. What may limit the generalization of these results of this study to other populations are site specific; expertise of the team, strict algorithm adherence based on the INOVA SHOCK protocol, and staff commitment to timely team activation. Retrospective clinical outcomes (acute decompensated heart failure cardiogenic shock, acute myocardial infarction cardiogenic shock) demonstrated an increase in current survival rates when compared to pre-Cardiogenic Shock team initiation, rapid team activation and diagnosis and timely utilization of mechanical circulatory support. ClinicalTrials.gov NCT03378739; https://clinicaltrials.gov/ct2/show/NCT03378739 (Archived by WebCite at http://www.webcitation.org/701vstDGd). ©Behnam Tehrani, Alexander Truesdell, Ramesh Singh, Charles Murphy, Patricia Saulino. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 28.06.2018.
Gutowski, Stacie M.; Shoemaker, James T.; Templeman, Kellie L.; Wei, Yang; Latour, Robert A.; Bellamkonda, Ravi V.; LaPlaca, Michelle C.; García, Andrés J.
2015-01-01
Neural electrodes are an important part of brain-machine interface devices that can restore functionality to patients with sensory and movement disorders. Chronically implanted neural electrodes induce an unfavorable tissue response which includes inflammation, scar formation, and neuronal cell death, eventually causing loss of electrode function. We developed a poly(ethylene glycol) hydrogel coating for neural electrodes with non-fouling characteristics, incorporated an anti-inflammatory agent, and engineered a stimulus-responsive degradable portion for on-demand release of the anti-inflammatory agent in response to inflammatory stimuli. This coating reduces in vitro glial cell adhesion, cell spreading, and cytokine release compared to uncoated controls. We also analyzed the in vivo tissue response using immunohistochemistry and microarray qRT-PCR. Although no differences were observed among coated and uncoated electrodes for inflammatory cell markers, lower IgG penetration into the tissue around PEG+IL-1Ra coated electrodes indicates an improvement in blood-brain barrier integrity. Gene expression analysis showed higher expression of IL-6 and MMP-2 around PEG+IL-1Ra samples, as well as an increase in CNTF expression, an important marker for neuronal survival. Importantly, increased neuronal survival around coated electrodes compared to uncoated controls was observed. Collectively, these results indicate promising findings for an engineered coating to increase neuronal survival and improve tissue response around implanted neural electrodes. PMID:25617126
Viglianti, BL; Lora-Michiels, M; Poulson, JM; Lan, Lan; Yu, D; Sanders, L; Craciunescu, O; Vujaskovic, Z; Thrall, DE; MacFall, J; Charles, HC; Wong, T; Dewhirst, MW
2009-01-01
Purpose This study tests whether DCE-MRI parameters obtained from canine patients with soft tissue sarcomas, treated with hyperthermia and radiotherapy, are predictive of therapeutic outcome. Experimental Design 37 dogs with soft tissue sarcomas had DCE-MRI performed prior to and following the first hyperthermia. Signal enhancement for tumor and reference muscle were fitted empirically, yielding a washin/washout rate for the contrast agent, tumor AUC calculated from 0 to 60s, 90s, and the time of maximal enhancement in the reference muscle. These parameters were then compared to local tumor control, metastasis free survival, and overall survival. Results Pre-therapy rate of contrast agent washout was positively predictive of improved overall survival and metastasis free survival with hazard ratio of 0.67 (p = 0.015) and 0.68 (p = 0.012) respectively. After the first hyperthermia washin rate, AUC60, AUC90, and AUCt-max, were predictive of improved overall survivaloverall survival and metastasis free survival with hazard ratio ranging from 0.46 to 0.53 (p < 0.002) and 0.44 to 0.55 (p < 0.004), respectively. DCE-MRI parameters were compared with extracellular pH and 31-P-MR spectroscopy results (previously published) in the same patients demonstrating a correlation. This suggested that an increase in perfusion after therapy was effective in eliminating excess acid from the tumor. Conclusions This study demonstrates that DCE-MRI has utility predicting overall survivaloverall survival and metastasis free survival in canine patients with soft tissue sarcomas. To our knowledge, this is the first time that DCE-MRI parameters have been shown to be predictive of clinical outcome for soft tissue sarcomas. PMID:19622579
Karlin, Andrew W.; Ly, Trang T.; Pyle, Laura; Forlenza, Gregory P.; Messer, Laurel; Wadwa, R. Paul; DeSalvo, Daniel J.; Payne, Sydney L.; Hanes, Sarah; Clinton, Paula; Buckingham, Bruce
2016-01-01
Abstract Background: Improved insulin infusion set survival and faster insulin action are important issues for pump users and for the development of an artificial pancreas. The current recommendation is to change infusion sets every 3 days. Our objectives were to determine the effect of lipohypertrophy (LH) on infusion set survival and continuous glucose monitoring glucose levels. Research Design and Methods: In this multicenter crossover trial, we recruited 20 subjects (age 28.1 ± 9.0 years) with type 1 diabetes (duration 17.5 ± 8.8 years) and an area of lipohypertrophied tissue >3 cm. Subjects alternated weekly wearing a Teflon infusion set in an area of either LH or non-LH for 4 weeks. Sets were changed after (a) failure or (b) surviving 7 days of use. Results: The least-squares mean duration of infusion set survival for sets that lasted <7 days in lipohypertrophied tissue was 4.31 days compared with 4.12 days in nonlipohypertrophied tissue (P = 0.71). The average duration of set survival for individual subjects ranged from 2.2 to 7.0 days. Infusion sets in lipohypertrophied tissue failed due to hyperglycemia in 35% of subjects compared with 23% in nonlipohypertrophied tissue (P = 0.22). Both lipohypertrophied and nonlipohypertrophied tissues displayed a general increase in mean daily glucose after the third day of infusion set wear, but daily mean glucose did not differ by tissue type (P > 0.38 on each day). Conclusion: LH did not significantly affect infusion set survival or mean glucose. Achieving optimal infusion set performance requires research into factors affecting set survival. Additionally, the recommendation for duration of set change may need to be individualized. PMID:27227290
Propensity-Matched Mortality Comparison of Incident Hemodialysis and Peritoneal Dialysis Patients
Weinhandl, Eric D.; Gilbertson, David T.; Arneson, Thomas J.; Snyder, Jon J.; Collins, Allan J.
2010-01-01
Contemporary comparisons of mortality in matched hemodialysis and peritoneal dialysis patients are lacking. We aimed to compare survival of incident hemodialysis and peritoneal dialysis patients by intention-to-treat analysis in a matched-pair cohort and in subsets defined by age, cardiovascular disease, and diabetes. We matched 6337 patient pairs from a retrospective cohort of 98,875 adults who initiated dialysis in 2003 in the United States. In the primary intention-to-treat analysis of survival from day 0, cumulative survival was higher for peritoneal dialysis patients than for hemodialysis patients (hazard ratio 0.92; 95% CI 0.86 to 1.00, P = 0.04). Cumulative survival probabilities for peritoneal dialysis versus hemodialysis were 85.8% versus 80.7% (P < 0.01), 71.1% versus 68.0% (P < 0.01), 58.1% versus 56.7% (P = 0.25), and 48.4% versus 47.3% (P = 0.50) at 12, 24, 36, and 48 months, respectively. Peritoneal dialysis was associated with improved survival compared with hemodialysis among subgroups with age <65 years, no cardiovascular disease, and no diabetes. In a sensitivity analysis of survival from 90 days after initiation, we did not detect a difference in survival between modalities overall (hazard ratio 1.05; 95% CI 0.96 to 1.16), but hemodialysis was associated with improved survival among subgroups with cardiovascular disease and diabetes. In conclusion, despite hazard ratio heterogeneity across patient subgroups and nonconstant hazard ratios during the follow-up period, the overall intention-to-treat mortality risk after dialysis initiation was 8% lower for peritoneal dialysis than for matched hemodialysis patients. These data suggest that increased use of peritoneal dialysis may benefit incident ESRD patients. PMID:20133483
Celastrol supports survival of retinal ganglion cells injured by optic nerve crush.
Kyung, Haksu; Kwong, Jacky M K; Bekerman, Vlad; Gu, Lei; Yadegari, Daniel; Caprioli, Joseph; Piri, Natik
2015-06-03
The present study evaluates the effect of celastrol on the survival of retinal ganglion cells (RGCs) injured by optic nerve crush (ONC). Celastrol, a quinine methide triterpene extracted from the perennial vine Tripterygium wilfordii (Celastraceae), has been identified as a potential neuroprotective candidate in a comprehensive drug screen against various neurodegenerative diseases. Two weeks after ONC, the average density of remaining RGCs in retinas of animals treated with daily intraperitoneal (i.p.) injections of celastrol (1mg/kg) was approximately 1332 cells/mm(2), or 40.8% of the Celastrol/Control group. In retinas of the Vehicle/ONC group about 381 RGCs/mm(2) were counted, which is 9.6% of the total number of RGCs in the DMSO/Control group. This corresponds to approximately a 250% increase in RGC survival mediated by celastrol treatment compared to Vehicle/ONC group. Furthermore, the average RGC number in retinas of ONC animals treated with a single intravitreal injection of 1mg/kg or 5mg/kg of celastrol was increased by approximately 80% (760 RGCs/mm(2)) and 78% (753 RGCs/mm(2)), respectively, compared to Vehicle/ONC controls (422 cells/mm(2)). Injection of 0.2mg/kg of celastrol had no significant effect on cell survival, with the average number of RGCs being 514 cells/mm(2) in celastrol-treated animals versus 422 cells/mm(2) in controls. The expression levels of Hsp70, Hsf1, Hsf2, HO-1 and TNF-alpha in the retina were analyzed to evaluate the roles of these proteins in the celastrol-mediated protection of injured RGCs. No statistically significant change in HO-1, Hsf1 and Hsp70 levels was seen in animals with ONC. An approximately 2 fold increase in Hsf2 level was observed in celastrol-treated animals with or without injury. Hsf2 level was also increased 1.8 fold in DMSO-treated animals with ONC injury compared to DMSO-treated animals with no injury suggesting that Hsf2 induction has an injury-induced component. Expression of TNF-alpha in retinas of celastrol-treated uninjured and ONC animals was reduced by approximately 2 and 1.5 fold compared to vehicle treated animals, respectively. The observed results suggest that mechanisms underlying celastrol׳s RGC protective effect are associated with inhibition of TNF-alpha-mediated cell death. Copyright © 2015 Elsevier B.V. All rights reserved.
Schneider, Karolin; Marbaix, Etienne; Bouzin, Caroline; Hamoir, Marc; Mahy, Pierre; Bol, Vanesa; Grégoire, Vincent
2018-03-01
Human papillomavirus (HPV) prevalence in oropharynx squamous cell carcinoma (OPSCC) is on the rise. HPV-linked OPSCCs represent a distinct clinical entity with a better treatment response and patient survival compared to tumors not linked to HPV. An emerging role in treatment response has been attributed to immune cell infiltration in human tumors. In this study, we investigated immune cell infiltration in human SCC of the head and neck region and its relation to overall survival after treatment with surgery (with or without radiotherapy) or concomitant chemo (or cetuximab)-radiotherapy. Paraffin-embedded tumor samples of 136 patients with SCC of the larynx, hypopharynx, oral cavity and oropharynx were processed for immunohistochemical detection of CD3 + T-cells, CD8 + cytotoxic T-cells, CD20 + B-cells and CD163 + M2 macrophages within the tumor infiltrated area. Clinico-pathological data were analyzed as a function of tumor location and p16-status. Immune cell infiltration was represented as stained area on the whole tumor infiltrated area, compared for the different tumor locations and correlated to patient survival. Patients with oropharynx tumors expressing significant p16 levels (p16-sg) had a 5-year overall survival of 85% compared to 43% for patients with no significant p16 (p16-ns) expression (HR: 0.3 - 95% CI: 0.1-0.6). Median immune cell infiltration (T- and B-lymphocytes) was significantly elevated in p16-sg oropharyngeal tumors, compared to p16-ns oropharyngeal tumors and to all other head and neck tumor locations. No difference in CD163 + macrophage infiltration was observed across the different patient groups. In the whole population, a high infiltration by CD3 + T-lymphocytes was associated to a significantly (p = .03; HR: 0.6, 95% CI: 0.4-0.97) better overall survival. Oropharynx cancer with significant p16 expression showed an increased overall survival and elevated T- and B-lymphocyte infiltration, which suggests a prognostic relevance of immune cell infiltration.
Probiotics improve survival of septic rats by suppressing conditioned pathogens in ascites
Liu, Da-Quan; Gao, Qiao-Ying; Liu, Hong-Bin; Li, Dong-Hua; Wu, Shang-Wei
2013-01-01
AIM: To investigate the benefits of probiotics treatment in septic rats. METHODS: The septic rats were induced by cecal ligation and puncture. The animals of control, septic model and probiotics treated groups were treated with vehicle and mixed probiotics, respectively. The mixture of probiotics included Bifidobacterium longum, Lactobacillus bulgaricus and Streptococcus thermophilus. We observed the survival of septic rats using different amounts of mixed probiotics. We also detected the bacterial population in ascites and blood of experimental sepsis using cultivation and real-time polymerase chain reaction. The severity of mucosal inflammation in colonic tissues was determined. RESULTS: Probiotics treatment improved survival of the rats significantly and this effect was dose dependent. The survival rate was 30% for vehicle-treated septic model group. However, 1 and 1/4 doses of probiotics treatment increased survival rate significantly compared with septic model group (80% and 55% vs 30%, P < 0.05). The total viable counts of bacteria in ascites decreased significantly in probiotics treated group compared with septic model group (5.20 ± 0.57 vs 9.81 ± 0.67, P < 0.05). The total positive rate of hemoculture decreased significantly in probiotics treated group compared with septic model group (33.3% vs 100.0%, P < 0.05). The population of Escherichia coli and Staphylococcus aureus in ascites of probiotics treated group were decreased significantly compared with that of septic model group (3.93 ± 0.73 vs 8.80 ± 0.83, P < 0.05; 2.80 ± 1.04 vs 5.39 ± 1.21, P < 0.05). With probiotics treatment, there was a decrease in the scores of inflammatory cell infiltration into the intestinal mucosa in septic animals (1.50 ± 0.25 vs 2.88 ± 0.14, P < 0.01). CONCLUSION: Escherichia coli and Staphylococcus aureus may be primary pathogens in septic rats. Probiotics improve survival of septic rats by suppressing these conditioned pathogens. PMID:23840152
Berthold, E; Månsson, B; Gullstrand, B; Geborek, P; Saxne, T; Bengtsson, A A; Kahn, R
2018-01-01
To study whether serum levels of tumour necrosis factor-α (TNF-α), free or bound to etanercept, in biological-naïve adults with rheumatoid arthritis (RA) could predict the long-term efficacy of etanercept, measured as drug survival. We identified 145 biological-naïve patients with RA starting treatment with etanercept at the Department of Rheumatology, Skåne University Hospital (1999-2008), of whom 16 had seronegative and 129 seropositive RA. TNF-α in serum was quantified using enzyme-linked immunosorbent assay in samples from the onset of treatment and at 6 week follow-up. Drug survival time was used to evaluate the long-term efficacy of etanercept. Levels of TNF-α were significantly increased at follow-up compared to at the start. At the 6 week follow-up, circulating TNF-α mainly comprised TNF-α in complex with etanercept. Longer drug survival time correlated with increased TNF-α at 6 week follow-up in the patients with seronegative RA, but not in the seropositive patients. We demonstrated that levels of circulating TNF-α increased in almost all individuals after initiation of treatment with etanercept and that this increase mainly comprised TNF-α in complex with etanercept. More importantly, this increase may predict drug survival in adults with seronegative, but not seropositive, RA and suggests that measuring TNF-α/etanercept complexes in serum may be relevant in patients with seronegative RA.
Rigby, Elizabeth A.; Haukos, David A.
2015-01-01
Mottled ducks (Anas fulvigula) on the western Gulf Coast have exhibited a steep population decline since the mid 1990s. Low rates of breeding incidence and nest success have been implicated in this decline, but duckling survival and the habitat needs of broods have not been previously investigated in this region. We fitted mottled duck ducklings and adult females with radio transmitters and tracked broods to estimate duckling survival and brood habitat selection on the upper Texas Gulf Coast. Duckling survival to 30 days was high (range among models 0.354–0.567) compared to other dabbling duck species. Estimated fecundity was low, (range among models 0.398–0.634) however, indicating that overall reproductive output is low. Within coastal marsh, broods selected home ranges with more water cover and less upland and fresh marsh landcover than was available in the study area. Within coastal marsh home ranges, broods selected for water cover relative to other landcover types, and there was some evidence that broods avoided unvegetated landcover. Although high quality brood habitat is undeniably important, management efforts to increase mottled duck population growth on the western Gulf Coast may best be spent on increasing nesting habitat quality to increase nest success and breeding incidence.
Caffeine-enhanced survival of radiation-sensitive, repair-deficient Chinese hamster cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Utsumi, H.; Elkind, M.M.
1983-11-01
A clone of V79 Chinese hamster cells (V79-AL162/S-10) with unique properties has been isolated after a challenge of parental cells (V79-AL162) with 1 mM ouabain. Compared with parental cells, or with other clones isolated after the ouabain challenge, these cells form smaller colonies, are more sensitive to both x rays and fission-spectrum neutrons, and respond atypically to a postirradiation treatment with caffeine. Their enhanced response to x rays results mainly from a large reduction in the shoulder of their survival curve, probably because in late S phase, the most resistant phase in the cell cycle, the survival curve of thesemore » cells has a reduced shoulder width. Caffeine, and to a lesser extent theophylline, added to the colony-forming medium immediately after exposure appreciably increases the width of the shoulder of these sensitive cells, whereas caffeine has the opposite effect on the response of normal V79 cells. Thus the unique response of the V79-AL162/S-10 cells to a radiation posttreatment with caffeine (increased survival) results from a net increase in their ability to repair damage that is otherwise lethal; caffeine treatment ordinarly prevents normal V79 cells from repairing damage that is only potentially lethal.« less
Yu, James B; Wilson, Lynn D; Dasgupta, Tina; Castrucci, William A; Weidhaas, Joanne B
2008-07-01
The role of postmastectomy radiotherapy (PMRT) for lymph node-negative locally advanced breast carcinoma (T3N0M0) after modified radical mastectomy (MRM) with regard to improvement in survival remains an area of controversy. The 1973-2004 National Cancer Institute (NCI) Surveillance, Epidemiology, and End Results (SEER) database was examined for patients with T3N0M0 ductal, lobular, or mixed ductal and lobular carcinoma of the breast who underwent MRM, treated from 1988-2003. Patients who were men, who had positive lymph nodes, who survived < or =6 months, for whom breast cancer was not the first malignancy, who had nonbeam radiation, intraoperative or preoperative radiation were excluded. The average treatment effect of PMRT on mortality was estimated with a propensity score case-matched analysis. In all, 1777 patients were identified; 568 (32%) patients received PMRT. Median tumor size was 6.3 cm. The median number of lymph nodes examined was 14 (range, 1-49). Propensity score matched case-control analysis showed no improvement in overall survival with the delivery of PMRT in this group. Older patients, patients with ER- disease (compared with ER+), and patients with high-grade tumors (compared with well differentiated) had increased mortality. The use of PMRT for T3N0M0 breast carcinoma after MRM is not associated with an increase in overall survival. It was not possible to analyze local control in this study given the limitations of the SEER database. The impact of potential improvement in local control as it relates to overall survival should be the subject of further investigation. (Copyright) 2008 American Cancer Society.
Finocchiaro, Liliana M E; Fondello, Chiara; Gil-Cardeza, María L; Rossi, Úrsula A; Villaverde, Marcela S; Riveros, María D; Glikin, Gerardo C
2015-06-01
We present here a nonviral immunogene therapy trial for canine malignant melanoma, an aggressive disease displaying significant clinical and histopathological overlapping with human melanoma. As a surgery adjuvant approach, it comprised the co-injection of lipoplexes bearing herpes simplex virus thymidine kinase and canine interferon-β genes at the time of surgery, combined with the periodic administration of a subcutaneous genetic vaccine composed of tumor extracts and lipoplexes carrying the genes of human interleukin-2 and human granulocyte-macrophage colony-stimulating factor. Following complete surgery (CS), the combined treatment (CT) significantly raised the portion of local disease-free canine patients from 11% to 83% and distant metastases-free (M0) from 44% to 89%, as compared with surgery-only-treated controls (ST). Even after partial surgery (PS), CT better controlled the systemic disease (M0: 82%) than ST (M0: 48%). Moreover, compared with ST, CT caused a significant 7-fold (CS) and 4-fold (PS) rise of overall survival, and >17-fold (CS) and >13-fold (PS) rise of metastasis-free survival. The dramatic increase of PS metastasis-free survival (>1321 days) and CS recurrence- and metastasis-free survival (both >2251 days) demonstrated that CT was shifting a rapidly lethal disease into a chronic one. In conclusion, this surgery adjuvant CT was able of significantly delaying or preventing postsurgical recurrence and distant metastasis, increasing disease-free and overall survival, and maintaining the quality of life. The high number of canine patients involved in CT (301) and the extensive follow-up (>6 years) with minimal or absent toxicity warrant the long-term safety and efficacy of this treatment. This successful clinical outcome justifies attempting a similar scheme for human melanoma.
Satyamitra, Merriline; Kumar, Vidya P.; Biswas, Shukla; Cary, Lynnette; Dickson, Leonora; Venkataraman, Srinivasan; Ghosh, Sanchita P.
2017-01-01
Filgrastim (Neupogen®, granulocyte-colony stimulating factor) is among the few countermeasures recommended for management of patients in the event of lethal total-body irradiation. Despite the plethora of studies using filgrastim as a radiation countermeasure, relatively little is known about the optimal dose schedule of filgrastim to mitigate radiation lethality. We evaluated the efficacy of filgrastim in improving 30-day survival of CD2F1 mice irradiated with a lethal dose (LD70/30) in the AFRRI cobalt-60 facility. We tested different schedules of 1, 3, 5,10 or 16 once-daily injections of filgrastim initiated one day after irradiation. Time optimization studies with filgrastim treatment were also performed, beginning 6–48 h postirradiation. Maximum survival was observed with 3 daily doses of 0.17 mg/kg filgrastim. Survival efficacy of the 3-day treatment was compared against the conventional 16-day filgrastim treatment after irradiation in four mouse strains with varying radiation sensitivities: C3H/HeN, C57BL/6, B6C3F1 and CD2F1. Blood indices, bone marrow histopathology and colony forming unit assays were also evaluated. Filgrastim significantly increased 30-day survival (P < 0.001) with a 3-day treatment compared to 16-day treatment. Filgrastim did not prevent cytopenia nadirs, but facilitated faster recovery of white blood cells, neutrophils, red blood cells, platelets, lymphocytes and hematocrits in all four strains. Accelerated hematopoietic recovery was also reflected in faster bone marrow reconstitution and significant increase in hematopoietic progenitors (P < 0.001) in all four mouse strains. These data indicate that prompt and abbreviated filgrastim treatment has potential benefit for triage in the event of a radiological incident for treating acute hematopoietic syndrome. PMID:28362168
Cannon, Richard B; Carpenter, Patrick S; Boothe, Dustin; Buchmann, Luke O; Hunt, Jason P; Lloyd, Shane; Hitchcock, Ying J; Houlton, Jeffrey J; Weis, John R; Shepherd, Hailey M; Monroe, Marcus M
2018-04-01
Objectives To investigate clinicopathologic and treatment factors associated with survival in adult head and neck sarcomas in the National Cancer Database (NCDB). To analyze whether treatment settings and therapies received influence survival outcomes and to compare trends in utilization via an aggregated national data set. Study Design Prospectively gathered data. Setting NCDB. Subjects and Methods The study comprised a total of 6944 adult patients treated for a head and neck sarcoma from January 2004 to December 2013. Overall survival (OS) was the primary outcome. Results Increased age and tumor size, nodal involvement, and poorly differentiated histology had significantly reduced OS ( P < .001). Angiosarcoma, malignant nerve sheath tumor, malignant fibrous histiocytoma, osteosarcoma, and rhabdomyosarcoma histologic subtypes had significantly reduced OS, while liposarcoma, chondrosarcoma, and chordoma had improved OS ( P < .001). Utilization of surgical therapy was associated with improved OS, while positive surgical margins were associated with treatment at a community-based cancer program and had reduced OS ( P < .001). On multivariate analysis, treatment with radiation and/or chemotherapy was not significantly associated with OS; however, primary treatment with definitive chemoradiotherapy had significantly reduced OS. Patients treated at academic/research cancer programs (n = 3874) had significantly improved 5- and 10-year OS (65% and 54%, respectively) when compared with patients treated at community-based cancer programs (n = 3027; 49% and 29%; P < .001). The percentage utilization of these programs (56% vs 44%) did not change over the study period. Conclusion For adult head and neck sarcomas, treatment at an academic/research cancer program was associated with improved survival; however, despite increasing medical specialization, the percentage utilization of these programs for this rare tumor remains constant.
Preterm birth-associated cost of early intervention services: an analysis by gestational age.
Clements, Karen M; Barfield, Wanda D; Ayadi, M Femi; Wilber, Nancy
2007-04-01
Characterizing the cost of preterm birth is important in assessing the impact of increasing prematurity rates and evaluating the cost-effectiveness of therapies to prevent preterm delivery. To assess early intervention costs that are associated with preterm births, we estimated the program cost of early intervention services for children who were born in Massachusetts, by gestational age at birth. Using the Pregnancy to Early Life Longitudinal Data Set, birth certificates for infants who were born in Massachusetts between July 1999 and June 2000 were linked to early intervention claims through 2003. We determined total program costs, in 2003 dollars, of early intervention and mean cost per surviving infant by gestational age. Costs by plurality, eligibility criteria, provider discipline, and annual costs for children's first 3 years also were examined. Overall, 14,033 of 76,901 surviving infants received early intervention services. Program costs totaled almost $66 million, with mean cost per surviving infant of $857. Mean cost per infant was highest for children who were 24 to 31 weeks' gestational age ($5393) and higher for infants who were 32 to 36 weeks' gestational age ($1578) compared with those who were born at term ($725). Cost per surviving infant generally decreased with increasing gestational age. Among children in early intervention, mean cost per child was higher for preterm infants than for term infants. At each gestational age, mean cost per surviving infant was higher for multiples than for singletons, and annual early intervention costs were higher for toddlers than for infants. Compared with their term counterparts, preterm infants incurred higher early intervention costs. This information along with data on birth trends will inform budget forecasting for early intervention programs. Costs that are associated with early childhood developmental services must be included when considering the long-term costs of prematurity.
McLay, L K; Green, M P; Jones, T M
2017-07-01
The presence of artificial light at night is expanding in geographical range and increasing in intensity to such an extent that species living in urban environments may never experience natural darkness. The negative ecological consequences of artificial night lighting have been identified in several key life history traits across multiple taxa (albeit with a strong vertebrate focus); comparable data for invertebrates is lacking. In this study, we explored the effect of chronic exposure to different night-time lighting intensities on growth, reproduction and survival in Drosophila melanogaster. We reared three generations of flies under identical daytime light conditions (2600lx) and one of four ecologically relevant ALAN treatments (0, 1, 10 or 100lx), then explored variation in oviposition, number of eggs produced, juvenile growth and survival and adult survival. We found that, in the presence of light at night (1, 10 and 100lx treatments), the probability of a female commencing oviposition and the number of eggs laid was significantly reduced. This did not translate into differences at the juvenile phase: juvenile development times and the probability of eclosing as an adult were comparable across all treatments. However, we demonstrate for the first time a direct link between chronic exposure to light at night (greater than 1lx) and adult survival. Our data highlight that ALAN has the capacity to cause dramatic shifts in multiple life history traits at both the individual and population level. Such shifts are likely to be species-specific, however a more in depth understanding of the broad-scale impact of ALAN and the relevant mechanisms driving biological change is urgently required as we move into an increasing brightly lit future. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cho, Hyunsoon; Mariotto, Angela B; Schwartz, Lisa M; Luo, Jun; Woloshin, Steven
2014-11-01
It is often assumed that increases in cancer survival reflect true progress against cancer. This is true when these increases are accompanied by decreased burden of disease: Fewer people being diagnosed or dying from cancer (ie, decreased incidence and mortality). But increased survival can also occur even when incidence is increasing and mortality is unchanged. To use trends in cancer burden-incidence and mortality-to illustrate when changes in survival reflect true progress. Using data from 1975 to 2010 collected by the Surveillance, Epidemiology, and End Results Program (incidence, survival) and the National Center for Health Statistics (mortality), we analyzed US trends in five-year relative survival, age-adjusted incidence, and mortality for selected cancers to identify patterns that do and do not reflect progress. Among the nine common cancers examined, survival increased in seven, and changed little or not at all for two. In some cases, increased survival was accompanied by decreased burden of disease, reflecting true progress. For example, from 1975 to 2010, five-year survival for colon cancer patients improved (from 48% to 68%) while cancer burden fell: Fewer cases (incidence decreased from 60 to 41 per 100,000) and fewer deaths (mortality decreased from 28 to 16 per 100,000), a pattern explained by both increased early detection (with removal of cancer precursors) and more effective treatment. In other cases, however, increased survival did not reflect true progress. In melanoma, kidney, and thyroid cancer, five-year survival increased but incidence increased with no change in mortality. This pattern suggests overdiagnosis from increased early detection, an increase in cancer burden. Changes in survival must be interpreted in the context of incidence and mortality. Increased survival only represents progress when accompanied by a reduction in incidence, mortality, or ideally both. Published by Oxford University Press 2014.
Mariotto, Angela B.; Schwartz, Lisa M.; Luo, Jun; Woloshin, Steven
2014-01-01
Background It is often assumed that increases in cancer survival reflect true progress against cancer. This is true when these increases are accompanied by decreased burden of disease: Fewer people being diagnosed or dying from cancer (ie, decreased incidence and mortality). But increased survival can also occur even when incidence is increasing and mortality is unchanged. Objective To use trends in cancer burden—incidence and mortality—to illustrate when changes in survival reflect true progress. Methods Using data from 1975 to 2010 collected by the Surveillance, Epidemiology, and End Results Program (incidence, survival) and the National Center for Health Statistics (mortality), we analyzed US trends in five-year relative survival, age-adjusted incidence, and mortality for selected cancers to identify patterns that do and do not reflect progress. Results Among the nine common cancers examined, survival increased in seven, and changed little or not at all for two. In some cases, increased survival was accompanied by decreased burden of disease, reflecting true progress. For example, from 1975 to 2010, five-year survival for colon cancer patients improved (from 48% to 68%) while cancer burden fell: Fewer cases (incidence decreased from 60 to 41 per 100000) and fewer deaths (mortality decreased from 28 to 16 per 100000), a pattern explained by both increased early detection (with removal of cancer precursors) and more effective treatment. In other cases, however, increased survival did not reflect true progress. In melanoma, kidney, and thyroid cancer, five-year survival increased but incidence increased with no change in mortality. This pattern suggests overdiagnosis from increased early detection, an increase in cancer burden. Conclusions Changes in survival must be interpreted in the context of incidence and mortality. Increased survival only represents progress when accompanied by a reduction in incidence, mortality, or ideally both. PMID:25417232
Elshaikh, Mohamed A; Ruterbusch, Julie; Cote, Michele L; Cattaneo, Richard; Munkarah, Adnan R
2013-11-01
To study the prognostic impact of baby boomer (BB) generation on survival end-points of patients with early-stage endometrial carcinoma (EC). Data were obtained from the SEER registry between 1988-2009. Inclusion criteria included women who underwent hysterectomy for stage I-II EC. Patients were divided into two birth cohorts: BB (women born between 1946 and 1964) and pre-boomers (PB) (born between 1926 and 1945). A total of 30,956 patients were analyzed. Considering that women in the PB group were older than those of the BB generation, the statistical analysis was limited to women 50-59 years of age at the time of diagnosis (n=11,473). Baby boomers had a significantly higher percentage of endometrioid histology (p<0.0001), higher percentage of African American women (p<0.0001), lower tumor grade (p<0.0001), higher number of dissected lymph nodes (LN) (p<0.0001), and less utilization of adjuvant radiation therapy (p=0.0003). Overall survival was improved in women in the BB generation compared to the PB generation (p=0.0003) with a trend for improved uterine cancer-specific survival (p=0.0752). On multivariate analysis, birth cohort (BB vs. PB) was not a significant predictor of survival end-points. Factors predictive of survival included: tumor grade, FIGO stage, African-American race, and increased number of dissected LN. Our study suggests that the survival of BB women between 50-60 years of age is better compared to women in the PB generation. As more BB patients are diagnosed with EC, further research is warranted.
The effect of the AED and AED programs on survival of individuals, groups and populations.
Stokes, Nathan Allen; Scapigliati, Andrea; Trammell, Antoine R; Parish, David C
2012-10-01
The automated external defibrillator (AED) is a tool that contributes to survival with mixed outcomes. This review assesses the effectiveness of the AED, consistencies and variations among studies, and how varying outcomes can be resolved. A worksheet for the International Liaison Committee on Resuscitation (ILCOR) 2010 science review focused on hospital survival in AED programs was the foundation of the articles reviewed. Articles identified in the search covering a broader range of topics were added. All articles were read by at least two authors; consensus discussions resolved differences. AED use developed sequentially. Use of AEDs by emergency medical technicians (EMTs) compared to manual defibrillators showed equal or superior survival. AED use was extended to trained responders likely to be near victims, such as fire/rescue, police, airline attendants, and casino security guards, with improvement in all venues but not all programs. Broad public access initiatives demonstrated increased survival despite low rates of AED use. Home AED programs have not improved survival; in-hospital trials have had mixed results. Successful programs have placed devices in high-risk sites, maintained the AEDs, recruited a team with a duty to respond, and conducted ongoing assessment of the program. The AED can affect survival among patients with sudden ventricular fibrillation (VF). Components of AED programs that affect outcome include the operator, location, the emergency response system, ongoing maintenance and evaluation. Comparing outcomes is complicated by variations in definitions of populations and variables. The effect of AEDs on individuals can be dramatic, but the effect on populations is limited.
Dzul, Maria C.; Yackulic, Charles B.; Stone, Dennis M.; Van Haverbeke, David R.
2016-01-01
Ecologists estimate vital rates, such as growth and survival, to better understand population dynamics and identify sensitive life history parameters for species or populations of concern. Here, we assess spatiotemporal variation in growth, movement, density, and survival of subadult humpback chub living in the Little Colorado River, Grand Canyon, AZ from 2001–2002 and 2009–2013. We divided the Little Colorado River into three reaches and used a multistate mark-recapture model to determine rates of movement and differences in survival and density between sites for different cohorts. Additionally, site-specific and year-specific effects on growth were evaluated using a linear model. Results indicate that summer growth was higher for upstream sites compared with downstream sites. In contrast, there was not a consistent spatial pattern across years in winter growth; however, river-wide winter growth was negatively related to the duration of floods from 1 October to 15 May. Apparent survival was estimated to be lower at the most downstream site compared with the upstream sites; however, this could be because in part of increased emigration into the Colorado River at downstream sites. Furthermore, the 2010 cohort (i.e. fish that are age 1 in 2010) exhibited high apparent survival relative to other years. Movement between reaches varied with year, and some years exhibited preferential upstream displacement. Improving understanding of spatiotemporal effects on age 1 humpback chub survival can help inform current management efforts to translocate humpback chub into new locations and give us a better understanding of the factors that may limit this tributary's carrying capacity for humpback chub.
Bemanian, Amin; Beyer, Kirsten M.M.
2017-01-01
Background The Black to White disparity in breast cancer survival is increasing, and racial residential segregation is a potential driver for this trend. However, study findings have been mixed, and no study has comprehensively compared the effectiveness of different local level segregation metrics in explaining cancer survival. Methods We proposed a set of new local segregation metrics named LEx/Is (Local Exposure and Isolation) and compared our new local isolation metric to two related metrics - the location quotient (LQ) and the index of concentration at extremes (ICE) - across the 102 largest US metropolitan areas. Then, using case data from the Milwaukee, WI metropolitan area, we used proportional hazards models to explore associations between segregation and breast cancer survival. Results Across the 102 metropolitan areas, the new local isolation metric was less skewed than the LQ or ICE. Across all races, Hispanic isolation was associated with poorer all-cause survival, and Hispanic LQ and Hispanic-White ICE were found to be associated with poorer survival for both breast cancer specific and all-cause mortality. For Black patients, Black LQ was associated with lower all-cause mortality and Black local isolation was associated with reduced all-cause and breast cancer specific mortality. ICE was found to suffer from high multicollinearity. Conclusions Local segregation is associated with breast cancer survival, but associations varied based on patient race and metric employed. Impact We highlight how selection of a segregation measure can alter study findings. These relationships need to be validated in other geographic areas. PMID:28325737
Survival of Parents and Siblings of Supercentenarians
Perls, Thomas; Kohler, Iliana V.; Andersen, Stacy; Schoenhofen, Emily; Pennington, JaeMi; Young, Robert; Terry, Dellara; Elo, Irma T.
2011-01-01
Background Given previous evidence of familial predisposition for longevity, we hypothesized that siblings and parents of supercentenarians (age ≥ 110 years) were predisposed to survival to very old age and that, relative to their birth cohorts, their relative survival probabilities (RSPs) are even higher than what has been observed for the siblings of centenarians. Methods Mean age at death conditional upon survival to ages 20 and 50 and survival probabilities from ages 20 and 50 to higher ages were determined for 50 male and 56 female siblings and 54 parents of 29 supercentenarians. These estimates were contrasted with comparable estimates based on birth cohort-specific mortality experience for the United States and Sweden. Results Conditional on survival to age 20 years, mean age at death of supercentenarians’ siblings was ~81 years for men and women. Compared with respective Swedish and U.S. birth cohorts, these estimates were 17%–20% (12–14 years) higher for the brothers and 11%–14% (8–10 years) higher for the sisters. Sisters had a 2.9 times greater probability and brothers had a 4.3 times greater probability of survival from age 20 to age 90. Mothers of supercentenarians had a 5.8 times greater probability of surviving from age 50 to age 90. Fathers also experienced an increased survival probability from age 50 to age 90 of 2.7, but it failed to attain statistical significance. Conclusions The RSPs of siblings and mothers of supercentenarians revealed a substantial survival advantage and were most pronounced at the oldest ages. The RSP to age 90 for siblings of supercentenarians was approximately the same as that reported for siblings of centenarians. It is possible that greater RSPs are observed for reaching even higher ages such as 100 years, but a larger sample of supercentenarians and their siblings and parents is needed to investigate this possibility. PMID:17895443
Survival of parents and siblings of supercentenarians.
Perls, Thomas; Kohler, Iliana V; Andersen, Stacy; Schoenhofen, Emily; Pennington, JaeMi; Young, Robert; Terry, Dellara; Elo, Irma T
2007-09-01
Given previous evidence of familial predisposition for longevity, we hypothesized that siblings and parents of supercentenarians (age >or= 110 years) were predisposed to survival to very old age and that, relative to their birth cohorts, their relative survival probabilities (RSPs) are even higher than what has been observed for the siblings of centenarians. Mean age at death conditional upon survival to ages 20 and 50 and survival probabilities from ages 20 and 50 to higher ages were determined for 50 male and 56 female siblings and 54 parents of 29 supercentenarians. These estimates were contrasted with comparable estimates based on birth cohort-specific mortality experience for the United States and Sweden. Conditional on survival to age 20 years, mean age at death of supercentenarians' siblings was approximately 81 years for men and women. Compared with respective Swedish and U.S. birth cohorts, these estimates were 17%-20% (12-14 years) higher for the brothers and 11%-14% (8-10 years) higher for the sisters. Sisters had a 2.9 times greater probability and brothers had a 4.3 times greater probability of survival from age 20 to age 90. Mothers of supercentenarians had a 5.8 times greater probability of surviving from age 50 to age 90. Fathers also experienced an increased survival probability from age 50 to age 90 of 2.7, but it failed to attain statistical significance. The RSPs of siblings and mothers of supercentenarians revealed a substantial survival advantage and were most pronounced at the oldest ages. The RSP to age 90 for siblings of supercentenarians was approximately the same as that reported for siblings of centenarians. It is possible that greater RSPs are observed for reaching even higher ages such as 100 years, but a larger sample of supercentenarians and their siblings and parents is needed to investigate this possibility.
Zhao, Zhao-Hua; Deng, Bin; Xu, Hao; Zhang, Jun-Feng; Mi, Ya-Jing; Meng, Xiang-Zhong; Gou, Xing-Chun; Xu, Li-Xian
2017-05-01
Previous studies have proven that paired immunoglobulin-like receptor B (PirB) plays a crucial suppressant role in neurite outgrowth and neuronal plasticity after central nervous system injury. However, the role of PirB in neuronal survival after cerebral ischemic injury and its mechanisms remains unclear. In the present study, the role of PirB is investigated in the survival and apoptosis of cerebral cortical neurons in cultured primary after oxygen and glucose deprivation (OGD)-induced injury. The results have shown that rebarbative PirB exacerbates early neuron apoptosis and survival. PirB gene silencing remarkably decreases early apoptosis and promotes neuronal survival after OGD. The expression of bcl-2 markedly increased and the expression of bax significantly decreased in PirB RNAi-treated neurons, as compared with the control- and control RNAi-treated ones. Further, phosphorylated TrkB and mTOR levels are significantly downregulated in the damaged neurons. However, the PirB silencing markedly upregulates phosphorylated TrkB and mTOR levels in the neurons after the OGD. Taken together, the overexpression of PirB inhibits the neuronal survival through increased neuron apoptosis. Importantly, the inhibition of the phosphorylation of TrkB and mTOR may be one of its mechanisms.
Biagi, Federico; Marchese, Alessandra; Ferretti, Francesca; Ciccocioppo, Rachele; Schiepatti, Annalisa; Volta, Umberto; Caio, Giacomo; Ciacci, Carolina; Zingone, Fabiana; D'Odorico, Anna; Carroccio, Antonio; Ambrosiano, Giuseppe; Mansueto, Pasquale; Gasbarrini, Antonio; Piscaglia, Anna Chiara; Andrealli, Alida; Astegiano, Marco; Segato, Sergio; Neri, Matteo; Meggio, Alberto; de Pretis, Giovanni; De Vitis, Italo; Gobbi, Paolo; Corazza, Gino Roberto
2014-08-07
Coeliac disease is a common enteropathy characterized by an increased mortality mainly due to its complications. The natural history of complicated coeliac disease is characterised by two different types of course: patients with a new diagnosis of coeliac disease that do not improve despite a strict gluten-free diet (type A cases) and previously diagnosed coeliac patients that initially improved on a gluten-free diet but then relapsed despite a strict diet (type B cases). Our aim was to study the prognosis and survival of A and B cases. Clinical and laboratory data from coeliac patients who later developed complications (A and B cases) and sex- and age-matched coeliac patients who normally responded to a gluten-free diet (controls) were collected among 11 Italian centres. 87 cases and 136 controls were enrolled. Complications tended to occur rapidly after the diagnosis of coeliac disease and cumulative survival dropped in the first months after diagnosis of complicated coeliac disease. Thirty-seven cases died (30/59 in group A, 7/28 in group B). Type B cases presented an increased survival rate compared to A cases. Complicated coeliac disease is an extremely serious condition with a high mortality and a short survival. Survival depends on the type of natural history.
Effects of Thai piperaceae plant extracts on Neospora caninum infection.
Leesombun, Arpron; Boonmasawai, Sookruetai; Nishikawa, Yoshifumi
2017-06-01
Neosporosis has a worldwide distribution and causes economic losses in farming, particularly by increasing the risk of abortion in cattle. This study investigated the effects of Thai piperaceae (Piper betle, P. nigrum, and P. sarmentosum) extracts on Neospora caninum infections in vitro and in vivo. In an in vitro parasite growth assay based on the green fluorescent protein (GFP) signal, P. betle was the most effective extract at inhibiting parasite growth in human foreskin fibroblast cells (IC 50 of GFP-expressing N. caninum parasites, 22.1μg/ml). The P. betle extract, at 25μg per ml, inhibited parasite invasion into host cells. Furthermore, in two independent experiments, treating N. caninum-infected mice with the P. betle extract for 7days post-infection increased their survival. In trial one, the anti-N. caninum effects of the P. betle extract reduced the mouse clinical scores for 30days post-infection (dpi). The survival rate of the mice treated with 400mg/kg was 100% compared with 66.6% for those treated with 100mg/kg and the non-treated controls. In trial two, treating the infected mice with the P. betle extract increased their survival at 50dpi. All mice in the non-treatment group died; however, the survival rates of the 400mg/kg-treated and 100mg/kg-treated mice were 83.3% and 33.3%, respectively. Also, a trend towards a reduced parasite burden was noted in the brains of the P. betle extract-treated mice, compared with the control mice. Therefore P. betle extract has potential as a medicinal plant for treating neosporosis. Copyright © 2017 Elsevier B.V. All rights reserved.
Female reproductive success in a species with an age-inversed hierarchy.
DE Vries, Dorien; Koenig, Andreas; Borries, Carola
2016-11-01
In most group-living mammals, reproductive success declines with increasing age and increases with increasing rank. Such effects have mainly been studied in matrilineal and in "age positive" hierarchies, which are stable and in which high ranking females often outperform low ranking ones. These relationships are less well-understood in age-inversed dominance hierarchies, in which a female's rank changes over time. We analyzed demographic data of 2 wild, unprovisioned groups of gray langurs (Semnopithecus schistaceus) near Ramnagar, Nepal covering periods of 5 years each. Female rank was unstable and age-inversed. We measured reproductive success via birth rates (57 births), infant survival (proportion of infants surviving to 2 years) and number of offspring surviving to 2 years of age (successful births) for 3 age and 3 rank classes. We found that old females performed significantly worse than expected (birth rate P = 0.04; successful births P = 0.03). The same was true for low ranking females (P = 0.04, and P < 0.01, respectively). Infant survival was highest for young and middle-aged as well as for high and middle ranking females. Overall, the results for these unstable hierarchies were rather similar to those for stable hierarchies of other mammals, particularly several nonhuman primates. Compared to a provisioned population of a closely related species, the wild and unprovisioned population examined (i) showed stronger age effects, while (ii) female reproductive success was equally affected by rank. Future comparative studies are needed to examine whether captive or provisioned populations deviate predictably from wild populations. © 2016 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and John Wiley & Sons Australia, Ltd.
Islet grafting and imaging in a bioengineered intramuscular space.
Witkowski, Piotr; Sondermeijer, Hugo; Hardy, Mark A; Woodland, David C; Lee, Keagan; Bhagat, Govind; Witkowski, Kajetan; See, Fiona; Rana, Abbas; Maffei, Antonella; Itescu, Silviu; Harris, Paul E
2009-11-15
Because the hepatic portal system may not be the optimal site for islet transplantation, several extrahepatic sites have been studied. Here, we examine an intramuscular transplantation site, bioengineered to better support islet neovascularization, engraftment, and survival, and we demonstrate that at this novel site, grafted beta cell mass may be quantitated in a real-time noninvasive manner by positron emission tomography (PET) imaging. Streptozotocin-induced rats were pretreated intramuscularly with a biocompatible angiogenic scaffold received syngeneic islet transplants 2 weeks later. The recipients were monitored serially by blood glucose and glucose tolerance measurements and by PET imaging of the transplant site with [11C] dihydrotetrabenazine. Parallel histopathologic evaluation of the grafts was performed using insulin staining and evaluation of microvasularity. Reversal of hyperglycemia by islet transplantation was most successful in recipients pretreated with bioscaffolds containing angiogenic factors when compared with those who received no bioscaffolds or bioscaffolds not treated with angiogenic factors. PET imaging with [11C] dihydrotetrabenazine, insulin staining, and microvascular density patterns were consistent with islet survival, increased levels of angiogenesis, and with reversal of hyperglycemia. Induction of increased neovascularization at an intramuscular site significantly improves islet transplant engraftment and survival compared with controls. The use of a nonhepatic transplant site may avoid intrahepatic complications and permit the use of PET imaging to measure and follow transplanted beta cell mass in real time. These findings have important implications for effective islet implantation outside of the liver and offer promising possibilities for improving islet survival, monitoring, and even prevention of islet loss.
Locke, Barbara; Forsgren, Eva; de Miranda, Joachim R.
2014-01-01
The honey bee ectoparasitic mite, Varroa destructor, has a world-wide distribution and inflicts more damage than all other known apicultural diseases. However, Varroa-induced colony mortality is more accurately a result of secondary virus infections vectored by the mite. This means that honey bee resistance to Varroa may include resistance or tolerance to virus infections. The aim of this study was to see if this is the case for a unique population of mite-resistant (MR) European honey bees on the island of Gotland, Sweden. This population has survived uncontrolled mite infestation for over a decade, developing specific mite-related resistance traits to do so. Using RT-qPCR techniques, we monitored late season virus infections, Varroa mite infestation and honey bee colony population dynamics in the Gotland MR population and compared this to mite-susceptible (MS) colonies in a close by apiary. From summer to autumn the deformed wing virus (DWV) titres increased similarly between the MR and MS populations, while the black queen cell virus (BQCV) and sacbrood virus (SBV) titres decreased substantially in the MR population compared to the MS population by several orders of magnitude. The MR colonies all survived the following winter with high mite infestation, high DWV infection, small colony size and low proportions of autumn brood, while the MS colonies all perished. Possible explanations for these changes in virus titres and their relevance to Varroa resistance and colony winter survival are discussed. PMID:24926792
Filgrastim Improves Survival in Lethally Irradiated Nonhuman Primates
Farese, Ann M.; Cohen, Melanie V.; Katz, Barry P.; Smith, Cassandra P.; Gibbs, Allison; Cohen, Daniel M.; MacVittie, Thomas J.
2015-01-01
Treatment of individuals exposed to potentially lethal doses of radiation is of paramount concern to health professionals and government agencies. We evaluated the efficacy of filgrastim to increase survival of nonhuman primates (NHP) exposed to an approximate mid-lethal dose (LD50/60) (7.50 Gy) of LINAC-derived photon radiation. Prior to total-body irradiation (TBI), nonhuman primates were randomized to either a control (n =22) or filgrastim-treated (n =24) cohorts. Filgrastim (10 μg/kg/d) was administered beginning 1 day after TBI and continued daily until the absolute neutrophil count (ANC) was >1,000/μL for 3 consecutive days. All nonhuman primates received medical management as per protocol. The primary end point was all cause overall mortality over the 60 day in-life study. Secondary end points included mean survival time of decedents and all hematologic-related parameters. Filgrastim significantly (P < 0.004) reduced 60 day overall mortality [20.8% (5/24)] compared to the controls [59.1% (13/22)]. Filgrastim significantly decreased the duration of neutropenia, but did not affect the absolute neutrophil count nadir. Febrile neutropenia (ANC <500/μL and body temperature ≥103°F) was experienced by 90.9% (20/22) of controls compared to 79.2% (19/24) of filgrastim-treated animals (P = 0.418). Survival was significantly increased by 38.3% over controls. Filgrastim, administered at this dose and schedule, effectively mitigated the lethality of the hematopoietic subsyndrome of the acute radiation syndrome. PMID:23210705
Gaber, Timo; Tran, Cam Loan; Schellmann, Saskia; Hahne, Martin; Strehl, Cindy; Hoff, Paula; Radbruch, Andreas; Burmester, Gerd-Rüdiger; Buttgereit, Frank
2013-06-01
Inflamed areas are characterized by infiltration of immune cells, local hypoxia and alterations of cellular redox states. We investigated the impact of hypoxia on survival, proliferation, cytokine secretion, intracellular energy and redox state of human CD4(+) T cells. We found that pathophysiological hypoxia (<2% O2 ) significantly decreased CD4(+) T-cell survival after mitogenic stimulation. This effect was not due to an increased caspase-3/7-mediated apoptosis or adenosine-5'-triphosphate (ATP) consumption/depletion. However, the ability of stimulated T cells to proliferate was reduced under hypoxic conditions, despite increased expression of CD25. Pathophysiological hypoxia was also found to modify intracellular ROS (iROS) levels in stimulated T cells over time as compared with levels found in normoxia. Physiological hypoxia (5% O2 ) did not decrease CD4(+) T-cell survival and proliferation or modify iROS levels as compared with normoxia. We conclude that pathophysiological hypoxia affects T-cell proliferation and viability via disturbed IL-2R signalling downstream of STAT5a phosphorylation, but not as a result of impaired cellular energy homeostasis. We suggest iROS links early events in T-cell stimulation to the inhibition of the lymphoproliferative response under pathophysiological hypoxic conditions. The level of iROS may therefore act as a mediator of immune functions leading to down-regulation of long-term T-cell activity in inflamed tissues. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sánchez-Hidalgo, J M; Salamanca-Bustos, J J; Arjona-Sánchez, Á; Campos-Hernández, J P; Ruiz Rabelo, J; Rodríguez-Benot, A; Requena-Tapia, M J; Briceño-Delgado, J
2018-03-01
Some factors affect the pancreas of a marginal donor, and although their influence on graft survival has been determined, there is an increasing consensus to accept marginal organs in a controlled manner to increase the pool of organs. Certain factors related to the recipient have also been proposed as having negative influence on graft prognosis. The objective of this study was to analyze the influence of these factors on the results of our simultaneous pancreas-kidney (SPK) transplantation series. Retrospective analysis of 126 SPK transplants. Donors and recipients were stratified in an optimal group (<2 expanded donor criteria) and a risk group (≥2 criteria). A pancreatic graft survival analysis was performed using a Kaplan-Meier test and log-rank test. Prognostic variables on graft survival were studied by Cox regression. Postoperative complications (graded by Clavien classification) were compared by χ 2 test or Fisher test. Median survival of pancreas was 66 months, with no significant difference between groups (P > .05). Multivariate analysis showed risk factors to be donor age, cold ischemia time, donor body mass index, receipt body mass index, and receipt panel-reactive antibody. In our series, the use of pancreatic grafts from donors with expanded criteria is safe and has increased the pool of grafts. Different variables, both donor and recipient, influence the survival of the pancreatic graft and should be taken into account in organ distribution systems. Copyright © 2017 Elsevier Inc. All rights reserved.
Hydrogen sulfide increases survival during sepsis: Protective effect of CHOP inhibition
Ferlito, Marcella; Wang, Qihong; Fulton, William B; Colombani, Paul; Marchionni, Luigi; Fox-Talbot, Karen; Paolocci, Nazareno; Steenbergen, Charles
2014-01-01
Sepsis is a major cause of mortality, and dysregulation of the immune response plays a central role in this syndrome. Hydrogen sulfide (H2S), a recently discovered gaso-transmitter, is endogenously generated by many cell types, regulating a number of physiologic processes and pathophysiologic conditions. Here we report that H2S increased survival after experimental sepsis induced by cecal ligation and puncture (CLP) in mice. Exogenous H2S decreased the systemic inflammatory response, reduced apoptosis in the spleen, and accelerated bacterial eradication. We found that CHOP, a mediator of the endoplasmic reticulum (ER) stress response, was elevated in several organs after CLP and its expression was inhibited by H2S treatment. Using CHOP knockout (KO) mice, we demonstrated for the first time that genetic deletion of Chop increased survival after lipopolysaccharide (LPS) injection or CLP. CHOP KO mice displayed diminished splenic caspase-3 activation and apoptosis, decreased cytokine production and augmented bacterial clearance. Furthermore, septic CHOP KO mice treated with H2S showed no additive survival benefit compared to septic CHOP KO mice. Finally, we showed that H2S inhibited CHOP expression in macrophages by a mechanism involving Nrf2 activation. In conclusion, our findings show a protective effect of H2S treatment afforded, at least partially, by inhibition of CHOP expression. The data reveal a major negative role for the transcription factor CHOP in overall survival during sepsis and suggest a new target for clinical intervention as well potential strategies for treatment. PMID:24403532
Balamuthusamy, Saravanan; Paramesh, Anil; Zhang, Rubin; Florman, Sander; Shenava, Rajesh; Islam, Tareq; Wagner, Janis; Killackey, Mary; Alper, Brent; Simon, Eric E; Slakey, Douglas
2009-01-01
There is insufficient data on the impact of recipient body mass index (BMI) on the long-term graft survival of adult patients transplanted with single pediatric kidneys. We performed a retrospective analysis of adult patients transplanted with single pediatric kidneys at our center. The recipients were classified into 2 groups: group 1 (BMI > or =30) and group 2 (BMI <30). Donor/recipient demographics, postoperative outcomes and survival rates were compared between the 2 groups. There was no significant difference in donor/recipient demographics between the 2 groups. In group 1, the death-censored graft survival (DCGS) at 1, 3 and 5 years was 90% at all 3 time points, and in group 2 it was 86, 68 and 60%, respectively (p = 0.05). The mean glomerular filtration rate (with standard deviation in parentheses) at 1, 3 and 5 years was, respectively, 55 (15), 59 (19) and 55 (28) ml/min for group 1, compared to 65 (28), 69 (23) and 67 (20) ml/min in group 2 (p = NS). Multivariate analysis revealed a hazard ratio of 5.12 (95% confidence interval 1.06-24.7; p = 0.04) for graft loss in nonobese patients when compared to obese patients. Obese patients had an increased risk for acute rejections within the first month of transplant (p = 0.02). Patients with a BMI > or =30 transplanted with single pediatric kidneys have better DCGS rates when compared to nonobese patients. Copyright (c) 2008 S. Karger AG, Basel.
Mifuji-Moroka, Rumi; Hara, Nagisa; Miyachi, Hirohide; Sugimoto, Ryosuke; Tanaka, Hideaki; Fujita, Naoki; Gabazza, Esteban C.; Takei, Yoshiyuki
2013-01-01
Long-term supplementation with branched-chain amino acids (BCAA) is associated with prolonged survival and decreased frequency of development of hepatocellular carcinoma (HCC) in patients with liver cirrhosis. However, the pharmaceutical mechanism underlying this association is still unclear. We investigated whether continuous BCAA supplementation increases survival rate of rats exposed to a fibrogenic agent and influences the iron accumulation, oxidative stress, fibrosis, and gluconeogenesis in the liver. Further, the effects of BCAA on gluconeogenesis in cultured cells were also investigated. A significant improvement in cumulative survival was observed in BCAA-supplemented rats with advanced cirrhosis compared to untreated rats with cirrhosis (P<0.05). The prolonged survival due to BCAA supplementation was associated with reduction of iron contents, reactive oxygen species production and attenuated fibrosis in the liver. In addition, BCAA ameliorated glucose metabolism by forkhead box protein O1 pathway in the liver. BCAA prolongs survival in cirrhotic rats and this was likely the consequences of reduced iron accumulation, oxidative stress and fibrosis and improved glucose metabolism in the liver. PMID:23936183
Jung, Chiau-Jing; Zheng, Quan-Hau; Shieh, Ya-Hsiung; Lin, Chi-Shuan; Chia, Jean-San
2009-11-01
Streptococcus mutans, a commensal of the human oral cavity, can survive in the bloodstream and cause infective endocarditis (IE). However, the virulence factors associated with this manifestation of disease are not known. Here, we demonstrate that AtlA, an autolysin of S. mutans is a newly identified fibronectin (Fn) binding protein and contributes to bacterial resistance to phagocytosis and survival in the bloodstream. Interestingly, prior exposure to plasma at low concentrations was sufficient to enhance bacterial survival in the circulation. Calcium ions at physiological plasma concentrations induced maturation of AtlA from the 104-90 kDa isoform resulting in increased Fn binding and resistance to phagocytosis. An isogenic mutant strain defective in AtlA expression exhibited reduced survival and virulence when tested in a rat model of IE compared with the wild-type and complemented strains. The data presented suggest that plasma components utilized by S. mutans enhanced survival in the circulation and AtlA is a virulence factor associated with infective endocarditis.
Motorboat noise impacts parental behaviour and offspring survival in a reef fish.
Nedelec, Sophie L; Radford, Andrew N; Pearl, Leanne; Nedelec, Brendan; McCormick, Mark I; Meekan, Mark G; Simpson, Stephen D
2017-06-14
Anthropogenic noise is a pollutant of international concern, with mounting evidence of disturbance and impacts on animal behaviour and physiology. However, empirical studies measuring survival consequences are rare. We use a field experiment to investigate how repeated motorboat-noise playback affects parental behaviour and offspring survival in the spiny chromis ( Acanthochromis polyacanthus ), a brooding coral reef fish. Repeated observations were made for 12 days at 38 natural nests with broods of young. Exposure to motorboat-noise playback compared to ambient-sound playback increased defensive acts, and reduced both feeding and offspring interactions by brood-guarding males. Anthropogenic noise did not affect the growth of developing offspring, but reduced the likelihood of offspring survival; while offspring survived at all 19 nests exposed to ambient-sound playback, six of the 19 nests exposed to motorboat-noise playback suffered complete brood mortality. Our study, providing field-based experimental evidence of the consequences of anthropogenic noise, suggests potential fitness consequences of this global pollutant. © 2017 The Authors.
Frascone, Ralph J; Wayne, Marvin A; Swor, Robert A; Mahoney, Brian D; Domeier, Robert M; Olinger, Michael L; Tupper, David E; Setum, Cindy M; Burkhart, Nathan; Klann, Lucinda; Salzman, Joshua G; Wewerka, Sandi S; Yannopoulos, Demetris; Lurie, Keith G; O'Neil, Brian J; Holcomb, Richard G; Aufderheide, Tom P
2013-09-01
A recent out-of-hospital cardiac arrest (OHCA) clinical trial showed improved survival to hospital discharge (HD) with favorable neurologic function for patients with cardiac arrest of cardiac origin treated with active compression decompression cardiopulmonary resuscitation (CPR) plus an impedance threshold device (ACD+ICD) versus standard (S) CPR. The current analysis examined whether treatment with ACD+ITD is more effective than standard (S-CPR) for all cardiac arrests of non-traumatic origin, regardless of the etiology. This is a secondary analysis of data from a randomized, prospective, multicenter, intention-to-treat, OHCA clinical trial. Adults with presumed non-traumatic cardiac arrest were enrolled and followed for one year post arrest. The primary endpoint was survival to hospital discharge (HD) with favorable neurologic function (Modified Rankin Scale score ≤ 3). Between October 2005 and July 2009, 2738 patients were enrolled (S-CPR=1335; ACD+ITD=1403). Survival to HD with favorable neurologic function was greater with ACD+ITD compared with S-CPR: 7.9% versus 5.7%, (OR 1.42, 95% CI 1.04, 1.95, p=0.027). One-year survival was also greater: 7.9% versus 5.7%, (OR 1.43, 95% CI 1.04, 1.96, p=0.026). Nearly all survivors in both groups had returned to their baseline neurological function by one year. Major adverse event rates were similar between groups. Treatment of out-of-hospital non-traumatic cardiac arrest patients with ACD+ITD resulted in a significant increase in survival to hospital discharge with favorable neurological function when compared with S-CPR. A significant increase survival rates was observed up to one year after arrest in subjects treated with ACD+ITD, regardless of the etiology of the cardiac arrest. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Obi, Yoshitsugu; Streja, Elani; Mehrotra, Rajnish; Rivara, Matthew B; Rhee, Connie M; Soohoo, Melissa; Gillen, Daniel L; Lau, Wei-Ling; Kovesdy, Csaba P; Kalantar-Zadeh, Kamyar
2018-06-01
The prevalence of severe obesity, often considered a contraindication to peritoneal dialysis (PD), has increased over time. However, mortality has decreased more rapidly in the PD population than the hemodialysis (HD) population in the United States. The association between obesity and clinical outcomes among patients with end-stage kidney disease remains unclear in the current era. Historical cohort study. 15,573 incident PD patients from a large US dialysis organization (2007-2011). Body mass index (BMI). Modality longevity, residual renal creatinine clearance, peritonitis, and survival. Higher BMI was significantly associated with shorter time to transfer to HD therapy (P for trend < 0.001), longer time to kidney transplantation (P for trend < 0.001), and, with borderline significance, more frequent peritonitis-related hospitalization (P for trend = 0.05). Compared with lean patients, obese patients had faster declines in residual kidney function (P for trend < 0.001) and consistently achieved lower total Kt/V over time (P for trend < 0.001) despite greater increases in dialysis Kt/V (P for trend < 0.001). There was a U-shaped association between BMI and mortality, with the greatest survival associated with the BMI range of 30 to < 35kg/m 2 in the case-mix adjusted model. Compared with matched HD patients, PD patients had lower mortality in the BMI categories of < 25 and 25 to < 35kg/m 2 and had equivalent survival in the BMI category ≥ 35kg/m 2 (P for interaction = 0.001 [vs < 25 kg/m 2 ]). This attenuation in survival difference among patients with severe obesity was observed only in patients with diabetes, but not those without diabetes. Inability to evaluate causal associations. Potential indication bias. Whereas obese PD patients had higher risk for complications than nonobese PD patients, their survival was no worse than matched HD patients. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Chua, Hui Lin; Plett, P. Artur; Sampson, Carol H.; Katz, Barry P.; Carnathan, Gilbert W.; MacVittie, Thomas J.; Lenden, Keith; Orschell, Christie M.
2013-01-01
In an effort to expand the worldwide pool of available medical countermeasures (MCM) against radiation, the PEGylated G-CSF (PEG-G-CSF) molecules Neulasta and Maxy-G34, a novel PEG-G-CSF designed for increased half-life and enhanced activity compared to Neulasta, were examined in a murine model of the Hematopoietic Syndrome of the Acute Radiation Syndrome (H-ARS), along with the lead MCM for licensure and stockpiling, G-CSF. Both PEG-G-CSFs were shown to retain significant survival efficacy when administered as a single dose 24hr post-exposure, compared to the 16 daily doses of G-CSF required for survival efficacy. Furthermore, 0.1 mg kg−1 of either PEG-G-CSF effected survival of lethally-irradiated mice that was similar to a 10-fold higher dose. The one dose/low dose administration schedules are attractive attributes of radiation MCM given the logistical challenges of medical care in a mass casualty event. Maxy-G34-treated mice that survived H-ARS were examined for residual bone marrow damage (RBMD) up to 9mo post-exposure. Despite differences in Sca-1 expression and cell cycle position in some hematopoietic progenitor phenotypes, Maxy-G34-treated mice exhibited the same degree of hematopoietic stem cell (HSC) insufficiency as vehicle treated H-ARS survivors in competitive transplantation assays of 150 purified Sca-1+cKit+lin-CD150+ cells. These data suggest that Maxy-G34, at the dose, schedule, and time frame examined, did not mitigate RBMD, but significantly increased survival from H-ARS at one-tenth the dose previously tested, providing strong support for advanced development of Maxy-G34, as well as Neulasta, as MCM against radiation. PMID:24276547
Frascone, Ralph J; Wayne, Marvin A; Swor, Robert A; Mahoney, Brian D; Domeier, Robert M; Olinger, Michael L; Tupper, David E; Setum, Cindy M; Burkhart, Nathan; Klann, Lucinda; Salzman, Joshua G; Wewerka, Sandi S; Yannopoulos, Demetris; Lurie, Keith G; O’Neil, Brian J.; Holcomb, Richard G; Aufderheide, Tom P
2013-01-01
Background A recent out-of-hospital cardiac arrest (OHCA) clinical trial showed improved survival to hospital discharge (HD) with favorable neurologic function for patients with cardiac arrest of cardiac origin treated with active compression decompression cardiopulmonary resuscitation (CPR) plus an impedance threshold device (ACD+ICD) versus standard (S) CPR. The current analysis examined whether treatment with ACD+ITD is more effective than standard (S-CPR) for all cardiac arrests of non-traumatic origin, regardless of the aetiology. Methods This is a secondary analysis of data from a randomized, prospective, multicenter, intention-to-treat, OHCA clinical trial. Adults with presumed non-traumatic cardiac arrest were enrolled and followed for one year post arrest. The primary endpoint was survival to hospital discharge (HD) with favorable neurologic function (modified Rankin Scale score ≤3). Results Between October 2005 to July 2009, 2738 patients were enrolled (S-CPR = 1335; ACD+ITD =1403). Survival to HD with favorable neurologic function was greater with ACD+ITD compared with S-CPR: 7.9% versus 5.7%, (OR 1.42, 95% CI 1.04, 1.95, p=0.027). One-year survival was also greater: 7.9% versus 5.7%, (OR 1.43, 95% CI 1.04, 1.96, p=0.026). Nearly all survivors in both groups had returned to their baseline neurological function by one year. Major adverse event rates were similar between groups. Conclusions Treatment of out-of-hospital non-traumatic cardiac arrest patients with ACD+ITD resulted in a significant increase in survival to hospital discharge with favorable neurological function when compared with S-CPR. A significant increase survival rates was observed up to one year after arrest in subjects treated with ACD+ITD, regardless of the etiology of the cardiac arrest. Clinical Trial Registration NCT 00189423 (http://www.clinicaltrials.gov) PMID:23669489
Mehta, H B; Vargas, G M; Adhikari, D; Dimou, F; Riall, T S
2017-06-01
The objectives were to determine trends in the use of chemotherapy as the initial treatment and to evaluate the comparative effectiveness of initial chemotherapy vs resection of the primary tumour on survival (intention-to-treat analysis) in Stage IV colorectal cancer (CRC). This cohort study used 2000-2011 data from the Surveillance, Epidemiology, and End Results (SEER)-Medicare linked database, including patients ≥ 66 years of age presenting with Stage IV CRC. Cox proportional hazards models and instrumental variable analysis were used to compare the effectiveness of chemotherapy as the initial treatment with resection of the primary tumour as the initial treatment, with 2-year survival as the end point. The use of chemotherapy as the first treatment increased over time, from 26.8% in 2001 to 46.9% in 2009 (P < 0.0001). The traditional Cox model showed that chemotherapy as the initial treatment was associated with a higher risk of mortality [hazard ratio (HR) = 1.35; 95% CI: 1.27-1.44]. When accounting for known and unknown confounders in an instrumental variable analysis, chemotherapy as the initial treatment suggested benefit on 2-year survival (HR = 0.68; 95% CI: 0.44-1.04); however, the association did not reach statistical significance. The study findings were similar in six subgroup analyses. The use of chemotherapy as the initial therapy for CRC increased substantially from 2001 to 2009. Instrumental variable analysis found that, compared with resection, chemotherapy as the initial treatment offers similar or better 2-year survival in patients with Stage IV CRC. Given the morbidity and mortality associated with colorectal resection in elderly patients, chemotherapy provides an option to patients who are not good candidates for resection. Colorectal Disease © 2017 The Association of Coloproctology of Great Britain and Ireland.
Zhang, Kecheng; Huang, Xiaohui; Gao, Yunhe; Liang, Wenquan; Xi, Hongqing; Cui, Jianxin; Li, Jiyang; Zhu, Minghua; Liu, Guoxiao; Zhao, Huazhou; Hu, Chong; Liu, Yi; Qiao, Zhi; Wei, Bo; Chen, Lin
2018-01-01
An increasing amount of attention has been paid to minimally invasive function-preserving gastrectomy, with an increase in incidence of early gastric cancer in the upper stomach. This study aimed to compare oncological outcomes, surgical stress, and nutritional status between robot-assisted proximal gastrectomy (RAPG) and laparoscopy-assisted proximal gastrectomy (LAPG). Eighty-nine patients were enrolled in this retrospective study between November 2011 and December 2013. Among them, 27 patients underwent RAPG and 62 underwent LAPG. Perioperative parameters, surgical stress, nutritional status, disease-free survival, and overall survival were compared between the 2 groups. Sex, age, and comorbidity were similar in the RAPG and LAPG groups. There were also similar perioperative outcomes regarding operation time, complications, and length of hospital stay between the groups. The reflux esophagitis rates following RAPG and LAPG were 18.5% and 14.5%, respectively ( P = .842). However, patients in the RAPG group had less blood loss ( P = .024), more harvested lymph nodes ( P = .021), and higher costs than those in the LAPG group ( P < .001). With regard to surgical stress, no significant differences were observed in C-reactive protein concentrations and white blood cell count on postoperative days 1, 3, and 7 between the groups ( Ps > .05). There appeared to be higher hemoglobin levels at 6 months ( P = .053) and a higher body mass index at 12 months ( P = .056) postoperatively in patients in the RAPG group compared with those in the LAPG group, but this difference was not significant. Similar disease-free survival and overall survival rates were observed between the groups. RAPG could be an alternative to LAPG for patients with early gastric cancer in the upper stomach with comparable oncological safety and nutritional status. Further well-designed, prospective, large-scale studies are needed to validate these results.
Dhahri, Wahiba; Drolet, Marie-Claude; Roussel, Elise; Couet, Jacques; Arsenault, Marie
2014-09-24
The composition of a diet can influence myocardial metabolism and development of left ventricular hypertrophy (LVH). The impact of a high-fat diet in chronic left ventricular volume overload (VO) causing eccentric LVH is unknown. This study examined the effects of chronic ingestion of a high-fat diet in rats with chronic VO caused by severe aortic valve regurgitation (AR) on LVH, function and on myocardial energetics and survival. Male Wistar rats were divided in four groups: Shams on control or high-fat (HF) diet (15 rats/group) and AR rats fed with the same diets (ARC (n = 56) and ARHF (n = 32)). HF diet was started one week before AR induction and the protocol was stopped 30 weeks later. As expected, AR caused significant LV dilation and hypertrophy and this was exacerbated in the ARHF group. Moreover, survival in the ARHF group was significantly decreased compared the ARC group. Although the sham animals on HF also developed significant obesity compared to those on control diet, this was not associated with heart hypertrophy. The HF diet in AR rats partially countered the expected shift in myocardial energy substrate preference usually observed in heart hypertrophy (from fatty acids towards glucose). Systolic function was decreased in AR rats but HF diet had no impact on this parameter. The response to HF diet of different fatty acid oxidation markers as well as the increase in glucose transporter-4 translocation to the plasma membrane compared to ARC was blunted in AR animals compared to those on control diet. HF diet for 30 weeks decreased survival of AR rats and worsened eccentric hypertrophy without affecting systolic function. The expected adaptation of myocardial energetics to volume-overload left ventricle hypertrophy in AR animals seemed to be impaired by the high-fat diet suggesting less metabolic flexibility.
Smith, John D; Ibrahim, Mohamed W; Newell, Helen; Danskine, Anna J; Soresi, Simona; Burke, Margaret M; Rose, Marlene L; Carby, Martin
2014-10-01
The impact of Luminex-detected HLA antibodies on outcomes after lung transplantation is unclear. Herein we have undertaken a retrospective study of pre-transplant sera from 425 lung transplants performed between 1991 and 2003. Pre-transplant sera, originally screened by complement-dependent cytotoxicity (CDC) assays, were retrospectively tested for the presence of HLA-specific antibodies using HLA-coated Luminex beads and C4d deposition on Luminex beads. The results were correlated with graft survival at 1 year. Twenty-seven patients were retrospectively identified as having been transplanted against donor-specific HLA antibodies (DSA) and 36 patients against non-donor-specific HLA antibodies (NDSA). DSA-positive patients had 1-year survival of 51.9% compared with 77.8% for NDSA and 71.8% for antibody-negative patients (p = 0.029). One-year survival of patients with complement-fixing DSA was 12.5% compared with 62.5% for non-complement-fixing DSA, 75.8% for non-complement-fixing NDSA and 71.8% for antibody-negative patients (p < 0.0001). DSA-positive patients with mean fluorescence intensity (MFI) >5,000 had 1-year survival of 33.3% compared with 71.4% for MFI 2,000 to 5000 and 62.5% for MFI <2,000 (p = 0.0046). Multivariable analysis revealed DSA to be an independent predictor of poor patient survival within 1 year (p = 0.0010, hazard ratio [HR] = 3.569) as well as complement-fixing DSA (p < 0.0001, HR = 11.083) and DSA with MFI >5,000 (p = 0.0001, HR = 5.512). Pre-formed DSA, particularly complement-fixing DSA, and high MFI are associated with poor survival within the first year after lung transplantation. Risk stratification according to complement fixation or MFI levels may allow for increased transplantation in sensitized patients. Copyright © 2014 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Goeree, Ron; Villeneuve, Julie; Goeree, Jeff; Penrod, John R; Orsini, Lucinda; Tahami Monfared, Amir Abbas
2016-06-01
Background Lung cancer is the most common type of cancer in the world and is associated with significant mortality. Nivolumab demonstrated statistically significant improvements in progression-free survival (PFS) and overall survival (OS) for patients with advanced squamous non-small cell lung cancer (NSCLC) who were previously treated. The cost-effectiveness of nivolumab has not been assessed in Canada. A contentious component of projecting long-term cost and outcomes in cancer relates to the modeling approach adopted, with the two most common approaches being partitioned survival (PS) and Markov models. The objectives of this analysis were to estimate the cost-utility of nivolumab and to compare the results using these alternative modeling approaches. Methods Both PS and Markov models were developed using docetaxel and erlotinib as comparators. A three-health state model was used consisting of progression-free, progressed disease, and death. Disease progression and time to progression were estimated by identifying best-fitting survival curves from the clinical trial data for PFS and OS. Expected costs and health outcomes were calculated by combining health-state occupancy with medical resource use and quality-of-life assigned to each of the three health states. The health outcomes included in the model were survival and quality-adjusted-life-years (QALYs). Results Nivolumab was found to have the highest expected per-patient cost, but also improved per-patient life years (LYs) and QALYs. Nivolumab cost an additional $151,560 and $140,601 per QALY gained compared to docetaxel and erlotinib, respectively, using a PS model approach. The cost-utility estimates using a Markov model were very similar ($152,229 and $141,838, respectively, per QALY gained). Conclusions Nivolumab was found to involve a trade-off between improved patient survival and QALYs, and increased cost. It was found that the use of a PS or Markov model produced very similar estimates of expected cost, outcomes, and incremental cost-utility.
Habimana, Olivier; Møretrø, Trond; Langsrud, Solveig; Vestby, Lene K; Nesse, Live L; Heir, Even
2010-11-02
The presence of Salmonella enterica serovars in feed ingredients, products and processing facilities is a well recognized problem worldwide. In Norwegian feed factories, strict control measures are implemented to avoid establishment and spreading of Salmonella throughout the processing chain. There is limited knowledge on the presence and survival of the resident microflora in feed production plants. Information on interactions between Salmonella and other bacteria in feed production plants and how they affect survival and biofilm formation of Salmonella is also limited. The aim of this study was to identify resident microbiota found in feed production environments, and to compare the survival of resident flora strains and Salmonella to stress factors typically found in feed processing environments. Moreover, the role of dominant resident flora strains in the biofilm development of Salmonella was determined. Surface microflora characterization from two feed productions plants, by means of 16 S rDNA sequencing, revealed a wide diversity of bacteria. Survival, disinfection and biofilm formation experiments were conducted on selected dominant resident flora strains and Salmonella. Results showed higher survival properties by resident flora isolates for desiccation, and disinfection compared to Salmonella isolates. Dual-species biofilms favored Salmonella growth compared to Salmonella in mono-species biofilms, with biovolume increases of 2.8-fold and 3.2-fold in the presence of Staphylococcus and Pseudomonas, respectively. These results offer an overview of the microflora composition found in feed industry processing environments, their survival under relevant stresses and their potential effect on biofilm formation in the presence of Salmonella. Eliminating the establishment of resident flora isolates in feed industry surfaces is therefore of interest for impeding conditions for Salmonella colonization and growth on feed industry surfaces. In-depth investigations are still needed to determine whether resident flora has a definite role in the persistence of Salmonella in feed processing environments.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
Mengual-Ballester, Mónica; Pellicer-Franco, Enrique; Valero-Navarro, Graciela; Soria-Aledo, Victoriano; García-Marín, José Andrés; Aguayo-Albasini, José Luis
2016-08-01
Population-based screening programmes for colorectal cancer (CRC) allow an early diagnosis, even before the onset of symptoms, but there are few studies and none in Spain on the influence they have on patient survival. The aim of the present study is to show that patients receiving surgery for CRC following diagnosis via a screening programme have a higher survival and disease-free survival rate than those diagnosed in the symptomatic stage. Prospective study of all the patients undergoing programmed surgery for CRC at the JM Morales Meseguer Hospital in Murcia (Spain) between 2004 and 2010. The patients were divided into two groups: (a) those diagnosed through screening (125 cases); and (b) those diagnosed in the symptomatic stage (565 cases). Survival and disease-free survival were analysed and compared for both groups using the Mantel method. The screen-detected CRC patients show a higher rate of survival (86.3% versus 72.1% at 5 years, p<0.05) and a lower rate of tumour recurrence (73.4% versus 88.3% at 5 years, p<0.05). Population-based screening for CRC is an effective strategic measure for reducing mortality specific to this neoplasia. Copyright © 2016. Published by Elsevier Ltd.
Quality of life in children and adolescents surviving cancer.
Bradley Eilertsen, Mary-Elizabeth; Jozefiak, Thomas; Rannestad, Toril; Indredavik, Marit S; Vik, Torstein
2012-04-01
To explore subjective and proxy reported QoL (Quality of Life) in children and adolescents surviving cancer three years after diagnosis compared with healthy controls. Case-control study including 50 children and adolescents diagnosed with cancer between January 1, 1993 and January 1, 2003 and treated at the Paediatric Department of St. Olav's University Hospital in Trondheim, Norway. Data were collected using The Inventory of Life Quality in Children and Adolescents (ILC) and the KINDL QoL questionnaires (parent and self-reports), as well as by collecting data for any somatic late effects and psychological problems from the medical records of children surviving cancer. Adolescents surviving cancer as a group assessed their QoL as similar to that of their peers. However, adolescents surviving brain tumours or those with late effects reported lower QoL and an increased number of QoL domains perceived as problematic, even many years after diagnosis and treatment. Parents generally report a poorer QoL for their children surviving cancer and a greater number of QoL domains experienced as problematic compared with parent controls. To improve the child's total functioning and well-being we conclude that when planning long-term follow-up care, rehabilitation of children and adolescents with cancer, especially for survivors with brain tumours, and with late effects should particularly take into account their subjectively perceived and proxy reported QoL, in addition to their psychological problems and psychosocial functioning. Copyright © 2011 Elsevier Ltd. All rights reserved.
A clinical study of 407 cases of nasopharyngeal carcinoma in Hong Kong
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teo, P.; Tsao, S.Y.; Shiu, W.
Four hundred and seven cases of nasopharyngeal carcinoma were analyzed retrospectively; 403/407 were evaluable for recurrence and survival. Parapharyngeal boost significantly decreased local recurrences in parapharyngeal diseases without base of skull involvement (T2p), but not with base of skull involvement (T3p). Enhanced local control of T2p with boost was significant without neoadjuvant chemotherapy. Tumors localized within the nasopharynx (T1) and tumors with nasal involvement (T2n) suffering from local persistences after external radiation therapy were treated with an intracavitary afterloading method. They had survival and recurrence rates comparable to complete responders to external radiation therapy. Patients with bulky cervical nodes (maximalmore » diameter greater than or equal to 4 cm, N1-N3), treated with neoadjuvant chemotherapy with cis-diamminedichloroplatinum II and 5-fluorouracil, had a regional failure rate, distant metastasis rate, actuarial survival rate, and disease-free survival rate comparable to those with smaller nodes treated with external radiation therapy alone. A simple modification of the Ho's classification by regrouping the T-stages into 'early T-stages' and 'advanced T-stages' and by combining the N1 and the N2 has greatly increased the power of the system in predicting local recurrence and distant metastasis, respectively. There was an overall improvement of the actuarial survival rate and disease-free survival rate over the historical control, and its significance is discussed.« less
Welaga, Paul; Hodgson, Abraham; Debpuur, Cornelius; Aaby, Peter; Binka, Fred; Azongo, Daniel; Oduro, Abraham
2018-01-01
Measles vaccine (MV) administered as the last vaccine after the third dose of diphtheria-tetanus-pertussis (DTP) may be associated with better child survival unrelated to prevention of measles infection. Other studies have shown that MV administered after DTP was more beneficial and was associated with lower mortality compared with DTP administered after MV or DTP administered simultaneously with MV. We compared the difference in mortality between measles vaccinated after DTP3 and measles-unvaccinated children in Navrongo, Ghana. This was a follow-up study involving annual cohort of children aged 9-23 months from 1996 to 2012. We assessed survival in relation to the measles vaccination status within the first 12 months from interview date and until 5 years of age using Cox proportional hazards models. In all, 38,333 children were included in the study. The proportion of children vaccinated with MV-after-DTP3 increased from 45% in 1996 to 95% in 2012. The adjusted hazard ratio (HR) for measles unvaccinated compared with MV-after-DTP3 vaccinated children was 1.38 (1.15-1.66) in the first 12 months after assessment of vaccination status and 1.22 (1.05-1.41) with follow-up to 5 years of age. The national immunization days campaigns with oral polio vaccine or MV might have reduced the effect of being MV-after-DTP3 vaccinated vs MV-unvaccinated. For 12 months of follow-up, the HR before a campaign for MV-unvaccinated children was 1.63 (1.23-2.17) compared to those who received MV-after-DTP3. After the campaign, the HR reduced to 1.23 (0.97-1.54). Stratifying the analysis by sex, measles-unvaccinated boys had a HR of 1.69 (1.33-2.61) compared to measles-unvaccinated girls who had a HR 1.06 (0.79-1.40) during 1-year follow-up. In 1989, only 7% of children in the area had received MV-after-DTP3; the increase in MV-after-DTP3 coverage from 1989 to 2012 may have lowered mortality rate among children aged 9 months to 3 years by 24%. Though an observational study, our findings suggest that measles vaccination, administered in the recommended sequence, is associated with improved child survival and may have contributed importantly to the mortality decline toward the achievement of Millennium Development Goal 4.
Monticelli, David; Ramos, Jaime A.; Hines, James E.; Nichols, James D.; Spendelow, Jeffrey A.
2008-01-01
Many demographic studies on long-lived seabirds have focused on the estimation of adult survival, but much less is known about survival during the early years of life, especially in tropical species. We report analyses of a capture–recapture dataset of 685 roseate terns ringed as fledglings and adults between 1998 and 2005 on Aride Island, Seychelles, and recaptured/resighted at the same colony site over a 5 yr (2002 to 2006) period. A multistate model was used to estimate survival for different age classes, including juvenile (first-year) birds returning as non-breeding prospectors. The effect of infestation by parasites (ticks) on survival was also examined. Overall, the estimated return of first-year individuals to the natal colony was very variable, ranging from 2 to 22%. Conditioned on survival, the probability of returning from Age 2 yr onwards increased to 70%. Survival rates were best modeled as time-specific, with estimates varying from 0.02 to 1.00 (mean 0.69) in first-year birds with a marked negative effect of tick infestation. In older birds (minimum age of 2 yr), the annual estimates fell between 0.69 and 0.86 (mean 0.77). Using a components of variance approach for estimation of year-to-year variation, we found high temporal variability for first-year individuals (coefficient of variation [CV] = 65%) compared to much less variation in the survival rate of older birds (CV = 9%). These findings agree with the life-history prediction that demographic rates of juveniles are usually lower and more variable than those of older individuals. Our results are also consistent with the predicted negative effect of tick parasitism on juvenile survival. Compared with data from other roseate tern populations, survival over the first 2 yr (Age 0 to 2 yr) was 18 to 40% higher in this study, suggesting that a high ‘young’ survival rate may be an important demographic trait in this tropical population to compensate for the low annual reproductive success. Our data show that estimating survival of young individuals may be crucial to elucidating the demographic tactics of seabirds.
Adams, Noah S.; Hansel, Hal C.; Perry, Russell W.; Evans, Scott D.
2012-01-01
We analyzed 6 years (2004-09) of passage and survival data collected at McNary Dam to examine how spill bay operations affect survival of juvenile salmonids passing through the spillway at McNary Dam. We also examined the relations between spill bay operations and survival through the juvenile fish bypass in an attempt to determine if survival through the bypass is influenced by spill bay operations. We used a Cormack-Jolly-Seber release-recapture model (CJS model) to determine how the survival of juvenile salmonids passing through McNary Dam relates to spill bay operations. Results of these analyses, while not designed to yield predictive models, can be used to help develop dam-operation strategies that optimize juvenile salmonid survival. For example, increasing total discharge typically had a positive effect on both spillway and bypass survival for all species except sockeye salmon (Oncorhynchus nerka). Likewise, an increase in spill bay discharge improved spillway survival for yearling Chinook salmon (Oncorhynchus tshawytscha), and an increase in spillway discharge positively affected spillway survival for juvenile steelhead (Oncorhynchus mykiss). The strong linear relation between increased spill and increased survival indicates that increasing the amount of water through the spillway is one strategy that could be used to improve spillway survival for yearling Chinook salmon and juvenile steelhead. However, increased spill did not improve spillway survival for subyearling Chinook salmon and sockeye salmon. Our results indicate that a uniform spill pattern would provide the highest spillway survival and bypass survival for subyearling Chinook salmon. Conversely, a predominantly south spill pattern provided the highest spillway survival for yearling Chinook salmon and juvenile steelhead. Although spill pattern was not a factor for spillway survival of sockeye salmon, spill bay operations that optimize passage through the north and south spill bays maximized spillway survival for this species. Bypass survival of yearling Chinook salmon could be improved by optimizing conditions to facilitate bypass passage at night, but the method to do so is not apparent from this analysis because photoperiod was the only factor affecting bypass survival based on the best and only supported model. Bypass survival of juvenile steelhead would benefit from lower water temperatures and increased total and spillway discharge. Likewise, subyearling Chinook salmon bypass survival would improve with lower water temperatures, increased total discharge, and a uniform spill pattern.
Epidemiology of Out-of-Hospital Cardiac Arrests Among Japanese Centenarians: 2005 to 2013.
Kitamura, Tetsuhisa; Kiyohara, Kosuke; Matsuyama, Tasuku; Izawa, Junichi; Shimamoto, Tomonari; Hatakeyama, Toshihiro; Fujii, Tomoko; Nishiyama, Chika; Iwami, Taku
2016-03-15
Although the number of centenarians has been rapidly increasing in industrialized countries, no clinical studies evaluated their characteristics and outcomes from out-of-hospital cardiac arrests (OHCAs). This nationwide, population-based, observation of the whole population of Japan enrolled consecutive OHCA centenarians with resuscitation attempts before emergency medical service arrival from 2005 to 2013. The primary outcome measure was 1-month survival from OHCAs. The multivariate logistic regression model was used to assess factors associated with 1-month survival in this population. Among a total of 4,937 OHCA centenarians before emergency medical service arrival, the numbers of those with OHCAs increased from 70 in 2005 to 136 in 2013 in men and from 227 in 2005 to 587 in 2013 in women. Women accounted for 80.3%. Ventricular fibrillation (VF) as first documented rhythm was 2.5%. The proportions of victims receiving bystander cardiopulmonary resuscitation were 64.2%. The proportion of 1-month survival from OHCAs in centenarians was only 1.1%. In a multivariate analysis, age was not associated with 1-month survival from OHCAs (adjusted odds ratio [OR] for one increment of age 1.01; 95% confidence interval [CI] 0.87 to 1.18). Witness by a bystander (adjusted OR 3.45; 95% CI 1.88 to 6.31) and VF as first documented rhythm (adjusted OR 5.49; 95% CI 2.24 to 13.43) were significant positive predictors for 1-month survival. Cardiac origin was significantly poor in 1-month survival compared with noncardiac origin (adjusted OR 0.37; 95% CI 0.21 to 0.64). In conclusion, survival from OHCAs in centenarians was very poor, but witness by a bystander and VF as first documented rhythm were associated with improved survival. Copyright © 2016 Elsevier Inc. All rights reserved.
Health characteristics of heart transplant recipients surviving into their 80s.
Tabachnick, Deborah R; Bowen, Megan E; Stehlik, Josef; Kfoury, Abdallah G; Caine, William T; Selzman, Craig H; McKellar, Stephen H
2017-08-01
Heart transplantation (HTx) is the preferred treatment for patients with end-stage heart failure and has been successful for >30 y. The clinical course of recipients at the extreme of age is unknown. We reviewed our experience to determine the overall health and prevalence of Tx-related medical problems for recipients in their ninth decade. We reviewed the UCTP experience from 1985 to present to identify patients who survived into their 80s and matched (1:1) with other recipients for gender and age at HTx, but did not survive to ≥80 y. The end point was the prevalence of medical problems. Since 1985, 1129 adult HTx have been performed and 14 patients (1.2%) survived to ≥80 y old. The mean age at HTx was 63 ± 4 y. Of octogenarians, the majority were males with ischemic cardiomyopathy. The average survival after transplant was 19 ± 5 y in the octogenarians and 5 ± 5 y in the controls (P < 0.01). Over time, the prevalence of comorbidities increased. Compared with nonoctogenarians, we observed higher prevalence of dyslipidemia (P = 0.02), and chronic renal insufficiency (P = 0.02) during follow-up. Cardiac function was normal (ejection fraction > 55%) for all octogenarians at age 80 y. Despite improvements in posttransplant care, survival of HTx patients into the ninth decade is rare (1%). For those surviving into their 80s, cardiac function is preserved but dyslipidemia, renal insufficiency, and skin cancers are common. As the age of Htx patients continues to increase, posttransplant care should be tailored to minimize post-HTx complications and further extend survival. Copyright © 2017 Elsevier Inc. All rights reserved.
Quintero-Fong, L; Toledo, J; Ruiz, L; Rendón, P; Orozco-Dávila, D; Cruz, L; Liedo, P
2016-10-01
The sexual performance of Anastrepha ludens males of the Tapachula-7 genetic sexing strain, produced via selection based on mating success, was compared with that of males produced without selection in competition with wild males. Mating competition, development time, survival, mass-rearing quality parameters and pheromone production were compared. The results showed that selection based on mating competitiveness significantly improved the sexual performance of offspring. Development time, survival of larvae, pupae and adults, and weights of larvae and pupae increased with each selection cycle. Differences in the relative quantity of the pheromone compounds (Z)-3-nonenol and anastrephin were observed when comparing the parental males with the F4 and wild males. The implications of this colony management method on the sterile insect technique are discussed.
Hernandez-Alejandro, Roberto; Croome, Kris P; Quan, Douglas; Mawardi, Mohamed; Chandok, Natasha; Dale, Cheryl; McAlister, Vivian; Levstik, Mark A; Wall, William; Marotta, Paul
2011-09-27
In hepatitis C virus (HCV) recipients of donation after cardiac death (DCD) grafts, there is suggestion of lower rates of graft survival, indicating that DCD grafts themselves may represent a significant risk factor for severe recurrence of HCV. We evaluated all DCD liver transplant recipients from August 2006 to February 2011 at our center. Recipients with HCV who received a DCD graft (group 1, HCV+ DCD, n=17) were compared with non-HCV recipients transplanted with a DCD graft (group 2, HCV- DCD, n=15), and with a matched group of HCV recipients transplanted with a donation after brain death (DBD) graft (group 3, HCV+ DBD, n=42). A trend of poorer graft survival was seen in HCV+ patients who underwent a DCD transplant (group 1) compared with HCV- patients who underwent a DCD transplant (group 2) (P=0.14). Importantly, a statistically significant difference in graft survival was seen in HCV+ patients undergoing DCD transplant (group 1) (73%) as compared with DBD transplant (group 3) (93%)(P=0.01). There was a statistically significant increase in HCV recurrence at 3 months (76% vs. 16%) (P=0.005) and severe HCV recurrence within the first year (47% vs. 10%) in the DCD group (P=0.004). HCV recurrence is more severe and progresses more rapidly in HCV+ recipients who receive grafts from DCD compared with those who receive grafts from DBD. DCD liver transplantation in HCV+ recipients is associated with a higher rate of graft failure compared with those who receive grafts from DBD. Caution must be taken when using DCD grafts in HCV+ recipients.
Barnes, Jammie; Mayes, Maureen D
2012-03-01
To identify the recent data regarding prevalence, incidence, survival, and risk factors for systemic sclerosis (SSc) and to compare these data to previously published findings. SSc disease occurrence data are now available for Argentina, Taiwan, and India and continue to show wide variation across geographic regions. The survival rate is negatively impacted by older age of onset, male sex, scleroderma renal crisis, pulmonary fibrosis, pulmonary arterial hypertension, cancer, and antitopoisomerase and anti-U1 antibodies. It appears that silica exposure confers an increased risk for developing scleroderma, but this exposure accounts for a very small proportion of male patients. Smoking is not associated with increased SSc susceptibility. Malignancies are reported in scleroderma at an increased rate, but the magnitude of this risk and the type of cancer vary among reports. Prevalence and incidence of SSc appears to be greater in populations of European ancestry and lower in Asian groups. Exposure to silica dust appears to be an environmental trigger, but this only accounts for a small proportion of male cases. Evidence for increased risk of neoplasia is suggestive, but the magnitude of the risk and the types of malignancies vary among reports.
Laryngeal cancer in the United States: changes in demographics, patterns of care, and survival.
Hoffman, Henry T; Porter, Kimberly; Karnell, Lucy H; Cooper, Jay S; Weber, Randall S; Langer, Corey J; Ang, Kie-Kian; Gay, Greer; Stewart, Andrew; Robinson, Robert A
2006-09-01
Survival has decreased among patients with laryngeal cancer during the past 2 decades in the United States. During this same period, there has been an increase in the nonsurgical treatment of laryngeal cancer. The objectives of this study were to identify trends in the demographics, management, and outcome of laryngeal cancer in the United States and to analyze factors contributing to the decreased survival. The authors conducted a retrospective, longitudinal study of laryngeal cancer cases. Review of the National Cancer Data Base (NCDB) revealed 158,426 cases of laryngeal squamous cell carcinoma (excluding verrucous carcinoma) diagnosed between the years 1985 and 2001. Analysis of these case records addressed demographics, management, and survival for cases grouped according to stage, site, and specific TNM classifications. This review of data from the NCDB analysis confirms the previously identified trend toward decreasing survival among patients with laryngeal cancer from the mid-1980s to mid-1990s. Patterns of initial management across this same period indicated an increase in the use of chemoradiation with a decrease in the use of surgery despite an increase in the use of endoscopic resection. The most notable decline in the 5-year relative survival between the 1985 to 1990 period and the 1994 to 1996 period occurred among advanced-stage glottic cancer, early-stage supraglottic cancers, and supraglottic cancers classified as T3N0M0. Initial treatment of T3N0M0 laryngeal cancer (all sites) in the 1994 to 1996 period resulted in poor 5-year relative survival for those receiving either chemoradiation (59.2%) or irradiation alone (42.7%) when compared with that of patients after surgery with irradiation (65.2%) and surgery alone (63.3%). In contrast, identical 5-year relative survival (65.6%) rates were observed during this same period for the subset of T3N0M0 glottic cancers initially treated with either chemoradiation or surgery with irradiation. The decreased survival recorded for patients with laryngeal cancer in the mid-1990s may be related to changes in patterns of management. Future studies are warranted to further evaluate these associations.
Yamamoto, Brent J; Shadiack, Annette M; Carpenter, Sarah; Sanford, Daniel; Henning, Lisa N; O'Connor, Edward; Gonzales, Nestor; Mondick, John; French, Jonathan; Stark, Gregory V; Fisher, Alan C; Casey, Leslie S; Serbina, Natalya V
2016-10-01
Inhalational anthrax has high mortality even with antibiotic treatment, and antitoxins are now recommended as an adjunct to standard antimicrobial regimens. The efficacy of obiltoxaximab, a monoclonal antibody against anthrax protective antigen (PA), was examined in multiple studies conducted in two animal models of inhalational anthrax. A single intravenous bolus of 1 to 32 mg/kg of body weight obiltoxaximab or placebo was administered to New Zealand White rabbits (two studies) and cynomolgus macaques (4 studies) at disease onset (significant body temperature increase or detection of serum PA) following lethal challenge with aerosolized Bacillus anthracis spores. The primary endpoint was survival. The relationship between efficacy and disease severity, defined by pretreatment bacteremia and toxemia levels, was explored. In rabbits, single doses of 1 to 16 mg/kg obiltoxaximab led to 17 to 93% survival. In two studies, survival following 16 mg/kg obiltoxaximab was 93% and 62% compared to 0% and 0% for placebo (P = 0.0010 and P = 0.0013, respectively). Across four macaque studies, survival was 6.3% to 78.6% following 4 to 32 mg/kg obiltoxaximab. In two macaque studies, 16 mg/kg obiltoxaximab reduced toxemia and led to survival rates of 31%, 35%, and 47% versus 0%, 0%, and 6.3% with placebo (P = 0.0085, P = 0.0053, P = 0.0068). Pretreatment bacteremia and toxemia levels inversely correlated with survival. Overall, obiltoxaximab monotherapy neutralized PA and increased survival across the range of disease severity, indicating clinical benefit of toxin neutralization with obiltoxaximab in both early and late stages of inhalational anthrax. Copyright © 2016 Yamamoto et al.
Myatt, Theodore A; Kaufman, Matthew H; Allen, Joseph G; MacIntosh, David L; Fabian, M Patricia; McDevitt, James J
2010-09-03
Laboratory research studies indicate that aerosolized influenza viruses survive for longer periods at low relative humidity (RH) conditions. Further analysis has shown that absolute humidity (AH) may be an improved predictor of virus survival in the environment. Maintaining airborne moisture levels that reduce survival of the virus in the air and on surfaces could be another tool for managing public health risks of influenza. A multi-zone indoor air quality model was used to evaluate the ability of portable humidifiers to control moisture content of the air and the potential related benefit of decreasing survival of influenza viruses in single-family residences. We modeled indoor AH and influenza virus concentrations during winter months (Northeast US) using the CONTAM multi-zone indoor air quality model. A two-story residential template was used under two different ventilation conditions - forced hot air and radiant heating. Humidity was evaluated on a room-specific and whole house basis. Estimates of emission rates for influenza virus were particle-size specific and derived from published studies and included emissions during both tidal breathing and coughing events. The survival of the influenza virus was determined based on the established relationship between AH and virus survival. The presence of a portable humidifier with an output of 0.16 kg water per hour in the bedroom resulted in an increase in median sleeping hours AH/RH levels of 11 to 19% compared to periods without a humidifier present. The associated percent decrease in influenza virus survival was 17.5 - 31.6%. Distribution of water vapor through a residence was estimated to yield 3 to 12% increases in AH/RH and 7.8-13.9% reductions in influenza virus survival. This modeling analysis demonstrates the potential benefit of portable residential humidifiers in reducing the survival of aerosolized influenza virus by controlling humidity indoors.
2010-01-01
Background Laboratory research studies indicate that aerosolized influenza viruses survive for longer periods at low relative humidity (RH) conditions. Further analysis has shown that absolute humidity (AH) may be an improved predictor of virus survival in the environment. Maintaining airborne moisture levels that reduce survival of the virus in the air and on surfaces could be another tool for managing public health risks of influenza. Methods A multi-zone indoor air quality model was used to evaluate the ability of portable humidifiers to control moisture content of the air and the potential related benefit of decreasing survival of influenza viruses in single-family residences. We modeled indoor AH and influenza virus concentrations during winter months (Northeast US) using the CONTAM multi-zone indoor air quality model. A two-story residential template was used under two different ventilation conditions - forced hot air and radiant heating. Humidity was evaluated on a room-specific and whole house basis. Estimates of emission rates for influenza virus were particle-size specific and derived from published studies and included emissions during both tidal breathing and coughing events. The survival of the influenza virus was determined based on the established relationship between AH and virus survival. Results The presence of a portable humidifier with an output of 0.16 kg water per hour in the bedroom resulted in an increase in median sleeping hours AH/RH levels of 11 to 19% compared to periods without a humidifier present. The associated percent decrease in influenza virus survival was 17.5 - 31.6%. Distribution of water vapor through a residence was estimated to yield 3 to 12% increases in AH/RH and 7.8-13.9% reductions in influenza virus survival. Conclusion This modeling analysis demonstrates the potential benefit of portable residential humidifiers in reducing the survival of aerosolized influenza virus by controlling humidity indoors. PMID:20815876
Krok-Schoen, Jessica L; Fisher, James L; Baltic, Ryan D; Paskett, Electra D
2016-11-01
Increased life expectancy, growth of minority populations, and advances in cancer screening and treatment have resulted in an increasing number of older, racially diverse cancer survivors. Potential black/white disparities in cancer incidence, stage, and survival among the oldest old (≥85 years) were examined using data from the SEER Program of the National Cancer Institute. Differences in cancer incidence and stage at diagnosis were examined for cases diagnosed within the most recent 5-year period, and changes in these differences over time were examined for white and black cases aged ≥85 years. Five-year relative cancer survival rate was also examined by race. Among those aged ≥85 years, black men had higher colorectal, lung and bronchus, and prostate cancer incidence rates than white men, respectively. From 1973 to 2012, lung and bronchus and female breast cancer incidence increased, while colorectal and prostate cancer incidence decreased among this population. Blacks had higher rates of unstaged cancer compared with whites. The 5-year relative survival rate for all invasive cancers combined was higher for whites than blacks. Notably, whites had more than three times the relative survival rate of lung and bronchus cancer when diagnosed at localized (35.1% vs. 11.6%) and regional (12.2% vs. 3.2%) stages than blacks, respectively. White and black differences in cancer incidence, stage, and survival exist in the ≥85 population. Continued efforts are needed to reduce white and black differences in cancer prevention and treatment among the ≥85 population. Cancer Epidemiol Biomarkers Prev; 25(11); 1517-23. ©2016 AACR. ©2016 American Association for Cancer Research.
Willenbacher, Ella; Weger, Roman; Rochau, Ursula; Siebert, Uwe; Willenbacher, Wolfgang
2016-01-01
Clinical trials demonstrate improving survival in patients with multiple myeloma (MM) after treatment. However, it is unclear whether increased survival translates to a similar benefit in a real world setting. We analyzed the overall survival of 347 multiple myeloma patients in Austria by means of a national registry (AMR), focused on results from 3rd and later lines of therapy. This benchmark was chosen to define a baseline prior to the broad application of upcoming 2nd generation drugs (carfilzomib, pomalidomide). Projected 10 years survival for patients with MM in Austria is estimated to be 56% in patients diagnosed in between the years 2011-2014, 21% in patients with a diagnosis made between 2000-2005, and 39% in those with a diagnosis made between 2006-2010). For the same intervals a significant increase in the use of both bortezomib, lenalidomide and thalidomide-so called IMiDs (from 2005 onwards) and their simultaneous use in combination therapies (from 2010 onwards) could be shown. The use of autologous transplantation (ASCT) remained more or less constant at ~ 35% of patients in the 1st line setting over the whole period, comparing well to international practice patterns, while the use of 2nd line ASCT increased from 5.5% to 18.7% of patients. Patients in 3rd or later line treatment (n = 105), showed that even in relapsed and refractory disease median survival was 27 months with a considerable proportion of long-term survivors (~20%). With the expected emergence of additional active anti-myeloma compounds, we aim to assess survival in patients with relapsed and refractory MM.
USDA-ARS?s Scientific Manuscript database
Water mold infestations on channel catfish eggs lower the hatch rate (egg survival) and ultimately the number of catfish fry available for stocking in production ponds. This study compared the potential of two hydrogen peroxide (HP) and two copper sulfate pentahydrate (CSP) treatments to increase c...
Prezioso, Domenico; Iacono, Fabrizio; Romeo, Giuseppe; Ruffo, Antonio; Russo, Nicola; Illiano, Ester
2014-06-01
The objective of this work is to compare the effectiveness of hormonal treatment (luteinizing hormone-releasing hormone agonists and/or antiandrogens) as an early or as a deferred intervention for patients with locally advanced prostate cancer (LAPC) and/or asymptomatic metastasis. Systematic review of trials published in 1950-2007. Sources included MEDLINE and bibliographies of retrieved articles. Eligible trials included adults with a history of LAPC who are not suitable for curative local treatment of prostate cancer. We retrieved 22 articles for detailed review, of which 8 met inclusion criteria. The Veterans Administration Cooperative Urological Research Group suggested that delaying hormonal therapy did not compromise overall survival and that many of the patients died of causes other than prostate cancer. In European Organisation for Research and Treatment of Cancer (EORTC) 30846 trial, the median survival for delayed endocrine treatment was 6.1 year, and for immediate treatment 7.6 year, the HR for survival on delayed versus immediate treatment was 1.23 (95 % CI 0.88-1.71), indicating a 23 % nonsignificant trend in favour of early treatment. In EORTC 30891, the immediate androgen deprivation resulted in a modest but statistically significant increase in overall survival. The protocol SAKK 08/88 showed the lack of any major advantage of immediate compared with deferred hormonal treatment regarding quality of life or overall survival. The early intervention with hormonal treatment for patients with LAPC provides important reductions in all-cause mortality, prostate cancer-specific mortality, overall progression, and distant progression compared with deferring their use until standard care has failed to halt the disease.
Use of automated external defibrillators for in-hospital cardiac arrest : Any time, any place?
Wutzler, A; Kloppe, C; Bilgard, A K; Mügge, A; Hanefeld, C
2017-11-07
Acute treatment of in-hospital cardiac arrest (IHCA) is challenging and overall survival rates are low. However, data on the use of public-access automated external defibrillators (AEDs) for IHCA remain controversial. The aim of our study was to evaluate characteristics of patients experiencing IHCA and feasibility of public-access AED use for resuscitation in a university hospital. IHCA events outside the intensive care unit were analysed over a period of 21 months. Patients' characteristics, AED performance, return of spontaneous circulation (ROSC) and 24 h survival were evaluated. Outcomes following adequate and inadequate AED use were compared. During the study period, 59 IHCAs occurred. AED was used in 28 (47.5%) of the cases. However, AED was adequately used in only 42.8% of total AED cases. AED use was not associated with an increased survival rate (12.9 vs. 10.7%, p = 0.8) compared to non-AED use. However, adequate AED use was associated with a higher survival rate (25 vs. 0%, p = 0.034) compared to inadequate AED use. Time from emergency call to application of AED >3 min was the most important factor of inadequate AED use. Adequate AED use was more often observed between 7:30 and 13:30 and in the internal medicine department. AEDs were applied in less than 50% of the IHCA events. Furthermore, AED use was inadequate in the majority of the cases. Since adequate AED use is associated with improved survival, AEDs should be available in hospital areas with patients at high risk of shockable rhythm.
Deol, Abhinav; Sengsayadeth, Salyka; Ahn, Kwang Woo; Wang, Hai-Lin; Aljurf, Mahmoud; Antin, Joseph Harry; Battiwalla, Minoo; Bornhauser, Martin; Cahn, Jean-Yves; Camitta, Bruce; Chen, Yi-Bin; Cutler, Corey S; Gale, Robert Peter; Ganguly, Siddhartha; Hamadani, Mehdi; Inamoto, Yoshihiro; Jagasia, Madan; Kamble, Rammurti; Koreth, John; Lazarus, Hillard M; Liesveld, Jane; Litzow, Mark R; Marks, David I; Nishihori, Taiga; Olsson, Richard F; Reshef, Ran; Rowe, Jacob M; Saad, Ayman A; Sabloff, Mitchell; Schouten, Harry C; Shea, Thomas C; Soiffer, Robert J; Uy, Geoffrey L; Waller, Edmond K; Wiernik, Peter H; Wirk, Baldeep; Woolfrey, Ann E; Bunjes, Donald; Devine, Steven; de Lima, Marcos; Sandmaier, Brenda M; Weisdorf, Dan; Khoury, Hanna Jean; Saber, Wael
2016-10-01
Patients with FMS like tyrosine kinase 3 (FLT3)-mutated acute myeloid leukemia (AML) have a poor prognosis and are referred for early allogeneic hematopoietic stem cell transplantation (HCT). Data from the Center for International Blood and Marrow Transplant Research (CIBMTR) were used to evaluate 511 adult patients with de novo AML who underwent HCT during 2008 through 2011 to determine whether FLT3 mutations had an impact on HCT outcomes. In total, 158 patients (31%) had FLT3 mutations. Univariate and multivariate analyses revealed an increased risk of relapse at 3 years in the FLT3 mutated group compared with the wild-type (WT) group (38% [95% confidence interval (CI), 30%-45%] vs 28% [95% CI, 24%-33%]; P = .04; relative risk, 1.60 [95% CI, 1.15-2.22]; P = .0048). However, FLT3 mutation status was not significantly associated with nonrelapse mortality, leukemia-free survival, or overall survival. Although more patients in the FLT3 mutated group died from relapsed primary disease compared with those in the WT group (60% vs 46%), the 3-year overall survival rate was comparable for the 2 groups (mutated group: 49%; 95% CI, 40%-57%; WT group: 55%, 95% CI, 50%-60%; P = .20). The current data indicate that FLT3 mutation status did not adversely impact overall survival after HCT, and about 50% of patients with this mutation who underwent HCT were long-term survivors. Cancer 2016;122:3005-3014. © 2016 American Cancer Society. © 2016 American Cancer Society.
He, Liru; Chapple, Andrew; Liao, Zhongxing; Komaki, Ritsuko; Thall, Peter F; Lin, Steven H
2016-10-01
To evaluate radiation modality effects on pericardial effusion (PCE), pleural effusion (PE) and survival in esophageal cancer (EC) patients. We analyzed data from 470 EC patients treated with definitive concurrent chemoradiotherapy (CRT). Bayesian semi-competing risks (SCR) regression models were fit to assess effects of radiation modality and prognostic covariates on the risks of PCE and PE, and death either with or without these preceding events. Bayesian piecewise exponential regression models were fit for overall survival, the time to PCE or death, and the time to PE or death. All models included propensity score as a covariate to correct for potential selection bias. Median times to onset of PCE and PE after RT were 7.1 and 6.1months for IMRT, and 6.5 and 5.4months for 3DCRT, respectively. Compared to 3DCRT, the IMRT group had significantly lower risks of PE, PCE, and death. The respective probabilities of a patient being alive without either PCE or PE at 3-years and 5-years were 0.29 and 0.21 for IMRT compared to 0.13 and 0.08 for 3DCRT. In the SCR regression analyses, IMRT was associated with significantly lower risks of PCE (HR=0.26) and PE (HR=0.49), and greater overall survival (probability of beneficial effect (pbe)>0.99), after controlling for known clinical prognostic factors. IMRT reduces the incidence and postpones the onset of PCE and PE, and increases survival probability, compared to 3DCRT. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Finotello, R; Stefanello, D; Zini, E; Marconato, L
2017-03-01
Canine hemangiosarcoma (HSA) is a neoplasm of vascular endothelial origin that has an aggressive biological behaviour, with less than 10% of dogs alive at 12-months postdiagnosis. Treatment of choice consists of surgery followed by adjuvant doxorubicin-based chemotherapy. We prospectively compared adjuvant doxorubicin and dacarbazine (ADTIC) to a traditional doxorubicin and cyclophosphamide (AC) treatment, aiming at determining safety and assessing whether this regimen prolongs survival and time to metastasis (TTM). Twenty-seven dogs were enrolled; following staging work-up, 18 were treated with AC and 9 with ADTIC. Median TTM and survival time were longer for dogs treated with ADTIC compared with those receiving AC (>550 versus 112 days, P = 0.021 and >550 versus 142 days, P = 0.011, respectively). Both protocols were well tolerated, without need for dose reduction or increased interval between treatments. A protocol consisting of combined doxorubicin and dacarbazine is safe in dogs with HSA and prolongs TTM and survival time. © 2015 John Wiley & Sons Ltd.
Shen, Jian Guo; Cheong, Jae Ho; Hyung, Woo Jin; Kim, Junuk; Choi, Seung Ho; Noh, Sung Hoon
2006-09-01
To investigate the interactions between splenectomy and perioperative transfusion in gastric cancer patients. Medical records of 449 gastric cancer patients who had undergone total gastrectomies for curative intent between 1991 and 1995 were reviewed. The influence of splenectomy on tumor recurrence and survival both in the transfused and nontransfused patients were evaluated by univariate and multivariate analysis. The recurrence rate in the splenectomy group was 48.1% as compared with 22.6% in the spleen-preserved group among transfused patients (P=.001); it was 40.7% compared with 26.5% among nontransfused patients (P=.086). There was no significant difference in the mean survival between the splenectomy group and the spleen-preserved group in a subgroup analysis by stage. Multivariate analysis identified splenectomy as an independent risk factor for recurrence but not as a predictor for survival among transfused patients. Splenectomy does not appear to abrogate the adverse effect of perioperative transfusion on prognosis in gastric cancer patients. Moreover, it may increase postoperative recurrence in transfused patients.
Changes in fertility patterns can improve child survival in Southeast Asia.
Greenspan, A
1993-12-01
This analysis of 1988 Philippine Demographic Survey data provides information on the direct and indirect effects of several major determinants of childhood mortality in the Philippines. Data are compared to rates in Indonesia and Thailand. The odds of infant mortality in the Philippines are reduced by 39% by spacing children more than two years apart. This finding is significant because infant mortality rates have not declined over the past 20 years. Child survival is related to the number of children in the family, the spacing of the children, the mother's age and education, and the risks of malnutrition and infection. Directs effects on child survival are related to infant survival status of the preceding child and the length of the preceding birth interval, while key indirect or background variables are maternal age and education, birth order, and place of residence. The two-stage causation model is tested with data on 13,716 ever married women aged 15-49 years and 20,015 index children born between January 1977 and February 1987. Results in the Philippine confirm that maternal age, birth order, mortality of the previous child, and maternal education are directly related to birth interval, while mortality of the previous child, birth order, and maternal educational status are directly related to infant mortality. Thailand, Indonesia, and the Philippines all show similar explanatory factors that directly influence infant mortality. The survival status of the preceding child is the most important predictor in all three countries and is particularly strong in Thailand. This factor acts through the limited time interval for rejuvenation of mother's body, nutritional deficiencies, and transmission of infectious disease among siblings. The conclusion is that poor environmental conditions increase vulnerability to illness and death. There are 133% greater odds of having a short birth interval among young urban women than among older rural women. There is a 29% increase in odds for second parity births compared to third or higher order parities. Maternal education is a strong predictor of infant survival only in the Philippines and Indonesia. Adolescent pregnancy is a risk only in Indonesia. Socioeconomic factors are not as important as birth interval, birth order, and maternal education in determining survival status.
Inhibition of IKKß in enterocytes exacerbates sepsis-induced intestinal injury and worsens mortality
Dominguez, Jessica A.; Samocha, Alexandr J.; Liang, Zhe; Burd, Eileen M.; Farris, Alton B.; Coopersmith, Craig M.
2013-01-01
Objective NF-kB is a critical regulator of cell survival genes and the host inflammatory response. The purpose of this study was to investigate the role of enterocyte-specific NF-kB in sepsis through selective ablation of IkB kinase (IKK)-ß. Design Prospective, randomized, controlled study. Setting Animal laboratories in university medical centers. Subjects and Interventions Mice lacking functional NF-kB in their intestinal epithelium (Vil-Cre/Ikkßf/Δ) and wild type (WT) mice were subjected to sham laparotomy or cecal ligation and puncture (CLP). Animals were sacrified at 24 hours or followed seven days for survival. Measurements and Main Results Septic WT mice had decreased villus length compared to sham mice while villus atrophy was further exacerbated in septic Vil-Cre/Ikkßf/Δ mice. Sepsis induced an increase in intestinal epithelial apoptosis compared to sham mice which was further exacerbated in Vil-Cre/Ikkßf/Δ mice. Sepsis induced intestinal hyperpermeability in WT mice compared to sham mice, which was further exacerbated in septic Vil-Cre/Ikkßf/Δ mice. This was associated with increased intestinal expression of claudin-2 in septic WT mice, which was further increased in septic Vil-Cre/Ikkßf/Δ mice. Both, pro-inflammatory and anti-inflammatory cytokines were increased in serum following CLP, and IL-10 and MCP-1 levels were higher in septic Vil-Cre/Ikkßf/Δ mice than septic WT mice. All septic mice were bacteremic, but no differences in bacterial load were identified between WT and Vil-Cre/Ikkßf/Δ mice. To determine the functional significance of these results, animals were followed for survival. Septic WT mice had lower mortality than septic Vil-Cre/Ikkßf/Δ mice (47% vs. 80%, p<0.05). Anti-TNF administration decreased intestinal apoptosis, permeability and mortality in WT septic mice and a similar improvement in intestinal integrity and survival were seen when anti-TNF was given to Vil-Cre/Ikkßf/Δ mice. Conclusions Enterocyte-specific NF-kB has a beneficial role in sepsis by partially preventing sepsis-induced increases in apoptosis and permeability, which are associated with worsening mortality. PMID:23939348
Carlo, Michael A; Riddell, Eric A; Levy, Ofir; Sears, Michael W
2018-01-01
The capacity to tolerate climate change often varies across ontogeny in organisms with complex life cycles. Recently developed species distribution models incorporate traits across life stages; however, these life-cycle models primarily evaluate effects of lethal change. Here, we examine impacts of recurrent sublethal warming on development and survival in ecological projections of climate change. We reared lizard embryos in the laboratory under temperature cycles that simulated contemporary conditions and warming scenarios. We also artificially warmed natural nests to mimic laboratory treatments. In both cases, recurrent sublethal warming decreased embryonic survival and hatchling sizes. Incorporating survivorship results into a mechanistic species distribution model reduced annual survival by up to 24% compared to models that did not incorporate sublethal warming. Contrary to models without sublethal effects, our model suggests that modest increases in developmental temperatures influence species ranges due to effects on survivorship. © 2017 John Wiley & Sons Ltd/CNRS.
Hustinx, W; Benaissa-Trouw, B; Van Kessel, K; Kuenen, J; Tavares, L; Kraaijeveld, K; Verhoef, J; Hoepelman, A
1997-12-01
Combined prophylactic treatment with recombinant murine granulocyte colony-stimulating factor (G-CSF) and a suboptimal dose of anti-K1 capsular IgM monoclonal antibody (MAb) significantly enhanced survival in an experimental mouse Escherichia coli O7:K1 peritonitis model compared with untreated animals (67% vs. 11% survival; P < 0.001) and with either treatment alone (67 vs. 29% and 27% survival, respectively; P < 0.01), which suggests synergism between these agents. Enhanced survival by combined treatment was associated with increased neutrophil counts in blood and peritoneal lavage fluid, lower systemic and higher levels of local tumour necrosis factor (TNF) and lower bacterial counts in blood cultures. Mouse neutrophils treated with G-CSF but not infected with E. coli showed enhanced phagocytic and respiratory burst capacity, down-regulation of L-selectin receptors and enhanced expression of Fc RII-III receptors but not of complement receptors.
Use and Effectiveness of Intraperitoneal Chemotherapy for Treatment of Ovarian Cancer
Wright, Alexi A.; Cronin, Angel; Milne, Dana E.; Bookman, Michael A.; Burger, Robert A.; Cohn, David E.; Cristea, Mihaela C.; Griggs, Jennifer J.; Keating, Nancy L.; Levenback, Charles F.; Mantia-Smaldone, Gina; Matulonis, Ursula A.; Meyer, Larissa A.; Niland, Joyce C.; Weeks, Jane C.; O'Malley, David M.
2015-01-01
Purpose A 2006 randomized trial demonstrated a 16-month survival benefit with intraperitoneal and intravenous (IP/IV) chemotherapy administered to patients who had ovarian cancer, compared with IV chemotherapy alone, but more treatment-related toxicities. The objective of this study was to examine the use and effectiveness of IP/IV chemotherapy in clinical practice. Patients and Methods Prospective cohort study of 823 women with stage III, optimally cytoreduced ovarian cancer diagnosed at six National Comprehensive Cancer Network institutions. We examined IP/IV chemotherapy use in all patients diagnosed between 2003 and 2012 (N = 823), and overall survival and treatment-related toxicities with Cox regression and logistic regression, respectively, in a propensity score–matched sample (n = 402) of patients diagnosed from 2006 to 2012, excluding trial participants, to minimize selection bias. Results Use of IP/IV chemotherapy increased from 0% to 33% between 2003 and 2006, increased to 50% from 2007 to 2008, and plateaued thereafter. Between 2006 and 2012, adoption of IP/IV chemotherapy varied by institution from 4% to 67% (P < .001) and 43% of patients received modified IP/IV regimens at treatment initiation. In the propensity score–matched sample, IP/IV chemotherapy was associated with significantly improved overall survival (3-year overall survival, 81% v 71%; hazard ratio, 0.68; 95% CI, 0.47 to 0.99), compared with IV chemotherapy, but also more frequent alterations in chemotherapy delivery route (adjusted rates discontinuation or change, 20.4% v 10.0%; adjusted odds ratio, 2.83; 95% CI, 1.47 to 5.47). Conclusion Although the use of IP/IV chemotherapy increased significantly at National Comprehensive Cancer Network centers between 2003 and 2012, fewer than 50% of eligible patients received it. Increasing IP/IV chemotherapy use in clinical practice may be an important and underused strategy to improve ovarian cancer outcomes. PMID:26240233
Conditional survival of all primary brain tumor patients by age, behavior, and histology.
Porter, Kimberly R; McCarthy, Bridget J; Berbaum, Michael L; Davis, Faith G
2011-01-01
Survival statistics commonly reflect survival from the time of diagnosis but do not take into account survival already achieved after a diagnosis. The objective of this study was to provide conditional survival estimates for brain tumor patients as a more accurate measure of survival for those who have already survived for a specified amount of time after diagnosis. Data on primary malignant and nonmalignant brain tumor cases diagnosed from 1985-2005 from selected SEER state cancer registries were obtained. Relative survival up to 15 years postdiagnosis and varying relative conditional survival rates were computed using the life-table method. The overall 1-year relative survival estimate derived from time of diagnosis was 67.8% compared to the 6-month relative conditional survival rate of 85.7% for 6-month survivors (the probability of surviving to 1 year given survival to 6 months). The 10-year overall relative survival rate was 49.5% from time of diagnosis compared to the 8-year relative conditional survival rate of 79.2% for 2-year survivors. Conditional survival estimates and standard survival estimates varied by histology, behavior, and age at diagnosis. The 5-year relative survival estimate derived from time of diagnosis for glioblastoma was 3.6% compared to the 3-year relative conditional survival rate of 36.4% for 2-year survivors. For most nonmalignant tumors, the difference between relative survival and the corresponding conditional survival estimates were minimal. Older age groups had greater numeric gains in survival but lower conditional survival estimates than other age groups. Similar findings were seen for other conditional survival intervals. Conditional survival is a useful disease surveillance measure for clinicians and brain tumor survivors to provide them with better 'real-time' estimates and hope. Copyright © 2011 S. Karger AG, Basel.
Inhibition of necroptosis attenuates lung injury and improves survival in neonatal sepsis.
Bolognese, Alexandra C; Yang, Weng-Lang; Hansen, Laura W; Denning, Naomi-Liza; Nicastro, Jeffrey M; Coppa, Gene F; Wang, Ping
2018-04-27
Neonatal sepsis represents a unique therapeutic challenge owing to an immature immune system. Necroptosis is a form of programmed cell death that has been identified as an important mechanism of inflammation-induced cell death. Receptor-interacting protein kinase 1 plays a key role in mediating this process. We hypothesized that pharmacologic blockade of receptor-interacting protein kinase 1 activity would be protective in neonatal sepsis. Sepsis was induced in C57BL/6 mouse pups (5-7 days old) by intraperitoneal injection of adult cecal slurry. At 1 hour after cecal slurry injection, the receptor-interacting protein kinase 1 inhibitor necrostatin-1 (10 µg/g body weight) or vehicle (5% dimethyl sulfoxide in phosphate buffered saline) was administered via retro-orbital injection. At 20 hours after cecal slurry injection, blood and lung tissues were collected for various analyses. At 20 hours after sepsis induction, vehicle-treated pups showed a marked increase in serum levels of interleukin 6, interleukin 1-beta, and interleukin 18 compared to sham. With necrostatin-1 treatment, serum levels of interleukin 6, interleukin 1-beta, and interleukin 18 were decreased by 77%, 81%, and 63%, respectively, compared to vehicle. In the lungs, sepsis induction resulted in a 232-, 10-, and 2.8-fold increase in interleukin 6, interleukin 1-beta, and interleukin 18 mRNA levels compared to sham, while necrostatin-1 treatment decreased these levels to 40-, 4-, and 0.8-fold, respectively. Expressions of the neutrophil chemokines keratinocyte chemoattractant and macrophage-inflammatory-protein-2 were also increased in the lungs in sepsis, while necrostatin-1 treatment decreased these levels by 81% and 61%, respectively, compared to vehicle. In addition, necrostatin-1 treatment significantly improved the lung histologic injury score and decreased lung apoptosis in septic pups. Finally, treatment with necrostatin-1 increased the 7-day survival rate from 0% in the vehicle-treated septic pups to 29% (P = .11). Inhibition of receptor-interacting protein kinase 1 by necrostatin-1 decreases systemic and pulmonary inflammation, decreases lung injury, and increases survival in neonatal mice with sepsis. Targeting the necroptosis pathway might represent a new therapeutic strategy for neonatal sepsis. Copyright © 2018 Elsevier Inc. All rights reserved.
Jones, Crystal L; Singh, Shweta S; Alamneh, Yonas; Casella, Leila G; Ernst, Robert K; Lesho, Emil P; Waterman, Paige E; Zurawski, Daniel V
2017-03-01
The loss of fitness in colistin-resistant (CR) Acinetobacter baumannii was investigated using longitudinal isolates from the same patient. Early CR isolates were outcompeted by late CR isolates for growth in broth and survival in the lungs of mice. Fitness loss was associated with an increased susceptibility to oxidative stress since early CR strains had reduced in vitro survival in the presence of hydrogen peroxide and decreased catalase activity compared to that of late CR and colistin-susceptible (CS) strains. Copyright © 2017 Jones et al.
Jänne, P A; Smith, I; McWalter, G; Mann, H; Dougherty, B; Walker, J; Orr, M C M; Hodgson, D R; Shaw, A T; Pereira, J R; Jeannin, G; Vansteenkiste, J; Barrios, C H; Franke, F A; Crinò, L; Smith, P
2015-07-14
Selumetinib (AZD6244, ARRY-142886)+docetaxel increases median overall survival (OS) and significantly improves progression-free survival (PFS) and objective response rate (ORR) compared with docetaxel alone in patients with KRAS mutant, stage IIIB/IV non-small-cell lung cancer (NSCLC; NCT00890825). Retrospective analysis of OS, PFS, ORR and change in tumour size at week 6 for different sub-populations of KRAS codon mutations. In patients receiving selumetinib+docetaxel and harbouring KRAS G12C or G12V mutations there were trends towards greater improvement in OS, PFS and ORR compared with other KRAS mutations. Different KRAS mutations in NSCLC may influence selumetinib/docetaxel sensitivity.
Gordon, Jonathan A R; Sodek, Jaro; Hunter, Graeme K; Goldberg, Harvey A
2009-08-15
Bone sialoprotein (BSP) is a secreted glycoprotein found in mineralized tissues however, BSP is aberrantly expressed in a variety of osteotropic tumors. Elevated BSP expression in breast and prostate primary carcinomas is directly correlated with increased bone metastases and tumor progression. In this study, the intracellular signaling pathways responsible for BSP-induced migration and tumor survival were examined in breast and prostate cancer cells (MDA-MB-231, Hs578T and PC3). Additionally, the effects of exogenous TGF-beta1 and EGF, cytokines associated with tumor metastasis and present in high-levels in the bone microenvironment, were examined in BSP-expressing cancer cells. Expression of BSP but not an integrin-binding mutant (BSP-KAE) in tumor cell lines resulted in increased levels of alpha(v)-containing integrins and number of mature focal adhesions. Adhesion of cells to recombinant BSP or the expression of BSP stimulated focal adhesion kinase and ERK phosphorylation, as well as activated AP-1-family proteins. Activation of these pathways by BSP expression increased the expression of the matrix metalloproteinases MMP-2, MMP-9, and MMP-14. The BSP-mediated activation of the FAK-associated pathway resulted in increased cancer cell invasion in a Matrigel-coated Boyden-chamber assay and increased cell survival upon withdrawal of serum. Addition of EGF or TGF-beta1 to the BSP-expressing cell lines significantly increased ERK phosphorylation, AP-1 activation, MMP-2 expression, cell migration and survival compared to untreated cells expressing BSP. This study thus defines the cooperative mechanisms by which BSP can enhance specific factors associated with a metastatic phenotype in tumor cell lines, an effect that is increased by circulating TGF-beta1 and EGF. (c) 2009 Wiley-Liss, Inc.
Hecker, Peter A; Galvao, Tatiana F; O'Shea, Karen M; Brown, Bethany H; Henderson, Reney; Riggle, Heather; Gupte, Sachin A; Stanley, William C
2012-05-01
A high-sugar intake increases heart disease risk in humans. In animals, sugar intake accelerates heart failure development by increased reactive oxygen species (ROS). Glucose-6-phosphate dehydrogenase (G6PD) can fuel ROS production by providing reduced nicotinamide adenine dinucleotide phosphate (NADPH) for superoxide generation by NADPH oxidase. Conversely, G6PD also facilitates ROS scavenging using the glutathione pathway. We hypothesized that a high-sugar intake would increase flux through G6PD to increase myocardial NADPH and ROS and accelerate cardiac dysfunction and death. Six-week-old TO-2 hamsters, a non-hypertensive model of genetic cardiomyopathy caused by a δ-sarcoglycan mutation, were fed a long-term diet of high starch or high sugar (57% of energy from sucrose plus fructose). After 24 wk, the δ-sarcoglycan-deficient animals displayed expected decreases in survival and cardiac function associated with cardiomyopathy (ejection fraction: control 68.7 ± 4.5%, TO-2 starch 46.1 ± 3.7%, P < 0.05 for TO-2 starch versus control; TO-2 sugar 58.0 ± 4.2%, NS, versus TO-2 starch or control; median survival: TO-2 starch 278 d, TO-2 sugar 318 d, P = 0.133). Although the high-sugar intake was expected to exacerbate cardiomyopathy, surprisingly, there was no further decrease in ejection fraction or survival with high sugar compared with starch in cardiomyopathic animals. Cardiomyopathic animals had systemic and cardiac metabolic abnormalities (increased serum lipids and glucose and decreased myocardial oxidative enzymes) that were unaffected by diet. The high-sugar intake increased myocardial superoxide, but NADPH and lipid peroxidation were unaffected. A sugar-enriched diet did not exacerbate ventricular function, metabolic abnormalities, or survival in heart failure despite an increase in superoxide production. Copyright © 2012 Elsevier Inc. All rights reserved.
Tobacco Cessation May Improve Lung Cancer Patient Survival.
Dobson Amato, Katharine A; Hyland, Andrew; Reed, Robert; Mahoney, Martin C; Marshall, James; Giovino, Gary; Bansal-Travers, Maansi; Ochs-Balcom, Heather M; Zevon, Michael A; Cummings, K Michael; Nwogu, Chukwumere; Singh, Anurag K; Chen, Hongbin; Warren, Graham W; Reid, Mary
2015-07-01
This study characterizes tobacco cessation patterns and the association of cessation with survival among lung cancer patients at Roswell Park Cancer Institute: an NCI Designated Comprehensive Cancer Center. Lung cancer patients presenting at this institution were screened with a standardized tobacco assessment, and those who had used tobacco within the past 30 days were automatically referred to a telephone-based cessation service. Demographic, clinical information, and self-reported tobacco use at last contact were obtained via electronic medical records and the Roswell Park Cancer Institute tumor registry for all lung cancer patients referred to the service between October 2010 and October 2012. Descriptive statistics and Cox proportional hazards models were used to assess whether tobacco cessation and other factors were associated with lung cancer survival through May 2014. Calls were attempted to 313 of 388 lung cancer patients referred to the cessation service. Eighty percent of patients (250 of 313) were successfully contacted and participated in at least one telephone-based cessation call; 40.8% (102 of 250) of persons contacted reported having quit at the last contact. After controlling for age, pack year history, sex, Eastern Cooperative Oncology Group performance status, time between diagnosis and last contact, tumor histology, and clinical stage, a statistically significant increase in survival was associated with quitting compared with continued tobacco use at last contact (HR = 1.79; 95% confidence interval: 1.14-2.82) with a median 9 month improvement in overall survival. Tobacco cessation among lung cancer patients after diagnosis may increase overall survival.
Tobacco Cessation May Improve Lung Cancer Patient Survival
Dobson Amato, Katharine A.; Hyland, Andrew; Reed, Robert; Mahoney, Martin C.; Marshall, James; Giovino, Gary; Bansal-Travers, Maansi; Ochs-Balcom, Heather M.; Zevon, Michael A.; Cummings, K. Michael; Nwogu, Chukwumere; Singh, Anurag K.; Chen, Hongbin; Warren, Graham W.; Reid, Mary
2015-01-01
Introduction This study characterizes tobacco cessation patterns and the association of cessation with survival among lung cancer patients at Roswell Park Cancer Institute: an NCI Designated Comprehensive Cancer Center. Methods Lung cancer patients presenting at this institution were screened with a standardized tobacco assessment, and those who had used tobacco within the past 30 days were automatically referred to a telephone-based cessation service. Demographic, clinical information and self-reported tobacco use at last contact were obtained via electronic medical records and the RPCI tumor registry for all lung cancer patients referred to the service between October 2010 and October 2012. Descriptive statistics and Cox proportional hazards models were used to assess whether tobacco cessation and other factors were associated with lung cancer survival through May 2014. Results Calls were attempted to 313 of 388 lung cancer patients referred to the cessation service. Eighty percent of patients (250/313) were successfully contacted and participated in at least one telephone-based cessation call; 40.8% (102/250) of persons contacted reported having quit at the last contact. After controlling for age, pack year history, sex, ECOG performance status, time between diagnosis and last contact, tumor histology, and clinical stage, a statistically significant increase in survival was associated with quitting compared to continued tobacco use at last contact (HR=1.79; 95% CI: 1.14-2.82) with a median 9 month improvement in overall survival. Conclusions Tobacco cessation among lung cancer patients after diagnosis may increase overall survival. PMID:26102442
Tran, Duong T; Gay, Isabel C; Diaz-Rodriguez, Janice; Parthasarathy, Kavitha; Weltman, Robin; Friedman, Lawrence
2016-01-01
To compare dental implant survival rates when placed in native bone and grafted sites. Additionally, risk factors associated with dental implant loss were identified. This study was based on the hypothesis that bone grafting has no effect on implant survival rates. A retrospective chart review was conducted for patients receiving dental implants at the University of Texas, School of Dentistry from 1985 to 2012. Exclusion criteria included patients with genetic diseases, radiation and chemotherapy, or an age less than 18 years. To avoid misclassification bias, implants were excluded if bone grafts were only done at the same time of placement. Data on age, sex, tobacco use, diabetes, osteoporosis, anatomical location of the implant, implant length and width, bone graft, and professional maintenance were collected for analysis. A total of 1,222 patients with 2,729 implants were included. The cumulative survival rates at 5 and 10 years were 92% and 87% for implants placed in native bone and 90% and 79% for implants placed in grafted bone, respectively. The results from multivariate analysis (Cox regression) indicated no significant difference in survival between the two groups; having maintenance therapy after implant placement reduced the failure rate by 80% (P < .001), and using tobacco increased the failure rate by 2.6-fold (P = .001). There was no difference in the dental implant survival rate when implants were placed in native bone or bone-grafted sites. Smoking and lack of professional maintenance were significantly related to increased implant loss.
Evaluating Conservation Breeding Success for an Extinct-in-the-Wild Antelope.
Little, Holly A; Gilbert, Tania C; Athorn, Marie L; Marshall, Andrew R
2016-01-01
With the number of threatened species increasing globally, conservation breeding is vitally important now more than ever. However, no previous peer-reviewed study has attempted to determine how the varying conditions across zoos have influenced breeding by an extinct-in-the-wild species. We therefore use questionnaires and studbook data to evaluate the influence of husbandry practices and enclosure design on scimitar-horned oryx (Oryx dammah) breeding success, at the herd level. Regression models were used to identify the variables that best predicted breeding success among 29 zoos across a five-year period. Calf survival decreased with herd age and the use of soft substrates in hardstand areas (yard area usually adjacent to the indoor housing), explaining 30.7% of overall variation. Calf survival also decreased where herds were small and where food provisions were not raised (and hence likely incited competition), although these were less influential. Likewise, birth rate decreased with soft substrates in hardstand areas and unraised food provisions, although these were less influential than for calf survival. Birth rate increased with year-round male presence, yet this decreased calf survival. Compared to previous studies, the number of enclosure/husbandry influences on breeding were relatively few. Nevertheless, these few enclosure/husbandry influences explained over one third of the variation in calf survival. Our data therefore suggest some potential improvements and hence that extinct-in-the-wild species stand a greater chance of survival with empirical design of zoo enclosures and husbandry methods.
Ferretti, Stefano; Bossard, Nadine; Binder-Fouchard, Florence; Faivre, Jean; Bordoni, Andrea; Biavati, Patrizia; Frassoldati, Antonio
2017-01-01
Liver cancer represents a major clinical challenge. The aim of the SUDCAN collaborative study was to compare the net survival from liver cancer between six European Latin countries (Belgium, France, Italy, Portugal, Spain and Switzerland) and provide trends in net survival and dynamics of excess mortality rates (EMRs) up to 5 years after diagnosis. The data were extracted from the EUROCARE-5 database. First, net survival was studied over the period 2000-2004 using the Pohar-Perme estimator. For trend analyses, the study period was specific to each country. Results are reported from 1992 to 2004 in France, Italy, Spain and Switzerland and from 2000 to 2004 in Belgium and Portugal. These trend analyses were carried out using a flexible excess-rate modeling strategy. There were little differences between the six countries in the 5-year age-standardized net survival (2000-2004): it ranged from 13% (France and Portugal) to 16% (Belgium). An increase in the net age-standardized survival was observed in all countries between 1992 and 2004, both at 1 year and at 5 years (the highest in Spain, the lowest in France). Generally, patients aged 60 years showed the highest increase. There was a progressive decrease in EMR over the 5-year- period following diagnosis. The study confirmed the poor prognosis of liver cancer. Innovative treatments might improve the prognosis as well as preventive screening of cirrhotic patients with good liver function. Efforts are also needed to improve registration practices.
Cohen, Todd J; Asheld, Wilbur J; Germano, Joseph; Islam, Shahidul; Patel, Dhimesh
2015-06-01
The purpose of the study was to examine survival in the implantable defibrillator subset of implanted leads at a large-volume implanting hospital. Implantable lead survival has been the subject of many multicenter studies over the past decade. Fewer large implanting volume single-hospital studies have examined defibrillator lead failure as it relates to patient survival and lead construction. This investigator-initiated retrospective study examined defibrillator lead failure in those who underwent implantation of a defibrillator between February 1, 1996 and December 31, 2011. Lead failure was defined as: failure to capture/sense, abnormal pacing and/or defibrillator impedance, visual insulation defect or lead fracture, extracardiac stimulation, cardiac perforation, tricuspid valve entrapment, lead tip fracture and/or lead dislodgment. Patient characteristics, implant approach, lead manufacturers, lead models, recalled status, patient mortality, and core lead design elements were compared using methods that include Kaplan Meier analysis, univariate and multivariable Cox regression models. A total of 4078 defibrillator leads were implanted in 3802 patients (74% male; n = 2812) with a mean age of 70 ± 13 years at Winthrop University Hospital. Lead manufacturers included: Medtronic: [n = 1834; 801 recalled]; St. Jude Medical: [n = 1707; 703 recalled]; Boston Scientific: [n = 537; 0 recalled]. Kaplan-Meier analysis adjusted for multiple comparisons revealed that both Boston Scientific's and St. Jude Medical's leads had better survival than Medtronic's leads (P<.001 and P=.01, respectively). Lead survival was comparable between Boston Scientific and St. Jude Medical (P=.80). A total of 153 leads failed (3.5% of all leads) during the study. There were 99 lead failures from Medtronic (5.4% failure rate); 56 were recalled Sprint Fidelis leads. There were 36 lead failures from St. Jude (2.1% failure rate); 20 were recalled Riata or Riata ST leads. There were 18 lead failures from Boston Scientific (3.35% failure rate); none were recalled. Kaplan Meier analysis also showed lead failure occurred sooner in the recalled leads (P=.01). A total of 1493 patients died during the study (mechanism of death was largely unknown). There was a significant increase in mortality in the recalled lead group as compared with non-recalled leads (P=.01), but no significant difference in survival when comparing recalled leads from Medtronic with St. Jude Medical (P=.67). A multivariable Cox regression model revealed younger age, history of percutaneous coronary intervention, baseline rhythm other than atrial fibrillation or atrial flutter, combination polyurethane and silicone lead insulation, a second defibrillation coil, and recalled lead status all contributed to lead failure. This study demonstrated a significantly improved lead performance in the Boston Scientific and St. Jude leads as compared with Medtronic leads. Some lead construction variables (insulation and number of coils) also had a significant impact on lead failure, which was independent of the manufacturer. Recalled St. Jude leads performed better than recalled Medtronic leads in our study. Recalled St. Jude leads had no significant difference in lead failure when compared with the other manufacturer's non-recalled leads. Defibrillator recalled lead status was associated with an increased mortality as compared with non-recalled leads. This correlation was independent of the lead manufacturer and clinically significant even when considering known mortality risk factors. These results must be tempered by the largely unknown mechanism of death in these patients.
Reproductive success of Horned Lark and McCown's Longspur in relation to wind energy infrastructure
Mahoney, Anika; Chalfoun, Anna D.
2016-01-01
Wind energy is a rapidly expanding industry with potential indirect effects to wildlife populations that are largely unexplored. In 2011 and 2012, we monitored 211 nests of 2 grassland songbirds, Horned Lark (Eremophila alpestris) and McCown's Longspur (Rhynchophanes mccownii), at 3 wind farms and 2 undeveloped reference sites in Wyoming, USA. We evaluated several indices of reproductive investment and success: clutch size, size-adjusted nestling mass, daily nest survival rate, and number of fledglings. We compared reproductive success between wind farms and undeveloped sites and modeled reproductive success within wind farms as a function of wind energy infrastructure and habitat. Size-adjusted nestling mass of Horned Lark was weakly negatively related to turbine density. In 2011, nest survival of Horned Lark decreased 55% as turbine density increased from 10 to 39 within 2 km of the nest. In 2012, however, nest survival of Horned Lark was best predicted by the combination of vegetation height, distance to shrub edge, and turbine density, with survival increasing weakly with increasing vegetation height. McCown's Longspur nest survival was weakly positively related to vegetation density at the nest site when considered with the amount of grassland habitat in the neighborhood and turbine density within 1 km of the nest. Habitat and distance to infrastructure did not explain clutch size or number of fledglings for either species, or size-adjusted nestling mass for McCown's Longspur. Our results suggest that the influence of wind energy infrastructure varies temporally and by species, even among species using similar habitats. Turbine density was repeatedly the most informative measure of wind energy development. Turbine density could influence wildlife responses to wind energy production and may become increasingly important to consider as development continues in areas with high-quality wind resources.
Julin, Jaakko; Jämsen, Esa; Puolakka, Timo; Konttinen, Yrjö T; Moilanen, Teemu
2010-08-01
Total knee replacements (TKRs) are being increasingly performed in patients aged < or = 65 years who often have high physical demands. We investigated the relation between age of the patient and prosthesis survival following primary TKR using nationwide data collected from the Finnish Arthroplasty Register. From Jan 1, 1997 through Dec 31, 2003, 32,019 TKRs for primary or secondary osteoarthritis were reported to the Finnish Arthroplasty Register. The TKRs were followed until the end of 2004. During the follow-up, 909 TKRs were revised, 205 (23%) due to infection and 704 for other reasons. Crude overall implant survival improved with increasing age between the ages of 40 and 80. The 5-year survival rates were 92% and 95% in patients aged < or = 55 and 56-65 years, respectively, compared to 97% in patients who were > 65 years of age (p < 0.001). The difference was mainly attributable to reasons other than infections. Sex, diagnosis, type of TKR (condylar, constrained, or hinge), use of patellar component, and fixation method were also associated with higher revision rates. However, the differences in prosthesis survival between the age groups < or = 55, 56-65, and > 65 years remained after adjustment for these factors (p < 0.001). Young age impairs the prognosis of TKR and is associated with increased revision rates for non-infectious reasons. Diagnosis, sex, type of TKR, use of patellar component, and fixation method partly explain the differences, but the effects of physical activity, patient demands, and obesity on implant survival in younger patients warrant further research.
Baade, Peter D.; Valery, Patricia C.; Whop, Lisa J.; Moore, Suzanne P.; Cunningham, Joan; Garvey, Gail; Brotherton, Julia M. L.; O’Connell, Dianne L.; Canfell, Karen; Sarfati, Diana; Roder, David; Buckley, Elizabeth; Condon, John R.
2018-01-01
Background Little is known about the impact of comorbidity on cervical cancer survival in Australian women, including whether Indigenous women’s higher prevalence of comorbidity contributes to their lower survival compared to non-Indigenous women. Methods Data for cervical cancers diagnosed in 2003–2012 were extracted from six Australian state-based cancer registries and linked to hospital inpatient records to identify comorbidity diagnoses. Five-year cause-specific and all-cause survival probabilities were estimated using the Kaplan-Meier method. Flexible parametric models were used to estimate excess cause-specific mortality by Charlson comorbidity index score (0,1,2+), for Indigenous women compared to non-Indigenous women. Results Of 4,467 women, Indigenous women (4.4%) compared to non-Indigenous women had more comorbidity at diagnosis (score ≥1: 24.2% vs. 10.0%) and lower five-year cause-specific survival (60.2% vs. 76.6%). Comorbidity was associated with increased cervical cancer mortality for non-Indigenous women, but there was no evidence of such a relationship for Indigenous women. There was an 18% reduction in the Indigenous: non-Indigenous hazard ratio (excess mortality) when comorbidity was included in the model, yet this reduction was not statistically significant. The excess mortality for Indigenous women was only evident among those without comorbidity (Indigenous: non-Indigenous HR 2.5, 95%CI 1.9–3.4), indicating that factors other than those measured in this study are contributing to the differential. In a subgroup of New South Wales women, comorbidity was associated with advanced-stage cancer, which in turn was associated with elevated cervical cancer mortality. Conclusions Survival was lowest for women with comorbidity. However, there wasn’t a clear comorbidity-survival gradient for Indigenous women. Further investigation of potential drivers of the cervical cancer survival differentials is warranted. Impact The results highlight the need for cancer care guidelines and multidisciplinary care that can meet the needs of complex patients. Also, primary and acute care services may need to pay more attention to Indigenous Australian women who may not obviously need it (i.e. those without comorbidity). PMID:29738533
Diaz, Abbey; Baade, Peter D; Valery, Patricia C; Whop, Lisa J; Moore, Suzanne P; Cunningham, Joan; Garvey, Gail; Brotherton, Julia M L; O'Connell, Dianne L; Canfell, Karen; Sarfati, Diana; Roder, David; Buckley, Elizabeth; Condon, John R
2018-01-01
Little is known about the impact of comorbidity on cervical cancer survival in Australian women, including whether Indigenous women's higher prevalence of comorbidity contributes to their lower survival compared to non-Indigenous women. Data for cervical cancers diagnosed in 2003-2012 were extracted from six Australian state-based cancer registries and linked to hospital inpatient records to identify comorbidity diagnoses. Five-year cause-specific and all-cause survival probabilities were estimated using the Kaplan-Meier method. Flexible parametric models were used to estimate excess cause-specific mortality by Charlson comorbidity index score (0,1,2+), for Indigenous women compared to non-Indigenous women. Of 4,467 women, Indigenous women (4.4%) compared to non-Indigenous women had more comorbidity at diagnosis (score ≥1: 24.2% vs. 10.0%) and lower five-year cause-specific survival (60.2% vs. 76.6%). Comorbidity was associated with increased cervical cancer mortality for non-Indigenous women, but there was no evidence of such a relationship for Indigenous women. There was an 18% reduction in the Indigenous: non-Indigenous hazard ratio (excess mortality) when comorbidity was included in the model, yet this reduction was not statistically significant. The excess mortality for Indigenous women was only evident among those without comorbidity (Indigenous: non-Indigenous HR 2.5, 95%CI 1.9-3.4), indicating that factors other than those measured in this study are contributing to the differential. In a subgroup of New South Wales women, comorbidity was associated with advanced-stage cancer, which in turn was associated with elevated cervical cancer mortality. Survival was lowest for women with comorbidity. However, there wasn't a clear comorbidity-survival gradient for Indigenous women. Further investigation of potential drivers of the cervical cancer survival differentials is warranted. The results highlight the need for cancer care guidelines and multidisciplinary care that can meet the needs of complex patients. Also, primary and acute care services may need to pay more attention to Indigenous Australian women who may not obviously need it (i.e. those without comorbidity).
Lalramliana; Yadav, Arun K
2016-12-01
Three locally isolated strains of entomopathogenic nematodes (EPNs), viz. Heterorhabditis indica , Steinernema thermophilum and Steinernema glaseri , from Meghalaya, India were characterized in terms of storage temperature and survival and infectivity of their infective juveniles (IJs). The survival and infectivity of nematode IJs was studied at, 5 ± 2 and 25 ± 2 °C, for a period of 120 days, using deionized water as storage medium. The viability of nematode IJs was checked by mobility criterion at different storage periods, while the infectivity of nematode IJs was ascertained on the basis of establishment of IJs, using Galleria mellonella larva mortality tests in petridishes. The results of this study revealed that storage temperature markedly affects the survival as well as the establishment of nematode IJs of the three EPN species. At 5 °C, comparatively higher rate of IJ's survival (i.e. 74-86 %) was observed for 15 days of storage, but the same reduced drastically to 28-32 % after 30 days of storage for H. indica and S. thermophilum . On the other hand, at 25 °C, the survival of nematode IJs was observed till 120 days for all the three studied EPNs. In case of S. thermophilum and S. glaseri , higher rate of IJs survival (>75 %) was observed respectively at 15 and 30 days of observation. The study also showed that the establishment of IJs of the three EPN species declines with increase in storage periods, at both the test temperatures. In general, the nematodes stored at 25 °C showed comparatively better establishment than those stored at 5 °C. Among the three EPN studied, the establishment of S. glaseri was comparatively better than the rest of the species at both the temperatures and for different storage durations. In conclusion, our study adds further valuable information about the effect of storage temperature on survival and infectivity of three indigenous EPN species of Meghalaya, India which appears to be promising biocontrol agents of local insect pests.
2012-01-01
Background We explore the benefits of applying a new proportional hazard model to analyze survival of breast cancer patients. As a parametric model, the hypertabastic survival model offers a closer fit to experimental data than Cox regression, and furthermore provides explicit survival and hazard functions which can be used as additional tools in the survival analysis. In addition, one of our main concerns is utilization of multiple gene expression variables. Our analysis treats the important issue of interaction of different gene signatures in the survival analysis. Methods The hypertabastic proportional hazards model was applied in survival analysis of breast cancer patients. This model was compared, using statistical measures of goodness of fit, with models based on the semi-parametric Cox proportional hazards model and the parametric log-logistic and Weibull models. The explicit functions for hazard and survival were then used to analyze the dynamic behavior of hazard and survival functions. Results The hypertabastic model provided the best fit among all the models considered. Use of multiple gene expression variables also provided a considerable improvement in the goodness of fit of the model, as compared to use of only one. By utilizing the explicit survival and hazard functions provided by the model, we were able to determine the magnitude of the maximum rate of increase in hazard, and the maximum rate of decrease in survival, as well as the times when these occurred. We explore the influence of each gene expression variable on these extrema. Furthermore, in the cases of continuous gene expression variables, represented by a measure of correlation, we were able to investigate the dynamics with respect to changes in gene expression. Conclusions We observed that use of three different gene signatures in the model provided a greater combined effect and allowed us to assess the relative importance of each in determination of outcome in this data set. These results point to the potential to combine gene signatures to a greater effect in cases where each gene signature represents some distinct aspect of the cancer biology. Furthermore we conclude that the hypertabastic survival models can be an effective survival analysis tool for breast cancer patients. PMID:23241496
The HLA-matching effect in different cohorts of kidney transplant recipients: 10 years later.
Sasaki, Nori; Idica, Adam
2010-01-01
Almost all the HLA-matching effects found by the 2000 analysis were confirmed by this study. The only HLA-matching effect found in the 2000 analysis that disappeared were those of "small matching effect" found in sub-populations of type I diabetes (PRA < 10%, donor age 20-35). The 2000 analysis found a lack of HLA matching effect in non-African American kidney transplant patients with type I diabetes between 1987 and 2000. The 2000 analysis found that a patients' ethnic group was a factor in graft survival; African American patients were found to have a significantly lower 10-year graft survival in the 5 or 6 mismatched group (27%) compared to Caucasian patients (40%). In addition, Asian patients (42%) had higher graft survival compared to that of Caucasian patients. In this study, we observe a similar pattern with death-censored graft analysis for all ethnic groups with 10-year graft survivals at 72.9% for Asians, 69.5% for Caucasians, and 49.3% for African Americans. There was an overall lack of HLA-matching effect on patient survival in the 2000 analysis. In our current analysis, the patient survivals remained virtually the same despite moderate increase in graft survival over the same period of time. The HLA-C locus mismatch was found to have additive effect to the 10-year graft survival trends observed in A and B mismatch cases. HLA-DQ mismatch on the other hand, showed limited HLA-matching effect and did not show the same additive effect as C. There are various possible issues in the DQ mismatch analysis, from the consistency of DQ typing results, lack of diversity in the DQ antigen, to the possibility of DQ mismatch having little effect on the graft survival. Utilizing kidney transplant cases performed from 1995 through 2000, the 2000 analysis projected 10-year survivals of 64% and 47% for the 0 ABDR mismatch and 5 or 6 ABDR mismatched cases respectively; the 2000 projection only missed actual death-censored survivals by 9% lower for the 0 mismatch and 17% lower for the 5 or 6 mismatch cases. Utilizing the transplant cases of 2005 through 2009, we projected their 10-year graft survivals for year 2020. The 10-year graft survival for 0 ABDR mismatched patients is expected to be over 85% and nearly 70% for 5 or 6 ABDR mismatched patients. The general upward trend of graft survival we have observed in the last 10 years has been dependent upon the development of novel transplant protocols and use of novel immunomodulatory reagents. This trend is likely to continue given the promise of new drugs and personalized healthcare. The decreasing range of the differences in the 10-year graft survival between best matched and worst matched HLA groups is also likely to continue. One interesting trend that is clearly evident is the increasing difference between the best and worst HLA-matching in terms of the associated graft half-life. The positive HLA-matching effect on long-term graft survival is clearly evident and should be taken into consideration for all kidney transplants.
Subsequent leukaemia in autoimmune disease patients.
Hemminki, Kari; Liu, Xiangdong; Försti, Asta; Ji, Jianguang; Sundquist, Jan; Sundquist, Kristina
2013-06-01
Previous studies have shown that patients diagnosed with some autoimmune (AI) diseases are at an increased risk of leukaemia but limited data are available on survival. We systematically analysed the risks (standardized incidence ratio, SIR) and survival (hazard ratio, HR) in nine types of leukaemia among 402 462 patients hospitalized for any of 33 AI diseases and compared to persons not hospitalized for AI diseases. Risk for all leukaemia was increased after 13 AI diseases and survival was decreased after six AI diseases. SIRs were increased after all AI diseases for seven types of leukaemia, including SIR 1·69 (95% confidence interval (CI): 1·29-2·19) for acute lymphoblastic leukaemia (ALL), 1·85 (95% CI: 1·65-2·07) for acute myeloid leukaemia, 1·68 (95% CI: 1·37-2·04) for chronic myeloid leukaemia, 2·20 (95% CI: 1·69-2·81) for 'other myeloid leukaemia', 2·45 (95% 1·99-2·98) for 'other and unspecified leukaemia', 1·81 (95% CI: 1·11-2·81) for monocytic leukaemia, and 1·36 (95% CI: 1·08-1·69) for myelofibrosis. The HRs were increased for four types of leukaemia, most for myelofibrosis (1·74, 95% CI: 1·33-2·29) and ALL (1·42, 95% CI: 1·03-1·95). Some AI diseases, including rheumatoid arthritis, were associated with increased SIRs and HRs in many types of leukaemia. The present data showed increases in risk and decreases in survival for many types of leukaemia after various AI diseases. Leukaemia is a rare complication in AI disease but findings about this comorbidity at the time of leukaemia diagnosis may help to optimize the treatment and improve survival. © 2013 John Wiley & Sons Ltd.
Flap preconditioning by pressure-controlled cupping in a rat model.
Koh, Kyung S; Park, Sung Woo; Oh, Tae Suk; Choi, Jong Woo
2016-08-01
Flap survival is essential for the success of soft-tissue reconstruction. Accordingly, various surgical and medical methods aim to increase flap survival. Because flap survival is affected by the innate vascular supply, traditional preconditioning methods mainly target vasodilatation or vascular reorientation to increase blood flow to the tissue. External stress on the skin, such as an external volume expander or cupping, induces vascular remodeling, and these approaches have been used in the fat grafting field and in traditional Asian medicine. In the present study, we used a rat random-pattern dorsal flap model to study the effectiveness of preconditioning with an externally applied device (cupping) at the flap site that directly applied negative pressure to the skin. The device, the pressure-controlled cupping, is connected to negative pressure vacuum device providing accurate pressure control from 0 mm Hg to -200 mm Hg. Flap surgery was performed after preconditioning under -25 mm Hg suction pressure for 30 min a day for 5 d, followed by 9 d of postoperative observation. Flap survival was assessed as the area of viable tissue and was compared between the preconditioned group and a control group. The preconditioned group showed absolute percentage increase of flap viability relative to the entire flap by 19.0± 7.6% (average 70.1% versus 51.0%). Tissue perfusion of entire flap, evaluated by laser Doppler imaging system, was improved with absolute percentage increase by 24.2± 10.4% (average 77.4% versus 53.1%). Histologic analysis of hematoxylin and eosin, CD31, and Masson-trichrome staining showed increased vascular density in the subdermal plexus and more organized collagen production with hypertrophy of the attached muscle. Our study suggests that flap preconditioning caused by controlled noninvasive suction induces vascular remodeling that increases tissue perfusion and improves flap survival in a rat model. Copyright © 2016 Elsevier Inc. All rights reserved.
Longitudinal change in the BODE index predicts mortality in severe emphysema.
Martinez, Fernando J; Han, Meilan K; Andrei, Adin-Cristian; Wise, Robert; Murray, Susan; Curtis, Jeffrey L; Sternberg, Alice; Criner, Gerard; Gay, Steven E; Reilly, John; Make, Barry; Ries, Andrew L; Sciurba, Frank; Weinmann, Gail; Mosenifar, Zab; DeCamp, Malcolm; Fishman, Alfred P; Celli, Bartolome R
2008-09-01
The predictive value of longitudinal change in BODE (Body mass index, airflow Obstruction, Dyspnea, and Exercise capacity) index has received limited attention. We hypothesized that decrease in a modified BODE (mBODE) would predict survival in National Emphysema Treatment Trial (NETT) patients. To determine how the mBODE score changes in patients with lung volume reduction surgery versus medical therapy and correlations with survival. Clinical data were recorded using standardized instruments. The mBODE was calculated and patient-specific mBODE trajectories during 6, 12, and 24 months of follow-up were estimated using separate regressions for each patient. Patients were classified as having decreasing, stable, increasing, or missing mBODE based on their absolute change from baseline. The predictive ability of mBODE change on survival was assessed using multivariate Cox regression models. The index of concordance was used to directly compare the predictive ability of mBODE and its separate components. The entire cohort (610 treated medically and 608 treated surgically) was characterized by severe airflow obstruction, moderate breathlessness, and increased mBODE at baseline. A wide distribution of change in mBODE was seen at follow-up. An increase in mBODE of more than 1 point was associated with increased mortality in surgically and medically treated patients. Surgically treated patients were less likely to experience death or an increase greater than 1 in mBODE. Indices of concordance showed that mBODE change predicted survival better than its separate components. The mBODE demonstrates short- and intermediate-term responsiveness to intervention in severe chronic obstructive pulmonary disease. Increase in mBODE of more than 1 point from baseline to 6, 12, and 24 months of follow-up was predictive of subsequent mortality. Change in mBODE may prove a good surrogate measure of survival in therapeutic trials in severe chronic obstructive pulmonary disease. Clinical trial registered with www.clinicaltrials.gov (NCT 00000606).
Raedel, Michael; Fiedler, Cliff; Jacoby, Stephan; Boening, Klaus W
2015-07-01
Scientific data about the long-term survival of teeth treated with cast post and cores are scarce. Retrospective studies often use different target events for their analyses. A comparison is therefore complicated. For associated tooth-, jaw-, and patient-related factors little evidence exists as to their effect on survival. The purpose of this study was to extend the knowledge on the survival of teeth treated with cast post and cores for observation periods of more than 10 years. A decrease or increase in survival times according to the presence or absence of associated parameters needs to be evaluated. A retrospective evaluation was conducted of all cast post and cores inserted in 1 university clinic between January 1992 and June 2011. A Kaplan-Meier survival analysis was carried out by using extraction as the target event. The survival curves for different tooth types, the presence or absence of adjacent teeth, and the prosthetic restoration of the respective jaws were compared by using the log-rank test (α=.05). A Cox regression model was calculated for multivariate analyses. A total of 717 cast post and cores for 343 patients were recorded. The mean survival time was 13.5 years. A statistically significant decrease in survival times was found for canines (11.9 years) and premolars (13.4 years) versus molars (14.1 years), no adjacent teeth (10.6 years) versus at least 1 adjacent tooth (13.8 years), and the restoration with removable dental prostheses (12.5 years) versus fixed dental prostheses and single crowns (13.9 years). The largest reduction in survival time was found for teeth being used as an abutment for a double crown-retained removable partial dental prosthesis (telescopic denture) (9.8 years). Tooth type and adjacent tooth status remained as significant variables within the multivariate Cox regression model. Cast post and cores have an acceptable long-term survival time. Because different factors may influence survival, considering these factors in treatment planning may increase the long-term success of these restorations. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
EOSINOPHIL INFLUX TO THE NASAL AIRWAY FOLLOWING LOCAL, LOW-LEVEL LPS CHALLENGE IN HUMANS
Background: Recent obervations show that atopic asthmatic subjects have increased sensitivity to respirable endotoxin (or LPS) compared with normal persons. In vitro studies demonstrate that LPS enchances eosinophil survival. These obervations suggest that the effects of inhal...
Pulte, Dianne; Redaniel, Maria Theresa; Bird, Jenny; Jeffreys, Mona
2015-06-01
Chronic lymphocytic leukemia (CLL) and chronic myeloid leukemia (CML) are highly treatable conditions occurring primarily in older patients. Lower survival among older people has been reported in both conditions, but newer treatments may change both the overall survival rate and the relative risk associated with aging. Here, we examine survival for patients with CLL and CML in the United States (US) and England. Patients with CLL and CML were identified from the Surveillance, Epidemiology, and End Results (US) and National Cancer Registry (England). Five-year relative survival was calculated by major age group. Excess hazard ratios (EHR) by age were calculated for each condition, and multivariable analysis was performed to adjust for the following potential confounders: gender, race or ethnic group (US only), period of diagnosis, and a measure of socioeconomic deprivation (England only). Five-year relative survival increased for both CLL and CML in both England and the US between 1996-2000 and 2006-2010. However, relative age-related disparities persisted. For CLL, the EHR for death was 9.44 (7.84-11.36) in the US and 6.14 (5.65-6.68) in England for ages 85+ compared to ages 55-64. For CML, the EHR was 3.52 (3.17-3.90) in the US and 4.54 (4.13-4.98) in England for ages 75+ compared to ages 45-64. Survival improved for patients with chronic leukemias in the early 21st century. However, age-related disparities persist, despite clinical trial evidence that treatment in older adults with chronic leukemia can be safe and effective. Further research to determine the reasons for the lower survival in older patients and greater awareness of this problem may improve survival for older patients with chronic leukemia. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Is There an Age Limit to Lung Transplantation?
Biswas Roy, Sreeja; Alarcon, Diana; Walia, Rajat; Chapple, Kristina M; Bremner, Ross M; Smith, Michael A
2015-08-01
Lung transplantation in patients older than 65 years is increasingly common, but questions remain regarding risk vs benefit and procedure choice. We identified short-term and long-term outcomes in older single-lung transplant (SLT) and bilateral-lung transplant (BLT) recipients. We performed a retrospective review of United Network for Organ Sharing data for patients who underwent lung transplantation between May 2005 and December 2012. Patients were grouped by age, and we calculated short-term and long-term survival rates and compared survival distributions. Of the 11,776 patients who received lung transplants, 9,317 (79%) were aged 12 to 64 years, 1,902 (16%) were 65 to 69, 486 (4%) were 70 to 74, and 71 (1%) were 75 to 79. Short-term survival was similar across all age groups and procedure types except those aged 75 to 79, who had lower short-term survival for BLT. Those aged 12 to 64 had higher 5-year survival for SLT and BLT than all other groups (p < 0.001), and BLT offered a long-term survival advantage over SLT in this group (p < 0.0001). Older age groups trended toward better long-term survival for BLT compared with SLT (65 to 69, p = 0.059; 70 to 74, p = 0.079). Although data were lacking for 5-year survival for those aged 75 to 79, the 3-year survival for BLT in this group was inferior. Lung transplant can be offered to select older patients up to age 74 with acceptable outcomes. SLT may be preferred for elderly patients, but BLT offers acceptable long-term outcomes without significant short-term risk. Patients older than 75 have acceptable short-term outcomes for SLT, but long-term outcomes for SLT and BLT in this group are poor. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Miller, Jessica A; Teel, David J; Peterson, William T; Baptista, Antonio M
2014-01-01
Research on regulatory mechanisms in biological populations often focuses on environmental covariates. An integrated approach that combines environmental indices with organismal-level information can provide additional insight on regulatory mechanisms. Survival of spring/summer Snake River Chinook salmon (Oncorhynchus tshawytscha) is consistently low whereas some adjacent populations with similar life histories experience greater survival. It is not known if populations with differential survival respond similarly during early marine residence, a critical period in the life history. Ocean collections, genetic stock identification, and otolith analyses were combined to evaluate the growth-mortality and match-mismatch hypotheses during early marine residence of spring/summer Snake River Chinook salmon. Interannual variation in juvenile attributes, including size at marine entry and marine growth rate, was compared with estimates of survival and physical and biological metrics. Multiple linear regression and multi-model inference were used to evaluate the relative importance of biological and physical metrics in explaining interannual variation in survival. There was relatively weak support for the match-mismatch hypothesis and stronger evidence for the growth-mortality hypothesis. Marine growth and size at capture were strongly, positively related to survival, a finding similar to spring Chinook salmon from the Mid-Upper Columbia River. In hindcast models, basin-scale indices (Pacific Decadal Oscillation (PDO) and the North Pacific Gyre Oscillation (NPGO)) and biological indices (juvenile salmon catch-per-unit-effort (CPUE) and a copepod community index (CCI)) accounted for substantial and similar portions of variation in survival for juvenile emigration years 1998-2008 (R2>0.70). However, in forecast models for emigration years 2009-2011, there was an increasing discrepancy between predictions based on the PDO (50-448% of observed value) compared with those based on the NPGO (68-212%) or biological indices (CPUE and CCI: 83-172%). Overall, the PDO index was remarkably informative in earlier years but other basin-scale and biological indices provided more accurate indications of survival in recent years.
Immunogenicity and protective efficacy of the Mycobacterium tuberculosis fadD26 mutant
Infante, E; Aguilar, L D; Gicquel, B; Pando, R Hernandez
2005-01-01
The Mycobacterium tuberculosis fadD26 mutant has impaired synthesis of phthiocerol dimycocerosates (DIM) and is attenuated in BALB/c mice. Survival analysis following direct intratracheal infection confirmed the attenuation: 60% survival at 4 months post-infection versus 100% mortality at 9 weeks post-infection with the wild-type strain. The fadD26 mutant induced less pneumonia and larger DTH reactions. It induced lower but progressive production of interferon (IFN)-γ, interleukin (IL)-4 and tumour necrosis factor (TNF)-α. Used as a subcutaneous vaccine 60 days before intratracheal challenge with a hypervirulent strain of M. tuberculosis (Beijing code 9501000), the mutant induced a higher level of protection than did Bacille Calmette–Guérin (BCG). Seventy per cent of the mice vaccinated with the fadD26 mutant survived at 16 weeks after challenge compared to 30% of those vaccinated with BCG. Similarly, there was less tissue damage (pneumonia) and lower colony-forming units (CFU) in the mice vaccinated with the fadD26 mutant compared to the findings in mice vaccinated with BCG. These data suggest that DIM synthesis is important for the pathogenicity of M. tuberculosis, and that inactivation of DIM synthesis can increase the immunogenicity of live vaccines, and increase their ability to protect against tuberculosis. PMID:15958066
Population-based analysis of survival for hypoplastic left heart syndrome.
Hirsch, Jennifer C; Copeland, Glenn; Donohue, Janet E; Kirby, Russell S; Grigorescu, Violanda; Gurney, James G
2011-07-01
To analyze survival patterns among infants with hypoplastic left heart syndrome (HLHS) in the State of Michigan. Cases of HLHS prevalent at live birth were identified and confirmed within the Michigan Birth Defects Registry from 1992 to 2005 (n=406). Characteristics of infants with HLHS were compared with a 10:1 random control sample. Compared with 4060 control subjects, the 406 cases of HLHS were more frequently male (62.6% vs 51.4%), born prematurely (<37 weeks gestation; 15.3% vs 8.7%), and born at low birth weight (LBW) (<2.5 kg; 16.0% vs 6.6%). HLHS 1-year survival rate improved over the study period (P=.041). Chromosomal abnormalities, LBW, premature birth, and living in a high poverty neighborhood were significantly associated with death. Controlling for neighborhood poverty, term infants versus preterm with HLHS or LBW were 3.2 times (95% CI: 1.9-5.3; P<.001) more likely to survive at least 1 year. Controlling for age and weight, infants from low-poverty versus high-poverty areas were 1.8 times (95% CI: 1.1-2.8; P=.015) more likely to survive at least 1 year. Among infants with HLHS in Michigan, those who were premature, LBW, had chromosomal abnormalities, or lived in a high-poverty area were at increased risk for early death. Copyright © 2011 Mosby, Inc. All rights reserved.
Zhang, Lili; Fan, Zhaomin; Han, Yuechen; Xu, Lei; Liu, Wenwen; Bai, Xiaohui; Zhou, Meijuan; Li, Jianfeng; Wang, Haibo
2018-04-01
Valproic acid (VPA), a medication primarily used to treat epilepsy and bipolar disorder, has been applied to the repair of central and peripheral nervous system injury. The present study investigated the effect of VPA on functional recovery, survival of facial motor neurons (FMNs), and expression of proteins in rats after facial nerve trunk transection by functional measurement, Nissl staining, TUNEL, immunofluorescence, and Western blot. Following facial nerve injury, all rats in group VPA showed a better functional recovery, which was significant at the given time, compared with group NS. The Nissl staining results demonstrated that the number of FMNs survival in group VPA was higher than that in group normal saline (NS). TUNEL staining showed that axonal injury of facial nerve could lead to neuronal apoptosis of FMNs. But treatment of VPA significantly reduced cell apoptosis by decreasing the expression of Bax protein and increased neuronal survival by upregulating the level of brain-derived neurotrophic factor (BDNF) and growth associated protein-43 (GAP-43) expression in injured FMNs compared with group NS. Overall, our findings suggest that VPA may advance functional recovery, reduce lesion-induced apoptosis, and promote neuron survival after facial nerve transection in rats. This study provides an experimental evidence for better understanding the mechanism of injury and repair of peripheral facial paralysis.
Wolski, Michal J; Bhatnagar, Ajay; Flickinger, John C; Belani, Chandra P; Ramalingam, Suresh; Greenberger, Joel S
2005-09-01
Three-dimensional (3D) conformal radiation therapy (CRT) and chemotherapy have recently improved lung cancer management. We reviewed outcomes in 68 patients with unresectable stage I-III non-small-cell lung cancer. Treatment consisted of 3D CRT alone or with concurrent chemotherapy (CCR). Concurrent chemotherapy improved survival, to a median of 17 months +/- 4.9 months, compared with 8 months+/- 4.1 months for the radiation therapy (RT) alone group (P=0.0347). The 2- and 5-year survival rates were 40.3%+/-7.7% and 14.1%+/-6.4%, respectively, with CCR, compared with 19.6%+/- 9.6% and 0, respectively, for RT alone. In a subgroup analysis for age > 65, patients who received CCR (n=20) had significantly improved survival and local control (P=0.005 and P=0.0286, respectively). Acute esophageal toxicity Radiation Therapy Oncology Group grade >or= 3 was significantly higher in the CCR group and correlated with the RT dose (19% in CCR vs. 0 in RT, P=0.0234; P=0.050). The overall incidences of esophageal and pulmonary toxicity grade >or= 3 were 20.6% and 5.9%, respectively. Our study confirms that CCR is associated with improved survival over RT alone, with a tolerable increase in acute toxicity.
Lee, Abigail H; Eme, John; Mueller, Casey A; Manzon, Richard G; Somers, Christopher M; Boreham, Douglas R; Wilson, Joanna Y
2016-04-01
Increasing incubation temperatures, caused by global climate change or thermal effluent from industrial processes, may influence embryonic development of fish. This study investigates the cumulative effects of increased incubation temperature and repeated heat shocks on developing Lake Whitefish (Coregonus clupeaformis) embryos. We studied the effects of three constant incubation temperatures (2°C, 5°C or 8°C water) and weekly, 1-h heat shocks (+3°C) on hatching time, survival and morphology of embryos, as these endpoints may be particularly susceptible to temperature changes. The constant temperatures represent the predicted magnitude of elevated water temperatures from climate change and industrial thermal plumes. Time to the pre-hatch stage decreased as constant incubation temperature increased (148d at 2°C, 92d at 5°C, 50d at 8°C), but weekly heat shocks did not affect time to hatch. Mean survival rates and embryo morphometrics were compared at specific developmental time-points (blastopore, eyed, fin flutter and pre-hatch) across all treatments. Constant incubation temperatures or +3°C heat-shock exposures did not significantly alter cumulative survival percentage (~50% cumulative survival to pre-hatch stage). Constant warm incubation temperatures did result in differences in morphology in pre-hatch stage embryos. 8°C and 5°C embryos were significantly smaller and had larger yolks than 2°C embryos, but heat-shocked embryos did not differ from their respective constant temperature treatment groups. Elevated incubation temperatures may adversely alter Lake Whitefish embryo size at hatch, but weekly 1-h heat shocks did not affect size or survival at hatch. These results suggest that intermittent bouts of warm water effluent (e.g., variable industrial emissions) are less likely to negatively affect Lake Whitefish embryonic development than warmer constant incubation temperatures that may occur due to climate change. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cepeda-Franco, C; Bernal-Bellido, C; Barrera-Pulido, L; Álamo-Martínez, J M; Ruiz-Matas, J H; Suárez-Artacho, G; Marín-Gómez, L M; Tinoco-González, J; Díaz-Aunión, C; Padillo-Ruiz, F J; Gómez-Bravo, M Á
2016-11-01
Recently, there has been a large discrepancy between the number of patients on the waiting list for a liver transplant and the availability of deceased donors, with an increase in annual wait list mortality rates. Elderly donor livers are thought to be marginal grafts; however, in recent years, their utilization has constantly increased. The aim of this study is to evaluate the utilization of elderly donors in Andalusia and post-transplant outcomes. This retrospective observational study of 2408 liver transplants, performed in Andalusia between 2000 and 2014, analyzes the outcomes from donors aged 70 plus (n = 423) in terms of survival rates of the graft and the recipient, the type of transplant, donor age, and D-MELD score (product of donor age and preoperative Model for End-stage Liver Disease score). The most frequent indications for transplant were alcoholic cirrhosis (49.2%), hepatitis C cirrhosis (13%), and hepatocellular carcinoma (12.5%). The overall survival at 5 years was 64%, with a significant fall in survival for recipients with a D-MELD greater than 1500 (57%; P = .045). In the 70-year-old-plus donor group, the overall patient survival was 58.4%. The retransplant rate increased proportionately with donor age. In the alcoholic cirrhosis recipient subgroup, the overall survival at 5 years was 67.6% (P < .05) compared with 33.5% in patients with hepatitis C. Use of elderly donors is a safe strategy to reduce the scarcity of donors, provided that a D-MELD score below 1500 is obtained. Retransplant rates increase progressively with donor age. It is necessary to carefully screen recipients of older organs, taking into account that the best results are obtained for alcoholic cirrhosis, negative viral load hepatitis C, and a D-MELD score below 1500. Copyright © 2016 Elsevier Inc. All rights reserved.
Hachem, Laureen D; Mothe, Andrea J; Tator, Charles H
2016-08-15
Traumatic spinal cord injury (SCI) leads to a cascade of secondary chemical insults, including oxidative stress and glutamate excitotoxicity, which damage host neurons and glia. Transplantation of exogenous neural stem/progenitor cells (NSPCs) has shown promise in enhancing regeneration after SCI, although survival of transplanted cells remains poor. Understanding the response of NSPCs to the chemical mediators of secondary injury is essential in finding therapies to enhance survival. We examined the in vitro effects of glutamate and glutamate receptor agonists on adult rat spinal cord-derived NSPCs. NSPCs isolated from the periventricular region of the adult rat spinal cord were exposed to various concentrations of glutamate for 96 h. We found that glutamate treatment (500 μM) for 96 h significantly increased live cell numbers, reduced cell death, and increased proliferation, but did not significantly alter cell phenotype. Concurrent glutamate treatment (500 μM) in the setting of H2O2 exposure (500 μM) for 10 h increased NSPC survival compared to H2O2 exposure alone. The effects of glutamate on NSPCs were blocked by the α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid (AMPA)/kainate receptor antagonist GYKI-52466, but not by the N-methyl-D-aspartic acid receptor antagonist MK-801 or DL-AP5, or the mGluR3 antagonist LY-341495. Furthermore, treatment of NSPCs with AMPA, kainic acid, or the kainate receptor-specific agonist (RS)-2-amino-3-(3-hydroxy-5-tert-butylisoxazol-4-yl)propanoic acid mimicked the responses seen with glutamate both alone and in the setting of oxidative stress. These findings offer important insights into potential mechanisms to enhance NSPC survival and implicate a potential role for glutamate in promoting NSPC survival and proliferation after traumatic SCI.
Zaytseva, Yekaterina Y.; Harris, Jennifer W.; Mitov, Mihail I.; Kim, Ji Tae; Butterfield, D. Allan; Lee, Eun Y.; Weiss, Heidi L.; Gao, Tianyan; Evers, B. Mark
2015-01-01
Fatty acid synthase (FASN), a lipogenic enzyme, is upregulated in colorectal cancer (CRC). Increased de novo lipid synthesis is thought to be a metabolic adaptation of cancer cells that promotes survival and metastasis; however, the mechanisms for this phenomenon are not fully understood. We show that FASN plays a role in regulation of energy homeostasis by enhancing cellular respiration in CRC. We demonstrate that endogenously synthesized lipids fuel fatty acid oxidation, particularly during metabolic stress, and maintain energy homeostasis. Increased FASN expression is associated with a decrease in activation of energy-sensing pathways and accumulation of lipid droplets in CRC cells and orthotopic CRCs. Immunohistochemical evaluation demonstrated increased expression of FASN and p62, a marker of autophagy inhibition, in primary CRCs and liver metastases compared to matched normal colonic mucosa. Our findings indicate that overexpression of FASN plays a crucial role in maintaining energy homeostasis in CRC via increased oxidation of endogenously synthesized lipids. Importantly, activation of fatty acid oxidation and consequent downregulation of stress-response signaling pathways may be key adaptation mechanisms that mediate the effects of FASN on cancer cell survival and metastasis, providing a strong rationale for targeting this pathway in advanced CRC. PMID:25970773
Zaytseva, Yekaterina Y; Harris, Jennifer W; Mitov, Mihail I; Kim, Ji Tae; Butterfield, D Allan; Lee, Eun Y; Weiss, Heidi L; Gao, Tianyan; Evers, B Mark
2015-08-07
Fatty acid synthase (FASN), a lipogenic enzyme, is upregulated in colorectal cancer (CRC). Increased de novo lipid synthesis is thought to be a metabolic adaptation of cancer cells that promotes survival and metastasis; however, the mechanisms for this phenomenon are not fully understood. We show that FASN plays a role in regulation of energy homeostasis by enhancing cellular respiration in CRC. We demonstrate that endogenously synthesized lipids fuel fatty acid oxidation, particularly during metabolic stress, and maintain energy homeostasis. Increased FASN expression is associated with a decrease in activation of energy-sensing pathways and accumulation of lipid droplets in CRC cells and orthotopic CRCs. Immunohistochemical evaluation demonstrated increased expression of FASN and p62, a marker of autophagy inhibition, in primary CRCs and liver metastases compared to matched normal colonic mucosa. Our findings indicate that overexpression of FASN plays a crucial role in maintaining energy homeostasis in CRC via increased oxidation of endogenously synthesized lipids. Importantly, activation of fatty acid oxidation and consequent downregulation of stress-response signaling pathways may be key adaptation mechanisms that mediate the effects of FASN on cancer cell survival and metastasis, providing a strong rationale for targeting this pathway in advanced CRC.
Pérez-Torrado, Roberto; Llopis, Silvia; Perrone, Benedetta; Gómez-Pastor, Rocío; Hube, Bernhard; Querol, Amparo
2015-01-01
In recent years, the number of human infection cases produced by the food related species Saccharomyces cerevisiae has increased. Whereas many strains of this species are considered safe, other 'opportunistic' strains show a high degree of potential virulence attributes and can cause infections in immunocompromised patients. Here we studied the genetic characteristics of selected opportunistic strains isolated from dietary supplements and also from patients by array comparative genomic hybridization. Our results show increased copy numbers of IMD genes in opportunistic strains, which are implicated in the de novo biosynthesis of the purine nucleotides pathway. The importance of this pathway for virulence of S. cerevisiae was confirmed by infections in immunodeficient murine models using a GUA1 mutant, a key gene of this pathway. We show that exogenous guanine, an end product of this pathway in its triphosphorylated form, increases the survival of yeast strains in ex vivo blood infections. Finally, we show the importance of the DNA damage response that activates dNTP biosynthesis in yeast cells during ex vivo blood infections. We conclude that opportunistic yeasts may use an enhanced de novo biosynthesis of the purine nucleotides pathway to increase survival and favor infections in the host.
Alout, Haoues; Dabiré, Roch K; Djogbénou, Luc S; Abate, Luc; Corbel, Vincent; Chandre, Fabrice; Cohuet, Anna
2016-07-19
Insecticide resistance raises concerns for the control of vector-borne diseases. However, its impact on parasite transmission could be diverse when considering the ecological interactions between vector and parasite. Thus we investigated the fitness cost associated with insecticide resistance and Plasmodium falciparum infection as well as their interactive cost on Anopheles gambiae survival and fecundity. In absence of infection, we observed a cost on fecundity associated with insecticide resistance. However, survival was higher for mosquito bearing the kdr mutation and equal for those with the ace-1(R) mutation compared to their insecticide susceptible counterparts. Interestingly, Plasmodium infection reduced survival only in the insecticide resistant strains but not in the susceptible one and infection was associated with an increase in fecundity independently of the strain considered. This study provides evidence for a survival cost associated with infection by Plasmodium parasite only in mosquito selected for insecticide resistance. This suggests that the selection of insecticide resistance mutation may have disturbed the interaction between parasites and vectors, resulting in increased cost of infection. Considering the fitness cost as well as other ecological aspects of this natural mosquito-parasite combination is important to predict the epidemiological impact of insecticide resistance.
The Atypical Chemokine Receptor ACKR2 is Protective Against Sepsis.
Castanheira, Fernanda V E Silva; Borges, Vanessa; Sônego, Fabiane; Kanashiro, Alexandre; Donate, Paula B; Melo, Paulo H; Pallas, Kenneth; Russo, Remo C; Amaral, Flávio A; Teixeira, Mauro M; Ramalho, Fernando S; Cunha, Thiago M; Liew, Foo Y; Alves-Filho, José C; Graham, Gerard J; Cunha, Fernando Q
2018-06-01
Sepsis is a systemic inflammatory response as a result of uncontrolled infections. Neutrophils are the first cells to reach the primary sites of infection, and chemokines play a key role in recruiting neutrophils. However, in sepsis chemokines could also contribute to neutrophil infiltration to vital organs leading to multiple organ failure. ACKR2 is an atypical chemokine receptor, which can remove and degrade inflammatory CC chemokines. The role of ACK2 in sepsis is unknown. Using a model of cecal ligation and puncture (CLP), we demonstrate here that ACKR2 deficient () mice exhibited a significant reduction in the survival rate compared with similarly treated wild-type (WT) mice. However, neutrophil migration to the peritoneal cavity and bacterial load were similar between WT and ACKR2 mice during CLP. In contrast, ACKR2 mice showed increased neutrophil infiltration and elevated CC chemokine levels in the lung, kidney, and heart compared with the WT mice. In addition, ACKR2 mice also showed more severe lesions in the lung and kidney than those in the WT mice. Consistent with these results, WT mice under nonsevere sepsis (90% survival) had higher expression of ACKR2 in these organs than mice under severe sepsis (no survival). Finally, the lungs from septic patients showed increased number of ACKR2 cells compared with those of nonseptic patients. Our data indicate that ACKR2 may have a protective role during sepsis, and the absence of ACKR2 leads to exacerbated chemokine accumulation, neutrophil infiltration, and damage to vital organs.
Florescu, Diana F; Kalil, Andre C; Qiu, Fang; Grant, Wendy; Morris, Michael C; Schmidt, Cynthia M; Florescu, Marius C; Poole, Jill A
2014-11-01
Severe hypogammaglobulinemia (IgG < 400 mg/dL) has adverse impact on mortality during the first year post-transplantation. The aim of the study was to determine whether increasing IgG levels to ≥400 mg/dL improved outcomes. Kaplan-Meier analyses were performed to estimate survival, log-rank test to compare survival distributions between groups, and Fisher's exact test to determine the association between hypogammaglobulinemia and rejection or graft loss. Thirty-seven solid organ transplant (SOT) recipients were included. Hypogammaglobulinemia was diagnosed at median of 5.6 months (range: 0-291.8 months) post-transplantation. Types of transplants: liver-small bowel (17); liver-small bowel-kidney (2); liver (5); small bowel (4); liver-kidney (1); kidney/kidney-pancreas (3); heart (3); heart-kidney (1); and heart-lung (1). The three-yr survival after the diagnosis of hypogammaglobulinemia was 49.5% (95% CI: 32.2-64.6%). Patients were dichotomized based upon IgG level at last follow-up: IgG ≥ 400 mg/dL (23 patients) and IgG < 400 mg/dL (14 patients). There was no evidence of a difference in survival (p = 0.44), rejection rate (p = 0.44), and graft loss censored for death (p = 0.99) at one yr between these two groups. There was no difference in survival between patients receiving or not immunoglobulin (p = 0.99) or cytomegalovirus hyperimmunoglobulin (p = 0.14). Severe hypogammaglobulinemia after SOT is associated with high mortality rates, but increasing IgG levels to ≥400 mg/dL did not seem to translate in better patient or graft survival in this cohort. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Friedrich, Kilian; Smit, Mark; Wannhoff, Andreas; Rupp, Christian; Scholl, Sabine G; Antoni, Christoph; Dollinger, Matthias; Neumann-Haefelin, Christoph; Stremmel, Wolfgang; Weiss, Karl Heinz; Schemmer, Peter; Gotthardt, Daniel Nils
2016-08-01
Therapeutic options to treat progression of end-stage liver disease (ESLD) or improve long-term survival after liver transplantation remain scarce. We investigated the impact of coffee consumption under these conditions. We recorded coffee consumption habits of 379 patients with ESLD awaiting liver transplantation and 260 patients after liver transplantation. Survival was analyzed based on coffee intake. One hundred ninety-five patients with ESLD consumed coffee on a daily basis, while 184 patients did not. Actuarial survival was impaired (P = 0.041) in non-coffee drinkers (40.4 ± 4.3 months, 95% confidence interval [CI]: 32.0-48.9) compared with coffee drinkers (54.9 ± 5.5 months, 95% CI: 44.0-65.7). In subgroup analysis, the survival of patients with alcoholic liver disease (ALD; P = 0.020) and primary sclerosing cholangitis (PSC; P = 0.017) was increased with coffee intake while unaffected in patients with chronic viral hepatitis (P = 0.517) or other liver disease entities (P = 0.652). Multivariate analysis showed that coffee consumption of PSC and ALD patients retained as an independent risk factor (odds ratio [OR]: 1.94; 95% CI: 1.15-3.28; P = 0.013) along with MELD score (OR: 1.13; 95% CI: 1.09-1.17; P = 0.000). Following liver transplantation, long-term survival was longer in coffee drinkers (coffee: 61.8 ± 2.0 months, 95% CI: 57.9-65.8) than non-drinkers (52.3 ± 3.5 months, 95% CI: 45.4-59.3; P = 0.001). Coffee consumption delayed disease progression in ALD and PSC patients with ESLD and increased long-term survival after liver transplantation. We conclude that regular coffee intake might be recommended for these patients. © 2016 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.
Liu, Mei-Mei; Liu, Li; Chen, Liang; Yin, Xiao-Jing; Liu, Hui; Zhang, Yan-Hua; Li, Pei-Ling; Wang, Shan; Li, Xiao-Xiao; Yu, Cai-Hong
2017-04-16
BACKGROUND The aim of this study was to investigate the effects of sleep duration and bedtime on sperm health, and the possible mechanism involved. MATERIAL AND METHODS We randomly divided 981 healthy Chinese men into groups according to research-set bedtimes (A=8-10 PM, B=after 10 PM, and C=after midnight) and sleep durations: group 1=<6.0 h (short), group 2=7.0-8.0 h (average), and group 3=>9.0 h (long). Sperm morphology, count, survival, and motility were examined according to sleep patterns. Antisperm antibody (ASA) production in semen was determined. RESULTS Sperm counts and their survival rates were lower in the short sleepers as compared to others within each group (all P<0.01). The lower counts and survival rates were observed in different bedtimes, with significant differences found between measurements of C1 vs. A1 and C2 vs. A2 or B2 (all P<0.05 or 0.01). Semen motility was lower in the short sleepers as compared to the average and long sleepers (all P<0.01). There were differences in the bedtime-related results between measurements of C1 vs. A1 or B1 (P<0.05 or 0.01). Additionally, the population proportion for the ASA-positive participates and incidence of the ASA-expressed population obviously increased in the short sleepers as compared to others within each group (all P<0.05). CONCLUSIONS Short and long sleep durations and late bedtime were associated with impaired sperm health in the study cohort, partly through increasing ASA production in the semen.
Ophthalmology's future in the next decade: a historical and comparative perspective.
Day, S H
1999-01-01
To gain a historical and comparative perspective about the future of ophthalmology within the profession of medicine. A literature search is made of disciplines other than medicine (history, sociology, philosophy, economics, and ethics) in order to assess factors responsible for survival and healthiness of a profession. The "learned" professions (medicine, law, and theology) are assessed. Other "professional" careers valued by society (sports and classical music) are reviewed. From the perspective of other disciplines, the future of ophthalmology is seen as vulnerable and fragile. Survival of professions, be they classically or economically defined, is linked to societal needs, a profession's unique commitment and ability to provide services to society, and the profession's maintenance of knowledge as well as skill-based services. Historical evidence has shown erosion of a profession's power consequent to capitalist influences, government influences, access of skills by less trained individuals, and elitist posturing by a profession. Comparative evidence has shown societal acceptance of an escalation of salaries for designated superstars, increasing roles and influence of managerial personnel, and trivialization of values other than economic ones. Attention to historical and comparative trends by individual ophthalmologists as well as associations representing ophthalmologists is mandatory if ophthalmology as we know it is to survive within the profession of medicine.
Ophthalmology's future in the next decade: a historical and comparative perspective.
Day, S H
1999-01-01
PURPOSE: To gain a historical and comparative perspective about the future of ophthalmology within the profession of medicine. METHODS: A literature search is made of disciplines other than medicine (history, sociology, philosophy, economics, and ethics) in order to assess factors responsible for survival and healthiness of a profession. The "learned" professions (medicine, law, and theology) are assessed. Other "professional" careers valued by society (sports and classical music) are reviewed. RESULTS: From the perspective of other disciplines, the future of ophthalmology is seen as vulnerable and fragile. Survival of professions, be they classically or economically defined, is linked to societal needs, a profession's unique commitment and ability to provide services to society, and the profession's maintenance of knowledge as well as skill-based services. Historical evidence has shown erosion of a profession's power consequent to capitalist influences, government influences, access of skills by less trained individuals, and elitist posturing by a profession. Comparative evidence has shown societal acceptance of an escalation of salaries for designated superstars, increasing roles and influence of managerial personnel, and trivialization of values other than economic ones. CONCLUSION: Attention to historical and comparative trends by individual ophthalmologists as well as associations representing ophthalmologists is mandatory if ophthalmology as we know it is to survive within the profession of medicine. PMID:10703121
Hua, L.; Cohen, T. S.; Shi, Y.; Datta, V.; Hilliard, J. J.; Tkaczyk, C.; Suzich, J.; Stover, C. K.
2015-01-01
Immunocompromised individuals are at increased risk of Staphylococcus aureus pneumonia. Neutralization of alpha-toxin (AT) with the monoclonal antibody (MAb) MEDI4893* protects normal mice from S. aureus pneumonia; however, the effects of the MAb in immunocompromised mice have not been reported. In this study, passive immunization with MEDI4893* increased survival rates and reduced bacterial numbers in the lungs in an immunocompromised murine S. aureus pneumonia model. Lungs from infected mice exhibited alveolar epithelial damage, protein leakage, and bacterial overgrowth, whereas lungs from mice passively immunized with MEDI4893* retained a healthy architecture, with an intact epithelial barrier. Adjunctive therapy or prophylaxis with a subtherapeutic MEDI4893* dose combined with subtherapeutic doses of vancomycin or linezolid improved survival rates, compared with the monotherapies. Furthermore, coadministration of MEDI4893* with vancomycin or linezolid extended the antibiotic treatment window. These data suggest that MAb-mediated neutralization of AT holds promise in strategies for prevention and adjunctive therapy among immunocompromised patients. PMID:25987629
Effect of semolina-jaggery diet on survival and development of Drosophila melanogaster.
Chattopadhyay, Debarati; James, Joel; Roy, Debasish; Sen, Soumadeep; Chatterjee, Rishita; Thirumurugan, Kavitha
2015-01-01
Drosophila melanogaster is an ideal model organism for developmental studies. This study tests the potential of semolina-jaggery (SJ) diet as a new formulation for bulk rearing of flies. Semolina and jaggery are organic products obtained from wheat endosperm and cane sugar, respectively. Semolina is a rich source of carbohydrates and protein. Jaggery has a high content of dietary sugars. Moreover, preparation of semolina jaggery diet is cost-effective and easy. Thus, the current study aimed to compare survival and developmental parameters of flies fed the SJ diet to flies fed the standard cornmeal-sugar-yeast (CSY) diet. SJ diet enhanced survival of flies without affecting fecundity; male flies showed increased resistance to starvation. A higher number of flies emerged at F2 and F3 generation when fed the SJ diet than when fed the control CSY diet. SJ diet did not increase fly body weight and lipid percentage. Therefore, SJ diet can be used for bulk rearing of healthy flies at par with the standard cornmeal-sugar-yeast diet.
Lethal effect of uv and $gamma$ irradiation on some species of Dematiaceae (in Russian)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhdanova, N.N.; Gavryushina, A.I.; Bondar, A.I.
1972-01-01
A comparative study was conducted of relation of four species of Dematiaceae and a mutant with lowered content of melanine to gamma and uv rays. Under uv irradiation, survival iate of all studied species was characterized by a complex exponential curve with a large, sharply pronounced resistant area. An assumption is advanced that a sharp fall of survival rate during the first minutes of uv irradiation is conditioned by specificity of the protective effect of melanine pigment tint needs time for transition into the active state. Species resistant to gamma irradiation had sygmoid curves of survival rate and sensitive speciesmore » had the exponential ones. Increased resistance to gamma rays was accompanied by an increase in concentration of paramagnetic-particles that were determined by the method of electronic paramagnetic resonance. Analysis of the data obtained makes it possible to suppose that the protective effect of fungal melanine is various under gamma and uv irradiation. (auth)« less
Ghosh, Debolina; LeVault, Kelsey R; Brewer, Gregory J
2014-01-01
To determine whether glutathione (GSH) loss or increased reactive oxygen species (ROS) are more important to neuron loss, aging, and Alzheimer's disease (AD), we stressed or boosted GSH levels in neurons isolated from aging 3xTg-AD neurons compared with those from age-matched nontransgenic (non-Tg) neurons. Here, using titrating with buthionine sulfoximine, an inhibitor of γ-glutamyl cysteine synthetase (GCL), we observed that GSH depletion increased neuronal death of 3xTg-AD cultured neurons at increasing rates across the age span, whereas non-Tg neurons were resistant to GSH depletion until old age. Remarkably, the rate of neuron loss with ROS did not increase in old age and was the same for both genotypes, which indicates that cognitive deficits in the AD model were not caused by ROS. Therefore, we targeted for neuroprotection activation of the redox sensitive transcription factor, nuclear erythroid-related factor 2 (Nrf2) by 18 alpha glycyrrhetinic acid to stimulate GSH synthesis through GCL. This balanced stimulation of a number of redox enzymes restored the lower levels of Nrf2 and GCL seen in 3xTg-AD neurons compared with those of non-Tg neurons and promoted translocation of Nrf2 to the nucleus. By combining the Nrf2 activator together with the NADH precursor, nicotinamide, we increased neuron survival against amyloid beta stress in an additive manner. These stress tests and neuroprotective treatments suggest that the redox environment is more important for neuron survival than ROS. The dual neuroprotective treatment with nicotinamide and an Nrf2 inducer indicates that these age-related and AD-related changes are reversible. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Jeremy P.; Murphy, James D.; Hanlon, Alexandra L.
2014-03-15
Purpose: Concerns have been raised about the potential for worse treatment outcomes because of dosimetric inaccuracies related to tumor motion and increased toxicity caused by the spread of low-dose radiation to normal tissues in patients with locally advanced non-small cell lung cancer (NSCLC) treated with intensity modulated radiation therapy (IMRT). We therefore performed a population-based comparative effectiveness analysis of IMRT, conventional 3-dimensional conformal radiation therapy (3D-CRT), and 2-dimensional radiation therapy (2D-RT) in stage III NSCLC. Methods and Materials: We used the Surveillance, Epidemiology, and End Results (SEER)-Medicare database to identify a cohort of patients diagnosed with stage III NSCLC frommore » 2002 to 2009 treated with IMRT, 3D-CRT, or 2D-RT. Using Cox regression and propensity score matching, we compared survival and toxicities of these treatments. Results: The proportion of patients treated with IMRT increased from 2% in 2002 to 25% in 2009, and the use of 2D-RT decreased from 32% to 3%. In univariate analysis, IMRT was associated with improved overall survival (OS) (hazard ratio [HR] 0.90, P=.02) and cancer-specific survival (CSS) (HR 0.89, P=.02). After controlling for confounders, IMRT was associated with similar OS (HR 0.94, P=.23) and CSS (HR 0.94, P=.28) compared with 3D-CRT. Both techniques had superior OS compared with 2D-RT. IMRT was associated with similar toxicity risks on multivariate analysis compared with 3D-CRT. Propensity score matched model results were similar to those from adjusted models. Conclusions: In this population-based analysis, IMRT for stage III NSCLC was associated with similar OS and CSS and maintained similar toxicity risks compared with 3D-CRT.« less
Bemanian, Amin; Beyer, Kirsten M M
2017-04-01
Background: The Black-to-White disparity in breast cancer survival is increasing, and racial residential segregation is a potential driver for this trend. However, study findings have been mixed, and no study has comprehensively compared the effectiveness of different local-level segregation metrics in explaining cancer survival. Methods: We proposed a set of new local segregation metrics named local exposure and isolation (LEx/Is) and compared our new local isolation metric with two related metrics, the location quotient (LQ) and the index of concentration at extremes (ICE), across the 102 largest U.S. metropolitan areas. Then, using case data from the Milwaukee, WI, metropolitan area, we used proportional hazards models to explore associations between segregation and breast cancer survival. Results: Across the 102 metropolitan areas, the new local isolation metric was less skewed than the LQ or ICE. Across all races, Hispanic isolation was associated with poorer all-cause survival, and Hispanic LQ and Hispanic-White ICE were found to be associated with poorer survival for both breast cancer-specific and all-cause mortality. For Black patients, Black LQ was associated with lower all-cause mortality and Black local isolation was associated with reduced all-cause and breast cancer-specific mortality. ICE was found to suffer from high multicollinearity. Conclusions: Local segregation is associated with breast cancer survival, but associations varied based on patient race and metric employed. Impact: We highlight how selection of a segregation measure can alter study findings. These relationships need to be validated in other geographic areas. Cancer Epidemiol Biomarkers Prev; 26(4); 516-24. ©2017 AACR See all the articles in this CEBP Focus section, "Geospatial Approaches to Cancer Control and Population Sciences." ©2017 American Association for Cancer Research.
Barr, James Geoffrey; Grundy, Paul L
2012-12-01
The prognosis of high-grade glioma (HGG) is poor with a median survival of about 1 year for glioblastoma. In 2007, NICE published a technology appraisal (TA121) recommending the use of carmustine wafers (Gliadel) and systemic therapy with temozolomide for selected patients with HGG. Outcomes for HGG surgery in the United Kingdom with these combined treatments have not been published. Retrospective audit of consecutive patients in a single unit with carmustine wafer implantation. Fifty-nine patients had carmustine wafers implanted at primary surgery, between October 2005 and October 2010 at Wessex Neurological Centre, Southampton, UK. Patients were given chemotherapeutic treatments strictly according to NICE TA121. Survival was calculated using Kaplan-Meier method. Fifty-five patients had WHO grade IV tumours and four had grade III. Median age was 61 years. At follow-up, 39 patients had died. Median survival was 15.3 months. Eight patients (13.5%) experienced post-operative complications (including five infections) for which four had the carmustine wafers removed. Forty-seven (80%) patients were treated with radical radiotherapy (55-60 Gy) and six (10%) patients received palliative radiotherapy (30 Gy). Thirty-seven patients (63%) received concomitant temozolomide chemotherapy. In the subset of 37 patients receiving multimodal treatment with radical radiotherapy and concomitant temozolomide, median survival was 15.8 months compared with 7.4 months in those not receiving multimodal treatment. Carmustine wafers for primary HGG surgery in accordance with the NICE TA121 were associated with a median survival of 15.3 months; this is improved compared with previously reported randomised trials. Multimodal treatment with carmustine wafers, radical radiotherapy and concomitant temozolomide was associated with improved survival. Increased incidence of infections was observed in cases receiving carmustine wafers.
Toriihara, Akira; Ohtake, Makoto; Tateishi, Kensuke; Hino-Shishikura, Ayako; Yoneyama, Tomohiro; Kitazume, Yoshio; Inoue, Tomio; Kawahara, Nobutaka; Tateishi, Ukihide
2018-05-01
The potential of positron emission tomography/computed tomography using 62 Cu-diacetyl-bis (N 4 -methylthiosemicarbazone) ( 62 Cu-ATSM PET/CT), which was originally developed as a hypoxic tracer, to predict therapeutic resistance and prognosis has been reported in various cancers. Our purpose was to investigate prognostic value of 62 Cu-ATSM PET/CT in patients with glioma, compared to PET/CT using 2-deoxy-2-[ 18 F]fluoro-D-glucose ( 18 F-FDG). 56 patients with glioma of World Health Organization grade 2-4 were enrolled. All participants had undergone both 62 Cu-ATSM PET/CT and 18 F-FDG PET/CT within mean 33.5 days prior to treatment. Maximum standardized uptake value and tumor/background ratio were calculated within areas of increased radiotracer uptake. The prognostic significance for progression-free survival and overall survival were assessed by log-rank test and Cox's proportional hazards model. Disease progression and death were confirmed in 37 and 27 patients in follow-up periods, respectively. In univariate analysis, there was significant difference of both progression-free survival and overall survival in age, tumor grade, history of chemoradiotherapy, maximum standardized uptake value and tumor/background ratio calculated using 62 Cu-ATSM PET/CT. Multivariate analysis revealed that maximum standardized uptake value calculated using 62 Cu-ATSM PET/CT was an independent predictor of both progression-free survival and overall survival (p < 0.05). In a subgroup analysis including patients of grade 4 glioma, only the maximum standardized uptake values calculated using 62 Cu-ATSM PET/CT showed significant difference of progression-free survival (p < 0.05). 62 Cu-ATSM PET/CT is a more promising imaging method to predict prognosis of patients with glioma compared to 18 F-FDG PET/CT.
Martin, David E; De Almeida, Jose Flavio A; Henry, Michael A; Khaing, Zin Z; Schmidt, Christine E; Teixeira, Fabricio B; Diogenes, Anibal
2014-01-01
Intracanal disinfection is a crucial step in regenerative endodontic procedures. Most published cases suggest the use of sodium hypochlorite (NaOCl) as the primary irrigant. However, the effect of clinically used concentrations of NaOCl on the survival and differentiation of stem cells is largely unknown. In this study, we tested the effect of various concentrations of NaOCl on the stem cells of the apical papilla (SCAPs) survival and dentin sialophosphoprotein (DSPP) expression. Standardized root canals were created in extracted human teeth and irrigated with NaOCl (0.5%, 1.5%, 3%, or 6%) followed by 17% EDTA or sterile saline. SCAPs in a hyaluronic acid-based scaffold were seeded into the canals and cultured for 7 days. Next, viable cells were quantified using a luminescence assay, and DSPP expression was evaluated using quantitative real-time polymerase chain reaction. There was a significant reduction in survival and DSPP expression in the group treated with 6% NaOCl compared with the untreated control group. Comparable survival was observed in the groups treated with the lower concentrations of NaOCl, but greater DSPP expression was observed in the 1.5% NaOCl group. In addition, 17% EDTA resulted in increased survival and DSPP expression partially reversing the deleterious effects of NaOCl. Collectively, the results suggest that dentin conditioning with high concentrations of NaOCl has a profound negative effect on the survival and differentiation of SCAPs. However, this effect can be prevented with the use of 1.5% NaOCl followed by 17% EDTA. The inclusion of this irrigation regimen might be beneficial in regenerative endodontic procedures. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Defining Long-term Outcomes with Living Donor Liver Transplantation in North America
Olthoff, Kim M.; Smith, Abigail R.; Abecassis, Michael; Baker, Talia; Emond, Jean C.; Berg, Carl L.; Beil, Charlotte A.; Burton, James R.; Fisher, Robert A.; Freise, Christopher E.; Gillespie, Brenda W.; Grant, David R.; Humar, Abhi; Kam, Igal; Merion, Robert M.; Pomfret, Elizabeth A.; Samstein, Benjamin; Shaked, Abraham
2015-01-01
Objective To compare long-term survival of living donor liver transplant (LDLT) at experienced transplant centers to outcomes of deceased donor liver transplant (DDLT) and identify key variables impacting patient and graft survival. Summary Background Data The Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) is a prospective multicenter NIH study comparing outcomes of LDLT and DDLT and associated risks. Methods Mortality and graft failure for 1427 liver recipients (963 LDLT) enrolled in A2ALL transplanted between 1/1/1998 and 1/31/2014 at 12 North American centers with median follow-up 6.7 years were analyzed using Kaplan-Meier and multivariable Cox models. Results Survival probability at 10 years was 70% for LDLT and 64% for DDLT. Unadjusted survival was higher with LDLT (HR=0.76, p=0.02) but attenuated after adjustment (HR=0.98, p=0.90) as LDLT recipients had lower mean MELD (15.5 vs 20.4) and fewer were transplanted from ICU, inpatient, on dialysis, ventilated, or with ascites. Post-transplant ICU days were less for LDLT. For all recipients female gender and primary sclerosing cholangitis were associated with improved survival, while dialysis and older recipient/donor age were associated with worse survival. Higher MELD score was associated with increased graft failure. Era of transplantation and type of donated lobe did not impact survival in LDLT. Conclusions LDLT provides significant long-term transplant benefit resulting in transplantation at a lower MELD score, decreased death on waitlist, and excellent post-transplant outcomes. Recipient diagnosis, disease severity, renal failure, and ages of recipient and donor should be considered in decision-making regarding timing of transplant and donor options. Clinical Trials ID NCT00096733. PMID:26258315
Disparate outcomes in patients with colorectal cancer: effect of race on long-term survival.
Wudel, L James; Chapman, William C; Shyr, Yu; Davidson, Mark; Jeyakumar, Anita; Rogers, Selwyn O; Allos, Tara; Stain, Steven C
2002-05-01
Increasing evidence suggests significant disparity in colorectal cancer outcomes between black and white patients. Contributing factors may include advanced tumor stage at diagnosis, differences in treatment, more aggressive tumor biology, access to care, and patient comorbidity. Disparities in colorectal cancer outcomes exist despite similar objective measures of treatment. Ten-year retrospective review of all patients with colorectal cancer using tumor registries at a city hospital (n = 83) and a university medical center (n = 585) in the same city. We assessed stage at diagnosis; curative surgical resection; use of adjuvant treatment; overall, disease-free, and stage-specific survival; and socioeconomic status. Patients with nonwhite, nonblack ethnicity (4% overall) were excluded. Differences in stage and treatments were compared using the chi(2) test, and median survival rates were compared using log-rank tests. Significantly more black patients were treated at the city hospital (53.0%) vs the university medical center (10.6%) (P<.001). No differences were identified in stage distribution or treatments received between hospitals or between black and white patients. Significantly worse survival was noted among patients treated at the city hospital (2.1 vs 5.3 years; P<.001) and among black patients treated at both institutions (city hospital: 1.4 vs 2.1 years, and university hospital: 3.2 vs 5.7 years; P<.001 for both). Disease-free survival rates showed similar significant reductions for black patients at both institutions. There was no association between survival and socioeconomic status at either institution. The marked reductions in overall and disease-free survival for black patients with colorectal cancer do not seem to be related to variation in treatment but may be due to biologic factors or non-cancer-related health conditions.
Simpson, Daniel R; Martínez, María Elena; Gupta, Samir; Hattangadi-Gluth, Jona; Mell, Loren K; Heestand, Gregory; Fanta, Paul; Ramamoorthy, Sonia; Le, Quynh-Thu; Murphy, James D
2013-12-04
Black patients with metastatic colorectal cancer have inferior survival compared to white patients. The purpose of this study was to examine disparity in specialist consultation and multimodality treatment and the impact that treatment inequality has on survival. We identified 9935 non-Hispanic white and 1281 black patients with stage IV colorectal cancer aged 66 years and older from the Surveillance, Epidemiology, and End Results (SEER)-Medicare linked database. Logistic regression models identified race-based differences in consultation rates and subsequent treatment with surgery, chemotherapy, or radiation. Multivariable Cox regression models identified potential factors that explain race-based survival differences. All statistical tests were two-sided. Black patients had lower rates of consultation with surgery, medical oncology, and radiation oncology. Among patients seen in consultation, black patients received less surgery directed at the primary tumor, liver- or lung-directed surgery, chemotherapy, and radiotherapy. Unadjusted survival analysis found a 15% higher chance of dying for black patients compared with white patients (hazard ratio [HR] = 1.15; 95% confidence interval (CI) = 1.08 to 1.22; P < .001). Adjustment for patient, tumor, and demographic variables marginally reduced the risk of death (HR = 1.08; 95% CI = 1.01 to 1.15; P = .03). After adjustment for differences in treatment, the increased risk of death for black patients disappeared. Our study shows racial disparity in specialist consultation as well as subsequent treatment with multimodality therapy for metastatic colorectal cancer, and it suggests that inferior survival for black patients may stem from this treatment disparity. Further research into the underlying causes of this inequality will improve access to treatment and survival in metastatic colorectal cancer.
Mazimba, S; Holland, E; Nagarajan, V; Mihalek, AD; Kennedy, JLW; Bilchick, KC
2017-01-01
Background The ‘obesity paradox’ refers to the fact that obese patients have better outcomes than normal weight patients. This has been observed in multiple cardiovascular conditions, but evidence for obesity paradox in pulmonary hypertension (PH) remains sparse. Methods We categorized 267 patients from the National Institute of Health-PH registry into five groups based on body mass index (BMI): underweight, normal weight, overweight, obese and morbidly obese. Mortality was compared in BMI groups using the X2 statistic. Five-year probability of death using the PH connection (PHC) risk equation was calculated, and the model was compared with BMI groups using Cox proportional hazards regression and Kaplan-Meier (KM) survival curves. Results Patients had a median age of 39 years (interquartile range 30–50 years), a median BMI of 23.4 kg m −2 (21.0–26.8 kg m−2) and an overall mortality at 5 years of 50.2%. We found a U-shaped relationship between survival and 1-year mortality with the best 1-year survival in overweight patients. KM curves showed the best survival in the overweight, followed by obese and morbidly obese patients, and the worst survival in normal weight and underweight patients (log-rank P = 0.0008). In a Cox proportional hazards analysis, increasing BMI was a highly significant predictor of improved survival even after adjustment for the PHC risk equation with a hazard ratio for death of 0.921 per kg m−2 (95% confidence interval: 0.886–0.954) (P < 0.0001). Conclusion We observed that the best survival was in the overweight patients, making this more of an ‘overweight paradox’ than an ‘obesity paradox’. This has implications for risk stratification and prognosis in group 1 PH patients. PMID:28209971
Impact of transplant nephrectomy on peak PRA levels and outcome after kidney re-transplantation
Tittelbach-Helmrich, Dietlind; Pisarski, Przemyslaw; Offermann, Gerd; Geyer, Marcel; Thomusch, Oliver; Hopt, Ulrich Theodor; Drognitz, Oliver
2014-01-01
AIM: To determine the impact of transplant nephrectomy on peak panel reactive antibody (PRA) levels, patient and graft survival in kidney re-transplants. METHODS: From 1969 to 2006, a total of 609 kidney re-transplantations were performed at the University of Freiburg and the Campus Benjamin Franklin of the University of Berlin. Patients with PRA levels above (5%) before first kidney transplantation were excluded from further analysis (n = 304). Patients with graft nephrectomy (n = 245, NE+) were retrospectively compared to 60 kidney re-transplants without prior graft nephrectomy (NE-). RESULTS: Peak PRA levels between the first and the second transplantation were higher in patients undergoing graft nephrectomy (P = 0.098), whereas the last PRA levels before the second kidney transplantation did not differ between the groups. Age adjusted survival for the second kidney graft, censored for death with functioning graft, were comparable in both groups. Waiting time between first and second transplantation did not influence the graft survival significantly in the group that underwent nephrectomy. In contrast, patients without nephrectomy experienced better graft survival rates when re-transplantation was performed within one year after graft loss (P = 0.033). Age adjusted patient survival rates at 1 and 5 years were 94.1% and 86.3% vs 83.1% and 75.4% group NE+ and NE-, respectively (P < 0.01). CONCLUSION: Transplant nephrectomy leads to a temporary increase in PRA levels that normalize before kidney re-transplantation. In patients without nephrectomy of a non-viable kidney graft timing of re-transplantation significantly influences graft survival after a second transplantation. Most importantly, transplant nephrectomy is associated with a significantly longer patient survival. PMID:25032103
Yu, Yalian; Wang, Hongbo; Yan, Aihui; Wang, Hailong; Li, Xinyao; Liu, Jiangtao; Li, Wei
2018-04-04
Recent studies have reported a relationship between prognosis and neutrophil-to-lymphocyte ratio (NLR) in patients with head and neck cancer (HNC). As the results are still controversial, we conducted a meta-analysis of pretreatment NLR in peripheral blood and prognosis in HNC patients. We retrieved articles from PubMed, Medline, Cochrane Library, Embase and Web of Science. A comparative analysis was conducted for the effect of pretreatment NLR in peripheral blood on overall survival (OS), progression-free survival, disease-free survival (DFS), disease-specific survival, metastasis-free survival, and recurrence-free survival of HNC patients. The analysis applied the criteria for systematic reviews described in the Cochrane Handbook and was conducted using hazard ratios (HRs) to estimate effect size, and calculated by Stata/SE version 13.0. The meta-analysis included eligible cohort studies (5475 cases). The OS data indicated increased mortality risk in HNC patients with a high NLR (HR = 1.84, 95% confidence interval (CI): 1.53-2.23; P < 0.001; heterogeneity, I 2 = 37.2%, P = 0.074). Analysis of subgroups stratified by NLR cutoff values revealed increased mortality risk and significantly shorter DFS in patients with high NLR compared to those with low NLR (HR = 2.18, 95% CI: 1.46-3.24; P < 0.001). Patients with high NLR had a higher probability of tumor recurrence after treatment than those with low NLR (HR = 1.63, 95% CI: 1.09-2.45; P = 0.017; heterogeneity, I 2 = 68.7%; P = 0.022). The probability of distant metastasis following treatment was greater in patients with high compared with low NLR (HR = 1.92, 95% CI: 1.36-2.72; P < 0.001; heterogeneity, I 2 = 0.0%; P = 0.614). Funnel plots of the meta-analysis results were stable, as shown by sensitivity analysis. No publication bias was detected by the Egger test (P = 0.135). HNC patients with elevated pretreatment NLR in peripheral blood have poor prognosis and are prone to local invasion and distant metastasis. NLR values are easily obtained from routinely collected blood samples and could assist clinicians to determine prognosis of HNC patients.
AJUBA increases the cisplatin resistance through hippo pathway in cervical cancer.
Bi, Lihong; Ma, Feng; Tian, Rui; Zhou, Yanli; Lan, Weiguang; Song, Quanmao; Cheng, Xiankui
2018-02-20
Though LIM-domain protein AJUBA was identified as a putative oncogene, the function and underlying mechanisms of AJUBA in cervical cancer remain largely unknown. Firstly, AJUBA expression was detected via real-time quantitative PCR in patients' samples. Furthermore, Hela and Siha cells were transfected with AJUBA-overexpressing plasmids, and then exposed to cisplatin, the apoptosis was measured by cytometry assay. In addition, the expression of YAP and TAZ was disclosed through western blot assay. Our results revealed that AJUBA expression was significantly higher in the cervical cancer patients resistant to cisplatin treatment compared with cervical cancer patients sensitive to cisplatin treatment. In addition, overall survival time was significantly shorter in the cervical cancer patients with high AJUBA expression compare with those with low AJUBA expression using kaplan-meier analysis. Hela and Siha cells transfected with AJUBA-expressing plasmids exposed to cisplatin treatment had higher survival rate compared with the cells transfected with empty vector control. Mechanistic studies revealed the AJUBA upregulated the downstream targets YAP and TAZ. These results suggest that high AJUBA level enhances cervical cancer cells drug resistance to cisplatin, also associates with decreased patient survival times. Copyright © 2017 Elsevier B.V. All rights reserved.
Survival and reproduction of some nematodes as affected by muck and organic acids.
Elmiligy, I A; Norton, D C
1973-01-01
Fulvic, humic, acetic, N-bulyric, formic, lactic, and propionic acids were inhibitory to the survival or reproduction of Aphelenchus avenae, Aphelenchoides goodeyi, Helicotylenchus pseudorobustus, Meloidogyne hapla or Xiphinema americanum. Reproduction of H. pseudorobustus and M. hapla significantly increased with increasing amounts of muck added to sand, and with the initial amount of nematode inoculum. All acids except humic and fulvic were lethal, in vitro, to all nematode species tested. When A. goodeyi was treated with fulvic acid, reproduction was reduced significantly when compared with sodium humate or water treatments. Treatment of H. pseudorobustus with fulvic acid (pH 3.5) resulted in a greater reduction in reproduction in soil than did treatment with humic acid (pH 3.5).
Genetic diversity affects colony survivorship in commercial honey bee colonies
NASA Astrophysics Data System (ADS)
Tarpy, David R.; vanEngelsdorp, Dennis; Pettis, Jeffrey S.
2013-08-01
Honey bee ( Apis mellifera) queens mate with unusually high numbers of males (average of approximately 12 drones), although there is much variation among queens. One main consequence of such extreme polyandry is an increased diversity of worker genotypes within a colony, which has been shown empirically to confer significant adaptive advantages that result in higher colony productivity and survival. Moreover, honey bees are the primary insect pollinators used in modern commercial production agriculture, and their populations have been in decline worldwide. Here, we compare the mating frequencies of queens, and therefore, intracolony genetic diversity, in three commercial beekeeping operations to determine how they correlate with various measures of colony health and productivity, particularly the likelihood of queen supersedure and colony survival in functional, intensively managed beehives. We found the average effective paternity frequency ( m e ) of this population of honey bee queens to be 13.6 ± 6.76, which was not significantly different between colonies that superseded their queen and those that did not. However, colonies that were less genetically diverse (headed by queens with m e ≤ 7.0) were 2.86 times more likely to die by the end of the study when compared to colonies that were more genetically diverse (headed by queens with m e > 7.0). The stark contrast in colony survival based on increased genetic diversity suggests that there are important tangible benefits of increased queen mating number in managed honey bees, although the exact mechanism(s) that govern these benefits have not been fully elucidated.
Xu, Xiao-Tao; Tao, Ze-Zhang; Song, Qi-Bin; Yao, Yi; Ruan, Peng
2012-09-01
In order to investigate the effects of RNA interference of decoy receptor 3 (DcR3) on the sensitivity of gastric cancer cells to 5-fluorouracil (5-FU) and the relevant mechanisms, siRNA against DcR3 was transfected into the gastric cancer cell line AGS. AGS cells were treated with different doses of 5-FU or for different time periods. The sensitivity of AGS cells to 5-FU was determined. The cell survival rate was detected by MTT assay. The apoptotic rate was determined by DAPI staining, and the expression of related proteins were detected by western blot analysis. The results showed that the cell survival rate was significanlty decreased in the knockdown group compared to the control group at different doses of 5-FU (P<0.01). After different time periods of treatment with 5-FU, the cell survival rate in the knockdown group was significantly decreased compared to the control group, respectively (P<0.01). The apoptotic rate of AGS cells in the knockdown group was increased along with the increasing dose of siRNA. The siRNA against DcR3 enhanced the expression of Fas, FasL, caspase-3 and caspase-8. In conclusion, knockdown of DcR3 by RNA interference enhances apoptosis and inhibits the growth of gastric cancer cells. Downregulation of DcR3 enhances the sensitivity of gastric cancer cells to 5-FU and increased the expression of Fas, FasL and caspase-3/8.
Aranzana, Elisa Maria de Camargo; Coppini, Adriana Zuolo; Ribeiro, Maurício Alves; Massarollo, Paulo Celso Bosco; Szutan, Luiz Arnaldo; Ferreira, Fabio Gonçalves
2015-06-01
Liver transplantation has not increased with the number of patients requiring this treatment, increasing deaths among those on the waiting list. Models predicting post-transplantation survival, including the Model for Liver Transplantation Survival and the Donor Risk Index, have been created. Our aim was to compare the performance of the Model for End-Stage Liver Disease, the Model for Liver Transplantation Survival and the Donor Risk Index as prognostic models for survival after liver transplantation. We retrospectively analyzed the data from 1,270 patients who received a liver transplant from a deceased donor in the state of São Paulo, Brazil, between July 2006 and July 2009. All data obtained from the Health Department of the State of São Paulo at the 15 registered transplant centers were analyzed. Patients younger than 13 years of age or with acute liver failure were excluded. The majority of the recipients had Child-Pugh class B or C cirrhosis (63.5%). Among the 1,006 patients included, 274 (27%) died. Univariate survival analysis using a Cox proportional hazards model showed hazard ratios of 1.02 and 1.43 for the Model for End-Stage Liver Disease and the Model for Liver Transplantation Survival, respectively (p<0.001). The areas under the ROC curve for the Donor Risk Index were always less than 0.5, whereas those for the Model for End-Stage Liver Disease and the Model for Liver Transplantation Survival were significantly greater than 0.5 (p<0.001). The cutoff values for the Model for End-Stage Liver Disease (≥29.5; sensitivity: 39.1%; specificity: 75.4%) and the Model for Liver Transplantation Survival (≥1.9; sensitivity 63.9%, specificity 54.5%), which were calculated using data available before liver transplantation, were good predictors of survival after liver transplantation (p<0.001). The Model for Liver Transplantation Survival displayed similar death prediction performance to that of the Model for End-Stage Liver Disease. A simpler model involving fewer variables, such as the Model for End-Stage Liver Disease, is preferred over a complex model involving more variables, such as the Model for Liver Transplantation Survival. The Donor Risk Index had no significance in post-transplantation survival in our patients.
Raspagliesi, Francesco; Maltese, Giuseppa; Bogani, Giorgio; Fucà, Giovanni; Lepori, Stefano; De Iaco, Pierandrea; Perrone, Myriam; Scambia, Giovanni; Cormio, Gennaro; Bogliolo, Stefano; Bergamini, Alice; Bifulco, Giuseppe; Casali, Paolo Giovanni; Lorusso, Domenica
2017-01-01
To investigate the impact of morcellation on survival outcomes of patients affected by undiagnosed uterine sarcoma. This is a retrospective study performed in 8 referral centers of MITO group. Data of women undergoing morcellation for apparent benign uterine myomas who were ultimately diagnosed with stage I uterine sarcoma on final pathology were compared with data of women who did not undergo morcellation. Uterine sarcoma included: leiomyosarcomas (LMS), smooth muscle tumors of uncertain malignant potential (STUMP), low-grade endometrial stromal sarcomas (LG-ESS) and undifferentiated uterine sarcomas (UUS). Two-year survival outcomes were evaluated using Kaplan-Meir and Cox models. Overall 125 patients were identified: 31(24.8%), 21(16.8%) and 73(58.4%) patients had power morcellation during laparoscopy, non power morcellation during open surgery and non morcellation during open procedures, respectively. Considering patients affected by LMS, morcellation did not correlated with disease-free survival. However, patients undergoing either morcellation or power morcellation experienced a 3-fold increase risk of death in comparison to patients who had not morcellation (p=0.02). A trend towards an increase of recurrence was observed for patients undergoing morcellation for STUMP (HR 7.7, p=0.09); while no differences in survival outcomes were observed for patients with LG-ESS and UUS. Our data suggest that morcellation increase the risk of death in patients affected by undiagnosed LMS. Further prospective studies are warranted in order to assess the risk to benefit ratio of power morcellator utilization in patients with apparent benign uterine myomas. Copyright © 2016 Elsevier Inc. All rights reserved.
Wu, Li-Rong; Zhu, Huan-Feng; Xu, Jianhua; Jiang, Xue-Song; Yin, Li; Jiang, Ning; Zong, Dan; Wang, Fei-Jiang; Huang, Sheng-Fu; Bian, Xiu-Hua; Wu, Jian-Feng; Song, Dan; Guo, Wen-Jie; Liu, Ju-Ying; He, Xia
2018-01-01
Background : This study aimed to compare concurrent chemoradiotherapy (CCRT) plus cetuximab (C) with CCRT alone in locoregionally advanced nasopharyngeal carcinoma(NPC). Methods : A total of 682 locoregionally advanced NPC patients who had undergone chemoradiotherapy with or without cetuximab were included. Propensity score-matching method was used to match patients. Progression-free survival (PFS), overall survival (OS), locoregional relapse-free survival (LRFS), and distant metastasis-free survival (DMFS) were compared between the two treatment arms. Results : After matching, 225 patients were identified for the analysis. Compared to CCRT, CCRT plus C was associated with significantly improved 3-year PFS (83.7% vs 71.9%, P = 0.036), LRFS (98.6% vs 90.2%, P = 0.034) but not OS (91.4% vs 85.4%, P = 0.117). Among patients with T4 and/or N3 category, CCRT plus C significantly prolonged 3-year PFS (81.0% vs 61.4%, P = 0.022) and increased 3-year OS (88.0% vs 77.9%, P = 0.086). No significant differences were observed between CCRT plus C and CCRT alone groups with regard to 3-year PFS, OS, LRFS and DMFS rates in stage III patients. Acute oral and oropharyngeal mucositis during radiotherapy were more common in the CCRT plus C than that in CCRT, but late toxicities were comparable. Conclusions: This study reveals that patients with locoregionally advanced NPC could benefit from the addition of cetuximab to CCRT, and this therapeutic gain mainly originated from T4 and/or N3 subgroup although suffering more acute moderate to severe toxicities.
Chang, Elizabeth T; Parekh, Palak R; Yang, Qingyuan; Nguyen, Duc M; Carrier, France
2016-03-01
The heterogenous ribonucleoprotein A18 (hnRNP A18) promotes tumor growth by coordinating the translation of selected transcripts associated with proliferation and survival. hnRNP A18 binds to and stabilizes the transcripts of pro-survival genes harboring its RNA signature motif in their 3'UTRs. hnRNP A18 binds to ATR, RPA, TRX, HIF-1α and several protein translation factor mRNAs on polysomes and increases de novo protein translation under cellular stress. Most importantly, down regulation of hnRNP A18 decreases proliferation, invasion and migration in addition to significantly reducing tumor growth in two mouse xenograft models, melanoma and breast cancer. Moreover, tissue microarrays performed on human melanoma, prostate, breast and colon cancer indicate that hnRNP A18 is over expressed in 40 to 60% of these malignant tissue as compared to normal adjacent tissue. Immunohistochemistry data indicate that hnRNP A18 is over expressed in the stroma and hypoxic areas of human tumors. These data thus indicate that hnRNP A18 can promote tumor growth in in vivo models by coordinating the translation of pro-survival transcripts to support the demands of proliferating cells and increase survival under cellular stress. hnRNP A18 therefore represents a new target to selectively inhibit protein translation in tumor cells.
Survival of Asian Females With Advanced Lung Cancer in the Era of Tyrosine Kinase Inhibitor Therapy.
Becker, Daniel J; Wisnivesky, Juan P; Grossbard, Michael L; Chachoua, Abraham; Camidge, D Ross; Levy, Benjamin P
2017-01-01
We examined the effect of access to epidermal growth factor receptor (EGFR) tyrosine kinase inhibitor (TKI) therapy on survival for Asian female (AF) EGFR mutation-enriched patients with advanced lung adenocarcinoma. We used the Surveillance Epidemiology and End Results database to study patients with stage IV lung adenocarcinoma diagnosed from 1998 to 2012. We compared survival (lung cancer-specific survival [LCSS] and overall survival) between AFs and non-Asian males (NAMs), an EGFR mutation-enriched and EGFR mutation-unenriched population, respectively, with a diagnosis in the pre-EGFR TKI (1998-2004) and EGFR TKI (2005-2012) eras. We used Cox proportional hazards models to examine the interaction of access to TKI treatment and EGFR enrichment status. Among 3029 AF and 35,352 NAM patients, we found that LCSS was best for AFs with a diagnosis in the TKI era (median, 14 months), followed by AFs with a diagnosis in the pre-TKI era (median, 8 months), NAMs with a diagnosis in the TKI era (median, 5 months), and NAMs with a diagnosis in the pre-TKI era (median, 4 months; log-rank P < .0001). In a multivariable model, the effect of a diagnosis in the TKI era on survival was greater for AFs than for NAMs (LCSS, P = .0020; overall survival, P = .0007). A lung cancer diagnosis in the TKI era was associated with an overall mortality decrease of 26% for AFs (hazard ratio, 0.740; 95% confidence interval, 0.682-0.80) and 15.9% for NAMs (hazard ratio, 0.841; 95% confidence interval, 0.822-0.860). We found increased survival for lung adenocarcinoma diagnoses made after widespread access to EGFR TKIs, with the greatest increase among AF patients enriched for EGFR mutations. The present analysis eliminated the effect of crossover, which has complicated assessments of the survival advantage in EGFR TKI randomized trials. Published by Elsevier Inc.
Ornamentation, age, and survival of female striped plateau lizards, Sceloporus virgatus
NASA Astrophysics Data System (ADS)
Weiss, Stacey L.
2016-04-01
Individuals with greater expression of secondary sexual traits are often older and have higher survivorship than individuals with lower expression; if so, assessment of such indicator traits may provide genetic and/or direct benefits to potential mates. I examined the relationship between ornament expression, age, and survival in the striped plateau lizard, Sceloporus virgatus, a species with female-specific ornamentation that honestly signals reproductive quality. I followed a group of females from 2008 to 2013, examined ornament color and size as females aged, and compared ornamentation of survivors versus non-survivors. In addition, I explored whether other (non-ornamental) phenotypic characters predicted survival. I found that peak ornament expression (both color and size) of individual females changed year to year but appeared to be a weak signal of age due to high among-female variation in ornament expression that occurred independent of age and a non-linear pattern of change for ornament color. However, both absolute and relative ornament size did increase significantly as an individual aged and therefore may provide some age-related information such as reproductive investment, which is expected to increase as residual reproductive value declines with age. Individual survival was unrelated to peak ornament expression and to other phenotypic variables measured, providing no support for the ornament as a viability indicator and suggesting that individual survival prospects are affected by stochastic and environmental factors.
Rising temperatures may drive fishing-induced selection of low-performance phenotypes
Clark, Timothy D.; Messmer, Vanessa; Tobin, Andrew J.; Hoey, Andrew S.; Pratchett, Morgan S.
2017-01-01
Climate warming is likely to interact with other stressors to challenge the physiological capacities and survival of phenotypes within populations. This may be especially true for the billions of fishes per year that undergo vigorous exercise prior to escaping or being intentionally released from fishing gear. Using adult coral grouper (Plectropomus leopardus), an important fisheries species throughout the Indo-Pacific, we show that population-level survival following vigorous exercise is increasingly compromised as temperatures increase from current-day levels (100–67% survival at 24–30 °C) to those projected for the end of the century (42% survival at 33 °C). Intriguingly, we demonstrate that high-performance individuals take longer to recover to a resting metabolic state and subsequently have lower survival in warm water compared with conspecifics that exercise less vigorously. Moreover, we show that post-exercise mortality of high-performance phenotypes manifests after 3–13 d at the current summer maximum (30 °C), while mortality at 33 °C occurs within 1.8–14.9 h. We propose that wild populations in a warming climate may become skewed towards low-performance phenotypes with ramifications for predator-prey interactions and community dynamics. Our findings highlight the susceptibility of phenotypic diversity to fishing activities and demonstrate a mechanism that may contribute to fishing-induced evolution in the face of ongoing climate change. PMID:28094310
Finch, Colton G.; Pine, William E.; Yackulic, Charles B.; Dodrill, Michael J.; Yard, Michael D.; Gerig, Brandon S.; Coggins,, Lewis G.; Korman, Josh
2016-01-01
The Colorado River below Glen Canyon Dam, Arizona, is part of an adaptive management programme which optimizes dam operations to improve various resources in the downstream ecosystem within Grand Canyon. Understanding how populations of federally endangered humpback chub Gila cypha respond to these dam operations is a high priority. Here, we test hypotheses concerning temporal variation in juvenile humpback chub apparent survival rates and abundance by comparing estimates between hydropeaking and steady discharge regimes over a 3-year period (July 2009–July 2012). The most supported model ignored flow type (steady vs hydropeaking) and estimated a declining trend in daily apparent survival rate across years (99.90%, 99.79% and 99.67% for 2009, 2010 and 2011, respectively). Corresponding abundance of juvenile humpback chub increased temporally; open population model estimates ranged from 615 to 2802 individuals/km, and closed model estimates ranged from 94 to 1515 individuals/km. These changes in apparent survival and abundance may reflect broader trends, or simply represent inter-annual variation. Important findings include (i) juvenile humpback chub are currently surviving and recruiting in the mainstem Colorado River with increasing abundance; (ii) apparent survival does not benefit from steady fall discharges from Glen Canyon Dam; and (iii) direct assessment of demographic parameters for juvenile endangered fish are possible and can rapidly inform management actions in regulated rivers.
Stockton, D.; Davies, T.; Day, N.; McCann, J.
1997-01-01
OBJECTIVES: To investigate the recent fall in mortality from breast cancer in England and Wales, and to determine the relative contributions of improvements in treatment and earlier detection of tumours. DESIGN: Retrospective study of all women with breast cancer registered by the East Anglian cancer registry and diagnosed between 1982 and 1989. SUBJECTS: 3965 patients diagnosed 1982-5 compared with 4665 patients diagnosed 1986-9, in three age groups 0-49, 50-64, > or = 65 years, with information on stage at diagnosis and survival. MAIN OUTCOME MEASURES: Three year relative survival rates by time period, age group, and stage; relative hazard ratios for each time period and age group derived from Cox's proportional hazards model, adjusted for single year of age and stage. RESULTS: Survival improved in the later time period, although there was little stage specific improvement. The proportion of early stage tumours increased especially in the 50-64 year age group, and adjustment for stage accounted for over half of the improvement in survival in women aged under 65 years. CONCLUSION: Over half of the drop in mortality in women aged under 65 years seems to be attributable to earlier detection of tumours, which has been observed since the mid-1980s. This could have resulted from an increase in breast awareness predating the start of the breast screening programme. PMID:9056796
2014-01-01
Background Coeliac disease is a common enteropathy characterized by an increased mortality mainly due to its complications. The natural history of complicated coeliac disease is characterised by two different types of course: patients with a new diagnosis of coeliac disease that do not improve despite a strict gluten-free diet (type A cases) and previously diagnosed coeliac patients that initially improved on a gluten-free diet but then relapsed despite a strict diet (type B cases). Our aim was to study the prognosis and survival of A and B cases. Methods Clinical and laboratory data from coeliac patients who later developed complications (A and B cases) and sex- and age-matched coeliac patients who normally responded to a gluten-free diet (controls) were collected among 11 Italian centres. Results 87 cases and 136 controls were enrolled. Complications tended to occur rapidly after the diagnosis of coeliac disease and cumulative survival dropped in the first months after diagnosis of complicated coeliac disease. Thirty-seven cases died (30/59 in group A, 7/28 in group B). Type B cases presented an increased survival rate compared to A cases. Conclusions Complicated coeliac disease is an extremely serious condition with a high mortality and a short survival. Survival depends on the type of natural history. PMID:25103857
Perl, Jeffrey; Dong, James; Rose, Caren; Jassal, Sarbjit Vanita; Gill, John S.
2013-01-01
♦ Background: Kidney transplant failure (TF) is among the leading causes of dialysis initiation. Whether survival is similar for patients treated with peritoneal dialysis (PD) and with hemodialysis (HD) after TF is unclear and may inform decisions concerning dialysis modality selection. ♦ Methods: Between 1995 and 2007, 16 113 adult dialysis patients identified from the US Renal Data System initiated dialysis after TF. A multivariable Cox proportional hazards model was used to evaluate the impact of initial dialysis modality (1 865 PD, 14 248 HD) on early (1-year) and overall mortality in an intention-to-treat approach. ♦ Results: Compared with HD patients, PD patients were younger (46.1 years vs 49.4 years, p < 0.0001) with fewer comorbidities such as diabetes mellitus (23.1% vs 25.7%, p < 0.0001). After adjustment, survival among PD patients was greater within the first year after dialysis initiation [adjusted hazard ratio (AHR): 0.85; 95% confidence interval (CI): 0.74 to 0.97], but lower after 2 years (AHR: 1.15; 95% CI: 1.02 to 1.29). During the entire period of observation, survival in both groups was similar (AHR for PD compared with HD: 1.09; 95% CI: 1.0 to 1.20). In a sensitivity analysis restricted to a cohort of 1865 propensity-matched pairs of HD and PD patients, results were similar (AHR: 1.03; 95% CI: 0.93 to 1.14). Subgroups of patients with a body mass index exceeding 30 kg/m2 [AHR: 1.26; 95% CI: 1.05 to 1.52) and with a baseline estimated glomerular filtration rate (eGFR) less than 5 mL/min/1.73 m2 (AHR: 1.45; 95% CI: 1.05 to 1.98) experienced inferior overall survival when treated with PD. ♦ Conclusions: Compared with HD, PD is associated with an early survival advantage, inferior late survival, and similar overall survival in patients initiating dialysis after TF. Those data suggest that increased initial use of PD among patients returning to dialysis after TF may be associated with improved outcomes, except among patients with a higher BMI and those who initiate dialysis at lower levels of eGFR. The reasons behind the inferior late survival seen in PD patients are unclear and require further study. PMID:24084843
Islet grafting and imaging in a bioengineered intramuscular space†
Witkowski, Piotr; Sondermeijer, Hugo; Hardy, Mark A.; Woodland, David C.; Lee, Keagan; Bhagat, Govind; Witkowski, Kajetan; See, Fiona; Rana, Abbas; Maffei, Antonella; Itescu, Silviu; Harris, Paul E.
2011-01-01
Background Since the hepatic portal system may not be the optimal site for islet transplantation, several extrahepatic sites have been studied. Here we examine an intramuscular transplantation site, bioengineered to better support islet neovascularization, engraftment, and survival, and demonstrate that at this novel site, grafted beta cell mass may be quantitated in a real time non-invasive manner by PET imaging. Methods Streptozotocin induced rats were pretreated intramuscularly with a biocompatible angiogenic scaffold received syngeneic islet transplants 2 weeks later. The recipients were monitored serially by blood glucose and glucose tolerance measurements and by PET imaging of the transplant site with [11C] dihydrotetrabenazine. Parallel histopathologic evaluation of the grafts was done using insulin staining and evaluation of microvasularity. Results Reversal of hyperglycemia by islet transplantation was most successful in recipients pretreated with bioscaffolds containing angiogenic factors as compared to those who received no bioscaffolds or bioscaffolds not treated with angiogenic factors. PET imaging with [11C] dihydrotetrabenazine, insulin staining and microvascular density patterns were consistent with islet survival, increased levels of angiogenesis, and with reversal of hyperglycemia. Conclusions Induction of increased neovascularization at an intramuscular site significantly improves islet transplant engraftment and survival compared to controls. The use of a non hepatic transplant site may avoid intrahepatic complications and permit the use of PET imaging to measure and follow transplanted beta-cell mass in real time. These findings have important implications for effective islet implantation outside of the liver, and offer promising possibilities for improving islet survival, monitoring, and even prevention of islet loss. PMID:19898201