Surgical therapy of canine nasal tumors: A retrospective study (1982-1986)
Laing, Elizabeth J.; Binnington, Allen G.
1988-01-01
The results of surgical therapy in 15 dogs with histologically confirmed nasal tumors were analyzed retrospectively and compared to previous reports. Median survival time for all dogs was seven months. When adjusted for nontumor-related deaths, median survival increased to nine months. These values are two to three times longer than previous reports. To determine possible prognostic indicators, tumor stage, location, and histological type were compared to survival time. Dogs with unilateral nasal tumors had a median survival of 11 months, as compared to three months for dogs with bilateral tumors (p = 0.005). Tumor stage and histological type were not significant factors in comparing survival times. PMID:17423139
Conditional survival of all primary brain tumor patients by age, behavior, and histology.
Porter, Kimberly R; McCarthy, Bridget J; Berbaum, Michael L; Davis, Faith G
2011-01-01
Survival statistics commonly reflect survival from the time of diagnosis but do not take into account survival already achieved after a diagnosis. The objective of this study was to provide conditional survival estimates for brain tumor patients as a more accurate measure of survival for those who have already survived for a specified amount of time after diagnosis. Data on primary malignant and nonmalignant brain tumor cases diagnosed from 1985-2005 from selected SEER state cancer registries were obtained. Relative survival up to 15 years postdiagnosis and varying relative conditional survival rates were computed using the life-table method. The overall 1-year relative survival estimate derived from time of diagnosis was 67.8% compared to the 6-month relative conditional survival rate of 85.7% for 6-month survivors (the probability of surviving to 1 year given survival to 6 months). The 10-year overall relative survival rate was 49.5% from time of diagnosis compared to the 8-year relative conditional survival rate of 79.2% for 2-year survivors. Conditional survival estimates and standard survival estimates varied by histology, behavior, and age at diagnosis. The 5-year relative survival estimate derived from time of diagnosis for glioblastoma was 3.6% compared to the 3-year relative conditional survival rate of 36.4% for 2-year survivors. For most nonmalignant tumors, the difference between relative survival and the corresponding conditional survival estimates were minimal. Older age groups had greater numeric gains in survival but lower conditional survival estimates than other age groups. Similar findings were seen for other conditional survival intervals. Conditional survival is a useful disease surveillance measure for clinicians and brain tumor survivors to provide them with better 'real-time' estimates and hope. Copyright © 2011 S. Karger AG, Basel.
Analyzing survival curves at a fixed point in time for paired and clustered right-censored data
Su, Pei-Fang; Chi, Yunchan; Lee, Chun-Yi; Shyr, Yu; Liao, Yi-De
2018-01-01
In clinical trials, information about certain time points may be of interest in making decisions about treatment effectiveness. Rather than comparing entire survival curves, researchers can focus on the comparison at fixed time points that may have a clinical utility for patients. For two independent samples of right-censored data, Klein et al. (2007) compared survival probabilities at a fixed time point by studying a number of tests based on some transformations of the Kaplan-Meier estimators of the survival function. However, to compare the survival probabilities at a fixed time point for paired right-censored data or clustered right-censored data, their approach would need to be modified. In this paper, we extend the statistics to accommodate the possible within-paired correlation and within-clustered correlation, respectively. We use simulation studies to present comparative results. Finally, we illustrate the implementation of these methods using two real data sets. PMID:29456280
Spada, Eva; Perego, Roberta; Sgamma, Elena Assunta; Proverbio, Daniela
2018-02-01
Feline immunodeficiency virus (FIV) and feline leukemia virus (FeLV) are among the most important feline infectious diseases worldwide. This retrospective study investigated survival times and effects of selected predictor factors on survival time in a population of owned pet cats in Northern Italy testing positive for the presence of FIV antibodies and FeLV antigen. One hundred and three retrovirus-seropositive cats, 53 FIV-seropositive cats, 40 FeLV-seropositive cats, and 10 FIV+FeLV-seropositive cats were included in the study. A population of 103 retrovirus-seronegative age and sex-matched cats was selected. Survival time was calculated and compared between retrovirus-seronegative, FIV, FeLV and FIV+FeLV-seropositive cats using Kaplan-Meier survival analysis. Cox proportional-hazards regression analysis was used to study the effect of selected predictor factors (male gender, peripheral blood cytopenia as reduced red blood cells - RBC- count, leukopenia, neutropenia and lymphopenia, hypercreatininemia and reduced albumin to globulin ratio) on survival time in retrovirus-seropositive populations. Median survival times for seronegative cats, FIV, FeLV and FIV+FeLV-seropositive cats were 3960, 2040, 714 and 77days, respectively. Compared to retrovirus-seronegative cats median survival time was significantly lower (P<0.000) in FeLV and FIV+FeLV-seropositive cats. Median survival time in FeLV and FIV+FeLV-seropositive cats was also significant lower (P<0.000) when compared to FIV-seropositive cats. Hazard ratio of death in FeLV and FIV+FeLV-seropositive cats being respectively 3.4 and 7.4 times higher, in comparison to seronegative cats and 2.3 and 4.8 times higher in FeLV and FIV+FeLV-seropositive cats as compared to FIV-seropositive cats. A Cox proportional-hazards regression analysis showed that FIV and FeLV-seropositive cats with reduced RBC counts at time of diagnosis of seropositivity had significantly shorter survival times when compared to FIV and FeLV-seropositive cats with normal RBC counts at diagnosis. In summary, FIV-seropositive status did not significantly affect longevity of cats in this study, unlike FeLV and FIV+FeLV-seropositivity. Reduced RBC counts at time of FIV and FeLV diagnosis could impact negatively on the longevity of seropositive cats and therefore blood counts should always be evaluated at diagnosis and follow-up of retrovirus-seropositive cats. Copyright © 2017 Elsevier B.V. All rights reserved.
Castillo, V.; Pessina, P.; Hall, P.; Blatter, M.F. Cabrera; Miceli, D.; Arias, E. Soler; Vidal, P.
2016-01-01
The objective of the present study was to compare the effects of isotretinoin 9-cis (RA9-cis) as a post-surgery treatment of thyroid carcinoma to a traditional treatment (doxorubicin) and no treatment. Owners who did not want their dogs to receive treatment were placed into the control group A (GA; n=10). The remaining dogs were randomly placed into either group B (GB; n=12) and received doxorubicin at a dose of 30 mg/m2 every three weeks, for six complete cycles or group C (GC; n=15) and treated with RA9-cis at a dose of 2 mg/kg/day for 6 months. The time of the recurrence was significantly shorter in the GA and GB compared to GC (P=0.0007; P=0.0015 respectively), while we did not detect differences between GA and GB. The hazard ratio of recurrence between GA and GB compared to GC were 7.25 and 5.60 times shorter, respectively. We did not detect any differences between the other groups. The risk ratio of recurrence was 2.0 times higher in GA compared to GC and 2.1 times higher in GB compared to GC. The type of carcinoma had an effect on time of survival with follicular carcinomas having an increased mean survival time than follicular-compact carcinomas (P<0.0001) and follicular-compact carcinomas had a longer mean survival time than compact carcinomas. The interaction among treatment and type was significant, but survival time in follicular carcinomas did not differ between treatments. In follicular-compact carcinomas the survival time of GC was greater than GB (P<0.05), but we did not detect a difference between GA and GB. In conclusion, this study shows that the use of surgery in combination with RA9-cis treatment significantly increases survival rate and decreases the time to tumor recurrence when compared to doxorubicin treated or untreated dogs. The histological type of carcinoma interacted with treatment for time to recurrence and survival time, with more undifferentiated carcinomas having a worse prognosis than differentiated carcinomas. PMID:26862515
Technique-associated outcomes in horses following large colon resection.
Pezzanite, Lynn M; Hackett, Eileen S
2017-11-01
To compare survival and complications in horses undergoing large colon resection with either sutured end-to-end or stapled functional end-to-end anastomoses. Retrospective cohort study. Twenty-six client-owned horses with gastrointestinal disease. Retrospective data were retrieved from the medical records of 26 horses undergoing colectomy, including 14 horses with sutured end-to-end and 12 horses with stapled functional end-to-end anastomoses, between 2003 and 2016. Records were evaluated for signalment, medical and surgical treatments, and survival to hospital discharge. Long-term follow-up was obtained through owner contact. Continuous variables were compared with Mann-Whitney tests. Fisher's exact testing was used to compare survival to hospital discharge. Survival time was compared by constructing Kaplan-Meier survival curves and performing log-rank curve comparison testing. Mean age of horses undergoing colectomy was 13 years. Reason for colectomy was prophylaxis (12) or salvage (14). Mean surgical time was 169 minutes. Mean hospitalization time was 9 days, which did not differ with anastomosis type (P = .62). Nine of 12 horses undergoing stapled functional end-to-end anastomosis and 12 of 14 horses undergoing sutured end-to-end anastomosis survived to hospital discharge (P = .63). Survival time did not differ with anastomosis technique (P = .35). Short- and long-term survival outcomes are not different between sutured end-to-end or stapled functional end-to-end anastomoses in horses undergoing colectomy. © 2017 The American College of Veterinary Surgeons.
Brenner, Hermann; Jansen, Lina
2016-02-01
Monitoring cancer survival is a key task of cancer registries, but timely disclosure of progress in long-term survival remains a challenge. We introduce and evaluate a novel method, denoted "boomerang method," for deriving more up-to-date estimates of long-term survival. We applied three established methods (cohort, complete, and period analysis) and the boomerang method to derive up-to-date 10-year relative survival of patients diagnosed with common solid cancers and hematological malignancies in the United States. Using the Surveillance, Epidemiology and End Results 9 database, we compared the most up-to-date age-specific estimates that might have been obtained with the database including patients diagnosed up to 2001 with 10-year survival later observed for patients diagnosed in 1997-2001. For cancers with little or no increase in survival over time, the various estimates of 10-year relative survival potentially available by the end of 2001 were generally rather similar. For malignancies with strongly increasing survival over time, including breast and prostate cancer and all hematological malignancies, the boomerang method provided estimates that were closest to later observed 10-year relative survival in 23 of the 34 groups assessed. The boomerang method can substantially improve up-to-dateness of long-term cancer survival estimates in times of ongoing improvement in prognosis. Copyright © 2016 Elsevier Inc. All rights reserved.
Mehra, Tarun; Grözinger, Gerd; Mann, Steven; Guenova, Emmanuella; Moos, Rudolf; Röcken, Martin; Claussen, Claus Detlef; Dummer, Reinhard; Clasen, Stephan
2014-01-01
Background Data on survival with mucosal melanoma and on prognostic factors of are scarce. It is still unclear if the disease course allows for mucosal melanoma to be treated as primary cutaneous melanoma or if differences in overall survival patterns require adapted therapeutic approaches. Furthermore, this investigation is the first to present 10-year survival rates for mucosal melanomas of different anatomical localizations. Methodology 116 cases from Sep 10 1984 until Feb 15 2011 retrieved from the Comprehensive Cancer Center and of the Central Register of the German Dermatologic Society databases in Tübingen were included in our analysis. We recorded anatomical location and tumor thickness, and estimated overall survival at 2, 5 and 10 years and the mean overall survival time. Survival times were analyzed with the Kaplan-Meier method. The log-rank test was used to compare survival times by localizations and by T-stages. Principal Findings We found a median overall survival time of 80.9 months, with an overall 2-year survival of 71.7%, 5-year survival of 55.8% and 10-year survival of 38.3%. The 10-year survival rates for patients with T1, T2, T3 or T4 stage tumors were 100.0%, 77.9%, 66.3% and 10.6% respectively. 10-year survival of patients with melanomas of the vulva was 64.5% in comparison to 22.3% of patients with non-vulva mucosal melanomas. Conclusion Survival times differed significantly between patients with melanomas of the vulva compared to the rest (p = 0.0006). It also depends on T-stage at the time of diagnosis (p<0.0001). PMID:25383553
Mehra, Tarun; Grözinger, Gerd; Mann, Steven; Guenova, Emmanuella; Moos, Rudolf; Röcken, Martin; Claussen, Claus Detlef; Dummer, Reinhard; Clasen, Stephan; Naumann, Aline; Garbe, Claus
2014-01-01
Data on survival with mucosal melanoma and on prognostic factors of are scarce. It is still unclear if the disease course allows for mucosal melanoma to be treated as primary cutaneous melanoma or if differences in overall survival patterns require adapted therapeutic approaches. Furthermore, this investigation is the first to present 10-year survival rates for mucosal melanomas of different anatomical localizations. 116 cases from Sep 10 1984 until Feb 15 2011 retrieved from the Comprehensive Cancer Center and of the Central Register of the German Dermatologic Society databases in Tübingen were included in our analysis. We recorded anatomical location and tumor thickness, and estimated overall survival at 2, 5 and 10 years and the mean overall survival time. Survival times were analyzed with the Kaplan-Meier method. The log-rank test was used to compare survival times by localizations and by T-stages. We found a median overall survival time of 80.9 months, with an overall 2-year survival of 71.7%, 5-year survival of 55.8% and 10-year survival of 38.3%. The 10-year survival rates for patients with T1, T2, T3 or T4 stage tumors were 100.0%, 77.9%, 66.3% and 10.6% respectively. 10-year survival of patients with melanomas of the vulva was 64.5% in comparison to 22.3% of patients with non-vulva mucosal melanomas. Survival times differed significantly between patients with melanomas of the vulva compared to the rest (p = 0.0006). It also depends on T-stage at the time of diagnosis (p < 0.0001).
Role of survivor bias in pancreatic cancer case-control studies.
Hu, Zhen-Huan; Connett, John E; Yuan, Jian-Min; Anderson, Kristin E
2016-01-01
The purpose of this study was to evaluate the impact of survivor bias on pancreatic cancer case-control studies. The authors constructed five case-loss scenarios based on the Iowa Women's Health Study cohort to reflect how case recruitment in population-based studies varies by case survival time. Risk factors for disease incidence included smoking, body mass index (BMI), waist circumference, diabetes, and alcohol consumption. Odds ratios (ORs) were estimated by conditional logistic regression and quantitatively compared by the interactions between risk factors and 3-month survival time. Additionally, Kaplan-Meier estimates for overall survival were compared within the subset cohort of pancreatic cancer cases. BMI and waist circumference showed a significant inverse relationship with survival time. Decreasing trends in ORs for BMI and waist circumference were observed with increasing case survival time. The interaction between BMI and survival time based on a cutpoint of 3 months was significant (P < .01) as was the interaction between waist circumference and survival time (P < .01). The findings suggested that case losses could result in survivor bias causing underestimated odds ratios for both BMI and waist circumference, whereas other risk factors were not significantly affected by case losses. Copyright © 2016 Elsevier Inc. All rights reserved.
Survival time with pacemaker implantation for dogs diagnosed with persistent atrial standstill.
Cervenec, R M; Stauthammer, C D; Fine, D M; Kellihan, H B; Scansen, B A
2017-06-01
To evaluate survival time in dogs with persistent atrial standstill after pacemaker implantation and to compare the survival times for cardiac-related vs. non-cardiac deaths. Secondary objectives were to evaluate the effects of breed and the presence of congestive heart failure (CHF) at the time of diagnosis on survival time. Twenty dogs with persistent atrial standstill and pacemaker implantation. Medical records were searched to identify dogs diagnosed with persistent atrial standstill based on electrocardiogram that underwent pacemaker implantation. Survival after pacemaker implantation was analyzed using the Kaplan-Meier method. The median survival time after pacemaker implantation for all-cause mortality was 866 days. There was no significant difference (p=0.573) in median survival time for cardiac (506 days) vs. non-cardiac deaths (400 days). The presence of CHF at the time of diagnosis did not affect the survival time (P=0.854). No difference in median survival time was noted between breeds (P=0.126). Dogs with persistent atrial standstill have a median survival time of 866 days with pacemaker implantation, though a wide range of survival times was observed. There was no difference in the median survival time for dogs with cardiac-related deaths and those without. Patient breed and the presence of CHF before pacemaker implantation did not affect median survival time. Copyright © 2017 Elsevier B.V. All rights reserved.
Decompression Sickness After Air Break in Prebreathe Described with a Survival Model
NASA Technical Reports Server (NTRS)
Conkin, J.; Pilmanis, A. A.
2010-01-01
Data from Brooks City-Base show the decompression sickness (DCS) and venous gas emboli (VGE) consequences of air breaks in a resting 100% O2 prebreathe (PB) prior to a hypobaric exposure. METHODS: DCS and VGE survival times from 95 controls for a 60 min PB prior to 2-hr or 4-hr exposures to 4.37 psia are statistically compared to 3 break in PB conditions: a 10 min (n=40), 20 min (n=40), or 60 min break (n=32) 30 min into the PB followed by 30 min of PB. Ascent rate was 1,524 meters / min and all exposures included light exercise and 4 min of VGE monitoring of heart chambers at 16 min intervals. DCS survival time for combined control and air breaks were described with an accelerated log logistic model where exponential N2 washin during air break was described with a 10 min half-time and washout during PB with a 60 min half-time. RESULTS: There was no difference in VGE or DCS survival times among 3 different air breaks, or when air breaks were compared to control VGE times. However, 10, 20, and 60 min air breaks had significantly earlier survival times compared to control DCS times, certainly early in the exposures. CONCLUSION: Air breaks of 10, 20, and 60 min after 30 min of a 60 min PB reduced DCS survival time. The survival model combined discrete comparisons into a global description mechanistically linked to asymmetrical N2 washin and washout kinetics based on inspired pN2. Our unvalidated regression is used to compute additional PB time needed to compensate for an air break in PB within the range of tested conditions.
Majhail, Navneet S; Brazauskas, Ruta; Hassebroek, Anna; Bredeson, Christopher N; Hahn, Theresa; Hale, Gregory A; Horowitz, Mary M; Lazarus, Hillard M; Maziarz, Richard T; Wood, William A; Parsons, Susan K; Joffe, Steven; Rizzo, J Douglas; Lee, Stephanie J; Hayes-Lattin, Brandon M
2012-06-01
Adolescents and young adults (AYAs) with cancer have not experienced improvements in survival to the same extent as children and older adults. We compared outcomes among children (<15 years), AYAs (15-40 years) and older adults (>40 years) receiving allogeneic hematopoietic cell transplant (HCT) for acute myeloid leukemia (AML). Our cohort consisted of 900 children, 2,708 AYA, and 2,728 older adult recipients of HLA-identical sibling or unrelated donor (URD) transplantation using myeloablative or reduced-intensity/nonmyeloablative conditioning. Outcomes were assessed over three time periods (1980-1988, 1989-1997, 1998-2005) for siblings and two time periods (1989-1997, 1998-2005) for URD HCT. Analyses were stratified by donor type. Results showed overall survival for AYAs using either siblings or URD improved over time. Although children had better and older adults had worse survival compared with AYAs, improvements in survival for AYAs did not lag behind those for children and older adults. After sibling donor HCT, 5-year adjusted survival for the three time periods was 40%, 48%, and 53% for children, 35%, 41%, and 42% for AYAs, and 22%, 30%, and 34% for older adults. Among URD HCT recipients, 5-year adjusted survival for the two time periods was 38% and 37% for children, 24% and 28% for AYAs, and 19% and 23% for older adults. Improvements in survival occurred because of a reduction in risk of treatment-related mortality. The risk of relapse did not change over time. Improvements in survival among AYAs undergoing allogeneic HCT for AML have paralleled those among children and older adults. Copyright © 2012 American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Self-rated health as a predictor of survival among patients with advanced cancer.
Shadbolt, Bruce; Barresi, Jane; Craft, Paul
2002-05-15
Evidence is emerging about the strong predictive relationship between self-rated health (SRH) and survival, although there is little evidence on palliative populations where an accurate prediction of survival is valuable. Thus, the relative importance of SRH in predicting the survival of ambulatory patients with advanced cancer was examined. SRH was compared to clinical assessments of performance status, as well as to quality-of-life measures. By use of a prospective cohort design, 181 patients (76% response rate) with advanced cancer were recruited into the study, resurveyed at 18 weeks, and observed to record deaths. The average age of patients was 62 years (SD = 12). The median survival time was 10 months. SRH was the strongest predictor of survival from baseline. Also, a Cox regression comparing changes in SRH over time yielded hazard ratios suggesting the relative risk (RR) of dying was greater for fair ratings at 18 weeks (approximately 3 times) compared with consistent good or better ratings; the RR was even greater (4.2 and 6.2 times) for poor ratings, especially when ratings were poor at baseline and 18 weeks (31 times). Improvement in SRH over time yielded the lowest RR. SRH is valid, reliable, and responsive to change as a predictor of survival of advanced cancer. These qualities suggest that SRH should be considered as an additional tool by oncologists to assess patients. Similarly, health managers could use SRH as an indicator of disease severity in palliative care case mix. Finally, SRH could provide a key to help us understand the human side of disease and its relationship with medicine.
Brugière, Olivier; Thabut, Gabriel; Suberbielle, Caroline; Reynaud-Gaubert, Martine; Thomas, Pascal; Pison, Christophe; Saint Raymond, Christel; Mornex, Jean-François; Bertocchi, Michèle; Dromer, Claire; Velly, Jean-François; Stern, Marc; Philippe, Bruno; Dauriat, Gaëlle; Biondi, Giuseppina; Castier, Yves; Fournier, Michel
2008-06-01
Recent data strongly suggest that human leukocyte antigen (HLA) mismatching has a negative impact on development of bronchiolitis obliterans syndrome (BOS) and survival after lung transplantation (LTx). Because HLA matching is sometimes achieved by extending ischemic time in other solid-organ transplantation models and ischemic time is a risk factor per se for death after LTx, we sought to compare the theoretical benefit of HLA matching with the negative impact of lengthened ischemic time. In this collaborative study we compared the relative impact of HLA mismatching and ischemic time on BOS and survival in 182 LTx recipients. Using multivariate analyses, we observed a lower incidence of BOS (hazard ratio [HR] = 1.70, 95% confidence interval [CI]: 1.1 to 2.7, p = 0.03) and enhanced survival (HR = 1.91, 95% CI: 1.24 to 2.92, p = 0.01) in patients with zero or one HLA-A mismatch compared with those having two HLA-A mismatches. This beneficial effect on survival was equivalent to a reduction of ischemic time of 168 minutes. We observed a reduced incidence of BOS and a better survival rate in patients well-matched at the HLA-A locus, associated with an opposite effect of an enhanced ischemic time. This suggests that graft ischemic time should be taken into account in future studies of prospective HLA matching in LTx.
Yokoyama, Yoshihito; Shigeto, Tatsuhiko; Miura, Rie; Kobayashi, Asami; Mizunuma, Makito; Yamauchi, Aisa; Futagami, Masayuki; Mizunuma, Hideki
2016-01-01
The current study examined the effectiveness of concurrent therapy using photodynamic therapy (PDT) and clofibric acid (CA) to treat peritoneal carcinomatosis resulting from ovarian cancer. Nude rats were used to create a model of peritoneal carcinomatosis resulting from ovarian cancer and the effectiveness of PDT with 5-aminolevulinic acid methyl ester hydrochloride (methyl-ALA-PDT) was determined. The survival time of rats receiving that therapy was compared to the survival time of a control group. Rats with peritoneal carcinomatosis resulting from ovarian cancer were divided into 3 groups: a group that received debulking surgery (DS) alone, a group that received DS+methyl-ALA-PDT, and a group that received DS+methyl-ALA-PDT+CA. The survival time of the 3 groups was compared. Protoporphyrin, a metabolite of methyl-ALA, produces a photochemical action when activated by light. The level of protoporphyrin (the concentration) that reached organs in the abdomen was measured with HPLC. Rats receiving methyl- ALA-PDT had a significantly longer survival time compared to the controls. Rats with peritoneal carcinomatosis that received DS+methyl-ALA-PDT+CA had a significantly longer survival time compared to the rats that received DS alone. Some of the rats that received concurrent therapy survived for a prolonged period. Protoporphyrin was highly concentrated in peritoneal metastases, but only small amounts reached major organs in the abdomen. PDT was not found to result in necrosis in the intestines. The results indicated that concurrent therapy consisting of PDT with methyl-ALA and CA is effective at treating peritoneal carcinomatosis resulting from ovarian cancer without damaging organs.
Cancer survival in Eastern and Western Germany after the fall of the iron curtain.
Jansen, Lina; Gondos, Adam; Eberle, Andrea; Emrich, Katharina; Holleczek, Bernd; Katalinic, Alexander; Brenner, Hermann
2012-09-01
Prior to the German reunification, cancer survival was much lower in East than in West Germany. We compare cancer survival between Eastern and Western Germany in the early twenty-first century, i.e. the second decade after the German reunification. Using data from 11 population-based cancer registries covering a population of 33 million people, 5-year age-standardized relative survival for the time period 2002-2006 was estimated for the 25 most common cancers using model-based period analysis. In 2002-2006, 5-year relative survival was very similar for most cancers, with differences below 3% units for 20 of 25 cancer sites. Larger, statistically significant survival advantages were seen for oral cavity, oesophagus, and gallbladder cancer and skin melanoma in the West and for leukemia in the East. Our study shows that within two decades after the assimilation of political and health care systems, the former major survival gap of cancer patients in Eastern Germany has been essentially overcome. This result is encouraging as it suggests that, even though economic conditions have remained difficult in Eastern Germany, comparable health care provision may nevertheless enable comparable levels of cancer survival within a relatively short period of time.
Rudin-Bitterli, Tabitha S; Spicer, John I; Rundle, Simon D
2016-04-01
Physiological plasticity of early developmental stages is a key way by which organisms can survive and adapt to environmental change. We investigated developmental plasticity of aspects of the cardio-respiratory physiology of encapsulated embryos of a marine gastropod, Littorina obtusata, surviving exposure to moderate hypoxia (PO2 =8 kPa) and compared the development of these survivors with that of individuals that died before hatching. Individuals surviving hypoxia exhibited a slower rate of development and altered ontogeny of cardio-respiratory structure and function compared with normoxic controls (PO2 >20 kPa). The onset and development of the larval and adult hearts were delayed in chronological time in hypoxia, but both organs appeared earlier in developmental time and cardiac activity rates were greater. The velum, a transient, 'larval' organ thought to play a role in gas exchange, was larger in hypoxia but developed more slowly (in chronological time), and velar cilia-driven, rotational activity was lower. Despite these effects of hypoxia, 38% of individuals survived to hatching. Compared with those embryos that died during development, these surviving embryos had advanced expression of adult structures, i.e. a significantly earlier occurrence and greater activity of their adult heart and larger shells. In contrast, embryos that died retained larval cardio-respiratory features (the velum and larval heart) for longer in chronological time. Surviving embryos came from eggs with significantly higher albumen provisioning than those that died, suggesting an energetic component for advanced development of adult traits. © 2016. Published by The Company of Biologists Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacobs, R.M.; Boyce, J.T.; Kociba, G.J.
This study demonstrates the potential usefulness of a flow cytometric technique to measure platelet survival time in cats utilizing autologous platelets labeled in vitro with fluorescein isothiocyanate (FITC). When compared with a 51Cr method, no significant differences in estimated survival times were found. Both the 51Cr and FITC-labeling procedures induced similar changes in platelet shape and collagen-induced aggregation. Platelets labeled with FITC had significantly greater volumes compared with those of glutaraldehyde-fixed platelets. These changes were primarily related to the platelet centrifugation and washing procedures rather than the labels themselves. This novel technique potentially has wide applicability to cell circulation timemore » studies as flow cytometry equipment becomes more readily available. Problems with the technique are discussed. In a preliminary study of the platelet survival time in feline leukemia virus (FeLV)-infected cats, two of three cats had significantly reduced survival times using both flow cytometric and radioisotopic methods. These data suggest increased platelet turnover in FeLV-infected cats.« less
Matsuo, Koji; Opper, Neisha R; Ciccone, Marcia A; Garcia, Jocelyn; Tierney, Katherine E; Baba, Tsukasa; Muderspach, Laila I; Roman, Lynda D
2015-02-01
To examine whether wait time between endometrial biopsy and surgical staging correlates with tumor characteristics and affects survival outcomes in patients with type I endometrial cancer. A retrospective study was conducted to examine patients with grade 1 and 2 endometrioid adenocarcinoma diagnosed by preoperative endometrial biopsy who subsequently underwent hysterectomy-based surgical staging between 2000 and 2013. Patients who received neoadjuvant chemotherapy or hormonal treatment were excluded. Time interval and grade change between endometrial biopsy and hysterectomy were correlated to demographics and survival outcomes. Median wait time was 57 days (range 1-177 days) among 435 patients. Upgrading of the tumor to grade 3 in the hysterectomy specimen was seen in 4.7% of 321 tumors classified as grade 1 and 18.4% of 114 tumors classified as grade 2 on the endometrial biopsy, respectively. Wait time was not associated with grade change (P>.05). Controlling for age, ethnicity, body habitus, medical comorbidities, CA 125 level, and stage, multivariable analysis revealed that wait time was not associated with survival outcomes (5-year overall survival rates, wait time 1-14, 15-42, 43-84, and 85 days or more; 62.5%, 93.6%, 95.2%, and 100%, respectively, P>.05); however, grade 1 to 3 on the hysterectomy specimen remained as an independent prognosticator associated with decreased survival (5-year overall survival rates, grade 1 to 3 compared with grade change 1 to 1, 82.1% compared with 98.5%, P=.01). Among grade 1 preoperative biopsies, grade 1 to 3 was significantly associated with nonobesity (P=.039) and advanced stage (P=.019). Wait time for surgical staging was not associated with decreased survival outcome in patients with type I endometrial cancer.
Prognosis after surgical excision of canine fibrous connective tissue sarcomas.
Bostock, D E; Dye, M T
1980-09-01
One hundred eighty seven dogs from which fibrous connective tissue sarcomas had been excised were studied until death or for at least 3 years after surgery. Dogs with a skin fibrosarcoma had a median survival time of 80 weeks, compared with 140 weeks for animals with haemangiopericytoma in similar sites, this difference being statistically significant. However, the difference in survival time between the two histologic types disappeared when tumours with a similar mitotic index were compared. Dogs with a tumour of mitotic index 9 or more had a median survival time of 49 weeks, compared with 118 weeks for those with a tumour of mitotic index less than 9, regardless of tumour morphology. Tumour recurrence rates of 62% and 25% respectively for the two groups were also significantly different.
Towards optimizing the sequence of bevacizumab and nitrosoureas in recurrent malignant glioma.
Wiestler, Benedikt; Radbruch, Alexander; Osswald, Matthias; Combs, Stephanie E; Jungk, Christine; Winkler, Frank; Bendszus, Martin; Unterberg, Andreas; Platten, Michael; Wick, Wolfgang; Wick, Antje
2014-03-01
Studies on the monoclonal VEGF-A antibody bevacizumab gave raise to questions regarding the lack of an overall survival benefit, the optimal timing in the disease course and potential combination and salvage therapies. We retrospectively assessed survival, radiological progression type on bevacizumab and efficacy of salvage therapies in 42 patients with recurrent malignant gliomas who received bevacizumab and nitrosourea sequentially. 15 patients received bevacizumab followed by nitrosourea at progression and 27 patients vice versa. Time to treatment failure, defined as time from initiation of one to failure of the other treatment, was similar in both groups (9.6 vs. 9.2 months, log rank p = 0.19). Progression-free survival on nitrosoureas was comparable in both groups, while progression-free survival on bevacizumab was longer in the group receiving bevacizumab first (5.3 vs. 4.1 months, log rank p = 0.03). Survival times were similar for patients with grade III (n = 9) and grade IV (n = 33) tumors. Progression-free survival on bevacizumab for patients developing contrast-enhancing T1 progression was longer than for patients who displayed a non-enhancing T2 progression. However, post-progression survival times after bevacizumab failure were not different. Earlier treatment with bevacizumab was not associated with better outcome in this series. The fact that earlier as compared to later bevacizumab treatment does not result in a different time to treatment failure highlights the challenge for first-line or recurrence trials with bevacizumab to demonstrate an overall survival benefit if crossover of bevacizumab-naïve patients after progression occurs.
A comparative study of mixture cure models with covariate
NASA Astrophysics Data System (ADS)
Leng, Oh Yit; Khalid, Zarina Mohd
2017-05-01
In survival analysis, the survival time is assumed to follow a non-negative distribution, such as the exponential, Weibull, and log-normal distributions. In some cases, the survival time is influenced by some observed factors. The absence of these observed factors may cause an inaccurate estimation in the survival function. Therefore, a survival model which incorporates the influences of observed factors is more appropriate to be used in such cases. These observed factors are included in the survival model as covariates. Besides that, there are cases where a group of individuals who are cured, that is, not experiencing the event of interest. Ignoring the cure fraction may lead to overestimate in estimating the survival function. Thus, a mixture cure model is more suitable to be employed in modelling survival data with the presence of a cure fraction. In this study, three mixture cure survival models are used to analyse survival data with a covariate and a cure fraction. The first model includes covariate in the parameterization of the susceptible individuals survival function, the second model allows the cure fraction to depend on covariate, and the third model incorporates covariate in both cure fraction and survival function of susceptible individuals. This study aims to compare the performance of these models via a simulation approach. Therefore, in this study, survival data with varying sample sizes and cure fractions are simulated and the survival time is assumed to follow the Weibull distribution. The simulated data are then modelled using the three mixture cure survival models. The results show that the three mixture cure models are more appropriate to be used in modelling survival data with the presence of cure fraction and an observed factor.
Lloyd, Penn; Martin, Thomas E.
2016-01-01
Slow life histories are characterized by high adult survival and few offspring, which are thought to allow increased investment per offspring to increase juvenile survival. Consistent with this pattern, south temperate zone birds are commonly longer-lived and have fewer young than north temperate zone species. However, comparative analyses of juvenile survival, including during the first few weeks of the post-fledging period when most juvenile mortality occurs, are largely lacking. We combined our measurements of fledgling survival for eight passerines in South Africa with estimates from published studies of 57 north and south temperate zone songbird species to test three predictions: (1) fledgling survival increases with length of development time in the nest; (2) fledgling survival increases with adult survival and reduced brood size controlled for development time; and (3) south temperate zone species, with their higher adult survival and smaller brood sizes, exhibit higher fledgling survival than north temperate zone species controlled for development time. We found that fledgling survival was higher among south temperate zone species and generally increased with development time and adult survival within and between latitudinal regions. Clutch size did not explain additional variation, but was confounded with adult survival. Given the importance of age-specific mortality to life history evolution, understanding the causes of these geographical patterns of mortality is important.
Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia
2017-07-28
Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.
Dickman, Mor M; Peeters, Jean Marie P W U; van den Biggelaar, Frank J H M; Ambergen, Ton A W; van Dongen, Martin C J M; Kruit, Pieter Jan; Nuijts, Rudy M M A
2016-10-01
To compare graft survival, best-corrected visual acuity (BCVA), endothelial cell density (ECD), and refraction following penetrating keratoplasty (PK) vs endothelial keratoplasty (EK) for Fuchs endothelial dystrophy (FED) and pseudophakic bullous keratopathy (PBK). Nonrandomized treatment comparison with national registry data. All consecutive patients undergoing first keratoplasty for FED and PBK between 1998 and 2014 were analyzed, with a maximum follow-up of 5 years (mean ± SD follow-up 39 ± 20 months, range 0-60 months). Graft survival was analyzed using Kaplan-Meier survival curves and Cox regression analysis. BCVA, ECD, and refractive error were compared using linear mixed models. Main outcome measures were graft survival, BCVA, refraction, and ECD. A total of 5115 keratoplasties (PK = 2390; EK = 2725) were identified. Two-year graft survival following EK was lower compared with PK (94.5% vs 96.3%, HR = 1.56, P = .001). Five-year survival was comparable for EK and PK (93.4% vs 89.7%, HR = 0.89, P = .261). EK graft survival improved significantly over time while remaining stable for PK. One-year BCVA was better following EK vs PK (0.34 vs 0.47 logMAR, P < .001). Astigmatism was lower 1 year after EK vs PK (-1.69 vs -3.52 D, P < .001). One-year ECD was lower after EK vs PK (1472 vs 1859 cells/mm 2 , P < .001). At 3 years, ECD did not differ between EK and PK. Long-term graft survival after EK and PK is high and comparable despite lower short-term survival for EK. EK graft survival improved over time, suggesting a learning curve. EK results in better BCVA, lower astigmatism, and similar long-term ECD compared with PK for FED and PBK. Copyright © 2016 Elsevier Inc. All rights reserved.
Rottmann, Miriam; Burges, A; Mahner, S; Anthuber, C; Beck, T; Grab, D; Schnelzer, A; Kiechle, M; Mayr, D; Pölcher, M; Schubert-Fritschle, G; Engel, J
2017-09-01
The objective was to compare the prognostic factors and outcomes among primary ovarian cancer (OC), fallopian tube cancer (FC), and peritoneal cancer (PC) patients in a population-based setting. We analysed 5399 OC, 327 FC, and 416 PC patients diagnosed between 1998 and 2014 in the catchment area of the Munich Cancer Registry (meanwhile 4.8 million inhabitants). Tumour site differences were examined by comparing prognostic factors, treatments, the time to progression, and survival. The effect of the tumour site was additionally analysed by a Cox regression model. The median age at diagnosis, histology, and FIGO stage significantly differed among the tumour sites (p < 0.001); PC patients were older, more often diagnosed with a serous subtype, and in FIGO stage III or IV. The time to progression and survival significantly differed among the tumour sites. When stratified by FIGO stage, the differences in time to progression disappeared, and the differences in survival considerably weakened. The differences in the multivariate survival analysis showed an almost identical outcome in PC patients (HR 1.07 [0.91-1.25]) and an improved survival of FC patients (HR 0.63 [0.49-0.81]) compared to that of OC patients. The comparison of OC, FC, and PC patients in this large-scale population-based study showed differences in the prognostic factors. These differences primarily account for the inferior outcome of PC patients, and for the improved survival of FC compared to OC patients.
Comparative trial of endocrine versus cytotoxic treatment in advanced breast cancer.
Priestman, T; Baum, M; Jones, V; Forbes, J
1977-01-01
Ninety-two women with advanced breast cancer were allocated at random to receive either cytotoxic or endocrine treatment. Out of 45 women included in the cytotoxic treatment group, 22 (49%) achieved complete or partial remission of their disease, whereas of the 47 included in the endocrine treatment group, only 10 (21%) achieved such remission. Significantly longer survival times in the cytotoxic treatment group were most apparent among premenopausal women, 75% of such patients responding to cytotoxic drugs (median survival 46 weeks) compared with only 11% benefiting from ovarian ablation (median survival 12 weeks). In postmenopausal women with predominantly soft-tissue disease, however, additive hormonal treatment with tamoxifen produced remission rates and survival times equivalent to those produced by cytotoxic drugs. PMID:324570
Lin, Pei-Jung; Concannon, Thomas W; Greenberg, Dan; Cohen, Joshua T; Rossi, Gregory; Hille, Jeffrey; Auerbach, Hannah R; Fang, Chi-Hui; Nadler, Eric S; Neumann, Peter J
2013-08-01
To investigate the relationship between the framing of survival gains and the perceived value of cancer care. Through a population-based survey of 2040 US adults, respondents were randomized to one of the two sets of hypothetical scenarios, each of which described the survival benefit for a new treatment as either an increase in median survival time (median survival), or an increase in the probability of survival for a given length of time (landmark survival). Each respondent was presented with two randomly selected scenarios with different prognosis and survival improvements, and asked about their willingness to pay (WTP) for the new treatments. Predicted WTP increased with survival benefits and respondents' income, regardless of how survival benefits were described. Framing therapeutic benefits as improvements in landmark rather than median time survival increased the proportion of the population willing to pay for that gain by 11-35%, and the mean WTP amount by 42-72% in the scenarios we compared. How survival benefits are described may influence the value people place on cancer care.
NASA Astrophysics Data System (ADS)
Hapugoda, J. C.; Sooriyarachchi, M. R.
2017-09-01
Survival time of patients with a disease and the incidence of that particular disease (count) is frequently observed in medical studies with the data of a clustered nature. In many cases, though, the survival times and the count can be correlated in a way that, diseases that occur rarely could have shorter survival times or vice versa. Due to this fact, joint modelling of these two variables will provide interesting and certainly improved results than modelling these separately. Authors have previously proposed a methodology using Generalized Linear Mixed Models (GLMM) by joining the Discrete Time Hazard model with the Poisson Regression model to jointly model survival and count model. As Aritificial Neural Network (ANN) has become a most powerful computational tool to model complex non-linear systems, it was proposed to develop a new joint model of survival and count of Dengue patients of Sri Lanka by using that approach. Thus, the objective of this study is to develop a model using ANN approach and compare the results with the previously developed GLMM model. As the response variables are continuous in nature, Generalized Regression Neural Network (GRNN) approach was adopted to model the data. To compare the model fit, measures such as root mean square error (RMSE), absolute mean error (AME) and correlation coefficient (R) were used. The measures indicate the GRNN model fits the data better than the GLMM model.
Lefbom, Bonnie K; Peckens, Neal K
2016-07-01
OBJECTIVE To assess the effects of in-person collaborative care by primary care veterinarians (pcDVMs) and board-certified veterinary cardiologists (BCVCs) on survival time of dogs after onset of congestive heart failure (CHF) and on associated revenue for the attending pcDVMs. DESIGN Retrospective cohort study. ANIMALS 26 small-breed dogs treated for naturally occurring CHF secondary to myxomatous mitral valve disease at a multilocation primary care veterinary hospital between 2008 and 2013. PROCEDURES Electronic medical records were reviewed to identify dogs with confirmed CHF secondary to myxomatous mitral valve disease and collect information on patient care, survival time, and pcDVM revenue. Data were compared between dogs that received collaborative care from the pcDVM and a BCVC and dogs that received care from the pcDVM alone. RESULTS Dogs that received collaborative care had a longer median survival time (254 days) than did dogs that received care from the pcDVM alone (146 days). A significant positive correlation was identified between pcDVM revenue and survival time for dogs that received collaborative care (ie, the longer the dog survived, the greater the pcDVM revenue generated from caring for that patient). CONCLUSIONS AND CLINICAL RELEVANCE Findings suggested that collaborative care provided to small-breed dogs with CHF by a BCVC and pcDVM could result in survival benefits for affected dogs and increased revenue for pcDVMs, compared with care provided by a pcDVM alone.
Malignant melanoma in 63 dogs (2001-2011): the effect of carboplatin chemotherapy on survival.
Brockley, L K; Cooper, M A; Bennett, P F
2013-01-01
The aim of the study was to compare the effect of carboplatin chemotherapy on the survival of canine patients diagnosed with malignant melanoma after loco-regional control or as a sole therapy. A retrospective study of 63 dogs with oral, digital or cutaneous malignant melanoma treated with surgery and/or chemotherapy was undertaken. Dogs were grouped based on the anatomical site of melanoma development. For oral melanoma, dogs were subclassified into two groups: loco-regional control and gross disease. All patients in the digital and cutaneous groups had achieved loco-regional control with surgery. Comparisons between survival data for each group at each anatomical site were then made. Within the loco-regional control groups survival time was compared between those treated with and without chemotherapy post surgery. For the oral melanoma patients with gross disease survival was compared between those treated with chemotherapy and palliative therapy. The toxicity of carboplatin chemotherapy was evaluated overall. The overall median survival times for patients with oral, digital and cutaneous melanoma were 389, 1,350 days and not reached (with a median follow-up of 776 days) respectively. Median survival time was defined as "not reached" when less than 50% of the subjects died of the disease at the end of the follow-up period, or at the time they were lost to follow-up. The addition of chemotherapy to surgery did not confer a survival benefit in the loco-regional control setting when assessing survival for each anatomical site. For oral melanoma patients with gross disease there was no difference between survival of patients treated with chemotherapy and palliative intent therapy. There was however an improvement in survival in the three dogs that responded to chemotherapy (978 days; p=0.039) compared to the eight non-responders (147 days). On univariate and multivariate analysis, anatomic location was the only variable that was significantly related to survival (p=0.0002 and p=0.009, respectively). The addition of chemotherapy to local treatments for canine melanoma at oral, digital and cutaneous sites did not lead to a significant increase in survival times. Carboplatin was well tolerated and appeared to have activity against oral melanoma in a subset of patients with gross disease that responded to treatment. Carboplatin with piroxicam could be considered for patients with gross disease when more traditional therapies, such as surgery or radiation therapy, are declined or are not available. In the loco-regional control setting, prospective randomised blinded studies with matched control groups are required to determine if chemotherapy has a role in the treatment of these types of cancer.
Comparison of cancer survival in New Zealand and Australia, 2006-2010.
Aye, Phyu S; Elwood, J Mark; Stevanovic, Vladimir
2014-12-19
Previous studies have shown substantially higher mortality rates from cancer in New Zealand compared to Australia, but these studies have not included data on patient survival. This study compares the survival of cancer patients diagnosed in 2006-10 in the whole populations of New Zealand and Australia. Identical period survival methods were used to calculate relative survival ratios for all cancers combined, and for 18 cancers each accounting for more than 50 deaths per year in New Zealand, from 1 to 10 years from diagnosis. Cancer survival was lower in New Zealand, with 5-year relative survival being 4.2% lower in women, and 3.8% lower in men for all cancers combined. Of 18 cancers, 14 showed lower survival in New Zealand; the exceptions, with similar survival in each country, being melanoma, myeloma, mesothelioma, and cervical cancer. For most cancers, the differences in survival were maximum at 1 year after diagnosis, becoming smaller later; however, for breast cancer, the survival difference increased with time after diagnosis. The lower survival in New Zealand, and the higher mortality rates shown earlier, suggest that further improvements in recognition, diagnosis, and treatment of cancer in New Zealand should be possible. As the survival differences are seen soon after diagnosis, issues of early management in primary care and time intervals to diagnosis and treatment may be particularly important.
Dynamic patient counseling: a novel concept in idiopathic pulmonary fibrosis.
Brown, A Whitney; Shlobin, Oksana A; Weir, Nargues; Albano, Maria C; Ahmad, Shahzad; Smith, Mary; Leslie, Kevin; Nathan, Steven D
2012-10-01
The characteristics of long-term survivors with idiopathic pulmonary fibrosis (IPF) have never been fully elucidated. We sought to illustrate the attenuated mortality and describe the characteristics of patients with IPF who survived at least 5 years beyond their initial presentation. Patients with IPF evaluated between 1997 and 2006 were identified through the clinic database. Patients who survived beyond 5 years from the time of their evaluation were compared with those who died or underwent lung transplantation within 5 years. Survival analyses were performed from the time of initial evaluation and contingent on annualized survival thereafter. Eighty-seven patients who survived at least 5 years formed the comparator group to whom other patients were contrasted. These patients had a higher BMI, FVC % predicted, FEV1 % predicted, total lung capacity % predicted, and diffusing capacity of lung for carbon monoxide % predicted, but a lower FEV1/FVC ratio and lower mean pulmonary artery pressures. More than one-half of these patients had moderate or severe disease at the time of presentation. Our annualized contingent survival analyses revealed a progressively increasing median survival dependent on the duration of the disease. Although we were able to demonstrate differences in our 5-year survivors, rather than being a distinct group, these patients appear to exist within a continuum of improving survival dependent on prior disease duration. This progressively improving time-dependent prognosis mandates the serial reevaluation of an individual patient’s projected outcomes. The implementation of dynamic counseling is an important concept in more accurately predicting life expectancy for patients with IPF who are frequently haunted by the prospects of a dismal survival.
ClinicAl Evaluation of Dental Restorative Materials
1989-01-01
use of an Atuarial Life Table Survival Analysis procedure. The median survival time for anterior composites was 13.5 years, as compared to 12.1 years...dental materials. For the first time in clinical biomaterials research, we used a statistical approach of Survival Analysis which utilized the... analysis has been established to assure uniformity in usage. This scale is now in use by clinical investigators throughout the country. Its use at the
Wu, Xinhong; Luo, Bo; Wei, Shaozhong; Luo, Yan; Feng, Yaojun; Xu, Juan; Wei, Wei
2013-11-01
To investigate the treatment efficiency of whole brain irradiation combined with precise radiotherapy on triple-negative (TN) phenotype breast cancer patients with brain metastases and their survival times. A total of 112 metastatic breast cancer patients treated with whole brain irradiation and intensity modulated radiotherapy (IMRT) or 3D conformal radiotherapy (3DCRT) were analyzed. Thirty-seven patients were of TN phenotype. Objective response rates were compared. Survival times were estimated by using the Kaplan-Meier method. Log-rank test was used to compare the survival time difference between the TN and non-TN groups. Potential prognostic factors were determined by using a Cox proportional hazard regression model. The efficiency of radiotherapy treatment on TN and non-TN phenotypes was 96.2% and 97%, respectively. TN phenotype was associated with worse survival times than non-TN phenotype after radiotherapy (6.9 months vs. 17 months) (P < 0.01). On multivariate analysis, good prognosis was associated with non-TN status, lower graded prognosis assessment class, and nonexistence of active extracranial metastases. After whole brain irradiation followed by IMRT or 3DCRT treatment, TN phenotype breast cancer patients with intracranial metastasis had high objective response rates but shorter survival time. With respect to survival in breast cancer patients with intracranial metastasis, the TN phenotype represents a significant adverse prognostic factor.
Boerner, T; Graichen, A; Jeiter, T; Zemann, F; Renner, P; März, L; Soeder, Y; Schlitt, H J; Piso, P; Dahlke, M H
2016-11-01
Peritoneal carcinomatosis (PC) is a dismal feature of gastric cancer that most often is treated by systemic palliative chemotherapy. In this retrospective matched pairs-analysis, we sought to establish whether specific patient subgroups alternatively should be offered a multimodal therapy concept, including cytoreductive surgery (CRS) and intraoperative hyperthermic chemotherapy (HIPEC). Clinical outcomes of 38 consecutive patients treated with gastrectomy, CRS and HIPEC for advanced gastric cancer with PC were compared to patients treated by palliative management (with and without gastrectomy) and to patients with advanced gastric cancer with no evidence of PC. Kaplan-Meier survival curves and multivariate Cox regression models were applied. Median survival time after gastrectomy was similar between patients receiving CRS-HIPEC and matched control patients operated for advanced gastric cancer without PC [18.1 months, confidence interval (CI) 10.1-26.0 vs. 21.8 months, CI 8.0-35.5 months], resulting in comparable 5-year survival (11.9 vs. 12.1 %). The median survival time after first diagnosis of PC for gastric cancer was 17.2 months (CI 10.1-24.2 months) in the CRS-HIPEC group compared with 11.0 months (CI 7.4-14.6 months) for those treated by gastrectomy and chemotherapy alone, resulting in a twofold increase of 2-year survival (35.8 vs. 16.9 %). We provide retrospective evidence that multimodal treatment with gastrectomy, CRS, and HIPEC is associated with improved survival for patients with PC of advanced gastric cancer compared with gastrectomy and palliative chemotherapy alone. We also show that patients treated with CRS-HIPEC have comparable survival to matched control patients without PC. However, regardless of treatment scheme, all patients subsequently recur and die of disease.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zain, Zakiyah, E-mail: zac@uum.edu.my; Ahmad, Yuhaniz, E-mail: yuhaniz@uum.edu.my; Azwan, Zairul, E-mail: zairulazwan@gmail.com, E-mail: farhanaraduan@gmail.com, E-mail: drisagap@yahoo.com
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, andmore » time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.« less
NASA Astrophysics Data System (ADS)
Zain, Zakiyah; Aziz, Nazrina; Ahmad, Yuhaniz; Azwan, Zairul; Raduan, Farhana; Sagap, Ismail
2014-12-01
Colorectal cancer is the third and the second most common cancer worldwide in men and women respectively, and the second in Malaysia for both genders. Surgery, chemotherapy and radiotherapy are among the options available for treatment of patients with colorectal cancer. In clinical trials, the main purpose is often to compare efficacy between experimental and control treatments. Treatment comparisons often involve several responses or endpoints, and this situation complicates the analysis. In the case of colorectal cancer, sets of responses concerned with survival times include: times from tumor removal until the first, the second and the third tumor recurrences, and time to death. For a patient, the time to recurrence is correlated to the overall survival. In this study, global score test methodology is used in combining the univariate score statistics for comparing treatments with respect to each survival endpoint into a single statistic. The data of tumor recurrence and overall survival of colorectal cancer patients are taken from a Malaysian hospital. The results are found to be similar to those computed using the established Wei, Lin and Weissfeld method. Key factors such as ethnic, gender, age and stage at diagnose are also reported.
Timing of chemotherapy and survival in patients with resectable gastric adenocarcinoma
Arrington, Amanda K; Nelson, Rebecca; Patel, Supriya S; Luu, Carrie; Ko, Michelle; Garcia-Aguilar, Julio; Kim, Joseph
2013-01-01
AIM: To evaluate the timing of chemotherapy in gastric cancer by comparing survival outcomes in treatment groups. METHODS: Patients with surgically resected gastric adenocarcinoma from 1988 to 2006 were identified from the Los Angeles County Cancer Surveillance Program. To evaluate the population most likely to receive and/or benefit from adjunct chemotherapy, inclusion criteria consisted of Stage II or III gastric cancer patients > 18 years of age who underwent curative-intent surgical resection. Patients were categorized into three groups according to the receipt of chemotherapy: (1) no chemotherapy; (2) preoperative chemotherapy; or (3) postoperative chemotherapy. Clinical and pathologic characteristics were compared across the different treatment arms. RESULTS: Of 1518 patients with surgically resected gastric cancer, 327 (21.5%) received perioperative chemotherapy. The majority of these 327 patients were male (68%) with a mean age of 61.5 years; and they were significantly younger than non-chemotherapy patients (mean age, 70.7; P < 0.001). Most patients had tumors frequently located in the distal stomach (34.5%). Preoperative chemotherapy was administered to 11.3% of patients (n = 37) and postoperative therapy to 88.7% of patients (n = 290). An overall survival benefit according to timing of chemotherapy was not observed on univariate or multivariate analysis. Similar results were observed with stage-specific survival analyses (5-year overall survival: Stage II, 25% vs 30%, respectively; Stage III, 14% vs 11%, respectively). Therefore, our results do not identify a survival advantage for specific timing of chemotherapy in locally advanced gastric cancer. CONCLUSION: This study supports the implementation of a randomized trial comparing the timing of perioperative therapy in patients with locally advanced gastric cancer. PMID:24392183
Beker, Mustafa Caglar; Caglayan, Berrak; Yalcin, Esra; Caglayan, Ahmet Burak; Turkseven, Seyma; Gurel, Busra; Kelestemur, Taha; Sertel, Elif; Sahin, Zafer; Kutlu, Selim; Kilic, Ulkan; Baykal, Ahmet Tarik; Kilic, Ertugrul
2018-03-01
Occurrence of stroke cases displays a time-of-day variation in human. However, the mechanism linking circadian rhythm to the internal response mechanisms against pathophysiological events after ischemic stroke remained largely unknown. To this end, temporal changes in the susceptibility to ischemia/reperfusion (I/R) injury were investigated in mice in which the ischemic stroke induced at four different Zeitgeber time points with 6-h intervals (ZT0, ZT6, ZT12, and ZT18). Besides infarct volume and brain swelling, neuronal survival, apoptosis, ischemia, and circadian rhythm related proteins were examined using immunohistochemistry, Western blot, planar surface immune assay, and liquid chromatography-mass spectrometry tools. Here, we present evidence that midnight (ZT18; 24:00) I/R injury in mice resulted in significantly improved infarct volume, brain swelling, neurological deficit score, neuronal survival, and decreased apoptotic cell death compared with ischemia induced at other time points, which were associated with increased expressions of circadian proteins Bmal1, PerI, and Clock proteins and survival kinases AKT and Erk-1/2. Moreover, ribosomal protein S6, mTOR, and Bad were also significantly increased, while the levels of PRAS40, negative regulator of AKT and mTOR, and phosphorylated p53 were decreased at this time point compared to ZT0 (06:00). Furthermore, detailed proteomic analysis revealed significantly decreased CSKP, HBB-1/2, and HBA levels, while increased GNAZ, NEGR1, IMPCT, and PDE1B at midnight as compared with early morning. Our results indicate that nighttime I/R injury results in less severe neuronal damage, with increased neuronal survival, increased levels of survival kinases and circadian clock proteins, and also alters the circadian-related proteins.
Palermo, Valentina; Stafford Johnson, Michael J; Sala, Elisabetta; Brambilla, Paola G; Martin, Mike W S
2011-03-01
To retrospectively compare and contrast the clinical presentation, diagnostic findings and survival in Boxer dogs with cardiomyopathy, with or without left ventricular (LV) systolic failure. Medical records of Boxers referred between 1993 and 2008 in which a diagnosis of ventricular arrhythmias and/or cardiomyopathy was made, were reviewed. Dogs were divided into two groups according to their left ventricular (LV) systolic diameter, group A normal (20 dogs) or group B dilated (59 dogs). Dogs in group A had a better outcome than dogs in group B (median survival time of 124 and 17 weeks respectively, p < 0.001). In group B, dogs with a history of collapse had a worse outcome (median survival time of 10 weeks) compared with dogs not showing collapse (median survival time 24 weeks) (p = 0.031). The majority of dogs, in this UK study, presented with the myocardial dysfunction form of the disease, with LV dilation and congestive heart failure signs. The prognosis was worse in dogs with LV dilation compared to dogs with a normal LV and ventricular arrhythmias. In the Boxers with LV dilation, dogs with collapse had a worse prognosis than those without. Copyright © 2011 Elsevier B.V. All rights reserved.
Benard, Vicki B; Watson, Meg; Saraiya, Mona; Harewood, Rhea; Townsend, Julie S; Stroup, Antoinette M; Weir, Hannah K; Allemani, Claudia
2017-12-15
Overall, cervical cancer survival in the United States has been reported to be among the highest in the world, despite slight decreases over the last decade. Objective of the current study was to describe cervical cancer survival trends among US women and examine differences by race and stage. This study used data from the CONCORD-2 study to compare survival among women (aged 15-99 years) diagnosed in 37 states covering 80% of the US population. Survival was adjusted for background mortality (net survival) with state- and race-specific life tables and was age-standardized with the International Cancer Survival Standard weights. Five-year survival was compared by race (all races, blacks, and whites). Two time periods, 2001-2003 and 2004-2009, were considered because of changes in how the staging variable was collected. From 2001 to 2009, 90,620 women were diagnosed with invasive cervical cancer. The proportion of cancers diagnosed at a regional or distant stage increased over time in most states. Overall, the 5-year survival was 63.5% in 2001-2003 and 62.8% in 2004-2009. The survival was lower for black women versus white women in both calendar periods and in most states; black women had a higher proportion of distant-stage cancers. The stability of the overall survival over time and the persistent differences in survival between white and black women in all US states suggest that there is a need for targeted interventions and improved access to screening, timely treatment, and follow-up care, especially among black women. Cancer 2017;123:5119-37. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Xu, Zhiyuan; Marko, Nicholas F; Angelov, Lilyana; Barnett, Gene H; Chao, Samuel T; Vogelbaum, Michael A; Suh, John H; Weil, Robert J
2012-03-01
Breast cancer is the second most common source of brain metastasis. Stereotactic radiosurgery (SRS) can be an effective treatment for some patients with brain metastasis (BM). Necrosis is a common feature of many brain tumors, including BM; however, the influence of tumor necrosis on treatment efficacy of SRS in women with breast cancer metastatic to the brain is unknown. A cohort of 147 women with breast cancer and BM treated consecutively with SRS over 10 years were studied. Of these, 80 (54.4%) had necrosis identified on pretreatment magnetic resonance images and 67 (46.4%) did not. Survival times were computed using the Kaplan-Meier method. Log-rank tests were used to compare groups with respect to survival times, Cox proportional hazards regression models were used to perform univariate and multivariate analyses, and chi-square and Fisher exact tests were used to compare clinicopathologic covariates. Neurological survival (NS) and survival after SRS were decreased in BM patients with necrosis at the time of SRS compared with patients without necrosis by 32% and 27%, respectively (NS median survival, 25 vs 17 months [log-rank test, P = .006]; SRS median survival, 15 vs 11 months [log-rank test, P = .045]). On multivariate analysis, HER2 amplification status and necrosis influenced NS and SRS after adjusting for standard clinical features, including BM number, size, and volume as well as Karnofsky performance status. Neuroimaging evidence of necrosis at the time of SRS significantly diminished the efficacy of therapy and was a potent prognostic marker. Copyright © 2011 American Cancer Society.
Effects of hemoadsorption on cytokine removal and short-term survival in septic rats
Peng, Zhi-Yong; Carter, Melinda J.; Kellum, John A.
2012-01-01
Objective A broad-spectrum immune-regulating therapy could be beneficial in the treatment of sepsis. Our previous studies have shown that a hemoadsorption device (CytoSorb) removes both pro- and anti-inflammatory cytokines and improves survival in experimental endotoxemia. We sought to determine whether hemoadsorption can also be effective in the treatment of sepsis. Design Randomized controlled laboratory experiment. Setting University laboratory. Interventions Rats were subjected to cecal ligation and puncture (CLP) and 20 hrs later were randomized to receive either hemoadsorption or sham treatment using an arterial-venous circuit. Hemoadsorption was accomplished using a cartridge containing Cytosorb beads. Blood was drawn for cytokine measurements and mean arterial pressure (MAP) was continuously monitored. Cytokines were measured via multiplex bead immunoassays. Survival time was observed for 9 hours after the intervention and assessed by Kaplan–Meier statistics. The overall survival in each group was compared using Fisher’s exact test. Finally, we used a Cox proportional-hazards model to examine the effects of cytokine removal on survival time. Measurements and Main Results Baseline plasma cytokine concentrations and MAP were similar between hemoadsorption and sham-treated groups. However, the concentrations of tumor necrosis factor, interleukin (IL)-1β, IL-6, and IL-10 were significantly lower after hemoadsorption compared to the sham group. Six hours after treatment ended, IL-6 and IL-10 concentrations were still lower in hemoadsorption group. MAP was significantly better in hemoadsorption compared to sham-treated animals (p < .05). Finally, mean survival time was significantly longer (720 vs. 381 min, p < .05, Mann–Whitney test), and overall survival was significantly better (11/17 vs. 2/16, p < .01) with hemoadsorption compared to sham. Combined reduction in both IL-6 and IL-10 was associated with a significantly decreased risk of death (hazard ratio, .11, p = .005). Conclusion Hemoadsorption reduced circulating cytokines, improved MAP, and resulted in better short-term survival in CLP-induced septic rats. PMID:18434884
Effects of hemoadsorption on cytokine removal and short-term survival in septic rats.
Peng, Zhi-Yong; Carter, Melinda J; Kellum, John A
2008-05-01
A broad-spectrum immune-regulating therapy could be beneficial in the treatment of sepsis. Our previous studies have shown that a hemoadsorption device (CytoSorb) removes both pro- and anti-inflammatory cytokines and improves survival in experimental endotoxemia. We sought to determine whether hemoadsorption can also be effective in the treatment of sepsis. Randomized controlled laboratory experiment. University laboratory. Rats were subjected to cecal ligation and puncture (CLP) and 20 hrs later were randomized to receive either hemoadsorption or sham treatment using an arterial-venous circuit. Hemoadsorption was accomplished using a cartridge containing Cytosorb beads. Blood was drawn for cytokine measurements and mean arterial pressure (MAP) was continuously monitored. Cytokines were measured via multiplex bead immunoassays. Survival time was observed for 9 hours after the intervention and assessed by Kaplan-Meier statistics. The overall survival in each group was compared using Fisher's exact test. Finally, we used a Cox proportional-hazards model to examine the effects of cytokine removal on survival time. Baseline plasma cytokine concentrations and MAP were similar between hemoadsorption and sham-treated groups. However, the concentrations of tumor necrosis factor, interleukin (IL)-1beta, IL-6, and IL-10 were significantly lower after hemoadsorption compared to the sham group. Six hours after treatment ended, IL-6 and IL-10 concentrations were still lower in hemoadsorption group. MAP was significantly better in hemoadsorption compared to sham-treated animals (p < .05). Finally, mean survival time was significantly longer (720 vs. 381 min, p < .05, Mann-Whitney test), and overall survival was significantly better (11/17 vs. 2/16, p < .01) with hemoadsorption compared to sham. Combined reduction in both IL-6 and IL-10 was associated with a significantly decreased risk of death (hazard ratio, .11, p = .005). Hemoadsorption reduced circulating cytokines, improved MAP, and resulted in better short-term survival in CLP-induced septic rats.
Smith, Shaun; Meade, Joseph; Gibbons, James; McGill, Kevina; Bolton, Declan; Whyte, Paul
2016-01-01
Campylobacter jejuni is the leading bacterial food-borne pathogen within the European Union, and poultry meat is an important vehicle for its transmission to humans. However, there is limited knowledge about how this organism persists in broiler litter and faeces. The aim of this study was to assess the impact of a number of environmental parameters, such as temperature, humidity, and oxygen, on Campylobacter survival in both broiler litter and faeces. Used litter was collected from a Campylobacter-negative broiler house after final depopulation and fresh faeces were collected from transport crates. Samples were confirmed as Campylobacter negative according to modified ISO methods for veterinary samples. Both sample matrices were inoculated with 9 log10 CFU/ml C. jejuni and incubated under high (≥85%) and low (≤70%) relative humidity conditions at three different temperatures (20°C, 25°C, and 30°C) under both aerobic and microaerophilic atmospheres. Inoculated litter samples were then tested for Campylobacter concentrations at time zero and every 2 hours for 12 hours, while faecal samples were examined at time zero and every 24 hours for 120 hours. A two-tailed t-test assuming unequal variance was used to compare mean Campylobacter concentrations in samples under the various temperature, humidity, and atmospheric conditions. C. jejuni survived significantly longer (P≤0.01) in faeces, with a minimum survival time of 48 hours, compared with 4 hours in used broiler litter. C. jejuni survival was significantly enhanced at 20°C in all environmental conditions in both sample matrices tested compared with survival at 25°C and 30°C. In general, survival was greater in microaerophilic compared with aerobic conditions in both sample matrices. Humidity, at the levels examined, did not appear to significantly impact C. jejuni survival in any sample matrix. The persistence of Campylobacter in broiler litter and faeces under various environmental conditions has implications for farm litter management, hygiene, and disinfection practices.
Takei, Yutaka; Kamikura, Takahisa; Nishi, Taiki; Maeda, Tetsuo; Sakagami, Satoru; Kubo, Minoru; Inaba, Hideo
2016-08-01
To compare the factors associated with survival after out-of-hospital cardiac arrests (OHCAs) among three time-distance areas (defined as interquartile range of time for emergency medical services response to patient's side). From a nationwide, prospectively collected data on 716,608 OHCAs between 2007 and 2012, this study analyzed 193,914 bystander-witnessed OHCAs without pre-hospital physician involvement. Overall neurologically favourable 1-month survival rates were 7.4%, 4.1% and 1.7% for close, intermediate and remote areas, respectively. We classified BCPR by type (compression-only vs. conventional) and by dispatcher-assisted CPR (DA-CPR) (with vs. without); the effects on time-distance area survival were analyzed by BCPR classification. Association of each BCPR classification with survival was affected by time-distance area and arrest aetiology (p<0.05). The survival rates in the remote area were much higher with conventional BCPR than with compression-only BCPR (odds ratio; 95% confidence interval, 1.26; 1.05-1.51) and with BCPR without DA-CPR than with BCPR with DA-CPR (1.54; 1.29-1.82). Accordingly, we classified BCPR into five groups (no BCPR, compression-only with DA-CPR, conventional with DA-CPR, compression-only without DA-CPR, and conventional without DA-CPR) and analyzed for associations with survival, both cardiac and non-cardiac related, in each time-distance area by multivariate logistic regression analysis. In the remote area, conventional BCPR without DA-CPR significantly improved survival after OHCAs of cardiac aetiology, compared with all the other BCPR groups. Other correctable factors associated with survival were short collapse-to-call and call-to-first CPR intervals. Every effort to recruit trained citizens initiating conventional BCPR should be made in remote time-distance areas. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
King, J N; Font, A; Rousselot, J-F; Ash, R A; Bonfanti, U; Brovida, C; Crowe, I D; Lanore, D; Pechereau, D; Seewald, W; Strehlau, G
2017-07-01
Chronic kidney disease (CKD) is an important cause of morbidity and mortality in dogs. To evaluate the efficacy in prolonging survival and safety of benazepril administration to dogs with CKD. Forty-nine client-owned dogs with CKD. Dogs were randomized to benazepril (0.25 to <0.5 mg/kg) or placebo once daily for up to 2 years in a prospective, multicenter, blinded clinical trial. The primary endpoint variable was the renal survival time, defined as the time from inclusion in the study to the treatment failure endpoint of death or euthanasia or need for administration of parenteral fluids related to renal failure. No benefit of benazepril versus placebo was detected for renal survival time in all dogs; median (95% confidence interval (CI)) survival times were 305 (53-575) days in the benazepril group and 287 (152-not available) in the placebo group (P = .53). Renal survival times were not significantly longer with benazepril compared to placebo for subgroups: hazard ratios (95% CI) were 0.50 (0.21-1.22) with P = .12 for initial urine protein-to-creatinine ratio (UPC) >0.5, and 0.38 (0.12-1.19) with P = .080 for initial UPC >0.5 plus plasma creatinine ≤440 μmol/L. Proteinuria, assessed from the UPC, was significantly (P = .0032) lower after treatment with benazepril compared to placebo. There were no significant differences between groups for clinical signs or frequencies of adverse events. Benazepril significantly reduced proteinuria in dogs with CKD. Insufficient numbers of dogs were recruited to allow conclusions on survival time. Copyright © 2017 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
Survival and Neurodevelopmental Outcomes among Periviable Infants.
Younge, Noelle; Goldstein, Ricki F; Bann, Carla M; Hintz, Susan R; Patel, Ravi M; Smith, P Brian; Bell, Edward F; Rysavy, Matthew A; Duncan, Andrea F; Vohr, Betty R; Das, Abhik; Goldberg, Ronald N; Higgins, Rosemary D; Cotten, C Michael
2017-02-16
Data reported during the past 5 years indicate that rates of survival have increased among infants born at the borderline of viability, but less is known about how increased rates of survival among these infants relate to early childhood neurodevelopmental outcomes. We compared survival and neurodevelopmental outcomes among infants born at 22 to 24 weeks of gestation, as assessed at 18 to 22 months of corrected age, across three consecutive birth-year epochs (2000-2003 [epoch 1], 2004-2007 [epoch 2], and 2008-2011 [epoch 3]). The infants were born at 11 centers that participated in the National Institute of Child Health and Human Development Neonatal Research Network. The primary outcome measure was a three-level outcome - survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, or death. After accounting for differences in infant characteristics, including birth center, we used multinomial generalized logit models to compare the relative risk of survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, and death. Data on the primary outcome were available for 4274 of 4458 infants (96%) born at the 11 centers. The percentage of infants who survived increased from 30% (424 of 1391 infants) in epoch 1 to 36% (487 of 1348 infants) in epoch 3 (P<0.001). The percentage of infants who survived without neurodevelopmental impairment increased from 16% (217 of 1391) in epoch 1 to 20% (276 of 1348) in epoch 3 (P=0.001), whereas the percentage of infants who survived with neurodevelopmental impairment did not change significantly (15% [207 of 1391] in epoch 1 and 16% [211 of 1348] in epoch 3, P=0.29). After adjustment for changes in the baseline characteristics of the infants over time, both the rate of survival with neurodevelopmental impairment (as compared with death) and the rate of survival without neurodevelopmental impairment (as compared with death) increased over time (adjusted relative risks, 1.27 [95% confidence interval {CI}, 1.01 to 1.59] and 1.59 [95% CI, 1.28 to 1.99], respectively). The rate of survival without neurodevelopmental impairment increased between 2000 and 2011 in this large cohort of periviable infants. (Funded by the National Institutes of Health and others; ClinicalTrials.gov numbers, NCT00063063 and NCT00009633 .).
Survival of asbestos insulation workers with mesothelioma.
Ribak, J; Selikoff, I J
1992-01-01
Malignant mesothelioma is a lethal disease. It is rare in the general population; however, workers exposed to asbestos suffer significant burdens of the neoplasm. The survival time of 457 consecutive fatal cases of pleural and peritoneal mesothelioma that occurred among 17,800 asbestos insulation workers observed prospectively from 1 January 1967 to 1 January 1987 was studied. Mean survival time from initial presentation of the disease to death was 11.4 months for the pleural mesothelioma patients compared with 7.4 months for the peritoneal group. This difference was statistically significant. Mean survival time from diagnosis to death was shorter for both groups of patients: 8.4 months for pleural mesothelioma v 5.8 months for the peritoneal cases. In conclusion, survival time in mesothelioma patients is short; most die within a year from the onset of the initial symptoms. No effective therapy is yet available. PMID:1419863
Probabilistic Survivability Versus Time Modeling
NASA Technical Reports Server (NTRS)
Joyner, James J., Sr.
2016-01-01
This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.
Arshad, Hafiz Muhammad Sharjeel; Kabir, Christopher; Tetangco, Eula; Shah, Natahsa; Raddawi, Hareth
2017-09-01
Recently published data indicate increasing incidence of colorectal adenocarcinoma (CRC) in young-onset (<50 years) patients. This study examines racial disparities in presentation and survival times among non-Hispanic Blacks (NHB) and Hispanics compared with non-Hispanic Whites (NHW). A retrospective single-center cohort study was conducted from 2004 through 2014 using 96 patient medical charts with a diagnosis of young-onset CRC. Age, gender, primary site, and histological stage at the time of diagnosis were assessed for survival probabilities by racial group over a minimum follow-up period of 5 years. Among subjects with CRC diagnosis before 50 years of age, the majority of subjects were between 40 and 50 years, with CRC presentation occurring among this age group for 51 (79.7%) of NHW, 18 (81.8%) of NHB, and 5 (50.0%) of Hispanics. The majority of all patients presented with advanced stages of CRC (31.3% with stage III and 27.1% with stage IV). NHB exhibited statistically significantly worse survival compared to NHW (adjusted hazard ratio for death = 2.09; 95% confidence interval 1.14-3.84; P = 0.02). A possible trend of worse survival was identified for Hispanics compared to NHW, but this group was low in numbers and results were not statistically significant. Disparities between racial groups among young-onset CRC cases were identified in overall survival and reflect growing concern in rising incidence and differentiated care management.
Survival times for canine intranasal sarcomas treated with radiation therapy: 86 cases (1996-2011).
Sones, Evan; Smith, Annette; Schleis, Stephanie; Brawner, William; Almond, Gregory; Taylor, Kathryn; Haney, Siobhan; Wypij, Jackie; Keyerleber, Michele; Arthur, Jennifer; Hamilton, Terrance; Lawrence, Jessica; Gieger, Tracy; Sellon, Rance; Wright, Zack
2013-01-01
Sarcomas comprise approximately one-third of canine intranasal tumors, however few veterinary studies have described survival times of dogs with histologic subtypes of sarcomas separately from other intranasal tumors. One objective of this study was to describe median survival times for dogs treated with radiation therapy for intranasal sarcomas. A second objective was to compare survival times for dogs treated with three radiation therapy protocols: daily-fractionated radiation therapy; Monday, Wednesday, and Friday fractionated radiation therapy; and palliative radiation therapy. Medical records were retrospectively reviewed for dogs that had been treated with radiation therapy for confirmed intranasal sarcoma. A total of 86 dogs met inclusion criteria. Overall median survival time for included dogs was 444 days. Median survival time for dogs with chondrosarcoma (n = 42) was 463 days, fibrosarcoma (n = 12) 379 days, osteosarcoma (n = 6) 624 days, and undifferentiated sarcoma (n = 22) 344 days. Dogs treated with daily-fractionated radiation therapy protocols; Monday, Wednesday and Friday fractionated radiation therapy protocols; and palliative radiation therapy protocols had median survival times of 641, 347, and 305 days, respectively. A significant difference in survival time was found for dogs receiving curative intent radiation therapy vs. palliative radiation therapy (P = 0.032). A significant difference in survival time was also found for dogs receiving daily-fractionated radiation therapy vs. Monday, Wednesday and Friday fractionated radiation therapy (P = 0.0134). Findings from this study support the use of curative intent radiation therapy for dogs with intranasal sarcoma. Future prospective, randomized trials are needed for confirmation of treatment benefits. © 2012 Veterinary Radiology & Ultrasound.
Ou, Judy Y; Spraker-Perlman, Holly; Dietz, Andrew C; Smits-Seemann, Rochelle R; Kaul, Sapna; Kirchhoff, Anne C
2017-10-01
Survival estimates for soft tissue sarcomas (STS) and malignant bone tumors (BT) diagnosed in pediatric, adolescent, and young adult patients are not easily available. We present survival estimates based on a patient having survived a defined period of time (conditional survival). Conditional survival estimates for the short-term were calculated for patients from diagnosis to the first five years after diagnosis and for patients surviving in the long-term (up to 20 years after diagnosis). We identified 703 patients who were diagnosed with a STS or BT at age ≤25 years from January 1, 1986 to December 31, 2012 at a large pediatric oncology center in Salt Lake City, Utah, United States. We obtained cancer type, age at diagnosis, primary site, and demographic data from medical records, and vital status through the National Death Index. Cancer stage was available for a subset of the cohort through the Utah Cancer Registry. Cox proportional hazards models, adjusted for age and sex, calculated survival estimates for all analyses. Short-term survival improves over time for both sarcomas. Short-term survival for STS from diagnosis (Year 0) did not differ by sex, but short-term survival starting from 1-year post diagnosis was significantly worse for male patients (Survival probability 1-year post-diagnosis [SP1]:77% [95% CI:71-83]) than female patients (SP1:86% [81-92]). Survival for patients who were diagnosed at age ≤10 years (Survival probability at diagnosis [SP0]:85% [79-91]) compared to diagnosis at ages 16-25 years (SP0:67% [59-75]) was significantly better at all time-points from diagnosis to 5-years post-diagnosis. Survival for axial sites (SP0:69% [63-75]) compared to extremities (SP0:84% [79-90]) was significantly worse from diagnosis to 1-year post-diagnosis. Survival for axial BT (SP0: 64% [54-74] was significantly worse than BT in the extremities (SP0:73% [68-79]) from diagnosis to 3-years post diagnosis. Relapsed patients of both sarcoma types had significantly worse short-term survival than non-relapsed patients. Long-term survival for STS in this cohort is 65% at diagnosis, and improves to 86% 5-years post-diagnosis. BT survival improves from 51% at diagnosis to 78% at 5-years post-diagnosis. Conditional survival for short- and long-term STS and BT improve as time from diagnosis increases. Short-term survival was significantly affected by patients' sex, age at diagnosis, cancer site, and relapse status. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sherrill, B; Wang, J; Kotapati, S; Chin, K
2013-07-09
Study CA184024 was a multinational, randomised, double-blind, phase 3 study comparing ipilimumab/dacarbazine (DTIC) vs placebo/DTIC in patients with untreated stage III/IV melanoma, which showed that ipilimumab significantly improves survival in patients with metastatic melanoma. The objective of this analysis was to compare the quality-adjusted survival experience among patients in this trial. Survival time was partitioned into health states: toxicity, time before progression without toxicity, and relapse until death or end of follow-up. Q-TWiST (quality-adjusted time without symptoms of disease or toxicity of treatment) was calculated as the utility-weighted sum of the mean health state durations. Analyses were repeated over extended follow-up periods. Based on a combination of trial-based and external utility scores, the Q-TWiST difference in this trial was 0.50 months (P=0.0326) favoring ipilimumab after 1 year. The Q-TWiST difference was 1.5 months with 2 years of follow-up (P=0.0091), 2.36 months at 3 years (P=0.005) and 3.28 months at 4 years (P=0.0074). During the first year of study, there was little difference between groups in quality-adjusted survival. However, after 2, 3 and 4 years follow-up for patients with extended survival, the benefits of IPI+DTIC vs PLA+DTIC for advanced melanoma continue to accrue.
Diethelm, A G; Blackstone, E H; Naftel, D C; Hudson, S L; Barber, W H; Deierhoi, M H; Barger, B O; Curtis, J J; Luke, R G
1988-01-01
Multiple risk factors contribute to the allograft survival of patients who have cadaveric renal transplantation. A retrospective review of 19 such factors in 426 patients identified race, DR match, B + DR match, number of transplants, and preservation time to have a significant influence. The parametric analysis confirmed the effect to be primarily in the early phase, i.e., first 6 months. All patients received cyclosporine with other methods of immunosuppression resulting in an overall 1-year graft survival rate of 66%. The overall 1-year graft survival rate in the white race was 73% and in the black race was 57% (p = 0.002). Allograft survival and DR match showed white recipients with a 1 DR match to have 75% survival at 1 year compared with 57% in the black patient (p = 0.009). If HLA B + DR match was considered, the white recipient allograft survival increased to 76%, 84%, and 88% for 1, 2, and 3 match kidneys by parametric analysis. Patients receiving first grafts had better graft survival (68%) than those undergoing retransplantation (58%) (p = 0.05). Organ preservation less than 12 hours influenced allograft survival with a 78% 1-year survival rate compared with 63% for kidneys with 12-18 hours of preservation. Despite the benefits of B + DR typing, short preservation time, and first transplants to the white recipient, the allograft survival in the black recipient remained uninfluenced by these parameters. PMID:3288138
Panasiti, V; Curzio, M; Roberti, V; Lieto, P; Devirgiliis, V; Gobbi, S; Naspi, A; Coppola, R; Lopez, T; di Meo, N; Gatti, A; Trevisan, G; Londei, P; Calvieri, S
2013-01-01
The last melanoma staging system of the 2009 American Joint Committee on Cancer takes into account, for stage IV disease, the serum levels of lactate dehydrogenase (LDH) and the site of distant metastases. Our aim was to compare the significance of metastatic volume, as evaluated at the time of stage IV melanoma diagnosis, with other clinical predictors of prognosis. We conducted a retrospective multicentric study. To establish which variables were statistically correlated both with death and survival time, contingency tables were evaluated. The overall survival curves were compared using the Kaplan-Meier method. Metastatic volume and number of affected organs were statistically related to death. In detail, patients with a metastatic volume >15 cm(3) had a worse prognosis than those with a volume lower than this value (survival probability at 60 months: 6.8 vs. 40.9%, respectively). The Kaplan-Meier method confirmed that survival time was significantly related to the site(s) of metastases, to elevated LDH serum levels and to melanoma stage according to the latest system. Our results suggest that metastatic volume may be considered as a useful prognostic factor for survival among melanoma patients.
Clemente, M; De Andrés, P J; Peña, L; Pérez-Alenza, M D
2009-07-18
Seven of 30 female dogs diagnosed with inflammatory mammary cancer were given chemotherapy and palliative treatment, and the other 23 received only palliative treatment. The median survival time of the seven dogs given chemotherapy was 57 days, compared with 35 days for the 23 given only palliative treatment.
Otgaar, Henry; Smeets, Tom; van Bergen, Saskia
2010-01-01
Recent studies have shown that processing words according to a survival scenario leads to superior retention relative to control conditions. Here, we examined whether a survival recall advantage could be elicited by using pictures. Furthermore, in Experiment 1, we were interested in whether survival processing also results in improved memory for details. Undergraduates rated the relevance of pictures in a survival, moving, or pleasantness scenario and were subsequently given a surprise free recall test. We found that survival processing yielded superior retention. We also found that distortions occurred more often in the survival condition than in the pleasantness condition. In Experiment 2, we directly compared the survival recall effect between pictures and words. A comparable survival recall advantage was found for pictures and words. The present findings support the idea that memory is enhanced by processing information in terms of fitness value, yet at the same time, the present results suggest that this may increase the risk for memory distortions.
HISTOCOMPATIBILITY STUDIES IN A CLOSELY BRED COLONY OF DOGS
Rapaport, F. T.; Hanaoka, T.; Shimada, T.; Cannon, F. D.; Ferrebee, J. W.
1970-01-01
The establishment of a closely bred colony of beagles with known leukocyte group phenotypes has permitted an assessment of the role of leukocyte group antigens in conditioning the survival of renal allografts in the unmodified host. 22 kidney transplants obtained from leukocyte group-compatible donors were accorded a mean survival time of 25.5 days, as compared with 13.1 days for 27 transplants obtained from incompatible donors. Donor-recipient coefficients of correlation and Swisher erythrocyte group incompatibilities did not appear to affect the observed results. The mean survival time of 21 renal allografts performed in randomly selected mongrel dogs was 9.5 days. Availability of a carefully characterized and phenotyped canine population may be useful in further studies of the comparative immunogenicity of the major transplantable organs, and of methods designed to facilitate prolonged organ transplant survival in the mammalian host. PMID:4910142
Shah, Keyur B; Thanavaro, Kristin L; Tang, Daniel G; Quader, Mohammed A; Mankad, Anit K; Tchoukina, Inna; Thacker, Leroy R; Smallfield, Melissa C; Katlaps, Gundars; Hess, Michael L; Cooke, Richard H; Kasirajan, Vigneshwar
2016-11-01
Insufficient data delineate outcomes for Interagency Registry for Mechanically Assisted Circulatory Support (INTERMACS) profile 1 patients with the total artificial heart (TAH). We studied 66 consecutive patients implanted with the TAH at our institution from 2006 through 2012 and compared outcome by INTERMACS profile. INTERMACS profiles were adjudicated retrospectively by a reviewer blinded to clinical outcomes. Survival after TAH implantation at 6 and 12 months was 76% and 71%, respectively. INTERMACS profile 1 patients had decreased 6-month survival on the device compared with those in profiles 2-4 (74% vs 95%, log rank: P = .015). For the 50 patients surviving to heart transplantation, the 1-year posttransplant survival was 82%. There was no difference in 1-year survival when comparing patients in the INTERMACS 1 profile with less severe profiles (79% vs 84%; log rank test P = .7; hazard ratio [confidence interval] 1.3 [0.3-4.8]). Patients implanted with the TAH as INTERMACS profile 1 had reduced survival to transplantation compared with less sick profiles. INTERMACS profile at the time of TAH implantation did not affect 1-year survival after heart transplantation. Copyright © 2016 Elsevier Inc. All rights reserved.
Hestbeck, J.B.; Nichols, J.D.; Hines, J.E.
1992-01-01
Predictions of the time-allocation hypothesis were tested with several a posteriori analyses of banding data for the mallard (Anas platyrhynchos). The time-allocation hypothesis states that the critical difference between resident and migrant birds is their allocation of time to reproduction on the breeding grounds and survival on the nonbreeding grounds. Residents have higher reproduction and migrants have higher survival. Survival and recovery rates were estimated by standard band-recovery methods for banding reference areas in the central United States and central Canada. A production-rate index was computed for each reference area with data from the U.S. Fish and Wildlife Service May Breeding Population Survey and July Production Survey. An analysis of covariance was used to test for the effects of migration distance and time period (decade) on survival, recovery, and production rates. Differences in migration chronology were tested by comparing direct-recovery distributions for different populations during the fall migration. Differences in winter locations were tested by comparing distributions of direct recoveries reported during December and January. A strong positive relationship was found between survival rate, and migration distance for 3 of the 4 age and sex classes. A weak negative relationship was found between recovery rate and migration distance. No relationship was found between production rate and migration distance. During the fall migration, birds from the northern breeding populations were located north of birds from the southern breeding populations. No pattern could be found in the relative locations of breeding and wintering areas. Although our finding that survival rate increased with migration distance was consistent with the time-allocation hypothesis, our results on migration chronology and location of wintering areas were not consistent with the mechanism underlying the time-allocation hypothesis. Neither this analysis nor other recent studies of life-history characteristics of migratory and resident birds supported the timeallocation hypothesis.
Smits, Jacqueline M A; Vanhaecke, Johan; Haverich, Axel; de Vries, Erwin; Smith, Mike; Rutgrink, Ellis; Ramsoebhag, Annemarie; Hop, Alinde; Persijn, Guido; Laufer, Gunther
2003-01-01
The definition of proper patient selection criteria remains a prominent item in constant need of attention. While the concept of gathering evidence in order to determine practice continues to be hopelessly ambiguous, it can never be emphasized too much that these univariate results are just a first foray into analysing predictors of survival; all following results should be regarded and interpreted in this perspective. HEART TRANSPLANT SURVIVAL: The 3-year survival rate for heart transplant recipients under age 16 was 83% versus 72% for adult recipients. Acutely retransplanted adult heart recipients had a 3-year survival rate of 36% compared with 72% for recipients of a first heart allograft. Patients suffering from DCM had the best survival rates at 3 years (74%) compared with patients suffering from CAD (70%) or from another end-stage heart disease (67%). With advancing age of the adult recipient, the mortality risk increased. Patients aged 16-40 had a 3-year survival rate of 77%, compared with 74%, 70% and 61% for transplant recipients aged 41-55, 56-65 and over age 65, respectively. The 3-year survival rates for adult recipients transplanted with an heart allograft from a donor aged under 16 or between 16-44 were 78% and 74%, compared with 66% and 63% for donors aged 45-55 and over 55, respectively. The 3-year survival rates for recipients of hearts with cold ischemic times under 2 hours, 2-3, 3-4, 4-5, 5-6 and more than 6 hours were 74%, 75%, 70%, 65%, 54% and 40%, respectively. Transplanting a female donor heart into a male recipient was associated with the worst prognosis: the 3-year survival rates were 73%, 71%, 66% and 76%, respectively, for the donor/recipient groups male/male, male/female, female/male and female/female, respectively. When the donor-to-recipient body weight ratio was below 0.8, the 3-year survival rate was 64%, compared to 72% for weight-matched pairs and 74% for patients who received a heart from an oversized donor (p=0.004). Better survival rates were obtained for better HLA-matched transplants. The 3-year survival rates were 75%, 89%, 78%, 78%, 69%, 72%, and 71% for HLA-A,-B,-DR zero, 1, 2, 3, 4, 5 and 6 mismatched groups, respectively (p=0.04). Survival was significantly associated with the CMV serologic status of the donor and recipient; the 3-year survival rates were: D+/R+, 71%; D+/R-, 69%; D- R-, 76%; and D-/R+, 76% (p=0.04). Patients in an ICU had a 3-year survival rate of 62%, compared to 72% for patients in a general ward and 74% for outpatients (p<0.0001). Patients that were on a VAD and there-upon transplanted had a 3-year survival rate of 65%, compared to 73% for patients without a VAD (p=0.004). Being on a ventilator was a major risk factor for death after transplantation; patients on ventilator support at the time of the transplant had a 3-year survival rate of 52% compared to 73% for the other patients (p<0.0001). LUNG TRANSPLANT SURVIVAL: The 3-year survival rate for children (73%) appeared to be better than the adult rate (61%; p=0.8). Adult lung transplant survival was significantly worse in the case of a repeat lung transplant; a 3-year retransplant survival rate of 42% was obtained compared with 61% for first transplants (p=0.049). With respect to the underlying end-stage lung disease, no statistically significant difference in long-term survival could be detected in this cohort. The 3-year survival rates were: 62% for COPD/Emphysema, 70% for CF, 58% for IPF, 64% for Alpha-1 ATD and 56% for PPH (p=0.2). Our data demonstrated no effect of the recipient's age on long-term lung transplant survival, except for 2 senior patients in this cohort. At 3-years the survival rates for recipients aged 16-40, 41-55 and 56-65 were 65%, 60% and 62%, respectively (p=0.05). The 3-year survival rates for transplants performed with lungs from donors aged under 16, 16-44, 45-55 and over 55 was 57%, 64%, 55% and 62%, respectively (p=0.1) No association between the duration of cold ischemic time and 3-year survival was observed; under 3 hours, 3-4, 4-5, 5-6 and over 6 hours of ischemia resulted in 3-year survival rates of 53%, 59%, 64%, 68% and 57%, respectively (p=0.2). Early posttransplant outcome tended to be better for gender-matched transplants, while transplanting a female donor lung into a male recipient was associated with the worst prognosis. The 3-year survival rates were 65% for male/male, 63% for male/female, 48% for female/male and 61% for female/female (p=0.009). No effect of donor-to-recipient weight match was observed in this Eurotransplant cohort; when the donor-to-recipient weight ratio was below 0.8, the 3-year survival rate was 57%, compared with 59% for weight-matched pairs and 64% for patients who received a lung from an oversized donor (p=0.5). Long-term survival after lung transplantation was influenced by HLA matching. The 3-year survival rates were 100%, 68%, 70%, 65%, 54% and 55% for the HLA-A,-B,-DR 1, 2, 3, 4, 5 and 6 mismatched groups, respectively (p=0.06). A donor CMV+ and recipient CMV- match was a risk factor for long-term mortality, with 3-year survival rates of 56% for D+/R+, 55% for D+/R-, 71% for D-/R- and 62% for D-/R+ transplants (p=0.046). En-bloc transplantation of both lungs yielded worse early results, but the 3-year survival rates for patients who underwent single (60%), bilateral sequential double lung (63%) and en-bloc double lung transplantation (56%) were not different (p=0.2). Ventilator dependency was associated with a significantly reduced survival at 3 years. Patients on a ventilator support at the time of the transplant had a 3-year survival rate of 48% compared with 63% for other patients (p=0.006).
Cost-effectiveness of indwelling pleural catheter compared with talc in malignant pleural effusion.
Olfert, Jordan A P; Penz, Erika D; Manns, Braden J; Mishra, Eleanor K; Davies, Helen E; Miller, Robert F; Luengo-Fernandez, Ramon; Gao, Song; Rahman, Najib M
2017-05-01
Malignant pleural effusion is associated with morbidity and mortality. A randomized controlled trial previously compared clinical outcomes and resource use with indwelling pleural catheter (IPC) and talc pleurodesis in this population. Using unpublished quality of life data, we estimate the cost-effectiveness of IPC compared with talc pleurodesis. Healthcare utilization and costs were captured during the trial. Utility weights produced by the EuroQol Group five-dimensional three-level questionnaire and survival were used to determine quality-adjusted life-years (QALYs) gained. The incremental cost-effectiveness ratio (ICER) was calculated over the 1-year trial period. Sensitivity analysis used patient survival data and modelled additional nursing time required per week for catheter drainage. Utility scores, cost and QALYs gained did not differ significantly between groups. The ICER for IPC compared with talc was favorable at $US10 870 per QALY gained. IPC was less costly with a probability exceeding 95% of being cost-effective when survival was <14 weeks, and was more costly when 2-h nursing time per week was assumed for catheter drainage. IPC is cost-effective when compared with talc, although substantial uncertainty exists around this estimate. IPC appears most cost-effective in patients with limited survival. If significant nursing time is required for catheter drainage, IPC becomes less likely to be cost-effective. Either therapy may be considered as a first-line option in treating malignant pleural effusion in patients without history of prior pleurodesis, with consideration for patient survival, support and preferences. © 2016 The Authors. Respirology published by John Wiley & Sons Australia, Ltd on behalf of Asian Pacific Society of Respirology.
Racial disparities in advanced-stage colorectal cancer survival.
Wallace, Kristin; Hill, Elizabeth G; Lewin, David N; Williamson, Grace; Oppenheimer, Stephanie; Ford, Marvella E; Wargovich, Michael J; Berger, Franklin G; Bolick, Susan W; Thomas, Melanie B; Alberg, Anthony J
2013-03-01
African-Americans (AA) have a higher incidence of and lower survival from colorectal cancer (CRC) compared with European Americans (EA). In the present study, statewide, population-based data from South Carolina Central Cancer Registry are used to investigate the relationship between race and age on advanced-stage CRC survival. The study population was comprised of 3,865 advanced pathologically documented colon and rectal adenocarcinoma cases diagnosed between 01 January 1996 and 31 December 2006: 2,673 (69 %) EA and 1,192 (31 %) AA. Kaplan-Meier methods were used to generate median survival time and corresponding 95 % confidence intervals (CI) by race, age, and gender. Factors associated with survival were evaluated by fitting Cox proportional hazards regression models to generate hazard ratios (HR) and 95 % CI. We observed a significant interaction between race and age on CRC survival (p = 0.04). Among younger patients (<50 years), AA race was associated with a 1.34 times (95 % CI 1.06-1.71) higher risk of death compared with EA. Among older patients, we observed a modest increase in risk of death among AA men compared with EA [HR 1.16 (95 % CI 1.01-1.32)] but no difference by race between women [HR 0.94 (95 % CI 0.82-1.08)]. Moreover, we observed that the disparity in survival has worsened over the past 15 years. Future studies that integrate clinical, molecular, and treatment-related data are needed for advancing understanding of the racial disparity in CRC survival, especially for those <50 years old.
Wang, S L; Lee, J J; Liao, A T
2015-07-01
Myelosuppression is one of the most common side effects of chemotherapy. The aim of this study was to determine whether chemotherapy-induced neutropenia is a positive prognostic indicator for remission and survival time in dogs with lymphoma. Fifty dogs with multicentric lymphoma received CHOP-based (C-cyclophosphamide; H-hydroxydaunorubicin; O-vincristine; P-prednisolone) chemotherapy using conventional dosages. Complete blood counts were recorded to determine the presence or absence of neutropenia after treatment. Toxicity, remission, and survival times were recorded and analysed. Thirteen dogs had chemotherapy-induced neutropenia and 37 had no neutropenia during the study period. No statistical difference was found between the groups for signalment or the presence of historical negative prognostic factors, except for bodyweight (P = 0.02). The median first remission times in the neutropenia and no neutropenia groups were 812 and 219 days, respectively (P <0.01). The median survival times of dogs in the neutropenia and no neutropenia groups were 952 and 282 days, respectively (P <0.01). Dogs with lymphoma that had chemotherapy-induced neutropenia exhibited significantly increased remission and survival times compared with dogs without neutropenia. Chemotherapeutic dosages may be adjusted individually to induce neutropenia without severe adverse effects in order to achieve longer remission and survival times. Copyright © 2015 Elsevier Ltd. All rights reserved.
Beckman, Sarah A; Sekiya, Naosumi; Chen, William C W; Mlakar, Logan; Tobita, Kimimassa; Huard, Johnny
2014-01-01
Since myoblasts have been limited by poor cell survival after cellular myoplasty, the major goal of the current study was to determine whether improving myoblast survival with an antioxidant could improve cardiac function after the transplantation of the myoblasts into an acute myocardial infarction. We previously demonstrated that early myogenic progenitors such as muscle-derived stem cells (MDSCs) exhibited superior cell survival and improved cardiac repair after transplantation into infarcted hearts compared to myoblasts, which we partially attributed to MDSC's higher antioxidant levels. To determine if antioxidant treatment could increase myoblast survival, subsequently improving cardiac function after myoblast transplantation into infarcted hearts. Myoblasts were pre-treated with the antioxidant N-acetylcysteine (NAC) or the glutathione depleter, diethyl maleate (DEM), and injected into infarcted murine hearts. Regenerative potential was monitored by cell survival and cardiac function. At early time points, hearts injected with NAC-treated myoblasts exhibited increased donor cell survival, greater cell proliferation, and decreased cellular apoptosis, compared to untreated myoblasts. NAC-treated myoblasts significantly improved cardiac contractility, reduced fibrosis, and increased vascular density compared to DEM-treated myoblasts, but compared to untreated myoblasts, no difference was noted. While early survival of myoblasts transplanted into infarcted hearts was augmented by NAC pre-treatment, cardiac function remained unchanged compared to non-treated myoblasts. Despite improving cell survival with NAC treated myoblast transplantation in a MI heart, cardiac function remained similar to untreated myoblasts. These results suggest that the reduced cardiac regenerative potential of myoblasts, when compared to MDSCs, is not only attributable to cell survival but is probably also related to the secretion of paracrine factors by the MDSCs.
Beckman, Sarah A.; Sekiya, Naosumi; Chen, William C.W.; Mlakar, Logan; Tobita, Kimimassa; Huard, Johnny
2017-01-01
Introduction Since myoblasts have been limited by poor cell survival after cellular myoplasty, the major goal of the current study was to determine whether improving myoblast survival with an antioxidant could improve cardiac function after the transplantation of the myoblasts into an acute myocardial infarction. Background We previously demonstrated that early myogenic progenitors such as muscle-derived stem cells (MDSCs) exhibited superior cell survival and improved cardiac repair after transplantation into infarcted hearts compared to myoblasts, which we partially attributed to MDSC’s higher antioxidant levels. Aim To determine if antioxidant treatment could increase myoblast survival, subsequently improving cardiac function after myoblast transplantation into infarcted hearts. Materials and Methods Myoblasts were pre-treated with the antioxidant N-acetylcysteine (NAC) or the glutathione depleter, diethyl maleate (DEM), and injected into infarcted murine hearts. Regenerative potential was monitored by cell survival and cardiac function. Results At early time points, hearts injected with NAC-treated myoblasts exhibited increased donor cell survival, greater cell proliferation, and decreased cellular apoptosis, compared to untreated myoblasts. NAC-treated myoblasts significantly improved cardiac contractility, reduced fibrosis, and increased vascular density compared to DEM-treated myoblasts, but compared to untreated myoblasts, no difference was noted. Discussion While early survival of myoblasts transplanted into infarcted hearts was augmented by NAC pre-treatment, cardiac function remained unchanged compared to non-treated myoblasts. Conclusion Despite improving cell survival with NAC treated myoblast transplantation in a MI heart, cardiac function remained similar to untreated myoblasts. These results suggest that the reduced cardiac regenerative potential of myoblasts, when compared to MDSCs, is not only attributable to cell survival but is probably also related to the secretion of paracrine factors by the MDSCs. PMID:28989945
Individual traits as determinants of time to death under extreme drought in Pinus sylvestris L.
Garcia-Forner, Núria; Sala, Anna; Biel, Carme; Savé, Robert; Martínez-Vilalta, Jordi
2016-10-01
Plants exhibit a variety of drought responses involving multiple interacting traits and processes, which makes predictions of drought survival challenging. Careful evaluation of responses within species, where individuals share broadly similar drought resistance strategies, can provide insight into the relative importance of different traits and processes. We subjected Pinus sylvestris L. saplings to extreme drought (no watering) leading to death in a greenhouse to (i) determine the relative effect of predisposing factors and responses to drought on survival time, (ii) identify and rank the importance of key predictors of time to death and (iii) compare individual characteristics of dead and surviving trees sampled concurrently. Time until death varied over 3 months among individual trees (from 29 to 147 days). Survival time was best predicted (higher explained variance and impact on the median survival time) by variables related to carbon uptake and carbon/water economy before and during drought. Trees with higher concentrations of monosaccharides before the beginning of the drought treatment and with higher assimilation rates prior to and during the treatment survived longer (median survival time increased 25-70 days), even at the expense of higher water loss. Dead trees exhibited less than half the amount of nonstructural carbohydrates (NSCs) in branches, stem and relative to surviving trees sampled concurrently. Overall, our results indicate that the maintenance of carbon assimilation to prevent acute depletion of NSC content above some critical level appears to be the main factor explaining survival time of P. sylvestris trees under extreme drought. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Survival and Neurodevelopmental Outcomes among Periviable Infants
Younge, Noelle; Goldstein, Ricki F.; Bann, Carla M.; Hintz, Susan R.; Patel, Ravi M.; Smith, P. Brian; Bell, Edward F.; Rysavy, Matthew A.; Duncan, Andrea F.; Vohr, Betty R.; Das, Abhik; Goldberg, Ronald N.; Higgins, Rosemary D.; Cotten, C. Michael
2017-01-01
BACKGROUND Data reported during the past 5 years indicate that rates of survival have increased among infants born at the borderline of viability, but less is known about how increased rates of survival among these infants relate to early childhood neurodevelopmental outcomes. METHODS We compared survival and neurodevelopmental outcomes among infants born at 22 to 24 weeks of gestation, as assessed at 18 to 22 months of corrected age, across three consecutive birth-year epochs (2000–2003 [epoch 1], 2004–2007 [epoch 2], and 2008–2011 [epoch 3]). The infants were born at 11 centers that participated in the National Institute of Child Health and Human Development Neonatal Research Network. The primary outcome measure was a three-level outcome — survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, or death. After accounting for differences in infant characteristics, including birth center, we used multinomial generalized logit models to compare the relative risk of survival without neurodevelopmental impairment, survival with neurodevelopmental impairment, and death. RESULTS Data on the primary outcome were available for 4274 of 4458 infants (96%) born at the 11 centers. The percentage of infants who survived increased from 30% (424 of 1391 infants) in epoch 1 to 36% (487 of 1348 infants) in epoch 3 (P<0.001). The percentage of infants who survived without neurodevelopmental impairment increased from 16% (217 of 1391) in epoch 1 to 20% (276 of 1348) in epoch 3 (P = 0.001), whereas the percentage of infants who survived with neurodevelopmental impairment did not change significantly (15% [207 of 1391] in epoch 1 and 16% [211 of 1348] in epoch 3, P = 0.29). After adjustment for changes in the baseline characteristics of the infants over time, both the rate of survival with neurodevelopmental impairment (as compared with death) and the rate of survival without neurodevelopmental impairment (as compared with death) increased over time (adjusted relative risks, 1.27 [95% confidence interval {CI}, 1.01 to 1.59] and 1.59 [95% CI, 1.28 to 1.99], respectively). CONCLUSIONS The rate of survival without neurodevelopmental impairment increased between 2000 and 2011 in this large cohort of periviable infants. (Funded by the National Institutes of Health and others; ClinicalTrials.gov numbers, NCT00063063 and NCT00009633.) PMID:28199816
Daters, A T; Mauldin, G E; Mauldin, G N; Brodsky, E M; Post, G S
2010-03-01
The purpose of this study was to evaluate the efficacy of adding mitoxantrone to a cyclophosphamide, doxorubicin, vincristine, L-asparaginase and prednisone containing protocol. Sixty-five dogs with multicentric lymphoma were evaluated for overall remission and survival times. Remission and survival time versus stage, substage, pretreatment hypercalcaemia and pretreatment steroid administration were also evaluated. Overall median remission for dogs with multicentric lymphoma was 302 days and overall median survival was 622 days. Of the dogs with multicentric lymphoma, 23 (35%) received all scheduled mitoxantrone doses. Only median survival versus substage was found to be significant (substage a median survival was 679 days and substage b median survival was 302 days, P = 0.025). Increasing the total combined dose of doxorubicin and mitoxantrone may improve remission times when compared with historical controls, and further studies are needed to determine how best to utilize mitoxantrone in multidrug chemotherapy protocols for canine multicentric lymphoma.
Population-based survival-cure analysis of ER-negative breast cancer.
Huang, Lan; Johnson, Karen A; Mariotto, Angela B; Dignam, James J; Feuer, Eric J
2010-08-01
This study investigated the trends over time in age and stage specific population-based survival of estrogen receptor negative (ER-) breast cancer patients by examining the fraction of cured patients and the median survival time for uncured patients. Cause-specific survival data from the Surveillance, Epidemiology, and End Results program for cases diagnosed during 1992-1998 were used in mixed survival cure models to evaluate the cure fraction and the extension in survival for uncured patients. Survival trends were compared with adjuvant chemotherapy data available from an overlapping patterns-of-care study. For stage II N+ disease, the largest increase in cure fraction was 44-60% (P = 0.0257) for women aged >or=70 in contrast to a 7-8% point increase for women aged <50 or 50-69 (P = 0.056 and 0.038, respectively). For women with stage III disease, the increases in the cure fraction were not statistically significant, although women aged 50-69 had a 10% point increase (P = 0.103). Increases in cure fraction correspond with increases in the use of adjuvant chemotherapy, particularly for the oldest age group. In this article, for the first time, we estimate the cure fraction for ER- patients. We notice that at age >o5r=70, the accelerated increase in cure fraction from 1992 to 1998 for women with stage II N+ compared with stage III suggests a selective benefit for chemotherapy in the lower stage group.
Rassnick, Kenneth M; Goldkamp, Carrie E; Erb, Hollis N; Scrivani, Peter V; Njaa, Bradley L; Gieger, Tracy L; Turek, Michelle M; McNiel, Elizabeth A; Proulx, David R; Chun, Ruthanne; Mauldin, Glenna E; Phillips, Brenda S; Kristal, Orna
2006-08-01
To evaluate factors associated with survival in dogs with nasal carcinomas that did not receive treatment or received only palliative treatment. Retrospective case series. 139 dogs with histologically confirmed nasal carcinomas. Medical records, computed tomography images, and biopsy specimens of nasal carcinomas were reviewed. Only dogs that were not treated with radiation, surgery, chemotherapy, or immunotherapy and that survived > or = 7 days from the date of diagnosis were included. The Kaplan-Meier method was used to estimate survival time. Factors potentially associated with survival were compared by use of log-rank and Wilcoxon rank sum tests. Multivariable survival analysis was performed by use of the Cox proportional hazards regression model. Overall median survival time was 95 days (95% confidence interval [CI], 73 to 113 days; range, 7 to 1,114 days). In dogs with epistaxis, the hazard of dying was 2.3 times that of dogs that did not have epistaxis. Median survival time of 107 dogs with epistaxis was 88 days (95% CI, 65 to 106 days) and that of 32 dogs without epistaxis was 224 days (95% CI, 54 to 467 days). The prognosis of dogs with untreated nasal carcinomas is poor. Treatment strategies to improve outcome should be pursued.
Lee, Jeeyun; Au, Wing-Yan; Park, Min Jae; Suzumiya, Junji; Nakamura, Shigeo; Kameoka, Jun-Ichi; Sakai, Chikara; Oshimi, Kazuo; Kwong, Yok-Lam; Liang, Raymond; Yiu, Harry; Wong, Kam-Hung; Cheng, Hoi-Ching; Ryoo, Baek-Yeol; Suh, Cheolwon; Ko, Young Hyeh; Kim, Kihyun; Lee, Jae-Won; Kim, Won Seog; Suzuki, Ritsuro
2008-12-01
Extranodal natural killer (NK)/T cell lymphoma, nasal type, is a recently recognized distinct entity and the most common type of non-B cell extranodal lymphoma in Asia. This retrospective analysis studied the potential survival benefits of hematopoeitic stem cell transplantation (HSCT) compared with a historical control group. A total of 47 patients from 3 previously published series of HSCT were matched according to NK/T cell lymphoma International Prognostic Index (NKIPI) risk groups and disease status at transplantation with 107 patients from a historical control group for analysis. After a median follow-up of 116.5 months, the median survival time was not determined for the HSCT group, but it was 43.5 months for the control group (95% confidence interval [CI] = 6.7 to 80.3 months; P = .127, log-rank test). In patients who were in complete remission (CR) at the time of HSCT or at surveillance after remission, disease-specific survival rates were significantly higher in the HSCT group compared with the control group (disease-specific 5-year survival rate, 87.3% for HSCT vs 67.8% for non-HSCT; P = .027). In contrast, in subgroup analysis on non-CR patients at the time of HSCT or non-HSCT treatment, disease-specific survival rates were not significantly prolonged in the HSCT group compared with the control group (1-year survival rate, 66.7% for HSCT vs 28.6% for non-HSCT; P = .141). The impact of HSCT on the survival of all patients was significantly retained at the multivariate level with a 2.1-fold (95% CI =1.2- to 3.7-fold) reduced risk of death (P = .006). HSCT seems to confer a survival benefit in patients who attained CR on postremission consolidation therapy. These findings suggest that, in particular, patients in CR with high NKIPI risk scores at diagnosis should receive full consideration for HSCT.
Lanfear, David E; Levy, Wayne C; Stehlik, Josef; Estep, Jerry D; Rogers, Joseph G; Shah, Keyur B; Boyle, Andrew J; Chuang, Joyce; Farrar, David J; Starling, Randall C
2017-05-01
Timing of left ventricular assist device (LVAD) implantation in advanced heart failure patients not on inotropes is unclear. Relevant prediction models exist (SHFM [Seattle Heart Failure Model] and HMRS [HeartMate II Risk Score]), but use in this group is not established. ROADMAP (Risk Assessment and Comparative Effectiveness of Left Ventricular Assist Device and Medical Management in Ambulatory Heart Failure Patients) is a prospective, multicenter, nonrandomized study of 200 advanced heart failure patients not on inotropes who met indications for LVAD implantation, comparing the effectiveness of HeartMate II support versus optimal medical management. We compared SHFM-predicted versus observed survival (overall survival and LVAD-free survival) in the optimal medical management arm (n=103) and HMRS-predicted versus observed survival in all LVAD patients (n=111) using Cox modeling, receiver-operator characteristic (ROC) curves, and calibration plots. In the optimal medical management cohort, the SHFM was a significant predictor of survival (hazard ratio=2.98; P <0.001; ROC area under the curve=0.71; P <0.001) but not LVAD-free survival (hazard ratio=1.41; P =0.097; ROC area under the curve=0.56; P =0.314). SHFM showed adequate calibration for survival but overestimated LVAD-free survival. In the LVAD cohort, the HMRS had marginal discrimination at 3 (Cox P =0.23; ROC area under the curve=0.71; P =0.026) and 12 months (Cox P =0.036; ROC area under the curve=0.62; P =0.122), but calibration was poor, underestimating survival across time and risk subgroups. In non-inotrope-dependent advanced heart failure patients receiving optimal medical management, the SHFM was predictive of overall survival but underestimated the risk of clinical worsening and LVAD implantation. Among LVAD patients, the HMRS had marginal discrimination and underestimated survival post-LVAD implantation. URL: http://www.clinicaltrials.gov. Unique identifier: NCT01452802. © 2017 American Heart Association, Inc.
Survival in patients with metachronous second primary lung cancer.
Ha, Duc; Choi, Humberto; Chevalier, Cory; Zell, Katrina; Wang, Xiao-Feng; Mazzone, Peter J
2015-01-01
Four to 10% of patients with non-small cell lung cancer subsequently develop a metachronous second primary lung cancer. The decision to perform surveillance or screening imaging for patients with potentially cured lung cancer must take into account the outcomes expected when detecting metachronous second primaries. To assess potential survival differences between patients with metachronous second primary lung cancer compared to matched patients with first primary lung cancer. We retrospectively reviewed patients diagnosed with lung cancer at the Cleveland Clinic (2006-2010). Metachronous second primary lung cancer was defined as lung cancer diagnosed after a 4-year, disease-free interval from the first lung cancer, or if there were two different histologic subtypes diagnosed at different times. Patients with first primary lung cancer diagnosed in the same time period served as control subjects. Propensity score matching was performed using age, sex, smoking history, histologic subtype, and collaborative stage, with a 1:3 case-control ratio. Survival analyses were performed by Cox proportional hazards modeling and Kaplan-Meier estimates. Forty-four patients met criteria for having a metachronous second primary lung cancer. There were no statistically significant differences between case subjects and control subjects in prognostic variables. The median survival time and 2-year overall survival rate for the metachronous second primary group, compared with control subjects, were as follows: 11.8 versus 18.4 months (P = 0.18) and 31.0 versus 40.9% (P = 0.28). The survival difference was largest in those with stage I metachronous second primaries (median survival time, 26.8 vs. 60.4 mo, P = 0.09; 2-year overall survival, 56.3 vs. 71.2%, P = 0.28). Patients with stage I metachronous second primary lung cancer may have worse survival than those who present with a first primary lung cancer. This could influence the benefit-risk balance of screening the high-risk cohort with a previously treated lung cancer.
Understanding survival analysis: Kaplan-Meier estimate.
Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal
2010-10-01
Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.
Shauna M. Uselman; Keirith A. Snyder; Elizabeth A. Leger; Sara E. Duke
2015-01-01
Comparing emergence and survival probabilities, early seral natives generally outperformed late seral natives when growing with exotics and had earlier emergence timing, although results differed among functional groups and soil types. In contrast, survival probabilities did not differ between the early and late seral mixes when growing without exotics. Within each...
Survival of juvenile black brant during brood rearing
Flint, Paul L.; Sedinger, James S.; Pollock, Kenneth H.
1995-01-01
Survival of young is an important and poorly understood component of waterfowl productivity. We estimated survival of black brant (Branta bernicla nigricans) goslings during summers 1987-89 on the Yukon-Kuskokwim Delta, Alaska, to determine timing and magnitude of gosling mortality and to compare methods of estimating gosling survival. Eighty-two percent of radio-tagged adult females (n = 61) fledged ≥1 gosling (brood success). We estimated survival of goslings within broods by 3 methods: (1) changes in mean brood size through time, (2) observation of goslings associated with marked adults, and (3) age ratios of brant captured in banding drives. Estimates of gosling survival within successful broods averaged 81% and ranged from 66 to 92%. Combining brood success and gosling survival within successful broods yielded estimates of overall gosling survival that averaged 68%, ranging from 79% in 1987 to 56% in 1989. Eighty-two percent of gosling mortality occurred in the first 15 days. Estimates of survival on the basis of age ratios of birds captured in banding drives are biased low. Our estimates of average gosling survival are higher than reported for other species of geese.
Geographic disparities in colorectal cancer survival
Henry, Kevin A; Niu, Xiaoling; Boscoe, Francis P
2009-01-01
Background Examining geographic variation in cancer patient survival can help identify important prognostic factors that are linked by geography and generate hypotheses about the underlying causes of survival disparities. In this study, we apply a recently developed spatial scan statistic method, designed for time-to-event data, to determine whether colorectal cancer (CRC) patient survival varies by place of residence after adjusting survival times for several prognostic factors. Methods Using data from a population-based, statewide cancer registry, we examined a cohort of 25,040 men and women from New Jersey who were newly diagnosed with local or regional stage colorectal cancer from 1996 through 2003 and followed to the end of 2006. Survival times were adjusted for significant prognostic factors (sex, age, stage at diagnosis, race/ethnicity and census tract socioeconomic deprivation) and evaluated using a spatial scan statistic to identify places where CRC survival was significantly longer or shorter than the statewide experience. Results Age, sex and stage adjusted survival times revealed several areas in the northern part of the state where CRC survival was significantly different than expected. The shortest and longest survival areas had an adjusted 5-year survival rate of 73.1% (95% CI 71.5, 74.9) and 88.3% (95% CI 85.4, 91.3) respectively, compared with the state average of 80.0% (95% CI 79.4, 80.5). Analysis of survival times adjusted for age, sex and stage as well as race/ethnicity and area socioeconomic deprivation attenuated the risk of death from CRC in several areas, but survival disparities persisted. Conclusion The results suggest that in areas where additional adjustments for race/ethnicity and area socioeconomic deprivation changed the geographic survival patterns and reduced the risk of death from CRC, the adjustment factors may be contributing causes of the disparities. Further studies should focus on specific and modifiable individual and neighborhood factors in the high risk areas that may affect a person's chance of surviving cancer. PMID:19627576
Perez-Cruz, Pedro E.; dos Santos, Renata; Silva, Thiago Buosi; Crovador, Camila Souza; Nascimento, Maria Salete de Angelis; Hall, Stacy; Fajardo, Julieta; Bruera, Eduardo; Hui, David
2014-01-01
Context Survival prognostication is important during end-of-life. The accuracy of clinician prediction of survival (CPS) over time has not been well characterized. Objectives To examine changes in prognostication accuracy during the last 14 days of life in a cohort of patients with advanced cancer admitted to two acute palliative care units and to compare the accuracy between the temporal and probabilistic approaches. Methods Physicians and nurses prognosticated survival daily for cancer patients in two hospitals until death/discharge using two prognostic approaches: temporal and probabilistic. We assessed accuracy for each method daily during the last 14 days of life comparing accuracy at day −14 (baseline) with accuracy at each time point using a test of proportions. Results 6718 temporal and 6621 probabilistic estimations were provided by physicians and nurses for 311 patients, respectively. Median (interquartile range) survival was 8 (4, 20) days. Temporal CPS had low accuracy (10–40%) and did not change over time. In contrast, probabilistic CPS was significantly more accurate (p<.05 at each time point) but decreased close to death. Conclusion Probabilistic CPS was consistently more accurate than temporal CPS over the last 14 days of life; however, its accuracy decreased as patients approached death. Our findings suggest that better tools to predict impending death are necessary. PMID:24746583
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.
Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N
2016-01-01
Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.
Xi, Mian; Xu, Cai; Liao, Zhongxing; Hofstetter, Wayne L; Blum Murphy, Mariela; Maru, Dipen M; Bhutani, Manoop S; Lee, Jeffrey H; Weston, Brian; Komaki, Ritsuko; Lin, Steven H
2017-08-01
To assess the impact of histology on recurrence patterns and survival outcomes in patients with esophageal cancer (EC) treated with definitive chemoradiotherapy (CRT). We analyzed 590 consecutive EC patients who received definitive CRT from 1998 to 2014, including 182 patients (30.8%) with squamous cell carcinoma (SCC) and 408 (69.2%) with adenocarcinoma. Recurrence pattern and timing, survival, and potential prognostic factors were compared. After a median follow-up time of 58.0months, the SCC group demonstrated a comparable locoregional recurrence rate (42.9% vs. 38.0%, P=0.264) but a significantly lower distant failure rate (27.5% vs. 48.0%, P<0.001) than adenocarcinoma group. No significant difference was found in overall survival or locoregional failure-free survival between groups, whereas the SCC group was associated with significantly more favorable recurrence-free survival (P=0.009) and distant metastasis-free survival (P<0.001). The adenocarcinoma group had higher hematogenous metastasis rates of bone, brain, and liver, whereas the SCC group had a marginally higher regional recurrence rate. Among patients who received salvage surgery after locoregional recurrence, no significant difference in survival was found between groups (P=0.12). The patterns and sites of recurrence, survival outcomes, and prognostic factors were significantly different between esophageal SCC and adenocarcinoma. Copyright © 2017 Elsevier B.V. All rights reserved.
Improved Survival After the Ross Procedure Compared With Mechanical Aortic Valve Replacement.
Buratto, Edward; Shi, William Y; Wynne, Rochelle; Poh, Chin L; Larobina, Marco; O'Keefe, Michael; Goldblatt, John; Tatoulis, James; Skillington, Peter D
2018-03-27
It is unclear whether the Ross procedure offers superior survival compared with mechanical aortic valve replacement (AVR). This study evaluated experience and compared long-term survival between the Ross procedure and mechanical AVR. Between 1992 and 2016, a total of 392 Ross procedures were performed. These were compared with 1,928 isolated mechanical AVRs performed during the same time period as identified using the University of Melbourne and Australia and New Zealand Society of Cardiac and Thoracic Surgeons' Cardiac Surgery Databases. Only patients between 18 and 65 years of age were included. Propensity-score matching was performed for risk adjustment. Ross procedure patients were younger, and had fewer cardiovascular risk factors. The Ross procedure was associated with longer cardiopulmonary bypass and aortic cross-clamp times. Thirty-day mortality was similar (Ross, 0.3%; mechanical, 0.8%; p = 0.5). Ross procedure patients experienced superior unadjusted long-term survival at 20 years (Ross, 95%; mechanical, 68%; p < 0.001). Multivariable analysis showed the Ross procedure to be associated with a reduced risk of late mortality (hazard ratio: 0.34; 95% confidence internal: 0.17 to 0.67; p < 0.001). Among 275 propensity-score matched pairs, Ross procedure patients had superior survival at 20 years (Ross, 94%; mechanical, 84%; p = 0.018). In this Australian, propensity-score matched study, the Ross procedure was associated with better long-term survival compared with mechanical AVR. In younger patients, with a long life expectancy, the Ross procedure should be considered in centers with sufficient expertise. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.
Sherrill, B; Wang, J; Kotapati, S; Chin, K
2013-01-01
Background: Study CA184024 was a multinational, randomised, double-blind, phase 3 study comparing ipilimumab/dacarbazine (DTIC) vs placebo/DTIC in patients with untreated stage III/IV melanoma, which showed that ipilimumab significantly improves survival in patients with metastatic melanoma. The objective of this analysis was to compare the quality-adjusted survival experience among patients in this trial. Methods: Survival time was partitioned into health states: toxicity, time before progression without toxicity, and relapse until death or end of follow-up. Q-TWiST (quality-adjusted time without symptoms of disease or toxicity of treatment) was calculated as the utility-weighted sum of the mean health state durations. Analyses were repeated over extended follow-up periods. Results: Based on a combination of trial-based and external utility scores, the Q-TWiST difference in this trial was 0.50 months (P=0.0326) favoring ipilimumab after 1 year. The Q-TWiST difference was 1.5 months with 2 years of follow-up (P=0.0091), 2.36 months at 3 years (P=0.005) and 3.28 months at 4 years (P=0.0074). Conclusion: During the first year of study, there was little difference between groups in quality-adjusted survival. However, after 2, 3 and 4 years follow-up for patients with extended survival, the benefits of IPI+DTIC vs PLA+DTIC for advanced melanoma continue to accrue. PMID:23787916
Dumitrascu, T; Dima, S; Brasoveanu, V; Stroescu, C; Herlea, V; Moldovan, S; Ionescu, M; Popescu, I
2014-12-01
The impact of venous resection (VR) in pancreatico-dudenectomy (PD) for pancreatic adenocarcinoma (PDAC) is controversial. The aim of the study is to comparatively assess the postoperative outcomes after PD with and without VR for PDAC and to identify predictors of morbidity and survival in the subgroup of PD with VR. The data of 51 PD with VR were compared with those of 183 PD without VR. Binary logistic regression and Cox survival analyses were performed. Both the operative time and estimated blood loss was significantly higher in the VR group (P<0.001). A trend towards an increased 90-day mortality (9.8% vs. 5.5%) and severe morbidity (20% vs. 13%) was observed when a VR was performed (P ≥0.264). The median overall survival time after the PD with and without VR was 13 months and 17 months, respectively (P=0.845). The absence of histological tumor invasion of the VR was found as the only independent predictor for a better survival (HR=0.359; 95% CI 0.161-0.803; P=0.013). A PD with VR can be safely incorporated in a pancreatic surgeon armamentarium. However, the trend towards increased mortality and severe morbidity rates should be expected, along with higher operative time and blood loss, compared with PD without VR. Associated VR does not appear to significantly impair the prognosis after PD for PDAC; however, histological tumor invasion of the VR has a negative impact on the survival.
Helicopter crashes into water: warning time, final position, and other factors affecting survival.
Brooks, Christopher J; MacDonald, Conor V; Baker, Susan P; Shanahan, Dennis F; Haaland, Wren L
2014-04-01
According to 40 yr of data, the fatality rate for a helicopter crash into water is approximately 25%. Does warning time and the final position of the helicopter in the water influence the survival rate? The National Transportation Safety Board (NTSB) database was queried to identify helicopter crashes into water between 1981 and 2011 in the Gulf of Mexico and Hawaii. Fatality rate, amount of warning time prior to the crash, and final position of the helicopter were identified. There were 133 helicopters that crashed into water with 456 crew and passengers. Of these, 119 occupants (26%) did not survive; of those who did survive, 38% were injured. Twelve died after making a successful escape from the helicopter. Crashes with < 15 s warning had a fatality rate of 22%, compared to 12% for 16-60 s warning and 5% for > 1 min. However, more than half of fatalities (57%) came from crashes for which the warning time could not be determined. Lack of warning time and how to survive in the water after the crash should be a topic for study in all marine survival/aircraft ditching courses. Investigators should be trained to provide estimates of warning time when investigating helicopter crashes into water.
Francis, Jasmine H; Iyer, Saipriya; Gobin, Y Pierre; Brodie, Scott E; Abramson, David H
2017-10-01
To compare the efficacy and toxicity of treating class 3 retinoblastoma vitreous seeds with ophthalmic artery chemosurgery (OAC) alone versus OAC with intravitreous chemotherapy. Retrospective cohort study. Forty eyes containing clouds (class 3 vitreous seeds) of 40 retinoblastoma patients (19 treated with OAC alone and 21 treated with OAC plus intravitreous and periocular chemotherapy). Ocular survival, disease-free survival and time to regression of seeds were estimated with Kaplan-Meier estimates. Ocular toxicity was evaluated by clinical findings and electroretinography: 30-Hz flicker responses were compared at baseline and last follow-up visit. Continuous variables were compared with Student t test, and categorical variables were compared with the Fisher exact test. Ocular survival, disease-free survival, and time to regression of seeds. There were no disease- or treatment-related deaths and no patient demonstrated externalization of tumor or metastatic disease. There was no significant difference in the age, laterality, disease, or disease status (treatment naïve vs. previously treated) between the 2 groups. The time to regression of seeds was significantly shorter for eyes treated with OAC plus intravitreous chemotherapy (5.7 months) compared with eyes treated with OAC alone (14.6 months; P < 0.001). The 18-month Kaplan-Meier estimates of disease-free survival were significantly worse for the OAC alone group: 67.1% (95% confidence interval, 40.9%-83.6%) versus 94.1% (95% confidence interval, 65%-99.1%) for the OAC plus intravitreous chemotherapy group (P = 0.05). The 36-month Kaplan-Meier estimates of ocular survival were 83.3% (95% confidence interval, 56.7%-94.3%) for the OAC alone group and 100% for the OAC plus intravitreous chemotherapy group (P = 0.16). The mean change in electroretinography responses was not significantly different between groups, decreasing by 11 μV for the OAC alone group and 22 μV for the OAC plus intravitreous chemotherapy group (P = 0.4). Treating vitreous seed clouds with OAC and intravitreous and periocular chemotherapy, compared with OAC alone, resulted in a shorter time to regression and was associated with fewer recurrences requiring additional treatment and fewer enucleations. The toxicity to the retina does not seem to be significantly worse in the OAC plus intravitreous chemotherapy group. Copyright © 2017 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Effects of electrical stimulation in the treatment of osteonecrosis of the femoral head.
Fornell, Salvador; Ribera, Juan; Mella, Mario; Carranza, Andrés; Serrano-Toledano, David; Domecq, Gabriel
2017-10-16
The aim of this study was to examine whether the use of an internal electrostimulator could improve the results obtained with core decompression alone in the treatment of osteonecrosis of the femoral head. We performed a retrospective study of 41 patients (55 hips) treated for osteonecrosis of the femoral head between 2005 and 2014. Mean follow-up time was 56 (12-108) months. We recorded 3 parameters: time to recurrence of pain, time to conversion to arthroplasty and time to radiographic failure. Survival was estimated using the Kaplan-Meier method. The equality of the survival distributions was determined by the Log rank test. Implanted electrostimulator was a factor that increased the survival of hips in a pre-op Steinberg stage of II or below, while it remained unchanged if the stage was III or higher. The addition of an internal electrostimulator provides increased survival compared to core decompression alone at stages below III.
Equilino, Mirjam; Théodoloz, Vincent; Gorgas, Daniela; Doherr, Marcus G; Heilmann, Romy M; Suchodolski, Jan S; Steiner, Jörg M; Burgener Dvm, Iwan A
2015-01-01
To evaluate serum concentrations of biochemical markers and survival time in dogs with protein-losing enteropathy (PLE). Prospective study. 29 dogs with PLE and 18 dogs with food-responsive diarrhea (FRD). Data regarding serum concentrations of various biochemical markers at the initial evaluation were available for 18 of the 29 dogs with PLE and compared with findings for dogs with FRD. Correlations between biochemical marker concentrations and survival time (interval between time of initial evaluation and death or euthanasia) for dogs with PLE were evaluated. Serum C-reactive protein concentration was high in 13 of 18 dogs with PLE and in 2 of 18 dogs with FRD. Serum concentration of canine pancreatic lipase immunoreactivity was high in 3 dogs with PLE but within the reference interval in all dogs with FRD. Serum α1-proteinase inhibitor concentration was less than the lower reference limit in 9 dogs with PLE and 1 dog with FRD. Compared with findings in dogs with FRD, values of those 3 variables in dogs with PLE were significantly different. Serum calprotectin (measured by radioimmunoassay and ELISA) and S100A12 concentrations were high but did not differ significantly between groups. Seventeen of the 29 dogs with PLE were euthanized owing to this disease; median survival time was 67 days (range, 2 to 2,551 days). Serum C-reactive protein, canine pancreatic lipase immunoreactivity, and α1-proteinase inhibitor concentrations differed significantly between dogs with PLE and FRD. Most initial biomarker concentrations were not predictive of survival time in dogs with PLE.
Landmark Estimation of Survival and Treatment Effect in a Randomized Clinical Trial
Parast, Layla; Tian, Lu; Cai, Tianxi
2013-01-01
Summary In many studies with a survival outcome, it is often not feasible to fully observe the primary event of interest. This often leads to heavy censoring and thus, difficulty in efficiently estimating survival or comparing survival rates between two groups. In certain diseases, baseline covariates and the event time of non-fatal intermediate events may be associated with overall survival. In these settings, incorporating such additional information may lead to gains in efficiency in estimation of survival and testing for a difference in survival between two treatment groups. If gains in efficiency can be achieved, it may then be possible to decrease the sample size of patients required for a study to achieve a particular power level or decrease the duration of the study. Most existing methods for incorporating intermediate events and covariates to predict survival focus on estimation of relative risk parameters and/or the joint distribution of events under semiparametric models. However, in practice, these model assumptions may not hold and hence may lead to biased estimates of the marginal survival. In this paper, we propose a semi-nonparametric two-stage procedure to estimate and compare t-year survival rates by incorporating intermediate event information observed before some landmark time, which serves as a useful approach to overcome semi-competing risks issues. In a randomized clinical trial setting, we further improve efficiency through an additional calibration step. Simulation studies demonstrate substantial potential gains in efficiency in terms of estimation and power. We illustrate our proposed procedures using an AIDS Clinical Trial Protocol 175 dataset by estimating survival and examining the difference in survival between two treatment groups: zidovudine and zidovudine plus zalcitabine. PMID:24659838
Cole, Ashley L; Austin, Anna E; Hickson, Ryan P; Dixon, Matthew S; Barber, Emma L
2018-05-11
Randomized trials outside the U.S. have found non-inferior survival for neoadjuvant chemotherapy (NACT) versus primary debulking surgery (PDS) for advanced ovarian cancer (AOC). However, these trials reported lower overall survival and lower rates of optimal debulking than U.S. studies, leading to questions about generalizability to U.S. practice, where aggressive debulking is more common. Consequently, comparative effectiveness in the U.S. remains controversial. We reviewed U.S. comparative effectiveness studies of NACT versus PDS for AOC. Here we describe methodological challenges, compare results to trials outside the U.S., and make suggestions for future research. We identified U.S. studies published in 2010 or later that evaluated the comparative effectiveness of NACT versus PDS on survival in AOC through a PubMed search. Two independent reviewers abstracted data from eligible articles. Nine of 230 articles were eligible for review. Methodological challenges included unmeasured confounders, heterogeneous treatment effects, treatment variations over time, and inconsistent measurement of treatment and survival. Whereas some limitations were unavoidable, several limitations noted across studies were avoidable, including conditioning on mediating factors and immortal time introduced by measuring survival beginning from diagnosis. Without trials in the U.S., non-randomized studies are an important source of evidence for the ideal treatment for AOC. However, several methodological challenges exist when assessing the comparative effectiveness of NACT versus PDS in a non-randomized setting. Future observational studies must ensure that treatment is consistent throughout the study period and that treatment groups are comparable. Rapidly-evolving oncology data networks may allow for identification of treatment intent and other important confounders. Copyright © 2018 Elsevier Ltd. All rights reserved.
Survival from skin cancer and its associated factors in Kurdistan province of Iran.
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
We explored survival of skin cancer and its determinants in Kurdistan province of Iran. In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar's office and information on type, stage and anatomic locations were extracted from patients' hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients.
Survival from skin cancer and its associated factors in Kurdistan province of Iran
Ahmadi, Galavizh; Asadi-Lari, Mohsen; Amani, Saeid; Solaymani-Dodaran, Masoud
2015-01-01
Background: We explored survival of skin cancer and its determinants in Kurdistan province of Iran. Methods: In a retrospective cohort design, we identified all registered skin cancer patients in Kurdistan Cancer Registry from year 2000 to 2009. Information on time and cause of death were obtained from Registrar’s office and information on type, stage and anatomic locations were extracted from patients’ hospital records. Additional demographic information was collected via a telephone interview. We calculated the 3 and 5 years survival. Survival experiences in different groups were compared using log rank test. Cox proportional hazard model was built and hazard ratios and their 95% confidence intervals were calculated. Results: Of a total of 1353, contact information for 667 patients were available, all of which were followed up. 472 telephone interviews were conducted. Mean follow-up time was 34 months. We identified 78 deaths in this group of patients and 44 of them were because of skin cancer. After controlling for confounding, tumour type, anatomical location, and diseases stage remained significantly associated with survival. Hazard ratios for death because of squamous cell carcinoma was 74.5 (95%CI: 4.8-1146) and for melanoma was 24.4 (95%CI: 1.3-485) compared with basal cell carcinomas. Hazard ratio for tumours in stage 4 was 16.7 (95%CI: 1.8-156.6) and for stage 3 was 16.8 (95%CI: 1.07-260) compared with stage 1 and 2. Conclusion: Tumour stage is independently associated with survival. Relatively low survival rates suggest delayed diagnosis. Increasing public awareness through media about the warning signs of skin cancers could increase the chance of survival in these patients. PMID:26793668
Probabilistic Survivability Versus Time Modeling
NASA Technical Reports Server (NTRS)
Joyner, James J., Sr.
2015-01-01
This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.
Serrano, Pablo E; Cleary, Sean P; Dhani, Neesha; Kim, Peter T W; Greig, Paul D; Leung, Kenneth; Moulton, Carol-Anne; Gallinger, Steven; Wei, Alice C
2015-04-01
Despite reduced perioperative mortality and routine use of adjuvant therapy following pancreatectomy for pancreatic ductal adenocarcinoma (PDAC), improvement in long-term outcome has been difficult to ascertain. This study compares outcomes in patients undergoing resection for PDAC within a single, high-volume academic institution over two sequential time periods. Retrospective review of patients with resected PDAC, in two cohorts: period 1 (P1), 1991-2000; and period 2 (P2), 2001-2010. Univariate and multivariate analyses using the Cox proportional hazards model were performed to determine prognostic factors associated with long-term survival. Survival was evaluated using Kaplan-Meier analyses. A total of 179 pancreatectomies were performed during P1 and 310 during P2. Perioperative mortality was 6.7 % (12/179) in P1 and 1.6 % (5/310) in P2 (p = 0.003). P2 had a greater number of lymph nodes resected (17 [0-50] vs. 7 [0-31]; p < 0.001), and a higher lymph node positivity rate (69 % [215/310] vs. 58 % [104/179]; p = 0.021) compared with P1. The adjuvant therapy rate was 30 % (53/179) in P1 and 63 % (195/310) in P2 (p < 0.001). By multivariate analysis, node and margin status, tumor grade, adjuvant therapy, and time period of resection were independently associated with overall survival (OS) for both time periods. Median OS was 16 months (95 % confidence interval [CI] 14-20) in P1 and 27 months (95 % CI 24-30) in P2 (p < 0.001). Factors associated with improved long-term survival remain comparable over time. Short- and long-term survival for patients with resected PDAC has improved over time due to decreased perioperative mortality and increased use of adjuvant therapy, although the proportion of 5-year survivors remains small.
De la Cruz, Susan E W; Takekawa, John Y; Spragens, Kyle A; Yee, Julie; Golightly, Richard T; Massey, Greg; Henkel, Laird A; Scott Larsen, R; Ziccardi, Michael
2013-02-15
Birds are often the most numerous vertebrates damaged and rehabilitated in marine oil spills; however, the efficacy of avian rehabilitation is frequently debated and rarely examined experimentally. We compared survival of three radio-marked treatment groups, oiled, rehabilitated (ORHB), un-oiled, rehabilitated (RHB), and un-oiled, non-rehabilitated (CON), in an experimental approach to examine post-release survival of surf scoters (Melanitta perspicillata) following the 2007 M/V Cosco Busan spill in San Francisco Bay. Live encounter-dead recovery modeling indicated that survival differed among treatment groups and over time since release. The survival estimate (±SE) for ORHB was 0.143±0.107 compared to CON (0.498±0.168) and RHB groups (0.772±0.229), suggesting scoters tolerated the rehabilitation process itself well, but oiling resulted in markedly lower survival. Future efforts to understand the physiological effects of oil type and severity on scoters are needed to improve post-release survival of this species. Published by Elsevier Ltd.
Survival in Norwegian BRCA1 mutation carriers with breast cancer.
Hagen, Anne Irene; Tretli, Steinar; Maehle, Lovise; Apold, Jaran; Vedå, Nina; Møller, Pål
2009-04-14
Several studies of survival in women with BRCA1 mutations have shown either reduced survival or no difference compared to controls. Programmes for early detection and treatment of inherited breast cancer, have failed to demonstrate a significant improvement in survival in BRCA1 mutation carriers.One hundred and sixty-seven women with disease-associated germline BRCA1 mutations and breast cancer from 1980 to 2001 were identified. Tumour characteristics, treatment given and survival were recorded. A control group comprising three hundred and four women matched for age, time of diagnosis and stage were used to compare survival.BRCA1 mutation carriers were found to have a poorer prognosis, which could be explained by neither the mode of surgical treatment nor the use of adjuvant chemotherapy. BRCA1 mutation carriers with node negative breast cancer had worse overall survival than controls.Our findings confirm the serious prognosis of BRCA1-associated breast cancer even when diagnosed at an early stage, and that type of treatment does not influence prognosis.
Survival in Norwegian BRCA1 mutation carriers with breast cancer
Hagen, Anne Irene; Tretli, Steinar; Mæhle, Lovise; Apold, Jaran; Vedå, Nina; Møller, Pål
2009-01-01
Several studies of survival in women with BRCA1 mutations have shown either reduced survival or no difference compared to controls. Programmes for early detection and treatment of inherited breast cancer, have failed to demonstrate a significant improvement in survival in BRCA1 mutation carriers. One hundred and sixty-seven women with disease-associated germline BRCA1 mutations and breast cancer from 1980 to 2001 were identified. Tumour characteristics, treatment given and survival were recorded. A control group comprising three hundred and four women matched for age, time of diagnosis and stage were used to compare survival. BRCA1 mutation carriers were found to have a poorer prognosis, which could be explained by neither the mode of surgical treatment nor the use of adjuvant chemotherapy. BRCA1 mutation carriers with node negative breast cancer had worse overall survival than controls. Our findings confirm the serious prognosis of BRCA1-associated breast cancer even when diagnosed at an early stage, and that type of treatment does not influence prognosis. PMID:19366445
De La Cruz, Susan E. W.; Takekawa, John Y.; Spragens, Kyle A.; Yee, Julie; Golightly, Richard T.; Massey, Greg; Henkel, Laird A.; Larsen, Scott; Ziccardi, Michael
2013-01-01
Birds are often the most numerous vertebrates damaged and rehabilitated in marine oil spills; however, the efficacy of avian rehabilitation is frequently debated and rarely examined experimentally. We compared survival of three radio-marked treatment groups, oiled, rehabilitated (ORHB), un-oiled, rehabilitated (RHB), and un-oiled, non-rehabilitated (CON), in an experimental approach to examine post-release survival of surf scoters (Melanitta perspicillata) following the 2007 M/V Cosco Busan spill in San Francisco Bay. Live encounter-dead recovery modeling indicated that survival differed among treatment groups and over time since release. The survival estimate (±SE) for ORHB was 0.143 ± 0.107 compared to CON (0.498 ± 0.168) and RHB groups (0.772 ± 0.229), suggesting scoters tolerated the rehabilitation process itself well, but oiling resulted in markedly lower survival. Future efforts to understand the physiological effects of oil type and severity on scoters are needed to improve post-release survival of this species.
Wang, Rong; Cheng, Nan; Xiao, Cang-Song; Wu, Yang; Sai, Xiao-Yong; Gong, Zhi-Yun; Wang, Yao; Gao, Chang-Qing
2017-01-01
Background: The optimal timing of surgical revascularization for patients presenting with ST-segment elevation myocardial infarction (STEMI) and impaired left ventricular function is not well established. This study aimed to examine the timing of surgical revascularization after STEMI in patients with ischemic heart disease and left ventricular dysfunction (LVD) by comparing early and late results. Methods: From January 2003 to December 2013, there were 2276 patients undergoing isolated coronary artery bypass grafting (CABG) in our institution. Two hundred and sixty-four (223 male, 41 females) patients with a history of STEMI and LVD were divided into early revascularization (ER, <3 weeks), mid-term revascularization (MR, 3 weeks to 3 months), and late revascularization (LR, >3 months) groups according to the time interval from STEMI to CABG. Mortality and complication rates were compared among the groups by Fisher's exact test. Cox regression analyses were performed to examine the effect of the time interval of surgery on long-term survival. Results: No significant differences in 30-day mortality, long-term survival, freedom from all-cause death, and rehospitalization for heart failure existed among the groups (P > 0.05). More patients in the ER group (12.90%) had low cardiac output syndrome than those in the MR (2.89%) and LR (3.05%) groups (P = 0.035). The mean follow-up times were 46.72 ± 30.65, 48.70 ± 32.74, and 43.75 ± 32.43 months, respectively (P = 0.716). Cox regression analyses showed a severe preoperative condition (odds ratio = 7.13, 95% confidence interval 2.05–24.74, P = 0.002) rather than the time interval of CABG (P > 0.05) after myocardial infarction was a risk factor of long-term survival. Conclusions: Surgical revascularization for patients with STEMI and LVD can be performed at different times after STEMI with comparable operative mortality and long-term survival. However, ER (<3 weeks) has a higher incidence of postoperative low cardiac output syndrome. A severe preoperative condition rather than the time interval of CABG after STEMI is a risk factor of long-term survival. PMID:28218210
Dialysis modality and survival: Done to death.
Trinh, Emilie; Chan, Christopher T; Perl, Jeffrey
2018-03-14
The debate surrounding whether peritoneal dialysis or hemodialysis is associated with differential survival continues as the numerous comparative studies over the past 3 decades have yielded conflicting results. Findings have also evolved over time in the setting of changing patient characteristics, advances in dialytic technologies, and the use of more robust statistical and epidemiologic approaches. Here, we will critically review the body of evidence, both historical and contemporary, comparing survival across dialysis modalities. Significant limitations of the observational nature of the current literature will be highlighted given that no adequately powered randomized controlled trials exist. Given the lack of consistency and limitations of current studies, coupled with the poor survival across both modalities, we can likely conclude that survival comparisons between both modalities do not appreciably differ. Hence, the choice of dialysis modality should not be dictated by survival comparisons, but rather be based on an individualized and informed decision making that places patient preference and lifestyle considerations at the forefront, while integrating medical factors and availability of resources and support. The emphasis of future research should move beyond survival outcomes when comparing dialysis modalities, and instead be redirected to patient-endorsed and patient-reported outcomes. © 2018 Wiley Periodicals, Inc.
Survival of Parents and Siblings of Supercentenarians
Perls, Thomas; Kohler, Iliana V.; Andersen, Stacy; Schoenhofen, Emily; Pennington, JaeMi; Young, Robert; Terry, Dellara; Elo, Irma T.
2011-01-01
Background Given previous evidence of familial predisposition for longevity, we hypothesized that siblings and parents of supercentenarians (age ≥ 110 years) were predisposed to survival to very old age and that, relative to their birth cohorts, their relative survival probabilities (RSPs) are even higher than what has been observed for the siblings of centenarians. Methods Mean age at death conditional upon survival to ages 20 and 50 and survival probabilities from ages 20 and 50 to higher ages were determined for 50 male and 56 female siblings and 54 parents of 29 supercentenarians. These estimates were contrasted with comparable estimates based on birth cohort-specific mortality experience for the United States and Sweden. Results Conditional on survival to age 20 years, mean age at death of supercentenarians’ siblings was ~81 years for men and women. Compared with respective Swedish and U.S. birth cohorts, these estimates were 17%–20% (12–14 years) higher for the brothers and 11%–14% (8–10 years) higher for the sisters. Sisters had a 2.9 times greater probability and brothers had a 4.3 times greater probability of survival from age 20 to age 90. Mothers of supercentenarians had a 5.8 times greater probability of surviving from age 50 to age 90. Fathers also experienced an increased survival probability from age 50 to age 90 of 2.7, but it failed to attain statistical significance. Conclusions The RSPs of siblings and mothers of supercentenarians revealed a substantial survival advantage and were most pronounced at the oldest ages. The RSP to age 90 for siblings of supercentenarians was approximately the same as that reported for siblings of centenarians. It is possible that greater RSPs are observed for reaching even higher ages such as 100 years, but a larger sample of supercentenarians and their siblings and parents is needed to investigate this possibility. PMID:17895443
Survival of parents and siblings of supercentenarians.
Perls, Thomas; Kohler, Iliana V; Andersen, Stacy; Schoenhofen, Emily; Pennington, JaeMi; Young, Robert; Terry, Dellara; Elo, Irma T
2007-09-01
Given previous evidence of familial predisposition for longevity, we hypothesized that siblings and parents of supercentenarians (age >or= 110 years) were predisposed to survival to very old age and that, relative to their birth cohorts, their relative survival probabilities (RSPs) are even higher than what has been observed for the siblings of centenarians. Mean age at death conditional upon survival to ages 20 and 50 and survival probabilities from ages 20 and 50 to higher ages were determined for 50 male and 56 female siblings and 54 parents of 29 supercentenarians. These estimates were contrasted with comparable estimates based on birth cohort-specific mortality experience for the United States and Sweden. Conditional on survival to age 20 years, mean age at death of supercentenarians' siblings was approximately 81 years for men and women. Compared with respective Swedish and U.S. birth cohorts, these estimates were 17%-20% (12-14 years) higher for the brothers and 11%-14% (8-10 years) higher for the sisters. Sisters had a 2.9 times greater probability and brothers had a 4.3 times greater probability of survival from age 20 to age 90. Mothers of supercentenarians had a 5.8 times greater probability of surviving from age 50 to age 90. Fathers also experienced an increased survival probability from age 50 to age 90 of 2.7, but it failed to attain statistical significance. The RSPs of siblings and mothers of supercentenarians revealed a substantial survival advantage and were most pronounced at the oldest ages. The RSP to age 90 for siblings of supercentenarians was approximately the same as that reported for siblings of centenarians. It is possible that greater RSPs are observed for reaching even higher ages such as 100 years, but a larger sample of supercentenarians and their siblings and parents is needed to investigate this possibility.
Geographical variation in cancer survival in England, 1991–2006: an analysis by Cancer Network
Quaresma, Manuela; Coleman, Michel P; Gordon, Emma; Forman, David; Rachet, Bernard
2011-01-01
Background Reducing geographical inequalities in cancer survival in England was a key aim of the Calman–Hine Report (1995) and the NHS Cancer Plan (2000). This study assesses whether geographical inequalities changed following these policy developments by analysing the trend in 1-year relative survival in the 28 cancer networks of England. Methods Population-based age-standardised relative survival at 1 year is estimated for 1.4 million patients diagnosed with cancer of the oesophagus, stomach, colon, lung, breast (women) or cervix in England during 1991–2006 and followed up to 2007. Regional and deprivation-specific life tables are built to adjust survival estimates for differences in background mortality. Analysis is divided into three calendar periods: 1991–5, 1996–2000 and 2001–6. Funnel plots are used to assess geographical variation in survival over time. Results One-year relative survival improved for all cancers except cervical cancer. There was a wide geographical variation in survival with generally lower estimates in northern England. This north–south divide became less marked over time, although the overall number of cancer networks that were lower outliers compared with the England value remained stable. Breast cancer was the only cancer for which there was a marked reduction in geographical inequality in survival over time. Conclusion Policy changes over the past two decades coincided with improved relative survival, without an increase in geographical variation. The north–south divide in relative survival became less pronounced over time but geographical inequalities persist. The reduction in geographical inequality in breast cancer survival may be followed by a similar trend for other cancers, provided government recommendations are implemented similarly. PMID:21321064
Contribution of surgical specialization to improved colorectal cancer survival.
Oliphant, R; Nicholson, G A; Horgan, P G; Molloy, R G; McMillan, D C; Morrison, D S
2013-09-01
Reorganization of colorectal cancer services has led to surgery being increasingly, but not exclusively, delivered by specialist surgeons. Outcomes from colorectal cancer surgery have improved, but the exact determinants remain unclear. This study explored the determinants of outcome after colorectal cancer surgery over time. Postoperative mortality (within 30 days of surgery) and 5-year relative survival rates for patients in the West of Scotland undergoing surgery for colorectal cancer between 1991 and 1994 were compared with rates for those having surgery between 2001 and 2004. The 1823 patients who had surgery in 2001-2004 were more likely to have had stage I or III tumours, and to have undergone surgery with curative intent than the 1715 patients operated on in 1991-1994. The proportion of patients presenting electively who received surgery by a specialist surgeon increased over time (from 14·9 to 72·8 per cent; P < 0·001). Postoperative mortality increased among patients treated by non-specialists over time (from 7·4 to 10·3 per cent; P = 0·026). Non-specialist surgery was associated with an increased risk of postoperative death (adjusted odds ratio 1·72, 95 per cent confidence interval (c.i.) 1·17 to 2·55; P = 0·006) compared with specialist surgery. The 5-year relative survival rate increased over time and was higher among those treated by specialist compared with non-specialist surgeons (62·1 versus 53·0 per cent; P < 0·001). Compared with the earlier period, the adjusted relative excess risk ratio for the later period was 0·69 (95 per cent c.i. 0·61 to 0·79; P < 0·001). Increased surgical specialization accounted for 18·9 per cent of the observed survival improvement. Increased surgical specialization contributed significantly to the observed improvement in longer-term survival following colorectal cancer surgery. © 2013 British Journal of Surgery Society Ltd. Published by John Wiley & Sons Ltd.
Balakrishnan, Nanthini; Teo, Soo-Hwang; Sinnadurai, Siamala; Bhoo Pathy, Nanthini Thevi; See, Mee-Hoong; Taib, Nur Aishah; Yip, Cheng-Har; Bhoo Pathy, Nirmala
2017-11-01
Reproductive factors are associated with risk of breast cancer, but the association with breast cancer survival is less well known. Previous studies have reported conflicting results on the association between time since last childbirth and breast cancer survival. We determined the association between time since last childbirth (LCB) and survival of women with premenopausal and postmenopausal breast cancers in Malaysia. A historical cohort of 986 premenopausal, and 1123 postmenopausal, parous breast cancer patients diagnosed from 2001 to 2012 in University Malaya Medical Centre were included in the analyses. Time since LCB was categorized into quintiles. Multivariable Cox regression was used to determine whether time since LCB was associated with survival following breast cancer, adjusting for demographic, tumor, and treatment characteristics. Premenopausal breast cancer patients with the most recent childbirth (LCB quintile 1) were younger, more likely to present with unfavorable prognostic profiles and had the lowest 5-year overall survival (OS) (66.9; 95% CI 60.2-73.6%), compared to women with longer duration since LCB (quintile 2 thru 5). In univariable analysis, time since LCB was inversely associated with risk of mortality and the hazard ratio for LCB quintile 2, 3, 4, and 5 versus quintile 1 were 0.53 (95% CI 0.36-0.77), 0.49 (95% CI 0.33-0.75), 0.61 (95% CI 0.43-0.85), and 0.64 (95% CI 0.44-0.93), respectively; P trend = 0.016. However, this association was attenuated substantially following adjustment for age at diagnosis and other prognostic factors. Similarly, postmenopausal breast cancer patients with the most recent childbirth were also more likely to present with unfavorable disease profiles. Compared to postmenopausal breast cancer patients in LCB quintile 1, patients in quintile 5 had a higher risk of mortality. This association was not significant following multivariable adjustment. Time since LCB is not independently associated with survival in premenopausal or postmenopausal breast cancers. The apparent increase in risks of mortality in premenopausal breast cancer patients with a recent childbirth, and postmenopausal patients with longer duration since LCB, appear to be largely explained by their age at diagnosis.
Survival of Salmonella enterica in poultry feed is strain dependent
Andino, Ana; Pendleton, Sean; Zhang, Nan; Chen, Wei; Critzer, Faith; Hanning, Irene
2014-01-01
Feed components have low water activity, making bacterial survival difficult. The mechanisms of Salmonella survival in feed and subsequent colonization of poultry are unknown. The purpose of this research was to compare the ability of Salmonella serovars and strains to survive in broiler feed and to evaluate molecular mechanisms associated with survival and colonization by measuring the expression of genes associated with colonization (hilA, invA) and survival via fatty acid synthesis (cfa, fabA, fabB, fabD). Feed was inoculated with 1 of 15 strains of Salmonella enterica consisting of 11 serovars (Typhimurium, Enteriditis, Kentucky, Seftenburg, Heidelberg, Mbandanka, Newport, Bairely, Javiana, Montevideo, and Infantis). To inoculate feed, cultures were suspended in PBS and survival was evaluated by plating samples onto XLT4 agar plates at specific time points (0 h, 4 h, 8 h, 24 h, 4 d, and 7 d). To evaluate gene expression, RNA was extracted from the samples at the specific time points (0, 4, 8, and 24 h) and gene expression measured with real-time PCR. The largest reduction in Salmonella occurred at the first and third sampling time points (4 h and 4 d) with the average reductions being 1.9 and 1.6 log cfu per g, respectively. For the remaining time points (8 h, 24 h, and 7 d), the average reduction was less than 1 log cfu per g (0.6, 0.4, and 0.6, respectively). Most strains upregulated cfa (cyclopropane fatty acid synthesis) within 8 h, which would modify the fluidity of the cell wall to aid in survival. There was a weak negative correlation between survival and virulence gene expression indicating downregulation to focus energy on other gene expression efforts such as survival-related genes. These data indicate the ability of strains to survive over time in poultry feed was strain dependent and that upregulation of cyclopropane fatty acid synthesis and downregulation of virulence genes were associated with a response to desiccation stress. PMID:24570467
Survival of Salmonella enterica in poultry feed is strain dependent.
Andino, Ana; Pendleton, Sean; Zhang, Nan; Chen, Wei; Critzer, Faith; Hanning, Irene
2014-02-01
Feed components have low water activity, making bacterial survival difficult. The mechanisms of Salmonella survival in feed and subsequent colonization of poultry are unknown. The purpose of this research was to compare the ability of Salmonella serovars and strains to survive in broiler feed and to evaluate molecular mechanisms associated with survival and colonization by measuring the expression of genes associated with colonization (hilA, invA) and survival via fatty acid synthesis (cfa, fabA, fabB, fabD). Feed was inoculated with 1 of 15 strains of Salmonella enterica consisting of 11 serovars (Typhimurium, Enteriditis, Kentucky, Seftenburg, Heidelberg, Mbandanka, Newport, Bairely, Javiana, Montevideo, and Infantis). To inoculate feed, cultures were suspended in PBS and survival was evaluated by plating samples onto XLT4 agar plates at specific time points (0 h, 4 h, 8 h, 24 h, 4 d, and 7 d). To evaluate gene expression, RNA was extracted from the samples at the specific time points (0, 4, 8, and 24 h) and gene expression measured with real-time PCR. The largest reduction in Salmonella occurred at the first and third sampling time points (4 h and 4 d) with the average reductions being 1.9 and 1.6 log cfu per g, respectively. For the remaining time points (8 h, 24 h, and 7 d), the average reduction was less than 1 log cfu per g (0.6, 0.4, and 0.6, respectively). Most strains upregulated cfa (cyclopropane fatty acid synthesis) within 8 h, which would modify the fluidity of the cell wall to aid in survival. There was a weak negative correlation between survival and virulence gene expression indicating downregulation to focus energy on other gene expression efforts such as survival-related genes. These data indicate the ability of strains to survive over time in poultry feed was strain dependent and that upregulation of cyclopropane fatty acid synthesis and downregulation of virulence genes were associated with a response to desiccation stress.
The effects of geography on survival in patients with oral cavity squamous cell carcinoma.
Zhang, Han; Dziegielewski, Peter T; Jean Nguyen, T T; Jeffery, Caroline C; O'Connell, Daniel A; Harris, Jeffrey R; Seikaly, Hadi
2015-06-01
To assess the survival outcomes of oral cavity squamous cell carcinoma (OCSCC) by differing geographical location. Demographic, pathologic, treatment, and survival data was obtained from OCSCC patients from 1998-2010 in Alberta, Canada. 554 patients were included from 660 OCSCC patients. Overall, disease-specific, and disease-free survivals were estimated with Kaplan-Meier and Cox regression analyses. Patients were grouped by geographic locations. Patients from urban locations had improved overall, disease-specific, and disease-free survival compared to rural locations (p<0.05). Two and five year estimates of overall survival were significantly higher in the urban cohort at 84% and 78%, versus rural with 48% and 44%, respectively (p<0.05). Disease-specific and disease-free survival rates were also superior in the urban group (p<0.05). Diagnosis to treatment time for all 3 geographical groups was not found to be statistically significant (p>0.05). This study shows that patients with OCSCC living in urban settings have improved survival compared to rural groups. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bowles, K; DeSandre-Robinson, D; Kubicek, L; Lurie, D; Milner, R; Boston, S E
2016-12-01
Local control is a major challenge in treating canine nasal tumours. Surgical cytoreduction prior to radiation therapy has not been shown to offer a survival advantage. Only one study has previously evaluated the outcome when surgery is performed after radiation, which demonstrated an improved survival time compared with radiation alone. The purpose of this study was to investigate the outcome of surgery after definitive radiation on survival times in dogs with sinonasal tumours. Medical records were retrospectively reviewed for dogs with nasal tumours that received definitive radiation followed by surgery. Information obtained from medical record review included signalment, diagnosis, treatment and outcome. The median survival time was 457 days. No long-term side effects were observed. These findings suggest that exenteration of the nasal cavity following definitive radiation for treatment of dogs with nasal tumours is well-tolerated and provides a similar survival duration to previous reports of radiation alone. © 2014 John Wiley & Sons Ltd.
Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models
Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.
2016-01-01
Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906
Perez-Cruz, Pedro E; Dos Santos, Renata; Silva, Thiago Buosi; Crovador, Camila Souza; Nascimento, Maria Salete de Angelis; Hall, Stacy; Fajardo, Julieta; Bruera, Eduardo; Hui, David
2014-11-01
Survival prognostication is important during the end of life. The accuracy of clinician prediction of survival (CPS) over time has not been well characterized. The aims of the study were to examine changes in prognostication accuracy during the last 14 days of life in a cohort of patients with advanced cancer admitted to two acute palliative care units and to compare the accuracy between the temporal and probabilistic approaches. Physicians and nurses prognosticated survival daily for cancer patients in two hospitals until death/discharge using two prognostic approaches: temporal and probabilistic. We assessed accuracy for each method daily during the last 14 days of life comparing accuracy at Day -14 (baseline) with accuracy at each time point using a test of proportions. A total of 6718 temporal and 6621 probabilistic estimations were provided by physicians and nurses for 311 patients, respectively. Median (interquartile range) survival was 8 days (4-20 days). Temporal CPS had low accuracy (10%-40%) and did not change over time. In contrast, probabilistic CPS was significantly more accurate (P < .05 at each time point) but decreased close to death. Probabilistic CPS was consistently more accurate than temporal CPS over the last 14 days of life; however, its accuracy decreased as patients approached death. Our findings suggest that better tools to predict impending death are necessary. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
Gong, Qi; Schaubel, Douglas E
2017-03-01
Treatments are frequently evaluated in terms of their effect on patient survival. In settings where randomization of treatment is not feasible, observational data are employed, necessitating correction for covariate imbalances. Treatments are usually compared using a hazard ratio. Most existing methods which quantify the treatment effect through the survival function are applicable to treatments assigned at time 0. In the data structure of our interest, subjects typically begin follow-up untreated; time-until-treatment, and the pretreatment death hazard are both heavily influenced by longitudinal covariates; and subjects may experience periods of treatment ineligibility. We propose semiparametric methods for estimating the average difference in restricted mean survival time attributable to a time-dependent treatment, the average effect of treatment among the treated, under current treatment assignment patterns. The pre- and posttreatment models are partly conditional, in that they use the covariate history up to the time of treatment. The pre-treatment model is estimated through recently developed landmark analysis methods. For each treated patient, fitted pre- and posttreatment survival curves are projected out, then averaged in a manner which accounts for the censoring of treatment times. Asymptotic properties are derived and evaluated through simulation. The proposed methods are applied to liver transplant data in order to estimate the effect of liver transplantation on survival among transplant recipients under current practice patterns. © 2016, The International Biometric Society.
Vukmir, Rade B
2004-03-01
There are many variables that can have an effect on survival in cardiopulmonary arrest. This study examined the effect of urban, suburban, or rural location on the outcome of prehospital cardiac arrest as a secondary end point in a study evaluating the effect of bicarbonate on survival. The proportion of survivors within a type of EMS provider system as well as response times were compared. This prospective, randomized, double-blind clinical intervention trial enrolled 874 prehospital cardiopulmonary arrest patients encountered by prehospital urban, suburban, and rural regional EMS area. Population density (patients per square mile) calculation allowed classification into urban (>2000/mi2), suburban (>400/mi2), and rural (0-399/mi2) systems. This group underwent standard advanced cardiac life support (ACLS) intervention with or without early empiric administration of bicarbonate in a 1-mEq/kg dose. A group of demographic, diagnostic, and therapeutic variables were analyzed for their effect on survival. Times were measured from collapse until onset of medical intervention and survival measured as the presence of ED vital signs on arrival. Data analysis used chi-squared with Pearson correlation for survivorship and Student t test comparisons for response times. The overall survival rate was approximately 13.9% (110 of 793), ranging from 9% rural, 14% for suburban, and 23% for urban sites for 372 patients (P=.007). Survival differences were associated with classification of arrest locale in this sample-best for urban, suburban, followed by rural sites. There was no difference in time to bystander cardiopulmonary resuscitation, but medical response time (basic life support) was decreased for suburban or urban sites, and intervention (ACLS) and transport times were decreased for suburban sites alone. Although response times were differentiated by location, they were not necessarily predictive of survival. Factors other than response time such as patient population or resuscitation skill could influence survival from cardiac arrest occurring in diverse prehospital service areas.
Yu, Jie; Liang, Ping; Yu, Xiao-ling; Cheng, Zhi-gang; Han, Zhi-yu; Zhang, Xu; Dong, Jun; Mu, Meng-juan; Li, Xin; Wang, Xiao-hui
2014-03-01
To review intermediate-term clinical outcomes of microwave ablation (MWA) compared with open radial nephrectomy (ORN) in small renal cell carcinoma (RCC) patients and to identify prognostic factors associated with two techniques. This retrospective study was institutional review board-approved. A total of 163 patients (127 men and 36 women) with small RCC (≤4 cm) were included from April 2006 to March 2012. Sixty-five patients underwent MWA and 98 patients underwent ORN. Survival, recurrence, and renal function changes were compared between the two groups. Effect of changes in key parameters (ie, overall survival, RCC-related survival, and metastasis-free survival) was statistically analyzed with the log-rank test. Although overall survival after MWA was lower than that after ORN (P = .002), RCC-related survival was comparable to ORN (P = .78). Estimated 5-year overall survival rates were 67.3% after MWA and 97.8% after ORN; for RCC-related survival, estimated 5-year rates were 97.1% after MWA and 97.8% after ORN. There was one local tumor recurrence 32 months after MWA and none after ORN. Major complication rates were comparable (P = .81) between the two techniques (MWA, 2.5% vs ORN, 3.1%). The MWA group had less surgical time (P < .001), estimated blood loss (P < .001), and postoperative hospitalization (P < .001). Multivariate analysis showed age (P = .014), tumor type (P = .003), postoperative urea nitrogen (P = .042), comorbid disease (P = .005), and treatment modality (P < .001) may become survival rate predictors. In intermediate term, ultrasonographically guided percutaneous MWA and ORN provide comparable results in oncologic outcomes. MWA appears to be a safe and effective technique for management of small RCC in patients with little loss of renal function. RSNA, 2013
He, Liru; Chapple, Andrew; Liao, Zhongxing; Komaki, Ritsuko; Thall, Peter F; Lin, Steven H
2016-10-01
To evaluate radiation modality effects on pericardial effusion (PCE), pleural effusion (PE) and survival in esophageal cancer (EC) patients. We analyzed data from 470 EC patients treated with definitive concurrent chemoradiotherapy (CRT). Bayesian semi-competing risks (SCR) regression models were fit to assess effects of radiation modality and prognostic covariates on the risks of PCE and PE, and death either with or without these preceding events. Bayesian piecewise exponential regression models were fit for overall survival, the time to PCE or death, and the time to PE or death. All models included propensity score as a covariate to correct for potential selection bias. Median times to onset of PCE and PE after RT were 7.1 and 6.1months for IMRT, and 6.5 and 5.4months for 3DCRT, respectively. Compared to 3DCRT, the IMRT group had significantly lower risks of PE, PCE, and death. The respective probabilities of a patient being alive without either PCE or PE at 3-years and 5-years were 0.29 and 0.21 for IMRT compared to 0.13 and 0.08 for 3DCRT. In the SCR regression analyses, IMRT was associated with significantly lower risks of PCE (HR=0.26) and PE (HR=0.49), and greater overall survival (probability of beneficial effect (pbe)>0.99), after controlling for known clinical prognostic factors. IMRT reduces the incidence and postpones the onset of PCE and PE, and increases survival probability, compared to 3DCRT. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
van Benthem, B H; Veugelers, P J; Cornelisse, P G; Strathdee, S A; Kaldor, J M; Shafer, K A; Coutinho, R A; van Griensven, G J
1998-06-18
To investigate the significance of the time from seroconversion to AIDS (incubation time) and other covariates for survival from AIDS to death. In survival analysis, survival from AIDS to death was compared for different categories of length of incubation time adjusted and unadjusted for other covariates, and significant predictors for survival from AIDS to death were investigated. Survival after AIDS was not affected by the incubation time in univariate as well as in multivariate analyses. Predictive factors for progression from AIDS to death were age at seroconversion, type of AIDS diagnosis, and CD4 cell count at AIDS. The relative hazard for age at seroconversion increased 1.38-fold over 10 years. Men with a CD4 cell count at AIDS of <130 x 10(6)/l had a twofold higher risk in progression to death than men with higher CD4 cell counts. Persons diagnosed with lymphoma had a sixfold higher risk of progression to death than persons with Kaposi's sarcoma or opportunistic infections. The incubation time as well as other factors before AIDS did not affect survival after AIDS. Survival from AIDS to death can be predicted by data obtained at the time of AIDS diagnosis, such as type of diagnosis, age and CD4 cell count. AIDS seems to be a significant point in progression to death, and not just a floating point between infection and death affected by prior factors for persons who did not receive effective therapy and did not have long incubation times.
Maxwell, Jessica Hooton; Kumar, Bhavna; Feng, Felix Y.; Worden, Francis P.; Lee, Julia; Eisbruch, Avraham; Wolf, Gregory T.; Prince, Mark E.; Moyer, Jeffrey S.; Teknos, Theodoros N.; Chepeha, Douglas B.; McHugh, Jonathan B.; Urba, Susan; Stoerker, Jay; Walline, Heather; Kurnit, David; Cordell, Kitrina G.; Davis, Samantha J.; Ward, Preston D.; Bradford, Carol R.; Carey, Thomas E.
2009-01-01
Purpose The goal of this study was to examine the effect of tobacco use on disease recurrence (local/regional recurrence, distant metastasis, or second primary) among HPV-positive patients with squamous cell carcinoma of the oropharynx (SCCOP) following a complete response to chemoradiation therapy. Experimental Design Between 1999 and 2007, 124 patients with advanced SCCOP (86% with stage IV) and adequate tumor tissue for HPV analysis who were enrolled in one of two consecutive University of Michigan treatment protocols were prospectively included in this study. Patients were categorized as never, former, or current tobacco users. The primary end-points were risk of disease recurrence and time to recurrence; secondary end-points were disease-specific survival and overall survival. Results One hundred and two patients (82.3%) had HPV-positive tumors. Over two-thirds (68%) of patients with HPV-positive tumors were tobacco users. Among HPV-positive patients, current tobacco users were at significantly higher risk of disease recurrence than never-tobacco users (hazard ratio = 5.2; confidence interval [1.1-24.4]; p=0.038). Thirty-five percent of HPV-positive ever tobacco users recurred compared to only 6% of HPV-positive never users and 50% of HPV-negative patients. All HPV-negative patients were tobacco users and had significantly shorter times to recurrence (p=0.002) and reduced disease-specific survival (p=0.004) and overall survival (p<0.001) compared to HPV-positive patients. Compared to HPV-positive never-tobacco users, those with a tobacco history showed a trend for reduced disease-specific survival (p=0.064) but not overall survival (p=0.221). Conclusion Current tobacco users with advanced, HPV-positive SCCOP are at higher risk of disease recurrence compared to never-tobacco users. PMID:20145161
Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj
2017-01-01
Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barth, R.F.; Matalka, K.Z.; Bailey, M.Q.
The present study was carried out to determine the efficacy of Boron Neutron Capture Therapy (BNCT) for intracerebral melanoma using nude rats, the human melanoma cell line MRA 27, and boronophenylalanine as the capture agent. MRA 27 cells (2 [times] 10[sup 5]) were implanted intracerebrally, and 30 days later, 120 mg of [sup 10]B-L-BPA were injected intraperitoneally into nude rats. Thirty days following implantation, tumor bearing rats were irradiated at the Brookhaven Medical Research Reactor. Six hours following administration of BPA, tumor, blood, and normal brain boron-10 levels were 23.7, 9.4, and 8.4 [mu]g/g respectively. Median survival time of untreatedmore » rats was 44 days compared to 76 days and 93 days for those receiving physical doses of 2.73 Gy and 3.64 Gy, respectively. Rats that have received both [sup 10]B-BPA and physical doses of 1.82, 2.73, or 3.64 Gy had median survival times of 170, 182, and 262 days, respectively. Forty percent of rats that had received the highest tumor dose (10.1 Gy) survived for > 300 days and in a replicate experiment 21% of the rats were longterm survivors (>220 days). Animals that received 12 Gy in a single dose or 18 Gy fractionated (2 Gy [times] 9) of gamma photons from a [sup 137]Cs source had median survival times of 86 and 79 days, respectively, compared to 47 days for untreated animals. Histopathologic examination of the brains of longterm surviving rats, euthanized at 8 or 16 months following BNCT, showed no residual tumor, but dense accumulations of melanin laden macrophages and minimal gliosis were observed. Significant prolongations in median survival time were noted in nude rats with intracerebral human melanoma that had received BNCT, thereby suggesting therapeutic efficacy. Large animal studies should be carried out to further assess BNCT of intracerebral melanoma before any human trials are contemplated. 49 refs., 7 figs., 2 tabs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rafat, M; Bazalova, M; Palma, B
Purpose: To characterize the effect of very rapid dose delivery as compared to conventional therapeutic irradiation times on clonogenic cell survival. Methods: We used a Varian Trilogy linear accelerator to deliver doses up to 10 Gy using a 6 MV SRS photon beam. We irradiated four cancer cell lines in times ranging from 30 sec to 30 min. We also used a Varian TrueBeam linear accelerator to deliver 9 MeV electrons at 10 Gy in 10 s to 30 min to determine the effect of irradiation time on cell survival. We then evaluated the effect of using 60 and 120more » MeV electrons on cell survival using the Next Linear Collider Test Accelerator (NLCTA) beam line at the SLAC National Accelerator Laboratory. During irradiation, adherent cells were maintained at 37oC with 20%O2/5%CO2. Clonogenic assays were completed following irradiation to determine changes in cell survival due to dose delivery time and beam quality, and the survival data were fitted with the linear-quadratic model. Results: Cell lines varied in radiosensitivity, ranging from two to four logs of cell kill at 10 Gy for both conventional and very rapid irradiation. Delivering radiation in shorter times decreased survival in all cell lines. Log differences in cell kill ranged from 0.2 to 0.7 at 10 Gy for the short compared to the long irradiation time. Cell kill differences between short and long irradiations were more pronounced as doses increased for all cell lines. Conclusion: Our findings suggest that shortening delivery of therapeutic radiation doses to less than 1 minute may improve tumor cell kill. This study demonstrates the potential advantage of technologies under development to deliver stereotactic ablative radiation doses very rapidly. Bill Loo and Peter Maxim have received Honoraria from Varian and Research Support from Varian and RaySearch.« less
Finotello, R; Stefanello, D; Zini, E; Marconato, L
2017-03-01
Canine hemangiosarcoma (HSA) is a neoplasm of vascular endothelial origin that has an aggressive biological behaviour, with less than 10% of dogs alive at 12-months postdiagnosis. Treatment of choice consists of surgery followed by adjuvant doxorubicin-based chemotherapy. We prospectively compared adjuvant doxorubicin and dacarbazine (ADTIC) to a traditional doxorubicin and cyclophosphamide (AC) treatment, aiming at determining safety and assessing whether this regimen prolongs survival and time to metastasis (TTM). Twenty-seven dogs were enrolled; following staging work-up, 18 were treated with AC and 9 with ADTIC. Median TTM and survival time were longer for dogs treated with ADTIC compared with those receiving AC (>550 versus 112 days, P = 0.021 and >550 versus 142 days, P = 0.011, respectively). Both protocols were well tolerated, without need for dose reduction or increased interval between treatments. A protocol consisting of combined doxorubicin and dacarbazine is safe in dogs with HSA and prolongs TTM and survival time. © 2015 John Wiley & Sons Ltd.
Survival time of the susceptible-infected-susceptible infection process on a graph.
van de Bovenkamp, Ruud; Van Mieghem, Piet
2015-09-01
The survival time T is the longest time that a virus, a meme, or a failure can propagate in a network. Using the hitting time of the absorbing state in an uniformized embedded Markov chain of the continuous-time susceptible-infected-susceptible (SIS) Markov process, we derive an exact expression for the average survival time E[T] of a virus in the complete graph K_{N} and the star graph K_{1,N-1}. By using the survival time, instead of the average fraction of infected nodes, we propose a new method to approximate the SIS epidemic threshold τ_{c} that, at least for K_{N} and K_{1,N-1}, correctly scales with the number of nodes N and that is superior to the epidemic threshold τ_{c}^{(1)}=1/λ_{1} of the N-intertwined mean-field approximation, where λ_{1} is the spectral radius of the adjacency matrix of the graph G. Although this new approximation of the epidemic threshold offers a more intuitive understanding of the SIS process, it remains difficult to compare outbreaks in different graph types. For example, the survival in an arbitrary graph seems upper bounded by the complete graph and lower bounded by the star graph as a function of the normalized effective infection rate τ/τ_{c}^{(1)}. However, when the average fraction of infected nodes is used as a basis for comparison, the virus will survive in the star graph longer than in any other graph, making the star graph the worst-case graph instead of the complete graph. Finally, in non-Markovian SIS, the distribution of the spreading attempts over the infectious period of a node influences the survival time, even if the expected number of spreading attempts during an infectious period (the non-Markovian equivalent of the effective infection rate) is kept constant. Both early and late infection attempts lead to shorter survival times. Interestingly, just as in Markovian SIS, the survival times appear to be exponentially distributed, regardless of the infection and curing time distributions.
Conditional relative survival of oral cavity cancer: Based on Korean Central Cancer Registry.
Min, Seung-Ki; Choi, Sung Weon; Ha, Johyun; Park, Joo Yong; Won, Young-Joo; Jung, Kyu-Won
2017-09-01
Conditional relative survival (CRS) describes the survival chance of patients who have already survived for a certain period of time after diagnosis and treatment of cancer. Thus, CRS can complement the conventional 5-year relative survival, which does not consider the time patients have survived after their diagnosis. This study aimed to assess the 5-year CRS among Korean patients with oral cancer and the related risk factors. We identified 15,329 oral cavity cancer cases with a diagnosis between 1993 and 2013 in the Korea Central Cancer Registry. The CRS rates were calculated according to sex, age, subsite, histology, and stage at diagnosis. The 5-year relative survival was 57.2%, and further analysis revealed that the 5-year CRS increased during the first 2years and reached a plateau at 86.5% after 5years of survival. Women had better 5-year CRS than men after 5years of survival (90.0% vs. 83.3%), and ≤45-year-old patients had better 5-year CRS than older patient groups (93.3% vs. 86.4% or 86.7%). Subsite-specific differences in 5-year CRS were observed (tongue: 91% vs. mouth floor: 73.9%). Squamous cell carcinoma had a CRS of 87.3%, compared to 85.5% for other histological types. Localized disease had a CRS of 95.7%, compared to 87.3% for regional metastasis. Patients with oral cavity cancer exhibited increasing CRS rates, which varied according to sex, age, subsite, histology, and stage at diagnosis. Thus, CRS analysis provides a more detailed perspective regarding survival during the years after the initial diagnosis or treatment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Liu, Yanhong; Shete, Sanjay; Etzel, Carol J.; Scheurer, Michael; Alexiou, George; Armstrong, Georgina; Tsavachidis, Spyros; Liang, Fu-Wen; Gilbert, Mark; Aldape, Ken; Armstrong, Terri; Houlston, Richard; Hosking, Fay; Robertson, Lindsay; Xiao, Yuanyuan; Wiencke, John; Wrensch, Margaret; Andersson, Ulrika; Melin, Beatrice S.; Bondy, Melissa
2010-01-01
Purpose Glioblastoma (GBM) is the most common and aggressive type of glioma and has the poorest survival. However, a small percentage of patients with GBM survive well beyond the established median. Therefore, identifying the genetic variants that influence this small number of unusually long-term survivors may provide important insight into tumor biology and treatment. Patients and Methods Among 590 patients with primary GBM, we evaluated associations of survival with the 100 top-ranking glioma susceptibility single nucleotide polymorphisms from our previous genome-wide association study using Cox regression models. We also compared differences in genetic variation between short-term survivors (STS; ≤ 12 months) and long-term survivors (LTS; ≥ 36 months), and explored classification and regression tree analysis for survival data. We tested results using two independent series totaling 543 GBMs. Results We identified LIG4 rs7325927 and BTBD2 rs11670188 as predictors of STS in GBM and CCDC26 rs10464870 and rs891835, HMGA2 rs1563834, and RTEL1 rs2297440 as predictors of LTS. Further survival tree analysis revealed that patients ≥ 50 years old with LIG4 rs7325927 (V) had the worst survival (median survival time, 1.2 years) and exhibited the highest risk of death (hazard ratio, 17.53; 95% CI, 4.27 to 71.97) compared with younger patients with combined RTEL1 rs2297440 (V) and HMGA2 rs1563834 (V) genotypes (median survival time, 7.8 years). Conclusion Polymorphisms in the LIG4, BTBD2, HMGA2, and RTEL1 genes, which are involved in the double-strand break repair pathway, are associated with GBM survival. PMID:20368557
Liu, Yanhong; Shete, Sanjay; Etzel, Carol J; Scheurer, Michael; Alexiou, George; Armstrong, Georgina; Tsavachidis, Spyros; Liang, Fu-Wen; Gilbert, Mark; Aldape, Ken; Armstrong, Terri; Houlston, Richard; Hosking, Fay; Robertson, Lindsay; Xiao, Yuanyuan; Wiencke, John; Wrensch, Margaret; Andersson, Ulrika; Melin, Beatrice S; Bondy, Melissa
2010-05-10
Glioblastoma (GBM) is the most common and aggressive type of glioma and has the poorest survival. However, a small percentage of patients with GBM survive well beyond the established median. Therefore, identifying the genetic variants that influence this small number of unusually long-term survivors may provide important insight into tumor biology and treatment. Among 590 patients with primary GBM, we evaluated associations of survival with the 100 top-ranking glioma susceptibility single nucleotide polymorphisms from our previous genome-wide association study using Cox regression models. We also compared differences in genetic variation between short-term survivors (STS;
A Decade of Experience With Renal Transplantation in African-Americans
Foster, Clarence E.; Philosophe, Benjamin; Schweitzer, Eugene J.; Colonna, John O.; Farney, Alan C.; Jarrell, Bruce; Anderson, Leslie; Bartlett, Stephen T.
2002-01-01
Objective To evaluate the strategies instituted by the authors’ center to decrease the time to transplantation and increase the rate of transplantation for African-Americans, consisting of a formal education program concerning the benefits of living organ donation that is oriented to minorities; a laparoscopic living donation program; use of hepatitis C-positive donors in documented positive recipients; and encouraging vaccination for hepatitis B, allowing the use of hepatitis B core Ab-positive donors. Summary Background Data The national shortage of suitable kidney donor organs has disproportional and adverse effects on African-Americans for several reasons. Type II diabetes mellitus and hypertension, major etiologic factors for end-stage renal disease, are more prevalent in African-Americans than in the general population. Once kidney failure has developed, African-Americans are disadvantaged for the following reasons: this patient cohort has longer median waiting times on the renal transplant list; African-Americans have higher rates of acute rejection, which affects long-term allograft survival; and once they are transplanted, the long-term graft survival rates are lower in this population than in other groups. Methods From March 1990 to November 2001 the authors’ center performed 2,167 renal transplants; 944 were in African-Americans (663 primary cadaver renal transplants and 253 primary Living donor renal transplants). The retransplants consisted of 83 cadaver transplants and 17 living donor transplants. Outcome measures of this retrospective analysis included median waiting time, graft and patient survival rates, and the rate of living donation in African-Americans and comparable non-African-Americans. Where applicable, data are compared to United Network for Organ Sharing national statistics. Statistical analysis employed appropriate SPSS applications. Results One- and 5-year patient survival rates for living donor kidneys were 97.1% and 91.3% for non-African-Americans and 96.8% and 90.4% for African-Americans. One- and 5-year graft survival rates were 95.1% and 89.1% for non-African-Americans and 93.1% and 82.9% for African-Americans. One- and 4-year patient survival rates for cadaver donor kidneys were 91.4% and 78.7% for non-African-Americans and 92.4% and 80.2% for African-Americans. One- and 5-year graft survival rates for cadaver kidneys were 84.6% and 73.7% for non-African-Americans and 84.6% and 68.9% for African-Americans. One- and 5-year graft and patient survival rates were identical for recipients of hepatitis C virus-positive and anti-HBc positive donors, with the exception of a trend to late graft loss in the African-American hepatitis C virus group due to higher rates of noncompliance, an effect that disappears with censoring of graft loss from that cause. The cadaveric renal transplant median waiting time for non-African-Americans was 391 days compared to 734 days nationally; the waiting time for African-Americans was 647 days compared to 1,335 days nationally. When looking at all patients, living and cadaver donor, the median waiting times are 220 days for non-African-Americans and 462 days for African-Americans. Conclusions Programs specifically oriented to improve volunteerism in African-Americans have led to a marked improvement in overall waiting time and in rates of living donation in this patient group. The median waiting times to cadaveric renal transplantation were also significantly shorter in the authors’ center, especially for African-American patients, by taking advantage of the higher rates of hepatitis C infection and encouraging hepatitis B vaccination. These policies can markedly improve end-stage renal disease care for African-Americans by halving the overall waiting time while still achieving comparable graft and patient survival rates. PMID:12454518
A decade of experience with renal transplantation in African-Americans.
Foster, Clarence E; Philosophe, Benjamin; Schweitzer, Eugene J; Colonna, John O; Farney, Alan C; Jarrell, Bruce; Anderson, Leslie; Bartlett, Stephen T
2002-12-01
OBJECTIVE To evaluate the strategies instituted by the authors' center to decrease the time to transplantation and increase the rate of transplantation for African-Americans, consisting of a formal education program concerning the benefits of living organ donation that is oriented to minorities; a laparoscopic living donation program; use of hepatitis C-positive donors in documented positive recipients; and encouraging vaccination for hepatitis B, allowing the use of hepatitis B core Ab-positive donors. SUMMARY BACKGROUND DATA The national shortage of suitable kidney donor organs has disproportional and adverse effects on African-Americans for several reasons. Type II diabetes mellitus and hypertension, major etiologic factors for end-stage renal disease, are more prevalent in African-Americans than in the general population. Once kidney failure has developed, African-Americans are disadvantaged for the following reasons: this patient cohort has longer median waiting times on the renal transplant list; African-Americans have higher rates of acute rejection, which affects long-term allograft survival; and once they are transplanted, the long-term graft survival rates are lower in this population than in other groups. METHODS From March 1990 to November 2001 the authors' center performed 2,167 renal transplants; 944 were in African-Americans (663 primary cadaver renal transplants and 253 primary Living donor renal transplants). The retransplants consisted of 83 cadaver transplants and 17 living donor transplants. Outcome measures of this retrospective analysis included median waiting time, graft and patient survival rates, and the rate of living donation in African-Americans and comparable non-African-Americans. Where applicable, data are compared to United Network for Organ Sharing national statistics. Statistical analysis employed appropriate SPSS applications. RESULTS One- and 5-year patient survival rates for living donor kidneys were 97.1% and 91.3% for non-African-Americans and 96.8% and 90.4% for African-Americans. One- and 5-year graft survival rates were 95.1% and 89.1% for non-African-Americans and 93.1% and 82.9% for African-Americans. One- and 4-year patient survival rates for cadaver donor kidneys were 91.4% and 78.7% for non-African-Americans and 92.4% and 80.2% for African-Americans. One- and 5-year graft survival rates for cadaver kidneys were 84.6% and 73.7% for non-African-Americans and 84.6% and 68.9% for African-Americans. One- and 5-year graft and patient survival rates were identical for recipients of hepatitis C virus-positive and anti-HBc positive donors, with the exception of a trend to late graft loss in the African-American hepatitis C virus group due to higher rates of noncompliance, an effect that disappears with censoring of graft loss from that cause. The cadaveric renal transplant median waiting time for non-African-Americans was 391 days compared to 734 days nationally; the waiting time for African-Americans was 647 days compared to 1,335 days nationally. When looking at all patients, living and cadaver donor, the median waiting times are 220 days for non-African-Americans and 462 days for African-Americans. CONCLUSIONS Programs specifically oriented to improve volunteerism in African-Americans have led to a marked improvement in overall waiting time and in rates of living donation in this patient group. The median waiting times to cadaveric renal transplantation were also significantly shorter in the authors' center, especially for African-American patients, by taking advantage of the higher rates of hepatitis C infection and encouraging hepatitis B vaccination. These policies can markedly improve end-stage renal disease care for African-Americans by halving the overall waiting time while still achieving comparable graft and patient survival rates.
Wu, Kunpeng; Chen, Ying; Yan, Caihong; Huang, Zhijia; Wang, Deming; Gui, Peigen; Bao, Juan
2017-10-01
To assess the effect of percutaneous endoscopic gastrostomy on short- and long-term survival of patients in a persistent vegetative state after stroke and determine the relevant prognostic factors. Stroke may lead to a persistent vegetative state, and the effect of percutaneous endoscopic gastrostomy on survival of stroke patients in a persistent vegetative state remains unclear. Prospective study. A total of 97 stroke patients in a persistent vegetative state hospitalised from January 2009 to December 2011 at the Second Hospital, University of South China, were assessed in this study. Percutaneous endoscopic gastrostomy was performed in 55 patients, and mean follow-up time was 18 months. Survival rate and risk factors were analysed. Median survival in the 55 percutaneous endoscopic gastrostomy-treated patients was 17·6 months, higher compared with 8·2 months obtained for the remaining 42 patients without percutaneous endoscopic gastrostomy treatment. Univariate analyses revealed that age, hospitalisation time, percutaneous endoscopic gastrostomy treatment status, family financial situation, family care, pulmonary infection and nutrition were significantly associated with survival. Multivariate analysis indicated that older age, no gastrostomy, poor family care, pulmonary infection and poor nutritional status were independent risk factors affecting survival. Indeed, percutaneous endoscopic gastrostomy significantly improved the nutritional status and decreased pulmonary infection rate in patients with persistent vegetative state after stroke. Interestingly, median survival time was 20·3 months in patients with no or one independent risk factors of poor prognosis (n = 38), longer compared with 8·7 months found for patients with two or more independent risk factors (n = 59). Percutaneous endoscopic gastrostomy significantly improves long-term survival of stroke patients in a persistent vegetative state and is associated with improved nutritional status and decreased pulmonary infection. Percutaneous endoscopic gastrostomy is a promising option for the management of stroke patients in a persistent vegetative state. © 2016 John Wiley & Sons Ltd.
[Effect of Fuzheng Huayu capsules on survival rate of patients with liver cirrhosis].
Ge, X J; Zhao, C Q; Xu, L M
2017-11-20
Objective: To investigate the effect of Fuzheng Huayu capsules on the survival rate of patients with liver cirrhosis. Methods: A retrospective analysis was performed for the clinical data of the patients with various types of liver cirrhosis who were hospitalized in Shuguang Hospital Affiliated to Shanghai University of Traditional Chinese Medicine from January 1, 2006 to December 31, 2008. The data collected for these patients included their basic information, diagnosis and treatment, and results of laboratory examination. The Kaplan-Meier method was used to analyze the effect of Fuzheng Huayu capsules on the survival rate of patients with liver cancer. The starting point of observation was the first day of the patient's admission and the ending point of follow-up observation was the date of death or the end of follow-up April 1, 2014. The cut-off value was obtained if the patient did not experience any outcome event (death) at the end of follow-up. With reference to the outcome, the time when the outcome occurred, and the cut-off value, the life-table method was used to calculate survival rates and survival curves were plotted. The Kaplan-Meier product-limit method was used to calculate the arithmetic mean of survival time and median survival time, and the log-rank test was used to compare the survival data. Results: A total of 430 patients with liver cirrhosis were enrolled, among whom 191 died and 239 survived or were censored. The average constituent ratio of death was 55.6% and the average constituent ratio of survival was 44.4%. The life-table method showed that the half-, 1-, 2-, and 5-year survival rates were 70%, 64%, 58%, and 48%, respectively. The median survival time was 112.1 weeks for the patients who did not take Fuzheng Huayu capsules and 351.6 weeks for those who did, and there was a significant difference in survival rate between the two groups ( P = 0.000). Among 313 patients who had an etiology of hepatitis B, 164 did not take Fuzheng Huayu capsules and had a median survival time of 195.9 weeks and a 5-year survival rate of 44%, and 149 took Fuzheng Huayu capsules and had a median survival time of 336.9 weeks and a 5-year survival rate of 59%; there was a significant difference in survival rate between the two groups ( P = 0.038). Among 117 patients who did not have hepatitis B, 68 did not take Fuzheng Huayu capsules and had a median survival time of 78.1 weeks and a 5-year survival rate of 32%, and 49 took Fuzheng Huayu capsules and had a median survival time of 277.4 weeks and a 5-year survival rate of 53%; there was a significant difference in survival rate between the two groups ( P = 0.013). Among 92 patients with compensated liver cirrhosis, 47 did not take Fuzheng Huayu capsules and had a 5-year survival rate of 65%, and 45 took Fuzheng Huayu capsules and had a 5-year survival rate of 82%; both groups of patients had a median survival of 440 weeks; there was a significant difference in survival rate between the two groups ( P = 0.027). Among 338 patients with decompensated liver cirrhosis, 185 did not take Fuzheng Huayu capsules and had a median survival time of 60.3 weeks and a 5-year survival rate of 33%, and 153 took Fuzheng Huayu capsules and had a median survival time of 267.7 weeks and a 5-year survival rate of 51%; there was a significant difference in survival rate between the two groups ( P = 0.001). Conclusion: Fuzheng Huayu capsules can improve the prognosis of patients with liver cirrhosis and increase their survival rates and have good long-term efficacy.
Nemelc, R M; Stadhouder, A; van Royen, B J; Jiya, T U
2014-11-01
Purpose: To evaluate outcome and survival and to identify prognostic variables for patients surgically treated for spinal metastases. Methods: A retrospective study was performed on 86 patients, surgically treated for spinal metastases. Preoperative analyses of the ASIA and spinal instability neoplastic scores (SINS) were performed. Survival curves of different prognostic variables were made by Kaplan–Meier analysis and the variables entered in a Cox proportional hazards model to determine their significance on survival. The correlation between preoperative radiotherapy and postoperative wound infections was also evaluated. Results: Survival analysis was performed on 81 patients,37 women and 44 men. Five patients were excluded due to missing data. Median overall survival was 38 weeks [95 % confidence interval (CI) 27.5–48.5 weeks], with a 3-month survival rate of 81.5 %. Breast tumor had the best median survival of 127 weeks and lung tumor the worst survival of 18 weeks. Univariate analysis showed tumor type, preoperative ASIA score (p = 0.01) and visceral metastases(p = 0.18) were significant prognostic variables for survival.Colon tumors had 5.53 times hazard ratio compared to patients with breast tumor. ASIA-C score had more than 13.03 times the hazard ratio compared to patients with an ASIA-E score. Retrospective analysis of the SINS scores showed 34 patients with a score of 13–18 points, 44 patients with a score of 7–12 points, and 1 patient with a score of 6 points. Preoperative radiotherapy had no influence on the postoperative incidence of deep surgical wound infections (p = 0.37). Patients with spinal metastases had a median survival of 38 weeks postoperative. The primary tumor type and ASIA score were significant prognostic factors for survival. Preoperative radiotherapy neither had influence on survival nor did it constitute a risk for postoperative surgical wound infections.
Kruse, M A; Holmes, E S; Balko, J A; Fernandez, S; Brown, D C; Goldschmidt, M H
2013-07-01
Osteosarcoma is the most common bone tumor in dogs. However, current literature focuses primarily on appendicular osteosarcoma. This study examined the prognostic value of histological and clinical factors in flat and irregular bone osteosarcomas and hypothesized that clinical factors would have a significant association with survival time while histological factors would not. All osteosarcoma biopsy samples of the vertebra, rib, sternum, scapula, or pelvis were reviewed while survival information and clinical data were obtained from medical records, veterinarians, and owners. Forty-six dogs were included in the analysis of histopathological variables and 27 dogs with complete clinical data were included in the analysis of clinical variables. In the histopathologic cox regression model, there was no significant association between any histologic feature of osteosarcoma, including grade, and survival time. In the clinical cox regression model, there was a significant association between the location of the tumor and survival time as well as between the percent elevation of alkaline phosphatase (ALP) above normal and survival time. Controlling for ALP elevation, dogs with osteosarcoma located in the scapula had a significantly greater hazard for death (2.8) compared to dogs with tumors in other locations. Controlling for tumor location, every 100% increase in ALP from normal increased the hazard for death by 1.7. For canine osteosarcomas of the flat and irregular bones, histopathological features, including grade do not appear to be rigorous predictors of survival. Clinical variables such as increased ALP levels and tumor location in the scapula were associated with decreased survival times.
Modeling time-to-event (survival) data using classification tree analysis.
Linden, Ariel; Yarnold, Paul R
2017-12-01
Time to the occurrence of an event is often studied in health research. Survival analysis differs from other designs in that follow-up times for individuals who do not experience the event by the end of the study (called censored) are accounted for in the analysis. Cox regression is the standard method for analysing censored data, but the assumptions required of these models are easily violated. In this paper, we introduce classification tree analysis (CTA) as a flexible alternative for modelling censored data. Classification tree analysis is a "decision-tree"-like classification model that provides parsimonious, transparent (ie, easy to visually display and interpret) decision rules that maximize predictive accuracy, derives exact P values via permutation tests, and evaluates model cross-generalizability. Using empirical data, we identify all statistically valid, reproducible, longitudinally consistent, and cross-generalizable CTA survival models and then compare their predictive accuracy to estimates derived via Cox regression and an unadjusted naïve model. Model performance is assessed using integrated Brier scores and a comparison between estimated survival curves. The Cox regression model best predicts average incidence of the outcome over time, whereas CTA survival models best predict either relatively high, or low, incidence of the outcome over time. Classification tree analysis survival models offer many advantages over Cox regression, such as explicit maximization of predictive accuracy, parsimony, statistical robustness, and transparency. Therefore, researchers interested in accurate prognoses and clear decision rules should consider developing models using the CTA-survival framework. © 2017 John Wiley & Sons, Ltd.
Hazel, A R; Heins, B J; Hansen, L B
2017-11-01
Montbéliarde (MO) × Holstein (HO) and Viking Red (VR) × HO crossbred cows were compared with pure HO cows in 8 large, high-performance dairy herds in Minnesota. All cows calved for the first time from December 2010 to April 2014. Fertility and survival traits were calculated from records of insemination, pregnancy diagnosis, calving, and disposal that were recorded via management software. Body condition score and conformation were subjectively scored once during early lactation by trained evaluators. The analysis of survival to 60 d in milk included 536 MO × HO, 560 VR × HO, and 1,033 HO cows during first lactation. Cows analyzed for other fertility, survival, and conformation traits had up to 13% fewer cows available for analysis. The first service conception rate of the crossbred cows (both types combined) increased 7%, as did the conception rate across the first 5 inseminations, compared with the HO cows during first lactation. Furthermore, the combined crossbred cows (2.11 ± 0.05) had fewer times bred than HO cows (2.30 ± 0.05) and 10 fewer d open compared with their HO herdmates. Across the 8 herds, breed groups did not differ for survival to 60 d in milk; however, the superior fertility of the crossbred cows allowed an increased proportion of the combined crossbreds (71 ± 1.5%) to calve a second time within 14 mo compared with the HO cows (63 ± 1.5%). For survival to second calving, the combined crossbred cows had 4% superior survival compared with the HO cows. The MO × HO and VR × HO crossbred cows both had increased body condition score (+0.50 ± 0.02 and +0.25 ± 0.02, respectively) but shorter stature and less body depth than HO cows. The MO × HO cows had less set to the hock and a steeper foot angle than the HO cows, and the VR × HO cows had more set to the hock with a similar foot angle to the HO cows. The combined crossbred cows had less udder clearance from the hock than HO cows, more width between both front and rear teats, and longer teat length than the HO cows; however, the frequency of first-lactation cows culled for udder conformation was uniformly low (<1%) across the breed groups. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Survival Analysis of Patients with End Stage Renal Disease
NASA Astrophysics Data System (ADS)
Urrutia, J. D.; Gayo, W. S.; Bautista, L. A.; Baccay, E. B.
2015-06-01
This paper provides a survival analysis of End Stage Renal Disease (ESRD) under Kaplan-Meier Estimates and Weibull Distribution. The data were obtained from the records of V. L. MakabaliMemorial Hospital with respect to time t (patient's age), covariates such as developed secondary disease (Pulmonary Congestion and Cardiovascular Disease), gender, and the event of interest: the death of ESRD patients. Survival and hazard rates were estimated using NCSS for Weibull Distribution and SPSS for Kaplan-Meier Estimates. These lead to the same conclusion that hazard rate increases and survival rate decreases of ESRD patient diagnosed with Pulmonary Congestion, Cardiovascular Disease and both diseases with respect to time. It also shows that female patients have a greater risk of death compared to males. The probability risk was given the equation R = 1 — e-H(t) where e-H(t) is the survival function, H(t) the cumulative hazard function which was created using Cox-Regression.
Pedro, Brigite M; Alves, Joana V; Cripps, Peter J; Stafford Johnson, Mike J; Martin, Mike W S
2011-12-01
The purpose of this study was to investigate the prognostic value of QRS duration in dogs with dilated cardiomyopathy (DCM) by studying its relationship with survival time. The medical records of dogs diagnosed with DCM were retrospectively searched for good quality ECG tracings. The QRS duration was measured from the ECG tracing and two different models were used: binary variable (dogs were divided into 2 groups based on a QRS duration of <60 ms or ≥60 ms) and continuous variable. The survival times were analysed by the Kaplan-Meier method and Cox's proportional hazard model. 266 dogs met the inclusion and exclusion criteria. A QRS duration ≥60 ms was associated with a reduced survival time compared to those with a QRS duration <60 ms (Hazard Ratio of 1.34, 95% CI 1.05-1.71, P = 0.02). When considered as a continuous variable the Hazard Ratio was 1.015 for each increase in QRS duration of 1 ms (95% CI 1.006-1.024, p = 0.001).Dogs with a QRS duration < 60 ms had a median survival time (IQ range) of 25 weeks (97-65) and dogs with a QRS duration ≥60 ms had a median survival time (IQ range) of 13 weeks (3-34). The measurement of QRS duration is relatively simple to perform from a surface ECG recording. A duration ≥60 ms is associated with shorter survival times in dogs with DCM, which may provide practitioners with additional prognostic information. Copyright © 2011 Elsevier B.V. All rights reserved.
AJUBA increases the cisplatin resistance through hippo pathway in cervical cancer.
Bi, Lihong; Ma, Feng; Tian, Rui; Zhou, Yanli; Lan, Weiguang; Song, Quanmao; Cheng, Xiankui
2018-02-20
Though LIM-domain protein AJUBA was identified as a putative oncogene, the function and underlying mechanisms of AJUBA in cervical cancer remain largely unknown. Firstly, AJUBA expression was detected via real-time quantitative PCR in patients' samples. Furthermore, Hela and Siha cells were transfected with AJUBA-overexpressing plasmids, and then exposed to cisplatin, the apoptosis was measured by cytometry assay. In addition, the expression of YAP and TAZ was disclosed through western blot assay. Our results revealed that AJUBA expression was significantly higher in the cervical cancer patients resistant to cisplatin treatment compared with cervical cancer patients sensitive to cisplatin treatment. In addition, overall survival time was significantly shorter in the cervical cancer patients with high AJUBA expression compare with those with low AJUBA expression using kaplan-meier analysis. Hela and Siha cells transfected with AJUBA-expressing plasmids exposed to cisplatin treatment had higher survival rate compared with the cells transfected with empty vector control. Mechanistic studies revealed the AJUBA upregulated the downstream targets YAP and TAZ. These results suggest that high AJUBA level enhances cervical cancer cells drug resistance to cisplatin, also associates with decreased patient survival times. Copyright © 2017 Elsevier B.V. All rights reserved.
[Non surgical treatment of pancreatic cancers].
Bleiberg, H; Gerard, B; Hendlisz, A; Jagodzinski, R
1997-09-01
Pancreatic cancer is a disease difficult to treat. Diagnosis is late, cancer remaining clinically unapparent even if locally advanced or metastatic. Few patients can be submitted to curative surgery. Even if resection is possible, 5-year survival varies from 0% to 18% according to series. Some data suggest that chemotherapy with or without radiotherapy could influence disease free survival but a benefit on overall survival has not been demonstrated. For locally advanced disease, the results of a trial published in 1968, showed that a combination of radiotherapy and 5-Fluorouracil (5FU) improved median survival as compared to radiotherapy alone (5.5 versus 10 months). Since then, no progress has been achieved. At the present time, survival of patients with metastatic pancreatic cancer cannot be improved. Very recently, a new agent, gemcitabine, has been compared to 5FU. Criteria for activity were based on clinical improvement analgesia consumption, performance status and weight gain. Twenty-four percent of the patients treated with gemcitabine had a clinical benefit as compared to 5% for those treated with 5FU. Other studies comparing chemotherapy to best supportive care show a significant decrease of depression and anxiety as well as an improvement in quality of life for patients being treated.
Mahmoud, K. Gh. M; Scholkamy, T. H; Darwish, S. F
2015-01-01
Cryopreservation and sexing of embryos are integrated into commercial embryo transfer technologies. To improve the effectiveness of vitrification of in vitro produced buffalo embryos, two experiments were conducted. The first evaluated the effect of exposure time (2 and 3 min) and developmental stage (morula and blastocysts) on the viability and development of vitrified buffalo embryos. Morphologically normal embryos and survival rates (re-expansion) significantly increased when vitrified morulae were exposed for 2 min compared to 3 min (P<0.001). On the other hand, morphologically normal and survival rates of blastocysts significantly increased when exposed for 3 min compared to 2 min (P<0.001). However, there were no significant differences between the two developmental stages (morulae and blastocystes) in the percentages of morphologically normal embryos and re-expansion rates after a 24 h culture. The second experiment aimed to evaluate the effect of viability on the sex ratio of buffalo embryos after vitrification and whether male and female embryos survived vitrification differently. A total number of 61 blastocysts were vitrified for 3 min with the same cryoprotectant as experiment 1. Higher percentages of males were recorded for live as compared to dead embryos; however, this difference was not significant. In conclusion, the post-thaw survival and development of in vitro produced morulae and blastocysts were found to be affected by exposure time rather than developmental stage. Survivability had no significant effect on the sex ratio of vitrified blastocysts; nevertheless, the number of surviving males was higher than dead male embryos. PMID:27175197
Hurthle cell carcinoma: an update on survival over the last 35 years.
Nagar, Sapna; Aschebrook-Kilfoy, Briseis; Kaplan, Edwin L; Angelos, Peter; Grogan, Raymon H
2013-12-01
Hurthle cell carcinoma (HCC) of the thyroid is a variant of follicular cell carcinoma (FCC). A low incidence and lack of long-term follow-up data have caused controversy regarding the survival characteristics of HCC. We aimed to clarify this controversy by analyzing HCC survival over a 35-year period using the Surveillance, Epidemiology, and End Results (SEER) database. Cases of HCC and FCC were extracted from the SEER-9 database (1975-2009). Five- and 10-year survival rates were calculated. We compared changes in survival over time by grouping cases into 5-year intervals. We identified 1,416 cases of HCC and 4,973 cases of FCC. For cases diagnosed from 1975 to 1979, HCC showed a worse survival compared with FCC (5 years, 75%; 95% confidence interval [CI], 60.2-85) versus 88.7% (95% CI, 86-90.8; 10 years, 66.7% [95% CI, 51.5-78.1] vs. 79.7% [95% CI, 76.5-82.6]). For cases diagnosed from 2000 to 2004 we found no difference in 5-year survival between HCC and FCC (91.1% [95% CI, 87.6-93.7] vs. 89.1% [95% CI, 86.5-91.2]). For cases diagnosed from 1995 to 1999, there was no difference in 10-year survival between HCC and FCC (80.9% [95% CI, 75.6-85.2] vs. 83.9% [95% CI, 80.8-86.6]). HCC survival improved over the study period while FCC survival rates remained stable (increase in survival at 5 years, 21.7% vs. 0.4%; at 10 years, 21.3% vs. 5.2%). Improvement in HCC survival was observed for both genders, in age ≥45 years, in local and regional disease, for tumors >4 cm, and with white race. HCC survival has improved dramatically over time such that HCC and FCC survival rates are now the same. These findings explain how studies over the last 4 decades have shown conflicting results regarding HCC survival; however, our data do not explain why HCC survival has improved. Copyright © 2013 Mosby, Inc. All rights reserved.
Conic, Ruzica Z; Cabrera, Claudia I; Khorana, Alok A; Gastman, Brian R
2018-01-01
The ideal timing for melanoma treatment, predominantly surgery, remains undetermined. Patient concern for receiving immediate treatment often exceeds surgeon or hospital availability, requiring establishment of a safe window for melanoma surgery. To assess the impact of time to definitive melanoma surgery on overall survival. Patients with stage I to III cutaneous melanoma and with available time to definitive surgery and overall survival were identified by using the National Cancer Database (N = 153,218). The t test and chi-square test were used to compare variables. Cox regression was used for multivariate analysis. In a multivariate analysis of patients in all stages who were treated between 90 and 119 days after biopsy (hazard ratio [HR], 1.09; 95% confidence interval [CI], 1.01-1.18) and more than 119 days (HR, 1.12; 95% CI, 1.02-1.22) had a higher risk for mortality compared with those treated within 30 days of biopsy. In a subgroup analysis of stage I, higher mortality risk was found in patients treated within 30 to 59 days (HR, 1.05; 95% CI, 1.01-1.1), 60 to 89 days (HR, 1.16; 95% CI, 1.07-1.25), 90 to 119 days (HR, 1.29; 95% CI, 1.12-1.48), and more than 119 days after biopsy (HR, 1.41; 95% CI, 1.21-1.65). Surgical timing did not affect survival in stages II and III. Melanoma-specific survival was not available. Expeditious treatment of stage I melanoma is associated with improved outcomes. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Clinical outcomes following salvage Gamma Knife radiosurgery for recurrent glioblastoma
Larson, Erik W; Peterson, Halloran E; Lamoreaux, Wayne T; MacKay, Alexander R; Fairbanks, Robert K; Call, Jason A; Carlson, Jonathan D; Ling, Benjamin C; Demakas, John J; Cooke, Barton S; Lee, Christopher M
2014-01-01
Glioblastoma multiforme (GBM) is the most common malignant primary brain tumor with a survival prognosis of 14-16 mo for the highest functioning patients. Despite aggressive, multimodal upfront therapies, the majority of GBMs will recur in approximately six months. Salvage therapy options for recurrent GBM (rGBM) are an area of intense research. This study compares recent survival and quality of life outcomes following Gamma Knife radiosurgery (GKRS) salvage therapy. Following a PubMed search for studies using GKRS as salvage therapy for malignant gliomas, nine articles from 2005 to July 2013 were identified which evaluated rGBM treatment. In this review, we compare Overall survival following diagnosis, Overall survival following salvage treatment, Progression-free survival, Time to recurrence, Local tumor control, and adverse radiation effects. This report discusses results for rGBM patient populations alone, not for mixed populations with other tumor histology grades. All nine studies reported median overall survival rates (from diagnosis, range: 16.7-33.2 mo; from salvage, range: 9-17.9 mo). Three studies identified median progression-free survival (range: 4.6-14.9 mo). Two showed median time to recurrence of GBM. Two discussed local tumor control. Six studies reported adverse radiation effects (range: 0%-46% of patients). The greatest survival advantages were seen in patients who received GKRS salvage along with other treatments, like resection or bevacizumab, suggesting that appropriately tailored multimodal therapy should be considered with each rGBM patient. However, there needs to be a randomized clinical trial to test GKRS for rGBM before the possibility of selection bias can be dismissed. PMID:24829861
Survival in cats with primary and secondary cardiomyopathies.
Spalla, Ilaria; Locatelli, Chiara; Riscazzi, Giulia; Santagostino, Sara; Cremaschi, Elena; Brambilla, Paola
2016-06-01
Feline cardiomyopathies (CMs) represent a heterogeneous group of myocardial diseases. The most common CM is hypertrophic cardiomyopathy (HCM), followed by restrictive cardiomyopathy (RCM). Studies comparing survival and outcome for different types of CM are scant. Furthermore, little is known about the cardiovascular consequences of systemic diseases on survival. The aim of this retrospective study was to compare survival and prognostic factors in cats affected by HCM, RCM or secondary CM referred to our institution over a 10 year period. The study included 94 cats with complete case records and echocardiographic examination. Fifty cats presented HCM, 14 RCM and 30 secondary CM. A statistically significant difference in survival time was identified for cats with HCM (median survival time of 865 days), RCM (273 days) and secondary CM (<50% cardiac death rate). In the overall population and in the primary CM group (HCM + RCM), risk factors in the multivariate analysis, regardless of the CM considered, were the presence of clinical signs, an increased left atrial to aortic root (LA/Ao) ratio and a hypercoagulable state. Primary CMs in cats share some common features (ie, LA dimension and hypercoagulable state) linked to feline cardiovascular physiology, which influence survival greatly in end-stage CM. The presence of clinical signs has to be regarded as a marker of disease severity, regardless of the underlying CM. Secondary CMs are more benign conditions, but if the primary disease is not properly managed, the prognosis might also be poor in this group of patients. © ISFM and AAFP 2015.
Wang, Shu-wen; Ren, Juan; Yan, Yan-li; Xue, Chao-fan; Tan, Li; Ma, Xiao-wei
2016-01-01
The objective of this study was to compare the effects of image-guided hypofractionated radiotherapy and conventional fractionated radiotherapy on non-small-cell lung cancer (NSCLC). Fifty stage- and age-matched cases with NSCLC were randomly divided into two groups (A and B). There were 23 cases in group A and 27 cases in group B. Image-guided radiotherapy (IGRT) and stereotactic radiotherapy were conjugately applied to the patients in group A. Group A patients underwent hypofractionated radiotherapy (6–8 Gy/time) three times per week, with a total dose of 64–66 Gy; group B received conventional fractionated radiotherapy, with a total dose of 68–70 Gy five times per week. In group A, 1-year and 2-year local failure survival rate and 1-year local failure-free survival rate were significantly higher than in group B (P<0.05). The local failure rate (P<0.05) and distant metastasis rate (P>0.05) were lower in group A than in group B. The overall survival rate of group A was significantly higher than that of group B (P=0.03), and the survival rate at 1 year was 87% vs 63%, (P<0.05). The median survival time of group A was longer than that of group B. There was no significant difference in the incidence of complications between the two groups (P>0.05). Compared with conventional fractionated radiation therapy, image-guided hypofractionated stereotactic radiotherapy in NSCLC received better treatment efficacy and showed good tolerability. PMID:27574441
Brenner, Hermann; Castro, Felipe A; Eberle, Andrea; Emrich, Katharina; Holleczek, Bernd; Katalinic, Alexander; Jansen, Lina
2016-01-01
The proportion of cases notified by death certificate only (DCO) is a commonly used data quality indicator in studies comparing cancer survival across regions and over time. We aimed to assess dependence of DCO proportions on the age structure of cancer patients. Using data from a national cancer survival study in Germany, we determined age specific and overall (crude) DCO proportions for 24 common forms of cancer. We then derived overall (crude) DCO proportions expected in case of shifts of the age distribution of the cancer populations by 5 and 10 years, respectively, assuming age specific DCO proportions to remain constant. Median DCO proportions across the 24 cancers were 2.4, 3.7, 5.5, 8.5 and 23.9% in age groups 15-44, 45-54, 55-64, 65-74, and 75+, respectively. A decrease of ages by 5 and 10 years resulted in decreases of cancer specific crude DCO proportions ranging from 0.4 to 4.8 and from 0.7 to 8.6 percent units, respectively. Conversely, an increase of ages by 5 and 10 years led to increases of cancer specific crude DCO proportions ranging from 0.8 to 4.8 and from 1.8 to 9.6 percent units, respectively. These changes were of similar magnitude (but in opposite direction) as changes in crude 5-year relative survival resulting from the same shifts in age distribution. The age structure of cancer patient populations has a substantial impact on DCO proportions. DCO proportions should therefore be age adjusted in comparative studies on cancer survival across regions and over time. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ngwa, Julius S; Cabral, Howard J; Cheng, Debbie M; Pencina, Michael J; Gagnon, David R; LaValley, Michael P; Cupples, L Adrienne
2016-11-03
Typical survival studies follow individuals to an event and measure explanatory variables for that event, sometimes repeatedly over the course of follow up. The Cox regression model has been used widely in the analyses of time to diagnosis or death from disease. The associations between the survival outcome and time dependent measures may be biased unless they are modeled appropriately. In this paper we explore the Time Dependent Cox Regression Model (TDCM), which quantifies the effect of repeated measures of covariates in the analysis of time to event data. This model is commonly used in biomedical research but sometimes does not explicitly adjust for the times at which time dependent explanatory variables are measured. This approach can yield different estimates of association compared to a model that adjusts for these times. In order to address the question of how different these estimates are from a statistical perspective, we compare the TDCM to Pooled Logistic Regression (PLR) and Cross Sectional Pooling (CSP), considering models that adjust and do not adjust for time in PLR and CSP. In a series of simulations we found that time adjusted CSP provided identical results to the TDCM while the PLR showed larger parameter estimates compared to the time adjusted CSP and the TDCM in scenarios with high event rates. We also observed upwardly biased estimates in the unadjusted CSP and unadjusted PLR methods. The time adjusted PLR had a positive bias in the time dependent Age effect with reduced bias when the event rate is low. The PLR methods showed a negative bias in the Sex effect, a subject level covariate, when compared to the other methods. The Cox models yielded reliable estimates for the Sex effect in all scenarios considered. We conclude that survival analyses that explicitly account in the statistical model for the times at which time dependent covariates are measured provide more reliable estimates compared to unadjusted analyses. We present results from the Framingham Heart Study in which lipid measurements and myocardial infarction data events were collected over a period of 26 years.
Manook, Miriam; Koeser, Leonardo; Ahmed, Zubir; Robb, Matthew; Johnson, Rachel; Shaw, Olivia; Kessaris, Nicos; Dorling, Anthony; Mamode, Nizam
2017-02-18
More than 40% of patients awaiting a kidney transplant in the UK are sensitised with human leucocyte antigen (HLA) antibodies. Median time to transplantation for such patients is double that of unsensitised patients at about 74 months. Removing antibody to perform an HLA-incompatible (HLAi) living donor transplantation is perceived to be high risk, although patient survival data are limited. We compared survival of patients opting for an HLAi kidney transplant with that of similarly sensitised patients awaiting a compatible organ. From the UK adult kidney transplant waiting list, we selected crossmatch positive living donor HLAi kidney transplant recipients who received their transplant between Jan 1, 2007, and Dec 31, 2013, and were followed up to Dec 31, 2014 (end of study). These patients were matched in a 1:4 ratio with similarly sensitised patients cases listed for a deceased-donor transplant during that period. Data were censored both at the time of transplantation (listed only), and at the end of the study period (listed or transplant). We used Kaplan-Meier curves to compare patient survival between HLAi and the matched cohort. Of 25 518 patient listings, 213 (1%) underwent HLAi transplantation during the study period. 852 matched controls were identified, of whom 41% (95% CI 32-50) remained without a transplant at 58 months after matching. We noted no difference in survival between patients who were in the HLAi group compared with the listed only group (log rank p=0·446), or listed or transplant group (log rank p=0·984). Survival of sensitised patients undergoing HLAi in the UK is comparable with those on dialysis awaiting a compatible organ, many of whom are unlikely to be have a transplant. Choosing a direct HLAi transplant has no detrimental effect on survival, but offers no survival benefit, by contrast with similar patients studied in a North American multicentre cohort. UK National Health Service Blood & Transplant and Guy's & St Thomas' National Institute for Health Research Biomedical Research Centre. Copyright © 2017 Elsevier Ltd. All rights reserved.
Correia, Francisco; Gouveia, Sónia; Felino, António Campos; Costa, Ana Lemos; Almeida, Ricardo Faria
To evaluate the differences between the survival rates of implants placed in patients with no history of periodontal disease (NP) and in patients with a history of chronic periodontal disease (CP). A retrospective cohort study was conducted in which all consenting patients treated with dental implants in a private clinic in Oporto, Portugal, from November 2, 2002 through February 11, 2011 were included. All patients were treated consecutively by the same experimental operator. This study aimed to analyze how the primary outcomes (presence of disease, time of placement, and time of loading) and the secondary outcomes (severity-generalized periodontitis, brand, implant length, prosthesis type, prosthesis metal-ceramic extension) influence the survival rate of dental implants. The survival analysis was performed through the Kaplan-Meier method, and the equality of survival distributions for all groups was tested with the log-rank test with a significance level of .05 for all comparisons. The sample consisted of 202 patients (47% NP and 53% CP) and 689 implants (31% NP and 69% CP). The survival rate in the NP and CP groups showed no statistically significant differences (95.8% versus 93.1%; P ≥ .05). Implants were lost before loading in 54.9% of the cases. The majority of the implants were lost in the first year and stabilized after the second year. Survival rates in the NP and CP patients showed no statistically significant differences when comparing the following factors: subclassification of the disease, implant brands, implant length (short/standard), type of prosthesis, extension of the prosthesis metal-ceramic, and time of placement and loading (P ≥ .05). This work disclosed no statistically significant differences in terms of survival rates when compared with the control group. Placing implants in patients with a history of periodontal disease appears to be viable and safe.
2013-01-01
Background Designs and analyses of clinical trials with a time-to-event outcome almost invariably rely on the hazard ratio to estimate the treatment effect and implicitly, therefore, on the proportional hazards assumption. However, the results of some recent trials indicate that there is no guarantee that the assumption will hold. Here, we describe the use of the restricted mean survival time as a possible alternative tool in the design and analysis of these trials. Methods The restricted mean is a measure of average survival from time 0 to a specified time point, and may be estimated as the area under the survival curve up to that point. We consider the design of such trials according to a wide range of possible survival distributions in the control and research arm(s). The distributions are conveniently defined as piecewise exponential distributions and can be specified through piecewise constant hazards and time-fixed or time-dependent hazard ratios. Such designs can embody proportional or non-proportional hazards of the treatment effect. Results We demonstrate the use of restricted mean survival time and a test of the difference in restricted means as an alternative measure of treatment effect. We support the approach through the results of simulation studies and in real examples from several cancer trials. We illustrate the required sample size under proportional and non-proportional hazards, also the significance level and power of the proposed test. Values are compared with those from the standard approach which utilizes the logrank test. Conclusions We conclude that the hazard ratio cannot be recommended as a general measure of the treatment effect in a randomized controlled trial, nor is it always appropriate when designing a trial. Restricted mean survival time may provide a practical way forward and deserves greater attention. PMID:24314264
Guyot, Patricia; Ades, A E; Ouwens, Mario J N M; Welton, Nicky J
2012-02-01
The results of Randomized Controlled Trials (RCTs) on time-to-event outcomes that are usually reported are median time to events and Cox Hazard Ratio. These do not constitute the sufficient statistics required for meta-analysis or cost-effectiveness analysis, and their use in secondary analyses requires strong assumptions that may not have been adequately tested. In order to enhance the quality of secondary data analyses, we propose a method which derives from the published Kaplan Meier survival curves a close approximation to the original individual patient time-to-event data from which they were generated. We develop an algorithm that maps from digitised curves back to KM data by finding numerical solutions to the inverted KM equations, using where available information on number of events and numbers at risk. The reproducibility and accuracy of survival probabilities, median survival times and hazard ratios based on reconstructed KM data was assessed by comparing published statistics (survival probabilities, medians and hazard ratios) with statistics based on repeated reconstructions by multiple observers. The validation exercise established there was no material systematic error and that there was a high degree of reproducibility for all statistics. Accuracy was excellent for survival probabilities and medians, for hazard ratios reasonable accuracy can only be obtained if at least numbers at risk or total number of events are reported. The algorithm is a reliable tool for meta-analysis and cost-effectiveness analyses of RCTs reporting time-to-event data. It is recommended that all RCTs should report information on numbers at risk and total number of events alongside KM curves.
Serine/threonine protein phosphatase 6 modulates the radiation sensitivity of glioblastoma
Shen, Y; Wang, Y; Sheng, K; Fei, X; Guo, Q; Larner, J; Kong, X; Qiu, Y; Mi, J
2011-01-01
Increasing the sensitivity of glioblastoma cells to radiation is a promising approach to improve survival in patients with glioblastoma multiforme (GBM). This study aims to determine if serine/threonine phosphatase (protein phosphatase 6 (PP6)) is a molecular target for GBM radiosensitization treatment. The GBM orthotopic xenograft mice model was used in this study. Our data demonstrated that the protein level of PP6 catalytic subunit (PP6c) was upregulated in the GBM tissue from about 50% patients compared with the surrounding tissue or control tissue. Both the in vitro survival fraction of GBM cells and the patient survival time were highly correlated or inversely correlated with PP6c expression (R2=0.755 and −0.707, respectively). We also found that siRNA knockdown of PP6c reduced DNA-dependent protein kinase (DNA-PK) activity in three different GBM cell lines, increasing their sensitivity to radiation. In the orthotopic mice model, the overexpression of PP6c in GBM U87 cells attenuated the effect of radiation treatment, and reduced the survival time of mice compared with the control mice, while the PP6c knocking-down improved the effect of radiation treatment, and increased the survival time of mice. These findings demonstrate that PP6 regulates the sensitivity of GBM cells to radiation, and suggest small molecules disrupting or inhibiting PP6 association with DNA-PK is a potential radiosensitizer for GBM. PMID:22158480
Raedel, Michael; Fiedler, Cliff; Jacoby, Stephan; Boening, Klaus W
2015-07-01
Scientific data about the long-term survival of teeth treated with cast post and cores are scarce. Retrospective studies often use different target events for their analyses. A comparison is therefore complicated. For associated tooth-, jaw-, and patient-related factors little evidence exists as to their effect on survival. The purpose of this study was to extend the knowledge on the survival of teeth treated with cast post and cores for observation periods of more than 10 years. A decrease or increase in survival times according to the presence or absence of associated parameters needs to be evaluated. A retrospective evaluation was conducted of all cast post and cores inserted in 1 university clinic between January 1992 and June 2011. A Kaplan-Meier survival analysis was carried out by using extraction as the target event. The survival curves for different tooth types, the presence or absence of adjacent teeth, and the prosthetic restoration of the respective jaws were compared by using the log-rank test (α=.05). A Cox regression model was calculated for multivariate analyses. A total of 717 cast post and cores for 343 patients were recorded. The mean survival time was 13.5 years. A statistically significant decrease in survival times was found for canines (11.9 years) and premolars (13.4 years) versus molars (14.1 years), no adjacent teeth (10.6 years) versus at least 1 adjacent tooth (13.8 years), and the restoration with removable dental prostheses (12.5 years) versus fixed dental prostheses and single crowns (13.9 years). The largest reduction in survival time was found for teeth being used as an abutment for a double crown-retained removable partial dental prosthesis (telescopic denture) (9.8 years). Tooth type and adjacent tooth status remained as significant variables within the multivariate Cox regression model. Cast post and cores have an acceptable long-term survival time. Because different factors may influence survival, considering these factors in treatment planning may increase the long-term success of these restorations. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Niu, Xiaoyu; Rajanbabu, Anupama; Delisle, Megan; Peng, Feng; Vijaykumar, Dehannathuparambil K; Pavithran, Keechilattu; Feng, Yukuan; Lau, Susie; Gotlieb, Walter H; Press, Joshua Z
2013-09-01
To explore the impact of treatment modality on survival in patients with brain metastases from epithelial ovarian cancer. We conducted a retrospective review of cases of ovarian cancer with brain metastases treated at institutions in three countries (Canada, China, and India) and conducted a search for studies regarding brain metastases in ovarian cancer reporting survival related to treatment modality. Survival was analyzed according to treatment regimens involving (1) some form of surgical excision or gamma-knife radiation with or without other modalities, (2) other modalities without surgery or gamma-knife radiation, or (3) palliation only. Twelve patients (mean age 56 years) with detailed treatment/outcome data were included; five were from China, four from Canada, and three from India. Median time from diagnosis of ovarian cancer to brain metastasis was 19 months (range 10 to 37 months), and overall median survival time from diagnosis of ovarian cancer was 38 months (13 to 82 months). Median survival time from diagnosis of brain metastasis was 17 months (1 to 45 months). Among patients who had multimodal treatment including gamma-knife radiotherapy or surgical excision, the median survival time after the identification of brain metastasis was 25.6 months, compared with 6.0 months in patients whose treatment did not include this type of focused localized modality (P = 0.006). Analysis of 20 studies also indicated that use of gamma-knife radiotherapy and excisional surgery in multi-modal treatment resulted in improved median survival interval (25 months vs. 6.0 months, P < 0.001). In the subset of patients with brain metastases from ovarian cancer, prolonged survival may result from use of multidisciplinary therapy, particularly if metastases are amenable to localized treatments such as gamma-knife radiotherapy and surgical excision.
Gaertner, Katharina; Müllner, Michael; Friehs, Helmut; Schuster, Ernst; Marosi, Christine; Muchitsch, Ilse; Frass, Michael; Kaye, Alan David
2014-04-01
Current literature suggests a positive influence of additive classical homeopathy on global health and well-being in cancer patients. Besides encouraging case reports, there is little if any research on long-term survival of patients who obtain homeopathic care during cancer treatment. Data from cancer patients who had undergone homeopathic treatment complementary to conventional anti-cancer treatment at the Outpatient Unit for Homeopathy in Malignant Diseases, Medical University Vienna, Department of Medicine I, Vienna, Austria, were collected, described and a retrospective subgroup-analysis with regard to survival time was performed. Patient inclusion criteria were at least three homeopathic consultations, fatal prognosis of disease, quantitative and qualitative description of patient characteristics, and survival time. In four years, a total of 538 patients were recorded to have visited the Outpatient Unit Homeopathy in Malignant Diseases, Medical University Vienna, Department of Medicine I, Vienna, Austria. 62.8% of them were women, and nearly 20% had breast cancer. From the 53.7% (n=287) who had undergone at least three homeopathic consultations within four years, 18.7% (n=54) fulfilled inclusion criteria for survival analysis. The surveyed neoplasms were glioblastoma, lung, cholangiocellular and pancreatic carcinomas, metastasized sarcoma, and renal cell carcinoma. Median overall survival was compared to expert expectations of survival outcomes by specific cancer type and was prolonged across observed cancer entities (p<0.001). Extended survival time in this sample of cancer patients with fatal prognosis but additive homeopathic treatment is interesting. However, findings are based on a small sample, and with only limited data available about patient and treatment characteristics. The relationship between homeopathic treatment and survival time requires prospective investigation in larger samples possibly using matched-pair control analysis or randomized trials. Copyright © 2014 Elsevier Ltd. All rights reserved.
Outcomes after medical and surgical interventions in horses with temporohyoid osteoarthropathy.
Espinosa, P; Nieto, J E; Estell, K E; Kass, P H; Aleman, M
2017-11-01
Temporohyoid osteoarthropathy (THO) is a cause of neurological disease in horses that is characterised by facial and vestibulocochlear nerve deficits. Studies reporting and comparing survival following medical or surgical treatment of THO are lacking. To compare survival and prognosis in horses with THO treated medically or surgically, and to report surgical complications. Retrospective study. The medical records of horses diagnosed with THO were retrieved, and data on signalment, clinical signs and duration, corneal ulceration and bilateral occurrence were recorded. Neurological severity was graded according to clinical signs. Preoperative radiographic and endoscopic images were graded according to the severity of changes. Factors potentially affecting survival and treatment were compared using Cox proportional hazards regression. A total of 77 horses were identified as having THO during the period 1990-2014. Of these, 25 horses underwent ceratohyoid ostectomy (CHO) and eight underwent partial stylohyoid ostectomy (PSHO). Thirteen of 20, one of 25 and one of eight horses treated by medical therapy, CHO and PSHO, respectively, died or were subjected to euthanasia as a consequence of THO. Compared with CHO, medical therapy was significantly associated with nonsurvival, but there were no significant differences in survival between horses undergoing PSHO and medical therapy. The duration of clinical signs, and neurological, radiographic and endoscopic grades were not associated with survival of THO. However, the age of the horse was significantly associated with poorer survival. Survival time was significantly shorter in the medical therapy group compared with the two surgical groups combined, but did not differ significantly between the two surgical groups. No significant difference between groups was seen in the incidence of surgical complications (33.3% in the PSHO and 22.2% in the CHO group). This was a nonrandomised study of treatment effects on survival and included a low number of cases. The survival prognosis in horses with THO is good to excellent in those submitted to surgical intervention, and fair in those treated with medical therapy alone. © 2017 EVJ Ltd.
Wattmo, Carina; Londos, Elisabet; Minthon, Lennart
2016-08-31
The survival time in nursing homes (NHs) in Alzheimer's disease (AD) might be affected by sociodemographic/clinical characteristics, rate of disease progression, and use of specific medications and community-based services. Whether different aspects of cholinesterase inhibitor (ChEI) therapy modify time spent in NHs is unclear. Therefore, we examined the relationship between these potential predictors and survival time in NHs. This prospective, multicenter study of ChEI treatment in clinical practice included 220 deceased patients clinically diagnosed with mild-to-moderate AD who were admitted to NHs during the study. Cognitive and activities of daily living (ADL) performance, ChEI dose, and amount of services used/week were evaluated every 6 months over 3 years. Dates of nursing-home placement (NHP) and death were recorded. Variables that determined survival time in NHs were analyzed using general linear models. The mean survival time in NHs was 4.06 years (men, 2.78 years; women, 4.53 years; P < 0.001). The multivariate model showed that a shorter stay in NHs was associated with the interaction term male living with a family member, use of antihypertensive/cardiac therapy or anxiolytics/sedatives/hypnotics, and worse basic ADL at NHP, but not with age or cognitive and instrumental ADL capacities. Increased community-based care did not reduce the survival time in NHs among individuals with AD. Men living with family spent significantly less time in NHs compared with the corresponding women, which suggests that the situation of female spouses of AD patients may need attention and possibly support. There was no indication that different aspects of ChEI therapy, e.g., drug type, dose, or duration, alter survival time in NHs.
Coombe, Robyn; Lisy, Karolina; Campbell, Jared; Perry, Gajen; Prasannan, Subhita
2017-08-01
The objective of this systematic review is to assess the effectiveness of aggressive treatment of oligometastatic breast cancer (OMBC) on survival outcomes by conducting a meta-analysis of current available evidence.More specifically, the objectives are to identify the effectiveness of intensified multidisciplinary treatment with aggressive locoregional therapies on survival time, five-year survival rates and disease free survival. The population is adult women (18 years and over) with OMBC defined as single or few (five or less) metastases limited to a single organ and the comparative group is conventional palliative treatment aimed at disease control. Secondary objectives to be assessed will be adverse outcomes associated with intensified treatment regimes.
Racial disparities in advanced stage colorectal cancer survival
Wallace, Kristin; Hill, Elizabeth G.; Lewin, David N.; Williamson, Grace; Oppenheimer, Stephanie; Ford, Marvella E.; Wargovich, Michael J.; Berger, Franklin G.; Bolick, Susan W.; Thomas, Melanie B.; Alberg, Anthony J.
2013-01-01
Purpose African Americans (AA) have a higher incidence and lower survival from colorectal cancer (CRC) compared to European Americans (EA). In the present study, statewide, population-based data from South Carolina Central Cancer Registry (SCCCR) is used to investigate the relationship between race and age on advanced stage CRC survival. Methods The study population was comprised of 3865 advanced pathologically documented colon and rectal adenocarcinoma cases diagnosed between 01 January 1996 and 31 December 2006: 2673 (69%) EA and 1192 (31%) AA. Kaplan-Meier methods were used to generate median survival time and corresponding 95% confidence intervals (CI) by race, age, and gender. Factors associated with survival were evaluated by fitting Cox proportional hazards (CPH) regression models to generate Hazard Ratios (HR) and 95% CI. Results We observed a significant interaction between race and age on CRC survival (p = 0.04). Among younger patients (< 50 years), AA race was associated with a 1.34 (95% CI 1.06-1.71) higher risk of death compared to EA. Among older patients, we observed a modest increase risk of death among AA men compared to EA (HR 1.16 (95% CI 1.01-1.32) but no difference by race among women (HR 0.94 (95% CI 0.82-1.08)). Moreover, we observed that the disparity in survival has worsened over the past 15 years. Conclusions Future studies that integrate clinical, molecular, and treatment-related data are needed for advancing understanding of the racial disparity in CRC survival, especially for those < 50 years old. PMID:23296454
Mujahid, Mahasin; Srinivas, Sandy; Keegan, Theresa H.M.
2016-01-01
Purpose: Testicular cancer is the most common cancer among adolescent and young adult (AYA) men 15–39 years of age. This study aims to determine whether race/ethnicity and/or neighborhood socioeconomic status (SES) contribute independently to survival of AYAs with testicular cancer. Methods: Data on 14,249 eligible AYAs with testicular cancer diagnosed in California between 1988 and 2010 were obtained from the population-based California Cancer Registry. Multivariable Cox proportional hazards regression was used to examine overall and testicular cancer-specific survival and survival for the seminoma and nonseminoma histologic subtypes according to race/ethnicity, census-tract level neighborhood SES, and other patient and clinical characteristics. Results: Compared with White AYAs, Hispanic AYAs had worse overall and testicular cancer-specific survival (hazard ratio [HR], 1.21; 95% confidence interval [CI], 1.07–1.37) and Black AYAs had worse overall survival (HR, 1.41; 95% CI, 1.01–1.97), independent of neighborhood SES and other demographic and clinical factors. Racial/ethnic disparities in survival were more pronounced for nonseminoma than for seminoma. AYAs residing in middle and low SES neighborhoods experienced worse survival across both histologic subtypes independent of race/ethnicity and other factors, while improvements in survival over time were more pronounced for seminoma. Longer time to treatment was also associated with worse survival, particularly for AYAs with nonseminoma. Conclusion: Among AYAs, race/ethnicity, and neighborhood SES are independently associated with survival after testicular cancer. Variation in disparities by histologic type according to demographic factors, year of diagnosis, and time to treatment may reflect differences in prognosis and extent of treatment for the two histologies. PMID:26812451
DeRouen, Mindy C; Mujahid, Mahasin; Srinivas, Sandy; Keegan, Theresa H M
2016-03-01
Testicular cancer is the most common cancer among adolescent and young adult (AYA) men 15-39 years of age. This study aims to determine whether race/ethnicity and/or neighborhood socioeconomic status (SES) contribute independently to survival of AYAs with testicular cancer. Data on 14,249 eligible AYAs with testicular cancer diagnosed in California between 1988 and 2010 were obtained from the population-based California Cancer Registry. Multivariable Cox proportional hazards regression was used to examine overall and testicular cancer-specific survival and survival for the seminoma and nonseminoma histologic subtypes according to race/ethnicity, census-tract level neighborhood SES, and other patient and clinical characteristics. Compared with White AYAs, Hispanic AYAs had worse overall and testicular cancer-specific survival (hazard ratio [HR], 1.21; 95% confidence interval [CI], 1.07-1.37) and Black AYAs had worse overall survival (HR, 1.41; 95% CI, 1.01-1.97), independent of neighborhood SES and other demographic and clinical factors. Racial/ethnic disparities in survival were more pronounced for nonseminoma than for seminoma. AYAs residing in middle and low SES neighborhoods experienced worse survival across both histologic subtypes independent of race/ethnicity and other factors, while improvements in survival over time were more pronounced for seminoma. Longer time to treatment was also associated with worse survival, particularly for AYAs with nonseminoma. Among AYAs, race/ethnicity, and neighborhood SES are independently associated with survival after testicular cancer. Variation in disparities by histologic type according to demographic factors, year of diagnosis, and time to treatment may reflect differences in prognosis and extent of treatment for the two histologies.
Goldberg, Richard M; Sargent, Daniel J; Morton, Roscoe F; Fuchs, Charles S; Ramanathan, Ramesh K; Williamson, Stephen K; Findlay, Brian P; Pitot, Henry C; Alberts, Steven
2006-07-20
Previously, we reported results of Intergroup N9741, which compared standard bolus fluorouracil (FU), leucovorin, plus irinotecan (IFL) with infused FU, leucovorin, plus oxaliplatin (FOLFOX4) and irinotecan plus oxaliplatin in patients with untreated metastatic colorectal cancer. High rates of grade > or = 3 toxicity on IFL (resulting in some deaths) led us to reduce the starting doses of both irinotecan and FU by 20% (rIFL). This article compares rIFL with FOLFOX4. The primary comparison was time to progression, with secondary end points of response rate (RR), overall survival, and toxicity. Three hundred five patients were randomly assigned. The North Central Cancer Treatment Group Data Safety Monitoring Committee interrupted enrollment at a planned interim analysis when outcomes crossed predetermined stopping boundaries. The results were significantly superior for FOLFOX4 compared with rIFL for time to progression (9.7 v 5.5 months, respectively; P < .0001), RR (48% v 32%, respectively; P = .006), and overall survival (19.0 v 16.3 months, respectively; P = .026). Toxicity profiles were not significantly different between regimens for nausea, vomiting, diarrhea, febrile neutropenia, dehydration, or 60-day all-cause mortality. Sensory neuropathy and neutropenia were significantly more common with FOLFOX4. Approximately 75% of patients in both arms received second-line therapy; 58% of rIFL patients received oxaliplatin-based second-line therapy, and 55% of FOLFOX4 patients received irinotecan-based regimens as second-line therapy. FOLFOX4 led to superior RR, time to progression, and overall survival compared with rIFL. The survival benefit for FOLFOX4 observed in the earlier stage of the study was preserved with equal use of either irinotecan or oxaliplatin as second-line therapy.
Bregar, Amy J; Alejandro Rauh-Hain, J; Spencer, Ryan; Clemmer, Joel T; Schorge, John O; Rice, Laurel W; Del Carmen, Marcela G
2017-04-01
To examine patterns of care and survival for Hispanic women compared to white and African American women with high-grade endometrial cancer. We utilized the National Cancer Data Base (NCDB) to identify women diagnosed with uterine grade 3 endometrioid adenocarcinoma, carcinosarcoma, clear cell carcinoma and papillary serous carcinoma between 2003 and 2011. The effect of treatment on survival was analyzed using the Kaplan-Meier method. Factors predictive of outcome were compared using the Cox proportional hazards model. 43,950 women were eligible. African American and Hispanic women had higher rates of stage III and IV disease compared to white women (36.5% vs. 36% vs. 33.5%, p<0.001). African American women were less likely to undergo surgical treatment for their cancer (85.2% vs. 89.8% vs. 87.5%, p<0.001) and were more likely to receive chemotherapy (36.8% vs. 32.4% vs. 32%, p<0.001) compared to white and Hispanic women. Over the entire study period, after adjusting for age, time period of diagnosis, region of the country, urban or rural setting, treating facility type, socioeconomic status, education, insurance, comorbidity index, pathologic stage, histology, lymphadenectomy and adjuvant treatment, African American women had lower overall survival compared to white women (Hazard Ratio 1.21, 95% CI 1.16-1.26). Conversely, Hispanic women had improved overall survival compared to white women after controlling for the aforementioned factors (HR 0.87, 95% CI 0.80-0.93). Among women with high-grade endometrial cancer, African American women have lower all-cause survival while Hispanic women have higher all-cause survival compared to white women after controlling for treatment, sociodemographic, comorbidity and histopathologic variables. Copyright © 2017 Elsevier Inc. All rights reserved.
Clinical outcome in 20 cases of lingual hemangiosarcoma in dogs: 1996-2011.
Burton, J H; Powers, B E; Biller, B J
2014-09-01
With the exception of solar-induced dermal hemangiosarcoma (HSA), the biologic behaviour of canine HSA is characterised by rapid tumour growth, a high metastatic rate and short survival times. Outcome of dogs with HSA of the tongue has not been previously reported. The purpose of this study was to assess outcome and prognostic factors in dogs with lingual HSA. Clinical data was collected retrospectively and histopathology was reviewed for 20 dogs. Median progression free survival was 524 days and the median overall survival time was 553 days. All dogs had low or intermediate grade tumours; most tumours were small and located on the ventral surface of the tongue. Prognostic factors significantly associated with increased survival included small tumour size and absence of clinical signs of an oral mass at the time of diagnosis. Dogs with HSA confined to the tongue may have a better prognosis compared with HSA in other organs. © 2012 John Wiley & Sons Ltd.
New insights about host response to smallpox using microarray data.
Esteves, Gustavo H; Simoes, Ana C Q; Souza, Estevao; Dias, Rodrigo A; Ospina, Raydonal; Venancio, Thiago M
2007-08-24
Smallpox is a lethal disease that was endemic in many parts of the world until eradicated by massive immunization. Due to its lethality, there are serious concerns about its use as a bioweapon. Here we analyze publicly available microarray data to further understand survival of smallpox infected macaques, using systems biology approaches. Our goal is to improve the knowledge about the progression of this disease. We used KEGG pathways annotations to define groups of genes (or modules), and subsequently compared them to macaque survival times. This technique provided additional insights about the host response to this disease, such as increased expression of the cytokines and ECM receptors in the individuals with higher survival times. These results could indicate that these gene groups could influence an effective response from the host to smallpox. Macaques with higher survival times clearly express some specific pathways previously unidentified using regular gene-by-gene approaches. Our work also shows how third party analysis of public datasets can be important to support new hypotheses to relevant biological problems.
Díaz Pérez, Gilmer A; Pow-Sang Godoy, Mariela; Morante Deza, Carlos; Meza Montoya, Luis; Destefano Urrutia, Víctor
2009-02-01
Testicular lymphoma is a rare illness with peculiar characteristics but with a poor prognosis. We evaluated 32 patients retrospectively studying their epidemiologic characteristics, hematologic values, histologic type, metastasis sites, the treatment given and the survival. We compared our results with international reports and we think that prospective studies are needed for better conclusions. The average of age was 45-years-old, with more than the half of patients with clinical stage IV at the moment of the diagnosis, the histiocitic pathology was the most frequent, and the time of survival was 39,543 +/- 14,451 months and the time in which the 50% of the patients die is 15 +/- 7,025 months. This is a rare disease, with a very poor prognosis, with a time of survival of 39,543 +/- 14,451 months and the time in which the 50% of the patients die is 15 +/- 7,025 months.
Han, Seong Kyu; Lee, Dongyeop; Lee, Heetak; Kim, Donghyo; Son, Heehwa G; Yang, Jae-Seong; Lee, Seung-Jae V; Kim, Sanguk
2016-08-30
Online application for survival analysis (OASIS) has served as a popular and convenient platform for the statistical analysis of various survival data, particularly in the field of aging research. With the recent advances in the fields of aging research that deal with complex survival data, we noticed a need for updates to the current version of OASIS. Here, we report OASIS 2 (http://sbi.postech.ac.kr/oasis2), which provides extended statistical tools for survival data and an enhanced user interface. In particular, OASIS 2 enables the statistical comparison of maximal lifespans, which is potentially useful for determining key factors that limit the lifespan of a population. Furthermore, OASIS 2 provides statistical and graphical tools that compare values in different conditions and times. That feature is useful for comparing age-associated changes in physiological activities, which can be used as indicators of "healthspan." We believe that OASIS 2 will serve as a standard platform for survival analysis with advanced and user-friendly statistical tools for experimental biologists in the field of aging research.
Balog, K; Huang, A A; Sum, S O; Moore, G E; Thompson, C; Scott-Moncrieff, J C
2013-01-01
Dogs with immune-mediated thrombocytopenia (ITP) are at risk of hemorrhage when platelet count is <50,000/μL. Treatment with vincristine (VINC) or human intravenous immunoglobulin (hIVIG) decreases platelet recovery time compared with treatment with corticosteroids alone. To compare the effect of hIVIG versus VINC on platelet recovery in dogs with ITP. Prospective, randomized study. Twenty dogs with idiopathic ITP (platelet count <16,000/μL) were enrolled. All dogs were treated with corticosteroids. Dogs were randomly assigned to receive a single dose of hIVIG (0.5 g/kg) or VINC (0.02 mg/kg). Outcome measures were platelet recovery time, duration of hospitalization, and survival to discharge. There was no significant difference in age, sex, weight, or initial platelet count between dogs treated with hIVIG (n = 10) and dogs treated with VINC (n = 10). Median platelet recovery time for both groups was 2.5 days (P = .51). Median hospitalization time for all dogs that survived to discharge was 4 days and not different between groups (P = .29). Seven of 10 dogs in the hIVIG group and 10 of 10 in the VINC group survived to discharge. Survival analysis did not identify any significant difference between the groups at discharge, 6 months, and 1 year after entry into the study. No adverse effects were reported in either group. Vincristine should be the first-line adjunctive treatment for the acute management of canine ITP because of lower cost and ease of administration compared with human intravenous immunoglobulin (hIVIG). Copyright © 2013 by the American College of Veterinary Internal Medicine.
Relationship between microRNA-146a expression and plasma renalase levels in hemodialyzed patients
Koch, Wojciech; Kukula-Koch, Wirginia; Gaweł, Kinga; Bednarek-Skublewska, Anna; Małecka-Massalska, Teresa; Milanowski, Janusz; Petkowicz, Beata; Solski, Janusz
2017-01-01
Background microRNA (miRNA) belongs to the non-coding RNAs family responsible for the regulation of gene expression. Renalase is a protein composed of 342 amino acids, secreted by the kidneys and possibly plays an important role in the regulation of sympathetic tone and blood pressure. The aim of the present study was to investigate plasma renalase concentration, and explore the relationship between miRNA-146a-5p expression and plasma renalase levels in hemodialyzed patients. Methods The study population comprised 55 subjects who succumbed to various cardiac events, 27 women and 28 men, aged 65–70 years. The total RNA including miRNA fraction was isolated using QiagenmiRNEasy Serum/Plasma kit according to the manufacturer’s protocol. The isolated miRNAs were analyzed using a quantitative polymerase chain reaction (qRT-PCR) technique. The plasma renalase levels were measured using a commercial ELISA kit. Results In the group of patients with high levels of renalase, higher miRNA-146a expression was found, compared with those with low concentration of renalase. Patients with simultaneous low miRNA-146a expression and high level of renalase were confirmed to deliver a significantly longer survival time compared with other patients. Conclusions miRNA-146a and plasma renalase levels were estimated as independent prognostic factors of hemodialyzed patients’ survival time. Patients with low miRNA-146a expression demonstrated a significantly longer survival time in contrast to the patients with a high expression level of miRNA-146a. Moreover, a significantly longer survival time was found in patients with high renalase activity compared with patients with low activity of the enzyme. PMID:28614373
The Vitamin B12 Analog Cobinamide Is an Effective Antidote for Oral Cyanide Poisoning.
Lee, Jangwoen; Mahon, Sari B; Mukai, David; Burney, Tanya; Katebian, Behdod S; Chan, Adriano; Bebarta, Vikhyat S; Yoon, David; Boss, Gerry R; Brenner, Matthew
2016-12-01
Cyanide is a major chemical threat, and cyanide ingestion carries a higher risk for a supra-lethal dose exposure compared to inhalation but provides an opportunity for effective treatment due to a longer treatment window and a gastrointestinal cyanide reservoir that could be neutralized prior to systemic absorption. We hypothesized that orally administered cobinamide may function as a high-binding affinity scavenger and that gastric alkalinization would reduce cyanide absorption and concurrently increase cobinamide binding, further enhancing antidote effectiveness. Thirty New Zealand white rabbits were divided into five groups and were given a lethal dose of oral cyanide poisoning (50 mg). The survival time of animals was monitored with oral cyanide alone, oral cyanide with gastric alkalinization with oral sodium bicarbonate buffer (500 mg), and in combination with either aquohydroxocobinamide or dinitrocobinamide (250 mM). Red blood cell cyanide concentration, plasma cobinamide, and thiocyanate concentrations were measured from blood samples. In cyanide ingested animals, oral sodium bicarbonate alone significantly prolonged survival time to 20.3 ± 8.6 min compared to 10.5 ± 4.3 min in saline-treated controls, but did not lead to overall survival. Aquohydroxocobinamide and dinitrocobinamide increased survival time to 64 ± 41 (p < 0.05) and 75 ± 16.4 min (p < 0.001), respectively. Compared to aquohydroxocobinamide, dinitrocobinamide showed greater systemic absorption and reduced blood pressure. Dinitrocobinamide also markedly increased the red blood cell cyanide concentration. Under all conditions, the plasma thiocyanate concentration gradually increased with time. This study demonstrates a promising new approach to treat high-dose cyanide ingestion, with gastric alkalinization alone and in combination with oral cobinamide for treating a supra-lethal dose of orally administered cyanide in rabbits.
Rapid on-site defibrillation versus community program.
Fedoruk, J C; Paterson, D; Hlynka, M; Fung, K Y; Gobet, Michael; Currie, Wayne
2002-01-01
For patients who suffer out-of-hospital cardiac arrest, the time from collapse to initial defibrillation is the single most important factor that affects survival to hospital discharge. The purpose of this study was to compare the survival rates of cardiac arrest victims within an institution that has a rapid defibrillation program with those of its own urban community, tiered EMS system. A logistic regression analysis of a retrospective data series (n = 23) and comparative analysis to a second retrospective data series (n = 724) were gathered for the study period September 1994 to September 1999. The first data series included all persons at Casino Windsor who suffered a cardiac arrest. Data collected included: age, gender, death/survival (neurologically intact discharge), presenting rhythm (ventricular fibrillation (VF), ventricular tachycardia (VT), or other), time of collapse, time to arrival of security personnel, time to initiation of cardiopulmonary resuscitation (CPR) prior to defibrillation (when applicable), time to arrival of staff nurse, time to initial defibrillation, and time to return of spontaneous circulation (if any). Significantly, all arrests within this series were witnessed by the surveillance camera systems, allowing time of collapse to be accurately determined rather than estimated. These data were compared to those of similar events, times, and intervals for all patients in the greater Windsor area who suffered cardiac arrest. This second series was based upon the Ontario Prehospital Advanced Life Support (OPALS) Study database, as coordinated by the Clinical Epidemiology Unit of the Ottawa Hospital, University of Ottawa. The Casino Windsor had 23 cases of cardiac arrests. Of the cases, 13 (56.5%) were male and 10 (43.5%) were female. All cases (100%) were witnessed. The average of the ages was 61.1 years, of the time to initial defibrillation was 7.7 minutes, and of the time for EMS to reach the patient was 13.3 minutes. The presenting rhythm was VF/VT in 91% of the case. Fifteen patients were discharged alive from hospital for a 65% survival rate. The Greater Windsor Study area included 668 cases of out-of-hospital cardiac arrest: Of these, 410 (61.4%) were male and 258 (38.6%) were female, 365 (54.6%) were witnessed, and 303 (45.4%) were not witnessed. The initial rhythm was VF/VT was in 34.3%. Thirty-seven (5.5%) were discharged alive from the hospital. This study provides further evidence that PAD Programs may enhance cardiac arrest survival rates and should be considered for any venue with large numbers of adults as well as areas with difficult medical access.
Mechanical circulatory support as a bridge to cardiac retransplantation: a single center experience.
Clerkin, Kevin J; Thomas, Sunu S; Haythe, Jennifer; Schulze, P Christian; Farr, Maryjane; Takayama, Hiroo; Jorde, Ulrich P; Restaino, Susan W; Naka, Yoshifumi; Mancini, Donna M
2015-02-01
Cardiac retransplantation is increasing in frequency. Recent data have shown that retransplantation outcomes are now comparable with primary transplantation. The use of mechanical circulatory support (MCS) as a bridge to retransplantation has similar post-retransplant outcomes to those without MCS, but the success of bridging patients to retransplant with MCS has not been well studied. From January 2000 to February 2014 at Columbia University Medical Center, 84 patients were listed for retransplantation. Of this cohort, 48 patients underwent retransplantation, 15 were bridged with MCS, 24 died, and 6 clinically improved. A retrospective analysis was performed examining waiting list time, survival to retransplantation, and survival after retransplant. The effect of the United Network of Organ Sharing (UNOS) allocation policy change in 2006 on waiting list time and MCS use was also investigated. Of 48 patients who underwent retransplantation, 11 were bridged with MCS. Overall 1-year survival to retransplantation was 81.3%. There was no significant difference in waiting list survival (p = 0.71) in those with and without MCS. Death from cardiac arrest or multiorgan failure with infection was more frequent in the medically managed group (p = 0.002). After the UNOS 2006 allocation policy change, waiting list time (599 ± 936 days in Era 1 vs 526 ± 498 days in Era 2, p = 0.65) and waiting list survival (p = 0.22) between eras were comparable, but there was a trend toward greater use of MCS (p = 0.13). Survival after retransplant was acceptable. The use of MCS as a bridge to cardiac retransplantation is a reasonable strategy. Copyright © 2015 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Wang, Shengfei; Huang, Yangle; Xie, Juntao; Zhuge, Lingdun; Shao, Longlong; Xiang, Jiaqing; Zhang, Yawei; Sun, Yihua; Hu, Hong; Chen, Sufeng; Lerut, Toni; Luketich, James D; Zhang, Jie; Chen, Haiquan
2018-03-01
Although endoscopic resection (ER) may be sufficient treatment for early-stage esophageal cancer, additional treatment is recommended when there is a high risk of cancer recurrence. It is unclear whether delaying esophagectomy by performing and assessing the success of ER affects outcomes as compared with immediate esophagectomy without ER. Additionally, long-term survival after sequential ER and esophagectomy required further investigation. Between 2011 and 2015, 48 patients with stage T1 esophageal cancer underwent esophagectomy after ER with curative intent at our institution. Two-to-one propensity score methods were used to identify 96 matched-control patients who were treated with esophagectomy only using baseline patient, tumor characteristics and surgical approach. Time from initial evaluation to esophagectomy, relapse-free survival, overall survival, and postoperative complications were compared between the propensity-matched groups. In the ER + esophagectomy group, the time from initial evaluation to esophagectomy was significantly longer than in the esophagectomy only group (114 vs. 8 days, p < 0.001). The incidence of dense adhesion (p = 0.347), operative time (p = 0.867), postoperative surgical complications (p = 0.966), and postoperative length of hospital stay (p = 0.125) were not significantly different between the groups. Moreover, recurrence-free survival and overall survival were also similar between the two groups (p = 0.411 and p = 0.817, respectively). Treatment of stage T1 esophageal cancer with ER prior to esophagectomy did not increase the difficulty of performing esophagectomy or the incidence of postoperative complications and did not affect survival after esophagectomy. These results suggest that ER can be recommended for patients with stage T1 cancer even if esophagectomy is warranted eventually.
Ackerman, Joshua T.; Herzog, Mark P.; Takekawa, John Y.; Hartman, Christopher A.
2014-01-01
Identifying differences in reproductive success rates of closely related and sympatrically breeding species can be useful for understanding limitations to population growth. We simultaneously examined the reproductive ecology of American avocets Recurvirostra americana and black-necked stilts Himantopus mexicanus using 1274 monitored nests and 240 radio-marked chicks in San Francisco Bay, California. Although there were 1.8 times more avocet nests than stilt nests, stilts nonetheless fledged 3.3 times more chicks. Greater production by stilts than avocets was the result of greater chick survival from hatching to fledging (avocet: 6%; stilt: 40%), and not because of differences in clutch size (avocet: 3.84; stilt: 3.77), nest survival (avocet: 44%; stilt: 35%), or egg hatching success (avocet: 90%; stilt: 92%). We reviewed the literature and confirmed that nest survival and hatching success are generally similar when avocets and stilts breed sympatrically. In addition to species, chick survival was strongly influenced by age, site, and year. In particular, daily survival rates increased rapidly with chick age, with 70% of mortalities occurring ≤ 1 week after hatch. California gulls Larus californicus caused 55% of avocet, but only 15% of stilt, chick deaths. Differential use of micro-habitats likely reduced stilt chick’s vulnerability to gull predation, particularly during the first week after hatch, because stilts nested in vegetation 2.7 times more often than avocets and vegetation height was 65% taller at stilt nests compared with avocet nests. Our results demonstrate that two co-occurring and closely related species with similar life history strategies can differ markedly in reproductive success, and simultaneous studies of such species can identify differences that limit productivity.
Zee, Jarcy; Xie, Sharon X.
2015-01-01
Summary When a true survival endpoint cannot be assessed for some subjects, an alternative endpoint that measures the true endpoint with error may be collected, which often occurs when obtaining the true endpoint is too invasive or costly. We develop an estimated likelihood function for the situation where we have both uncertain endpoints for all participants and true endpoints for only a subset of participants. We propose a nonparametric maximum estimated likelihood estimator of the discrete survival function of time to the true endpoint. We show that the proposed estimator is consistent and asymptotically normal. We demonstrate through extensive simulations that the proposed estimator has little bias compared to the naïve Kaplan-Meier survival function estimator, which uses only uncertain endpoints, and more efficient with moderate missingness compared to the complete-case Kaplan-Meier survival function estimator, which uses only available true endpoints. Finally, we apply the proposed method to a dataset for estimating the risk of developing Alzheimer's disease from the Alzheimer's Disease Neuroimaging Initiative. PMID:25916510
Outcomes after liver transplantation of patients with Indo-Asian ethnicity.
Rocha, Chiara; Perera, M Thamara; Roberts, Keith; Bonney, Glenn; Gunson, Bridget; Nightingale, Peter; Bramhall, Simon R; Isaac, John; Muiesan, Paolo; Mirza, Darius F
2015-04-01
The impact of ethnicity on outcomes after orthotopic liver transplantation (OLT) is unclear. The British Indo-Asian population has a high incidence of liver disease but its contribution to the national deceased donor pool is small. We evaluated access to and outcomes of OLT in Indo-Asians. We compared 182 Indo-Asians with white patients undergoing OLT. Matching criteria were transplantation year, liver disease, age, sex. Donor and recipient characteristics, postoperative outcomes, including patient and graft survival, OLT era (early, 1987-2001; late, 2002-2011) were compared. Survival was also analyzed by underlying disease-acute liver failure (ALF) and chronic liver failure. Indo-Asians had higher diabetes incidence. There were no differences in waiting time for transplantation, despite smaller body size and more uncommon blood groups (B, AB) among Indo-Asians. In the early era, patient survival for Indo-Asians with ALF was worse when compared to whites. In the late era, graft and patient survival at 1, 2, and 5 years were similar between groups. This study demonstrates that Indo-Asian patients have equal access to OLT and comparable outcomes to whites in the United Kingdom. Survival has improved among Indo-Asian patients; this may be attributable to careful patient selection in case of ALF, though improvement of patient management may have contributed.
Viglianti, BL; Lora-Michiels, M; Poulson, JM; Lan, Lan; Yu, D; Sanders, L; Craciunescu, O; Vujaskovic, Z; Thrall, DE; MacFall, J; Charles, HC; Wong, T; Dewhirst, MW
2009-01-01
Purpose This study tests whether DCE-MRI parameters obtained from canine patients with soft tissue sarcomas, treated with hyperthermia and radiotherapy, are predictive of therapeutic outcome. Experimental Design 37 dogs with soft tissue sarcomas had DCE-MRI performed prior to and following the first hyperthermia. Signal enhancement for tumor and reference muscle were fitted empirically, yielding a washin/washout rate for the contrast agent, tumor AUC calculated from 0 to 60s, 90s, and the time of maximal enhancement in the reference muscle. These parameters were then compared to local tumor control, metastasis free survival, and overall survival. Results Pre-therapy rate of contrast agent washout was positively predictive of improved overall survival and metastasis free survival with hazard ratio of 0.67 (p = 0.015) and 0.68 (p = 0.012) respectively. After the first hyperthermia washin rate, AUC60, AUC90, and AUCt-max, were predictive of improved overall survivaloverall survival and metastasis free survival with hazard ratio ranging from 0.46 to 0.53 (p < 0.002) and 0.44 to 0.55 (p < 0.004), respectively. DCE-MRI parameters were compared with extracellular pH and 31-P-MR spectroscopy results (previously published) in the same patients demonstrating a correlation. This suggested that an increase in perfusion after therapy was effective in eliminating excess acid from the tumor. Conclusions This study demonstrates that DCE-MRI has utility predicting overall survivaloverall survival and metastasis free survival in canine patients with soft tissue sarcomas. To our knowledge, this is the first time that DCE-MRI parameters have been shown to be predictive of clinical outcome for soft tissue sarcomas. PMID:19622579
DOE Office of Scientific and Technical Information (OSTI.GOV)
Louie, Alexander V.; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London, ON
Purpose: To compare the quality-adjusted life expectancy and overall survival in patients with Stage I non-small-cell lung cancer (NSCLC) treated with either stereotactic body radiation therapy (SBRT) or surgery. Methods and Materials: We constructed a Markov model to describe health states after either SBRT or lobectomy for Stage I NSCLC for a 5-year time frame. We report various treatment strategy survival outcomes stratified by age, sex, and pack-year history of smoking, and compared these with an external outcome prediction tool (Adjuvant{exclamation_point} Online). Results: Overall survival, cancer-specific survival, and other causes of death as predicted by our model correlated closely withmore » those predicted by the external prediction tool. Overall survival at 5 years as predicted by baseline analysis of our model is in favor of surgery, with a benefit ranging from 2.2% to 3.0% for all cohorts. Mean quality-adjusted life expectancy ranged from 3.28 to 3.78 years after surgery and from 3.35 to 3.87 years for SBRT. The utility threshold for preferring SBRT over surgery was 0.90. Outcomes were sensitive to quality of life, the proportion of local and regional recurrences treated with standard vs. palliative treatments, and the surgery- and SBRT-related mortalities. Conclusions: The role of SBRT in the medically operable patient is yet to be defined. Our model indicates that SBRT may offer comparable overall survival and quality-adjusted life expectancy as compared with surgical resection. Well-powered prospective studies comparing surgery vs. SBRT in early-stage lung cancer are warranted to further investigate the relative survival, quality of life, and cost characteristics of both treatment paradigms.« less
Chronic consequences of acute injuries: worse survival after discharge.
Shafi, Shahid; Renfro, Lindsay A; Barnes, Sunni; Rayan, Nadine; Gentilello, Larry M; Fleming, Neil; Ballard, David
2012-09-01
The Trauma Quality Improvement Program uses inhospital mortality to measure quality of care, which assumes patients who survive injury are not likely to suffer higher mortality after discharge. We hypothesized that survival rates in trauma patients who survive to discharge remain stable afterward. Patients treated at an urban Level I trauma center (2006-2008) were linked with the Social Security Administration Death Master File. Survival rates were measured at 30, 90, and 180 days and 1 and 2 years from injury among two groups of trauma patients who survived to discharge: major trauma (Abbreviated Injury Scale score ≥ 3 injuries, n = 2,238) and minor trauma (Abbreviated Injury Scale score ≤ 2 injuries, n = 1,171). Control groups matched to each trauma group by age and sex were simulated from the US general population using annual survival probabilities from census data. Kaplan-Meier and log-rank analyses conditional upon survival to each time point were used to determine changes in risk of mortality after discharge. Cox proportional hazards models with left truncation at the time of discharge were used to determine independent predictors of mortality after discharge. The survival rate in trauma patients with major injuries was 92% at 30 days posttrauma and declined to 84% by 3 years (p > 0.05 compared with general population). Minor trauma patients experienced a survival rate similar to the general population. Age and injury severity were the only independent predictors of long-term mortality given survival to discharge. Log-rank tests conditional on survival to each time point showed that mortality risk in patients with major injuries remained significantly higher than the general population for up to 6 months after injury. The survival rate of trauma patients with major injuries remains significantly lower than survival for minor trauma patients and the general population for several months postdischarge. Surveillance for early identification and treatment of complications may be needed for trauma patients with major injuries. Prognostic study, level III.
Singh, Randhir; Kim, Jinkyung; Shepherd, Marion W; Luo, Feng; Jiang, Xiuping
2011-06-01
A three-strain mixture of Escherichia coli O157:H7 was inoculated into fresh dairy compost (ca. 10(7) CFU/g) with 40 or 50% moisture and was placed in an environmental chamber (ca. 70% humidity) that was programmed to ramp from room temperature to selected composting temperatures in 2 and 5 days to simulate the early composting phase. The surviving E. coli O157:H7 population was analyzed by direct plating and enrichment. Optimal and suboptimal compost mixes, with carbon/nitrogen (C/N) ratios of 25:1 and 16:1, respectively, were compared in this study. In the optimal compost mix, E. coli O157:H7 survived for 72, 48, and 24 h in compost with 40% moisture and for 72, 24, and 24 h with 50% moisture at 50, 55, and 60°C, respectively, following 2 days of come-up time (rate of heating up). However, in the suboptimal compost mix, the pathogen survived for 288, 72, and 48 h in compost with 40% moisture and for 240, 72, 24 h in compost with 50% moisture at the same temperatures, respectively. Pathogen survival was longer, with 5 days of come-up time compared with 2 days of come-up. Overall, E. coli O157:H7 was inactivated faster in the compost with 50% moisture than in the compost with 40% at 55 and 60°C. Both moisture and come-up time were significant factors affecting Weibull model parameters. Our results suggest that slow come-up time at the beginning of composting can extend pathogen survival during composting. Additionally, both the C/N ratio and the initial moisture level in the compost mix affect the rate of pathogen inactivation as well.
Singh, Randhir; Kim, Jinkyung; Shepherd, Marion W.; Luo, Feng; Jiang, Xiuping
2011-01-01
A three-strain mixture of Escherichia coli O157:H7 was inoculated into fresh dairy compost (ca. 107 CFU/g) with 40 or 50% moisture and was placed in an environmental chamber (ca. 70% humidity) that was programmed to ramp from room temperature to selected composting temperatures in 2 and 5 days to simulate the early composting phase. The surviving E. coli O157:H7 population was analyzed by direct plating and enrichment. Optimal and suboptimal compost mixes, with carbon/nitrogen (C/N) ratios of 25:1 and 16:1, respectively, were compared in this study. In the optimal compost mix, E. coli O157:H7 survived for 72, 48, and 24 h in compost with 40% moisture and for 72, 24, and 24 h with 50% moisture at 50, 55, and 60°C, respectively, following 2 days of come-up time (rate of heating up). However, in the suboptimal compost mix, the pathogen survived for 288, 72, and 48 h in compost with 40% moisture and for 240, 72, 24 h in compost with 50% moisture at the same temperatures, respectively. Pathogen survival was longer, with 5 days of come-up time compared with 2 days of come-up. Overall, E. coli O157:H7 was inactivated faster in the compost with 50% moisture than in the compost with 40% at 55 and 60°C. Both moisture and come-up time were significant factors affecting Weibull model parameters. Our results suggest that slow come-up time at the beginning of composting can extend pathogen survival during composting. Additionally, both the C/N ratio and the initial moisture level in the compost mix affect the rate of pathogen inactivation as well. PMID:21498743
Intensive Hemodialysis Associates with Improved Survival Compared with Conventional Hemodialysis
Lindsay, Robert M.; Cuerden, Meaghan S.; Garg, Amit X.; Port, Friedrich; Austin, Peter C.; Moist, Louise M.; Pierratos, Andreas; Chan, Christopher T.; Zimmerman, Deborah; Lockridge, Robert S.; Couchoud, Cécile; Chazot, Charles; Ofsthun, Norma; Levin, Adeera; Copland, Michael; Courtney, Mark; Steele, Andrew; McFarlane, Philip A.; Geary, Denis F.; Pauly, Robert P.; Komenda, Paul; Suri, Rita S.
2012-01-01
Patients undergoing conventional maintenance hemodialysis typically receive three sessions per week, each lasting 2.5–5.5 hours. Recently, the use of more intensive hemodialysis (>5.5 hours, three to seven times per week) has increased, but the effects of these regimens on survival are uncertain. We conducted a retrospective cohort study to examine whether intensive hemodialysis associates with better survival than conventional hemodialysis. We identified 420 patients in the International Quotidian Dialysis Registry who received intensive home hemodialysis in France, the United States, and Canada between January 2000 and August 2010. We matched 338 of these patients to 1388 patients in the Dialysis Outcomes and Practice Patterns Study who received in-center conventional hemodialysis during the same time period by country, ESRD duration, and propensity score. The intensive hemodialysis group received a mean (SD) 4.8 (1.1) sessions per week with a mean treatment time of 7.4 (0.87) hours per session; the conventional group received three sessions per week with a mean treatment time of 3.9 (0.32) hours per session. During 3008 patient-years of follow-up, 45 (13%) of 338 patients receiving intensive hemodialysis died compared with 293 (21%) of 1388 patients receiving conventional hemodialysis (6.1 versus 10.5 deaths per 100 person-years; hazard ratio, 0.55 [95% confidence interval, 0.34–0.87]). The strength and direction of the observed association between intensive hemodialysis and improved survival were consistent across all prespecified subgroups and sensitivity analyses. In conclusion, there is a strong association between intensive home hemodialysis and improved survival, but whether this relationship is causal remains unknown. PMID:22362910
Ten-Year Survival in Patients with Idiopathic Pulmonary Fibrosis After Lung Transplantation.
ten Klooster, Liesbeth; Nossent, George D; Kwakkel-van Erp, Johanna M; van Kessel, Diana A; Oudijk, Erik J; van de Graaf, Ed A; Luijk, Bart; Hoek, Rogier A; van den Blink, Bernt; van Hal, Peter Th; Verschuuren, Erik A; van der Bij, Wim; van Moorsel, Coline H; Grutters, Jan C
2015-12-01
Idiopathic pulmonary fibrosis (IPF) is a progressive and lethal fibrosing lung disease with a median survival of approximately 3 years after diagnosis. The only medical option to improve survival in IPF is lung transplantation (LTX). The purpose of this study was to evaluate trajectory data of IPF patients listed for LTX and to investigate the survival after LTX. Data were retrospectively collected from September 1989 until July 2011 of all IPF patients registered for LTX in the Netherlands. Patients were included after revision of the diagnosis based on the criteria set by the ATS/ERS/JRS/ALAT. Trajectory data, clinical data at time of screening, and donor data were collected. In total, 98 IPF patients were listed for LTX. During the waiting list period, 30 % of the patients died. Mean pulmonary artery pressure, 6-min walking distance, and the use of supplemental oxygen were significant predictors of mortality on the waiting list. Fifty-two patients received LTX with a median overall survival after transplantation of 10 years. This study demonstrated a 10-year survival time after LTX in IPF. Furthermore, our study demonstrated a significantly better survival after bilateral LTX in IPF compared to single LTX although bilateral LTX patients were significantly younger.
Gilstrap, Lauren Gray; Niehaus, Emily; Malhotra, Rajeev; Ton, Van-Khue; Watts, James; Seldin, David C.; Madsen, Joren C.; Semigran, Marc J.
2013-01-01
Background Orthotopic heart transplant (OHT) followed by myeloablative chemotherapy and autologous stem cell transplant (ASCT) has been successful in the treatment of light chain (AL) cardiac amyloidosis. The purpose of this study is to identify predictors of survival to OHT in patients with end stage heart failure due to AL amyloidosis, and compare post-OHT survival of cardiac amyloid patients to that of other cardiomyopathy patients undergoing OHT. Methods From January 2000 to June 2011, 31 patients with end stage heart failure secondary to AL amyloidosis were listed for OHT at Massachusetts General Hospital (MGH). Univariate and multivariate regression analyses identified predictors of survival to OHT. Kaplan-Meier analysis compared survival between MGH amyloidosis patients and the Scientific Registry of Transplant Recipients (SRTR) non-amyloid cardiomyopathy patients. Results Low body mass index (BMI) was the only predictor of survival to OHT in patients with end stage heart failure due to cardiac amyloidosis. Survival of cardiac amyloid patients who died prior to receiving a donor heart was only 63 ± 45 days after listing. Patients who survived to OHT received a donor organ at 53 ± 48 days after listing. Survival of AL amyloidosis patients on the waitlist was less than patients waitlisted for all other non-amyloid diagnoses. The long-term survival of transplanted amyloid patients was no different than the survival of non-amyloid, restrictive (p=0.34), non-amyloid dilated (p=0.34) or all non-amyloid cardiomyopathy patients (p=0.22) in the SRTR database. Conclusions Those that survive to OHT followed by ASCT have a survival rate similar to other cardiomyopathy patients undergoing OHT. However, more than one third of the patients died awaiting OHT. The only predictor of survival to OHT in AL amyloidosis patients was low BMI, which correlated with shorter waitlist time. To optimize the survival of these patients, access to donor organs must be improved. In light chain (AL) amyloidosis, amyloid fibrils derived from clonal lambda or kappa immunoglobulin light chains deposit abnormally in organs. Cardiac involvement is apparent echocardiographically in 60% of AL amyloidosis patients at the time of diagnosis, with clinical evidence of heart failure in 69% of patients.1 The median survival of AL amyloidosis patients presenting with any heart failure symptom is 8.5 months2 and even less for end-stage heart failure pateints. PMID:24200511
Obi, Yoshitsugu; Streja, Elani; Mehrotra, Rajnish; Rivara, Matthew B; Rhee, Connie M; Soohoo, Melissa; Gillen, Daniel L; Lau, Wei-Ling; Kovesdy, Csaba P; Kalantar-Zadeh, Kamyar
2018-06-01
The prevalence of severe obesity, often considered a contraindication to peritoneal dialysis (PD), has increased over time. However, mortality has decreased more rapidly in the PD population than the hemodialysis (HD) population in the United States. The association between obesity and clinical outcomes among patients with end-stage kidney disease remains unclear in the current era. Historical cohort study. 15,573 incident PD patients from a large US dialysis organization (2007-2011). Body mass index (BMI). Modality longevity, residual renal creatinine clearance, peritonitis, and survival. Higher BMI was significantly associated with shorter time to transfer to HD therapy (P for trend < 0.001), longer time to kidney transplantation (P for trend < 0.001), and, with borderline significance, more frequent peritonitis-related hospitalization (P for trend = 0.05). Compared with lean patients, obese patients had faster declines in residual kidney function (P for trend < 0.001) and consistently achieved lower total Kt/V over time (P for trend < 0.001) despite greater increases in dialysis Kt/V (P for trend < 0.001). There was a U-shaped association between BMI and mortality, with the greatest survival associated with the BMI range of 30 to < 35kg/m 2 in the case-mix adjusted model. Compared with matched HD patients, PD patients had lower mortality in the BMI categories of < 25 and 25 to < 35kg/m 2 and had equivalent survival in the BMI category ≥ 35kg/m 2 (P for interaction = 0.001 [vs < 25 kg/m 2 ]). This attenuation in survival difference among patients with severe obesity was observed only in patients with diabetes, but not those without diabetes. Inability to evaluate causal associations. Potential indication bias. Whereas obese PD patients had higher risk for complications than nonobese PD patients, their survival was no worse than matched HD patients. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.
Salamo, Oriana; Roghaee, Shiva; Schweitzer, Michael D; Mantero, Alejandro; Shafazand, Shirin; Campos, Michael; Mirsaeidi, Mehdi
2018-05-03
Sarcoidosis commonly affects the lung. Lung transplantation (LT) is required when there is a severe and refractory involvement. We compared post-transplant survival rates of sarcoidosis patients with chronic obstructive pulmonary disease (COPD) and idiopathic pulmonary fibrosis (IPF). We also explored whether the race and age of the donor, and double lung transplant have any effect on the survival in the post transplant setting. We analyzed 9,727 adult patients with sarcoidosis, COPD, and IPF who underwent LT worldwide between 2005-2015 based on United Network for Organ Sharing (UNOS) database. Survival rates were compared with Kaplan-Meier, and risk factors were investigated by Cox-regression analysis. 469 (5%) were transplanted because of sarcoidosis, 3,688 (38%) for COPD and 5,570 (57%) for IPF. Unadjusted survival analysis showed a better post-transplant survival rate for patients with sarcoidosis (p < 0.001, Log-rank test). In Cox-regression analysis, double lung transplant and white race of the lung donor showed to have a significant survival advantage. Since double lung transplant, those who are younger and have lower Lung Allocation Score (LAS) at the time of transplant have a survival advantage, we suggest double lung transplant as the procedure of choice, especially in younger sarcoidosis subjects and with lower LAS scores.
Chen, Hanyi; Yang, Chen; Yan, Bei; Sun, Lianghong; Wu, Zheng; Li, Xiaopan; Zhang, Meiyu; Li, Xiaoli; Yang, Liming
2014-03-01
Different histologies of lung cancer vary in occurrence and prognosis. This study aims to analyze the incidence and occurrence trend of lung cancer and investigate the survival rate and its influential factors among lung cancer patients with different histologies. Permanent residents were recruited between 2002 and 2009 in Pudong New Area (former Nanhui Area and former Pudong Area), Shanghai, China. Annual percent changes were estimated by a linear regression of the logarithm on the incidence rates for eight years. Survival rates were calculated and compared by using life-table analysis and Log-rank test, respectively. The standardized incidence rates of lung cancer were 52.28 and 18.86 per 100,000 in males and females, respectively. The median survival time was 410.72 days for specific classified lung cancer. The incidence rates of adenocarcinoma ranked the highest and showed an upward tendency (P<0.05). Patients with small cell lung cancer showed the worst survival condition. The survival condition in males with squamous cell lung cancer living in former Nanhui Area was better compared with those living in former Pudong Area. Lung cancers with different histologies demonstrated different occurrence trends and survival rates. Gender, age, and living area influence the survival rates of lung cancer with different histologies.
Facciorusso, Antonio; Del Prete, Valentina; Antonino, Matteo; Neve, Viviana; Amoruso, Annabianca; Crucinio, Nicola; Di Leo, Alfredo; Barone, Michele
2015-10-01
Survival estimates are commonly reported as survival from the first observation, but future survival probability changes based on the survival time already accumulated after therapy, otherwise known as conditional survival (CS). The aim of the study was to describe CS according to different prognostic variables in hepatocellular carcinoma (HCC) patients treated with radiofrequency ablation (RFA). Data on 125 very early/early HCC patients treated with RFA between 1999 and 2007 were analyzed. Actuarial survival estimates were computed by means of Kaplan-Meier method and compared by log-rank test. The 5-year CS was calculated with stratification by several predictors for patients who had already survived up to 5 years from diagnosis. Median overall survival (OS) was 72 months (95% confidence interval [CI], 58-86). Age, Child-Pugh (CP), α-fetoprotein (AFP), Cancer of the Liver Italian Program (CLIP) score and type of recurrence (early vs late) were significant predictors of OS. The 5-year CS rates of the entire study cohort assessed at 1, 2, 3 and 5 years from the treatment were 49%, 48%, 30% and 34%, respectively. Subgroup analysis confirmed age and CP as significant predictors of CS at all time points, while the CS of subgroups stratified by AFP and CLIP did not differ significantly from the 3rd year after RFA onward, as more advanced patients had probably escaped early recurrence. CS analysis showed that the impact of different variables influencing OS is not linear over time after RFA. Information derived from the study can improve the current management of HCC patients. © 2014 The Japan Society of Hepatology.
Sívori, Martín; Rodríguez, Gabriel E; Pascansky, Daniel; Sáenz, César; Sica, Roberto E P
2007-01-01
Sporadic amyotrophic lateral sclerosis (sALS) is a progressive degenerative motor neuron disorder lacking specific treatment. Riluzole is the only drug able to modestly slow down the course of the disease. Respiratory insufficiency is the main cause of death; non invasive ventilation (NIV) has shown to improve survival. Our aim was to evaluate the effect of NIV and riluzole on survival. Ninety seven patients with a diagnosis of sALS were assessed and followed up for 60 months. Twenty nine patients received NIV and 68 did not (nNIV). Overall median survival In the NIV group was 15.41 +/- 7.78 months vs. 10.88 +/- 7.78 months in the nNIV group (p= 0.028). Median survival time was not different in patients receiving riluzole (n=44), as compared with those who did not (n=53), although at month 4th and 5th riluzole treated patients showed a modest benefit. In those who only received NIV (n=11) or only riluzole (n=26), survival time was 13.45 +/- 13.44 months and 11.19 +/- 7.79 months, respectively. Patients who received both NIV and riluzole (n=18) had a median survival time of 16.61 +/- 10.97 months vs. 10.69 +/- 7.86 months for those who received only supportive treatment (n=42) (p= 0.021). NIV improved survival in our series of patients. Riluzole did not show any significant impact on survival when employed as the only therapy. Patients receiving both treatments simultaneously had a significant longer survival.
Aspects of hatching success and chick survival in Gull-billed Terns in coastal Virginia
Eyler, T.B.; Erwin, R.M.; Stotts, D.B.; Hatfield, J.S.
1999-01-01
Because of a long-term population decline in Gull-billed Terns (Sterna nilotica) nesting along the coast of Virginia, we began a three year study in 1994 to monitor hatching success and survival of Gull-billed Tern chicks at several Virginia colony sites. Colonies were located on either small, storm-deposited shellpiles along marsh fringes or large, sandshell overwash fans of barrier islands. Nests were monitored one to three times a week for hatching success, and enclosures were installed around selected nests to monitor chick survival from hatching to about two weeks of age. Hatching success was lower in marsh colonies than island colonies, and was lower in 1995 than in 1994 and 1996, primarily because of flooding. The average brood size of nests where at least one chick hatched was 1.99 chicks. Survival rates of chicks to 14 days depended on hatch order and year but not brood size (one vs. two or more) or time of season. A-chicks had higher survival rates than B-chicks and third-hatched C-chicks (0.661 compared to 0.442 and 0.357, respectively). The year effect was significant only for A-chicks, with lower survival in 1994 (0.50) than in 1995 (0.765) or 1996 (0.758). Overall, productivity was low (0.53 chick per nest) compared to estimates for colonies in Denmark, and was attributable to nest flooding by spring and storm-driven high tides and chick predation, presumably mostly by Great Horned Owls (Bubo virginianus).
Association Between Statin Use and Endometrial Cancer Survival.
Nevadunsky, Nicole S; Van Arsdale, Anne; Strickler, Howard D; Spoozak, Lori A; Moadel, Alyson; Kaur, Gurpreet; Girda, Eugenia; Goldberg, Gary L; Einstein, Mark H
2015-07-01
To evaluate the association of 3 hydroxy-3-methyl-glutaryl-coenzyme A reductase inhibitor (statin) use and concordant polypharmacy with disease-specific survival from endometrial cancer. A retrospective cohort study was conducted of 985 endometrial cancer cases treated from January 1999 through December 2009 at a single institution. Disease-specific survival was estimated by Kaplan-Meier analyses. A Cox proportional hazards model was used to study factors associated with survival. All statistical tests were two-sided and performed using Stata. At the time of analysis, 230 patients (22% of evaluable patients) died of disease and median follow-up was 3.28 years. Disease-specific survival was greater (179/220 [81%]) for women with endometrial cancer taking statin therapy at the time of diagnosis and staging compared with women not using statins (423/570 [74%]) (log rank test, P=.03). This association persisted for the subgroup of patients with nonendometrioid endometrial tumors who were statin users (59/87 [68%]) compared with nonusers (93/193 [43%]) (log rank test, P=.02). The relationship remained significant (hazard ratio 0.63, 95% confidence interval [CI] 0.40-0.99) after adjusting for age, clinical stage, radiation, and other factors. Further evaluation of polypharmacy showed an association between concurrent statin and aspirin use with an especially low disease-specific mortality (hazard ratio 0.25, 95% CI 0.09-0.70) relative to those who used neither. Statin and aspirin use was associated with improved survival from nonendometrioid endometrial cancer.
Cortactin is a sensitive biomarker relative to the poor prognosis of human hepatocellular carcinoma
2013-01-01
Background Cortactin is an important regulator involved in invasion and migration of hepatocellular carcinoma (HCC). The aim of this study was to elucidate the forecasting role of cortactin in resectable HCCs. Methods We compared the invasiveness and motility among liver epithelial cell line and HCC cell lines by using Transwell assay and wound healing assay. We further investigated the CTTN mRNA expression by real-time PCR. Next, 91 HCC and 20 normal liver tissue samples were detected by IHC and real-time PCR. Finally, we analyzed the clinicopathologic features and survival time of the HCC cases. Results We identified that HepG2, LM3, and SK-Hep-1 had more invasiveness and motility (P <0.05). Compared with liver epithelial cell line, CTTN expression was higher in LM3, HepG2, and MHCC97-L (P <0.01) and lower in SK-Hep-1 (P <0.05). IHC examination showed cortactin expression was closely relative to TNM stage (AJCC/UICC), cancer embolus, and metastasis (P <0.01). Cortactin overexpression indicated a longer survival time of 52 ± 8.62 months and low expression of a shorter survival time of 20 ± 4.95 months (P <0.01). Cortactin examination has more predictive power in patients with Child-Pugh grade A and BCLC stage 0-B. Conclusions Overexpression of cortactin is closely associated with poor human HCCs prognosis that caused by cancer embolus and metastasis. Cortactin and CTTN should be used for differentiating varieties of survival for patients after HCC resection. PMID:23518204
Hockersmith, E.E.; Muir, W.D.; Smith, S.G.; Sandford, B.P.; Perry, R.W.; Adams, N.S.; Rondorf, D.W.
2003-01-01
A study was conducted to compare the travel times, detection probabilities, and survival of migrant hatchery-reared yearling chinook salmon Oncorhynchus tshawytscha tagged with either gastrically or surgically implanted sham radio tags (with an imbedded passive integrated transponder [PIT] tag) with those of their cohorts tagged only with PIT tags in the Snake and Columbia rivers. Juvenile chinook salmon with gastrically implanted radio tags migrated significantly faster than either surgically radio-tagged or PIT-tagged fish, while migration rates were similar among surgically radio-tagged and PIT-tagged fish. The probabilities of PIT tag detection at downstream dams varied by less than 5% and were not significantly different among the three groups. Survival was similar among treatments for median travel times of less than approximately 6 d (migration distance of 106 km). However, for both gastrically and surgically radio-tagged fish, survival was significantly less than for PIT-tagged fish, for which median travel times exceeded approximately 10 d (migration distance of 225 km). The results of this study support the use of radio tags to estimate the survival of juvenile chinook salmon having a median fork length of approximately 150 mm (range, 127-285 mm) and a median travel time of migration of less than approximately 6 d.
Fitness consequences of larval traits persist across the metamorphic boundary.
Crean, Angela J; Monro, Keyne; Marshall, Dustin J
2011-11-01
Metamorphosis is thought to provide an adaptive decoupling between traits specialized for each life-history stage in species with complex life cycles. However, an increasing number of studies are finding that larval traits can carry-over to influence postmetamorphic performance, suggesting that these life-history stages may not be free to evolve independently of each other. We used a phenotypic selection framework to compare the relative and interactive effects of larval size, time to hatching, and time to settlement on postmetamorphic survival and growth in a marine invertebrate, Styela plicata. Time to hatching was the only larval trait found to be under directional selection, individuals that took more time to hatch into larvae survived better after metamorphosis but grew more slowly. Nonlinear selection was found to act on multivariate trait combinations, once again acting in opposite directions for selection acting via survival and growth. Individuals with above average values of larval traits were most likely to survive, but surviving individuals with intermediate larval traits grew to the largest size. These results demonstrate that larval traits can have multiple, complex fitness consequences that persist across the metamorphic boundary; and thus postmetamorphic selection pressures may constrain the evolution of larval traits. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.
Plasma serotonin in horses undergoing surgery for small intestinal colic
Torfs, Sara C.; Maes, An A.; Delesalle, Catherine J.; Pardon, Bart; Croubels, Siska M.; Deprez, Piet
2015-01-01
This study compared serotonin concentrations in platelet poor plasma (PPP) from healthy horses and horses with surgical small intestinal (SI) colic, and evaluated their association with postoperative ileus, strangulation and non-survival. Plasma samples (with EDTA) from 33 horses with surgical SI colic were collected at several pre- and post-operative time points. Serotonin concentrations were determined using liquid-chromatography tandem mass spectrometry. Results were compared with those for 24 healthy control animals. The serotonin concentrations in PPP were significantly lower (P < 0.01) in pre- and post-operative samples from surgical SI colic horses compared to controls. However, no association with postoperative ileus or non-survival could be demonstrated at any time point. In this clinical study, plasma serotonin was not a suitable prognostic factor in horses with SI surgical colic. PMID:25694668
Quintero-Fong, L; Toledo, J; Ruiz, L; Rendón, P; Orozco-Dávila, D; Cruz, L; Liedo, P
2016-10-01
The sexual performance of Anastrepha ludens males of the Tapachula-7 genetic sexing strain, produced via selection based on mating success, was compared with that of males produced without selection in competition with wild males. Mating competition, development time, survival, mass-rearing quality parameters and pheromone production were compared. The results showed that selection based on mating competitiveness significantly improved the sexual performance of offspring. Development time, survival of larvae, pupae and adults, and weights of larvae and pupae increased with each selection cycle. Differences in the relative quantity of the pheromone compounds (Z)-3-nonenol and anastrephin were observed when comparing the parental males with the F4 and wild males. The implications of this colony management method on the sterile insect technique are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Zhongli; Department of Etiology and Carcinogenesis; Zhang, Wencheng
Purpose: To investigate whether single nucleotide polymorphisms (SNPs) in the ataxia telangiectasia mutated (ATM) gene are associated with survival in patients with esophageal squamous cell carcinoma (ESCC) receiving radiation therapy or chemoradiation therapy or surgery only. Methods and Materials: Four tagSNPs of ATM were genotyped in 412 individuals with clinical stage III or IV ESCC receiving radiation therapy or chemoradiation therapy, and in 388 individuals with stage I, II, or III ESCC treated with surgery only. Overall survival time of ESCC among different genotypes was estimated by Kaplan-Meier plot, and the significance was examined by log-rank test. The hazard ratios (HRs)more » and 95% confidence intervals (CIs) for death from ESCC among different genotypes were computed by a Cox proportional regression model. Results: We found 2 SNPs, rs664143 and rs664677, associated with survival time of ESCC patients receiving radiation therapy. Individuals with the rs664143A allele had poorer median survival time compared with the rs664143G allele (14.0 vs 20.0 months), with the HR for death being 1.45 (95% CI 1.12-1.89). Individuals with the rs664677C allele also had worse median survival time than those with the rs664677T allele (14.0 vs 23.5 months), with the HR of 1.57 (95% CI 1.18-2.08). Stratified analysis showed that these associations were present in both stage III and IV cancer and different radiation therapy techniques. Significant associations were also found between the SNPs and locosregional progression or progression-free survival. No association between these SNPs and survival time was detected in ESCC patients treated with surgery only. Conclusion: These results suggest that the ATM polymorphisms might serve as independent biomarkers for predicting prognosis in ESCC patients receiving radiation therapy.« less
Feng, Zhixin; Jones, Kelvyn; Wang, Wenfei Winnie
2015-01-01
This study undertakes a survival analysis of elderly persons in China using Chinese Longitudinal Healthy Longevity Survey 2002–2008. Employing discrete-time multilevel models, we explored the effect of social support on the survival of elderly people in China. This study focuses on objective (living arrangements and received support) and subjective activities (perceived support) of social support, finding that the effect of different activities of social support on the survival of elderly people varies according to the availability of different support resources. Specifically, living with a spouse, financial independence, perceiving care support from any resource is associated with higher survival rates for elderly people. Separate analysis focusing on urban elderly and rural elderly revealed broadly similar results. There is a larger difference between those perceiving care support from family or social service and not perceiving care support in urban areas comparing to those in rural areas. Those who cannot pay medical expenses are the least likely to survive. The higher level of economic development in province has no significant effect on the survival of elderly people for the whole sample model and the elderly people in urban areas; however, there is a negative influence on the survival of the rural elderly people. PMID:25703671
NASA Astrophysics Data System (ADS)
Kolosov, Mikhail S.; Shubina, Elena
2015-03-01
Photodynamic therapy is a prospective treatment modality of brain cancers. It is of importance to have information about relative survival rate of different cell types in nerve tissue during photodynamic treatment. Particularly, for development of sparing strategy of the photodynamic therapy of brain tumors, which pursuits both total elimination of malignant cells, which are usually of glial origin, and, at the same time, preservation of normal blood circulation as well as normal glial cells in the brain. The aim of this work was to carry out comparative survival study of glial cells and cells composing walls of blood vessels after photodynamic treatment, using simple model object - ventral nerve cord of crustacean.
Use of automated external defibrillators for in-hospital cardiac arrest : Any time, any place?
Wutzler, A; Kloppe, C; Bilgard, A K; Mügge, A; Hanefeld, C
2017-11-07
Acute treatment of in-hospital cardiac arrest (IHCA) is challenging and overall survival rates are low. However, data on the use of public-access automated external defibrillators (AEDs) for IHCA remain controversial. The aim of our study was to evaluate characteristics of patients experiencing IHCA and feasibility of public-access AED use for resuscitation in a university hospital. IHCA events outside the intensive care unit were analysed over a period of 21 months. Patients' characteristics, AED performance, return of spontaneous circulation (ROSC) and 24 h survival were evaluated. Outcomes following adequate and inadequate AED use were compared. During the study period, 59 IHCAs occurred. AED was used in 28 (47.5%) of the cases. However, AED was adequately used in only 42.8% of total AED cases. AED use was not associated with an increased survival rate (12.9 vs. 10.7%, p = 0.8) compared to non-AED use. However, adequate AED use was associated with a higher survival rate (25 vs. 0%, p = 0.034) compared to inadequate AED use. Time from emergency call to application of AED >3 min was the most important factor of inadequate AED use. Adequate AED use was more often observed between 7:30 and 13:30 and in the internal medicine department. AEDs were applied in less than 50% of the IHCA events. Furthermore, AED use was inadequate in the majority of the cases. Since adequate AED use is associated with improved survival, AEDs should be available in hospital areas with patients at high risk of shockable rhythm.
Huang, Min; Lou, Yanyan; Pellissier, James; Burke, Thomas; Liu, Frank Xiaoqing; Xu, Ruifeng; Velcheti, Vamsidhar
2017-02-01
This analysis aimed to evaluate the cost-effectiveness of pembrolizumab compared with docetaxel in patients with previously treated advanced non-squamous cell lung cancer (NSCLC) with PD-L1 positive tumors (total proportion score [TPS] ≥ 50%). The analysis was conducted from a US third-party payer perspective. A partitioned-survival model was developed using data from patients from the KEYNOTE 010 clinical trial. The model used Kaplan-Meier (KM) estimates of progression-free survival (PFS) and overall survival (OS) from the trial for patients treated with either pembrolizumab 2 mg/kg or docetaxel 75 mg/m 2 with extrapolation based on fitted parametric functions and long-term registry data. Quality-adjusted life years (QALYs) were derived based on EQ-5D data from KEYNOTE 010 using a time to death approach. Costs of drug acquisition/administration, adverse event management, and clinical management of advanced NSCLC were included in the model. The base-case analysis used a time horizon of 20 years. Costs and health outcomes were discounted at a rate of 3% per year. A series of one-way and probabilistic sensitivity analyses were performed to test the robustness of the results. Base case results project for PD-L1 positive (TPS ≥50%) patients treated with pembrolizumab a mean survival of 2.25 years. For docetaxel, a mean survival time of 1.07 years was estimated. Expected QALYs were 1.71 and 0.76 for pembrolizumab and docetaxel, respectively. The incremental cost per QALY gained with pembrolizumab vs docetaxel is $168,619/QALY, which is cost-effective in the US using a threshold of 3-times GDP per capita. Sensitivity analyses showed the results to be robust over plausible values of the majority of inputs. Results were most sensitive to extrapolation of overall survival. Pembrolizumab improves survival, increases QALYs, and can be considered as a cost-effective option compared to docetaxel in PD-L1 positive (TPS ≥50%) pre-treated advanced NSCLC patients in the US.
Grossman, Douglas; Farnham, James M; Hyngstrom, John; Klapperich, Marki E; Secrest, Aaron M; Empey, Sarah; Bowen, Glen M; Wada, David; Andtbacka, Robert H I; Grossmann, Kenneth; Bowles, Tawnya L; Cannon-Albright, Lisa A
2018-03-01
Survival data are mixed comparing patients with multiple primary melanomas (MPM) to those with single primary melanomas (SPM). We compared MPM versus SPM patient survival using a matching method that avoids potential biases associated with other analytic approaches. Records of 14,138 individuals obtained from the Surveillance, Epidemiology, and End Results registry of all melanomas diagnosed or treated in Utah between 1973 and 2011 were reviewed. A single matched control patient was selected randomly from the SPM cohort for each MPM patient, with the restriction that they survived at least as long as the interval between the first and second diagnoses for the matched MPM patient. Survival curves (n = 887 for both MPM and SPM groups) without covariates showed a significant survival disadvantage for MPM patients (chi-squared 39.29, P < .001). However, a multivariate Cox proportional hazards model showed no significant survival difference (hazard ratio 1.07, P = .55). Restricting the multivariate analysis to invasive melanomas also showed no significant survival difference (hazard ratio 0.99, P = .96). Breslow depth, ulceration status, and specific cause of death were not available for all patients. Patients with MPM had similar survival times as patients with SPM. Copyright © 2018 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.
Extending Human Hematopoietic Stem Cell Survival In Vitro with Adipocytes
Glettig, Dean Liang
2013-01-01
Abstract Human hematopoietic stem cells (hHSCs) cannot be maintained in vitro for extended time periods because they rapidly differentiate or die. To extend in vitro culture time, researchers have made attempts to use human mesenchymal stem cells (hMSCs) to create feeder layers that mimic the stem cell niche. We have conducted an array of experiments including adipocytes in these feeder layers that inhibit hHSC differentiation and by that prolong stem cell survival in vitro. The amount of CD34+ cells was quantified using flow cytometry. In a first experiment, feeder layers of undifferentiated hMSCs were compared with feeder layers differentiated toward osteoblasts or adipocytes using minimal medium, showing the highest survival rate where adipocytes were included. The same conclusion was drawn in a second experiment in comparing hMSCs with adipogenic feeder cells, using a culture medium supplemented with a cocktail of hHSC growth factors. In a third experiment, it was shown that direct cell–cell contact is necessary for the supportive effect of the feeder layers. In a fourth and fifth experiment the amount of adipocytes in the feeder layers were varied, and in all experiments a higher amount of adipocytes in the feeder layers showed a less rapid decay of CD34+ cells at later time points. We therefore concluded that adipocytes assist in suppressing hHSC differentiation and aid in prolonging their survival in vitro. PMID:23741628
Li, Ming Xin; Liu, Jun Feng; Lu, Jian Da; Zhu, Ying; Kuang, Ding Wei; Xiang, Jian Bing; Sun, Peng; Wang, Wei; Xue, Jun; Gu, Yong; Hao, Chuan Ming
2016-12-01
The object of this study is to explore whether the plasmadiafiltration (PDF) is more effective in improving the intestinal mucosal barrier function by removing more key large molecular inflammatory mediators and then prolonging the survival time. Totally, 24 porcine sepsis models induced by cecal ligation and puncture (CLP) operation were randomly divided into three groups: PDF group, high-volume hemofiltration (HVHF) group, and control group, and received 8 h treatment, respectively. The expression of ZO-1 and occludin in intestinal mucosal epithelial cells were detected by immunohistochemistry, and apoptotic protein caspase-3-positive lymphocytes were signed in mesenteric lymph nodes by TUNEL staining. The hemodynamic parameters were measured by invasive cavity detection. The tumor necrosis factor alpha (TNFα) and high-mobility group protein 1 (HMGB1) were tested by ELISA method. And then, the survival curves with all-cause death were compared with three groups. PDF led to a superior reversal of sepsis-related hemodynamic impairment and serum biochemistry abnormalities and resulted in longer survival time compared with HVHF and control (p < 0.01). Definitive protection from excessive TNF-α and HMGB1 response were only achieved by PDF. A more regular distribution pattern of ZO-1 and occludin along the epithelium was found in PDF animals (p < 0.01). The presence of apoptotic lymphocytes was significantly reduced in the PDF animals (p < 0.01). PDF can effectively eliminate more pivotal inflammatory mediators of TNFα and HMGB1 and reduce the inflammation damage of the intestinal mucosal barrier and apoptosis of lymphocyte then improve the circulation function and prolong the survival time.
Risks and Benefits of Multimodal Esophageal Cancer Treatments: A Meta-Analysis.
Sun, Lei; Zhao, Fen; Zeng, Yan; Yi, Cheng
2017-02-19
BACKGROUND Esophageal cancer has traditionally been associated with very poor outcomes. A number of therapies are available for the treatment and palliation of esophageal cancer, but little systematic evidence compares the efficacy of different treatment strategies. This meta-analysis aimed to investigate whether treatments in addition to radiotherapy could provide better efficacy and safety. MATERIAL AND METHODS We identified a total of 12 eligible studies with 18 study arms by searching PubMed, the Cochrane Library, EMBASE, and Clinical Trials.gov without time or language restrictions. The final search was conducted on 17 August 2016. We calculated mean differences (MD) and risk ratios (RR) with 95% confidence intervals (CI) for continuous and dichotomous data, respectively. Heterogeneity was calculated and reported using Tau², Chi², and I² analyses. RESULTS Twelve studies with 18 study arms were included in the analysis. Addition of surgery to chemo-radiotherapy resulted in improved median survival time (p=0.009) compared with chemo-radiotherapy alone, but all other outcomes were unaffected. Strikingly, and in contrast with patients with squamous cell carcinomas, the subset of patients with adenocarcinoma who received therapies in addition to radiotherapy showed a significant improvement in median survival time (p<0.0001), disease-free survival (p=0.007), 2-year survival rates (p=0.002), and 3-year survival rates (p=0.01). The incidence of adverse effects increased substantially with additional therapies. CONCLUSIONS This meta-analysis reveals stark differences in outcomes in patients depending on the type of carcinoma. Patients with squamous cell carcinoma should be educated about the risks and benefits of undergoing multiple therapies.
Construction of a North American Cancer Survival Index to Measure Progress of Cancer Control Efforts
Weir, Hannah K; Mariotto, Angela; Wilson, Reda; Nishri, Diane
2017-01-01
Introduction Population-based cancer survival data provide insight into the effectiveness of health care delivery. Comparing survival for all cancer sites combined is challenging, because the primary cancer site and age distribution of patients may differ among areas or change over time. Cancer survival indices (CSIs) are summary measures of survival for cancers of all sites combined and are used in England and Europe to monitor temporal trends and examine geographic differences in survival. We describe the construction of the North American Cancer Survival Index and demonstrate how it can be used to compare survival by geographic area and by race. Methods We used data from 36 US cancer registries to estimate relative survival ratios for people diagnosed with cancer from 2006 through 2012 to create the CSI: the weighted sum of age-standardized, site-specific, relative survival ratios, with weights derived from the distribution of incident cases by sex and primary site from 2006 through 2008. The CSI was calculated for 32 registries for all races, 31 registries for whites, and 12 registries for blacks. Results The survival estimates standardized by age only versus age-, sex-, and site-standardized (CSI) were 64.1% (95% confidence interval [CI], 64.1%–64.2%) and 63.9% (95% CI, 63.8%–63.9%), respectively, for the United States for all races combined. The inter-registry ranges in unstandardized and CSI estimates decreased from 12.3% to 5.0% for whites, and from 5.4% to 3.9% for blacks. We found less inter-registry variation in CSI estimates than in unstandardized all-sites survival estimates, but disparities by race persisted. Conclusions CSIs calculated for different jurisdictions or periods are directly comparable, because they are standardized by age, sex, and primary site. A national CSI could be used to measure temporal progress in meeting public health objectives, such as Healthy People 2030. PMID:28910593
Polanco, Patricio M; Ding, Ying; Knox, Jordan M; Ramalingam, Lekshmi; Jones, Heather; Hogg, Melissa E; Zureikat, Amer H; Holtzman, Matthew P; Pingpank, James; Ahrendt, Steven; Zeh, Herbert J; Bartlett, David L; Choudry, Haroon A
2016-02-01
High-grade (HG) mucinous appendiceal neoplasms (MAN) have a worse prognosis than low-grade histology. Our objective was to assess the safety and efficacy of cytoreductive surgery with hyperthermic intraperitoneal chemoperfusion (CRS/HIPEC) in patients with high-grade, high-volume (HG-HV) peritoneal metastases in whom the utility of this aggressive approach is controversial. Prospectively collected perioperative data were compared between patients with peritoneal metastases from HG-HV MAN, defined as simplified peritoneal cancer index (SPCI) ≥12, and those with high-grade, low-volume (HG-LV; SPCI <12) disease. Kaplan-Meier curves and multivariate Cox regression models identified prognostic factors affecting oncologic outcomes. Overall, 54 patients with HG-HV and 43 with HG-LV peritoneal metastases underwent CRS/HIPEC. The HG-HV group had longer operative time, increased blood loss/transfusion, and increased intensive care unit length of stay (p < 0.05). Incomplete macroscopic cytoreduction (CC-1/2/3) was higher in the HG-HV group compared with the HG-LV group (68.5 vs. 32.6 %; p = 0.005). Patients with HG-HV disease demonstrated worse survival than those with HG-LV disease (overall survival [OS] 17 vs. 42 m, p = 0.009; time to progression (TTP) 10 vs. 14 m, p = 0.024). However, when complete macroscopic resection (CC-0) was achieved, the OS and progression-free survival of patients with HG-HV disease were comparable with HG-LV disease (OS 56 vs. 52 m, p = 0.728; TTP 20 vs. 19 m, p = 0.393). In a multivariate Cox proportional hazard regression model, CC-0 resection was the only significant predictor of improved survival for patients with HG-HV disease. Although patients with HG-HV peritoneal metastases from MAN have worse prognosis compared with patients with HG-LV disease, their survival is comparable when complete macroscopic cytoreduction is achieved.
[Survival time of HIV/AIDS cases and related factors in Beijing, 1995-2015].
Li, Y; Wang, J; He, S F; Chen, J; Lu, H Y
2017-11-10
Objective: To analyze the survival time of HIV/AIDS cases and related factors in Beijing from 1995 to 2015. Methods: A retrospective cohort study was conducted to analyze the data of 12 874 HIV/AIDS cases. The data were collected from Chinese HIV/AIDS Comprehensive Information Management System. Life table method was applied to calculate the survival proportion, and Cox proportion hazard regression model were used to identify the factors related with survival time. Results: Among 12 874 HIV/AIDS cases, 303 (2.4%) died of AIDS related diseases; 9 346 (72.6%) received antiretroviral therapy. The average survival time was 226.5 months (95 %CI : 223.0-230.1), and the survival rates of 1, 5, 10, and 15 years were 98.2%, 96.4%, 93.2%, and 91.9% respectively. Multivariate Cox proportion hazard regression model showed that AIDS phase ( HR =1.439, 95 %CI : 1.041-1.989), heterosexual transmission ( HR =1.646, 95 %CI : 1.184-2.289), being married ( HR =2.186, 95 %CI : 1.510-3.164); older age (≥60 years) at diagnosis ( HR =6.608, 95 %CI : 3.546-12.316); lower CD(4)(+)T cell counts at diagnosis (<350 cells/μl) ( HR =8.711, 95 %CI : 5.757-13.181); receiving no antiretroviral therapy (ART) ( HR =18.223, 95 %CI : 13.317-24.937) were the high risk factors influencing the survival of AIDS patients compared with HIV phase, homosexual transmission, being unmarried, younger age (≤30 years), higher CD(4)(+)T cell count (≥350 cell/μl) and receiving ART. Conclusion: The average survival time of HIV/AIDS cases was 226.5 months after diagnoses. Receiving ART, higher CD(4)(+)T cell counts at the first test, HIV phase, younger age, being unmarried and the homosexual transmission were related to the longer survival time of HIV/AIDS cases. Receiving no ART, the lower CD(4)(+)T cell counts at the first test, AIDS phase, older age, being married and heterosexual transmission indicated higher risk of death due to AIDS.
Population-based analysis of survival for hypoplastic left heart syndrome.
Hirsch, Jennifer C; Copeland, Glenn; Donohue, Janet E; Kirby, Russell S; Grigorescu, Violanda; Gurney, James G
2011-07-01
To analyze survival patterns among infants with hypoplastic left heart syndrome (HLHS) in the State of Michigan. Cases of HLHS prevalent at live birth were identified and confirmed within the Michigan Birth Defects Registry from 1992 to 2005 (n=406). Characteristics of infants with HLHS were compared with a 10:1 random control sample. Compared with 4060 control subjects, the 406 cases of HLHS were more frequently male (62.6% vs 51.4%), born prematurely (<37 weeks gestation; 15.3% vs 8.7%), and born at low birth weight (LBW) (<2.5 kg; 16.0% vs 6.6%). HLHS 1-year survival rate improved over the study period (P=.041). Chromosomal abnormalities, LBW, premature birth, and living in a high poverty neighborhood were significantly associated with death. Controlling for neighborhood poverty, term infants versus preterm with HLHS or LBW were 3.2 times (95% CI: 1.9-5.3; P<.001) more likely to survive at least 1 year. Controlling for age and weight, infants from low-poverty versus high-poverty areas were 1.8 times (95% CI: 1.1-2.8; P=.015) more likely to survive at least 1 year. Among infants with HLHS in Michigan, those who were premature, LBW, had chromosomal abnormalities, or lived in a high-poverty area were at increased risk for early death. Copyright © 2011 Mosby, Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brekhman, I.I.; Oskotskii, L.I.; Khakham, A.I.
1960-02-01
In ginseng therapy of irradiated mice (400 r), the survival was 2 1/2 times higher, and in eleutherococcus treatment the survival was almost 5 times higher as compared with the control group. In the combined action of x-ray irradiation and overloading (rotation of mice for 15 seconds in a centrifuge at 400 to 500 rounds per minute), the therapeutic effect of ginseng and eleutherococcus was more pronounced than in an isolated affection of experimental animals with x rays. (auth)
Ak, Guntulu; Metintas, Selma; Akarsu, Muhittin; Metintas, Muzaffer
2015-07-09
We aimed to evaluate the efficiency and safety of cis/carboplatin plus gemcitabine, which was previously used for mesothelioma but with no recorded proof of its efficiency, compared with cis/carboplatin plus pemetrexed, which is known to be effective in mesothelioma, in comparable historical groups of malignant pleural mesothelioma. One hundred and sixteen patients received cis/carboplatin plus pemetrexed (group 1), while 30 patients received cis/carboplatin plus gemcitabine (group 2) between June 1999 and June 2012. The two groups were compared in terms of median survival and adverse events to chemotherapy. The mean ages of groups 1 and 2 were 60.7 and 60.8 years, respectively. Most of the patients (78.1%) had epithelial type tumors, and 47% of the patients had stage IV disease. There was no difference between the two groups in terms of age, gender, asbestos exposure, histology, stage, Karnofsky performance status, presence of pleurodesis, prophylactic radiotherapy, second-line chemotherapy and median hemoglobin and serum albumin levels. The median survival time from diagnosis to death or the last day of follow up with a 95% confidence interval was 12 ± 0.95 months (95% CI: 10.15-13.85) for group 1 and 11.0 ± 1.09 months (95% CI: 8.85-13.15) for group 2 (Log-Rank: 0.142; p = 0.706). The median survival time from treatment to death or the last day of follow-up with a 95% confidence interval was 11.0 ± 0.99 months (95% CI: 9.06-12.94) for group 1 and 11.0 ± 1.52 months (95% CI: 8.02-13.97) for group 2 (Log-Rank: 0.584; p = 0.445). The stage and Karnofsky performance status were found to be significant variables on median survival time by univariate analysis. After adjusting for the stage and Karnofsky performance status, the chemotherapy schema was not impressive on median survival time (OR: 0.837; 95% CI: 0.548-1.277; p = 0.409). The progression free survival was 7.0 ± 0.61 months for group I and 6.0 ± 1.56 months for group II (Log-Rank: 0.522; p = 0.470). The treatment was generally well tolerated, and the side effects were similar in both groups. The study indicates that platinum-based gemcitabine is effective and a safe schema in malignant pleural mesothelioma. Further research should include large randomized phase III trials comparing these agents.
Association of Low-Dose Aspirin and Survival of Women With Endometrial Cancer.
Matsuo, Koji; Cahoon, Sigita S; Yoshihara, Kosuke; Shida, Masako; Kakuda, Mamoru; Adachi, Sosuke; Moeini, Aida; Machida, Hiroko; Garcia-Sayre, Jocelyn; Ueda, Yutaka; Enomoto, Takayuki; Mikami, Mikio; Roman, Lynda D; Sood, Anil K
2016-07-01
To examine the survival outcomes in women with endometrial cancer who were taking low-dose aspirin (81-100 mg/d). A multicenter retrospective study was conducted examining patients with stage I-IV endometrial cancer who underwent hysterectomy-based surgical staging between January 2000 and December 2013 (N=1,687). Patient demographics, medical comorbidities, medication types, tumor characteristics, and treatment patterns were correlated to survival outcomes. A Cox proportional hazard regression model was used to estimate adjusted hazard ratio for disease-free and disease-specific overall survival. One hundred fifty-eight patients (9.4%, 95% confidence interval [CI] 8.8-11.9) were taking low-dose aspirin. Median follow-up time for the study cohort was 31.5 months. One hundred twenty-seven patients (7.5%) died of endometrial cancer. Low-dose aspirin use was significantly correlated with concurrent obesity, hypertension, diabetes mellitus, and hypercholesterolemia (all P<.001). Low-dose aspirin users were more likely to take other antihypertensive, antiglycemic, and anticholesterol agents (all P<.05). Low-dose aspirin use was not associated with histologic subtype, tumor grade, nodal metastasis, or cancer stage (all P>.05). On multivariable analysis, low-dose aspirin use remained an independent prognostic factor associated with an improved 5-year disease-free survival rate (90.6% compared with 80.9%, adjusted hazard ratio 0.46, 95% CI 0.25-0.86, P=.014) and disease-specific overall survival rate (96.4% compared with 87.3%, adjusted hazard ratio 0.23, 95% CI 0.08-0.64, P=.005). The increased survival effect noted with low-dose aspirin use was greatest in patients whose age was younger than 60 years (5-year disease-free survival rates, 93.9% compared with 84.0%, P=.013), body mass index was 30 or greater (92.2% compared with 81.4%, P=.027), who had type I cancer (96.5% compared with 88.6%, P=.029), and who received postoperative whole pelvic radiotherapy (88.2% compared with 61.5%, P=.014). These four factors remained significant for disease-specific overall survival (all P<.05). Our results suggest that low-dose aspirin use is associated with improved survival outcomes in women with endometrial cancer, especially in those who are young, obese, with low-grade disease, and who receive postoperative radiotherapy.
Halliday, Jane; Collins, Veronica; Riley, Merilyn; Youssef, Danielle; Muggli, Evelyne
2009-01-01
With this study we aimed to compare survival rates for children with Down syndrome in 2 time periods, 1 before prenatal screening (1988-1990) and 1 contemporaneous with screening (1998-2000), and to examine the frequency of comorbidities and their influence on survival rates. Record-linkage was performed between the population-based Victorian Birth Defects Register and records of deaths in children up to 15 years of age collected under the auspice of the Consultative Council on Obstetric and Pediatric Mortality and Morbidity. Cases of Down syndrome were coded according to the presence or absence of comorbidities by using the International Classification of Diseases, Ninth Revision classification of birth defects. Kaplan-Meier survival functions and log rank tests for equality of survival distributions were performed. Of infants liveborn with Down syndrome in 1998-2000, 90% survived to 5 years of age, compared with 86% in the earlier cohort. With fetal deaths excluded, the proportion of isolated Down syndrome cases in the earlier cohort was 48.7% compared with 46.1% in the most recent cohort. In 1988-1990 there was at least 1 cardiac defect in 41.1% of cases and in 45.4% in 1998-2000. There was significant variation in survival rates for the different comorbidity groupings in the 1988-1990 cohort, but this was not so evident in the 1998-2000 cohort. Survival of children with Down syndrome continues to improve, and there is an overall survival figure of 90% to at least 5 years of age. It is clear from this study that prenatal screening technologies are not differentially ascertaining fetuses with Down syndrome and additional defects, because there has been no proportional increase in births of isolated cases with Down syndrome.
Trends in incidence and survival for anal cancer in New South Wales, Australia, 1972-2009.
Soeberg, Matthew J; Rogers, Kris; Currow, David C; Young, Jane M
2015-12-01
Little is known about the incidence and survival of anal cancer in New South Wales (NSW), Australia, as anal cancer cases are often grouped together with other colorectal cancers in descriptive epidemiological analyses. We studied patterns and trends in the incidence and survival of people diagnosed with anal cancer in NSW, Australia, 1972-2009 (n=2724). We also predicted anal cancer incidence in NSW during 2010-2032. Given the human papilloma virus-associated aetiology for most anal cancers, we quantified these changes over time in incidence and survival by histological subtype: anal squamous cell carcinoma (ASCC); and anal adenocarcinoma (AAC). There was a linear increase in incident anal cancer cases in NSW with an average annual percentage change (AAPC) of 1.6 (95% CI 1.1-2.0) such that, in combination with age-period-cohort modelling, we predict there will be 198 cases of anal cancer in the 2032 calendar year (95% CI 169-236). Almost all of these anal cancer cases are projected to be ASCC (94%). Survival improved over time regardless of histological subtype. However, five-year relative survival was substantially higher for people with ASCC (70% (95% CI 66-74%)) compared to AAC (51% (95% CI 43-59%)), a 37% difference. Survival was also greater for women (69% (95% CI 64-73%)) with ASCC compared to men (55% (95% CI 50-60%)). It was not possible to estimate survival by stage at diagnosis particularly given that 8% of all cases were recorded as having distant stage and 22% had missing stage data. Aetiological explanations, namely exposure to oncogenic types of human papillomavirus, along with demographic changes most likely explain the actual and projected increase in ASCC case numbers. Survival differences by gender and histological subtype point to areas where further research is warranted to improve treatment and outcomes for all anal cancer patients. Copyright © 2015 Elsevier Ltd. All rights reserved.
Kalderstam, Jonas; Edén, Patrik; Ohlsson, Mattias
2015-01-01
We investigate a new method to place patients into risk groups in censored survival data. Properties such as median survival time, and end survival rate, are implicitly improved by optimizing the area under the survival curve. Artificial neural networks (ANN) are trained to either maximize or minimize this area using a genetic algorithm, and combined into an ensemble to predict one of low, intermediate, or high risk groups. Estimated patient risk can influence treatment choices, and is important for study stratification. A common approach is to sort the patients according to a prognostic index and then group them along the quartile limits. The Cox proportional hazards model (Cox) is one example of this approach. Another method of doing risk grouping is recursive partitioning (Rpart), which constructs a decision tree where each branch point maximizes the statistical separation between the groups. ANN, Cox, and Rpart are compared on five publicly available data sets with varying properties. Cross-validation, as well as separate test sets, are used to validate the models. Results on the test sets show comparable performance, except for the smallest data set where Rpart's predicted risk groups turn out to be inverted, an example of crossing survival curves. Cross-validation shows that all three models exhibit crossing of some survival curves on this small data set but that the ANN model manages the best separation of groups in terms of median survival time before such crossings. The conclusion is that optimizing the area under the survival curve is a viable approach to identify risk groups. Training ANNs to optimize this area combines two key strengths from both prognostic indices and Rpart. First, a desired minimum group size can be specified, as for a prognostic index. Second, the ability to utilize non-linear effects among the covariates, which Rpart is also able to do.
Modeling microbial survival in buildup biofilm for complex medical devices
2009-01-01
Background Flexible endoscopes undergo repeated rounds of patient-use and reprocessing. Some evidence indicates that there is an accumulation or build-up of organic material that occurs over time in endoscope channels. This "buildup biofilm" (BBF) develops as a result of cyclical exposure to wet and dry phases during usage and reprocessing. This study investigated whether the BBF matrix represents a greater challenge to disinfectant efficacy and microbial eradication than traditional biofilm (TBF), which forms when a surface is constantly bathed in fluid. Methods Using the MBEC (Minimum Biofilm Eradication Concentration) system, a unique modelling approach was developed to evaluate microbial survival in BBF formed by repetitive cycles of drying, disinfectant exposure and re-exposure to the test organism. This model mimics the cumulative effect of the reprocessing protocol on flexible endoscopes. Glutaraldehyde (GLUT) and accelerated hydrogen peroxide (AHP) were evaluated to assess the killing of microbes in TBF and BBF. Results The data showed that the combination of an organic matrix and aldehyde disinfection quickly produced a protective BBF that facilitated high levels of organism survival. In cross-linked BBF formed under high nutrient conditions the maximum colony forming units (CFU) reached ~6 Log10 CFU/peg. However, if an oxidizing agent was used for disinfection and if organic levels were kept low, organism survival did not occur. A key finding was that once established, the microbial load of BBF formed by GLUT exposure had a faster rate of accumulation than in TBF. The rate of biofilm survival post high-level disinfection (HLD) determined by the maximum Log10CFU/initial Log10CFU for E. faecalis and P. aeruginosa in BBF was 10 and 8.6 respectively; significantly different compared to a survival rate in TBF of ~2 for each organism. Data from indirect outgrowth testing demonstrated for the first time that there is organism survival in the matrix. Both TBF and BBF had surviving organisms when GLUT was used. For AHP survival was seen less frequently in BBF than in TBF. Conclusion This BBF model demonstrated for the first time that survival of a wide range of microorganisms does occur in BBF, with significantly more rapid outgrowth compared to TBF. This is most pronounced when GLUT is used compared to AHP. The data supports the need for meticulous cleaning of reprocessed endoscopes since the presence of organic material and microorganisms prevents effective disinfection when GLUT and AHP are used. However, cross-linking agents like GLUT are not as effective when there is BBF. The data from the MBEC model of BBF suggest that for flexible endoscopes that are repeatedly used and reprocessed, the assurance of effective high-level disinfection may decrease if BBF develops within the channels. PMID:19426471
Zhao, Ming; Wang, Jian-peng; Wu, Pei-hong; Zhang, Fu-jun; Huang, Zi-lin; Li, Wang; Zhang, Liang; Pan, Chang-chuan; Li, Chuan-xing; Jiang, Yong
2010-11-09
To evaluate the clinical efficacy and survival rate of transarterial chemoembolization (TACE) alone or plus radiofrequency ablation (RFA) in patients with intermediate or advanced stage primary hepatocellular carcinoma (HCC). In this retrospective study, 467 cases received RFA or TACE plus RFA. Among them, 167 cases with strict clinical procedure (TACE alone or plus RFA) and complete follow-up data were included. Eighty-seven cases received TACE and 80 cases had TACE plus RFA between January 2000 and December 2006. Hierarchical analyses were performed using log-rank tests and survival curve was estimated by Kaplan-Meier method. A total of 167 patients received TACE alone or plus RFA for a follow-up period of 1 to 89 months. In the TACE alone group, the time-to-progression (TTP) was an average of 3.6 months. The median survival was 13 months, one-year survival rate 52.9%, three-year survival rate 11.5% and five-year survival rate 4.6%. In the TACE plus RFA group, the TTP time was an average of 10.8 months. The median survival time was 30 months, one-year survival rate 85.0%, three-year survival rate 45.0% and five-year survival rate 11.3%. In the TACE alone group, the median survival of intermediate stage HCC was 14 months, one-year survival rate 62.2%, three-year survival rate 13.3% and five-year survival rate 4.4%; In the TACE plus RFA group, the median survival of intermediate stage HCC was 14 months, one-year survival rate 90.1%, three-year survival rate 52.9% and five-year survival rate 13.7%. All differences of two groups has statistical significance (P < 0.05). In intermediate stage HCC, the median survival of TACE alone group was 14 months, one-year survival rate 62.2%, three-year survival rate 13.3%, five-year survival rate 4.4% versus 32 months, 90.1%, 52.9%, 13.7% in the TACE plus RFA group respectively. For the advanced stage HCC, the median survival time was 12 months, one-year survival rate 35%, three-year survival rate 7.1% and five-year survival rate 0 in the TACE alone group versus 28 months, 62.1%, 24.1% and 6.9% in the TACE plus RFA group (P = 0.00). There was significantly statistic difference between both groups in intermediate and advanced staging HCC. Among them, 60/485 (12.4%) patients required a therapy of post-TACE hepatic dysfunctions versus 13/168 (7.7%) in the TACE plus RFA group (P = 0.004, ANOVA method). The regimen of TACE plus RFA has the advantages of tumor control, liver function protection and survival extending in the treatment of HCC than TACE alone in intermediate or advanced stage HCC.
Trastuzumab and survival of patients with metastatic breast cancer.
Kast, Karin; Schoffer, Olaf; Link, Theresa; Forberger, Almuth; Petzold, Andrea; Niedostatek, Antje; Werner, Carmen; Klug, Stefanie J; Werner, Andreas; Gatzweiler, Axel; Richter, Barbara; Baretton, Gustavo; Wimberger, Pauline
2017-08-01
Prognosis of Her2-positive breast cancer has changed since the introduction of trastuzumab for treatment in metastatic and early breast cancer. It was described to be even better compared to prognosis of Her2-negative metastatic breast cancer. The purpose of this study was to evaluate the effect of trastuzumab in our cohort. Besides the effect of adjuvant pretreatment with trastuzumab on survival of patients with metastatic Her2-positive breast cancer was analyzed. All patients with primary breast cancer of the Regional Breast Cancer Center Dresden diagnosed during the years 2001-2013 were analyzed for treatment with or without trastuzumab in the adjuvant and in the metastatic treatment setting using Kaplan-Meier survival estimation and Cox regression. Age and tumor stage at time of first diagnosis of breast cancer as well as hormone receptor status, grading, time, and site of metastasis at first diagnosis of distant metastatic disease were analyzed. Of 4.481 female patients with primary breast cancer, 643 presented with metastatic disease. Her2-positive status was documented in 465 patients, including 116 patients with primary or secondary metastases. Median survival of patients with Her2-positive primary metastatic disease was 3.0 years (95% CI 2.3-4.0). After adjustment for other factors, survival was better in patients with Her2-positive breast cancer with trastuzumab therapy compared to Her2-negative metastatic disease (HR 2.10; 95% CI 1.58-2.79). Analysis of influence of adjuvant therapy with and without trastuzumab by Kaplan-Meier showed a trend for better survival in not pretreated patients. Median survival was highest in hormone receptor-positive Her2-positive (triple-positive) primary metastatic breast cancer patients with 3.3 years (95% CI 2.3-4.6). Prognosis of patients with Her2-positive metastatic breast cancer after trastuzumab treatment is more favorable than for Her2-negative breast cancer. The role of adjuvant chemotherapy with or without trastuzumab warrants further research. Survival is best in triple-positive metastatic breast cancer. This will effect counseling at the time of first diagnosis of metastatic breast cancer.
Du, Zhongli; Zhang, Wencheng; Zhou, Yuling; Yu, Dianke; Chen, Xiabin; Chang, Jiang; Qiao, Yan; Zhang, Meng; Huang, Ying; Wu, Chen; Xiao, Zefen; Tan, Wen; Lin, Dongxin
2015-09-01
To investigate whether single nucleotide polymorphisms (SNPs) in the ataxia telangiectasia mutated (ATM) gene are associated with survival in patients with esophageal squamous cell carcinoma (ESCC) receiving radiation therapy or chemoradiation therapy or surgery only. Four tagSNPs of ATM were genotyped in 412 individuals with clinical stage III or IV ESCC receiving radiation therapy or chemoradiation therapy, and in 388 individuals with stage I, II, or III ESCC treated with surgery only. Overall survival time of ESCC among different genotypes was estimated by Kaplan-Meier plot, and the significance was examined by log-rank test. The hazard ratios (HRs) and 95% confidence intervals (CIs) for death from ESCC among different genotypes were computed by a Cox proportional regression model. We found 2 SNPs, rs664143 and rs664677, associated with survival time of ESCC patients receiving radiation therapy. Individuals with the rs664143A allele had poorer median survival time compared with the rs664143G allele (14.0 vs 20.0 months), with the HR for death being 1.45 (95% CI 1.12-1.89). Individuals with the rs664677C allele also had worse median survival time than those with the rs664677T allele (14.0 vs 23.5 months), with the HR of 1.57 (95% CI 1.18-2.08). Stratified analysis showed that these associations were present in both stage III and IV cancer and different radiation therapy techniques. Significant associations were also found between the SNPs and locosregional progression or progression-free survival. No association between these SNPs and survival time was detected in ESCC patients treated with surgery only. These results suggest that the ATM polymorphisms might serve as independent biomarkers for predicting prognosis in ESCC patients receiving radiation therapy. Copyright © 2015 Elsevier Inc. All rights reserved.
Impact of transplant nephrectomy on peak PRA levels and outcome after kidney re-transplantation
Tittelbach-Helmrich, Dietlind; Pisarski, Przemyslaw; Offermann, Gerd; Geyer, Marcel; Thomusch, Oliver; Hopt, Ulrich Theodor; Drognitz, Oliver
2014-01-01
AIM: To determine the impact of transplant nephrectomy on peak panel reactive antibody (PRA) levels, patient and graft survival in kidney re-transplants. METHODS: From 1969 to 2006, a total of 609 kidney re-transplantations were performed at the University of Freiburg and the Campus Benjamin Franklin of the University of Berlin. Patients with PRA levels above (5%) before first kidney transplantation were excluded from further analysis (n = 304). Patients with graft nephrectomy (n = 245, NE+) were retrospectively compared to 60 kidney re-transplants without prior graft nephrectomy (NE-). RESULTS: Peak PRA levels between the first and the second transplantation were higher in patients undergoing graft nephrectomy (P = 0.098), whereas the last PRA levels before the second kidney transplantation did not differ between the groups. Age adjusted survival for the second kidney graft, censored for death with functioning graft, were comparable in both groups. Waiting time between first and second transplantation did not influence the graft survival significantly in the group that underwent nephrectomy. In contrast, patients without nephrectomy experienced better graft survival rates when re-transplantation was performed within one year after graft loss (P = 0.033). Age adjusted patient survival rates at 1 and 5 years were 94.1% and 86.3% vs 83.1% and 75.4% group NE+ and NE-, respectively (P < 0.01). CONCLUSION: Transplant nephrectomy leads to a temporary increase in PRA levels that normalize before kidney re-transplantation. In patients without nephrectomy of a non-viable kidney graft timing of re-transplantation significantly influences graft survival after a second transplantation. Most importantly, transplant nephrectomy is associated with a significantly longer patient survival. PMID:25032103
Kemna, Mariska; Albers, Erin; Bradford, Miranda C; Law, Sabrina; Permut, Lester; McMullan, D Mike; Law, Yuk
2016-03-01
The effect of donor-recipient sex matching on long-term survival in pediatric heart transplantation is not well known. Adult data have shown worse survival when male recipients receive a sex-mismatched heart, with conflicting results in female recipients. We analyzed 5795 heart transplant recipients ≤ 18 yr in the Scientific Registry of Transplant Recipients (1990-2012). Recipients were stratified based on donor and recipient sex, creating four groups: MM (N = 1888), FM (N = 1384), FF (N = 1082), and MF (N = 1441). Males receiving sex-matched donor hearts had increased unadjusted allograft survival at five yr (73.2 vs. 71%, p = 0.01). However, this survival advantage disappeared with longer follow-up and when adjusted for additional risk factors by multivariable Cox regression analysis. In contrast, for females, receiving a sex-mismatched heart was associated with an 18% higher risk of allograft loss over time compared to receiving a sex-matched heart (HR 1.18, 95% CI: 1.00-1.38) and a 26% higher risk compared to sex-matched male recipients (HR 1.26, 95% CI: 1.10-1.45). Females who receive a heart from a male donor appear to have a distinct long-term survival disadvantage compared to all other groups. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Survival with Three-Times Weekly In-Center Nocturnal Versus Conventional Hemodialysis
Xu, Jianglin; Suri, Rita S.; Nesrallah, Gihad; Lindsay, Robert; Garg, Amit X.; Lester, Keith; Ofsthun, Norma; Lazarus, Michael; Hakim, Raymond M.
2012-01-01
Whether the duration of hemodialysis treatments improves outcomes remains controversial. Here, we evaluated survival and clinical changes associated with converting from conventional hemodialysis (mean=3.75 h/treatment) to in-center nocturnal hemodialysis (mean=7.85 h/treatment). All 959 consecutive patients who initiated nocturnal hemodialysis for the first time in 77 Fresenius Medical Care facilities during 2006 and 2007 were eligible. We used Cox models to compare risk for mortality during 2 years of follow-up in a 1:3 propensity score–matched cohort of 746 nocturnal and 2062 control patients on conventional hemodialysis. Two-year mortality was 19% among nocturnal hemodialysis patients compared with 27% among conventional patients. Nocturnal hemodialysis associated with a 25% reduction in the risk for death after adjustment for age, body mass index, and dialysis vintage (hazard ratio=0.75, 95% confidence interval=0.61–0.91, P=0.004). With respect to clinical features, interdialytic weight gain, albumin, hemoglobin, dialysis dose, and calcium increased on nocturnal therapy, whereas postdialysis weight, predialysis systolic blood pressure, ultrafiltration rate, phosphorus, and white blood cell count declined (all P<0.001). In summary, notwithstanding the possibility of residual selection bias, conversion to treatment with nocturnal hemodialysis associates with favorable clinical features, laboratory biomarkers, and improved survival compared with propensity score–matched controls. The potential impact of extended treatment time on clinical outcomes while maintaining a three times per week hemodialysis schedule requires evaluation in future clinical trials. PMID:22362905
Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry
2004-06-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.
A new approach to estimate time-to-cure from cancer registries data.
Boussari, Olayidé; Romain, Gaëlle; Remontet, Laurent; Bossard, Nadine; Mounier, Morgane; Bouvier, Anne-Marie; Binquet, Christine; Colonna, Marc; Jooste, Valérie
2018-04-01
Cure models have been adapted to net survival context to provide important indicators from population-based cancer data, such as the cure fraction and the time-to-cure. However existing methods for computing time-to-cure suffer from some limitations. Cure models in net survival framework were briefly overviewed and a new definition of time-to-cure was introduced as the time TTC at which P(t), the estimated covariate-specific probability of being cured at a given time t after diagnosis, reaches 0.95. We applied flexible parametric cure models to data of four cancer sites provided by the French network of cancer registries (FRANCIM). Then estimates of the time-to-cure by TTC and by two existing methods were derived and compared. Cure fractions and probabilities P(t) were also computed. Depending on the age group, TTC ranged from to 8 to 10 years for colorectal and pancreatic cancer and was nearly 12 years for breast cancer. In thyroid cancer patients under 55 years at diagnosis, TTC was strikingly 0: the probability of being cured was >0.95 just after diagnosis. This is an interesting result regarding the health insurance premiums of these patients. The estimated values of time-to-cure from the three approaches were close for colorectal cancer only. We propose a new approach, based on estimated covariate-specific probability of being cured, to estimate time-to-cure. Compared to two existing methods, the new approach seems to be more intuitive and natural and less sensitive to the survival time distribution. Copyright © 2018 Elsevier Ltd. All rights reserved.
Rubin H. Flocks and colloidal gold treatments for prostate cancer.
Rosevear, Henry M; Lightfoot, Andrew J; O'Donnell, Michael A; Platz, Charles E; Loening, Stefan A; Hawtrey, Charles E
2011-01-01
In the early 1950s, Rubin H. Flocks of the University of Iowa began to treat prostate cancer patients with colloidal gold (Au(198)) therapy, evolving his technique over nearly 25 years in 1515 patients. We reviewed the long-term outcomes of Flocks' prostate cancer patients as compared to those patients treated by other methods at the University of Iowa before Flocks' chairmanship. We reviewed archived patient records, Flocks' published data, and long-term survival data from the Iowa Tumor Registry to determine short- and long-term outcomes of Flocks' work with colloidal gold. We also reviewed the literature of Flocks' time to compare his outcomes against those of his contemporaries. The use of colloidal gold, either as primary or adjunctive therapy, provided short- and long-term survival benefit for the majority of Flocks' patients as compared to historical treatment options (p < 0.001). Flocks' use of colloidal gold for the treatment of locally advanced prostate cancer offered short- and long-term survival benefits compared to other contemporary treatments.
Büchele, Fabian; Döbrössy, Máté; Hackl, Christina; Jiang, Wei; Papazoglou, Anna; Nikkhah, Guido
2014-08-01
Following transplantation of foetal primary dopamine (DA)-rich tissue for neurorestaurative treatment of Parkinson's disease (PD), only 5-10% of the functionally relevant DAergic cells survive both in experimental models and in clinical studies. The current work tested how a two-step grafting protocol could have a positive impact on graft survival. DAergic tissue is divided in two portions and grafted in two separate sessions into the same target area within a defined time interval. We hypothesized that the first graft creates a "DAergic" microenvironment or "nest" similar to the perinatal substantia nigra that stimulates and protects the second graft. 6-OHDA-lesioned rats were sequentially transplanted with wild-type (GFP-, first graft) and transgenic (GFP+, second graft) DAergic cells in time interims of 2, 5 or 9days. Each group was further divided into two sub-groups receiving either 200k (low cell number groups: 2dL, 5dL, 9dL) or 400k cells (high cell number groups: 2dH, 5dH, 9dH) as first graft. During the second transplantation, all groups received the same amount of 200k GFP+ cells. Controls received either low or high cell numbers in one single session (standard protocol). Drug-induced rotations, at 2 and 6weeks after grafting, showed significant improvement compared to the baseline lesion levels without significant differences between the groups. Rats were sacrificed 8weeks after transplantation for post-mortem histological assessment. Both two-step groups with the time interval of 2days (2dL and 2dH) showed a significantly higher survival of DAergic cells compared to their respective standard control group (2dL, +137%; 2dH, +47%). Interposing longer intervals of 5 or 9days resulted in the loss of statistical significance, neutralising the beneficial two-step grafting effect. Furthermore, the transplants in the 2dL and 2dH groups had higher graft volume and DA-fibre-density values compared to all other two-step groups. They also showed intense growth of GFP+ vessels - completely absent in control grafts - in regions where the two grafts overlap, indicating second-graft derived angiogenesis. In summary, the study shows that two-step grafting with a 2days time interval significantly increases DAergic cell survival compared to the standard protocol. Furthermore, our results demonstrate, for the first time, a donor-derived neoangiogenesis, leading to a new understanding of graft survival and development in the field of cell-replacement therapies for neurodegenerative diseases. Copyright © 2014 Elsevier Inc. All rights reserved.
Joseph, Djenaba A; Johnson, Chris J; White, Arica; Wu, Manxia; Coleman, Michel P
2017-12-15
In the first CONCORD study, 5-year survival for patients with diagnosed with rectal cancer between 1990 and 1994 was <60%, with large racial disparities noted in the majority of participating states. We have updated these findings to 2009 by examining population-based survival by stage of disease at the time of diagnosis, race, and calendar period. Data from the CONCORD-2 study were used to compare survival among individuals aged 15 to 99 years who were diagnosed in 37 states encompassing up to 80% of the US population. We estimated net survival up to 5 years after diagnosis correcting for background mortality with state-specific and race-specific life table. Survival estimates were age-standardized with the International Cancer Survival Standard weights. We present survival estimates by race (all, black, and white) for 2001 through 2003 and 2004 through 2009 to account for changes in collecting the data for Surveillance, Epidemiology, and End Results Summary Stage 2000. There was a small increase in 1-year, 3-year, and 5-year net survival between 2001-2003 (84.6%, 70.7%, and 63.2%, respectively), and 2004-2009 (85.1%, 71.5%, and 64.1%, respectively). Black individuals were found to have lower 1-year, 3-year, and 5-year survival than white individuals in both periods; the absolute difference in survival between black and white individuals declined only for 5-year survival. Black patients had lower 5-year survival than whites at each stage at the time of diagnosis in both time periods. There was little improvement noted in net survival for patients with rectal cancer, with persistent disparities noted between black and white individuals. Additional investigation is needed to identify and implement effective interventions to ensure the consistent and equitable use of high-quality screening, diagnosis, and treatment to improve survival for patients with rectal cancer. Cancer 2017;123:5037-58. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beer, W. Nicholas; Iltis, Susannah; Anderson, James J.
2009-01-01
Columbia Basin Research uses the COMPASS model on a daily basis during the outmigration of Snake River Chinook and steelhead smolts to predict downstream passage and survival. Fish arrival predictions and observations from program RealTime along with predicted and observed environmental conditions are used to make in-season predictions of arrival and survival to various dams in the Columbia and Snake Rivers. For 2008, calibrations of travel and survival parameters for two stocks of fish-Snake River yearling PIT-tagged wild chinook salmon (chin1pit) and Snake River PIT-tagged steelhead (lgrStlhd)-were used to model travel and survival of steelhead and chinook stocks from Lowermore » Granite Dam (LWG) or McNary Dam (MCN) to Bonneville Dam (BON). This report summarizes the success of the COMPASS/RealTime process to model these migrations as they occur. We compared model results on timing and survival to data from two sources: stock specific counts at dams and end-of-season control survival estimates (Jim Faulkner, NOAA, pers. comm. Dec. 16, 2008). The difference between the predicted and observed day of median passage and the Mean Absolute Deviation (MAD) between predicted and observed arrival cumulative distributions are measures of timing accuracy. MAD is essentially the average percentage error over the season. The difference between the predicted and observed survivals is a measure of survival accuracy. Model results and timing data were in good agreement from LWG to John Day Dam (JDA). Predictions of median passage days for the chin1pit and lgrStlhd stocks were 0 and 2 days (respectively) later than observed. MAD for chin1pit and lgrStlhd stocks at JDA were 2.3% and 5.9% (respectively). Between JDA and BON modeling and timing data were not as well matched. At BON, median passage predictions were 6 and 10 days later than observed and MAD values were 7.8% and 16.0% respectively. Model results and survival data were in good agreement from LWG to MCN. COMPASS predicted survivals of 0.77 and 0.69 for chin1pit and lgrStlhd, while the data control's survivals were 0.79 and 0.68. The differences are 0.02 and 0.01 (respectively), nearly identical. However, from MCN to BON, COMPASS predicted survivals of 0.74 and 0.69 while the data controls survivals were 0.47 and 0.53 respectively. Differences of 0.27 and 0.16. In summary: Travel and survival of chin1pit and lgrStlhd stocks were well modeled in the upper reaches. Fish in the lower reaches down through BON suffered unmodeled mortality, and/or passed BON undetected. A drop in bypass fraction and unmodeled mortality during the run could produce such patterns by shifting the observed median passage day to appear artificially early.« less
Goulart, Alessandra C; Fernandes, Tiotrefis G; Santos, Itamar S; Alencar, Airlane P; Bensenor, Isabela M; Lotufo, Paulo A
2013-05-24
Few studies have examined both ischemic and hemorrhagic stroke to identify prognostic factors associated to long-term stroke survival. We investigated long-term survival and predictors that could adversely influence ischemic and hemorrhagic first-ever stroke prognosis. We prospectively ascertained 665 consecutive first-ever ischemic and hemorrhagic stroke cases from "The Study of Stroke Mortality and Morbidity" (The EMMA Study) in a community hospital in São Paulo, Brazil. We evaluated cardiovascular risk factors and sociodemographic characteristics (age, gender, race and educational level). We found a lower survival rate among hemorrhagic cases compared to ischemic stroke cases at the end of 4 years of follow-up (52% vs. 44%, p = 0.04). The risk of death was two times higher among people with ischemic stroke without formal education. Also, we found consistently higher risk of death for diabetics with ischemic stroke (HR = 1.45; 95% CI = 1.07-1.97) compared to no diabetics. As expected, age equally influenced on the high risk of poor survival, regardless of stroke subtype. For ischemic stroke, the lack of formal education and diabetes were significant independent predictors of poor long-term survival.
Effects of immunization with the rNfa1 protein on experimental Naegleria fowleri-PAM mice.
Lee, Y J; Kim, J H; Sohn, H J; Lee, J; Jung, S Y; Chwae, Y J; Kim, K; Park, S; Shin, H J
2011-07-01
Free-living Naegleria fowleri causes primary amoebic meningoencephalitis (PAM) in humans and animals. To examine the effect of immunization with Nfa1 protein on experimental murine PAM because of N. fowleri, BALB/c mice were intra-peritoneally or intra-nasally immunized with a recombinant Nfa1 protein. We analysed Nfa1-specific antibody and cytokine induction, and the mean survival time of infected mice. Mice immunized intra-peritoneally or intra-nasally with rNfa1 protein developed specific IgG, IgA and IgE antibodies; the IgG response was dominated by IgG1, followed by IgG2b, IgG2a and IgG3. High levels of the Th1 cytokine, IFN-γ, and the regulatory cytokine, IL-10, were also induced. The mean survival time of mice immunized intra-peritoneally with rNfa1 protein was prolonged compared with controls, (25.0 and 15.5 days, respectively). Similarly, the mean survival time of mice immunized intra-nasally with rNfa1 protein was 24.7 days, compared with 15.0 days for controls. © 2011 Blackwell Publishing Ltd.
IRRADIATION OF SCORPIONS OF THE SPECIES CENTRUROIDES LIMPIDUS WITH GAMMA RAYS (in Spanish)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazzotti, L.; Rhode, R.H.; Lopez, F.
Although radiation studies have been done on other Arthropoda, mainly with the purpose of their eradication, little is known about the effect of gamma rays on scorpions. In this study, four groups of 15 scorpions each were irradiated with gamma rays varying in dosage from 4500 to 11,500 r. After a dose of approximates 4500 r, apparently normal physical activity and food intake were maintained. Longest survival time in this group was 105 days as compared with more than 8 months for the controls. The 8000-r dose produced immediate immobilization with subsequent development of sluggish movement. No food was takenmore » and longest surviva) time was 33 days. After 10,000 r all movement stopped, all deaths occurring before 17 days. After 11,500 r there was no survival beyond 7 days. The survival time for a control group of 100 scorpions was over 6 months in spite of the high mortality figure for these arachnids in captivity. The results indicate a very high vulnerability in scorpions of this species to gamma radiation as compared with arthropods of the class Insecta, which appear to show a much greater resistance. (BBB)« less
Biofilm formation enhances Helicobacter pylori survivability in vegetables.
Ng, Chow Goon; Loke, Mun Fai; Goh, Khean Lee; Vadivelu, Jamuna; Ho, Bow
2017-04-01
To date, the exact route and mode of transmission of Helicobacter pylori remains elusive. The detection of H. pylori in food using molecular approaches has led us to postulate that the gastric pathogen may survive in the extragastric environment for an extended period. In this study, we show that H. pylori prolongs its survival by forming biofilm and micro-colonies on vegetables. The biofilm forming capability of H. pylori is both strain and vegetable dependent. H. pylori strains were classified into high and low biofilm formers based on their highest relative biofilm units (BU). High biofilm formers survived longer on vegetables compared to low biofilm formers. The bacteria survived better on cabbage compared to other vegetables tested. In addition, images captured on scanning electron and confocal laser scanning microscopes revealed that the bacteria were able to form biofilm and reside as micro-colonies on vegetable surfaces, strengthening the notion of possible survival of H. pylori on vegetables for an extended period of time. Taken together, the ability of H. pylori to form biofilm on vegetables (a common food source for human) potentially plays an important role in its survival, serving as a mode of transmission of H. pylori in the extragastric environment. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sohn, Jae Ho; Duran, Rafael; Zhao, Yan; Fleckenstein, Florian; Chapiro, Julius; Sahu, Sonia P.; Schernthaner, Rüdiger E.; Qian, Tianchen; Lee, Howard; Zhao, Li; Hamilton, James; Frangakis, Constantine; Lin, MingDe; Salem, Riad; Geschwind, Jean-Francois
2018-01-01
Background & Aims There is debate over the best way to stage hepatocellular carcinoma (HCC). We attempted to validate the prognostic and clinical utility of the recently developed Hong Kong Liver Cancer (HKLC) staging system, a hepatitis B-based model, and compared data with that from the Barcelona Clinic Liver Cancer (BCLC) staging system in a North American population who underwent intra-arterial therapy (IAT). Methods We performed a retrospective analysis of data from 1009 patients with HCC who underwent intra-arterial therapy from 2000 through 2014. Most patients had hepatitis C or unresectable tumors; all patients underwent IAT, with or without resection, transplantation, and/or systemic chemotherapy. We calculated HCC stage for each patient using 5-stage HKLC (HKLC-5) and 9-stage HKLC (HKLC-9) system classifications, as well as the BCLC system. Survival information was collected up until end of 2014 at which point living or unconfirmed patients were censored. We compared performance of the BCLC, HKLC-5, and HKLC-9 systems in predicting patient outcomes using Kaplan-Meier estimates, calibration plots, c-statistic, Akaike information criterion, and the likelihood ratio test. Results Median overall survival time, calculated from first IAT until date of death or censorship, for the entire cohort (all stages) was 9.8 months. The BCLC and HKLC staging systems predicted patient survival times with significance (P<.001). HKLC-5 and HKLC-9 each demonstrated good calibration. The HKLC-5 system outperformed the BCLC system in predicting patient survival times (HKLC c=0.71, Akaike information criterion=6242; BCLC c=0.64, Akaike information criterion=6320), reducing error in predicting survival time (HKLC reduced error by 14%, BCLC reduced error by 12%), and homogeneity (HKLC χ2=201; P<.001; BCLC χ2=119; P<.001) and monotonicity (HKLC linear trend χ2=193; P<.001; BCLC linear trend χ2=111; P<.001). Small proportions of patients with HCC of stages IV or V, according to the HKLC system, survived for 6 months and 4 months, respectively. Conclusion In a retrospective analysis of patients who underwent IAT for unresectable HCC, we found the HKLC-5 staging system to have the best combination of performances in survival separation, calibration, and discrimination; it consistently outperformed the BCLC system in predicting survival times of patients. The HKLC system identified patients with HCC of stages IV and V who are unlikely to benefit from IAT. PMID:27847278
Sohn, Jae Ho; Duran, Rafael; Zhao, Yan; Fleckenstein, Florian; Chapiro, Julius; Sahu, Sonia; Schernthaner, Rüdiger E; Qian, Tianchen; Lee, Howard; Zhao, Li; Hamilton, James; Frangakis, Constantine; Lin, MingDe; Salem, Riad; Geschwind, Jean-Francois
2017-05-01
There is debate over the best way to stage hepatocellular carcinoma (HCC). We attempted to validate the prognostic and clinical utility of the recently developed Hong Kong Liver Cancer (HKLC) staging system, a hepatitis B-based model, and compared data with that from the Barcelona Clinic Liver Cancer (BCLC) staging system in a North American population that underwent intra-arterial therapy (IAT). We performed a retrospective analysis of data from 1009 patients with HCC who underwent IAT from 2000 through 2014. Most patients had hepatitis C or unresectable tumors; all patients underwent IAT, with or without resection, transplantation, and/or systemic chemotherapy. We calculated HCC stage for each patient using 5-stage HKLC (HKLC-5) and 9-stage HKLC (HKLC-9) system classifications, and the BCLC system. Survival information was collected up until the end of 2014 at which point living or unconfirmed patients were censored. We compared performance of the BCLC, HKLC-5, and HKLC-9 systems in predicting patient outcomes using Kaplan-Meier estimates, calibration plots, C statistic, Akaike information criterion, and the likelihood ratio test. Median overall survival time, calculated from first IAT until date of death or censorship, for the entire cohort (all stages) was 9.8 months. The BCLC and HKLC staging systems predicted patient survival times with significance (P < .001). HKLC-5 and HKLC-9 each demonstrated good calibration. The HKLC-5 system outperformed the BCLC system in predicting patient survival times (HKLC C = 0.71, Akaike information criterion = 6242; BCLC C = 0.64, Akaike information criterion = 6320), reducing error in predicting survival time (HKLC reduced error by 14%, BCLC reduced error by 12%), and homogeneity (HKLC chi-square = 201, P < .001; BCLC chi-square = 119, P < .001) and monotonicity (HKLC linear trend chi-square = 193, P < .001; BCLC linear trend chi-square = 111, P < .001). Small proportions of patients with HCC of stages IV or V, according to the HKLC system, survived for 6 months and 4 months, respectively. In a retrospective analysis of patients who underwent IAT for unresectable HCC, we found the HKLC-5 staging system to have the best combination of performances in survival separation, calibration, and discrimination; it consistently outperformed the BCLC system in predicting survival times of patients. The HKLC system identified patients with HCC of stages IV and V who are unlikely to benefit from IAT. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
New insights about host response to smallpox using microarray data
Esteves, Gustavo H; Simoes, Ana CQ; Souza, Estevao; Dias, Rodrigo A; Ospina, Raydonal; Venancio, Thiago M
2007-01-01
Background Smallpox is a lethal disease that was endemic in many parts of the world until eradicated by massive immunization. Due to its lethality, there are serious concerns about its use as a bioweapon. Here we analyze publicly available microarray data to further understand survival of smallpox infected macaques, using systems biology approaches. Our goal is to improve the knowledge about the progression of this disease. Results We used KEGG pathways annotations to define groups of genes (or modules), and subsequently compared them to macaque survival times. This technique provided additional insights about the host response to this disease, such as increased expression of the cytokines and ECM receptors in the individuals with higher survival times. These results could indicate that these gene groups could influence an effective response from the host to smallpox. Conclusion Macaques with higher survival times clearly express some specific pathways previously unidentified using regular gene-by-gene approaches. Our work also shows how third party analysis of public datasets can be important to support new hypotheses to relevant biological problems. PMID:17718913
Sherwood, Mark Brian
2006-01-01
Purpose The purpose of this study was to evaluate the concept of targeting mediators of the scarring process at multiple points across the course of bleb failure, in order to prolong bleb survival. Methods There were three linked parts to the experiment. In the first part, a cannula glaucoma filtration surgery (GFS) was performed on 32 New Zealand White (NZW) rabbits, and bleb survival was assessed for six different regimens plus controls by grading bleb height and width. For the second part of the study, the same GFS surgery was performed on an additional 10 NZW rabbits. Two additional filtering blebs were treated with balanced saline solution (BSS), two received mitomycin-C (MMC) (0.4 mg/mL), and for the remaining six, a sequential regimen was given consisting of 200 mmol/L mannose-6-phosphate (M-6-P) solution at the time of surgery, followed by subconjunctival injections of antibody to connective tissue growth factor at days 2 and 4, and Ilomastat, a broad-spectrum matrix metalloproteinase inhibitor, at days 7, 12, and 20 postoperatively. Bleb survival was again assessed. In the final part of the experiment, blebs treated with either BSS, MMC, or the above sequential multitreatment regimen were examined histologically at 14 days postoperatively in three additional NZW rabbits. Results All six individual therapies selected resulted in some improvement of bleb survival compared to BSS control. Blebs treated with the new sequential, multitreatment protocol survived an average of 29 days (regression slope, P < .0001 compared to control), those receiving BSS an average of 17 days, and those treated with MMC (0.4 mg/mL) an average of 36 days. The sequential, multitreatment regimen was significantly superior to any of the six monotherapies for time to zero analysis (flattening) of the bleb (P < .002). Histologic examination of the bleb tissues showed a markedly less epithelial thinning, subepithelial collagen thinning, and goblet cell loss in the multitreatment group, when compared with the MMC blebs. Conclusions In a rabbit model of GFS, a sequential, targeted, multitreatment approach prolonged bleb survival compared to BSS controls and decreased bleb tissue morphological changes when compared to those treated with MMC. It is not known whether these findings can be reproduced in humans, and further work is needed to determine an optimum regimen and timing of therapeutic delivery. PMID:17471357
Wang, Jingshu; Chmielowski, Bartosz; Pellissier, James; Xu, Ruifeng; Stevinson, Kendall; Liu, Frank Xiaoqing
2017-02-01
Recent clinical trials have shown that pembrolizumab significantly prolonged progression-free survival and overall survival compared with ipilimumab in ipilimumab-naïve patients with unresectable or metastatic melanoma. However, there has been no published evidence on the cost-effectiveness of pembrolizumab for this indication. To assess the long-term cost-effectiveness of pembrolizumab versus ipilimumab in ipilimumab-naïve patients with unresectable or meta-static melanoma from a U.S. integrated health system perspective. A partitioned-survival model was developed, which divided overall survival time into progression-free survival and postprogression survival. The model used Kaplan-Meier estimates of progression-free survival and overall survival from a recent randomized phase 3 study (KEYNOTE-006) that compared pembrolizumab and ipilimumab. Extrapolation of progression-free survival and overall survival beyond the clinical trial was based on parametric functions and literature data. The base-case time horizon was 20 years, and costs and health outcomes were discounted at a rate of 3% per year. Clinical data-including progression-free survival and overall survival data spanning a median follow-up time of 15 months, as well as quality of life and adverse event data from the ongoing KEYNOTE-006 trial-and cost data from public sources were used to populate the model. Costs included those of drug acquisition, treatment administration, adverse event management, and disease management of advanced melanoma. The incremental cost-effectiveness ratio (ICER) expressed as cost difference per quality-adjusted life-year (QALY) gained was the main outcome, and a series of sensitivity analyses were performed to test the robustness of the results. In the base case, pembrolizumab was projected to increase the life expectancy of U.S. patients with advanced melanoma by 1.14 years, corresponding to a gain of 0.79 discounted QALYs over ipilimumab. The model also projected an average increase of $63,680 in discounted perpatient costs of treatment with pembrolizumab versus ipilimumab. The corresponding ICER was $81,091 per QALY ($68,712 per life-year) over a 20-year time horizon. With $100,000 per QALY as the threshold, when input parameters were varied in deterministic one-way sensitivity analyses, the use of pembrolizumab was cost-effective relative to ipilimumab in most ranges. Further, in a comprehensive probabilistic sensitivity analysis, the ICER was cost-effective in 83% of the simulations. Compared with ipilimumab, pembrolizumab had higher expected QALYs and was cost-effective for the treatment of patients with unresectable or metastatic melanoma from a U.S. integrated health system perspective. This study was supported by funding from Merck & Co., which reviewed and approved the manuscript before journal submission. Wang, Pellissier, Xu, Stevinson, and Liu are employees of, and own stock in, Merck & Co. Chmielowski has served as a paid consultant for Merck & Co. and received a consultant fee for clinical input in connection with this study. Chmielowski also reports receiving advisory board and speaker bureau fees from multiple major pharmaceutical companies. Wang led the modeling and writing of the manuscript. Chmielowski, Xu, Stevinson, and Pellissier contributed substantially to the modeling design and methodology. Liu led the data collection work and contributed substantially to writing the manuscript. In conducting the analysis and writing the manuscript, the authors followed Merck publication polices and the "cost-effectiveness analysis alongside clinical trials-good research practices and the CHEERS reporting format as recommended by the International Society for Pharmacoeconomics and Outcomes Research.
An accelerated technique for irradiation of malignant canine nasal and paranasal sinus tumors.
Adams, W M; Miller, P E; Vail, D M; Forrest, L J; MacEwen, E G
1998-01-01
Tumor and normal tissue response was assessed in 21 dogs with malignant nasal tumors given 42 Gy cobalt radiation in 9 or 10 fractions over 11 to 13 days. Local tumor/clinical relapse recurred in 68% of dogs, with a median relapse free interval (RFI) of 270 days. Median survival was 428 days. One year survival for all dogs was 60%. RFI and survival times are better than, or similar to, previous reports of dogs treated with radiotherapy only. Acute radiation effects were severe in one dog. Late effects were severe in six of 15 dogs (40%) with durable tumor control. Late effects included bilateral blindness (3), osteoradionecrosis (3), and seizures (1). These six dogs had a median survival of 705 days. Loss of vision occurred in at least one eye in nine dogs (47%). Tumor staging based on CT findings was predictive for survival duration. Tumor histology was not predictive of outcome. Labrador Retrievers were significantly over-represented. Despite comparable or improved tumor control and survival times provided by this accelerated protocol, relative to other radiotherapy reports, local failure remains the major cause of death, and late radiation effects can be severe in dogs with durable tumor control.
Population dynamics of Greater Scaup breeding on the Yukon-Kuskokwim Delta, Alaska
Flint, Paul L.; Grand, J. Barry; Fondell, Thomas F.; Morse, Julie A.
2006-01-01
Using a stochastic model, we estimated that, on average, breeding females produced 0.57 young females/nesting season. We combined this estimate of productivity with our annual estimates of adult survival and an assumed population growth rate of 1.0, then solved for an estimate of first-year survival (0.40). Under these conditions the predicted stable age distribution of breeding females (i.e., the nesting population) was 15.1% 1-year-old, 4.1% 2-year-old first-time breeders, and 80.8% 2-year-old and older, experienced breeders. We subjected this stochastic model to perturbation analyses to examine the relative effects of demographic parameters on k. The relative effects of productivity and adult survival on the population growth rate were 0.26 and 0.72, respectively. Thus, compared to productivity, proportionally equivalent changes in annual survival would have 2.8 times the effect on k. However, when we examined annual variation in predicted population size using standardized regression coefficients, productivity explained twice as much variation as annual survival. Thus, management actions focused on changes in survival or productivity have the ability to influence population size; however, substantially larger changes in productivity are required to influence population trends.
Habib, Shahid; Khan, Khalid; Hsu, Chiu-Hsieh; Meister, Edward; Rana, Abbas; Boyer, Thomas
2017-01-01
Background We evaluated the concept of whether liver failure patients with a superimposed kidney injury receiving a simultaneous liver and kidney transplant (SLKT) have similar outcomes compared to patients with liver failure without a kidney injury receiving a liver transplantation (LT) alone. Methods Using data from the United Network of Organ Sharing (UNOS) database, patients were divided into five groups based on pre-transplant model for end-stage liver disease (MELD) scores and categorized as not having (serum creatinine (sCr) ≤ 1.5 mg/dL) or having (sCr > 1.5 mg/dL) renal dysfunction. Of 30,958 patients undergoing LT, 14,679 (47.5%) had renal dysfunction, and of those, 5,084 (16.4%) had dialysis. Results Survival in those (liver failure with renal dysfunction) receiving SLKT was significantly worse (P < 0.001) as compared to those with sCr < 1.5 mg/dL (liver failure only). The highest mortality rate observed was 21% in the 36+ MELD group with renal dysfunction with or without SLKT. In high MELD recipients (MELD > 30) with renal dysfunction, presence of renal dysfunction affects the outcome and SLKT does not improve survival. In low MELD recipients (16 - 20), presence of renal dysfunction at the time of transplantation does affect post-transplant survival, but survival is improved with SLKT. Conclusions SLKT improved 1-year survival only in low MELD (16 - 20) recipients but not in other groups. Performance of SLKT should be limited to patients where a benefit in survival and post-transplant outcomes can be demonstrated. PMID:28496531
Wang, Chien-Kai; Chen, Hsiao-Chien; Fang, Sheng-Uei; Ho, Chia-Wen; Tai, Cheng-Jeng; Yang, Chih-Ping; Liu, Yu-Chuan
2018-04-20
Many human diseases are inflammation-related, such as cancer and those associated with aging. Previous studies demonstrated that plasmon-induced activated (PIA) water with electron-doping character, created from hot electron transfer via decay of excited Au nanoparticles (NPs) under resonant illumination, owns reduced hydrogen-bonded networks and physchemically antioxidative properties. In this study, it is demonstrated PIA water dramatically induced a major antioxidative Nrf2 gene in human gingival fibroblasts which further confirms its cellular antioxidative and anti-inflammatory properties. Furthermore, mice implanted with mouse Lewis lung carcinoma (LLC-1) cells drinking PIA water alone or together with cisplatin treatment showed improved survival time compared to mice which consumed only deionized (DI) water. With the combination of PIA water and cisplatin administration, the survival time of LLC-1-implanted mice markedly increased to 8.01 ± 0.77 days compared to 6.38 ± 0.61 days of mice given cisplatin and normal drinking DI water. This survival time of 8.01 ± 0.77 days compared to 4.62 ± 0.71 days of mice just given normal drinking water is statistically significant (p = 0.009). Also, the gross observations and eosin staining results suggested that LLC-1-implanted mice drinking PIA water tended to exhibit less metastasis than mice given only DI water.
[Survival analysis with competing risks: estimating failure probability].
Llorca, Javier; Delgado-Rodríguez, Miguel
2004-01-01
To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.
Gao, Hongfei; Yuan, Lijun; Han, Yimin
2016-06-24
The current study aims to evaluate and compare the efficacy of post-operative chemotherapy using paclitaxel plus carboplatin or nedaplatin in patients with ovarian cancer, as well as the effects of different combinational therapies on the survival times of patients. Ninety-four patients were recruited for the study. These ovarian cancer patients were admitted into the Cancer Hospital Affiliated with Harbin Medical University for surgery from January 2008 to October 2009. They were divided into different groups according to their post-operative chemotherapy schemes: paclitaxel plus carboplatin (CBP group, n = 48) and paclitaxel plus nedaplatin (NDP group, n = 46). Variance analysis was used to compare the effects of different chemotherapy schemes and pathological types of ovarian cancer on the level of CA125 in serum at different treatment time points. Univariate and multivariate analyses were employed to evaluate the survival times of patients in different groups and pathological types and ages. No significant differences were observed regarding the effects of various chemotherapy schemes (P = 0.561) and pathological types (P = 0.903) on the level of CA125 in serum of patients with ovarian cancer. However, the duration of chemotherapy had a profound impact on the level of CA125 in serum (P < 0.001). The survival times of patients was not affected by age (P = 0.101) and pathological type of ovarian cancer (P = 0.94) significantly. However, it was significantly affected by the chemotherapy scheme. Combined chemotherapy using carboplatin plus paclitaxel should be considered as the preferred treatment scheme for the initial treatment of ovarian cancer.
Ohmi, Aki; Ohno, Koichi; Uchida, Kazuyuki; Goto-Koshino, Yuko; Tomiyasu, Hirotaka; Kanemoto, Hideyuki; Fukushima, Kenjiro; Tsujimoto, Hajime
2017-09-29
Shiba dogs are predisposed to chronic enteropathy (CE) and have poorer prognosis than other dog breeds. The objective of this study was to investigate the significance of polymerase chain reaction for antigen receptor rearrangement (PARR) results on clinical findings and prognosis of Shiba dogs with CE. We retrospectively collected data on 22 Shiba dogs diagnosed as having CE. Fifty-nine percent of the dogs had clonality-positive results on PARR analysis. Furthermore, on histopathology, epitheliotropic behavior of small lymphocytes of the intestinal mucosa was observed significantly more frequently in dogs with clonal rearrangement of antigen receptor genes (P=0.027). The median overall survival time of clonality-positive dogs was 48 days (range, 4-239 days), compared to 271 days (range, 45-1,316+ days) in clonality-negative dogs. The median overall survival time of epitheliotropism-positive dogs was 76 days (range, 30-349 days) compared to 239 days (range, 4-1,316+ days) for epitheliotropism-negative dogs. Statistical analysis revealed that the clonality-positive result was associated with significantly shorter survival time (P=0.036). In contrast, presence or absence of epitheliotropism had no statistically significant effect on survival time (P=0.223). These cases might appropriately be diagnosed as small T-cell intestinal lymphoma; there are some common clinical and pathogenic features with human enteropathy-associated T-cell lymphoma type 2. The pathogenesis and poor prognosis for Shiba dogs with CE seem to be associated with this type of lymphoma, although further investigation is warranted.
Witjes, J Alfred; Dalbagni, Guido; Karnes, Robert J; Shariat, Shahrokh; Joniau, Steven; Palou, Joan; Serretta, Vincenzo; Larré, Stéphane; di Stasi, Savino; Colombo, Renzo; Babjuk, Marek; Malmström, Per-Uno; Malats, Nuria; Irani, Jacques; Baniel, Jack; Cai, Tommaso; Cha, Eugene; Ardelt, Peter; Varkarakis, John; Bartoletti, Riccardo; Spahn, Martin; Pisano, Francesca; Gontero, Paolo; Sylvester, Richard
2016-11-01
Potential differences in efficacy of different bacillus Calmette-Guérin (BCG) strains are of importance for daily practice, especially in the era of BCG shortage. To retrospectively compare the outcome with BCG Connaught and BCG TICE in a large study cohort of pT1 high-grade non-muscle-invasive bladder cancer patients. Individual patient data were collected for 2,451 patients with primary T1G3 tumors from 23 centers who were treated with BCG for the first time between 1990 and 2011. Using Cox multivariable regression and adjusting for the most important prognostic factors in this nonrandomized comparison, BCG Connaught and TICE were compared for time to recurrence, progression, and the duration of cancer specific survival and overall survival. Information on the BCG strain was available for 2,099 patients: 957 on Connaught and 1,142 on TICE. Overall, 765 (36%) patients received some form of maintenance BCG, 560 (59%) on Connaught and 205 (18%) on TICE. Without maintenance, Connaught was more effective than TICE only for the time to first recurrence (hazard ratio [HR] = 1.48; 95% CI: 1.20-1.82; P<0.001). With maintenance, TICE was more effective than Connaught for the time to first recurrence (HR = 0.66; 95% CI: 0.47-0.93; P = 0.019) with a trend for cancer specific survival (HR = 0.36; 95% CI: 0.14-0.92; P = 0.033). For time to progression and overall survival, Connaught and TICE had a similar efficacy. Compared to no maintenance therapy, maintenance BCG significantly reduced the risk of recurrence, progression and death, both overall, and disease specific, for TICE, but not for Connaught. We found that BCG Connaught results in a lower recurrence rate as compared with BCG TICE when no maintenance is used. However, the opposite is true when maintenance is given. As there is currently a BCG shortage, information on the efficacy of different BCG strains is important. In this nonrandomized retrospective comparison in over 2,000 patients, we found that BCG Connaught reduces the recurrence rate compared to BCG TICE when no maintenance is used, but the opposite is true when maintenance is given. Copyright © 2016 Elsevier Inc. All rights reserved.
Qualls, Whitney A; Xue, Rui De; Beier, John C; Müller, Günter C
2013-06-01
The international trade of lucky bamboo (Dracaena sanderiana [Asparagaceae]) is responsible for certain introductions of the exotic species Aedes albopictus (Skuse) in California and the Netherlands. Understanding the association of this species with lucky bamboo and other ornamental plants is important from a public health standpoint. The aim of this study was to investigate the importance of indoor ornamental plants as sugar sources for adult A. albopictus. If exposed to D. sanderiana, bromeliad (Guzmania spp. hybrid [Bromeliaceae]), Moses-in-the-cradle (Rhoeo spathacea [Commelinaceae]), 10 % sucrose solution, and a negative water control as the only nutrient source, adult female A. albopictus mean survival time was 12, 7, 6, 15, and 4 days, respectively. Mean survival times for adult males were not significantly different (P > 0.05) from the females and were 10, 7, 6, 14, and 3 days, respectively. Combined male and female survival times were not significantly different on lucky bamboo compared to survival times on a 10 % sucrose control. Based on our findings, A. albopictus can readily survive long enough to complete a gonotrophic cycle and potentially complete the extrinsic incubation period for many arboviruses when only provided access to lucky bamboo plants or possibly other common ornamentals. Vector control professionals should be aware of potential in-home infestations and public health concerns associated with mosquito breeding and plant tissue feeding on ornamental plants.
Survival of the first arteriovenous fistula in 96 patients on chronic hemodialysis.
Radoui, Aicha; Lyoussfi, Zineb; Haddiya, Intissar; Skalli, Zoubair; El Idrissi, Redouane; Rhou, Hakima; Ezzaitouni, Fatima; Ouzeddoun, Naima; El Mesnaoui, Abbes; Bayahia, Rabea; Benamar, Loubna
2011-07-01
Native arteriovenous fistula (AVF) represents the best vascular approach for chronic hemodialysis. The aim of this study was to determine the survival of the first AVF and to identify the factors responsible for poor AVF survival. A retrospective study was conducted on 96 chronic hemodialysis patients benefiting from the creation and cannulation of their first AVF at our center, with a minimum follow-up period of 1 year. We collected demographic, clinical, and biological data, as well as analyzed the following AVF characteristics: anatomic site, cannulation time, survival, and complications. To identify the predictive factors of poor AVF survival, we defined and compared two groups of patients on the basis of whether they lost their first AVF during the evolution. Patients' mean age was 42.1 ± 13 years, with predominantly female patients. Mean AVF cannulation time was 17.5 ± 24 days. AVF loss was mainly related to thrombosis in 29% of the cases and stenosis in 9.4%. AVF survival was 87%, 77%, 71%, 67%, and 64% after 1, 3, 5, 8, and 10 years of hemodialysis, respectively. In our study, the main factors associated with AVF loss were lengthy jugular venous catheters placement (p = 0.004), short AVF cannulation time after its creation (p = 0.03), and hypotension episodes during dialysis (p = 0.03). Long-term survival and quality of life in hemodialysis depend on an appropriate dialysis carried out-thanks to a correct vascular approach! According to the previously published data, survival of the first AVF can vary between 10% and 36% at 10 years. In our study, survival of the first native AVF was satisfying because it reached 64% at 10 years. Early AVF creation and prevention and management of its complications remain the safest and most comfortable solution to ensure AVF survival and thus a satisfying survival and quality of life in chronic hemodialysis patients. Copyright © 2011 Annals of Vascular Surgery Inc. Published by Elsevier Inc. All rights reserved.
Lymph Nodes and Survival in Pancreatic Neuroendocrine Tumors (pNET)
Krampitz, Geoffrey W.; Norton, Jeffrey A.; Poultsides, George A.; Visser, Brendan C.; Sun, Lixian; Jensen, Robert T.
2012-01-01
Background The significance of lymph node metastases on survival of patients with pNET is controversial. Hypothesis Lymph node metastases decrease survival in patients with pNET. Design Prospective databases of the National Institutes of Health (NIH) and Stanford University Hospital (SUH) were queried. Main Outcome Measures Overall survival, disease-related survival, and time to development of liver metastases Results 326 underwent surgical exploration for pNET at the NIH (n=216) and SUH (n=110). 40 (13%) and 305 (94%) underwent enucleation and resection, respectively. Of the patients who underwent resection, 117 (42%) had partial pancreatectomy and 30 (11%) had a Whipple procedure. 41 also had liver resections, 21 wedge resections and 20 lobectomies. Average follow-up was 8 years (range 0.3–28.6 years). The 10-year overall survival for patients with no metastases or lymph node metastases only was similar at 80%. As expected, patients with liver metastases had a significantly decreased 10-year survival of 30% (p<0.001). The time to development of liver metastases was significantly reduced for patients with lymph node metastases alone compared to those with none (p<0.001). For the NIH cohort with longer follow-up, disease-related survival was significantly different for those patients with no metastases, lymph node metastases alone, and liver metastases (p<0.0001). Extent of lymph node involvement in this subgroup showed that disease-related survival decreased as a function of number of lymph nodes involved (p=0.004). Conclusion As expected, liver metastases decrease survival of patients with pNET. Patients with lymph node metastases alone have a shorter time to development of liver metastases that is dependent on the number of lymph nodes involved. With sufficient long-term follow-up, lymph node metastases decrease disease-related survival. Careful evaluation of number and extent of lymph node involvement is warranted in all surgical procedures for pNET. PMID:22987171
Brinker, T; Raymond, B; Bijma, P; Vereijken, A; Ellen, E D
2017-02-01
Mortality of laying hens due to cannibalism is a major problem in the egg-laying industry. Survival depends on two genetic effects: the direct genetic effect of the individual itself (DGE) and the indirect genetic effects of its group mates (IGE). For hens housed in sire-family groups, DGE and IGE cannot be estimated using pedigree information, but the combined effect of DGE and IGE is estimated in the total breeding value (TBV). Genomic information provides information on actual genetic relationships between individuals and might be a tool to improve TBV accuracy. We investigated whether genomic information of the sire increased TBV accuracy compared with pedigree information, and we estimated genetic parameters for survival time. A sire model with pedigree information (BLUP) and a sire model with genomic information (ssGBLUP) were used. We used survival time records of 7290 crossbred offspring with intact beaks from four crosses. Cross-validation was used to compare the models. Using ssGBLUP did not improve TBV accuracy compared with BLUP which is probably due to the limited number of sires available per cross (~50). Genetic parameter estimates were similar for BLUP and ssGBLUP. For both BLUP and ssGBLUP, total heritable variance (T 2 ), expressed as a proportion of phenotypic variance, ranged from 0.03 ± 0.04 to 0.25 ± 0.09. Further research is needed on breeding value estimation for socially affected traits measured on individuals kept in single-family groups. © 2016 The Authors. Journal of Animal Breeding and Genetics Published by Blackwell Verlag GmbH.
Park, Seong Yong; Suh, Jee Won; Narm, Kyoung Sik; Lee, Chang Young; Lee, Jin Gu; Paik, Hyo Chae; Chung, Kyoung Young; Kim, Dae Joon
2017-06-01
This study was performed to investigate the feasibility of four-arm robotic lobectomy (FARL) as a solo surgical technique in patients with non-small cell lung cancer (NSCLC). Early outcome and long-term survival of FARL were compared with those of video-assisted thoracoscopic lobectomy (VATL). Prospective enrollment of patients with clinical stage I NSCLC undergoing FARL or VATL (20 patients in each group) was planned. Interim analysis for early postoperative outcome was performed after the initial 10 cases in each group. The study was terminated early because of safety issues in the FARL group after enrollment of 12 FARL and 17 VATL patients from 2011 to 2012. There were no differences in clinical characteristics between groups. Lobectomy time and total operation time were significantly longer in the FARL group (P=0.003). There were three life-threatening events in the FARL group (2 bleedings, 1 bronchus tear) that necessitated thoracotomy conversion in 1 patient. There were no differences in other operative outcomes including pain score, complications, or length of hospital stay. Pathologic stage and number of dissected lymph nodes (LNs) were also comparable. During a follow-up of 48.9±9.5 months, recurrence was identified in 2 (16.7%) patients in FARL group and 3 (23.5%) in VATL group. Five-year overall survival (100% vs . 87.5%, P=0.386) and disease-free survival (82.5% vs . 75.6%, P=0.589) were comparable. FARL as solo surgery could not be recommended because of safety issues. It required a longer operation time and had no benefits over VATL in terms of early postoperative outcome or long-term survival.
Khan, Hafiz M R; Gabbidon, Kemesha; Saxena, Anshul; Abdool-Ghany, Faheema; Dodge, John M; Lenzmeier, Taylor
2016-10-01
Cervical cancer is the second most common cancer among women resulting in nearly 500,000 cases annually. Screening leads to better treatment and survival time. However, human papillomavirus (HPV) exposure, screening, and treatment vary among races and ethnicities in the United States. The purpose of this study is to examine disparities in characteristics of cervical cancer and survival of cases between White Hispanic (WH) and White non-Hispanic (WNH) women in the United States. We used a stratified random sampling method to select cervical cancer patient records from nine states; a simple random sampling method to extract the demographic and disease characteristics data within states from the Surveillance Epidemiology and End Results (SEER) database. We used statistical probability distribution methods for discrete and continuous data. The chi-square test and independent samples t-test were used to evaluate statistically significant differences. Furthermore, the Cox Proportional Regression and the Kaplan-Meier survival estimators were used to compare WH and WNH population survival times in the United States. The samples of WNH and WH women included 4,000 cervical cancer cases from 1973-2009. There were statistically significant differences between ethnicities: marital status (p < 0.001); primary site of cancer (p < 0.001); lymph node involvement (p < 0.001); grading and differentiation (p < 0.0001); and tumor behavior (p < 0.001). The mean age of diagnosis for both groups showed no statistical differences. However, the mean survival time for WNH was 221.7 (standard deviation [SD] = 118.1) months and for WH was 190.3 (SD = 120.3), which differed significantly (p < 0.001). Clear disparities exist in risk factors, cervical cancer characteristics, and survival time between WH and WNH women.
Methods for Performing Survival Curve Quality-of-Life Assessments.
Sumner, Walton; Ding, Eric; Fischer, Irene D; Hagen, Michael D
2014-08-01
Many medical decisions involve an implied choice between alternative survival curves, typically with differing quality of life. Common preference assessment methods neglect this structure, creating some risk of distortions. Survival curve quality-of-life assessments (SQLA) were developed from Gompertz survival curves fitting the general population's survival. An algorithm was developed to generate relative discount rate-utility (DRU) functions from a standard survival curve and health state and an equally attractive alternative curve and state. A least means squared distance algorithm was developed to describe how nearly 3 or more DRU functions intersect. These techniques were implemented in a program called X-Trade and tested. SQLA scenarios can portray realistic treatment choices. A side effect scenario portrays one prototypical choice, to extend life while experiencing some loss, such as an amputation. A risky treatment scenario portrays procedures with an initial mortality risk. A time trade scenario mimics conventional time tradeoffs. Each SQLA scenario yields DRU functions with distinctive shapes, such as sigmoid curves or vertical lines. One SQLA can imply a discount rate or utility if the other value is known and both values are temporally stable. Two SQLA exercises imply a unique discount rate and utility if the inferred DRU functions intersect. Three or more SQLA results can quantify uncertainty or inconsistency in discount rate and utility estimates. Pilot studies suggested that many subjects could learn to interpret survival curves and do SQLA. SQLA confuse some people. Compared with SQLA, standard gambles quantify very low utilities more easily, and time tradeoffs are simpler for high utilities. When discount rates approach zero, time tradeoffs are as informative and easier to do than SQLA. SQLA may complement conventional utility assessment methods. © The Author(s) 2014.
Improved survival in HIV treatment programs in Asia
De La Mata, Nicole L; Kumarasamy, Nagalingeswaran; Khol, Vohith; Ng, Oon Tek; Van Nguyen, Kinh; Merati, Tuti Parwati; Pham, Thuy Thanh; Lee, Man Po; Durier, Nicolas; Law, Matthew
2016-01-01
Background Antiretroviral treatment (ART) for HIV-positive patients has expanded rapidly in Asia over the last ten years. Our study aimed to describe the time trends and risk factors for overall survival in patients receiving first-line ART in Asia. Methods We included HIV-positive adult patients who initiated ART between 2003–2013 (n=16 546), from seven sites across six Asia-Pacific countries. Patient follow-up was to May 2014. We compared survival for each country and overall by time period of ART initiation using Kaplan-Meier curves. Factors associated with mortality were assessed using Cox regression, stratified by site. We also summarized first-line ART regimens, CD4 count at ART initiation, and CD4 and HIV viral load testing frequencies. Results There were 880 deaths observed over 54 532 person-years of follow-up, a crude rate of 1.61 (1.51, 1.72) per 100 person-years. Survival significantly improved in more recent years of ART initiation. The survival probabilities at 4 years follow-up for those initiating ART in 2003–05 was 92.1%, 2006–09 was 94.3% and 2010–2013 was 94.5% (p<0.001). Factors associated with higher mortality risk included initiating ART in earlier time periods, older age, male sex, injecting drug use as HIV exposure and lower pre-ART CD4 count. Concurrent with improved survival was increased tenofovir use, ART initiation at higher CD4 counts, and greater monitoring of CD4 and HIV viral load. Conclusions Our results suggest that HIV-positive patients from Asia have improved survival in more recent years of ART initiation. This is likely a consequence of improvements in treatment and, patient management and monitoring over time. PMID:26961354
Modeling absolute differences in life expectancy with a censored skew-normal regression approach
Clough-Gorr, Kerri; Zwahlen, Marcel
2015-01-01
Parameter estimates from commonly used multivariable parametric survival regression models do not directly quantify differences in years of life expectancy. Gaussian linear regression models give results in terms of absolute mean differences, but are not appropriate in modeling life expectancy, because in many situations time to death has a negative skewed distribution. A regression approach using a skew-normal distribution would be an alternative to parametric survival models in the modeling of life expectancy, because parameter estimates can be interpreted in terms of survival time differences while allowing for skewness of the distribution. In this paper we show how to use the skew-normal regression so that censored and left-truncated observations are accounted for. With this we model differences in life expectancy using data from the Swiss National Cohort Study and from official life expectancy estimates and compare the results with those derived from commonly used survival regression models. We conclude that a censored skew-normal survival regression approach for left-truncated observations can be used to model differences in life expectancy across covariates of interest. PMID:26339544
Hirose, Takahiko; Kimura, Fumiharu; Tani, Hiroki; Ota, Shin; Tsukahara, Akihiro; Sano, Eri; Shigekiyo, Taro; Nakamura, Yoshitsugu; Kakiuchi, Kensuke; Motoki, Mikiko; Unoda, Kiichi; Ishida, Simon; Nakajima, Hideto; Arawaka, Shigeki
2018-04-20
Introduction We evaluated post non-invasive ventilation survival and factors for the transition to tracheostomy in amyotrophic lateral sclerosis. Methods We analyzed 197 patients using a prospectively-collected database, with 114 patients since 2000. Results Of 114 patients, 59 patients underwent non-invasive ventilation (NIV), which prolonged the total median survival time to 43 months compared with 32 months without treatment. The best post-NIV survival was associated with a lack of bulbar symptoms, higher measured pulmonary function, and a slower rate of progression at diagnosis. The transition rate from NIV to tracheostomy gradually decreased over the years. Patients using NIV for more than 6 months were more likely to refuse tracheostomy and to be female. Discussion This study confirmed a positive survival effect with NIV, which was less effective in patients with bulbar dysfunction. Further studies are necessary to determine the best timing for using NIV with ALS in patients with bulbar dysfunction. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.
Halazun, Karim J; Quillin, Ralph C; Rosenblatt, Russel; Bongu, Advaith; Griesemer, Adam D; Kato, Tomoaki; Smith, Craig; Michelassi, Fabrizio; Guarrera, James V; Samstein, Benjamin; Brown, Robert S; Emond, Jean C
2017-09-01
Marginal livers (ML) have been used to expand the donor pool. National utilization of MLs is variable, and in some centers, they are never used. We examined the outcomes of MLs in the largest single center series of MLs used to date and compared outcomes to standard (SL) and living donor (LD) livers. Analysis of a prospectively maintained database of all liver transplants performed at our institution from 1998 to 2016. ML grafts were defined as livers from donors >70, livers discarded regionally and shared nationally, livers with cold ischemic time >12 hours, livers from hepatitis C virus positive donors, livers from donation after cardiac death donors, livers with >30% steatosis, and livers split between 2 recipients. A total of 2050 liver transplant recipients were studied, of these 960 (46.8%) received ML grafts. ML recipients were more likely to have lower MELDs and have hepatocellular carcinoma. Most MLs used were from organs turned down regionally and shared nationally (69%) or donors >70 (22%). Survival of patients receiving MLs did not significantly differ from patients receiving SL grafts (P = 0.08). ML and SL recipients had worse survival than LDs (P < 0.01). Despite nearly half of our recipients receiving MLs, overall survival was significantly better than national survival over the same time period (P = 0.04). Waitlist mortality was significantly lower in our series compared with national results (19% vs 24.0%, P < 0.0001). Outcomes of recipients of ML grafts are comparable to SL transplants. Despite liberal use of these grafts, we have been able to successfully reduce waitlist mortality while exceeding national post-transplant survival metrics.
Kurian, Allison W; Canchola, Alison J; Gomez, Scarlett L; Clarke, Christina A
2016-11-01
Nipple-sparing mastectomy, which may improve cosmesis, body image, and sexual function in comparison to non-nipple-sparing mastectomy, is increasingly used to treat early-stage breast cancer; however, long-term survival data are lacking. We evaluated survival after nipple-sparing mastectomy versus non-nipple-sparing mastectomy in a population-based cancer registry. We conducted an observational study using the California Cancer Registry, considering all stage 0-III breast cancers diagnosed in California from 1988 to 2013. We compared breast cancer-specific and overall survival time after nipple-sparing versus non-nipple-sparing mastectomy, using multivariable analysis. Among 157,592 stage 0-III female breast cancer patients treated with unilateral mastectomy from 1988-2013, 993 (0.6 %) were reported as having nipple-sparing and 156,599 (99.4 %) non-nipple-sparing mastectomies; median follow-up was 7.9 years. The proportion of mastectomies that were nipple-sparing increased over time (1988, 0.2 %; 2013, 5.1 %) and with neighborhood socioeconomic status, and decreased with age and stage. On multivariable analysis, nipple-sparing mastectomy was associated with a lower risk of breast cancer-specific mortality compared to non-nipple-sparing mastectomy [hazard ratio (HR) 0.71, 95 % confidence interval (CI) 0.51-0.98]. However, when restricting to diagnoses 1996 or later and adjusting for a larger set of covariates, risk was attenuated (HR 0.86, 95 % CI 0.52-1.42). Among California breast cancer patients diagnosed from 1988-2013, nipple-sparing mastectomy was not associated with worse survival than non-nipple-sparing mastectomy. These results may inform the decisions of patients and doctors deliberating between these surgical approaches for breast cancer treatment.
Robinson, Emily J; Power, Geraldine S; Nolan, Jerry; Soar, Jasmeet; Spearpoint, Ken; Gwinnutt, Carl; Rowan, Kathryn M
2016-01-01
Background Internationally, hospital survival is lower for patients admitted at weekends and at night. Data from the UK National Cardiac Arrest Audit (NCAA) indicate that crude hospital survival was worse after in-hospital cardiac arrest (IHCA) at night versus day, and at weekends versus weekdays, despite similar frequency of events. Objective To describe IHCA demographics during three day/time periods—weekday daytime (Monday to Friday, 08:00 to 19:59), weekend daytime (Saturday and Sunday, 08:00 to 19:59) and night-time (Monday to Sunday, 20:00 to 07:59)—and to compare the associated rates of return of spontaneous circulation (ROSC) for >20 min (ROSC>20 min) and survival to hospital discharge, adjusted for risk using previously developed NCAA risk models. To consider whether any observed difference could be attributed to differences in the case mix of patients resident in hospital and/or the administered care. Methods We performed a prospectively defined analysis of NCAA data from 27 700 patients aged ≥16 years receiving chest compressions and/or defibrillation and attended by a hospital-based resuscitation team in response to a resuscitation (2222) call in 146 UK acute hospitals. Results Risk-adjusted outcomes (OR (95% CI)) were worse (p<0.001) for both weekend daytime (ROSC>20 min 0.88 (0.81 to 0.95); hospital survival 0.72 (0.64 to 0.80)), and night-time (ROSC>20 min 0.72 (0.68 to 0.76); hospital survival 0.58 (0.54 to 0.63)) compared with weekday daytime. The effects were stronger for non-shockable than shockable rhythms, but there was no significant interaction between day/time of arrest and age, or day/time of arrest and arrest location. While many daytime IHCAs involved procedures, restricting the analyses to IHCAs in medical admissions with an arrest location of ward produced results that are broadly in line with the primary analyses. Conclusions IHCAs attended by the hospital-based resuscitation team during nights and weekends have substantially worse outcomes than during weekday daytimes. Organisational or care differences at night and weekends, rather than patient case mix, appear to be responsible. PMID:26658774
Robinson, Emily J; Smith, Gary B; Power, Geraldine S; Harrison, David A; Nolan, Jerry; Soar, Jasmeet; Spearpoint, Ken; Gwinnutt, Carl; Rowan, Kathryn M
2016-11-01
Internationally, hospital survival is lower for patients admitted at weekends and at night. Data from the UK National Cardiac Arrest Audit (NCAA) indicate that crude hospital survival was worse after in-hospital cardiac arrest (IHCA) at night versus day, and at weekends versus weekdays, despite similar frequency of events. To describe IHCA demographics during three day/time periods-weekday daytime (Monday to Friday, 08:00 to 19:59), weekend daytime (Saturday and Sunday, 08:00 to 19:59) and night-time (Monday to Sunday, 20:00 to 07:59)-and to compare the associated rates of return of spontaneous circulation (ROSC) for >20 min (ROSC>20 min) and survival to hospital discharge, adjusted for risk using previously developed NCAA risk models. To consider whether any observed difference could be attributed to differences in the case mix of patients resident in hospital and/or the administered care. We performed a prospectively defined analysis of NCAA data from 27 700 patients aged ≥16 years receiving chest compressions and/or defibrillation and attended by a hospital-based resuscitation team in response to a resuscitation (2222) call in 146 UK acute hospitals. Risk-adjusted outcomes (OR (95% CI)) were worse (p<0.001) for both weekend daytime (ROSC>20 min 0.88 (0.81 to 0.95); hospital survival 0.72 (0.64 to 0.80)), and night-time (ROSC>20 min 0.72 (0.68 to 0.76); hospital survival 0.58 (0.54 to 0.63)) compared with weekday daytime. The effects were stronger for non-shockable than shockable rhythms, but there was no significant interaction between day/time of arrest and age, or day/time of arrest and arrest location. While many daytime IHCAs involved procedures, restricting the analyses to IHCAs in medical admissions with an arrest location of ward produced results that are broadly in line with the primary analyses. IHCAs attended by the hospital-based resuscitation team during nights and weekends have substantially worse outcomes than during weekday daytimes. Organisational or care differences at night and weekends, rather than patient case mix, appear to be responsible. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Outcomes Associated with Steroid Avoidance and Alemtuzumab among Kidney Transplant Recipients.
Serrano, Oscar K; Friedmann, Patricia; Ahsanuddin, Sayeeda; Millan, Carlos; Ben-Yaacov, Almog; Kayler, Liise K
2015-11-06
Alemtuzumab is a humanized anti-CD52 monoclonal antibody used as induction in kidney transplantation (KTX) since 2003. Few studies have evaluated long-term outcomes of this agent or changes in outcomes over time. A retrospective cohort study was performed examining United States registry data from 2003 to 2014 of primary KTX recipients receiving induction with alemtuzumab (AZ; n=5521) or antithymocyte globulin (ATG; n=8504) and maintenance immunosuppression with tacrolimus and mycophenolate mofetil and early withdrawal of steroids. The primary outcome was overall death-censored graft survival (DCGS), and secondary outcomes were overall patient survival and 1-year acute rejection. Multivariate models were fit with donor, recipient, and transplant covariates. Because poorer outcomes with AZ may occur from a learning curve impact with the use of a new medication, transplant year was categorized into three time periods to evaluate outcomes over time (2003-2005, 2006-2008, ≥2009), and an interaction term of induction type with transplant year category was included in all models to test for era impacts. On multivariate analysis of DCGS there was a significant interaction between AZ and era (P<0.001). AZ was significantly associated with inferior DCGS in the earliest 2003-2005 era (adjusted hazard ratio [aHR], 2.21; 95% confidence interval [95% CI], 1.72 to 2.84) but not in the middle 2006-2008 era (aHR, 1.14; 95% CI, 0.96 to 1.36) or the most recent 2009-2014 era (aHR, 1.08; 95% CI, 0.90 to 1.29) compared with ATG. Risk-adjusted patient survival (aHR, 1.32; 95% CI, 1.08 to 1.61; aHR, 1.26; 95% CI, 1.09 to 1.46; and aHR, 1.10; 95% CI, 0.93 to 1.29 by era, respectively) and acute rejection (adjusted odds ratio [aOR], 1.17; 95% CI, 0.96 to 1.42; aOR, 0.94; 95% CI, 0.82 to 1.07; aOR, 0.89; 95% CI, 0.81 to 0.98 by era, respectively) with AZ was comparable with ATG in the most recent era; however, there was no significant interaction with time (P=0.13 and P=0.06, respectively). Current alemtuzumab utilization is associated with comparable graft and patient survival and acute rejection compared with ATG. Graft survival with alemtuzumab has improved over time. Copyright © 2015 by the American Society of Nephrology.
Outcomes Associated with Steroid Avoidance and Alemtuzumab among Kidney Transplant Recipients
Serrano, Oscar K.; Friedmann, Patricia; Ahsanuddin, Sayeeda; Millan, Carlos; Ben-Yaacov, Almog
2015-01-01
Background and objectives Alemtuzumab is a humanized anti-CD52 monoclonal antibody used as induction in kidney transplantation (KTX) since 2003. Few studies have evaluated long-term outcomes of this agent or changes in outcomes over time. Design, setting, participants, & measurements A retrospective cohort study was performed examining United States registry data from 2003 to 2014 of primary KTX recipients receiving induction with alemtuzumab (AZ; n=5521) or antithymocyte globulin (ATG; n=8504) and maintenance immunosuppression with tacrolimus and mycophenolate mofetil and early withdrawal of steroids. The primary outcome was overall death-censored graft survival (DCGS), and secondary outcomes were overall patient survival and 1-year acute rejection. Multivariate models were fit with donor, recipient, and transplant covariates. Because poorer outcomes with AZ may occur from a learning curve impact with the use of a new medication, transplant year was categorized into three time periods to evaluate outcomes over time (2003–2005, 2006–2008, ≥2009), and an interaction term of induction type with transplant year category was included in all models to test for era impacts. Results On multivariate analysis of DCGS there was a significant interaction between AZ and era (P<0.001). AZ was significantly associated with inferior DCGS in the earliest 2003–2005 era (adjusted hazard ratio [aHR], 2.21; 95% confidence interval [95% CI], 1.72 to 2.84) but not in the middle 2006–2008 era (aHR, 1.14; 95% CI, 0.96 to 1.36) or the most recent 2009–2014 era (aHR, 1.08; 95% CI, 0.90 to 1.29) compared with ATG. Risk-adjusted patient survival (aHR, 1.32; 95% CI, 1.08 to 1.61; aHR, 1.26; 95% CI, 1.09 to 1.46; and aHR, 1.10; 95% CI, 0.93 to 1.29 by era, respectively) and acute rejection (adjusted odds ratio [aOR], 1.17; 95% CI, 0.96 to 1.42; aOR, 0.94; 95% CI, 0.82 to 1.07; aOR, 0.89; 95% CI, 0.81 to 0.98 by era, respectively) with AZ was comparable with ATG in the most recent era; however, there was no significant interaction with time (P=0.13 and P=0.06, respectively). Conclusions Current alemtuzumab utilization is associated with comparable graft and patient survival and acute rejection compared with ATG. Graft survival with alemtuzumab has improved over time. PMID:26342042
White, Evan C; Khodayari, Behnood; Erickson, Kelly T; Lien, Winston W; Hwang-Graziano, Julie; Rao, Aroor R
2017-08-01
To compare the toxicity and treatment outcomes in human immunodeficiency virus (HIV)-positive versus HIV-negative patients with squamous cell carcinoma of the anal canal who underwent definitive concurrent chemoradiation at a single institution. Fifty-three consecutive HIV-positive patients treated between 1987 and 2013 were compared with 205 consecutive HIV-negative patients treated between 2003 and 2013. All patients received radiotherapy at a single regional facility. The median radiation dose was 54 Gy (range, 28 to 60 Gy). Concurrent chemotherapy consisted of 2 cycles 5-FU with mitomycin-C given on day 1±day 29). After treatment, patients were closely followed with imaging studies, clinical examinations, and rigid proctoscopies. Outcomes assessed were toxicity rates, progression-free survival, colostomy-free survival, cancer-specific survival, and overall survival. Median follow-up was 34 months. Compared with HIV-negative patients, HIV-positive patients were younger (median age, 48 vs. 62 y) and predominantly male sex (98% of HIV-positive patients were male vs. 22% of HIV-negative patients). Of the HIV-positive patients, 37 (70%) were on highly active antiretroviral therapy, 26 (65%) had an undetectable viral load at the time of treatment, and 36 (72%) had a CD4 count>200 (mean CD4 count, 455). There were no significant differences in acute or late nonhematologic or hematologic toxicity rates between the 2 groups. At 3 years, there was no significant difference between HIV-positive and HIV-negative patients in regards to progression-free survival (75% vs. 76%), colostomy-free survival (85% vs. 85%), or cancer-specific survival (79% vs. 88%, P=0.36), respectively. On univariate analysis, there was a trend toward worse overall survival in HIV-positive patients (72% vs. 84% at 3 y, P=0.06). For the entire cohort, on multivariate analysis only male sex and stage were predictive of worse survival outcomes. HIV status was not associated with worse outcomes in Cox models. In the highly active antiretroviral therapy era, HIV-positive patients with anal cancer treated with standard definitive chemoradiation have equivalent toxicity and cancer-specific survival compared with HIV-negative patients.
Hospital variation in time to defibrillation after in-hospital cardiac arrest.
Chan, Paul S; Nichol, Graham; Krumholz, Harlan M; Spertus, John A; Nallamothu, Brahmajee K
2009-07-27
Delays to defibrillation are associated with worse survival after in-hospital cardiac arrest, but the degree to which hospitals vary in defibrillation response times and hospital predictors of delays remain unknown. Using hierarchical models, we evaluated hospital variation in rates of delayed defibrillation (>2 minutes) and its impact on survival among 7479 adult inpatients with cardiac arrests at 200 hospitals within the National Registry of Cardiopulmonary Resuscitation. Adjusted rates of delayed defibrillation varied substantially among hospitals (range, 2.4%-50.9%), with hospital-level effects accounting for a significant amount of the total variation in defibrillation delays after adjusting for patient factors. We found a 46% greater odds of patients with identical covariates getting delayed defibrillation at one randomly selected hospital compared with another. Among traditional hospital factors evaluated, however, only bed volume (reference category: <200 beds; 200-499 beds: odds ratio [OR], 0.62 [95% confidence interval {CI}, 0.48-0.80]; >or=500 beds: OR, 0.74 [95% CI, 0.53-1.04]) and arrest location (reference category: intensive care unit; telemetry unit: OR, 1.92 [95% CI, 1.65-2.22]; nonmonitored unit: OR, 1.90 [95% CI, 1.61-2.24]) were associated with differences in rates of delayed defibrillation. Wide variation also existed in adjusted hospital rates of survival to discharge (range, 5.3%-49.6%), with higher survival among hospitals in the top-performing quartile for defibrillation time (compared with the bottom quartile: OR for top quartile, 1.41 [95% CI, 1.11-1.77]). Rates of delayed defibrillation vary widely among hospitals but are largely unexplained by traditional hospital factors. Given its association with improved survival, future research is needed to better understand best practices in the delivery of defibrillation at top-performing hospitals.
McLay, L K; Green, M P; Jones, T M
2017-07-01
The presence of artificial light at night is expanding in geographical range and increasing in intensity to such an extent that species living in urban environments may never experience natural darkness. The negative ecological consequences of artificial night lighting have been identified in several key life history traits across multiple taxa (albeit with a strong vertebrate focus); comparable data for invertebrates is lacking. In this study, we explored the effect of chronic exposure to different night-time lighting intensities on growth, reproduction and survival in Drosophila melanogaster. We reared three generations of flies under identical daytime light conditions (2600lx) and one of four ecologically relevant ALAN treatments (0, 1, 10 or 100lx), then explored variation in oviposition, number of eggs produced, juvenile growth and survival and adult survival. We found that, in the presence of light at night (1, 10 and 100lx treatments), the probability of a female commencing oviposition and the number of eggs laid was significantly reduced. This did not translate into differences at the juvenile phase: juvenile development times and the probability of eclosing as an adult were comparable across all treatments. However, we demonstrate for the first time a direct link between chronic exposure to light at night (greater than 1lx) and adult survival. Our data highlight that ALAN has the capacity to cause dramatic shifts in multiple life history traits at both the individual and population level. Such shifts are likely to be species-specific, however a more in depth understanding of the broad-scale impact of ALAN and the relevant mechanisms driving biological change is urgently required as we move into an increasing brightly lit future. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hu, Xingsheng; Zhang, Li; Shi, Yuankai; Zhou, Caicun; Liu, Xiaoqing; Wang, Dong; Song, Yong; Li, Qiang; Feng, Jifeng; Qin, Shukui; Xv, Nong; Zhou, Jianying; Zhang, Li; Hu, Chunhong; Zhang, Shucai; Luo, Rongcheng; Wang, Jie; Tan, Fenlai; Wang, Yinxiang; Ding, Lieming; Sun, Yan
2015-01-01
Icotinib is a small molecule targeting epidermal growth factor receptor tyrosine kinase, which shows non-inferior efficacy and better safety comparing to gefitinib in previous phase III trial. The present study was designed to further evaluate the efficacy and safety of icotinib in patients with advanced non-small-cell lung cancer (NSCLC) previously treated with platinum-based chemotherapy. Patients with NSCLC progressing after one or two lines of chemotherapy were enrolled to receive oral icotinib (125 mg tablet, three times per day). The primary endpoint was progression-free survival. The secondary endpoints included overall survival, objective response rate, time to progression, quality of life and safety. From March 16, 2010 to October 9, 2011, 128 patients from 15 centers nationwide were enrolled, in which 124 patients were available for efficacy evaluation and 127 patients were evaluable for safety. The median progression-free survival and time to progression were 5.0 months (95%CI 2.9-6.6 m) and 5.4 months (95%CI 3.1-7.9 m), respectively. The objective response rate and disease control rate were 25.8% and 67.7% respectively. Median overall survival exceeded 17.6 months (95%CI 14.2 m-NA) according to censored data. Further follow-up of overall survival is ongoing. The most frequent treatment-related adverse events were rash (26%, 33/127), diarrhea (12.6%, 16/127) and elevation of transaminase (15.7%, 20/127). In general, this study showed similar efficacy and numerically better safety when compared with that in ICOGEN trial, further confirming the efficacy and safety of icotinib in treating patients with advanced NSCLC previously treated with chemotherapy. ClinicalTrials.gov NCT02486354.
ERIC Educational Resources Information Center
Kim, Jinok; Chung, Gregory K. W. K.
2012-01-01
In this study we compared the effects of two math game designs on math and game performance, using discrete-time survival analysis (DTSA) to model players' risk of not advancing to the next level in the game. 137 students were randomly assigned to two game conditions. The game covered the concept of a unit and the addition of like-sized fractional…
Singh, T. P.; Almond, C. S.; Gauvreau, K.
2014-01-01
We assessed whether the improvement in post-transplant survival in pediatric heart transplant (HT) recipients during the last 2 decades has benefited the major racial groups in the US equally. We analyzed all children <18 years of age who underwent their first HT in the US during 1987–2008. We compared trends in graft loss (death or re-transplant) in white, black and Hispanic children in 5 successive cohorts (1987–1992, 1993–1996, 1997–2000, 2001–2004, 2005–2008). The primary endpoint was early graft loss within 6 months post-transplant. Longer-term survival was assessed in recipients who survived the first 6 months. The improvement in early post-transplant survival was similar (hazard ratio [HR] for successive eras 0.80, 95% confidence interval [CI] 0.7, 0.9, P=0.24 for black-era interaction, P=0.22 for Hispanic-era interaction) in adjusted analysis. Longer-term survival was worse in black children (HR 2.2, CI 1.9, 2.5) and did not improve in any group with time (HR 1.0 for successive eras, CI 0.9, 1.1, P=0.57; P=0.19 for black-era interaction, P=0.21 for Hispanic-era interaction). Thus, the improvement in early post-HT survival during the last 2 decades has benefited white, black and Hispanic children equally. Disparities in longer-term survival have not narrowed with time; the survival remains worse in black recipients. PMID:21199352
2014-01-01
Background Although cardiac cancer of the remnant stomach and primary cardiac cancer both occur in the same position, their clinical characteristics and outcomes have not been compared previously. The objective of this study was designed to evaluate the prognosis of cardiac cancer of the remnant stomach in comparison with primary cardiac cancer. Methods In this retrospective comparative study, clinical data and prognosis were compared in 48 patients with cardiac cancer of the remnant stomach and 96 patients with primary cardiac cancer who underwent radical resection from January 1995 to June 2007. Clinicopathologic characteristics, survival times, mortality, and complications were analyzed. Results The 5-year survival rate was significantly higher in patients with primary cardiac cancer than in those with cardiac cancer of the remnant stomach (28.4% vs. 16.7%, P = 0.035). Serosal invasion, lymph node metastasis and tumor location were independent prognostic factors for survival. Subgroup analysis, however, showed similar survival rates in patients with primary cardiac cancer and cardiac cancer of the remnant stomach without serosal invasion (25.0% vs. 43.8%, P = 0.214) and without lymph node metastasis (25.0% vs. 38.8%, P = 0.255), as well as similar complication rates (20.8% vs. 11.5%, P = 0.138). Conclusion Although the survival rates after radical resection in patients with cardiac cancer of the remnant stomach were poorer than in those with primary cardiac cancer, they were similar in survival rates when patients without serosal invasion or lymph node metastasis. Therefore, early detection is an important way to improve overall survival in cardiac cancer of the remnant stomach. PMID:24468299
Jansen, L; Buttmann-Schweiger, N; Listl, S; Ressing, M; Holleczek, B; Katalinic, A; Luttmann, S; Kraywinkel, K; Brenner, H
2018-01-01
The epidemiology of squamous cell oral cavity and pharyngeal cancers (OCPC) has changed rapidly during the last years, possibly due to an increase of human papilloma virus (HPV) positive tumors and successes in tobacco prevention. Here, we compare incidence and survival of OCPC by HPV-relation of the site in Germany and the United States (US). Age-standardized and age-specific incidence and 5-year relative survival was estimated using data from population-based cancer registries in Germany and the US Surveillance Epidemiology and End Results (SEER) 13 database. Incidence was estimated for each year between 1999 and 2013. Relative survival for 2002-2005, 2006-2009, and 2010-2013 was estimated using period analysis. The datasets included 52,787 and 48,861 cases with OCPC diagnosis between 1997 and 2013 in Germany and the US. Incidence was much higher in Germany compared to the US for HPV-unrelated OCPC and more recently also for HPV-related OCPC in women. Five-year relative survival differences between Germany and the US were small for HPV-unrelated OCPC. For HPV-related OCPC, men had higher survival in the US (62.1%) than in Germany (45.4%) in 2010-2013. These differences increased over time and were largest in younger patients and stage IV disease without metastasis. In contrast, women had comparable survival for HPV-related OCPC in both countries. Strong survival differences between Germany and the US were observed for HPV-related OCPC in men, which might be explained by differences in HPV-attributable proportions. Close monitoring of the epidemiology of OCPC in each country is needed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stedman, Margaret R; Feuer, Eric J; Mariotto, Angela B
2014-11-01
The probability of cure is a long-term prognostic measure of cancer survival. Estimates of the cure fraction, the proportion of patients "cured" of the disease, are based on extrapolating survival models beyond the range of data. The objective of this work is to evaluate the sensitivity of cure fraction estimates to model choice and study design. Data were obtained from the Surveillance, Epidemiology, and End Results (SEER)-9 registries to construct a cohort of breast and colorectal cancer patients diagnosed from 1975 to 1985. In a sensitivity analysis, cure fraction estimates are compared from different study designs with short- and long-term follow-up. Methods tested include: cause-specific and relative survival, parametric mixture, and flexible models. In a separate analysis, estimates are projected for 2008 diagnoses using study designs including the full cohort (1975-2008 diagnoses) and restricted to recent diagnoses (1998-2008) with follow-up to 2009. We show that flexible models often provide higher estimates of the cure fraction compared to parametric mixture models. Log normal models generate lower estimates than Weibull parametric models. In general, 12 years is enough follow-up time to estimate the cure fraction for regional and distant stage colorectal cancer but not for breast cancer. 2008 colorectal cure projections show a 15% increase in the cure fraction since 1985. Estimates of the cure fraction are model and study design dependent. It is best to compare results from multiple models and examine model fit to determine the reliability of the estimate. Early-stage cancers are sensitive to survival type and follow-up time because of their longer survival. More flexible models are susceptible to slight fluctuations in the shape of the survival curve which can influence the stability of the estimate; however, stability may be improved by lengthening follow-up and restricting the cohort to reduce heterogeneity in the data. Published by Oxford University Press 2014.
Peterson, Damon; Trantham, Randi B.; Trantham, Tulley G.; Caldwell, Colleen A.
2018-01-01
One of the greatest limiting factors of studies designed to obtain growth, movement, and survival in small-bodied fishes is the selection of a viable tag. The tag must be relatively small with respect to body size as to impart minimal sub-lethal effects on growth and mobility, as well as be retained throughout the life of the fish or duration of the study. Thus, body size of the model species becomes a major limiting factor; yet few studies have obtained empirical evidence of the minimum fish size and related tagging effects. The probability of surviving a tagging event was quantified in White Sands pupfish (Cyprinodon tularosa) across a range of sizes (19–60 mm) to address the hypothesis that body size predicts tagging survival. We compared tagging related mortality, individual taggers, growth, and tag retention in White Sands pupfish implanted with 8-mm passive integrated transponder (PIT), visual implant elastomer (VIE), and control (handled similarly, but no tag implantation) over a 75 d period. Initial body weight was a good predictor of the probability of survival in PIT- and VIE-tagged fish. As weight increased by 1 g, the fish were 4.73 times more likely to survive PIT-tag implantation compared to the control fish with an estimated suitable tagging size at 1.1 g (TL: 39.29 ± 0.41 mm). Likewise, VIE-tagged animals were 2.27 times more likely to survive a tagging event compared to the control group for every additional 1 g with an estimated size suitable for tagging of 0.9 g (TL: 36.9 ± 0.36 mm) fish. Growth rates of PIT- and VIE-tagged White Sands pupfish were similar to the control groups. This research validated two popular tagging methodologies in the White Sands pupfish, thus providing a valuable tool for characterizing vital rates in other small-bodied fishes.
Changes in Risk Profile Over Time in the Population of a Pediatric Heart Transplant Program.
Reinhartz, Olaf; Maeda, Katsuhide; Reitz, Bruce A; Bernstein, Daniel; Luikart, Helen; Rosenthal, Daniel N; Hollander, Seth A
2015-09-01
Single-center data on pediatric heart transplantation spanning long time frames is sparse. We attempted to analyze how risk profile and pediatric heart transplant survival outcomes at a large center changed over time. We divided 320 pediatric heart transplants done at Stanford University between 1974 and 2014 into three groups by era: the first 20 years (95 transplants), the subsequent 10 years (87 transplants), and the most recent 10 years (138 transplants). Differences in age at transplant, indication, mechanical support, and survival were analyzed. Follow-up was 100% complete. Average age at time of transplantation was 10.4 years, 11.9 years, and 5.6 years in eras 1, 2, and 3, respectively. The percentage of infants who received transplants by era was 21%, 7%, and 18%, respectively. The indication of end-stage congenital heart disease vs cardiomyopathy was 24%, 22%, and 49%, respectively. Only 1 patient (1%) was on mechanical support at transplant in era 1 compared with 15% in era 2 and 30% in era 3. Overall survival was 72% at 5 years and 57% at 10 years. Long-term survival increased significantly with each subsequent era. Patients with cardiomyopathy generally had a survival advantage over those with congenital heart disease. The risk profile of pediatric transplant patients in our institution has increased over time. In the last 10 years, median age has decreased and ventricular assist device support has increased dramatically. Transplantation for end-stage congenital heart disease is increasingly common. Despite this, long-term survival has significantly and consistently improved. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Begg, Douglas J.; Dhand, Navneet K.; Watt, Bruce; Whittington, Richard J.
2014-01-01
The duration of survival of both the S and C strains of Mycobacterium avium subsp. paratuberculosis in feces was quantified in contrasting climatic zones of New South Wales, Australia, and detailed environmental temperature data were collected. Known concentrations of S and C strains in feces placed on soil in polystyrene boxes were exposed to the environment with or without the provision of shade (70%) at Bathurst, Armidale, Condobolin, and Broken Hill, and subsamples taken every 2 weeks were cultured for the presence of M. avium subsp. paratuberculosis. The duration of survival ranged from a minimum of 1 week to a maximum of 16 weeks, and the provision of 70% shade was the most important factor in extending the survival time. The hazard of death for exposed compared to shaded samples was 20 and 9 times higher for the S and C strains, respectively. Site did not affect the survival of the C strain, but for the S strain, the hazard of death was 2.3 times higher at the two arid zone sites (Broken Hill and Condobolin) than at the two temperate zone sites (Bathurst and Armidale). Temperature measurements revealed maximum temperatures exceeding 60°C and large daily temperature ranges at the soil surface, particularly in exposed boxes. PMID:24463974
Foy, Kevin Chu; Fisher, James L; Lustberg, Maryam B; Gray, Darrell M; DeGraffinreid, Cecilia R; Paskett, Electra D
2018-01-01
African American (AA) women have a 42% higher breast cancer death rate compared to white women despite recent advancements in management of the disease. We examined racial differences in clinical and tumor characteristics, treatment and survival in patients diagnosed with breast cancer between 2005 and 2014 at a single institution, the James Cancer Hospital, and who were included in the Arthur G. James Cancer Hospital and Richard J. Solove Research Institute Cancer Registry in Columbus OH. Statistical analyses included likelihood ratio chi-square tests for differences in proportions, as well as univariate and multivariate Cox proportional hazards regressions to examine associations between race and overall and progression-free survival probabilities. AA women made up 10.2% (469 of 4593) the sample. Average time to onset of treatment after diagnosis was almost two times longer in AA women compared to white women (62.0 days vs 35.5 days, p < 0.0001). AA women were more likely to report past or current tobacco use, experience delays in treatment, have triple negative and late stage breast cancer, and were less likely to receive surgery, especially mastectomy and reconstruction following mastectomy. After adjustment for confounding factors (age, grade, and surgery), overall survival probability was significantly associated with race (HR = 1.33; 95% CI 1.03-1.72). These findings highlight the need for efforts focused on screening and receipt of prompt treatment among AA women diagnosed with breast cancer.
Garcia, Elena Linda; Caffrey-Villari, Sherry; Ramirez, Diomeda; Caron, Jean-Luc; Mannhart, Patrice; Reuter, Paul-Georges; Lapostolle, Frederic; Adnet, Frederic
2017-03-01
Out-of-hospital cardiac arrest (OHCA) is a major public health challenge. Use of automated external defibrillators (AED) by laypersons improves survival of patient's victim of OHCA. The aim of our study was to compare onsite AED vs. dispatched AED management of cardiac arrest occurring in international airports. We conducted a retrospective, observational, comparative, study on data collected from three international airports: Paris-Charles-de-Gaulle (CDG), Chicago and Madrid-Barajas. We included patients with OHCA occurring inside the airport between 2009 and 2013. Group public access (PUB) included airports where AED were available to laypersons and group dispatched (SEC) was represented by Paris-CDG airport where AED was provided by paramedic teams. The primary endpoint was successful resuscitation defined as survival at time of hospital admission. We included 150 consecutive patients victim of OHCA in the three airports. The time between collapse and AED setting was significantly shorter in the PUB vs. SEC group (4±3minutes vs. 11±11, P=0.0006). The total duration of resuscitation was shorter in the PUB group (10±10minutes vs. 36±25minutes, P<0.0001). Survival at time of hospital admission was higher in the PUB group (62% vs. 38%, P=0.01). The availability of public access AEDs in international airports seems to allow a quicker defibrillation and an increased success rate of resuscitation. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method.
Wan, Xiaomin; Peng, Liubao; Li, Yuanjian
2015-01-01
Background In general, the individual patient-level data (IPD) collected in clinical trials are not available to independent researchers to conduct economic evaluations; researchers only have access to published survival curves and summary statistics. Thus, methods that use published survival curves and summary statistics to reproduce statistics for economic evaluations are essential. Four methods have been identified: two traditional methods 1) least squares method, 2) graphical method; and two recently proposed methods by 3) Hoyle and Henley, 4) Guyot et al. The four methods were first individually reviewed and subsequently assessed regarding their abilities to estimate mean survival through a simulation study. Methods A number of different scenarios were developed that comprised combinations of various sample sizes, censoring rates and parametric survival distributions. One thousand simulated survival datasets were generated for each scenario, and all methods were applied to actual IPD. The uncertainty in the estimate of mean survival time was also captured. Results All methods provided accurate estimates of the mean survival time when the sample size was 500 and a Weibull distribution was used. When the sample size was 100 and the Weibull distribution was used, the Guyot et al. method was almost as accurate as the Hoyle and Henley method; however, more biases were identified in the traditional methods. When a lognormal distribution was used, the Guyot et al. method generated noticeably less bias and a more accurate uncertainty compared with the Hoyle and Henley method. Conclusions The traditional methods should not be preferred because of their remarkable overestimation. When the Weibull distribution was used for a fitted model, the Guyot et al. method was almost as accurate as the Hoyle and Henley method. However, if the lognormal distribution was used, the Guyot et al. method was less biased compared with the Hoyle and Henley method. PMID:25803659
Weber, Arthur J; Viswanáthan, Suresh; Ramanathan, Chidambaram; Harman, Christine D
2010-01-01
To determine whether application of BDNF to the eye and brain provides a greater level of neuroprotection after optic nerve injury than treatment of the eye alone. Retinal ganglion cell survival and pattern electroretinographic responses were compared in normal cat eyes and in eyes that received (1) a mild nerve crush and no treatment, (2) a single intravitreal injection of BDNF at the time of the nerve injury, or (3) intravitreal treatment combined with 1 to 2 weeks of continuous delivery of BDNF to the visual cortex, bilaterally. Relative to no treatment, administration of BDNF to the eye alone resulted in a significant increase in ganglion cell survival at both 1 and 2 weeks after nerve crush (1 week, 79% vs. 55%; 2 weeks, 60% vs. 31%). Combined treatment of the eye and visual cortex resulted in a modest additional increase (17%) in ganglion cell survival in the 1-week eyes, a further significant increase (55%) in the 2-week eyes, and ganglion cell survival levels for both that were comparable to normal (92%-93% survival). Pattern ERG responses for all the treated eyes were comparable to normal at 1 week after injury; however, at 2 weeks, only the responses of eyes receiving the combined BDNF treatment remained so. Although treatment of the eye alone with BDNF has a significant impact on ganglion cell survival after optic nerve injury, combined treatment of the eye and brain may represent an even more effective approach and should be considered in the development of future optic neuropathy-related neuroprotection strategies.
Jin, Zhezhen; Clerkin, Kevin; Homma, Shunichi; Mancini, Donna M.
2016-01-01
Background Placement of left ventricular assist devices (LVAD) as a bridge-to-heart transplantation (HTx) has rapidly expanded due to organ donor shortage. However, the timing of LVAD implantation is variable and it remains unclear if earlier implantation improves survival. Methods We analyzed 14,187 adult candidates from the United Network of Organ Sharing database. Patients were classified by 3 treatment strategies including patients medically treated alone (MED, n=11,009), patients on LVAD support at listing (Early-LVAD, n=1588) and patients undergoing LVAD placement while awaiting HTx (Delayed-LVAD, n=1590). Likelihood of HTx and event-free survival were assessed in patients subcategorized by clinical strategies and UNOS status at listing. Results The device support strategy, despite the timing of placement, was not associated with increased likelihood of HTx compared to MED group. However, both LVAD implantation strategies showed better survival compared to MED group (Early-LVAD: HR 0.811 and 0.633, 95% CI 0.668-0.984 and 0.507-0.789, for 1A and 1B; p=0.034 and p<0.001, Delayed-LVAD: HR 0.553 and 0.696, 95% CI 0.415-0.736 and 0.571-0.847, for 1A and 1B; both p<0.001, respectively). Furthermore, there was no significant difference in survival between these LVAD implantation strategies in patients listed as 1B (p=0.500), although Early-LVAD implantation showed worse survival in patients listed as 1A (HR 1.467, 95% CI 1.076-2.000; p=0.015). Conclusion LVAD support strategies offer a safe bridge-to-HTx. Those candidates who receive urgent upfront LVAD implantation for HTx, and improve to 1B status, would achieve competitive survival with those who receive elective LVAD implantation. PMID:26618255
Kitada, Shuichi; Schulze, P Christian; Jin, Zhezhen; Clerkin, Kevin; Homma, Shunichi; Mancini, Donna M
2016-01-15
Placement of left ventricular assist devices (LVAD) as a bridge-to-heart transplantation (HTx) has rapidly expanded due to organ donor shortage. However, the timing of LVAD implantation is variable and it remains unclear if earlier implantation improves survival. We analyzed 14,187 adult candidates from the United Network of Organ Sharing database. Patients were classified by 3 treatment strategies including patients medically treated alone (MED, n=11,009), patients on LVAD support at listing (Early-LVAD, n=1588) and patients undergoing LVAD placement while awaiting HTx (Delayed-LVAD, n=1590). Likelihood of HTx and event-free survival were assessed in patients subcategorized by clinical strategies and UNOS status at listing. The device support strategy, despite the timing of placement, was not associated with increased likelihood of HTx compared to MED group. However, both LVAD implantation strategies showed better survival compared to MED group (Early-LVAD: HR 0.811 and 0.633, 95% CI 0.668-0.984 and 0.507-0.789, for 1A and 1B; p=0.034 and p<0.001, Delayed-LVAD: HR 0.553 and 0.696, 95% CI 0.415-0.736 and 0.571-0.847, for 1A and 1B; both p<0.001, respectively). Furthermore, there was no significant difference in survival between these LVAD implantation strategies in patients listed as 1B (p=0.500), although Early-LVAD implantation showed worse survival in patients listed as 1A (HR 1.467, 95% CI 1.076-2.000; p=0.015). LVAD support strategies offer a safe bridge-to-HTx. Those candidates who receive urgent upfront LVAD implantation for HTx, and improve to 1B status, would achieve competitive survival with those who receive elective LVAD implantation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Culp, William T N; Olea-Popelka, Francisco; Sefton, Jennifer; Aldridge, Charles F; Withrow, Stephen J; Lafferty, Mary H; Rebhun, Robert B; Kent, Michael S; Ehrhart, Nicole
2014-11-15
To evaluate clinical characteristics, outcome, and prognostic variables in a cohort of dogs surviving > 1 year after an initial diagnosis of osteosarcoma. Retrospective case series. 90 client-owned dogs. Medical records for an 11-year period from 1997 through 2008 were reviewed, and patients with appendicular osteosarcoma that lived > 1 year after initial histopathologic diagnosis were studied. Variables including signalment, weight, serum alkaline phosphatase activity, tumor location, surgery, and adjuvant therapies were recorded. Median survival times were calculated by means of a Kaplan-Meier survival function. Univariate analysis was conducted to compare the survival function for categorical variables, and the Cox proportional hazard model was used to evaluate the likelihood of death > 1 year after diagnosis on the basis of the selected risk factors. 90 dogs met the inclusion criteria; clinical laboratory information was not available in all cases. Median age was 8.2 years (range, 2.7 to 13.3 years), and median weight was 38 kg (83.6 lb; range, 21 to 80 kg [46.2 to 176 lb]). Serum alkaline phosphatase activity was high in 29 of 60 (48%) dogs. The most common tumor location was the distal portion of the radius (54/90 [60%]). Eighty-nine of 90 (99%) dogs underwent surgery, and 78 (87%) received chemotherapy. Overall, 49 of 90 (54%) dogs developed metastatic disease. The median survival time beyond 1 year was 243 days (range, 1 to 1,899 days). Dogs that developed a surgical-site infection after limb-sparing surgery had a significantly improved prognosis > 1 year after osteosarcoma diagnosis, compared with dogs that did not develop infections. Results of the present study indicated that dogs with an initial diagnosis of osteosarcoma that lived > 1 year had a median survival time beyond the initial year of approximately 8 months. As reported previously, the development of a surgical-site infection in dogs undergoing a limb-sparing surgery significantly affected prognosis and warrants further study.
NASA Technical Reports Server (NTRS)
Darlington, Daniel N.; Chew, Gordon; Ha, Taryn; Keil, Lanny C.; Dallman, Mary F.
1990-01-01
Fed adrenalectomized rats survive the stress of hemorrhage and hypovolemia, whereas fasted adrenalectomized rats become hypotensive and hypoglycemic after the first 90 min and die within 4 hours (h). We have studied the effects of glucose and corticosterone (B) infusions after hemorrhage as well as treatment with B at the time of adrenalectomy on the capacity of chronically prepared, conscious, fasted, adrenalectomized rats to survive hemorrhage. We have also measured the magnitudes of vasoactive hormone responses to hemorrhage. Maintenance of plasma glucose concentrations did not sustain life; however, treatment of rats at the time of adrenalectomy with B allowed 100 percent survival, and acute treatment of adrenalectomized rats at the time of hemorrhage allowed about 50 percent survival during the 5-h posthemorrhage observation period. Rats in the acute B infusion group that died exhibited significantly increased plasma B and significantly decreased plasma glucose concentrations by 2 h compared to the rats that lived. Plasma vasopressin, renin, and norepinephrine responses to hemorrhage were markedly augmented in the adrenalectomized rats not treated with B, and plasma vasopressin concentrations were significantly elevated at 1 and 2 h in all of the rats that subsequently died compared to values in those that lived. We conclude that: 1) death after hemorrhage in fasted adrenalectomized rats is not a result of lack of glucose; 2) chronic and, to an extent, acute treatment of fasted adrenalectomized rats with B enables survival; 3) fasted adrenalectomized rats exhibit strong evidence of hepatic insufficiency which is not apparent in either fed adrenalectomized rats or B-treated fasted adrenalectomized rats; 4) death after hemorrhage in fasted adrenalectomized rats may result from hepatic failure as a consequence of marked splanchnic vasoconstriction mediated bv the actions of extraordinarily high levels of vasoactive hormones after hemorrhage; and 5) B appears to act to decrease the magnitude of response of vasoactive hormones after hemorrhage in fasted adrenatectomized rats.
Comparison of extended colectomy and limited resection in patients with Lynch syndrome.
Natarajan, Nagendra; Watson, Patrice; Silva-Lopez, Edibaldo; Lynch, Henry T
2010-01-01
The purpose of the study was to determine the advantages and disadvantages of prophylactic/extended colectomy (subtotal colectomy) in patients with Lynch syndrome who manifest colorectal cancer. A retrospective cohort using Creighton University's hereditary cancer database was used to identify cases and controls. Cases are patients who underwent subtotal colectomy, either with no colorectal cancer diagnosis (prophylactic) or at diagnosis of first colorectal cancer; controls for these 2 types of cases were, respectively, patients who underwent no colon surgery or those having limited resection at time of diagnosis of first colorectal cancer. The Kaplan-Meier and proportional hazard regression models from the Statistical Analysis Software program was used to calculate the difference in survival, time to subsequent colorectal cancer, and subsequent abdominal surgery between cases and controls. The event-free survival of our study did not reach 50%, so we used the event-free survival at 5 years as our parameter to compare the 2 groups. The event-free survival for subsequent colorectal cancer, subsequent abdominal surgery, and death was 94%, 84%, and 93%, respectively, for cases and 74%, 63%, and 88%, respectively, for controls. Times to subsequent colorectal cancer and subsequent abdominal surgery were significantly shorter in the control group (P < .006 and P < .04, respectively). No significant difference was identified with respect to survival time between the cases and controls. Even though no survival benefit was identified between the cases and controls the increased incidence of metachronous colorectal cancer and increased abdominal surgeries among controls warrant the recommendation of subtotal colectomy in patients with Lynch syndrome.
Sample size calculation for studies with grouped survival data.
Li, Zhiguo; Wang, Xiaofei; Wu, Yuan; Owzar, Kouros
2018-06-10
Grouped survival data arise often in studies where the disease status is assessed at regular visits to clinic. The time to the event of interest can only be determined to be between two adjacent visits or is right censored at one visit. In data analysis, replacing the survival time with the endpoint or midpoint of the grouping interval leads to biased estimators of the effect size in group comparisons. Prentice and Gloeckler developed a maximum likelihood estimator for the proportional hazards model with grouped survival data and the method has been widely applied. Previous work on sample size calculation for designing studies with grouped data is based on either the exponential distribution assumption or the approximation of variance under the alternative with variance under the null. Motivated by studies in HIV trials, cancer trials and in vitro experiments to study drug toxicity, we develop a sample size formula for studies with grouped survival endpoints that use the method of Prentice and Gloeckler for comparing two arms under the proportional hazards assumption. We do not impose any distributional assumptions, nor do we use any approximation of variance of the test statistic. The sample size formula only requires estimates of the hazard ratio and survival probabilities of the event time of interest and the censoring time at the endpoints of the grouping intervals for one of the two arms. The formula is shown to perform well in a simulation study and its application is illustrated in the three motivating examples. Copyright © 2018 John Wiley & Sons, Ltd.
Vidal, Gregory; Bursac, Zoran; Miranda-Carboni, Gustavo; White-Means, Shelley; Starlard-Davenport, Athena
2017-07-01
Racial disparities in survival among African American (AA) women in the United States have been well documented. Breast cancer mortality rates among AA women is higher in Memphis, Tennessee as compared to 49 of the largest US cities. In this study, we investigated the extent to which racial/ethnic disparities in survival outcomes among Memphis women are attributed to differences in breast tumor subtype and treatment outcomes. A total of 3527 patients diagnosed with stage I-IV breast cancer between January 2002 and April 2015 at Methodist Health hospitals and West Cancer Center in Memphis, TN were included in the analysis. Kaplan-Meier survival curves were generated and Cox proportional hazards regression were used to compare survival outcomes among 1342 (38.0%) AA and 2185 (62.0%) non-Hispanic White breast cancer patients by race and breast tumor subtype. Over a mean follow-up time of 29.9 months, AA women displayed increased mortality risk [adjusted hazard ratio (HR), 1.65; 95% confidence interval (CI), 1.35-2.03] and were more likely to be diagnosed at advanced stages of disease. AA women with triple-negative breast cancer (TNBC) had the highest death rate at 26.7% compared to non-Hispanic White women at 16.5%. AA women with TNBC and luminal B/HER2- breast tumors had the highest risk of mortality. Regardless of race, patients who did not have surgery had over five times higher risk of dying compared to those who had surgery. These findings provide additional evidence of the breast cancer disparity gap between AA and non-Hispanic White women and highlight the need for targeted interventions and policies to eliminate breast cancer disparities in AA populations, particularly in Memphis, TN. © 2017 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.
Effects of sinus surgery on lung transplantation outcomes in cystic fibrosis.
Leung, Man-Kit; Rachakonda, Leelanand; Weill, David; Hwang, Peter H
2008-01-01
In cystic fibrosis (CF) patients who are candidates for lung transplant, pretransplant sinus surgery has been advocated to avoid bacterial seeding of the transplanted lungs. This study reviews the 17-year experience of pretransplant sinus surgery among CF patients at a major transplant center. Retrospective chart review was performed in all CF patients who underwent heart-lung or lung transplantation at Stanford Medical Center between 1988 and 2005. Postoperative culture data from bronchoalveolar lavage (BAL) and sinus aspirates were evaluated, in addition to survival data. Eighty-seven CF transplant recipients underwent pretransplant sinus surgery; 87% (n=59/68) of patients showed recolonization of the lung grafts with Pseudomonas on BAL cultures. The median postoperative time to recolonization was 19 days. Bacterial floras cultured from sinuses were similar in type and prevalence as the floras cultured from BAL. When compared with published series of comparable cohorts in which pretransplant sinus surgery was not performed, there was no statistically significant difference in the prevalence of Pseudomonas recolonization. Times to recolonization also were similar. Survival rates in our cohort were similar to national survival rates for CF lung transplant recipients. Despite pretransplant sinus surgery, recolonization of lung grafts occurs commonly and rapidly with a spectrum of flora that mimics the sinus flora. Survival rates of CF patients who undergo prophylactic sinus surgery are similar to those from centers where prophylactic sinus surgery is not performed routinely. Pretransplant sinus surgery does not appear to prevent lung graft recolonization and is not associated with overall survival benefit.
Patent ductus arteriosus ligation in premature infants in the United States.
Tashiro, Jun; Wang, Bo; Sola, Juan E; Hogan, Anthony R; Neville, Holly L; Perez, Eduardo A
2014-08-01
Patent ductus arteriosus (PDA) is a condition that commonly affects premature and low birth weight (BW) infants at times necessitating surgical intervention. We examined outcomes after surgical ligation (SL). We analyzed the Kids' Inpatient Database for premature infants diagnosed with PDA, admitted at <8 d of age. Patient demographics, disposition, morbidity, and mortality were analyzed. All cases were weighted appropriately to project nationally representative estimates. A total of 63,208 patients were identified with diagnosis of PDA. Of these, 6766 (10.7%) underwent SL. Lower gestational age (GA) and BW patients had higher incidence of PDA and rates of SL. Overall survival was 90.8% for the cohort. Survival for the SL group was 88.0% and 91.2% for the non-SL group; however, infants undergoing SL had higher survival rates up to 28 wk and 1250 g for GA and BW, respectively. GA did not affect post-SL survival adversely. Rather, lower BW was associated with extremely high mortality rates. Black infants and boys had lower survival compared with other races and girls, respectively. Larger hospitals had higher survival rates, but hospital location, teaching status, and type did not affect survival. Payer status and income quartile did not affect survival. PDA and SL are more common in lower BW and GA groups. Higher survival rates are found for infants with SL versus non-SL in the lowest BW and GA groups. Morbidity and mortality are not affected by SL timing. BW, rather than GA, determines survival of infants undergoing SL. Copyright © 2014 Elsevier Inc. All rights reserved.
Factors affecting survival of women diagnosed with breast cancer in El-Minia Governorate, Egypt.
Seedhom, Amany Edward; Kamal, Nashwa Nabil
2011-07-01
This study was conducted to determine breast cancer survival time and the association between breast cancer survival and socio-demographic and pathologic factors among women, in El-Minia, Egypt. While there has been much researches regarding prognostic factors for breast cancer but the majority of these studies were from developed countries. El-Minia has a population of approximately 4 million. To date, no research has been performed to determine breast cancer survival and the factors affecting it in El-minia. This retrospective study used data obtained from the cancer registry in the National Institute of Oncology in El-Minia and included 1207 women diagnosed with first primary breast cancer between 1(st) January 2005 and 31(st) December 2009 and followed to 30(th) June 2010. The association between survival and sociodemographic and pathological factors and distant metastasis at diagnosis, and treatment options was investigated using unifactorial chi-square test and multi-factorial (Cox regression) analyses. Kaplan-Meier analysis was used to compare survival time among different groups. Median survival time was 83.8 ± 3.2. Cox regression showed that high vs low educational level (Hazard ratio (HR)= 0.35, 95% CI; 0.27-0.46), metastases to bone (HR = 3.22, 95% CI: 1.71-6.05), metastases to lung (HR= 2.314, 95% CI: 1.225-4.373), tumor size (≤ 2 cm vs ≥ 5 cm: HR = 1.4, 95% CI: 1.1-1.8) and number of involved nodes (1 vs > 10 HR = 5.21, 95%CI: 3.1-9.01) were significantly related to survival. The results showed the need to develop screening programs and standardized treatment regimens in a tax-funded health care system.
BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY
2010-01-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627
Timing embryo biopsy for PGD - before or after cryopreservation?
Shinar, S; Kornecki, N; Schwartz, T; Mey-Raz, N; Amir, H; Almog, B; Shavit, T; Hasson, J
2016-09-01
Pre-implantation genetic diagnosis (PGD) is required in order to screen and diagnose embryos of patients at risk of having a genetically affected offspring. A biopsy to diagnose the genetic profile of the embryo may be performed either before or after cryopreservation. The aim of this study was to determine which biopsy timing yields higher embryo survival rates. Retrospective cohort study of all PGD patients in a public IVF unit between 2010 and 2013. Inclusion criteria were patients with good-quality embryos available for cryopreservation by the slow freezing method. Embryos were divided into two groups: biopsy before and biopsy after cryopreservation. The primary outcome was embryo survival rates post thawing. Sixty-five patients met inclusion criteria. 145 embryos were biopsied before cryopreservation and 228 embryos were cryopreserved and biopsied after thawing. Embryo survival was significantly greater in the latter group (77% vs. 68%, p < 0.0001). Cryopreservation preceding biopsy results in better embryo survival compared to biopsy before cryopreservation.
Severe chronic bronchitis in advanced emphysema increases mortality and hospitalizations.
Kim, Victor; Sternberg, Alice L; Washko, George; Make, Barry J; Han, Meilan K; Martinez, Fernando; Criner, Gerard J
2013-12-01
Chronic bronchitis in COPD has been associated with an increased exacerbation rate, more hospitalizations, and an accelerated decline in lung function. The clinical characteristics of patients with advanced emphysema and chronic bronchitis have not been well described. Patients randomized to medical therapy in the National Emphysema Treatment Trial were grouped based on their reports of cough and phlegm on the St. George's Respiratory Questionnaire(SGRQ) at baseline: chronic bronchitis(CB+) and no chronic bronchitis(CB-). The patients were similarly categorized into severe chronic bronchitis(SCB+) or no severe chronic bronchitis (SCB-) based on the above definition plus report of chest trouble. Kaplan-Meier survival analysis was used to determine the relationships between chronic bronchitis and severe chronic bronchitis and survival and time to hospitalization. Lung function and SGRQ scores over time were compared between groups. The CB+(N = 234; 38%) and CB- groups(N = 376; 62%) had similar survival (median 60.8 versus 65.7 months, p = 0.19) and time to hospitalization (median 26.9 versus 24.9 months, p = 0.84). The SCB+ group(N = 74; 12%) had worse survival (median 47.7 versus 65.7 months, p = 0.02) and shorter time to hospitalization (median 18.5 versus 26.7 months, p = 0.02) than the SCB- group (N = 536; 88%). Mortality and hospitalization rates were not increased when chest trouble was analyzed by itself. The CB+ and CB-groups had similar lung function and SGRQ scores over time. The SCB+ and SCB-groups had similar lung function over time, but the SCB+ group had significantly worse SGRQ scores. Severe chronic bronchitis is associated with worse survival, shorter time to hospitalization, and worse health-related quality of life.
Decaens, Thomas; Roudot-Thoraval, Françoise; Hadni-Bresson, Solange; Meyer, Carole; Gugenheim, Jean; Durand, Francois; Bernard, Pierre-Henri; Boillot, Olivier; Sulpice, Laurent; Calmus, Yvon; Hardwigsen, Jean; Ducerf, Christian; Pageaux, Georges-Philippe; Dharancy, Sebastien; Chazouilleres, Olivier; Cherqui, Daniel; Duvoux, Christophe
2006-12-01
Orthotopic liver transplantation (OLT) indication for hepatocellular carcinoma (HCC) is currently based on the Milan criteria. The University of California, San Francisco (UCSF) recently proposed an expansion of the selection criteria according to tumors characteristics on the explanted liver. This study: 1) assessed the validity of these criteria in an independent large series and 2) tested for the usefulness of these criteria when applied to pre-OLT tumor evaluation. Between 1985 and 1998, 479 patients were listed for liver transplantation (LT) for HCC and 467 were transplanted. According to pre-OLT (imaging at date of listing) or post-OLT (explanted liver) tumor characteristics, patients were retrospectively classified according to both the Milan and UCSF criteria. The 5-yr survival statistics were assessed by the Kaplan-Meier method and compared by the log-rank test. Pre-OLT UCSF criteria were analyzed according to an intention-to-treat principle. Based on the pre-OLT evaluation, 279 patients were Milan+, 44 patients were UCSF+ but Milan- (subgroup of patients that might benefit from the expansion), and 145 patients were UCSF- and Milan-. With a short median waiting time of 4 months, 5-yr survival was 60.1 +/- 3.0%, 45.6 +/- 7.8%, and 34.7 +/- 4.0%, respectively (P < 0.001). The 5-yr survival was arithmetically lower in UCSF+ Milan- patients compared to Milan+ but this difference was not significant (P = 0.10). Based on pathological features of the explanted liver, 5-yr survival was 70.4 +/- 3.4%, 63.6 +/- 7.8%, and 34.1 +/- 3.1%, in Milan+ patients (n = 184), UCSF+ Milan- patients (n = 39), and UCSF- Milan- patients (n = 238), respectively (P < 0.001). However, the 5-yr survival did not differ between Milan+ and UCSF+ Milan- patients (P = 0.33). In conclusion, these results show that when applied to pre-OLT evaluation, the UCSF criteria are associated with a 5-yr survival below 50%. Their applicability is therefore limited, despite similar survival rates compared to the Milan criteria, when the explanted liver is taken into account.
Outcomes of Male Patients with Alport Syndrome Undergoing Renal Replacement Therapy
Temme, Johanna; Kramer, Anneke; Jager, Kitty J.; Lange, Katharina; Peters, Frederick; Müller, Gerhard-Anton; Kramar, Reinhard; Heaf, James G.; Finne, Patrik; Palsson, Runolfur; Reisæter, Anna V.; Hoitsma, Andries J.; Metcalfe, Wendy; Postorino, Maurizio; Zurriaga, Oscar; Santos, Julio P.; Ravani, Pietro; Jarraya, Faical; Verrina, Enrico; Dekker, Friedo W.
2012-01-01
Summary Background and objectives Patients with the hereditary disease Alport syndrome commonly require renal replacement therapy (RRT) in the second or third decade of life. This study compared age at onset of RRT, renal allograft, and patient survival in men with Alport syndrome receiving various forms of RRT (peritoneal dialysis, hemodialysis, or transplantation) with those of men with other renal diseases. Design, setting, participants, & measurements Patients with Alport syndrome receiving RRT identified from 14 registries in Europe were matched to patients with other renal diseases. A linear spline model was used to detect changes in the age at start of RRT over time. Kaplan-Meier method and Cox regression analysis were used to examine patient and graft survival. Results Age at start of RRT among patients with Alport syndrome remained stable during the 1990s but increased by 6 years between 2000–2004 and 2005–2009. Survival of patients with Alport syndrome requiring dialysis or transplantation did not change between 1990 and 2009. However, patients with Alport syndrome had better renal graft and patient survival than matched controls. Numbers of living-donor transplantations were lower in patients with Alport syndrome than in matched controls. Conclusions These data suggest that kidney failure in patients with Alport syndrome is now being delayed compared with previous decades. These patients appear to have superior patient survival while undergoing dialysis and superior patient and graft survival after deceased-donor kidney transplantation compared with patients receiving RRT because of other causes of kidney failure. PMID:22997344
Outcomes of male patients with Alport syndrome undergoing renal replacement therapy.
Temme, Johanna; Kramer, Anneke; Jager, Kitty J; Lange, Katharina; Peters, Frederick; Müller, Gerhard-Anton; Kramar, Reinhard; Heaf, James G; Finne, Patrik; Palsson, Runolfur; Reisæter, Anna V; Hoitsma, Andries J; Metcalfe, Wendy; Postorino, Maurizio; Zurriaga, Oscar; Santos, Julio P; Ravani, Pietro; Jarraya, Faical; Verrina, Enrico; Dekker, Friedo W; Gross, Oliver
2012-12-01
Patients with the hereditary disease Alport syndrome commonly require renal replacement therapy (RRT) in the second or third decade of life. This study compared age at onset of RRT, renal allograft, and patient survival in men with Alport syndrome receiving various forms of RRT (peritoneal dialysis, hemodialysis, or transplantation) with those of men with other renal diseases. Patients with Alport syndrome receiving RRT identified from 14 registries in Europe were matched to patients with other renal diseases. A linear spline model was used to detect changes in the age at start of RRT over time. Kaplan-Meier method and Cox regression analysis were used to examine patient and graft survival. Age at start of RRT among patients with Alport syndrome remained stable during the 1990s but increased by 6 years between 2000-2004 and 2005-2009. Survival of patients with Alport syndrome requiring dialysis or transplantation did not change between 1990 and 2009. However, patients with Alport syndrome had better renal graft and patient survival than matched controls. Numbers of living-donor transplantations were lower in patients with Alport syndrome than in matched controls. These data suggest that kidney failure in patients with Alport syndrome is now being delayed compared with previous decades. These patients appear to have superior patient survival while undergoing dialysis and superior patient and graft survival after deceased-donor kidney transplantation compared with patients receiving RRT because of other causes of kidney failure.
Angthong, Chayanin; Angthong, Wirana; Harnroongroj, Thos; Naito, Masatoshi; Harnroongroj, Thossart
2013-01-01
Survival rates are poorer after a second hip fracture than after a first hip fracture. Previous survival studies have included in-hospital mortality. Excluding in-hospital deaths from the analysis allows survival times to be evaluated in community-based patients. There is still a lack of data regarding the effects of subsequent fractures on survival times after hospital discharge following an initial hip fracture. This study compared the survival times of community-dwelling patients with hip fracture who had or did not have a subsequent major long-bone fracture. Hazard ratios and risk factors for subsequent fractures and mortality rates with and without subsequent fractures were calculated. Of 844 patients with hip fracture from 2000 through 2008, 71 had a subsequent major long-bone fracture and 773 did not. Patients who died of other causes, such as perioperative complications, during hospitalization were excluded. Such exclusion allowed us to determine the effect of subsequent fracture on the survival of community-dwelling individuals after hospital discharge or after the time of the fracture if they did not need hospitalization. Demographic data, causes of death, and mortality rates were recorded. Differences in mortality rates between the patient groups and hazard ratios were calculated. Mortality rates during the first year and from 1 to 5 years after the most recent fracture were 5.6% and 1.4%, respectively, in patients with subsequent fractures, and 4.7% and 1.4%, respectively, in patients without subsequent fractures. These rates did not differ significantly between the groups. Cox regression analysis and calculation of hazard ratios did not show significant differences between patients with subsequent fractures and those without. On univariate and multivariate analyses, age <75 years and male sex were risk factors for subsequent fracture. This study found that survival times did not differ significantly between patients with and without subsequent major long-bone fractures after hip fracture. Therefore, all patients with hip fracture, with or without subsequent fractures, need the same robust holistic care. The risks of subsequent fractures should be addressed in patients with hip fracture and should be reduced where possible by education regarding fracture prevention and regular rehabilitation programs. Efforts should be made to decrease the rates of major long-bone fractures and their burdens, even though such fractures have only a minor effect on survival in community-dwelling individuals.
Exposure-related effects of Pseudomonas fluorescens (Pf-CL145A) on juvenile unionid mussels
Weber, Kerry L.; Luoma, James A.; Mayer, Denise A.; Aloisi, Douglas B.; Eckert, Nathan L.
2015-01-01
Mean survival of three unionid mussels species exposed to FDP was not significantly different in the 50-, 100-, and 200-mg/L AI treatment groups and the 300 mg/L heat-deactivated treatment groups when compared to the control groups. Mean survival of O. olivaria and M. nervosa was significantly lower in the 300-mg/L AI treated groups (38.1 and 48.1 percent, respectively) compared to the control groups (71.9 and 88.1 percent, respectively). The results indicate that exposure to FDP-formulated P. fluorescens up to the maximum label concentration (100 mg/L AI) and up to three times the maximum label exposure duration (8 hours) is not likely to affect the survival of O. olivaria, A. ligamentina, and M. nervosa.
Corrêa, José Roberto Missel; Rocha, Fabrício Domingos; Peres, Alessandro Afonso; Gonçalves, Luiz Felipe; Manfro, Roberto Ceratti
2003-01-01
To evaluate the impact of HCV (hepatitis C virus) and HBV (hepatitis B virus) infection on long-term graft and patient survival in renal transplantation. One hundred and nine kidney allograft recipients were evaluated regarding the presence of antibodies against HCV and hepatitis B surface antigen. Patients were divided into four groups according to their serologic status and followed for ten years for survival analysis. Age, gender, renal failure etiology, length of previous dialysis and post transplantation periods were evaluated. Length on dialysis time was significantly longer in the anti-HCV positive group. There was also a higher number of patients with re-transplants in the HBV and HCV groups. There were no significant differences in 10-year patient survival in the anti-HCV positive group (71.0%; relative risk: 1.13; CI: 0.86-1.47) and in the HBV infected group (77.8%; relative risk: 1.03; CI: 0.7-1.5) compared to the not infected group (80%). However, the group of patients infected with both viruses presented a significantly lower 10-year patient survival (37.5%; relative risk: 2.13; CI: 0.86-5.28) compared to the index group. There were no significant differences on graft survival among the groups. In the present study renal transplant patients infected concomitantly with HBV and HCV present a significantly lower long-term patient survival.
Kwan, Sharon W; Harris, William P; Gold, Laura S; Hebert, Paul L
2018-06-01
The purpose of this study was to compare the clinical effectiveness of embolization with that of sorafenib in the management of hepatocellular carcinoma as practiced in real-world settings. This population-based observational study was conducted with the Surveillance, Epidemiology, and End Results-Medicare linked database. Patients 65 years old and older with a diagnosis of primary liver cancer between 2007 and 2011 who underwent embolization or sorafenib treatment were identified. Patients were excluded if they had insufficient claims records, a diagnosis of intrahepatic cholangiocarcinoma, or other primary cancer or had undergone liver transplant or combination therapy. The primary outcome of interest was overall survival. Inverse probability of treatment weighting models were used to control for selection bias. The inclusion and exclusion criteria were met by 1017 patients. Models showed good balance between treatment groups. Compared with those who underwent embolization, patients treated with sorafenib had significantly higher hazard of earlier death from time of treatment (hazard ratio, 1.87; 95% CI, 1.46-2.37; p < 0.0001) and from time of cancer diagnosis (hazard ratio, 1.87; 95% CI, 1.46-2.39; p < 0.0001). The survival advantage after embolization was seen in both intermediate- and advanced-stage disease. This comparative effectiveness study of Medicare patients with hepatocellular carcinoma showed significantly longer overall survival after treatment with embolization than with sorafenib. Because these findings conflict with expert opinion-based guidelines for treatment of advanced-stage disease, prospective randomized comparative trials in this subpopulation would be justified.
ERIC Educational Resources Information Center
Suddendorf, T.; Busby, J.
2005-01-01
Mechanisms that produce behavior which increase future survival chances provide an adaptive advantage. The flexibility of human behavior is at least partly the result of one such mechanism, our ability to travel mentally in time and entertain potential future scenarios. We can study mental time travel in children using language. Current results…
Baudrot, Virgile; Preux, Sara; Ducrot, Virginie; Pave, Alain; Charles, Sandrine
2018-02-06
Toxicokinetic-toxicodynamic (TKTD) models, as the General Unified Threshold model of Survival (GUTS), provide a consistent process-based framework compared to classical dose-response models to analyze both time and concentration-dependent data sets. However, the extent to which GUTS models (Stochastic Death (SD) and Individual Tolerance (IT)) lead to a better fitting than classical dose-response model at a given target time (TT) has poorly been investigated. Our paper highlights that GUTS estimates are generally more conservative and have a reduced uncertainty through smaller credible intervals for the studied data sets than classical TT approaches. Also, GUTS models enable estimating any x% lethal concentration at any time (LC x,t ), and provide biological information on the internal processes occurring during the experiments. While both GUTS-SD and GUTS-IT models outcompete classical TT approaches, choosing one preferentially to the other is still challenging. Indeed, the estimates of survival rate over time and LC x,t are very close between both models, but our study also points out that the joint posterior distributions of SD model parameters are sometimes bimodal, while two parameters of the IT model seems strongly correlated. Therefore, the selection between these two models has to be supported by the experimental design and the biological objectives, and this paper provides some insights to drive this choice.
Martin, Thomas E.; Riordan, Margaret M.; Repin, Rimi; Mouton, James C.; Blake, William M.
2017-01-01
AimAdult survival is central to theories explaining latitudinal gradients in life history strategies. Life history theory predicts higher adult survival in tropical than north temperate regions given lower fecundity and parental effort. Early studies were consistent with this prediction, but standard-effort netting studies in recent decades suggested that apparent survival rates in temperate and tropical regions strongly overlap. Such results do not fit with life history theory. Targeted marking and resighting of breeding adults yielded higher survival estimates in the tropics, but this approach is thought to overestimate survival because it does not sample social and age classes with lower survival. We compared the effect of field methods on tropical survival estimates and their relationships with life history traits.LocationSabah, Malaysian Borneo.Time period2008–2016.Major taxonPasseriformes.MethodsWe used standard-effort netting and resighted individuals of all social and age classes of 18 tropical songbird species over 8 years. We compared apparent survival estimates between these two field methods with differing analytical approaches.ResultsEstimated detection and apparent survival probabilities from standard-effort netting were similar to those from other tropical studies that used standard-effort netting. Resighting data verified that a high proportion of individuals that were never recaptured in standard-effort netting remained in the study area, and many were observed breeding. Across all analytical approaches, addition of resighting yielded substantially higher survival estimates than did standard-effort netting alone. These apparent survival estimates were higher than for temperate zone species, consistent with latitudinal differences in life histories. Moreover, apparent survival estimates from addition of resighting, but not from standard-effort netting alone, were correlated with parental effort as measured by egg temperature across species.Main conclusionsInclusion of resighting showed that standard-effort netting alone can negatively bias apparent survival estimates and obscure life history relationships across latitudes and among tropical species.
Cancer survival among children of Turkish descent in Germany 1980–2005: a registry-based analysis
Spix, Claudia; Spallek, Jacob; Kaatsch, Peter; Razum, Oliver; Zeeb, Hajo
2008-01-01
Background Little is known about the effect of migrant status on childhood cancer survival. We studied cancer survival among children of Turkish descent in the German Cancer Childhood Registry, one of the largest childhood cancer registries worldwide. Methods We identified children of Turkish descent among cancer cases using a name-based approach. We compared 5-year survival probabilities of Turkish and other children in three time periods of diagnosis (1980–87, 1988–95, 1996–2005) using the Kaplan-Meier method and log-rank tests. Results The 5-year survival probability for all cancers among 1774 cases of Turkish descent (4.76% of all 37.259 cases) was 76.9% compared to 77.6% in the comparison group (all other cases; p = 0.15). We found no age- or sex-specific survival differences (p-values between p = 0.18 and p = 0.90). For the period 1980–87, the 5-year survival probability among Turkish children with lymphoid leukaemia was significantly lower (62% versus 75.8%; p < 0.0001), this remains unexplained. For more recently diagnosed leukaemias, we saw no survival differences for Turkish and non-Turkish children. Conclusion Our results suggest that nowadays Turkish migrant status has no bearing on the outcome of childhood cancer therapies in Germany. The inclusion of currently more than 95% of all childhood cancer cases in standardised treatment protocols is likely to contribute to this finding. PMID:19040749
Bridging locoregional therapy for hepatocellular carcinoma prior to liver transplantation.
Heckman, Jason T; Devera, Michael B; Marsh, J Wallis; Fontes, Paulo; Amesur, Nikhil B; Holloway, Shane E; Nalesnik, Michael; Geller, David A; Steel, Jennifer L; Gamblin, T Clark
2008-11-01
The impact of locoregional therapy prior to liver transplantation for hepatocellular carcinoma utilizing either transcatheter arterial chemoembolization (TACE), yttrium-90 ((90)Y), radiofrequency ablation (RFA), or resection prior to orthotopic liver transplantation (OLT) is largely unknown. We sought to examine locoregional therapies and their effect on survival compared with transplantation alone. A retrospective review of a prospectively collected database. 123 patients were included. Patients were analyzed in two groups. Group I consisted of 50 patients that received therapy (20 TACE; 16 (90)Y; 13 RFA, 3 resections). Group II consisted of 73 patients transplanted without therapy. Median list time was 28 days (range 2-260 days ) in group I, and 24 days (range 1-380 days) in group II. Median time from therapy to OLT was 3.8 months (range 9 days to 68 months). Twelve patients (24%) were successfully downstaged (8 TACE, 2 (90)Y, 2 RFA/resection). Overall 1-, 3-, and 5-year survival were 81%, 74%, and 74%, respectively. Survival was not statistically significantly different between the two groups (P = 0.53). The 12 patients downstaged did not have a significant difference in survival as compared with the patients who received therapy but did not respond or the patients who were transplanted without therapy (P = 0.76). Our report addresses locoregional therapy for hepatocellular carcinoma as a bridge to transplant. There was no statistical difference in overall survival between patients treated and those not treated prior to transplant. We provide further evidence that locoregional therapy is a safe tool for patients on the transplant list, does not impact survival, and can downstage selected patients to allow life-saving liver transplantation.
Lammert, Craig; Juran, Brian D.; Schlicht, Erik; Chan, Landon L.; Atkinson, Elizabeth J.; de Andrade, Mariza; Lazaridis, Konstantinos N.
2014-01-01
Background Biochemical response to Ursodeoxycholic Acid among patients with Primary Biliary Cirrhosis remains variable and there is no agreement of an ideal model. Novel assessment of response coupled to histologic progression was recently defined by the Toronto criteria. We retrospectively assessed transplant-free survival and clinical outcomes associated with Ursodeoxycholic Acid response to evaluate the Toronto criteria using a large North American cohort of PBC patients. Methods 398 PBC patients from the Mayo Clinic PBC Genetic Epidemiology (MCPGE) Registry were assessed for Ursodeoxycholic Acid treatment and biochemical response per the Toronto criteria. Responders were defined by reduction in alkaline phosphatase to less than or equal to 1.67 times the upper normal limit by 2 years of treatment, whereas non-responders had alkaline phosphatase values greater than 1.67 times the upper normal limit. Probability of survival was estimated using the Kaplan-Meier method. Results 302 (76%) patients were responders and 96 (24%) were non-responders. Significantly more non-responders developed adverse events related to chronic liver disease compared to responders (Hazard Ratio (HR): 2.77, P = 0.001). Biochemical responders and early-stage disease at treatment start was associated with improved overall transplant-free survival compared to non-responders (HR: 1.9) and patients with late stage disease (HR: 2.7) after age and sex adjustment. Conclusions The Toronto criteria are capable of identifying Ursodeoxycholic Acid-treated Primary Biliary Cirrhosis patients at risk of poor transplant-free survival and adverse clinical outcomes. Our data reveal that despite advanced disease at diagnosis, biochemical response per the Toronto criteria associates with improved overall transplant-free survival. PMID:24317935
Lammert, Craig; Juran, Brian D; Schlicht, Erik; Chan, Landon L; Atkinson, Elizabeth J; de Andrade, Mariza; Lazaridis, Konstantinos N
2014-10-01
Biochemical response to ursodeoxycholic acid among patients with primary biliary cirrhosis remains variable, and there is no agreement of an ideal model. Novel assessment of response coupled to histologic progression was recently defined by the Toronto criteria. We retrospectively assessed transplant-free survival and clinical outcomes associated with ursodeoxycholic acid response to evaluate the Toronto criteria using a large North American cohort of PBC patients. Three hundred and ninety-eight PBC patients from the Mayo Clinic PBC Genetic Epidemiology Registry were assessed for ursodeoxycholic acid treatment and biochemical response per the Toronto criteria. Responders were defined by reduction in alkaline phosphatase to less than or equal to 1.67 times the upper normal limit by 2 years of treatment, whereas non-responders had alkaline phosphatase values greater than 1.67 times the upper normal limit. Probability of survival was estimated using the Kaplan-Meier method. Three hundred and two (76 %) patients were responders and 96 (24 %) were non-responders. Significantly more non-responders developed adverse events related to chronic liver disease compared to responders (hazard ratio (HR) 2.77, P = 0.001). Biochemical responders and early-stage disease at treatment start was associated with improved overall transplant-free survival compared to non-responders (HR 1.9) and patients with late-stage disease (HR 2.7) after age and sex adjustment. The Toronto criteria are capable of identifying ursodeoxycholic acid-treated primary biliary cirrhosis patients at risk of poor transplant-free survival and adverse clinical outcomes. Our data reveal that despite advanced disease at diagnosis, biochemical response per the Toronto criteria associates with improved overall transplant-free survival.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hristov, Boris; Reddy, Sushanth; Lin, Steven H.
Purpose: Surgery followed by chemotherapy and radiation (CRT) offers patients with pancreatic adenocarcinoma a chance for extended survival. In some patients, however, resection is difficult because of vascular involvement by the carcinoma, necessitating resection and grafting of the mesenterico-portal vessels. The purpose of this study was to compare outcomes between pancreaticoduodenectomy (PD) with and without mesenterico-portal vein resection (VR) in patients receiving adjuvant CRT for pancreatic adenocarcinoma. Methods and Materials: Between 1993 and 2005, 160 patients underwent PD with 5-FU-based adjuvant CRT followed by maintenance chemotherapy at the Johns Hopkins Hospital; 20 (12.5%) of the 160 underwent VR. Clinical outcomes,more » including median survival, overall survival, and complication rates were assessed for both groups. Results: Patients who underwent VR had significantly longer operative times (p = 0.009), greater intraoperative blood loss (p = 0.01), and longer postoperative lengths of stay (p = 0.03). However, postoperative morbidity, median survival, and overall survival rates were similar between the two groups. Most patients (70%) from both groups were able to complete CRT, and a subgroup analysis demonstrated no appreciable differences in terms of complications. None of the VR patients who received adjuvant CRT developed veno-occlusive disease or graft failure/leakage. Conclusion: In a cohort of patients treated with adjuvant 5-FU-based CRT at the Johns Hopkins Hospital, having a VR at the time of PD resulted in similar complication rates and survival. These data support the feasibility and safety of adjuvant CRT in patients undergoing VR at the time of PD.« less
Alyahya, A; Khanum, A; Qudeimat, M
2018-02-01
To compare class II resin composite with preformed metal crowns (PMC) in the treatment of proximal dentinal caries in high caries-risk patients. The charts (270) of paediatric patients with proximal caries of their primary molars were reviewed. Success or failure of a procedure was assessed using the dental notes. Survival analysis was used to calculate the mean survival time (MST) for both procedures. The influence of variables on the mean survival time was investigated. A total of 593 class II resin composites and 243 PMCs were placed in patients ranging between 4-13 years of age. The failure percentage of class II resin composites was 22.6% with the majority having been due to recurrent caries, while the failure percentage of PMCs was 15.2% with the majority due to loss of the crown. There was no significant difference between the MST of class II resin composites and PMCs, 41.3 and 45.6 months respectively (p value = 0.06). In class II resin composites, mesial restorations were associated with lower MST compared to distal restorations (p-value < 0.001). The MST of resin composites and PMCs were comparable when performed on high caries-risk patients.
Henny, Charles J.; Hill, Elwood F.; Grove, Robert A.; Chelgren, Nathan; Haggerty, Patricia K.
2017-01-01
This telemetry study is an extension of our 1997–2006 research on historical mercury contamination on snowy egrets (Egretta thula) up to ~ 20 days of age. Findings from initial studies at the mercury-contaminated Carson River colony at Lahontan Reservoir (LR) and a similar-sized reference (REF) colony on the Humboldt River included mercury-related physiological, biochemical, histopathological and reproductive effects up to ~20 days of age; with poor water years (2000–04), i.e., reduced prey availability, exacerbating effects. Herein, we compare timing of dispersal and migration at LR vs. REF, but the primary question now addressed is “whether survival of young mercury-exposed snowy egrets from LR would be further compromised beyond ~20 days of age? ” Based upon telemetry signals until 90–110 days of age (including dead bird counts and survival rate estimates), we conclude that mercury did not further compromise survival. Dead bird counts and survival rate estimates included time in the colony when fed by adults, plus the critical period when young dispersed from the colony to forage independently. The extended drought during this 3-year study was most critical in 2002 when production of ~20 d old egrets at LR was only 0.24 young/nest. In 2002, survival rates were low at both colonies and we documented the highest counts of dead egrets for both colonies. We suggest the losses in 2002 beyond 20 days of age were more a function of prey availability influenced by drought than exposure to mercury, especially at LR, because higher mercury concentrations, higher survival rates and fewer dead birds were documented at LR in 2003 when water conditions improved. Furthermore, total mercury (THg) in blood in 2003 was more than double 2002 (geometric mean, 3.39 vs 1.47 µg/g wet weight (ww). This higher THg exposure at LR in 2003 was associated with a redistribution of parent and post-dispersal feeding activities upstream (where there was higher mercury from historic mining) related to slightly improved water levels. When comparing the 3-year telemetry findings based upon ~20 d old young at LR (blood THg, geo. means 1.47, 3.39 and 1.89 µg/g ww), we found no evidence that age at dispersal, Julian date at dispersal, timing of migration, or pre-migration survival (~20 to ~100 days post-hatch) were adversely affected by elevated mercury.
Henny, Charles J; Hill, Elwood F; Grove, Robert A; Chelgren, Nathan D; Haggerty, Patricia K
2017-01-01
This telemetry study is an extension of our 1997-2006 research on historical mercury contamination on snowy egrets (Egretta thula) up to ~ 20 days of age. Findings from initial studies at the mercury-contaminated Carson River colony at Lahontan Reservoir (LR) and a similar-sized reference (REF) colony on the Humboldt River included mercury-related physiological, biochemical, histopathological and reproductive effects up to ~20 days of age; with poor water years (2000-04), i.e., reduced prey availability, exacerbating effects. Herein, we compare timing of dispersal and migration at LR vs. REF, but the primary question now addressed is "whether survival of young mercury-exposed snowy egrets from LR would be further compromised beyond ~20 days of age? " Based upon telemetry signals until 90-110 days of age (including dead bird counts and survival rate estimates), we conclude that mercury did not further compromise survival. Dead bird counts and survival rate estimates included time in the colony when fed by adults, plus the critical period when young dispersed from the colony to forage independently. The extended drought during this 3-year study was most critical in 2002 when production of ~20d old egrets at LR was only 0.24 young/nest. In 2002, survival rates were low at both colonies and we documented the highest counts of dead egrets for both colonies. We suggest the losses in 2002 beyond 20 days of age were more a function of prey availability influenced by drought than exposure to mercury, especially at LR, because higher mercury concentrations, higher survival rates and fewer dead birds were documented at LR in 2003 when water conditions improved. Furthermore, total mercury (THg) in blood in 2003 was more than double 2002 (geometric mean, 3.39 vs 1.47µg/g wet weight (ww). This higher THg exposure at LR in 2003 was associated with a redistribution of parent and post-dispersal feeding activities upstream (where there was higher mercury from historic mining) related to slightly improved water levels. When comparing the 3-year telemetry findings based upon ~20d old young at LR (blood THg, geo. means 1.47, 3.39 and 1.89µg/g ww), we found no evidence that age at dispersal, Julian date at dispersal, timing of migration, or pre-migration survival (~20 to ~100 days post-hatch) were adversely affected by elevated mercury. Published by Elsevier Inc.
The two-sample problem with induced dependent censorship.
Huang, Y
1999-12-01
Induced dependent censorship is a general phenomenon in health service evaluation studies in which a measure such as quality-adjusted survival time or lifetime medical cost is of interest. We investigate the two-sample problem and propose two classes of nonparametric tests. Based on consistent estimation of the survival function for each sample, the two classes of test statistics examine the cumulative weighted difference in hazard functions and in survival functions. We derive a unified asymptotic null distribution theory and inference procedure. The tests are applied to trial V of the International Breast Cancer Study Group and show that long duration chemotherapy significantly improves time without symptoms of disease and toxicity of treatment as compared with the short duration treatment. Simulation studies demonstrate that the proposed tests, with a wide range of weight choices, perform well under moderate sample sizes.
Effects of fish cues on mosquito larvae development.
Silberbush, Alon; Abramsky, Zvika; Tsurim, Ido
2015-10-01
We investigated the effects of predator-released kairomones on life history traits of larval Culex pipiens (Linnaeus). We compared the development time and survival of sibling larvae, reared in either water conditioned by the presence of Gambusia affinis (Baird and Girard) or fishless control-water. Our results indicate that larvae developing in fish-conditioned water (FCW) pupated faster than larvae in fishless-control water. The effect of FCW on larval survival was evident only in females. Surprisingly, FCW increased female survival. In both development-time and survival, boiling the water eliminated the FCW effect, supporting our hypothesis that fish conditioning is based on kairomones. Accelerated metamorphosis in response to predator released kairomones, evident in our results, is a rarely described phenomenon. Intuitively, when exposed to predator associated signals, aquatic larvae should metamorphose earlier to escape the higher risk of predation. However, theoretical models predict this outcome only under specific conditions. Indeed, longer - rather than shorter - time to metamorphosis is usually observed in response to predation risk. We argue that the response of larval mosquitoes to predation risk is context-dependent. Shortening larval development time may not be an exceptional response, but rather represents a part of a response spectrum that depends on the level of predation risk and resource abundance. Copyright © 2015 Elsevier B.V. All rights reserved.
Survival of postfledging Forster's terns in relation to mercury exposure in San Francisco Bay
Ackerman, Joshua T.; Eagles-Smith, Collin A.; Takekawa, John Y.; Iverson, S.A.
2008-01-01
We examined factors influencing mercury concentrations in 90 fledgling Forster's terns (Sterna forsteri) and evaluated whether mercury influenced postfledging survival in San Francisco Bay, California. Mercury concentrations (??SE) in chicks 21-29 days old (just before fledging) were 0.33 ?? 0.01 ??g g-1 ww for blood and 6.44 ?? 0.28 ??g g -1 fw for breast feathers. Colony site had an overriding influence on fledgling contamination, however hatching date and age also affected blood, but not feather, mercury concentrations. Blood mercury concentrations decreased by 28% during the 50-day hatching period and increased with chick age by 30% during the last week prior to fledging. Using radio-telemetry, we calculated that cumulative survival during the 35-day postfledging time period was 0.81 ?? 0.09 (SE). Postfledging survival rates increased with size-adjusted mass, and cumulative survival probability was 61% lower for terns with the lowest, compared to the highest, observed masses. Conversely, survival was not influenced by blood mercury concentration, time since fledging, sex, or hatch date. Mercury concentrations in breast feathers of fledglings found dead at nesting colonies also were no different than those in live chicks. Our results indicate that colony site, hatching date, and age influenced mercury concentrations in fledgling Forster's terns, but that mercury did not influence postfledging survival. ?? 2008 Springer Science+Business Media, LLC.
Rauh-Hain, J Alejandro; Starbuck, Kristen D; Meyer, Larissa A; Clemmer, Joel; Schorge, John O; Lu, Karen H; Del Carmen, Marcela G
2015-10-01
Evaluate rates of chemotherapy and radiotherapy delivery in the treatment of uterine carcinosarcoma, and compare clinical outcomes of treated and untreated patients. The National Cancer Database was queried to identify patients diagnosed with uterine carcinosarcoma between 2003 and 2011. The impact of chemotherapy on survival was analyzed using the Kaplan-Meier method. Factors predictive of outcome were compared using the Cox proportional hazards model. A total of 10,609 patients met study eligibility criteria. Stages I, II, III, and IV disease accounted for 2997 (28.2%), 642 (6.1%), 2037 (19.2%), and 1316 (12.4%) of the study population, respectively. Most patients (91.0%) underwent definitive surgery, and lymphadenectomy was performed in 68.7% of the patients. Chemotherapy was administered in 2378 (22.4%) patients, radiotherapy to 2196 (20.7%), adjuvant chemo-radiation to 1804 (17.0%), and 4231 (39.9%) of women did not received adjuvant therapy. Utilization of chemotherapy became more frequent over time. Over the entire study period, after adjusting for race, period of diagnosis, facility location, facility type, insurance provider, stage, age, treatment modality, lymph node dissection, socioeconomic status, and comorbidity index, there was an association between treatment modality and survival. The lowest hazard ratio observed was in patients that received chemo-radiation. The strongest quantitative predictor of death was stage at the time of diagnosis. In addition, surgical treatment, lymph node dissection, most recent time-periods, lower comorbidity index, and higher socioeconomic status were associated with improved survival. The overall rates of chemotherapy use have increased over time. Adjuvant chemotherapy and chemo-radiation were associated with improved survival. Copyright © 2015 Elsevier Inc. All rights reserved.
Kim, Su Jin; Kim, Hyun Jung; Lee, Hee Young; Ahn, Hyeong Sik; Lee, Sung Woo
2016-06-01
The objective was to determine whether extracorporeal cardiopulmonary resuscitation (ECPR), when compared with conventional cardiopulmonary resuscitation (CCPR), improves outcomes in adult patients, and to determine appropriate conditions that can predict good survival outcome in ECPR patients through a meta-analysis. We searched the relevant literature of comparative studies between ECPR and CCPR in adults, from the MEDLINE, EMBASE, and Cochrane databases. The baseline information and outcome data (survival, good neurologic outcome at discharge, at 3-6 months, and at 1 year after arrest) were extracted. Beneficial effect of ECPR on outcome was analyzed according to time interval, location of arrest (out-of-hospital cardiac arrest (OHCA) and in-hospital cardiac arrest (IHCA)), and pre-defined population inclusion criteria (witnessed arrest, initial shockable rhythm, cardiac etiology of arrest and CPR duration) by using Review Manager 5.3. Cochran's Q test and I(2) were calculated. 10 of 1583 publications were included. Although survival to discharge did not show clear superiority in OHCA, ECPR showed statistically improved survival and good neurologic outcome as compared to CCPR, especially at 3-6 months after arrest. In the subgroup of patients with pre-defined inclusion criteria, the pooled meta-analysis found similar results in studies with pre-defined criteria. Survival and good neurologic outcome tended to be superior in the ECPR group at 3-6 months after arrest. The effect of ECPR on survival to discharge in OHCA was not clearly shown. As ECPR showed better outcomes than CCPR in studies with pre-defined criteria, strict indications criteria should be considered when implementation of ECPR. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Cantekin, Kenan; Delikan, Ebru; Cetin, Secil
2014-01-01
Objective: The purposes of this research were to (1) compare the shear-peel bond strength (SPBS) of a band of a fixed space maintainer (SM) cemented with five different adhesive cements; and (2) compare the survival time of bands of SM with each cement type after simulating mechanical fatigue stress. Materials and Methods: Seventy-five teeth were used to assess retentive strength and another 50 teeth were used to assess the fatigue survival time. SPBS was determined with a universal testing machine. Fatigue testing was conducted in a ball mill device. Results: The mean survival time of bands cemented with R & D series Nova Glass-LC (6.2 h), Transbond Plus (6.7 h), and R & D series Nova Resin (6.8 h) was significantly longer than for bands cemented with Ketac-Cem (5.4 h) and GC Equia (5.2 h) (P < 0.05). Conclusion: Although traditional glass ionomer cement (GIC) cement presented higher retentive strength than resin-based cements (resin, resin modified GIC, and compomer cement), resin based cements, especially dual cure resin cement (nova resin cement) and compomer (Transbond Plus), can be expected to have lower failure rates for band cementation than GIC (Ketac-Cem) in the light of the results of the ball mill test. PMID:25202209
Integration of RNA-Seq and RPPA data for survival time prediction in cancer patients.
Isik, Zerrin; Ercan, Muserref Ece
2017-10-01
Integration of several types of patient data in a computational framework can accelerate the identification of more reliable biomarkers, especially for prognostic purposes. This study aims to identify biomarkers that can successfully predict the potential survival time of a cancer patient by integrating the transcriptomic (RNA-Seq), proteomic (RPPA), and protein-protein interaction (PPI) data. The proposed method -RPBioNet- employs a random walk-based algorithm that works on a PPI network to identify a limited number of protein biomarkers. Later, the method uses gene expression measurements of the selected biomarkers to train a classifier for the survival time prediction of patients. RPBioNet was applied to classify kidney renal clear cell carcinoma (KIRC), glioblastoma multiforme (GBM), and lung squamous cell carcinoma (LUSC) patients based on their survival time classes (long- or short-term). The RPBioNet method correctly identified the survival time classes of patients with between 66% and 78% average accuracy for three data sets. RPBioNet operates with only 20 to 50 biomarkers and can achieve on average 6% higher accuracy compared to the closest alternative method, which uses only RNA-Seq data in the biomarker selection. Further analysis of the most predictive biomarkers highlighted genes that are common for both cancer types, as they may be driver proteins responsible for cancer progression. The novelty of this study is the integration of a PPI network with mRNA and protein expression data to identify more accurate prognostic biomarkers that can be used for clinical purposes in the future. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ohara, M; Lu, H; Shiraki, K; Ishimura, Y; Uesaka, T; Katoh, O; Watanabe, H
2001-12-01
The radioprotective effect of miso, a fermentation product from soy bean, was investigated with reference to the survival time, crypt survival and jejunum crypt length in male B6C3F1 mice. Miso at three different fermentation stages (early-, medium- and long-term fermented miso) was mixed in MF diet into biscuits at 10% and was administered from 1 week before irradiation. Animal survival in the long-term fermented miso group was significantly prolonged as compared with the short-term fermented miso and MF cases after 8 Gy of 60Co-gamma-ray irradiation at a dose rate of 2Gy min(-1). Delay in mortality was evident in all three miso groups, with significantly increased survival. At doses of 10 and 12 Gy X-irradiation at a dose rate of 4 Gy min(-1), the treatment with long-term fermented miso significantly increased crypt survival. Also the protective influence against irradiation in terms of crypt lengths in the long-term fermented miso group was significantly greater than in the short-term or medium-term fermented miso and MF diet groups. Thus, prolonged fermentation appears to be very important for protection against radiation effects.
Kuk, Deborah; Shoushtari, Alexander N; Barker, Christopher A; Panageas, Katherine S; Munhoz, Rodrigo R; Momtaz, Parisa; Ariyan, Charlotte E; Brady, Mary Sue; Coit, Daniel G; Bogatch, Kita; Callahan, Margaret K; Wolchok, Jedd D; Carvajal, Richard D; Postow, Michael A
2016-07-01
Subtypes of melanoma, such as mucosal, uveal, and acral, are believed to result in worse prognoses than nonacral cutaneous melanoma. After a diagnosis of distant metastatic disease, however, the overall survival of patients with mucosal, uveal, acral, nonacral cutaneous, and unknown primary melanoma has not been directly compared. We conducted a single-center, retrospective analysis of 3,454 patients with melanoma diagnosed with distant metastases from 2000 to 2013, identified from a prospectively maintained database. We examined melanoma subtype, date of diagnosis of distant metastases, age at diagnosis of metastasis, gender, and site of melanoma metastases. Of the 3,454 patients (237 with mucosal, 286 with uveal, 2,292 with nonacral cutaneous, 105 with acral cutaneous, and 534 with unknown primary melanoma), 2,594 died. The median follow-up was 46.1 months. The median overall survival for those with mucosal, uveal, acral, nonacral cutaneous, and unknown primary melanoma was 9.1, 13.4, 11.4, 11.7, and 10.4 months, respectively. Patients with uveal melanoma, cutaneous melanoma (acral and nonacral), and unknown primary melanoma had similar survival, but patients with mucosal melanoma had worse survival. Patients diagnosed with metastatic melanoma in 2006-2010 and 2011-2013 had better overall survival than patients diagnosed in 2000-2005. In a multivariate model, patients with mucosal melanoma had inferior overall survival compared with patients with the other four subtypes. Additional research and advocacy are needed for patients with mucosal melanoma because of their shorter overall survival in the metastatic setting. Despite distinct tumor biology, the survival was similar for those with metastatic uveal melanoma, acral, nonacral cutaneous, and unknown primary melanoma. Uveal, acral, and mucosal melanoma are assumed to result in a worse prognosis than nonacral cutaneous melanoma or unknown primary melanoma. No studies, however, have been conducted assessing the overall survival of patients with these melanoma subtypes starting at the time of distant metastatic disease. The present study found that patients with uveal, acral, nonacral cutaneous, and unknown primary melanoma have similar overall survival after distant metastases have been diagnosed. These findings provide information for oncologists to reconsider previously held assumptions and appropriately counsel patients. Patients with mucosal melanoma have worse overall survival and are thus a group in need of specific research and advocacy. ©AlphaMed Press.
Kuk, Deborah; Shoushtari, Alexander N.; Barker, Christopher A.; Panageas, Katherine S.; Munhoz, Rodrigo R.; Momtaz, Parisa; Ariyan, Charlotte E.; Brady, Mary Sue; Coit, Daniel G.; Bogatch, Kita; Callahan, Margaret K.; Wolchok, Jedd D.; Carvajal, Richard D.
2016-01-01
Background. Subtypes of melanoma, such as mucosal, uveal, and acral, are believed to result in worse prognoses than nonacral cutaneous melanoma. After a diagnosis of distant metastatic disease, however, the overall survival of patients with mucosal, uveal, acral, nonacral cutaneous, and unknown primary melanoma has not been directly compared. Materials and Methods. We conducted a single-center, retrospective analysis of 3,454 patients with melanoma diagnosed with distant metastases from 2000 to 2013, identified from a prospectively maintained database. We examined melanoma subtype, date of diagnosis of distant metastases, age at diagnosis of metastasis, gender, and site of melanoma metastases. Results. Of the 3,454 patients (237 with mucosal, 286 with uveal, 2,292 with nonacral cutaneous, 105 with acral cutaneous, and 534 with unknown primary melanoma), 2,594 died. The median follow-up was 46.1 months. The median overall survival for those with mucosal, uveal, acral, nonacral cutaneous, and unknown primary melanoma was 9.1, 13.4, 11.4, 11.7, and 10.4 months, respectively. Patients with uveal melanoma, cutaneous melanoma (acral and nonacral), and unknown primary melanoma had similar survival, but patients with mucosal melanoma had worse survival. Patients diagnosed with metastatic melanoma in 2006–2010 and 2011–2013 had better overall survival than patients diagnosed in 2000–2005. In a multivariate model, patients with mucosal melanoma had inferior overall survival compared with patients with the other four subtypes. Conclusion. Additional research and advocacy are needed for patients with mucosal melanoma because of their shorter overall survival in the metastatic setting. Despite distinct tumor biology, the survival was similar for those with metastatic uveal melanoma, acral, nonacral cutaneous, and unknown primary melanoma. Implications for Practice: Uveal, acral, and mucosal melanoma are assumed to result in a worse prognosis than nonacral cutaneous melanoma or unknown primary melanoma. No studies, however, have been conducted assessing the overall survival of patients with these melanoma subtypes starting at the time of distant metastatic disease. The present study found that patients with uveal, acral, nonacral cutaneous, and unknown primary melanoma have similar overall survival after distant metastases have been diagnosed. These findings provide information for oncologists to reconsider previously held assumptions and appropriately counsel patients. Patients with mucosal melanoma have worse overall survival and are thus a group in need of specific research and advocacy. PMID:27286787
Effects of surgically implanted transmitters on reproduction and survival in mallards
Sheppard, Jennifer; Arnold, Todd W.; Amundson, Courtney L.; Klee, David
2017-01-01
Abdominally implanted radiotransmitters have been widely used in studies of waterbird ecology; however, the longer handling times and invasiveness of surgical implantation raise important concerns about animal welfare and potential effects on data quality. Although it is difficult to assess effects of handling and marking wild animals by comparing them with unmarked controls, insights can often be obtained by evaluating variation in handling or marking techniques. Here, we used data from 243 female mallards (Anas platyrhynchos) and mallard–grey duck hybrids (A. platyrhynchos × A. superciliosa) equipped with fully encapsulated abdominally implanted radiotransmitters from 2 study sites in New Zealand during 2014–2015 to assess potential marking effects. We evaluated survival, dispersal, and reproductive effort (e.g., breeding propensity, nest initiation date, clutch size) in response to 3 different attributes of handling duration and procedures: 1) processing time, including presurgery banding, measurements, and blood sampling of unanaesthetized birds; 2) surgery time from initiation to cessation of anesthetic; and 3) total holding time from first capture until release. We found no evidence that female survival, dispersal probability, or reproductive effort were negatively affected by holding, processing, or surgery time and concluded that we collected reliable data without compromising animal welfare. Our results support previous research that techniques using fully encapsulated abdominal-implant radiotransmitters are suitable to enable researchers to obtain reliable estimates of reproductive performance and survival.
Shiovitz, Stacey; Bertagnolli, Monica M.; Renfro, Lindsay A.; Nam, Eunmi; Foster, Nathan R.; Dzieciatkowski, Slavomir; Luo, Yanxin; Lao, Victoria Valinluck; Monnat, Raymond J.; Emond, Mary J.; Maizels, Nancy; Niedzwiecki, Donna; Goldberg, Richard M.; Saltz, Leonard B.; Venook, Alan; Warren, Robert S.; Grady, William M.
2014-01-01
BACKGROUND & AIMS The CpG island methylator phenotype (CIMP), defined by a high frequency of aberrantly methylated genes, is a characteristic of a subclass of colon tumors with distinct clinical and molecular features. Cohort studies have produced conflicting results on responses of CIMP-positive tumors to chemotherapy. We assessed the association between tumor CIMP status and survival of patients receiving adjuvant fluorouracil and leucovorin alone or with irinotecan (IFL) METHODS We analyzed data from patients with stage 3 colon adenocarcinoma randomly assigned to groups given fluorouracil and leucovorin or IFL following surgery, from April 1999 through April 2001. The primary endpoint of the trial was overall survival and the secondary endpoint was disease-free survival. DNA isolated from available tumor samples (n=615) was used to determine CIMP status based on methylation patterns at the CACNA1G, IGF2, NEUROG1, RUNX3, and SOCS1 loci. The effects of CIMP on survival were modeled using Kaplan-Meier and Cox proportional hazards; interactions with treatment and BRAF, KRAS, and mismatch repair (MMR) status were also investigated. RESULTS Of the tumor samples characterized for CIMP status, 145 were CIMP positive (23%). Patients with CIMP-positive tumors had shorter overall survival times than patients with CIMP-negative tumors (hazard ratio [HR]=1.36; 95% confidence interval [CI], 1.01–1.84). Treatment with IFL showed a trend toward increased overall survival for patients with CIMP-positive tumors, compared to treatment with fluorouracil and leucovorin (HR=0.62; 95% CI, 0.37–1.05; P=.07), but not for patients with CIMP-negative tumors (HR=1.38; 95% CI, 1.00–1.89; P=.049). In a 3-way interaction analysis, patients with CIMP-positive, MMR-intact tumors benefited most from the addition of irinotecan to fluorouracil and leucovorin therapy (for the interaction, P=.01). CIMP was more strongly associated with response to IFL than MMR status. Results for disease-free survival times were comparable among all analyses. CONCLUSION Patients with stage 3, CIMP-positive, MMR-intact colon tumors have longer survival times when irinotecan is added to combination therapy with fluorouracil and leucovorin. PMID:24859205
Shiovitz, Stacey; Bertagnolli, Monica M; Renfro, Lindsay A; Nam, Eunmi; Foster, Nathan R; Dzieciatkowski, Slavomir; Luo, Yanxin; Lao, Victoria Valinluck; Monnat, Raymond J; Emond, Mary J; Maizels, Nancy; Niedzwiecki, Donna; Goldberg, Richard M; Saltz, Leonard B; Venook, Alan; Warren, Robert S; Grady, William M
2014-09-01
The CpG island methylator phenotype (CIMP), defined by a high frequency of aberrantly methylated genes, is a characteristic of a subclass of colon tumors with distinct clinical and molecular features. Cohort studies have produced conflicting results on responses of CIMP-positive tumors to chemotherapy. We assessed the association between tumor CIMP status and survival of patients receiving adjuvant fluorouracil and leucovorin alone or with irinotecan (IFL). We analyzed data from patients with stage III colon adenocarcinoma randomly assigned to groups given fluorouracil and leucovorin or IFL after surgery, from April 1999 through April 2001. The primary end point of the trial was overall survival and the secondary end point was disease-free survival. DNA isolated from available tumor samples (n = 615) was used to determine CIMP status based on methylation patterns at the CACNA1G, IGF2, NEUROG1, RUNX3, and SOCS1 loci. The effects of CIMP on survival were modeled using Kaplan-Meier and Cox proportional hazards; interactions with treatment and BRAF, KRAS, and mismatch repair (MMR) status were also investigated. Of the tumor samples characterized for CIMP status, 145 were CIMP positive (23%). Patients with CIMP-positive tumors had shorter overall survival times than patients with CIMP-negative tumors (hazard ratio = 1.36; 95% confidence interval: 1.01-1.84). Treatment with IFL showed a trend toward increased overall survival for patients with CIMP-positive tumors, compared with treatment with fluorouracil and leucovorin (hazard ratio = 0.62; 95% CI: 0.37-1.05; P = .07), but not for patients with CIMP-negative tumors (hazard ratio = 1.38; 95% CI: 1.00-1.89; P = .049). In a 3-way interaction analysis, patients with CIMP-positive, MMR-intact tumors benefited most from the addition of irinotecan to fluorouracil and leucovorin therapy (for the interaction, P = .01). CIMP was more strongly associated with response to IFL than MMR status. Results for disease-free survival times were comparable among all analyses. Patients with stage III, CIMP-positive, MMR-intact colon tumors have longer survival times when irinotecan is added to combination therapy with fluorouracil and leucovorin. Copyright © 2014 AGA Institute. Published by Elsevier Inc. All rights reserved.
Powell, D.B.; Palm, R.C.; MacKenzie, A.P.; Winton, J.R.
2009-01-01
The effects of temperature, ionic strength, and new cryopreservatives derived from polar ice bacteria were investigated to help accelerate the development of economical, live attenuated vaccines for aquaculture. Extracts of the extremophile Gelidibacter algens functioned very well as part of a lyophilization cryoprotectant formulation in a 15-week storage trial. The bacterial extract and trehalose additives resulted in significantly higher colony counts of columnaris bacteria (Flavobacterium columnare) compared to nonfat milk or physiological saline at all time points measured. The bacterial extract combined with trehalose appeared to enhance the relative efficiency of recovery and growth potential of columnaris in flask culture compared to saline, nonfat milk, or trehalose-only controls. Pre-lyophilization temperature treatments significantly affected F. columnare survival following rehydration. A 30-min exposure at 0 ??C resulted in a 10-fold increase in bacterial survival following rehydration compared to mid-range temperature treatments. The brief 30 and 35 ??C pre-lyophilization exposures appeared to be detrimental to the rehydration survival of the bacteria. The survival of F. columnare through the lyophilization process was also strongly affected by changes in ionic strength of the bacterial suspension. Changes in rehydration constituents were also found to be important in promoting increased survival and growth. As the sodium chloride concentration increased, the viability of rehydrated F. columnare decreased. ?? 2009 Elsevier Inc.
Long survival in acute myelogenous leukaemia: an international collaborative study.
Whittaker, J A; Reizenstein, P; Callender, S T; Cornwell, G G; Delamore, I W; Gale, R P; Gobbi, M; Jacobs, P; Lantz, B; Maiolo, A T; Rees, J K; Van Slyck, E J; Van, H V
1981-01-01
A group of 82 adult patients with acute myelogenous leukaemia had survived in continuous first remission for more than three years was studied. These long-surviving patients were being treated at 12 referral centres in Europe and the USA, and they were compared with other patients with acute myelogenous leukaemia from 10 of these centres. There was no clear difference in the amount of induction chemotherapy or the time taken to achieve remission. Immunotherapy was not found to improve chances of long-term survival. The 82 patients were also compared with a group of 115 patients who had no appreciable difference in the number of blood or marrow myeloblasts between these two groups at presentation, but the long survivors had significantly higher initial platelet counts and were slightly younger. The long survivors also tended to have a lower total white cell count at presentation and lower granulocyte counts; there was no obvious explanation for these differences. Eight of the 82 patients relapsed from three to four years after remission and two (of 69 patients) after four to five year. Thereafter relapse was rare, and it seems likely that some of the 40 patients who have survived for five years or more are cured. PMID:6781618
NASA Astrophysics Data System (ADS)
McNeeley, Kathleen M.; Annapragada, Ananth; Bellamkonda, Ravi V.
2007-09-01
Liposomal and other nanocarrier based drug delivery vehicles can localize to tumours through passive and/or active targeting. Passively targeted liposomal nanocarriers accumulate in tumours via 'leaky' vasculature through the enhanced permeability and retention (EPR) effect. Passive accumulation depends upon the circulation time and the degree of tumour vessel 'leakiness'. After extravasation, actively targeted liposomal nanocarriers efficiently deliver their payload by receptor-mediated uptake. However, incorporation of targeting moieties can compromise circulation time in the blood due to recognition and clearance by the reticuloendothelial system, decreasing passive accumulation. Here, we compare the efficacy of passively targeted doxorubicin-loaded PEGylated liposomal nanocarriers to that of actively targeted liposomal nanocarriers in a rat 9L brain tumour model. Although folate receptor (FR)-targeted liposomal nanocarriers had significantly reduced blood circulation time compared to PEGylated liposomal nanocarriers; intratumoural drug concentrations both at 20 and 50 h after administration were equal for both treatments. Both treatments significantly increased tumour inoculated animal survival by 60-80% compared to non-treated controls, but no difference in survival was observed between FR-targeted and passively targeted nanocarriers. Therefore, alternate approaches allowing for active targeting without compromising circulation time may be important for fully realizing the benefits of receptor-mediated active targeting of gliomas.
Tang, Z H; Geng, Z M; Chen, C; Si, S B; Cai, Z Q; Song, T Q; Gong, P; Jiang, L; Qiu, Y H; He, Y; Zhai, W L; Li, S P; Zhang, Y C; Yang, Y
2018-05-01
Objective: To investigate the clinical value of Bayesian network in predicting survival of patients with advanced gallbladder cancer(GBC)who underwent curative intent surgery. Methods: The clinical data of patients with advanced GBC who underwent curative intent surgery in 9 institutions from January 2010 to December 2015 were analyzed retrospectively.A median survival time model based on a tree augmented naïve Bayes algorithm was established by Bayesia Lab software.The survival time, number of metastatic lymph nodes(NMLN), T stage, pathological grade, margin, jaundice, liver invasion, age, sex and tumor morphology were included in this model.Confusion matrix, the receiver operating characteristic curve and area under the curve were used to evaluate the accuracy of the model.A priori statistical analysis of these 10 variables and a posterior analysis(survival time as the target variable, the remaining factors as the attribute variables)was performed.The importance rankings of each variable was calculated with the polymorphic Birnbaum importance calculation based on the posterior analysis results.The survival probability forecast table was constructed based on the top 4 prognosis factors. The survival curve was drawn by the Kaplan-Meier method, and differences in survival curves were compared using the Log-rank test. Results: A total of 316 patients were enrolled, including 109 males and 207 females.The ratio of male to female was 1.0∶1.9, the age was (62.0±10.8)years.There was 298 cases(94.3%) R0 resection and 18 cases(5.7%) R1 resection.T staging: 287 cases(90.8%) T3 and 29 cases(9.2%) T4.The median survival time(MST) was 23.77 months, and the 1, 3, 5-year survival rates were 67.4%, 40.8%, 32.0%, respectively.For the Bayesian model, the number of correctly predicted cases was 121(≤23.77 months) and 115(>23.77 months) respectively, leading to a 74.86% accuracy of this model.The prior probability of survival time was 0.503 2(≤23.77 months) and 0.496 8(>23.77 months), the importance ranking showed that NMLN(0.366 6), margin(0.350 1), T stage(0.319 2) and pathological grade(0.258 9) were the top 4 prognosis factors influencing the postoperative MST.These four factors were taken as observation variables to get the probability of patients in different survival periods.Basing on these results, a survival prediction score system including NMLN, margin, T stage and pathological grade was designed, the median survival time(month) of 4-9 points were 66.8, 42.4, 26.0, 9.0, 7.5 and 2.3, respectively, there was a statistically significant difference in the different points( P <0.01). Conclusions: The survival prediction model of GBC based on Bayesian network has high accuracy.NMLN, margin, T staging and pathological grade are the top 4 risk factors affecting the survival of patients with advanced GBC who underwent curative resection.The survival prediction score system based on these four factors could be used to predict the survival and to guide the decision making of patients with advanced GBC.
Fay, Michael P; Follmann, Dean A; Lynn, Freyja; Schiffer, Jarad M; Stark, Gregory V; Kohberger, Robert; Quinn, Conrad P; Nuzum, Edwin O
2012-09-12
Because clinical trials to assess the efficacy of vaccines against anthrax are not ethical or feasible, licensure for new anthrax vaccines will likely involve the Food and Drug Administration's "Animal Rule," a set of regulations that allow approval of products based on efficacy data only in animals combined with immunogenicity and safety data in animals and humans. U.S. government-sponsored animal studies have shown anthrax vaccine efficacy in a variety of settings. We examined data from 21 of those studies to determine whether an immunological bridge based on lethal toxin neutralization activity assay (TNA) can predict survival against an inhalation anthrax challenge within and across species and genera. The 21 studies were classified into 11 different settings, each of which had the same animal species, vaccine type and formulation, vaccination schedule, time of TNA measurement, and challenge time. Logistic regression models determined the contribution of vaccine dilution dose and TNA on prediction of survival. For most settings, logistic models using only TNA explained more than 75% of the survival effect of the models with dose additionally included. Cross-species survival predictions using TNA were compared to the actual survival and shown to have good agreement (Cohen's κ ranged from 0.55 to 0.78). In one study design, cynomolgus macaque data predicted 78.6% survival in rhesus macaques (actual survival, 83.0%) and 72.6% in rabbits (actual survival, 64.6%). These data add support for the use of TNA as an immunological bridge between species to extrapolate data in animals to predict anthrax vaccine effectiveness in humans.
Karami, Azade; Bakhtiari, Mitra; Azadbakht, Mehri; Ghorbani, Rostam; Khazaei, Mozafar; Rezaei, Mansour
2017-06-01
Oocyte incubation time before freezing is one of the factors affecting oocyte vitrification. In the assisted reproductive technology (ART) clinics, it is sometimes decided to perform oocyte vitrification after a long period of incubation time due to various conditions, such as inability to collect semen samples, unsuccessful urological interventions (PESA, TESE, etc.), or unexpected conditions. A time factor of up to 6 h has been studied in the available reports. Therefore, this study was designed to evaluate oocyte incubation time before freezing at 0, 6, 12, 18, and 24 h after retrieval. Metaphase II (MII) oocytes were obtained from NMRI female mice after being randomly divided into the five groups of 0, 6, 12, 18, and 24 h of freezing via hormonal stimulation following retrieval and entered into the vitrification-warming process. The thawed oocytes were evaluated according to the survival criteria and then inseminated with the sperms of male mice for in vitro fertilization. The next day, the embryo formation rate and embryo quality were assessed. Our results demonstrated that even after 24 h of incubation, the survival rate of oocytes was 51.35% with the embryo formation rate of 73.21%. However, the survival and embryo formation rates significantly decreased within 12, 18, and 24 h after retrieval compared to the groups vitrified at 0 h. The embryo quality was significantly reduced by vitrification at 0 to 24 h after retrieval. According to our data, although a prolonged incubation time before freezing reduced the survival rate, there was still a chance for oocytes to stay alive with acceptable embryo formation and quality rates after vitrification warming of oocytes.
Wang, Yu-Chen; Huang, Ying-Ying; Lo, Ping-Hang; Chang, Kuan-Cheng; Chen, Chu-Huang; Chen, Ming-Fong
2016-11-01
To investigate the age-dependent impact of the superfast door-to-balloon (D2B) times of ≤60min as recommended by the new ESC Guideline for patients with ST elevation myocardial infarction (STEMI) undergoing primary percutaneous coronary intervention (PPCI) on mid-term survival rates based on a single center registry dataset. This study enrolled consecutive STEMI patients who underwent PPCI from Jan 1, 2009 through Sep 30, 2013. We compared demographics, clinical characteristics and the D2B-survival relationships between patients aged ≥65 and <65. The younger group comprised 242 patients (68%) aged <65 and the elder group consisted of 115 patients (32%) aged ≥65. In patients aged <65, the mortality rate decreased linearly with D2B time shortening (>90min vs. 61-90min vs. ≤60min=14.9% vs. 13.3% vs. 1.2%, P=0.001). Contrarily, shortening of D2B time was not associated with reduced mortality rate in patients aged ≥65 (>90min vs. 61-90min vs. ≤60min=23.5% vs. 19% vs. 18.9%, P=0.99). In younger patients but not the elderly, a D2B time of <60min has sufficient power to predict mortality with a sensitivity of 0.83, specificity of 0.74, and Youden index of 0.57. Our results show that the new ESC Guideline recommendation of D2B time ≤60min is associated with better survival rates in younger STEMI patients undergoing PPCI. Our findings stress the importance of guideline adherence to minimize reperfusion delay to improve survival in these patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Drabik, Anna; Büscher, Guido; Thomas, Karsten; Graf, Christian; Müller, Dirk; Stock, Stephanie
2012-08-01
This study aimed to assess the impact of a nationwide German diabetes mellitus disease management program (DMP) on survival time and costs in comparison to routine care. The authors conducted a retrospective observational cohort study using routine administration data from Germany's largest sickness fund to identify insured suffering from diabetes in 2002. A total of 95,443 insured with type 2 diabetes mellitus who were born before January 1, 1962 met the defined inclusion criteria, resulting in 19,888 pairs of DMP participants and nonparticipants matched for socioeconomic and health status using propensity score matching methods. This is the first time propensity score matching has been used to evaluate a survival benefit of DMPs. In the time frame analyzed (3 years), mean survival time for the DMP group was 1045 days vs. 985 days for the routine care group (P<0.001). Mean daily hospital and total costs (including DMP administration and medical costs) were lower for the DMP group in the case of deceased insureds (92€ vs. 139€ and 122€ vs. 169€, respectively) as well as for censored observations (6€ vs. 7€ and 12.9€ vs. 13.4€, respectively). Mean daily drug costs were slightly lower for deceased insured in the DMP group (difference 0.6€), while no identifiable difference was found for censored observations. In this study, insured who were enrolled in a DMP for diabetes mellitus in the German Statutory Health Insurance showed a significant benefit in survival time. They also incurred lower costs compared to propensity score matched insured in routine care.
Williams, Claire; Lewsey, James D.; Mackay, Daniel F.; Briggs, Andrew H.
2016-01-01
Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results. PMID:27698003
Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H
2017-05-01
Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.
Rahbar, Mohammad H; Choi, Sangbum; Hong, Chuan; Zhu, Liang; Jeon, Sangchoon; Gardiner, Joseph C
2018-01-01
We propose a nonparametric shrinkage estimator for the median survival times from several independent samples of right-censored data, which combines the samples and hypothesis information to improve the efficiency. We compare efficiency of the proposed shrinkage estimation procedure to unrestricted estimator and combined estimator through extensive simulation studies. Our results indicate that performance of these estimators depends on the strength of homogeneity of the medians. When homogeneity holds, the combined estimator is the most efficient estimator. However, it becomes inconsistent when homogeneity fails. On the other hand, the proposed shrinkage estimator remains efficient. Its efficiency decreases as the equality of the survival medians is deviated, but is expected to be as good as or equal to the unrestricted estimator. Our simulation studies also indicate that the proposed shrinkage estimator is robust to moderate levels of censoring. We demonstrate application of these methods to estimating median time for trauma patients to receive red blood cells in the Prospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study.
Choi, Sangbum; Hong, Chuan; Zhu, Liang; Jeon, Sangchoon; Gardiner, Joseph C.
2018-01-01
We propose a nonparametric shrinkage estimator for the median survival times from several independent samples of right-censored data, which combines the samples and hypothesis information to improve the efficiency. We compare efficiency of the proposed shrinkage estimation procedure to unrestricted estimator and combined estimator through extensive simulation studies. Our results indicate that performance of these estimators depends on the strength of homogeneity of the medians. When homogeneity holds, the combined estimator is the most efficient estimator. However, it becomes inconsistent when homogeneity fails. On the other hand, the proposed shrinkage estimator remains efficient. Its efficiency decreases as the equality of the survival medians is deviated, but is expected to be as good as or equal to the unrestricted estimator. Our simulation studies also indicate that the proposed shrinkage estimator is robust to moderate levels of censoring. We demonstrate application of these methods to estimating median time for trauma patients to receive red blood cells in the Prospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study. PMID:29772007
Duffy, D; Selmic, L E; Kendall, A R; Powers, B E
2017-03-01
Extraskeletal osteosarcoma (EOS) is a rare, highly malignant mesenchymal neoplasm arising from viscera or soft tissues characterised by the formation of osteoid in the absence of bone involvement. Owing to the rarity of these neoplasms very little information exists on treatment outcomes. The purpose of this study was to describe the outcome following surgical treatment of non-mammary and non-thyroidal soft tissue and visceral EOS in dogs. Thirty-three dogs were identified; the most common primary tumour site was the spleen. Dogs that had wide or radical tumour excision had longer survival times compared with dogs that had only marginal tumour excision performed [median survival time of 90 days (range: 0-458 days) versus median survival time of 13 days (range: 0-20 days)]. The use of surgery should be considered in the management of dogs with non-mammary and non-thyroidal soft tissue and visceral EOS. © 2015 John Wiley & Sons Ltd.
Bonjour, Timothy J; Charny, Grigory; Thaxton, Robert E
2016-11-01
Rapid effective trauma resuscitations (TRs) decrease patient morbidity and mortality. Few studies have evaluated TR care times. Effective time goals and superior human patient simulator (HPS) training can improve patient survivability. The purpose of this study was to compare live TR to HPS resuscitation times to determine mean incremental resuscitation times and ascertain if simulation was educationally equivalent. The study was conducted at San Antonio Military Medical Center, Department of Defense Level I trauma center. This was a prospective observational study measuring incremental step times by trauma teams during trauma and simulation patient resuscitations. Trauma and simulation patient arms had 60 patients for statistical significance. Participants included Emergency Medicine residents and Physician Assistant residents as the trauma team leader. The trauma patient arm revealed a mean evaluation time of 10:33 and simulation arm 10:23. Comparable time characteristics in the airway, intravenous access, blood sample collection, and blood pressure data subsets were seen. TR mean times were similar to the HPS arm subsets demonstrating simulation as an effective educational tool. Effective stepwise approaches, incremental time goals, and superior HPS training can improve patient survivability and improved departmental productivity using TR teams. Reprint & Copyright © 2016 Association of Military Surgeons of the U.S.
Hartmann-Johnsen, Olaf Johan; Kåresen, Rolf; Schlichting, Ellen; Nygård, Jan F
2015-11-01
Breast-conserving therapy (BCT) and mastectomy (MTX) has been considered to have a similar long-time survival. However, better survival in women undergoing BCT compared with MTX is found in two recent register studies from the United States. The purpose of this study was to compare survival after BCT and MTX for women with early-stage breast cancer in Norway. Women with invasive, early-stage breast cancer (1998-2008) where BCT and MTX were considered as equally beneficial treatments were included for a total of 13,015 women. Surgery was divided in two main cohorts (primary BCT, primary MTX) and five subcohorts. Analyses were stratified into T1N0M0, T2N0M0, T1N1M0, T2N1M0, and age groups (<50, 50-69, ≥70). Overall survival and breast cancer-specific survival (BCSS) were calculated in life tables, hazard ratios by Cox regression, and sensitivity analyses. Five-year BCSS for women who underwent primary BCT or primary MTX was 97 and 88 %, respectively. Women who underwent primary MTX had a hazard ratio of 1.64 (95 % confidence interval 1.43-1.88) for breast cancer death compared with women who underwent primary BCT after adjusting for the year of diagnosis, age at diagnosis, stage, histology, and grade. Survival was better or equal after breast-conserving therapy than mastectomy in all early stages, surgical subcohorts, and age groups. This advantage could not only be attributed to differences in tumor biology.
Greenhalgh, Stephen N; Reeve, Jenny A; Johnstone, Thurid; Goodfellow, Mark R; Dunning, Mark D; O'Neill, Emma J; Hall, Ed J; Watson, Penny J; Jeffery, Nick D
2014-09-01
To compare long-term survival and quality of life data in dogs with clinical signs associated with a congenital portosystemic shunt (CPSS) that underwent medical or surgical treatment. Prospective cohort study. 124 client-owned dogs with CPSS. Dogs received medical or surgical treatment without regard to signalment, clinical signs, or clinicopathologic results. Survival data were analyzed with a Cox regression model. Quality of life information, obtained from owner questionnaires, included frequency of CPSS-associated clinical signs (from which a clinical score was derived), whether owners considered their dog normal, and (for surgically treated dogs) any ongoing medical treatment for CPSS. A Mann-Whitney U test was used to compare mean clinical score data between surgically and medically managed dogs during predetermined follow-up intervals. 97 dogs underwent surgical treatment; 27 were managed medically. Median follow-up time for all dogs was 1,936 days. Forty-five dogs (24 medically managed and 21 surgically managed) died or were euthanized during the follow-up period. Survival rate was significantly improved in dogs that underwent surgical treatment (hazard ratio, 8.11; 95% CI, 4.20 to 15.66) than in those treated medically for CPSS. Neither age at diagnosis nor shunt type affected survival rate. Frequency of clinical signs was lower in surgically versus medically managed dogs for all follow-up intervals, with a significant difference between groups at 4 to 7 years after study entry. Surgical treatment of CPSS in dogs resulted in significantly improved survival rate and lower frequency of ongoing clinical signs, compared with medical management. Age at diagnosis did not affect survival rate and should not influence treatment choice.
Hack, Jason B; Deguzman, Jocelyn M; Brewer, Kori L; Meggs, William J; O'Rourke, Dorcas
2011-07-01
Pressure immobilization bandages have been shown to delay onset of systemic toxicity after Eastern coral snake (Micrurus fulvius) envenomation to the distal extremity. To assess the efficacy of a novel compression device in delaying onset of systemic toxicity after truncal envenomations with Eastern coral snake (Micrurus fulvius) venom in a porcine model. With University approval, nine juvenile pigs (11 kg to 22 kg) were sedated, anesthetized, and intubated but not paralyzed to ensure continuous spontaneous respirations in a university animal laboratory. Each animal was injected subcutaneously with 10 mg of M. fulvius venom in a pre-selected area of the trunk. After 1 min, six animals had the application of a novel, localizing circumferential compression (LoCC) device applied to the bite site (treatment group) and three animals had no treatment (control group). The device was composed of a rigid polymer clay form molded into a hollow fusiform shape with an internal dimension of 8 × 5 × 3 cm and an elastic belt wrapped around the animal securing the device in place. Vital signs were recorded at 30-min intervals. End points included a respiratory rate below 3 breaths/min, oxygen saturation < 80%, or survival to 8 h. Survival to 8 h was analyzed using Fisher's exact test, with p < 0.05 indicating significance. Survival analysis was performed using the Mantel-Cox test to assess time to death with outcomes represented in a Kaplan-Meier Cumulative survival plot. Five of the six pigs in the treatment group survived 8 h (293-480 min). None of the control pigs survived to 8 h (Fisher's exact p = 0.04), with mean time of respiratory failure 322 min (272-382 min). Survival analysis showed a significant delay in time to event in the treatment group compared to the control group (p = 0.04). The LoCC device used in this study delayed the onset of systemic toxicity and significantly increased survival time after artificial truncal envenomation by Eastern coral snake venom. Copyright © 2011 Elsevier Inc. All rights reserved.
Shitara, Kohei; Matsuo, Keitaro; Oze, Isao; Mizota, Ayako; Kondo, Chihiro; Nomura, Motoo; Yokota, Tomoya; Takahari, Daisuke; Ura, Takashi; Muro, Kei
2011-08-01
We performed a systematic review and meta-analysis to determine the impact of neutropenia or leukopenia experienced during chemotherapy on survival. Eligible studies included prospective or retrospective analyses that evaluated neutropenia or leukopenia as a prognostic factor for overall survival or disease-free survival. Statistical analyses were conducted to calculate a summary hazard ratio and 95% confidence interval (CI) using random-effects or fixed-effects models based on the heterogeneity of the included studies. Thirteen trials were selected for the meta-analysis, with a total of 9,528 patients. The hazard ratio of death was 0.69 (95% CI, 0.64-0.75) for patients with higher-grade neutropenia or leukopenia compared to patients with lower-grade or lack of cytopenia. Our analysis was also stratified by statistical method (any statistical method to decrease lead-time bias; time-varying analysis or landmark analysis), but no differences were observed. Our results indicate that neutropenia or leukopenia experienced during chemotherapy is associated with improved survival in patients with advanced cancer or hematological malignancies undergoing chemotherapy. Future prospective analyses designed to investigate the potential impact of chemotherapy dose adjustment coupled with monitoring of neutropenia or leukopenia on survival are warranted.
Nutritional status and survival of maintenance hemodialysis patients receiving lanthanum carbonate.
Komaba, Hirotaka; Kakuta, Takatoshi; Wada, Takehiko; Hida, Miho; Suga, Takao; Fukagawa, Masafumi
2018-04-16
Hyperphosphatemia and poor nutritional status are associated with increased mortality. Lanthanum carbonate is an effective, calcium-free phosphate binder, but little is known about the long-term impact on mineral metabolism, nutritional status and survival. We extended the follow-up period of a historical cohort of 2292 maintenance hemodialysis patients that was formed in late 2008. We examined 7-year all-cause mortality according to the serum phosphate levels and nutritional indicators in the entire cohort and then compared the mortality rate of the 562 patients who initiated lanthanum with that of the 562 propensity score-matched patients who were not treated with lanthanum. During a mean ± SD follow-up of 4.9 ± 2.3 years, 679 patients died in the entire cohort. Higher serum phosphorus levels and lower nutritional indicators (body mass index, albumin and creatinine) were each independently associated with an increased risk of death. In the propensity score-matched analysis, patients who initiated lanthanum had a 23% lower risk for mortality compared with the matched controls. During the follow-up period, the serum phosphorus levels tended to decrease comparably in both groups, but the lanthanum group maintained a better nutritional status than the control group. The survival benefit associated with lanthanum was unchanged after adjustment for time-varying phosphorus or other mineral metabolism parameters, but was attenuated by adjustments for time-varying indicators of nutritional status. Treatment with lanthanum is associated with improved survival in hemodialysis patients. This effect may be partially mediated by relaxation of dietary phosphate restriction and improved nutritional status.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hashimoto, Kenji; Narita, Yoshitaka, E-mail: yonarita@ncc.go.jp; Miyakita, Yasuji
2011-11-15
Purpose: Data comparing the clinical outcomes of local brain radiotherapy (LBRT) and whole brain RT (WBRT) in patients with a single brain metastasis after tumor removal are limited. Patients and Methods: A retrospective analysis was performed to compare the patterns of treatment failure, cause of death, progression-free survival, median survival time, and Karnofsky performance status for long-term survivors among patients who underwent surgery followed by either LBRT or WBRT between 1990 and 2008 at the National Cancer Center Hospital. Results: A total of 130 consecutive patients were identified. The median progression-free survival period among the patients who received postoperative LBRTmore » (n = 64) and WBRT (n = 66) was 9.7 and 11.5 months, respectively (p = .75). The local recurrence rates (LBRT, 9.4% vs. WBRT, 12.1%) and intracranial new metastasis rate (LBRT, 42.2% vs. WBRT, 33.3%) were similar in each arm. The incidence of leptomeningeal metastasis was also equivalent (LBRT, 9.4% vs. WBRT, 10.6%). The median survival time for the LBRT and WBRT patients was 13.9 and 16.7 months, respectively (p = .88). A neurologic cause of death was noted in 35.6% of the patients in the LBRT group and 36.7% of the WBRT group (p = .99). The Karnofsky performance status at 2 years was comparable between the two groups. Conclusions: The clinical outcomes of LBRT and WBRT were similar. A prospective evaluation is warranted.« less
Uemura, Shinya; Iwashita, Takuji; Iwata, Keisuke; Mukai, Tsuyoshi; Osada, Shinji; Sekino, Takafumi; Adachi, Takahito; Kawai, Masahiko; Yasuda, Ichiro; Shimizu, Masahito
2018-05-03
Malignant gastric outlet obstruction (GOO) often develops in patients with advanced pancreatic cancer (APC). It is not clear whether endoscopic duodenal stenting (DS) or surgical gastrojejunostomy (GJJ) is preferable as palliative treatment. To compare the efficacy and safety of GJJ and DS for GOO with APC. Consecutive 99 patients who underwent DS or GJJ for GOO with APC were evaluated. We compared the technical and clinical success rates, the incidence of adverse event (AE), the time to start chemotherapy and discharge and survival durations between DS and GJJ. Prognostic factors for overall survival (OS) were investigated on the multivariate analysis. GOO was managed with GJJ in 35 and DS in 64. The technical and clinical success rates were comparable. DS was associated with shorter time to start oral intake and earlier chemotherapy start and discharge. No difference was seen in the early and late AE rates. Multivariate analyses of prognostic factors for OS showed that performance status ≧2, administration of chemotherapy, and presence of obstructive jaundice to be significant factors. There were no significant differences in survival durations between the groups, regardless of the PS. There were no significant differences in the technical and clinical success and AE rates and survival duration between DS and GJJ in management of GOO by APC. DS may be a preferable option over GJJ given that it will lead to an earlier return to oral intake, a shortened length of hospital stay, and finally an earlier referral for chemotherapy. Copyright © 2018 IAP and EPC. Published by Elsevier B.V. All rights reserved.
Moulki, Naeem; Kealhofer, Jessica V; Benditt, David G; Gravely, Amy; Vakil, Kairav; Garcia, Santiago; Adabag, Selcuk
2018-06-16
Bifascicular block and prolonged PR interval on the electrocardiogram (ECG) have been associated with complete heart block and sudden cardiac death. We sought to determine if cardiac implantable electronic devices (CIED) improve survival in these patients. We assessed survival in relation to CIED status among 636 consecutive patients with bifascicular block and prolonged PR interval on the ECG. In survival analyses, CIED was considered as a time-varying covariate. Average age was 76 ± 9 years, and 99% of the patients were men. A total of 167 (26%) underwent CIED (127 pacemaker only) implantation at baseline (n = 23) or during follow-up (n = 144). During 5.4 ± 3.8 years of follow-up, 83 (13%) patients developed complete or high-degree atrioventricular block and 375 (59%) died. Patients with a CIED had a longer survival compared to those without a CIED in the traditional, static analysis (log-rank p < 0.0001) but not when CIED was considered as a time-varying covariate (log-rank p = 0.76). In the multivariable model, patients with a CIED had a 34% lower risk of death (hazard ratio 0.66, 95% confidence interval 0.52-0.83; p = 0.001) than those without CIED in the traditional analysis but not in the time-varying covariate analysis (hazard ratio 1.05, 95% confidence interval 0.79-1.38; p = 0.76). Results did not change in the subgroup with a pacemaker only. Bifascicular block and prolonged PR interval on ECG are associated with a high incidence of complete atrioventricular block and mortality. However, CIED implantation does not have a significant influence on survival when time-varying nature of CIED implantation is considered.
Asher, Lucy; Harvey, Naomi D.; Green, Martin; England, Gary C. W.
2017-01-01
Epidemiology is the study of patterns of health-related states or events in populations. Statistical models developed for epidemiology could be usefully applied to behavioral states or events. The aim of this study is to present the application of epidemiological statistics to understand animal behavior where discrete outcomes are of interest, using data from guide dogs to illustrate. Specifically, survival analysis and multistate modeling are applied to data on guide dogs comparing dogs that completed training and qualified as a guide dog, to those that were withdrawn from the training program. Survival analysis allows the time to (or between) a binary event(s) and the probability of the event occurring at or beyond a specified time point. Survival analysis, using a Cox proportional hazards model, was used to examine the time taken to withdraw a dog from training. Sex, breed, and other factors affected time to withdrawal. Bitches were withdrawn faster than dogs, Labradors were withdrawn faster, and Labrador × Golden Retrievers slower, than Golden Retriever × Labradors; and dogs not bred by Guide Dogs were withdrawn faster than those bred by Guide Dogs. Multistate modeling (MSM) can be used as an extension of survival analysis to incorporate more than two discrete events or states. Multistate models were used to investigate transitions between states of training to qualification as a guide dog or behavioral withdrawal, and from qualification as a guide dog to behavioral withdrawal. Sex, breed (with purebred Labradors and Golden retrievers differing from F1 crosses), and bred by Guide Dogs or not, effected movements between states. We postulate that survival analysis and MSM could be applied to a wide range of behavioral data and key examples are provided. PMID:28804710
Saenz, Daniel; Fucik, Erin M; Kwiatkowski, Matthew A
2013-01-01
Changes in climate and the introduction of invasive species are two major stressors to amphibians, although little is known about the interaction between these two factors with regard to impacts on amphibians. We focused our study on an invasive tree species, the Chinese tallow (Triadica sebifera), that annually sheds its leaves and produces leaf litter that is known to negatively impact aquatic amphibian survival. The purpose of our research was to determine whether the timing of leaf fall from Chinese tallow and the timing of amphibian breeding (determined by weather) influence survival of amphibian larvae. We simulated a range of winter weather scenarios, ranging from cold to warm, by altering the relative timing of when leaf litter and amphibian larvae were introduced into aquatic mesocosms. Our results indicate that amphibian larvae survival was greatly affected by the length of time Chinese tallow leaf litter decomposes in water prior to the introduction of the larvae. Larvae in treatments simulating warm winters (early amphibian breeding) were introduced to the mesocosms early in the aquatic decomposition process of the leaf litter and had significantly lower survival compared with cold winters (late amphibian breeding), likely due to significantly lower dissolved oxygen levels. Shifts to earlier breeding phenology, linked to warming climate, have already been observed in many amphibian taxa, and with most climate models predicting a significant warming trend over the next century, the trend toward earlier breeding should continue if not increase. Our results strongly suggest that a warming climate can interact with the effects of invasive plant species, in ways we have not previously considered, to reduce the survival of an already declining group of organisms. PMID:24363907
Saenz, Daniel; Fucik, Erin M; Kwiatkowski, Matthew A
2013-11-01
Changes in climate and the introduction of invasive species are two major stressors to amphibians, although little is known about the interaction between these two factors with regard to impacts on amphibians. We focused our study on an invasive tree species, the Chinese tallow (Triadica sebifera), that annually sheds its leaves and produces leaf litter that is known to negatively impact aquatic amphibian survival. The purpose of our research was to determine whether the timing of leaf fall from Chinese tallow and the timing of amphibian breeding (determined by weather) influence survival of amphibian larvae. We simulated a range of winter weather scenarios, ranging from cold to warm, by altering the relative timing of when leaf litter and amphibian larvae were introduced into aquatic mesocosms. Our results indicate that amphibian larvae survival was greatly affected by the length of time Chinese tallow leaf litter decomposes in water prior to the introduction of the larvae. Larvae in treatments simulating warm winters (early amphibian breeding) were introduced to the mesocosms early in the aquatic decomposition process of the leaf litter and had significantly lower survival compared with cold winters (late amphibian breeding), likely due to significantly lower dissolved oxygen levels. Shifts to earlier breeding phenology, linked to warming climate, have already been observed in many amphibian taxa, and with most climate models predicting a significant warming trend over the next century, the trend toward earlier breeding should continue if not increase. Our results strongly suggest that a warming climate can interact with the effects of invasive plant species, in ways we have not previously considered, to reduce the survival of an already declining group of organisms.
Clinical and molecular predictors of disease severity and survival in chronic lymphocytic leukemia.
Weinberg, J Brice; Volkheimer, Alicia D; Chen, Youwei; Beasley, Bethany E; Jiang, Ning; Lanasa, Mark C; Friedman, Daphne; Vaccaro, Gina; Rehder, Catherine W; Decastro, Carlos M; Rizzieri, David A; Diehl, Louis F; Gockerman, Jon P; Moore, Joseph O; Goodman, Barbara K; Levesque, Marc C
2007-12-01
Several parameters may predict disease severity and overall survival in chronic lymphocytic leukemia (CLL). The purpose of our study of 190 CLL patients was to compare immunoglobulin heavy chain variable region (IgV(H)) mutation status, cytogenetic abnormalities, and leukemia cell CD38 and Zap-70 to older, traditional parameters. We also wanted to construct a simple, inexpensive prognosis score that would significantly predict TTT and survival in patients at the time of diagnosis and help practicing clinicians. In univariate analyses, patients with higher clinical stage, higher leukocyte count at diagnosis, shorter leukocyte doubling time, elevated serum lactate dehydrogenase (LDH), unmutated immunoglobulin heavy chain variable region (IgV(H)) genes, and higher CD38 had a shorter overall survival and time-to-treatment (TTT). CLL cell Zap-70 expression was higher in patients with unmutated IgV(H), and those with higher Zap-70 tended to have shorter survival. IgV(H)4-34 or IgV(H)1-69 was the most common IgV(H) genes used (16 and 12%, respectively). Of those with IgV(H)1-69, 86% had unmutated IgV(H) and had a significantly shorter TTT. A cytogenetic abnormality was noted in 71% of the patients tested. Patients with 11q22 del and 17p13 del or complex abnormalities were significantly more likely to have unmutated IgV(H). We found that a prognostic score constructed using modified Rai stage, cellular CD38, and serum LDH (parameters easily obtained clinically) significantly predicted TTT and survival in patients at the time of diagnosis and performed as well or better than models using the newer markers.
Influences on Early and Medium-Term Survival Following Surgical Repair of the Aortic Arch
Bashir, Mohamad; Field, Mark; Shaw, Matthew; Fok, Matthew; Harrington, Deborah; Kuduvalli, Manoj; Oo, Aung
2014-01-01
Objectives: It is now well established by many groups that surgery on the aortic arch may be achieved with consistently low morbidity and mortality along with relatively good survival compared to estimated natural history for a number of aortic arch pathologies. The objectives of this study were to: 1) report, compare, and analyze our morbidity and mortality outcomes for hemiarch and total aortic arch surgery; 2) examine the survival benefit of hemiarch and total aortic arch surgery compared to age- and sex-matched controls; and 3) define factors which influence survival in these two groups and, in particular, identify those that are modifiable and potentially actionable. Methods: Outcomes from patients undergoing surgical resection of both hemiarch and total aortic arch at the Liverpool Heart and Chest Hospital between June 1999 and December 2012 were examined in a retrospective analysis of data collected for The Society for Cardiothoracic Surgeons (UK). Results: Over the period studied, a total of 1240 patients underwent aortic surgery, from which 287 were identified as having undergone hemi to total aortic arch surgery under deep or moderate hypothermic circulatory arrest. Twenty three percent of patients' surgeries were nonelective. The median age at the time of patients undergoing elective hemiarch was 64.3 years and total arch was 65.3 years (P = 0.25), with 40.1% being female in the entire group. A total of 140 patients underwent elective hemiarch replacement, while 81 underwent elective total arch replacement. Etiology of the aortic pathology was degenerative in 51.2% of the two groups, with 87.1% requiring aortic valve repair in the elective hemiarch group and 64.2% in the elective total arch group (P < 0.001). Elective in-hospital mortality was 2.1% in the hemiarch group and 6.2% (P = 0.15) in the total arch group with corresponding rates of stroke (2.9% versus 4.9%, P = 0.47), renal failure (4.3% versus 6.2%, P = 0.54), reexploration for bleeding (4.3% versus 4.9%, P > 0.99), and prolonged ventilation (8.6% versus 16.1%, P = 0.09). Overall mortality was 20.9% at 5 years, while it was 15.7% in the elective hemiarch and 25.9% in the total arch group (P = 0.065). Process control charts demonstrated stability of annualized mortality outcomes over the study period. Survival curve was flat and parallel compared to age- and sex-matched controls beyond 2 years. Multivariate analysis demonstrated the following independent factors associated with survival: renal dysfunction [hazard ratio (HR) = 3.11; 95% confidence interval (CI) = 1.44-6.73], New York Heart Association (NYHA) class ≥ III (HR = 2.25; 95% CI = 1.38-3.67), circulatory arrest time > 100 minutes (HR = 2.92; 95% CI = 1.57-5.43), peripheral vascular disease (HR = 2.44; 95% CI = 1.25-4.74), and concomitant coronary artery bypass graft operation (HR = 2.14; 95% CI = 1.20-3.80). Conclusions: Morbidity, mortality, and medium-term survival were not statistically different for patients undergoing elective hemi-aortic arch and total aortic arch surgery. The survival curve in this group of patients is flat and parallel to sex- and age-matched controls beyond 2 years. Multivariate analysis identified independent influences on survival as renal dysfunction, NYHA class ≥ III, circulatory arrest time (> 100 min), peripheral vascular disease, and concomitant coronary artery bypass grafting. Focus on preoperative optimization of some of these variables may positively influence long-term survival. PMID:26798716
Gold, Heather Taffet; Sorbero, Melony E. S.; Griggs, Jennifer J.; Do, Huong T.; Dick, Andrew W.
2013-01-01
Analysis of observational cohort data is subject to bias from unobservable risk selection. We compared econometric models and treatment effectiveness estimates using the linked Surveillance, Epidemiology, and End Results (SEER)-Medicare claims data for women diagnosed with ductal carcinoma in situ. Treatment effectiveness estimates for mastectomy and breast conserving surgery (BCS) with or without radiotherapy were compared using three different models: simultaneous-equations model, discrete-time survival model with unobserved heterogeneity (frailty), and proportional hazards model. Overall trends in disease-free survival (DFS), or time to first subsequent breast event, by treatment are similar regardless of the model, with mastectomy yielding the highest DFS over 8 years of follow-up, followed by BCS with radiotherapy, and then BCS alone. Absolute rates and direction of bias varied substantially by treatment strategy. DFS was underestimated by single-equation and frailty models compared to the simultaneous-equations model and RCT results for BCS with RT and overestimated for BCS alone. PMID:21602195
Zhao, Yanmin; Wang, Jiasheng; Luo, Yi; Shi, Jimin; Zheng, Weiyan; Tan, Yamin; Cai, Zhen; Huang, He
2017-08-01
The relative merits of reduced intensity hematopoietic stem cell transplantation (RIST) for chronic myeloid leukemia (CML) in the first chronic phase (CP) in imatinib era have not been evaluated. The study was designed to compare the outcomes of combination therapy of RIST plus imatinib (RIST + IM) vs. imatinib (IM) alone for young patients with early CP (ECP) and late CP (LCP). Of the patients, 130 were non-randomly assigned to treatment with IM alone (n = 88) or RIST + IM (n = 42). The 10-year overall survival (OS) and event-free survival (EFS) were comparable between RIST + IM and IM groups. LCP, high Sokal score, and no complete cytogenetic response at 3 months were adverse prognostic factors for survival, but only the time from diagnosis to IM was an independent predictor after multivariate analysis. For ECP, IM was similar to RIST + IM, with 10-year EFS rates of 77.2 vs. 81.6% (p = 0.876) and OS rates of 93.8 vs. 87.9% (p = 0.102), respectively. For LCP, both treatments resulted in similar survival, but more patients in the imatinib group experienced events (10-year EFS 40.8 vs. 66.7%, p = 0.047). The patients with higher EBMT risk scores had an inferior survival than those with lower scores (69.2 vs. 92.9%, p = 0.04). We concluded that RIST + IM was comparable to IM in terms of OS and EFS. However, RIST + IM was more affordable than IM alone in a 10-year scale. Thus, RIST + IM could be considered as an alternative treatment option, especially when the patients have low EBMT risk scores and demand a definite cure for CML.
Htut, Myo; D'Souza, Anita; Krishnan, Amrita; Bruno, Benedetto; Zhang, Mei-Jie; Fei, Mingwei; Diaz, Miguel Angel; Copelan, Edward; Ganguly, Siddhartha; Hamadani, Mehdi; Kharfan-Dabaja, Mohamed; Lazarus, Hillard; Lee, Cindy; Meehan, Kenneth; Nishihori, Taiga; Saad, Ayman; Seo, Sachiko; Ramanathan, Muthalagu; Usmani, Saad Z; Gasparetto, Christina; Mark, Tomer M; Nieto, Yago; Hari, Parameswaran
2018-03-01
We compared postrelapse overall survival (OS) after autologous/allogeneic (auto/allo) versus tandem autologous (auto/auto) hematopoietic cell transplantation (HCT) in patients with multiple myeloma (MM). Postrelapse survival of patients receiving an auto/auto or auto/allo HCT for MM and prospectively reported to the Center for International Blood and Marrow Transplant Research between 2000 and 2010 were analyzed. Relapse occurred in 404 patients (72.4%) in the auto/auto group and in 178 patients (67.4%) in the auto/allo group after a median follow-up of 8.5 years. Relapse occurred before 6 months after a second HCT in 46% of the auto/allo patients, compared with 26% of the auto/auto patients. The 6-year postrelapse survival was better in the auto/allo group compared with the auto/auto group (44% versus 35%; P = .05). Mortality due to MM was 69% (n = 101) in the auto/allo group and 83% (n = 229) deaths in auto/auto group. In multivariate analysis, both cohorts had a similar risk of death in the first year after relapse (hazard ratio [HR], .72; P = .12); however, for time points beyond 12 months after relapse, overall survival was superior in the auto/allo cohort (HR for death in auto/auto =1.55; P = .005). Other factors associated with superior survival were enrollment in a clinical trial for HCT, male sex, and use of novel agents at induction before HCT. Our findings shown superior survival afterrelapse in auto/allo HCT recipients compared with auto/auto HCT recipients. This likely reflects a better response to salvage therapy, such as immunomodulatory drugs, potentiated by a donor-derived immunologic milieu. Further augmentation of the post-allo-HCT immune system with new immunotherapies, such as monoclonal antibodies, checkpoint inhibitors, and others, merit investigation. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
Hypoxic coma as a strategy to survive inundation in a salt-marsh inhabiting spider
Pétillon, Julien; Montaigne, William; Renault, David
2009-01-01
Spiders constitute a major arthropod group in regularly inundated habitats. Some species survive a flooding period under water. We compared survival during both submersion and a recovery period after submersion, in three stenotopic lycosids: two salt-marsh species Arctosa fulvolineata and Pardosa purbeckensis, and a forest spider Pardosa lugubris. Both activity and survival rates were determined under controlled laboratory conditions by individually surveying 120 females kept submerged in sea water. We found significant differences between the three species, with the two salt-marsh spiders exhibiting higher survival abilities. To our knowledge, this study reports for the first time the existence of a hypoxic coma caused by submersion, which is most pronounced in A. fulvolineata, the salt-marsh spider known to overcome tidal inundation under water. Its ability to fall into that coma can therefore be considered a physiological adaptation to its regularly inundated habitat. PMID:19411268
Survival probabilities of patients with childhood spinal muscle atrophy.
Mannaa, Mohannad M; Kalra, Maninder; Wong, Brenda; Cohen, Aliza P; Amin, Raouf S
2009-03-01
Medical and technological advances over the past 2 decades have resulted in improved patient care for children with spinal muscular atrophy (SMA). The objective of the present study was to describe changes in the life expectancy of pediatric patients with SMA over time and to compare these findings with previously reported survival patterns. Medical records of all patients diagnosed with SMA over a 16-year period (1989-2005) at Cincinnati Children's Hospital Medical Center were reviewed. Data pertaining to date of birth, type of SMA, medical and surgical interventions, pulmonary complications, and date of death were obtained. Kaplan-Meier survival analyses showed a significant improvement in survival probabilities in the severest form of SMA. We found a positive trend in the survival of patients with severe SMA. Although we cannot attribute this trend to any single factor, it is likely that advances in pulmonary care and aggressive nutritional support have played a significant role.
Stockton, D.; Davies, T.; Day, N.; McCann, J.
1997-01-01
OBJECTIVES: To investigate the recent fall in mortality from breast cancer in England and Wales, and to determine the relative contributions of improvements in treatment and earlier detection of tumours. DESIGN: Retrospective study of all women with breast cancer registered by the East Anglian cancer registry and diagnosed between 1982 and 1989. SUBJECTS: 3965 patients diagnosed 1982-5 compared with 4665 patients diagnosed 1986-9, in three age groups 0-49, 50-64, > or = 65 years, with information on stage at diagnosis and survival. MAIN OUTCOME MEASURES: Three year relative survival rates by time period, age group, and stage; relative hazard ratios for each time period and age group derived from Cox's proportional hazards model, adjusted for single year of age and stage. RESULTS: Survival improved in the later time period, although there was little stage specific improvement. The proportion of early stage tumours increased especially in the 50-64 year age group, and adjustment for stage accounted for over half of the improvement in survival in women aged under 65 years. CONCLUSION: Over half of the drop in mortality in women aged under 65 years seems to be attributable to earlier detection of tumours, which has been observed since the mid-1980s. This could have resulted from an increase in breast awareness predating the start of the breast screening programme. PMID:9056796
Rhabdomyosarcoma Arising in a Previously Irradiated Field: An Analysis of 43 Patients
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dang, Nguyen D.; Teh, Bin S.; Paulino, Arnold C., E-mail: apaulino@tmhs.org
2013-03-01
Patients with soft tissue sarcomas that arise from previously irradiated fields have traditionally been reported to have a poor prognosis. In this report, we examined the characteristics and outcomes of patients who developed a rhabdomyosarcoma in a previously irradiated field (RMS-RIF); we hypothesize that these patients should have a better outcome compared to other postradiation soft tissue sarcomas as these tumors are chemosensitive and radiosensitive. A PubMed search of the literature from 1961-2010 yielded 33 studies with data for patients with RMS-RIF. The study included 43 patients with a median age of 6.5 years at the time of radiation therapymore » (RT) for the initial tumor. The median RT dose was 48 Gy. The median latency period, the time from RT to development of RMS-RIF, was 8 years. The 3-year overall survival for RMS-RIF was 42%. The 3-year overall survival was 66% for patients receiving chemotherapy and local treatment (surgery and/or RT) compared to 29% for those who had systemic treatment only or local treatment only (P=.049). Other factors associated with increased 3-year overall survival included retinoblastoma initial diagnosis (P<.001), age ≤18 years at diagnosis of RMS-RIF (P=.003), favorable site (P=.008), and stage 1 disease (P=.002). Age at time of RMS-RIF, retinoblastoma initial tumor, favorable site, stage 1 disease, and use of both systemic and local treatment were found to be favorable prognostic factors for 3-year overall survival.« less
Anal sac tumours of the dog and their response to cytoreductive surgery and chemotherapy.
Emms, S G
2005-06-01
A retrospective study of anal sac tumours without pulmonary metastases, from the author's clinical records for the period July 1989 to July 2002, was conducted to establish the response to treatment with surgery and melphalan chemotherapy. Of 21 dogs with tumours of the anal sacs 19 had apocrine gland adenocarcinomas of anal sac origin, one had a benign papillary cystadenoma and another had a malignant melanoma. Two of the 19 dogs had bilateral anal sac adenocarcinomas. Ten of the 19 dogs with apocrine gland adenocarcinomas of anal sac origin had sublumbar lymphadenopathy. Five dogs were excluded by their owners from recommended treatment. Fourteen dogs with apocrine gland adenocarcinomas of anal sac origin were treated by surgical cytoreduction and chemotherapy with melphalan. Seven of the 14 dogs had regional lymph node metastases. Cytoreduction was by local excision of the anal sac in all 14 dogs and concurrent removal of the sublumbar retroperitoneal lymph nodes in the seven dogs with regional lymph node metastases. The median survival time of dogs with sublumbar nodal metastasis was 20 months and for dogs with tumour localised to the anal sac the median survival time was 29.3 months. There was no difference in median survival of those dogs with sublumbar metastases compared to those without. This study suggests there is a role for melphalan in the treatment of dogs with anal sac adenocarcinoma when combined with cytoreductive surgery, with treatment survival times and the local recurrence rate of the primary tumour comparing favourably with previously published treatment regimes.
Retrospective analysis of survival rates and marginal bone loss on short implants in the mandible.
Draenert, Florian G; Sagheb, Keyvan; Baumgardt, Katharina; Kämmerer, Peer W
2012-09-01
Short implants have become an interesting alternative to bone augmentation in dental implantology. Design of shorter implants and longer surveillance times are a current research issue. The goal of this study was to show the survival rates of short implants below 9 mm in the partly edentulous mandibular premolar and molar regions with fixed prosthetics. Marginal vertical and 2D bone loss was evaluated additionally. Different implant designs are orientationally evaluated. A total of 247 dental implants with fixed prosthetics (crowns and bridges) in the premolar and molar region of the mandible were evaluated; 47 implants were 9 mm or shorter. Patient data were evaluated to acquire implant survival rates, implant diameter, gender and age. Panoramic X-rays were analysed for marginal bone loss. Average surveillance time was 1327 days. Cumulative survival rate (CSR) of short implants was 98% (1 implants lost) compared to 94% in the longer implants group without significance. Thirty-five of the short implants were Astratech (0 losses) and 12 were Camlog Screw Line Promote Plus (1 loss). Early vertical and two-dimensional marginal bone loss was not significantly different in short and regular length implant group with an average of 0.6 mm and 0.7 mm(2) in short implants over the observation period. Within the limitations of this study, we conclude that short implants with a length of 9 mm or less have equal survival rates compared with longer implants over the observation period of 1-3 years. © 2011 John Wiley & Sons A/S.
Survival in commercially insured multiple sclerosis patients and comparator subjects in the U.S.
Kaufman, D W; Reshef, S; Golub, H L; Peucker, M; Corwin, M J; Goodin, D S; Knappertz, V; Pleimes, D; Cutter, G
2014-05-01
Compare survival in patients with multiple sclerosis (MS) from a U.S. commercial health insurance database with a matched cohort of non-MS subjects. 30,402 MS patients and 89,818 non-MS subjects (comparators) in the OptumInsight Research (OIR) database from 1996 to 2009 were included. An MS diagnosis required at least 3 consecutive months of database reporting, with two or more ICD-9 codes of 340 at least 30 days apart, or the combination of 1 ICD-9-340 code and at least 1 MS disease-modifying treatment (DMT) code. Comparators required the absence of ICD-9-340 and DMT codes throughout database reporting. Up to three comparators were matched to each patient for: age in the year of the first relevant code (index year - at least 3 months of reporting in that year were required); sex; region of residence in the index year. Deaths were ascertained from the National Death Index and the Social Security Administration Death Master File. Subjects not identified as deceased were assumed to be alive through the end of 2009. Annual mortality rates were 899/100,000 among MS patients and 446/100,000 among comparators. Standardized mortality ratios compared to the U.S. population were 1.70 and 0.80, respectively. Kaplan-Meier analysis yielded a median survival from birth that was 6 years lower among MS patients than among comparators. The results show, for the first time in a U.S. population, a survival disadvantage for contemporary MS patients compared to non-MS subjects from the same healthcare system. The 6-year decrement in lifespan parallels a recent report from British Columbia. Copyright © 2013 Elsevier B.V. All rights reserved.
Lower kidney allograft survival in African-Americans compared to Hispanic-Americans with lupus.
Gonzalez-Suarez, M L; Contreras, G
2017-10-01
Background and objective African-Americans and Hispanic-Americans with lupus are the two most common minority groups who receive kidney transplants in the USA. It is unknown if African-Americans and Hispanic-Americans with lupus have similar outcomes after kidney transplantation. In this study, we assessed whether African-Americans compared to Hispanic-Americans have worse kidney allograft survival after risk factors of rejection and other prognostic factors were matched between both groups. Methods Out of 1816 African-Americans and 901 Hispanic-Americans with lupus, who received kidney transplants between 1987 and 2006 and had complete records in the UNOS program, 478 pairs were matched in 16 baseline predictors and follow-up time employing a predicted probability of group membership. The primary outcome was kidney allograft survival. Main secondary outcomes were rejection, allograft failure attributed to rejection, and mortality. Results Matched pairs were predominantly women (81%) with the mean age of 36 years. 96% were on dialysis before transplantation. 89% of recipients received kidneys from deceased donors and 15.5% from expanded criteria donors. 12% of recipients had zero HLA mismatch. African-Americans compared to Hispanic-Americans had lower cumulative allograft survival during 12-year follow-up ( p < 0.001). African-Americans compared to Hispanic-Americans had higher rates of rejection (10.4 vs 6.73 events/100 patients-years; p = 0.0002) and allograft failure attributed to rejection (6.31 vs 3.99; p = 0.0023). However, African-Americans and Hispanic-Americans had similar mortality rates (2.71 vs 2.31; p = 0.4269). Conclusions African-Americans compared to Hispanic-Americans with lupus had lower kidney allograft survival when recognized risk factors of rejection were matched between groups.
Simpson, Siobhan; Dunning, Mark David; de Brot, Simone; Grau-Roma, Llorenç; Mongan, Nigel Patrick; Rutland, Catrin Sian
2017-10-24
Osteosarcoma (OSA) is a rare cancer in people. However OSA incidence rates in dogs are 27 times higher than in people. Prognosis in both species is relatively poor, with 5 year OSA survival rates in people not having improved in decades. For dogs, 1 year survival rates are only around ~ 45%. Improved and novel treatment regimens are urgently required to improve survival in both humans and dogs with OSA. Utilising information from genetic studies could assist in this in both species, with the higher incidence rates in dogs contributing to the dog population being a good model of human disease. This review compares the clinical characteristics, gross morphology and histopathology, aetiology, epidemiology, and genetics of canine and human OSA. Finally, the current position of canine OSA genetic research is discussed and areas for additional work within the canine population are identified.
Chua, Terence C; Saxena, Akshat; Chu, Francis; Butler, S Patrick; Quinn, Richard J; Glenn, Derek; Morris, David L
2011-04-01
Resection of hepatocellular carcinoma (HCC) is potentially curative; however, recurrence is common. To date, few or no effective adjuvant therapies have been adequately investigated. This study evaluates the efficacy of adjuvant iodine-131-lipiodol after hepatic resection through the experience of a single-center hepatobiliary service of managing this disease. All patients who underwent hepatic resection for HCC and received adjuvant iodine-131-lipiodol between January 1991 and August 2009 were selected for inclusion into the experimental group. A group composed of patients treated during the same time period without adjuvant iodine-131-lipiodol was identified through the unit's HCC surgery database for comparison. The endpoints of this study were disease-free survival and overall survival. Forty-one patients who received adjuvant iodine-131-lipiodol after hepatic resection were compared with a matched group of 41 patients who underwent hepatic resection only. The median disease-free and overall survival were 24 versus 10 months (P = 0.032) and 104 versus 19 months (P = 0.001) in the experimental and control groups, respectively. Rates of intrahepatic-only recurrences (73 vs. 37%; P = 0.02) and surgical and nonsurgical treatments for recurrences (84 vs. 56%; P = 0.04) were higher in the experimental group compared to the control group. The finding of this study corroborates the current evidence from randomized and nonrandomized trials that adjuvant iodine-131-lipiodol improves disease-free and overall survival in patients with HCC after hepatic resection. The lengthened disease-free survival after adjuvant iodine-131-lipiodol allows for further disease-modifying treatments to improve the overall survival.
Outcomes of kidney transplantation in Alport syndrome compared with other forms of renal disease.
Kelly, Yvelynne P; Patil, Anish; Wallis, Luke; Murray, Susan; Kant, Saumitra; Kaballo, Mohammed A; Casserly, Liam; Doyle, Brendan; Dorman, Anthony; O'Kelly, Patrick; Conlon, Peter J
2017-11-01
Alport syndrome is an inherited renal disease characterized by hematuria, renal failure, hearing loss and a lamellated glomerular basement membrane. Patients with Alport syndrome who undergo renal transplantation have been shown to have patient and graft survival rates similar to or better than those of patients with other renal diseases. In this national case series, based in Beaumont Hospital Dublin, we studied the cohort of patients who underwent renal transplantation over the past 33 years, recorded prospectively in the Irish Renal Transplant Registry, and categorized them according to the presence or absence of Alport syndrome. The main outcomes assessed were patient and renal allograft survival. Fifty-one patients diagnosed with Alport syndrome in Beaumont Hospital received 62 transplants between 1982 and 2014. The comparison group of non-Alport patients comprised 3430 patients for 3865 transplants. Twenty-year Alport patient survival rate was 70.2%, compared to 44.8% for patients with other renal diseases (p = .01). Factors associated with patient survival included younger age at transplantation as well as differences in recipient sex, donor age, cold ischemia time, and episodes of acute rejection. Twenty-year graft survival was 46.8% for patients with Alport syndrome compared to 30.2% for those with non-Alport disease (p = .11). Adjusting for baseline differences between the groups, patients with end-stage kidney disease (ESKD) due to Alport syndrome have similar patient and graft survival to those with other causes of ESKD. This indicates that early diagnosis and management can lead to favorable outcomes for this patient cohort.
Faro, Albert; Shepherd, Ross; Huddleston, Charles B; Lowell, Jeffrey; Gandhi, Sanjiv; Nadler, Michelle; Sweet, Stuart C
2007-06-15
Simultaneous liver-lung transplantation is an infrequent but technically feasible procedure in patients with end-stage lung disease and advanced liver disease. We characterize the outcomes of pediatric patients who underwent this procedure at our institution. We performed a retrospective, case-control study and reviewed the medical records of all patients referred to our transplant program from its inception. Seven patients were listed for simultaneous liver-lung transplant. The five patients who survived to transplant were matched to 13 controls who underwent isolated bilateral sequential lung transplant for underlying diagnosis, age at time of transplant, gender, and era of transplant. Outcome measures included patient and graft survival, occurrence of bronchiolitis obliterans (BO), and episodes of rejection. Of the five study patients who underwent liver-lung transplant, one died of multiorgan failure 11 days after transplant compared with 9 of 13 controls who died. The median survival for the study patients was 89 months (range, 0-112 months) compared with the controls, who had a median survival of 34 months (range, 0-118 months). The remaining four patients had bronchiolitis obliterans syndrome scores of 0 compared with 5 of 13 control patients (P=0.02). The rate of acute rejection per 100 patient days was 0.012 for the study patients compared with 0.11 for the controls (P=0.025). Simultaneous liver-lung transplantation is a technically feasible procedure with excellent long-term outcomes. The surviving study subjects remain free from bronchiolitis obliterans syndrome. These results suggest that the transplanted liver may bestow immunologic privilege to the lung allograft.
Influence of drug therapy on the risk of recurrence of ocular toxoplasmosis.
Reich, Michael; Becker, Matthias D; Mackensen, Friederike
2016-02-01
Retrospective, observational case series with follow-up examination to analyse the influence of drug therapy on ocular toxoplasmosis (OT) in terms of recurrence-risk. In this one centre study an existing data set of 84 patients with active OT was used. Drug therapy for 255 active lesions was retrospectively reconstructed. Median recurrence-free survival time was calculated for the different treatment regimes using Kaplan-Meier estimates. 20 different regimens were used as treatment of OT in the catchment area of the Interdisciplinary Uveitis Center, University of Heidelberg, Germany. Median recurrence-free survival time was significantly lower after using systemic corticosteroid monotherapy (0.9 years; 95% CI 0.5 to 1.3 years) compared with Toxoplasma gondii-specific antibiotic treatment (3.0 years; 95% CI 2.2 to 3.9 years; p<0.001) or compared with no therapy (3.0 years; 95% CI 2.1 to 3.9 years; p=0.006). No difference could be detected when comparing median recurrence-free survival time after using T gondii-specific antibiotics compared with no therapy (p=0.679). Although our study shows that drug therapy seems to influence the risk of recurrence of OT, there is no consensus regarding the choice of antiparasitic agents for treatment regimens in the catchment area of the Interdisciplinary Uveitis Center, University of Heidelberg. Survey results provide useful information for treating physicians and for clinical investigators interested in therapy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Dumitraşcu, Traian; Stroescu, Cezar; Braşoveanu, Vladislav; Herlea, Vlad; Ionescu, Mihnea; Popescu, Irinel
2017-01-01
Introduction: The safety of portal vein resection (PVR) during surgery for perihilar cholangiocarcinoma (PHC) has been demonstrated in Asia, America, and Western Europe. However, no data about this topic are reported from Eastern Europe. The aim of the present study is to comparatively assess the early and long-term outcomes after resection for PHC with and without PVR. The data of 21 patients with PVR were compared with those of 102 patients with a curative-intent surgery for PHC without PVR. The appropriate statistical tests were used to compare different variables between the groups. Results: A PVR was performed in 17% of the patients. In the PVR group, significantly more right trisectionectomies (p=0.031) and caudate lobectomies (0.049) were performed and, as expected, both the operative time (p=0.015) and blood loss (p=0.002) were significantly higher. No differences between the groups were observed regarding the severe postoperative morbidity and mortality rates, and completion of adjuvant therapy. However, in the PVR group the postoperative clinicallyrelevant liver failure rate was significantly higher (p=0.001). No differences between the groups were observed for the median overall survival times (34 vs. 26 months, p = 0.566). A histological proof of the venous tumor invasion was observed in 52% of the patients with a PVR and was associated with significantly worse survival (p=0.027). A PVR can be safely performed during resection for PHC, without significant added severe morbidity or mortality rates. However, clinically-relevant liver failure rates are significantly higher when a PVR is performed. Furthermore, increased operative times and blood loss should be expected when a PVR is performed. Histological tumor invasion of the portal vein is associated with significantly worse survival. Celsius.
Raheel, Shafay; Shbeeb, Izzat; Crowson, Cynthia S; Matteson, Eric L
2017-08-01
To determine time trends in the incidence and survival of polymyalgia rheumatica (PMR) over a 15-year period in Olmsted County, Minnesota, and to examine trends in incidence of PMR in the population by comparing this time period to a previous incidence cohort from the same population base. All cases of incident PMR among Olmsted County, Minnesota residents in 2000-2014 were identified to extend the previous 1970-1999 cohort. Detailed review of all individual medical records was performed. Incidence rates were age- and sex-adjusted to the US white 2010 population. Survival rates were compared with the expected rates in the population of Minnesota. There were 377 incident cases of PMR during the 15-year study period. Of these, 64% were female and the mean age at incidence was 74.1 years. The overall age- and sex-adjusted annual incidence of PMR was 63.9 (95% confidence interval [95% CI] 57.4-70.4) per 100,000 population ages ≥50 years. Incidence rates increased with age in both sexes, but incidence fell after age 80 years. There was a slight increase in incidence of PMR in the recent time period compared to 1970-1999 (P = 0.063). Mortality among individuals with PMR was not significantly worse than that expected in the general population (standardized mortality ratio 0.70 [95% CI 0.57-0.85]). The incidence of PMR has increased slightly in the past 15 years compared to previous decades. Survivorship in patients with PMR is not worse than in the general population. © 2016, American College of Rheumatology.
Magaji, Bello Arkilla; Moy, Foong Ming; Roslani, April Camilla; Law, Chee Wei
2017-05-18
Colorectal cancer is the third most commonly diagnosed malignancy and the fourth leading cause of cancer-related death globally. It is the second most common cancer among both males and females in Malaysia. The economic burden of colorectal cancer is likely to increase over time owing to its current trend and aging population. Cancer survival analysis is an essential indicator for early detection and improvement in cancer treatment. However, there was a scarcity of studies concerning survival of colorectal cancer patients as well as its predictors. Therefore, we aimed to determine the 1-, 3- and 5-year survival rates, compare survival rates among ethnic groups and determine the predictors of survival among colorectal cancer patients. This was an ambidirectional cohort study conducted at the University Malaya Medical Centre (UMMC) in Kuala Lumpur, Malaysia. All Malaysian citizens or permanent residents with histologically confirmed diagnosis of colorectal cancer seen at UMMC from 1 January 2001 to 31 December 2010 were included in the study. Demographic and clinical characteristics were extracted from the medical records. Patients were followed-up until death or censored at the end of the study (31st December 2010). Censored patients' vital status (whether alive or dead) were cross checked with the National Registration Department. Survival analyses at 1-, 3- and 5-year intervals were performed using the Kaplan-Meier method. Log-rank test was used to compare the survival rates, while Cox proportional hazard regression analysis was carried out to determine the predictors of 5-year colorectal cancer survival. Among 1212 patients, the median survival for colorectal, colon and rectal cancers were 42.0, 42.0 and 41.0 months respectively; while the 1-, 3-, and 5-year relative survival rates ranged from 73.8 to 76.0%, 52.1 to 53.7% and 40.4 to 45.4% respectively. The Chinese patients had the lowest 5-year survival compared to Malay and Indian patients. Based on the 814 patients with data on their Duke's staging, independent predictors of poor colorectal cancer (5-year) survival were male sex (Hazard Ratio [HR]: 1.41; 95% CI: 1.12, 1.76), Chinese ethnicity (HR: 1.41; 95% CI: 1.07,1.85), elevated (≥ 5.1 ng/ml) pre-operative carcino-embryonic antigen (CEA) level (HR: 2.13; 95% CI: 1.60, 2.83), Duke's stage C (HR: 1.68; 95% CI: 1.28, 2.21), Duke's stage D (HR: 4.61; 95% CI: 3.39, 6.28) and emergency surgery (HR: 1.52; 95% CI: 1.07, 2.15). The survival rates of colorectal cancer among our patients were comparable with those of some Asian countries but lower than those found in more developed countries. Males and patients from the Chinese ethnic group had lower survival rates compared to their counterparts. More advanced staging and late presentation were important predictors of colorectal cancer survival. Health education programs targeting high risk groups and emphasizing the importance of screening and early diagnosis, as well as the recognition of symptoms and risk factors should be implemented. A nationwide colorectal cancer screening program should be designed and implemented to increase early detection and improve survival outcomes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiland, Mark A.; Ploskey, Gene R.; Hughes, James S.
The purpose of this study was to compare dam passage survival, at two spill treatment levels, of yearling Chinook salmon and steelhead smolts at John Day Dam during spring 2010. The two treatments were 30% and 40% spill out of total project discharge. Under the 2008 Federal Columbia River Power System (FCRPS) Biological Opinion (BiOp), dam passage survival should be greater than or equal to 0.96 and estimated with a standard error (SE) less than or equal 0.015. The study also estimated forebay residence time, tailrace egress time, and spill passage efficiency (SPE), as required in the Columbia Basin Fishmore » Accords. However, by agreement among the stakeholders, this study was not an official BiOp compliance test because the long-term passage measures at John Day Dam have yet to be finalized and another year of spill-treatment testing was desired.« less
Hospital Variation in Time to Epinephrine for Non-Shockable In-Hospital Cardiac Arrest
Khera, Rohan; Chan, Paul S.; Donnino, Michael; Girotra, Saket
2016-01-01
Background For patients with in-hospital cardiac arrests due to non-shockable rhythms, delays in epinephrine administration beyond 5 minutes is associated with worse survival. However, the extent of hospital variation in delayed epinephrine administration and its impact on hospital-level outcomes is unknown. Methods Within Get with the Guidelines-Resuscitation, we identified 103,932 adult patients (≥18 years) at 548 hospitals with an in-hospital cardiac arrest due to a non-shockable rhythm who received at least 1 dose of epinephrine between 2000 to 2014. We constructed two-level hierarchical regression models to quantify hospital variation in rates of delayed epinephrine administration (>5 minutes) and its association with hospital rates of survival to discharge and survival with functional recovery. Results Overall, 13,213 (12.7%) patients had delays to epinephrine, and this rate varied markedly across hospitals (range: 0% to 53.8%). The odds of delay in epinephrine administration were 58% higher at one randomly selected hospital compared to a similar patient at another randomly selected hospitals (median odds ratio [OR] 1.58; 95% C.I. 1.51 – 1.64). Median risk-standardized survival rate was 12.0% (range: 5.4% to 31.9%) and risk-standardized survival with functional recovery was 7.4% (range: 0.9% to 30.8%). There was an inverse correlation between a hospital’s rate of delayed epinephrine administration and its risk-standardized rate of survival to discharge (ρ= −0.22, P<0.0001) and survival with functional recovery (ρ= −0.14, P=0.001). Compared to a median survival rate of 12.9% (interquartile range 11.1% to 15.4%) at hospitals in the lowest quartile of epinephrine delay, risk-standardized survival was 16% lower at hospitals in the quartile with the highest rate of epinephrine delays (10.8%, interquartile range: 9.7% to 12.7%). Conclusions Delays in epinephrine administration following in-hospital cardiac arrest are common and varies across hospitals. Hospitals with high rates of delayed epinephrine administration had lower rates of overall survival for in-hospital cardiac arrest due to non-shockable rhythm. Further studies are needed to determine if improving hospital performance on time to epinephrine administration, especially at hospitals with poor performance on this metric will lead to improved outcomes. PMID:27908910
Thoma, D S; Zeltner, M; Hüsler, J; Hämmerle, C H F; Jung, R E
2015-09-01
To compare short implants in the posterior maxilla to longer implants placed after or simultaneously with sinus floor elevation procedures. The focused question was as follows: Are short implants superior to longer implants in the augmented sinus in terms of survival and complication rates of implants and reconstructions, patient-reported outcome measures (PROMs) and costs? A MEDLINE search (1990-2014) was performed for randomized controlled clinical studies comparing short implants (≤8 mm) to longer implants (>8 mm) in augmented sinus. The search was complimented by an additional hand search of the selected papers and reviews published between 2011 and 2014. Eligible studies were selected based on the inclusion criteria, and quality assessments were conducted. Descriptive statistics were applied for a number of outcome measures. Survival rates of dental implants were pooled simply in case of comparable studies. Eight randomized controlled clinical trials (RCTs) comparing short implants versus longer implants in the augmented sinus derived from an initial search count of 851 titles were selected and data extracted. In general, all studies were well conducted with a low risk of bias for the majority of the analyzed parameters. Based on the pooled analyses of longer follow-ups (5 studies, 16-18 months), the survival rate of longer implants amounted to 99.5% (95% CI: 97.6-99.98%) and for shorter implants to 99.0% (95% CI: 96.4-99.8%). For shorter follow-ups (3 studies, 8-9 months), the survival rates of longer implants are 100% (95% CI: 97.1-100%) and for shorter implants 98.2% (95% CI: 93.9-99.7%). Complications were predominantly of biological origin, mainly occurred intraoperatively as membrane perforations, and were almost three times as higher for longer implant in the augmented sinus compared to shorter implants. PROMs, morbidity, surgical time and costs were generally in favor of shorter dental implants. All studies were performed by surgeons in specialized clinical settings. The outcomes of the survey analyses demonstrated predictably high implant survival rates for short implants and longer implants placed in augmented sinus and their respective reconstructions. Given the higher number of biological complications, increased morbidity, costs and surgical time of longer dental implants in the augmented sinus, shorter dental implants may represent the preferred treatment alternative. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Radisavljević, Mirjana; Bjelaković, Goran; Jović, Jasna; Radovanović-Dinić, Biljana; Benedoto-Stojanov, Danijela; Brzački, Vesna; Marković-Živković, Bojana
2017-01-01
Bleeding from esophageal varices is a significant factor in mortality of patients with terminal liver cirrhosis. This complication is a major health problem for recipients on the list for liver transplant. In that regard, studying predictors of variceal bleeding episode is very important. Also, it is important to find the best survival predictor among prognostic scores. The aim of the study was to compare validity of prognostic scores in assessment of survival in hospital-treated patients after bleeding from esophageal varices, and to compare validity of baseline Child-Turcotte-Pugh (CTP) and Modul for End-stage Liver Disease (MELD) scores with CTP creatinine modified (CTP-crea) I and II scores in assessment of survival in patients within a long-term follow-up period after the episode of bleeding from esophageal varices. The study included a total of 126 patients suffering from terminal liver cirrhosis submited to testing CTP score score I and II, MELD score, MELD Na score, integrated MELD score, MELD sodium (MESO) index, United Kingdom Model for End-Stage Liver Disease (UKELD) score and updated MELD score. Patients with bleeding from esophageal varices most often had CTP score rank C (46,9%). CTP score rank B had 37.5% patients, while the smallest percentage of patients had CTP rank A, 15.6% of them. Patients who have values of CTP score higher than 10.50 and bleeding from esophagus, have 3.2 times higher chance for death outcome compared to other patients. Patients who have values of CTP-crea I score higher than 10.50 and bleeding from esophagus, have 3.1 times higher chance for death out-come than other patients. Patients who have values of CTP-crea II score higher than 11.50 and bleeding from esophagus, have 3,7 times higher chance for death outcome compared to other patients. Survival of patients with bleeding from esophageal varices in the short-term follow up can be predicted by following CTP score and creatinine modified CTP scores. Patients with bleeding from esophageal varices who have CTP score and CTP-crea I score higher than 10.5 and CTP-crea II score higher than 11.5, have statistically significantly higher risk from mortality within one-month follow-up compared to patients with bleeding from esophageal varices who have lower numerical values of scores of the CTP group.
Roembke, Felicitas; Heinzow, Hauke Sebastian; Gosseling, Thomas; Heinecke, Achim; Domagk, Dirk; Domschke, Wolfram; Meister, Tobias
2014-01-01
Pneumocystis jirovecii pneumonia also known as pneumocystis pneumonia (PCP) is an opportunistic respiratory infection in human immunodeficiency virus (HIV) patients that may also develop in non-HIV immunocompromised persons. The aim of our study was to evaluate mortality predictors of PCP patients in a tertiary referral centre. Fifty-one patients with symptomatic PCP were enrolled in the study. The patients had either HIV infection (n = 21) or other immunosuppressive conditions (n = 30). Baseline characteristics (e.g. age, sex and underlying disease) were retrieved. Kaplan-Meier analysis was employed to calculate survival. Comparisons were made by log-rank test. A multivariate analysis of factors influencing survival was carried out using the Cox regression model. Chi-squared test and Wilcoxon-Mann-Whitney test was applied as appropriate. The median survival time for the HIV group was >120 months compared with 3 months for the non-HIV group (P = 0.009). Three-month survival probability was also significantly greater in the HIV group compared with the non-HIV group (90% vs 41%, P = 0.002). In univariate log-rank test, intensive care unit (ICU) necessity, HIV negativity, age >50 years, haemoglobin <10g/dl, C-reactive protein >5 mg/dL and multiple comorbidities were significant negative predictors of survival. In the Cox regression model, ICU and HIV statuses turned out to be independent prognostic factors of survival. PCP is a serious problem in non-HIV immunocompromised patients in whom survival outcomes are worse than those in HIV patients. © 2013 John Wiley & Sons Ltd.
Prognosis of hepatocellular carcinoma: comparison of 7 staging systems in an American cohort.
Marrero, Jorge A; Fontana, Robert J; Barrat, Ashley; Askari, Frederick; Conjeevaram, Hari S; Su, Grace L; Lok, Anna S
2005-04-01
Currently there is no consensus as to which staging system is best in predicting the survival of patients with hepatocellular carcinoma (HCC). The aims of this study were to identify independent predictors of survival and to compare 7 available prognostic staging systems in patients with HCC. A total of 239 consecutive patients with cirrhosis and HCC seen between January 1, 2000, and December 31, 2003, were included. Demographic, laboratory, and tumor characteristics and performance status were determined at diagnosis and before therapy. Predictors of survival were identified using the Kaplan-Meir test and the Cox model. Sixty-two percent of patients had hepatitis C, 56% had more than 1 tumor nodule, 24% had portal vein thrombosis, and 29% did not receive any cancer treatment. At the time of censorship, 153 (63%) patients had died. The 1- and 3-year survival of the entire cohort was 58% and 29%, respectively. The independent predictors of survival were performance status (P < .0001), MELD score greater than 10 (P = .001), portal vein thrombosis (P = .0001), and tumor diameter greater than 4 cm (P = .001). Treatment of HCC was related to overall survival. The Barcelona Clinic Liver Cancer (BCLC) staging system had the best independent predictive power for survival when compared with the other 6 prognostic systems. In conclusion, performance status, tumor extent, liver function, and treatment were independent predictors of survival mostly in patients with cirrhosis and HCC. The BCLC staging system includes aspects of all of these elements and provided the best prognostic stratification for our cohort of patients with HCC.
Frouws, M A; Rademaker, E; Bastiaannet, E; van Herk-Sukel, M P P; Lemmens, V E; Van de Velde, C J H; Portielje, J E A; Liefers, G J
2017-05-01
Several studies have suggested that the association between aspirin and improved cancer survival is mediated through the mechanism of aspirin as thrombocyte aggregation inhibitors (TAI). The aim of this study was to provide epidemiological evidence for this mechanism assessing the association between overall survival and the use of aspirin and non-aspirin TAI in patients with colorectal cancer. In this observational study, data from the Netherlands Comprehensive Cancer Organisation were linked to PHARMO Database Network. Patients using aspirin or aspirin in combination with non-aspirin TAI (dual users) were selected and compared with non-users. The association between overall survival and the use of (non-)aspirin TAI was analysed using Cox regression models with the use of (non-)aspirin TAI as a time-varying covariate. In total, 9196 patients were identified with colorectal cancer and 1766 patients used TAI after diagnosis. Non-aspirin TAI were mostly clopidogrel and dipyridamole. Aspirin use was associated with a significant increased overall survival and hazard ratio (HR) 0.41 (95% confidence interval [CI] 0.37-0.47), and the use of non-aspirin TAI was not associated with survival of HR 0.92 (95% CI 0.70-1.22). Dual users did not have an improved overall survival when compared with patients using solely aspirin. Aspirin use after diagnosis of colorectal cancer was associated with significantly lower mortality rates and this effect remained significant after adjusting for potential confounders. No additional survival benefit was observed in patients using both aspirin and another TAI. Copyright © 2017 Elsevier Ltd. All rights reserved.
Knollmann, Friedrich D; Kumthekar, Rohan; Fetzer, David; Socinski, Mark A
2014-03-01
We set out to investigate whether volumetric tumor measurements allow for a prediction of treatment response, as measured by patient survival, in patients with advanced non-small-cell lung cancer (NSCLC). Patients with nonresectable NSCLC (stage III or IV, n = 100) who were repeatedly evaluated for treatment response by computed tomography (CT) were included in a Health Insurance Portability and Accountability Act (HIPAA)-compliant retrospective study. Tumor response was measured by comparing tumor volumes over time. Patient survival was compared with Response Evaluation Criteria in Solid Tumors (RECIST) using Kaplan-Meier survival statistics and Cox regression analysis. The median overall patient survival was 553 days (standard error, 146 days); for patients with stage III NSCLC, it was 822 days, and for patients with stage IV disease, 479 days. The survival differences were not statistically significant (P = .09). According to RECIST, 5 patients demonstrated complete response, 39 partial response, 44 stable disease, and 12 progressive disease. Patient survival was not significantly associated with RECIST class, the change of the sum of tumor diameters (P = .98), nor the change of the sum of volumetric tumor dimensions (P = .17). In a group of 100 patients with advanced-stage NSCLC, neither volumetric CT measurements of changes in tumor size nor RECIST class significantly predicted patient survival. This observation suggests that size response may not be a sufficiently precise surrogate marker of success to steer treatment decisions in individual patients. Copyright © 2014 Elsevier Inc. All rights reserved.
Roué, Tristan; Labbé, Sylvain; Belliardo, Sophie; Plenet, Juliette; Douine, Maylis; Nacher, Mathieu
2016-08-01
The prognosis of patients with breast cancer in French Guiana is worse than in France, with 23 deaths per 100 incident cases versus 17 per 100 in metropolitan France. This study aimed to compare the relative survival of patients with invasive breast cancer (IBC) between women from French Guiana and metropolitan France and to determine risk factors influencing breast cancer survival in French Guiana. Data were collected from the Cancer Registry of French Guiana. We compared the relative survival of women with IBC between French Guiana and metropolitan France. We used the Cox proportional hazard regression to evaluate the effect of prognostic factors on cancer-specific mortality in French Guiana. We included all 269 cases of IBC in women diagnosed in French Guiana between 2003 and 2009. The overall 5-year relative survival rate of patients with IBC was 79% in French Guiana and 86% in metropolitan France. The place of birth (foreign country vs. French territory), the tumor stage at the time of diagnosis, the mode of diagnosis (symptoms vs. screening), the presence of hormone receptors in the tumor, and the histologic type were the variables associated with survival differences. None of the other study variables were significantly associated with prognosis. Access to care for migrants is challenging, which leads to health inequalities. Early detection through prevention programs is crucial to increase IBC survival, notably for foreign-born patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Clark, Andrew L; Knosalla, Christoph; Birks, Emma; Loebe, Matthias; Davos, Constantinos H; Tsang, Sui; Negassa, Abdissa; Yacoub, Magdi; Hetzer, Roland; Coats, Andrew J S; Anker, Stefan D
2007-08-01
Heart transplantation is an important treatment for end-stage chronic heart failure. We studied the effect of body mass index (BMI), and the effect of subsequent weight change, on survival following transplantation in 1902 consecutive patients. Patients were recruited from: London (n=553), Berlin (N=971) and Boston (N=378). Patients suitable for transplantation due to symptoms, low left ventricular ejection fraction (
Takayama, Keiko; Hirayama, Keisuke; Hirao, Akihiko; Kondo, Keiko; Hayashi, Hideki; Kadota, Koichi; Asaba, Hiroyuki; Ishizu, Hideki; Nakata, Kenji; Kurisu, Kairi; Oshima, Etsuko; Yokota, Osamu; Yamada, Norihito; Terada, Seishi
2017-11-01
It is widely supposed that there has been no evidence of increased survival in patients with advanced dementia receiving enteral tube feeding. However, more than a few studies have reported no harmful outcome from tube feeding in dementia patients compared to in patients without dementia. This was a retrospective study. Nine psychiatric hospitals in Okayama Prefecture participated in this survey. All inpatients fulfilling the entry criteria were evaluated. All subjects suffered from difficulty with oral intake. Attending physicians thought that the patients could not live without long-term artificial nutrition. The physicians decided whether to make use of long-term artificial nutrition between January 2012 and December 2014. We evaluated 185 patients. Their mean age was 76.6 ± 11.4 years. Of all subjects, patients with probable Alzheimer's disease (n = 78) formed the biggest group, schizophrenia patients (n = 44) the second, and those with vascular dementia (n = 30) the third. The median survival times were 711 days for patients with tube feeding and 61 days for patients without tube feeding. In a comparison different types of tube feeding, median survival times were 611 days for patients with a nasogastric tube and more than 1000 days for those with a percutaneous endoscopic gastrostomy tube. Patients with tube feeding survived longer than those without tube feeding, even among dementia patients. This study suggests that enteral nutrition for patients with dementia prolongs survival. Additionally, percutaneous endoscopic gastrostomy tube feeding may be safer than nasogastric tube feeding among patients in psychiatric hospitals. © 2017 Japanese Psychogeriatric Society.
Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang
2013-01-01
Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.
Austin, Peter C; Schuster, Tibor
2016-10-01
Observational studies are increasingly being used to estimate the effect of treatments, interventions and exposures on outcomes that can occur over time. Historically, the hazard ratio, which is a relative measure of effect, has been reported. However, medical decision making is best informed when both relative and absolute measures of effect are reported. When outcomes are time-to-event in nature, the effect of treatment can also be quantified as the change in mean or median survival time due to treatment and the absolute reduction in the probability of the occurrence of an event within a specified duration of follow-up. We describe how three different propensity score methods, propensity score matching, stratification on the propensity score and inverse probability of treatment weighting using the propensity score, can be used to estimate absolute measures of treatment effect on survival outcomes. These methods are all based on estimating marginal survival functions under treatment and lack of treatment. We then conducted an extensive series of Monte Carlo simulations to compare the relative performance of these methods for estimating the absolute effects of treatment on survival outcomes. We found that stratification on the propensity score resulted in the greatest bias. Caliper matching on the propensity score and a method based on earlier work by Cole and Hernán tended to have the best performance for estimating absolute effects of treatment on survival outcomes. When the prevalence of treatment was less extreme, then inverse probability of treatment weighting-based methods tended to perform better than matching-based methods. © The Author(s) 2014.
Sun, Xin Rong; Tang, Cheng Wu; Lu, Wen Ming; Xu, Yong Qiang; Feng, Wen Ming; Bao, Yin; Zheng, Yin Yuan
2014-05-01
This study aims to compare the clinical outcomes and costs between endoscopic biliary stenting (EBS) and percutaneous transhepatic biliary stenting (PTBS). We randomly assigned 112 patients with unresectable malignant biliary obstruction 2006 and 2011 to receive EBS or PTBS with self-expandable metal stent (SEMS) as palliative treatment. PTBS was successfully performed in 55 patients who formed the PTBS group (failed in 2 patients). EBS was successfully performed in 52 patients who formed the EBS group (failed in 3 patients). The effectiveness of biliary drainage, hospital stay, complications, cost, survival time and mortality were compared. Patients in PTBS group had shorter hospital stay and lower initial and overall expense than the BBS group (P < 0.05). There was no significant difference in effectiveness of biliary drainage (P = 0.9357) or survival time between two groups (P = 0.6733). Early complications occurred in PTBS group was significantly lower than in EBS group (3/55 vs 11/52, P = 0.0343). Late complications in the EBS group did not differ significantly from PTBS group (7/55 vs 9/52, P = 0.6922). The survival curves in the two groups showed no significant difference (P = 0.5294). Conclusions: 3.
Fehring, Richard J; Schneider, Mary; Raviele, Kathleen; Rodriguez, Dana; Pruszynski, Jessica
2013-07-01
The aim was to compare the efficacy and acceptability of two Internet-supported fertility-awareness-based methods of family planning. Six hundred and sixty-seven women and their male partners were randomized into either an electronic hormonal fertility monitor (EHFM) group or a cervical mucus monitoring (CMM) group. Both groups utilized a Web site with instructions, charts and support. Acceptability was assessed online at 1, 3 and 6 months. Pregnancy rates were determined by survival analysis. The EHFM participants (N=197) had a total pregnancy rate of 7 per 100 users over 12 months of use compared with 18.5 for the CMM group (N=164). The log rank survival test showed a significant difference (p<.01) in survival functions. Mean acceptability for both groups increased significantly over time (p<.0001). Continuation rates at 12 months were 40.6% for the monitor group and 36.6% for the mucus group. In comparison with the CMM, the EHFM method of family planning was more effective. All users had an increase in acceptability over time. Results are tempered by the high dropout rate. Copyright © 2013 Elsevier Inc. All rights reserved.
Pötschger, Ulrike; Heinzl, Harald; Valsecchi, Maria Grazia; Mittlböck, Martina
2018-01-19
Investigating the impact of a time-dependent intervention on the probability of long-term survival is statistically challenging. A typical example is stem-cell transplantation performed after successful donor identification from registered donors. Here, a suggested simple analysis based on the exogenous donor availability status according to registered donors would allow the estimation and comparison of survival probabilities. As donor search is usually ceased after a patient's event, donor availability status is incompletely observed, so that this simple comparison is not possible and the waiting time to donor identification needs to be addressed in the analysis to avoid bias. It is methodologically unclear, how to directly address cumulative long-term treatment effects without relying on proportional hazards while avoiding waiting time bias. The pseudo-value regression technique is able to handle the first two issues; a novel generalisation of this technique also avoids waiting time bias. Inverse-probability-of-censoring weighting is used to account for the partly unobserved exogenous covariate donor availability. Simulation studies demonstrate unbiasedness and satisfying coverage probabilities of the new method. A real data example demonstrates that study results based on generalised pseudo-values have a clear medical interpretation which supports the clinical decision making process. The proposed generalisation of the pseudo-value regression technique enables to compare survival probabilities between two independent groups where group membership becomes known over time and remains partly unknown. Hence, cumulative long-term treatment effects are directly addressed without relying on proportional hazards while avoiding waiting time bias.
High emergency organ allocation rule in lung transplantation: a simulation study.
Riou, Julien; Boëlle, Pierre-Yves; Christie, Jason D; Thabut, Gabriel
2017-10-01
The scarcity of suitable organ donors leads to protracted waiting times and mortality in patients awaiting lung transplantation. This study aims to assess the short- and long-term effects of a high emergency organ allocation policy on the outcome of lung transplantation. We developed a simulation model of lung transplantation waiting queues under two allocation strategies, based either on waiting time only or on additional criteria to prioritise the sickest patients. The model was informed by data from the United Network for Organ Sharing. We compared the impact of these strategies on waiting time, waiting list mortality and overall survival in various situations of organ scarcity. The impact of a high emergency allocation strategy depends largely on the organ supply. When organ supply is sufficient (>95 organs per 100 patients), it may prevent a small number of early deaths (1 year survival: 93.7% against 92.4% for waiting time only) without significant impact on waiting times or long-term survival. When the organ/recipient ratio is lower, the benefits in early mortality are larger but are counterbalanced by a dramatic increase of the size of the waiting list. Consequently, we observed a progressive increase of mortality on the waiting list (although still lower than with waiting time only), a deterioration of patients' condition at transplant and a decrease of post-transplant survival times. High emergency organ allocation is an effective strategy to reduce mortality on the waiting list, but causes a disruption of the list equilibrium that may have detrimental long-term effects in situations of significant organ scarcity.
High emergency organ allocation rule in lung transplantation: a simulation study
Boëlle, Pierre-Yves; Christie, Jason D.; Thabut, Gabriel
2017-01-01
The scarcity of suitable organ donors leads to protracted waiting times and mortality in patients awaiting lung transplantation. This study aims to assess the short- and long-term effects of a high emergency organ allocation policy on the outcome of lung transplantation. We developed a simulation model of lung transplantation waiting queues under two allocation strategies, based either on waiting time only or on additional criteria to prioritise the sickest patients. The model was informed by data from the United Network for Organ Sharing. We compared the impact of these strategies on waiting time, waiting list mortality and overall survival in various situations of organ scarcity. The impact of a high emergency allocation strategy depends largely on the organ supply. When organ supply is sufficient (>95 organs per 100 patients), it may prevent a small number of early deaths (1 year survival: 93.7% against 92.4% for waiting time only) without significant impact on waiting times or long-term survival. When the organ/recipient ratio is lower, the benefits in early mortality are larger but are counterbalanced by a dramatic increase of the size of the waiting list. Consequently, we observed a progressive increase of mortality on the waiting list (although still lower than with waiting time only), a deterioration of patients’ condition at transplant and a decrease of post-transplant survival times. High emergency organ allocation is an effective strategy to reduce mortality on the waiting list, but causes a disruption of the list equilibrium that may have detrimental long-term effects in situations of significant organ scarcity. PMID:29181383
Obesity does not affect survival outcomes in extremity soft tissue sarcoma.
Alamanda, Vignesh K; Moore, David C; Song, Yanna; Schwartz, Herbert S; Holt, Ginger E
2014-09-01
Obesity is a growing epidemic and has been associated with an increased frequency of complications after various surgical procedures. Studies also have shown adipose tissue to promote a microenvironment favorable for tumor growth. Additionally, the relationship between obesity and prognosis of soft tissue sarcomas has yet to be evaluated. We sought to assess if (1) obesity affects survival outcomes (local recurrence, distant metastasis, and death attributable to disease) in patients with extremity soft tissue sarcomas; and (2) whether obesity affected wound healing and other surgical complications after treatment. A BMI of 30 kg/m(2) or greater was used to define obesity. Querying our prospective database between 2001 and 2008, we identified 397 patients for the study; 154 were obese and 243 were not obese. Mean followup was 4.5 years (SD, 3.1 years) in the obese group and 3.9 years (SD, 3.2 years) in the nonobese group; the group with a BMI of 30 kg/m(2) or greater had a higher proportion of patients with followups of at least 2 years compared with the group with a BMI less than 30 kg/m(2) (76% versus 62%). Outcomes, including local recurrence, distant metastasis, and overall survival, were analyzed after patients were stratified by BMI. Multivariable survival models were used to identify independent predictors of survival outcomes. Wilcoxon rank sum test was used to compare continuous variables. Based on the accrual interval of 8 years, the additional followup of 5 years after data collection, and the median survival time for the patients with a BMI less than 30 kg/m(2) of 3 years, we were able to detect true median survival times in the patients with a BMI of 30 kg/m(2) of 2.2 years or less with 80% power and type I error rate of 0.05. Patients who were obese had similar survival outcomes and wound complication rates when compared with their nonobese counterparts. Patients who were obese were more likely to have lower-grade tumors (31% versus 20%; p = 0.021) and additional comorbidities including diabetes mellitus (26% versus 7%; p < 0.001), hypertension (63% versus 38%; p < 0.001), and smoking (49% versus 37%; p = 0.027). Regression analysis confirmed that even after accounting for certain tumor characteristics and comorbidities, obesity did not serve as an independent risk factor in affecting survival outcomes. Although the prevalence of obesity continues to increase and lead to many negative health consequences, it does not appear to adversely affect survival, local recurrence, or wound complication rates for patients with extremity soft tissue sarcomas. Level III, therapeutic study. See the Instructions for Authors for a complete description of levels of evidence.
Johansson, Inger; Andersson, Rune; Friman, Vanda; Selimovic, Nedim; Hanzen, Lars; Nasic, Salmir; Nyström, Ulla; Sigurdardottir, Vilborg
2015-12-24
Cytomegalovirus (CMV) is associated with an increased risk of cardiac allograft vasculopathy (CAV), the major limiting factor for long-term survival after heart transplantation (HTx). The purpose of this study was to evaluate the impact of CMV infection during long-term follow-up after HTx. A retrospective, single-centre study analyzed 226 HTx recipients (mean age 45 ± 13 years, 78 % men) who underwent transplantation between January 1988 and December 2000. The incidence and risk factors for CMV infection during the first year after transplantation were studied. Risk factors for CAV were included in an analyses of CAV-free survival within 10 years post-transplant. The effect of CMV infection on the grade of CAV was analyzed. Survival to 10 years post-transplant was higher in patients with no CMV infection (69 %) compared with patients with CMV disease (55 %; p = 0.018) or asymptomatic CMV infection (54 %; p = 0.053). CAV-free survival time was higher in patients with no CMV infection (6.7 years; 95 % CI, 6.0-7.4) compared with CMV disease (4.2 years; CI, 3.2-5.2; p < 0.001) or asymptomatic CMV infection (5.4 years; CI, 4.3-6.4; p = 0.013). In univariate analysis, recipient age, donor age, coronary artery disease (CAD), asymptomatic CMV infection and CMV disease were significantly associated with CAV-free survival. In multivariate regression analysis, CMV disease, asymptomatic CMV infection, CAD and donor age remained independent predictors of CAV-free survival at 10 years post-transplant. CAV-free survival was significantly reduced in patients with CMV disease and asymptomatic CMV infection compared to patients without CMV infection. These findings highlight the importance of close monitoring of CMV viral load and appropriate therapeutic strategies for preventing asymptomatic CMV infection.
Survival of patients with colon and rectal cancer in central and northern Denmark, 1998–2009
Ostenfeld, Eva B; Erichsen, Rune; Iversen, Lene H; Gandrup, Per; Nørgaard, Mette; Jacobsen, Jacob
2011-01-01
Objective The prognosis for colon and rectal cancer has improved in Denmark over the past decades but is still poor compared with that in our neighboring countries. We conducted this population-based study to monitor recent trends in colon and rectal cancer survival in the central and northern regions of Denmark. Material and methods Using the Danish National Registry of Patients, we identified 9412 patients with an incident diagnosis of colon cancer and 5685 patients diagnosed with rectal cancer between 1998 and 2009. We determined survival, and used Cox proportional hazard regression analysis to compare mortality over time, adjusting for age and gender. Among surgically treated patients, we computed 30-day mortality and corresponding mortality rate ratios (MRRs). Results The annual numbers of colon and rectal cancer increased from 1998 through 2009. For colon cancer, 1-year survival improved from 65% to 70%, and 5-year survival improved from 37% to 43%. For rectal cancer, 1-year survival improved from 73% to 78%, and 5-year survival improved from 39% to 47%. Men aged 80+ showed most pronounced improvements. The 1- and 5-year adjusted MRRs decreased: for colon cancer 0.83 (95% confidence interval CI: 0.76–0.92) and 0.84 (95% CI: 0.78–0.90) respectively; for rectal cancer 0.79 (95% CI: 0.68–0.91) and 0.81 (95% CI: 0.73–0.89) respectively. The 30-day postoperative mortality after resection also declined over the study period. Compared with 1998–2000 the 30-day MRRs in 2007–2009 were 0.68 (95% CI: 0.53–0.87) for colon cancer and 0.59 (95% CI: 0.37–0.96) for rectal cancer. Conclusion The survival after colon and rectal cancer has improved in central and northern Denmark during the 1998–2009 period, as well as the 30-day postoperative mortality. PMID:21814467
Survival of patients with colon and rectal cancer in central and northern Denmark, 1998-2009.
Ostenfeld, Eva B; Erichsen, Rune; Iversen, Lene H; Gandrup, Per; Nørgaard, Mette; Jacobsen, Jacob
2011-01-01
The prognosis for colon and rectal cancer has improved in Denmark over the past decades but is still poor compared with that in our neighboring countries. We conducted this population-based study to monitor recent trends in colon and rectal cancer survival in the central and northern regions of Denmark. Using the Danish National Registry of Patients, we identified 9412 patients with an incident diagnosis of colon cancer and 5685 patients diagnosed with rectal cancer between 1998 and 2009. We determined survival, and used Cox proportional hazard regression analysis to compare mortality over time, adjusting for age and gender. Among surgically treated patients, we computed 30-day mortality and corresponding mortality rate ratios (MRRs). The annual numbers of colon and rectal cancer increased from 1998 through 2009. For colon cancer, 1-year survival improved from 65% to 70%, and 5-year survival improved from 37% to 43%. For rectal cancer, 1-year survival improved from 73% to 78%, and 5-year survival improved from 39% to 47%. Men aged 80+ showed most pronounced improvements. The 1- and 5-year adjusted MRRs decreased: for colon cancer 0.83 (95% confidence interval CI: 0.76-0.92) and 0.84 (95% CI: 0.78-0.90) respectively; for rectal cancer 0.79 (95% CI: 0.68-0.91) and 0.81 (95% CI: 0.73-0.89) respectively. The 30-day postoperative mortality after resection also declined over the study period. Compared with 1998-2000 the 30-day MRRs in 2007-2009 were 0.68 (95% CI: 0.53-0.87) for colon cancer and 0.59 (95% CI: 0.37-0.96) for rectal cancer. The survival after colon and rectal cancer has improved in central and northern Denmark during the 1998-2009 period, as well as the 30-day postoperative mortality.
Hussain, Jamilla A; Mooney, Andrew; Russon, Lynne
2013-10-01
There are limited data on the outcomes of elderly patients with chronic kidney disease undergoing renal replacement therapy or conservative management. We aimed to compare survival, hospital admissions and palliative care access of patients aged over 70 years with chronic kidney disease stage 5 according to whether they chose renal replacement therapy or conservative management. Retrospective observational study. Patients aged over 70 years attending pre-dialysis clinic. In total, 172 patients chose conservative management and 269 chose renal replacement therapy. The renal replacement therapy group survived for longer when survival was taken from the time estimated glomerular filtration rate <20 mL/min (p < 0.0001), <15 mL/min (p < 0.0001) and <12 mL/min (p = 0.002). When factors influencing survival were stratified for both groups independently, renal replacement therapy failed to show a survival advantage over conservative management, in patients older than 80 years or with a World Health Organization performance score of 3 or more. There was also a significant reduction in the effect of renal replacement therapy on survival in patients with high Charlson's Comorbidity Index scores. The relative risk of an acute hospital admission (renal replacement therapy vs conservative management) was 1.6 (p < 0.05; 95% confidence interval = 1.14-2.13). A total of 47% of conservative management patients died in hospital, compared to 69% undergoing renal replacement therapy (Renal Registry data). Seventy-six percent of the conservative management group accessed community palliative care services compared to 0% of renal replacement therapy patients. For patients aged over 80 years, with a poor performance status or high co-morbidity scores, the survival advantage of renal replacement therapy over conservative management was lost at all levels of disease severity. Those accessing a conservative management pathway had greater access to palliative care services and were less likely to be admitted to or die in hospital.
Kairiene, Igne; Pasauliene, Ramune; Lipunova, Nadezda; Vaitkeviciene, Goda; Rageliene, Lina; Rascon, Jelena
2017-10-01
The reported treatment outcomes of children treated for cancer in Eastern European countries are inferior to those in Northern/Western Europe. We hypothesized that recent survival rates could be comparable to the current standards and performed a population-based analysis of treatment outcome of childhood acute myeloid leukemia (AML) in Lithuania, a small Eastern European country. Children < 18 years old who were treated for AML from 2000 to 2013 were included (n = 54). Estimates of 5-year event-free (EFS 5y ) and overall survival (OS 5y ) rates were analyzed. Comparing periods 2000-2006 (n = 32) and 2007-2013 (n = 22), the EFS 5y improved from 31 to 63% (p = 0.04), and the OS 5y improved from 31 to 72% (p = 0.02) because of reductions in toxicity-related mortality (42 vs. 15%, p = 0.08) and relapse (43 vs. 25%, p = 0.08). The most significant improvement was demonstrated in high-risk patients (OS 5y improved from 26 to 75%, p = 0.02) who benefited from hematopoietic stem cell transplantation: the post-transplant EFS 5y increased from 13 to 86% (p = 0.01). The current survival rate of Lithuanian children treated for AML was comparable to the expected rate in other parts of Europe. What is Known: • In the last three decades, significant improvement has been achieved in treating childhood cancer, with an overall survival (OS) rate of > 80% in high-income countries. The difference in survival rates between Northern/Western and Eastern European countries as well as between high- and middle-/low-income countries is as much as 20%. Recently, the 5-year event-free survival rate of acute myeloid leukemia (AML) has reached > 60% in high-income countries. The survival rates for myeloproliferative diseases were the lowest in Eastern European countries. • The reported inferior survival rates were calculated based on outcome data of patients treated until 2007. The recent survival rates in Eastern European countries are unknown. What is New: • Being a small Eastern European country, Lithuania has experienced good economic growth during the last decade. We hypothesized that economic growth and gain of experience could result in better survival rates of children treated for cancer in our country in recent years. • A population-based analysis of treatment outcome of childhood AML treated in Lithuania in the recent years was performed for the first time. The survival rates of childhood AML in Lithuania are comparable to those of other high-income countries. Current survival rates of children treated for cancer in Eastern European countries could be comparable to the best current standards contributing to better European survival rates of childhood cancer in general.
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
2013-01-01
Background The objective of this study was to evaluate the effect on outcomes of intraoperative recombinant human interleukin-2 injection after surgical resection of peripheral nerve sheath tumours. In this double-blind trial, 40 patients due to undergo surgical excision (<5 mm margins) of presumed peripheral nerve sheath tumours were randomized to receive intraoperative injection of interleukin-2 or placebo into the wound bed. Results There were no significant differences in any variable investigated or in median survival between the two groups. The median recurrence free interval was 874 days (range 48–2141 days), The recurrence-free interval and overall survival time were significantly longer in dogs that undergone the primary surgery by a specialist-certified surgeon compared to a referring veterinarian regardless of whether additional adjunct therapy was given. Conclusion Overall, marginal excision of peripheral nerve sheath tumours in dogs resulted in a long survival time, but adjuvant treatment with recombinant human interleukin-2 (rhIL-2) did not provide a survival advantage. PMID:23927575
Utilization of advanced-age donors in renal transplantation.
Olaverri, J G; Mora Christian, J; Elorrieta, P; Esnaola, K; Rodríguez, P; Marrón, I; Uriarte, I; Landa, M J; Zarraga, S; Gainza, F J; Aranzabal, J; Zabala, J A; Pertusa, C
2011-11-01
The shortage of organ availability in recent years has made it necessary to use grafts from advanced-aged donors to maintain the rate of renal transplantation in our country. The objective of this study was to evaluate the graft function and patient survival using kidneys from deceased donors of over 65 year of age. From 2005 until 2010, we compared the outcomes of patients who received grafts from donors over 65 years old vs less than 65 years. We observed no significant difference in sex, time on dialysis, or cold ischemia time between the groups. As expected the recipient age was significantly different. For the analysis of survival, we used the Tablecloth-Haenzel test and the Kaplan-Meier survival estimator. Actuarial survivals at 3 years after transplantation showed 84.8% among patients transplanted with kidneys from donors over 65 years old versus 97.5% in the control group. The graft survival was 78.8% among expanded criteria versus 86.85% in the control group. When we analyzed graft survival using an "exitus-censured" analysis, we obtained graft survivals of 89.1% in the expanded criteria kidney group versus 88.6% among the controls. We concluded that the use of kidney from donors over 65 years of age allows us to increase the rate of renal transplantation to about 15 to 20 per million population, with good graft and patient survivals provided that the protocol for expanded criteria organs ensured proper macroscopic and microscopic evaluation of the organ for transplantation. Copyright © 2011. Published by Elsevier Inc.
Tanikawa, K; Hirai, K; Kawazoe, Y; Yamashita, K; Kumagai, M; Abe, M
1985-10-01
407 cases of unresectable hepatocellular carcinoma (HCC) occurring from 1970 to March 1985, including 107 cases receiving conservative therapy, 176 cases receiving one-shot therapy and 124 cases receiving transcatheter arterial embolization (TAE) therapy, were studied and the efficacy of chemotherapy was compared with that of TAE therapy. The results were as follows; One-year survival rate was 2.8% with a median survival time of 1.3 months in conservative therapy. In the 176 cases of one-shot therapy, one-year survival rate was 21.0%, two-year 6.8% and three-year 2.3% and the median survival time was 4.8% months. In 120 cases of one-shot therapy which were compatible with criteria for one-shot injection of anticancer drugs via the hepatic artery for HCC, one-year survival rate was 30%. However the rate was 1.8% in 56 cases which were not compatible with the criteria. In 37 cases in which Mitomycin C (MMC) and Adriamycin (ADR) were administered alternately, one-year survival rate was 41.7%, two-year 16.1% and three-year 4.3%. The highest survival rate was obtained by TAE therapy. One-year survival rate was 66.9%, two-year 33.8% and three-year 28.9%. Decrease of AFP after therapy was noted in 42.4% of cases given one-shot therapy and in 95.2% of cases given TEA therapy. The results suggest that alternate administration of anticancer agents produces good chemotherapeutic effects and that the best life-prolongation is obtained by TAE therapy.
Myeloid Blastic Transformation of Myeloproliferative Neoplasms – A Review of 112 Cases
Noor, Syed J; Tan, Wei; Wilding, Gregory E; Ford, Laurie A; Barcos, Maurice; Sait, Sheila N J; Block, AnneMarie W; Thompson, James E; Wang, Eunice S; Wetzler, Meir
2010-01-01
Blastic transformation of myeloproliferative neoplasms (MPN) is still poorly understood. We describe a cohort of 23 Roswell Park Cancer Institute (RPCI) patients and 89 additional cases from the English literature for whom biologic features were described. We initially compared our 23 patients to the 89 cases from the literature. Our population had significantly less patients with prior history of polycythemia vera (PV), shorter time from MPN diagnosis to blastic transformation, <3 prior therapies, more frequent use of hydroxyurea and erythropoietin and less frequent use of alkylating agents. Interestingly, the overall survival of the two cohorts from the time of blastic transformation was similar. We therefore looked at the outcome of the entire cohort (n=112). Patients with prior history of essential thrombocythemia survived longer than patients with prior history of myelofibrosis or PV. Further, patients with <3 prior therapies, those who lacked complex karyotype and those <60 year old at MPN diagnosis had significantly longer survival. Among the PRCI population, 20/23 patients underwent induction treatment with cytarabine and an anthracycline containing regimens; 12 achieved remission and their overall survival was significantly longer than those who did not. Three patients underwent an allogeneic transplantation and their survival was significantly longer than those who did not. Patients with <3 prior therapies, those who lack complex karyotype and those <60 at MPN diagnosis have longer survival following blastic transformation. Finally, allogeneic transplantation represents the only chance for long-term survival in these patients. PMID:20727590
Kybartienė, Sondra; Skarupskienė, Inga; Ziginskienė, Edita; Kuzminskis, Vytautas
2010-01-01
There are no data about arteriovenous fistulas (AVF) formation, survival, and complications rate in patients with end-stage renal failure in Lithuania. We analyzed the data of patients (N=272) with end-stage renal failure, dialyzed at the Hospital of Kaunas University of Medicine from January 1, 2000, until March 30, 2010, and identified 368 cases of AVF creation. The patients were divided into two groups: group 1 included the patients with an AVF that functioned for <15 months (n=138) and group 2 included patients with an AVF that functioned for ≥15 months (n=171). Less than half (47%) of the patients started planned hemodialysis and 51% of the patients started hemodialysis urgently. The mean time of AVF functioning was 15.43±8.67 months. Age, gender, the kidney disease, and time of AVF maturation had no influence on AVF functioning time. AVFs of the patients who started planned hemodialysis functioned longer as compared to AVFs of the patients who started hemodialysis urgently (P<0.05). Hospitalization time of the patients who started hemodialysis urgently was longer as compared that of the patients who had a matured AVF (37.63±20.55 days vs. 16.54±9.43 days). The first vascular access had better survival than repeated access. AVF survival in patients with ischemic brain vascular disease was worse than in patients without this comorbidity.
Tai, Patricia; Yu, Edward; Cserni, Gábor; Vlastos, Georges; Royce, Melanie; Kunkler, Ian; Vinh-Hung, Vincent
2005-01-01
Background The present commonly used five-year survival rates are not adequate to represent the statistical cure. In the present study, we established the minimum number of years required for follow-up to estimate statistical cure rate, by using a lognormal distribution of the survival time of those who died of their cancer. We introduced the term, threshold year, the follow-up time for patients dying from the specific cancer covers most of the survival data, leaving less than 2.25% uncovered. This is close enough to cure from that specific cancer. Methods Data from the Surveillance, Epidemiology and End Results (SEER) database were tested if the survival times of cancer patients who died of their disease followed the lognormal distribution using a minimum chi-square method. Patients diagnosed from 1973–1992 in the registries of Connecticut and Detroit were chosen so that a maximum of 27 years was allowed for follow-up to 1999. A total of 49 specific organ sites were tested. The parameters of those lognormal distributions were found for each cancer site. The cancer-specific survival rates at the threshold years were compared with the longest available Kaplan-Meier survival estimates. Results The characteristics of the cancer-specific survival times of cancer patients who died of their disease from 42 cancer sites out of 49 sites were verified to follow different lognormal distributions. The threshold years validated for statistical cure varied for different cancer sites, from 2.6 years for pancreas cancer to 25.2 years for cancer of salivary gland. At the threshold year, the statistical cure rates estimated for 40 cancer sites were found to match the actuarial long-term survival rates estimated by the Kaplan-Meier method within six percentage points. For two cancer sites: breast and thyroid, the threshold years were so long that the cancer-specific survival rates could yet not be obtained because the SEER data do not provide sufficiently long follow-up. Conclusion The present study suggests a certain threshold year is required to wait before the statistical cure rate can be estimated for each cancer site. For some cancers, such as breast and thyroid, the 5- or 10-year survival rates inadequately reflect statistical cure rates, and highlight the need for long-term follow-up of these patients. PMID:15904508
Comparison of 2 heterotopic heart transplant techniques in rats: cervical and abdominal heart.
Ma, Yi; Wang, Guodong
2011-04-01
Heterotopic heart transplant in rats has been accepted as the most commonly used animal model to investigate the mechanisms of transplant immunology. Many ingenious approaches to this model have been reported. We sought to improve this model and compare survival rates and histologic features of acute rejection in cervical and abdominal heart transplants. Rats were divided into cervical and abdominal groups. Microsurgical techniques were introduced for vascular anastomoses. In the abdominal heart transplant group, the donor's thoracic aorta was anastomosed end-to-side to the recipient's infrarenal abdominal aorta, and the donor's pulmonary artery was anastomosed to the recipient's inferior vena cava. In the cervical heart transplant group, the donor's thoracic aorta was anastomosed to the recipient's common carotid artery, and the donor's pulmonary artery was anastomosed to the recipient's external jugular vein. Survival time of the 2 models was followed and pathology was examined. Histologic features of allogeneic rejection also were compared in the cervical and abdominal heart transplant groups. The mean time to recover the donor's hearts was 7.4 ± 2.2 minutes in the cervical group and 7.2 ± 1.8 minutes in the abdominal group. In the cervical and abdominal heart transplant models, the mean recipient's operative time was 23.2 ± 2.6 minutes and 21.6 ± 2.8 minutes. Graft survival was 98% and 100% in the cervical and abdominal heart transplant groups. There was no significant difference in graft survival between the 2 methods. Heart allografts rejected at 5.7 and 6.2 days in the cervical and abdominal transplant groups. There was no difference in the histologic features of acute allogenic rejection in cervical and abdominal heart transplant. Both cervical and abdominal heart transplants can achieve a high rate of success. The histologic features of acute allogeneic rejection in the models are comparable.
Cai, Yeyu; Chang, Qian; Xiao, Enhua; Shang, Quan-Liang; Chen, Zhu
2018-06-01
To compare the clinical efficacies and adverse reactions between transcatheter arterial chemoembolization (TACE), γ-ray 3-dimensional fractionated stereotactic conformal radiotherapy (FSCR), and TACE combined with FSCR for primary hepatocellular carcinoma.The study was approved by the Institutional Review Board, and informed consent was waived due to the retrospective study design. About 121 patients met the inclusion criteria and were included in this study, from March 2008 to January 2010, in the Second Xiangya Hospital. Forty-six patients underwent TACE alone, 36 patients underwent γ-knife alone, and 39 were treated by γ-knife combined with TACE. Short-term effects, overall survival rates, adverse reactions, and survival times were compared between the 3 treatment groups.Short-term effects were observed in 41.3% of the TACE group, 33.3% of the γ-knife group, and 64.1% of the TACE combined γ-knife group (P = .020). Overall survival rates at 6,12, 18, and 24 months were 50%, 34.8%, 28.3%, and 21.7% for the TACE group, 36.1%, 30.6%, 16.7%, and 11.1% for γ-knife group, and 84.6%, 71.8%, 61.5%, and 30.8% for TACE combined γ-knife group, respectively. The differences in the overall survival rates at 6, 12, and 18 months between the 3 groups were statistically significant (P = 0), but the overall survival rates at 24 months in the 3 groups were not significantly different (P = .117). The median survival time was 7 months for the TACE group, 3 months for the γ-knife group, and 20 months for the TACE combined γ-knife group (P = 0). There were statistically significant differences (P = .010) of leukopenia between the 3 groups, and no statistically significant differences of (P > .05) thrombocytopenia, anemia, nausea, vomiting, and liver function lesions.TACE combined with γ-knife for primary hepatocellular carcinoma is superior to TACE or γ-knife alone in short-term and long-term effects. This procedure is a mild, safe, and effective treatment for primary hepatocellular carcinoma.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Jeffrey Q.; Truong, Pauline T.; Olivotto, Ivo A.
Purpose: Optimal local management for young women with early-stage breast cancer remains controversial. This study examined 15-year outcomes among women younger than 40 years treated with breast-conserving surgery plus whole-breast radiation therapy (BCT) compared with those treated with modified radical mastectomy (MRM). Methods and Materials: Women aged 20 to 39 years with early-stage breast cancer diagnosed between 1989 and 2003 were identified in a population-based database. Primary outcomes of breast cancer–specific survival (BCSS), overall survival (OS) and secondary outcomes of local relapse–free survival (LRFS), locoregional relapse–free survival (LRRFS), and distant relapse–free survival (DRFS) were calculated using Kaplan-Meier methods and compared between BCTmore » and MRM cohorts using log-rank tests. A planned subgroup analysis was performed on patients considered “ideal” for BCT (ie, T1N0, negative margins and no extensive ductal carcinoma in situ) and in whom local therapy may have the largest impact on survival because of low systemic risk. Results: 965 patients were identified; 616 had BCT and 349 had MRM. The median follow-up time was 14.4 years (range, 8.4-23.3 years). Overall, 15-year rates of BCSS (76.0% vs 74.1%, P=.62), OS (74.2% vs 73.0%, P=.75), LRFS (85.4% vs 86.5%, P=.95), LRRFS (82.2% vs 81.6%, P=.61), and DRFS (74.4% vs 71.6%, P=.40) were similar between the BCT and MRM cohorts. In the “ideal” for BCT subgroup, there were 219 BCT and 67 MRM patients with a median follow-up time of 15.5 years. The 15-year BCSS (86.1% vs 82.9%, P=.57), OS (82.6% vs 82.9%, P=.89), LRFS (86.2% vs 84.2%, P=.50), LRRFS (83.1% vs 78.3%, P=.24), and DRFS (84.8% vs 79.1%, P=.17) were similar in the BCT and MRM cohorts. Conclusions: This population-based analysis with long-term follow-up confirmed that women younger than 40 years treated with BCT had similar 15-year outcomes compared with MRM. Young age alone is not a contraindication to BCT.« less
Simple prognostic model for patients with advanced cancer based on performance status.
Jang, Raymond W; Caraiscos, Valerie B; Swami, Nadia; Banerjee, Subrata; Mak, Ernie; Kaya, Ebru; Rodin, Gary; Bryson, John; Ridley, Julia Z; Le, Lisa W; Zimmermann, Camilla
2014-09-01
Providing survival estimates is important for decision making in oncology care. The purpose of this study was to provide survival estimates for outpatients with advanced cancer, using the Eastern Cooperative Oncology Group (ECOG), Palliative Performance Scale (PPS), and Karnofsky Performance Status (KPS) scales, and to compare their ability to predict survival. ECOG, PPS, and KPS were completed by physicians for each new patient attending the Princess Margaret Cancer Centre outpatient Oncology Palliative Care Clinic (OPCC) from April 2007 to February 2010. Survival analysis was performed using the Kaplan-Meier method. The log-rank test for trend was employed to test for differences in survival curves for each level of performance status (PS), and the concordance index (C-statistic) was used to test the predictive discriminatory ability of each PS measure. Measures were completed for 1,655 patients. PS delineated survival well for all three scales according to the log-rank test for trend (P < .001). Survival was approximately halved for each worsening performance level. Median survival times, in days, for each ECOG level were: EGOG 0, 293; ECOG 1, 197; ECOG 2, 104; ECOG 3, 55; and ECOG 4, 25.5. Median survival times, in days, for PPS (and KPS) were: PPS/KPS 80-100, 221 (215); PPS/KPS 60 to 70, 115 (119); PPS/KPS 40 to 50, 51 (49); PPS/KPS 10 to 30, 22 (29). The C-statistic was similar for all three scales and ranged from 0.63 to 0.64. We present a simple tool that uses PS alone to prognosticate in advanced cancer, and has similar discriminatory ability to more complex models. Copyright © 2014 by American Society of Clinical Oncology.
Castleberry, A W; Güller, U; Tarantino, I; Berry, M F; Brügger, L; Warschkow, R; Cerny, T; Mantyh, C R; Candinas, D; Worni, M
2014-06-01
Recently, multiple clinical trials have demonstrated improved outcomes in patients with metastatic colorectal cancer. This study investigated if the improved survival is race dependent. Overall and cancer-specific survival of 77,490 White and Black patients with metastatic colorectal cancer from the 1988-2008 Surveillance Epidemiology and End Results registry were compared using unadjusted and multivariable adjusted Cox proportional hazard regression as well as competing risk analyses. Median age was 69 years, 47.4 % were female and 86.0 % White. Median survival was 11 months overall, with an overall increase from 8 to 14 months between 1988 and 2008. Overall survival increased from 8 to 14 months for White, and from 6 to 13 months for Black patients. After multivariable adjustment, the following parameters were associated with better survival: White, female, younger, better educated and married patients, patients with higher income and living in urban areas, patients with rectosigmoid junction and rectal cancer, undergoing cancer-directed surgery, having well/moderately differentiated, and N0 tumors (p < 0.05 for all covariates). Discrepancies in overall survival based on race did not change significantly over time; however, there was a significant decrease of cancer-specific survival discrepancies over time between White and Black patients with a hazard ratio of 0.995 (95 % confidence interval 0.991-1.000) per year (p = 0.03). A clinically relevant overall survival increase was found from 1988 to 2008 in this population-based analysis for both White and Black patients with metastatic colorectal cancer. Although both White and Black patients benefitted from this improvement, a slight discrepancy between the two groups remained.
Li, Qi-Wen; Niu, Shao-Qing; Wang, Han-Yu; Wen, Ge; Li, Yi-Yang; Xia, Yun-Fei; Zhang, Yu-Jing
2015-01-01
A moderate dose of radiation is the recommended treatment for solitary plasmacytoma (SP), but there is controversy over the role of surgery. Our study aimed at comparing different treatment modalities in the management of SP. Data from 38 consecutive patients with solitary plasmacytoma, including 16 with bone plasmacytoma and 22 with extramedullary plasmacytoma, were retrospectively reviewed. 15 patients received radiotherapy alone; 11 received surgery alone, and 12 received both. The median radiation dose was 50Gy. All operations were performed as radical resections. Local progression-free survival (LPFS), multiple myeloma-free survival (MMFS), progression-free survival (PFS) and overall survival (OS) were calculated and outcomes of different therapies were compared. The median follow-up time was 55 months. 5-year LPFS, MMFS, PFS and OS were 87.0%, 80.9%, 69.8% and 87.4%, respectively. Univariate analysis revealed, compared with surgery alone, radiotherapy alone was associated with significantly higher 5-year LPFS (100% vs 69.3%, p=0.016), MMFS (100% vs 51.4%, p=0.006), PFS (100% vs 33.7%, p=0.0004) and OS (100% vs 70%, p=0.041). Radiotherapy alone can be considered as a more effective treatment for SP over surgery. Whether a combination of radiotherapy and surgery improves outcomes requires further study.
Adams, Timothy S; Blouin, Dawn; Johnson, Don
2016-01-01
Compare maximum concentration (Cmax), time to maximum concentration (Tmax), mean serum concentration of vasopressin, return of spontaneous circulation (ROSC), time to ROSC, and odds of survival relative to vasopressin administration by tibial intraosseous (TIO), humerus intraosseous (HIO), and intravenous (IV) routes in a hypovolemic cardiac arrest model. Prospective, between subjects, randomized experimental design. TriService Research Facility. Yorkshire-cross swine (n = 40). Swine were anesthetized, exsanguinated to a Class III hemorrhage, and placed into cardiac arrest. After 2 minutes, cardiopulmonary resuscitation was initiated. After an additional 2 minutes, a dose of 40 units of vasopressin was administered by TIO, HIO, or the IV routes. Blood samples were collected over 4 minutes and analyzed by high-performance liquid chromatography tandem mass spectrometry. ROSC, time to ROSC, Cmax, Tmax, mean concentrations over time, and odds ratio. There was no significant difference in rate of ROSC or time to ROSC between the TIO, HIO, and IV groups (p > 0.05). The Cmax was significantly higher in the IV group compared to the TIO group (p = 0.015), but no significant difference between the TIO versus HIO or HIO versus IV groups (p > 0.05). The Tmax was significantly shorter for the HIO compared to the TIO group (p = 0.034), but no significant differences between the IV group compared to the TIO or HIO groups (p > 0.05). The odds of survival were higher in the HIO group compared to all other groups. The TIO and HIO provide rapid and reliable access to administer life-saving medications during cardiac arrest.
Survival from colorectal cancer in Victoria: 10-year follow up of the 1987 management survey.
McLeish, John A; Thursfield, Vicky J; Giles, Graham G
2002-05-01
In 1987, the Victorian Cancer Registry identified a population-based sample of patients who underwent surgery for colorectal cancer for an audit of management following resection. Over 10 years have passed since this survey, and data on the survival of these patients (incorporating various prognostic indicators collected at the time of the survey) are now discussed in the present report. Relative survival analysis was conducted for each prognostic indicator separately and then combined in a multivariate model. Relative survival at 5 years for patients undergoing curative resections was 76% compared with 7% for those whose treatment was considered palliative. Survival at 10 years was little changed (73% and 7% respectively). Survival did not differ significantly by sex or age irrespective of treatment intention. In the curative group, only stage was a significant predictor of survival. Multivariate analysis was performed only for the curative group. Adjusting for all variables simultaneously,stage was the only -significant predictor of survival. Patients with Dukes' stage C disease were at a significantly greater risk (OR 5.5 (1.7-17.6)) than those with Dukes' A. Neither tumour site, sex, age, surgeon activity level nor adjuvant therapies made a significant contribution to the model.
Wanjugi, P; Fox, G A; Harwood, V J
2016-10-01
Nutrient levels, competition from autochthonous microorganisms, and protozoan predation may all influence survival of fecal microorganisms as they transition from the gastrointestinal tract to aquatic habitats. Although Escherichia coli is an important indicator of waterborne pathogens, the effects of environmental stressors on its survival in aquatic environments remain poorly understood. We manipulated organic nutrient, predation, and competition levels in outdoor microcosms containing natural river water, sediments, and microbial populations to determine their relative contribution to E. coli survival. The activities of predator (protozoa) and competitor (indigenous bacteria) populations were inhibited by adding cycloheximide or kanamycin. We developed a statistical model of E. coli density over time that fits with the data under all experimental conditions. Predation and competition had significant negative effects on E. coli survival, while higher nutrient levels increased survival. Among the main effects, predation accounted for the greatest variation (40 %) compared with nutrients (25 %) or competition (15 %). The highest nutrient level mitigated the effect of predation on E. coli survival. Thus, elevated organic nutrients may disproportionately enhance the survival of E. coli, and potentially that of other enteric bacteria, in aquatic habitats.
Schwietzer, A; Kessler, M; Kandel-Tschiederer, B
2012-10-17
Combination therapy of intranasal tumours in dogs with palliative 60 cobalt radiation and carboplatin chemotherapy. Twenty-five dogs with intranasal tumours were treated in the Hofheim Veterinary Hospital (Germany) from 2004 to 2006 with a total radiation dose of 24Gy (3 fractions of 8 Gy on days 0, 7 and 21) and five doses of Carboplatin (270-300 mg/m² BSA i.v. every 21-28 days). In 88% patients, clinical symptoms subsided partially or completely resulting in improvement in quality of life. Computed tomography revealed partial (5/25) or complete (5/25) tumour remissions. Chemotherapy was well tolerated. Radiation therapy caused no or minimal side effects except for 3 dogs (12%), which experienced serious ocular side effects resulting in loss of vision of the affected eye and one dog with epileptic seizures. Survival times ranged from 10-639 days with a median of 156 days. There was no statistically significant correlation between the parameters breed, age, sex, brain invasion or tumour stage and survival time or progression free interval. Survival time and progression free interval were significantly correlated with the degree of tumour remission. It can be concluded from this study that palliative radiation therapy combined with chemotherapy results in excellent palliation of clinical symptoms and acceptable survival times. There was no advantage of combined therapy (radiation with carboplatin) when compared to literature data on results of radiation therapy alone.
Rajan, Shahzleen; Wissenberg, Mads; Folke, Fredrik; Hansen, Steen Møller; Gerds, Thomas A; Kragholm, Kristian; Hansen, Carolina Malta; Karlsson, Lena; Lippert, Freddy K; Køber, Lars; Gislason, Gunnar H; Torp-Pedersen, Christian
2016-12-20
Bystander-initiated cardiopulmonary resuscitation (CPR) increases patient survival after out-of-hospital cardiac arrest, but it is unknown to what degree bystander CPR remains positively associated with survival with increasing time to potential defibrillation. The main objective was to examine the association of bystander CPR with survival as time to advanced treatment increases. We studied 7623 out-of-hospital cardiac arrest patients between 2005 and 2011, identified through the nationwide Danish Cardiac Arrest Registry. Multiple logistic regression analysis was used to examine the association between time from 911 call to emergency medical service arrival (response time) and survival according to whether bystander CPR was provided (yes or no). Reported are 30-day survival chances with 95% bootstrap confidence intervals. With increasing response times, adjusted 30-day survival chances decreased for both patients with bystander CPR and those without. However, the contrast between the survival chances of patients with versus without bystander CPR increased over time: within 5 minutes, 30-day survival was 14.5% (95% confidence interval [CI]: 12.8-16.4) versus 6.3% (95% CI: 5.1-7.6), corresponding to 2.3 times higher chances of survival associated with bystander CPR; within 10 minutes, 30-day survival chances were 6.7% (95% CI: 5.4-8.1) versus 2.2% (95% CI: 1.5-3.1), corresponding to 3.0 times higher chances of 30-day survival associated with bystander CPR. The contrast in 30-day survival became statistically insignificant when response time was >13 minutes (bystander CPR vs no bystander CPR: 3.7% [95% CI: 2.2-5.4] vs 1.5% [95% CI: 0.6-2.7]), but 30-day survival was still 2.5 times higher associated with bystander CPR. Based on the model and Danish out-of-hospital cardiac arrest statistics, an additional 233 patients could potentially be saved annually if response time was reduced from 10 to 5 minutes and 119 patients if response time was reduced from 7 (the median response time in this study) to 5 minutes. The absolute survival associated with bystander CPR declined rapidly with time. Yet bystander CPR while waiting for an ambulance was associated with a more than doubling of 30-day survival even in case of long ambulance response time. Decreasing ambulance response time by even a few minutes could potentially lead to many additional lives saved every year. © 2016 American Heart Association, Inc.
Survival of Salmonella Newport in oysters.
Morrison, Christopher M; Armstrong, Alexandra E; Evans, Sanford; Mild, Rita M; Langdon, Christopher J; Joens, Lynn A
2011-08-02
Salmonella enterica is the leading cause of laboratory-confirmed foodborne illness in the United States and raw shellfish consumption is a commonly implicated source of gastrointestinal pathogens. A 2005 epidemiological study done in our laboratory by Brands et al., showed that oysters in the United States are contaminated with Salmonella, and in particular, a specific strain of the Newport serovar. This work sought to further investigate the host-microbe interactions between Salmonella Newport and oysters. A procedure was developed to reliably and repeatedly expose oysters to enteric bacteria and quantify the subsequent levels of bacterial survival. The results show that 10 days after an exposure to Salmonella Newport, an average concentration of 3.7 × 10(3)CFU/g remains within the oyster meat, and even after 60 days there still can be more than 10(2)CFU/g remaining. However, the strain of Newport that predominated in the market survey done by Brands et al. does not survive within oysters or the estuarine environment better than any other strains of Salmonella we tested. Using this same methodology, we compared Salmonella Newport's ability to survive within oysters to a non-pathogenic strain of E. coli and found that after 10 days the concentration of Salmonella was 200-times greater than that of E. coli. We also compared those same strains of Salmonella and E. coli in a depuration process to determine if a constant 120 L/h flux of clean seawater could significantly reduce the concentration of bacteria within oysters and found that after 3 days the oysters retained over 10(4)CFU/g of Salmonella while the oysters exposed to the non-pathogenic strain of E. coli contained 100-times less bacteria. Overall, the results of this study demonstrate that any of the clinically relevant serovars of Salmonella can survive within oysters for significant periods of time after just one exposure event. Based on the drastic differences in survivability between Salmonella and a non-pathogenic relative, the results of this study also suggest that unidentified virulence factors may play a role in Salmonella's interactions with oysters. Published by Elsevier B.V.
Culp, William T. N.; Olea-Popelka, Francisco; Sefton, Jennifer; Aldridge, Charles F.; Withrow, Stephen J.; Lafferty, Mary H.; Rebhun, Robert B.; Kent, Michael S.; Ehrhart, Nicole
2015-01-01
Objective To evaluate clinical characteristics, outcome, and prognostic variables in a cohort of dogs surviving > 1 year after an initial diagnosis of osteosarcoma. Design Retrospective case series. Animals 90 client-owned dogs. Procedures Medical records for an 11-year period from 1997 through 2008 were reviewed, and patients with appendicular osteosarcoma that lived > 1 year after initial histopathologic diagnosis were studied. Variables including signalment, weight, serum alkaline phosphatase activity, tumor location, surgery, and adjuvant therapies were recorded. Median survival times were calculated by means of a Kaplan-Meier survival function. Univariate analysis was conducted to compare the survival function for categorical variables, and the Cox proportional hazard model was used to evaluate the likelihood of death > 1 year after diagnosis on the basis of the selected risk factors. Results 90 dogs met the inclusion criteria; clinical laboratory information was not available in all cases. Median age was 8.2 years (range, 2.7 to 13.3 years), and median weight was 38 kg (83.6 lb; range, 21 to 80 kg [46.2 to 176 lb]). Serum alkaline phosphatase activity was high in 29 of 60 (48%) dogs. The most common tumor location was the distal portion of the radius (54/90 [60%]). Eighty-nine of 90 (99%) dogs underwent surgery, and 78 (87%) received chemotherapy. Overall, 49 of 90 (54%) dogs developed metastatic disease. The median survival time beyond 1 year was 243 days (range, 1 to 1,899 days). Dogs that developed a surgical-site infection after limb-sparing surgery had a significantly improved prognosis > 1 year after osteosarcoma diagnosis, compared with dogs that did not develop infections. Conclusions and Clinical Relevance Results of the present study indicated that dogs with an initial diagnosis of osteosarcoma that lived > 1 year had a median survival time beyond the initial year of approximately 8 months. As reported previously, the development of a surgical-site infection in dogs undergoing a limb-sparing surgery significantly affected prognosis and warrants further study. PMID:25356715
Elliott, Thomas B.; Bolduc, David L.; Ledney, G. David; Kiang, Juliann G.; Fatanmi, Oluseyi O.; Wise, Stephen Y.; Romaine, Patricia L. P.; Newman, Victoria L.; Singh, Vijay K.
2015-01-01
Purpose: A combination therapy for combined injury (CI) using a non-specific immunomodulator, synthetic trehalose dicorynomycolate and monophosphoryl lipid A (STDCM-MPL), was evaluated to augment oral antimicrobial agents, levofloxacin (LVX) and amoxicillin (AMX), to eliminate endogenous sepsis and modulate cytokine production. Materials and methods: Female B6D2F1/J mice received 9.75 Gy cobalt-60 gamma-radiation and wound. Bacteria were isolated and identified in three tissues. Incidence of bacteria and cytokines were compared between treatment groups. Results: Results demonstrated that the lethal dose for 50% at 30 days (LD50/30) of B6D2F1/J mice was 9.42 Gy. Antimicrobial therapy increased survival in radiation-injured (RI) mice. Combination therapy increased survival after RI and extended survival time but did not increase survival after CI. Sepsis began five days earlier in CI mice than RI mice with Gram-negative species predominating early and Gram-positive species increasing later. LVX plus AMX eliminated sepsis in CI and RI mice. STDCM-MPL eliminated Gram-positive bacteria in CI and most RI mice but not Gram-negative. Treatments significantly modulated 12 cytokines tested, which pertain to wound healing or elimination of infection. Conclusions: Combination therapy eliminates infection and prolongs survival time but does not assure CI mouse survival, suggesting that additional treatment for proliferative-cell recovery is required. PMID:25994812
Akilimali, P Z; Mutombo, P B; Kayembe, P K; Kaba, D K; Mapatano, M A
2014-06-01
The study aimed to identify factors associated with the survival of patients receiving antiretroviral therapy. A historic cohort of HIV patients from two major hospitals in Goma (Democratic Republic of Congo) was followed from 2004 to 2012. The Kaplan-Meier method was used to describe the probability of survival as a function of time since inclusion into the cohort. The log-rank test was used to compare survival curves based on determinants. The Cox regression model identified the determinants of survival since treatment induction. The median follow-up time was 3.56 years (IQR=2.22-5.39). The mortality rate was 40 deaths per 1000 person-years. Male gender (RR: 2.56; 95 %CI 1.66-4.83), advanced clinical stage (RR: 2.12; 95 %CI 1.15-3.90), low CD4 count (CD4 < 50) (RR: 2.05; 95 %CI : 1.22-3.45), anemia (RR: 3.95; 95 %CI 2.60-6.01), chemoprophylaxis with cotrimoxazole (RR: 4.29, 95 % CI 2.69-6.86) and period of treatment initiation (2010-2011) (RR: 3.34; 95 %CI 1.24-8.98) were statistically associated with short survival. Initiation of treatment at an early stage of the disease with use of less toxic molecules and an increased surveillance especially of male patients are recommended to reduce mortality. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Elliott, Thomas B; Bolduc, David L; Ledney, G David; Kiang, Juliann G; Fatanmi, Oluseyi O; Wise, Stephen Y; Romaine, Patricia L P; Newman, Victoria L; Singh, Vijay K
2015-01-01
A combination therapy for combined injury (CI) using a non-specific immunomodulator, synthetic trehalose dicorynomycolate and monophosphoryl lipid A (STDCM-MPL), was evaluated to augment oral antimicrobial agents, levofloxacin (LVX) and amoxicillin (AMX), to eliminate endogenous sepsis and modulate cytokine production. Female B6D2F(1)/J mice received 9.75 Gy cobalt-60 gamma-radiation and wound. Bacteria were isolated and identified in three tissues. Incidence of bacteria and cytokines were compared between treatment groups. Results demonstrated that the lethal dose for 50% at 30 days (LD(50/30)) of B6D2F(1)/J mice was 9.42 Gy. Antimicrobial therapy increased survival in radiation-injured (RI) mice. Combination therapy increased survival after RI and extended survival time but did not increase survival after CI. Sepsis began five days earlier in CI mice than RI mice with Gram-negative species predominating early and Gram-positive species increasing later. LVX plus AMX eliminated sepsis in CI and RI mice. STDCM-MPL eliminated Gram-positive bacteria in CI and most RI mice but not Gram-negative. Treatments significantly modulated 12 cytokines tested, which pertain to wound healing or elimination of infection. Combination therapy eliminates infection and prolongs survival time but does not assure CI mouse survival, suggesting that additional treatment for proliferative-cell recovery is required.
Han, Tianci; Shu, Tianci; Dong, Siyuan; Li, Peiwen; Li, Weinan; Liu, Dali; Qi, Ruiqun; Zhang, Shuguang; Zhang, Lin
2017-05-01
Decreased expression of human chemokine-like factor-like MARVEL transmembrane domain-containing 3 (CMTM3) has been identified in a number of human tumors and tumor cell lines, including gastric and testicular cancer, and PC3, CAL27 and Tca-83 cell lines. However, the association between CMTM3 expression and the clinicopathological features and prognosis of esophageal squamous cell carcinoma (ESCC) patients remains unclear. The aim of the present study was to investigate the correlation between CMTM3 expression and clinicopathological parameters and prognosis in ESCC. CMTM3 mRNA and protein expression was analyzed in ESCC and paired non-tumor tissues by quantitative real-time polymerase chain reaction, western blotting and immunohistochemical analysis. The Kaplan-Meier method was used to plot survival curves and the Cox proportional hazards regression model was also used for univariate and multivariate survival analysis. The results revealed that CMTM3 mRNA and protein expression levels were lower in 82.5% (30/40) and 75% (30/40) of ESCC tissues, respectively, when compared with matched non-tumor tissues. Statistical analysis demonstrated that CMTM3 expression was significantly correlated with lymph node metastasis (P=0.002) and clinical stage (P<0.001) in ESCC tissues. Furthermore, the survival time of ESCC patients exhibiting low CMTM3 expression was significantly shorter than that of ESCC patients exhibiting high CMTM3 expression (P=0.01). In addition, Kaplan-Meier survival analysis revealed that the overall survival time of patients exhibiting low CMTM3 expression was significantly decreased compared with patients exhibiting high CMTM3 expression (P=0.010). Cox multivariate analysis indicated that CMTM3 protein expression was an independent prognostic predictor for ESCC after resection. This study indicated that CMTM3 expression is significantly decreased in ESCC tissues and CMTM3 protein expression in resected tumors may present an effective prognostic biomarker.
Paula, Adriano R; Carolino, Aline T; Paula, Cátia O; Samuels, Richard I
2011-01-25
Dengue fever transmitted by the mosquito Aedes aegypti, is one of the most rapidly spreading insect borne diseases, stimulating the search for alternatives to current control measures. The dengue vector A. aegypti has received less attention than anophelene species, although more than 2.5 billion people are at risk of infection worldwide. Entomopathogenic fungi are emerging as potential candidates for the control of mosquitoes. Here we continue our studies on the pathogenicity of the entomopathogenic fungus Metarhizium anisopliae against adult A. aegypti females. With the aim of further reducing mean survival times of A. aegypti exposed to fungus impregnated surfaces, a sub-lethal concentration of the neonicotinoid insecticide Imidacloprid (IMI) was added to fungal suspensions. A sub-lethal concentration of IMI that did not significantly alter the daily survival rates or mean survival percentages of mosquitoes was identified to be 0.1 ppm. This sub-lethal concentration was combined with M. anisopliae conidia (1 × 10(9) conidia mL(-1)). Both the combined treatment and the conidia alone were able to reduce the survival of A. aegypti compared with untreated or IMI treated mosquitoes. Importantly, mosquito survival following exposure to the combined treatment for 6 and 12 hrs was significantly reduced when compared with mosquitoes exposed to conidia alone. This is the first time that a combination of an insecticide and an entomopathogenic fungus has been tested against A. aegypti. Firstly, the study showed the potential of IMI as an alternative to the currently employed pyrethroid adulticides. Secondly, as an alternative to applications of high concentrations of chemical insecticides, we suggest that adult A. aegypti could be controlled by surface application of entomopathogenic fungi and that the efficiency of these fungi could be increased by combining the fungi with ultra-low concentrations of insecticides, resulting in higher mortality following relatively short exposure times.
2011-01-01
Background Dengue fever transmitted by the mosquito Aedes aegypti, is one of the most rapidly spreading insect borne diseases, stimulating the search for alternatives to current control measures. The dengue vector A. aegypti has received less attention than anophelene species, although more than 2.5 billion people are at risk of infection worldwide. Entomopathogenic fungi are emerging as potential candidates for the control of mosquitoes. Here we continue our studies on the pathogenicity of the entomopathogenic fungus Metarhizium anisopliae against adult A. aegypti females. With the aim of further reducing mean survival times of A. aegypti exposed to fungus impregnated surfaces, a sub-lethal concentration of the neonicotinoid insecticide Imidacloprid (IMI) was added to fungal suspensions. Results A sub-lethal concentration of IMI that did not significantly alter the daily survival rates or mean survival percentages of mosquitoes was identified to be 0.1 ppm. This sub-lethal concentration was combined with M. anisopliae conidia (1 × 109 conidia mL-1). Both the combined treatment and the conidia alone were able to reduce the survival of A. aegypti compared with untreated or IMI treated mosquitoes. Importantly, mosquito survival following exposure to the combined treatment for 6 and 12 hrs was significantly reduced when compared with mosquitoes exposed to conidia alone. Conclusions This is the first time that a combination of an insecticide and an entomopathogenic fungus has been tested against A. aegypti. Firstly, the study showed the potential of IMI as an alternative to the currently employed pyrethroid adulticides. Secondly, as an alternative to applications of high concentrations of chemical insecticides, we suggest that adult A. aegypti could be controlled by surface application of entomopathogenic fungi and that the efficiency of these fungi could be increased by combining the fungi with ultra-low concentrations of insecticides, resulting in higher mortality following relatively short exposure times. PMID:21266078
All-cause mortality and multimorbidity in older adults: The role of social support and loneliness.
Olaya, Beatriz; Domènech-Abella, Joan; Moneta, Maria Victoria; Lara, Elvira; Caballero, Francisco Félix; Rico-Uribe, Laura Alejandra; Haro, Josep Maria
2017-12-01
To determine whether the effect of multimorbidity on time to mortality is modified by level of social support and loneliness in a representative sample of 2113 participants aged 60+. Vital status was ascertained through national registers or by asking participants' relatives. Baseline variables included number of illnesses, self-perceived social support (Oslo social support scale) and loneliness (UCLA loneliness scale). Kaplan-Meier survival curves were used to estimate the time to death by multimorbidity, social support and loneliness. Adjusted cox proportional hazards regression models were conducted to explore interactions between multimorbidity and social support and loneliness. Multimorbidity was associated with low probability of survival, whereas high loneliness and low social support were not related with time to death. Only the interaction multimorbidity∗social support was significant. Participants with low social support and 2 chronic diseases, compared with none, presented lower probability of survival (HR=2.43, 95%CI=1.14-5.18, p<0.05), whereas the effect of multimorbidity, in comparison with not having chronic conditions, did not affect mortality if participants had high social support. For participants with low social support, there were no differences between having one, two or more than two diseases. When there is high social support, the probability of death is significantly lower if one or two chronic diseases are present, compared with more than two. These findings indicate that having a supportive social environment increases the survival of people with physical illnesses, especially those with one or two. For those with more than two illnesses, survival remains unchanged regardless of the level of social support and other protective factors should be explored in future research. Geriatric health professionals are encouraged to evaluate social relationships and stimulate support given by relatives, friends or neighbors. Copyright © 2017 Elsevier Inc. All rights reserved.
Shi, Yuankai; Zhou, Caicun; Liu, Xiaoqing; Wang, Dong; Song, Yong; Li, Qiang; Feng, Jifeng; Qin, Shukui; Xv, Nong; Zhou, Jianying; Zhang, Li; Hu, Chunhong; Zhang, Shucai; Luo, Rongcheng; Wang, Jie; Tan, Fenlai; Wang, Yinxiang; Ding, Lieming; Sun, Yan
2015-01-01
Background Icotinib is a small molecule targeting epidermal growth factor receptor tyrosine kinase, which shows non-inferior efficacy and better safety comparing to gefitinib in previous phase III trial. The present study was designed to further evaluate the efficacy and safety of icotinib in patients with advanced non-small-cell lung cancer (NSCLC) previously treated with platinum-based chemotherapy. Methods Patients with NSCLC progressing after one or two lines of chemotherapy were enrolled to receive oral icotinib (125mg tablet, three times per day). The primary endpoint was progression-free survival. The secondary endpoints included overall survival, objective response rate, time to progression, quality of life and safety. Results From March 16, 2010 to October 9, 2011, 128 patients from 15 centers nationwide were enrolled, in which 124 patients were available for efficacy evaluation and 127 patients were evaluable for safety. The median progression-free survival and time to progression were 5.0 months (95%CI 2.9–6.6 m) and 5.4 months (95%CI 3.1–7.9 m), respectively. The objective response rate and disease control rate were 25.8% and 67.7% respectively. Median overall survival exceeded 17.6 months (95%CI 14.2 m-NA) according to censored data. Further follow-up of overall survival is ongoing. The most frequent treatment-related adverse events were rash (26%, 33/127), diarrhea (12.6%, 16/127) and elevation of transaminase (15.7%, 20/127). Conclusions In general, this study showed similar efficacy and numerically better safety when compared with that in ICOGEN trial, further confirming the efficacy and safety of icotinib in treating patients with advanced NSCLC previously treated with chemotherapy. Trial Registration ClinicalTrials.gov NCT02486354 PMID:26599904
Evaluation of red cell distribution width in dogs with pulmonary hypertension.
Swann, James W; Sudunagunta, Siddharth; Covey, Heather L; English, Kate; Hendricks, Anke; Connolly, David J
2014-12-01
To compare red cell distribution width (RDW) between dogs with different causes of pulmonary hypertension (PH) and a control dog population to determine whether RDW was correlated with severity of PH as measured by echocardiography. A further aim was to determine the prognostic significance of increased RDW for dogs with PH. Forty-four client-owned dogs with PH and 79 control dogs presented to a single tertiary referral institution. Signalment, clinical pathological and echocardiographic data were obtained retrospectively from the medical records of dogs with PH, and RDW measured on a Cell-Dyn 3500 was compared between dogs with pre- and post-capillary PH and a control population. Referring veterinary surgeons were contacted for follow-up information and Kaplan-Meier analysis was conducted to investigate differences in survival time between affected dogs with different RDW values. The RDW was significantly greater in dogs with pre-capillary PH compared to control dogs. There was no difference in median survival times between dogs with PH divided according to RDW values. The RDW was positively correlated with mean corpuscular volume and haematocrit in dogs with PH, but did not correlate with echocardiographic variables. An association was found between dogs with PH and increased RDW; however there was considerable overlap in values between control dogs and dogs with PH. The RDW was not associated with survival in this study. Copyright © 2014 Elsevier B.V. All rights reserved.
López-Padilla, Daniel; Peghini Gavilanes, Esteban; Revilla Ostolaza, Teresa Yolanda; Trujillo, María Dolores; Martínez Serna, Iván; Arenas Valls, Nuria; Girón Matute, Walther Iván; Larrosa-Barrero, Roberto; Manrique Mutiozabal, Adriana; Pérez Gallán, Marta; Zevallos, Annette; Sayas Catalán, Javier
2016-10-01
To determine the prevalence of arterial stump thrombosis (AST) after pulmonary resection surgery for lung cancer and to describe subsequent radiological follow-up and treatment. Observational, descriptive study of AST detected by computerized tomography angiography (CT) using intravenous contrast. Clinical and radiological variables were compared and a survival analysis using Kaplan-Meier curves was performed after dividing patients into 3 groups: patients with AST, patients with pulmonary embolism (PE), and patients without AST or PE. Nine cases of AST were detected after a total of 473 surgeries (1.9%), 6 of them in right-sided surgeries (67% of AST cases). Median time to detection after surgery was 11.3 months (interquartile range 2.7-42.2 months), and range 67.5 months (1.4-69.0 months). Statistically significant differences were found only in the number of CTs performed in AST patients compared to those without AST or PE, and in tumor recurrence in PE patients compared to the other 2 groups. No differences were found in baseline or oncological characteristics, nor in the survival analysis. In this series, AST prevalence was low and tended to occur in right-sided surgeries. Detection over time was variable, and unrelated to risk factors previous to surgery, histopathology, and tumor stage or recurrence. AST had no impact on patient survival. Copyright © 2016 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.
Zhang, H-L; Li, L; Cheng, C-J; Sun, X-C
2018-02-01
The study aims to detect the association of miR-146a-5p with intracranial aneurysms (IAs). The expression of miR-146a-5p was compared from plasma samples between 72 patients with intracranial aneurysms (IAs) and 40 healthy volunteers by quantitative Real-time polymerase chain reaction (qRT-PCR). Statistical analysis was performed to analyze the relationship between miR-146a-5p expression and clinical data and overall survival (OS) time of IAs patients. Univariate and multivariate Cox proportional hazards have also been performed. Notably, higher miR-146a-5p expression was found in plasma samples from 72 patients with intracranial aneurysms (IAs) compared with 40 healthy controls. Higher miR-146a-5p expression was significantly associated with rupture and Hunt-Hess level in IAs patients. Kaplan-Meier survival analysis verified that higher miR-146a-5p expression predicted a shorter overall survival (OS) compared with lower miR-146a-5p expression in IAs patients. Univariate and multivariate Cox proportional hazards demonstrated that higher miR-146a-5p expression, rupture, and Hunt-Hess were independent risk factors of OS in patients with intracranial aneurysms (IAs). MiR-146a-5p expression may serve as a biomarker for predicting prognosis in patients with IAs.
Shi, Minghan; Fortin, David; Sanche, Léon; Paquette, Benoit
2015-01-01
The prognosis for patients with glioblastoma remains poor with current treatments. Although platinum based drugs are sometimes offered at relapse, their efficacy in this setting is still disputed. In this study, we use convection-enhanced delivery (CED) to deliver the platinum-based drugs (cisplatin, carboplatin, and Lipoplatin™-liposomal formulation of cisplatin) directly into the tumor of F98 glioma-bearing rats that were subsequently treated with γ radiation (15 Gy). CED increased by factors varying between 17 and 111, the concentration of these platinum-based drugs in the brain tumor compared to intra-venous (i.v.) administration, and by 9- to 34-fold, when compared to intra-arterial (i.a.) administration. Furthermore, CED resulted in a better systemic tolerance to platinum drugs compared to their i.a. injection. Among the drugs tested, carboplatin showed the highest maximum tolerated dose (MTD). Treatment with carboplatin resulted in the best median survival time (MeST) (38.5 days), which was further increased by the addition of radiotherapy (54.0 days). Although the DNA-bound platinum adduct were higher at 4 h after CED than 24 h for carboplatin group, combination with radiotherapy led to similar improvement of median survival time. However, less toxicity was observed in animals irradiated 24 h after CED-based chemotherapy. In conclusion, CED increased the accumulation of platinum drugs in tumor, reduced the toxicity, and resulted in a higher median survival time. The best treatment was obtained in animals treated with carboplatin and irradiated 24 h later. PMID:25784204
Shi, Minghan; Fortin, David; Sanche, Léon; Paquette, Benoit
2015-06-01
The prognosis for patients with glioblastoma remains poor with current treatments. Although platinum-based drugs are sometimes offered at relapse, their efficacy in this setting is still disputed. In this study, we use convection-enhanced delivery (CED) to deliver the platinum-based drugs (cisplatin, carboplatin, and Lipoplatin(TM) - liposomal formulation of cisplatin) directly into the tumor of F98 glioma-bearing rats that were subsequently treated with γ radiation (15 Gy). CED increased by factors varying between 17 and 111, the concentration of these platinum-based drugs in the brain tumor compared to intra-venous (i.v.) administration, and by 9- to 34-fold, when compared to intra-arterial (i.a.) administration. Furthermore, CED resulted in a better systemic tolerance to platinum drugs compared to their i.a. injection. Among the drugs tested, carboplatin showed the highest maximum tolerated dose (MTD). Treatment with carboplatin resulted in the best median survival time (MeST) (38.5 days), which was further increased by the addition of radiotherapy (54.0 days). Although the DNA-bound platinum adduct were higher at 4 h after CED than 24 h for carboplatin group, combination with radiotherapy led to similar improvement of median survival time. However, less toxicity was observed in animals irradiated 24 h after CED-based chemotherapy. In conclusion, CED increased the accumulation of platinum drugs in tumor, reduced the toxicity, and resulted in a higher median survival time. The best treatment was obtained in animals treated with carboplatin and irradiated 24 h later.
Harb, Afif; von Horn, Alexander; Gocalek, Kornelia; Schäck, Luisa Marilena; Clausen, Jan; Krettek, Christian; Noack, Sandra; Neunaber, Claudia
2017-07-01
Due to the rising interest in Europe to treat large cartilage defects with osteochondrale allografts, research aims to find a suitable solution for long-term storage of osteochondral allografts. This is further encouraged by the fact that legal restrictions currently limit the use of the ingredients from animal or human sources that are being used in other regions of the world (e.g. in the USA). Therefore, the aim of this study was A) to analyze if a Lactated Ringer (LR) based solution is as efficient as a Dulbecco modified Eagle's minimal essential medium (DMEM) in maintaining chondrocyte viability and B) at which storage temperature (4°C vs. 37°C) chondrocyte survival of the osteochondral allograft is optimally sustained. 300 cartilage grafts were collected from knees of ten one year-old Black Head German Sheep. The grafts were stored in four different storage solutions (one of them DMEM-based, the other three based on Lactated Ringer Solution), at two different temperatures (4 and 37°C) for 14 and 56days. At both points in time, chondrocyte survival as well as death rate, Glycosaminoglycan (GAG) content, and Hydroxyproline (HP) concentration were measured and compared between the grafts stored in the different solutions and at the different temperatures. Independent of the storage solutions tested, chondrocyte survival rates were higher when stored at 4°C compared to storage at 37°C both after short-term (14days) and long-term storage (56days). At no point in time did the DMEM-based solution show a superior chondrocyte survival compared to lactated Ringer based solution. GAG and HP content were comparable across all time points, temperatures and solutions. LR based solutions that contain only substances that are approved in Germany may be just as efficient for storing grafts as the USA DMEM-based solution gold standard. Moreover, in the present experiment storage of osteochondral allografts at 4°C was superior to storage at 37°C. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fecal indicator bacteria persistence under natural conditions in an ice-covered river.
Davenport, C V; Sparrow, E B; Gordon, R C
1976-01-01
Total coliform (TC), fecal coliform (FC), and fecal streptococcus (FS) survival characteristics, under natural conditions at 0 degrees C in an ice-covered river, were examined during February and March 1975. The membrane filter (MF) technique was used throughout the study, and the multiple-tube (MPN) method was used in parallel on three preselected days for comparative recovery of these bacteria. Survival was studied at seven sample stations downstream from all domestic pollution sources in a 317-km reach of the river having 7.1 days mean flow time (range of 6.0 to 9.1 days). The mean indicator bacteria densities decreased continuously at successive stations in this reach and, after adjustment for dilution, the most rapid die-off was found to occur during the first 1.9 days, followed by a slower decrease. After 7.1 days, the relative survival was TC less than FC less than FS, with 8.4%, 15.7%, and 32.8% of the initial populations remaining viable, respectively. These rates are higher than previously reported and suggest that the highest survival rates for these bacteria in receiving streams can be expected at 0 degree C under ice cover. Additionally, the FC-FS ratio was greater than 5 at all stations, indicating that this ratio may be useable for determining the source of fecal pollution in receiving streams for greater than 7 days flow time at low water temperatures. The MPN and MF methods gave comparable results for the TC and FS at all seven sample stations, with both the direct and verified MF counts within the 95% confidence limits of the respective MPNs in most samples, but generally lower than the MPN index. Although FC recovery on membrane filters was comparable results at stations near the pollution source. However, the results became more comparable with increasing flow time. The results of this study indicate that heat shock is a major factor in suppression of the FC counts on the membrane filters at 44.5 degree C. Heat shock may be minimized by extended incubation at 35 degrees C before exposure to the higher temperature. PMID:825042
Si, Anfeng; Li, Jun; Xing, Xianglei; Lei, Zhengqing; Xia, Yong; Yan, Zhenlin; Wang, Kui; Shi, Lehua; Shen, Feng
2017-04-01
Tumor recurrence after liver resection for intrahepatic cholangiocarcinoma is common. The effective treatment for recurrent intrahepatic cholangiocarcinoma remains to be established. This study evaluated the short- and long-term prognoses of patients after repeat hepatic resection for recurrent intrahepatic cholangiocarcinoma. Data for 72 patients who underwent R0 repeat hepatic resection for recurrent intrahepatic cholangiocarcinoma at the Eastern Hepatobiliary Surgery Hospital between 2005 and 2013 were analyzed. Tumor re-recurrence, recurrence-to-death survival, and overall survival were calculated and compared using the Kaplan-Meier method and the log-rank test. Independent risk factors were identified by Cox regression analysis. Operative morbidity and mortality rates were 18.1% and 1.4%, respectively. The 1-, 2-, and 3-year re-recurrence rates were 53.2%, 80.2%, and 92.6%, respectively, and the corresponding recurrence-to-death survival was 82.9%, 53.0%, and 35.3%, respectively. The 1-, 3-, and 5-year overall survival was 97.2%, 67.0%, and 41.9%, respectively. Patients with a time to recurrence of >1 year from the initial hepatectomy achieved higher 1-, 2-, and 3-year recurrence-to-death survival than patients with a time to recurrence of ≤1 year (92.5%, 61.7%. and 46.6% vs 70.4%, 42.2%, and 23.0%, P = .022). Multivariate analysis identified that recurrent tumor >3 cm (hazard ratio: 2.346; 95% confidence interval: 1.288-4.274), multiple recurrent nodules (2.304; 1.049-5.059), cirrhosis (3.165; 1.543-6.491), and a time to recurrence of ≤1 year (1.872; 1.055-3.324) were independent risk factors of recurrence-to-death survival. Repeat hepatic resection for recurrent intrahepatic cholangiocarcinoma was safe and produced long-term survival outcomes in selected patients based on prognostic stratification with the presence of the independent risk factors of recurrence-to-death survival. Copyright © 2016 Elsevier Inc. All rights reserved.
Ozbek, C; Sever, K; Demirhan, O; Mansuroglu, D; Kurtoglu, N; Ugurlucan, M; Sevmis, S; Karakayali, H
2015-12-01
The aim of this study was to compare the mid and long term postoperative outcomes between the hemodialysis-dependent patients awaiting kidney transplantat who underwent open heart surgery in our department during the last five years, and those who did not receive a renal transplant, to determine the predictors of mortality, and assess the possible contribution of post heart surgery kidney transplantation to survival. The patients were separated into two groups: those who underwent a transplantation after open heart surgery were included in the Tp+ group, and those who did not in the Tp- group Between June 2008 and December 2012, 127 dialysis dependent patients awaiting kidney transplant and who underwent open heart surgery were separated into two groups. Those who underwent transplantation after open heart surgery were determined as Tp+ (n=33), and those who did not as Tp- (n=94). Both groups were compared with respect to preoperative paramaters including age, sex, diabetes mellitus (DM), hypertension (HT), hyperlipidemia (HL), obesity, smoking, chronic obstructive pulmonary disease (COPD), peripheral vascular disease (PVD), left ventricle ejection fraction (EF), Euroscore; operative parameters including cross clamp time, perfusion time, number of grafts, use of internal mammary artery (IMA); postoperative parameters including revision, blood transfusion, ventilation time, use of inotropic agents, length of stay in the intensive care unit and hospital, and follow up findings. Problems encountered during follow up were recorded. Predictors of mortality were determined and the survival was calculated. Among the preoperative parameters, when compared with the Tp- group, the Tp+ group had significantly lower values in mean age, presence of DM, obesity, PVD, and Euroscore levels, and higher EF values. Assessment of postoperative values showed that blood transfusion requirement and length of hospital stay were significantly lower in the Tp+ group compared to the Tp- group, whereas the length of follow up was significantly higher in the Tp+ group. The use of inotropic agents was significantly higher in the Tp- group. A logistic regression analysis was made to determine the factors affecting mortality. Revision (p=0.013), blood transfusion (p=0.017), ventilation time (p=0.019), and length of stay in the intensive care unit (p=0.009) were found as predictors of mortality. Survival rates at years 1, 2 and 3 were 86.1%, 81%, 77.5% in the Tp- group, and 96.0%, 96.3%, 90.4% in the Tp+ group. Median survival rate was 41.35±2.02 in the Tp- group, and 49.64±1.59 in the Tp+ group which was significantly higher compared to the Tp- group (p=0.048). Chronic renal failure is among the perioperative risk factors for patients undergoing open heart surgery. Transplantation is still an important health issue due to insufficiency of available transplant organs. Patients with chronic renal failure are well known to have higher risks for coronary artery disease. A radical solution of the cardiovascular system problems prior to kidney transplantation seems to have a significant contribution to the post transplant survival.
Antihypoxic activities of Eryngium caucasicum and Urtica dioica.
Khalili, M; Dehdar, T; Hamedi, F; Ebrahimzadeh, M A; Karami, M
2015-09-01
Urtica dioica and Eryngium spp. have been used in traditional medicine for many years. In spite of many works, nothing is known about their protective effect against hypoxia-induced lethality. Protective effects of U. dioica (UD) aerial parts and E. caucasicum (EC) inflorescence against hypoxia-induced lethality in mice were evaluated by three experimental models of hypoxia, asphyctic, haemic and circulatory. Statistically significant protective activities were established in some doses of extracts in three models. Antihypoxic activity was especially pronounced in polyphenol fractions in asphyctic model. EC polyphenol fraction at 400 mg/kg prolonged survival time (48.80 ± 4.86, p < 0.001) which was comparable with that of phenytoin (p > 0.05). It was the most effective extract in circulatory model, too. It prolonged survival time significantly respect to control group (p < 0.001). UD extracts protected the mice but the response was not dose-dependent. In haemic model, extracts of EP significantly and dose dependently prolonged survival time as compared to control group (p < 0.001). At 600 mg/kg, EP was the most effective one, being capable of keeping the mice alive for 12.71 ± 0.75 min. Only the concentration of 300 mg/kg of UD was effective (p < 0.001). Extracts showed remarkable antihypoxic effects. Pharmacological effects may be attributed to the presence of polyphenols in the extracts.
Pell, Jill P; Sirel, Jane M; Marsden, Andrew K; Ford, Ian; Walker, Nicola L; Cobbe, Stuart M
2002-01-01
Objective To estimate the potential impact of public access defibrillators on overall survival after out of hospital cardiac arrest. Design Retrospective cohort study using data from an electronic register. A statistical model was used to estimate the effect on survival of placing public access defibrillators at suitable or possibly suitable sites. Setting Scottish Ambulance Service. Subjects Records of all out of hospital cardiac arrests due to heart disease in Scotland in 1991-8. Main outcome measures Observed and predicted survival to discharge from hospital. Results Of 15 189 arrests, 12 004 (79.0%) occurred in sites not suitable for the location of public access defibrillators, 453 (3.0%) in sites where they may be suitable, and 2732 (18.0%) in suitable sites. Defibrillation was given in 67.9% of arrests that occurred in possibly suitable sites for locating defibrillators and in 72.9% of arrests that occurred in suitable sites. Compared with an actual overall survival of 744 (5.0%), the predicted survival with public access defibrillators ranged from 942 (6.3%) to 959 (6.5%), depending on the assumptions made regarding defibrillator coverage. Conclusions The predicted increase in survival from targeted provision of public access defibrillators is less than the increase achievable through expansion of first responder defibrillation to non-ambulance personnel, such as police or firefighters, or of bystander cardiopulmonary resuscitation. Additional resources for wide scale coverage of public access defibrillators are probably not justified by the marginal improvement in survival. What is already known on this topicThree quarters of all deaths from acute coronary events occur before the patient reaches a hospitalDefibrillation is an independent predictor of survival from out of hospital cardiac arrestThe probability of a rhythm being amenable to defibrillation declines with timeInterest in providing public access defibrillators to reduce the time to defibrillation has been growing, but their potential impact on overall survival is unknownWhat this study addsMost arrests occur in sites unsuitable for locating public access defibrillatorsArrests that occur in sites suitable for locating defibrillators already have the best profile in terms of ambulance response time, use of defibrillation, and survival of the patientPublic access defibrillators are less likely to increase survival than expansion of first responder defibrillation or bystander cardiopulmonary resuscitation PMID:12217989
Liu, Zun Chang; Chang, Thomas M.S.
2012-01-01
We implanted artificial cell bioencapsulated bone marrow mesenchymal stem cells into the spleens of 90% hepatectomized (PH) rats. The resulting 14 days survival rate was 91%. This is compared to a survival rate of 21% in 90% hepatectomized rats and 25% for those receiving free MSCs transplanted the same way. Unlike free MSCs, the bioencapsulated MSCs are retained in the spleens and their hepatotrophic factors can continue to drain directly into the liver without dilution resulting in improved hepatic regeneration. In addition, with time the transdifferentiation of MSCs into hepatocyte-like cells in the spleen renders the spleen as a ectopic liver support. PMID:19132579
Ju, Yonghan; Sohn, So Young
2014-01-01
The main goal of this research is to identify variables related to the expected time to death due to road traffic accidents (RTAs). Such research is expected to be useful in improving safety laws and regulations and developing new safety systems. The resulting information is crucial not only for reducing accident fatalities but for assessing related insurance policies. In this article, we analyze factors that are potentially associated with variation in the expected survival time after a road traffic accident using Weibull regression. In particular, we consider the association with alcohol involvement, delta V, and restraint systems. Our empirical results, obtained based on the NASS-CDS, indicate that the expected survival time for non-alcohol-impaired drivers is 3.23 times longer at a delta V of 50 km/h than that for alcohol-impaired drivers under the same conditions. In addition, it was observed that, even when occupants were alcohol-impaired, if they were protected by both air bags and seat belts, their expected survival time after an RTA increased 2.59-fold compared to alcohol-impaired drivers who used only seat belts. Our findings may be useful in improving road traffic safety and insurance policies by offering insights into the factors that reduce fatalities.
Sun, Fei; Ding, Wen; He, Jie-Hua; Wang, Xiao-Jing; Ma, Ze-Biao; Li, Yan-Fang
2015-10-20
Stomatin-like protein 2 (SLP-2, also known as STOML2) is a stomatin homologue of uncertain function. SLP-2 overexpression has been suggested to be associated with cancer progression, resulting in adverse clinical outcomes in patients. Our study aim to investigate SLP-2 expression in epithelial ovarian cancer cells and its correlation with patient survival. SLP-2 mRNA and protein expression levels were analysed in five epithelial ovarian cancer cell lines and normal ovarian epithelial cells using real-time PCR and western blotting analysis. SLP-2 expression was investigated in eight matched-pair samples of epithelial ovarian cancer and adjacent noncancerous tissues from the same patients. Using immunohistochemistry, we examined the protein expression of paraffin-embedded specimens from 140 patients with epithelial ovarian cancer, 20 cases with borderline ovarian tumours, 20 cases with benign ovarian tumours, and 20 cases with normal ovarian tissues. Statistical analyses were applied to evaluate the clinicopathological significance of SLP-2 expression. SLP-2 mRNA and protein expression levels were significantly up-regulated in epithelial ovarian cancer cell lines and cancer tissues compared with normal ovarian epithelial cells and adjacent noncancerous ovarian tissues. Immunohistochemistry analysis revealed that the relative overexpression of SLP-2 was detected in 73.6 % (103/140) of the epithelial ovarian cancer specimens, 45.0 % (9/20) of the borderline ovarian specimens, 30.0 % (6/20) of the benign ovarian specimens and none of the normal ovarian specimens. SLP-2 protein expression in epithelial ovarian cancer was significantly correlated with the tumour stage (P < 0.001). Epithelial ovarian cancer patients with higher SLP-2 protein expression levels had shorter progress free survival and overall survival times compared to patients with lower SLP-2 protein expression levels. Multivariate analyses showed that SLP-2 expression levels were an independent prognostic factor for survival in epithelial ovarian cancer patients. SLP-2 mRNA and proteins were overexpressed in epithelial ovarian cancer tissues. SLP-2 protein overexpression was associated with advanced stage disease. Patients with higher SLP-2 protein expression had shorter progress free survival and poor overall survival times. Thus, SLP-2 protein expression was an independent prognostic factor for patients with epithelial ovarian cancer.
van den Beukel, Tessa O; Hommel, Kristine; Kamper, Anne-Lise; Heaf, James G; Siegert, Carl E H; Honig, Adriaan; Jager, Kitty J; Dekker, Friedo W; Norredam, Marie
2016-07-01
In Western countries, black and Asian dialysis patients experience better survival compared with white patients. The aim of this study is to compare the survival of native Danish dialysis patients with that of dialysis patients originating from other countries and to explore the association between the duration of residence in Denmark before the start of dialysis and the mortality on dialysis. We performed a population-wide national cohort study of incident chronic dialysis patients in Denmark (≥18 years old) who started dialysis between 1995 and 2010. In total, 8459 patients were native Danes, 344 originated from other Western countries, 79 from North Africa or West Asia, 173 from South or South-East Asia and 54 from sub-Saharan Africa. Native Danes were more likely to die on dialysis compared with the other groups (crude incidence rates for mortality: 234, 166, 96, 110 and 53 per 1000 person-years, respectively). Native Danes had greater hazard ratios (HRs) for mortality compared with the other groups {HRs for mortality adjusted for sociodemographic and clinical characteristics: 1.32 [95% confidence interval (CI) 1.14-1.54]; 2.22 [95% CI 1.51-3.23]; 1.79 [95% CI 1.41-2.27]; 2.00 [95% CI 1.10-3.57], respectively}. Compared with native Danes, adjusted HRs for mortality for Western immigrants living in Denmark for ≤10 years, >10 to ≤20 years and >20 years were 0.44 (95% CI 0.27-0.71), 0.56 (95% CI 0.39-0.82) and 0.86 (95% CI 0.70-1.04), respectively. For non-Western immigrants, these HRs were 0.42 (95% CI 0.27-0.67), 0.52 (95% CI 0.33-0.80) and 0.48 (95% CI 0.35-0.66), respectively. Incident chronic dialysis patients in Denmark originating from countries other than Denmark have a better survival compared with native Danes. For Western immigrants, this survival benefit declines among those who have lived in Denmark longer. For non-Western immigrants, the survival benefit largely remains over time. © The Author 2015. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.
Chang, Susan; Zhang, Peixin; Cairncross, J Gregory; Gilbert, Mark R; Bahary, Jean-Paul; Dolinskas, Carol A; Chakravarti, Arnab; Aldape, Kenneth D; Bell, Erica H; Schiff, David; Jaeckle, Kurt; Brown, Paul D; Barger, Geoffrey R; Werner-Wasik, Maria; Shih, Helen; Brachman, David; Penas-Prado, Marta; Robins, H Ian; Belanger, Karl; Schultz, Christopher; Hunter, Grant; Mehta, Minesh
2017-02-01
The primary objective of this study was to compare the overall survival (OS) of patients with anaplastic astrocytoma (AA) treated with radiotherapy (RT) and either temozolomide (TMZ) or a nitrosourea (NU). Secondary endpoints were time to tumor progression (TTP), toxicity, and the effect of IDH1 mutation status on clinical outcome. Eligible patients with centrally reviewed, histologically confirmed, newly diagnosed AA were randomized to receive either RT+TMZ (n = 97) or RT+NU (n = 99). The study closed early because the target accrual rate was not met. Median follow-up time for patients still alive was 10.1 years (1.9-12.6 y); 66% of the patients died. Median survival time was 3.9 years in the RT/TMZ arm (95% CI, 3.0-7.0) and 3.8 years in the RT/NU arm (95% CI, 2.2-7.0), corresponding to a hazard ratio (HR) of 0.94 (P = .36; 95% CI, 0.67-1.32). The differences in progression-free survival (PFS) and TTP between the 2 arms were not statistically significant. Patients in the RT+NU arm experienced more grade ≥3 toxicity (75.8% vs 47.9%, P < .001), mainly related to myelosuppression. Of the 196 patients, 111 were tested for IDH1-R132H status (60 RT+TMZ and 51 RT+NU). Fifty-four patients were IDH negative and 49 were IDH positive with a better OS in IDH-positive patients (median survival time 7.9 vs 2.8 y; P = .004, HR = 0.50; 95% CI, 0.31-0.81). RT+TMZ did not appear to significantly improve OS or TTP for AA compared with RT+ NU. RT+TMZ was better tolerated. IDH1-R132H mutation was associated with longer survival. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Pulmonary Function and Survival in Idiopathic vs Secondary Usual Interstitial Pneumonia
Strand, Matthew J.; Sprunger, David; Cosgrove, Gregory P.; Fernandez-Perez, Evans R.; Frankel, Stephen K.; Huie, Tristan J.; Olson, Amy L.; Solomon, Joshua; Brown, Kevin K.
2014-01-01
BACKGROUND: The usual interstitial pneumonia (UIP) pattern of lung injury may occur in the setting of connective tissue disease (CTD), but it is most commonly found in the absence of a known cause, in the clinical context of idiopathic pulmonary fibrosis (IPF). Our objective was to observe and compare longitudinal changes in pulmonary function and survival between patients with biopsy-proven UIP found in the clinical context of either CTD or IPF. METHODS: We used longitudinal data analytic models to compare groups (IPF [n = 321] and CTD-UIP [n = 56]) on % predicted FVC (FVC %) or % predicted diffusing capacity of the lung for carbon monoxide (Dlco %), and we used both unadjusted and multivariable techniques to compare survival between these groups. RESULTS: There were no significant differences between groups in longitudinal changes in FVC % or Dlco % up to diagnosis, or from diagnosis to 10 years beyond (over which time, the mean decrease in FVC % per year [95% CI] was 4.1 [3.4, 4.9] for IPF and 3.5 [1.8, 5.1] for CTD-UIP, P = .49 for difference; and the mean decrease in Dlco % per year was 4.7 [4.0, 5.3] for IPF and 4.3 [3.0, 5.6] for CTD-UIP, P = .60 for difference). Despite the lack of differences in pulmonary function, subjects with IPF had worse survival in unadjusted (log-rank P = .003) and certain multivariable analyses. CONCLUSIONS: Despite no significant differences in changes in pulmonary function over time, patients with CTD-UIP (at least those with certain classifiable CTDs) live longer than patients with IPF—an observation that we suspect is due to an increased rate of mortal acute exacerbations in patients with IPF. PMID:24700149
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frangakis, Constantine; Geschwind, Jean-Francois; Kim, Daniel
Introduction: The drop-off risk for patients awaiting liver transplantation for hepatocellular carcinoma (HCC) is 22%. Transplant liver availability is expected to worsen, resulting in longer waiting times and increased drop-off rates. Our aim was to determine whether chemoembolization can decrease this risk. Patients and Methods: Eighty-seven consecutive HCC patients listed for liver transplant (Milan criteria) underwent statistical comparability adjustments using the propensity score (Wilcoxon, Fisher's, and chi-square tests). Forty-three nonchemoembolization patients and 22 chemoembolization patients were comparable for Child-Pugh and Model for End-Stage Liver Disease scores, tumor size and number, alpha fetoprotein (AFP) levels, and cause of cirrhosis. We calculatedmore » the risk of dropping off the transplant list by assigning a transplant time to those who dropped off (equal probability with patients who were on the list longer than the patient in question). The significance level was obtained by calculating the simulation distribution of the difference compared with the permutations of chemoembolization versus nonchemoembolization assignment of the patients. Kaplan-Meier estimators (log-rank test) were used to determine survival rates. Results: Median follow-up was 187 {+-} 110 weeks (range 38 to 435, date of diagnosis). The chemoembolization group had an 80% drop-off risk decrease (15% nonchemoembolization versus 3% chemoembolization, p = 0.04). Although survival was better for the chemoembolization group, it did not reach statistical significance. Two-year survival for the nonchemoembolization and chemoembolization group was 57.3% {+-} 7.1% and 76.0% {+-} 7.9%, respectively (p = 0.078). Conclusions: Chemoembolization appears to result in a significant decrease in the risk of dropping off liver transplant list for patients with HCC and results in a tendency toward longer survival.« less
Zhou, Hong; Zhang, Long; Ye, Fang; Wang, Hai-Jun; Huntington, Dale; Huang, Yanjie; Wang, Anqi; Liu, Shuiqing; Wang, Yan
2016-01-01
To examine the effects of maternal death on the health of the index child, the health and educational attainment of the older children, and the mental health and quality of life of the surviving husband. A cohort study including 183 households that experienced a maternal death matched to 346 households that experienced childbirth but not a maternal death was conducted prospectively between June 2009 and October 2011 in rural China. Data on household sociodemographic characteristics, physical and mental health were collected using a quantitative questionnaire and medical examination at baseline and follow-up surveys. Multivariate linear regression, logistic regression models and difference-in-difference (DID) were used to compare differences of outcomes between two groups. The index children who experienced the loss of a mother had a significantly higher likelihood of dying, abandonment and malnutrition compared to children whose mothers survived at the follow-up survey. The risk of not attending school on time and dropping out of school among older children in the affected group was higher than those in the control group during the follow-up. Husbands whose wife died had significantly lower EQ-5D index and EQ-VAS both at baseline and at follow-up surveys compared to those without experiencing a wife's death, suggesting an immediate and sustained poorer mental health quality of life among the surviving husbands. Also the prevalence of posttraumatic stress disorder (PTSD) was 72.6% at baseline and 56.2% at follow-up among husbands whose wife died. Maternal death has multifaceted and spillover effects on the physical and mental health of family members that are sustained over time. Programmes that reduce maternal mortality will mitigate repercussions on surviving family members are critical and needed.
Ouwens, Mario; Hauch, Ole; Franzén, Stefan
2018-05-01
The rank-preserving structural failure time model (RPSFTM) is used for health technology assessment submissions to adjust for switching patients from reference to investigational treatment in cancer trials. It uses counterfactual survival (survival when only reference treatment would have been used) and assumes that, at randomization, the counterfactual survival distribution for the investigational and reference arms is identical. Previous validation reports have assumed that patients in the investigational treatment arm stay on therapy throughout the study period. To evaluate the validity of the RPSFTM at various levels of crossover in situations in which patients are taken off the investigational drug in the investigational arm. The RPSFTM was applied to simulated datasets differing in percentage of patients switching, time of switching, underlying acceleration factor, and number of patients, using exponential distributions for the time on investigational and reference treatment. There were multiple scenarios in which two solutions were found: one corresponding to identical counterfactual distributions, and the other to two different crossing counterfactual distributions. The same was found for the hazard ratio (HR). Unique solutions were observed only when switching patients were on investigational treatment for <40% of the time that patients in the investigational arm were on treatment. Distributions other than exponential could have been used for time on treatment. An HR equal to 1 is a necessary but not always sufficient condition to indicate acceleration factors associated with equal counterfactual survival. Further assessment to distinguish crossing counterfactual curves from equal counterfactual curves is especially needed when the time that switchers stay on investigational treatment is relatively long compared to the time direct starters stay on investigational treatment.
Improved Survival in Male Melanoma Patients in the Era of Sentinel Node Biopsy.
Koskivuo, I; Vihinen, P; Mäki, M; Talve, L; Vahlberg, T; Suominen, E
2017-03-01
Sentinel node biopsy is a standard method for nodal staging in patients with clinically localized cutaneous melanoma, but the survival advantage of sentinel node biopsy remains unsolved. The aim of this case-control study was to investigate the survival benefit of sentinel node biopsy. A total of 305 prospective melanoma patients undergoing sentinel node biopsy were compared with 616 retrospective control patients with clinically localized melanoma whom have not undergone sentinel node biopsy. Survival differences were calculated with the median follow-up time of 71 months in sentinel node biopsy patients and 74 months in control patients. Analyses were calculated overall and separately in males and females. Overall, there were no differences in relapse-free survival or cancer-specific survival between sentinel node biopsy patients and control patients. Male sentinel node biopsy patients had significantly higher relapse-free survival ( P = 0.021) and cancer-specific survival ( P = 0.024) than control patients. In females, no differences were found. Cancer-specific survival rates at 5 years were 87.8% in sentinel node biopsy patients and 85.2% in controls overall with 88.3% in male sentinel node biopsy patients and 80.6% in male controls and 87.3% in female sentinel node biopsy patients and 89.8% in female controls. Sentinel node biopsy did not improve survival in melanoma patients overall. While females had no differences in survival, males had significantly improved relapse-free survival and cancer-specific survival following sentinel node biopsy.
Adams, William M; Kleiter, Miriam M; Thrall, Donald E; Klauer, Julia M; Forrest, Lisa J; La Due, Tracy A; Havighurst, Thomas C
2009-01-01
Prognostic significance of tumor histology and four computed tomography (CT) staging methods was tested retrospectively in dogs from three treatment centers that underwent intent-to-cure-radiotherapy for intranasal neoplasia. Disease-free and overall survival times were available for 94 dogs. A grouping of anaplastic, squamous cell, and undifferentiated carcinomas had a significantly shorter median disease-free survival (4.4 mo) than a grouping of all sarcomas (10.6 months). Disease-free survivals were not significantly different, when all carcinomas were compared with all sarcomas. The published original and modified WHO staging methods did not significantly relate to either survival endpoint. A modified human maxillary tumor staging system previously applied to canine nasal tumors was prognostically significant for both survival endpoints; a further modified version of that CT-based staging system resulted in improved significance for both survival endpoints. Dogs with unilateral intranasal involvement without bone destruction beyond the turbinates on CT, had longest median survival (23.4 months); CT evidence of cribriform plate involvement was associated with shortest median survival (6.7 months). Combining CT and histology statistically improved prognostic significance for both survival endpoints over the proposed CT staging method alone. Significance was lost when CT stages were collapsed to < four categories or histopathology groupings were collapsed to < three categories.
Sylvestre, Gabriel; Gandini, Mariana; Maciel-de-Freitas, Rafael
2013-01-01
Background Aedes aegypti is the main vector of dengue, a disease that is increasing its geographical range as well as incidence rates. Despite its public health importance, the effect of dengue virus (DENV) on some mosquito traits remains unknown. Here, we investigated the impact of DENV-2 infection on the feeding behavior, survival, oviposition success and fecundity of Ae. aegypti females. Methods/Principal Findings After orally-challenging Ae. aegypti females with a DENV-2 strain using a membrane feeder, we monitored the feeding behavior, survival, oviposition success and fecundity throughout the mosquito lifespan. We observed an age-dependent cost of DENV infection on mosquito feeding behavior and fecundity. Infected individuals took more time to ingest blood from anesthetized mice in the 2nd and 3rd weeks post-infection, and also longer overall blood-feeding times in the 3rd week post-infection, when females were around 20 days old. Often, infected Ae. aegypti females did not lay eggs and when they were laid, smaller number of eggs were laid compared to uninfected controls. A reduction in the number of eggs laid per female was evident starting on the 3rd week post-infection. DENV-2 negatively affected mosquito lifespan, since overall the longevity of infected females was halved compared to that of the uninfected control group. Conclusions The DENV-2 strain tested significantly affected Ae. aegypti traits directly correlated with vectorial capacity or mosquito population density, such as feeding behavior, survival, fecundity and oviposition success. Infected mosquitoes spent more time ingesting blood, had reduced lifespan, laid eggs less frequently, and when they did lay eggs, the clutches were smaller than uninfected mosquitoes. PMID:23555838
Lukowsky, Lilia R.; Mehrotra, Rajnish; Kheifets, Leeka; Arah, Onyebuchi A.; Nissenson, Allen R.
2013-01-01
Summary Background and objectives There are conflicting research results about the survival differences between hemodialysis and peritoneal dialysis, especially during the first 2 years of dialysis treatment. Given the challenges of conducting randomized trials, differential rates of modality switch and transplantation, and time-varying confounding in cohort data during the first years of dialysis treatment, use of novel analytical techniques in observational cohorts can help examine the peritoneal dialysis versus hemodialysis survival discrepancy. Design, setting, participants, & measurements This study examined a cohort of incident dialysis patients who initiated dialysis in DaVita dialysis facilities between July of 2001 and June of 2004 and were followed for 24 months. This study used the causal modeling technique of marginal structural models to examine the survival differences between peritoneal dialysis and hemodialysis over the first 24 months, accounting for modality change, differential transplantation rates, and detailed time-varying laboratory measurements. Results On dialysis treatment day 90, there were 23,718 incident dialysis—22,360 hemodialysis and 1,358 peritoneal dialysis—patients. Incident peritoneal dialysis patients were younger, had fewer comorbidities, and were nine and three times more likely to switch dialysis modality and receive kidney transplantation over the 2-year period, respectively, compared with hemodialysis patients. In marginal structural models analyses, peritoneal dialysis was associated with persistently greater survival independent of the known confounders, including dialysis modality switch and transplant censorship (i.e., death hazard ratio of 0.52 [95% confidence limit 0.34–0.80]). Conclusions Peritoneal dialysis seems to be associated with 48% lower mortality than hemodialysis over the first 2 years of dialysis therapy independent of modality switches or differential transplantation rates. PMID:23307879
Lukowsky, Lilia R; Mehrotra, Rajnish; Kheifets, Leeka; Arah, Onyebuchi A; Nissenson, Allen R; Kalantar-Zadeh, Kamyar
2013-04-01
There are conflicting research results about the survival differences between hemodialysis and peritoneal dialysis, especially during the first 2 years of dialysis treatment. Given the challenges of conducting randomized trials, differential rates of modality switch and transplantation, and time-varying confounding in cohort data during the first years of dialysis treatment, use of novel analytical techniques in observational cohorts can help examine the peritoneal dialysis versus hemodialysis survival discrepancy. This study examined a cohort of incident dialysis patients who initiated dialysis in DaVita dialysis facilities between July of 2001 and June of 2004 and were followed for 24 months. This study used the causal modeling technique of marginal structural models to examine the survival differences between peritoneal dialysis and hemodialysis over the first 24 months, accounting for modality change, differential transplantation rates, and detailed time-varying laboratory measurements. On dialysis treatment day 90, there were 23,718 incident dialysis-22,360 hemodialysis and 1,358 peritoneal dialysis-patients. Incident peritoneal dialysis patients were younger, had fewer comorbidities, and were nine and three times more likely to switch dialysis modality and receive kidney transplantation over the 2-year period, respectively, compared with hemodialysis patients. In marginal structural models analyses, peritoneal dialysis was associated with persistently greater survival independent of the known confounders, including dialysis modality switch and transplant censorship (i.e., death hazard ratio of 0.52 [95% confidence limit 0.34-0.80]). Peritoneal dialysis seems to be associated with 48% lower mortality than hemodialysis over the first 2 years of dialysis therapy independent of modality switches or differential transplantation rates.
Gheshlaghi, Farzad; Lavasanijou, Mohamad Reza; Moghaddam, Noushin Afshar; Khazaei, Majid; Behjati, Mohaddeseh; Farajzadegan, Ziba; Sabzghabaee, Ali Mohammad
2015-01-01
Objectives: Intentional and accidental intoxication with aluminium phosphide (ALP) remains a clinical problem, especially in the Middle East region. Considering the high mortality rate besides lack of any recommended first option drug for its treatment, this study was aimed to compare the therapeutic effects of N-acetylcysteine (NAC), vitamin C (Vit C), and methylene blue; both in isolate and also in combination, for the treatment of ALP intoxication in a rat model. Materials and Methods: In this experimental animal study, 80 male Wistar rats in eight groups were intoxicated with ALP (12.5 mg/kg) and treated with a single dose of NAC (100 mg/kg) or Vit C (500–1,000 mg/kg) or methylene blue (1 mg/kg/5 min, 0.1%) or two of these agents or all three of them (controls were not treated). Rats were monitored regarding the parameters of drug efficacy as increased survival time and reduced morbidity and mortality rate for 3 consecutive days to ensure toxin neutralization. Macroscopic changes were recorded and biopsy sections were taken from brain, cerebellum, kidney, liver, and heart for microscopic evaluation regarding cellular hypoxia. Results: The mean survival times of rats exposed to ALP and treated with VitC + NAC was 210.55±236.22 minutes. In analysis of survival times, there was a significant difference between Group 5 which received VitC + NAC and the other groups (P < 0.01). Serum magnesium levels after death were higher than normal (P = 0.01). Conclusions: Despite the higher survival rate of antioxidant-treated rats compared with controls, this difference was not statistically significant. PMID:26862259
Gao, Qingzhen; Wang, Xiaoping; Zhang, Ruibin; Wang, Pu; Jing, Yongsheng; Ren, Wanjun; Zhu, Bin
2016-07-01
The study aimed to compare the impact of allogeneic bone marrow cells (BMCs) infusion through the inferior vena cava (IVC) and portal vein (PV) combined with rapamycin on allogeneic islet grafts in diabetic rats. Recipient diabetic Wistar rats were infused with islets from Sprague-Dawley rats through the PV. PKH26-labeled BMCs of Sprague-Dawley rats were infused to recipients through the PV or IVC, followed by administration of rapamycin for 4 days. Blood glucose level was measured to evaluate the survival time of the islets. Lymphocytes separated from blood, BMCs, thymus, liver, spleen and lymph node were analyzed by flow cytometry. The peripheral blood smear, BMCs smear and frozen sections of tissues were observed by a fluorescence microscope. The survival time of the islets was significantly prolonged by the BMCs infusion combined with rapamycin. The rats receiving BMCs infusion through the PV induced a significantly longer survival time of the islets, and increased mixed chimeras of allogeneic BMCs in the thymus, liver, spleen and lymph node compared with the rats receiving BMCs infusion through the IVC. The amount of the mixed chimeras on day 14 was lower than that on day 7 after islet transplantation. Furthermore, PV transplantation had significantly more mixed chimera than IVC transplantation in all analyzed organs or tissues. BMCs infusion combined with rapamycin prolongs the islets survival and induces mixed chimeras of BMCs. PV infusion of BMCs might be a more effective strategy than IVC infusion of BMCs. © 2015 The Authors. Journal of Diabetes Investigation published by Asian Association for the Study of Diabetes (AASD) and John Wiley & Sons Australia, Ltd.
Uno, Hajime; Tian, Lu; Claggett, Brian; Wei, L J
2015-12-10
With censored event time observations, the logrank test is the most popular tool for testing the equality of two underlying survival distributions. Although this test is asymptotically distribution free, it may not be powerful when the proportional hazards assumption is violated. Various other novel testing procedures have been proposed, which generally are derived by assuming a class of specific alternative hypotheses with respect to the hazard functions. The test considered by Pepe and Fleming (1989) is based on a linear combination of weighted differences of the two Kaplan-Meier curves over time and is a natural tool to assess the difference of two survival functions directly. In this article, we take a similar approach but choose weights that are proportional to the observed standardized difference of the estimated survival curves at each time point. The new proposal automatically makes weighting adjustments empirically. The new test statistic is aimed at a one-sided general alternative hypothesis and is distributed with a short right tail under the null hypothesis but with a heavy tail under the alternative. The results from extensive numerical studies demonstrate that the new procedure performs well under various general alternatives with a caution of a minor inflation of the type I error rate when the sample size is small or the number of observed events is small. The survival data from a recent cancer comparative study are utilized for illustrating the implementation of the process. Copyright © 2015 John Wiley & Sons, Ltd.
Survivable architectures for time and wavelength division multiplexed passive optical networks
NASA Astrophysics Data System (ADS)
Wong, Elaine
2014-08-01
The increased network reach and customer base of next-generation time and wavelength division multiplexed PON (TWDM-PONs) have necessitated rapid fault detection and subsequent restoration of services to its users. However, direct application of existing solutions for conventional PONs to TWDM-PONs is unsuitable as these schemes rely on the loss of signal (LOS) of upstream transmissions to trigger protection switching. As TWDM-PONs are required to potentially use sleep/doze mode optical network units (ONU), the loss of upstream transmission from a sleeping or dozing ONU could erroneously trigger protection switching. Further, TWDM-PONs require its monitoring modules for fiber/device fault detection to be more sensitive than those typically deployed in conventional PONs. To address the above issues, three survivable architectures that are compliant with TWDM-PON specifications are presented in this work. These architectures combine rapid detection and protection switching against multipoint failure, and most importantly do not rely on upstream transmissions for LOS activation. Survivability analyses as well as evaluations of the additional costs incurred to achieve survivability are performed and compared to the unprotected TWDM-PON. Network parameters that impact the maximum achievable network reach, maximum split ratio, connection availability, fault impact, and the incremental reliability costs for each proposed survivable architecture are highlighted.
Toward computer simulation of high-LET in vitro survival curves.
Heuskin, A-C; Michiels, C; Lucas, S
2013-09-21
We developed a Monte Carlo based computer program called MCSC (Monte Carlo Survival Curve) able to predict the survival fraction of cells irradiated in vitro with a broad beam of high linear energy transfer particles. Three types of cell responses are studied: the usual high dose response, the bystander effect and the low-dose hypersensitivity (HRS). The program models the broad beam irradiation and double strand break distribution following Poisson statistics. The progression of cells through the cell cycle is taken into account while the repair takes place. Input parameters are experimentally determined for A549 lung carcinoma cells irradiated with 10 and 20 keV µm(-1) protons, 115 keV µm(-1) alpha particles and for EAhy926 endothelial cells exposed to 115 keV µm(-1) alpha particles. Results of simulations are presented and compared with experimental survival curves obtained for A549 and EAhy296 cells. Results are in good agreement with experimental data for both cell lines and all irradiation protocols. The benefits of MCSC are several: the gain of time that would have been spent performing time-consuming clonogenic assays, the capacity to estimate survival fraction of cell lines not forming colonies and possibly the evaluation of radiosensitivity parameters of given individuals.
Ravi, Yazhini; Lella, Srihari K; Copeland, Laurel A; Zolfaghari, Kiumars; Grady, Kathleen; Emani, Sitaramesh; Sai-Sudhakar, Chittoor B
2018-05-01
Recipient-related factors, such as education level and type of health insurance, are known to affect heart transplantation outcomes. Pre-operative employment status has shown an association with survival in abdominal organ transplant patients. We sought to evaluate the effect of work status of heart transplant (HTx) recipients at the time of listing and at the time of transplantation on short- and long-term survival. We evaluated the United Network for Organ Sharing (UNOS) registry for all adult HTx recipients from 2001 to 2014. Recipients were grouped based on their work status at listing and at heart transplantation. Kaplan-Meier estimates illustrated 30-day, 1-year, 5-year, and 10-year survival comparing working with non-working groups. The Cox proportional hazards regression model was applied to adjust for covariates that could potentially confound the post-transplantation survival analysis. Working at listing for HTx was not significantly associated with 30-day and 1-year survival. However, 5- and 10-year mortality were 14.5% working vs 19.8% not working (p < 0.0001) and 16% working vs 26% not working (p < 0.0001), respectively. Working at HTx appeared to be associated with a survival benefit at every time interval, with a trend toward improved survival at 30 days and 1 year and a significant association at 5 and 10 years. Kaplan-Meier analysis demonstrated a 5% and 10% decrease in 5- and 10-year mortality, respectively, for the working group compared with the group not working at transplantation. The Cox proportional hazards regression model showed that working at listing and working at transplantation were each associated with decreased mortality (hazard ratio [HR] = 0.8, 95% confidence interval [CI] 0.71 to 0.91; and HR = 0.76, 95% CI 0.65 to 0.89, respectively). This study is the first analysis of UNOS STAR data on recipient work status pre-HTx demonstrating: (1) an improvement in post-transplant survival for working HTx candidates; and (2) an association between working pre-HTx and longer post-HTx survival. Given that work status before HTx may be a modifiable risk factor for better outcomes after HTx, we strongly recommend that UNOS consider these important findings for moving forward this patient-centered research on work status. Working at listing and working at HTx are associated with long-term survival benefits. The association may be reciprocal, where working identifies less ill patients and also improves well-being. Consideration should be given to giving additional weight to work status during organ allocation. Work status may also be a modifiable factor associated with better post-HTx outcomes. Copyright © 2018 International Society for the Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Sundaram, Vinay; Choi, Gina; Jeon, Christie Y; Ayoub, Walid S; Nissen, Nicholas N; Klein, Andrew S; Tran, Tram T
2015-05-01
Primary sclerosing cholangitis (PSC) patients suffer from comorbidities unaccounted for by the model for end-stage liver disease scoring system and may benefit from the increased donor organ pool provided by donation after cardiac death (DCD) liver transplantation. However, the impact of DCD transplantation on PSC graft outcomes is unknown. We studied 41,018 patients using the United Network for Organ Sharing database from 2002 through 2012. Kaplan-Meier analysis and Cox regression were used to evaluate graft survival and risk factors for graft failure, respectively. The PSC patients receiving DCD livers (n=75) showed greater overall graft failure (37.3% vs. 20.4%, P = 0.001), graft failure from biliary complications (47.4% vs. 13.9%, P = 0.002), and shorter graft survival time (P = 0.003), compared to PSC patients receiving donation after brain death organs (n=1592). Among DCD transplants (n=1943), PSC and non-PSC patients showed similar prevalence of graft failure and graft survival time, though a trend existed toward increased biliary-induced graft failure among PSC patients (47.4 vs. 26.4%, P = 0.063). Cox modeling demonstrated that PSC patients have a positive graft survival advantage compared to non-PSC patients (hazard ratio [HR]=0.72, P < 0.001), whereas DCD transplantation increased risk of graft failure (HR = 1.28, P < 0.001). Furthermore, the interaction between DCD transplant and PSC was significant (HR = 1.76, P = 0.015), indicating that use of DCD organs impacts graft survival more in PSC than non-PSC patients. Donation after cardiac death liver transplantation leads to significantly worse outcomes in PSC. We recommend cautious use of DCD transplantation in this population.
Zabell, Joseph R; Adejoro, Oluwakayode; Jarosek, Stephanie L; Elliott, Sean P; Konety, Badrinath R
2016-10-01
Prostate cancer remains a common disease that is frequently treated with multimodal therapy. The goal of this study was to assess the impact of treatment of the primary tumor on survival in men who go onto receive chemotherapy for prostate cancer. Using surveillance, epidemiology and end results (SEER)-Medicare data from 1992 to 2009, we identified a cohort of 1614 men who received chemotherapy for prostate cancer. Primary outcomes were prostate cancer-specific mortality (PCSM) and all-cause mortality (ACM). We compared survival among men who had previously undergone radical prostatectomy (RP), radiation therapy (RT), or neither of these therapies. Propensity score adjusted Cox proportional hazard models and weighted Kaplan-Meier curves were used to assess survival. Compared to men who received no local treatment, PCSM was lower for men who received RP ± RT (HR 0.65, p < 0.01) and for those who received RT only (HR 0.79, p < 0.05). Patients receiving neither RP nor RT demonstrated higher PCSM and ACM than those receiving treatment in a weighted time-to-event analysis. Men who received RP + RT had longer mean time from diagnosis to initiation of chemotherapy (100.7 ± 47.7 months) than men with no local treatment (48.8 ± 35.0 months, p < 0.05). In patients who go on to receive chemotherapy, treatment of the primary tumor for prostate cancer appears to confer a survival advantage over those who do not receive primary treatment. These data suggest continued importance for local treatment of prostate cancer, even in patients at high risk of failing local therapy.
Impact of Pre-Dialysis Care on Clinical Outcomes in Peritoneal Dialysis Patients.
Spigolon, Dandara N; de Moraes, Thyago P; Figueiredo, Ana E; Modesto, Ana Paula; Barretti, Pasqual; Bastos, Marcus Gomes; Barreto, Daniela V; Pecoits-Filho, Roberto
2016-01-01
Structured pre-dialysis care is associated with an increase in peritoneal dialysis (PD) utilization, but not with peritonitis risk, technical and patient survival. This study aimed at analyzing the impact of pre-dialysis care on these outcomes. All incident patients starting PD between 2004 and 2011 in a Brazilian prospective cohort were included in this analysis. Patients were divided into 2 groups: early pre-dialysis care (90 days of follow-up by a nephrology team); and late pre-dialysis care (absent or less than 90 days follow-up). The socio-demographic, clinical and biochemical characteristics between the 2 groups were compared. Risk factors for the time to the first peritonitis episode, technique failure and mortality based on Cox proportional hazards models. Four thousand one hundred seven patients were included. Patients with early pre-dialysis care presented differences in gender (female - 47.0 vs. 51.1%, p = 0.01); race (white - 63.8 vs. 71.7%, p < 0.01); education (<4 years - 61.9 vs. 71.0%, p < 0.01), respectively, compared to late care. Patients with early pre-dialysis care presented a higher prevalence of comorbidities, lower levels of creatinine, phosphorus, and glucose with a significantly better control of hemoglobin and potassium serum levels. There was no impact of pre-dialysis care on peritonitis rates (hazard ratio (HR) 0.88; 95% CI 0.77-1.01) and technique survival (HR 1.12; 95% CI 0.92-1.36). Patient survival (HR 1.20; 95% CI 1.03-1.41) was better in the early pre-dialysis care group. Earlier pre-dialysis care was associated with improved patient survival, but did not influence time to the first peritonitis nor technique survival in this national PD cohort. © 2016 S. Karger AG, Basel.
Zhang, Jie; Jiang, Yizhou; Wu, Chunxiao; Cai, Shuang; Wang, Rui; Zhen, Ying; Chen, Sufeng; Zhao, Kuaile; Huang, Yangle; Luketich, James; Chen, Haiquan
2015-10-01
Esophageal squamous cell carcinoma (ESCC) is the major histologic subtype of esophageal cancer, characterized by a high mortality rate and geographic differences in incidences. It is unknown whether there is difference between "eastern" ESCC and "western" ESCC. This study is attempted to demonstrate the hypothesis by comparing ESCC between Chinese residents and Caucasians living in the US. The data sources of this study are from United States SEER limited-use database and Shanghai Cancer Registries by Shanghai Municipal Center for Disease Control (SMCDC). Consecutive, non-selected patients with pathologically diagnosed ESCC, between January 1, 2002 and December 31, 2006, were included in this analysis. 1-year, 3-year and 5-year survival estimates were computed and compared between two populations. A Cox proportional hazards model was used to determine factors affecting survival differences. A total of 1,718 Chinese, 1,624 Caucasians ESCC patients with individual American Joint Commission on Cancer (AJCC) staging information were included in this study. The Caucasian group had a significantly higher proportion of female patients than Chinese (38.24% vs. 18.68% P<0.01). ESCC was diagnosed in Chinese patients at an earlier age and stage than Caucasians. Generally, Chinese patients had similar overall survival rate with Caucasian by both univariate and multivariate analysis. Overall survival was significantly worse only in male Caucasians compared to Chinese patients (median survival time, 12.4 vs. 14.5 months, P<0.01, respectively). ESCC from eastern and western countries might have some different features. These differences need to be taken into account for the management of ESCC patients in different ethnic groups.
Agaku, Israel T; Adisa, Akinyele O
2014-04-01
Nativity status is a major determinant of health and healthcare access in the United States. This study compared oral squamous cell carcinoma (OSCC) survival between US-born and foreign-born patients. Data were obtained from the 1988-2008 Surveillance, Epidemiology and End Results database. A Cox proportional hazards multivariate model was used to assess the eff ect of birthplace on OSCC survival, adjusting for other sociodemographic and clinical covariates. US-born patients had a higher median survival time (19.3 years; 95% confidence interval [CI]: 18.6-19.7) compared to foreign-born patients (10.7 years; 95% CI: 10.1-11.3). After adjusting for other factors, being born in the US conferred a modest protective eff ect from OSCC mortality (hazard ratio [HR] = 0.93, 95% CI: 0.87- 0.99). Other factors that conferred better survival included involvement of paired structures (HR = 0.65; 95% CI: 0.58- 0.74), lip involvement rather than tongue lesions (HR = 0.76; 95% CI: 0.71-0.82), and receipt of either surgery (HR = 0.89; 95% CI: 0.84-0.94) or radiation therapy (HR = 0.92; 95% CI: 0.87-0.97). US-born patients had significantly better OSCC survival compared to their foreign-born counterparts. This underscores the need for enhanced and sustained efforts to improve access to healthcare among immigrant populations. In addition, oral health professionals such as general dentists, oral pathologists, and oral surgeons providing care to immigrant patients should ensure that reasonable efforts are made to communicate effectively with patients with language barriers, especially in high-stake conditions such as cancer. This may help increase such patients' awareness of treatment provided and the critical issues regarding cancer care, resulting in enhanced treatment outcome.
López, Mercedes N; Pereda, Cristian; Segal, Gabriela; Muñoz, Leonel; Aguilera, Raquel; González, Fermín E; Escobar, Alejandro; Ginesta, Alexandra; Reyes, Diego; González, Rodrigo; Mendoza-Naranjo, Ariadna; Larrondo, Milton; Compán, Alvaro; Ferrada, Carlos; Salazar-Onfray, Flavio
2009-02-20
The aim of this work was to assess immunologic response, disease progression, and post-treatment survival of melanoma patients vaccinated with autologous dendritic cells (DCs) pulsed with a novel allogeneic cell lysate (TRIMEL) derived from three melanoma cell lines. Forty-three stage IV and seven stage III patients were vaccinated four times with TRIMEL/DC vaccine. Specific delayed type IV hypersensitivity (DTH) reaction, ex vivo cytokine production, and regulatory T-cell populations were determined. Overall survival and disease progression rates were analyzed using Kaplan-Meier curves and compared with historical records. The overall survival for stage IV patients was 15 months. More than 60% of patients showed DTH-positive reaction against the TRIMEL. Stage IV/DTH-positive patients displayed a median survival of 33 months compared with 11 months observed for DTH-negative patients (P = .0014). All stage III treated patients were DTH positive and remained alive and tumor free for a median follow-up period of 48 months (range, 33 to 64 months). DTH-positive patients showed a marked reduction in the proportion of CD4+ transforming growth factor (TGF) beta+ regulatory T cells compared to DTH-negative patients (1.54% v 5.78%; P < .0001). Our findings strongly suggest that TRIMEL-pulsed DCs provide a standardized and widely applicable source of melanoma antigens, very effective in evoking antimelanoma immune response. To our knowledge, this is the first report describing a correlation between vaccine-induced reduction of CD4+TGFbeta+ regulatory T cells and in vivo antimelanoma immune response associated to improved patient survival and disease stability.
Ong, Marcus Eng Hock; Tiah, Ling; Leong, Benjamin Sieu-Hon; Tan, Elaine Ching Ching; Ong, Victor Yeok Kein; Tan, Elizabeth Ai Theng; Poh, Bee Yen; Pek, Pin Pin; Chen, Yuming
2012-08-01
To compare vasopressin and adrenaline in the treatment of patients with cardiac arrest presenting to or in the Emergency Department (ED). A randomised, double-blind, multi-centre, parallel-design clinical trial in four adult hospitals. Eligible cardiac arrest patients (confirmed by the absence of pulse, unresponsiveness and apnea) aged >16 (aged>21 for one hospital) were randomly assigned to intravenous adrenaline (1mg) or vasopressin (40 IU) at ED. Patients with traumatic cardiac arrest or contraindication for cardiopulmonary resuscitation (CPR) were excluded. Patients received additional open label doses of adrenaline as per current guidelines. Primary outcome was survival to hospital discharge (defined as participant discharged alive or survival to 30 days post-arrest). The study recruited 727 participants (adrenaline = 353; vasopressin = 374). Baseline characteristics of the two groups were comparable. Eight participants (2.3%) from adrenaline and 11 (2.9%) from vasopressin group survived to hospital discharge with no significant difference between groups (p = 0.27, RR = 1.72, 95% CI = 0.65-4.51). After adjustment for race, medical history, bystander CPR and prior adrenaline given, more participants survived to hospital admission with vasopressin (22.2%) than with adrenaline (16.7%) (p = 0.05, RR = 1.43, 95% CI = 1.02-2.04). Sub-group analysis suggested improved outcomes for vasopressin in participants with prolonged arrest times. Combination of vasopressin and adrenaline did not improve long term survival but seemed to improve survival to admission in patients with prolonged cardiac arrest. Further studies on the effect of vasopressin combined with therapeutic hypothermia on patients with prolonged cardiac arrest are needed. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Drennan, Ian R.; Case, Erin; Verbeek, P. Richard; Reynolds, Joshua C.; Goldberger, Zachary D.; Jasti, Jamie; Charleston, Mark; Herren, Heather; Idris, Ahamed H.; Leslie, Paul R.; Austin, Michael A.; Xiong, Yan; Schmicker, Robert H.; Morrison, Laurie J.
2017-01-01
Introduction The Universal Termination of Resuscitation (TOR) Guideline accurately identifies potential out-of-hospital cardiac arrest (OHCA) survivors. However, implementation is inconsistent with some emergency medical service (EMS) agencies using absence of return of spontaneous circulation (ROSC) as sole criterion for termination. Objective To compare the performance of the Universal TOR Guideline with the single criterion of no prehospital ROSC. Second, to determine factors associated with survival for patients transported without a ROSC. Lastly, to compare the impact of time to ROSC as a marker of futility to the Universal TOR Guideline. Design Retrospective, observational cohort study Participants Non-traumatic, adult (≥18 years) OHCA patients of presumed cardiac etiology treated by EMS providers Setting ROC-PRIMED and ROC-Epistry post ROC-PRIMED databases between 2007 and 2011. Outcomes Primary outcome was survival to hospital discharge and the secondary outcome was functional survival. We used multivariable regression to evaluate factors associated with survival in patients transported without a ROSC. Results 36,543 treated OHCAs occurred of which 9,467 (26%) were transported to hospital without a ROSC. Patients transported without a ROSC who met the Universal TOR Guideline for transport had a survival of 3.0% (95% CI 2.5%–3.4%) compared to 0.7% (95% CI 0.4%–0.9%) in patients who met the Universal TOR Guideline for termination. The Universal TOR Guideline identified 99% of survivors requiring continued resuscitation and transportation to hospital including early identification of survivors who sustained a ROSC after extended durations of CPR. Conclusion Using absence of ROSC as a sole predictor of futility misses potential survivors. The Universal TOR Guideline remains a strong predictor of survival. PMID:27923115
Erythrocyte survival time in Greyhounds as assessed by use of in vivo biotinylation.
Garon, Catherine L; Cohn, Leah A; Scott, Michael A
2010-09-01
To determine erythrocyte survival time in Greyhounds. 6 Greyhounds used as blood donors and 3 privately owned non-Greyhound dogs. In vivo biotinylation of erythrocytes was performed by infusion of biotin-N-hydroxysuccinimide into each dog via a jugular vein catheter. Blood samples were collected 12 hours later and then at weekly intervals and were used to determine the percentage of biotin-labeled erythrocytes at each time point. Erythrocytes were washed, incubated with avidin-fluorescein isothiocyanate, and washed again before the percentage of biotinylated erythrocytes was measured by use of flow cytometry. Survival curves for the percentage of biotinylated erythrocytes were generated, and erythrocyte survival time was defined as the x-intercept of a least squares best-fit line for the linear portion of each curve. The R2 for survival curves ranged from 0.93 to 0.99 during the first 10 weeks after infusion of erythrocytes. Erythrocyte survival time for the 3 non-Greyhound dogs was 94, 98, and 116 days, respectively, which was consistent with previously reported values. Erythrocyte survival time for the 6 Greyhounds ranged from 83 to 110 days (mean, 93 days; median, 88 days). As determined by use of in vivo biotinylation, erythrocyte survival times in Greyhounds were similar to those determined for non-Greyhound dogs and did not differ significantly from erythrocyte survival times reported previously for non-Greyhound dogs. Erythrocyte survival time was similar in Greyhounds and non-Greyhound dogs. Greyhounds can be used as erythrocyte donors without concerns about inherently shorter erythrocyte survival time.
Islet grafting and imaging in a bioengineered intramuscular space.
Witkowski, Piotr; Sondermeijer, Hugo; Hardy, Mark A; Woodland, David C; Lee, Keagan; Bhagat, Govind; Witkowski, Kajetan; See, Fiona; Rana, Abbas; Maffei, Antonella; Itescu, Silviu; Harris, Paul E
2009-11-15
Because the hepatic portal system may not be the optimal site for islet transplantation, several extrahepatic sites have been studied. Here, we examine an intramuscular transplantation site, bioengineered to better support islet neovascularization, engraftment, and survival, and we demonstrate that at this novel site, grafted beta cell mass may be quantitated in a real-time noninvasive manner by positron emission tomography (PET) imaging. Streptozotocin-induced rats were pretreated intramuscularly with a biocompatible angiogenic scaffold received syngeneic islet transplants 2 weeks later. The recipients were monitored serially by blood glucose and glucose tolerance measurements and by PET imaging of the transplant site with [11C] dihydrotetrabenazine. Parallel histopathologic evaluation of the grafts was performed using insulin staining and evaluation of microvasularity. Reversal of hyperglycemia by islet transplantation was most successful in recipients pretreated with bioscaffolds containing angiogenic factors when compared with those who received no bioscaffolds or bioscaffolds not treated with angiogenic factors. PET imaging with [11C] dihydrotetrabenazine, insulin staining, and microvascular density patterns were consistent with islet survival, increased levels of angiogenesis, and with reversal of hyperglycemia. Induction of increased neovascularization at an intramuscular site significantly improves islet transplant engraftment and survival compared with controls. The use of a nonhepatic transplant site may avoid intrahepatic complications and permit the use of PET imaging to measure and follow transplanted beta cell mass in real time. These findings have important implications for effective islet implantation outside of the liver and offer promising possibilities for improving islet survival, monitoring, and even prevention of islet loss.
Svenson, Ulrika; Roos, Göran; Wikström, Pernilla
2017-02-01
Previous studies have suggested that leukocyte telomere length is associated with risk of developing prostate cancer. Investigations of leukocyte telomere length as a prognostic factor in prostate cancer are, however, lacking. In this study, leukocyte telomere length was investigated both as a risk marker, comparing control subjects and patient risk groups (based on serum levels of prostate-specific antigen, tumor differentiation, and tumor stage), and as a prognostic marker for metastasis-free and cancer-specific survival. Relative telomere length was measured by a well-established quantitative polymerase chain reaction method in 415 consecutively sampled individuals. Statistical evaluation included 162 control subjects without cancer development during follow-up and 110 untreated patients with newly diagnosed localized prostate cancer at the time of blood draw. Leukocyte telomere length did not differ significantly between control subjects and patients, or between patient risk groups. Interestingly, however, and in line with our previous results in breast and kidney cancer patients, relative telomere length at diagnosis was an independent prognostic factor. Patients with long leukocyte telomeres (⩾median) had a significantly worse prostate cancer-specific and metastasis-free survival compared to patients with short telomere length. In contrast, for patients who died of other causes than prostate cancer, long relative telomere length was not coupled to shorter survival time. To our knowledge, these results are novel and give further strength to our hypothesis that leukocyte telomere length might be used as a prognostic marker in malignancy.
Aggarwal, Rohit; McBurney, Christine; Schneider, Frank; Yousem, Samuel A; Gibson, Kevin F; Lindell, Kathleen; Fuhrman, Carl R; Oddis, Chester V
2017-03-01
To compare the survival outcomes between myositis-associated usual interstitial pneumonia (MA-UIP) and idiopathic pulmonary fibrosis (IPF-UIP). Adult MA-UIP and IPF-UIP patients were identified using CTD and IPF registries. The MA-UIP cohort included myositis or anti-synthetase syndrome patients with interstitial lung disease while manifesting UIP on high-resolution CT chest and/or a lung biopsy revealing UIP histology. IPF subjects met American Thoracic Society criteria and similarly had UIP histopathology. Kaplan-Meier survival curves compared cumulative and pulmonary event-free survival (event = transplant or death) between (i) all MA-UIP and IPF-UIP subjects, (ii) MA-UIP with biopsy proven UIP (n = 25) vs IPF-UIP subjects matched for age, gender and baseline forced vital capacity (±10%). Cox proportional hazards ratios compared the survival controlling for co-variates. Eighty-one IPF-UIP and 43 MA-UIP subjects were identified. The median cumulative and event-free survival time in IPF vs MA-UIP was 5.25/1.8 years vs 16.2/10.8 years, respectively. Cumulative and event-free survival was significantly worse in IPF-UIP vs MA-UIP [hazards ratio of IPF-UIP was 2.9 (95% CI: 1.5, 5.6) and 5.0 (95% CI: 2.8, 8.7) (P < 0.001), respectively]. IPF-UIP event-free survival (but not cumulative) remained significantly worse than MA-UIP with a hazards ratio of 6.4 (95% CI: 3.0, 13.8) after controlling for age at interstitial lung disease diagnosis, gender, ethnicity and baseline forced vital capacity%. Respiratory failure was the most common cause of death in both groups. A sub-analysis of 25 biopsy-proven MA-UIP subjects showed similar results. MA-UIP patients demonstrated a significant survival advantage over a matched IPF cohort, suggesting that despite similar histological and radiographic findings at presentation, the prognosis of MA-UIP is superior to that of IPF-UIP. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Kovalenko, Olga A; Azzam, Edouard I; Ende, Norman
2013-11-01
The purpose of this study was to evaluate the window of time and dose of human umbilical-cord-blood (HUCB) mononucleated cells necessary for successful treatment of radiation injury in mice. Female A/J mice (27-30 weeks old) were exposed to an absorbed dose of 9-10 Gy of (137)Cs γ-rays delivered acutely to the whole body. They were treated either with 1 × 10(8) or 2 × 10(8) HUCB mononucleated cells at 24-52 h after the irradiation. The antibiotic Levaquin was applied 4 h postirradiation. The increased dose of cord-blood cells resulted in enhanced survival. The enhancement of survival in animals that received 2 × 10(8) HUCB mononucleated cells relative to irradiated but untreated animals was highly significant (P < 0.01). Compared with earlier studies, the increased dose of HUCB mononucleated cells, coupled with early use of an antibiotic, extended the window of time for effective treatment of severe radiation injury from 4 to 24-52 h after exposure.
Han, Hyuk-Soo; Kang, Seung-Baik
2013-05-01
The long-term survivorship of TKA in Asian countries is comparable to that in Western countries. High-flexion TKA designs were introduced to improve flexion after TKA. However, several studies suggest high-flexion designs are at greater risk of femoral component loosening compared with conventional TKA designs. We previously reported a revision rate of 21% at 11 to 45 months; this report is intended as a followup to that study. Do implant survival and function decrease with time and do high-flexion activities increase the risk of premature failure? We prospectively followed 72 Nexgen LPS-flex fixed TKAs in 47 patients implanted by a single surgeon between March 2003 and September 2004. We determined the probability of survival using revision as an end point and compared survival between those who could and those who could not perform high-flexion activities. Minimum followup was 0.9 years (median, 6.5 years; range, 0.9-8.6 years). Twenty-five patients (33 knees) underwent revision for aseptic loosening of the femoral component at a mean of 4 years (range, 1-8 years). The probability of revision-free survival for aseptic loosening was 67% and 52% at 5 and 8 years, respectively. Eight-year cumulative survivorship was lower in patients capable of squatting, kneeling, or sitting crosslegged (31% compared with 78%). There were no differences in the pre- and postoperative mean Hospital for Special Surgery scores and maximum knee flexion degrees whether or not high-flexion activities could be achieved. Overall midterm high-flexion TKA survival in our Asian cohort was lower than that of conventional and other high-flexion designs. This unusually high rate of femoral component loosening was associated with postoperative high-flexion activities.
Deakin, Charles D; Fothergill, Rachael; Moore, Fionna; Watson, Lynne; Whitbread, Mark
2014-07-01
The relationship between the neurological status at the time of handover from the ambulance crew to a Heart Attack Centre (HAC) in patients who have achieved return of spontaneous circulation (ROSC) and subsequent outcome, in the context of current treatment standards, is unknown. A retrospective review of all patients treated by London Ambulance Service (LAS) from 1(st) April 2011 to 31(st) March 2013 admitted to a HAC in Greater London was undertaken. Neurological status (A - alert; V - responding to voice; P - responding to pain; U - unresponsive) recorded by the ambulance crew on handover was compared with length of hospital stay and survival to hospital discharge. A total of 475 sequential adult cardiac arrests of presumed cardiac origin, achieving ROSC on admission to a HAC were identified. Outcome data was available for 452 patients, of whom 253 (56.0%) survived to discharge. Level of consciousness on admission to the HAC was a predictor of duration of hospital stay (P<0.0001) and survival to hospital discharge (P<0.0001). Of those presenting with a shockable rhythm, 32.3% (120/371) were 'A' or 'V', compared with 9.1% (9/99) of those with non-shockable rhythms (P<0.001). Patients with shockable rhythms achieving ROSC are more likely to be conscious (A or V) compared with those with non-shockable rhythms. Most patients who are conscious on admission to the HAC will survive, compared with approximately half of those who are unconscious (P or U), suggesting that critical care is generally appropriate at all levels of consciousness if ROSC has been achieved. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Baade, Peter D; Dasgupta, Paramita; Dickman, Paul W; Cramb, Susanna; Williamson, John D; Condon, John R; Garvey, Gail
2016-08-01
The survival inequality faced by Indigenous Australians after a cancer diagnosis is well documented; what is less understood is whether this inequality has changed over time and what this means in terms of the impact a cancer diagnosis has on Indigenous people. Survival information for all patients identified as either Indigenous (n=3168) or non-Indigenous (n=211,615) and diagnosed in Queensland between 1997 and 2012 were obtained from the Queensland Cancer Registry, with mortality followed up to 31st December, 2013. Flexible parametric survival models were used to quantify changes in the cause-specific survival inequalities and the number of lives that might be saved if these inequalities were removed. Among Indigenous cancer patients, the 5-year cause-specific survival (adjusted by age, sex and broad cancer type) increased from 52.9% in 1997-2006 to 58.6% in 2007-2012, while it improved from 61.0% to 64.9% among non-Indigenous patients. This meant that the adjusted 5-year comparative survival ratio (Indigenous: non-Indigenous) increased from 0.87 [0.83-0.88] to 0.89 [0.87-0.93], with similar improvements in the 1-year comparative survival. Using a simulated cohort corresponding to the number and age-distribution of Indigenous people diagnosed with cancer in Queensland each year (n=300), based on the 1997-2006 cohort mortality rates, 35 of the 170 deaths due to cancer (21%) expected within five years of diagnosis were due to the Indigenous: non-Indigenous survival inequality. This percentage was similar when applying 2007-2012 cohort mortality rates (19%; 27 out of 140 deaths). Indigenous people diagnosed with cancer still face a poorer survival outlook than their non-Indigenous counterparts, particularly in the first year after diagnosis. The improving survival outcomes among both Indigenous and non-Indigenous cancer patients, and the decreasing absolute impact of the Indigenous survival disadvantage, should provide increased motivation to continue and enhance current strategies to further reduce the impact of the survival inequalities faced by Indigenous people diagnosed with cancer. Copyright © 2016 Elsevier Ltd. All rights reserved.
Grisham, Blake A.; Boal, Clint W.
2015-01-01
Baseline survival and mortality data for lesser prairie-chickens (Tympanuchus pallidicinctus) are lacking for shinnery oak (Quercus havardii) prairies. An understanding of the causes and timing of mortalities and breeding season survival in this ecoregion is important because shinnery oak prairies have hotter and drier environmental conditions, as well as different predator communities compared with the northern distribution of the species. The need for this information has become more pressing given the recent listing of the species as threatened under the U.S. Endangered Species Act. We investigated causes of mortality and survival of lesser prairie-chickens during the 6-month breeding season (1 Mar–31 Aug) of 2008–2011 on the Texas Southern High Plains, USA. We recorded 42 deaths of radiotagged individuals, and our results indicated female mortalities were proportionate among avian and mammalian predation and other causes of mortality but survival was constant throughout the 6-month breeding season. Male mortalities were constant across avian and mammalian predation and other causes, but more mortalities occurred in June compared with other months. Male survival also varied by month, and survival probabilities were lower in June–August. We found predation on leks was rare, mortalities from fence collisions were rare, female survival did not decrease during incubation or brood-rearing, and survival was influenced by drought. Our study corroborated recent studies that suggested lesser prairie-chickens are living at the edge of their physiological tolerances to environmental conditions in shinnery oak prairies. As such, lesser prairie-chickens in our study experienced different patterns of mortality and survival that we attributed to hot, dry conditions during the breeding season. Specifically, and converse to other studies on lesser prairie-chicken survival and mortality, drought positively influenced female survival because females did not incubate eggs during drought conditions; the incubation period is when females are most vulnerable to predation. Male mortalities and survival were negatively influenced by drought later in the breeding season, which we attributed to rigorous lekking activities through late May combined with lack of food and cover as the breeding season progressed into summer.
de Pont, Anne-Cornélie J M; Bouman, Catherine S C; Bakhtiari, Kamran; Schaap, Marianne C L; Nieuwland, Rienk; Sturk, Augueste; Hutten, Barbara A; de Jonge, Evert; Vroom, Margreeth B; Meijers, Joost C M; Büller, Harry R
2006-01-01
During continuous venovenous hemofiltration, predilution can prolong circuit survival time, but the underlying mechanism has not been elucidated. The aim of the present study was to compare predilution with postdilution, with respect to circuit thrombogenesis. Eight critically ill patients were treated with both predilutional and postdilutional continuous venovenous hemofiltration in a crossover fashion. A filtration flow of 60 ml/min was used in both modes. We chose blood flows of 140 and 200 ml/min during predilution and postdilution, respectively, to keep the total flow through the hemofilter constant. Extracorporeal circuit pressures were measured hourly, and samples of blood and ultrafiltrate were collected at five different time points. Thrombin-antithrombin complexes and prothrombin fragments F1 + 2 were measured by ELISA, and platelet activation was assessed by flow cytometry. No signs of thrombin generation or platelet activation were found during either mode. During postdilution, baseline platelet count and maximal prefilter pressure had a linear relation, whereas both parameters were inversely related with circuit survival time. In summary, predilution and postdilution did not differ with respect to extracorporeal circuit thrombogenesis. During postdilution, baseline platelet count and maximal prefilter pressure were inversely related with circuit survival time.
NASA Astrophysics Data System (ADS)
Bessell-Browne, Pia; Stat, Michael; Thomson, Damian; Clode, Peta L.
2014-09-01
Colonies of Coscinaraea marshae corals from Rottnest Island, Western Australia have survived for more than 11 months in various bleached states following a severe heating event in the austral summer of 2011. These colonies are situated in a high-latitude, mesophotic environment, which has made their long-term survival of particular interest as such environments typically suffer from minimal thermal pressures. We have investigated corals that remain unbleached, moderately bleached, or severely bleached to better understand potential survival mechanisms utilised in response to thermal stress. Specifically, Symbiodinium (algal symbiont) density and genotype, chlorophyll- a concentrations, and δ13C and δ15N levels were compared between colonies in the three bleaching categories. Severely bleached colonies housed significantly fewer Symbiodinium cells ( p < 0.05) and significantly reduced chlorophyll- a concentrations ( p < 0.05), compared with unbleached colonies. Novel Symbiodinium clade associations were observed for this coral in both severely and moderately bleached colonies, with clade C and a mixed clade population detected. In unbleached colonies, only clade B was observed. Levels of δ15N indicate that severely bleached colonies are utilising heterotrophic feeding mechanisms to aid survival whilst bleached. Collectively, these results suggest that these C. marshae colonies can survive with low symbiont and chlorophyll densities, in response to prolonged thermal stress and extended bleaching, and increase heterotrophic feeding levels sufficiently to meet energy demands, thus enabling some colonies to survive and recover over long time frames. This is significant as it suggests that corals in mesophotic and high-latitude environments may possess considerable plasticity and an ability to tolerate and adapt to large environmental fluctuations, thereby improving their chances of survival as climate change impacts coral ecosystems worldwide.
The effects of exercise and stress on the survival and maturation of adult-generated granule cells
Snyder, Jason S.; Glover, Lucas R.; Sanzone, Kaitlin M.; Kamhi, J. Frances; Cameron, Heather A.
2009-01-01
Stress strongly inhibits proliferation of granule cell precursors in the dentate gyrus, while voluntary running has the opposite effect. Few studies, however, have examined the possible effects of these environmental manipulations on the maturation and survival of young granule cells. We examined number of surviving granule cells and the proportion of young neurons that were functionally mature, as defined by seizure-induced immediate-early gene expression, in 14 and 21 day-old granule cells in mice that were given access to a running wheel, restrained daily for 2 hours, or given no treatment during this period. Importantly, treatments began two days after BrdU injection, to isolate effects on survival from those on cell proliferation. We found a large increase in granule cell survival in running mice compared with controls at both time points. In addition, running increased the proportion of granule cells expressing the immediate-early gene Arc in response to seizures, suggesting that it speeds incorporation into circuits, i.e., functional maturation. Stressed mice showed no change in Arc expression, compared to control animals, but, surprisingly, showed a transient increase in survival of 14-day-old granule cells, which was gone 7 days later. Examination of cell proliferation, using the endogenous mitotic marker proliferating cell nuclear antigen (PCNA) showed an increase in cell proliferation after 12 days of running but not after 19 days of running. The number of proliferating cells was unchanged 24 hours after the 12th or 19th episode of daily restraint stress. These findings demonstrate that running has strong effects on survival and maturation of young granule cells as well as their birth and that stress can have positive but short-lived effects on granule cell survival. PMID:19156854
Racial differences in cervical cancer survival in the Detroit metropolitan area.
Movva, Sujana; Noone, Anne-Michelle; Banerjee, Mousumi; Patel, Divya A; Schwartz, Kendra; Yee, Cecilia L; Simon, Michael S
2008-03-15
African-American (AA) women have lower survival rates from cervical cancer compared with white women. The objective of this study was to examine the influence of socioeconomic status (SES) and other variables on racial disparities in overall survival among women with invasive cervical cancer. One thousand thirty-six women (705 white women and 331 AA women) who were diagnosed with primary invasive cancer of the cervix between 1988 and 1992 were identified through the Metropolitan Detroit Cancer Surveillance System (MDCSS), a registry in the Surveillance, Epidemiology, and End Results (SEER) database. Pathology, treatment, and survival data were obtained through SEER. SES was categorized by using occupation, poverty, and educational status at the census tract level. Cox proportional hazards models were used to compare overall survival between AA women and white women adjusting for sociodemographics, clinical presentation, and treatment. AA women were more likely to present at an older age (P<.001), with later stage disease (P<.001), and with squamous histology (P=.01), and they were more likely to reside in a census tract categorized as Working Poor (WP) (P<.001). After multivariate adjustment, race no longer had a significant impact on survival. Women who resided in a WP census tract had a higher risk of death than women from a Professional census tract (P=.05). There was a significant interaction between disease stage and time with the effect of stage on survival attenuated after 6 years. In this study, factors that affected access to medical care appeared to have a more important influence than race on the long-term survival of women with invasive cervical cancer. Copyright (c) 2008 American Cancer Society.
Rural AIDS Diagnoses in Florida: Changing Demographics and Factors Associated With Survival
Trepka, Mary Jo; Niyonsenga, Theophile; Maddox, Lorene M.; Lieb, Spencer
2012-01-01
Purpose To compare demographic characteristics and predictors of survival of rural residents diagnosed with acquired immunodeficiency syndrome (AIDS) with those of urban residents. Methods Florida surveillance data for people diagnosed with AIDS during 1993–2007 were merged with 2000 Census data using ZIP code tabulation areas (ZCTA). Rural status was classified based on the ZCTA’s rural-urban commuting area classification. Survival rates were compared between rural and urban areas using survival curves and Cox proportional hazards models controlling for demographic, clinical, and area-level socioeconomic and health care access factors. Findings Of the 73,590 people diagnosed with AIDS, 1,991 (2.7%) resided in rural areas. People in the most recent rural cohorts were more likely than those in earlier cohorts to be female, non-Hispanic black, older, and have a reported transmission mode of heterosexual sex. There were no statistically significant differences in the 3-, 5-, or 10-year survival rates between rural and urban residents. Older age at the time of diagnosis, diagnosis during the 1993–1995 period, other/unknown transmission mode, and lower CD4 count/percent categories were associated with lower survival in both rural and urban areas. In urban areas only, being non-Hispanic black or Hispanic, being US born, more poverty, less community social support, and lower physician density were also associated with lower survival. Conclusions In rural Florida, the demographic characteristics of people diagnosed with AIDS have been changing, which may necessitate modifications in the delivery of AIDS-related services. Rural residents diagnosed with AIDS did not have a significant survival disadvantage relative to urban residents. PMID:23802929
He, Lingling; Liu, Xiaoli; Zhao, Yalin; Zhang, Shuan; Jiang, Yuyong; Wang, Xianbo; Yang, Zhiyun
2017-01-01
Aim . To determine whether nucleot(s)ide analogs therapy has survival benefit for patients with HBV-related HCC after unresectable treatment. Method . A systematic search was conducted through seven electronic databases including PubMed, OVID, EMBASE, Cochrane Databases, Elsevier, Wiley Online Library, and BMJ Best Practice. All studies comparing NA combined with unresectable treatment versus unresectable treatment alone were considered for inclusion. The primary outcome was the overall survival (OS) after unresectable treatment for patients with HBV-related HCC. The secondary outcome was the progression-free survival (PFS). Results were expressed as hazard ratio (HR) for survival with 95% confidence intervals. Results . We included six studies with 994 patients: 409 patients in nucleot(s)ide analogs therapy group and 585 patients without antiviral therapy in control group. There were significant improvements for the overall survival (HR = 0.57; 95% CI = 0.47-0.70; p < 0.001) and progression-free survival (HR = 0.84; 95% CI = 0.71-0.99; p = 0.034) in the NA-treated group compared with the control group. Funnel plot showed that there was no significant publication bias in these studies. When it comes to antiviral drugs and operation method, it also showed benefit in NA-treated group. At the same time, overall mortality as well as mortality secondary to liver failure in NA-treated group was obviously lesser. Sensitivity analyses confirmed the robustness of the results. Conclusions . Nucleot(s)ide analogs therapy after unresectable treatment has potential beneficial effects in terms of overall survival and progression-free survival. NA therapy should be considered in clinical practice.
Qiao, Baozhen; Schymura, Maria J; Kahn, Amy R
2016-10-01
Population-based cancer survival analyses have traditionally been based on the first primary cancer. Recent studies have brought this practice into question, arguing that varying registry reference dates affect the ability to identify earlier cancers, resulting in selection bias. We used a theoretical approach to evaluate the extent to which the length of registry operations affects the classification of first versus subsequent cancers and consequently survival estimates. Sequence number central was used to classify tumors from the New York State Cancer Registry, diagnosed 2001-2010, as either first primaries (value=0 or 1) or subsequent primaries (≥2). A set of three sequence numbers, each based on an assumed reference year (1976, 1986 or 1996), was assigned to each tumor. Percent of subsequent cancers was evaluated by reference year, cancer site and age. 5-year relative survival estimates were compared under four different selection scenarios. The percent of cancer cases classified as subsequent primaries was 15.3%, 14.3% and 11.2% for reference years 1976, 1986 and 1996, respectively; and varied by cancer site and age. When only the first primary was included, shorter registry operation time was associated with slightly lower 5-year survival estimates. When all primary cancers were included, survival estimates decreased, with the largest decreases seen for the earliest reference year. Registry operation length affected the identification of subsequent cancers, but the overall effect of this misclassification on survival estimates was small. Survival estimates based on all primary cancers were slightly lower, but might be more comparable across registries. Copyright © 2016 Elsevier Ltd. All rights reserved.
Impact of breast cancer subtypes on 3-year survival among adolescent and young adult women
2013-01-01
Introduction Young women have poorer survival after breast cancer than do older women. It is unclear whether this survival difference relates to the unique distribution of hormone receptor (HR) and human epidermal growth factor receptor 2 (HER2)-defined molecular breast cancer subtypes among adolescent and young adult (AYA) women aged 15 to 39 years. The purpose of our study was to examine associations between breast cancer subtypes and short-term survival in AYA women, as well as to determine whether the distinct molecular subtype distribution among AYA women explains the unfavorable overall breast cancer survival statistics reported for AYA women compared with older women. Methods Data for 5,331 AYA breast cancers diagnosed between 2005 and 2009 were obtained from the California Cancer Registry. Survival by subtype (triple-negative; HR+/HER2-; HR+/HER2+; HR-/HER2+) and age-group (AYA versus 40- to 64-year-olds) was analyzed with Cox proportional hazards regression with follow-up through 2010. Results With up to 6 years of follow-up and a mean survival time of 3.1 years (SD = 1.5 years), AYA women diagnosed with HR-/HER + and triple-negative breast cancer experienced a 1.6-fold and 2.7-fold increased risk of death, respectively, from all causes (HR-/HER + hazard ratio: 1.55; 95% confidence interval (CI): 1.10 to 2.18; triple-negative HR: 2.75; 95% CI, 2.06 to 3.66) and breast cancer (HR-/HER + hazard ratio: 1.63; 95% CI, 1.12 to 2.36; triple-negative hazard ratio: 2.71; 95% CI, 1.98 to 3.71) than AYA women with HR+/HER2- breast cancer. AYA women who resided in lower socioeconomic status neighborhoods, had public health insurance, and were of Black, compared with White, race/ethnicity experienced worse survival. This race/ethnicity association was attenuated somewhat after adjusting for breast cancer subtypes (hazard ratio, 1.33; 95% CI, 0.98 to 1.82). AYA women had similar all-cause and breast cancer-specific short-term survival as older women for all breast cancer subtypes and across all stages of disease. Conclusions Among AYA women with breast cancer, short-term survival varied by breast cancer subtypes, with the distribution of breast cancer subtypes explaining some of the poorer survival observed among Black, compared with White, AYA women. Future studies should consider whether distribution of breast cancer subtypes and other factors, including differential receipt of treatment regimens, influences long-term survival in young compared with older women. PMID:24131591
Spíndola, A F; Silva-Torres, C S A; Rodrigues, A R S; Torres, J B
2013-08-01
The ladybird beetle, Eriopis connexa (Germar) (Coleoptera: Coccinellidae), is one of the commonest predators of aphids (Hemiptera: Aphididae) in the cotton agroecosystem and in many other row and fruit crops in Brazil, and has been introduced into other countries such as the USA for purposes of aphid control. In addition, the boll weevil, Anthonomus grandis Boheman (Coleoptera: Curculionidae) is the most serious cotton pest where it occurs, including Brazil. Controlling boll weevils and other pests such as cotton defoliators still tends to involve the intense application of insecticides to secure cotton production. The pyrethroid insecticide lambda-cyhalothrin (LCT) is commonly used, but this compound is not effective against aphids; hence, a desirable strategy would be to maintain E. connexa populations in cotton fields where LCT is applied. Using populations of E. connexa resistant (Res) and susceptible (Sus) to LCT, we compared behavioural responses on treated cotton plants and under confinement on partially and fully treated surfaces, and assessed the insects' survival on treated plants compared with that of the boll weevil. The E. connexa resistant population caged on treated plants with 15 and 75 g a.i. ha-1 exhibited ≫82% survival for both insecticide concentrations compared with ≪3% and ≪17% survival for susceptible E. connexa populations and boll weevils, respectively. The response of E. connexa Res and Sus populations when released, either on the soil or on the plant canopy, indicated avoidance towards treated plants, as measured by elapsed time to assess the plant. When compared with susceptible individuals, resistant ones took longer time to suffer insecticide knockdown, had a higher recovery rate after suffering knockdown, and spent more time in the plant canopy. Based on behavioural parameters evaluated in treated arenas, no ladybird beetles exhibited repellency. However, irritability was evident, with the susceptible population exhibiting greater irritability compared with the resistant population and a subgroup comprising resistant individuals that had recovered from knockdown. The outcomes for the E. connexa Res population indicate a promising strategy for its maintenance when using the insecticide LCT in integrated pest management schemes to control boll weevil or other non-target pest of ladybird beetles in cotton fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zelefsky, Michael J., E-mail: zelefskm@mskcc.org; Gomez, Daniel R.; Polkinghorn, William R.
2013-07-01
Purpose: To determine whether the response to neoadjuvant androgen deprivation therapy (ADT) defined by a decline in prostate-specific antigen (PSA) to nadir values is associated with improved survival outcomes after external beam radiation therapy (EBRT) for prostate cancer. Methods and Materials: One thousand forty-five patients with localized prostate cancer were treated with definitive EBRT in conjunction with neoadjuvant and concurrent ADT. A 6-month course of ADT was used (3 months during the neoadjuvant phase and 2 to 3 months concurrently with EBRT). The median EBRT prescription dose was 81 Gy using a conformal-based technique. The median follow-up time was 8.5more » years. Results: The 10-year PSA relapse-free survival outcome among patients with pre-radiation therapy PSA nadirs of ≤0.3 ng/mL was 74.3%, compared with 57.7% for patients with higher PSA nadir values (P<.001). The 10-year distant metastases-free survival outcome among patients with pre-radiation therapy PSA nadirs of ≤0.3 ng/mL was 86.1%, compared with 78.6% for patients with higher PSA nadir values (P=.004). In a competing-risk analysis, prostate cancer-related deaths were also significantly reduced among patients with pre-radiation therapy PSA nadirs of <0.3 ng/mL compared with higher values (7.8% compared with 13.7%; P=.009). Multivariable analysis demonstrated that the pre-EBRT PSA nadir value was a significant predictor of long-term biochemical tumor control, distant metastases-free survival, and cause-specific survival outcomes. Conclusions: Pre-radiation therapy nadir PSA values of ≤0.3 ng/mL after neoadjuvant ADT were associated with improved long-term biochemical tumor control, reduction in distant metastases, and prostate cancer-related death. Patients with higher nadir values may require alternative adjuvant therapies to improve outcomes.« less
Chapman, William C; Vachharajani, Neeta; Collins, Kelly M; Garonzik-Wang, Jackie; Park, Yikyung; Wellen, Jason R; Lin, Yiing; Shenoy, Surendra; Lowell, Jeffrey A; Doyle, M B Majella
2015-07-01
The shortage of donor organs has led to increasing use of extended criteria donors, including older donors. The upper limit of donor age that produces acceptable outcomes continues to be explored. In liver transplantation, with appropriate selection, graft survival and patient outcomes would be comparable regardless of age. We performed a retrospective analysis of 1,036 adult orthotopic liver transplantations (OLT) from a prospectively maintained database performed between January 1, 2000 and December 31, 2013. The study focus group was liver transplantations performed using grafts from older (older than 60 years) deceased donors. Deceased donor liver transplantations done during the same time period using grafts from younger donors (younger than 60 years) were analyzed for comparison. Both groups were further divided based on recipient age (less than 60 years and 60 years or older). Donor age was the primary variable. Recipient variables included were demographics, indication for transplantation, Model for End-Stage Liver Disease (MELD), graft survival, and patient survival. Operative details and postoperative complications were analyzed. Patient demographics and perioperative details were similar between groups. Patient and graft survival rates were similar in the 4 groups. Rates of rejection (p = 0.07), bile leak (p = 0.17), and hepatic artery thrombosis were comparable across all groups (p = 0.84). Hepatitis C virus recurrence was similar across all groups (p = 0.10). Thirty-one young recipients (less than 60 years) received grafts from donors aged 70 or older. Their survival and other complication rates were comparable to those in the young donor to young recipient group. Comparable outcomes in graft and patient survivals were achieved using older donors (60 years or more), regardless of recipient age, without increased rate of complications. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Probiotics improve survival of septic rats by suppressing conditioned pathogens in ascites
Liu, Da-Quan; Gao, Qiao-Ying; Liu, Hong-Bin; Li, Dong-Hua; Wu, Shang-Wei
2013-01-01
AIM: To investigate the benefits of probiotics treatment in septic rats. METHODS: The septic rats were induced by cecal ligation and puncture. The animals of control, septic model and probiotics treated groups were treated with vehicle and mixed probiotics, respectively. The mixture of probiotics included Bifidobacterium longum, Lactobacillus bulgaricus and Streptococcus thermophilus. We observed the survival of septic rats using different amounts of mixed probiotics. We also detected the bacterial population in ascites and blood of experimental sepsis using cultivation and real-time polymerase chain reaction. The severity of mucosal inflammation in colonic tissues was determined. RESULTS: Probiotics treatment improved survival of the rats significantly and this effect was dose dependent. The survival rate was 30% for vehicle-treated septic model group. However, 1 and 1/4 doses of probiotics treatment increased survival rate significantly compared with septic model group (80% and 55% vs 30%, P < 0.05). The total viable counts of bacteria in ascites decreased significantly in probiotics treated group compared with septic model group (5.20 ± 0.57 vs 9.81 ± 0.67, P < 0.05). The total positive rate of hemoculture decreased significantly in probiotics treated group compared with septic model group (33.3% vs 100.0%, P < 0.05). The population of Escherichia coli and Staphylococcus aureus in ascites of probiotics treated group were decreased significantly compared with that of septic model group (3.93 ± 0.73 vs 8.80 ± 0.83, P < 0.05; 2.80 ± 1.04 vs 5.39 ± 1.21, P < 0.05). With probiotics treatment, there was a decrease in the scores of inflammatory cell infiltration into the intestinal mucosa in septic animals (1.50 ± 0.25 vs 2.88 ± 0.14, P < 0.01). CONCLUSION: Escherichia coli and Staphylococcus aureus may be primary pathogens in septic rats. Probiotics improve survival of septic rats by suppressing these conditioned pathogens. PMID:23840152
Møller, Sidsel G; Rajan, Shahzleen; Folke, Fredrik; Hansen, Carolina Malta; Hansen, Steen Møller; Kragholm, Kristian; Lippert, Freddy K; Karlsson, Lena; Køber, Lars; Torp-Pedersen, Christian; Gislason, Gunnar H; Wissenberg, Mads
2016-07-01
Survival after out-of-hospital cardiac arrest (OHCA) has tripled during the past decade in Denmark as a likely result of improvements in cardiac arrest management. This study analyzed whether these improvements were applicable for patients with chronic obstructive pulmonary disease (COPD). Patients ≥18 years with OHCA of presumed cardiac cause were identified through the Danish Cardiac Arrest Registry, 2001-2011. Patients with a history of COPD up to ten years prior to arrest were identified from the Danish National Patient Register and compared to non-COPD patients. Of 21,480 included patients, 3056 (14.2%) had history of COPD. Compared to non-COPD patients, COPD patients were older (75 vs. 71 years), less likely male (61.2% vs. 68.5%), had higher prevalence of other comorbidities, and were less likely to have: arrests outside private homes (17.7% vs. 28.3%), witnessed arrests (48.7% vs. 52.9%), bystander cardiopulmonary resuscitation (25.8% vs. 34.8%), and shockable heart rhythm (15.6% vs. 29.9%), all p<0.001; while no significant difference in the time-interval from recognition of arrest to rhythm analysis by ambulance-crew; p=0.24. From 2001 to 2011, survival upon hospital arrival increased in both patient-groups (from 6.8% to 17.1% in COPD patients and from 8.2% to 25.6% in non-COPD patients, p<0.001). However, no significant change was observed in 30-day survival in COPD patients (from 3.7% to 2.1%, p=0.27), in contrast to non-COPD patients (from 3.5% to 13.0%, p<0.001). Despite generally improved 30-day survival after OHCA over time, no improvement was observed in 30-day survival in COPD patients. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Selvarajah, Gayathri T; Kirpensteijn, Jolle; van Wolferen, Monique E; Rao, Nagesha AS; Fieten, Hille; Mol, Jan A
2009-01-01
Background Gene expression profiling of spontaneous tumors in the dog offers a unique translational opportunity to identify prognostic biomarkers and signaling pathways that are common to both canine and human. Osteosarcoma (OS) accounts for approximately 80% of all malignant bone tumors in the dog. Canine OS are highly comparable with their human counterpart with respect to histology, high metastatic rate and poor long-term survival. This study investigates the prognostic gene profile among thirty-two primary canine OS using canine specific cDNA microarrays representing 20,313 genes to identify genes and cellular signaling pathways associated with survival. This, the first report of its kind in dogs with OS, also demonstrates the advantages of cross-species comparison with human OS. Results The 32 tumors were classified into two prognostic groups based on survival time (ST). They were defined as short survivors (dogs with poor prognosis: surviving fewer than 6 months) and long survivors (dogs with better prognosis: surviving 6 months or longer). Fifty-one transcripts were found to be differentially expressed, with common upregulation of these genes in the short survivors. The overexpressed genes in short survivors are associated with possible roles in proliferation, drug resistance or metastasis. Several deregulated pathways identified in the present study, including Wnt signaling, Integrin signaling and Chemokine/cytokine signaling are comparable to the pathway analysis conducted on human OS gene profiles, emphasizing the value of the dog as an excellent model for humans. Conclusion A molecular-based method for discrimination of outcome for short and long survivors is useful for future prognostic stratification at initial diagnosis, where genes and pathways associated with cell cycle/proliferation, drug resistance and metastasis could be potential targets for diagnosis and therapy. The similarities between human and canine OS makes the dog a suitable pre-clinical model for future 'novel' therapeutic approaches where the current research has provided new insights on prognostic genes, molecular pathways and mechanisms involved in OS pathogenesis and disease progression. PMID:19735553
Chang, Yaojen; Gallon, Lorenzo; Jay, Colleen; Shetty, Kirti; Ho, Bing; Levitsky, Josh; Baker, Talia; Ladner, Daniela; Friedewald, John; Abecassis, Michael; Hazen, Gordon; Skaro, Anton I
2014-09-01
There are complex risk-benefit tradeoffs with different transplantation strategies for end-stage liver disease patients on renal support. Using a Markov discrete-time state transition model, we compared survival for this group with 3 strategies: simultaneous liver-kidney (SLK) transplantation, liver transplantation alone (LTA) followed by immediate kidney transplantation if renal function did not recover, and LTA followed by placement on the kidney transplant wait list. Patients were followed for 30 years from the age of 50 years. The probabilities of events were synthesized from population data and clinical trials according to Model for End-Stage Liver Disease (MELD) scores (21-30 and >30) to estimate input parameters. Sensitivity analyses tested the impact of uncertainty on survival. Overall, the highest survival rates were seen with SLK transplantation for both MELD score groups (82.8% for MELD scores of 21-30 and 82.5% for MELD scores > 30 at 1 year), albeit at the cost of using kidneys that might not be needed. Liver transplantation followed by kidney transplantation led to higher survival rates (77.3% and 76.4%, respectively, at 1 year) than placement on the kidney transplant wait list (75.1% and 74.3%, respectively, at 1 year). When uncertainty was considered, the results indicated that the waiting time and renal recovery affected conclusions about survival after SLK transplantation and liver transplantation, respectively. The subgroups with the longest durations of pretransplant renal replacement therapy and highest MELD scores had the largest absolute increases in survival with SLK transplantation versus sequential transplantation. In conclusion, the findings demonstrate the inherent tension in choices about the use of available kidneys and suggest that performing liver transplantation and using renal transplantation only for those who fail to recover their native renal function could free up available donor kidneys. These results could inform discussions about transplantation policy. © 2014 American Association for the Study of Liver Diseases.
Long, Thorir E; Sigurdsson, Martin I; Sigurdsson, Gisli H; Indridason, Olafur S
2016-12-01
Acute kidney injury (AKI) is a common complication of medical and surgical interventions in hospitalized patients and associates with high mortality. Our aim was to examine renal recovery and long-term survival and time trends in AKI survival. Changes in serum creatinine (SCr) were used to define AKI in patients at Landspitali University Hospital in Iceland from 1993 to 2013. Renal recovery was defined as SCr < 1.5× baseline. Out of 25 274 individuals who had their highest measured SCr during hospitalization and an available baseline SCr, 10,419 (41%) had AKI during hospitalization (H-AKI), 19%, 11% and 12% with Stage 1, 2 and 3, respectively. The incidence of H-AKI increased from 18.6 (95% CI, 14.7-22.5) to 29.9 (95% CI, 26.7-33.1) per 1000 admissions/year over the study period. Survival after H-AKI was 61% at 90-days and 51% at one year. Comparing H-AKI patients to propensity score matched individuals the hazard ratio for death was 1.49 (1.36-1.62), 2.17 (1.95-2.41) and 2.95 (2.65-3.29) for Stage 1, 2 and 3, respectively. One-year survival of H-AKI patients improved from 47% in 1993-1997 to 57% in 2008-2013 and the adjusted hazard ratio for mortality improved, compared to the first 5-year period, 0.85 (0.81-0.89), 0.67 (0.64-0.71), and 0.57 (0.53-0.60) for each subsequent 5-year interval. Recovery of renal function was achieved in 88%, 58% and 44% of patients in Stages 1, 2 and 3, respectively, improving with time. Acute kidney injury is an independent predictor of long-term mortality in hospitalized patients but there has been a marked improvement in survival and renal recovery over the past two decades. © 2015 Asian Pacific Society of Nephrology.
Chok, Kenneth S H; Fung, James Y Y; Chan, Albert C Y; Dai, Wing Chiu; Sharr, William W; Cheung, Tan To; Chan, See Ching; Lo, Chung Mau
2017-01-01
To evaluate if living donor liver transplantation (LDLT) should be offered to patients with Model for End-stage Liver Disease (MELD) scores ≥35. No data was available to support LDLT of such patients. Data of 672 consecutive adult liver transplant recipients from 2005 to 2014 at our center were reviewed. Patients with MELD scores ≥35 were divided into the deceased donor liver transplantation (DDLT) group and the LDLT group and were compared. Univariate analysis was performed to identify risk factors affecting survival. The LDLT group (n = 54) had younger (33 yrs vs 50 yrs, P < 0.001) and lighter (56 Kg vs 65 Kg, P = 0.004) donors, lighter grafts (627.5 g vs 1252.5 g, P < 0.001), lower graft-weight-to-recipient-standard-liver-volume rates (51.28% vs 99.76%, P < 0.001), shorter cold ischemic time (106.5 min vs 389 min, P < 0.001), and longer operation time (681.5 min vs 534 min, P < 0.001). The groups were comparable in postoperative complication, hospital mortality, and graft survival and patient survival at one year (88.9% vs 92.5%; 88.9% vs 94.7%), three years (87.0% vs 86.9%; 87.0% vs 88.8%), and five years (84.8% vs 81.8%; 84.8% vs 83.3%). Univariate analysis did not show inferior survival in LDLT recipients. At centers with experience, the outcomes of LDLT can be comparable with those of DDLT even in patients with MELD scores ≥35. When donor risks and recipient benefits are fully considered and balanced, an MELD score ≥35 should not be a contraindication to LDLT. In Hong Kong, where most waitlisted patients have acute-on-chronic liver failure from hepatitis B, LDLT is a wise alternative to DDLT.
Survival of Bacillus pumilus spores for a prolonged period of time in real space conditions.
Vaishampayan, Parag A; Rabbow, Elke; Horneck, Gerda; Venkateswaran, Kasthuri J
2012-05-01
To prevent forward contamination and maintain the scientific integrity of future life-detection missions, it is important to characterize and attempt to eliminate terrestrial microorganisms associated with exploratory spacecraft and landing vehicles. Among the organisms isolated from spacecraft-associated surfaces, spores of Bacillus pumilus SAFR-032 exhibited unusually high resistance to decontamination techniques such as UV radiation and peroxide treatment. Subsequently, B. pumilus SAFR-032 was flown to the International Space Station (ISS) and exposed to a variety of space conditions via the European Technology Exposure Facility (EuTEF). After 18 months of exposure in the EXPOSE facility of the European Space Agency (ESA) on EuTEF under dark space conditions, SAFR-032 spores showed 10-40% survivability, whereas a survival rate of 85-100% was observed when these spores were kept aboard the ISS under dark simulated martian atmospheric conditions. In contrast, when UV (>110 nm) was applied on SAFR-032 spores for the same time period and under the same conditions used in EXPOSE, a ∼7-log reduction in viability was observed. A parallel experiment was conducted on Earth with identical samples under simulated space conditions. Spores exposed to ground simulations showed less of a reduction in viability when compared with the "real space" exposed spores (∼3-log reduction in viability for "UV-Mars," and ∼4-log reduction in viability for "UV-Space"). A comparative proteomics analysis indicated that proteins conferring resistant traits (superoxide dismutase) were present in higher concentration in space-exposed spores when compared to controls. Also, the first-generation cells and spores derived from space-exposed samples exhibited elevated UVC resistance when compared with their ground control counterparts. The data generated are important for calculating the probability and mechanisms of microbial survival in space conditions and assessing microbial contaminants as risks for forward contamination and in situ life detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madonna, G.S.; Ledney, G.D.; Moore, M.M.
Compromise of antimicrobial defenses by irradiation can result in sepsis and death. Additional trauma can further predispose patients to infection and thus increase mortality. We recently showed that injection of synthetic trehalose dicorynomycolate (S-TDCM) significantly augments resistance to infection and increases survival of mice compromised either by whole-body irradiation with gamma radiation or equal mixtures of fission neutron and gamma radiation. In this study, C3H/HeN mice were given a lethal dose of gamma radiation (8.0 Gy) and an open wound (15% total body surface area (TBSA)) 1 hr later while anesthetized. Irradiated/wounded mice became more severely leukopenic and thrombocytopenic thanmore » mice exposed to radiation alone, and died from natural wound infection and sepsis within 7 days. S-TDCM given 1 hr postirradiation increased survival of mice exposed to radiation alone. However, this treatment did not increase survival of the irradiated/wounded mice. Systemic antibiotic therapy with gentamicin or ofloxacin for 10 days significantly increased survival time compared with untreated irradiated/wounded mice (p less than 0.01). Combination therapy with topical gentamicin cream and systemic oxacillin increased survival from 0% to 100%. Treatment with S-TDCM combined with the suboptimal treatment of topical and systemic gentamicin increased survival compared with antibiotic treatment alone. These studies demonstrate that post-trauma therapy with S-TDCM and antibiotics augments resistance to infection in immunocompromised mice. The data suggest that therapies which combine stimulation of nonspecific host defense mechanisms with antibiotics may increase survival of irradiated patients inflicted with accidental or surgical trauma.« less
Elshaikh, Mohamed A; Ruterbusch, Julie; Cote, Michele L; Cattaneo, Richard; Munkarah, Adnan R
2013-11-01
To study the prognostic impact of baby boomer (BB) generation on survival end-points of patients with early-stage endometrial carcinoma (EC). Data were obtained from the SEER registry between 1988-2009. Inclusion criteria included women who underwent hysterectomy for stage I-II EC. Patients were divided into two birth cohorts: BB (women born between 1946 and 1964) and pre-boomers (PB) (born between 1926 and 1945). A total of 30,956 patients were analyzed. Considering that women in the PB group were older than those of the BB generation, the statistical analysis was limited to women 50-59 years of age at the time of diagnosis (n=11,473). Baby boomers had a significantly higher percentage of endometrioid histology (p<0.0001), higher percentage of African American women (p<0.0001), lower tumor grade (p<0.0001), higher number of dissected lymph nodes (LN) (p<0.0001), and less utilization of adjuvant radiation therapy (p=0.0003). Overall survival was improved in women in the BB generation compared to the PB generation (p=0.0003) with a trend for improved uterine cancer-specific survival (p=0.0752). On multivariate analysis, birth cohort (BB vs. PB) was not a significant predictor of survival end-points. Factors predictive of survival included: tumor grade, FIGO stage, African-American race, and increased number of dissected LN. Our study suggests that the survival of BB women between 50-60 years of age is better compared to women in the PB generation. As more BB patients are diagnosed with EC, further research is warranted.
2011-01-01
Background Generalisability of longitudinal studies is threatened by issues such as choice of sampling frame, representativeness of the initial sample, and attrition. To determine representativeness, cohorts are often compared with the population of interest at baseline on demographic and health characteristics. This study illustrates the use of relative survival as a tool for assessing generalisability of results from a cohort of older people among whom death is a potential threat to generalisability. Methods The authors used data from the 1921-26 cohort (n = 12,416, aged 70-75 in 1996) of the Australian Longitudinal Study on Women's Health (ALSWH). Vital status was determined by linkage to the National Death Index, and expected deaths were derived using Australian life tables. Relative survival was estimated using observed survival in the cohort divided by expected survival among women of the same age and State or Territory. Results Overall, the ALSWH women showed relative survival 9.5% above the general population. Within States and Territories, the relative survival advantage varied from 6% to 23%. The interval-specific relative survival remained relatively constant over the 12 years (1996-2008) under review, indicating that the survival advantage of the cohort has not diminished over time. Conclusion This study demonstrates that relative survival can be a useful measure of generalisability in a longitudinal study of the health of the general population, particularly when participants are older. PMID:21294918
2009-01-01
Background During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia. PMID:19331670
Management of metastatic renal cell carcinoma in the era of targeted therapies.
Webber, K; Cooper, A; Kleiven, H; Yip, D; Goldstein, D
2011-08-01
Metastatic renal cell cancer is associated with poor prognosis and survival and is resistant to conventional chemotherapy. Therapeutic targeting of molecular pathways for tumour angiogenesis and other specific activation mechanisms offers improved tumour response and prolonged survival. To conduct a retrospective audit of metastatic renal cell carcinoma patients treated with targeted therapies. Data were extracted from clinical records of patients undergoing targeted treatment between 2005 and 2009 at two hospital sites. Data collected included pathology, systemic therapy class, toxicity and survival. Univariate and multivariate survival analyses were performed. Sixty-one patients were treated with 102 lines of therapy with a median overall survival (OS) of 23 months, median time to failure of first-line treatment (TTF1) of 10 months and median time to failure of second-line treatment (TTF2) of 5.2 months. Time from first diagnosis to treatment >12 months was significantly associated with improved OS, longer TTF1, TTF2 and response to first-line anti-vascular endothelial growth factor receptor tyrosine kinase inhibitors (anti-VEGF TKI) therapy. Variables associated with tumour biology, natural history and the systemic inflammatory response were associated with improved OS and TTF1. Development of hypertension was predictive of anti-VEGF TKI outcome. Toxicities were as expected for each drug class. Survival and toxicity outcomes from two Australian sites are comparable to published data. The adverse event profile differs to conventional chemotherapy. Clinicians caring for patients with metastatic renal cancer will need to become familiar with these toxicities and their management as these agents enter widespread use. © 2011 The Authors. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.
Mao, Xiao-Nan; Lu, Zai-Ming; Wen, Feng; Liang, Hong-Yuan; Guo, Qi-Yong
2017-01-01
Abstract This study explored the effect of the implant position of stents across the Vater's ampulla on treatment outcomes in patients with lower bile duct obstruction. In the retrospective study, 41 patients with malignant obstruction of the lower bile duct and obstructive jaundice received percutaneous transhepatic biliary placement of bare-metal stents. Basic demographic data on patients, such as sex, age, and primary diseases, and follow-up data, including postoperative complications and jaundice-free survival, were recorded. The follow-up data on patients with an involved ampulla, patients with an uninvolved ampulla, patients with a stent across the ampulla, and patients with a stent at a site other than the ampulla were compared. Furthermore, prognostic factors for jaundice- free survival were investigated using Cox proportional hazards regression analysis. Among the 41 patients, 38 patients experienced subsiding of jaundice, whereas 3 cases had unsuccessful stent patency. Whether or not the ampulla was involved did not influence the incidence rates of postoperative complications and the jaundice-free survival time. Notably, when stents were placed across the ampulla, the jaundice-free survival time was significantly longer than when stents were placed at sites other than across the ampulla (P < .05). Furthermore, placement of the stent across the ampulla or at other sites was an independent prognostic factor (hazard ratio = 0.154, 95% confidence interval 0.042–0.560, P = .005) for jaundice-free survival of patients. The current study revealed that the implant position of a stent across the ampulla resulted in maintenance of stent patency and prolongation of the jaundice-free survival time. PMID:29137005
Jernigan, Amelia M; Mahdi, Haider; Rose, Peter G
2015-09-01
To estimate the frequency of hereditary breast and ovarian cancer (HBOC) in women with central nervous system (CNS) metastasis from epithelial ovarian cancer (EOC) and to evaluate for a potential relationship between HBOC status and survival. A total of 1240 cases of EOC treated between 1995 and 2014 were reviewed to identify CNS metastasis. Demographics, treatment, family history, genetic testing, and survival outcomes were recorded. Women were then classified as HBOC+ or HBOC- based on histories and genetic testing results. Kaplan-Meier survival curves and univariable Cox proportional hazards models were used. Of 1240 cases, 32 cases of EOC with CNS metastasis were identified (2.58%). Median age was 52.13 (95% confidence interval [CI], 40.56-78.38) years, and 87.10% had stage III to IV disease. Among those with documented personal and family history, 66.7% (20/30) were suspicious for HBOC syndrome. Among those who underwent germline testing, 71.43% (5/7) had a pathogenic BRCA mutation. The median time from diagnosis to CNS metastasis was 29.17 (95% CI, 0-187.91) months. At a median survival of 5.97 (95% CI, 0.20-116.95) months from the time of CNS metastasis and 43.76 (95% CI, 1.54-188.44) months from the time of EOC diagnosis, 29 women died of disease. Univariate Cox proportional hazard models were used to compare HBOC- to HBOC+ women and did not reveal a significant difference for survival outcomes. Confirmed BRCA mutations and histories concerning for HBOC syndrome are common in women with EOC metastatic to the CNS. We did not demonstrate a relationship between HBOC status and survival outcomes, but were not powered to do so.
Fenske, Timothy S.; Zhang, Mei-Jie; Carreras, Jeanette; Ayala, Ernesto; Burns, Linda J.; Cashen, Amanda; Costa, Luciano J.; Freytes, César O.; Gale, Robert P.; Hamadani, Mehdi; Holmberg, Leona A.; Inwards, David J.; Lazarus, Hillard M.; Maziarz, Richard T.; Munker, Reinhold; Perales, Miguel-Angel; Rizzieri, David A.; Schouten, Harry C.; Smith, Sonali M.; Waller, Edmund K.; Wirk, Baldeep M.; Laport, Ginna G.; Maloney, David G.; Montoto, Silvia; Hari, Parameswaran N.
2014-01-01
Purpose To examine the outcomes of patients with chemotherapy-sensitive mantle-cell lymphoma (MCL) following a first hematopoietic stem-cell transplantation (HCT), comparing outcomes with autologous (auto) versus reduced-intensity conditioning allogeneic (RIC allo) HCT and with transplantation applied at different times in the disease course. Patients and Methods In all, 519 patients who received transplantations between 1996 and 2007 and were reported to the Center for International Blood and Marrow Transplant Research were analyzed. The early transplantation cohort was defined as those patients in first partial or complete remission with no more than two lines of chemotherapy. The late transplantation cohort was defined as all the remaining patients. Results Auto-HCT and RIC allo-HCT resulted in similar overall survival from transplantation for both the early (at 5 years: 61% auto-HCT v 62% RIC allo-HCT; P = .951) and late cohorts (at 5 years: 44% auto-HCT v 31% RIC allo-HCT; P = .202). In both early and late transplantation cohorts, progression/relapse was lower and nonrelapse mortality was higher in the allo-HCT group. Overall survival and progression-free survival were highest in patients who underwent auto-HCT in first complete response. Multivariate analysis of survival from diagnosis identified a survival benefit favoring early HCT for both auto-HCT and RIC allo-HCT. Conclusion For patients with chemotherapy-sensitive MCL, the optimal timing for HCT is early in the disease course. Outcomes are particularly favorable for patients undergoing auto-HCT in first complete remission. For those unable to achieve complete remission after two lines of chemotherapy or those with relapsed disease, either auto-HCT or RIC allo-HCT may be effective, although the chance for long-term remission and survival is lower. PMID:24344210
Fenske, Timothy S; Zhang, Mei-Jie; Carreras, Jeanette; Ayala, Ernesto; Burns, Linda J; Cashen, Amanda; Costa, Luciano J; Freytes, César O; Gale, Robert P; Hamadani, Mehdi; Holmberg, Leona A; Inwards, David J; Lazarus, Hillard M; Maziarz, Richard T; Munker, Reinhold; Perales, Miguel-Angel; Rizzieri, David A; Schouten, Harry C; Smith, Sonali M; Waller, Edmund K; Wirk, Baldeep M; Laport, Ginna G; Maloney, David G; Montoto, Silvia; Hari, Parameswaran N
2014-02-01
To examine the outcomes of patients with chemotherapy-sensitive mantle-cell lymphoma (MCL) following a first hematopoietic stem-cell transplantation (HCT), comparing outcomes with autologous (auto) versus reduced-intensity conditioning allogeneic (RIC allo) HCT and with transplantation applied at different times in the disease course. In all, 519 patients who received transplantations between 1996 and 2007 and were reported to the Center for International Blood and Marrow Transplant Research were analyzed. The early transplantation cohort was defined as those patients in first partial or complete remission with no more than two lines of chemotherapy. The late transplantation cohort was defined as all the remaining patients. Auto-HCT and RIC allo-HCT resulted in similar overall survival from transplantation for both the early (at 5 years: 61% auto-HCT v 62% RIC allo-HCT; P = .951) and late cohorts (at 5 years: 44% auto-HCT v 31% RIC allo-HCT; P = .202). In both early and late transplantation cohorts, progression/relapse was lower and nonrelapse mortality was higher in the allo-HCT group. Overall survival and progression-free survival were highest in patients who underwent auto-HCT in first complete response. Multivariate analysis of survival from diagnosis identified a survival benefit favoring early HCT for both auto-HCT and RIC allo-HCT. For patients with chemotherapy-sensitive MCL, the optimal timing for HCT is early in the disease course. Outcomes are particularly favorable for patients undergoing auto-HCT in first complete remission. For those unable to achieve complete remission after two lines of chemotherapy or those with relapsed disease, either auto-HCT or RIC allo-HCT may be effective, although the chance for long-term remission and survival is lower.
A second chance at life: people's lived experiences of surviving out-of-hospital cardiac arrest.
Forslund, Ann-Sofie; Jansson, Jan-Håkan; Lundblad, Dan; Söderberg, Siv
2017-12-01
There is more to illuminate about people's experiences of surviving out-of-hospital cardiac arrest (OHCA) and how such an event affects people's lives over time. This study aimed to elucidate meanings of people's lived experiences and changes in everyday life during their first year after surviving OHCA. A qualitative, longitudinal design was used. Eleven people surviving OHCA from northern Sweden agreed to participate and were interviewed 6 and 12 months after the event. A phenomenological hermeneutic interpretation was used to analyse the transcribed texts. The structural analysis resulted in two themes: (i) striving to regain one's usual self and (ii) a second chance at life, and subthemes (ia) testing the body, (ib) pursuing the ordinary life, (ic) gratitude for help to survival, (iia) regaining a sense of security with one's body, (iib) getting to know a new self, and (iic) seeking meaning and establishing a future. To conclude, we suggest that people experienced meanings of surviving OHCA over time as striving to regain their usual self and getting a second chance at life. The event affected them in many ways and resulted in a lot of emotions and many things to think about. Participants experienced back-and-forth emotions, when comparing their present lives to both their lives before cardiac arrest and those lives they planned for the future. During their first year, participants' daily lives were still influenced by 'being dead' and returning to life. As time passed, they wanted to resume their ordinary lives and hoped for continued lives filled with meaning and joyous activities. © 2017 Nordic College of Caring Science.
Chiang, Jui-Kun; Chih-Wen, Lin; Kao, Yee-Hsin
2017-01-01
Objective Liver cancer is a growing global public health problem. Ultrasonography is an imaging tool widely used for the early diagnosis of liver cancer. However, the effect of ultrasonography surveillance (US) on the survival of patients with liver cancer is unknown. Therefore, this study examined the association between survival and US frequency during the 2 years preceding patients’ liver cancer diagnosis. Methods This population-based longitudinal study was conducted in Taiwan, a region with high liver cancer incidence, by using the National Health Insurance Research Database. We compared survival between patients who received US three times or more (≥3 group) and less than three times (<3 group) during the 2 years preceding their liver cancer diagnosis, and identified the predictors for the ≥3 group. Results This study enrolled 4621 patients with liver cancer who had died between 1997 and 2010. The median survival rate was higher in the ≥3 group (1.42 years) than in the <3 group (0.51 years). Five-year survival probability was also significantly higher in the ≥3 group (14.4%) than in the <3 group (7.7%). The multivariate logistic regression results showed that the three most common positive predictors for receiving three or more US sessions were indications of viral hepatitis, gallbladder diseases and kidney–urinary–bladder diseases; the most common negative predictors for receiving three or more US sessions were male sex and indications of abdominal pain. Conclusion Patients with liver cancer who received US three times or more during the 2 years preceding their liver cancer diagnosis exhibited a higher 5-year survival probability. PMID:28645973
Bunnapradist, Suphamai; Gritsch, H Albin; Peng, Alice; Jordan, Stanley C; Cho, Yong W
2003-04-01
The current organ shortage has led to the utilization of double kidney transplants from marginal adult donors, but outcomes data are limited. The United Network for Organ Sharing registry database was used to compare the outcomes of 403 dual adult kidney transplantations (DKT) and 11,033 single kidney transplantations (SKT) from 1997 to 2000. Graft and patient survival and the effect of multiple risk factors were evaluated. It was found that DKT patients were older, less sensitized, and received grafts from older, more mismatched donors with longer cold ischemia times. There was also a greater percentage of donors with a history of diabetes or hypertension and African-American recipients and donors in the DKT group. Graft survival was inferior in the DKT group, with a 7% lower graft survival rate at 1 yr. There was a higher incidence of primary nonfunction in the DKT group, although the incidence of delayed graft function, early rejection treatment, and graft thrombosis did not differ. Multivariate analysis was used to identify African-American recipient ethnicity and retransplant as risk factors for graft loss. Graft survival was comparable in DKT and SKT with donors over 55 yr of age. DKT resulted in inferior graft outcomes compared with SKT. When compared with SKT with donors over 55 yr of age, DKT resulted in similar graft outcomes. These otherwise discarded kidneys should be cautiously considered as a source of marginal donors.
Balamuthusamy, Saravanan; Paramesh, Anil; Zhang, Rubin; Florman, Sander; Shenava, Rajesh; Islam, Tareq; Wagner, Janis; Killackey, Mary; Alper, Brent; Simon, Eric E; Slakey, Douglas
2009-01-01
There is insufficient data on the impact of recipient body mass index (BMI) on the long-term graft survival of adult patients transplanted with single pediatric kidneys. We performed a retrospective analysis of adult patients transplanted with single pediatric kidneys at our center. The recipients were classified into 2 groups: group 1 (BMI > or =30) and group 2 (BMI <30). Donor/recipient demographics, postoperative outcomes and survival rates were compared between the 2 groups. There was no significant difference in donor/recipient demographics between the 2 groups. In group 1, the death-censored graft survival (DCGS) at 1, 3 and 5 years was 90% at all 3 time points, and in group 2 it was 86, 68 and 60%, respectively (p = 0.05). The mean glomerular filtration rate (with standard deviation in parentheses) at 1, 3 and 5 years was, respectively, 55 (15), 59 (19) and 55 (28) ml/min for group 1, compared to 65 (28), 69 (23) and 67 (20) ml/min in group 2 (p = NS). Multivariate analysis revealed a hazard ratio of 5.12 (95% confidence interval 1.06-24.7; p = 0.04) for graft loss in nonobese patients when compared to obese patients. Obese patients had an increased risk for acute rejections within the first month of transplant (p = 0.02). Patients with a BMI > or =30 transplanted with single pediatric kidneys have better DCGS rates when compared to nonobese patients. Copyright (c) 2008 S. Karger AG, Basel.