Sample records for death-censored graft failure

  1. Post-Transplant Hypophosphatemia and the Risk of Death-Censored Graft Failure and Mortality after Kidney Transplantation.

    PubMed

    van Londen, Marco; Aarts, Brigitte M; Deetman, Petronella E; van der Weijden, Jessica; Eisenga, Michele F; Navis, Gerjan; Bakker, Stephan J L; de Borst, Martin H

    2017-08-07

    Hypophosphatemia is common in the first year after kidney transplantation, but its clinical implications are unclear. We investigated the relationship between the severity of post-transplant hypophosphatemia and mortality or death-censored graft failure in a large cohort of renal transplant recipients with long-term follow-up. We performed a longitudinal cohort study in 957 renal transplant recipients who were transplanted between 1993 and 2008 at a single center. We used a large real-life dataset containing 28,178 phosphate measurements (median of 27; first to third quartiles, 23-34) serial measurements per patient) and selected the lowest intraindividual phosphate level during the first year after transplantation. The primary outcomes were all-cause mortality, cardiovascular mortality, and death-censored graft failure. The median (interquartile range) intraindividual lowest phosphate level was 1.58 (1.30-1.95) mg/dl, and it was reached at 33 (21-51) days post-transplant. eGFR was the main correlate of the lowest serum phosphate level (model R 2 =0.32). During 9 (5-12) years of follow-up, 181 (19%) patients developed graft failure, and 295 (35%) patients died, of which 94 (32%) deaths were due to cardiovascular disease. In multivariable Cox regression analysis, more severe hypophosphatemia was associated with a lower risk of death-censored graft failure (fully adjusted hazard ratio, 0.61; 95% confidence interval, 0.43 to 0.88 per 1 mg/dl lower serum phosphate) and cardiovascular mortality (fully adjusted hazard ratio, 0.37; 95% confidence interval, 0.22 to 0.62) but not noncardiovascular mortality (fully adjusted hazard ratio, 1.33; 95% confidence interval, 0.9 to 1.96) or all-cause mortality (fully adjusted hazard ratio, 1.15; 95% confidence interval, 0.81 to 1.61). Post-transplant hypophosphatemia develops early after transplantation. These data connect post-transplant hypophosphatemia with favorable long-term graft and patient outcomes. Copyright © 2017 by the

  2. Overall Graft Loss Versus Death-Censored Graft Loss: Unmasking the Magnitude of Racial Disparities in Outcomes Among US Kidney Transplant Recipients.

    PubMed

    Taber, David J; Gebregziabher, Mulugeta; Payne, Elizabeth H; Srinivas, Titte; Baliga, Prabhakar K; Egede, Leonard E

    2017-02-01

    Black kidney transplant recipients experience disproportionately high rates of graft loss. This disparity has persisted for 40 years, and improvements may be impeded based on the current public reporting of overall graft loss by US regulatory organizations for transplantation. Longitudinal cohort study of kidney transplant recipients using a data set created by linking Veterans Affairs and US Renal Data System information, including 4918 veterans transplanted between January 2001 and December 2007, with follow-up through December 2010. Multivariable analysis was conducted using 2-stage joint modeling of random and fixed effects of longitudinal data (linear mixed model) with time to event outcomes (Cox regression). Three thousand three hundred six non-Hispanic whites (67%) were compared with 1612 non-Hispanic black (33%) recipients with 6.0 ± 2.2 years of follow-up. In the unadjusted analysis, black recipients were significantly more likely to have overall graft loss (hazard ratio [HR], 1.19; 95% confidence interval [95% CI], 1.07-1.33), death-censored graft loss (HR, 1.67; 95% CI, 1.45-1.92), and lower mortality (HR, 0.83; 95% CI, 0.72-0.96). In fully adjusted models, only death-censored graft loss remained significant (HR, 1.38; 95% CI, 1.12-1.71; overall graft loss [HR, 1.08; 95% CI, 0.91-1.28]; mortality [HR, 0.84; 95% CI, 0.67-1.06]). A composite definition of graft loss reduced the magnitude of disparities in blacks by 22%. Non-Hispanic black kidney transplant recipients experience a substantial disparity in graft loss, but not mortality. This study of US data provides evidence to suggest that researchers should focus on using death-censored graft loss as the primary outcome of interest to facilitate a better understanding of racial disparities in kidney transplantation.

  3. High Risk of Graft Failure in Emerging Adult Heart Transplant Recipients.

    PubMed

    Foster, B J; Dahhou, M; Zhang, X; Dharnidharka, V; Ng, V; Conway, J

    2015-12-01

    Emerging adulthood (17-24 years) is a period of high risk for graft failure in kidney transplant. Whether a similar association exists in heart transplant recipients is unknown. We sought to estimate the relative hazards of graft failure at different current ages, compared with patients between 20 and 24 years old. We evaluated 11 473 patients recorded in the Scientific Registry of Transplant Recipients who received a first transplant at <40 years old (1988-2013) and had at least 6 months of graft function. Time-dependent Cox models were used to estimate the association between current age (time-dependent) and failure risk, adjusted for time since transplant and other potential confounders. Failure was defined as death following graft failure or retransplant; observation was censored at death with graft function. There were 2567 failures. Crude age-specific graft failure rates were highest in 21-24 year olds (4.2 per 100 person-years). Compared to individuals with the same time since transplant, 21-24 year olds had significantly higher failure rates than all other age periods except 17-20 years (HR 0.92 [95%CI 0.77, 1.09]) and 25-29 years (0.86 [0.73, 1.03]). Among young first heart transplant recipients, graft failure risks are highest in the period from 17 to 29 years of age. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  4. Early statin use is an independent predictor of long-term graft survival.

    PubMed

    Moreso, Francesc; Calvo, Natividad; Pascual, Julio; Anaya, Fernando; Jiménez, Carlos; Del Castillo, Domingo; Sánchez-Plumed, Jaime; Serón, Daniel

    2010-06-01

    Background. Statin use in renal transplantation has been associated with a lower risk of patient death but not with an improvement of graft functional survival. The aim of this study is to evaluate the effect of statin use in graft survival, death-censored graft survival and patient survival using the data recorded on the Spanish Late Allograft Dysfunction Study Group.Patients and methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered. Since the mean follow-up in the 2002 cohort was 3 years, statin use was analysed considering its introduction during the first year or during the initial 2 years after transplantation. Univariate and multivariate Cox regression analyses with a propensity score for statin use were employed to analyse graft survival, death-censored graft survival and patient survival.Results. In the 4682 evaluated patients, the early statin use after transplantation significantly increased from 1990 to 2002 (12.7%, 27.9%, 47.7% and 53.0%, P < 0.001). Statin use during the first year was not associated with graft or patient survival. Statin use during the initial 2 years was associated with a lower risk of graft failure (relative risk [RR] = 0.741 and 95% confidence interval [CI] = 0.635-0.866, P < 0.001) and patient death (RR = 0.806 and 95% CI = 0.656-0.989, P = 0.039). Death-censored graft survival was not associated with statin use during the initial 2 years.Conclusion. The early introduction of statin treatment after transplantation is associated with a significant decrease in late graft failure due to a risk reduction in patient death.

  5. Early statin use is an independent predictor of long-term graft survival

    PubMed Central

    Moreso, Francesc; Calvo, Natividad; Pascual, Julio; Anaya, Fernando; Jiménez, Carlos; del Castillo, Domingo; Sánchez-Plumed, Jaime; Serón, Daniel

    2010-01-01

    Background. Statin use in renal transplantation has been associated with a lower risk of patient death but not with an improvement of graft functional survival. The aim of this study is to evaluate the effect of statin use in graft survival, death-censored graft survival and patient survival using the data recorded on the Spanish Late Allograft Dysfunction Study Group. Patients and methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered. Since the mean follow-up in the 2002 cohort was 3 years, statin use was analysed considering its introduction during the first year or during the initial 2 years after transplantation. Univariate and multivariate Cox regression analyses with a propensity score for statin use were employed to analyse graft survival, death-censored graft survival and patient survival. Results. In the 4682 evaluated patients, the early statin use after transplantation significantly increased from 1990 to 2002 (12.7%, 27.9%, 47.7% and 53.0%, P < 0.001). Statin use during the first year was not associated with graft or patient survival. Statin use during the initial 2 years was associated with a lower risk of graft failure (relative risk [RR] = 0.741 and 95% confidence interval [CI] = 0.635–0.866, P < 0.001) and patient death (RR = 0.806 and 95% CI = 0.656–0.989, P = 0.039). Death-censored graft survival was not associated with statin use during the initial 2 years. Conclusion. The early introduction of statin treatment after transplantation is associated with a significant decrease in late graft failure due to a risk reduction in patient death. PMID:20508861

  6. Regression analysis of case K interval-censored failure time data in the presence of informative censoring.

    PubMed

    Wang, Peijie; Zhao, Hui; Sun, Jianguo

    2016-12-01

    Interval-censored failure time data occur in many fields such as demography, economics, medical research, and reliability and many inference procedures on them have been developed (Sun, 2006; Chen, Sun, and Peace, 2012). However, most of the existing approaches assume that the mechanism that yields interval censoring is independent of the failure time of interest and it is clear that this may not be true in practice (Zhang et al., 2007; Ma, Hu, and Sun, 2015). In this article, we consider regression analysis of case K interval-censored failure time data when the censoring mechanism may be related to the failure time of interest. For the problem, an estimated sieve maximum-likelihood approach is proposed for the data arising from the proportional hazards frailty model and for estimation, a two-step procedure is presented. In the addition, the asymptotic properties of the proposed estimators of regression parameters are established and an extensive simulation study suggests that the method works well. Finally, we apply the method to a set of real interval-censored data that motivated this study. © 2016, The International Biometric Society.

  7. Acute Rejection Increases Risk of Graft Failure and Death in Recent Liver Transplant Recipients.

    PubMed

    Levitsky, Josh; Goldberg, David; Smith, Abigail R; Mansfield, Sarah A; Gillespie, Brenda W; Merion, Robert M; Lok, Anna S F; Levy, Gary; Kulik, Laura; Abecassis, Michael; Shaked, Abraham

    2017-04-01

    Acute rejection is detrimental to most transplanted solid organs, but is considered to be less of a consequence for transplanted livers. We evaluated risk factors for and outcomes after biopsy-proven acute rejection (BPAR) based on an analysis of a more recent national sample of recipients of liver transplants from living and deceased donors. We analyzed data from the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) from 2003 through 2014 as the exploratory cohort and the Scientific Registry of Transplant Recipients (SRTR) from 2005 through 2013 as the validation cohort. We examined factors associated with time to first BPAR using multivariable Cox regression or discrete-survival analysis. Competing risks methods were used to compare causes of death and graft failure between recipients of living and deceased donors. At least 1 BPAR episode occurred in 239 of 890 recipients in A2ALL (26.9%) and 7066 of 45,423 recipients in SRTR (15.6%). In each database, risk of rejection was significantly lower when livers came from biologically related living donors (A2ALL hazard ratio [HR], 0.57; 95% confidence interval [CI], 0.43-0.76; and SRTR HR, 0.78; 95% CI, 0.66-0.91) and higher in liver transplant recipients with primary biliary cirrhosis, of younger age, or with hepatitis C. In each database, BPAR was associated with significantly higher risks of graft failure and death. The risks were highest in the 12 month post-BPAR period in patients whose first episode occurred more than 1 year after liver transplantation: HRs for graft failure were 6.79 in A2ALL (95% CI, 2.64-17.45) and 4.41 in SRTR (95% CI, 3.71-5.23); HRs for death were 8.81 in A2ALL (95% CI, 3.37-23.04) and 3.94 in SRTR (95% CI, 3.22-4.83). In analyses of cause-specific mortality, associations were observed for liver-related (graft failure) causes of death but not for other causes. Contrary to previous data, acute rejection after liver transplant is associated with significantly increased

  8. Association of Pretransplant Skin Cancer With Posttransplant Malignancy, Graft Failure and Death in Kidney Transplant Recipients.

    PubMed

    Kang, Woosun; Sampaio, Marcelo Santos; Huang, Edmund; Bunnapradist, Suphamai

    2017-06-01

    Posttransplant malignancy (PTM) is one of the leading causes of late death in kidney recipients. Those with a cancer history may be more prone to develop a recurrent or a new cancer. We studied the association between pretransplant skin cancer, PTM, death, and graft failure. Primary adult kidney recipients transplanted between 2005 and 2013 were included. Malignancy information was obtained from Organ Procurement Kidney Transplant Network/United Network for Organ Sharing registration and follow-up forms. Posttransplant malignancy was classified into skin cancer, solid tumor, and posttransplant lymphoproliferative disorder (PTLD). Competing risk and survival analysis with adjustment for confounders were used to calculate risk for PTM, death and graft failure in recipients with pretransplant skin cancer compared with those without cancer. Risk was reported in hazard ratios (HR) with 95% confidence interval (CI). The cohort included 1671 recipients with and 102 961 without pretransplant skin malignancy. The 5-year cumulative incidence of PTM in patients with and without a pretransplant skin cancer history was 31.6% and 7.4%, respectively (P < 0.001). Recipients with pretransplant skin cancer had increased risk of PTM (sub-HR [SHR], 2.60; 95% CI, 2.27-2.98), and posttransplant skin cancer (SHR, 2.92; 95% CI, 2.52-3.39), PTLD (SHR, 1.93; 95% CI, 1.01-3.66), solid tumor (SHR, 1.44; 95% CI, 1.04-1.99), death (HR, 1.20; 95% CI, 1.07-1.34), and graft failure (HR, 1.17; 95% CI, 1.05-1.30) when compared with those without pretransplant malignancy. Pretransplant skin cancer was associated with an increased risk of posttransplant skin cancer, PTLD, solid organ cancer, death and graft failure.

  9. Association Between Delayed Graft Function and Graft Loss in Donation After Cardiac Death Kidney Transplants-A Paired Kidney Registry Analysis.

    PubMed

    Lim, Wai H; McDonald, Stephen P; Russ, Graeme R; Chapman, Jeremy R; Ma, Maggie Km; Pleass, Henry; Jaques, Bryon; Wong, Germaine

    2017-06-01

    Delayed graft function (DGF) is an established complication after donation after cardiac death (DCD) kidney transplants, but the impact of DGF on graft outcomes is uncertain. To minimize donor variability and bias, a paired donor kidney analysis was undertaken where 1 kidney developed DGF and the other did not develop DGF using data from the Australia and New Zealand Dialysis and Transplant Registry. Using paired DCD kidney data from the Australia and New Zealand Dialysis and Transplant Registry, we examined the association between DGF, graft and patient outcomes between 1994 and 2012 using adjusted Cox regression models. Of the 74 pairs of DCD kidneys followed for a median of 1.9 years (408 person-years), a greater proportion of recipients with DGF had experienced overall graft loss and death-censored graft loss at 3 years compared with those without DGF (14% vs 4%, P = 0.04 and 11% vs 0%, P < 0.01, respectively). Compared with recipients without DGF, the adjusted hazard ratio for overall graft loss at 3 years for recipients with DGF was 4.31 (95% confidence interval [95% CI], 1.13-16.44). The adjusted hazard ratio for acute rejection and all-cause mortality at 3 years in recipients who have experienced DGF were 0.98 (95% CI, 0.96-1.01) and 1.70 (95% CI, 0.36-7.93), respectively, compared with recipients without DGF. Recipients of DCD kidneys with DGF experienced a higher incidence of overall and death-censored graft loss compared with those without DGF. Strategies aim to reduce the risk of DGF could potentially improve graft survival in DCD kidney transplants.

  10. Early renal function recovery and long-term graft survival in kidney transplantation.

    PubMed

    Wan, Susan S; Cantarovich, Marcelo; Mucsi, Istvan; Baran, Dana; Paraskevas, Steven; Tchervenkov, Jean

    2016-05-01

    Following kidney transplantation (KTx), renal function improves gradually until a baseline eGFR is achieved. Whether or not a recipient achieves the best-predicted eGFR after KTx may have important implications for immediate patient management, as well as for long-term graft survival. The aim of this cohort study was to calculate the renal function recovery (RFR) based on recipient and donor eGFR and to evaluate the association between RFR and long-term death-censored graft failure (DCGF). We studied 790 KTx recipients between January 1990 and August 2014. The last donor SCr prior to organ procurement was used to estimate donor GFR. Recipient eGFR was calculated using the average of the best three SCr values observed during the first 3 months post-KTx. RFR was defined as the ratio of recipient eGFR to half the donor eGFR. 53% of recipients had an RFR ≥1. There were 127 death-censored graft failures (16%). Recipients with an RFR ≥1 had less DCGF compared with those with an RFR <1 (HR 0.56; 95% CI 0.37-0.85; P = 0.006). Transplant era, acute rejection, ECD and DGF were also significant determinants of graft failure. Early recovery of predicted eGFR based on donor eGFR is associated with less DCGF after KTx. © 2016 Steunstichting ESOT.

  11. Outcome-Dependent Sampling with Interval-Censored Failure Time Data

    PubMed Central

    Zhou, Qingning; Cai, Jianwen; Zhou, Haibo

    2017-01-01

    Summary Epidemiologic studies and disease prevention trials often seek to relate an exposure variable to a failure time that suffers from interval-censoring. When the failure rate is low and the time intervals are wide, a large cohort is often required so as to yield reliable precision on the exposure-failure-time relationship. However, large cohort studies with simple random sampling could be prohibitive for investigators with a limited budget, especially when the exposure variables are expensive to obtain. Alternative cost-effective sampling designs and inference procedures are therefore desirable. We propose an outcome-dependent sampling (ODS) design with interval-censored failure time data, where we enrich the observed sample by selectively including certain more informative failure subjects. We develop a novel sieve semiparametric maximum empirical likelihood approach for fitting the proportional hazards model to data from the proposed interval-censoring ODS design. This approach employs the empirical likelihood and sieve methods to deal with the infinite-dimensional nuisance parameters, which greatly reduces the dimensionality of the estimation problem and eases the computation difficulty. The consistency and asymptotic normality of the resulting regression parameter estimator are established. The results from our extensive simulation study show that the proposed design and method works well for practical situations and is more efficient than the alternative designs and competing approaches. An example from the Atherosclerosis Risk in Communities (ARIC) study is provided for illustration. PMID:28771664

  12. Accelerated failure time models for semi-competing risks data in the presence of complex censoring.

    PubMed

    Lee, Kyu Ha; Rondeau, Virginie; Haneuse, Sebastien

    2017-12-01

    Statistical analyses that investigate risk factors for Alzheimer's disease (AD) are often subject to a number of challenges. Some of these challenges arise due to practical considerations regarding data collection such that the observation of AD events is subject to complex censoring including left-truncation and either interval or right-censoring. Additional challenges arise due to the fact that study participants under investigation are often subject to competing forces, most notably death, that may not be independent of AD. Towards resolving the latter, researchers may choose to embed the study of AD within the "semi-competing risks" framework for which the recent statistical literature has seen a number of advances including for the so-called illness-death model. To the best of our knowledge, however, the semi-competing risks literature has not fully considered analyses in contexts with complex censoring, as in studies of AD. This is particularly the case when interest lies with the accelerated failure time (AFT) model, an alternative to the traditional multiplicative Cox model that places emphasis away from the hazard function. In this article, we outline a new Bayesian framework for estimation/inference of an AFT illness-death model for semi-competing risks data subject to complex censoring. An efficient computational algorithm that gives researchers the flexibility to adopt either a fully parametric or a semi-parametric model specification is developed and implemented. The proposed methods are motivated by and illustrated with an analysis of data from the Adult Changes in Thought study, an on-going community-based prospective study of incident AD in western Washington State. © 2017, The International Biometric Society.

  13. Sieve estimation in a Markov illness-death process under dual censoring.

    PubMed

    Boruvka, Audrey; Cook, Richard J

    2016-04-01

    Semiparametric methods are well established for the analysis of a progressive Markov illness-death process observed up to a noninformative right censoring time. However, often the intermediate and terminal events are censored in different ways, leading to a dual censoring scheme. In such settings, unbiased estimation of the cumulative transition intensity functions cannot be achieved without some degree of smoothing. To overcome this problem, we develop a sieve maximum likelihood approach for inference on the hazard ratio. A simulation study shows that the sieve estimator offers improved finite-sample performance over common imputation-based alternatives and is robust to some forms of dependent censoring. The proposed method is illustrated using data from cancer trials. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Semiparametric regression analysis of failure time data with dependent interval censoring.

    PubMed

    Chen, Chyong-Mei; Shen, Pao-Sheng

    2017-09-20

    Interval-censored failure-time data arise when subjects are examined or observed periodically such that the failure time of interest is not examined exactly but only known to be bracketed between two adjacent observation times. The commonly used approaches assume that the examination times and the failure time are independent or conditionally independent given covariates. In many practical applications, patients who are already in poor health or have a weak immune system before treatment usually tend to visit physicians more often after treatment than those with better health or immune system. In this situation, the visiting rate is positively correlated with the risk of failure due to the health status, which results in dependent interval-censored data. While some measurable factors affecting health status such as age, gender, and physical symptom can be included in the covariates, some health-related latent variables cannot be observed or measured. To deal with dependent interval censoring involving unobserved latent variable, we characterize the visiting/examination process as recurrent event process and propose a joint frailty model to account for the association of the failure time and visiting process. A shared gamma frailty is incorporated into the Cox model and proportional intensity model for the failure time and visiting process, respectively, in a multiplicative way. We propose a semiparametric maximum likelihood approach for estimating model parameters and show the asymptotic properties, including consistency and weak convergence. Extensive simulation studies are conducted and a data set of bladder cancer is analyzed for illustrative purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Causes of corneal graft failure in India.

    PubMed

    Dandona, L; Naduvilath, T J; Janarthanan, M; Rao, G N

    1998-09-01

    The success of corneal grafting in visual rehabilitation of the corneal blind in India depends on survival of the grafts. Understanding the causes of graft failure may help reduce the risk of failure. We studied these causes in a series of 638 graft failures at our institution. Multivariate logistic regression analysis was used to evaluate the association of particular causes of graft failure with indications for grafting, socioeconomic status, age, sex, host corneal vascularization, donor corneal quality, and experience of surgeon. The major causes of graft failure were allograft rejection (29.2%), increased intraocular pressure (16.9%), infection excluding endophthalmitis (15.4%), and surface problems (12.7%). The odds of infection causing graft failure were significantly higher in patients of lower socioeconomic status (odds ratio 2.45, 95% CI 1.45-4.15). Surface problems as a cause of graft failure was significantly associated with grafts done for corneal scarring or for regrafts (odds ratio 3.36, 95% CI 1.80-6.30). Increased intraocular pressure as a cause of graft failure had significant association with grafts done for aphakic or pseudophakic bullous keratopathy, congenital conditions or glaucoma, or regrafts (odds ratio 2.19, 95% CI 1.25-3.84). Corneal dystrophy was the indication for grafting in 12 of the 13 cases of graft failure due to recurrence of host disease. Surface problems, increased intraocular pressure, and infection are modifiable risk factors that are more likely to cause graft failure in certain categories of patients in India. Knowledge about these associations can be helpful in looking for and aggressively treating these modifiable risk factors in the at-risk categories of corneal graft patients. This can possibly reduce the chance of graft failure.

  16. Effects of Dopamine Donor Pretreatment on Graft Survival after Kidney Transplantation: A Randomized Trial.

    PubMed

    Schnuelle, Peter; Schmitt, Wilhelm H; Weiss, Christel; Habicht, Antje; Renders, Lutz; Zeier, Martin; Drüschler, Felix; Heller, Katharina; Pisarski, Przemyslaw; Banas, Bernhard; Krämer, Bernhard K; Jung, Matthias; Lopau, Kai; Olbricht, Christoph J; Weihprecht, Horst; Schenker, Peter; De Fijter, Johan W; Yard, Benito A; Benck, Urs

    2017-03-07

    Donor dopamine improves initial graft function after kidney transplantation due to antioxidant properties. We investigated if a 4 µ g/kg per minute continuous dopamine infusion administered after brain-death confirmation affects long-term graft survival and examined the exposure-response relationship with treatment duration. Five-year follow-up of 487 renal transplant patients from 60 European centers who had participated in the randomized, multicenter trial of dopamine donor pretreatment between 2004 and 2007 (ClinicalTrials.gov identifier: NCT00115115). Follow-up was complete in 99.2%. Graft survival was 72.6% versus 68.7% ( P =0.34), and 83.3% versus 80.4% ( P =0.42) after death-censoring in treatment and control arms according to trial assignment. Although infusion times varied substantially in the treatment arm (range 0-32.2 hours), duration of the dopamine infusion and all-cause graft failure exhibited an exposure-response relationship (hazard ratio, 0.96; 95% confidence interval [95% CI], 0.92 to 1.00, per hour). Cumulative frequency curves of graft survival and exposure time of the dopamine infusion indicated a maximum response rate at 7.10 hours (95% CI, 6.99 to 7.21), which almost coincided with the optimum infusion time for improvement of early graft function (7.05 hours; 95% CI, 6.92 to 7.18). Taking infusion time of 7.1 hours as threshold in subsequent graft survival analyses indicated a relevant benefit: Overall, 81.5% versus 68.5%; P =0.03; and 90.3% versus 80.2%; P =0.04 after death-censoring. We failed to show a significant graft survival advantage on intention-to-treat. Dopamine infusion time was very short in a considerable number of donors assigned to treatment. Our finding of a significant, nonlinear exposure-response relationship disclosed a threshold value of the dopamine infusion time that may improve long-term kidney graft survival. Copyright © 2017 by the American Society of Nephrology.

  17. Influence of delayed graft function and acute rejection on outcomes after kidney transplantation from donors after cardiac death.

    PubMed

    Nagaraja, Pramod; Roberts, Gareth W; Stephens, Michael; Horvath, Szabolcs; Fialova, Jana; Chavez, Rafael; Asderakis, Argiris; Kaposztas, Zsolt

    2012-12-27

    Delayed graft function (DGF) and acute rejection (AR) exert an adverse impact on graft outcomes after kidney transplantation using organs from donation after brain-stem death (DBD) donors. Here, we examine the impact of DGF and AR on graft survival in kidney transplants using organs from donation after cardiac death (DCD) donors. We conducted a single-center retrospective study of DCD and DBD donor kidney transplants. We compared 1- and 4-year graft and patient survival rates, as well as death-censored graft survival (DCGS) rates, between the two groups using univariate analysis, and the impact of DGF and AR on graft function was compared using multivariate analysis. Eighty DCD and 206 DBD donor transplants were analyzed. Median follow-up was 4.5 years. The incidence of DGF was higher among DCD recipients (73% vs. 27%, P<0.001), and AR was higher among DBD recipients (23% vs. 9%, P<0.001). One-year and 4-year graft survival rates were similar (DCD 94% and 79% vs. DBD 90% and 82%). Among recipients with DGF, the 4-year DCGS rate was better for DCD recipients compared with DBD recipients (100% vs. 92%, P=0.04). Neither DGF nor AR affected the 1-year graft survival rate in DCD recipients, whereas in DBD recipients, the 1-year graft survival rate was worse in the presence of DGF (88% vs. 96%, P=0.04) and the 4-year DCGS rate was worse in the presence of AR (88% vs. 96%, P=0.04). Despite the high incidence of DGF, medium-term outcomes of DCD kidney transplants are comparable to those from DBD transplants. Short-term graft survival from DCD transplants is not adversely influenced by DGF and AR, unlike in DBD transplants.

  18. Graft failure after allogeneic hematopoietic stem cell transplantation.

    PubMed

    Ozdemir, Zehra Narli; Civriz Bozdağ, Sinem

    2018-04-18

    Graft failure is a serious complication of allogeneic hematopoietic stem cell transplantation (allo-HSCT) defined as either lack of initial engraftment of donor cells (primary graft failure) or loss of donor cells after initial engraftment (secondary graft failure). Successful transplantation depends on the formation of engrafment, in which donor cells are integrated into the recipient's cell population. In this paper, we distinguish two different entities, graft failure (GF) and poor graft function (PGF), and review the current comprehensions of the interactions between the immune and hematopoietic compartments in these conditions. Factors associated with graft failure include histocompatibility locus antigen (HLA)-mismatched grafts, underlying disease, type of conditioning regimen and stem cell source employed, low stem cell dose, ex vivo T-cell depletion, major ABO incompatibility, female donor grafts for male recipients, disease status at transplantation. Although several approaches have been developed which aimed to prevent graft rejection, establish successful engraftment and treat graft failure, GF remains a major obstacle to the success of allo-HSCT. Allogeneic hematopoietic stem cell transplantation (allo-HSCT) still remains to be the curative treatment option for various non-malignant and malignant hematopoietic diseases. The outcome of allo-HSCT primarily depends on the engraftment of the graft. Graft failure (GF), is a life-threatening complication which needs the preferential therapeutic manipulation. In this paper, we focused on the definitions of graft failure / poor graft function and also we reviewed the current understanding of the pathophysiology, risk factors and treatment approaches for these entities. Copyright © 2018. Published by Elsevier Ltd.

  19. Simple estimation procedures for regression analysis of interval-censored failure time data under the proportional hazards model.

    PubMed

    Sun, Jianguo; Feng, Yanqin; Zhao, Hui

    2015-01-01

    Interval-censored failure time data occur in many fields including epidemiological and medical studies as well as financial and sociological studies, and many authors have investigated their analysis (Sun, The statistical analysis of interval-censored failure time data, 2006; Zhang, Stat Modeling 9:321-343, 2009). In particular, a number of procedures have been developed for regression analysis of interval-censored data arising from the proportional hazards model (Finkelstein, Biometrics 42:845-854, 1986; Huang, Ann Stat 24:540-568, 1996; Pan, Biometrics 56:199-203, 2000). For most of these procedures, however, one drawback is that they involve estimation of both regression parameters and baseline cumulative hazard function. In this paper, we propose two simple estimation approaches that do not need estimation of the baseline cumulative hazard function. The asymptotic properties of the resulting estimates are given, and an extensive simulation study is conducted and indicates that they work well for practical situations.

  20. Inverse probability weighting to control confounding in an illness-death model for interval-censored data.

    PubMed

    Gillaizeau, Florence; Sénage, Thomas; Le Borgne, Florent; Le Tourneau, Thierry; Roussel, Jean-Christian; Leffondrè, Karen; Porcher, Raphaël; Giraudeau, Bruno; Dantan, Etienne; Foucher, Yohann

    2018-04-15

    Multistate models with interval-censored data, such as the illness-death model, are still not used to any considerable extent in medical research regardless of the significant literature demonstrating their advantages compared to usual survival models. Possible explanations are their uncommon availability in classical statistical software or, when they are available, by the limitations related to multivariable modelling to take confounding into consideration. In this paper, we propose a strategy based on propensity scores that allows population causal effects to be estimated: the inverse probability weighting in the illness semi-Markov model with interval-censored data. Using simulated data, we validated the performances of the proposed approach. We also illustrated the usefulness of the method by an application aiming to evaluate the relationship between the inadequate size of an aortic bioprosthesis and its degeneration or/and patient death. We have updated the R package multistate to facilitate the future use of this method. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Risk prediction models for graft failure in kidney transplantation: a systematic review.

    PubMed

    Kaboré, Rémi; Haller, Maria C; Harambat, Jérôme; Heinze, Georg; Leffondré, Karen

    2017-04-01

    Risk prediction models are useful for identifying kidney recipients at high risk of graft failure, thus optimizing clinical care. Our objective was to systematically review the models that have been recently developed and validated to predict graft failure in kidney transplantation recipients. We used PubMed and Scopus to search for English, German and French language articles published in 2005-15. We selected studies that developed and validated a new risk prediction model for graft failure after kidney transplantation, or validated an existing model with or without updating the model. Data on recipient characteristics and predictors, as well as modelling and validation methods were extracted. In total, 39 articles met the inclusion criteria. Of these, 34 developed and validated a new risk prediction model and 5 validated an existing one with or without updating the model. The most frequently predicted outcome was graft failure, defined as dialysis, re-transplantation or death with functioning graft. Most studies used the Cox model. There was substantial variability in predictors used. In total, 25 studies used predictors measured at transplantation only, and 14 studies used predictors also measured after transplantation. Discrimination performance was reported in 87% of studies, while calibration was reported in 56%. Performance indicators were estimated using both internal and external validation in 13 studies, and using external validation only in 6 studies. Several prediction models for kidney graft failure in adults have been published. Our study highlights the need to better account for competing risks when applicable in such studies, and to adequately account for post-transplant measures of predictors in studies aiming at improving monitoring of kidney transplant recipients. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  2. Recipient age as a determinant factor of patient and graft survival.

    PubMed

    Moreso, Francesc; Ortega, Francisco; Mendiluce, Alicia

    2004-06-01

    Age of renal transplants has been related to death, alloimmune response and graft outcome. We reviewed the influence of patient age on transplant outcome in three cohorts of patients transplanted in Spain during the 1990 s. Patient age was categorized into four groups (I, 18-40; II, 41-50; III, 51-60; and IV, > 60 years). Risks factors for acute rejection were evaluated by logistic regression adjusting for transplant centre and transplantation year, while a Cox proportional hazard model was employed for analysing patient and graft survival. Older patients had a higher death rate (I, 3.5%; II, 7.7%; III, 13.2%; and IV, 16.9%; P<0.001), but a lower standardized mortality index (I, 7.6; II, 7.0; III, 5.8; and IV, 4.1; P = 0.0019). Older patients had the lowest risk of acute rejection [odds ratio (OR) 0.79 and 95% confidence interval (CI) 0.66-0.97 for group II; OR 0.75 and 95% CI 0.62-0.91 for group III; OR 0.43 and 95% CI 0.33-0.56 for group IV). Death-censored graft survival was poorer in patients older than 60 years (relative risk 1.40; 95% CI 1.09-1.80), but this result was not explained by any combination of patient age with donor age, delayed graft function or immunosuppression. Patient age is a main determinant of transplant outcome. Although death rate is higher for older patients, standardized mortality was not. Thus, the efforts to reduce mortality should be also implemented in younger patients. Old patients have a low risk of acute rejection but a poorer death-censored graft survival. This last result was not explained by any controlled variable in our study.

  3. Predictors of early graft failure after coronary artery bypass grafting for chronic total occlusion.

    PubMed

    Oshima, Hideki; Tokuda, Yoshiyuki; Araki, Yoshimori; Ishii, Hideki; Murohara, Toyoaki; Ozaki, Yukio; Usui, Akihiko

    2016-07-01

    Little is known regarding the transit-time flow measurement (TTFM) variables in grafts anastomosed to chronically totally occluded vessels (CTOs). We aimed to establish the TTFM cut-off values for detecting graft failure in bypass grafts anastomosed to chronically totally occluded arteries and clarify the relationship between early graft failure and the grade of collateral circulation/regional wall motion of the CTO territory. Among 491 patients who underwent isolated coronary artery bypass grafting (CABG) from 2009 to 2015, 196 cases with CTOs underwent postoperative coronary angiography within 1 month after CABG. Two hundred and forty-one CTOs in all patients were examined. Thirty-two CTOs (13%) were not bypassed and 214 conduits were anastomosed to CTOs and underwent intraoperative TTFM. Arterial conduits and saphenous vein grafts (SVGs) were used in 102 and 112 cases, respectively. Among the arterial conduit procedures that were performed, 78 involved the left internal thoracic artery (LITA), 10 involved the right internal thoracic artery (RITA) and 14 involved the right gastroepiploic artery (rGEA). Any graft showing Fitzgibbon type B or O lesions on angiography was considered to be a failing graft. The insufficiency rates for LITA, RITA, rGEA and SVG procedures were 5.1, 10, 14.3 and 7.1%, respectively. The TTFM variables recorded in failing grafts had a significantly lower mean flow (Qmean) and higher pulsatility index (PI) compared with patent grafts. Furthermore, akinetic or dyskinetic wall motion in the territory of bypassed CTOs was observed at a significantly higher rate in failing grafts. A multivariable regression analysis and receiver operating characteristic analysis revealed good predictors of early graft failure as follows: a Qmean value of < 11.5 ml/min for arterial conduits, a PI value of >5.85 and akinetic/dyskinetic wall motion in the CTO territory for SVGs. The Rentrop collateral grade was not associated with early graft failure. The Qmean

  4. Prolonged warm ischemia time is associated with graft failure and mortality after kidney transplantation.

    PubMed

    Tennankore, Karthik K; Kim, S Joseph; Alwayn, Ian P J; Kiberd, Bryce A

    2016-03-01

    Warm ischemia time is a potentially modifiable insult to transplanted kidneys, but little is known about its effect on long-term outcomes. Here we conducted a study of United States kidney transplant recipients (years 2000-2013) to determine the association between warm ischemia time (the time from organ removal from cold storage to reperfusion with warm blood) and death/graft failure. Times under 10 minutes were potentially attributed to coding error. Therefore, the 10-to-under-20-minute interval was chosen as the reference group. The primary outcome was mortality and graft failure (return to chronic dialysis or preemptive retransplantation) adjusted for recipient, donor, immunologic, and surgical factors. The study included 131,677 patients with 35,901 events. Relative to the reference patients, times of 10 to under 20, 20 to under 30, 30 to under 40, 40 to under 50, 50 to under 60, and 60 and more minutes were associated with hazard ratios of 1.07 (95% confidence interval, 0.99-1.15), 1.13 (1.06-1.22), 1.17 (1.09-1.26), 1.20 (1.12-1.30), and 1.23 (1.15-1.33) for the composite event, respectively. Association between prolonged warm ischemia time and death/graft failure persisted after stratification by donor type (living vs. deceased donor) and delayed graft function status. Thus, warm ischemia time is associated with adverse long-term patient and graft survival after kidney transplantation. Identifying strategies to reduce warm ischemia time is an important consideration for future study. Copyright © 2015 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  5. Comparison of the long-term outcomes of kidney transplantation: USA versus Spain

    PubMed Central

    Ojo, Akinlolu O.; Morales, José María; González-Molina, Miguel; Steffick, Diane E.; Luan, Fu L.; Merion, Robert M.; Ojo, Tammy; Moreso, Francesc; Arias, Manuel; Campistol, Josep María; Hernandez, Domingo; Serón, Daniel

    2013-01-01

    Background The long-term outcomes of kidney transplantation are suboptimal because many patients lose their allografts or experience premature death. Cross-country comparisons of long-term outcomes of kidney transplantation may provide insight into factors contributing to premature graft failure and death. We evaluated the rates of late graft failure and death among US and Spanish kidney recipients. Methods This is a cohort study of US (n = 9609) and Spanish (n = 3808) patients who received a deceased donor kidney transplant in 1990, 1994, 1998 or 2002 and had a functioning allograft 1 year after transplantation with follow-up through September 2006. Ten-year overall and death-censored graft survival and 10-year overall recipient survival and death with graft function (DWGF) were estimated with multivariate Cox models. Results Among recipients alive with graft function 1 year after transplant, the 10-year graft survival was 71.3% for Spanish and 53.4% for US recipients (P < 0.001). The 10-year, death-censored graft survival was 75.6 and 76.0% for Spanish and US recipients, respectively (P = 0.73). The 10-year recipient survival was 86.2% for Spanish and 67.4% for US recipients (P < 0.001). In recipients with diabetes as the cause of ESRD, the adjusted DWGF rates at 10 years were 23.9 and 53.8 per 1000 person-years for Spanish and US recipients, respectively (P < 0.001). Among recipients whose cause of ESRD was not diabetes mellitus, the adjusted 10-year DWGF rates were 11.0 and 25.4 per 1000 person-years for Spanish and US recipients, respectively. Conclusions US kidney transplant recipients had more than twice the long-term hazard of DWGF compared with Spanish kidney transplant recipients and similar levels of death-censored graft function. Pre-transplant medical care, comorbidities, such as cardiovascular disease, and their management in each country's health system are possible explanations for the differences between the two countries. PMID:22759384

  6. Comparison of the long-term outcomes of kidney transplantation: USA versus Spain.

    PubMed

    Ojo, Akinlolu O; Morales, José María; González-Molina, Miguel; Steffick, Diane E; Luan, Fu L; Merion, Robert M; Ojo, Tammy; Moreso, Francesc; Arias, Manuel; Campistol, Josep María; Hernandez, Domingo; Serón, Daniel

    2013-01-01

    The long-term outcomes of kidney transplantation are suboptimal because many patients lose their allografts or experience premature death. Cross-country comparisons of long-term outcomes of kidney transplantation may provide insight into factors contributing to premature graft failure and death. We evaluated the rates of late graft failure and death among US and Spanish kidney recipients. This is a cohort study of US (n = 9609) and Spanish (n = 3808) patients who received a deceased donor kidney transplant in 1990, 1994, 1998 or 2002 and had a functioning allograft 1 year after transplantation with follow-up through September 2006. Ten-year overall and death-censored graft survival and 10-year overall recipient survival and death with graft function (DWGF) were estimated with multivariate Cox models. Among recipients alive with graft function 1 year after transplant, the 10-year graft survival was 71.3% for Spanish and 53.4% for US recipients (P < 0.001). The 10-year, death-censored graft survival was 75.6 and 76.0% for Spanish and US recipients, respectively (P = 0.73). The 10-year recipient survival was 86.2% for Spanish and 67.4% for US recipients (P < 0.001). In recipients with diabetes as the cause of ESRD, the adjusted DWGF rates at 10 years were 23.9 and 53.8 per 1000 person-years for Spanish and US recipients, respectively (P < 0.001). Among recipients whose cause of ESRD was not diabetes mellitus, the adjusted 10-year DWGF rates were 11.0 and 25.4 per 1000 person-years for Spanish and US recipients, respectively. US kidney transplant recipients had more than twice the long-term hazard of DWGF compared with Spanish kidney transplant recipients and similar levels of death-censored graft function. Pre-transplant medical care, comorbidities, such as cardiovascular disease, and their management in each country's health system are possible explanations for the differences between the two countries.

  7. Death with functioning graft in living donor kidney transplantation: analysis of risk factors.

    PubMed

    El-Agroudy, Amgad E; Bakr, Mohamed A; Shehab El-Dein, Ahmed B; Ghoneim, Mohamed A

    2003-01-01

    Death with a functioning graft (DWF) has been reported as a major cause of graft loss after renal transplantation. It has been reported to occur in 9-30%. From March 1976 to January 2002, a total of 1400 living donor renal transplants were performed in our center. Out of 257 reported deaths among our patients, 131 recipients died with functioning grafts after a mean period of 53.4 +/- 53.2 months. DWF patients account for 27% of all graft losses in our series. The mean age was 34.9 + 10.6 (range 8-62 years), 98 of them were male and 33 were female. The original kidney disease was GN in 9, PN in 24, PCK in 5 and nephrosclerosis in 8 patients. Acute rejection episodes were diagnosed in 84 patients (63.1). The post-transplant complications encountered were hypertension in 78 patients (59.5%), diabetes mellitus in 30 patients (22.9%), medical infections in 68 (51.5%), hepatic complications in 30 (22.9%) and malignancy in 17 patients (13%). The main causes of death in these patients were infections in 46 (35.6%), cardiovascular in 23 (17.6%), liver cell failure in 15 patients (11.4%) and malignancy in 8 (6.1%). The mean serum creatinine was 2 +/- 0.6 mg/dl at last follow-up before death. We conclude that the relatively higher mortality in renal transplantation is, in part, due to co-morbid medical illness, pre-transplant dialysis treatment, and factors uniquely related to transplantation, including immunosuppression and other drug effects. DWF must be in consideration when calculating graft survival. Copyright 2003 S. Karger AG, Basel

  8. A comparison of graft and patient outcomes following kidney transplantation in extended hour and conventional haemodialysis patients.

    PubMed

    See, Emily J; Hawley, Carmel M; Cho, Yeoungjee; Toussaint, Nigel D; Agar, John Wm; Pascoe, Elaine M; Lim, Wai H; Francis, Ross S; Collins, Michael G; Johnson, David W

    2018-01-08

    Differences in early graft function between kidney transplant recipients previously managed with either haemodialysis (HD) or peritoneal dialysis are well described. However, only two single-centre studies have compared graft and patient outcomes between extended hour and conventional HD patients, with conflicting results. This study compared the outcomes of all extended hour (≥24 hours/week) and conventional HD patients transplanted in Australia and New Zealand between 2000 and 2014. The primary outcome was delayed graft function (DGF), defined in an ordinal manner as either a spontaneous fall in serum creatinine of less than 10% within 24 hours, or the need for dialysis within 72 hours following transplantation. Secondary outcomes included the requirement for dialysis within 72 hours post-transplant, acute rejection, estimated glomerular filtration rate at 12 months, death-censored graft failure, all-cause and cardiovascular mortality, and a composite of graft failure and mortality. A total of 4,935 HD patients (378 extended hour HD, 4,557 conventional HD) received a kidney transplant during the study period. Extended hour HD was associated with an increased likelihood of DGF compared with conventional HD (adjusted proportional odds ratio 1.33; 95% confidence interval 1.06-1.67). There was no significant difference between extended hour and conventional HD in terms of any of the secondary outcomes. Compared to conventional HD, extended hour HD was associated with DGF, although long-term graft and patient outcomes were not different. This article is protected by copyright. All rights reserved.

  9. Recipient Risk Factors for Graft Failure in the Cornea Donor Study

    PubMed Central

    Sugar, Alan; Tanner, Jean Paul; Dontchev, Mariya; Tennant, Brad; Schultze, Robert L.; Dunn, Steven P.; Lindquist, Thomas D.; Gal, Robin L.; Beck, Roy W.; Kollman, Craig; Mannis, Mark J.; Holland, Edward J.

    2009-01-01

    Purpose Identify recipient factors which may be related to risk of corneal graft failure Design Multi-center prospective, double-masked, controlled clinical trial Participants 1090 subjects undergoing corneal transplantation for a moderate risk condition (principally Fuchs’ dystrophy or pseudophakic corneal edema) Methods Donor corneas were assigned using a random approach without respect to recipient factors, and surgeons were masked to information about the donor cornea including donor age. Surgery and post-operative care were performed according to the surgeons’ usual routines and subjects were followed for five years. Baseline factors were evaluated for their association with graft failure. Main Outcome Measures Graft failure, defined as a regraft or a cloudy cornea that was sufficiently opaque to compromise vision for a minimum of three consecutive months. Results Preoperative diagnosis of pseudophakic/aphakic corneal edema increased graft failure risk approximately 4-fold compared with Fuchs’ dystrophy (27% vs. 7%). Prior glaucoma surgery with preoperative glaucoma medication use substantially increased the graft failure rate. Factors not strongly associated with graft failure included age, gender, diabetes, smoking, and graft size. Conclusion The risk of graft failure is significantly increased in eyes with pseudophakic or aphakic corneal edema compared with Fuchs’ dystrophy, independent of lens status, and in eyes with a history of glaucoma. PMID:19395036

  10. Peripheral arterial disease preoperatively may predict graft failure and mortality in kidney transplant recipients.

    PubMed

    Patel, Salma I; Chakkera, Harini A; Wennberg, Paul W; Liedl, David A; Alrabadi, Fadi; Cha, Stephen S; Hooley, Darren D; Amer, Hatem; Wadei, Hani M; Shamoun, Fadi E

    2017-06-01

    Patients with end-stage renal disease undergoing kidney transplant often have diffuse atherosclerosis and high cardiovascular morbidity and mortality rates. We analyzed the correlation of peripheral arterial disease (PAD), here quantified by an abnormal ankle-brachial index (ABI) measured within the 5 years prior to kidney transplant, with graft failure and mortality rates (primary end points) after adjusting for known cardiovascular risk factors (age, sex, smoking history, hypertension, diabetes, stroke, known coronary artery disease or heart failure, years of dialysis). Of 1055 patients in our transplant population, 819 had arterial studies within the 5 years prior to transplant. Secondary end points included myocardial infarction; cerebrovascular accident; and limb ischemia, gangrene, or amputation. Low ABI was an independent and significant predictor of organ failure (OR, 2.77 (95% CI, 1.68-4.58), p<0.001), secondary end points (HR, 1.39 (95% CI, 0.97-1.99), p<0.076), and death (HR, 1.84 (95% CI, 1.26-2.68), p=0.002). PAD was common in this population: of 819 kidney transplant recipients, 46% had PAD. Low ABI was associated with a threefold greater risk of graft failure, a twofold greater risk of death after transplant, and a threefold greater risk of secondary end points. Screening for PAD is important in this patient population because of the potential impact on long-term outcomes.

  11. Single-Center Experience Using Marginal Liver Grafts in Korea.

    PubMed

    Park, P-J; Yu, Y-D; Yoon, Y-I; Kim, S-R; Kim, D-S

    2018-05-01

    Liver transplantation (LT) is an established therapeutic modality for patients with end-stage liver disease. The use of marginal donors has become more common worldwide due to the sharp increase in recipients, with a consequent shortage of suitable organs. We analyzed our single-center experience over the last 8 years in LT to evaluate the outcomes of using so-called "marginal donors." We retrospectively analyzed the database of all LTs performed at our institution from 2009 to 2017. Only patients undergoing deceased-donor LTs were analyzed. Marginal grafts were defined as livers from donors >60 years of age, livers from donors with serum sodium levels >155 mEq, graft steatosis >30%, livers with cold ischemia time ≥12 hours, livers from donors who were hepatitis B or C virus positive, livers recovered from donation after cardiac death, and livers split between 2 recipients. Patients receiving marginal grafts (marginal group) were compared with patients receiving standard grafts (standard group). A total of 106 patients underwent deceased-donor LT. There were 55 patients in the standard group and 51 patients in the marginal group. There were no significant differences in terms of age, sex, Model for End-Stage Liver Disease score, underlying liver disease, presence of hepatocellular carcinoma, and hospital stay between the 2 groups. Although the incidence of acute cellular rejection, cytomegalovirus infection, and postoperative complications was similar between the 2 groups, the incidence of early allograft dysfunction was higher in the marginal group. With a median follow-up of 26 months, the 1-, 3-, and 5-year overall and graft (death-censored) survivals in the marginal group were 85.5%, 75%, and 69.2% and 85.9%, 83.6%, and 77.2%, respectively. Patient overall survival and graft survival (death-censored) were significantly lower in the marginal group (P = .023 and P = .048, respectively). On multivariate analysis, receiving a marginal graft (hazard ratio [HR

  12. Endothelial cell density to predict endothelial graft failure after penetrating keratoplasty.

    PubMed

    Lass, Jonathan H; Sugar, Alan; Benetz, Beth Ann; Beck, Roy W; Dontchev, Mariya; Gal, Robin L; Kollman, Craig; Gross, Robert; Heck, Ellen; Holland, Edward J; Mannis, Mark J; Raber, Irving; Stark, Walter; Stulting, R Doyle

    2010-01-01

    To determine whether preoperative and/or postoperative central endothelial cell density (ECD) and its rate of decline postoperatively are predictive of graft failure caused by endothelial decompensation following penetrating keratoplasty to treat a moderate-risk condition, principally, Fuchs dystrophy or pseudophakic corneal edema. In a subset of Cornea Donor Study participants, a central reading center determined preoperative and postoperative ECD from available specular images for 17 grafts that failed because of endothelial decompensation and 483 grafts that did not fail. Preoperative ECD was not predictive of graft failure caused by endothelial decompensation (P = .91). However, the 6-month ECD was predictive of subsequent failure (P < .001). Among those that had not failed within the first 6 months, the 5-year cumulative incidence (+/-95% confidence interval) of failure was 13% (+/-12%) for the 33 participants with a 6-month ECD of less than 1700 cells/mm(2) vs 2% (+/-3%) for the 137 participants with a 6-month ECD of 2500 cells/mm(2) or higher. After 5 years' follow-up, 40 of 277 participants (14%) with a clear graft had an ECD below 500 cells/mm(2). Preoperative ECD is unrelated to graft failure from endothelial decompensation, whereas there is a strong correlation of ECD at 6 months with graft failure from endothelial decompensation. A graft can remain clear after 5 years even when the ECD is below 500 cells/mm(2).

  13. Age-Dependent Risk of Graft Failure in Young Kidney Transplant Recipients.

    PubMed

    Kaboré, Rémi; Couchoud, Cécile; Macher, Marie-Alice; Salomon, Rémi; Ranchin, Bruno; Lahoche, Annie; Roussey-Kesler, Gwenaelle; Garaix, Florentine; Decramer, Stéphane; Pietrement, Christine; Lassalle, Mathilde; Baudouin, Véronique; Cochat, Pierre; Niaudet, Patrick; Joly, Pierre; Leffondré, Karen; Harambat, Jérôme

    2017-06-01

    The risk of graft failure in young kidney transplant recipients has been found to increase during adolescence and early adulthood. However, this question has not been addressed outside the United States so far. Our objective was to investigate whether the hazard of graft failure also increases during this age period in France irrespective of age at transplantation. Data of all first kidney transplantation performed before 30 years of age between 1993 and 2012 were extracted from the French kidney transplant database. The hazard of graft failure was estimated at each current age using a 2-stage modelling approach that accounted for both age at transplantation and time since transplantation. Hazard ratios comparing the risk of graft failure during adolescence or early adulthood to other periods were estimated from time-dependent Cox models. A total of 5983 renal transplant recipients were included. The risk of graft failure was found to increase around the age of 13 years until the age of 21 years, and decrease thereafter. Results from the Cox model indicated that the hazard of graft failure during the age period 13 to 23 years was almost twice as high as than during the age period 0 to 12 years, and 25% higher than after 23 years. Among first kidney transplant recipients younger than 30 years in France, those currently in adolescence or early adulthood have the highest risk of graft failure.

  14. Brain death and marginal grafts in liver transplantation.

    PubMed

    Jiménez-Castro, M B; Gracia-Sancho, J; Peralta, C

    2015-06-04

    It is well known that most organs for transplantation are currently procured from brain-dead donors; however, the presence of brain death is an important risk factor in liver transplantation. In addition, one of the mechanisms to avoid the shortage of liver grafts for transplant is the use of marginal livers, which may show higher risk of primary non-function or initial poor function. To our knowledge, very few reviews have focused in the field of liver transplantation using brain-dead donors; moreover, reviews that focused on both brain death and marginal grafts in liver transplantation, both being key risk factors in clinical practice, have not been published elsewhere. The present review aims to describe the recent findings and the state-of-the-art knowledge regarding the pathophysiological changes occurring during brain death, their effects on marginal liver grafts and summarize the more controversial topics of this pathology. We also review the therapeutic strategies designed to date to reduce the detrimental effects of brain death in both marginal and optimal livers, attempting to explain why such strategies have not solved the clinical problem of liver transplantation.

  15. PIRCHE-II Is Related to Graft Failure after Kidney Transplantation

    PubMed Central

    Geneugelijk, Kirsten; Niemann, Matthias; Drylewicz, Julia; van Zuilen, Arjan D.; Joosten, Irma; Allebes, Wil A.; van der Meer, Arnold; Hilbrands, Luuk B.; Baas, Marije C.; Hack, C. Erik; van Reekum, Franka E.; Verhaar, Marianne C.; Kamburova, Elena G.; Bots, Michiel L.; Seelen, Marc A. J.; Sanders, Jan Stephan; Hepkema, Bouke G.; Lambeck, Annechien J.; Bungener, Laura B.; Roozendaal, Caroline; Tilanus, Marcel G. J.; Vanderlocht, Joris; Voorter, Christien E.; Wieten, Lotte; van Duijnhoven, Elly M.; Gelens, Mariëlle; Christiaans, Maarten H. L.; van Ittersum, Frans J.; Nurmohamed, Azam; Lardy, Junior N. M.; Swelsen, Wendy; van der Pant, Karlijn A.; van der Weerd, Neelke C.; ten Berge, Ineke J. M.; Bemelman, Fréderike J.; Hoitsma, Andries; van der Boog, Paul J. M.; de Fijter, Johan W.; Betjes, Michiel G. H.; Heidt, Sebastiaan; Roelen, Dave L.; Claas, Frans H.; Otten, Henny G.; Spierings, Eric

    2018-01-01

    Individual HLA mismatches may differentially impact graft survival after kidney transplantation. Therefore, there is a need for a reliable tool to define permissible HLA mismatches in kidney transplantation. We previously demonstrated that donor-derived Predicted Indirectly ReCognizable HLA Epitopes presented by recipient HLA class II (PIRCHE-II) play a role in de novo donor-specific HLA antibodies formation after kidney transplantation. In the present Dutch multi-center study, we evaluated the possible association between PIRCHE-II and kidney graft failure in 2,918 donor–recipient couples that were transplanted between 1995 and 2005. For these donors–recipients couples, PIRCHE-II numbers were related to graft survival in univariate and multivariable analyses. Adjusted for confounders, the natural logarithm of PIRCHE-II was associated with a higher risk for graft failure [hazard ratio (HR): 1.13, 95% CI: 1.04–1.23, p = 0.003]. When analyzing a subgroup of patients who had their first transplantation, the HR of graft failure for ln(PIRCHE-II) was higher compared with the overall cohort (HR: 1.22, 95% CI: 1.10–1.34, p < 0.001). PIRCHE-II demonstrated both early and late effects on graft failure in this subgroup. These data suggest that the PIRCHE-II may impact graft survival after kidney transplantation. Inclusion of PIRCHE-II in donor-selection criteria may eventually lead to an improved kidney graft survival. PMID:29556227

  16. Pre-transplant course and risk of kidney transplant failure in IgA nephropathy patients.

    PubMed

    Bjørneklett, Rune; Vikse, Bjørn Egil; Smerud, Hilde Kloster; Bostad, Leif; Leivestad, Torbjørn; Hartmann, Anders; Iversen, Bjarne M

    2011-01-01

    There is lack of knowledge to what degree clinical/morphological presentation and course of IgA nephropathy (IgAN) prior to end-stage renal disease are risk factors for graft loss after kidney transplantation. Patients with IgAN between 1988 and 2006 (registered in the Norwegian Kidney Biopsy Registry) who later received a kidney transplant (registered in the Norwegian Renal Registry) were included. The cohort was followed up regarding death-censored graft loss throughout 2008. Graft survival with a rapid progressive (RP) vs. a slow progressive (SP) course of pre-Tx IgAN (annual GFR > or <30 mL/min/1.73 m(2) ) was studied. Among 106 included patients, there were 14 graft losses giving a graft loss rate of 1.9/100 patient years. Follow-up until the first kidney transplant was 6.9 ± 4.4 (range 0.1-19) yr. Patients with pre-Tx RP had a higher graft loss rate compared with SP patients (6.3 vs.1.3/100 patient years, p < 0.001). Graft loss rate with living-related donor (LRD) was similar to unrelated donor (UD) grafts. Most RP patients had received LRD grafts, and in SP patients, graft survival with LRD grafts was better than UD grafts (0.3 vs.2.1/100 patient years, p = 0.055). A rapid pre-transplant course is a strong risk factor for transplant failure in patients with IgAN. © 2011 John Wiley & Sons A/S.

  17. The recovery status from delayed graft function can predict long-term outcome after deceased donor kidney transplantation.

    PubMed

    Lee, Juhan; Song, Seung Hwan; Lee, Jee Youn; Kim, Deok Gie; Lee, Jae Geun; Kim, Beom Seok; Kim, Myoung Soo; Huh, Kyu Ha

    2017-10-20

    The effect of delayed graft function (DGF) recovery on long-term graft outcome is unclear. The aim of this study was to examine the association of DGF recovery status with long-term outcome. We analyzed 385 recipients who underwent single kidney transplantation from brain-dead donors between 2004 and 2015. Patients were grouped according to renal function at 1 month post-transplantation: control (without DGF); recovered DGF (glomerular filtration rate [GFR] ≥ 30 mL/min/1.73 m 2 ); and incompletely recovered DGF group (GFR < 30 mL/min/1.73 m 2 ). DGF occurred in 104 of 385 (27%) recipients. Of the DGF patients, 70 recovered from DGF and 34 incompletely recovered from DGF. Death-censored graft survival rates for control, recovered DGF, and incompletely recovered DGF groups were 95.3%, 94.7%, and 80.7%, respectively, at 5 years post-transplantation (P = 0.003). Incompletely recovered DGF was an independent risk factor for death-censored graft loss (HR = 3.410, 95%CI, 1.114-10.437). DGF was associated with increased risk for patient death regardless of DGF recovery status. Mean GFRs at 5 years were 65.5 ± 20.8, 62.2 ± 27.0, and 45.8 ± 15.4 mL/min/1.73 m 2 for control, recovered, and incompletely recovered DGF groups, respectively (P < 0.001). Control group and recovered DGF patients had similar renal outcomes. However, DGF was associated with increased risk for patient death regardless of DGF recovery status.

  18. Nonesterified fatty acids and development of graft failure in renal transplant recipients.

    PubMed

    Klooster, Astrid; Hofker, H Sijbrand; Navis, Gerjan; Homan van der Heide, Jaap J; Gans, Reinold O B; van Goor, Harry; Leuvenink, Henri G D; Bakker, Stephan J L

    2013-06-15

    Chronic transplant dysfunction is the most common cause of graft failure on the long term. Proteinuria is one of the cardinal clinical signs of chronic transplant dysfunction. Albumin-bound fatty acids (FA) have been hypothesized to be instrumental in the etiology of renal damage induced by proteinuria. We therefore questioned whether high circulating FA could be associated with an increased risk for future development of graft failure in renal transplant recipients (RTR). To this end, we prospectively investigated the association of fasting concentrations of circulating nonesterified FA (NEFA) with the development of graft failure in RTR. Baseline measurements were performed between 2001 and 2003 in outpatient RTR with a functioning graft of more than 1 year. Follow-up was recorded until May 19, 2009. Graft failure was defined as return to dialysis or retransplantation. We included 461 RTR at a median (interquartile range [IQR]) of 6.1 (3.3-11.3) years after transplantation. Median (IQR) fasting concentrations of NEFA were 373 (270-521) μM/L. Median (IQR) follow-up for graft failure beyond baseline was 7.1 (6.1-7.5) years. Graft failure occurred in 23 (15%), 14 (9%), and 9 (6%) of RTR across increasing gender-specific tertiles of NEFA (P=0.04). In a gender-adjusted Cox-regression analysis, log-transformed NEFA level was inversely associated with the development of graft failure (hazard ratio, 0.61; 95% confidence interval, 0.47-0.81; P<0.001). In this prospective cohort study in RTR, we found an inverse association between fasting NEFA concentrations and risk for development of graft failure. This association suggests a renoprotective rather than a tubulotoxic effect of NEFA. Further studies on the role of different types of NEFA in the progression of renal disease are warranted.

  19. Assessment of Heart Transplant Waitlist Time and Pre- and Post-transplant Failure: A Mixed Methods Approach.

    PubMed

    Goldstein, Benjamin A; Thomas, Laine; Zaroff, Jonathan G; Nguyen, John; Menza, Rebecca; Khush, Kiran K

    2016-07-01

    Over the past two decades, there have been increasingly long waiting times for heart transplantation. We studied the relationship between heart transplant waiting time and transplant failure (removal from the waitlist, pretransplant death, or death or graft failure within 1 year) to determine the risk that conservative donor heart acceptance practices confer in terms of increasing the risk of failure among patients awaiting transplantation. We studied a cohort of 28,283 adults registered on the United Network for Organ Sharing heart transplant waiting list between 2000 and 2010. We used Kaplan-Meier methods with inverse probability censoring weights to examine the risk of transplant failure accumulated over time spent on the waiting list (pretransplant). In addition, we used transplant candidate blood type as an instrumental variable to assess the risk of transplant failure associated with increased wait time. Our results show that those who wait longer for a transplant have greater odds of transplant failure. While on the waitlist, the greatest risk of failure is during the first 60 days. Doubling the amount of time on the waiting list was associated with a 10% (1.01, 1.20) increase in the odds of failure within 1 year after transplantation. Our findings suggest a relationship between time spent on the waiting list and transplant failure, thereby supporting research aimed at defining adequate donor heart quality and acceptance standards for heart transplantation.

  20. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System.

    PubMed

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure ( P <0.05). Multivariate logistic regression analysis showed no statistically significant relationship ( P >0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant ( P <0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y.

  1. Weighing of risk factors for penetrating keratoplasty graft failure: application of Risk Score System

    PubMed Central

    Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio

    2017-01-01

    AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027

  2. Timing of Pregnancy After Kidney Transplantation and Risk of Allograft Failure.

    PubMed

    Rose, C; Gill, J; Zalunardo, N; Johnston, O; Mehrotra, A; Gill, J S

    2016-08-01

    The optimal timing of pregnancy after kidney transplantation remains uncertain. We determined the risk of allograft failure among women who became pregnant within the first 3 posttransplant years. Among 21 814 women aged 15-45 years who received a first kidney-only transplant between 1990 and 2010 captured in the United States Renal Data System, n = 729 pregnancies were identified using Medicare claims. The probability of allograft failure from any cause including death (ACGL) at 1, 3, and 5 years after pregnancy was 9.6%, 25.9%, and 36.6%. In multivariate analyses, pregnancy in the first posttransplant year was associated with an increased risk of ACGL (hazard ratio [HR]: 1.18; 95% confidence interval [CI] 1.00, 1.40) and death censored graft loss (DCGL) (HR:1.25; 95% CI 1.04, 1.50), while pregnancy in the second posttransplant year was associated with an increased risk of DCGL (HR: 1.26; 95% CI 1.06, 1.50). Pregnancy in the third posttransplant year was not associated with an increased risk of ACGL or DCGL. These findings demonstrate a higher incidence of allograft failure after pregnancy than previously reported and that the increased risk of allograft failure extends to pregnancies in the second posttransplant year. © Copyright 2016 The American Society of Transplantation and the American Society of Transplant Surgeons.

  3. Development and evaluation of a composite risk score to predict kidney transplant failure.

    PubMed

    Moore, Jason; He, Xiang; Shabir, Shazia; Hanvesakul, Rajesh; Benavente, David; Cockwell, Paul; Little, Mark A; Ball, Simon; Inston, Nicholas; Johnston, Atholl; Borrows, Richard

    2011-05-01

    Although risk factors for kidney transplant failure are well described, prognostic risk scores to estimate risk in prevalent transplant recipients are limited. Development and validation of risk-prediction instruments. The development data set included 2,763 prevalent patients more than 12 months posttransplant enrolled into the LOTESS (Long Term Efficacy and Safety Surveillance) Study. The validation data set included 731 patients who underwent transplant at a single UK center. Estimated glomerular filtration rate (eGFR) and other risk factors were evaluated using Cox regression. Scores for death-censored and overall transplant failure were based on the summed hazard ratios for baseline predictor variables. Predictive performance was assessed using calibration (Hosmer-Lemeshow statistic), discrimination (C statistic), and clinical reclassification (net reclassification improvement) compared with eGFR alone. In the development data set, 196 patients died and another 225 experienced transplant failure. eGFR, recipient age, race, serum urea and albumin levels, declining eGFR, and prior acute rejection predicted death-censored transplant failure. eGFR, recipient age, sex, serum urea and albumin levels, and declining eGFR predicted overall transplant failure. In the validation data set, 44 patients died and another 101 experienced transplant failure. The weighted scores comprising these variables showed adequate discrimination and calibration for death-censored (C statistic, 0.83; 95% CI, 0.75-0.91; Hosmer-Lemeshow χ(2)P = 0.8) and overall (C statistic, 0.70; 95% CI, 0.64-0.77; Hosmer-Lemeshow χ(2)P = 0.5) transplant failure. However, the scores failed to reclassify risk compared with eGFR alone (net reclassification improvements of 7.6% [95% CI, -0.2 to 13.4; P = 0.09] and 4.3% [95% CI, -2.7 to 11.8; P = 0.3] for death-censored and overall transplant failure, respectively). Retrospective analysis of predominantly cyclosporine-treated patients; limited study size and

  4. Nighttime kidney transplantation is associated with less pure technical graft failure.

    PubMed

    Brunschot, Denise M D Özdemir-van; Hoitsma, Andries J; van der Jagt, Michel F P; d'Ancona, Frank C; Donders, Rogier A R T; van Laarhoven, Cees J H M; Hilbrands, Luuk B; Warlé, Michiel C

    2016-07-01

    To minimize cold ischemia time, transplantations with kidneys from deceased donors are frequently performed during the night. However, sleep deprivation of those who perform the transplantation may have adverse effects on cognitive and psychomotor performance and may cause reduced cognitive flexibility. We hypothesize that renal transplantations performed during the night are associated with an increased incidence of pure technical graft failure. A retrospective analysis of data of the Dutch Organ Transplant Registry concerning all transplants from deceased donors between 2000 and 2013 was performed. Nighttime surgery was defined as the start of the procedure between 8 p.m. and 8 a.m. The primary outcome measure was technical graft failure, defined as graft loss within 10 days after surgery without signs of (hyper)acute rejection. Of 4.519 renal transplantations in adult recipients, 1.480 were performed during the night. The incidence of pure technical graft failure was 1.0 % for procedures started during the night versus 2.6 % for daytime surgery (p = .001). In a multivariable model, correcting for relevant donor, recipient and graft factors, daytime surgery was an independent predictor of pure technical graft failure (p < .001). Limitation of this study is mainly to its retrospective design, and the influence of some relevant variables, such as the experience level of the surgeon, could not be assessed. We conclude that nighttime surgery is associated with less pure technical graft failures. Further research is required to explore factors that may positively influence the performance of the surgical team during the night.

  5. Quantile Regression with Censored Data

    ERIC Educational Resources Information Center

    Lin, Guixian

    2009-01-01

    The Cox proportional hazards model and the accelerated failure time model are frequently used in survival data analysis. They are powerful, yet have limitation due to their model assumptions. Quantile regression offers a semiparametric approach to model data with possible heterogeneity. It is particularly powerful for censored responses, where the…

  6. [Causes of death with a functioning graft among kidney allograft recipients].

    PubMed

    Vega, Jorge; Videla, Christian; Borja, Hernán; Goecke, Helmuth; Martínez, Felipe; Betancour, Pablo

    2012-03-01

    Death with a functioning graft (DWGF) is now one of the main causes of renal transplant (RTx) loss. To determine whether the causes of DWGF, characteristics of donors and recipients and complications of RTx have changed in the last two decades. Cooperative study of a cohort of 418 kidney grafts performed between 1968 and 2010. Patients were divided into two groups according to whether their kidney transplants were performed between 1968 and 1992 (Group 1) or 1993 and 2010 (Group 2). Sixty eight patients experienced DWGF. Infections were the leading cause of DWGF in both groups (38 and 41%, respectively), followed by cardiovascular diseases (24 and 23% respectively), gastrointestinal disorders (21 and 26% respectively) and cancer (17 and 10% respectively). There were no significant differences in causes of death between the two groups according to the time elapsed since the renal transplantation. In patients in Group 1, the interval between diagnosis of renal failure and dialysis (HD) and the interval between the start of HD and kidney transplantation were significantly lower than in Group 2. The former had also an increased number of acute rejections in the first five years of kidney transplantation (p < 0.001). In Group 2, patients more often received their kidneys from deceased donors, had previous kidney transplantation, higher rate of antibodies to a panel of lymphocytes and an increased incidence of cardiovascular disorders after five years of RTx. The proportion of graft loss due to DWGF has increased over the last 2 decades, but its causes have not changed significantly. Infections are the most common causes of DWGF followed by cardiovascular and digestive diseases.

  7. Outcomes Using Grafts from Donors after Cardiac Death.

    PubMed

    Doyle, M B Majella; Collins, Kelly; Vachharajani, Neeta; Lowell, Jeffrey A; Shenoy, Surendra; Nalbantoglu, Ilke; Byrnes, Kathleen; Garonzik-Wang, Jacqueline; Wellen, Jason; Lin, Yiing; Chapman, William C

    2015-07-01

    Previous reports suggest that donation after cardiac death (DCD) liver grafts have increased primary nonfunction (PNF) and cholangiopathy thought to be due to the graft warm ischemia before cold flushing. In this single-center, retrospective study, 866 adult liver transplantations were performed at our institution from January 2005 to August 2014. Forty-nine (5.7%) patients received DCD donor grafts. The 49 DCD graft recipients were compared with all recipients of donation after brain death donor (DBD) grafts and to a donor and recipient age- and size-matched cohort. The DCD donors were younger (age 28, range 8 to 60 years) than non-DCD (age 44.3, range 9 to 80 years) (p < 0.0001), with similar recipient age. The mean laboratory Model for End-Stage Liver Disease (MELD) was lower in DCD recipients (18.7 vs 22.2, p = 0.03). Mean cold and warm ischemia times were similar. Median ICU and hospital stay were 2 days and 7.5 days in both groups (p = 0.37). Median follow-ups were 4.0 and 3.4 years, respectively. Long-term outcomes were similar between groups, with similar 1-, 3- and 5-year patient and graft survivals (p = 0.59). Four (8.5%) recipients developed ischemic cholangiopathy (IC) at 2, 3, 6, and 8 months. Primary nonfunction and hepatic artery thrombosis did not occur in any patient in the DCD group. Acute kidney injury was more common with DCD grafts (16.3% of DCD recipients required dialysis vs 4.1% of DBD recipients, p = 0.01). An increased donor age (>40 years) was shown to increase the risk of IC (p = 0.006). Careful selection of DCD donors can provide suitable donors, with results of liver transplantation comparable to those with standard brain dead donors. Copyright © 2015 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  8. Monte Carlo Simulation of Sudden Death Bearing Testing

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2003-01-01

    Monte Carlo simulations combined with sudden death testing were used to compare resultant bearing lives to the calculated hearing life and the cumulative test time and calendar time relative to sequential and censored sequential testing. A total of 30 960 virtual 50-mm bore deep-groove ball bearings were evaluated in 33 different sudden death test configurations comprising 36, 72, and 144 bearings each. Variations in both life and Weibull slope were a function of the number of bearings failed independent of the test method used and not the total number of bearings tested. Variation in L10 life as a function of number of bearings failed were similar to variations in lift obtained from sequentially failed real bearings and from Monte Carlo (virtual) testing of entire populations. Reductions up to 40 percent in bearing test time and calendar time can be achieved by testing to failure or the L(sub 50) life and terminating all testing when the last of the predetermined bearing failures has occurred. Sudden death testing is not a more efficient method to reduce bearing test time or calendar time when compared to censored sequential testing.

  9. The effect of delayed graft function on graft and patient survival in kidney transplantation: an approach using competing events analysis.

    PubMed

    Fonseca, Isabel; Teixeira, Laetitia; Malheiro, Jorge; Martins, La Salete; Dias, Leonídio; Castro Henriques, António; Mendonça, Denisa

    2015-06-01

    In kidney transplantation, the impact of delayed graft function (DGF) on long-term graft and patient survival is controversial. We examined the impact of DGF on graft and recipient survival by accounting for the possibility that death with graft function may act as a competing risk for allograft failure. We used data from 1281 adult primary deceased-donor kidney recipients whose allografts functioned at least 1 year. The probability of graft loss occurrence is overestimated using the complement of Kaplan-Meier estimates (1-KM). Both the cause-specific Cox proportional hazard regression model (standard Cox) and the subdistribution hazard regression model proposed by Fine and Gray showed that DGF was associated with shorter time to graft failure (csHR = 2.0, P = 0.002; sHR = 1.57, P = 0.009), independent of acute rejection (AR) and after adjusting for traditional factors associated with graft failure. Regarding patient survival, DGF was a predictor of patient death using the cause-specific Cox model (csHR = 1.57, P = 0.029) but not using the subdistribution model. The probability of graft loss from competing end points should not be reported with the 1-KM. Application of a regression model for subdistribution hazard showed that, independent of AR, DGF has a detrimental effect on long-term graft survival, but not on patient survival. © 2015 Steunstichting ESOT.

  10. Donor age and early graft failure after lung transplantation: a cohort study.

    PubMed

    Baldwin, M R; Peterson, E R; Easthausen, I; Quintanilla, I; Colago, E; Sonett, J R; D'Ovidio, F; Costa, J; Diamond, J M; Christie, J D; Arcasoy, S M; Lederer, D J

    2013-10-01

    Lungs from older adult organ donors are often unused because of concerns for increased mortality. We examined associations between donor age and transplant outcomes among 8860 adult lung transplant recipients using Organ Procurement and Transplantation Network and Lung Transplant Outcomes Group data. We used stratified Cox proportional hazard models and generalized linear mixed models to examine associations between donor age and both 1-year graft failure and primary graft dysfunction (PGD). The rate of 1-year graft failure was similar among recipients of lungs from donors age 18-64 years, but severely ill recipients (Lung Allocation Score [LAS] >47.7 or use of mechanical ventilation) of lungs from donors age 56-64 years had increased rates of 1-year graft failure (p-values for interaction = 0.04 and 0.02, respectively). Recipients of lungs from donors <18 and ≥65 years had increased rates of 1-year graft failure (adjusted hazard ratio [HR] 1.23, 95% CI 1.01-1.50 and adjusted HR 2.15, 95% CI 1.47-3.15, respectively). Donor age was not associated with the risk of PGD. In summary, the use of lungs from donors age 56 to 64 years may be safe for adult candidates without a high LAS and the use of lungs from pediatric donors is associated with a small increase in early graft failure. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.

  11. Prediction of mode of death in heart failure: the Seattle Heart Failure Model.

    PubMed

    Mozaffarian, Dariush; Anker, Stefan D; Anand, Inder; Linker, David T; Sullivan, Mark D; Cleland, John G F; Carson, Peter E; Maggioni, Aldo P; Mann, Douglas L; Pitt, Bertram; Poole-Wilson, Philip A; Levy, Wayne C

    2007-07-24

    Prognosis and mode of death in heart failure patients are highly variable in that some patients die suddenly (often from ventricular arrhythmia) and others die of progressive failure of cardiac function (pump failure). Prediction of mode of death may facilitate decisions about specific medications or devices. We used the Seattle Heart Failure Model (SHFM), a validated prediction model for total mortality in heart failure, to assess the mode of death in 10,538 ambulatory patients with New York Heart Association class II to IV heart failure and predominantly systolic dysfunction enrolled in 6 randomized trials or registries. During 16,735 person-years of follow-up, 2014 deaths occurred, which included 1014 sudden deaths and 684 pump-failure deaths. Compared with a SHFM score of 0, patients with a score of 1 had a 50% higher risk of sudden death, patients with a score of 2 had a nearly 3-fold higher risk, and patients with a score of 3 or 4 had a nearly 7-fold higher risk (P<0.001 for all comparisons; 1-year area under the receiver operating curve, 0.68). Stratification of risk of pump-failure death was even more pronounced, with a 4-fold higher risk with a score of 1, a 15-fold higher risk with a score of 2, a 38-fold higher risk with a score of 3, and an 88-fold higher risk with a score of 4 (P<0.001 for all comparisons; 1-year area under the receiver operating curve, 0.85). The proportion of deaths caused by sudden death versus pump-failure death decreased from a ratio of 7:1 with a SHFM score of 0 to a ratio of 1:2 with a SHFM score of 4 (P trend <0.001). The SHFM score provides information about the likely mode of death among ambulatory heart failure patients. Investigation is warranted to determine whether such information might predict responses to or cost-effectiveness of specific medications or devices in heart failure patients.

  12. Chronicity of Anterior Cruciate Ligament Deficiency, Part 2: Radiographic Predictors of Early Graft Failure

    PubMed Central

    Tanaka, Yoshinari; Kita, Keisuke; Takao, Rikio; Amano, Hiroshi; Uchida, Ryohei; Shiozaki, Yoshiki; Yonetani, Yasukazu; Kinugasa, Kazutaka; Mae, Tatsuo; Horibe, Shuji

    2018-01-01

    Background: Accumulating evidence suggests that long-term anterior cruciate ligament (ACL) deficiency can give rise to an abnormal tibiofemoral relationship and subsequent intra-articular lesions. However, the effects of chronic ACL deficiency (ACLD) on early graft failure after anatomic reconstruction remain unclear. Hypothesis: We hypothesized that patients with long-term ACLD lasting more than 5 years would have a greater rate of early graft failure due to insufficient intraoperative reduction of the tibia and that the preoperative and immediately postoperative abnormal tibiofemoral relationship in the sagittal plane, such as anterior tibial subluxation (ATS), would correlate with the graft status on postoperative magnetic resonance imaging (MRI). Study Design: Cohort study; Level of evidence, 3. Methods: A total of 358 patients who had undergone anatomic ACL reconstruction with hamstring grafts were divided into 5 groups based on chronicity of ACLD: (1) 0 to 6 months, (2) 6 months to 1 year, (3) 1 to 2 years, (4) 2 to 5 years, and (5) longer than 5 years. Preoperatively and immediately postoperatively, lateral radiographs in full extension were taken in all patients to evaluate the tibiofemoral relationship, specifically with regard to ATS, space for the ACL (sACL), and extension angle. All patients underwent MRI at 6 months to reveal graft status. Groups with a high rate of graft failure were further analyzed to compare demographic and radiographic factors between the intact and failure subgroups, followed by multivariate logistic regression analysis to identify predisposing factors. Results: Graft failure without trauma was observed in 4 (1.8%), 0 (0%), 1 (3.7%), 3 (9.7%), and 8 patients (17.7%) in groups 1, 2, 3, 4, and 5, respectively. Of the 76 patients in groups 4 and 5, significant differences were noted between the failure and intact subgroups in preoperative ATS (4.9 vs 2.4 mm, respectively; P < .01), side-to-side differences in sACL (sACL-SSD) (4.7 vs

  13. Hepatitis C Virus Recurrence Occurs Earlier in Patients Receiving Donation After Circulatory Death Liver Transplant Grafts Compared With Those Receiving Donation After Brainstem Death Grafts.

    PubMed

    Townsend, S A; Monga, M A; Nightingale, P; Mutimer, D; Elsharkawy, A M; Holt, A

    2017-11-01

    Hepatitis C virus (HCV)-related cirrhosis remains the commonest indication for liver transplantation worldwide, yet few studies have investigated the impact of donation after circulatory death (DCD) graft use on HCV recurrence and patient outcomes. DCD grafts have augmented the limited donor organ pool and reduced wait-list mortality, although concerns regarding graft longevity and patient outcome persist. This was a single-center study of all HCV + adults who underwent DCD liver transplantation between 2004 and 2014. 44 HCV+ patients received DCD grafts, and were matched with 44 HCV+ recipients of donation after brainstem death (DBD) grafts, and their outcomes examined. The groups were matched for age, sex, and presence of hepatocellular carcinoma; no significant differences were found between the group's donor or recipient characteristics. Paired and unpaired analysis demonstrated that HCV recurrence was more rapid in recipients of DCD organs compared with DBD grafts (408 vs 657 days; P = .006). There were no significant differences in graft survival, patient survival, or rates of biliary complications between the cohorts despite DCD donors being 10 years older on average than those used in other published experience. In an era of highly effective direct acting antiviral therapy, rapid HCV recrudescence in grafts from DCD donors should not compromise long-term morbidity or mortality. In the context of rising wait-list mortality, it is prudent to use all available sources to expand the pool of donor organs, and our data support the practice of using extended-criteria DCD grafts based on donor age. Notwithstanding that, clinicians should be aware that HCV recrudescence is more rapid in DCD recipients, and early post-transplant anti-viral therapy is indicated to prevent graft injury. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  14. Donation after cardiac death liver transplantation in primary sclerosing cholangitis: proceed with caution.

    PubMed

    Sundaram, Vinay; Choi, Gina; Jeon, Christie Y; Ayoub, Walid S; Nissen, Nicholas N; Klein, Andrew S; Tran, Tram T

    2015-05-01

    Primary sclerosing cholangitis (PSC) patients suffer from comorbidities unaccounted for by the model for end-stage liver disease scoring system and may benefit from the increased donor organ pool provided by donation after cardiac death (DCD) liver transplantation. However, the impact of DCD transplantation on PSC graft outcomes is unknown. We studied 41,018 patients using the United Network for Organ Sharing database from 2002 through 2012. Kaplan-Meier analysis and Cox regression were used to evaluate graft survival and risk factors for graft failure, respectively. The PSC patients receiving DCD livers (n=75) showed greater overall graft failure (37.3% vs. 20.4%, P = 0.001), graft failure from biliary complications (47.4% vs. 13.9%, P = 0.002), and shorter graft survival time (P = 0.003), compared to PSC patients receiving donation after brain death organs (n=1592). Among DCD transplants (n=1943), PSC and non-PSC patients showed similar prevalence of graft failure and graft survival time, though a trend existed toward increased biliary-induced graft failure among PSC patients (47.4 vs. 26.4%, P = 0.063). Cox modeling demonstrated that PSC patients have a positive graft survival advantage compared to non-PSC patients (hazard ratio [HR]=0.72, P < 0.001), whereas DCD transplantation increased risk of graft failure (HR = 1.28, P < 0.001). Furthermore, the interaction between DCD transplant and PSC was significant (HR = 1.76, P = 0.015), indicating that use of DCD organs impacts graft survival more in PSC than non-PSC patients. Donation after cardiac death liver transplantation leads to significantly worse outcomes in PSC. We recommend cautious use of DCD transplantation in this population.

  15. The Association Between Renin-Angiotensin System Blockade and Long-term Outcomes in Renal Transplant Recipients: The Wisconsin Allograft Recipient Database (WisARD).

    PubMed

    Shin, Jung-Im; Palta, Mari; Djamali, Arjang; Kaufman, Dixon B; Astor, Brad C

    2016-07-01

    Renin-angiotensin system (RAS) blockade reduces mortality in the general population and among non-dialysis-dependent patients with chronic kidney disease. The RAS blockade also decreases proteinuria and protects renal function in non-transplant patients with chronic kidney disease. It remains controversial, however, whether this translates to improved patient or graft survival among transplant recipients. We analyzed 2684 primary kidney transplant recipients at the University of Wisconsin in 1994 to 2010 who had a functioning graft at 6 months after transplantation. We assessed the association of RAS blockade with patient and graft survival using time-dependent Cox and marginal structural models. Three hundred seventy-seven deaths and 329 graft failures before death (638 total graft losses) occurred during a median of 5.4 years of follow-up. The RAS blockade was associated with an adjusted-hazard ratio of 0.63 (95% confidence interval, 0.53-0.75) for total graft loss, 0.69 (0.55-0.86) for death, and 0.62 (0.49-0.78) for death-censored graft failure. The associations of RAS blockade with a lower risk of total graft loss and mortality were stronger with more severe proteinuria. The RAS blockade was associated with a 2-fold higher risk of hyperkalemia. Our findings suggest RAS blockade is associated with better patient and graft survival in renal transplant recipients.

  16. A proportional hazards regression model for the subdistribution with right-censored and left-truncated competing risks data

    PubMed Central

    Zhang, Xu; Zhang, Mei-Jie; Fine, Jason

    2012-01-01

    With competing risks failure time data, one often needs to assess the covariate effects on the cumulative incidence probabilities. Fine and Gray proposed a proportional hazards regression model to directly model the subdistribution of a competing risk. They developed the estimating procedure for right-censored competing risks data, based on the inverse probability of censoring weighting. Right-censored and left-truncated competing risks data sometimes occur in biomedical researches. In this paper, we study the proportional hazards regression model for the subdistribution of a competing risk with right-censored and left-truncated data. We adopt a new weighting technique to estimate the parameters in this model. We have derived the large sample properties of the proposed estimators. To illustrate the application of the new method, we analyze the failure time data for children with acute leukemia. In this example, the failure times for children who had bone marrow transplants were left truncated. PMID:21557288

  17. Techniques for estimating health care costs with censored data: an overview for the health services researcher

    PubMed Central

    Wijeysundera, Harindra C; Wang, Xuesong; Tomlinson, George; Ko, Dennis T; Krahn, Murray D

    2012-01-01

    Objective The aim of this study was to review statistical techniques for estimating the mean population cost using health care cost data that, because of the inability to achieve complete follow-up until death, are right censored. The target audience is health service researchers without an advanced statistical background. Methods Data were sourced from longitudinal heart failure costs from Ontario, Canada, and administrative databases were used for estimating costs. The dataset consisted of 43,888 patients, with follow-up periods ranging from 1 to 1538 days (mean 576 days). The study was designed so that mean health care costs over 1080 days of follow-up were calculated using naïve estimators such as full-sample and uncensored case estimators. Reweighted estimators – specifically, the inverse probability weighted estimator – were calculated, as was phase-based costing. Costs were adjusted to 2008 Canadian dollars using the Bank of Canada consumer price index (http://www.bankofcanada.ca/en/cpi.html). Results Over the restricted follow-up of 1080 days, 32% of patients were censored. The full-sample estimator was found to underestimate mean cost ($30,420) compared with the reweighted estimators ($36,490). The phase-based costing estimate of $37,237 was similar to that of the simple reweighted estimator. Conclusion The authors recommend against the use of full-sample or uncensored case estimators when censored data are present. In the presence of heavy censoring, phase-based costing is an attractive alternative approach. PMID:22719214

  18. The fracture load and failure types of veneered anterior zirconia crowns: an analysis of normal and Weibull distribution of complete and censored data.

    PubMed

    Stawarczyk, Bogna; Ozcan, Mutlu; Hämmerle, Christoph H F; Roos, Malgorzata

    2012-05-01

    The aim of this study was to compare the fracture load of veneered anterior zirconia crowns using normal and Weibull distribution of complete and censored data. Standardized zirconia frameworks for maxillary canines were milled using a CAD/CAM system and randomly divided into 3 groups (N=90, n=30 per group). They were veneered with three veneering ceramics, namely GC Initial ZR, Vita VM9, IPS e.max Ceram using layering technique. The crowns were cemented with glass ionomer cement on metal abutments. The specimens were then loaded to fracture (1 mm/min) in a Universal Testing Machine. The data were analyzed using classical method (normal data distribution (μ, σ); Levene test and one-way ANOVA) and according to the Weibull statistics (s, m). In addition, fracture load results were analyzed depending on complete and censored failure types (only chipping vs. total fracture together with chipping). When computed with complete data, significantly higher mean fracture loads (N) were observed for GC Initial ZR (μ=978, σ=157; s=1043, m=7.2) and VITA VM9 (μ=1074, σ=179; s=1139; m=7.8) than that of IPS e.max Ceram (μ=798, σ=174; s=859, m=5.8) (p<0.05) by classical and Weibull statistics, respectively. When the data were censored for only total fracture, IPS e.max Ceram presented the lowest fracture load for chipping with both classical distribution (μ=790, σ=160) and Weibull statistics (s=836, m=6.5). When total fracture with chipping (classical distribution) was considered as failure, IPS e.max Ceram did not show significant fracture load for total fracture (μ=1054, σ=110) compared to other groups (GC Initial ZR: μ=1039, σ=152, VITA VM9: μ=1170, σ=166). According to Weibull distributed data, VITA VM9 showed significantly higher fracture load (s=1228, m=9.4) than those of other groups. Both classical distribution and Weibull statistics for complete data yielded similar outcomes. Censored data analysis of all ceramic systems based on failure types is essential

  19. [Nonparametric method of estimating survival functions containing right-censored and interval-censored data].

    PubMed

    Xu, Yonghong; Gao, Xiaohuan; Wang, Zhengxi

    2014-04-01

    Missing data represent a general problem in many scientific fields, especially in medical survival analysis. Dealing with censored data, interpolation method is one of important methods. However, most of the interpolation methods replace the censored data with the exact data, which will distort the real distribution of the censored data and reduce the probability of the real data falling into the interpolation data. In order to solve this problem, we in this paper propose a nonparametric method of estimating the survival function of right-censored and interval-censored data and compare its performance to SC (self-consistent) algorithm. Comparing to the average interpolation and the nearest neighbor interpolation method, the proposed method in this paper replaces the right-censored data with the interval-censored data, and greatly improves the probability of the real data falling into imputation interval. Then it bases on the empirical distribution theory to estimate the survival function of right-censored and interval-censored data. The results of numerical examples and a real breast cancer data set demonstrated that the proposed method had higher accuracy and better robustness for the different proportion of the censored data. This paper provides a good method to compare the clinical treatments performance with estimation of the survival data of the patients. This pro vides some help to the medical survival data analysis.

  20. Associations of renal function at 1-year after kidney transplantation with subsequent return to dialysis, mortality, and healthcare costs.

    PubMed

    Schnitzler, Mark A; Johnston, Karissa; Axelrod, David; Gheorghian, Adrian; Lentine, Krista L

    2011-06-27

    Improved early kidney transplant outcomes limit the contemporary utility of standard clinical endpoints. Quantifying the relationship of renal function at 1 year after transplant with subsequent clinical outcomes and healthcare costs may facilitate cost-benefit evaluations among transplant recipients. Data for Medicare-insured kidney-only transplant recipients (1995-2003) were drawn from the United States Renal Data System. Associations of estimated glomerular filtration rate (eGFR) level at the first transplant anniversary with subsequent death-censored graft failure and patient death in posttransplant years 1 to 3 and 4 to 7 were examined by parametric survival analysis. Associations of eGFR with total health care costs defined by Medicare payments were assessed with multivariate linear regression. Among 38,015 participants, first anniversary eGFR level demonstrated graded associations with subsequent outcomes. Compared with patients with 12-month eGFR more than or equal to 60 mL/min/1.73 m, the adjusted relative risk of death-censored graft failure in years 1 to 3 was 31% greater for eGFR 45 to 59 mL/min/1.73 m (P<0.0001) and 622% greater for eGFR 15 to 30 mL/min/1.73 m (P<0.0001). Associations of first anniversary eGFR level with graft failure and mortality remained significant in years 4 to 7. The proportions of recipients expected to return to dialysis or die attributable to eGFR less than 60 mL/min/1.73 m over 10 years were 23.1% and 9.4%, respectively, and were significantly higher than proportions attributable to delayed graft function or acute rejection. Reduced eGFR was associated with graded and significant increases in health care spending during years 2 and 3 after transplant (P<0.0001). eGFR is strongly associated with clinical and economic outcomes after kidney transplantation.

  1. Revisiting double kidney transplantation: two kidneys provide better graft survival than one.

    PubMed

    Cruzado, J M; Fernandez, L; Riera, L; Bestard, O; Carrera, M; Torras, J; Gil Vernet, S; Melilli, E; Ngango, L; Grinyó, J M

    2011-01-01

    Double kidney transplantation is an accepted strategy to increase the donor pool. Regarding older donor kidneys, protocols for deciding to perform a dual or a single transplantation are mainly based on preimplantation biopsies. The aim of our study was to evaluate the long-term graft and patient survivals of our "Dual Kidney Transplant program." Patients who lost one of their grafts peritransplantation were used as controls. A total of 203 patients underwent kidney transplantation from December 1996 to January 2008 in our "old for old" renal transplantation program. We excluded 21 patients because of a nonfunctioning kidney, hyperacute rejection, or patient death with a functioning graft within the first month. Seventy-nine among 182 kidney transplantation the "old for old" program were dual kidney transplantation (DKT). Fifteen of 79 patients lost one of their kidney grafts (the uninephrectomized (UNX) UNX group). At 1 year, renal function was lower and proteinuria greater among the UNX than the DKT group. Patient survival was similar in both groups. However, death-censored graft survival was lower in UNX than DKT patients. The 5-year graft survival rate was 70% in UNX versus 93% in DKT cohorts (P = .04). In conclusion, taking into account the kidney shortage, our results may question whether the excellent transplant outcomes with DKT counter balance the reduced donor pool obviating acceptable transplant outcomes for more patients with single kidney transplantation. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Mass Spectrometry Based Metabolomics Comparison of Liver Grafts from Donors after Circulatory Death (DCD) and Donors after Brain Death (DBD) Used in Human Orthotopic Liver Transplantation.

    PubMed

    Hrydziuszko, Olga; Perera, M Thamara P R; Laing, Richard; Kirwan, Jennifer; Silva, Michael A; Richards, Douglas A; Murphy, Nick; Mirza, Darius F; Viant, Mark R

    2016-01-01

    Use of marginal liver grafts, especially those from donors after circulatory death (DCD), has been considered as a solution to organ shortage. Inferior outcomes have been attributed to donor warm ischaemic damage in these DCD organs. Here we sought to profile the metabolic mechanisms underpinning donor warm ischaemia. Non-targeted Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry metabolomics was applied to biopsies of liver grafts from donors after brain death (DBD; n = 27) and DCD (n = 10), both during static cold storage (T1) as well as post-reperfusion (T2). Furthermore 6 biopsies from DBD donors prior to the organ donation (T0) were also profiled. Considering DBD and DCD together, significant metabolic differences were discovered between T1 and T2 (688 peaks) that were primarily related to amino acid metabolism, meanwhile T0 biopsies grouped together with T2, denoting the distinctively different metabolic activity of the perfused state. Major metabolic differences were discovered between DCD and DBD during cold-phase (T1) primarily related to glucose, tryptophan and kynurenine metabolism, and in the post-reperfusion phase (T2) related to amino acid and glutathione metabolism. We propose tryptophan/kynurenine and S-adenosylmethionine as possible biomarkers for the previously established higher graft failure of DCD livers, and conclude that the associated pathways should be targeted in more exhaustive and quantitative investigations.

  3. Mass Spectrometry Based Metabolomics Comparison of Liver Grafts from Donors after Circulatory Death (DCD) and Donors after Brain Death (DBD) Used in Human Orthotopic Liver Transplantation

    PubMed Central

    Laing, Richard; Kirwan, Jennifer; Silva, Michael A.; Richards, Douglas A.; Murphy, Nick; Mirza, Darius F.; Viant, Mark R.

    2016-01-01

    Use of marginal liver grafts, especially those from donors after circulatory death (DCD), has been considered as a solution to organ shortage. Inferior outcomes have been attributed to donor warm ischaemic damage in these DCD organs. Here we sought to profile the metabolic mechanisms underpinning donor warm ischaemia. Non-targeted Fourier transform ion cyclotron resonance (FT-ICR) mass spectrometry metabolomics was applied to biopsies of liver grafts from donors after brain death (DBD; n = 27) and DCD (n = 10), both during static cold storage (T1) as well as post-reperfusion (T2). Furthermore 6 biopsies from DBD donors prior to the organ donation (T0) were also profiled. Considering DBD and DCD together, significant metabolic differences were discovered between T1 and T2 (688 peaks) that were primarily related to amino acid metabolism, meanwhile T0 biopsies grouped together with T2, denoting the distinctively different metabolic activity of the perfused state. Major metabolic differences were discovered between DCD and DBD during cold-phase (T1) primarily related to glucose, tryptophan and kynurenine metabolism, and in the post-reperfusion phase (T2) related to amino acid and glutathione metabolism. We propose tryptophan/kynurenine and S-adenosylmethionine as possible biomarkers for the previously established higher graft failure of DCD livers, and conclude that the associated pathways should be targeted in more exhaustive and quantitative investigations. PMID:27835640

  4. The risk of allograft failure and the survival benefit of kidney transplantation are complicated by delayed graft function.

    PubMed

    Gill, Jagbir; Dong, Jianghu; Rose, Caren; Gill, John S

    2016-06-01

    Concern about the long-term impact of delayed graft function (DGF) may limit the use of high-risk organs for kidney transplantation. To understand this better, we analyzed 29,598 mate kidney transplants from the same deceased donor where only 1 transplant developed DGF. The DGF associated risk of graft failure was greatest in the first posttransplant year, and in patients with concomitant acute rejection (hazard ratio: 8.22, 95% confidence interval: 4.76-14.21). In contrast, the DGF-associated risk of graft failure after the first posttransplant year in patients without acute rejection was far lower (hazard ratio: 1.15, 95% confidence interval: 1.02-1.29). In subsequent analysis, recipients of transplants complicated by DGF still derived a survival benefit when compared with patients who received treatment with dialysis irrespective of donor quality as measured by the Kidney Donor Profile Index (KDPI). The difference in the time required to derive a survival benefit was longer in transplants with DGF than in transplants without DGF, and this difference was greatest in recipients of lower quality kidneys (difference: 250-279 days for KDPI 20%-60% vs. 809 days for the KDPI over 80%). Thus, the association of DGF with graft failure is primarily limited to the first posttransplant year. Transplants complicated by DGF provide a survival benefit compared to treatment with dialysis, but the survival benefit is lower in kidney transplants with lower KDPI. This information may increase acceptance of kidneys at high risk for DGF and inform strategies to minimize the risk of death in the setting of DGF. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  5. Should we use standard survival models or the illness-death model for interval-censored data to investigate risk factors of chronic kidney disease progression?

    PubMed

    Boucquemont, Julie; Metzger, Marie; Combe, Christian; Stengel, Bénédicte; Leffondre, Karen

    2014-01-01

    In studies investigating risk factors of chronic kidney disease (CKD) progression, one may be interested in estimating factors effects on both a fall of glomerular filtration rate (GFR) below a specific level (i.e., a CKD stage) and death. Such studies have to account for the fact that GFR is measured at intermittent visit only, which implies that progression to the stage of interest is unknown for patients who die before being observed at that stage. Our objective was to compare the results of an illness-death model that handles this uncertainty, with frequently used survival models. This study included 1,519 patients from the NephroTest cohort with CKD stages 1-4 at baseline (69% males, 59±15 years, median protein/creatinine ratio [PCR] 27.4 mg/mmol) and subsequent annual measures of GFR (follow-up time 4.3±2.7 years). Each model was used to estimate the effects of sex, age, PCR, and GFR at baseline on the hazards of progression to CKD stage 5 (GFR<15 mL/min/1.73 m2, n = 282 observed) and death (n = 168). For progression to stage 5, there were only minor differences between results from the different models. The differences between results were higher for the hazard of death before or after progression. Our results also suggest that previous findings on the effect of age on end-stage renal disease are more likely due to a strong impact of age on death than to an effect on progression. The probabilities of progression were systematically under-estimated with the survival model as compared with the illness-death model. This study illustrates the advantages of the illness-death model for accurately estimating the effects of risk factors on the hazard of progression and death, and probabilities of progression. It avoids the need to choose arbitrary time-to-event and time-to-censoring, while accounting for both interval censoring and competition by death, using a single analytical model.

  6. Donor-specific anti-HLA Abs and graft failure in matched unrelated donor hematopoietic stem cell transplantation

    PubMed Central

    Ciurea, Stefan O.; Thall, Peter F.; Wang, Xuemei; Wang, Sa A.; Hu, Ying; Cano, Pedro; Aung, Fleur; Rondon, Gabriela; Molldrem, Jeffrey J.; Korbling, Martin; Shpall, Elizabeth J.; de Lima, Marcos; Champlin, Richard E.

    2011-01-01

    Anti-HLA donor-specific Abs (DSAs) have been reported to be associated with graft failure in mismatched hematopoietic stem cell transplantation; however, their role in the development of graft failure in matched unrelated donor (MUD) transplantation remains unclear. We hypothesize that DSAs against a mismatched HLA-DPB1 locus is associated with graft failure in this setting. The presence of anti-HLA Abs before transplantation was determined prospectively in 592 MUD transplantation recipients using mixed-screen beads in a solid-phase fluorescent assay. DSA identification was performed using single-Ag beads containing the corresponding donor's HLA-mismatched Ags. Anti-HLA Abs were detected in 116 patients (19.6%), including 20 patients (3.4%) with anti-DPB1 Abs. Overall, graft failure occurred in 19 of 592 patients (3.2%), including 16 of 584 (2.7%) patients without anti-HLA Abs compared with 3 of 8 (37.5%) patients with DSA (P = .0014). In multivariate analysis, DSAs were the only factor highly associated with graft failure (P = .0001; odds ratio = 21.3). Anti-HLA allosensitization was higher overall in women than in men (30.8% vs 12.1%; P < .0001) and higher in women with 1 (P = .008) and 2 or more pregnancies (P = .0003) than in men. We conclude that the presence of anti-DPB1 DSAs is associated with graft failure in MUD hematopoietic stem cell transplantation. PMID:21967975

  7. Declining Risk of Sudden Death in Heart Failure.

    PubMed

    Shen, Li; Jhund, Pardeep S; Petrie, Mark C; Claggett, Brian L; Barlera, Simona; Cleland, John G F; Dargie, Henry J; Granger, Christopher B; Kjekshus, John; Køber, Lars; Latini, Roberto; Maggioni, Aldo P; Packer, Milton; Pitt, Bertram; Solomon, Scott D; Swedberg, Karl; Tavazzi, Luigi; Wikstrand, John; Zannad, Faiez; Zile, Michael R; McMurray, John J V

    2017-07-06

    The risk of sudden death has changed over time among patients with symptomatic heart failure and reduced ejection fraction with the sequential introduction of medications including angiotensin-converting-enzyme inhibitors, angiotensin-receptor blockers, beta-blockers, and mineralocorticoid-receptor antagonists. We sought to examine this trend in detail. We analyzed data from 40,195 patients who had heart failure with reduced ejection fraction and were enrolled in any of 12 clinical trials spanning the period from 1995 through 2014. Patients who had an implantable cardioverter-defibrillator at the time of trial enrollment were excluded. Weighted multivariable regression was used to examine trends in rates of sudden death over time. Adjusted hazard ratios for sudden death in each trial group were calculated with the use of Cox regression models. The cumulative incidence rates of sudden death were assessed at different time points after randomization and according to the length of time between the diagnosis of heart failure and randomization. Sudden death was reported in 3583 patients. Such patients were older and were more often male, with an ischemic cause of heart failure and worse cardiac function, than those in whom sudden death did not occur. There was a 44% decline in the rate of sudden death across the trials (P=0.03). The cumulative incidence of sudden death at 90 days after randomization was 2.4% in the earliest trial and 1.0% in the most recent trial. The rate of sudden death was not higher among patients with a recent diagnosis of heart failure than among those with a longer-standing diagnosis. Rates of sudden death declined substantially over time among ambulatory patients with heart failure with reduced ejection fraction who were enrolled in clinical trials, a finding that is consistent with a cumulative benefit of evidence-based medications on this cause of death. (Funded by the China Scholarship Council and the University of Glasgow.).

  8. Correcting for dependent censoring in routine outcome monitoring data by applying the inverse probability censoring weighted estimator.

    PubMed

    Willems, Sjw; Schat, A; van Noorden, M S; Fiocco, M

    2018-02-01

    Censored data make survival analysis more complicated because exact event times are not observed. Statistical methodology developed to account for censored observations assumes that patients' withdrawal from a study is independent of the event of interest. However, in practice, some covariates might be associated to both lifetime and censoring mechanism, inducing dependent censoring. In this case, standard survival techniques, like Kaplan-Meier estimator, give biased results. The inverse probability censoring weighted estimator was developed to correct for bias due to dependent censoring. In this article, we explore the use of inverse probability censoring weighting methodology and describe why it is effective in removing the bias. Since implementing this method is highly time consuming and requires programming and mathematical skills, we propose a user friendly algorithm in R. Applications to a toy example and to a medical data set illustrate how the algorithm works. A simulation study was carried out to investigate the performance of the inverse probability censoring weighted estimators in situations where dependent censoring is present in the data. In the simulation process, different sample sizes, strengths of the censoring model, and percentages of censored individuals were chosen. Results show that in each scenario inverse probability censoring weighting reduces the bias induced in the traditional Kaplan-Meier approach where dependent censoring is ignored.

  9. Effect of dipyridamole plus aspirin on hemodialysis graft patency.

    PubMed

    Dixon, Bradley S; Beck, Gerald J; Vazquez, Miguel A; Greenberg, Arthur; Delmez, James A; Allon, Michael; Dember, Laura M; Himmelfarb, Jonathan; Gassman, Jennifer J; Greene, Tom; Radeva, Milena K; Davidson, Ingemar J; Ikizler, T Alp; Braden, Gregory L; Fenves, Andrew Z; Kaufman, James S; Cotton, James R; Martin, Kevin J; McNeil, James W; Rahman, Asif; Lawson, Jeffery H; Whiting, James F; Hu, Bo; Meyers, Catherine M; Kusek, John W; Feldman, Harold I

    2009-05-21

    Arteriovenous graft stenosis leading to thrombosis is a major cause of complications in patients undergoing hemodialysis. Procedural interventions may restore patency but are costly. Although there is no proven pharmacologic therapy, dipyridamole may be promising because of its known vascular antiproliferative activity. We conducted a randomized, double-blind, placebo-controlled trial of extended-release dipyridamole, at a dose of 200 mg, and aspirin, at a dose of 25 mg, given twice daily after the placement of a new arteriovenous graft until the primary outcome, loss of primary unassisted patency (i.e., patency without thrombosis or requirement for intervention), was reached. Secondary outcomes were cumulative graft failure and death. Primary and secondary outcomes were analyzed with the use of a Cox proportional-hazards regression with adjustment for prespecified covariates. At 13 centers in the United States, 649 patients were randomly assigned to receive dipyridamole plus aspirin (321 patients) or placebo (328 patients) over a period of 4.5 years, with 6 additional months of follow-up. The incidence of primary unassisted patency at 1 year was 23% (95% confidence interval [CI], 18 to 28) in the placebo group and 28% (95% CI, 23 to 34) in the dipyridamole-aspirin group, an absolute difference of 5 percentage points. Treatment with dipyridamole plus aspirin significantly prolonged the duration of primary unassisted patency (hazard ratio, 0.82; 95% CI, 0.68 to 0.98; P=0.03) and inhibited stenosis. The incidences of cumulative graft failure, death, the composite of graft failure or death, and serious adverse events (including bleeding) did not differ significantly between study groups. Treatment with dipyridamole plus aspirin had a significant but modest effect in reducing the risk of stenosis and improving the duration of primary unassisted patency of newly created grafts. (ClinicalTrials.gov number, NCT00067119.) 2009 Massachusetts Medical Society

  10. Effect of Dipyridamole plus Aspirin on Hemodialysis Graft Patency

    PubMed Central

    Dixon, Bradley S.; Beck, Gerald J.; Vazquez, Miguel A.; Greenberg, Arthur; Delmez, James A.; Allon, Michael; Dember, Laura M.; Himmelfarb, Jonathan; Gassman, Jennifer J.; Greene, Tom; Radeva, Milena K.; Davidson, Ingemar J.; Ikizler, T. Alp; Braden, Gregory L.; Fenves, Andrew Z.; Kaufman, James S.; Cotton, James R.; Martin, Kevin J.; McNeil, James W.; Rahman, Asif; Lawson, Jeffery H.; Whiting, James F.; Hu, Bo; Meyers, Catherine M.; Kusek, John W.; Feldman, Harold I.

    2014-01-01

    BACKGROUND Arteriovenous graft stenosis leading to thrombosis is a major cause of complications in patients undergoing hemodialysis. Procedural interventions may restore patency but are costly. Although there is no proven pharmacologic therapy, dipyridamole may be promising because of its known vascular antiproliferative activity. METHODS We conducted a randomized, double-blind, placebo-controlled trial of extended-release dipyridamole, at a dose of 200 mg, and aspirin, at a dose of 25 mg, given twice daily after the placement of a new arteriovenous graft until the primary outcome, loss of primary unassisted patency (i.e., patency without thrombosis or requirement for intervention), was reached. Secondary outcomes were cumulative graft failure and death. Primary and secondary outcomes were analyzed with the use of a Cox proportional-hazards regression with adjustment for prespecified covariates. RESULTS At 13 centers in the United States, 649 patients were randomly assigned to receive dipyridamole plus aspirin (321 patients) or placebo (328 patients) over a period of 4.5 years, with 6 additional months of follow-up. The incidence of primary unassisted patency at 1 year was 23% (95% confidence interval [CI], 18 to 28) in the placebo group and 28% (95% CI, 23 to 34) in the dipyridamole–aspirin group, an absolute difference of 5 percentage points. Treatment with dipyridamole plus aspirin significantly prolonged the duration of primary unassisted patency (hazard ratio, 0.82; 95% CI, 0.68 to 0.98; P = 0.03) and inhibited stenosis. The incidences of cumulative graft failure, death, the composite of graft failure or death, and serious adverse events (including bleeding) did not differ significantly between study groups. CONCLUSIONS Treatment with dipyridamole plus aspirin had a significant but modest effect in reducing the risk of stenosis and improving the duration of primary unassisted patency of newly created grafts. (ClinicalTrials.gov number, NCT00067119.) PMID

  11. Machine-Learning Algorithms Predict Graft Failure After Liver Transplantation.

    PubMed

    Lau, Lawrence; Kankanige, Yamuna; Rubinstein, Benjamin; Jones, Robert; Christophi, Christopher; Muralidharan, Vijayaragavan; Bailey, James

    2017-04-01

    The ability to predict graft failure or primary nonfunction at liver transplant decision time assists utilization of scarce resource of donor livers, while ensuring that patients who are urgently requiring a liver transplant are prioritized. An index that is derived to predict graft failure using donor and recipient factors, based on local data sets, will be more beneficial in the Australian context. Liver transplant data from the Austin Hospital, Melbourne, Australia, from 2010 to 2013 has been included in the study. The top 15 donor, recipient, and transplant factors influencing the outcome of graft failure within 30 days were selected using a machine learning methodology. An algorithm predicting the outcome of interest was developed using those factors. Donor Risk Index predicts the outcome with an area under the receiver operating characteristic curve (AUC-ROC) value of 0.680 (95% confidence interval [CI], 0.669-0.690). The combination of the factors used in Donor Risk Index with the model for end-stage liver disease score yields an AUC-ROC of 0.764 (95% CI, 0.756-0.771), whereas survival outcomes after liver transplantation score obtains an AUC-ROC of 0.638 (95% CI, 0.632-0.645). The top 15 donor and recipient characteristics within random forests results in an AUC-ROC of 0.818 (95% CI, 0.812-0.824). Using donor, transplant, and recipient characteristics known at the decision time of a transplant, high accuracy in matching donors and recipients can be achieved, potentially providing assistance with clinical decision making.

  12. Graft failure: III. Glaucoma escalation after penetrating keratoplasty.

    PubMed

    Greenlee, Emily C; Kwon, Young H

    2008-06-01

    Glaucoma after penetrating keratoplasty is a frequently observed post-operative complication and is a risk factor for graft failure. Penetrating keratoplasty performed for aphakic and pseudophakic bullous keratopathy and inflammatory conditions are more likely to cause postoperative glaucoma compared with keratoconus and Fuchs' endothelial dystrophy. The intraocular pressure elevation may occur immediately after surgery or in the early to late postoperative period. Early postoperative causes of glaucoma include pre-existing glaucoma, retained viscoelastic, hyphema, inflammation, pupillary block, aqueous misdirection, or suprachoroidal hemorrhage. Late causes include pre-existing glaucoma, angle-closure glaucoma, ghost cell glaucoma, suprachoroidal hemorrhage, and steroid-induced glaucoma. Determining the cause of IOP elevation can help guide therapeutic intervention. Treatments for refractory glaucoma include topical anti-glaucoma medications such as beta-adrenergic blockers. Topical carbonic anhydrase inhibitors, miotic agents, adrenergic agonists, and prostaglandin analogs should be used with caution in the post-keratoplasty patient, because of the possibility of corneal decompensation, cystoid macular edema, or persistent inflammation. Various glaucoma surgical treatments have reported success in post-keratoplasty glaucoma. Trabeculectomy with mitomycin C can be successful in controlling IOP without the corneal toxicity noted with 5-fluorouracil. Glaucoma drainage devices have successfully controlled intraocular pressure in postkeratoplasty glaucoma; this is, however, associated with increased risk of graft failure. Placement of the tube through the pars plana may improve graft success compared with implantation within the anterior chamber. In addition, cyclophotocoagulation remains a useful procedure for eyes that have refractory glaucoma despite multiple surgical interventions.

  13. Donor brain death predisposes human kidney grafts to a proinflammatory reaction after transplantation.

    PubMed

    de Vries, D K; Lindeman, J H N; Ringers, J; Reinders, M E J; Rabelink, T J; Schaapherder, A F M

    2011-05-01

    Donor brain death has profound effects on post-transplantation graft function and survival. We hypothesized that changes initiated in the donor influence the graft's response to ischemia and reperfusion. In this study, human brain dead donor kidney grafts were compared to living and cardiac dead donor kidney grafts. Pretransplant biopsies of brain dead donor kidneys contained notably more infiltrating T lymphocytes and macrophages. To assess whether the different donor conditions result in a different response to reperfusion, local cytokine release from the reperfused kidney was studied by measurement of paired arterial and renal venous blood samples. Reperfusion of kidneys from brain dead donors was associated with the instantaneous release of inflammatory cytokines, such as G-CSF, IL-6, IL-9, IL-16 and MCP-1. In contrast, kidneys from living and cardiac dead donors showed a more modest cytokine response with release of IL-6 and small amounts of MCP-1. In conclusion, this study shows that donor brain death initiates an inflammatory state of the graft with T lymphocyte and macrophage infiltration and massive inflammatory cytokine release upon reperfusion. These observations suggest that brain dead donors require a novel approach for donor pretreatment aimed at preventing this inflammatory response to increase graft survival. ©2011 The Authors Journal compilation©2011 The American Society of Transplantation and the American Society of Transplant Surgeons.

  14. Maximum likelihood estimation for semiparametric transformation models with interval-censored data

    PubMed Central

    Mao, Lu; Lin, D. Y.

    2016-01-01

    Abstract Interval censoring arises frequently in clinical, epidemiological, financial and sociological studies, where the event or failure of interest is known only to occur within an interval induced by periodic monitoring. We formulate the effects of potentially time-dependent covariates on the interval-censored failure time through a broad class of semiparametric transformation models that encompasses proportional hazards and proportional odds models. We consider nonparametric maximum likelihood estimation for this class of models with an arbitrary number of monitoring times for each subject. We devise an EM-type algorithm that converges stably, even in the presence of time-dependent covariates, and show that the estimators for the regression parameters are consistent, asymptotically normal, and asymptotically efficient with an easily estimated covariance matrix. Finally, we demonstrate the performance of our procedures through simulation studies and application to an HIV/AIDS study conducted in Thailand. PMID:27279656

  15. High risk of graft failure in patients with anti-HLA antibodies undergoing haploidentical stem-cell transplantation.

    PubMed

    Ciurea, Stefan O; de Lima, Marcos; Cano, Pedro; Korbling, Martin; Giralt, Sergio; Shpall, Elizabeth J; Wang, Xuemei; Thall, Peter F; Champlin, Richard E; Fernandez-Vina, Marcelo

    2009-10-27

    BACKGROUND.: Although donor-specific anti-human leukocyte antigen (HLA) antibodies (DSA) have been implicated in graft rejection in solid organ transplantation, their role in hematopoietic stem-cell transplantation remains unclear. METHODS.: To address the hypothesis that the presence of DSA contributes to the development graft failure, we tested 24 consecutive patients for the presence of anti-HLA antibodies determined by a sensitive and specific solid-phase/single-antigen assay. The study included a total of 28 haploidentical transplants, each with 2 to 5 HLA allele mismatches, at a single institution, from September 2005 to August 2008. RESULTS.: DSA were detected in five patients (21%). Three of four (75%) patients with DSA before the first transplant failed to engraft, compared with 1 of 20 (5%) without DSA (P=0.008). All four patients who experienced primary graft failure had second haploidentical transplants. One patient developed a second graft failure with persistent high DSA levels, whereas three engrafted, two of them in the absence of DSA. No other known factors that could negatively influence engraftment were associated with the development of graft failure in these patients. CONCLUSIONS.: These results suggest that donor-specific anti-HLA antibodies are associated with a high rate of graft rejection in patients undergoing haploidentical stem-cell transplantation. Anti-HLA sensitization should be evaluated routinely in hematopoietic stem-cell transplantation with HLA mismatched donors.

  16. A dynamical system that describes vein graft adaptation and failure.

    PubMed

    Garbey, Marc; Berceli, Scott A

    2013-11-07

    Adaptation of vein bypass grafts to the mechanical stresses imposed by the arterial circulation is thought to be the primary determinant for lesion development, yet an understanding of how the various forces dictate local wall remodeling is lacking. We develop a dynamical system that summarizes the complex interplay between the mechanical environment and cell/matrix kinetics, ultimately dictating changes in the vein graft architecture. Based on a systematic mapping of the parameter space, three general remodeling response patterns are observed: (1) shear stabilized intimal thickening, (2) tension induced wall thinning and lumen expansion, and (3) tension stabilized wall thickening. Notable is our observation that the integration of multiple feedback mechanisms leads to a variety of non-linear responses that would be unanticipated by an analysis of each system component independently. This dynamic analysis supports the clinical observation that the majority of vein grafts proceed along an adaptive trajectory, where grafts dilate and mildly thicken in response to the increased tension and shear, but a small portion of the grafts demonstrate a maladaptive phenotype, where progressive inward remodeling and accentuated wall thickening lead to graft failure. © 2013 The Authors. Published by Elsevier Ltd. All rights reserved.

  17. Maximal blood flow acceleration analysis in the early diastolic phase for aortocoronary artery bypass grafts: a new transit-time flow measurement predictor of graft failure following coronary artery bypass grafting.

    PubMed

    Handa, Takemi; Orihashi, Kazumasa; Nishimori, Hideaki; Yamamoto, Masaki

    2016-11-01

    Maximal graft flow acceleration (max df/dt) determined using transit-time flowmetry (TTFM) in the diastolic phase was assessed as a potential predictor of graft failure for aortocoronary artery (AC) bypass grafts in coronary artery bypass patients. Max df/dt was retrospectively measured in 114 aortocoronary artery bypass grafts. TTFM data were fitted to a 9-polynomial curve, which was derived from the first-derivative curve, to measure max df/dt (9-polynomial max df/dt). Abnormal TTFM was defined as a mean flow of <15 ml/min, a pulsatility index of >5 or a diastolic filling ratio of <50 %. Postoperative assessments were routinely performed by coronary artery angiography (CAG) at 1 year after surgery. Using TTFM, 68 grafts were normal, 4 of which were failing on CAG, and 46 grafts were abnormal, 21 of which were failing on CAG. 9-polynomial max df/dt was significantly lower in abnormal TTFM/failing by the CAG group compared with abnormal TTFM/patent by the CAG group (1.08 ± 0.89 vs. 2.05 ± 1.51 ml/s(2), respectively; P < 0.01, Mann-Whitney U test, Holm adjustment). TTFM 9-polynomial max df/dt in the early diastolic phase may be a promising predictor of future graft failure for AC bypass grafts, particularly in abnormal TTFM grafts.

  18.  Liver transplantation in the critically ill: donation after cardiac death compared to donation after brain death grafts.

    PubMed

    Taner, C Burcin; Bulatao, Ilynn G; Arasi, Lisa C; Perry, Dana K; Willingham, Darrin L; Sibulesky, Lena; Rosser, Barry G; Canabal, Juan M; Nguyen, Justin H; Kramer, David J

    2012-01-01

     Patients with end stage liver disease may become critically ill prior to LT requiring admission to the intensive care unit (ICU). The high acuity patients may be thought too ill to transplant; however, often LT is the only therapeutic option. Choosing the correct liver allograft for these patients is often difficult and it is imperative that the allograft work immediately. Donation after cardiac death (DCD) donors provide an important source of livers, however, DCD graft allocation remains a controversial topic, in critically ill patients. Between January 2003-December 2008, 1215 LTs were performed: 85 patients at the time of LT were in the ICU. Twelve patients received DCD grafts and 73 received donation after brain dead (DBD) grafts. After retransplant cases and multiorgan transplants were excluded, 8 recipients of DCD grafts and 42 recipients of DBD grafts were included in this study. Post-transplant outcomes of DCD and DBD liver grafts were compared. While there were differences in graft and survival between DCD and DBD groups at 4 month and 1 year time points, the differences did not reach statistical significance. The graft and patient survival rates were similar among the groups at 3-year time point. There is need for other large liver transplant programs to report their outcomes using liver grafts from DCD and DBD donors. We believe that the experience of the surgical, medical and critical care team is important for successfully using DCD grafts for critically ill patients.

  19. The Need for New Donor Stratification to Predict Graft Survival in Deceased Donor Kidney Transplantation.

    PubMed

    Yang, Shin Seok; Yang, Jaeseok; Ahn, Curie; Min, Sang Il; Ha, Jongwon; Kim, Sung Joo; Park, Jae Berm

    2017-05-01

    The aim of this study was to determine whether stratification of deceased donors by the United Network for Organ Sharing (UNOS) criteria negatively impacts graft survival. We retrospectively reviewed deceased donor and recipient pretransplant variables of kidney transplantations that occurred between February 1995 and December 2009. We compared clinical outcomes between standard criteria donors (SCDs) and expanded criteria donors (ECDs). The deceased donors consisted of 369 patients. A total of 494 transplant recipients were enrolled in this study. Mean age was 41.7±11.4 year (range 18-69) and 273 patients (55.4%) were male. Mean duration of follow-up was 8.8±4.9 years. The recipients from ECD kidneys were 63 patients (12.8%). The overall mean cold ischemia time was 5.7±3.2 hours. Estimated glomerular filtration rate at 1, 2, and 3 years after transplantation were significantly lower in ECD transplants (1 year, 62.2±17.6 vs. 51.0±16.4, p<0.001; 2 year, 62.2±17.6 vs. 51.0±16.4, p=0.001; 3 year, 60.9±23.5 vs. 54.1±18.7, p=0.047). In multivariate analysis, donor age (≥40 years) was an independent risk factor for graft failure. In Kaplan-Meier analyses, there was no significant difference in death-censored graft survival (Log rank test, p>0.05), although patient survival was lower in ECDs than SCDs (Log rank test, p=0.011). Our data demonstrate that stratification by the UNOS criteria does not predict graft survival. In order to expand the donor pool, new criteria for standard/expanded donors need to be modified by regional differences. © Copyright: Yonsei University College of Medicine 2017

  20. Combined predictive value of the expanded donor criteria for long-term graft survival of kidneys from donors after cardiac death: A single-center experience over three decades.

    PubMed

    Kusaka, Mamoru; Kubota, Yusuke; Sasaki, Hitomi; Fukami, Naohiko; Fujita, Tamio; Hirose, Yuichi; Takahashi, Hiroshi; Kenmochi, Takashi; Shiroki, Ryoichi; Hoshinaga, Kiyotaka

    2016-04-01

    Kidneys procured from the deceased hold great potential for expanding the donor pool. The aims of the present study were to investigate the post-transplant outcomes of renal allografts recovered from donors after cardiac death, to identify risk factors affecting the renal prognosis and to compare the long-term survival from donors after cardiac death according to the number of risk factors shown by expanded criteria donors. A total of 443 grafts recovered using an in situ regional cooling technique from 1983 to 2011 were assessed. To assess the combined predictive value of the significant expanded criteria donor risk criteria, the patients were divided into three groups: those with no expanded criteria donor risk factors (no risk), one expanded criteria donor risk factor (single-risk) and two or more expanded criteria donor risk factors (multiple-risk). Among the donor factors, age ≥50 years, hypertension, maximum serum creatinine level ≥1.5 mg/dL and a warm ischemia time ≥30 min were identified as independent predictors of long-term graft failure on multivariate analysis. Regarding the expanded criteria donors criteria for marginal donors, cerebrovascular disease, hypertension and maximum serum creatinine level ≥1.5 mg/dL were identified as significant predictors on univariate analysis. The single- and multiple-risk groups showed 2.01- and 2.40-fold higher risks of graft loss, respectively. Renal grafts recovered from donors after cardiac death donors have a good renal function with an excellent long-term graft survival. However, an increased number of expanded criteria donors risk factors increase the risk of graft loss. © 2016 The Japanese Urological Association.

  1. Establishing a Core Outcome Measure for Graft Health: a Standardized Outcomes in Nephrology - Kidney Transplantation (SONG-Tx) Consensus Workshop Report.

    PubMed

    Tong, Allison; Sautenet, Benedicte; Poggio, Emilio D; Lentine, Krista L; Oberbauer, Rainer; Mannon, Roslyn; Murphy, Barbara; Padilla, Benita; Chow, Kai Ming; Marson, Lorna; Chadban, Steve; Craig, Jonathan C; Ju, Angela; Manera, Karine E; Hanson, Camilla S; Josephson, Michelle A; Knoll, Greg

    2018-02-22

    Graft loss, a critically important outcome for transplant recipients, is variably defined and measured, and incompletely reported in trials. We convened a consensus workshop on establishing a core outcome measure for graft loss for all trials in kidney transplantation. Twenty-five kidney transplant recipients/caregivers and 33 health professionals from eight countries participated. Transcripts were analyzed thematically. Five themes were identified. "Graft loss as a continuum" conceptualizes graft loss as a process, but requiring an endpoint defined as a discrete event. In "defining an event with precision and accuracy," loss of graft function requiring chronic dialysis (minimum 90 days) provided an objective and practical definition; re-transplant would capture preemptive transplantation; relisting was readily measured but would overestimate graft loss; and allograft nephrectomy was redundant in being preceded by dialysis. However, the thresholds for renal replacement therapy varied. Conservative management was regarded as too ambiguous and complex to use routinely. "Distinguishing death-censored graft loss" would ensure clarity and meaningfulness in interpreting results. "Consistent reporting for decision-making" by specifying time points and metrics (ie time to event) was suggested. "Ease of ascertainment and data collection" of the outcome from registries could support use of registry data to efficiently extend follow-up of trial participants. A practical and meaningful core outcome measure for graft loss may be defined as chronic dialysis or re-transplant, and distinguished from loss due to death. Consistent reporting of graft loss using standardized metrics and time points may improve the contribution of trials to decision-making in kidney transplantation.

  2. Long-term results after transplantation of pediatric liver grafts from donation after circulatory death donors.

    PubMed

    van Rijn, Rianne; Hoogland, Pieter E R; Lehner, Frank; van Heurn, Ernest L W; Porte, Robert J

    2017-01-01

    Liver grafts from donation after circulatory death (DCD) donors are increasingly accepted as an extension of the organ pool for transplantation. There is little data on the outcome of liver transplantation with DCD grafts from a pediatric donor. The objective of this study was to assess the outcome of liver transplantation with pediatric DCD grafts and to compare this with the outcome after transplantation of livers from pediatric donation after brain death (DBD) donors. All transplantations performed with a liver from a pediatric donor (≤16 years) in the Netherlands between 2002 and 2015 were included. Patient survival, graft survival, and complication rates were compared between DCD and DBD liver transplantation. In total, 74 liver transplantations with pediatric grafts were performed; twenty (27%) DCD and 54 (73%) DBD. The median donor warm ischemia time (DWIT) was 24 min (range 15-43 min). Patient survival rate at 10 years was 78% for recipients of DCD grafts and 89% for DBD grafts (p = 0.32). Graft survival rate at 10 years was 65% in recipients of DCD versus 76% in DBD grafts (p = 0.20). If donor livers in this study would have been rejected for transplantation when the DWIT ≥30 min (n = 4), the 10-year graft survival rate would have been 81% after DCD transplantation. The rate of non-anastomotic biliary strictures was 5% in DCD and 4% in DBD grafts (p = 1.00). Other complication rates were also similar between both groups. Transplantation of livers from pediatric DCD donors results in good long-term outcome especially when the DWIT is kept ≤30 min. Patient and graft survival rates are not significantly different between recipients of a pediatric DCD or DBD liver. Moreover, the incidence of non-anastomotic biliary strictures after transplantation of pediatric DCD livers is remarkably low.

  3. Long-term results after transplantation of pediatric liver grafts from donation after circulatory death donors

    PubMed Central

    Hoogland, Pieter E. R.; Lehner, Frank; van Heurn, Ernest L. W.; Porte, Robert J.

    2017-01-01

    Background Liver grafts from donation after circulatory death (DCD) donors are increasingly accepted as an extension of the organ pool for transplantation. There is little data on the outcome of liver transplantation with DCD grafts from a pediatric donor. The objective of this study was to assess the outcome of liver transplantation with pediatric DCD grafts and to compare this with the outcome after transplantation of livers from pediatric donation after brain death (DBD) donors. Method All transplantations performed with a liver from a pediatric donor (≤16 years) in the Netherlands between 2002 and 2015 were included. Patient survival, graft survival, and complication rates were compared between DCD and DBD liver transplantation. Results In total, 74 liver transplantations with pediatric grafts were performed; twenty (27%) DCD and 54 (73%) DBD. The median donor warm ischemia time (DWIT) was 24 min (range 15–43 min). Patient survival rate at 10 years was 78% for recipients of DCD grafts and 89% for DBD grafts (p = 0.32). Graft survival rate at 10 years was 65% in recipients of DCD versus 76% in DBD grafts (p = 0.20). If donor livers in this study would have been rejected for transplantation when the DWIT ≥30 min (n = 4), the 10-year graft survival rate would have been 81% after DCD transplantation. The rate of non-anastomotic biliary strictures was 5% in DCD and 4% in DBD grafts (p = 1.00). Other complication rates were also similar between both groups. Conclusions Transplantation of livers from pediatric DCD donors results in good long-term outcome especially when the DWIT is kept ≤30 min. Patient and graft survival rates are not significantly different between recipients of a pediatric DCD or DBD liver. Moreover, the incidence of non-anastomotic biliary strictures after transplantation of pediatric DCD livers is remarkably low. PMID:28426684

  4. [Interleukin 8 concentrations in donor bronchoalveolar lavage: impact on primary graft failure in double lung transplant].

    PubMed

    Almenar, María; Cerón, José; Gómez, M A Dolores; Peñalver, Juan C; Jiménez, M A José; Padilla, José

    2009-01-01

    The purpose of this study was to determine concentrations of interleukin 8 (IL-8) in the bronchoalveolar lavage (BAL) fluid from donor lungs and assess the role of IL-8 levels in the development of primary graft failure. Twenty patients who received a double lung transplant were studied. A series of data, including BAL fluid concentrations of IL-8, were collected for the donors. Data collected for the recipients included arterial blood gases after 6, 24, and 48 hours, and intubation time. Patients with a ratio of PaO(2) to the fraction of inspired oxygen (FiO(2)) of less than 300 during the first 48 hours were diagnosed with primary graft failure. IL-8 levels were determined by enzyme-linked immunosorbent assay. Associations between the donor variables and IL-8 concentrations were evaluated using the Spearman rank correlation coefficient (rho) and the Mann-Whitney test for categorical and continuous variables, respectively. Logistic regression was used for multivariate analysis. Fifteen of the 20 donors were men. The cause of brain death was trauma in 9 donors, 7 were smokers, 13 required inotropic support, and pathogens were isolated in the BAL fluid of 18. The median age was 35 years (interquartile range [IQR], 23.5-51.25y), the median ventilation time was 1 day (IQR, 1-2d), the median PaO(2)/FiO(2) was 459.5 (IQR, 427-510.25), and the median IL-8 concentration in BAL fluid was 49.01ng/L (IQR, 7.86-94.05ng/mL). Ten of the recipients were men and the median age was 48.43 years (IQR, 25.4-56.81y). The median ischemic time was 210 minutes (IQR, 176.25-228.75 min) for the first lung and 300 minutes (IQR, 273.75-333.73 min) for the second lung. The median PaO(2)/FiO(2) ratio for the implant at 6, 14, and 48 hours was 329 (IQR, 190.25-435), 363.5 (IQR, 249-434.75), and 370.5 (IQR, 243.25-418.25), respectively. The median intubation time was 39.5 hours (IQR, 19.25-68.5h) and the correlation with IL-8 values was positive: higher IL-8 concentrations in BAL fluid

  5. Limitation of Inverse Probability-of-Censoring Weights in Estimating Survival in the Presence of Strong Selection Bias

    PubMed Central

    Howe, Chanelle J.; Cole, Stephen R.; Chmiel, Joan S.; Muñoz, Alvaro

    2011-01-01

    In time-to-event analyses, artificial censoring with correction for induced selection bias using inverse probability-of-censoring weights can be used to 1) examine the natural history of a disease after effective interventions are widely available, 2) correct bias due to noncompliance with fixed or dynamic treatment regimens, and 3) estimate survival in the presence of competing risks. Artificial censoring entails censoring participants when they meet a predefined study criterion, such as exposure to an intervention, failure to comply, or the occurrence of a competing outcome. Inverse probability-of-censoring weights use measured common predictors of the artificial censoring mechanism and the outcome of interest to determine what the survival experience of the artificially censored participants would be had they never been exposed to the intervention, complied with their treatment regimen, or not developed the competing outcome. Even if all common predictors are appropriately measured and taken into account, in the context of small sample size and strong selection bias, inverse probability-of-censoring weights could fail because of violations in assumptions necessary to correct selection bias. The authors used an example from the Multicenter AIDS Cohort Study, 1984–2008, regarding estimation of long-term acquired immunodeficiency syndrome-free survival to demonstrate the impact of violations in necessary assumptions. Approaches to improve correction methods are discussed. PMID:21289029

  6. Estimation of indirect effect when the mediator is a censored variable.

    PubMed

    Wang, Jian; Shete, Sanjay

    2017-01-01

    A mediation model explores the direct and indirect effects of an initial variable ( X) on an outcome variable ( Y) by including a mediator ( M). In many realistic scenarios, investigators observe censored data instead of the complete data. Current research in mediation analysis for censored data focuses mainly on censored outcomes, but not censored mediators. In this study, we proposed a strategy based on the accelerated failure time model and a multiple imputation approach. We adapted a measure of the indirect effect for the mediation model with a censored mediator, which can assess the indirect effect at both the group and individual levels. Based on simulation, we established the bias in the estimations of different paths (i.e. the effects of X on M [ a], of M on Y [ b] and of X on Y given mediator M [ c']) and indirect effects when analyzing the data using the existing approaches, including a naïve approach implemented in software such as Mplus, complete-case analysis, and the Tobit mediation model. We conducted simulation studies to investigate the performance of the proposed strategy compared to that of the existing approaches. The proposed strategy accurately estimates the coefficients of different paths, indirect effects and percentages of the total effects mediated. We applied these mediation approaches to the study of SNPs, age at menopause and fasting glucose levels. Our results indicate that there is no indirect effect of association between SNPs and fasting glucose level that is mediated through the age at menopause.

  7. Engraftment kinetics and graft failure after single umbilical cord blood transplantation using a myeloablative conditioning regimen.

    PubMed

    Ruggeri, Annalisa; Labopin, Myriam; Sormani, Maria Pia; Sanz, Guillermo; Sanz, Jaime; Volt, Fernanda; Michel, Gerard; Locatelli, Franco; Diaz De Heredia, Cristina; O'Brien, Tracey; Arcese, William; Iori, Anna Paola; Querol, Sergi; Kogler, Gesine; Lecchi, Lucilla; Pouthier, Fabienne; Garnier, Federico; Navarrete, Cristina; Baudoux, Etienne; Fernandes, Juliana; Kenzey, Chantal; Eapen, Mary; Gluckman, Eliane; Rocha, Vanderson; Saccardi, Riccardo

    2014-09-01

    Umbilical cord blood transplant recipients are exposed to an increased risk of graft failure, a complication leading to a higher rate of transplant-related mortality. The decision and timing to offer a second transplant after graft failure is challenging. With the aim of addressing this issue, we analyzed engraftment kinetics and outcomes of 1268 patients (73% children) with acute leukemia (64% acute lymphoblastic leukemia, 36% acute myeloid leukemia) in remission who underwent single-unit umbilical cord blood transplantation after a myeloablative conditioning regimen. The median follow-up was 31 months. The overall survival rate at 3 years was 47%; the 100-day cumulative incidence of transplant-related mortality was 16%. Longer time to engraftment was associated with increased transplant-related mortality and shorter overall survival. The cumulative incidence of neutrophil engraftment at day 60 was 86%, while the median time to achieve engraftment was 24 days. Probability density analysis showed that the likelihood of engraftment after umbilical cord blood transplantation increased after day 10, peaked on day 21 and slowly decreased to 21% by day 31. Beyond day 31, the probability of engraftment dropped rapidly, and the residual probability of engrafting after day 42 was 5%. Graft failure was reported in 166 patients, and 66 of them received a second graft (allogeneic, n=45). Rescue actions, such as the search for another graft, should be considered starting after day 21. A diagnosis of graft failure can be established in patients who have not achieved neutrophil recovery by day 42. Moreover, subsequent transplants should not be postponed after day 42. Copyright© Ferrata Storti Foundation.

  8. Warfarin improves the outcome of infrainguinal vein bypass grafting at high risk for failure.

    PubMed

    Sarac, T P; Huber, T S; Back, M R; Ozaki, C K; Carlton, L M; Flynn, T C; Seeger, J M

    1998-09-01

    Patients with marginal venous conduit, poor arterial runoff, and prior failed bypass grafts are at high risk for infrainguinal graft occlusion and limb loss. We sought to evaluate the effects of anticoagulation therapy after autogenous vein infrainguinal revascularization on duration of patency, limb salvage rates, and complication rates in this subset of patients. This randomized prospective trial was performed in a university tertiary care hospital and in a Veterans Affairs Hospital. Fifty-six patients who were at high risk for graft failure were randomized to receive aspirin (24 patients, 27 bypass grafts) or aspirin and warfarin (WAR; 32 patients, 37 bypass grafts). All patients received 325 mg of aspirin each day, and the patients who were randomized to warfarin underwent anticoagulation therapy with heparin immediately after surgery and then were started on warfarin therapy to maintain an international normalized ratio between 2 and 3. Perioperative blood transfusions and complications were compared with the Student t test or with the chi2 test. Graft patency rates, limb salvage rates, and survival rates were compared with the Kaplan-Meier method and the log-rank test. Sixty-one of the 64 bypass grafts were performed for rest pain or tissue loss, and 3 were performed for short-distance claudication. There were no differences between the groups in ages, indications, bypass graft types, risk classifications (ie, conduit, runoff, or graft failure), or comorbid conditions (except diabetes mellitus). The cumulative 5-year survival rate was similar between the groups. The incidence rate of postoperative hematoma (32% vs 3.7%; P = .004) was greater in the WAR group, but no differences were seen between the WAR group and the aspirin group in the number of packed red blood cells transfused, in the incidence rate of overall nonhemorrhagic wound complications, or in the overall complication rate (62% vs 52%). The immediate postoperative primary graft patency rates (97

  9. Donation after cardiac death as a strategy to increase deceased donor liver availability.

    PubMed

    Merion, Robert M; Pelletier, Shawn J; Goodrich, Nathan; Englesbe, Michael J; Delmonico, Francis L

    2006-10-01

    This study examines donation after cardiac death (DCD) practices and outcomes in liver transplantation. Livers procured from DCD donors have recently been used to increase the number of deceased donors and bridge the gap between limited organ supply and the pool of waiting list candidates. Comprehensive evaluation of this practice and its outcomes has not been previously reported. A national cohort of all DCD and donation after brain-death (DBD) liver transplants between January 1, 2000 and December 31, 2004 was identified in the Scientific Registry of Transplant Recipients. Time to graft failure (including death) was modeled by Cox regression, adjusted for relevant donor and recipient characteristics. DCD livers were used for 472 (2%) of 24,070 transplants. Annual DCD liver activity increased from 39 in 2000 to 176 in 2004. The adjusted relative risk of DCD graft failure was 85% higher than for DBD grafts (relative risk, 1.85; 95% confidence interval, 1.51-2.26; P < 0.001), corresponding to 3-month, 1-year, and 3-year graft survival rates of 83.0%, 70.1%, and 60.5%, respectively (vs. 89.2%, 83.0%, and 75.0% for DBD recipients). There was no significant association between transplant program DCD liver transplant volume and graft outcome. The annual number of DCD livers used for transplant has increased rapidly. However, DCD livers are associated with a significantly increased risk of graft failure unrelated to modifiable donor or recipient factors. Appropriate recipients for DCD livers have not been fully characterized and recipient informed consent should be obtained before use of these organs.

  10. Long-Term Outcomes of Renal Transplant in Recipients With Lower Urinary Tract Dysfunction.

    PubMed

    Wilson, Rebekah S; Courtney, Aisling E; Ko, Dicken S C; Maxwell, Alexander P; McDaid, James

    2018-01-02

    Lower urinary tract dysfunction can lead to chronic kidney disease, which, despite surgical intervention, will progress to end-stage renal disease, requiring dialysis. Urologic pathology may damage a transplanted kidney, limiting patient and graft survival. Although smaller studies have suggested that urinary tract dysfunction does not affect graft or patient survival, this is not universally accepted. Northern Ireland has historically had the highest incidence of neural tube defects in Europe, giving rich local experience in caring for patients with lower urinary tract dysfunction. Here, we analyzed outcomes of renal transplant recipients with lower urinary tract dysfunction versus control recipients. We identified 3 groups of kidney transplant recipients treated between 2001 and 2010; those in group 1 had end-stage renal disease due to lower urinary tract dysfunction with prior intervention (urologic surgery, long-term catheter, or intermittent self-catheterization), group 2 had end-stage renal disease secondary to lower urinary tract dysfunction without intervention, and group 3 had end-stage renal disease due to polycystic kidney disease (chosen as a relatively healthy control cohort without comorbid burden of other causes of end-stage renal disease such as diabetes). The primary outcome measured, graft survival, was death censored, with graft loss defined as requirement for renal replacement therapy or retransplant. Secondary outcomes included patient survival and graft function. In 150 study patients (16 patients in group 1, 64 in group 2, and 70 in group 3), 5-year death-censored graft survival was 93.75%, 90.6%, and 92.9%, respectively, with no significant differences in graft failure among groups (Cox proportional hazards model). Five-year patient survival was 100%, 100%, and 94.3%, respectively. Individuals with a history of lower urinary tract dysfunction had graft and patient survival rates similar to the control group. When appropriately treated, lower

  11. Donation after cardiac death liver transplantation: Graft quality evaluation based on pretransplant liver biopsy.

    PubMed

    Xia, Weiliang; Ke, Qinghong; Wang, Ye; Feng, Xiaowen; Guo, Haijun; Wang, Weilin; Zhang, Min; Shen, Yan; Wu, Jian; Xu, Xiao; Yan, Sheng; Zheng, Shusen

    2015-06-01

    Donation after cardiac death (DCD) liver grafts are associated with inferior clinical outcomes and high discard rates because of poor graft quality. We investigated the predictive value of DCD liver biopsy for the pretransplant graft quality evaluation. DCD liver transplants that took place between October 2010 and April 2014 were included (n = 127). Histological features of graft biopsy samples were analyzed to assess risk factors for graft survival. Macrovesicular steatosis ≥ 20% [hazard ratio (HR) = 2.973; P = 0.045] and sinusoidal neutrophilic infiltrate (HR = 6.969; P = 0.005) were confirmed as independent risk factors for graft survival; hepatocellular swelling, vacuolation, and necrosis failed to show prognostic value. Additionally, a donor serum total bilirubin level ≥ 34.2 μmol/L was also associated with a lower probability of graft survival. Our analysis indicates that macrovesicular steatosis ≥ 20% and sinusoidal neutrophilic infiltrate are novel and useful histological markers for DCD liver grafts with unacceptable quality. This finding can be used by transplant surgeons to improve DCD liver acceptance protocols. © 2015 American Association for the Study of Liver Diseases.

  12. Infectious complications as the leading cause of death after kidney transplantation: analysis of more than 10,000 transplants from a single center.

    PubMed

    de Castro Rodrigues Ferreira, Flávio; Cristelli, Marina Pontello; Paula, Mayara Ivani; Proença, Henrique; Felipe, Claudia Rosso; Tedesco-Silva, Helio; Medina-Pestana, José Osmar

    2017-08-01

    To identify specific causes of graft failure in a large sample of kidney transplant patients from a middle-income, developing country. Retrospective cohort study analyzing all consecutive single kidney transplants (KTs) performed at a single center in Brazil between January 1st 1998 and December 31st 2013. The database closing date was December 31st 2014. Out of 10,400 KTs, there were 1191 (11.45%) deaths with a functioning graft, 40 cases (0.38%) of primary non-function (PNF) and 1417 cases (13.62%) of graft loss excluding death and PNF as the cause. Infectious complications (404 cases, 34% of all deaths) were the major cause of death. Most deaths due to infection occurred within the first year after transplantation (157 deaths, 38.86%). Immunologic mechanisms, comprising acute rejection and immune-mediated interstitial fibrosis/tubular atrophy (IF/TA), were responsible for 52% of all cases of graft failure not involving recipient death. Half of the losses by acute rejection occurred late after transplantation. Contrary to what is observed in developed countries, infectious complications are the main challenge with kidney transplantation in Brazil. Non-adherence to treatment also appears to contribute significantly to long-term kidney graft loss. Strategies for improvement should focus on better compliance and a greater safety profile of immunosuppressive treatment.

  13. Incomplete cellular depopulation may explain the high failure rate of bovine ureteric grafts.

    PubMed

    Spark, J I; Yeluri, S; Derham, C; Wong, Y T; Leitch, D

    2008-05-01

    The aim was to assess the results of a decellularized bovine ureter graft (SynerGraft) for complex venous access. Bovine ureter conduits were implanted in patients with a failed fistula or access graft in whom native vessels were unsuitable as conduits. Graft histories were obtained from all patients who had undergone this procedure at one institution. Failed grafts were explanted and subjected to histological examination. A sample of fresh bovine ureter was immunostained for galactose (alpha1 --> 3) galactose (alpha-Gal). Nine patients with a median age of 46 (range 25-70) years underwent complex venous access surgery between August 2004 and November 2006 using a SynerGraft. Graft types included loop superficial femoral artery to stump of long saphenous vein (four patients), loop brachial artery to vein (two), brachial artery to axillary vein (two) and left axillary artery to innominate vein (one). Three grafts developed aneurysmal dilatation and two thrombosed. Histological assessment of the explanted bovine ureters revealed acute and chronic transmural inflammation. Immunostaining of fresh bovine ureter suggested residual cells and the xenoantigen alpha-Gal. Graft failure with aneurysmal dilatation and thrombosis in complex arteriovenous conduits using bovine ureter may be due to residual xenoantigens. 2008 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.

  14. Some insight on censored cost estimators.

    PubMed

    Zhao, H; Cheng, Y; Bang, H

    2011-08-30

    Censored survival data analysis has been studied for many years. Yet, the analysis of censored mark variables, such as medical cost, quality-adjusted lifetime, and repeated events, faces a unique challenge that makes standard survival analysis techniques invalid. Because of the 'informative' censorship imbedded in censored mark variables, the use of the Kaplan-Meier (Journal of the American Statistical Association 1958; 53:457-481) estimator, as an example, will produce biased estimates. Innovative estimators have been developed in the past decade in order to handle this issue. Even though consistent estimators have been proposed, the formulations and interpretations of some estimators are less intuitive to practitioners. On the other hand, more intuitive estimators have been proposed, but their mathematical properties have not been established. In this paper, we prove the analytic identity between some estimators (a statistically motivated estimator and an intuitive estimator) for censored cost data. Efron (1967) made similar investigation for censored survival data (between the Kaplan-Meier estimator and the redistribute-to-the-right algorithm). Therefore, we view our study as an extension of Efron's work to informatively censored data so that our findings could be applied to other marked variables. Copyright © 2011 John Wiley & Sons, Ltd.

  15. Kidney donation after circulatory death in a country with a high number of brain dead donors: 10-year experience in Belgium.

    PubMed

    Jochmans, Ina; Darius, Tom; Kuypers, Dirk; Monbaliu, Diethard; Goffin, Eric; Mourad, Michel; Ledinh, Hieu; Weekers, Laurent; Peeters, Patrick; Randon, Caren; Bosmans, Jean-Louis; Roeyen, Geert; Abramowicz, Daniel; Hoang, Anh-Dung; De Pauw, Luc; Rahmel, Axel; Squifflet, Jean-Paul; Pirenne, Jacques

    2012-08-01

    Worldwide shortage of standard brain dead donors (DBD) has revived the use of kidneys donated after circulatory death (DCD). We reviewed the Belgian DCD kidney transplant (KT) experience since its reintroduction in 2000. Risk factors for delayed graft function (DGF) were identified using multivariate analysis. Five-year patient/graft survival was assessed using Kaplan-Meier curves. The evolution of the kidney donor type and the impact of DCDs on the total KT activity in Belgium were compared with the Netherlands. Between 2000 and 2009, 287 DCD KT were performed. Primary nonfunction occurred in 1% and DGF in 31%. Five-year patient and death-censored graft survival were 93% and 95%, respectively. In multivariate analysis, cold storage (versus machine perfusion), cold ischemic time, and histidine-tryptophan-ketoglutarate solution were independent risk factors for the development of DGF. Despite an increased number of DCD donations and transplantations, the total number of deceased KT did not increase significantly. This could suggest a shift from DBDs to DCDs. To increase KT activity, Belgium should further expand controlled DCD programs while simultaneously improve the identification of all potential DBDs and avoid their referral for donation as DCDs before brain death occurs. Furthermore, living donation remains underused. © 2012 The Authors. Transplant International © 2012 European Society for Organ Transplantation.

  16. Obesity in pediatric kidney transplant recipients and the risks of acute rejection, graft loss and death.

    PubMed

    Ladhani, Maleeka; Lade, Samantha; Alexander, Stephen I; Baur, Louise A; Clayton, Philip A; McDonald, Stephen; Craig, Jonathan C; Wong, Germaine

    2017-08-01

    Obesity is prevalent in children with chronic kidney disease (CKD), but the health consequences of this combination of comorbidities are uncertain. The aim of this study was to evaluate the impact of obesity on the outcomes of children following kidney transplantation. Using data from the ANZDATA Registry (1994-2013), we assessed the association between age-appropriate body mass index (BMI) at the time of transplantation and the subsequent development of acute rejection (within the first 6 months), graft loss and death using adjusted Cox proportional hazards models. Included in our analysis were 750 children ranging in age from 2 to 18 (median age 12) years with a total of 6597 person-years of follow-up (median follow-up 8.4 years). Overall, at transplantation 129 (17.2%) children were classified as being overweight and 61 (8.1%) as being obese. Of the 750 children, 102 (16.2%) experienced acute rejection within the first 6 months of transplantation, 235 (31.3%) lost their allograft and 53 (7.1%) died. Compared to children with normal BMI, the adjusted hazard ratios (HR) for graft loss in children who were underweight, overweight or diagnosed as obese were 1.05 [95% confidence interval (CI) 0.70-1.60], 1.03 (95% CI 0.71-1.49) and 1.61 (95% CI 1.05-2.47), respectively. There was no statistically significant association between BMI and acute rejection [underweight: HR 1.07, 95% CI 0.54-2.09; overweight: HR 1.42, 95% CI 0.86-2.34; obese: HR 1.83, 95% CI 0.95-3.51) or patient survival (underweight: HR 1.18, 95% CI 0.54-2.58, overweight: HR 0.85, 95% CI 0.38-1.92; obese: HR 0.80, 95% CI 0.25-2.61). Over 10 years of follow-up, pediatric transplant recipients diagnosed with obesity have a substantially increased risk of allograft failure but not acute rejection of the graft or death.

  17. A flexible model for multivariate interval-censored survival times with complex correlation structure.

    PubMed

    Falcaro, Milena; Pickles, Andrew

    2007-02-10

    We focus on the analysis of multivariate survival times with highly structured interdependency and subject to interval censoring. Such data are common in developmental genetics and genetic epidemiology. We propose a flexible mixed probit model that deals naturally with complex but uninformative censoring. The recorded ages of onset are treated as possibly censored ordinal outcomes with the interval censoring mechanism seen as arising from a coarsened measurement of a continuous variable observed as falling between subject-specific thresholds. This bypasses the requirement for the failure times to be observed as falling into non-overlapping intervals. The assumption of a normal age-of-onset distribution of the standard probit model is relaxed by embedding within it a multivariate Box-Cox transformation whose parameters are jointly estimated with the other parameters of the model. Complex decompositions of the underlying multivariate normal covariance matrix of the transformed ages of onset become possible. The new methodology is here applied to a multivariate study of the ages of first use of tobacco and first consumption of alcohol without parental permission in twins. The proposed model allows estimation of the genetic and environmental effects that are shared by both of these risk behaviours as well as those that are specific. 2006 John Wiley & Sons, Ltd.

  18. Diabetes mellitus increases risk of unsuccessful graft preparation in Descemet membrane endothelial keratoplasty: a multicenter study.

    PubMed

    Greiner, Mark A; Rixen, Jordan J; Wagoner, Michael D; Schmidt, Gregory A; Stoeger, Christopher G; Straiko, Michael D; Zimmerman, M Bridget; Kitzmann, Anna S; Goins, Kenneth M

    2014-11-01

    The aim of this study was to evaluate preparation outcomes of tissue prepared for Descemet membrane endothelial keratoplasty (DMEK) from diabetic and nondiabetic donors. In this nonrandomized, consecutive case series, DMEK grafts were prepared from diabetic and nondiabetic donors by experienced technicians in 2 eye banks using slightly different, modified submerged manual preparation techniques to achieve "prestripped" graft tissue. Graft preparation results were analyzed retrospectively. The main outcome measure was the rate of unsuccessful (failed) DMEK graft preparations, defined as tears through the graft area that prevent tissue use. A total of 359 corneas prepared from 290 donors (114 diabetic and 245 nondiabetic) were included in the statistical analysis of graft preparation failure. There were no significant differences between diabetic and nondiabetic donor tissue characteristics with respect to donor age, death to preservation time, death to preparation time, endothelial cell density, percent hexagonality, or coefficient of variation. DMEK tissue preparation was unsuccessful in 19 (5.3%) cases. There was a significant difference in the site-adjusted rate of DMEK preparation failure between diabetic [15.3%; 95% confidence interval (CI), 9.0-25.0] and nondiabetic donors (1.9%; 95% CI, 0.8-4.8), and the corresponding site-adjusted odds ratio of DMEK graft preparation failure in diabetic donor tissue versus nondiabetic donor tissue was 9.20 (95% CI, 2.89-29.32; P = 0.001). Diabetes may be a risk factor for unsuccessful preparation of donor tissue for DMEK. We recommend caution in the use of diabetic tissue for DMEK graft preparation. Further study is needed to identify what subset of diabetic donors is at risk for unsuccessful DMEK graft preparation.

  19. Coronary Artery Bypass Grafting in Diabetic Patients: Complete Arterial versus Internal Thoracic Artery and Sequential Vein Grafts-A Propensity-Score Matched Analysis.

    PubMed

    Kunihara, Takashi; Wendler, Olaf; Heinrich, Kerstin; Nomura, Ryota; Schäfers, Hans-Joachim

    2018-06-20

     The optimal choice of conduit and configuration for coronary artery bypass grafting (CABG) in diabetic patients remains somewhat controversial, even though arterial grafts have been proposed as superior. We attempted to clarify the role of complete arterial revascularization using the left internal thoracic artery (LITA) and the radial artery (RA) alone in "T-Graft" configuration on long-term outcome.  From 1994 to 2001, 104 diabetic patients with triple vessel disease underwent CABG using LITA/RA "T-Grafts" (Group-A). Using propensity-score matching, 104 patients with comparable preoperative characteristics who underwent CABG using LITA and one sequential vein graft were identified (Group-V). Freedom from all causes of death, cardiac death, major adverse cardiac event (MACE), major adverse cardiac (and cerebral) event (MACCE), and repeat revascularization at 10 years of Group-A was 60 ± 5%, 67 ± 5%, 48 ± 5%, 37 ± 5%, and 81 ± 4%, respectively, compared with 58 ± 5%, 70 ± 5%, 49 ± 5%, 39 ± 5%, and 93 ± 3% in Group-V. There were no significant differences in these end points between groups regardless of insulin-dependency. Multivariable Cox proportional hazards model identified age, left ventricular ejection fraction, renal failure, and hyperlipidemia as independent predictors for all death, age and left ventricular ejection fraction for cardiac death, sinus rhythm for both MACE and MACCE, and prior percutaneous coronary intervention for re-revascularization.  In our experience, complete arterial revascularization using LITA/RA "T-Grafts" does not provide superior long-term clinical benefits for diabetic patients compared with a combination of LITA and sequential vein graft. Georg Thieme Verlag KG Stuttgart · New York.

  20. Fresh Osteochondral Allograft Transplantation: Is Graft Storage Time Associated With Clinical Outcomes and Graft Survivorship?

    PubMed

    Schmidt, Kenneth J; Tírico, Luís E; McCauley, Julie C; Bugbee, William D

    2017-08-01

    Regulatory concerns and the popularity of fresh osteochondral allograft (OCA) transplantation have led to a need for prolonged viable storage of osteochondral grafts. Tissue culture media allow a longer storage time but lead to chondrocyte death within the tissue. The long-term clinical consequence of prolonged storage is unknown. Patients transplanted with OCAs with a shorter storage time would have lower failure rates and better clinical outcomes than those transplanted with OCAs with prolonged storage. Cohort study; Level of evidence, 3. A matched-pair study was performed of 75 patients who received early release grafts (mean storage, 6.3 days [range, 1-14 days]) between 1997 and 2002, matched 1:1 by age, diagnosis, and graft size, with 75 patients who received late release grafts (mean storage time, 20.0 days [range, 16-28 days]) from 2002 to 2008. The mean age was 33.5 years, and the median graft size was 6.3 cm 2 . All patients had a minimum 2-year follow-up. Evaluations included pain, satisfaction, function, failures, and reoperations. Outcome measures included the modified Merle d'Aubigné-Postel (18-point) scale, International Knee Documentation Committee (IKDC) form, and Knee Society function (KS-F) scale. Clinical failure was defined as revision OCA transplantation or conversion to arthroplasty. Among patients with grafts remaining in situ, the mean follow-up was 11.9 years (range, 2.0-16.8 years) and 7.8 years (range, 2.3-11.1 years) for the early and late release groups, respectively. OCA failure occurred in 25.3% (19/75) of patients in the early release group and 12.0% (9/75) of patients in the late release group ( P = .036). The median time to failure was 3.5 years (range, 1.7-13.8 years) and 2.7 years (range, 0.3-11.1 years) for the early and late release groups, respectively. The 5-year survivorship of OCAs was 85% for the early release group and 90% for the late release group ( P = .321). No differences in postoperative pain and function were

  1. Stratification of the Risk of Sudden Death in Nonischemic Heart Failure

    PubMed Central

    Pimentel, Maurício; Zimerman, Leandro Ioschpe; Rohde, Luis Eduardo

    2014-01-01

    Despite significant therapeutic advancements, heart failure remains a highly prevalent clinical condition associated with significant morbidity and mortality. In 30%-40% patients, the etiology of heart failure is nonischemic. The implantable cardioverter-defibrillator (ICD) is capable of preventing sudden death and decreasing total mortality in patients with nonischemic heart failure. However, a significant number of patients receiving ICD do not receive any kind of therapy during follow-up. Moreover, considering the situation in Brazil and several other countries, ICD cannot be implanted in all patients with nonischemic heart failure. Therefore, there is an urgent need to identify patients at an increased risk of sudden death because these would benefit more than patients at a lower risk, despite the presence of heart failure in both risk groups. In this study, the authors review the primary available methods for the stratification of the risk of sudden death in patients with nonischemic heart failure. PMID:25352509

  2. Reoperation for composite valve graft failure: Operative results and midterm survival.

    PubMed

    Maroto, Luis C; Carnero, Manuel; Cobiella, Javier; García, Mónica; Vilacosta, Isidre; Reguillo, Fernando; Villagrán, Enrique; Olmos, Carmen

    2018-06-01

    The replacement of a failed composite valve graft is technically more demanding and is associated with increased morbidity and mortality. We present our technique and outcomes for reoperations for composite graft failures. Between September 2011 and June 2017, 14 patients underwent a redo composite graft replacement. Twelve patients (85.7%) were male, and mean age was 58.4 years ± 12 standard deviation (SD). One patient had two previous root replacements. Indications for reoperation were endocarditis (8), aortic pseudoaneurysm (3), and aortic prosthesis thrombosis (3). Mean logistic EuroSCORE and EuroSCORE II were 30.8% and 14.7%, respectively. A mechanical composite graft was used in 12 patients and biological composite grafts were used in two patients. Hospital mortality was 14.3% (n = 2). One patient (7.1%) required reoperation for bleeding, One patient (7.1%) had mechanical ventilation >24 h, and four patients (28.6%) required implantation of a permanent pacemaker. Median intensive care unit and hospital stays were 3 days (interquartile range [IQR] 1-5) and 10 days (IQR 6.5-38.5). One patient experienced recurrent prosthetic valve endocarditis 14 months after operation. On follow-up, 11 of 12 survivors were in New York Heart Association class I or II. Survival at 3 years was 85.7% ± 9.4% SD. Composite valve graft replacement can be performed with acceptable morbidity and mortality with good mid-term survival. © 2018 Wiley Periodicals, Inc.

  3. Rethinking the advantage of zero-HLA mismatches in unrelated living donor kidney transplantation: implications on kidney paired donation.

    PubMed

    Casey, Michael Jin; Wen, Xuerong; Rehman, Shehzad; Santos, Alfonso H; Andreoni, Kenneth A

    2015-04-01

    The OPTN/UNOS Kidney Paired Donation (KPD) Pilot Program allocates priority to zero-HLA mismatches. However, in unrelated living donor kidney transplants (LDKT)-the same donor source in KPD-no study has shown whether zero-HLA mismatches provide any advantage over >0 HLA mismatches. We hypothesize that zero-HLA mismatches among unrelated LDKT do not benefit graft survival. This retrospective SRTR database study analyzed LDKT recipients from 1987 to 2012. Among unrelated LDKT, subjects with zero-HLA mismatches were compared to a 1:1-5 matched (by donor age ±1 year and year of transplantation) control cohort with >0 HLA mismatches. The primary endpoint was death-censored graft survival. Among 32,654 unrelated LDKT recipients, 83 had zero-HLA mismatches and were matched to 407 controls with >0 HLA mismatches. Kaplan-Meier analyses for death-censored graft and patient survival showed no difference between study and control cohorts. In multivariate marginal Cox models, zero-HLA mismatches saw no benefit with death-censored graft survival (HR = 1.46, 95% CI 0.78-2.73) or patient survival (HR = 1.43, 95% CI 0.68-3.01). Our data suggest that in unrelated LDKT, zero-HLA mismatches may not offer any survival advantage. Therefore, particular study of zero-HLA mismatching is needed to validate its place in the OPTN/UNOS KPD Pilot Program allocation algorithm. © 2014 Steunstichting ESOT.

  4. A flexible cure rate model with dependent censoring and a known cure threshold.

    PubMed

    Bernhardt, Paul W

    2016-11-10

    We propose a flexible cure rate model that accommodates different censoring distributions for the cured and uncured groups and also allows for some individuals to be observed as cured when their survival time exceeds a known threshold. We model the survival times for the uncured group using an accelerated failure time model with errors distributed according to the seminonparametric distribution, potentially truncated at a known threshold. We suggest a straightforward extension of the usual expectation-maximization algorithm approach for obtaining estimates in cure rate models to accommodate the cure threshold and dependent censoring. We additionally suggest a likelihood ratio test for testing for the presence of dependent censoring in the proposed cure rate model. We show through numerical studies that our model has desirable properties and leads to approximately unbiased parameter estimates in a variety of scenarios. To demonstrate how our method performs in practice, we analyze data from a bone marrow transplantation study and a liver transplant study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  5. The role of donor-recipient relationship in long-term outcomes of living donor renal transplantation.

    PubMed

    Miles, Clifford D; Schaubel, Douglas E; Liu, Dandan; Port, Friedrich K; Rao, Panduranga S

    2008-05-27

    Graft failure related to acute and chronic rejection remains an important problem in transplantation. An association has been reported between microchimerism and the development of tolerance. Since it has been established that cells of fetal origin can be found in maternal tissues long after parturition, and cells of maternal origin may persist for years in offspring, we hypothesized that this fetal-maternal microchimerism may confer tolerance and thus less graft loss for kidneys transplanted between mothers and their offspring. We used data from the Scientific Registry of Transplant Recipients to compare death-censored graft survival among recipients of living-related renal transplants sharing at least one human leukocyte antigen (HLA) haplotype with their donor. A total of 23,064 such transplants were reported from 1995 to 2004. A Cox proportional hazards model was constructed to compare death-censored graft survival among the following donor-recipient pairings: child-to-mother, child-to-father, mother-to-child, father-to-child, 1-haplotype matched siblings, and HLA-identical siblings. HLA-identical sibling recipients had the best survival, but results for the child-to-father group were not significantly worse (hazard ratio=1.07, P=0.47). Mother-to-child transplants had the poorest graft survival (hazard ratio=2.61, P<0.0001). We found no evidence of tolerance to kidneys transplanted between mothers and offspring. Our analysis of 1-haplotype matched living-related renal transplants argues against tolerance to organs based on fetal-maternal microchimerism. Mechanistic studies examining the relationship between chimerism and immune sensitization would be useful to explore our results, and may contribute to a better understanding of tolerance.

  6. Vein Graft Preservation Solutions, Patency, and Outcomes After Coronary Artery Bypass Graft Surgery

    PubMed Central

    Harskamp, Ralf E.; Alexander, John H.; Schulte, Phillip J.; Brophy, Colleen M.; Mack, Michael J.; Peterson, Eric D.; Williams, Judson B.; Gibson, C. Michael; Califf, Robert M.; Kouchoukos, Nicholas T.; Harrington, Robert A.; Ferguson, T. Bruce; Lopes, Renato D.

    2015-01-01

    IMPORTANCE In vitro and animal model data suggest that intraoperative preservation solutions may influence endothelial function and vein graft failure (VGF) after coronary artery bypass graft (CABG) surgery. Clinical studies to validate these findings are lacking. OBJECTIVE To evaluate the effect of vein graft preservation solutions on VGF and clinical outcomes in patients undergoing CABG surgery. DESIGN, SETTING, AND PARTICIPANTS Data from the Project of Ex-Vivo Vein Graft Engineering via Transfection IV (PREVENT IV) study, a phase 3, multicenter, randomized, double-blind, placebo-controlled trial that enrolled 3014 patients at 107 US sites from August 1, 2002, through October 22, 2003, were used. Eligibility criteria for the trial included CABG surgery for coronary artery disease with at least 2 planned vein grafts. INTERVENTIONS Preservation of vein grafts in saline, blood, or buffered saline solutions. MAIN OUTCOMES AND MEASURES One-year angiographic VGF and 5-year rates of death, myocardial infarction, and subsequent revascularization. RESULTS Most patients had grafts preserved in saline (1339 [44.4%]), followed by blood (971 [32.2%]) and buffered saline (507 [16.8%]). Baseline characteristics were similar among groups. One-year VGF rates were much lower in the buffered saline group than in the saline group (patient-level odds ratio [OR], 0.59 [95% CI, 0.45-0.78; P < .001]; graft-level OR, 0.63 [95% CI, 0.49-0.79; P < .001]) or the blood group (patient-level OR, 0.62 [95% CI, 0.46-0.83; P = .001]; graft-level OR, 0.63 [95% CI, 0.48-0.81; P < .001]). Use of buffered saline solution also tended to be associated with a lower 5-year risk for death, myocardial infarction, or subsequent revascularization compared with saline (hazard ratio, 0.81 [95% CI, 0.64-1.02; P = .08]) and blood (0.81 [0.63-1.03; P = .09]) solutions. CONCLUSIONS AND RELEVANCE Patients undergoing CABG whose vein grafts were preserved in a buffered saline solution had lower VGF rates and trends

  7. Risk score for predicting long-term mortality after coronary artery bypass graft surgery.

    PubMed

    Wu, Chuntao; Camacho, Fabian T; Wechsler, Andrew S; Lahey, Stephen; Culliford, Alfred T; Jordan, Desmond; Gold, Jeffrey P; Higgins, Robert S D; Smith, Craig R; Hannan, Edward L

    2012-05-22

    No simplified bedside risk scores have been created to predict long-term mortality after coronary artery bypass graft surgery. The New York State Cardiac Surgery Reporting System was used to identify 8597 patients who underwent isolated coronary artery bypass graft surgery in July through December 2000. The National Death Index was used to ascertain patients' vital statuses through December 31, 2007. A Cox proportional hazards model was fit to predict death after CABG surgery using preprocedural risk factors. Then, points were assigned to significant predictors of death on the basis of the values of their regression coefficients. For each possible point total, the predicted risks of death at years 1, 3, 5, and 7 were calculated. It was found that the 7-year mortality rate was 24.2 in the study population. Significant predictors of death included age, body mass index, ejection fraction, unstable hemodynamic state or shock, left main coronary artery disease, cerebrovascular disease, peripheral arterial disease, congestive heart failure, malignant ventricular arrhythmia, chronic obstructive pulmonary disease, diabetes mellitus, renal failure, and history of open heart surgery. The points assigned to these risk factors ranged from 1 to 7; possible point totals for each patient ranged from 0 to 28. The observed and predicted risks of death at years 1, 3, 5, and 7 across patient groups stratified by point totals were highly correlated. The simplified risk score accurately predicted the risk of mortality after coronary artery bypass graft surgery and can be used for informed consent and as an aid in determining treatment choice.

  8. Four decades of the kidney transplantation program at the Institute Nacional de Ciencias Médicas y Nutrición Salvador Zubirán in Mexico City.

    PubMed

    Morales-Buenrostro, Luis E; Marino-Vázquez, Lluvia A; Alberú, Josefina

    2009-01-01

    This is a retrospective study that includes four decades of kidney transplant program at our Institute, with a total of 923 kidney transplants in 872 recipients. In this report, the effect of variables in recipient, donor, and transplant on long-term graft survival was analyzed using the Kaplan Meier method with log-rank test for survival comparisons. Global graft survival at our center-analyzed by censoring for death-with-functioning-graft-for 1, 5 and 10 years was 93%, 83% and 74%, respectively, with median survival of 24.5 years. When analyzed for all-cause graft loss, 1, 5 and 10 year survival was 90%, 76% and 61%, with 12.8-year median survival. Variables associated with lower graft survival censored for death-with-functioning-graft were transplantation in an earlier decade, less histocompatibility, younger kidney transplant recipients, no induction therapy, and double drug initial immunosuppression. After Cox's regression multivariate analysis, the risk factors that remained associated with worse survival were younger recipient, earlier transplant decade, and deceased donor.

  9. Chronobiology of death in heart failure.

    PubMed

    Ribas, Nuria; Domingo, Maite; Gastelurrutia, Paloma; Ferrero-Gregori, Andreu; Rull, Pilar; Noguero, Mariana; Garcia, Carmen; Puig, Teresa; Cinca, Juan; Bayes-Genis, Antoni

    2014-05-01

    In the general population, heart events occur more often during early morning, on Mondays, and during winter. However, the chronobiology of death in heart failure has not been analyzed. The aim of this study was to determine the circadian, day of the week, and seasonal variability of all-cause mortality in chronic heart failure. This was an analysis of all consecutive heart failure patients followed in a heart failure unit from January 2003 to December 2008. The circadian moment of death was analyzed at 6-h intervals and was determined by reviewing medical records and by information provided by the relatives. Of 1196 patients (mean [standard deviation] age, 69 [13] years; 62% male), 418 (34.9%) died during a mean (standard deviation) follow-up of 29 (21) months. Survivors were younger, had higher body mass index, left ventricular ejection fraction, glomerular filtration rate, hemoglobin and sodium levels, and lower Framingham risk scores, amino-terminal pro-B type natriuretic peptide, troponin T, and urate values. They were more frequently treated with angiotensin receptor blockers, beta-blockers, mineralocorticoids receptor antagonists, digoxin, nitrates, hydralazine, statins, loop diuretics, and thiazides. The analysis of the circadian and weekly variability did not reveal significant differences between the four 6-h intervals or the days of the week. Mortality occurred more frequently during the winter (30.6%) compared with the other seasons (P = .024). All cause mortality does not follow a circadian pattern, but a seasonal rhythm in patients with heart failure. This finding is in contrast to the circadian rhythmicity of cardiovascular events reported in the general population. Copyright © 2013 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  10. Improving the Outcomes of Organs Obtained From Controlled Donation After Circulatory Death Donors Using Abdominal Normothermic Regional Perfusion.

    PubMed

    Miñambres, E; Suberviola, B; Dominguez-Gil, B; Rodrigo, E; Ruiz-San Millan, J C; Rodríguez-San Juan, J C; Ballesteros, M A

    2017-08-01

    The use of donation after circulatory death (DCD) has increased significantly during the past decade. However, warm ischemia results in a greater risk for transplantation. Indeed, controlled DCD (cDCD) was associated with inferior outcomes compared with donation after brain death. The use of abdominal normothermic regional perfusion (nRP) to restore blood flow before organ recovery in cDCD has been proposed as better than rapid recovery to reverse the effect of ischemia and improve recipients' outcome. Here, the first Spanish series using abdominal nRP as an in situ conditioning method is reported. A specific methodology to avoid restoring circulation to the brain after death determination is described. Twenty-seven cDCD donors underwent abdominal nRP during at least 60 min. Thirty-seven kidneys, 11 livers, six bilateral lungs, and one pancreas were transplanted. The 1-year death-censored kidney survival was 91%, and delayed graft function rate was 27%. The 1-year liver survival rate was 90.1% with no cases of ischemic cholangiopathy. Transplanted lungs and pancreas exhibited primary function. The use of nRP may represent an advance to increase the number and quality of grafts in cDCD. Poor results in cDCD livers could be reversed with nRP. Concerns about restoring brain circulation after death are easily solved. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  11. Graft survival after cardiac transplantation for alcohol cardiomyopathy.

    PubMed

    Brinkley, D Marshall; Novak, Eric; Topkara, Veli K; Geltman, Edward M

    2014-08-27

    Alcohol cardiomyopathy (ACM) constitutes up to 40% of patients with non-ischemic dilated cardiomyopathy. Transplant-free survival is worse for patients with ACM versus idiopathic dilated cardiomyopathy (IDCM) with continued exposure. The prognosis for patients with ACM after cardiac transplantation is unknown. We evaluated adults who underwent single-organ, cardiac transplantation from 1994 to 2009 with a diagnosis of ACM (n=134) or IDCM (n=10,243) in the Organ Procurement Transplantation Network registry. Kaplan-Meier curves were generated by cohort for time until graft failure, cardiac allograft vasculopathy, and hospitalization for rejection. A Cox proportional hazards model was created to determine factors associated with each outcome. Patients with ACM were more likely to be males (P<0.0001), minorities (P<0.0001), and smokers (P=0.0310) compared with IDCM. Overall graft survival was lower for the ACM cohort (P=0.0001). After multivariate analysis, ACM was not independently associated with graft survival (HR 1.341, 95% CI 0.944-1.906, P=0.1017). Creatinine, total bilirubin, minority ethnicity, graft under-sizing, life support, diabetes, and donor age were independent predictors of graft failure. There were no significant differences between primary cause of death, vasculopathy, or rejection. There was no association between ACM and graft survival in this large registry study, but poorer overall survival in the ACM cohort was associated with other recipient characteristics.

  12. Estimating and Testing Mediation Effects with Censored Data

    ERIC Educational Resources Information Center

    Wang, Lijuan; Zhang, Zhiyong

    2011-01-01

    This study investigated influences of censored data on mediation analysis. Mediation effect estimates can be biased and inefficient with censoring on any one of the input, mediation, and output variables. A Bayesian Tobit approach was introduced to estimate and test mediation effects with censored data. Simulation results showed that the Bayesian…

  13. Downstream anastomotic hyperplasia. A mechanism of failure in Dacron arterial grafts.

    PubMed Central

    LoGerfo, F W; Quist, W C; Nowak, M D; Crawshaw, H M; Haudenschild, C C

    1983-01-01

    The precise location and progression of anastomotic hyperplasia and its possible relationship to flow disturbances was investigated in femoro-femoral Dacron grafts in 28 dogs. In 13 grafts, the outflow from the end-to-side downstream anastomosis was bidirectional (BDO), and in 15 it was unidirectional (UDO) (distally). Grafts were electively removed at intervals of two to 196 days or at the time of thrombosis. Each anastomosis and adjacent artery was perfusion-fixed and sectioned sagittally. The mean sagittal section was projected onto a digitized pad, and the total area of hyperplasia internal to the arterial internal elastic lamina and within the adjacent graft was integrated by computer. The location of the hyperplasia was compared with previously established sites of flow separation and stagnation. The observation was made that hyperplasia is significantly greater at the downstream, as compared with the upstream, anastomosis in both groups (BDO = p less than 0.001 and UDO = p less than 0.001) (analysis of variance for independent groups). Furthermore, this downstream hyperplasia was progressive with time (BDO p less than 0.01) (UDO p less than 0.01); Spearman Rank Correlation. There was no significant increase in the extent of downstream hyperplasia where flow separation was known to be greater (BDO). Five grafts failed (three BDO, two UDO), as a result of complete occlusion of the downstream anastomosis by fibrous hyperplasia. Transmission electron microscopy showed the hyperplasia to consist of collagen-producing smooth muscle cells. Anastomotic hyperplasia is significantly greater at the downstream anastomosis, is progressive with time, and is the primary cause of failure of Dacron arterial grafts in this model. Quantitative analysis of downstream anastomotic hyperplasia may be a valuable measure of the biocompatibility of Dacron grafts. Images Fig. 2. Fig. 3. Fig. 5. Fig. 6. Fig. 7. Fig. 8. PMID:6219641

  14. The Censored Mean-Level Detector for Multiple Target Environments.

    DTIC Science & Technology

    1984-03-01

    rate ( CFAR ) detectors known as censored mean-level detectors ( CMLD ). The CMLD , a special case of which is the mean-level detector (or zell-averaged...detectors known as censored mean- level detectors ( CMLD ). The CMLD , a special case of which is the mean-level detector (or cell-averaged CFAR detector), is...CENSORED MEAN-LEVEL DETECTOR The censored mean-level detector ( CMLD ) is a generalization of the traditional mean-level detector (MLD) or cell-averaged CFAR

  15. Semi-Markov models for interval censored transient cognitive states with back transitions and a competing risk

    PubMed Central

    Wei, Shaoceng; Kryscio, Richard J.

    2015-01-01

    Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient, cognitive states and death as a competing risk (Figure 1). Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript we apply a Semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. PMID:24821001

  16. Semi-Markov models for interval censored transient cognitive states with back transitions and a competing risk.

    PubMed

    Wei, Shaoceng; Kryscio, Richard J

    2016-12-01

    Continuous-time multi-state stochastic processes are useful for modeling the flow of subjects from intact cognition to dementia with mild cognitive impairment and global impairment as intervening transient cognitive states and death as a competing risk. Each subject's cognition is assessed periodically resulting in interval censoring for the cognitive states while death without dementia is not interval censored. Since back transitions among the transient states are possible, Markov chains are often applied to this type of panel data. In this manuscript, we apply a semi-Markov process in which we assume that the waiting times are Weibull distributed except for transitions from the baseline state, which are exponentially distributed and in which we assume no additional changes in cognition occur between two assessments. We implement a quasi-Monte Carlo (QMC) method to calculate the higher order integration needed for likelihood estimation. We apply our model to a real dataset, the Nun Study, a cohort of 461 participants. © The Author(s) 2014.

  17. Model Calibration with Censored Data

    DOE PAGES

    Cao, Fang; Ba, Shan; Brenneman, William A.; ...

    2017-06-28

    Here, the purpose of model calibration is to make the model predictions closer to reality. The classical Kennedy-O'Hagan approach is widely used for model calibration, which can account for the inadequacy of the computer model while simultaneously estimating the unknown calibration parameters. In many applications, the phenomenon of censoring occurs when the exact outcome of the physical experiment is not observed, but is only known to fall within a certain region. In such cases, the Kennedy-O'Hagan approach cannot be used directly, and we propose a method to incorporate the censoring information when performing model calibration. The method is applied tomore » study the compression phenomenon of liquid inside a bottle. The results show significant improvement over the traditional calibration methods, especially when the number of censored observations is large.« less

  18. Skin grafts: a rural general surgical perspective.

    PubMed

    Henderson, Nigel J; Fancourt, Michael; Gilkison, William; Kyle, Stephen; Mosquera, Damien

    2009-05-01

    Skin grafts are a common method of closing skin defects. The literature comparing methods of graft application and subsequent outcomes is poor, but reports indicate a graft failure rate between 2 and 30%. The aim of this study was to audit our current skin graft practice. Data were collected prospectively on all skin grafts performed by the general surgical department between 1st December 2005 and 1st December 2006. A standardized proforma on each patient included data on age, gender, graft indication, application method, comorbidities, length of stay, and graft outcomes including graft take at 1, 2 and 6 weeks post-operatively. There were 85 grafts performed on 74 patients, median age 72 years (9-102 years), with 10 (12%) acute admissions. Prophylactic antibiotics were given to 50% (38 of 74) of patients. Successful grafts (>80% take) were performed in 68 (80%) patients. The overall graft complication rate was 24.7% (22 of 85 grafts). Infection occurred in 13 of 17 graft failures. No patients underwent re-operation for graft failure. Patients who received prophylactic antibiotics had a reduced risk of graft failure (Fisher's exact test, P = 0.016). Skin grafts were performed successfully in the majority of patients. Graft complication and failure rates compare well with the world literature. The use of prophylactic antibiotics was the only predictor of successful graft take.

  19. Subclinical rejection associated with chronic allograft nephropathy in protocol biopsies as a risk factor for late graft loss.

    PubMed

    Moreso, F; Ibernon, M; Gomà, M; Carrera, M; Fulladosa, X; Hueso, M; Gil-Vernet, S; Cruzado, J M; Torras, J; Grinyó, J M; Serón, D

    2006-04-01

    Chronic allograft nephropathy (CAN) in protocol biopsies is associated with graft loss while the association between subclinical rejection (SCR) and outcome has yielded contradictory results. We analyze the predictive value of SCR and/or CAN in protocol biopsies on death-censored graft survival. Since 1988, a protocol biopsy was done during the first 6 months in stable grafts with serum creatinine <300 micromol/L and proteinuria <1 g/day. Biopsies were evaluated according to Banff criteria. Borderline changes and acute rejection were grouped as SCR. CAN was defined as presence of interstitial fibrosis and tubular atrophy. Mean follow-up was 91 +/- 46 months. Sufficient tissue was obtained in 435 transplants. Biopsies were classified as normal (n = 186), SCR (n = 74), CAN (n = 110) and SCR with CAN (n = 65). Presence of SCR with CAN was associated with old donors, percentage of panel reactive antibodies and presence of acute rejection before protocol biopsy. Cox regression analysis showed that SCR with CAN (relative risk [RR]: 1.86, 95% confidence interval [CI]: 1.11-3.12; p = 0.02) and hepatitis C virus (RR: 2.27, 95% CI: 1.38-3.75; p = 0.01) were independent predictors of graft survival. In protocol biopsies, the detrimental effect of interstitial fibrosis/tubular atrophy on long-term graft survival is modulated by SCR.

  20. Analysis of censored data.

    PubMed

    Lucijanic, Marko; Petrovecki, Mladen

    2012-01-01

    Analyzing events over time is often complicated by incomplete, or censored, observations. Special non-parametric statistical methods were developed to overcome difficulties in summarizing and comparing censored data. Life-table (actuarial) method and Kaplan-Meier method are described with an explanation of survival curves. For the didactic purpose authors prepared a workbook based on most widely used Kaplan-Meier method. It should help the reader understand how Kaplan-Meier method is conceptualized and how it can be used to obtain statistics and survival curves needed to completely describe a sample of patients. Log-rank test and hazard ratio are also discussed.

  1. An identifiable model for informative censoring

    USGS Publications Warehouse

    Link, W.A.; Wegman, E.J.; Gantz, D.T.; Miller, J.J.

    1988-01-01

    The usual model for censored survival analysis requires the assumption that censoring of observations arises only due to causes unrelated to the lifetime under consideration. It is easy to envision situations in which this assumption is unwarranted, and in which use of the Kaplan-Meier estimator and associated techniques will lead to unreliable analyses.

  2. Threshold regression to accommodate a censored covariate.

    PubMed

    Qian, Jing; Chiou, Sy Han; Maye, Jacqueline E; Atem, Folefac; Johnson, Keith A; Betensky, Rebecca A

    2018-06-22

    In several common study designs, regression modeling is complicated by the presence of censored covariates. Examples of such covariates include maternal age of onset of dementia that may be right censored in an Alzheimer's amyloid imaging study of healthy subjects, metabolite measurements that are subject to limit of detection censoring in a case-control study of cardiovascular disease, and progressive biomarkers whose baseline values are of interest, but are measured post-baseline in longitudinal neuropsychological studies of Alzheimer's disease. We propose threshold regression approaches for linear regression models with a covariate that is subject to random censoring. Threshold regression methods allow for immediate testing of the significance of the effect of a censored covariate. In addition, they provide for unbiased estimation of the regression coefficient of the censored covariate. We derive the asymptotic properties of the resulting estimators under mild regularity conditions. Simulations demonstrate that the proposed estimators have good finite-sample performance, and often offer improved efficiency over existing methods. We also derive a principled method for selection of the threshold. We illustrate the approach in application to an Alzheimer's disease study that investigated brain amyloid levels in older individuals, as measured through positron emission tomography scans, as a function of maternal age of dementia onset, with adjustment for other covariates. We have developed an R package, censCov, for implementation of our method, available at CRAN. © 2018, The International Biometric Society.

  3. Donation after cardiac death liver transplantation: predictors of outcome.

    PubMed

    Mathur, A K; Heimbach, J; Steffick, D E; Sonnenday, C J; Goodrich, N P; Merion, R M

    2010-11-01

    We aimed to identify recipient, donor and transplant risk factors associated with graft failure and patient mortality following donation after cardiac death (DCD) liver transplantation. These estimates were derived from Scientific Registry of Transplant Recipients data from all US liver-only DCD recipients between September 1, 2001 and April 30, 2009 (n = 1567) and Cox regression techniques. Three years post-DCD liver transplant, 64.9% of recipients were alive with functioning grafts, 13.6% required retransplant and 21.6% died. Significant recipient factors predictive of graft failure included: age ≥ 55 years, male sex, African-American race, HCV positivity, metabolic liver disorder, transplant MELD ≥ 35, hospitalization at transplant and the need for life support at transplant (all, p ≤ 0.05). Donor characteristics included age ≥ 50 years and weight >100 kg (all, p ≤ 0.005). Each hour increase in cold ischemia time (CIT) was associated with 6% higher graft failure rate (HR 1.06, p < 0.001). Donor warm ischemia time ≥ 35 min significantly increased graft failure rates (HR 1.84, p = 0.002). Recipient predictors of mortality were age ≥ 55 years, hospitalization at transplant and retransplantation (all, p ≤ 0.006). Donor weight >100 kg and CIT also increased patient mortality (all, p ≤ 0.035). These findings are useful for transplant surgeons creating DCD liver acceptance protocols. ©2010 The Authors Journal compilation©2010 The American Society of Transplantation and the American Society of Transplant Surgeons.

  4. De novo noncutaneous malignancies after kidney transplantation are associated with an increased risk of graft failure: results from a time-dependent analysis on 672 patients.

    PubMed

    Cena, Tiziana; Musetti, Claudio; Quaglia, Marco; Magnani, Corrado; Stratta, Piero; Bagnardi, Vincenzo; Cantaluppi, Vincenzo

    2016-10-01

    The aim of this study was to evaluate the association between cancer occurrence and risk of graft failure in kidney transplant recipients. From November 1998 to November 2013, 672 adult patients received their first kidney transplant from a deceased donor and had a minimum follow-up of 6 months. During a median follow-up of 4.7 years (3523 patient-years), 47 patients developed a nonmelanoma skin cancer (NMSC) and 40 a noncutaneous malignancy (NCM). A total of 59 graft failures were observed. The failure rate was 6 per 100 patient-year (pt-yr) after NCM versus 1.5 per 100 pt-yr in patients without NCM. In a time-dependent multivariable model, the occurrence of NCM appeared to be associated with failure (HR = 3.27; 95% CI = 1.44-7.44). The effect of NCM on the cause-specific graft failure was different (P = 0.002) when considering events due to chronic rejection (HR = 0.55) versus other causes (HR = 15.59). The reduction of the immunosuppression after NCM was not associated with a greater risk of graft failure. In conclusion, our data suggest that post-transplant NCM may be a strong risk factor for graft failure, particularly for causes other than chronic rejection. © 2016 Steunstichting ESOT.

  5. A multivariate cure model for left-censored and right-censored data with application to colorectal cancer screening patterns.

    PubMed

    Hagar, Yolanda C; Harvey, Danielle J; Beckett, Laurel A

    2016-08-30

    We develop a multivariate cure survival model to estimate lifetime patterns of colorectal cancer screening. Screening data cover long periods of time, with sparse observations for each person. Some events may occur before the study begins or after the study ends, so the data are both left-censored and right-censored, and some individuals are never screened (the 'cured' population). We propose a multivariate parametric cure model that can be used with left-censored and right-censored data. Our model allows for the estimation of the time to screening as well as the average number of times individuals will be screened. We calculate likelihood functions based on the observations for each subject using a distribution that accounts for within-subject correlation and estimate parameters using Markov chain Monte Carlo methods. We apply our methods to the estimation of lifetime colorectal cancer screening behavior in the SEER-Medicare data set. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. BACKWARD ESTIMATION OF STOCHASTIC PROCESSES WITH FAILURE EVENTS AS TIME ORIGINS1

    PubMed Central

    Gary Chan, Kwun Chuen; Wang, Mei-Cheng

    2011-01-01

    Stochastic processes often exhibit sudden systematic changes in pattern a short time before certain failure events. Examples include increase in medical costs before death and decrease in CD4 counts before AIDS diagnosis. To study such terminal behavior of stochastic processes, a natural and direct way is to align the processes using failure events as time origins. This paper studies backward stochastic processes counting time backward from failure events, and proposes one-sample nonparametric estimation of the mean of backward processes when follow-up is subject to left truncation and right censoring. We will discuss benefits of including prevalent cohort data to enlarge the identifiable region and large sample properties of the proposed estimator with related extensions. A SEER–Medicare linked data set is used to illustrate the proposed methodologies. PMID:21359167

  7. Predicting long-term graft survival in adult kidney transplant recipients.

    PubMed

    Pinsky, Brett W; Lentine, Krista L; Ercole, Patrick R; Salvalaggio, Paolo R; Burroughs, Thomas E; Schnitzler, Mark A

    2012-07-01

    The ability to accurately predict a population's long-term survival has important implications for quantifying the benefits of transplantation. To identify a model that can accurately predict a kidney transplant population's long-term graft survival, we retrospectively studied the United Network of Organ Sharing data from 13,111 kidney-only transplants completed in 1988- 1989. Nineteen-year death-censored graft survival (DCGS) projections were calculated and compared with the population's actual graft survival. The projection curves were created using a two-part estimation model that (1) fits a Kaplan-Meier survival curve immediately after transplant (Part A) and (2) uses truncated observational data to model a survival function for long-term projection (Part B). Projection curves were examined using varying amounts of time to fit both parts of the model. The accuracy of the projection curve was determined by examining whether predicted survival fell within the 95% confidence interval for the 19-year Kaplan-Meier survival, and the sample size needed to detect the difference in projected versus observed survival in a clinical trial. The 19-year DCGS was 40.7% (39.8-41.6%). Excellent predictability (41.3%) can be achieved when Part A is fit for three years and Part B is projected using two additional years of data. Using less than five total years of data tended to overestimate the population's long-term survival, accurate prediction of long-term DCGS is possible, but requires attention to the quantity data used in the projection method.

  8. Saphenous vein graft aneurysm fistula formation causing right heart failure: an unusual presentation.

    PubMed

    Boon, K J; Arshad, M A; Singh, H; Lainchbury, J G; Blake, J W H

    2015-11-01

    Saphenous vein graft aneurysm (SVG) formation after coronary artery bypass grafting is a rare complication of the surgery. We present a case of a 68-year-old man with an unusual presentation of such an aneurysm. Thirty-four years after his initial bypass surgery, the patient presented with a fistula formation into his right atrium from a vein graft aneurysm. Late aneurysm formation is thought to occur secondary to atherosclerotic degeneration of the SVG with background hypertension and dyslipidaemia accelerating the process. Diagnostic modalities used to investigate SVG aneurysms include computed tomography, transthoracic echocardiogram, magnetic resonance imaging and cardiac catheterisation. Aneurysms with fistula formation historically require aggressive surgical intervention. Resection of the aneurysm with subsequent revascularisation if required is the surgical norm. SVG aneurysm with fistula formation into a cardiac chamber is a rare complication of coronary artery bypass grafting (CABG), which can occur with atypical presenting symptoms. Physicians should keep in mind the possibility of this occurring in post-CABG patients presenting with heart failure and a new murmur. © 2015 Royal Australasian College of Physicians.

  9. Association between bilirubin and mode of death in severe systolic heart failure.

    PubMed

    Wu, Audrey H; Levy, Wayne C; Welch, Kathleen B; Neuberg, Gerald W; O'Connor, Christopher M; Carson, Peter E; Miller, Alan B; Ghali, Jalal K

    2013-04-15

    The bilirubin level has been associated with worse outcomes, but it has not been studied as a predictor for the mode of death in patients with systolic heart failure. The Prospective Randomized Amlodipine Evaluation Study (PRAISE) cohort (including New York Heart Association class IIIB-IV patients with left ventricular ejection fraction <30%, n = 1,135) was analyzed, divided by bilirubin level: ≤0.6 mg/dl, group 1; >0.6 to 1.2 mg/dl, group 2; and >1.2 mg/dl, group 3. Multivariate Cox proportional hazards models were used to determine the association of bilirubin with the risk of sudden or pump failure death. Total bilirubin was entered as a base 2 log-transformed variable (log2 bilirubin), indicating doubling of the bilirubin level corresponding to each increase in variable value. The higher bilirubin groups had a lower ejection fraction (range 19% to 21%), sodium (range 138 to 139 mmol/L), and systolic blood pressure (range 111 to 120 mm Hg), a greater heart rate (range 79 to 81 beats/min), and greater diuretic dosages (range 86 to 110 furosemide-equivalent total daily dose in mg). The overall survival rates declined with increasing bilirubin (24.3, 31.3, and 44.3 deaths per 100 person-years, respectively, for groups 1, 2, and 3). Although a positive relation was seen between log2 bilirubin and both pump failure risk and sudden death risk, the relation in multivariate modeling was significant only for pump failure mortality (hazard ratio 1.47, 95% confidence interval 1.19 to 1.82, p = 0.0004), not for sudden death mortality (hazard ratio 1.21, 95% confidence interval 0.98 to 1.49, p = 0.08). In conclusion, an increasing bilirubin level was significantly associated with the risk of pump failure death but not for sudden death in patients with severe systolic heart failure. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Acute allograft failure in thoracic organ transplantation.

    PubMed

    Jahania, M S; Mullett, T W; Sanchez, J A; Narayan, P; Lasley, R D; Mentzer, R M

    2000-01-01

    Thoracic organ transplantation is an effective form of treatment for end-stage heart and lung disease. Despite major advances in the field, transplant patients remain at risk for acute allograft dysfunction, a major cause of early and late mortality. The most common causes of allograft failure include primary graft failure secondary to inadequate heart and lung preservation during cold storage, cellular rejection, and various donor-recipient-related factors. During cold storage and early reperfusion, heart and lung allografts are vulnerable to intracellular calcium overload, acidosis, cell swelling, injury mediated by reactive oxygen species, and the inflammatory response. Brain death itself is associated with a reduction in myocardial contractility, and recipient-related factors such as preexisting pulmonary hypertension can lead to acute right heart failure and the pulmonary reimplantation response. The development of new methods to prevent or treat these various causes of acute graft failure could lead to a marked improvement in short- and long-term survival of patients undergoing thoracic organ transplantation.

  11. A random-censoring Poisson model for underreported data.

    PubMed

    de Oliveira, Guilherme Lopes; Loschi, Rosangela Helena; Assunção, Renato Martins

    2017-12-30

    A major challenge when monitoring risks in socially deprived areas of under developed countries is that economic, epidemiological, and social data are typically underreported. Thus, statistical models that do not take the data quality into account will produce biased estimates. To deal with this problem, counts in suspected regions are usually approached as censored information. The censored Poisson model can be considered, but all censored regions must be precisely known a priori, which is not a reasonable assumption in most practical situations. We introduce the random-censoring Poisson model (RCPM) which accounts for the uncertainty about both the count and the data reporting processes. Consequently, for each region, we will be able to estimate the relative risk for the event of interest as well as the censoring probability. To facilitate the posterior sampling process, we propose a Markov chain Monte Carlo scheme based on the data augmentation technique. We run a simulation study comparing the proposed RCPM with 2 competitive models. Different scenarios are considered. RCPM and censored Poisson model are applied to account for potential underreporting of early neonatal mortality counts in regions of Minas Gerais State, Brazil, where data quality is known to be poor. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Censored quantile regression with recursive partitioning-based weights

    PubMed Central

    Wey, Andrew; Wang, Lan; Rudser, Kyle

    2014-01-01

    Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800

  13. [Liver transplant with donated graft after controlled cardiac death. Current situation].

    PubMed

    Abradelo De Usera, Manuel; Jiménez Romero, Carlos; Loinaz Segurola, Carmelo; Moreno González, Enrique

    2013-11-01

    An increasing pressure on the liver transplant waiting list, forces us to explore new sources, in order to expand the donor pool. One of the most interesting and with a promising potential, is donation after cardiac death (DCD). Initially, this activity has developed in Spain by means of the Maastricht type II donation in the uncontrolled setting. For different reasons, donation after controlled cardiac death has been reconsidered in our country. The most outstanding circumstance involved in DCD donation is a potential ischemic stress, that could cause severe liver graft cell damage, resulting in an adverse effect on liver transplant results, in terms of complications and outcomes. The complex and particular issues related to DCD Donation will be discussed in this review. Copyright © 2012 AEC. Published by Elsevier Espana. All rights reserved.

  14. Causes of graft failure in simultaneous pancreas-kidney transplantation by various time periods.

    PubMed

    Wakil, Kayo; Sugawara, Yasuhiko; Kokudo, Norihiro; Kadowaki, Takashi

    2013-01-01

    Data collected by the United Network for Organ Sharing from all approved United States transplant programs was analyzed. The data included 26,572 adult diabetic patients who received a primary pancreas transplant between January 1987 and December 2012. Simultaneous pancreas-kidney (SPK) transplantation was the major therapeutic option for diabetes patients. SPK had better graft survival than pancreas transplant alone (PTA) or pancreas-after-kidney (PAK) or pancreas-with-kidney (from a living donor, PWK). The 5-year pancreas graft survival rates for SPK, PWK, PAK, and PTA were 70.0%, 57.2%, 54.0%, and 48.2%, respectively. When long-term SPK pancreas graft survival was examined by various transplant time periods, it was found that survival has remained almost stable since 1996. Graft survival rates were high among the pancreas recipients transplanted in the periods 1996-2000, 2001-2005, and 2006-2012, and the rates were similar: the 5-year rates were 68.9%, 72.4%, and 73.8%, respectively. Technical failure was the leading cause of graft loss during the first year post-transplant, regardless of period: 61.3%, 68.6%, 64.2%, and 71.9% for 1987-1995, 1996-2000, 2001-2005, and 2006-2012, respectively. After one year, chronic rejection was the leading cause of graft loss in all periods: 51.8%, 53.2%, 44.3%, and 40.7% for 1987-1995, 1996-2000, 2001-2005, and 2006-2012, respectively. Chronic rejection accounted for around 50% (or more) of the grafts that survived over five years. Survival of long-term pancreas grafts as well as long-term causes of graft loss remained almost unchanged across the different transplant periods. Clearly, there is a need for a means to identify early markers of chronic rejection, and to control it to improve long-term survival.

  15. Three-year outcomes after percutaneous coronary intervention and coronary artery bypass grafting in patients with heart failure: from the CREDO-Kyoto percutaneous coronary intervention/coronary artery bypass graft registry cohort-2†.

    PubMed

    Marui, Akira; Kimura, Takeshi; Nishiwaki, Noboru; Komiya, Tatsuhiko; Hanyu, Michiya; Shiomi, Hiroki; Tanaka, Shiro; Sakata, Ryuzo

    2015-02-01

    Ischaemic heart disease is a major risk factor for heart failure. However, long-term benefit of percutaneous coronary intervention (PCI) or coronary artery bypass grafting (CABG) in those patients has not been well elucidated. Of the 15 939 patients undergoing first myocardial revascularization enrolled in the CREDO-Kyoto PCI/CABG Registry Cohort-2, we identified 1064 patients with multivessel and/or left main disease with a history of heart failure (ACC/AHA Stage C or D). There were 672 patients undergoing PCI and 392 CABG. Preprocedural left ventricular ejection fraction was not different between PCI and CABG (46.6 ± 15.1 vs 46.6 ± 14.6%, P = 0.89), but the CABG group included more patients with triple-vessel and left main disease (P < 0.01 each). Three-year outcomes revealed that the risk of hospital readmission for heart failure was higher after PCI than after CABG (hazard ratio [95% confidence interval]; 1.90 [1.18-3.05], P = 0.01). More importantly, adjusted mortality after PCI was significantly higher than after CABG (1.79 [1.13-2.82], P = 0.01). The risk of cardiac death after PCI was also higher than after CABG (1.98 [1.10-3.55], P = 0.02). Stratified analysis using the SYNTAX score demonstrated that risk of death was not different between PCI and CABG in patients with low (<23) and intermediate (23-32) SYNTAX scores (2.10 [0.57-7.68], P = 0.26 and 1.43 [0.63-3.21], P = 0.39, respectively), whereas those with a high (≥ 33) SYNTAX score, the risk of death was far higher after PCI than after CABG (4.83 [1.46-16.0], P = 0.01). In patients with heart failure with advanced coronary artery disease, CABG was a better option than PCI because CABG was associated with better survival benefit, particularly in more complex coronary lesions stratified by the SYNTAX score. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Edifoligide and long-term outcomes after coronary artery bypass grafting: PRoject of Ex-vivo Vein graft ENgineering via Transfection IV (PREVENT IV) 5-year results.

    PubMed

    Lopes, Renato D; Williams, Judson B; Mehta, Rajendra H; Reyes, Eric M; Hafley, Gail E; Allen, Keith B; Mack, Michael J; Peterson, Eric D; Harrington, Robert A; Gibson, C Michael; Califf, Robert M; Kouchoukos, Nicholas T; Ferguson, T Bruce; Lorenz, Todd J; Alexander, John H

    2012-09-01

    Edifoligide, an E2F transcription factor decoy, does not prevent vein graft failure or adverse clinical outcomes at 1 year in patients undergoing coronary artery bypass grafting (CABG). We compared the 5-year clinical outcomes of patients in PREVENT IV treated with edifoligide and placebo to identify predictors of long-term clinical outcomes. A total of 3,014 patients undergoing CABG with at least 2 planned vein grafts were enrolled. Kaplan-Meier curves were generated to compare the long-term effects of edifoligide and placebo. A Cox proportional hazards model was constructed to identify factors associated with 5-year post-CABG outcomes. The main outcome measures were death, myocardial infarction (MI), repeat revascularization, and rehospitalization through 5 years. Five-year follow-up was complete in 2,865 patients (95.1%). At 5 years, patients randomized to edifoligide and placebo had similar rates of death (11.7% and 10.7%, respectively), MI (2.3% and 3.2%), revascularization (14.1% and 13.9%), and rehospitalization (61.6% and 62.5%). The composite outcome of death, MI, or revascularization occurred at similar frequency in patients assigned to edifoligide and placebo (26.3% and 25.5%, respectively; hazard ratio 1.03 [95% CI 0.89-1.18], P = .721). Factors associated with death, MI, or revascularization at 5 years included peripheral and/or cerebrovascular disease, time on cardiopulmonary bypass, lung disease, diabetes mellitus, and congestive heart failure. Up to a quarter of patients undergoing CABG will have a major cardiac event or repeat revascularization procedure within 5 years of surgery. Edifoligide does not affect outcomes after CABG; however, common identifiable baseline and procedural risk factors are associated with long-term outcomes after CABG. Copyright © 2012 Mosby, Inc. All rights reserved.

  17. Impact of the pretransplant dialysis modality on kidney transplantation outcomes: a nationwide cohort study.

    PubMed

    Lin, Huan-Tang; Liu, Fu-Chao; Lin, Jr-Rung; Pang, See-Tong; Yu, Huang-Ping

    2018-06-04

    Most patients with uraemia must undergo chronic dialysis while awaiting kidney transplantation; however, the role of the pretransplant dialysis modality on the outcomes of kidney transplantation remains obscure. The objective of this study was to clarify the associations between the pretransplant dialysis modality, namely haemodialysis (HD) or peritoneal dialysis (PD), and the development of post-transplant de novo diseases, allograft failure and all-cause mortality for kidney-transplant recipients. Retrospective nationwide cohort study. Data retrieved from the Taiwan National Health Insurance Research Database. The National Health Insurance database was explored for patients who received kidney transplantation in Taiwan during 1998-2011 and underwent dialysis >90 days before transplantation. The pretransplant characteristics, complications during kidney transplantation and post-transplant outcomes were statistically analysed and compared between the HD and PD groups. Cox regression analysis was used to evaluate the HR of the dialysis modality on graft failure and all-cause mortality. The primary outcomes were long-term post-transplant death-censored allograft failure and all-cause mortality started after 90 days of kidney transplantation until the end of follow-up. The secondary outcomes were events during kidney transplantation and post-transplant de novo diseases adjusted by propensity score in log-binomial model. There were 1812 patients included in our cohort, among which 1209 (66.7%) and 603 (33.3%) recipients received pretransplant HD and PD, respectively. Recipients with chronic HD were generally older and male, had higher risks of developing post-transplant de novo ischaemic heart disease, tuberculosis and hepatitis C after adjustment. Pretransplant HD contributed to higher graft failure in the multivariate analysis (HR 1.38, p<0.05) after adjustment for the recipient age, sex, duration of dialysis and pretransplant diseases. There was no significant

  18. Impact of censoring on learning Bayesian networks in survival modelling.

    PubMed

    Stajduhar, Ivan; Dalbelo-Basić, Bojana; Bogunović, Nikola

    2009-11-01

    Bayesian networks are commonly used for presenting uncertainty and covariate interactions in an easily interpretable way. Because of their efficient inference and ability to represent causal relationships, they are an excellent choice for medical decision support systems in diagnosis, treatment, and prognosis. Although good procedures for learning Bayesian networks from data have been defined, their performance in learning from censored survival data has not been widely studied. In this paper, we explore how to use these procedures to learn about possible interactions between prognostic factors and their influence on the variate of interest. We study how censoring affects the probability of learning correct Bayesian network structures. Additionally, we analyse the potential usefulness of the learnt models for predicting the time-independent probability of an event of interest. We analysed the influence of censoring with a simulation on synthetic data sampled from randomly generated Bayesian networks. We used two well-known methods for learning Bayesian networks from data: a constraint-based method and a score-based method. We compared the performance of each method under different levels of censoring to those of the naive Bayes classifier and the proportional hazards model. We did additional experiments on several datasets from real-world medical domains. The machine-learning methods treated censored cases in the data as event-free. We report and compare results for several commonly used model evaluation metrics. On average, the proportional hazards method outperformed other methods in most censoring setups. As part of the simulation study, we also analysed structural similarities of the learnt networks. Heavy censoring, as opposed to no censoring, produces up to a 5% surplus and up to 10% missing total arcs. It also produces up to 50% missing arcs that should originally be connected to the variate of interest. Presented methods for learning Bayesian networks from

  19. Semiparametric regression analysis of interval-censored competing risks data.

    PubMed

    Mao, Lu; Lin, Dan-Yu; Zeng, Donglin

    2017-09-01

    Interval-censored competing risks data arise when each study subject may experience an event or failure from one of several causes and the failure time is not observed directly but rather is known to lie in an interval between two examinations. We formulate the effects of possibly time-varying (external) covariates on the cumulative incidence or sub-distribution function of competing risks (i.e., the marginal probability of failure from a specific cause) through a broad class of semiparametric regression models that captures both proportional and non-proportional hazards structures for the sub-distribution. We allow each subject to have an arbitrary number of examinations and accommodate missing information on the cause of failure. We consider nonparametric maximum likelihood estimation and devise a fast and stable EM-type algorithm for its computation. We then establish the consistency, asymptotic normality, and semiparametric efficiency of the resulting estimators for the regression parameters by appealing to modern empirical process theory. In addition, we show through extensive simulation studies that the proposed methods perform well in realistic situations. Finally, we provide an application to a study on HIV-1 infection with different viral subtypes. © 2017, The International Biometric Society.

  20. Successful transplantation of kidneys from elderly circulatory death donors by using microscopic and macroscopic characteristics to guide single or dual implantation.

    PubMed

    Mallon, D H; Riddiough, G E; Summers, D M; Butler, A J; Callaghan, C J; Bradbury, L L; Bardsley, V; Broecker, V; Saeb-Parsy, K; Torpey, N; Bradley, J A; Pettigrew, G J

    2015-11-01

    Most kidneys from potential elderly circulatory death (DCD) donors are declined. We report single center outcomes for kidneys transplanted from DCD donors over 70 years old, using preimplantation biopsy Remuzzi grading to inform implantation as single or dual transplants. Between 2009 and 2012, 43 single transplants and 12 dual transplants were performed from elderly DCD donors. Remuzzi scores were higher for dual than single implants (4.4 vs. 3.4, p < 0.001), indicating more severe baseline injury. Donor and recipient characteristics for both groups were otherwise similar. Early graft loss from renal vein thrombosis occurred in two singly implanted kidneys, and in one dual-implanted kidney; its pair continued to function satisfactorily. Death-censored graft survival at 3 years was comparable for the two groups (single 94%; dual 100%), as was 1 year eGFR. Delayed graft function occurred less frequently in the dual-implant group (25% vs. 65%, p = 0.010). Using this approach, we performed proportionally more kidney transplants from elderly DCD donors (23.4%) than the rest of the United Kingdom (7.3%, p < 0.001), with graft outcomes comparable to those achieved nationally for all deceased-donor kidney transplants. Preimplantation biopsy analysis is associated with acceptable transplant outcomes for elderly DCD kidneys and may increase transplant numbers from an underutilized donor pool. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  1. Rewarming preservation by organ perfusion system for donation after cardiac death liver grafts in pigs.

    PubMed

    Matsuno, N; Obara, H; Watanabe, R; Iwata, S; Kono, S; Fujiyama, M; Hirano, T; Kanazawa, H; Enosawa, S

    2014-05-01

    Use of grafts from donors after cardiac death (DCD) would greatly contribute to the expansion of the donor organ pool. However, this requires the development of novel preservation methods to recover the organ from changes due to warm ischemia time (WIT). Porcine livers were perfused with a newly developed machine perfusion (MP) system. The livers were perfused with modified University of Wisconsin solution (UW) - gluconate. All grafts were procured after acute hemorrhagic shock with the ventilator off. For group 1 (n = 6), grafts were procured after WIT of 60 minutes and preserved by hypothermic MP (HMP) for 3 hours. For group 2 (n = 5), grafts were preserved with 2 hours of simple cold storage (SCS) and HMP for 2 hours. For group 3 (n = 6), grafts were preserved with 2 hours of SCS and rewarming up to 25°C by MP for 2 hours (RMP). The preserved liver grafts were transplanted orthotopically. The alanine aminotransferase level in perfusate in RMP during perfusion preservation was maintained at less than that of HMP. The levels of aspartate aminotransferase and lactate dehydrogenase in the 2 hours after reperfusion were significantly lower in group 3. Histologically, the necrosis of hepatocytes was less severe in group 3. The survival rate in group 3 was 2/4, but 0/4 in the other group. RMP is expected to facilitate the recovery of the DCD liver grafts. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Urinary potassium excretion, renal ammoniagenesis, and risk of graft failure and mortality in renal transplant recipients.

    PubMed

    Eisenga, Michele F; Kieneker, Lyanne M; Soedamah-Muthu, Sabita S; van den Berg, Else; Deetman, Petronella E; Navis, Gerjan J; Gans, Reinold Ob; Gaillard, Carlo Ajm; Bakker, Stephan Jl; Joosten, Michel M

    2016-12-01

    Renal transplant recipients (RTRs) have commonly been urged to limit their potassium intake during renal insufficiency and may adhere to this principle after transplantation. Importantly, in experimental animal models, low dietary potassium intake induces kidney injury through stimulation of ammoniagenesis. In humans, low potassium intake is an established risk factor for high blood pressure. We hypothesized that low 24-h urinary potassium excretion [UKV; urinary potassium concentration × volume], the gold standard for assessment of dietary potassium intake, represents a risk factor for graft failure and mortality in RTRs. In secondary analyses, we aimed to investigate whether these associations could be explained by ammoniagenesis, plasma potassium, or blood pressure. In a prospective cohort of 705 RTRs, we assessed dietary potassium intake by a single 24-h UKV and food-frequency questionnaires. Cox regression analyses were used to investigate prospective associations with outcome. We included 705 stable RTRs (mean ± SD age: 53 ± 13 y; 57% men) at 5.4 y (IQR: 1.9-12.0 y) after transplantation and 253 kidney donors. Mean ± SD UKV was 73 ± 24 mmol/24 h in RTRs compared with 85 ± 25 mmol/24 h in kidney donors. During follow-up for 3.1 y (IQR: 2.7-3.9 y), 45 RTRs developed graft failure and 83 died. RTRs in the lowest sex-specific tertile of UKV (women, <55 mmol/24 h; men, <65 mmol/24 h) had an increased risk of graft failure (HR: 3.70; 95% CI: 1.64, 8.34) and risk of mortality (HR; 2.66; 95% CI: 1.53, 4.61), independent of potential confounders. In causal path analyses, 24-h urinary ammonia excretion, plasma potassium, and blood pressure did not affect these associations. Our results indicate that low UKV is associated with a higher risk of graft failure and mortality in RTRs. Specific attention for adequate potassium intake after transplantation seems warranted. This trial was registered at clinicaltrials.gov as NCT02811835. © 2016 American Society for

  3. Living Donation Versus Donation After Circulatory Death Liver Transplantation for Low MELD Recipients.

    PubMed

    Kling, Catherine E; Perkins, James D; Reyes, Jorge D; Montenovo, Martin I

    2018-04-10

    Background In this era of organ scarcity, living donor liver transplant (LDLT) is an alternative to using deceased donors and in Western countries is more often used in low model for end-stage liver disease (MELD) recipients. We sought to compare the patient survival and graft survival between recipients of liver transplantation from living donors and donation after circulatory death (DCD) donors in patients with low MELD scores. Methods Retrospective cohort analysis of adult liver transplant recipients with a laboratory MELD <= 20 who underwent transplantation between 01/01/2003 and 03/31/2016. Recipients were categorized by donor graft type (DCD or LDLT) and recipient and donor characteristics were compared. Ten-year patient and graft survival curves were calculated using Kaplan-Meier analyses and a mixed-effects model was performed to determine the contributions of recipient, donor and center variables on patient and graft survival. Results 36,705 liver transplants were performed - 2,166 (5.9%) were from DCD donors and 2,284 (6.2%) from living donors. In the mixed-effects model, DCD status was associated with a higher risk of graft failure (RR 1.27, 95% CI 1.16-1.38) but not worse patient survival (RR 1.27, 95% CI: 0.96-1.67). Lower DCD center experience was associated with a 1.21 higher risk of patient death (95% CI: 1.17-1.25) and 1.13 higher risk of graft failure (95% CI: 1.12-1.15). LDLT center experience was also predictive of patient survival (RR 1.03, 95% CI: 1.02-1.03) and graft failure (RR 1.05, 95% CI: 1.05-1.06). Conclusion For liver transplant recipients with low laboratory MELD, LDLT offers better graft survival and a tendency to better patient survival than DCD donors. This article is protected by copyright. All rights reserved. © 2018 by the American Association for the Study of Liver Diseases.

  4. The predictive value of the antioxidative function of HDL for cardiovascular disease and graft failure in renal transplant recipients.

    PubMed

    Leberkühne, Lynn J; Ebtehaj, Sanam; Dimova, Lidiya G; Dikkers, Arne; Dullaart, Robin P F; Bakker, Stephan J L; Tietge, Uwe J F

    2016-06-01

    Protection of low-density lipoproteins (LDL) against oxidative modification is a key anti-atherosclerotic property of high-density lipoproteins (HDL). This study evaluated the predictive value of the HDL antioxidative function for cardiovascular mortality, all-cause mortality and chronic graft failure in renal transplant recipients (RTR). The capacity of HDL to inhibit native LDL oxidation was determined in vitro in a prospective cohort of renal transplant recipients (RTR, n = 495, median follow-up 7.0 years). The HDL antioxidative functionality was significantly higher in patients experiencing graft failure (57.4 ± 9.7%) than in those without (54.2 ± 11.3%; P = 0.039), while there were no differences for cardiovascular and all-cause mortality. Specifically glomerular filtration rate (P = 0.001) and C-reactive protein levels (P = 0.006) associated independently with antioxidative functionality in multivariate linear regression analyses. Cox regression analysis demonstrated a significant relationship between antioxidative functionality of HDL and graft failure in age-adjusted analyses, but significance was lost following adjustment for baseline kidney function and inflammatory load. No significant association was found between HDL antioxidative functionality and cardiovascular and all-cause mortality. This study demonstrates that the antioxidative function of HDL (i) does not predict cardiovascular or all-cause mortality in RTR, but (ii) conceivably contributes to the development of graft failure, however, not independent of baseline kidney function and inflammatory load. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  5. Liver Transplantation Using Grafts From Donors After Circulatory Death: A Propensity Score-Matched Study From a Single Center.

    PubMed

    Laing, R W; Scalera, I; Isaac, J; Mergental, H; Mirza, D F; Hodson, J; Wilkin, R J W; Perera, M T P R; Muiesan, P

    2016-06-01

    The use of livers from donation after circulatory death (DCD) is increasing, but concerns exist regarding outcomes following use of grafts from "marginal" donors. To compare outcomes in transplants using DCD and donation after brain death (DBD), propensity score matching was performed for 973 patients with chronic liver disease and/or malignancy who underwent primary whole-liver transplant between 2004 and 2014 at University Hospitals Birmingham NHS Foundation Trust. Primary end points were overall graft and patient survival. Secondary end points included postoperative, biliary and vascular complications. Over 10 years, 234 transplants were carried out using DCD grafts. Of the 187 matched DCDs, 82.9% were classified as marginal per British Transplantation Society guidelines. Kaplan-Meier analysis of graft and patient survival found no significant differences for either outcome between the paired DCD and DBD patients (p = 0.162 and p = 0.519, respectively). Aspartate aminotransferase was significantly higher in DCD recipients until 48 h after transplant (p < 0.001). The incidences of acute kidney injury and ischemic cholangiopathy were greater in DCD recipients (32.6% vs. 15% [p < 0.001] and 9.1% vs. 1.1% [p < 0.001], respectively). With appropriate recipient selection, the use of DCDs, including those deemed marginal, can be safe and can produce outcomes comparable to those seen using DBD grafts in similar recipients. © Copyright 2016 The American Society of Transplantation and the American Society of Transplant Surgeons.

  6. Renal transplantation in systemic lupus erythematosus: outcome and prognostic factors in 50 cases from a single centre.

    PubMed

    Cairoli, Ernesto; Sanchez-Marcos, Carolina; Espinosa, Gerard; Glucksmann, Constanza; Ercilla, Guadalupe; Oppenheimer, Federico; Cervera, Ricard

    2014-01-01

    End-stage renal disease (ESRD) is an important cause of morbidity and mortality in patients with systemic lupus erythematosus (SLE). To analyze the outcome and prognostic factors of renal transplantation in patients with ESRD due to SLE from January 1986 to December 2013 in a single center. Fifty renal transplantations were performed in 40 SLE patients (32 female (80%), mean age at transplantation 36±10.4 years). The most frequent lupus nephropathy was type IV (72.2%). Graft failure occurred in a total of 15 (30%) transplantations and the causes of graft failure were chronic allograft nephropathy (n=12), acute rejection (n=2), and chronic humoral rejection (1). The death-censored graft survival rates were 93.9% at 1 year, 81.5% at 5 years, and 67.6% at the end of study. The presence of deceased donor allograft (P=0.007) and positive anti-HCV antibodies (P=0.001) negatively influence the survival of the renal transplant. The patient survival rate was 91.4% at the end of the study. Recurrence of lupus nephritis in renal allograft was observed in one patient. Renal transplantation is a good alternative for renal replacement therapy in patients with SLE. In our cohort, the presence of anti-HCV antibodies and the type of donor source were related to the development of graft failure.

  7. Renal Transplantation in Systemic Lupus Erythematosus: Outcome and Prognostic Factors in 50 Cases from a Single Centre

    PubMed Central

    Cairoli, Ernesto; Sanchez-Marcos, Carolina; Espinosa, Gerard; Glucksmann, Constanza; Ercilla, Guadalupe; Oppenheimer, Federico; Cervera, Ricard

    2014-01-01

    Background. End-stage renal disease (ESRD) is an important cause of morbidity and mortality in patients with systemic lupus erythematosus (SLE). Objectives. To analyze the outcome and prognostic factors of renal transplantation in patients with ESRD due to SLE from January 1986 to December 2013 in a single center. Results. Fifty renal transplantations were performed in 40 SLE patients (32 female (80%), mean age at transplantation 36 ± 10.4 years). The most frequent lupus nephropathy was type IV (72.2%). Graft failure occurred in a total of 15 (30%) transplantations and the causes of graft failure were chronic allograft nephropathy (n = 12), acute rejection (n = 2), and chronic humoral rejection (1). The death-censored graft survival rates were 93.9% at 1 year, 81.5% at 5 years, and 67.6% at the end of study. The presence of deceased donor allograft (P = 0.007) and positive anti-HCV antibodies (P = 0.001) negatively influence the survival of the renal transplant. The patient survival rate was 91.4% at the end of the study. Recurrence of lupus nephritis in renal allograft was observed in one patient. Conclusion. Renal transplantation is a good alternative for renal replacement therapy in patients with SLE. In our cohort, the presence of anti-HCV antibodies and the type of donor source were related to the development of graft failure. PMID:25013800

  8. Biomarkers of Myocardial Stress and Fibrosis as Predictors of Mode of Death in Patients with Chronic Heart Failure

    PubMed Central

    Ahmad, Tariq; Fiuzat, Mona; Neely, Ben; Neely, Megan; Pencina, Michael J.; Kraus, William E.; Zannad, Faiez; Whellan, David J.; Donahue, Mark; Piña, Ileana L.; Adams, Kirkwood; Kitzman, Dalane W.; O’Connor, Christopher M.; Felker, G. Michael

    2014-01-01

    Objective To determine whether biomarkers of myocardial stress and fibrosis improve prediction of mode of death in patients with chronic heart failure. Background The two most common modes of death in patients with chronic heart failure are pump failure and sudden cardiac death. Prediction of mode of death may facilitate treatment decisions. The relationship between NT-proBNP, galectin-3, and ST2, biomarkers that reflect different pathogenic pathways in heart failure (myocardial stress and fibrosis), and mode of death is unknown. Methods HF-ACTION was a randomized controlled trial of exercise training vs. usual care in patients with chronic heart failure due to left ventricular systolic dysfunction (LVEF<35%). An independent clinical events committee prospectively adjudicated mode of death. NT-proBNP, galectin-3, and ST2 levels were assessed at baseline in 813 subjects. Associations between biomarkers and mode of death were assessed using cause-specific Cox-proportional hazards modeling, and interaction testing was used to measure differential association between biomarkers and pump failure versus sudden cardiac death. Discrimination and risk reclassification metrics were used to assess the added value of galectin-3 and ST2 in predicting mode of death risk beyond a clinical model that included NT-proBNP. Results After a median follow up of 2.5 years, there were 155 deaths: 49 from pump failure 42 from sudden cardiac death, and 64 from other causes. Elevations in all biomarkers were associated with increased risk of both pump failure and sudden cardiac death in both adjusted and unadjusted analyses. In each case, increases in the biomarker had a stronger association with pump failure than sudden cardiac death but this relationship was attenuated after adjustment for clinical risk factors. Clinical variables along with NT-proBNP levels were stronger predictors of pump failure (C statistic: 0.87) than sudden cardiac death (C statistic: 0.73). Addition of ST2 and

  9. Postreperfusion hyperkalemia in liver transplantation using donation after cardiac death grafts with pathological changes.

    PubMed

    Zhang, Wen-Jin; Xia, Wei-Liang; Pan, Hui-Yun; Zheng, Shu-Sen

    2016-10-01

    With the increasing use of donation after cardiac death (DCD), especially of the graft liver with steatosis or other pathological changes, the frequency of postreperfusion hyperkalemia in liver transplantation has increased significantly. The present study aimed to determine the factors associated with developing postreperfusion hyperkalemia in liver transplantation from DCD. One hundred thirty-one consecutive adult patients who underwent orthotopic liver transplantation from DCD were retrospectively studied. Based on serum potassium within 5 minutes after reperfusion, recipients were divided into two groups: hyperkalemia and normokalemia. According to preoperative biopsy results, the DCD graft livers were classified into five categories. Univariate analysis was performed using Chi-square test to identify variables that were significantly different between two groups. Multivariate logistic regression was used to confirm the risk factors of developing hyperkalemia and postreperfusion syndrome. Correlation analysis was used to identify the relationship between the serum concentration of potassium within 5 minutes after reperfusion and the difference in mean arterial pressure values before and within 5 minutes after reperfusion. Twenty-two of 131 liver recipients had hyperkalemia episodes within 5 minutes after reperfusion. The rate of hyperkalemia was significantly higher in recipients of macrosteatotic DCD graft liver (78.6%, P<0.001) than that in recipients of non-macrosteatotic DCD graft liver. The odds ratio of developing postreperfusion hyperkalemia in recipients of macrosteatotic DCD graft liver was 51.3 (P<0.001). Macrosteatosis in the DCD graft liver was an independent risk factor of developing hyperkalemia within 5 minutes after reperfusion. The highest rate of postreperfusion syndrome also occurred in the recipients with macrosteatotic DCD graft liver (71.4%, P<0.001). A strong relationship existed between the serum potassium within 5 minutes after

  10. Midterm Results of Renal Transplantation From Controlled Cardiac Death Donors Are Similar to Those From Brain Death Donors.

    PubMed

    Lafuente, O; Sánchez-Sobrino, B; Pérez, M; López-Sánchez, P; Janeiro, D; Rubio, E; Huerta, A; Marques, M; Llópez-Carratala, M R; Rubio, J J; Portolés, J

    2016-11-01

    The systematic use of grafts from controlled donors after cardiac death (cDCD) started in our country in 2012 and expanded with the strategic support of National Transplant Organization. We present our experience in kidney transplantation with organs from cDCD donors with a mean follow-up of 3 years. Observational prospective study of all transplants performed in our center in 2012-2013 followed to 2016. The immunosuppression protocols were triple therapy for low-risk recipients from a standard brain death donor (DBD), adding basiliximab or thymoglobulin induction for extended-criteria donor or high-risk recipient, respectively, and thymoglobulin induction plus triple therapy for all cDCD recipients. A total of 42 donors were included (84 grafts in total, but 1 discarded due to multiple cysts); 25 DBD and 17 cDCD without differences in age or sex. The graft use rate was 98.9% for cDCD; 55 grafts were implanted in our hospital (26 DBD and 29 cDCD), and the remaining 28 grafts were transferred to other centers. There were no differences in primary failure (3.4% cDCD vs 7.4% DBD), but the cDCD organs had a higher incidence of delayed graft function (51.7% vs 25.9%). Despite that, graft and patient survivals, as well as glomerular filtration rate (66.3 vs 59.6 mL/min) were similar in both groups. Only 1 patient died at home with a functioning graft in the cDCD group. Despite a higher rate of delayed graft function with cDCD, the midterm outcomes are at least similar to those with DBD. The cDCD programs should be promoted to increase the chances of a transplant in our patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Biomarkers of myocardial stress and fibrosis as predictors of mode of death in patients with chronic heart failure.

    PubMed

    Ahmad, Tariq; Fiuzat, Mona; Neely, Benjamin; Neely, Megan L; Pencina, Michael J; Kraus, William E; Zannad, Faiez; Whellan, David J; Donahue, Mark P; Piña, Ileana L; Adams, Kirkwood F; Kitzman, Dalane W; O'Connor, Christopher M; Felker, G Michael

    2014-06-01

    The aim of this study was to determine whether biomarkers of myocardial stress and fibrosis improve prediction of the mode of death in patients with chronic heart failure. The 2 most common modes of death in patients with chronic heart failure are pump failure and sudden cardiac death. Prediction of the mode of death may facilitate treatment decisions. The relationship between amino-terminal pro-brain natriuretic peptide (NT-proBNP), galectin-3, and ST2, biomarkers that reflect different pathogenic pathways in heart failure (myocardial stress and fibrosis), and mode of death is unknown. HF-ACTION (Heart Failure: A Controlled Trial Investigating Outcomes of Exercise Training) was a randomized controlled trial of exercise training versus usual care in patients with chronic heart failure due to left ventricular systolic dysfunction (left ventricular ejection fraction ≤35%). An independent clinical events committee prospectively adjudicated mode of death. NT-proBNP, galectin-3, and ST2 levels were assessed at baseline in 813 subjects. Associations between biomarkers and mode of death were assessed using cause-specific Cox proportional hazards modeling, and interaction testing was used to measure differential associations between biomarkers and pump failure versus sudden cardiac death. Discrimination and risk reclassification metrics were used to assess the added value of galectin-3 and ST2 in predicting mode of death risk beyond a clinical model that included NT-proBNP. After a median follow-up period of 2.5 years, there were 155 deaths: 49 from pump failure, 42 from sudden cardiac death, and 64 from other causes. Elevations in all biomarkers were associated with increased risk for both pump failure and sudden cardiac death in both adjusted and unadjusted analyses. In each case, increases in the biomarker had a stronger association with pump failure than sudden cardiac death, but this relationship was attenuated after adjustment for clinical risk factors. Clinical

  12. An analysis of surgical and anaesthetic factors affecting skin graft viability in patients admitted to a Burns Intensive Care Unit.

    PubMed

    Isitt, Catherine E; McCloskey, Kayleigh A; Caballo, Alvaro; Sharma, Pranev; Williams, Andrew; Leon-Villapalos, Jorge; Vizcaychipi, Marcela P

    2016-01-01

    Skin graft failure is a recognised complication in the treatment of major burns. Little research to date has analysed the impact of the complex physiological management of burns patients on the success of skin grafting. We analysed surgical and anaesthetic variables to identify factors contributing to graft failure. Inclusion criteria were admission to our Burns Intensive Care Unit (BICU) between January 2009 and October 2013 with a major burn. After exclusion for death before hospital discharge or prior skin graft at a different hospital, 35 patients remained and were divided into those with successful autografts (n=16) and those with a failed autograft (n=19). For the purposes of this study, we defined poor autograft viability as requiring at least one additional skin graft to the same site. Logistic regression of variables was performed using SPSS (Version 22.0 IBMTM). Age, Sex, %Total Burn Surface Area or Belgian Outcome Burns Injury score did not significantly differ between groups. No differences were found in any surgical factor at logistic regression (graft site, harvest site, infection etc.). When all operations were analysed, the use of colloids was found to be significantly associated with graft failure (p=0.035, CI 95%) and this remained significant when only split thickness skin grafts (STSGs) and debridement operations were included (p=0.034, CI 95%). No differences were found in crystalloid use, intraoperative temperature, pre-operative haemoglobin and blood products or vasopressor use. This analysis highlights an independent association between colloids and graft failure which has not been previously documented.

  13. Preimplant Histologic Acute Tubular Necrosis and Allograft Outcomes

    PubMed Central

    Hall, Isaac E.; Reese, Peter P.; Weng, Francis L.; Schröppel, Bernd; Doshi, Mona D.; Hasz, Rick D.; Reitsma, William; Goldstein, Michael J.; Hong, Kwangik

    2014-01-01

    Background and objectives The influence of deceased-donor AKI on post-transplant outcomes is poorly understood. The few published studies about deceased-donor preimplant biopsy have reported conflicting results regarding associations between AKI and recipient outcomes. Design, setting, participants, & measurements This multicenter study aimed to evaluate associations between deceased-donor biopsy reports of acute tubular necrosis (ATN) and delayed graft function (DGF), and secondarily for death-censored graft failure, first adjusting for the kidney donor risk index and then stratifying by donation after cardiac death (DCD) status. Results Between March 2010 and April 2012, 651 kidneys (369 donors, 4 organ procurement organizations) were biopsied and subsequently transplanted, with ATN reported in 110 (17%). There were 262 recipients (40%) who experienced DGF and 38 (6%) who experienced graft failure. DGF occurred in 45% of kidneys with reported ATN compared with 39% without ATN (P=0.31) resulting in a relative risk (RR) of 1.13 (95% confidence interval [95% CI], 0.9 to 1.43) and a kidney donor risk index–adjusted RR of 1.11 (95% CI, 0.88 to 1.41). There was no significant difference in graft failure for kidneys with versus without ATN (8% versus 5%). In stratified analyses, the adjusted RR for DGF with ATN was 0.97 (95% CI, 0.7 to 1.34) for non-DCD kidneys and 1.59 (95% CI, 1.23 to 2.06) for DCD kidneys (P=0.02 for the interaction between ATN and DCD on the development of DGF). Conclusions Despite a modest association with DGF for DCD kidneys, this study reveals no significant associations overall between preimplant biopsy-reported ATN and the outcomes of DGF or graft failure. The potential benefit of more rigorous ATN reporting is unclear, but these findings provide little evidence to suggest that current ATN reports are useful for predicting graft outcomes or deciding to accept or reject allograft offers. PMID:24558049

  14. Prediction models of donor arrest and graft utilization in liver transplantation from maastricht-3 donors after circulatory death.

    PubMed

    Davila, D; Ciria, R; Jassem, W; Briceño, J; Littlejohn, W; Vilca-Meléndez, H; Srinivasan, P; Prachalias, A; O'Grady, J; Rela, M; Heaton, N

    2012-12-01

    Shortage of organs for transplantation has led to the renewed interest in donation after circulatory-determination of death (DCDD). We conducted a retrospective analysis (2001-2009) and a subsequent prospective validation (2010) of liver Maastricht-Category-3-DCDD and donation-after-brain-death (DBD) offers to our program. Accepted and declined offers were compared. Accepted DCDD offers were divided into donors who went on to cardiac arrest and those who did not. Donors who arrested were divided into those producing grafts that were transplanted or remained unused. Descriptive comparisons and regression analyses were performed to assess predictor models of donor cardiac arrest and graft utilization. Variables from the multivariate analysis were prospectively validated. Of 1579 DCDD offers, 621 were accepted, and of these, 400 experienced cardiac arrest after withdrawal of support. Of these, 173 livers were transplanted. In the DCDD group, donor age < 40 years, use of inotropes and absence of gag/cough reflexes were predictors of cardiac arrest. Donor age >50 years, BMI >30, warm ischemia time >25 minutes, ITU stay >7 days and ALT ≥ 4× normal rates were risk factors for not using the graft. These variables had excellent sensitivity and specificity for the prediction of cardiac arrest (AUROC = 0.835) and graft use (AUROC = 0.748) in the 2010 prospective validation. These models can feasibly predict cardiac arrest in potential DCDDs and graft usability, helping to avoid unnecessary recoveries and healthcare expenditure. © Copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.

  15. Relation between lowered colloid osmotic pressure, respiratory failure, and death.

    PubMed

    Tonnesen, A S; Gabel, J C; McLeavey, C A

    1977-01-01

    Plasma colloid osmotic pressure was measured each day in 84 intensive care unit patients. Probit analysis demonstrated a direct relationship between colloid osmotic pressure (COP) and survival. The COP associated with a 50% survival rate was 15.0 torr. COP was higher in survivors than in nonsurvivors without respiratory failure and in patients who recovered from respiratory failure. We conclude that lowered COP is associated with an elevated mortality rate. However, the relationship to death is not explained by the relationship to respiratory failure.

  16. Graft reconditioning with nitric oxide gas in rat liver transplantation from cardiac death donors.

    PubMed

    Kageyama, Shoichi; Yagi, Shintaro; Tanaka, Hirokazu; Saito, Shunichi; Nagai, Kazuyuki; Hata, Koichiro; Fujimoto, Yasuhiro; Ogura, Yasuhiro; Tolba, Rene; Shinji, Uemoto

    2014-03-27

    Liver transplant outcomes using grafts donated after cardiac death (DCD) remain poor. We investigated the effects of ex vivo reconditioning of DCD grafts with venous systemic oxygen persufflation using nitric oxide gas (VSOP-NO) in rat liver transplants. Orthotopic liver transplants were performed in Lewis rats, using DCD grafts prepared using static cold storage alone (group-control) or reconditioning using VSOP-NO during cold storage (group-VSOP-NO). Experiment I: In a 30-min warm ischemia model, graft damage and hepatic expression of inflammatory cytokines, endothelial nitric oxide synthase (eNOS), inducible nitric oxide synthase (iNOS), and endothelin-1 (ET-1) were examined, and histologic analysis was performed 2, 6, 24, and 72 hr after transplantation. Experiment II: In a 60-min warm ischemia model, grafts were evaluated 2 hr after transplantation (6 rats/group), and survival was assessed (7 rats/group). Experiment I: Group-VSOP-NO had lower alanine aminotransferase (ALT) (P<0.001), hyaluronic acid (P<0.05), and malondialdehyde (MDA) (P<0.001), hepatic interleukin-6 expression (IL-6) (P<0.05), and hepatic tumor necrosis factor-alpha (TNF-α) expression (P<0.001). Hepatic eNOS expression (P<0.001) was upregulated, whereas hepatic iNOS (P<0.01) and ET-1 (P<0.001) expressions were downregulated. The damage of hepatocyte and sinusoidal endothelial cells (SECs) were lower in group-VSOP-NO.Experiment II: VSOP-NO decreased ET-1 and 8-hydroxy-2'deoxyguanosine (8-OHdG) expression and improved survival after transplantation by 71.4% (P<0.01). These results suggest that VSOP-NO effectively reconditions warm ischemia-damaged grafts, presumably by decreasing ET-1 upregulation and oxidative damage.

  17. Factors Associated with Mortality and Graft Failure in Liver Transplants: A Hierarchical Approach

    PubMed Central

    Andraus, Wellington; de Martino, Rodrigo Bronze; Ortega, Neli Regina de Siqueira; Abe, Jair Minoro; D’Albuquerque, Luiz Augusto Carneiro

    2015-01-01

    Background Liver transplantation has received increased attention in the medical field since the 1980s following the introduction of new immunosuppressants and improved surgical techniques. Currently, transplantation is the treatment of choice for patients with end-stage liver disease, and it has been expanded for other indications. Liver transplantation outcomes depend on donor factors, operating conditions, and the disease stage of the recipient. A retrospective cohort was studied to identify mortality and graft failure rates and their associated factors. All adult liver transplants performed in the state of São Paulo, Brazil, between 2006 and 2012 were studied. Methods and Findings A hierarchical Poisson multiple regression model was used to analyze factors related to mortality and graft failure in liver transplants. A total of 2,666 patients, 18 years or older, (1,482 males; 1,184 females) were investigated. Outcome variables included mortality and graft failure rates, which were grouped into a single binary variable called negative outcome rate. Additionally, donor clinical, laboratory, intensive care, and organ characteristics and recipient clinical data were analyzed. The mortality rate was 16.2 per 100 person-years (py) (95% CI: 15.1–17.3), and the graft failure rate was 1.8 per 100 py (95% CI: 1.5–2.2). Thus, the negative outcome rate was 18.0 per 100 py (95% CI: 16.9–19.2). The best risk model demonstrated that recipient creatinine ≥ 2.11 mg/dl [RR = 1.80 (95% CI: 1.56–2.08)], total bilirubin ≥ 2.11 mg/dl [RR = 1.48 (95% CI: 1.27–1.72)], Na+ ≥ 141.01 mg/dl [RR = 1.70 (95% CI: 1.47–1.97)], RNI ≥ 2.71 [RR = 1.64 (95% CI: 1.41–1.90)], body surface ≥ 1.98 [RR = 0.81 (95% CI: 0.68–0.97)] and donor age ≥ 54 years [RR = 1.28 (95% CI: 1.11–1.48)], male gender [RR = 1.19(95% CI: 1.03–1.37)], dobutamine use [RR = 0.54 (95% CI: 0.36–0.82)] and intubation ≥ 6 days [RR = 1.16 (95% CI: 1.10–1.34)] affected the negative outcome

  18. “Smooth” Semiparametric Regression Analysis for Arbitrarily Censored Time-to-Event Data

    PubMed Central

    Zhang, Min; Davidian, Marie

    2008-01-01

    Summary A general framework for regression analysis of time-to-event data subject to arbitrary patterns of censoring is proposed. The approach is relevant when the analyst is willing to assume that distributions governing model components that are ordinarily left unspecified in popular semiparametric regression models, such as the baseline hazard function in the proportional hazards model, have densities satisfying mild “smoothness” conditions. Densities are approximated by a truncated series expansion that, for fixed degree of truncation, results in a “parametric” representation, which makes likelihood-based inference coupled with adaptive choice of the degree of truncation, and hence flexibility of the model, computationally and conceptually straightforward with data subject to any pattern of censoring. The formulation allows popular models, such as the proportional hazards, proportional odds, and accelerated failure time models, to be placed in a common framework; provides a principled basis for choosing among them; and renders useful extensions of the models straightforward. The utility and performance of the methods are demonstrated via simulations and by application to data from time-to-event studies. PMID:17970813

  19. Effect of censoring trace-level water-quality data on trend-detection capability

    USGS Publications Warehouse

    Gilliom, R.J.; Hirsch, R.M.; Gilroy, E.J.

    1984-01-01

    Monte Carlo experiments were used to evaluate whether trace-level water-quality data that are routinely censored (not reported) contain valuable information for trend detection. Measurements are commonly censored if they fall below a level associated with some minimum acceptable level of reliability (detection limit). Trace-level organic data were simulated with best- and worst-case estimates of measurement uncertainty, various concentrations and degrees of linear trend, and different censoring rules. The resulting classes of data were subjected to a nonparametric statistical test for trend. For all classes of data evaluated, trends were most effectively detected in uncensored data as compared to censored data even when the data censored were highly unreliable. Thus, censoring data at any concentration level may eliminate valuable information. Whether or not valuable information for trend analysis is, in fact, eliminated by censoring of actual rather than simulated data depends on whether the analytical process is in statistical control and bias is predictable for a particular type of chemical analyses.

  20. Donor Preconditioning After the Onset of Brain Death With Dopamine Derivate n-Octanoyl Dopamine Improves Early Posttransplant Graft Function in the Rat.

    PubMed

    Li, S; Korkmaz-Icöz, S; Radovits, T; Ruppert, M; Spindler, R; Loganathan, S; Hegedűs, P; Brlecic, P; Theisinger, B; Theisinger, S; Höger, S; Brune, M; Lasitschka, F; Karck, M; Yard, B; Szabó, G

    2017-07-01

    Heart transplantation is the therapy of choice for end-stage heart failure. However, hemodynamic instability, which has been demonstrated in brain-dead donors (BDD), could also affect the posttransplant graft function. We tested the hypothesis that treatment of the BDD with the dopamine derivate n-octanoyl-dopamine (NOD) improves donor cardiac and graft function after transplantation. Donor rats were given a continuous intravenous infusion of either NOD (0.882 mg/kg/h, BDD+NOD, n = 6) or a physiological saline vehicle (BDD, n = 9) for 5 h after the induction of brain death by inflation of a subdural balloon catheter. Controls were sham-operated (n = 9). In BDD, decreased left-ventricular contractility (ejection fraction; maximum rate of rise of left-ventricular pressure; preload recruitable stroke work), relaxation (maximum rate of fall of left-ventricular pressure; Tau), and increased end-diastolic stiffness were significantly improved after the NOD treatment. Following the transplantation, the NOD-treatment of BDD improved impaired systolic function and ventricular relaxation. Additionally, after transplantation increased interleukin-6, tumor necrosis factor TNF-α, NF-kappaB-p65, and nuclear factor (NF)-kappaB-p105 gene expression, and increased caspase-3, TNF-α and NF-kappaB protein expression could be significantly downregulated by the NOD treatment compared to BDD. BDD postconditioning with NOD through downregulation of the pro-apoptotic factor caspase-3, pro-inflammatory cytokines, and NF-kappaB may protect the heart against the myocardial injuries associated with brain death and ischemia/reperfusion. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  1. Ex Vivo Perfusion Characteristics of Donation After Cardiac Death Kidneys Predict Long-Term Graft Survival.

    PubMed

    Sevinc, M; Stamp, S; Ling, J; Carter, N; Talbot, D; Sheerin, N

    2016-12-01

    Ex vivo perfusion is used in our unit for kidneys donated after cardiac death (DCD). Perfusion flow index (PFI), resistance, and perfusate glutathione S-transferase (GST) can be measured to assess graft viability. We assessed whether measurements taken during perfusion could predict long-term outcome after transplantation. All DCD kidney transplants performed from 2002 to 2014 were included in this study. The exclusion criteria were: incomplete data, kidneys not machine perfused, kidneys perfused in continuous mode, and dual transplantation. There were 155 kidney transplantations included in the final analysis. Demographic data, ischemia times, donor hypertension, graft function, survival and machine perfusion parameters after 3 hours were analyzed. Each perfusion parameter was divided into 3 groups as high, medium, and low. Estimated glomerular filtration rate was calculated at 12 months and then yearly after transplantation. There was a significant association between graft survival and PFI and GST (P values, .020 and .022, respectively). PFI was the only independent parameter to predict graft survival. A low PFI during ex vivo hypothermic perfusion is associated with inferior graft survival after DCD kidney transplantation. We propose that PFI is a measure of the health of the graft vasculature and that a low PFI indicates vascular disease and therefore predicts a worse long-term outcome. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Impact of graft implantation order on graft survival in simultaneous pancreas-kidney transplantation.

    PubMed

    Niclauss, Nadja; Bédat, Benoît; Morel, Philippe; Andres, Axel; Toso, Christian; Berney, Thierry

    2016-05-01

    The optimal order of revascularization for pancreas and kidney grafts in simultaneous pancreas-kidney transplantation has not been established. In this study, we investigate the influence of graft implantation order on graft survival in SPK. 12 700 transplantations from the Scientific Registry of Transplant Recipients were analyzed retrospectively. Graft implantation order was determined based on the reported ischemia times of pancreas and kidney grafts. Pancreas and kidney graft survivals were analyzed depending on graft implantation order at 3 months and 5 years using Kaplan-Meier plots. Significance was tested with log-rank test and Cox regression model. In 8454 transplantations, the pancreas was implanted first (PBK), and in 4246 transplantations, the kidney was implanted first (KBP). The proportion of lost pancreas grafts at 3 months was significantly lower in PBK (9.4% vs. 10.8%, P = 0.011). Increasing time lag (>2 h) between kidney and pancreas graft implantation in KBP accentuated the detrimental impact on pancreas graft survival (12.5% graft loss at 3 months, P = 0.001). Technical failure rates were reduced in PBK (5.6 vs. 6.9%, P = 0.005). Graft implantation order had no impact on kidney graft survival. In summary, although observed differences are small, pancreas graft implantation first increases short-term pancreas graft survival and reduces rates of technical failure. © 2016 Steunstichting ESOT.

  3. Impact of early graft function on 10-year graft survival in recipients of kidneys from standard- or expanded-criteria donors.

    PubMed

    Smail, Nassima; Tchervenkov, Jean; Paraskevas, Steven; Baran, Dana; Mucsi, Istvan; Hassanain, Mazen; Chaudhury, Prosanto; Cantarovich, Marcelo

    2013-07-27

    The use of kidneys from expanded-criteria donors (ECD) is regarded with caution. We compared 279 kidney transplant recipients (KTxR) from standard-criteria donors (SCD) and 237 from ECD, transplanted between January 1990 and December 2006. We evaluated the impact of immediate graft function (IGF), slow graft function (SGF), and delayed graft function (DGF) and the drop in estimated glomerular filtration rate (ΔeGFR) ≤ 30% or > 30% during the first year after transplantation on long-term patient and death-censored graft survival (DCGS). Ten-year patient survival was similar in SCD- or ECD-KTxR (P = 0.38). DCGS was better in SCD-KTxR versus ECD-KTxR (77.3% vs. 67.3%; P = 0.01). DCGS did not differ in either group experiencing IGF (P = 0.17) or DGF (P = 0.12). However, DCGS was worse in ECD-KTxR experiencing SGF (84.9% vs. 73.7%; P = 0.04). Predictors of DCGS were 1-year serum creatinine (hazard ratio, 1.03; P < 0.0001) and ΔeGFR > 30% between 1 and 12 months (Δ1-12eGFR) after transplantation (hazard ratio, 2.2; P = 0.02). In ECD-KTxR with IGF and more than 1-year follow-up, 10-year DCGS was better in those with Δ1-12eGFR ≤ 30% versus those with Δ1-12eGFR > 30% (83.8% vs. 53.6%; P = 0.01). Recipients of SCD or ECD kidneys with IGF or DGF had similar 10-year patient survival and DCGS. SGF had a worse impact on DCGS in ECD-KTxR. In addition to 1-year serum creatinine, Δ1-12eGFR > 30% is a negative predictor of DCGS. Larger studies should confirm if increasing the use of ECD, avoiding factors that contribute to SGF or DGF, and/or a decline in eGFR during the first year after transplantation may expand the donor pool and result in acceptable long-term outcomes.

  4. Parametric Model Based On Imputations Techniques for Partly Interval Censored Data

    NASA Astrophysics Data System (ADS)

    Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah

    2017-12-01

    The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.

  5. The influence of HLA mismatches and immunosuppression on kidney graft survival: an analysis of more than 1300 patients.

    PubMed

    Martins, L; Fonseca, I; Sousa, S; Matos, C; Santos, J; Dias, L; Henriques, A C; Sarmento, A M; Cabrita, A

    2007-10-01

    New immunosuppressive drugs used in kidney transplantation decreased the incidence of acute rejection. It was hypothesized that, with their power, the importance of HLA matching was decreased. To evaluate the influence of HLA matching, immunosuppression, and other possible risk factors, we analyzed data of 1314 consecutive deceased donor kidney transplantation. We divided the patient population into 4 cohorts, according to the era of transplantation: era 1, before 1990, azathioprine (Aza) and cyclosporine (Csa) no microemulsion; era 2, between 1990 and 1995, Csa microemulsion; era 3, between 1996 and 2000, wide use of mycophenolate mofetil (MMF) and anti-thymocyte globulin (ATG); and era 4, after 2000, marked by sirolimus and tacrolimus (TAC) use. Multivariate analysis compared death-censored graft survival. Using as reference the results obtained with 0 HLA mismatches, we verified, during era 1 and era 2, an increased risk of graft loss for all of the subgroups with HLA mismatch >0. However, during era 3 and era 4, the number of HLA mismatches did not influence graft survival. Although acute rejection and delayed graft function, which decreased in the later periods, remained as prognostic factors for graft loss. Considering the immunosuppressive protocol with Csa+Aza+Pred as reference, protocols used after 1995 with Pred+Csa+ATG, with Pred+Csa+MMF, and with Pred+Tac+MMF presented better survival results. Results showed that the significance of HLA matching decreased while the results improved with the new immunosuppressant drugs. These observations support the hypothesis that the weakened importance of HLA matching may be a consequence of the increasing efficacy of the immunosuppression.

  6. Cumulative incidence for wait-list death in relation to length of queue for coronary-artery bypass grafting: a cohort study.

    PubMed

    Sobolev, Boris G; Kuramoto, Lisa; Levy, Adrian R; Hayden, Robert

    2006-08-24

    In deciding where to undergo coronary-artery bypass grafting, the length of surgical wait lists is often the only information available to cardiologists and their patients. Our objective was to compare the cumulative incidence for death on the wait list according to the length of wait lists at the time of registration for the operation. The study cohort included 8966 patients who registered to undergo isolated coronary-artery bypass grafting (82.4% men; 71.9% semi-urgent; 22.4% non-urgent). The patients were categorized according to wait-list clearance time at registration: either "1 month or less" or "more than 1 month". Cumulative incidence for wait-list death was compared between the groups, and the significance of difference was tested by means of regression models. Urgent patients never registered on a wait list with a clearance time of more than 1 month. Semi-urgent patients registered on shorter wait lists more often than non-urgent patients (79.1% vs. 44.7%). In semi-urgent and non-urgent patients, the observed proportion of wait-list deaths by 52 weeks was lower in category "1 month or less" than in category "more than 1 month" (0.8% [49 deaths] vs. 1.6% [39 deaths], P < 0.005). After adjustment, the odds of death before surgery were 64% higher in patients on longer lists, odds ratio [OR] = 1.64 (95% confidence interval [CI] 1.02-2.63). The observed death rate was higher in category "more than 1 month" than in category "1 month or less", 0.79 (95%CI 0.54-1.04) vs. 0.58 (95% CI 0.42-0.74) per 1000 patient-weeks, the adjusted OR = 1.60 (95%CI 1.01-2.53). Longer wait times (log-rank test = 266.4, P < 0.001) and higher death rates contributed to a higher cumulative incidence for death on the wait list with a clearance time of more than 1 month. Long wait lists for coronary-artery bypass grafting are associated with increased probability that a patient dies before surgery. Physicians who advise patients where to undergo cardiac revascularization should consider

  7. Longterm results of liver transplantation from donation after circulatory death.

    PubMed

    Blok, Joris J; Detry, Olivier; Putter, Hein; Rogiers, Xavier; Porte, Robert J; van Hoek, Bart; Pirenne, Jacques; Metselaar, Herold J; Lerut, Jan P; Ysebaert, Dirk K; Lucidi, Valerio; Troisi, Roberto I; Samuel, Undine; den Dulk, A Claire; Ringers, Jan; Braat, Andries E

    2016-08-01

    Donation after circulatory death (DCD) liver transplantation (LT) may imply a risk for decreased graft survival, caused by posttransplantation complications such as primary nonfunction or ischemic-type biliary lesions. However, similar survival rates for DCD and donation after brain death (DBD) LT have been reported. The objective of this study is to determine the longterm outcome of DCD LT in the Eurotransplant region corrected for the Eurotransplant donor risk index (ET-DRI). Transplants performed in Belgium and the Netherlands (January 1, 2003 to December 31, 2007) in adult recipients were included. Graft failure was defined as either the date of recipient death or retransplantation whichever occurred first (death-uncensored graft survival). Mean follow-up was 7.2 years. In total, 126 DCD and 1264 DBD LTs were performed. Kaplan-Meier survival analyses showed different graft survival for DBD and DCD at 1 year (77.7% versus 74.8%, respectively; P = 0.71), 5 years (65.6% versus 54.4%, respectively; P = 0.02), and 10 years (47.3% versus 44.2%, respectively; P = 0.55; log-rank P = 0.038). Although there was an overall significant difference, the survival curves almost reach each other after 10 years, which is most likely caused by other risk factors being less in DCD livers. Patient survival was not significantly different (P = 0.59). Multivariate Cox regression analysis showed a hazard ratio of 1.7 (P < 0.001) for DCD (corrected for ET-DRI and recipient factors). First warm ischemia time (WIT), which is the time from the end of circulation until aortic cold perfusion, over 25 minutes was associated with a lower graft survival in univariate analysis of all DCD transplants (P = 0.002). In conclusion, DCD LT has an increased risk for diminished graft survival compared to DBD. There was no significant difference in patient survival. DCD allografts with a first WIT > 25 minutes have an increased risk for a decrease in graft survival. Liver Transplantation 22 1107

  8. Simplified technique for auxiliary orthotopic liver transplantation using a whole graft

    PubMed Central

    ROCHA-SANTOS, Vinicius; NACIF, Lucas Souto; PINHEIRO, Rafael Soares; DUCATTI, Liliana; ANDRAUS, Wellington; D'ALBURQUERQUE, Luiz Carneiro

    2015-01-01

    Background Acute liver failure is associated with a high mortality rate and the main purposes of treatment are to prevent cerebral edema and infections, which often are responsible for patient death. The orthotopic liver transplantation is the gold standard treatment and improves the 1-year survival. Aim To describe an alternative technique to auxiliary liver transplant on acute liver failure. Method Was performed whole auxiliary liver transplantation as an alternative technique for a partial auxiliary liver transplantation using a whole liver graft from a child removing the native right liver performed a right hepatectomy. The patient met the O´Grady´s criteria and the rational to indicate an auxiliary orthotopic liver transplantation was the acute classification without hemodynamic instability or renal failure in a patient with deterioration in consciousness. Results The procedure improved liver function and decreased intracranial hypertension in the postoperative period. Conclusion This technique can overcome some postoperative complications that are associated with partial grafts. As far as is known, this is the first case of auxiliary orthotopic liver transplantation in Brazil. PMID:26176253

  9. Modeling absolute differences in life expectancy with a censored skew-normal regression approach

    PubMed Central

    Clough-Gorr, Kerri; Zwahlen, Marcel

    2015-01-01

    Parameter estimates from commonly used multivariable parametric survival regression models do not directly quantify differences in years of life expectancy. Gaussian linear regression models give results in terms of absolute mean differences, but are not appropriate in modeling life expectancy, because in many situations time to death has a negative skewed distribution. A regression approach using a skew-normal distribution would be an alternative to parametric survival models in the modeling of life expectancy, because parameter estimates can be interpreted in terms of survival time differences while allowing for skewness of the distribution. In this paper we show how to use the skew-normal regression so that censored and left-truncated observations are accounted for. With this we model differences in life expectancy using data from the Swiss National Cohort Study and from official life expectancy estimates and compare the results with those derived from commonly used survival regression models. We conclude that a censored skew-normal survival regression approach for left-truncated observations can be used to model differences in life expectancy across covariates of interest. PMID:26339544

  10. Multiple Imputation of a Randomly Censored Covariate Improves Logistic Regression Analysis.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2016-01-01

    Randomly censored covariates arise frequently in epidemiologic studies. The most commonly used methods, including complete case and single imputation or substitution, suffer from inefficiency and bias. They make strong parametric assumptions or they consider limit of detection censoring only. We employ multiple imputation, in conjunction with semi-parametric modeling of the censored covariate, to overcome these shortcomings and to facilitate robust estimation. We develop a multiple imputation approach for randomly censored covariates within the framework of a logistic regression model. We use the non-parametric estimate of the covariate distribution or the semiparametric Cox model estimate in the presence of additional covariates in the model. We evaluate this procedure in simulations, and compare its operating characteristics to those from the complete case analysis and a survival regression approach. We apply the procedures to an Alzheimer's study of the association between amyloid positivity and maternal age of onset of dementia. Multiple imputation achieves lower standard errors and higher power than the complete case approach under heavy and moderate censoring and is comparable under light censoring. The survival regression approach achieves the highest power among all procedures, but does not produce interpretable estimates of association. Multiple imputation offers a favorable alternative to complete case analysis and ad hoc substitution methods in the presence of randomly censored covariates within the framework of logistic regression.

  11. Clinical Courses of Graft Failure Caused by Chronic Allograft Dysfunction in Kidney Transplantation.

    PubMed

    Fujiwara, T; Teruta, S; Tsudaka, S; Ota, K; Matsuda, H

    Chronic allograft dysfunction (CAD) is a main cause of graft failure in kidney transplantation. We retrospectively analyzed 279 kidney transplant recipients who survived with a functioning graft for at least 2 years. CAD was defined as chronic graft deterioration, excluding other specific causes. We defined the pattern of decline in estimated glomerular filtration rate (eGFR), as follows: (1) "plateau" was defined as decline in eGFR ≤2 mL/min/1.73 m 2 /year; "long plateaus" were those lasting more than 5 years; (2) "rapid decline" was a decrease in eGFR ≥20 mL/min/1.73 m 2 /year. Patients diagnosed with CAD were categorized according to the occurrence of rapid decline and/or long plateau as follows: group 1, neither rapid decline nor long plateau; group 2, rapid decline only; group 3, long plateau only; and group 4, both rapid decline and long plateau. From a total of 81 graft losses, 51 (63%) failed because of CAD, with a median of 9.4 years. Sixteen patients belonged to group 1, 14 to group 2, 12 to group 3, and nine to group 4. Mean graft survival times in the four groups were 7.7 ± 1.1, 6.1 ± 3.1, 16.2 ± 2.5, and 10.8 ± 3.6 years, respectively (P < .001). There were significant differences among groups in donor age, year of transplantation, mean eGFR at baseline, and acute rejection rate after transplantation. The results indicate that this cohort of kidney transplant recipients who had CAD comprised subgroups with different clinical courses. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Quantifying and estimating the predictive accuracy for censored time-to-event data with competing risks.

    PubMed

    Wu, Cai; Li, Liang

    2018-05-15

    This paper focuses on quantifying and estimating the predictive accuracy of prognostic models for time-to-event outcomes with competing events. We consider the time-dependent discrimination and calibration metrics, including the receiver operating characteristics curve and the Brier score, in the context of competing risks. To address censoring, we propose a unified nonparametric estimation framework for both discrimination and calibration measures, by weighting the censored subjects with the conditional probability of the event of interest given the observed data. The proposed method can be extended to time-dependent predictive accuracy metrics constructed from a general class of loss functions. We apply the methodology to a data set from the African American Study of Kidney Disease and Hypertension to evaluate the predictive accuracy of a prognostic risk score in predicting end-stage renal disease, accounting for the competing risk of pre-end-stage renal disease death, and evaluate its numerical performance in extensive simulation studies. Copyright © 2018 John Wiley & Sons, Ltd.

  13. Revisiting Traditional Risk Factors for Rejection and Graft Loss after Kidney Transplantation

    PubMed Central

    Dunn, TB; Noreen, H; Gillingham, K; Maurer, D; Ozturk, O. Goruroglu; Pruett, TL; Bray, RA; Gebel, HM; Matas, AJ

    2011-01-01

    Single antigen bead (SAB) testing permits reassessment of immunologic risk for kidney transplantation. Traditionally, high panel reactive antibody (PRA), retransplant and deceased donor (DD) grafts have been associated with increased risk. We hypothesized that this risk was likely mediated by (unrecognized) donor-specific antibody (DSA). We grouped 587 kidney transplants using clinical history and SAB testing of day of transplant serum as 1) unsensitized; PRA=0 (n= 178), 2) 3rd party sensitized; no DSA (n=363), or 3) donor sensitized; with DSA (n=46), and studied rejection rates, death censored graft survival (DCGS), and risk factors for rejection. Antibody-mediated rejection (AMR) rates were increased with DSA (p<0.0001), but not with PRA in the absence of DSA. Cell-mediated rejection (CMR) rates were increased with DSA (p<0.005); with a trend to increased rates when PRA>0 in the absence of DSA (p=0.08). Multivariate analyses showed risk factors for AMR were DSA, worse HLA matching, and female gender; for CMR: DSA, PRA>0 and worse HLA matching. AMR and CMR were associated with decreased DCGS. The presence of DSA is an important predictor of rejection risk, in contrast to traditional risk factors. Further development of immunosuppressive protocols will be facilitated by stratification of rejection risk by donor sensitization. PMID:21812918

  14. Corneal graft reversal: Histopathologic report of two cases

    PubMed Central

    Qahtani, Abdullah A.; Alkatan, Hind M.

    2014-01-01

    Graft reversal is a rare cause for failed PKP. In this case report we are presenting 2 graft failure cases in which the corneal grafts were reversed unintentionally. The onset of signs of graft failure, however was variable. We have included their clinical course and the histopathologic findings of the removed corneal grafts. A total of 6 cases including ours have been reported so far. The aim of this report is to attract the attention of corneal surgeons to an additional rare cause for failed penetrating keratoplasty (PKP) which is donor graft reversal. PMID:25473355

  15. Corneal graft reversal: Histopathologic report of two cases.

    PubMed

    Qahtani, Abdullah A; Alkatan, Hind M

    2014-10-01

    Graft reversal is a rare cause for failed PKP. In this case report we are presenting 2 graft failure cases in which the corneal grafts were reversed unintentionally. The onset of signs of graft failure, however was variable. We have included their clinical course and the histopathologic findings of the removed corneal grafts. A total of 6 cases including ours have been reported so far. The aim of this report is to attract the attention of corneal surgeons to an additional rare cause for failed penetrating keratoplasty (PKP) which is donor graft reversal.

  16. A critical analysis of early death after adult liver transplants.

    PubMed

    Rana, Abbas; Kaplan, Bruce; Jie, Tun; Porubsky, Marian; Habib, Shahid; Rilo, Horacio; Gruessner, Angelika C; Gruessner, Rainer W G

    2013-01-01

    The 15% mortality rate of liver transplant recipients at one yr may be viewed as a feat in comparison with the waiting list mortality, yet it nonetheless leaves room for much improvement. Our aim was to critically examine the mortality rates to identify high-risk periods and to incorporate cause of death into the analysis of post-transplant survival. We performed a retrospective analysis on United Network for Organ Sharing data for all adult recipients of liver transplants from January 1, 2002 to October 31, 2011. Our analysis included multivariate logistic regression where the primary outcome measure was patient death of 49,288 recipients. The highest mortality rate by day post-transplant was on day 0 (0.9%). The most significant risk factors were as follows: for one-d mortality from technical failure, intensive care unit admission odds ratio (OR 3.2); for one-d mortality from graft failure, warm ischemia >75 min (OR 5.6); for one-month mortality from infection, a previous transplant (OR 3.3); and for one-month mortality from graft failure, a previous transplant (OR 3.7). We found that the highest mortality rate after liver transplantation is within the first day and the first month post-transplant. Those two high-risk periods have common, as well as different, risk factors for mortality. © 2013 John Wiley & Sons A/S.

  17. Failure to Rescue Rates After Coronary Artery Bypass Grafting: An Analysis From The Society of Thoracic Surgeons Adult Cardiac Surgery Database.

    PubMed

    Edwards, Fred H; Ferraris, Victor A; Kurlansky, Paul A; Lobdell, Kevin W; He, Xia; O'Brien, Sean M; Furnary, Anthony P; Rankin, J Scott; Vassileva, Christina M; Fazzalari, Frank L; Magee, Mitchell J; Badhwar, Vinay; Xian, Ying; Jacobs, Jeffrey P; Wyler von Ballmoos, Moritz C; Shahian, David M

    2016-08-01

    Failure to rescue (FTR) is increasingly recognized as an important quality indicator in surgery. The Society of Thoracic Surgeons National Database was used to develop FTR metrics and a predictive FTR model for coronary artery bypass grafting (CABG). The study included 604,154 patients undergoing isolated CABG at 1,105 centers from January 2010 to January 2014. FTR was defined as death after four complications: stroke, renal failure, reoperation, and prolonged ventilation. FTR was determined for each complication and a composite of the four complications. A statistical model to predict FTR was developed. FTR rates were 22.3% for renal failure, 16.4% for stroke, 12.4% for reoperation, 12.1% for prolonged ventilation, and 10.5% for the composite. Mortality increased with multiple complications and with specific combinations of complications. The multivariate risk model for prediction of FTR demonstrated a C index of 0.792 and was well calibrated, with a 1.0% average difference between observed/expected (O/E) FTR rates. With centers grouped into mortality terciles, complication rates increased modestly (11.4% to 15.7%), but FTR rates more than doubled (6.8% to 13.9%) from the lowest to highest terciles. Centers in the lowest complication rate tercile had an FTR O/E of 1.14, whereas centers in the highest complication rate tercile had an FTR O/E of 0.91. CABG mortality rates vary directly with FTR, but complication rates have little relation to death. FTR rates derived from The Society of Thoracic Surgeons data can serve as national benchmarks. Predicted FTR rates may facilitate patient counseling, and FTR O/E ratios have promise as valuable quality metrics. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  18. Variable selection in a flexible parametric mixture cure model with interval-censored data.

    PubMed

    Scolas, Sylvie; El Ghouch, Anouar; Legrand, Catherine; Oulhaj, Abderrahim

    2016-03-30

    In standard survival analysis, it is generally assumed that every individual will experience someday the event of interest. However, this is not always the case, as some individuals may not be susceptible to this event. Also, in medical studies, it is frequent that patients come to scheduled interviews and that the time to the event is only known to occur between two visits. That is, the data are interval-censored with a cure fraction. Variable selection in such a setting is of outstanding interest. Covariates impacting the survival are not necessarily the same as those impacting the probability to experience the event. The objective of this paper is to develop a parametric but flexible statistical model to analyze data that are interval-censored and include a fraction of cured individuals when the number of potential covariates may be large. We use the parametric mixture cure model with an accelerated failure time regression model for the survival, along with the extended generalized gamma for the error term. To overcome the issue of non-stable and non-continuous variable selection procedures, we extend the adaptive LASSO to our model. By means of simulation studies, we show good performance of our method and discuss the behavior of estimates with varying cure and censoring proportion. Lastly, our proposed method is illustrated with a real dataset studying the time until conversion to mild cognitive impairment, a possible precursor of Alzheimer's disease. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  19. Linear Regression with a Randomly Censored Covariate: Application to an Alzheimer's Study.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2017-01-01

    The association between maternal age of onset of dementia and amyloid deposition (measured by in vivo positron emission tomography (PET) imaging) in cognitively normal older offspring is of interest. In a regression model for amyloid, special methods are required due to the random right censoring of the covariate of maternal age of onset of dementia. Prior literature has proposed methods to address the problem of censoring due to assay limit of detection, but not random censoring. We propose imputation methods and a survival regression method that do not require parametric assumptions about the distribution of the censored covariate. Existing imputation methods address missing covariates, but not right censored covariates. In simulation studies, we compare these methods to the simple, but inefficient complete case analysis, and to thresholding approaches. We apply the methods to the Alzheimer's study.

  20. Donor Hemodynamics as a Predictor of Outcomes After Kidney Transplantation From Donors After Cardiac Death.

    PubMed

    Allen, M B; Billig, E; Reese, P P; Shults, J; Hasz, R; West, S; Abt, P L

    2016-01-01

    Donation after cardiac death is an important source of transplantable organs, but evidence suggests donor warm ischemia contributes to inferior outcomes. Attempts to predict recipient outcome using donor hemodynamic measurements have not yielded statistically significant results. We evaluated novel measures of donor hemodynamics as predictors of delayed graft function and graft failure in a cohort of 1050 kidneys from 566 donors. Hemodynamics were described using regression line slopes, areas under the curve, and time beyond thresholds for systolic blood pressure, oxygen saturation, and shock index (heart rate divided by systolic blood pressure). A logistic generalized estimation equation model showed that area under the curve for systolic blood pressure was predictive of delayed graft function (above median: odds ratio 1.42, 95% confidence interval [CI] 1.06-1.90). Multivariable Cox regression demonstrated that slope of oxygen saturation during the first 10 minutes after extubation was associated with graft failure (below median: hazard ratio 1.30, 95% CI 1.03-1.64), with 5-year graft survival of 70.0% (95%CI 64.5%-74.8%) for donors above the median versus 61.4% (95%CI 55.5%-66.7%) for those below the median. Among older donors, increased shock index slope was associated with increased hazard of graft failure. Validation of these findings is necessary to determine the utility of characterizing donor warm ischemia to predict recipient outcome. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.

  1. Cell Death and Heart Failure in Obesity: Role of Uncoupling Proteins

    PubMed Central

    Ruiz-Ramírez, Angélica; López-Acosta, Ocarol; Barrios-Maya, Miguel Angel

    2016-01-01

    Metabolic diseases such as obesity, metabolic syndrome, and type II diabetes are often characterized by increased reactive oxygen species (ROS) generation in mitochondrial respiratory complexes, associated with fat accumulation in cardiomyocytes, skeletal muscle, and hepatocytes. Several rodents studies showed that lipid accumulation in cardiac myocytes produces lipotoxicity that causes apoptosis and leads to heart failure, a dynamic pathological process. Meanwhile, several tissues including cardiac tissue develop an adaptive mechanism against oxidative stress and lipotoxicity by overexpressing uncoupling proteins (UCPs), specific mitochondrial membrane proteins. In heart from rodent and human with obesity, UCP2 and UCP3 may protect cardiomyocytes from death and from a state progressing to heart failure by downregulating programmed cell death. UCP activation may affect cytochrome c and proapoptotic protein release from mitochondria by reducing ROS generation and apoptotic cell death. Therefore the aim of this review is to discuss recent findings regarding the role that UCPs play in cardiomyocyte survival by protecting against ROS generation and maintaining bioenergetic metabolism homeostasis to promote heart protection. PMID:27642497

  2. The effect of donor diabetes history on graft failure and endothelial cell density 10 years after penetrating keratoplasty.

    PubMed

    Lass, Jonathan H; Riddlesworth, Tonya D; Gal, Robin L; Kollman, Craig; Benetz, Beth A; Price, Francis W; Sugar, Alan; Terry, Mark A; Soper, Mark; Beck, Roy W

    2015-03-01

    To examine the long-term effect of donor diabetes history on graft failure and endothelial cell density (ECD) after penetrating keratoplasty (PK) in the Cornea Donor Study. Multicenter, prospective, double-masked, controlled clinical trial. One thousand ninety subjects undergoing PK for a moderate risk condition, principally Fuchs' dystrophy or pseudophakic or aphakic corneal edema, were enrolled by 105 surgeons from 80 clinical sites in the United States. Corneas from donors 12 to 75 years of age were assigned by 43 eye banks to participants without respect to recipient factors. Donor and recipient diabetes status was determined from existing medical records. Images of the central endothelium were obtained before surgery (baseline) and at intervals for 10 years after surgery and were analyzed by a central image analysis reading center to determine ECD. Time to graft failure (regraft or cloudy cornea for 3 consecutive months) and ECD. There was no statistically significant association of donor diabetes history with 10-year graft failure, baseline ECD, 10-year ECD, or ECD values longitudinally over time in unadjusted analyses, nor after adjusting for donor age and other significant covariates. The 10-year graft failure rate was 23% in the 199 patients receiving a cornea from a donor with diabetes versus 26% in the 891 patients receiving a cornea from a donor without diabetes (95% confidence interval for the difference, -10% to 6%; unadjusted P=0.60). Baseline ECD (P=0.71), 10-year ECD (P>0.99), and changes in ECD over 10 years (P=0.86) were similar comparing donor groups with and without diabetes. The study results do not suggest an association between donor diabetes and PK outcome. However, the assessment of donor diabetes was imprecise and based on historical data only. The increasing frequency of diabetes in the aging population in the United States affects the donor pool. Thus, the impact of donor diabetes on long-term endothelial health after PK or endothelial

  3. The Effect of Donor Diabetes History on Graft Failure and Endothelial Cell Density Ten Years after Penetrating Keratoplasty

    PubMed Central

    Lass, Jonathan H.; Riddlesworth, Tonya D.; Gal, Robin L.; Kollman, Craig; Benetz, Beth A.; Price, Francis W.; Sugar, Alan; Terry, Mark A.; Soper, Mark; Beck, Roy W.

    2014-01-01

    Objective To examine the long term effect of donor diabetes history on graft failure and endothelial cell density (ECD) after penetrating keratoplasty (PKP) in the Cornea Donor Study Design Multi-center prospective, double-masked, controlled clinical trial Participants 1090 subjects undergoing PKP for a moderate risk condition, principally Fuchs’ dystrophy or pseudophakic/aphakic corneal edema (PACE), were enrolled by 105 surgeons from 80 clinical sites in the United States. Methods Corneas from donors 12 to 75 years old were assigned by 43 eye banks to participants without respect to recipient factors. Donor and recipient diabetes status was determined from existing medical records. Images of the central endothelium were obtained preoperatively (baseline) and at intervals for ten years postoperatively and analyzed by a central image analysis reading center to determine ECD. Main Outcome Measure(s) Time to graft failure (regraft or cloudy cornea for 3 consecutive months) and ECD. Results There was no statistically significant association of donor diabetes history with 10-year graft failure, baseline ECD, 10-year ECD or ECD values longitudinally over time in unadjusted analyses nor after adjusting for donor age and other significant covariates. The 10-year graft failure rate was 23% in the 199 cases receiving a cornea from a donor with diabetes versus 26% in the 891 cases receiving a cornea from a donor without diabetes (95% confidence interval for the difference: −10% to +6%; unadjusted p = 0.60). Baseline ECD (p=0.71), 10-year ECD (p>0.99), and changes in ECD over 10 years (p=0.86) were similar comparing donor diabetes and no-diabetes groups. Conclusions and Relevance The study results do not suggest an association between donor diabetes and PKP outcome. However, the assessment of donor diabetes was imprecise and based on historical data only. The increasing frequency of diabetes in the aging population in the United States affects the donor pool, thus the

  4. Censored Hurdle Negative Binomial Regression (Case Study: Neonatorum Tetanus Case in Indonesia)

    NASA Astrophysics Data System (ADS)

    Yuli Rusdiana, Riza; Zain, Ismaini; Wulan Purnami, Santi

    2017-06-01

    Hurdle negative binomial model regression is a method that can be used for discreate dependent variable, excess zero and under- and overdispersion. It uses two parts approach. The first part estimates zero elements from dependent variable is zero hurdle model and the second part estimates not zero elements (non-negative integer) from dependent variable is called truncated negative binomial models. The discrete dependent variable in such cases is censored for some values. The type of censor that will be studied in this research is right censored. This study aims to obtain the parameter estimator hurdle negative binomial regression for right censored dependent variable. In the assessment of parameter estimation methods used Maximum Likelihood Estimator (MLE). Hurdle negative binomial model regression for right censored dependent variable is applied on the number of neonatorum tetanus cases in Indonesia. The type data is count data which contains zero values in some observations and other variety value. This study also aims to obtain the parameter estimator and test statistic censored hurdle negative binomial model. Based on the regression results, the factors that influence neonatorum tetanus case in Indonesia is the percentage of baby health care coverage and neonatal visits.

  5. Graft survival of diabetic versus nondiabetic donor tissue after initial keratoplasty.

    PubMed

    Vislisel, Jesse M; Liaboe, Chase A; Wagoner, Michael D; Goins, Kenneth M; Sutphin, John E; Schmidt, Gregory A; Zimmerman, M Bridget; Greiner, Mark A

    2015-04-01

    To compare corneal graft survival using tissue from diabetic and nondiabetic donors in patients undergoing initial Descemet stripping automated endothelial keratoplasty (DSAEK) or penetrating keratoplasty (PKP). A retrospective chart review of pseudophakic eyes that underwent DSAEK or PKP was performed. The primary outcome measure was graft failure. Cox proportional hazard regression and Kaplan-Meier survival analyses were used to compare diabetic versus nondiabetic donor tissue for all keratoplasty cases. A total of 183 eyes (136 DSAEK, 47 PKP) were included in the statistical analysis. Among 24 procedures performed using diabetic donor tissue, there were 4 cases (16.7%) of graft failure (3 DSAEK, 1 PKP), and among 159 procedures performed using nondiabetic donor tissue, there were 18 cases (11.3%) of graft failure (12 DSAEK, 6 PKP). Cox proportional hazard ratio of graft failure for all cases comparing diabetic with nondiabetic donor tissue was 1.69, but this difference was not statistically significant (95% confidence interval, 0.56-5.06; P = 0.348). There were no significant differences in Kaplan-Meier curves comparing diabetic with nondiabetic donor tissue for all cases (P = 0.380). Statistical analysis of graft failure by donor diabetes status within each procedure type was not possible because of the small number of graft failure events involving diabetic tissue. We found similar rates of graft failure in all keratoplasty cases when comparing tissue from diabetic and nondiabetic donors, but further investigation is needed to determine whether diabetic donor tissue results in different graft failure rates after DSAEK compared with PKP.

  6. The effects of body mass index on graft survival in adult recipients transplanted with single pediatric kidneys.

    PubMed

    Balamuthusamy, Saravanan; Paramesh, Anil; Zhang, Rubin; Florman, Sander; Shenava, Rajesh; Islam, Tareq; Wagner, Janis; Killackey, Mary; Alper, Brent; Simon, Eric E; Slakey, Douglas

    2009-01-01

    There is insufficient data on the impact of recipient body mass index (BMI) on the long-term graft survival of adult patients transplanted with single pediatric kidneys. We performed a retrospective analysis of adult patients transplanted with single pediatric kidneys at our center. The recipients were classified into 2 groups: group 1 (BMI > or =30) and group 2 (BMI <30). Donor/recipient demographics, postoperative outcomes and survival rates were compared between the 2 groups. There was no significant difference in donor/recipient demographics between the 2 groups. In group 1, the death-censored graft survival (DCGS) at 1, 3 and 5 years was 90% at all 3 time points, and in group 2 it was 86, 68 and 60%, respectively (p = 0.05). The mean glomerular filtration rate (with standard deviation in parentheses) at 1, 3 and 5 years was, respectively, 55 (15), 59 (19) and 55 (28) ml/min for group 1, compared to 65 (28), 69 (23) and 67 (20) ml/min in group 2 (p = NS). Multivariate analysis revealed a hazard ratio of 5.12 (95% confidence interval 1.06-24.7; p = 0.04) for graft loss in nonobese patients when compared to obese patients. Obese patients had an increased risk for acute rejections within the first month of transplant (p = 0.02). Patients with a BMI > or =30 transplanted with single pediatric kidneys have better DCGS rates when compared to nonobese patients. Copyright (c) 2008 S. Karger AG, Basel.

  7. [Kidney transplantation: consecutive one thousand transplants at National Institute of Medical Sciences and Nutrition Salvador Zubirán in Mexico City].

    PubMed

    Marino-Vazquez, Lluvia Aurora; Sánchez-Ugarte, Regina; Morales-Buenrostro, Luis Eduardo

    2011-09-01

    The National Institute of Medical Sciences and Nutrition Salvador Zubiran (INCMNSZ) is a specialty hospital for adults and a teaching hospital, which performed the first kidney transplant in 1967; in 1971 it began the formal program of renal transplantation. Recently, it was performed the kidney transplant number 1000, so this article presents the information of these thousand kidney transplants, with special emphasis on survival. Retrospective cohort study which included 1000 consecutive transplants performed at the INCMNSZ between 1967 and June 2011. It describes the general characteristics of kidney transplant recipients, transplant-related variables, initial immunosuppression and complications. Descriptive statistics were used. The survival analysis was performed using the Kaplan-Meier method. It shows the patient survival, graft survival censored for death with functional graft and total graft survival (uncensored). Patient survival at 1, 3, 5, 10, 15, and 20 years was 94.9, 89.6, 86.8, 76.9, 66.1, and 62.2%, respectively. Graft survival censored for death with functional graft at 1, 3, 5, 10, 15, and 20 years was 93.1, 87.1, 83.5, 73.9, 62.7, and 52.5% respectively. Risk factors associated with poorer graft survival were younger age of the recipient, transplant during the first period (1967-1983), and a HLA mismatch. Patient and graft survival have improved over time through the use of better immunosuppression and use of induction therapy. Identification of risk factors affecting graft survival, allows each center to set their strategies to improve the patient's outcome.

  8. Thermal Cycling Life Prediction of Sn-3.0Ag-0.5Cu Solder Joint Using Type-I Censored Data

    PubMed Central

    Mi, Jinhua; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-01-01

    Because solder joint interconnections are the weaknesses of microelectronic packaging, their reliability has great influence on the reliability of the entire packaging structure. Based on an accelerated life test the reliability assessment and life prediction of lead-free solder joints using Weibull distribution are investigated. The type-I interval censored lifetime data were collected from a thermal cycling test, which was implemented on microelectronic packaging with lead-free ball grid array (BGA) and fine-pitch ball grid array (FBGA) interconnection structures. The number of cycles to failure of lead-free solder joints is predicted by using a modified Engelmaier fatigue life model and a type-I censored data processing method. Then, the Pan model is employed to calculate the acceleration factor of this test. A comparison of life predictions between the proposed method and the ones calculated directly by Matlab and Minitab is conducted to demonstrate the practicability and effectiveness of the proposed method. At last, failure analysis and microstructure evolution of lead-free solders are carried out to provide useful guidance for the regular maintenance, replacement of substructure, and subsequent processing of electronic products. PMID:25121138

  9. Is Support of Censoring Controversial Media Content for the Good of Others? Sexual Strategies and Support of Censoring Pro-Alcohol Advertising.

    PubMed

    Zhang, Jinguang

    2017-01-01

    At least in the United States, there are widespread concerns with advertising that encourages alcohol consumption, and previous research explains those concerns as aiming to protect others from the harm of excessive alcohol use. 1 Drawing on sexual strategies theory, we hypothesized that support of censoring pro-alcohol advertising is ultimately self-benefiting regardless of its altruistic effect at a proximate level. Excessive drinking positively correlates with having casual sex, and casual sex threatens monogamy, one of the major means with which people adopting a long-term sexual strategy increase their inclusive fitness. Then, one way for long-term strategists to protect monogamy, and thus their reproductive interest is to support censoring pro-alcohol advertising, thereby preventing others from becoming excessive drinkers (and consequently having casual sex) under media influence. Supporting this hypothesis, three studies consistently showed that restricted sociosexuality positively correlated with support of censoring pro-alcohol advertising before and after various value-, ideological-, and moral-foundation variables were controlled for. Also as predicted, Study 3 revealed a significant indirect effect of sociosexuality on censorship support through perceived media influence on others but not through perceived media influence on self. These findings further supported a self-interest analysis of issue opinions, extended third-person-effect research on support of censoring pro-alcohol advertising, and suggested a novel approach to analyzing media censorship support.

  10. SEMIPARAMETRIC EFFICIENT ESTIMATION FOR SHARED-FRAILTY MODELS WITH DOUBLY-CENSORED CLUSTERED DATA

    PubMed Central

    Wang, Jane-Ling

    2018-01-01

    In this paper, we investigate frailty models for clustered survival data that are subject to both left- and right-censoring, termed “doubly-censored data”. This model extends current survival literature by broadening the application of frailty models from right-censoring to a more complicated situation with additional left censoring. Our approach is motivated by a recent Hepatitis B study where the sample consists of families. We adopt a likelihood approach that aims at the nonparametric maximum likelihood estimators (NPMLE). A new algorithm is proposed, which not only works well for clustered data but also improve over existing algorithm for independent and doubly-censored data, a special case when the frailty variable is a constant equal to one. This special case is well known to be a computational challenge due to the left censoring feature of the data. The new algorithm not only resolves this challenge but also accommodate the additional frailty variable effectively. Asymptotic properties of the NPMLE are established along with semi-parametric efficiency of the NPMLE for the finite-dimensional parameters. The consistency of Bootstrap estimators for the standard errors of the NPMLE is also discussed. We conducted some simulations to illustrate the numerical performance and robustness of the proposed algorithm, which is also applied to the Hepatitis B data. PMID:29527068

  11. Impact of donor obesity and donation after cardiac death on outcomes after kidney transplantation.

    PubMed

    Ortiz, Jorge; Gregg, Austin; Wen, Xuerong; Karipineni, Farah; Kayler, Liise K

    2012-01-01

    The effect of donor body mass index (BMI) and donor type on kidney transplant outcomes has not been well studied. Scientific Registry of Transplant Recipients data on recipients of deceased-donor kidneys between 1997 and 2010 were reviewed. Donors were categorized by DCD status (DCD, 6932; non-DCD, 90,158) and BMI groups at 5 kg/m(2) increments: 18.5-24.9, 25-29.9, 30-34.9, 35-39.9, 40-44.9, and ≥ 45 kg/m(2) . The primary outcome, death-censored graft survival (DCGS), was adjusted for donor, recipient, and transplant characteristics. Among recipients of non-DCD kidneys, donor BMI was not associated with DCGS. Among DCD recipients, donor BMI was not associated with DCGS for donor BMI categories < 45 kg/m(2) ; however, donor BMI ≥ 45 kg/m(2) was independently associated with DCGS compared to BMI of 20-24.9 kg/m(2) (adjusted hazard ratio, 1.84; 95% CI, 1.23, 2.74). The adjusted odds of delayed graft function (DGF) was greater for each level of BMI above reference for both DCD and non-DCD groups. There was no association of donor BMI with one-yr acute rejection for either type of donor. Although BMI is associated with DGF, long-term graft survival is not affected except in the combination of DCD with extreme donor BMI ≥ 45. © 2012 John Wiley & Sons A/S.

  12. Impact of donor age in liver transplantation from donation after circulatory death donors: A decade of experience at Cleveland Clinic.

    PubMed

    Firl, Daniel J; Hashimoto, Koji; O'Rourke, Colin; Diago-Uso, Teresa; Fujiki, Masato; Aucejo, Federico N; Quintini, Cristiano; Kelly, Dympna M; Miller, Charles M; Fung, John J; Eghtesad, Bijan

    2015-12-01

    The use of liver grafts from donation after circulatory death (DCD) donors remains controversial, particularly with donors of advanced age. This retrospective study investigated the impact of donor age in DCD liver transplantation. We examined 92 recipients who received DCD grafts and 92 matched recipients who received donation after brain death (DBD) grafts at Cleveland Clinic from January 2005 to June 2014. DCD grafts met stringent criteria to minimize risk factors in both donors and recipients. The 1-, 3-, and 5-year graft survival in DCD recipients was significantly inferior to that in DBD recipients (82%, 71%, 66% versus 92%, 87%, 85%, respectively; P = 0.03). Six DCD recipients (7%), but no DBD recipients, experienced ischemic-type biliary stricture (P = 0.01). However, the incidence of biliary stricture was not associated with donor age (P = 0.57). Interestingly, recipients receiving DCD grafts from donors who were <45 years of age (n = 55) showed similar graft survival rates compared to those receiving DCD grafts from donors who were ≥45 years of age (n = 37; 80%, 69%, 66% versus 83%, 72%, 66%, respectively; P = 0.67). Cox proportional hazards modeling in all study populations (n = 184) revealed advanced donor age (P = 0.05) and the use of a DCD graft (P = 0.03) as unfavorable factors for graft survival. Logistic regression analysis showed that the risk of DBD graft failure increased with increasing age, but the risk of DCD graft failure did not increase with increasing age (P = 0.13). In conclusion, these data suggest that stringent donor and recipient selection may ameliorate the negative impact of donor age in DCD liver transplantation. DCD grafts should not be discarded because of donor age, per se, and could help expand the donor pool for liver transplantation. © 2015 American Association for the Study of Liver Diseases.

  13. New Factors Predicting Delayed Graft Function: a Multi-Center Cohort Study of Kidney Donation After Brain Death Followed by Circulatory Death.

    PubMed

    Sun, Qipeng; Huang, Zhengyu; Zhou, Honglan; Lin, Minzhuan; Hua, Xuefeng; Hong, Liangqing; Na, Ning; Cai, Ruiming; Wang, Gang; Sun, Qiquan

    2018-05-30

    Delayed graft function (DGF) is a common complication following kidney transplantation adversely affecting graft outcomes. Donation after brain death followed by circulatory death (DBCD), a novel donation pattern, is expected to correlate with high incidence of DGF. However, little information is available about factors associated with DGF in DBCD. A total of 383 kidney transplants from DBCD donation in three institutions were enrolled. Associations of DGF with the clinical characteristics of recipients and donors were quantified. In this retrospective multi-center study, the incidence of DGF was 19.3%. Lower incidence of DGF was found in recipients for whom antithymocyte globulin was used for induction (p < 0.05), which was an independent protective factor against DGF (odds ratio [OR] = 0.48; 95% CI 0.27-0.86). Two novel explicative variables were recognized as independent risk factors, including use of vasoactive drugs (OR = 3.15; 95% CI 1.39-7.14) and cardiopulmonary resuscitation (OR = 2.51; 95% CI 1.05-6.00), which contributed significantly to increased risk of DGF (p < 0.05). Prolonged warm ischemia time (> 18 min; OR = 2.42; 95% CI 1.36-4.32), was also predictive of DGF in DBCD. A prediction model was developed and achieved an area under the curve of 0.89 in predicting DGF when combined with reported parameters. The novel factors, confirmed for the first time in our study, will help to improve risk prediction of DGF and to determine optimal interventions to prevent DGF in clinical practice. © 2018 The Author(s). Published by S. Karger AG, Basel.

  14. Revisiting traditional risk factors for rejection and graft loss after kidney transplantation.

    PubMed

    Dunn, T B; Noreen, H; Gillingham, K; Maurer, D; Ozturk, O G; Pruett, T L; Bray, R A; Gebel, H M; Matas, A J

    2011-10-01

    Single-antigen bead (SAB) testing permits reassessment of immunologic risk for kidney transplantation. Traditionally, high panel reactive antibody (PRA), retransplant and deceased donor (DD) grafts have been associated with increased risk. We hypothesized that this risk was likely mediated by (unrecognized) donor-specific antibody (DSA). We grouped 587 kidney transplants using clinical history and single-antigen bead (SAB) testing of day of transplant serum as (1) unsensitized; PRA = 0 (n = 178), (2) third-party sensitized; no DSA (n = 363) or (3) donor sensitized; with DSA (n = 46), and studied rejection rates, death-censored graft survival (DCGS) and risk factors for rejection. Antibody-mediated rejection (AMR) rates were increased with DSA (p < 0.0001), but not with panel reactive antibody (PRA) in the absence of DSA. Cell-mediated rejection (CMR) rates were increased with DSA (p < 0.005); with a trend to increased rates when PRA>0 in the absence of DSA (p = 0.08). Multivariate analyses showed risk factors for AMR were DSA, worse HLA matching, and female gender; for CMR: DSA, PRA>0 and worse HLA matching. AMR and CMR were associated with decreased DCGS. The presence of DSA is an important predictor of rejection risk, in contrast to traditional risk factors. Further development of immunosuppressive protocols will be facilitated by stratification of rejection risk by donor sensitization. ©2011 The Authors Journal compilation©2011 The American Society of Transplantation and the American Society of Transplant Surgeons.

  15. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    NASA Astrophysics Data System (ADS)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  16. The Cosmological Evolution of Radio Sources with CENSORS

    NASA Technical Reports Server (NTRS)

    Brookes, Mairi; Best, Philip; Peacock, John; Dunlop, James; Rottgering, Huub

    2006-01-01

    The CENSORS survey, selected from the NVSS, has been followed up using EIS, K-band imaging and spectroscopic observations to produce a radio sample capable of probing the source density in the regime: z greater than 2.5. With a current spectroscopic completeness of 62%, CENSORS has been used in direct modeling of RLF evolution and in V/V(sub max) tests. There is evidence for a shallow decline in number density of source in the luminosity range 10(sup 26) - 10(sup 27)WHz(sup -1) at 1.4GHz.

  17. Censoring approach to the detection limits in X-ray fluorescence analysis

    NASA Astrophysics Data System (ADS)

    Pajek, M.; Kubala-Kukuś, A.

    2004-10-01

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called "nondetects", can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples.

  18. Utility in Treating Kidney Failure in End-Stage Liver Disease With Simultaneous Liver-Kidney Transplantation.

    PubMed

    Cheng, Xingxing S; Stedman, Margaret R; Chertow, Glenn M; Kim, W Ray; Tan, Jane C

    2017-05-01

    Simultaneous liver-kidney (SLK) transplantation plays an important role in treating kidney failure in patients with end-stage liver disease. It used 5% of deceased donor kidney transplanted in 2015. We evaluated the utility, defined as posttransplant kidney allograft lifespan, of this practice. Using data from the Scientific Registry of Transplant Recipients, we compared outcomes for all SLK transplants between January 1, 1995, and December 3, 2014, to their donor-matched kidney used in kidney-alone (Ki) or simultaneous pancreas kidney (SPK) transplants. Primary outcome was kidney allograft lifespan, defined as the time free from death or allograft failure. Secondary outcomes included death and death-censored allograft failure. We adjusted all analyses for donor, transplant, and recipient factors. The adjusted 10-year mean kidney allograft lifespan was higher in Ki/SPK compared with SLK transplants by 0.99 years in the Model for End-stage Liver Disease era and 1.71 years in the pre-Model for End-stage Liver Disease era. Death was higher in SLK recipients relative to Ki/SPK recipients: 10-year cumulative incidences 0.36 (95% confident interval 0.33-0.38) versus 0.19 (95% confident interval 0.17-0.21). SLK transplantation exemplifies the trade-off between the principles of utility and medical urgency. With each SLK transplantation, about 1 year of allograft lifespan is traded so that sicker patients, that is, SLK transplant recipients, are afforded access to the organ. These data provide a basis against which benefits derived from urgency-based allocation can be measured.

  19. Utility in Treating Kidney Failure in End-Stage Liver Disease With Simultaneous Liver-Kidney Transplantation

    PubMed Central

    Cheng, Xingxing S.; Stedman, Margaret R.; Chertow, Glenn M.; Kim, W. Ray; Tan, Jane C.

    2017-01-01

    Background Simultaneous liver-kidney (SLK) transplantation plays an important role in treating kidney failure in patients with end-stage liver disease. It used 5% of deceased donor kidney transplanted in 2015. We evaluated the utility, defined as posttransplant kidney allograft lifespan, of this practice. Methods Using data from the Scientific Registry of Transplant Recipients, we compared outcomes for all SLK transplants between January 1, 1995, and December 3, 2014, to their donor-matched kidney used in kidney-alone (Ki) or simultaneous pancreas kidney (SPK) transplants. Primary outcome was kidney allograft lifespan, defined as the time free from death or allograft failure. Secondary outcomes included death and death-censored allograft failure. We adjusted all analyses for donor, transplant, and recipient factors. Results The adjusted 10-year mean kidney allograft lifespan was higher in Ki/SPK compared with SLK transplants by 0.99 years in the Model for End-stage Liver Disease era and 1.71 years in the pre-Model for End-stage Liver Disease era. Death was higher in SLK recipients relative to Ki/SPK recipients: 10-year cumulative incidences 0.36 (95% confident interval 0.33-0.38) versus 0.19 (95% confident interval 0.17-0.21). Conclusions SLK transplantation exemplifies the trade-off between the principles of utility and medical urgency. With each SLK transplantation, about 1 year of allograft lifespan is traded so that sicker patients, that is, SLK transplant recipients, are afforded access to the organ. These data provide a basis against which benefits derived from urgency-based allocation can be measured. PMID:28437790

  20. Testing independence of bivariate interval-censored data using modified Kendall's tau statistic.

    PubMed

    Kim, Yuneung; Lim, Johan; Park, DoHwan

    2015-11-01

    In this paper, we study a nonparametric procedure to test independence of bivariate interval censored data; for both current status data (case 1 interval-censored data) and case 2 interval-censored data. To do it, we propose a score-based modification of the Kendall's tau statistic for bivariate interval-censored data. Our modification defines the Kendall's tau statistic with expected numbers of concordant and disconcordant pairs of data. The performance of the modified approach is illustrated by simulation studies and application to the AIDS study. We compare our method to alternative approaches such as the two-stage estimation method by Sun et al. (Scandinavian Journal of Statistics, 2006) and the multiple imputation method by Betensky and Finkelstein (Statistics in Medicine, 1999b). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Incidence and predictors of in-hospital non-cardiac death in patients with acute heart failure.

    PubMed

    Wakabayashi, Kohei; Sato, Naoki; Kajimoto, Katsuya; Minami, Yuichiro; Mizuno, Masayuki; Keida, Takehiko; Asai, Kuniya; Munakata, Ryo; Murai, Koji; Sakata, Yasushi; Suzuki, Hiroshi; Takano, Teruo

    2017-08-01

    Patients with acute heart failure (AHF) commonly have multiple co-morbidities, and some of these patients die in the hospital from causes other than aggravated heart failure. However, limited information is available on the mode of death in patients with AHF. Therefore, the present study was performed to determine the incidence and predictors of in-hospital non-cardiac death in patients with AHF, using the Acute Decompensated Heart Failure Syndromes (ATTEND) registry Methods: The ATTEND registry included 4842 consecutive patients with AHF admitted between April 2007-September 2011. The primary endpoint of the present study was in-hospital non-cardiac death. A stepwise regression model was used to identify the predictors of in-hospital non-cardiac death. The incidence of all-cause in-hospital mortality was 6.4% ( n=312), and the incidence was 1.9% ( n=93) and 4.5% ( n=219) for non-cardiac and cardiac causes, respectively. Old age was associated with in-hospital non-cardiac death, with a 42% increase in the risk per decade (odds 1.42, p=0.004). Additionally, co-morbidities including chronic obstructive pulmonary disease (odds 1.98, p=0.034) and anaemia (odds 1.17 (per 1.0 g/dl decrease), p=0.006) were strongly associated with in-hospital non-cardiac death. Moreover, other predictors included low serum sodium levels (odds 1.05 (per 1.0 mEq/l decrease), p=0.045), high C-reactive protein levels (odds 1.07, p<0.001) and no statin use (odds 0.40, p=0.024). The incidence of in-hospital non-cardiac death was markedly high in patients with AHF, accounting for 30% of all in-hospital deaths in the ATTEND registry. Thus, the prevention and management of non-cardiac complications are vital to prevent acute-phase mortality in patients with AHF, especially those with predictors of in-hospital non-cardiac death.

  2. Liver Transplantation for Fulminant Hepatic Failure

    PubMed Central

    Farmer, Douglas G.; Anselmo, Dean M.; Ghobrial, R. Mark; Yersiz, Hasan; McDiarmid, Suzanne V.; Cao, Carlos; Weaver, Michael; Figueroa, Jesus; Khan, Khurram; Vargas, Jorge; Saab, Sammy; Han, Steven; Durazo, Francisco; Goldstein, Leonard; Holt, Curtis; Busuttil, Ronald W.

    2003-01-01

    Objective To analyze outcomes after liver transplantation (LT) in patients with fulminant hepatic failure (FHF) with emphasis on pretransplant variables that can potentially help predict posttransplant outcome. Summary Background Data FHF is a formidable clinical problem associated with a high mortality rate. While LT is the treatment of choice for irreversible FHF, few investigations have examined pretransplant variables that can potentially predict outcome after LT. Methods A retrospective review was undertaken of all patients undergoing LT for FHF at a single transplant center. The median follow-up was 41 months. Thirty-five variables were analyzed by univariate and multivariate analysis to determine their impact on patient and graft survival. Results Two hundred four patients (60% female, median age 20.2 years) required urgent LT for FHF. Before LT, the majority of patients were comatose (76%), on hemodialysis (16%), and ICU-bound. The 1- and 5-year survival rates were 73% and 67% (patient) and 63% and 57% (graft). The primary cause of patient death was sepsis, and the primary cause of graft failure was primary graft nonfunction. Univariate analysis of pre-LT variables revealed that 19 variables predicted survival. From these results, multivariate analysis determined that the serum creatinine was the single most important prognosticator of patient survival. Conclusions This study, representing one of the largest published series on LT for FHF, demonstrates a long-term survival of nearly 70% and develops a clinically applicable and readily measurable set of pretransplant factors that determine posttransplant outcome. PMID:12724633

  3. Minimally invasive direct coronary artery bypass grafting: a meta-analysis.

    PubMed

    Kettering, K

    2008-12-01

    Recently minimally invasive direct coronary artery bypass (MIDCAB) grafting has become an interesting alternative to conventional coronary artery bypass grafting, especially in patients with a high-grade left anterior descending coronary artery (LAD) stenosis unsuitable for balloon angioplasty. Although MIDCAB offers several advantages such as the avoidance of sternotomy and cardiopulmonary bypass, concerns have been raised about the technical accuracy of the anastomoses that can be performed on a beating heart. Therefore, clinical and angiographic outcomes after MIDCAB are the subject of current controversy. A literature search for all published outcome studies of MIDCAB grafting was performed for the period from January 1995 through October 2007. Seventeen articles were enrolled in this meta-analysis. The data presented in the studies was analyzed with regard to clinical and angiographic results. Early and late (>30 days after MIDCAB) death rates were 1.3% (51/4081 patients) and 3.2% (130/4081 patients), respectively. The infarct rate was 0.8% (32/4081 patients; non-fatal myocardial infarction). Other minor or major complications (e.g. reoperation for management of bleeding, chest wound problems, arrhythmias, cerebrovascular accident, pericardial effusion, pulmonary complications) were reported in 781 cases. The conversion rate to sternotomy/cardiopulmonary bypass was 1.8% (74/4081 patients). A re-intervention due to graft failure was necessary in 134/4081 patients (3.3%). A total of 2556 grafts were studied angiographically immediately after surgery. One hundred and six grafts (4.2%) were occluded and 169 grafts (6.6 %) had a significant stenosis (50-99%). At 6-month follow-up, 445 grafts were studied angiographically. Sixteen grafts (3.6%) were occluded and 32 grafts (7.2%) had a significant stenosis. Clinical outcomes and immediate graft patency after MIDCAB are acceptable. However, long-term follow-up results and further randomized prospective clinical trials

  4. A single-item self-report medication adherence question predicts hospitalisation and death in patients with heart failure.

    PubMed

    Wu, Jia-Rong; DeWalt, Darren A; Baker, David W; Schillinger, Dean; Ruo, Bernice; Bibbins-Domingo, Kristen; Macabasco-O'Connell, Aurelia; Holmes, George M; Broucksou, Kimberly A; Erman, Brian; Hawk, Victoria; Cene, Crystal W; Jones, Christine DeLong; Pignone, Michael

    2014-09-01

    To determine whether a single-item self-report medication adherence question predicts hospitalisation and death in patients with heart failure. Poor medication adherence is associated with increased morbidity and mortality. Having a simple means of identifying suboptimal medication adherence could help identify at-risk patients for interventions. We performed a prospective cohort study in 592 participants with heart failure within a four-site randomised trial. Self-report medication adherence was assessed at baseline using a single-item question: 'Over the past seven days, how many times did you miss a dose of any of your heart medication?' Participants who reported no missing doses were defined as fully adherent, and those missing more than one dose were considered less than fully adherent. The primary outcome was combined all-cause hospitalisation or death over one year and the secondary endpoint was heart failure hospitalisation. Outcomes were assessed with blinded chart reviews, and heart failure outcomes were determined by a blinded adjudication committee. We used negative binomial regression to examine the relationship between medication adherence and outcomes. Fifty-two percent of participants were 52% male, mean age was 61 years, and 31% were of New York Heart Association class III/IV at enrolment; 72% of participants reported full adherence to their heart medicine at baseline. Participants with full medication adherence had a lower rate of all-cause hospitalisation and death (0·71 events/year) compared with those with any nonadherence (0·86 events/year): adjusted-for-site incidence rate ratio was 0·83, fully adjusted incidence rate ratio 0·68. Incidence rate ratios were similar for heart failure hospitalisations. A single medication adherence question at baseline predicts hospitalisation and death over one year in heart failure patients. Medication adherence is associated with all-cause and heart failure-related hospitalisation and death in heart

  5. Corneal graft rejection in African Americans at Howard University Hospital

    PubMed Central

    Ferdinand, Larry; Ngakeng, Vanessa; Copeland, Robert A.

    2011-01-01

    Purpose There is scarcity of data in the literature on cornel graft rejection rate in patients exclusively of African ancestry. The purpose of this study was to evaluate the rejection rate of corneal transplant surgery performed at Howard University Hospital on such patients over a 15 year period. Design A retrospective evaluation was performed of the cornea graft rejection and corneal graft failure rate in 125 penetrating keratoplasties (PKPs) done by one corneal specialist at Howard University Hospital from January 1, 1990 to August 31, 2005. Methods Of the 125 patients, 62 were eliminated from the study because of re-grafted eyes, non-African descent, primary graft failures, follow-up less than 1 month and lack of availability of charts. This study, therefore, studied and recorded data from 63 penetrating keratoplasties of 63 eyes from 60 patients. Results Episodes of graft rejection were documented in 23 eyes (36.5% of cases). Nine out of the 23 graft rejections manifested to secondary graft failure (39%). Overall, there were nine out of the 63 PKPs (14.3%) that resulted in secondary graft failure over the past 15 years. The major diagnostic categories were bullous keratopathy 24 (38%), keratoconus 10 (15.8%), Fuch’s dystrophy 4 (6.3%), other 20 (31.7%). Of the cases with episodes of rejection and failure, 4.3% and none were attributable to keratoconus, 30.4% and 22.2% for bullous keratopathy, and 8.7% and 22.2% for Fuch’s dystrophy, respectively. Also, best visual acuity was looked at in patients with rejection episodes. None of the patients had a pre-op visual acuity 20/40 or better; however, post-op PKP 2 (8.7%) of patients achieved 20/40 or better. Also, 4 (17.4%) of patients had a pre-op visual acuity between 20/50 and 20/150, but post-op PKP best visual acuity between 20/50 and 20/150 was increased to 9 (39.1%). Conclusion At 36% the prevalence of corneal graft rejection was one of the highest in the reported literature. But only 14% of those

  6. Corneal graft rejection in African Americans at Howard University Hospital.

    PubMed

    Ferdinand, Larry; Ngakeng, Vanessa; Copeland, Robert A

    2011-07-01

    There is scarcity of data in the literature on cornel graft rejection rate in patients exclusively of African ancestry. The purpose of this study was to evaluate the rejection rate of corneal transplant surgery performed at Howard University Hospital on such patients over a 15 year period. A retrospective evaluation was performed of the cornea graft rejection and corneal graft failure rate in 125 penetrating keratoplasties (PKPs) done by one corneal specialist at Howard University Hospital from January 1, 1990 to August 31, 2005. Of the 125 patients, 62 were eliminated from the study because of re-grafted eyes, non-African descent, primary graft failures, follow-up less than 1 month and lack of availability of charts. This study, therefore, studied and recorded data from 63 penetrating keratoplasties of 63 eyes from 60 patients. Episodes of graft rejection were documented in 23 eyes (36.5% of cases). Nine out of the 23 graft rejections manifested to secondary graft failure (39%). Overall, there were nine out of the 63 PKPs (14.3%) that resulted in secondary graft failure over the past 15 years. The major diagnostic categories were bullous keratopathy 24 (38%), keratoconus 10 (15.8%), Fuch's dystrophy 4 (6.3%), other 20 (31.7%). Of the cases with episodes of rejection and failure, 4.3% and none were attributable to keratoconus, 30.4% and 22.2% for bullous keratopathy, and 8.7% and 22.2% for Fuch's dystrophy, respectively. Also, best visual acuity was looked at in patients with rejection episodes. None of the patients had a pre-op visual acuity 20/40 or better; however, post-op PKP 2 (8.7%) of patients achieved 20/40 or better. Also, 4 (17.4%) of patients had a pre-op visual acuity between 20/50 and 20/150, but post-op PKP best visual acuity between 20/50 and 20/150 was increased to 9 (39.1%). At 36% the prevalence of corneal graft rejection was one of the highest in the reported literature. But only 14% of those episodes resulted in graft failure which is one of

  7. Sudden cardiac death and pump failure death prediction in chronic heart failure by combining ECG and clinical markers in an integrated risk model.

    PubMed

    Ramírez, Julia; Orini, Michele; Mincholé, Ana; Monasterio, Violeta; Cygankiewicz, Iwona; Bayés de Luna, Antonio; Martínez, Juan Pablo; Laguna, Pablo; Pueyo, Esther

    2017-01-01

    Sudden cardiac death (SCD) and pump failure death (PFD) are common endpoints in chronic heart failure (CHF) patients, but prevention strategies are different. Currently used tools to specifically predict these endpoints are limited. We developed risk models to specifically assess SCD and PFD risk in CHF by combining ECG markers and clinical variables. The relation of clinical and ECG markers with SCD and PFD risk was assessed in 597 patients enrolled in the MUSIC (MUerte Súbita en Insuficiencia Cardiaca) study. ECG indices included: turbulence slope (TS), reflecting autonomic dysfunction; T-wave alternans (TWA), reflecting ventricular repolarization instability; and T-peak-to-end restitution (ΔαTpe) and T-wave morphology restitution (TMR), both reflecting changes in dispersion of repolarization due to heart rate changes. Standard clinical indices were also included. The indices with the greatest SCD prognostic impact were gender, New York Heart Association (NYHA) class, left ventricular ejection fraction, TWA, ΔαTpe and TMR. For PFD, the indices were diabetes, NYHA class, ΔαTpe and TS. Using a model with only clinical variables, the hazard ratios (HRs) for SCD and PFD for patients in the high-risk group (fifth quintile of risk score) with respect to patients in the low-risk group (first and second quintiles of risk score) were both greater than 4. HRs for SCD and PFD increased to 9 and 11 when using a model including only ECG markers, and to 14 and 13, when combining clinical and ECG markers. The inclusion of ECG markers capturing complementary pro-arrhythmic and pump failure mechanisms into risk models based only on standard clinical variables substantially improves prediction of SCD and PFD in CHF patients.

  8. Nanofat grafting under a split-thickness skin graft for problematic wound management.

    PubMed

    Kemaloğlu, Cemal Alper

    2016-01-01

    Obesity and certain medical disorders make the reconstruction of skin defects challenging. Different kind of procedure can be used for these defect, besides, skin grafting is one of the most common and simplest procedure. Fat grafting and stem cells which are located in the adipose tissue have been commonly used in plastic surgery for regeneration and rejuvenation purposes. To decrease graft failure rate we performed nanofat grafting under an autologous split-thickness skin graft in our patient who had a problematic wound. The case of a 35-year-old female patient with a traumatic skin defect on her left anterior crural region is described herein. After subsequent flap reconstruction, the result was disappointing and the defect size was widened. The defect was treated with combined grafting (nanofat grafting under an autologous split-thickness skin graft). At the 6 months follow-up assessment after combined grafting, the integrity of the skin graft was good with excellent pliability. Combined grafting for problematic wounds seems to be a useful technique for cases requiring reconstruction. The potential existence of stem cells may be responsible for the successful result in our patient.

  9. Effect of a single intraoperative high-dose ATG-Fresenius on delayed graft function in donation after cardiac-death donor renal allograft recipients: a randomized study.

    PubMed

    van den Hoogen, Martijn W F; Kho, Marcia M L; Abrahams, Alferso C; van Zuilen, Arjan D; Sanders, Jan-Stephan; van Dijk, Marja; Hilbrands, Luuk B; Weimar, Willem; Hoitsma, Andries J

    2013-04-01

    Reducing the incidence of delayed graft function after transplant with donation after cardiac death donor renal allografts would facilitate managing recipients during their first weeks after a transplant. To reduce this incidence, in most studies, induction therapy with depleting anti-T-lymphocyte antibodies is coupled with a reduction of the dosage of the calcineurin inhibitor. The separate effect of anti-T-cell therapy on the incidence and duration of delayed graft function is therefore difficult to assess. We performed a randomized study to evaluate the effect of a single intraoperative high-dose of anti-T-lymphocyte immunoglobulin (ATG)-Fresenius (9 mg/kg body weight) on the incidence of delayed graft function. Eligible adult recipients of a first donation after cardiac death donor renal allograft were randomly assigned to ATG-Fresenius or no induction therapy. Maintenance immunosuppression consisted of tacrolimus, in an unadjusted dose, mycophenolate mofetil, and steroids. The study was prematurely terminated because of a lower-than-anticipated inclusion rate. Baseline characteristics were comparable in the ATG-Fresenius group (n=28) and the control group (n=24). Twenty-two patients in the ATG-Fresenius group (79%) had delayed graft function, compared with 13 in the control group (54%; P = .06). Allograft and patient survival were comparable in both groups. Serious adverse events occurred more frequently in the ATG-Fresenius group than they did in the control group (57% vs 29%; P < .05). Intraoperative administration of a single high-dose of ATG-Fresenius in donation after cardiac death donor renal allograft recipients, followed by triple immunosuppression with an unadjusted tacrolimus dose, seems ineffective to reduce the incidence of delayed graft function. Moreover, this was associated with a higher rate of serious adverse events (EudraCT-number, 2007-000210-36.).

  10. EEA stapler and omental graft in esophagogastrectomy: experience with 30 intrathoracic anastomoses for cancer.

    PubMed Central

    Fekete, F; Breil, P; Ronsse, H; Tossen, J C; Langonnet, F

    1981-01-01

    Experience with the EEA stapler device used in 30 esophagogastric resections for cancer with intrathoracic anastomosis, is reported. The mortality rate was 13.3%. The anastomotic failure rate was 3.3% (1/30) with only one death; three asymptomatic blind fistulas were found on a routine contrast examination of the anastomosis. It is felt that esophagogastric EEA stapled anastomosis associated with an omental graft is a very safe technique. Images Fig. 4. Fig. 5. Fig. 6. PMID:7247526

  11. Failure-free survival after second-line systemic treatment of chronic graft-versus-host disease

    PubMed Central

    Storer, Barry E.; Lee, Stephanie J.; Carpenter, Paul A.; Sandmaier, Brenda M.; Flowers, Mary E. D.; Martin, Paul J.

    2013-01-01

    This study attempted to characterize causes of treatment failure, identify associated prognostic factors, and develop shorter-term end points for trials testing investigational products or regimens for second-line systemic treatment of chronic graft-versus-host disease (GVHD). The study cohort (312 patients) received second-line systemic treatment of chronic GVHD. The primary end point was failure-free survival (FFS) defined by the absence of third-line treatment, nonrelapse mortality, and recurrent malignancy during second-line treatment. Treatment change was the major cause of treatment failure. FFS was 56% at 6 months after second-line treatment. Lower steroid doses at 6 months correlated with subsequent withdrawal of immunosuppressive treatment. Multivariate analysis showed that high-risk disease at transplantation, lower gastrointestinal involvement at second-line treatment, and severe NIH global score at second-line treatment were associated with increased risks of treatment failure. These three factors were used to define risk groups, and success rates at 6 months were calculated for each risk group either without or with various steroid dose limits at 6 months as an additional criterion of success. These success rates could be used as the basis for a clinically relevant and efficient shorter-term end point in clinical studies that evaluate agents for second-line systemic treatment of chronic GVHD. PMID:23321253

  12. Evaluation of methods for managing censored results when calculating the geometric mean.

    PubMed

    Mikkonen, Hannah G; Clarke, Bradley O; Dasika, Raghava; Wallis, Christian J; Reichman, Suzie M

    2018-01-01

    Currently, there are conflicting views on the best statistical methods for managing censored environmental data. The method commonly applied by environmental science researchers and professionals is to substitute half the limit of reporting for derivation of summary statistics. This approach has been criticised by some researchers, raising questions around the interpretation of historical scientific data. This study evaluated four complete soil datasets, at three levels of simulated censorship, to test the accuracy of a range of censored data management methods for calculation of the geometric mean. The methods assessed included removal of censored results, substitution of a fixed value (near zero, half the limit of reporting and the limit of reporting), substitution by nearest neighbour imputation, maximum likelihood estimation, regression on order substitution and Kaplan-Meier/survival analysis. This is the first time such a comprehensive range of censored data management methods have been applied to assess the accuracy of calculation of the geometric mean. The results of this study show that, for describing the geometric mean, the simple method of substitution of half the limit of reporting is comparable or more accurate than alternative censored data management methods, including nearest neighbour imputation methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A multilayered polyurethane foam technique for skin graft immobilization.

    PubMed

    Nakamura, Motoki; Ito, Erika; Kato, Hiroshi; Watanabe, Shoichi; Morita, Akimichi

    2012-02-01

    Several techniques are applicable for skin graft immobilization. Although the sponge dressing is a popular technique, pressure failure near the center of the graft is a weakness of the technique that can result in engraftment failure. To evaluate the efficacy of a new skin graft immobilization technique using multilayered polyurethane foam in vivo and in vitro. Twenty-six patients underwent a full-thickness skin graft. Multiple layers of a hydrocellular polyurethane foam dressing were used for skin graft immobilization. In addition, we created an in vitro skin graft model that allowed us to estimate immobilization pressure at the center and edges of skin grafts of various sizes. Overall mean graft survival was 88.9%. In the head and neck region (19 patients), mean graft survival was 93.6%. Based on the in vitro outcomes, this technique supplies effective pressure (<30 mmHg) to the center region of the skin graft. This multilayered polyurethane foam dressing is simple, safe, and effective for skin graft immobilization. © 2011 by the American Society for Dermatologic Surgery, Inc. Published by Wiley Periodicals, Inc.

  14. Cox model with interval-censored covariate in cohort studies.

    PubMed

    Ahn, Soohyun; Lim, Johan; Paik, Myunghee Cho; Sacco, Ralph L; Elkind, Mitchell S

    2018-05-18

    In cohort studies the outcome is often time to a particular event, and subjects are followed at regular intervals. Periodic visits may also monitor a secondary irreversible event influencing the event of primary interest, and a significant proportion of subjects develop the secondary event over the period of follow-up. The status of the secondary event serves as a time-varying covariate, but is recorded only at the times of the scheduled visits, generating incomplete time-varying covariates. While information on a typical time-varying covariate is missing for entire follow-up period except the visiting times, the status of the secondary event are unavailable only between visits where the status has changed, thus interval-censored. One may view interval-censored covariate of the secondary event status as missing time-varying covariates, yet missingness is partial since partial information is provided throughout the follow-up period. Current practice of using the latest observed status produces biased estimators, and the existing missing covariate techniques cannot accommodate the special feature of missingness due to interval censoring. To handle interval-censored covariates in the Cox proportional hazards model, we propose an available-data estimator, a doubly robust-type estimator as well as the maximum likelihood estimator via EM algorithm and present their asymptotic properties. We also present practical approaches that are valid. We demonstrate the proposed methods using our motivating example from the Northern Manhattan Study. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Marginal regression analysis of recurrent events with coarsened censoring times.

    PubMed

    Hu, X Joan; Rosychuk, Rhonda J

    2016-12-01

    Motivated by an ongoing pediatric mental health care (PMHC) study, this article presents weakly structured methods for analyzing doubly censored recurrent event data where only coarsened information on censoring is available. The study extracted administrative records of emergency department visits from provincial health administrative databases. The available information of each individual subject is limited to a subject-specific time window determined up to concealed data. To evaluate time-dependent effect of exposures, we adapt the local linear estimation with right censored survival times under the Cox regression model with time-varying coefficients (cf. Cai and Sun, Scandinavian Journal of Statistics 2003, 30, 93-111). We establish the pointwise consistency and asymptotic normality of the regression parameter estimator, and examine its performance by simulation. The PMHC study illustrates the proposed approach throughout the article. © 2016, The International Biometric Society.

  16. Intermediate-term graft loss after renal transplantation is associated with both donor-specific antibody and acute rejection.

    PubMed

    Devos, Jennifer M; Gaber, Ahmed Osama; Teeter, Larry D; Graviss, Edward A; Patel, Samir J; Land, Geoffrey A; Moore, Linda W; Knight, Richard J

    2014-03-15

    Renal transplant recipients with de novo DSA (dDSA) experience higher rates of rejection and worse graft survival than dDSA-free recipients. This study presents a single-center review of dDSA monitoring in a large, multi-ethnic cohort of renal transplant recipients. The authors performed a nested case-control study of adult kidney and kidney-pancreas recipients from July 2007 through July 2011. Cases were defined as dDSA-positive whereas controls were all DSA-negative transplant recipients. DSA were determined at 1, 3, 6, 9, and 12 months posttransplant, and every 6 months thereafter. Of 503 recipients in the analysis, 24% developed a dDSA, of whom 73% had dDSA against DQ antigen. Median time to dDSA was 6.1 months (range 0.2-44.6 months). After multivariate analysis, African American race, kidney-pancreas recipient, and increasing numbers of human leukocyte antigen mismatches were independent risk factors for dDSA. Recipients with dDSA were more likely to suffer an acute rejection (AR) (35% vs. 10%, P<0.001), an antibody-mediated AR (16% vs. 0.3%, P<0.001), an AR ascribed to noncompliance (8% vs. 2%, P=0.001), and a recurrent AR (6% vs. 1%, P=0.002) than dDSA-negative recipients. At a median follow-up of 31 months, the death-censored actuarial graft survival of dDSA recipients was worse than the DSA-free cohort (P=0.002). Yet, for AR-free recipients, there was no difference in graft survival between cohorts (P=0.66). Development of dDSA was associated with an increased incidence of graft loss, yet the detrimental effect of dDSA was limited in the intermediate term to recipients with AR.

  17. Bovine and PTFE vascular graft results in hemodialysis patients.

    PubMed

    Sert, S; Demirogullari, B; Ziya Anadol, A; Guvence, N; Dalgic, A

    2000-01-01

    Purpose. There are many reports of patency periods, failure rates, thrombosis and infection attacks connected with vascular grafts. In this article, the results of polytetrafluoroethylene (PTFE) and Bovine grafts were compared in a forty-four month period. Methods. 61 vascular grafts (29 PTFE, 32 bovine) were placed in 49 patients. The grafts were compared in different ways, such as survival, complication rates and placement area using life survey analysis. Results. Mean survival time was 17 mo (SE +/- 2.8) for PTFE grafts and 11 mo (SE +/- 1.1) for bovine grafts. A failure rate of 34% due only to graft complications were found in PTFE and 25% in bovine grafts. All graft complications were seen in the first year. Comparison of the cumulative survival rates of the groups were found to be insignificant during the study period and the first year ( p>0.05). Regardless of the type, there was no signif-icant difference between the grafts placed in the forearm and the grafts in the thigh (p>0.05). Conclusions. There is no survival difference between PTFE and bovine grafts. First year of the grafts is important for developing complications.

  18. Differential Event Rates and Independent Predictors of Long-Term Major Cardiovascular Events and Death in 5795 Patients With Unprotected Left Main Coronary Artery Disease Treated With Stents, Bypass Surgery, or Medication: Insights From a Large International Multicenter Registry.

    PubMed

    Kang, Se Hun; Ahn, Jung-Min; Lee, Cheol Hyun; Lee, Pil Hyung; Kang, Soo-Jin; Lee, Seung-Whan; Kim, Young-Hak; Lee, Cheol Whan; Park, Seong-Wook; Park, Duk-Woo; Park, Seung-Jung

    2017-07-01

    Identifying predictive factors for major cardiovascular events and death in patients with unprotected left main coronary artery disease is of great clinical value for risk stratification and possible guidance for tailored preventive strategies. The Interventional Research Incorporation Society-Left MAIN Revascularization registry included 5795 patients with unprotected left main coronary artery disease (percutaneous coronary intervention, n=2850; coronary-artery bypass grafting, n=2337; medication alone, n=608). We analyzed the incidence and independent predictors of major adverse cardiac and cerebrovascular events (MACCE; a composite of death, MI, stroke, or repeat revascularization) and all-cause mortality in each treatment stratum. During follow-up (median, 4.3 years), the rates of MACCE and death were substantially higher in the medical group than in the percutaneous coronary intervention and coronary-artery bypass grafting groups ( P <0.001). In the percutaneous coronary intervention group, the 3 strongest predictors for MACCE were chronic renal failure, old age (≥65 years), and previous heart failure; those for all-cause mortality were chronic renal failure, old age, and low ejection fraction. In the coronary-artery bypass grafting group, old age, chronic renal failure, and low ejection fraction were the 3 strongest predictors of MACCE and death. In the medication group, old age, low ejection fraction, and diabetes mellitus were the 3 strongest predictors of MACCE and death. Among patients with unprotected left main coronary artery disease, the key clinical predictors for MACCE and death were generally similar regardless of index treatment. This study provides effect estimates for clinically relevant predictors of long-term clinical outcomes in real-world left main coronary artery patients, providing possible guidance for tailored preventive strategies. URL: https://clinicaltrials.gov. Unique identifier: NCT01341327. © 2017 American Heart Association, Inc.

  19. Survival of patients treated for end-stage renal disease by dialysis and transplantation.

    PubMed Central

    Higgins, M. R.; Grace, M.; Dossetor, J. B.

    1977-01-01

    The results of treatment in 213 patients with end-stage renal disease who underwent hemodialysis, peritoneal dialysis or transplantation, or a combination, between 1962 and 1975 were analysed. Comparison by censored survival analysis showed significantly better (P less than 0.01) patient survival with the integrated therapy of dialysis and transplantation than with either form of dialysis alone. There was no significant difference in survival of males and females but survival at the extremes of age was poorer. Analysis of survival by major cause of renal failure indicated best survival in patients with congenital renal disease. Graft and patient survival rates at 1 year after the first transplantation were 42% and 69%. The major cause of death in this series was vascular disease but infection was responsible for 50% of deaths after transplantation. While integration of dialysis with transplantation produces best patient survival, this course is possible only when sufficient cadaver kidneys are available. PMID:334354

  20. Comparison of outcomes of kidney transplantation from donation after brain death, donation after circulatory death, and donation after brain death followed by circulatory death donors.

    PubMed

    Chen, Guodong; Wang, Chang; Ko, Dicken Shiu-Chung; Qiu, Jiang; Yuan, Xiaopeng; Han, Ming; Wang, Changxi; He, Xiaoshun; Chen, Lizhong

    2017-11-01

    There are three categories of deceased donors of kidney transplantation in China, donation after brain death (DBD), donation after circulatory death (DCD), and donation after brain death followed by circulatory death (DBCD) donors. The aim of this study was to compare the outcomes of kidney transplantation from these three categories of deceased donors. We retrospectively reviewed 469 recipients who received deceased kidney transplantation in our hospital from February 2007 to June 2015. The recipients were divided into three groups according to the source of their donor kidneys: DBD, DCD, or DBCD. The primary endpoints were delayed graft function (DGF), graft loss, and patient death. The warm ischemia time was much longer in DCD group compared to DBCD group (18.4 minutes vs 12.9 minutes, P < .001). DGF rate was higher in DCD group than in DBD and DBCD groups (22.5% vs 10.2% and 13.8%, respectively, P = .021). Urinary leakage was much higher in DCD group (P = .049). Kaplan-Meier analysis showed that 1-, 2-, and 3-year patient survivals were all comparable among the three groups. DBCD kidney transplantation has lower incidences of DGF and urinary leakage than DCD kidney transplant. However, the overall patient and graft survival were comparable among DBD, DCD, and DBCD kidney transplantation. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets: I. Summary statistics

    USGS Publications Warehouse

    Antweiler, Ronald C.; Taylor, Howard E.

    2008-01-01

    The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.

  2. Sudden cardiac death and pump failure death prediction in chronic heart failure by combining ECG and clinical markers in an integrated risk model

    PubMed Central

    Orini, Michele; Mincholé, Ana; Monasterio, Violeta; Cygankiewicz, Iwona; Bayés de Luna, Antonio; Martínez, Juan Pablo

    2017-01-01

    Background Sudden cardiac death (SCD) and pump failure death (PFD) are common endpoints in chronic heart failure (CHF) patients, but prevention strategies are different. Currently used tools to specifically predict these endpoints are limited. We developed risk models to specifically assess SCD and PFD risk in CHF by combining ECG markers and clinical variables. Methods The relation of clinical and ECG markers with SCD and PFD risk was assessed in 597 patients enrolled in the MUSIC (MUerte Súbita en Insuficiencia Cardiaca) study. ECG indices included: turbulence slope (TS), reflecting autonomic dysfunction; T-wave alternans (TWA), reflecting ventricular repolarization instability; and T-peak-to-end restitution (ΔαTpe) and T-wave morphology restitution (TMR), both reflecting changes in dispersion of repolarization due to heart rate changes. Standard clinical indices were also included. Results The indices with the greatest SCD prognostic impact were gender, New York Heart Association (NYHA) class, left ventricular ejection fraction, TWA, ΔαTpe and TMR. For PFD, the indices were diabetes, NYHA class, ΔαTpe and TS. Using a model with only clinical variables, the hazard ratios (HRs) for SCD and PFD for patients in the high-risk group (fifth quintile of risk score) with respect to patients in the low-risk group (first and second quintiles of risk score) were both greater than 4. HRs for SCD and PFD increased to 9 and 11 when using a model including only ECG markers, and to 14 and 13, when combining clinical and ECG markers. Conclusion The inclusion of ECG markers capturing complementary pro-arrhythmic and pump failure mechanisms into risk models based only on standard clinical variables substantially improves prediction of SCD and PFD in CHF patients. PMID:29020031

  3. Increased risk of severe recurrence of hepatitis C virus in liver transplant recipients of donation after cardiac death allografts.

    PubMed

    Hernandez-Alejandro, Roberto; Croome, Kris P; Quan, Douglas; Mawardi, Mohamed; Chandok, Natasha; Dale, Cheryl; McAlister, Vivian; Levstik, Mark A; Wall, William; Marotta, Paul

    2011-09-27

    In hepatitis C virus (HCV) recipients of donation after cardiac death (DCD) grafts, there is suggestion of lower rates of graft survival, indicating that DCD grafts themselves may represent a significant risk factor for severe recurrence of HCV. We evaluated all DCD liver transplant recipients from August 2006 to February 2011 at our center. Recipients with HCV who received a DCD graft (group 1, HCV+ DCD, n=17) were compared with non-HCV recipients transplanted with a DCD graft (group 2, HCV- DCD, n=15), and with a matched group of HCV recipients transplanted with a donation after brain death (DBD) graft (group 3, HCV+ DBD, n=42). A trend of poorer graft survival was seen in HCV+ patients who underwent a DCD transplant (group 1) compared with HCV- patients who underwent a DCD transplant (group 2) (P=0.14). Importantly, a statistically significant difference in graft survival was seen in HCV+ patients undergoing DCD transplant (group 1) (73%) as compared with DBD transplant (group 3) (93%)(P=0.01). There was a statistically significant increase in HCV recurrence at 3 months (76% vs. 16%) (P=0.005) and severe HCV recurrence within the first year (47% vs. 10%) in the DCD group (P=0.004). HCV recurrence is more severe and progresses more rapidly in HCV+ recipients who receive grafts from DCD compared with those who receive grafts from DBD. DCD liver transplantation in HCV+ recipients is associated with a higher rate of graft failure compared with those who receive grafts from DBD. Caution must be taken when using DCD grafts in HCV+ recipients.

  4. Prolonged immunosuppression preserves nonsensitization status after kidney transplant failure.

    PubMed

    Casey, Michael J; Wen, Xuerong; Kayler, Liise K; Aiyer, Ravi; Scornik, Juan C; Meier-Kriesche, Herwig-Ulf

    2014-08-15

    When kidney transplants fail, transplant medications are discontinued to reduce immunosuppression-related risks. However, retransplant candidates are at risk for allosensitization which prolonging immunosuppression may minimize. We hypothesized that for these patients, a prolonged immunosuppression withdrawal after graft failure preserves nonsensitization status (PRA 0%) better than early immunosuppression withdrawal. We retrospectively examined subjects transplanted at a single center between July 1, 1999 and December 1, 2009 with a non-death-related graft loss. Subjects were stratified by timing of immunosuppression withdrawal after graft loss: early (≤3 months) or prolonged (>3 months). Retransplant candidates were eligible for the main study where the primary outcome was nonsensitization at retransplant evaluation. Non-retransplant candidates were included in the safety analysis only. We found 102 subjects with non-death-related graft loss of which 49 were eligible for the main study. Nonsensitization rates at retransplant evaluation were 30% and 66% for the early and prolonged immunosuppression withdrawal groups, respectively (P=0.01). After adjusting for cofactors such as blood transfusion and allograft nephrectomy, prolonged immunosuppression withdrawal remained significantly associated with nonsensitization (adjusted odds ratio=5.78, 95% CI [1.37-24.44]). No adverse safety signals were seen in the prolonged immunosuppression withdrawal group compared to the early immunosuppression withdrawal group. These results suggest that prolonged immunosuppression may be a safe strategy to minimize sensitization in retransplant candidates and provide the basis for larger or prospective studies for further verification.

  5. Low estimated glomerular filtration rate and chronic kidney failure following liver transplant: a retrospective cohort study.

    PubMed

    Narciso, Roberto C; Ferraz, Leonardo R; Rodrigues, Cassio J O; Monte, Júlio C M; Mie, Sérgio; Dos Santos, Oscar F P; Paes, Ângela T; Cendoroglo, Miguel; Jaber, Bertrand L; Durão, Marcelino S; Batista, Marcelo C

    2013-07-01

    Patients undergoing orthotropic liver transplant (LTx) often present with chronic kidney disease (CKD). Identification of patients who will progress to end-stage renal disease (ESRD) might allow not only the implementation of kidney protective measures but also simultaneous kidney transplant. Retrospective cohort study in adults who underwent LTx at a single center. ESRD, death, and composite of ESRD or death were studied outcomes. 331 patients, who underwent LTx, were followed up for 2.6 ± 1.4 years; 31 (10%) developed ESRD, 6 (2%) underwent kidney transplant after LTx and 25 (8%) remained on chronic hemodialysis. Patients with preoperative eGFR lesser than 60 ml/min per 1.73 m2 had a 4-fold increased risk of developing ESRD after adjustment for sex, diabetes mellitus, APACHE II score, use of nephrotoxic drugs, and severe liver graft failure (HR = 3.95, 95% CI 1.73, 9.01; p = 0.001). Other independent risk factors for ESRD were preoperative diabetes mellitus and post-operative severe liver graft dysfunction. These findings emphasize low eGFR prior to LTx as a predictor for ESRD or death. The consideration for kidney after liver transplant as a treatment modality should be taken into account for those who develop chronic kidney failure after LTx.

  6. Estimation of Recurrence of Colorectal Adenomas with Dependent Censoring Using Weighted Logistic Regression

    PubMed Central

    Hsu, Chiu-Hsieh; Li, Yisheng; Long, Qi; Zhao, Qiuhong; Lance, Peter

    2011-01-01

    In colorectal polyp prevention trials, estimation of the rate of recurrence of adenomas at the end of the trial may be complicated by dependent censoring, that is, time to follow-up colonoscopy and dropout may be dependent on time to recurrence. Assuming that the auxiliary variables capture the dependence between recurrence and censoring times, we propose to fit two working models with the auxiliary variables as covariates to define risk groups and then extend an existing weighted logistic regression method for independent censoring to each risk group to accommodate potential dependent censoring. In a simulation study, we show that the proposed method results in both a gain in efficiency and reduction in bias for estimating the recurrence rate. We illustrate the methodology by analyzing a recurrent adenoma dataset from a colorectal polyp prevention trial. PMID:22065985

  7. Nonparametric autocovariance estimation from censored time series by Gaussian imputation.

    PubMed

    Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K

    2009-02-01

    One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.

  8. Impact of recipient body mass index on short-term and long-term survival of pancreatic grafts.

    PubMed

    Bédat, Benoît; Niclauss, Nadja; Jannot, Anne-Sophie; Andres, Axel; Toso, Christian; Morel, Philippe; Berney, Thierry

    2015-01-01

    The impact of recipient body mass index on graft and patient survival after pancreas transplantation is not well known. We have analyzed data from all pancreas transplant recipients reported in the Scientific Registry of Transplant Recipients between 1987 and 2011. Recipients were categorized into BMI classes, as defined by the World Health Organization. Short-term (90 days) and long-term (90 days to 5 years) patient and graft survivals were analyzed according to recipient BMI class using Kaplan-Meier estimates. Hazard ratios were estimated using Cox proportional hazard models. A total of 21,075 adult recipients were included in the analysis. Mean follow-up was 5 ± 1.1 years. Subjects were overweight or obese in 39%. Increasing recipient BMI was an independent predictor of pancreatic graft loss and patient death in the short term (P<0.001), especially for obese class II patient survival (hazard ratio, 2.07; P=0.009). In the long term, obesity, but not overweight, was associated with higher risk of graft failure (P=0.01). Underweight was associated with a higher risk of long-term death (P<0.001). These results question the safety of pancreas transplantation in obese patients and suggest that they may be directed to alternate therapies, such as behavioral modifications or bariatric surgery, before pancreas transplantation is considered.

  9. Association of Pretransplantion Opioid Use with Graft Loss or Death in Liver Transplantation Patients with Model of End-Stage Liver Disease Exceptions.

    PubMed

    Fleming, James N; Taber, David J; Pilch, Nicole A; Mardis, Caitlin R; Gilbert, Rachael E; Wilson, Lytani Z; Patel, Neha; Ball, Sarah; Mauldin, Patrick; Baliga, Prabhakar K

    2018-04-01

    Up to 77% of liver transplantation candidates experience pain, and the majority are prescribed opioids. Previous studies have shown increased readmissions and mortality in liver transplant recipients who were prescribed opioids before transplantation. Our aim was to identify specific populations that are at the highest risk for deleterious outcomes with opioid use before transplantation. This was a single-center retrospective cohort study of adult receiving liver transplants between 2010 and 2016 to assess the impact of pretransplantation opioid use on mortality and graft loss after liver transplantation. A total of 446 liver transplant recipients were included in the study, 148 (33%) of which were identified as pretransplantation opioid users. Opioid use increased significantly during the course of the study. There were no differences in the overall cohort between opioid users and non-opioid users with regard to graft or patient outcomes. However, the influence of opioid use on outcomes varied based on Model for End-Stage Liver Disease (MELD) and functional status. In patients with any MELD exception, opioid use was an independent predictor of time to graft loss or death (adjusted hazard ratio 2.36; 95% CI 1.05 to 5.28; p = 0.037). It also independently predicted time to graft loss or death in patients with low laboratory MELD scores (adjusted hazard ratio 2.38; 95% CI 1.10 to 5.13; p = 0.027). In our 6-year retrospective cohort, pretransplantation opioid use based on medication reconciliation was independently associated with time to graft loss or mortality in liver transplant recipients with MELD exceptions and laboratory MELD scores ≤15. Copyright © 2018 American College of Surgeons. Published by Elsevier Inc. All rights reserved.

  10. Relationships between Phytophthora ramorum canker (sudden oak death) and failure potential in coast live oak

    Treesearch

    Tedmund J. Swiecki; Elizabeth Bernhardt; Christiana Drake; Laurence R. Costello

    2006-01-01

    In autumn 2002, we conducted a retrospective study on coast live oak (Quercus agrifolia) failures in Marin County, California, woodlands affected by Phytophthora ramorum canker (sudden oak death). The objectives of this case-control study were to quantify levels of bole, large branch, and root failure in these woodlands and...

  11. Bilateral internal mammary artery grafting: in situ versus Y-graft. Similar 20-year outcome.

    PubMed

    Di Mauro, Michele; Iacò, Angela L; Allam, Ahmed; Awadi, Mohammed O; Osman, Ahmed A; Clemente, Daniela; Calafiore, Antonio M

    2016-10-01

    The aim of this study was to evaluate the 20-year clinical outcome of patients undergoing coronary artery bypass grafting with bilateral internal mammary arteries (BIMAs) using two different configurations, in situ versus Y-graft. From September 1991 to December 2002, 2150 patients with multivessel coronary artery disease underwent isolated myocardial revascularization with BIMA grafting. BIMA was used as an in situ or Y-configuration in 1332 and 818 cases, respectively. A propensity score model was applied to calculate a standardized difference of ≤10% between groups (BIMA in situ vs BIMA Y-graft), and a cohort of 1468 matched patients was identified (734 in each group). Death, non-fatal myocardial infarction and the need for repeat revascularization were defined as 'major adverse cardiac events'. Late mortality was 24.3% (n = 357) [BIMA in situ vs BIMA Y-graft: 26.9% (n = 197) vs 21.8% (n = 160)]; in 11.6% (n = 170) of cases death was due to cardiac causes [11.9% (n = 87) vs 11.3% (n = 83)]. The rate of major adverse cardiac events was 37.1% (n = 545) [40.8% (n = 299) vs 33.5% (n = 246)]. The 20-year survival was 59 ± 6% and the event-free survival was 45 ± 7%. The clinical outcome of BIMA grafting is independent of surgical configuration. Y-grafting increases the flexibility of BIMA grafting and should be taken into account when a surgical strategy for myocardial revascularization needs to be planned. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  12. Descemet Stripping Automated Endothelial Keratoplasty for Failed Penetrating Keratoplasty: Influence of the Graft-Host Junction on the Graft Survival Rate.

    PubMed

    Omoto, Takashi; Sakisaka, Toshihiro; Toyono, Tetsuya; Yoshida, Junko; Shirakawa, Rika; Miyai, Takashi; Yamagami, Satoru; Usui, Tomohiko

    2018-04-01

    To investigate the clinical results of Descemet stripping automated endothelial keratoplasty (DSAEK) for failed penetrating keratoplasty (PK) and the influence of the graft-host junction (GHJ) on the graft survival rate. Data were retrospectively collected on patient demographics, visual outcomes, complications, and graft survival rate for 17 eyes of 16 patients who underwent DSAEK for failed PK. The graft survival rate was compared between the eyes when divided into a bump group and a well-aligned group according to the shape of the GHJ detected on anterior segment optical coherence tomography. The most common indication for initial PK was bullous keratopathy after glaucoma surgery (35.3%). Seven eyes (41.2%) were classified into the bump group and 10 eyes (58.8%) into the well-aligned group. The mean best-ever documented visual acuity (BDVA) after DSAEK was 0.33 logMAR. Postoperatively, almost 70% of eyes achieved a BDVA that was within 0.2 logMAR of their preoperative BDVA. Graft detachment occurred in 29.4% of eyes and primary graft failure in 17.6%. All primary failures occurred in the bump group. The cumulative graft survival rate was 82.3% at 1 year, 73.2% at 2 years, and 58.6% at 3 years. Graft failure was more likely in eyes in the bump group than in those in the well-aligned group (P = 0.037, Wilcoxon test). DSAEK for failed PK had a favorable outcome in this study. However, the GHJ should be assessed carefully before performing the procedure.

  13. Maximum likelihood estimates, from censored data, for mixed-Weibull distributions

    NASA Astrophysics Data System (ADS)

    Jiang, Siyuan; Kececioglu, Dimitri

    1992-06-01

    A new algorithm for estimating the parameters of mixed-Weibull distributions from censored data is presented. The algorithm follows the principle of maximum likelihood estimate (MLE) through the expectation and maximization (EM) algorithm, and it is derived for both postmortem and nonpostmortem time-to-failure data. It is concluded that the concept of the EM algorithm is easy to understand and apply (only elementary statistics and calculus are required). The log-likelihood function cannot decrease after an EM sequence; this important feature was observed in all of the numerical calculations. The MLEs of the nonpostmortem data were obtained successfully for mixed-Weibull distributions with up to 14 parameters in a 5-subpopulation, mixed-Weibull distribution. Numerical examples indicate that some of the log-likelihood functions of the mixed-Weibull distributions have multiple local maxima; therefore, the algorithm should start at several initial guesses of the parameter set.

  14. Role of the Renin–Angiotensin System in the Pathogenesis of Intimal Hyperplasia: Therapeutic Potential for Prevention of Vein Graft Failure?

    PubMed Central

    Osgood, Michael J.; Harrison, David G.; Sexton, Kevin W.; Hocking, Kyle M.; Voskresensky, Igor V.; Komalavilas, Padmini; Cheung-Flynn, Joyce; Guzman, Raul J.; Brophy, Colleen M.

    2014-01-01

    The saphenous vein remains the most widely used conduit for peripheral and coronary revascularization despite a high rate of vein graft failure. The most common cause of vein graft failure is intimal hyperplasia. No agents have been proven to be successful for the prevention of intimal hyperplasia in human subjects. The rennin–angiotensin system is essential in the regulation of vascular tone and blood pressure in physiologic conditions. However, this system mediates cardiovascular remodeling in pathophysiologic states. Angiotensin II is becoming increasingly recognized as a potential mediator of intimal hyperplasia. Drugs modulating the renin–angiotensin system include angiotensin-converting enzyme inhibitors and angiotensin receptor blockers. These drugs are powerful inhibitors of atherosclerosis and cardiovascular remodeling, and they are first-line agents for management of several medical conditions based on class I evidence that they delay progression of cardiovascular disease and improve survival. Several experimental models have demonstrated that these agents are capable of inhibiting intimal hyperplasia. However, there are no data supporting their role in prevention of intimal hyperplasia in patients with vein grafts. This review summarizes the physiology of the rennin–angiotensin system, the role of angiotensin II in the pathogenesis of cardiovascular remodeling, the medical indications for these agents, and the experimental data supporting an important role of the rennin–angiotensin system in the pathogenesis of intimal hyperplasia. PMID:22445245

  15. Heart rate turbulence predicts all-cause mortality and sudden death in congestive heart failure patients.

    PubMed

    Cygankiewicz, Iwona; Zareba, Wojciech; Vazquez, Rafael; Vallverdu, Montserrat; Gonzalez-Juanatey, Jose R; Valdes, Mariano; Almendral, Jesus; Cinca, Juan; Caminal, Pere; de Luna, Antoni Bayes

    2008-08-01

    Abnormal heart rate turbulence (HRT) has been documented as a strong predictor of total mortality and sudden death in postinfarction patients, but data in patients with congestive heart failure (CHF) are limited. The aim of this study was to evaluate the prognostic significance of HRT for predicting mortality in CHF patients in New York Heart Association (NYHA) class II-III. In 651 CHF patients with sinus rhythm enrolled into the MUSIC (Muerte Subita en Insuficiencia Cardiaca) study, the standard HRT parameters turbulence onset (TO) and slope (TS), as well as HRT categories, were assessed for predicting total mortality and sudden death. HRT was analyzable in 607 patients, mean age 63 years (434 male), 50% of ischemic etiology. During a median follow up of 44 months, 129 patients died, 52 from sudden death. Abnormal TS and HRT category 2 (HRT2) were independently associated with increased all-cause mortality (HR: 2.10, CI: 1.41 to 3.12, P <.001 and HR: 2.52, CI: 1.56 to 4.05, P <.001; respectively), sudden death (HR: 2.25, CI: 1.13 to 4.46, P = .021 for HRT2), and death due to heart failure progression (HR: 4.11, CI: 1.84 to 9.19, P <.001 for HRT2) after adjustment for clinical covariates in multivariate analysis. The prognostic value of TS for predicting total mortality was similar in various groups dichotomized by age, gender, NYHA class, left ventricular ejection fraction, and CHF etiology. TS was found to be predictive for total mortality only in patients with QRS > 120 ms. HRT is a potent risk predictor for both heart failure and arrhythmic death in patients with class II and III CHF.

  16. 42 CFR 482.80 - Condition of participation: Data submission, clinical experience, and outcome requirements for...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... transplants. (1) CMS will compare each transplant center's observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year... a center's patient and graft survival rates to be acceptable if: (i) A center's observed patient...

  17. 42 CFR 482.80 - Condition of participation: Data submission, clinical experience, and outcome requirements for...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... transplants. (1) CMS will compare each transplant center's observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year... a center's patient and graft survival rates to be acceptable if: (i) A center's observed patient...

  18. 42 CFR 482.80 - Condition of participation: Data submission, clinical experience, and outcome requirements for...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... will compare each transplant center's observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using... graft survival rates to be acceptable if: (i) A center's observed patient survival rate or observed...

  19. 42 CFR 482.82 - Condition of participation: Data submission, clinical experience, and outcome requirements for re...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... transplant center's observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using data contained... graft survival rates to be acceptable if: (i) A center's observed patient survival rate or observed...

  20. Statistical analysis tables for truncated or censored samples

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.; Cooley, C. G.

    1971-01-01

    Compilation describes characteristics of truncated and censored samples, and presents six illustrations of practical use of tables in computing mean and variance estimates for normal distribution using selected samples.

  1. Estimation of distributional parameters for censored trace-level water-quality data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilliom, R.J.; Helsel, D.R.

    1984-01-01

    A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water-sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensored observations,more » for determining the best-performing parameter estimation method for any particular data set. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least-squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification. 6 figs., 6 tabs.« less

  2. Evaluation of statistical treatments of left-censored environmental data using coincident uncensored data sets. II. Group comparisons

    USGS Publications Warehouse

    Antweiler, Ronald C.

    2015-01-01

    The main classes of statistical treatments that have been used to determine if two groups of censored environmental data arise from the same distribution are substitution methods, maximum likelihood (MLE) techniques, and nonparametric methods. These treatments along with using all instrument-generated data (IN), even those less than the detection limit, were evaluated by examining 550 data sets in which the true values of the censored data were known, and therefore “true” probabilities could be calculated and used as a yardstick for comparison. It was found that technique “quality” was strongly dependent on the degree of censoring present in the groups. For low degrees of censoring (<25% in each group), the Generalized Wilcoxon (GW) technique and substitution of √2/2 times the detection limit gave overall the best results. For moderate degrees of censoring, MLE worked best, but only if the distribution could be estimated to be normal or log-normal prior to its application; otherwise, GW was a suitable alternative. For higher degrees of censoring (each group >40% censoring), no technique provided reliable estimates of the true probability. Group size did not appear to influence the quality of the result, and no technique appeared to become better or worse than other techniques relative to group size. Finally, IN appeared to do very well relative to the other techniques regardless of censoring or group size.

  3. Outcomes after aortic graft-to-graft anastomosis with an automated circular stapler: A novel approach.

    PubMed

    Idrees, Jay J; Yazdchi, Farhang; Soltesz, Edward G; Vekstein, Andrew M; Rodriguez, Christopher; Roselli, Eric E

    2016-10-01

    Patients with complex aortic disease often require multistaged repairs with numerous anastomoses. Manual suturing can be time consuming. To reduce ischemic time, a circular stapling device has been used to facilitate prosthetic graft-to-graft anastomoses. Objectives are to describe this technique and assess outcomes. From February 2009 to May 2014, 44 patients underwent complex aortic repair with a circular end-to-end anastomosis (EEA) stapler at Cleveland Clinic. All patients had extensive aneurysms: 17 after ascending dissection repair, 10 chronic type B dissections, and 17 degenerative aneurysms. Stapler was used during total arch repair as an end-to-side anastomosis (n = 36; including first stage elephant trunk [ET] in 32, frozen ET in 3) and an end-to-end anastomosis during redo thoracoabdominal repair (n = 11). Three patients had the stapler used during both stages of repair. Patients underwent early and annual follow-ups with computed tomography analysis. There were no bleeds, ruptures, or leaks at the stapled site, but 2 patients died. Complications included 7 reoperations not related to the site of stapled anastomosis and 6 tracheostomies, but there was no paralysis or renal failure. Mean circulatory arrest time was 16 ± 5 minutes. Mean follow-up was 26 ± 17 months and consisted of imaging before discharge, at 3 to 6 months, and at 1 year. Planned reinterventions included 21 second-stage ET completion: Endovascular (n = 18) and open (n = 3). There were 4 late deaths. Use of an end-to-end anastomotic automated circular stapler is safe, effective, and durable in performing graft-to-graft anastomoses during complex thoracic aortic surgery. Further evaluation and refinement of this technique are warranted. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.

  4. 42 CFR 482.82 - Condition of participation: Data submission, clinical experience, and outcome requirements for re...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using data contained in the most recent...'s patient and graft survival rates to be acceptable if: (i) A center's observed patient survival...

  5. 42 CFR 482.80 - Condition of participation: Data submission, clinical experience, and outcome requirements for...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... transplant center's observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using the data.... (2) CMS will not consider a center's patient and graft survival rates to be acceptable if: (i) A...

  6. 42 CFR 482.82 - Condition of participation: Data submission, clinical experience, and outcome requirements for re...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using data contained in the most recent...'s patient and graft survival rates to be acceptable if: (i) A center's observed patient survival...

  7. 42 CFR 482.82 - Condition of participation: Data submission, clinical experience, and outcome requirements for re...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... observed number of patient deaths and graft failures 1-year post-transplant to the center's expected number of patient deaths and graft failures 1-year post-transplant using data contained in the most recent...'s patient and graft survival rates to be acceptable if: (i) A center's observed patient survival...

  8. Obesity paradox and risk of sudden death in heart failure results from the MUerte Subita en Insuficiencia cardiaca (MUSIC) study.

    PubMed

    Gastelurrutia, Paloma; Pascual-Figal, Domingo; Vazquez, Rafael; Cygankiewicz, Iwona; Shamagian, Lillian Grigorian; Puig, Teresa; Ferrero, Andreu; Cinca, Juan; de Luna, Antoni Bayes; Bayes-Genis, Antoni

    2011-01-01

    among patients with heart failure (HF), body mass index (BMI) has been inversely associated with mortality, giving rise to the so-called obesity paradox. The aim of this study was to examine the relationship between BMI and two modes of cardiac death: pump failure death and sudden death. nine hundred seventy-nine patients with mild to moderate chronic symptomatic HF from the MUSIC (MUerte Subita en Insuficiencia Cardiaca) Study, a prospective, multicenter, and longitudinal study designed to assess risk predictors of cardiac mortality, were followed up during a median of 44 months. Independent predictors of death were identified by a multivariable Cox proportional hazards model. higher BMI emerged as an independent predictor of all-cause mortality (hazard ratio [HR] = 0.94, 95% confidence interval [CI] = 0.91-0.97, P = .0003) and pump failure death (HR = 0.93, 95% CI = 0.88-0.98, P = .004). Sudden death accounted for 45% of deaths in obese patients, 53% in overweight patients, and 37% in lean patients. No significant relationship between BMI and sudden death was observed (HR = 0.97, 95% CI = 0.92-1.02, P = .28). The only independent predictors of sudden death were prior history of myocardial infarction (HR = 1.89, 95% CI = 1.23-2.90, P = .004), hypertension (HR = 1.66, 95% CI = 1.05-2.63, P = .03), left ventricular ejection fraction (HR = 0.88, 95% CI = 0.79-0.96, P = .006), and N-terminal pro-B-type natriuretic peptide (HR = 1.01, 95% CI = 1.00-1.02, P = .048). the obesity paradox in HF affects all-cause mortality and pump failure death but not sudden death. The risk of dying suddenly was similar across BMI categories in this cohort of ambulatory patients with HF.

  9. Predictors of contemporary coronary artery bypass grafting outcomes.

    PubMed

    Weisel, Richard D; Nussmeier, Nancy; Newman, Mark F; Pearl, Ronald G; Wechsler, Andrew S; Ambrosio, Giuseppe; Pitt, Bertram; Clare, Robert M; Pieper, Karen S; Mongero, Linda; Reece, Tammy L; Yau, Terrence M; Fremes, Stephen; Menasché, Philippe; Lira, Armando; Harrington, Robert A; Ferguson, T Bruce

    2014-12-01

    The study objective was to identify the predictors of outcomes in a contemporary cohort of patients from the Reduction in cardiovascular Events by acaDesine in patients undergoing CABG (RED-CABG) trial. Despite the increasing risk profile of patients who undergo coronary artery bypass grafting, morbidity and mortality have remained low, and identification of the current predictors of adverse outcomes may permit new treatments to further improve outcomes. The RED-CABG trial was a multicenter, randomized, double-blind, placebo-controlled study that determined that acadesine did not reduce adverse events in moderately high-risk patients undergoing nonemergency coronary artery bypass grafting. The primary efficacy end point was a composite of all-cause death, nonfatal stroke, or the need for mechanical support for severe left ventricular dysfunction through postoperative day 28. Logistic regression modeling with stepwise variable selection identified which prespecified baseline characteristics were associated with the primary outcome. A second logistic model included intraoperative variables as potential covariates. The 4 independent preoperative risk factors predictive of the composite end point were (1) a history of heart failure (odds ratio, 2.9); (2) increasing age (odds ratio, 1.033 per decade); (3) a history of peripheral vascular disease (odds ratio, 1.6); and (4) receiving aspirin before coronary artery bypass grafting (odds ratio, 0.5), which was protective. The duration of the cardiopulmonary bypass (odds ratio, 1.8) was the only intraoperative variable that contributed to adverse outcomes. Patients who had heart failure and preserved systolic function had a similar high risk of adverse outcomes as those with low ejection fractions, and new approaches may mitigate this risk. Recognition of patients with excessive atherosclerotic burden may permit perioperative interventions to improve their outcomes. The contemporary risks of coronary artery bypass grafting

  10. Detection of imminent vein graft occlusion: what is the optimal surveillance program?

    PubMed

    Tinder, Chelsey N; Bandyk, Dennis F

    2009-12-01

    The prediction of infrainguinal vein bypass failure remains an inexact judgment. Patient demographics, technical factors, and vascular laboratory graft surveillance testing are helpful in identifying a high-risk graft cohort. The optimal surveillance program to detect the bypass at risk for imminent occlusion continues to be developed, but required elements are known and include clinical assessment for new or changes in limb ischemia symptoms, measurement of ankle and/or toe systolic pressure, and duplex ultrasound imaging of the bypass graft. Duplex ultrasound assessment of bypass hemodynamics may be the most accurate method to detect imminent vein graft occlusion. The finding of low graft flow during intraoperative assessment or at a scheduled surveillance study predicts failure; and if associated with an occlusive lesion, a graft revision can prolong patency. The most common abnormality producing graft failure is conduit stenosis caused by myointimal hyperplasia; and the majority can be repaired by an endovascular intervention. Frequency of testing to detect the failing bypass should be individualized to the patient, the type of arterial bypass, and prior duplex ultrasound scan findings. The focus of surveillance is on identification of the low-flow arterial bypass and timely repair of detected critical stenosis defined by duplex velocity spectra criteria of a peak systolic velocity 300 cm/s and peak systolic velocity ratio across the stenosis >3.5-correlating with >70% diameter-reducing stenosis. When conducted appropriately, a graft surveillance program should result in an unexpected graft failure rate of <3% per year.

  11. Genetically Modified Porcine Skin Grafts for Treatment of Severe Burn Injuries

    DTIC Science & Technology

    2011-07-01

    limited the usefulness of living porcine grafts, since the lack of blood supply soon lead to desiccation and avascular necrosis . Failure of...porcine grafts, since the lack of blood supply soon lead to desiccation and avascular necrosis . Failure of vascularization of xenografts is largely

  12. Analysis by early angiography of right internal thoracic artery grafting via the transverse sinus : predictors of graft failure.

    PubMed

    Ura, M; Sakata, R; Nakayama, Y; Arai, Y; Oshima, S; Noda, K

    2000-02-15

    There has been debate regarding whether technically demanding right internal thoracic artery (RITA) grafting via the transverse sinus can be extensively applied to patients in high-risk groups, such as patients with a small body size, elderly patients, and woman with relatively smaller coronary artery and internal thoracic artery (ITA) diameters. Of the 1456 patients who underwent isolated coronary artery bypass grafting between January 1989 and December 1998 at Kumamoto Central Hospital, 393 patients (mean age, 62.4+/-9.0 years) with the RITA anastomosed to the major branches of the circumflex artery were studied. Left ITA grafting was performed in 384 patients, and in 369, the in situ left ITA was anastomosed to the left anterior descending coronary artery using standard methods. Early postoperative angiography was performed in 381 patients. The RITA was occluded in 4 patients, and string-like artery and significant stenosis were present in 11 and 7 patients, respectively; RITA graft patency was thus 94.1%. Of the preoperative variables and angiographic data, simple and multiple logistic regression analyses identified decreased severity of native stenosis, diffuse sclerosis of native vessels, and residual side branches of the ITA as independent predictors of nonfunctional grafts. The method of ITA grafting did not influence the patency of the graft. The excellent patency rate demonstrated by this study, the largest angiographic study to date of RITA grafting via the transverse sinus, indicates that this technique can provide reliable revascularization of the left ventricle and that it has the potential to be applied to a wide variety of patients with diseased circumflex arteries.

  13. Alveolar graft in the cleft lip and palate patient: Review of 104 cases

    PubMed Central

    Tobella-Camps, María L.; Rivera-Baró, Alejandro

    2014-01-01

    Introduction: Alveolar bone grafting is a vital part of the rehabilitation of cleft patients. The factors that have been most frequently associated with the success of the graft are the age at grafting and the pre-grafting orthodontic treatment. Objectives: 1) Describe the cases of alveolar bone grafts performed at the Maxilofacial Unit of Hospital Sant Joan de Déu, Barcelona (HSJD); and 2) Analyze the success/failure of alveolar grafts and related variables. Material and Methods: Descriptive retrospective study using a sample of 104 patients who underwent a secondary alveolar graft at the Craniofacial Unit of HSJD between 1998 and 2012. The graft was done by the same surgeon in all patients using bone from the iliac crest. Results: 70% of the patients underwent the procedure before the age of 15 (median 14.45 years); 70% of the graft patients underwent pre-graft maxillary expansion. A total of 100 cases were recorded as successful (median age of 14.58 years, 68 underwent pre-graft expansion) and only 4 were recorded as failures (median age of 17.62 years, 3 underwent pre-graft expansion). We did not find statistically significant differences in age at the time of grafting or pre-surgical expansion when comparing the success and failure groups. We found the success rate of the graft to be 96.2%. Conclusions: The number of failures was too small to establish a statistically significant conclusion in our sample regarding the age at grafting and pre-grafting expansion. The use of alveolar bone grafting from the iliac crest has a very high success rate with a very low incidence of complications. Existing controversies regarding secondary bone grafting and the wide range of success rates found in the literature suggest that it is necessary to establish a specific treatment protocol that ensures the success of this procedure. Key words:Alveolar graft, cleft lip and palate, alveolar cleft, alveolar defect. PMID:24880440

  14. Serum Uromodulin: A Biomarker of Long-Term Kidney Allograft Failure.

    PubMed

    Bostom, Andrew; Steubl, Dominik; Garimella, Pranav S; Franceschini, Nora; Roberts, Mary B; Pasch, Andreas; Ix, Joachim H; Tuttle, Katherine R; Ivanova, Anastasia; Shireman, Theresa; Kim, S Joseph; Gohh, Reginald; Weiner, Daniel E; Levey, Andrew S; Hsu, Chi-Yuan; Kusek, John W; Eaton, Charles B

    2018-01-01

    Uromodulin is a kidney-derived glycoprotein and putative tubular function index. Lower serum uromodulin was recently associated with increased risk for kidney allograft failure in a preliminary, longitudinal single-center -European study involving 91 kidney transplant recipients (KTRs). The Folic Acid for Vascular Outcome Reduction in Transplantation (FAVORIT) trial is a completed, large, multiethnic controlled clinical trial cohort, which studied chronic, stable KTRs. We conducted a case cohort analysis using a randomly selected subset of patients (random subcohort, n = 433), and all individuals who developed kidney allograft failure (cases, n = 226) during follow-up. Serum uromodulin was determined in this total of n = 613 FAVORIT trial participants at randomization. Death-censored kidney allograft failure was the study outcome. The 226 kidney allograft failures occurred during a median surveillance of 3.2 years. Unadjusted, weighted Cox proportional hazards modeling revealed that lower serum uromodulin, tertile 1 vs. tertile 3, was associated with a threefold greater risk for kidney allograft failure (hazards ratio [HR], 95% CI 3.20 [2.05-5.01]). This association was attenuated but persisted at twofold greater risk for allograft failure, after adjustment for age, sex, smoking, allograft type and vintage, prevalent diabetes mellitus and cardiovascular disease (CVD), total/high-density lipoprotein cholesterol ratio, systolic blood pressure, estimated glomerular filtration rate, and natural log urinary albumin/creatinine: HR 2.00, 95% CI (1.06-3.77). Lower serum uromodulin, a possible indicator of less well-preserved renal tubular function, remained associated with greater risk for kidney allograft failure, after adjustment for major, established clinical kidney allograft failure and CVD risk factors, in a large, multiethnic cohort of long-term, stable KTRs. © 2018 S. Karger AG, Basel.

  15. Evaluation of pre-implantation kidney biopsies: comparison of Banff criteria to a morphometric approach.

    PubMed

    Lopes, José António; Moreso, Francesc; Riera, Luis; Carrera, Marta; Ibernon, Meritxell; Fulladosa, Xavier; Grinyó, Josep Maria; Serón, Daniel

    2005-04-01

    Donor glomerulosclerosis, interstitial fibrosis, and fibrous intimal thickening correlate with graft outcome. We evaluate chronic lesions in donor biopsies according to Banff criteria and with a morphometric technique to ascertain their predictive value on graft outcome. We evaluated 77 cadaveric donor biopsies according to Banff criteria. Glomerulosclerosis was expressed as the percentage of global sclerotic glomeruli. The following morphometric parameters were obtained: cortical interstitial volume fraction (Vvint/c), cortical glomerular volume fraction (Vvglom/c), mean glomerular volume (Vg), mean and maximal intimal arterial volume fraction (Vvintima/art), and Vvintima/art of the largest artery. We evaluated the correlation of histologic lesions with delayed graft function, 3 months' glomerular filtration rate (GFR), and death-censored graft survival. Multivariate logistic regression showed that delayed graft function was associated with cv score [relative risk (RR) 4.2 and 95% CI 1.1 to 16.0) and glomerulosclerosis (RR 1.06 and 95% CI 1.01 to 1.13). Stepwise regression showed that Vvint/c and glomerulosclerosis were independent predictors of 3 months' GFR (R= 0.62, P= 0.0001). Repeated analysis not considering morphometric parameters showed that glomerulosclerosis, cv score and ci score were independent predictors of 3 months' GFR (R= 0.64, P= 0.0001). A donor chronic damage score was generated considering glomerulosclerosis, cv score and ci score. This score after adjusting for clinical variables was associated with 3 months' GFR (R= 0.71, P < 0.0001) and death-censored graft survival (RR 2.2 and 95% CI 1.3 to 3.7). Combined evaluation of donor glomerulosclerosis, chronic vascular and interstitial damage according to Banff criteria allows a precise prediction of graft outcome. Morphometric evaluation of donor biopsies does not improve the predictive value of semiquantitative grading.

  16. Censoring distances based on labeled cortical distance maps in cortical morphometry.

    PubMed

    Ceyhan, Elvan; Nishino, Tomoyuki; Alexopolous, Dimitrios; Todd, Richard D; Botteron, Kelly N; Miller, Michael I; Ratnanather, J Tilak

    2013-01-01

    It has been demonstrated that shape differences in cortical structures may be manifested in neuropsychiatric disorders. Such morphometric differences can be measured by labeled cortical distance mapping (LCDM) which characterizes the morphometry of the laminar cortical mantle of cortical structures. LCDM data consist of signed/labeled distances of gray matter (GM) voxels with respect to GM/white matter (WM) surface. Volumes and other summary measures for each subject and the pooled distances can help determine the morphometric differences between diagnostic groups, however they do not reveal all the morphometric information contained in LCDM distances. To extract more information from LCDM data, censoring of the pooled distances is introduced for each diagnostic group where the range of LCDM distances is partitioned at a fixed increment size; and at each censoring step, the distances not exceeding the censoring distance are kept. Censored LCDM distances inherit the advantages of the pooled distances but also provide information about the location of morphometric differences which cannot be obtained from the pooled distances. However, at each step, the censored distances aggregate, which might confound the results. The influence of data aggregation is investigated with an extensive Monte Carlo simulation analysis and it is demonstrated that this influence is negligible. As an illustrative example, GM of ventral medial prefrontal cortices (VMPFCs) of subjects with major depressive disorder (MDD), subjects at high risk (HR) of MDD, and healthy control (Ctrl) subjects are used. A significant reduction in laminar thickness of the VMPFC in MDD and HR subjects is observed compared to Ctrl subjects. Moreover, the GM LCDM distances (i.e., locations with respect to the GM/WM surface) for which these differences start to occur are determined. The methodology is also applicable to LCDM-based morphometric measures of other cortical structures affected by disease.

  17. Censored Glauber Dynamics for the Mean Field Ising Model

    NASA Astrophysics Data System (ADS)

    Ding, Jian; Lubetzky, Eyal; Peres, Yuval

    2009-11-01

    We study Glauber dynamics for the Ising model on the complete graph on n vertices, known as the Curie-Weiss Model. It is well known that at high temperature ( β<1) the mixing time is Θ( nlog n), whereas at low temperature ( β>1) it is exp ( Θ( n)). Recently, Levin, Luczak and Peres considered a censored version of this dynamics, which is restricted to non-negative magnetization. They proved that for fixed β>1, the mixing-time of this model is Θ( nlog n), analogous to the high-temperature regime of the original dynamics. Furthermore, they showed cutoff for the original dynamics for fixed β<1. The question whether the censored dynamics also exhibits cutoff remained unsettled. In a companion paper, we extended the results of Levin et al. into a complete characterization of the mixing-time for the Curie-Weiss model. Namely, we found a scaling window of order 1/sqrt{n} around the critical temperature β c =1, beyond which there is cutoff at high temperature. However, determining the behavior of the censored dynamics outside this critical window seemed significantly more challenging. In this work we answer the above question in the affirmative, and establish the cutoff point and its window for the censored dynamics beyond the critical window, thus completing its analogy to the original dynamics at high temperature. Namely, if β=1+ δ for some δ>0 with δ 2 n→∞, then the mixing-time has order ( n/ δ)log ( δ 2 n). The cutoff constant is (1/2+[2(ζ2 β/ δ-1)]-1), where ζ is the unique positive root of g( x)=tanh ( β x)- x, and the cutoff window has order n/ δ.

  18. Improving National Results in Liver Transplantation Using Grafts From Donation After Cardiac Death Donors.

    PubMed

    Croome, Kristopher P; Lee, David D; Keaveny, Andrew P; Taner, C Burcin

    2016-12-01

    Published reports describing the national experience with liver grafts from donation after cardiac death (DCD) donors have resulted in reservations with their widespread utilization. The present study aimed to investigate if temporal improvements in outcomes have been observed on a national level and to determine if donor and recipient selection have been modified in a fashion consistent with published data on DCD use in liver transplantation (LT). Patients undergoing DCD LT between 2003 and 2014 were obtained from the United Network of Organ Sharing Standard Transplant Analysis and Research file and divided into 3 equal eras based on the date of DCD LT: era 1 (2003-2006), era 2 (2007-2010), and era 3 (2011-2014). Improvement in graft survival was seen between era 1 and era 2 (P = 0.001) and between era 2 and era 3 (P < 0.001). Concurrently, an increase in the proportion of patients with hepatocellular carcinoma and a decrease in critically ill patients, retransplant recipients, donor age, warm ischemia time greater than 30 minutes and cold ischemic time also occurred over the same period. On multivariate analysis, significant predictors of graft survival included: recipient age, biologic MELD score, recipient on ventilator, recipient hepatitis C virus + serology, donor age and cold ischemic time. In addition, even after adjustment for all of the aforementioned variables, both era 2 (hazard ratio, 0.81; confidence interval, 0.69-0.94; P = 0.007), and era 3 (hazard ratio, 0.61; confidence interval, 0.5-0.73; P < 0.001) had a protective effect compared to era 1. The national outcomes for DCD LT have improved over the last 12 years. This change was associated with modifications in both recipient and donor selection. Furthermore, an era effect was observed, even after adjustment for all recipient and donor variables on multivariate analysis.

  19. Access to Heart Transplantation: A Proper Analysis of the Competing Risks of Death and Transplantation Is Required to Optimize Graft Allocation.

    PubMed

    Cantrelle, Christelle; Legeai, Camille; Latouche, Aurélien; Tuppin, Philippe; Jasseron, Carine; Sebbag, Laurent; Bastien, Olivier; Dorent, Richard

    2017-08-01

    Heart allocation systems are usually urgency-based, offering grafts to candidates at high risk of waitlist mortality. In the context of a revision of the heart allocation rules, we determined observed predictors of 1-year waitlist mortality in France, considering the competing risk of transplantation, to determine which candidate subgroups are favored or disadvantaged by the current allocation system. Patients registered on the French heart waitlist between 2010 and 2013 were included. Cox cause-specific hazards and Fine and Gray subdistribution hazards were used to determine candidate characteristics associated with waitlist mortality and access to transplantation. Of the 2053 candidates, 7 variables were associated with 1-year waitlist mortality by the Fine and Gray method including 4 candidate characteristics related to heart failure severity (hospitalization at listing, serum natriuretic peptide level, systolic pulmonary artery pressure, and glomerular filtration rate) and 3 characteristics not associated with heart failure severity but with lower access to transplantation (blood type, age, and body mass index). Observed waitlist mortality for candidates on mechanical circulatory support was like that of others. The heart allocation system strongly modifies the risk of pretransplant mortality related to heart failure severity. An in-depth competing risk analysis is therefore a more appropriate method to evaluate graft allocation systems. This knowledge should help to prioritize candidates in the context of a limited donor pool.

  20. Liver graft preservation using perfluorocarbon improves the outcomes of simulated donation after cardiac death liver transplantation in rats.

    PubMed

    Okumura, Shinya; Uemura, Tadahiro; Zhao, Xiangdong; Masano, Yuki; Tsuruyama, Tatsuaki; Fujimoto, Yasuhiro; Iida, Taku; Yagi, Shintaro; Bezinover, Dmitri; Spiess, Bruce; Kaido, Toshimi; Uemoto, Shinji

    2017-09-01

    The outcomes of liver transplantation (LT) from donation after cardiac death (DCD) donors remain poor due to severe warm ischemia injury. Perfluorocarbon (PFC) is a novel compound with high oxygen carrying capacity. In the present study, a rat model simulating DCD LT was used, and the impact of improved graft oxygenation provided by PFC addition on liver ischemia/reperfusion injury (IRI) and survival after DCD LT was investigated. Orthotopic liver transplants were performed in male Lewis rats, using DCD liver grafts preserved with cold University of Wisconsin (UW) solution in the control group and preserved with cold oxygenated UW solution with addition of 20% PFC in the PFC group. For experiment I, in a 30-minute donor warm ischemia model, postoperative graft injury was analyzed at 3 and 6 hours after transplantation. For experiment II, in a 50-minute donor warm ischemia model, the postoperative survival was assessed. For experiment I, the levels of serum aspartate aminotransferase, alanine aminotransferase, hyaluronic acid, malondialdehyde, and several inflammatory cytokines were significantly lower in the PFC group. The hepatic expression levels of tumor necrosis factor α and interleukin 6 were significantly lower, and the expression level of heme oxygenase 1 was significantly higher in the PFC group. Histological analysis showed significantly less necrosis and apoptosis in the PFC group. Sinusoidal endothelial cells and microvilli of the bile canaliculi were well preserved in the PFC group. For experiment II, the postoperative survival rate was significantly improved in the PFC group. In conclusion, graft preservation with PFC attenuated liver IRI and improved postoperative survival. This graft preservation protocol might be a new therapeutic option to improve the outcomes of DCD LT. Liver Transplantation 23 1171-1185 2017 AASLD. © 2017 by the American Association for the Study of Liver Diseases.

  1. Role of donor hemodynamic trajectory in determining graft survival in liver transplantation from donation after circulatory death donors.

    PubMed

    Firl, Daniel J; Hashimoto, Koji; O'Rourke, Colin; Diago-Uso, Teresa; Fujiki, Masato; Aucejo, Federico N; Quintini, Cristiano; Kelly, Dympna M; Miller, Charles M; Fung, John J; Eghtesad, Bijan

    2016-11-01

    Donation after circulatory death (DCD) donors show heterogeneous hemodynamic trajectories following withdrawal of life support. Impact of hemodynamics in DCD liver transplant is unclear, and objective measures of graft viability would ease transplant surgeon decision making and inform safe expansion of the donor organ pool. This retrospective study tested whether hemodynamic trajectories were associated with transplant outcomes in DCD liver transplantation (n = 87). Using longitudinal clustering statistical techniques, we phenotyped DCD donors based on hemodynamic trajectory for both mean arterial pressure (MAP) and peripheral oxygen saturation (SpO 2 ) following withdrawal of life support. Donors were categorized into 3 clusters: those who gradually decline after withdrawal of life support (cluster 1), those who maintain stable hemodynamics followed by rapid decline (cluster 2), and those who decline rapidly (cluster 3). Clustering outputs were used to compare characteristics and transplant outcomes. Cox proportional hazards modeling revealed hepatocellular carcinoma (hazard ratio [HR] = 2.53; P = 0.047), cold ischemia time (HR = 1.50 per hour; P = 0.027), and MAP cluster 1 were associated with increased risk of graft loss (HR = 3.13; P = 0.021), but not SpO 2 cluster (P = 0.172) or donor warm ischemia time (DWIT; P = 0.154). Despite longer DWIT, MAP and SpO 2 clusters 2 showed similar graft survival to MAP and SpO 2 clusters 3, respectively. In conclusion, despite heterogeneity in hemodynamic trajectories, DCD donors can be categorized into 3 clinically meaningful subgroups that help predict graft prognosis. Further studies should confirm the utility of liver grafts from cluster 2. Liver Transplantation 22 1469-1481 2016 AASLD. © 2016 by the American Association for the Study of Liver Diseases.

  2. Influence assessment in censored mixed-effects models using the multivariate Student’s-t distribution

    PubMed Central

    Matos, Larissa A.; Bandyopadhyay, Dipankar; Castro, Luis M.; Lachos, Victor H.

    2015-01-01

    In biomedical studies on HIV RNA dynamics, viral loads generate repeated measures that are often subjected to upper and lower detection limits, and hence these responses are either left- or right-censored. Linear and non-linear mixed-effects censored (LMEC/NLMEC) models are routinely used to analyse these longitudinal data, with normality assumptions for the random effects and residual errors. However, the derived inference may not be robust when these underlying normality assumptions are questionable, especially the presence of outliers and thick-tails. Motivated by this, Matos et al. (2013b) recently proposed an exact EM-type algorithm for LMEC/NLMEC models using a multivariate Student’s-t distribution, with closed-form expressions at the E-step. In this paper, we develop influence diagnostics for LMEC/NLMEC models using the multivariate Student’s-t density, based on the conditional expectation of the complete data log-likelihood. This partially eliminates the complexity associated with the approach of Cook (1977, 1986) for censored mixed-effects models. The new methodology is illustrated via an application to a longitudinal HIV dataset. In addition, a simulation study explores the accuracy of the proposed measures in detecting possible influential observations for heavy-tailed censored data under different perturbation and censoring schemes. PMID:26190871

  3. Double inverse-weighted estimation of cumulative treatment effects under nonproportional hazards and dependent censoring.

    PubMed

    Schaubel, Douglas E; Wei, Guanghui

    2011-03-01

    In medical studies of time-to-event data, nonproportional hazards and dependent censoring are very common issues when estimating the treatment effect. A traditional method for dealing with time-dependent treatment effects is to model the time-dependence parametrically. Limitations of this approach include the difficulty to verify the correctness of the specified functional form and the fact that, in the presence of a treatment effect that varies over time, investigators are usually interested in the cumulative as opposed to instantaneous treatment effect. In many applications, censoring time is not independent of event time. Therefore, we propose methods for estimating the cumulative treatment effect in the presence of nonproportional hazards and dependent censoring. Three measures are proposed, including the ratio of cumulative hazards, relative risk, and difference in restricted mean lifetime. For each measure, we propose a double inverse-weighted estimator, constructed by first using inverse probability of treatment weighting (IPTW) to balance the treatment-specific covariate distributions, then using inverse probability of censoring weighting (IPCW) to overcome the dependent censoring. The proposed estimators are shown to be consistent and asymptotically normal. We study their finite-sample properties through simulation. The proposed methods are used to compare kidney wait-list mortality by race. © 2010, The International Biometric Society.

  4. A review of statistical issues with progression-free survival as an interval-censored time-to-event endpoint.

    PubMed

    Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang

    2013-01-01

    Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.

  5. Cure rate model with interval censored data.

    PubMed

    Kim, Yang-Jin; Jhun, Myoungshic

    2008-01-15

    In cancer trials, a significant fraction of patients can be cured, that is, the disease is completely eliminated, so that it never recurs. In general, treatments are developed to both increase the patients' chances of being cured and prolong the survival time among non-cured patients. A cure rate model represents a combination of cure fraction and survival model, and can be applied to many clinical studies over several types of cancer. In this article, the cure rate model is considered in the interval censored data composed of two time points, which include the event time of interest. Interval censored data commonly occur in the studies of diseases that often progress without symptoms, requiring clinical evaluation for detection (Encyclopedia of Biostatistics. Wiley: New York, 1998; 2090-2095). In our study, an approximate likelihood approach suggested by Goetghebeur and Ryan (Biometrics 2000; 56:1139-1144) is used to derive the likelihood in interval censored data. In addition, a frailty model is introduced to characterize the association between the cure fraction and survival model. In particular, the positive association between the cure fraction and the survival time is incorporated by imposing a common normal frailty effect. The EM algorithm is used to estimate parameters and a multiple imputation based on the profile likelihood is adopted for variance estimation. The approach is applied to the smoking cessation study in which the event of interest is a smoking relapse and several covariates including an intensive care treatment are evaluated to be effective for both the occurrence of relapse and the non-smoking duration. Copyright (c) 2007 John Wiley & Sons, Ltd.

  6. Evaluation of synthetic vascular grafts in a mouse carotid grafting model.

    PubMed

    Chan, Alex H P; Tan, Richard P; Michael, Praveesuda L; Lee, Bob S L; Vanags, Laura Z; Ng, Martin K C; Bursill, Christina A; Wise, Steven G

    2017-01-01

    Current animal models for the evaluation of synthetic grafts are lacking many of the molecular tools and transgenic studies available to other branches of biology. A mouse model of vascular grafting would allow for the study of molecular mechanisms of graft failure, including in the context of clinically relevant disease states. In this study, we comprehensively characterise a sutureless grafting model which facilitates the evaluation of synthetic grafts in the mouse carotid artery. Using conduits electrospun from polycaprolactone (PCL) we show the gradual development of a significant neointima within 28 days, found to be greatest at the anastomoses. Histological analysis showed temporal increases in smooth muscle cell and collagen content within the neointima, demonstrating its maturation. Endothelialisation of the PCL grafts, assessed by scanning electron microscopy (SEM) analysis and CD31 staining, was near complete within 28 days, together replicating two critical aspects of graft performance. To further demonstrate the potential of this mouse model, we used longitudinal non-invasive tracking of bone-marrow mononuclear cells from a transgenic mouse strain with a dual reporter construct encoding both luciferase and green fluorescent protein (GFP). This enabled characterisation of mononuclear cell homing and engraftment to PCL using bioluminescence imaging and histological staining over time (7, 14 and 28 days). We observed peak luminescence at 7 days post-graft implantation that persisted until sacrifice at 28 days. Collectively, we have established and characterised a high-throughput model of grafting that allows for the evaluation of key clinical drivers of graft performance.

  7. Has the survival of the graft improved after renal transplantation in the era of modern immunosuppression?

    PubMed

    Moreso, Francesc; Hernández, Domingo

    2013-01-18

    The introduction of new immunosuppressant drugs in recent years has allowed for a reduction in acute rejection rates along with highly significant improvements in short-term kidney transplantation results. Nonetheless, this improvement has not translated into such significant changes in long-term results. In this manner, late graft failure continues to be a frequent cause of readmission onto dialysis programmes and re-entry onto the waiting list. Multiple entities of immunological and non-immunological origin act together and lead to chronic allograft dysfunction. The characteristics of the transplanted organ are a greater determinant of graft survival, and although various algorithms have been designed as a way of understanding the risk of the transplant organ and assigning the most adequate recipient accordingly. They are applied in the clinical setting only under exceptional circumstances. Characterising, for each patient, the immune factors (clinical and subclinical rejection, reactivation of dormant viral infections, adherence to treatment) and non-immune factors (hypertension, diabetes, anaemia, dyslipidaemia) that contribute to chronic allograft dysfunction could allow us to intervene more effectively as a way of delaying the progress of such processes. Therefore, identifying the causes of graft failure and its risk factors, applying predictive models, and intervening in causal factors could constitute strategies for improving kidney transplantation results in terms of survival. This review analyses some of the evidences conditioning graft failure as well as related therapeutic and prognostic aspects: 1) magnitude of the problem and causes of graft failure; 2) identification of graft failure risk factors; 3) therapeutic strategies for reducing graft failure, and; 4) graft failure prediction.

  8. New therapeutic possibilities for vein graft disease in the post-edifoligide era.

    PubMed

    Cai, Xinjiang; Freedman, Neil J

    2006-07-01

    Vein graft neointimal hyperplasia involves proliferation and migration of vascular smooth muscle cells into the vessel intima, and ultimately engenders accelerated atherosclerosis and vein graft failure. Since a myriad of stimuli provoke smooth muscle cell proliferation, molecular therapies for vein graft disease have targeted mechanisms fundamental to all cell proliferation - the 'cell-cycle' machinery. Preclinically, the most successful of these therapies has been edifoligide (E2F decoy), a double-stranded oligodeoxynucleotide that binds to the transcription factor known as E2F. Recently, PRoject of Ex vivo vein GRaft Engineering via Transfection (PREVENT) III and IV demonstrated that edifoligide failed to benefit human vein grafts employed to treat lower-extremity ischemia and coronary heart disease, respectively. The clinical failure of edifoligide calls into question previous models of vein graft disease and lends credence to recent animal studies demonstrating that vein graft arterialization substantially involves the immigration into the vein graft of a variety of vascular progenitor cells. Future vein graft disease therapies will likely target not only proliferation of graft-intrinsic cells, but also immigration of graft-extrinsic cells.

  9. A semiparametric Bayesian proportional hazards model for interval censored data with frailty effects.

    PubMed

    Henschel, Volkmar; Engel, Jutta; Hölzel, Dieter; Mansmann, Ulrich

    2009-02-10

    Multivariate analysis of interval censored event data based on classical likelihood methods is notoriously cumbersome. Likelihood inference for models which additionally include random effects are not available at all. Developed algorithms bear problems for practical users like: matrix inversion, slow convergence, no assessment of statistical uncertainty. MCMC procedures combined with imputation are used to implement hierarchical models for interval censored data within a Bayesian framework. Two examples from clinical practice demonstrate the handling of clustered interval censored event times as well as multilayer random effects for inter-institutional quality assessment. The software developed is called survBayes and is freely available at CRAN. The proposed software supports the solution of complex analyses in many fields of clinical epidemiology as well as health services research.

  10. Linear regression analysis of survival data with missing censoring indicators.

    PubMed

    Wang, Qihua; Dinse, Gregg E

    2011-04-01

    Linear regression analysis has been studied extensively in a random censorship setting, but typically all of the censoring indicators are assumed to be observed. In this paper, we develop synthetic data methods for estimating regression parameters in a linear model when some censoring indicators are missing. We define estimators based on regression calibration, imputation, and inverse probability weighting techniques, and we prove all three estimators are asymptotically normal. The finite-sample performance of each estimator is evaluated via simulation. We illustrate our methods by assessing the effects of sex and age on the time to non-ambulatory progression for patients in a brain cancer clinical trial.

  11. Dark Endothelial Spots After Descemet Membrane Endothelial Keratoplasty May Appear as Recurrent Fuchs Dystrophy or Herald Graft Failure or Rejection.

    PubMed

    Zygoura, Vasiliki; Baydoun, Lamis; Monnereau, Claire; Satué, Maria; Oellerich, Silke; Melles, Gerrit R J

    2017-12-01

    To evaluate the clinical significance of dark spots in the donor endothelial cell layer as observed with specular microscopy, in patients who underwent Descemet membrane endothelial keratoplasty (DMEK) for Fuchs endothelial dystrophy (FED). Specular microscopy images of 83 consecutive eyes up to 7 years after DMEK were retrospectively reviewed in a masked fashion for the presence of dark spots and morphologic changes in the endothelial cell layer and processed for endothelial cell density (ECD) measurements. A normal endothelial cell layer was found in 52/83 eyes (62.7%) (group 0). In the remaining 31/83 eyes, various dark discolorations with or without altered endothelial cell morphology were categorized into 4 groups. Dark spots were classified as artifacts in 10/83 (12.0%) eyes (group I) and as "superimposed" dots in 10/83 (12.0%) eyes (group II), that is, optical irregularities slightly anterior to a healthy endothelial cell layer. In 11/83 (13.3%) eyes, endothelial stress was characterized by dark grayish discolorations and/or nuclear activation (group III). Most of the latter eyes also had a significant ECD decrease; 3 of these eyes later developed secondary graft failure, of which one was preceded by allograft rejection. None of the eyes showed recurrent guttae typical for FED (group IV). Dark endothelial spots after DMEK for FED may not represent a recurrent disease, but tissue irregularities just anterior to the graft. However, if associated with changes in endothelial cell morphology, nuclear activation and/or ECD decrease, dark discolorations may reflect "cellular stress" heralding secondary graft failure or (subclinical) allograft rejection.

  12. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a

  13. The Role of Programmed Cell Death Ligand-1 (PD-L1/CD274) in the Development of Graft versus Host Disease

    PubMed Central

    Al-Chaqmaqchi, Heevy; Sadeghi, Behnam; Abedi-Valugerdi, Manuchehr; Al-Hashmi, Sulaiman; Fares, Mona; Kuiper, Raoul; Lundahl, Joachim

    2013-01-01

    Programmed cell death ligand-1 (PD-L1/CD274) is an immunomodulatory molecule involved in cancer and complications of bone marrow transplantation, such as graft rejection and graft-versus-host disease. The present study was designed to assess the dynamic expression of this molecule after hematopoietic stem cell transplantation in relation to acute graft-versus-host disease. Female BALB/c mice were conditioned with busulfan and cyclophosphamide and transplanted with either syngeneic or allogeneic (male C57BL/6 mice) bone marrow and splenic cells. The expression of PD-L1 was evaluated at different time points employing qPCR, western blot and immunohistochemistry. Allogeneic- but not syngeneic-transplanted animals exhibited a marked up-regulation of PD-L1 expression in the muscle and kidney, but not the liver, at days 5 and 7 post transplantation. In mice transplanted with allogeneic bone marrow cells, the enhanced expression of PD-L1 was associated with high serum levels of IFNγ and TNFα at corresponding intervals. Our findings demonstrate that PD-L1 is differently induced and expressed after allogeneic transplantation than it is after syngeneic transplantation, and that it is in favor of target rather than non-target organs at the early stages of acute graft-versus-host disease. This is the first study to correlate the dynamics of PD-L1 at the gene-, protein- and activity levels with the early development of acute graft-versus-host disease. Our results suggest that the higher expression of PD-L1 in the muscle and kidney (non-target tissues) plays a protective role in skeletal muscle during acute graft-versus-host disease. PMID:23593203

  14. Annotation, submission and screening of repetitive elements in Repbase: RepbaseSubmitter and Censor.

    PubMed

    Kohany, Oleksiy; Gentles, Andrew J; Hankus, Lukasz; Jurka, Jerzy

    2006-10-25

    Repbase is a reference database of eukaryotic repetitive DNA, which includes prototypic sequences of repeats and basic information described in annotations. Updating and maintenance of the database requires specialized tools, which we have created and made available for use with Repbase, and which may be useful as a template for other curated databases. We describe the software tools RepbaseSubmitter and Censor, which are designed to facilitate updating and screening the content of Repbase. RepbaseSubmitter is a java-based interface for formatting and annotating Repbase entries. It eliminates many common formatting errors, and automates actions such as calculation of sequence lengths and composition, thus facilitating curation of Repbase sequences. In addition, it has several features for predicting protein coding regions in sequences; searching and including Pubmed references in Repbase entries; and searching the NCBI taxonomy database for correct inclusion of species information and taxonomic position. Censor is a tool to rapidly identify repetitive elements by comparison to known repeats. It uses WU-BLAST for speed and sensitivity, and can conduct DNA-DNA, DNA-protein, or translated DNA-translated DNA searches of genomic sequence. Defragmented output includes a map of repeats present in the query sequence, with the options to report masked query sequence(s), repeat sequences found in the query, and alignments. Censor and RepbaseSubmitter are available as both web-based services and downloadable versions. They can be found at http://www.girinst.org/repbase/submission.html (RepbaseSubmitter) and http://www.girinst.org/censor/index.php (Censor).

  15. Bypass grafting to the anterior tibial artery.

    PubMed

    Armour, R H

    1976-01-01

    Four patients with severe ischaemia of a leg due to atherosclerotic occlusion of the tibial and peroneal arteries had reversed long saphenous vein grafts to the patent lower part of the anterior tibial artery. Two of these grafts continue to function 19 and 24 months after operation respectively. One graft failed on the fifth postoperative day and another occluded 4 months after operation. The literature on femorotibial grafting has been reviewed. The early failure rate of distal grafting is higher than in the case of femoropopliteal bypass, but a number of otherwise doomed limbs can be salvaged. Contrary to widely held views, grafting to the anterior tibial artery appears to give results comparable to those obtained when the lower anastomosis is made to the posterior tibial artery.

  16. The Gumbel hypothesis test for left censored observations using regional earthquake records as an example

    NASA Astrophysics Data System (ADS)

    Thompson, E. M.; Hewlett, J. B.; Baise, L. G.; Vogel, R. M.

    2011-01-01

    Annual maximum (AM) time series are incomplete (i.e., censored) when no events are included above the assumed censoring threshold (i.e., magnitude of completeness). We introduce a distrtibutional hypothesis test for left-censored Gumbel observations based on the probability plot correlation coefficient (PPCC). Critical values of the PPCC hypothesis test statistic are computed from Monte-Carlo simulations and are a function of sample size, censoring level, and significance level. When applied to a global catalog of earthquake observations, the left-censored Gumbel PPCC tests are unable to reject the Gumbel hypothesis for 45 of 46 seismic regions. We apply four different field significance tests for combining individual tests into a collective hypothesis test. None of the field significance tests are able to reject the global hypothesis that AM earthquake magnitudes arise from a Gumbel distribution. Because the field significance levels are not conclusive, we also compute the likelihood that these field significance tests are unable to reject the Gumbel model when the samples arise from a more complex distributional alternative. A power study documents that the censored Gumbel PPCC test is unable to reject some important and viable Generalized Extreme Value (GEV) alternatives. Thus, we cannot rule out the possibility that the global AM earthquake time series could arise from a GEV distribution with a finite upper bound, also known as a reverse Weibull distribution. Our power study also indicates that the binomial and uniform field significance tests are substantially more powerful than the more commonly used Bonferonni and false discovery rate multiple comparison procedures.

  17. The Spectrum of Renal Allograft Failure

    PubMed Central

    Chand, Sourabh; Atkinson, David; Collins, Clare; Briggs, David; Ball, Simon; Sharif, Adnan; Skordilis, Kassiani; Vydianath, Bindu; Neil, Desley; Borrows, Richard

    2016-01-01

    Background Causes of “true” late kidney allograft failure remain unclear as study selection bias and limited follow-up risk incomplete representation of the spectrum. Methods We evaluated all unselected graft failures from 2008–2014 (n = 171; 0–36 years post-transplantation) by contemporary classification of indication biopsies “proximate” to failure, DSA assessment, clinical and biochemical data. Results The spectrum of graft failure changed markedly depending on the timing of allograft failure. Failures within the first year were most commonly attributed to technical failure, acute rejection (with T-cell mediated rejection [TCMR] dominating antibody-mediated rejection [ABMR]). Failures beyond a year were increasingly dominated by ABMR and ‘interstitial fibrosis with tubular atrophy’ without rejection, infection or recurrent disease (“IFTA”). Cases of IFTA associated with inflammation in non-scarred areas (compared with no inflammation or inflammation solely within scarred regions) were more commonly associated with episodes of prior rejection, late rejection and nonadherence, pointing to an alloimmune aetiology. Nonadherence and late rejection were common in ABMR and TCMR, particularly Acute Active ABMR. Acute Active ABMR and nonadherence were associated with younger age, faster functional decline, and less hyalinosis on biopsy. Chronic and Chronic Active ABMR were more commonly associated with Class II DSA. C1q-binding DSA, detected in 33% of ABMR episodes, were associated with shorter time to graft failure. Most non-biopsied patients were DSA-negative (16/21; 76.1%). Finally, twelve losses to recurrent disease were seen (16%). Conclusion This data from an unselected population identifies IFTA alongside ABMR as a very important cause of true late graft failure, with nonadherence-associated TCMR as a phenomenon in some patients. It highlights clinical and immunological characteristics of ABMR subgroups, and should inform clinical practice and

  18. A cigarette manufacturer and a managed care company collaborate to censor health information targeted at employees.

    PubMed

    Muggli, Monique E; Hurt, Richard D

    2004-08-01

    A review of internal tobacco company documents showed that the tobacco company Philip Morris and the insurance company CIGNA collaborated to censor accurate information on the harm of smoking and on environmental tobacco smoke exposure from CIGNA health newsletters sent to employees of Philip Morris and its affiliates. From 1996 to 1998, 5 of the 8 CIGNA newsletters discussed in the internal tobacco documents were censored.We recommend that accrediting bodies mandate that health plans not censor employee-directed health information at the request of employers.

  19. Improvement of liver injury and survival by JNK2 and iNOS deficiency in liver transplants from cardiac death mice.

    PubMed

    Liu, Qinlong; Rehman, Hasibur; Krishnasamy, Yasodha; Schnellmann, Rick G; Lemasters, John J; Zhong, Zhi

    2015-07-01

    Inclusion of liver grafts from cardiac death donors (CDD) would increase the availability of donor livers but is hampered by a higher risk of primary non-function. Here, we seek to determine mechanisms that contribute to primary non-function of liver grafts from CDD with the goal to develop strategies for improved function and outcome, focusing on c-Jun-N-terminal kinase (JNK) activation and mitochondrial depolarization, two known mediators of graft failure. Livers explanted from wild-type, inducible nitric oxide synthase knockout (iNOS(-/-)), JNK1(-/-) or JNK2(-/-) mice after 45-min aorta clamping were implanted into wild-type recipients. Mitochondrial depolarization was detected by intravital confocal microscopy in living recipients. After transplantation of wild-type CDD livers, graft iNOS expression and 3-nitrotyrosine adducts increased, but hepatic endothelial NOS expression was unchanged. Graft injury and dysfunction were substantially higher in CDD grafts than in non-CDD grafts. iNOS deficiency and inhibition attenuated injury and improved function and survival of CDD grafts. JNK1/2 and apoptosis signal-regulating kinase-1 activation increased markedly in wild-type CDD grafts, which was blunted by iNOS deficiency. JNK inhibition and JNK2 deficiency, but not JNK1 deficiency, decreased injury and improved function and survival of CDD grafts. Mitochondrial depolarization and binding of phospho-JNK2 to Sab, a mitochondrial protein linked to the mitochondrial permeability transition, were higher in CDD than in non-CDD grafts. iNOS deficiency, JNK inhibition and JNK2 deficiency all decreased mitochondrial depolarization and blunted ATP depletion in CDD grafts. JNK inhibition and deficiency did not decrease 3-nitrotyrosine adducts in CDD grafts. The iNOS-JNK2-Sab pathway promotes CDD graft failure via increased mitochondrial depolarization, and is an attractive target to improve liver function and survival in CDD liver transplantation recipients. Copyright © 2015

  20. First Comparison of Hypothermic Oxygenated PErfusion Versus Static Cold Storage of Human Donation After Cardiac Death Liver Transplants: An International-matched Case Analysis.

    PubMed

    Dutkowski, Philipp; Polak, Wojciech G; Muiesan, Paolo; Schlegel, Andrea; Verhoeven, Cornelia J; Scalera, Irene; DeOliveira, Michelle L; Kron, Philipp; Clavien, Pierre-Alain

    2015-11-01

    Exposure of donor liver grafts to prolonged periods of warm ischemia before procurement causes injuries including intrahepatic cholangiopathy, which may lead to graft loss. Due to unavoidable prolonged ischemic time before procurement in donation after cardiac death (DCD) donation in 1 participating center, each liver graft of this center was pretreated with the new machine perfusion "Hypothermic Oxygenated PErfusion" (HOPE) in an attempt to improve graft quality before implantation. HOPE-treated DCD livers (n = 25) were matched and compared with normally preserved (static cold preservation) DCD liver grafts (n = 50) from 2 well-established European programs. Criteria for matching included duration of warm ischemia and key confounders summarized in the balance of risk score. In a second step, perfused and unperfused DCD livers were compared with liver grafts from standard brain dead donors (n = 50), also matched to the balance of risk score, serving as baseline controls. HOPE treatment of DCD livers significantly decreased graft injury compared with matched cold-stored DCD livers regarding peak alanine-aminotransferase (1239 vs 2065 U/L, P = 0.02), intrahepatic cholangiopathy (0% vs 22%, P = 0.015), biliary complications (20% vs 46%, P = 0.042), and 1-year graft survival (90% vs 69%, P = 0.035). No graft failure due to intrahepatic cholangiopathy or nonfunction occurred in HOPE-treated livers, whereas 18% of unperfused DCD livers needed retransplantation. In addition, HOPE-perfused DCD livers achieved similar results as control donation after brain death livers in all investigated endpoints. HOPE seems to offer important benefits in preserving higher-risk DCD liver grafts.

  1. The concordance index C and the Mann-Whitney parameter Pr(X>Y) with randomly censored data.

    PubMed

    Koziol, James A; Jia, Zhenyu

    2009-06-01

    Harrell's c-index or concordance C has been widely used as a measure of separation of two survival distributions. In the absence of censored data, the c-index estimates the Mann-Whitney parameter Pr(X>Y), which has been repeatedly utilized in various statistical contexts. In the presence of randomly censored data, the c-index no longer estimates Pr(X>Y); rather, a parameter that involves the underlying censoring distributions. This is in contrast to Efron's maximum likelihood estimator of the Mann-Whitney parameter, which is recommended in the setting of random censorship.

  2. The influence of Australian eye banking practices on corneal graft survival.

    PubMed

    Keane, Miriam C; Lowe, Marie T; Coster, Douglas J; Pollock, Graeme A; Williams, Keryn A

    2013-08-19

    To identify eye banking practices that influence corneal graft survival. Prospective cohort study of records of 19,254 followed corneal grafts in 15160 patients, submitted to the Australian Corneal Graft Registry between May 1985 and July 2012. Influence of corneal preservation method (organ culture, moist pot, Optisol, other); death-to-enucleation, death-to-preservation and enucleation-to-graft times; transportation by air; graft era; and indication for graft on probability of graft survival at most recent follow-up. In multivariate analysis, 919 penetrating grafts performed using corneas transported interstate by air exhibited worse survival than 14,684 grafts performed using corneas retrieved and used locally (hazard ratio [HR], 1.44; 95% CI, 1.21-1.73; P = 0.001). This was also the case for traditional lamellar grafts (64 corneas transported by air and 813 used locally; HR, 1.69; 95% CI, 1.03-2.78; P = 0.038). Indication for graft influenced survival of penetrating grafts (4611 keratoconus, 727 emergency or high-risk, 10,265 other indication; global P < 0.001) and traditional lamellar grafts (65 keratoconus, 212 emergency or high-risk, 600 other indication; global P < 0.001). The preservation medium in which corneas used for traditional lamellar grafts were stored exerted a marginal influence on graft survival (global P = 0.047). Donor corneas transported interstate exhibited poorer survival after transplantation than those retrieved and grafted locally. Higher proportions of emergency procedures involving transported corneas did not account for this difference. Where possible, efforts to avoid transportation of corneal tissue by air freight within Australia may be warranted.

  3. A test of grafting ponderosa pine.

    Treesearch

    Edwin L. Mowat; Roy R. Silen

    1957-01-01

    Cleft grafting ponderosa pine proved best of five methods of vegetative propagation tested for field use in the dry hot climate of eastern Oregon. Ponderosa pine has been successfully grafted under many conditions elsewhere, but first trials in this area during 1955 were a failure. This study was made in 1956 at the U.S. Forest Service nursery at Bend, Oreg., to find a...

  4. Second allogeneic hematopoietic cell transplantation for Patients with Fanconi anemia and Bone Marrow Failure

    PubMed Central

    Ayas, Mouhab; Eapen, Mary; Le-Rademacher, Jennifer; Carreras, Jeanette; Abdel-Azim, Hisham; Alter, Blanche P.; Anderlini, Paolo; Battiwalla, Minoo; Bierings, Marc; Buchbinder, David K.; Bonfim, Carmem; Camitta, Bruce M.; Fasth, Anders L.; Gale, Robert Peter; Lee, Michelle A.; Lund, Troy C.; Myers, Kasiani C.; Olsson, Richard F.; Page, Kristin M.; Prestidge, Tim D.; Radhi, Mohamed; Shah, Ami J.; Schultz, Kirk R.; Wirk, Baldeep; Wagner, John E.; Deeg, H. Joachim

    2015-01-01

    Second allogeneic hematopoietic cell transplantation (HCT) is the only salvage option for those for develop graft failure after their first HCT. Data on outcomes after second HCT in Fanconi anemia (FA) are scarce. We report outcomes after second allogeneic HCT for FA (n=81). The indication for second HCT was graft failure after the first HCT. Transplants occurred between 1990 and 2012. The timing of second transplantation predicted subsequent graft failure and survival. Graft failure was high when the second transplant occurred less than 3 months from the first. The 3-month probability of graft failure was 69% when the interval between first and second transplant was less than 3 months compared to 23% when the interval was longer (p<0.001). Consequently, survival rates were substantially lower when the interval between first and second transplant was less than 3 months, 23% at 1-year compared to 58%, when the interval was longer (p=0.001). The corresponding 5-year probabilities of survival were 16% and 45%, respectively (p=0.006). Taken together, these data suggest that fewer than half of FA patients undergoing a second HCT for graft failure are long-term survivors. There is an urgent need to develop strategies to lower graft failure after first HCT. PMID:26116087

  5. Comparison of Methods for Analyzing Left-Censored Occupational Exposure Data

    PubMed Central

    Huynh, Tran; Ramachandran, Gurumurthy; Banerjee, Sudipto; Monteiro, Joao; Stenzel, Mark; Sandler, Dale P.; Engel, Lawrence S.; Kwok, Richard K.; Blair, Aaron; Stewart, Patricia A.

    2014-01-01

    The National Institute for Environmental Health Sciences (NIEHS) is conducting an epidemiologic study (GuLF STUDY) to investigate the health of the workers and volunteers who participated from April to December of 2010 in the response and cleanup of the oil release after the Deepwater Horizon explosion in the Gulf of Mexico. The exposure assessment component of the study involves analyzing thousands of personal monitoring measurements that were collected during this effort. A substantial portion of these data has values reported by the analytic laboratories to be below the limits of detection (LOD). A simulation study was conducted to evaluate three established methods for analyzing data with censored observations to estimate the arithmetic mean (AM), geometric mean (GM), geometric standard deviation (GSD), and the 95th percentile (X0.95) of the exposure distribution: the maximum likelihood (ML) estimation, the β-substitution, and the Kaplan–Meier (K-M) methods. Each method was challenged with computer-generated exposure datasets drawn from lognormal and mixed lognormal distributions with sample sizes (N) varying from 5 to 100, GSDs ranging from 2 to 5, and censoring levels ranging from 10 to 90%, with single and multiple LODs. Using relative bias and relative root mean squared error (rMSE) as the evaluation metrics, the β-substitution method generally performed as well or better than the ML and K-M methods in most simulated lognormal and mixed lognormal distribution conditions. The ML method was suitable for large sample sizes (N ≥ 30) up to 80% censoring for lognormal distributions with small variability (GSD = 2–3). The K-M method generally provided accurate estimates of the AM when the censoring was <50% for lognormal and mixed distributions. The accuracy and precision of all methods decreased under high variability (GSD = 4 and 5) and small to moderate sample sizes (N < 20) but the β-substitution was still the best of the three methods. When using the

  6. Compliance effects on small diameter polyurethane graft patency.

    PubMed

    Uchida, N; Kambic, H; Emoto, H; Chen, J F; Hsu, S; Murabayshi, S; Harasaki, H; Nosé, Y

    1993-10-01

    Microporous compliance matched and noncompliant grafts were compared in a dog carotid artery interposition model. We fabricated 4 mm diameter sponge type polyurethane (Biomer) tubes 5 cm in length with a 0.5 mm wall thickness. The luminal surface was covered with a 50 microns coating of cross-linked gelatin. Compliance was measured in vitro and in vivo by volume and vessel diameter changes. Over a mean arterial pressure range of 55-155 mm Hg, the diameter changes of grafts and stump arteries were measured in situ using an ultrasonic Hokanson device. Compliance matched grafts were found to have the same in vitro compliance values as the natural canine carotid at a mean arterial pressure of 100 mm Hg. Compliance matched and noncompliant grafts had values of 10.3 +/- 1.3 and 0.9 +/- 0.1 x 10(-2) mm Hg, respectively. End to end arterial anastomoses were constructed between the graft and the host arteries. The use of synthetic grafts with matched compliance to the adjacent natural vessels has been advocated as the ideal solution to circumvent the problems of graft failure. These studies indicate that compliance values for compliance matched grafts decreased immediately after implantation (from 10.3 to 6.5 x 10(-2) %/mm Hg) and within 6 weeks decreased to 3.6 x 10(-2) %/mm Hg. The compliance values for noncompliant grafts remained constant throughout the test period. At autopsy all grafts showed a tightly adhered tissue capsule. The thickness of the anastomotic hyperplasia at the distal sites of compliance matched grafts was significantly different (P < .05) than that of the adjacent artery. The patency for compliant and noncompliant grafts was 64% and 50%, respectively. Evidence for polyurethane graft degradation was obtained by Fourier transform infrared spectroscopy and gel permeation chromatography analysis of patent explants. Compliance mismatch alone does not contribute to graft failure, however, material degradation, suture technique and/or capsule formation can

  7. Comparison of graft survival following penetrating keratoplasty and Descemet's stripping endothelial keratoplasty in eyes with a glaucoma drainage device.

    PubMed

    Iverson, Shawn M; Spierer, Oriel; Papachristou, George C; Feuer, William J; Shi, Wei; Greenfield, David S; O'Brien, Terrence P

    2018-02-01

    To compare corneal graft survival rates after penetrating keratoplasty (PK) and Descemet's stripping endothelial keratoplasty (DSEK) in patients with a glaucoma drainage device (GDD) or medically managed glaucoma. A retrospective chart review was conducted on consecutive patients who underwent primary PK or primary DSEK. Inclusion criteria consisted of eyes with a diagnosis of glaucoma prior to corneal transplantation and a minimum of 6 months of follow-up. Graft failure was defined as an edematous cornea with failure to maintain deturgescence lasting beyond a period of 1 month of intense steroid therapy or vascularization and scarring resulting in irreversible loss of central graft clarity. Corneal graft survival was calculated using Kaplan-Meier survival analysis. Patients were divided into four groups: GDD-PK, GDD-DSEK, medical-PK and medical-DSEK. Fifty-six eyes of 56 patients were identified as meeting inclusion criteria. Among eyes with a GDD, there was no difference in the proportion of failures between PK grafts (48%) and DSEK grafts (50%) (p = 0.90). Failure occurred earlier in DSEK recipients compared to PK recipients, 5.82 ± 6.77 months versus 14.40 ± 7.70 months, respectively (p = 0.04). A Kaplan-Meier analysis did not identify a difference between the four groups with respect to graft failure (p = 0.52). There is no significant difference in graft survival rates between medically and surgically treated glaucoma patients for either PK or DSEK grafts. In patients with GDD, graft failure occurs earlier in DSEK compared to PK.

  8. Non-censored rib fracture data during frontal PMHS sled tests.

    PubMed

    Kemper, Andrew R; Beeman, Stephanie M; Porta, David J; Duma, Stefan M

    2016-09-01

    The purpose of this study was to obtain non-censored rib fracture data due to three-point belt loading during dynamic frontal post-mortem human surrogate (PMHS) sled tests. The PMHS responses were then compared to matched tests performed using the Hybrid-III 50(th) percentile male ATD. Matched dynamic frontal sled tests were performed on two male PMHSs, which were approximately 50(th) percentile height and weight, and the Hybrid-III 50(th) percentile male ATD. The sled pulse was designed to match the vehicle acceleration of a standard sedan during a FMVSS-208 40 kph test. Each subject was restrained with a 4 kN load limiting, driver-side, three-point seatbelt. A 59-channel chestband, aligned at the nipple line, was used to quantify the chest contour, anterior-posterior sternum deflection, and maximum anterior-posterior chest deflection for all test subjects. The internal sternum deflection of the ATD was quantified with the sternum potentiometer. For the PMHS tests, a total of 23 single-axis strain gages were attached to the bony structures of the thorax, including the ribs, sternum, and clavicle. In order to create a non-censored data set, the time history of each strain gage was analyzed to determine the timing of each rib fracture and corresponding timing of each AIS level (AIS = 1, 2, 3, etc.) with respect to chest deflection. Peak sternum deflection for PMHS 1 and PMHS 2 were 48.7 mm (19.0%) and 36.7 mm (12.2%), respectively. The peak sternum deflection for the ATD was 20.8 mm when measured by the chest potentiometer and 34.4 mm (12.0%) when measured by the chestband. Although the measured ATD sternum deflections were found to be well below the current thoracic injury criterion (63 mm) specified for the ATD in FMVSS-208, both PMHSs sustained AIS 3+ thoracic injuries. For all subjects, the maximum chest deflection measured by the chestband occurred to the right of the sternum and was found to be 83.0 mm (36.0%) for PMHS 1, 60.6 mm (23.9%) for PMHS 2

  9. A Mathematical Model to Predict Endothelial Cell Density Following Penetrating Keratoplasty With Selective Dropout From Graft Failure

    PubMed Central

    Riddlesworth, Tonya D.; Kollman, Craig; Lass, Jonathan H.; Patel, Sanjay V.; Stulting, R. Doyle; Benetz, Beth Ann; Gal, Robin L.; Beck, Roy W.

    2014-01-01

    Purpose. We constructed several mathematical models that predict endothelial cell density (ECD) for patients after penetrating keratoplasty (PK) for a moderate-risk condition (principally Fuchs' dystrophy or pseudophakic/aphakic corneal edema). Methods. In a subset (n = 591) of Cornea Donor Study participants, postoperative ECD was determined by a central reading center. Various statistical models were considered to estimate the ECD trend longitudinally over 10 years of follow-up. A biexponential model with and without a logarithm transformation was fit using the Gauss-Newton nonlinear least squares algorithm. To account for correlated data, a log-polynomial model was fit using the restricted maximum likelihood method. A sensitivity analysis for the potential bias due to selective dropout was performed using Bayesian analysis techniques. Results. The three models using a logarithm transformation yield similar trends, whereas the model without the transform predicts higher ECD values. The adjustment for selective dropout turns out to be negligible. However, this is possibly due to the relatively low rate of graft failure in this cohort (19% at 10 years). Fuchs' dystrophy and pseudophakic/aphakic corneal edema (PACE) patients had similar ECD decay curves, with the PACE group having slightly higher cell densities by 10 years. Conclusions. Endothelial cell loss after PK can be modeled via a log-polynomial model, which accounts for the correlated data from repeated measures on the same subject. This model is not significantly affected by the selective dropout due to graft failure. Our findings warrant further study on how this may extend to ECD following endothelial keratoplasty. PMID:25425307

  10. Xenon Treatment Protects Against Cold Ischemia Associated Delayed Graft Function and Prolongs Graft Survival in Rats

    PubMed Central

    Zhao, H; Watts, H R; Chong, M; Huang, H; Tralau-Stewart, C; Maxwell, P H; Maze, M; George, A J T; Ma, D

    2013-01-01

    Prolonged hypothermic storage causes ischemia-reperfusion injury (IRI) in the renal graft, which is considered to contribute to the occurrence of the delayed graft function (DGF) and chronic graft failure. Strategies are required to protect the graft and to prolong renal graft survival. We demonstrated that xenon exposure to human proximal tubular cells (HK-2) led to activation of range of protective proteins. Xenon treatment prior to or after hypothermia–hypoxia challenge stabilized the HK-2 cellular structure, diminished cytoplasmic translocation of high-mobility group box (HMGB) 1 and suppressed NF-κB activation. In the syngeneic Lewis-to-Lewis rat model of kidney transplantation, xenon exposure to donors before graft retrieval or to recipients after engraftment decreased caspase-3 expression, localized HMGB-1 within nuclei and prevented TLR-4/NF-κB activation in tubular cells; serum pro-inflammatory cytokines IL-1β, IL-6 and TNF-α were reduced and renal function was preserved. Xenon treatment of graft donors or of recipients prolonged renal graft survival following IRI in both Lewis-to-Lewis isografts and Fischer-to-Lewis allografts. Xenon induced cell survival or graft functional recovery was abolished by HIF-1α siRNA. Our data suggest that xenon treatment attenuates DGF and enhances graft survival. This approach could be translated into clinical practice leading to a considerable improvement in long-term graft survival. PMID:23710625

  11. T-Wave Morphology Restitution Predicts Sudden Cardiac Death in Patients With Chronic Heart Failure.

    PubMed

    Ramírez, Julia; Orini, Michele; Mincholé, Ana; Monasterio, Violeta; Cygankiewicz, Iwona; Bayés de Luna, Antonio; Martínez, Juan Pablo; Pueyo, Esther; Laguna, Pablo

    2017-05-19

    Patients with chronic heart failure are at high risk of sudden cardiac death (SCD). Increased dispersion of repolarization restitution has been associated with SCD, and we hypothesize that this should be reflected in the morphology of the T-wave and its variations with heart rate. The aim of this study is to propose an electrocardiogram (ECG)-based index characterizing T-wave morphology restitution (TMR), and to assess its association with SCD risk in a population of chronic heart failure patients. Holter ECGs from 651 ambulatory patients with chronic heart failure from the MUSIC (MUerte Súbita en Insuficiencia Cardiaca) study were available for the analysis. TMR was quantified by measuring the morphological variation of the T-wave per RR increment using time-warping metrics, and its predictive power was compared to that of clinical variables such as the left ventricular ejection fraction and other ECG-derived indices, such as T-wave alternans and heart rate variability. TMR was significantly higher in SCD victims than in the rest of patients (median 0.046 versus 0.039, P <0.001). When TMR was dichotomized at TMR=0.040, the SCD rate was significantly higher in the TMR≥0.040 group ( P <0.001). Cox analysis revealed that TMR≥0.040 was strongly associated with SCD, with a hazard ratio of 3.27 ( P <0.001), independently of clinical and ECG-derived variables. No association was found between TMR and pump failure death. This study shows that TMR is specifically associated with SCD in a population of chronic heart failure patients, and it is a better predictor than clinical and ECG-derived variables. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  12. Roles of inflammation and apoptosis in experimental brain death-induced right ventricular failure.

    PubMed

    Belhaj, Asmae; Dewachter, Laurence; Rorive, Sandrine; Remmelink, Myriam; Weynand, Birgit; Melot, Christian; Galanti, Laurence; Hupkens, Emeline; Sprockeels, Thomas; Dewachter, Céline; Creteur, Jacques; McEntee, Kathleen; Naeije, Robert; Rondelet, Benoît

    2016-12-01

    Right ventricular (RV) dysfunction remains the leading cause of early death after cardiac transplantation. Methylprednisolone is used to improve graft quality; however, evidence for that remains empirical. We sought to determine whether methylprednisolone, acting on inflammation and apoptosis, might prevent brain death-induced RV dysfunction. After randomization to placebo (n = 11) or to methylprednisolone (n = 8; 15 mg/kg), 19 pigs were assigned to a brain-death procedure. The animals underwent hemodynamic evaluation at 1 and 5 hours after Cushing reflex (i.e., hypertension and bradycardia). The animals euthanized, and myocardial tissue was sampled. This was repeated in a control group (n = 8). At 5 hours after the Cushing reflex, brain death resulted in increased pulmonary artery pressure (27 ± 2 vs 18 ± 1 mm Hg) and in a 30% decreased ratio of end-systolic to pulmonary arterial elastances (Ees/Ea). Cardiac output and right atrial pressure did not change. This was prevented by methylprednisolone. Brain death-induced RV dysfunction was associated with increased RV expression of heme oxygenase-1, interleukin (IL)-6, IL-10, IL-1β, tumor necrosis factor (TNF)-α, IL-1 receptor-like (ST)-2, signal transducer and activator of transcription-3, intercellular adhesion molecules-1 and -2, vascular cell adhesion molecule-1, and neutrophil infiltration, whereas IL-33 expression decreased. RV apoptosis was confirmed by terminal deoxynucleotide transferase-mediated deoxy uridine triphosphate nick-end labeling staining. Methylprednisolone pre-treatment prevented RV-arterial uncoupling and decreased RV expression of TNF-α, IL-1 receptor-like-2, intercellular adhesion molecule-1, vascular cell adhesion molecule-1, and neutrophil infiltration. RV Ees/Ea was inversely correlated to RV TNF-α and IL-6 expression. Brain death-induced RV dysfunction is associated with RV activation of inflammation and apoptosis and is partly limited by methylprednisolone. Copyright © 2016

  13. Estimation of distributional parameters for censored trace level water quality data: 1. Estimation techniques

    USGS Publications Warehouse

    Gilliom, Robert J.; Helsel, Dennis R.

    1986-01-01

    A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensored observations, for determining the best performing parameter estimation method for any particular data set. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification.

  14. Estimation of distributional parameters for censored trace level water quality data. 1. Estimation Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilliom, R.J.; Helsel, D.R.

    1986-02-01

    A recurring difficulty encountered in investigations of many metals and organic contaminants in ambient waters is that a substantial portion of water sample concentrations are below limits of detection established by analytical laboratories. Several methods were evaluated for estimating distributional parameters for such censored data sets using only uncensored observations. Their reliabilities were evaluated by a Monte Carlo experiment in which small samples were generated from a wide range of parent distributions and censored at varying levels. Eight methods were used to estimate the mean, standard deviation, median, and interquartile range. Criteria were developed, based on the distribution of uncensoredmore » observations, for determining the best performing parameter estimation method for any particular data det. The most robust method for minimizing error in censored-sample estimates of the four distributional parameters over all simulation conditions was the log-probability regression method. With this method, censored observations are assumed to follow the zero-to-censoring level portion of a lognormal distribution obtained by a least squares regression between logarithms of uncensored concentration observations and their z scores. When method performance was separately evaluated for each distributional parameter over all simulation conditions, the log-probability regression method still had the smallest errors for the mean and standard deviation, but the lognormal maximum likelihood method had the smallest errors for the median and interquartile range. When data sets were classified prior to parameter estimation into groups reflecting their probable parent distributions, the ranking of estimation methods was similar, but the accuracy of error estimates was markedly improved over those without classification.« less

  15. MODELING LEFT-TRUNCATED AND RIGHT-CENSORED SURVIVAL DATA WITH LONGITUDINAL COVARIATES

    PubMed Central

    Su, Yu-Ru; Wang, Jane-Ling

    2018-01-01

    There is a surge in medical follow-up studies that include longitudinal covariates in the modeling of survival data. So far, the focus has been largely on right censored survival data. We consider survival data that are subject to both left truncation and right censoring. Left truncation is well known to produce biased sample. The sampling bias issue has been resolved in the literature for the case which involves baseline or time-varying covariates that are observable. The problem remains open however for the important case where longitudinal covariates are present in survival models. A joint likelihood approach has been shown in the literature to provide an effective way to overcome those difficulties for right censored data, but this approach faces substantial additional challenges in the presence of left truncation. Here we thus propose an alternative likelihood to overcome these difficulties and show that the regression coefficient in the survival component can be estimated unbiasedly and efficiently. Issues about the bias for the longitudinal component are discussed. The new approach is illustrated numerically through simulations and data from a multi-center AIDS cohort study. PMID:29479122

  16. Outcomes of Kidney Transplantation Abroad: A Single-Center Canadian Cohort Study.

    PubMed

    Quach, Kevin; Sultan, Heebah; Li, Yanhong; Famure, Olusegun; Kim, S Joseph

    2016-03-01

    An increasing demand for kidney transplantation has enticed some patients with end-stage renal disease (ESRD) to venture outside their country of residence, but their posttransplant outcomes may be suboptimal. We compared the risks and clinical outcomes among tourists, or patients who pursue a kidney transplant abroad, versus patients who received a transplant at the Toronto General Hospital (TGH). A single-center, 1:3 matched (based on age at transplant, time on dialysis, and year of transplant) cohort study was conducted. Forty-five tourists were matched with 135 domestic transplant recipients between January 1, 2000, and December 31, 2011. Multivariable Cox proportional hazards models were fitted to assess graft and patient outcomes. Among the 45 tourists, the majority (38 of 45) traveled to the Middle East or Far East Asia, and most received living donor kidney transplants (35 of 45). Multivariable Cox proportional hazards models showed that tourists had a higher risk for the composite outcome of acute rejection, death-censored graft failure, or death with graft function (DWGF; hazard ratio [HR] 2.08, 95% confidence interval [CI]: 1.06-4.07). Tourists also showed a higher risk for the individual end points of acute rejection, DWGF, and posttransplant hospitalizations. Patients going abroad for kidney transplantation may have inferior outcomes compared to domestic patients receiving kidney transplants. Patients who are contemplating an overseas transplant need to be aware of the increased risk of adverse posttransplant outcomes and should be appropriately counseled by transplant professionals during the pretransplant evaluation process. © 2016, NATCO.

  17. Midterm outcomes of the Zenith Renu AAA Ancillary Graft.

    PubMed

    Jim, Jeffrey; Rubin, Brian G; Geraghty, Patrick J; Money, Samuel R; Sanchez, Luis A

    2011-08-01

    The Zenith Renu abdominal aortic aneurysm (AAA) Ancillary Graft (Cook Medical Inc, Bloomington, Ind) provides active proximal fixation for treatment of pre-existing endografts with failed or failing proximal fixation or seal. The purpose of this study was to evaluate the midterm outcomes of treatment with this device. From September 2005 to November 2006, a prospective, nonrandomized, multicenter, postmarket registry was utilized to collect physician experiences from 151 cases (89 converters and 62 main body extensions) at 95 institutions. Preoperative indications and procedural and postimplantation outcomes were collected and analyzed. Technical success and clinical success were determined as defined by the Society of Vascular Surgery reporting standards. Patients were predominantly male (87%) with a mean age of 77 years. The interval between the original endograft implantation to Renu treatment was 43.4 ± 18.7 months. The indications for treatment were endoleak (n = 111), migration (n = 136), or both (n = 94). Technical success was 98.0% with two cases of intraoperative conversion and one case of persistent type IA endoleak. The median follow-up for the cohort was 45.0 months (range, 0-56 months; interquartile range, 25.0 months). Overall, 32 cases had treatment failures that included at least one of the following: death (n = 5), type I/III endoleak (n = 18), graft infection (n = 1), thrombosis (n = 1), aneurysm enlargement >5 mm (n = 9), rupture (n = 4), conversion (n = 9, with 7 after 30 days), and migration (n = 1). Overall, the clinical success for the entire cohort during the follow-up period was 78.8% (119/151). The postmarket registry data confirm that the Zenith Renu AAA Ancillary Graft can be used to treat endovascular repairs that failed due to proximal attachment failures. The salvage treatment with the Renu device had high technical success rate and resulted in clinical success in a majority of patients (78.8%). While failed endovascular repairs can

  18. Assessment of independent predictors for long-term mortality between women and men after coronary artery bypass grafting: are women different from men?

    PubMed

    Toumpoulis, Ioannis K; Anagnostopoulos, Constantine E; Balaram, Sandhya K; Rokkas, Chris K; Swistel, Daniel G; Ashton, Robert C; DeRose, Joseph J

    2006-02-01

    The long-term mortality of coronary artery bypass grafting in women in not certain. The purpose of this study was to determine and compare risk factors for long-term mortality in women and men undergoing coronary artery bypass grafting. Between 1992 and 2002, 3760 consecutive patients (2598 men and 1162 women) underwent isolated coronary artery bypass grafting. Long-term survival data were obtained from the National Death Index (mean follow-up, 5.1 +/- 3.2 years). Multivariable Cox regression analysis was performed, including 64 preoperative, intraoperative, and postoperative factors separately in women and men. There were no differences in in-hospital mortality (2.7% in men vs 2.9% in women, P = .639) and 5-year survival (82.0% +/- 0.8% in men vs 81.1% +/- 1.3% in women, P = .293). After adjustment for all independent predictors of long-term mortality, female sex was an independent predictor of improved 5-year survival (hazard ratio, 0.82; 95% confidence interval, 0.71-0.96; P = .014). Twenty-one independent predictors for long-term mortality were determined in men, whereas only 12 were determined in women. There were 9 common risk factors (age, ejection fraction, diabetes mellitus, > or =2 arterial grafts, postoperative myocardial infarction, deep sternal wound infection, sepsis and/or endocarditis, gastrointestinal complications, and respiratory failure); however, their weights were different between women and men. Malignant ventricular arrhythmias, calcified aorta, and preoperative renal failure were independent predictors only in women. Emergency operation, previous cardiac operation, peripheral vascular disease, left ventricular hypertrophy, current and past congestive heart failure, chronic obstructive pulmonary disease, body mass index of greater than 29, preoperative dialysis, thrombolysis within 7 days before coronary artery bypass grafting, intraoperative stroke, and postoperative renal failure were independent predictors only in men. Despite equality

  19. Graft-versus-host disease causes failure of donor hematopoiesis and lymphopoiesis in interferon-gamma receptor-deficient hosts.

    PubMed

    Delisle, Jean-Sébastien; Gaboury, Louis; Bélanger, Marie-Pier; Tassé, Eliane; Yagita, Hideo; Perreault, Claude

    2008-09-01

    The immunopathologic condition known as graft-versus-host disease (GVHD) results from a type I T-cell process. However, a prototypical type I cytokine, interferon-gamma (IFN-gamma), can protect against several manifestations of GVHD in recipients of major histocompatibility complex (MHC)-mismatched hematopoietic cells. We transplanted hematopoietic cells from C3H.SW donors in wild-type (wt) and IFN-gamma-receptor-deficient (IFN-gammaRKO) MHC-matched C57BL/6 recipients. In IFN-gammaRKO recipients, host cells were unresponsive to IFN-gamma, whereas wt donor cells were exposed to exceptionally high levels of IFN-gamma. From an IFN-gamma perspective, we could therefore evaluate the impact of a loss-of-function on host cells and gain-of-function on donor cells. We found that lack of IFN-gammaR prevented up-regulation of MHC proteins on host cells but did not mitigate damage to most target organs. Two salient phenotypes in IFN-gammaRKO recipients involved donor cells: lymphoid hypoplasia and hematopoietic failure. Lymphopenia was due to FasL-induced apoptosis and decreased cell proliferation. Bone marrow aplasia resulted from a decreased proliferation of hematopoietic stem/progenitor cells that was associated with down-regulation of 2 genes negatively regulated by IFN-gamma: Ccnd1 and Myc. We conclude that IFN-gamma produced by alloreactive T cells may entail a severe graft-versus-graft reaction and could be responsible for cytopenias that are frequently observed in subjects with GVHD.

  20. Drug Concentration Thresholds Predictive of Therapy Failure and Death in Children With Tuberculosis: Bread Crumb Trails in Random Forests

    PubMed Central

    Swaminathan, Soumya; Pasipanodya, Jotam G.; Ramachandran, Geetha; Hemanth Kumar, A. K.; Srivastava, Shashikant; Deshpande, Devyani; Nuermberger, Eric; Gumbo, Tawanda

    2016-01-01

    Background. The role of drug concentrations in clinical outcomes in children with tuberculosis is unclear. Target concentrations for dose optimization are unknown. Methods. Plasma drug concentrations measured in Indian children with tuberculosis were modeled using compartmental pharmacokinetic analyses. The children were followed until end of therapy to ascertain therapy failure or death. An ensemble of artificial intelligence algorithms, including random forests, was used to identify predictors of clinical outcome from among 30 clinical, laboratory, and pharmacokinetic variables. Results. Among the 143 children with known outcomes, there was high between-child variability of isoniazid, rifampin, and pyrazinamide concentrations: 110 (77%) completed therapy, 24 (17%) failed therapy, and 9 (6%) died. The main predictors of therapy failure or death were a pyrazinamide peak concentration <38.10 mg/L and rifampin peak concentration <3.01 mg/L. The relative risk of these poor outcomes below these peak concentration thresholds was 3.64 (95% confidence interval [CI], 2.28–5.83). Isoniazid had concentration-dependent antagonism with rifampin and pyrazinamide, with an adjusted odds ratio for therapy failure of 3.00 (95% CI, 2.08–4.33) in antagonism concentration range. In regard to death alone as an outcome, the same drug concentrations, plus z scores (indicators of malnutrition), and age <3 years, were highly ranked predictors. In children <3 years old, isoniazid 0- to 24-hour area under the concentration-time curve <11.95 mg/L × hour and/or rifampin peak <3.10 mg/L were the best predictors of therapy failure, with relative risk of 3.43 (95% CI, .99–11.82). Conclusions. We have identified new antibiotic target concentrations, which are potential biomarkers associated with treatment failure and death in children with tuberculosis. PMID:27742636

  1. Drug Concentration Thresholds Predictive of Therapy Failure and Death in Children With Tuberculosis: Bread Crumb Trails in Random Forests.

    PubMed

    Swaminathan, Soumya; Pasipanodya, Jotam G; Ramachandran, Geetha; Hemanth Kumar, A K; Srivastava, Shashikant; Deshpande, Devyani; Nuermberger, Eric; Gumbo, Tawanda

    2016-11-01

     The role of drug concentrations in clinical outcomes in children with tuberculosis is unclear. Target concentrations for dose optimization are unknown.  Plasma drug concentrations measured in Indian children with tuberculosis were modeled using compartmental pharmacokinetic analyses. The children were followed until end of therapy to ascertain therapy failure or death. An ensemble of artificial intelligence algorithms, including random forests, was used to identify predictors of clinical outcome from among 30 clinical, laboratory, and pharmacokinetic variables.  Among the 143 children with known outcomes, there was high between-child variability of isoniazid, rifampin, and pyrazinamide concentrations: 110 (77%) completed therapy, 24 (17%) failed therapy, and 9 (6%) died. The main predictors of therapy failure or death were a pyrazinamide peak concentration <38.10 mg/L and rifampin peak concentration <3.01 mg/L. The relative risk of these poor outcomes below these peak concentration thresholds was 3.64 (95% confidence interval [CI], 2.28-5.83). Isoniazid had concentration-dependent antagonism with rifampin and pyrazinamide, with an adjusted odds ratio for therapy failure of 3.00 (95% CI, 2.08-4.33) in antagonism concentration range. In regard to death alone as an outcome, the same drug concentrations, plus z scores (indicators of malnutrition), and age <3 years, were highly ranked predictors. In children <3 years old, isoniazid 0- to 24-hour area under the concentration-time curve <11.95 mg/L × hour and/or rifampin peak <3.10 mg/L were the best predictors of therapy failure, with relative risk of 3.43 (95% CI, .99-11.82).  We have identified new antibiotic target concentrations, which are potential biomarkers associated with treatment failure and death in children with tuberculosis. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America.

  2. Donor Indocyanine Green Clearance Test Predicts Graft Quality and Early Graft Prognosis After Liver Transplantation.

    PubMed

    Tang, Yunhua; Han, Ming; Chen, Maogen; Wang, Xiaoping; Ji, Fei; Zhao, Qiang; Zhang, Zhiheng; Ju, Weiqiang; Wang, Dongping; Guo, Zhiyong; He, Xiaoshun

    2017-11-01

    Transplantation centers have given much attention to donor availability. However, no reliable quantitative methods have been employed to accurately assess graft quality before transplantation. Here, we report that the indocyanine green (ICG) clearance test is a valuable index for liver grafts. We performed the ICG clearance test on 90 brain-dead donors within 6 h before organ procurement between March 2015 and November 2016. We also analyzed the relationship between graft liver function and early graft survival after liver transplantation (LT). Our results suggest that the ICG retention rate at 15 min (ICGR15) of donors before procurement was independently associated with 3-month graft survival after LT. The best donor ICGR15 cutoff value was 11.0%/min, and we observed a significant increase in 3-month graft failure among patients with a donor ICGR15 above this value. On the other hand, a donor ICGR15 value of ≤ 11.0%/min could be used as an early assessment index of graft quality because it provides additional information to the transplant surgeon or organ procurement organization members who must maintain or improve organ function to adapt the LT. An ICG clearance test before liver procurement might be an effective quantitative method to predict graft availability and improve early graft prognosis after LT.

  3. Inverse probability weighted least squares regression in the analysis of time-censored cost data: an evaluation of the approach using SEER-Medicare.

    PubMed

    Griffiths, Robert I; Gleeson, Michelle L; Danese, Mark D; O'Hagan, Anthony

    2012-01-01

    To assess the accuracy and precision of inverse probability weighted (IPW) least squares regression analysis for censored cost data. By using Surveillance, Epidemiology, and End Results-Medicare, we identified 1500 breast cancer patients who died and had complete cost information within the database. Patients were followed for up to 48 months (partitions) after diagnosis, and their actual total cost was calculated in each partition. We then simulated patterns of administrative and dropout censoring and also added censoring to patients receiving chemotherapy to simulate comparing a newer to older intervention. For each censoring simulation, we performed 1000 IPW regression analyses (bootstrap, sampling with replacement), calculated the average value of each coefficient in each partition, and summed the coefficients for each regression parameter to obtain the cumulative values from 1 to 48 months. The cumulative, 48-month, average cost was $67,796 (95% confidence interval [CI] $58,454-$78,291) with no censoring, $66,313 (95% CI $54,975-$80,074) with administrative censoring, and $66,765 (95% CI $54,510-$81,843) with administrative plus dropout censoring. In multivariate analysis, chemotherapy was associated with increased cost of $25,325 (95% CI $17,549-$32,827) compared with $28,937 (95% CI $20,510-$37,088) with administrative censoring and $29,593 ($20,564-$39,399) with administrative plus dropout censoring. Adding censoring to the chemotherapy group resulted in less accurate IPW estimates. This was ameliorated, however, by applying IPW within treatment groups. IPW is a consistent estimator of population mean costs if the weight is correctly specified. If the censoring distribution depends on some covariates, a model that accommodates this dependency must be correctly specified in IPW to obtain accurate estimates. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  4. A nonparametric method for assessment of interactions in a median regression model for analyzing right censored data.

    PubMed

    Lee, MinJae; Rahbar, Mohammad H; Talebi, Hooshang

    2018-01-01

    We propose a nonparametric test for interactions when we are concerned with investigation of the simultaneous effects of two or more factors in a median regression model with right censored survival data. Our approach is developed to detect interaction in special situations, when the covariates have a finite number of levels with a limited number of observations in each level, and it allows varying levels of variance and censorship at different levels of the covariates. Through simulation studies, we compare the power of detecting an interaction between the study group variable and a covariate using our proposed procedure with that of the Cox Proportional Hazard (PH) model and censored quantile regression model. We also assess the impact of censoring rate and type on the standard error of the estimators of parameters. Finally, we illustrate application of our proposed method to real life data from Prospective Observational Multicenter Major Trauma Transfusion (PROMMTT) study to test an interaction effect between type of injury and study sites using median time for a trauma patient to receive three units of red blood cells. The results from simulation studies indicate that our procedure performs better than both Cox PH model and censored quantile regression model based on statistical power for detecting the interaction, especially when the number of observations is small. It is also relatively less sensitive to censoring rates or even the presence of conditionally independent censoring that is conditional on the levels of covariates.

  5. Censored rainfall modelling for estimation of fine-scale extremes

    NASA Astrophysics Data System (ADS)

    Cross, David; Onof, Christian; Winter, Hugo; Bernardara, Pietro

    2018-01-01

    Reliable estimation of rainfall extremes is essential for drainage system design, flood mitigation, and risk quantification. However, traditional techniques lack physical realism and extrapolation can be highly uncertain. In this study, we improve the physical basis for short-duration extreme rainfall estimation by simulating the heavy portion of the rainfall record mechanistically using the Bartlett-Lewis rectangular pulse (BLRP) model. Mechanistic rainfall models have had a tendency to underestimate rainfall extremes at fine temporal scales. Despite this, the simple process representation of rectangular pulse models is appealing in the context of extreme rainfall estimation because it emulates the known phenomenology of rainfall generation. A censored approach to Bartlett-Lewis model calibration is proposed and performed for single-site rainfall from two gauges in the UK and Germany. Extreme rainfall estimation is performed for each gauge at the 5, 15, and 60 min resolutions, and considerations for censor selection discussed.

  6. Effect of circulatory assistance on premature death and long-term prognosis.

    PubMed

    Sánchez Lázaro, I J; Almenar Bonet, L; Martínez-Dolz, L; Moro López, J; Rueda Soriano, J; Arnau Vives, M A; Buendía Fuentes, F; Ortiz Martínez, V; Cano Pérez, O; Sánchez Soriano, R; Salvador Sanz, A

    2008-11-01

    Patients undergoing urgent heart transplantation (HT) have a poorer prognosis and more long-term complications. The objective of this study was to compare the preoperative course in patients undergoing urgent HT according to the need for preoperative intra-aortic balloon counterpulsation (IABP). We studied 102 consecutive patients including 23 patients with IABP who underwent urgent HT between January 2000 and September 2006. We excluded patients who received combination transplants, those who underwent repeat HT, and pediatric patients who underwent HT. The statistical methods used were the t test for quantitative variables and the chi(2) test for qualitative variables. A logistic regression model was constructed to assess the possible relationship between IABP and other variables on premature death within 30 days after HT. Mean (SD) patient-age was 50 (10) years. No significant differences were observed in baseline characteristics between the IABP and the non-IAPB groups. The IABP patient group had higher rates of acute graft failure (45.5% vs 35.4%; P = .46) and premature death (18.8% vs 14.8%; P = .67) and shorter long-term survival (40.6 [34.9] vs 54.5 [43.7] mo; P = .30). Multivariate analysis demonstrated no association between the need for IABP and increased frequency of premature death. Use of IABP is not associated with premature or late death. We recommend use of IABP in patients with acute decompensated heart failure to stabilize them before HT.

  7. Indications and results of omental pedicle grafts in oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petit, J.Y.; Lacour, J.; Margulis, A.

    1979-12-01

    Sixty omental grafts were performed in our department. Sixty-two percent concerned breast cancer patients. Other grafts were undertaken for other cancers: head and neck, gynecologic urologic and intestinal, skin and soft tissue tumors. These grafts were indicated for radionecrosis or chemonecrosis in 33 cases and for cancer recurrence in 26 cases (among whom 24 were previously irradiated). Only one graft was performed for lymphoedema treatment. Overall, fifty four patients (83.5%) had successful grafts, and six (16.5%) had graft failures. According to the treated lesion we obtained 82% of successful treatment among patients treated for radio or chemonecrosis, and 92% formore » patients treated for recurrences.« less

  8. The biomechanical strength of a hardware-free femoral press-fit method for ACL bone-tendon-bone graft fixation.

    PubMed

    Arnold, M P; Burger, L D; Wirz, D; Goepfert, B; Hirschmann, M T

    2017-04-01

    The purpose was to investigate graft slippage and ultimate load to failure of a femoral press-fit fixation technique for anterior cruciate ligament (ACL) reconstruction. Nine fresh-frozen knees were used. Standardized harvesting of the B-PT-B graft was performed. The femora were cemented into steel rods, and a tunnel was drilled outside-in into the native ACL footprint and expanded using a manual mill bit. The femoral bone block was fixed press-fit. To pull the free end of the graft, it was fixed to a mechanical testing machine using a deep-freezing technique. A motion capture system was used to assess three-dimensional micro-motion. After preconditioning of the graft, 1000 cycles of tensile loading were applied. Finally, an ultimate load to failure test was performed. Graft slippage in mm ultimate load to failure as well as type of failure was noted. In six of the nine measured specimens, a typical pattern of graft slippage was observed during cyclic loading. For technical reasons, the results of three knees had to be discarded. 78.6 % of total graft slippage occurred in the first 100 cycles. Once the block had settled, graft slippage converged to zero, highlighting the importance of initial preconditioning of the graft in the clinical setting. Graft slippage after 1000 cycles varied around 3.4 ± 3.2 mm (R = 1.3-9.8 mm) between the specimens. Ultimate loading (n = 9) revealed two characteristic patterns of failure. In four knees, the tendon ruptured, while in five knees the bone block was pulled out of the femoral tunnel. The median ultimate load to failure was 852 N (R = 448-1349 N). The implant-free femoral press-fit fixation provided adequate primary stability with ultimate load to failure pull forces at least equal to published results for interference screws; hence, its clinical application is shown to be safe.

  9. Angiotensin-converting enzyme inhibitors and angiotensin receptor blockers in renal transplantation between 1990 and 2002 in Spain.

    PubMed

    Hernández, Angel Alonso; Moreso, Francesc; Bayés, Beatriz; Lauzurica, Ricardo; Sánz-Guajardo, Dámaso; Gómez-Huertas, Ernesto; Pereira, Porfirio; Paul, Javier; Crespo, Josep; Amenábar, Juan J; Oliver, Juan; Serón, Daniel

    2010-06-01

    Background. Angiotensin-converting enzyme inhibitors (ACEI) and angiotensin II type 1 receptor blockers (ARB) decrease cardiovascular mortality and slow the progression of renal disease in non-transplant patients, but their impact on kidney transplant outcome has not been well established.Methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered for the present study. Only adult (>/=18 years) recipients of a single kidney transplant functioning at the end of the first year were considered. A total of 4842 patients with clinical data about ACEI/ARB therapy were included.Results. During the initial 2 years after transplant, ACEI/ARB were less frequently used in the 1990 and 1994 cohorts than in 1998 and 2002 (15.1%, 24.6%, 33.5% and 45.1%, respectively; P < 0.001). During the first year, a total of 1063 patients (22.8%) received ACEI/ARB treatment, and graft survival (50.0% for treated patients and 51.4% for untreated, P = ns), death-censored graft survival (60.6% versus 63.5%, P = ns) and patient survival (68.8% versus 66.6%, P = ns) were not different. During the initial 2 years, 1472 patients (31.4%) received treatment with ACEI/ARB, and graft survival tended to be higher in treated patients (54.4% and 50.9%, P = 0.063). Since there was an interaction between ACEI/ARB treatment and year of transplant, graft survival was analysed in each cohort. Cox regression analysis including the propensity score for ACEI/ARB treatment showed an association between ACEI/ARB treatment and graft survival in the 2002 cohort (relative risk 0.36 and 95% confidence interval 0.17-0.75, P = 0.007). Death-censored graft survival (63.8% versus 63.1%, P = ns) and patient survival (68.1% and 66.5%, P = ns) were not significantly different.Conclusions. The use of ACEI/ARB during the initial 2 years after transplantation was associated with a better graft survival, but this effect was only observed in the 2002 cohort.

  10. Angiotensin-converting enzyme inhibitors and angiotensin receptor blockers in renal transplantation between 1990 and 2002 in Spain

    PubMed Central

    Hernández, Ángel Alonso; Moreso, Francesc; Bayés, Beatriz; Lauzurica, Ricardo; Sánz-Guajardo, Dámaso; Gómez-Huertas, Ernesto; Pereira, Porfirio; Paul, Javier; Crespo, Josep; Amenábar, Juan J.; Oliver, Juan; Serón, Daniel

    2010-01-01

    Background. Angiotensin-converting enzyme inhibitors (ACEI) and angiotensin II type 1 receptor blockers (ARB) decrease cardiovascular mortality and slow the progression of renal disease in non-transplant patients, but their impact on kidney transplant outcome has not been well established. Methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered for the present study. Only adult (≥18 years) recipients of a single kidney transplant functioning at the end of the first year were considered. A total of 4842 patients with clinical data about ACEI/ARB therapy were included. Results. During the initial 2 years after transplant, ACEI/ARB were less frequently used in the 1990 and 1994 cohorts than in 1998 and 2002 (15.1%, 24.6%, 33.5% and 45.1%, respectively; P < 0.001). During the first year, a total of 1063 patients (22.8%) received ACEI/ARB treatment, and graft survival (50.0% for treated patients and 51.4% for untreated, P = ns), death-censored graft survival (60.6% versus 63.5%, P = ns) and patient survival (68.8% versus 66.6%, P = ns) were not different. During the initial 2 years, 1472 patients (31.4%) received treatment with ACEI/ARB, and graft survival tended to be higher in treated patients (54.4% and 50.9%, P = 0.063). Since there was an interaction between ACEI/ARB treatment and year of transplant, graft survival was analysed in each cohort. Cox regression analysis including the propensity score for ACEI/ARB treatment showed an association between ACEI/ARB treatment and graft survival in the 2002 cohort (relative risk 0.36 and 95% confidence interval 0.17–0.75, P = 0.007). Death-censored graft survival (63.8% versus 63.1%, P = ns) and patient survival (68.1% and 66.5%, P = ns) were not significantly different. Conclusions. The use of ACEI/ARB during the initial 2 years after transplantation was associated with a better graft survival, but this effect was only observed in the 2002 cohort. PMID:20508862

  11. Death after discharge: predictors of mortality in older brain-injured patients.

    PubMed

    Peck, Kimberly A; Calvo, Richard Y; Sise, C Beth; Johnson, Jeffrey; Yen, Jessica W; Sise, Michael J; Dunne, Casey E; Badiee, Jayraan; Shackford, Steven R; Lobatz, Michael A

    2014-12-01

    Older patients with traumatic brain injury (TBI) may be at high risk of death after hospitalization. The purpose of this study was to characterize long-term mortality of older TBI patients who survived to discharge. We hypothesized that predictors of postdischarge mortality differed from those of inpatient mortality. A retrospective cohort study was performed on TBI patients older than 55 years admitted to our Level I trauma center between July 1, 2006, and December 31, 2011. Postdischarge deaths were identified by matching patient data with local vital records up to December 31, 2011, when data collection was terminated (censoring). Patients were categorized by age, comorbidities, history of preinjury anticoagulant/prescription antiplatelet agent therapy, injury severity indices, initial TBI type, prehospital living status, discharge location, and discharge condition. The effect of risk factors on postdischarge mortality was evaluated by Cox proportional hazards modeling. Of 353 patients, 322 (91.2%) survived to discharge. Postdischarge mortality was 19.8% (n = 63) for the study period. Of the postdischarge deaths, 54.0% died within 6 months of discharge, and 68.3% died within 1 year. Median days to death after discharge or censoring were 149 and 410, respectively. Factors associated with death after discharge included age, preinjury anticoagulant use, higher number of Charlson comorbidities, discharge to a long-term care facility, and severe disability. Factors related to injury severity (i.e., Injury Severity Score [ISS], initial Glasgow Coma Scale [GCS] score) and preinjury prescription antiplatelet agent use, previously found to predict inpatient death, did not predict postdischarge mortality. Older TBI patients who survive to discharge have a significant risk of death within 1 year. Predictors of postdischarge mortality and inpatient death differ. Death after discharge is largely a function of overall health status. Monitoring health status and continued

  12. Xenon treatment protects against cold ischemia associated delayed graft function and prolongs graft survival in rats.

    PubMed

    Zhao, H; Watts, H R; Chong, M; Huang, H; Tralau-Stewart, C; Maxwell, P H; Maze, M; George, A J T; Ma, D

    2013-08-01

    Prolonged hypothermic storage causes ischemia-reperfusion injury (IRI) in the renal graft, which is considered to contribute to the occurrence of the delayed graft function (DGF) and chronic graft failure. Strategies are required to protect the graft and to prolong renal graft survival. We demonstrated that xenon exposure to human proximal tubular cells (HK-2) led to activation of range of protective proteins. Xenon treatment prior to or after hypothermia-hypoxia challenge stabilized the HK-2 cellular structure, diminished cytoplasmic translocation of high-mobility group box (HMGB) 1 and suppressed NF-κB activation. In the syngeneic Lewis-to-Lewis rat model of kidney transplantation, xenon exposure to donors before graft retrieval or to recipients after engraftment decreased caspase-3 expression, localized HMGB-1 within nuclei and prevented TLR-4/NF-κB activation in tubular cells; serum pro-inflammatory cytokines IL-1β, IL-6 and TNF-α were reduced and renal function was preserved. Xenon treatment of graft donors or of recipients prolonged renal graft survival following IRI in both Lewis-to-Lewis isografts and Fischer-to-Lewis allografts. Xenon induced cell survival or graft functional recovery was abolished by HIF-1α siRNA. Our data suggest that xenon treatment attenuates DGF and enhances graft survival. This approach could be translated into clinical practice leading to a considerable improvement in long-term graft survival. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.

  13. Impact of perioperative myocardial infarction on angiographic and clinical outcomes following coronary artery bypass grafting (from PRoject of Ex-vivo Vein graft ENgineering via Transfection [PREVENT] IV).

    PubMed

    Yau, James M; Alexander, John H; Hafley, Gail; Mahaffey, Kenneth W; Mack, Michael J; Kouchoukos, Nicholas; Goyal, Abhinav; Peterson, Eric D; Gibson, C Michael; Califf, Robert M; Harrington, Robert A; Ferguson, T Bruce

    2008-09-01

    Myocardial infarction (MI) after coronary artery bypass grafting (CABG) is associated with significant morbidity and mortality. Frequency, management, mechanisms, and angiographic and clinical outcomes associated with perioperative MI remain poorly understood. PREVENT IV was a multicenter, randomized, placebo-controlled trial of edifoligide in 3,014 patients undergoing CABG. Angiographic and 2-year clinical follow-up were complete for 1,920 and 2,956 patients, respectively. Perioperative MI was defined as creatinine kinase-MB increase >or=10 times the upper limit of normal or >or=5 times the upper limit of normal with new 30-ms Q waves within 24 hours of surgery. Baseline characteristics, in-hospital management, and angiographic and clinical outcomes of patients with and without perioperative MI were compared. Perioperative MI occurred in 294 patients (9.8%). Patients with perioperative MI had longer surgery (250 vs 230 minutes; p <0.001), more on-pump surgery (83% vs 78%; p = 0.048), and worse target-artery quality (p <0.001). Patients with perioperative MI more frequently underwent angiography within 30 days of enrollment (1.7% vs 0.6%; p = 0.021). One-year angiographic vein graft failure occurred in 62.4% of patients with and 43.8% of patients without perioperative MI (p <0.001). Two-year composite clinical outcome (death, MI, or revascularization) was worse in patients with perioperative MI before (19.4% vs 15.2%; p = 0.039) and after (hazard ratio 1.33, 95% confidence interval 1.00 to 1.76, p = 0.046) adjusting for differences in significant predictors. In conclusion, perioperative MI was relatively common, was associated with worse outcomes, and mechanisms other than vein graft failure accounted for a substantial proportion of these MIs. Further research is needed into the prevention and treatment of perioperative MI in patients undergoing CABG.

  14. Estimation of distributional parameters for censored trace level water quality data: 2. Verification and applications

    USGS Publications Warehouse

    Helsel, Dennis R.; Gilliom, Robert J.

    1986-01-01

    Estimates of distributional parameters (mean, standard deviation, median, interquartile range) are often desired for data sets containing censored observations. Eight methods for estimating these parameters have been evaluated by R. J. Gilliom and D. R. Helsel (this issue) using Monte Carlo simulations. To verify those findings, the same methods are now applied to actual water quality data. The best method (lowest root-mean-squared error (rmse)) over all parameters, sample sizes, and censoring levels is log probability regression (LR), the method found best in the Monte Carlo simulations. Best methods for estimating moment or percentile parameters separately are also identical to the simulations. Reliability of these estimates can be expressed as confidence intervals using rmse and bias values taken from the simulation results. Finally, a new simulation study shows that best methods for estimating uncensored sample statistics from censored data sets are identical to those for estimating population parameters. Thus this study and the companion study by Gilliom and Helsel form the basis for making the best possible estimates of either population parameters or sample statistics from censored water quality data, and for assessments of their reliability.

  15. Rank-based estimation in the {ell}1-regularized partly linear model for censored outcomes with application to integrated analyses of clinical predictors and gene expression data.

    PubMed

    Johnson, Brent A

    2009-10-01

    We consider estimation and variable selection in the partial linear model for censored data. The partial linear model for censored data is a direct extension of the accelerated failure time model, the latter of which is a very important alternative model to the proportional hazards model. We extend rank-based lasso-type estimators to a model that may contain nonlinear effects. Variable selection in such partial linear model has direct application to high-dimensional survival analyses that attempt to adjust for clinical predictors. In the microarray setting, previous methods can adjust for other clinical predictors by assuming that clinical and gene expression data enter the model linearly in the same fashion. Here, we select important variables after adjusting for prognostic clinical variables but the clinical effects are assumed nonlinear. Our estimator is based on stratification and can be extended naturally to account for multiple nonlinear effects. We illustrate the utility of our method through simulation studies and application to the Wisconsin prognostic breast cancer data set.

  16. Identification of subgroups by risk of graft failure after paediatric renal transplantation: application of survival tree models on the ESPN/ERA-EDTA Registry.

    PubMed

    Lofaro, Danilo; Jager, Kitty J; Abu-Hanna, Ameen; Groothoff, Jaap W; Arikoski, Pekka; Hoecker, Britta; Roussey-Kesler, Gwenaelle; Spasojević, Brankica; Verrina, Enrico; Schaefer, Franz; van Stralen, Karlijn J

    2016-02-01

    Identification of patient groups by risk of renal graft loss might be helpful for accurate patient counselling and clinical decision-making. Survival tree models are an alternative statistical approach to identify subgroups, offering cut-off points for covariates and an easy-to-interpret representation. Within the European Society of Pediatric Nephrology/European Renal Association-European Dialysis and Transplant Association (ESPN/ERA-EDTA) Registry data we identified paediatric patient groups with specific profiles for 5-year renal graft survival. Two analyses were performed, including (i) parameters known at time of transplantation and (ii) additional clinical measurements obtained early after transplantation. The identified subgroups were added as covariates in two survival models. The prognostic performance of the models was tested and compared with conventional Cox regression analyses. The first analysis included 5275 paediatric renal transplants. The best 5-year graft survival (90.4%) was found among patients who received a renal graft as a pre-emptive transplantation or after short-term dialysis (<45 days), whereas graft survival was poorest (51.7%) in adolescents transplanted after long-term dialysis (>2.2 years). The Cox model including both pre-transplant factors and tree subgroups had a significantly better predictive performance than conventional Cox regression (P < 0.001). In the analysis including clinical factors, graft survival ranged from 97.3% [younger patients with estimated glomerular filtration rate (eGFR) >30 mL/min/1.73 m(2) and dialysis <20 months] to 34.7% (adolescents with eGFR <60 mL/min/1.73 m(2) and dialysis >20 months). Also in this case combining tree findings and clinical factors improved the predictive performance as compared with conventional Cox model models (P < 0.0001). In conclusion, we demonstrated the tree model to be an accurate and attractive tool to predict graft failure for patients with specific characteristics. This may

  17. Kidney and liver organ transplantation in persons with human immunodeficiency virus: An Evidence-Based Analysis.

    PubMed

    2010-01-01

    -cell count and HIV viral load appear controlled post transplant with an incidence of opportunistic infection of 20.5%. However, the quality of this evidence for these outcomes is very low indicating uncertainty in these effects. Similarly, because of very low quality evidence there is uncertainty in the rate of acute graft rejection among both the HIV+ and HIV- groups LIVER TRANSPLANTATION: HIV+/HCV+ VS. HCV+ Based on a combined HIV+/HCV+ cohort sample size of 156 from seven studies, the risk of death after liver transplantation is significantly greater (2.8 fold) in a co-infected cohort compared with an HCV+ mono-infected cohort (HR: 2.81; 95% CI: 1.47, 5.37). The quality of evidence supporting this outcome is very low. Death censored graft survival evidence was not available. Regarding disease progression, based on a combined sample size of 71 persons in the co-infected cohort, the CD4+ T-cell count and HIV viral load appear controlled post transplant; however, again the quality of evidence supporting this outcome is very low. The rate of opportunistic infection in the co-infected cohort was 7.2%. The quality of evidence supporting this estimate is very low, indicating uncertainty in these estimates of effect. Based on a combined HIV+/HCV+ cohort (n=57) the rate of acute graft rejection does not differ to that of an HCV+ mono-infected cohort (OR: 0.88; 95% CI: 0.44, 1.76). Also based on a combined HIV+/HCV+ cohort (n=83), the rate of HCV+ recurrence does not differ to that of an HCV+ mono-infected cohort (OR: 0.66; 95% CI: 0.27, 1.59). In both cases, the quality of the supporting evidence was very low. Overall, because of very low quality evidence there is uncertainty in the effect of kidney or liver transplantation in HIV+ persons with end stage organ failure compared with those not infected with HIV. Examining the economics of this issue, the cost of kidney and liver transplants in an HIV+ patient population are, on average, 56K and 147K per case, based on both Canadian

  18. Prevalence and Long-Term Survival After Coronary Artery Bypass Grafting in Women and Men With Heart Failure and Preserved Versus Reduced Ejection Fraction.

    PubMed

    Sun, Louise Y; Tu, Jack V; Bader Eddeen, Anan; Liu, Peter P

    2018-06-16

    Heart failure (HF) with reduced ejection fraction (rEF) is a widely regarded prognosticator after coronary artery bypass grafting. HF with preserved ejection fraction (pEF) accounts for up to half of all HF cases and is associated with considerable morbidity and mortality in hospitalized cohorts. However, HFpEF outcomes have not been elucidated in cardiac surgical patients. We investigated the prevalence and outcomes of HFpEF and HFrEF in women and men following coronary artery bypass grafting. We conducted a retrospective cohort study in Ontario, Canada, between October 1, 2008, and March 31, 2015, using Cardiac Care Network and Canadian Institute of Health Information data. HF is captured through a validated population-based database of all Ontarians with physician-diagnosed HF. We defined pEF as ejection fraction ≥50% and rEF as ejection fraction <50%. The primary outcome was all-cause mortality. Analyses were stratified by sex. Mortality rates were calculated using Kaplan-Meier method. The relative hazard of death was assessed using multivariable Cox proportional hazard models. Of 40 083 patients (20.6% women), 55.5% had pEF without HF, 25.7% had rEF without HF, 6.9% had HFpEF, and 12.0% had HFrEF. Age-standardized HFpEF mortality rates at 4±2 years of follow-up were similar in women and men. HFrEF standardized HFpEF mortality rates were higher in women than men. We found a higher prevalence and poorer prognosis of HFpEF in women. A history of HF was a more important prognosticator than ejection fraction. Preoperative screening and extended postoperative follow-up should be focused on women and men with HF rather than on rEF alone. © 2018 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  19. Transposed brachial-basilic arteriovenous fistulas versus prosthetic upper limb grafts: a meta-analysis.

    PubMed

    Lazarides, M K; Georgiadis, G S; Papasideris, C P; Trellopoulos, G; Tzilalis, V D

    2008-11-01

    Controversy exists regarding the best type of arteriovenous (AV) fistula to be formed in secondary and tertiary access procedures when primary fistulas have failed. This meta-analysis aimed to compare transposed brachial-basilic AV fistulas (BBAVFs) with upper limb AV prosthetic grafts. A literature search of the MEDLINE and SCOPUS databases was performed to identify comparative studies reporting outcomes for both BBAVFs with upper limb AV prosthetic grafts. Meta-analysis techniques were applied to identify differences in outcomes between the two groups regarding primary and secondary 1-year failure rates. Eleven relevant studies, involving 1509 patients, met the inclusion criteria and were incorporated in the final analysis; however, only one was randomised controlled trial. The pooled odds' ratio (OR) estimate for the primary and secondary failure rates at 1 year was 0.67 (CI 0.41-1.09) and 0.88 (CI 0.69-1.12), respectively, showing no difference in the outcome between the two groups. The re-intervention rate was higher for prosthetic grafts (0.54 per BBAVF versus 1.32 per graft). In a small subgroup of two studies comparing BBAVFs with forearm grafts the pooled estimate for 1-year primary failure rate was in favour of the BBAVF group (OR 0.3, CI 0.15-0.58, p=0.0004) suggesting that forearm grafts were inferior having a 3-fold risk of failure at 1 year. This analysis supports the use of BBAVF early in difficult access cases prior to the use of prosthetic grafts. However, the latter conclusion is debatable due to heterogeneity, small size and non-randomised design of the included studies.

  20. Mechanism of failure of the Cabrol procedure: A computational fluid dynamic analysis.

    PubMed

    Poullis, M; Pullan, M

    2015-12-01

    Sudden failure of the Cabrol graft is common and frequently fatal. We utilised the technique of computational fluid dynamic (CFD) analysis to evaluate the mechanism of failure and potentially improve on the design of the Cabrol procedure. CFD analysis of the classic Cabrol procedure and a number of its variants was performed. Results from this analysis was utilised to generate further improved geometric options for the Cabrol procedure. These were also subjected to CFD analysis. All current Cabrol and variations of the Cabrol procedure are predicated by CFD analysis to be prone to graft thrombosis, secondary to stasis around the right coronary artery button. The right coronary artery flow characteristics were found to be the dominant reason for Cabrol graft failure. A simple modification of the Cabrol geometry is predicated to virtually eliminate any areas of blood stasis, and graft failure. Modification of the Cabrol graft geometry, due to CFD analysis may help reduce the incidence of graft thrombosis. A C shaped Cabrol graft with the right coronary button anastomosed to its side along its course from the aorta to the left coronary button is predicted to have the least thrombotic tendency. Clinical correlation is needed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Beware! Here There Be Beasties: Responding to Fundamentalist Censors.

    ERIC Educational Resources Information Center

    Traw, Rick

    1996-01-01

    Describes a heated censorship controversy experienced in 1990 in the Sioux Falls, South Dakota, school district brought by fundamentalist censors against the "Impressions" reading series. Explores specific categories of complaints, such as the supernatural, folktales, and myths. Notes the influence of religion and racism. Includes an addendum of…

  2. Censoring: a new approach for detection limits in total-reflection X-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Pajek, M.; Kubala-Kukuś, A.; Braziewicz, J.

    2004-08-01

    It is shown that the detection limits in the total-reflection X-ray fluorescence (TXRF), which restrict quantification of very low concentrations of trace elements in the samples, can be accounted for using the statistical concept of censoring. We demonstrate that the incomplete TXRF measurements containing the so-called "nondetects", i.e. the non-measured concentrations falling below the detection limits and represented by the estimated detection limit values, can be viewed as the left random-censored data, which can be further analyzed using the Kaplan-Meier (KM) method correcting for nondetects. Within this approach, which uses the Kaplan-Meier product-limit estimator to obtain the cumulative distribution function corrected for the nondetects, the mean value and median of the detection limit censored concentrations can be estimated in a non-parametric way. The Monte Carlo simulations performed show that the Kaplan-Meier approach yields highly accurate estimates for the mean and median concentrations, being within a few percent with respect to the simulated, uncensored data. This means that the uncertainties of KM estimated mean value and median are limited in fact only by the number of studied samples and not by the applied correction procedure for nondetects itself. On the other hand, it is observed that, in case when the concentration of a given element is not measured in all the samples, simple approaches to estimate a mean concentration value from the data yield erroneous, systematically biased results. The discussed random-left censoring approach was applied to analyze the TXRF detection-limit-censored concentration measurements of trace elements in biomedical samples. We emphasize that the Kaplan-Meier approach allows one to estimate the mean concentrations being substantially below the mean level of detection limits. Consequently, this approach gives a new access to lower the effective detection limits for TXRF method, which is of prime interest for

  3. Magnesium inference screw supports early graft incorporation with inhibition of graft degradation in anterior cruciate ligament reconstruction

    PubMed Central

    Cheng, Pengfei; Han, Pei; Zhao, Changli; Zhang, Shaoxiang; Zhang, Xiaonong; Chai, Yimin

    2016-01-01

    Patients after anterior cruciate ligament (ACL) reconstruction surgery commonly encounters graft failure in the initial phase of rehabilitation. The inhibition of graft degradation is crucial for the successful reconstruction of the ACL. Here, we used biodegradable high-purity magnesium (HP Mg) screws in the rabbit model of ACL reconstruction with titanium (Ti) screws as a control and analyzed the graft degradation and screw corrosion using direct pull-out tests, microCT scanning, and histological and immunohistochemical staining. The most noteworthy finding was that tendon graft fixed by HP Mg screws exhibited biomechanical properties substantially superior to that by Ti screws and the relative area of collagen fiber at the tendon-bone interface was much larger in the Mg group, when severe graft degradation was identified in the histological analysis at 3 weeks. Semi-quantitative immunohistochemical results further elucidated that the MMP-13 expression significantly decreased surrounding HP Mg screws with relatively higher Collagen II expression. And HP Mg screws exhibited uniform corrosion behavior without displacement or loosening in the femoral tunnel. Therefore, our results demonstrated that Mg screw inhibited graft degradation and improved biomechanical properties of tendon graft during the early phase of graft healing and highlighted its potential in ACL reconstruction. PMID:27210585

  4. Magnesium inference screw supports early graft incorporation with inhibition of graft degradation in anterior cruciate ligament reconstruction

    NASA Astrophysics Data System (ADS)

    Cheng, Pengfei; Han, Pei; Zhao, Changli; Zhang, Shaoxiang; Zhang, Xiaonong; Chai, Yimin

    2016-05-01

    Patients after anterior cruciate ligament (ACL) reconstruction surgery commonly encounters graft failure in the initial phase of rehabilitation. The inhibition of graft degradation is crucial for the successful reconstruction of the ACL. Here, we used biodegradable high-purity magnesium (HP Mg) screws in the rabbit model of ACL reconstruction with titanium (Ti) screws as a control and analyzed the graft degradation and screw corrosion using direct pull-out tests, microCT scanning, and histological and immunohistochemical staining. The most noteworthy finding was that tendon graft fixed by HP Mg screws exhibited biomechanical properties substantially superior to that by Ti screws and the relative area of collagen fiber at the tendon-bone interface was much larger in the Mg group, when severe graft degradation was identified in the histological analysis at 3 weeks. Semi-quantitative immunohistochemical results further elucidated that the MMP-13 expression significantly decreased surrounding HP Mg screws with relatively higher Collagen II expression. And HP Mg screws exhibited uniform corrosion behavior without displacement or loosening in the femoral tunnel. Therefore, our results demonstrated that Mg screw inhibited graft degradation and improved biomechanical properties of tendon graft during the early phase of graft healing and highlighted its potential in ACL reconstruction.

  5. Outcomes of Adult Liver Transplantation from Donation After Brain Death Followed by Circulatory Death in China.

    PubMed

    Zhang, Jiabin; Ren, Hui; Sun, Yanling; Li, Zhijie; Wang, Hongbo; Liu, Zhenwen; Zhou, Shaotang

    2018-05-01

    BACKGROUND Organ donation from a deceased donor, which is donation after brain death followed by circulatory death, is a unique transplantation practice in China. Pathological features of grafts help guide the utilization of grafts. MATERIAL AND METHODS We retrospectively reviewed our experiences in 188 DBCD allografts from May 2014 to April 2017. We divided 183 transplanted allografts into 3 groups according to pretransplant histology: the good quality graft group (n=62), the preservation injury group (n=27), and the steatotic graft group (n=94). Univariate and multivariate analyses were performed to identify factors in the steatotic graft group predicting the prognoses. RESULTS The prevalence rates of allografts in the good quality, steatotic liver, and preservation injury groups were 33.0% (62/188), 50.0% (94/188), and 14.4%(27/188), respectively, and the discarded rate was 2.7% (5/188). The 1- and 3-year overall survival rates were 92.1% and 88.1%, respectively. There were no differences in 1- and 3-year patient survival among the 3 groups (p=0.615). Some complications occurred: acute rejection in 7 cases, lung infection in 11 recipients, biliary stricture and bile leak in 9 patients, and portal thrombosis in 1 recipient; 17 recipients died of various causes. Cox multivariate analysis revealed that longer cold storage time was associated with worse outcome in the steatotic graft group. CONCLUSIONS Clinical outcomes of adult liver transplantation from deceased donation in China are acceptable.

  6. Genetic therapy for vein bypass graft disease: current perspectives.

    PubMed

    Simosa, Hector F; Conte, Michael S

    2004-01-01

    Although continued progress in endovascular technology holds promise for less invasive approaches to arterial diseases, surgical bypass grafting remains the mainstay of therapy for patients with advanced coronary and peripheral ischemia. In the United States, nearly 400,000 coronary and 100,000 lower extremity bypass procedures are performed annually. The autogenous vein, particularly the greater saphenous vein, has proven to be a durable and versatile arterial substitute, with secondary patency rates at 5 years of 70 to 80% in the extremity. However, vein graft failure is a common occurrence that incurs significant morbidity and mortality, and, to date, pharmacologic approaches to prolong vein graft patency have produced limited results. Dramatic advances in genetics, coupled with a rapidly expanding knowledge of the molecular basis of vascular diseases, have set the stage for genetic interventions. The attraction of a genetic approach to vein graft failure is based on the notion that the tissue at risk is readily accessible to the clinician prior to the onset of the pathologic process and the premise that genetic reprogramming of cells in the wall of the vein can lead to an improved healing response. Although the pathophysiology of vein graft failure is incompletely understood, numerous relevant molecular targets have been elucidated. Interventions designed to influence cell proliferation, thrombosis, inflammation, and matrix remodeling at the genetic level have been described, and many have been tested in animal models. Both gene delivery and gene blockade strategies have been investigated, with the latter reaching the stage of advanced clinical trials.

  7. Donations After Circulatory Death in Liver Transplant.

    PubMed

    Eren, Emre A; Latchana, Nicholas; Beal, Eliza; Hayes, Don; Whitson, Bryan; Black, Sylvester M

    2016-10-01

    The supply of liver grafts for treatment of end-stage liver disease continues to fall short of ongoing demands. Currently, most liver transplants originate from donations after brain death. Enhanced utilization of the present resources is prudent to address the needs of the population. Donation after circulatory or cardiac death is a mechanism whereby the availability of organs can be expanded. Donations after circulatory death pose unique challenges given their exposure to warm ischemia. Technical principles of donations after circulatory death procurement and pertinent studies investigating patient outcomes, graft outcomes, and complications are highlighted in this review. We also review associated risk factors to suggest potential avenues to achieve improved outcomes and reduced complications. Future considerations and alternative techniques of organ preservation are discussed, which may suggest novel strategies to enhance preservation and donor expansion through the use of marginal donors. Ultimately, without effective measures to bolster organ supply, donations after circulatory death should remain a consideration; however, an understanding of inherent risks and limitations is necessary.

  8. Adapting machine learning techniques to censored time-to-event health record data: A general-purpose approach using inverse probability of censoring weighting.

    PubMed

    Vock, David M; Wolfson, Julian; Bandyopadhyay, Sunayan; Adomavicius, Gediminas; Johnson, Paul E; Vazquez-Benitez, Gabriela; O'Connor, Patrick J

    2016-06-01

    Models for predicting the probability of experiencing various health outcomes or adverse events over a certain time frame (e.g., having a heart attack in the next 5years) based on individual patient characteristics are important tools for managing patient care. Electronic health data (EHD) are appealing sources of training data because they provide access to large amounts of rich individual-level data from present-day patient populations. However, because EHD are derived by extracting information from administrative and clinical databases, some fraction of subjects will not be under observation for the entire time frame over which one wants to make predictions; this loss to follow-up is often due to disenrollment from the health system. For subjects without complete follow-up, whether or not they experienced the adverse event is unknown, and in statistical terms the event time is said to be right-censored. Most machine learning approaches to the problem have been relatively ad hoc; for example, common approaches for handling observations in which the event status is unknown include (1) discarding those observations, (2) treating them as non-events, (3) splitting those observations into two observations: one where the event occurs and one where the event does not. In this paper, we present a general-purpose approach to account for right-censored outcomes using inverse probability of censoring weighting (IPCW). We illustrate how IPCW can easily be incorporated into a number of existing machine learning algorithms used to mine big health care data including Bayesian networks, k-nearest neighbors, decision trees, and generalized additive models. We then show that our approach leads to better calibrated predictions than the three ad hoc approaches when applied to predicting the 5-year risk of experiencing a cardiovascular adverse event, using EHD from a large U.S. Midwestern healthcare system. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. How I treat acute graft-versus-host disease of the gastrointestinal tract and the liver.

    PubMed

    McDonald, George B

    2016-03-24

    Treatment of acute graft-versus-host disease (GVHD) has evolved from a one-size-fits-all approach to a more nuanced strategy based on predicted outcomes. Lower and time-limited doses of immune suppression for patients predicted to have low-risk GVHD are safe and effective. In more severe GVHD, prolonged exposure to immunosuppressive therapies, failure to achieve tolerance, and inadequate clinical responses are the proximate causes of GVHD-related deaths. This article presents acute GVHD-related scenarios representing, respectively, certainty of diagnosis, multiple causes of symptoms, jaundice, an initial therapy algorithm, secondary therapy, and defining futility of treatment. © 2016 by The American Society of Hematology.

  10. Fourteen-year outcomes of abdominal aortic endovascular repair with the Zenith stent graft.

    PubMed

    Verzini, Fabio; Romano, Lydia; Parlani, Gianbattista; Isernia, Giacomo; Simonte, Gioele; Loschi, Diletta; Lenti, Massimo; Cao, Piergiorgio

    2017-02-01

    Long-term results of abdominal aortic aneurysm (AAA) endovascular repair are affected by graft design renewals that tend to improve the performance of older generation prostheses but usually reset the follow-up times to zero. The present study investigated the long-term outcomes of endovascular AAA repair (EVAR) using the Zenith graft, still in use without major modification, in a single center experience. Between 2000 and 2011, 610 patients underwent elective EVAR using the Zenith endograft (Cook Inc, Bloomington, Ind) and represent the study group. Primary outcomes were overall survival, freedom from AAA rupture, and freedom from AAA-related death. Secondary outcomes included freedom from late (>30 days) reintervention, freedom from late (>30 days) conversion to open repair, freedom from aneurysm sac enlargement >5.0 mm and freedom from EVAR failure, defined as a composite of AAA-related death, AAA rupture, AAA growth >5 mm, and any reintervention. Mean age was 73.2 years. Mean aneurysm diameter was 55.3 mm. There were five perioperative deaths (0.8%) and three intraoperative conversions. At a mean follow-up of 99.2 (range, 0-175) months, seven AAA ruptures occurred, all fatal except one. Overall survival was 92.8% ± 1.1% at 1 year, 70.1% ± 1.9% at 5 years, 37.8% ± 2.9% at 10 years, and 24 ± 4% at 14 years. Freedom from AAA-rupture was 99.8% ± 0.02 at 1 year (one case), 99.4% ± 0.04 at 5 years (three cases), and 98.1% ± 0.07 at 10 and 14 years. Freedom from late reintervention and conversion was 98% ± 0.6 at 1 year, 87.7% ± 1.5 at 5 years, 75.7% ± 3.2 at 10 years, and 69.9% ± 5.2 at 14 years. Freedom from aneurysm sac growth >5.0 mm was 99.8% at 1 year, 96.6% ± 0.7 at 5 years, 81.0% ± 3.4 at 10 years, and 74.1% ± 5.8% at 14 years. EVAR failure occurred in 132 (21.6%) patients at 14 years. At multivariate analysis, independent predictors of EVAR failure resulted type I and III endoleak (hazard ratio [HR], 6.7; 95

  11. A Bayesian model for time-to-event data with informative censoring

    PubMed Central

    Kaciroti, Niko A.; Raghunathan, Trivellore E.; Taylor, Jeremy M. G.; Julius, Stevo

    2012-01-01

    Randomized trials with dropouts or censored data and discrete time-to-event type outcomes are frequently analyzed using the Kaplan–Meier or product limit (PL) estimation method. However, the PL method assumes that the censoring mechanism is noninformative and when this assumption is violated, the inferences may not be valid. We propose an expanded PL method using a Bayesian framework to incorporate informative censoring mechanism and perform sensitivity analysis on estimates of the cumulative incidence curves. The expanded method uses a model, which can be viewed as a pattern mixture model, where odds for having an event during the follow-up interval (tk−1,tk], conditional on being at risk at tk−1, differ across the patterns of missing data. The sensitivity parameters relate the odds of an event, between subjects from a missing-data pattern with the observed subjects for each interval. The large number of the sensitivity parameters is reduced by considering them as random and assumed to follow a log-normal distribution with prespecified mean and variance. Then we vary the mean and variance to explore sensitivity of inferences. The missing at random (MAR) mechanism is a special case of the expanded model, thus allowing exploration of the sensitivity to inferences as departures from the inferences under the MAR assumption. The proposed approach is applied to data from the TRial Of Preventing HYpertension. PMID:22223746

  12. A comparative analysis of the outcomes of aortic cuffs and converters for endovascular graft migration.

    PubMed

    Thomas, Bradley G; Sanchez, Luis A; Geraghty, Patrick J; Rubin, Brian G; Money, Samuel R; Sicard, Gregorio A

    2010-06-01

    Proximal attachment failure, often leading to graft migration, is a severe complication of endovascular aneurysm repair (EVAR). Aortic cuffs have been used to treat proximal attachment failure with mixed results. The Zenith Renu AAA Ancillary Graft (Cook Inc, Bloomington, Ind) is available in two configurations: converter and main body extension. Both provide proximal extension with active fixation for the treatment of pre-existing endovascular grafts with failed or failing proximal fixation or seal in patients who are not surgical candidates. We prospectively compared the outcomes of patient treatment with these two device configurations. From September 2005 to May 2008, a prospective, nonrandomized, postmarket registry was conducted to collect data from 151 patients treated at 95 institutions for proximal aortic endovascular graft failure using the Renu graft. Treatment indications included inadequate proximal fixation or seal, for example, migration, and type I and III endoleak. A total of 136 patients (90%) had migration, 111 (74%) had endoleak, and 94 (62%) had endoleaks and graft migration. AneuRx grafts were present in 126 patients (83%), of which 89 (59%) were treated with a converter and 62 (41%) with a main body extension. Outcomes using converters vs main body extensions for endoleak rates, changes in aneurysm size, and ruptures were compared. Preprocedural demographics between the two groups did not differ significantly. Procedural success rates were 98% for the converter group and 100% for the main body extension group. At a mean follow-up of 12.8 +/- 7.5 months, no type III endoleaks (0%)were identified in the converter group, and five (8%) were identified in the main body extension group. There were no aneurysm ruptures in patients treated with converters (0%) and three ruptures (5%) in patients treated with main body extensions. Each patient with aneurysm rupture had been treated with a Renu main body extension, developed a type III endoleak, and

  13. Narrowing the gap: early and intermediate outcomes after percutaneous coronary intervention and coronary artery bypass graft procedures in California, 1997 to 2006.

    PubMed

    Carey, Joseph S; Danielsen, Beate; Milliken, Jeffrey; Li, Zhongmin; Stabile, Bruce E

    2009-11-01

    Percutaneous coronary intervention is increasingly used to treat multivessel coronary artery disease. Coronary artery bypass graft procedures have decreased, and as a result, percutaneous coronary intervention has increased. The overall impact of this treatment shift is uncertain. We examined the in-hospital mortality and complication rates for these procedures in California using a combined risk model. The confidential dataset of the Office of Statewide Health Planning and Development patient discharge database was queried for 1997 to 2006. A risk model was developed using International Classification of Diseases, Ninth Revision, Clinical Modification procedures and diagnostic codes from the combined pool of isolated coronary artery bypass graft and percutaneous coronary intervention procedures performed during 2005 and 2006. In-hospital mortality was corrected for "same-day" transfers to another health care institution. Early failure rate was defined as in-hospital mortality rate plus reintervention for another percutaneous coronary intervention or cardiac surgery procedure within 90 days. Coronary artery bypass graft volume decreased from 28,495 (1997) to 15,520 (2006), whereas percutaneous coronary intervention volume increased from 38,098 to 53,703. Risk-adjusted mortality rate decreased from 4.7% to 2.1% for coronary artery bypass graft procedures and from 3.4% to 1.9% for percutaneous coronary intervention. Expected mortality rate increased for both procedures. Early failure rate decreased from 13.1% to 8.0% for percutaneous coronary intervention and from 6.5% to 5.4% for coronary artery bypass graft. For the years 2004 and 2005, the risk of recurrent myocardial infarction or need for coronary artery bypass graft during the first postoperative year was 12% for percutaneous coronary intervention and 6% for coronary artery bypass grafts. This study shows that as volume shifted from coronary artery bypass grafts to percutaneous coronary intervention, expected

  14. Statistical inferences with jointly type-II censored samples from two Pareto distributions

    NASA Astrophysics Data System (ADS)

    Abu-Zinadah, Hanaa H.

    2017-08-01

    In the several fields of industries the product comes from more than one production line, which is required to work the comparative life tests. This problem requires sampling of the different production lines, then the joint censoring scheme is appeared. In this article we consider the life time Pareto distribution with jointly type-II censoring scheme. The maximum likelihood estimators (MLE) and the corresponding approximate confidence intervals as well as the bootstrap confidence intervals of the model parameters are obtained. Also Bayesian point and credible intervals of the model parameters are presented. The life time data set is analyzed for illustrative purposes. Monte Carlo results from simulation studies are presented to assess the performance of our proposed method.

  15. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it

  16. Automatic segmentation of the wire frame of stent grafts from CT data.

    PubMed

    Klein, Almar; van der Vliet, J Adam; Oostveen, Luuk J; Hoogeveen, Yvonne; Kool, Leo J Schultze; Renema, W Klaas Jan; Slump, Cornelis H

    2012-01-01

    Endovascular aortic replacement (EVAR) is an established technique, which uses stent grafts to treat aortic aneurysms in patients at risk of aneurysm rupture. Late stent graft failure is a serious complication in endovascular repair of aortic aneurysms. Better understanding of the motion characteristics of stent grafts will be beneficial for designing future devices. In addition, analysis of stent graft movement in individual patients in vivo can be valuable for predicting stent graft failure in these patients. To be able to gather information on stent graft motion in a quick and robust fashion, we propose an automatic method to segment stent grafts from CT data, consisting of three steps: the detection of seed points, finding the connections between these points to produce a graph, and graph processing to obtain the final geometric model in the form of an undirected graph. Using annotated reference data, the method was optimized and its accuracy was evaluated. The experiments were performed using data containing the AneuRx and Zenith stent grafts. The algorithm is robust for noise and small variations in the used parameter values, does not require much memory according to modern standards, and is fast enough to be used in a clinical setting (65 and 30s for the two stent types, respectively). Further, it is shown that the resulting graphs have a 95% (AneuRx) and 92% (Zenith) correspondence with the annotated data. The geometric model produced by the algorithm allows incorporation of high level information and material properties. This enables us to study the in vivo motions and forces that act on the frame of the stent. We believe that such studies will provide new insights into the behavior of the stent graft in vivo, enables the detection and prediction of stent failure in individual patients, and can help in designing better stent grafts in the future. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. A joint frailty-copula model between tumour progression and death for meta-analysis.

    PubMed

    Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie

    2017-12-01

    Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.

  18. National trends in rates of death and hospital admissions related to acute myocardial infarction, heart failure and stroke, 1994–2004

    PubMed Central

    Tu, Jack V.; Nardi, Lorelei; Fang, Jiming; Liu, Juan; Khalid, Laila; Johansen, Helen

    2009-01-01

    Background Rates of death from cardiovascular and cerebrovascular diseases have been steadily declining over the past few decades. Whether such declines are occurring to a similar degree for common disorders such as acute myocardial infarction, heart failure and stroke is uncertain. We examined recent national trends in mortality and rates of hospital admission for these 3 conditions. Methods We analyzed mortality data from Statistic Canada’s Canadian Mortality Database and data on hospital admissions from the Canadian Institute for Health Information’s Hospital Morbidity Database for the period 1994–2004. We determined age- and sex-standardized rates of death and hospital admissions per 100 000 population aged 20 years and over as well as in-hospital case-fatality rates. Results The overall age- and sex-standardized rate of death from cardiovascular disease in Canada declined 30.0%, from 360.6 per 100 000 in 1994 to 252.5 per 100 000 in 2004. During the same period, the rate fell 38.1% for acute myocardial infarction, 23.5% for heart failure and 28.2% for stroke, with improvements observed across most age and sex groups. The age- and sex-standardized rate of hospital admissions decreased 27.6% for stroke and 27.2% for heart failure. The rate for acute myocardial infarction fell only 9.2%. In contrast, the relative decline in the inhospital case-fatality rate was greatest for acute myocardial infarction (33.1%; p < 0.001). Much smaller relative improvements in case-fatality rates were noted for heart failure (8.1%) and stroke (8.9%). Interpretation The rates of death and hospital admissions for acute myocardial infarction, heart failure and stroke in Canada changed at different rates over the 10-year study period. Awareness of these trends may guide future efforts for health promotion and health care planning and help to determine priorities for research and treatment. PMID:19546444

  19. Assessing the impact of censoring of costs and effects on health-care decision-making: an example using the Atrial Fibrillation Follow-up Investigation of Rhythm Management (AFFIRM) study.

    PubMed

    Fenwick, Elisabeth; Marshall, Deborah A; Blackhouse, Gordon; Vidaillet, Humberto; Slee, April; Shemanski, Lynn; Levy, Adrian R

    2008-01-01

    Losses to follow-up and administrative censoring can cloud the interpretation of trial-based economic evaluations. A number of investigators have examined the impact of different levels of adjustment for censoring, including nonadjustment, adjustment of effects only, and adjustment for both costs and effects. Nevertheless, there is a lack of research on the impact of censoring on decision-making. The objective of this study was to estimate the impact of adjustment for censoring on the interpretation of cost-effectiveness results and expected value of perfect information (EVPI), using a trial-based analysis that compared rate- and rhythm-control treatments for persons with atrial fibrillation. Three different levels of adjustment for censoring were examined: no censoring of cost and effects, censoring of effects only, and censoring of both costs and effects. In each case, bootstrapping was used to estimate the uncertainty incosts and effects, and the EVPI was calculated to determine the potential worth of further research. Censoring did not impact the adoption decision. Nevertheless, this was not the case for the decision uncertainty or the EVPI. For a threshold of $50,000 per life-year, the EVPI varied between $626,000 (partial censoring) to $117 million (full censoring) for the eligible US population. The level of adjustment for censoring in trial-based cost-effectiveness analyses can impact on the decisions to fund a new technology and to devote resources for further research. Only when censoring is taken into account for both costs and effects are these decisions appropriately addressed.

  20. Skin grafting the contaminated wound bed: reassessing the role of the preoperative swab.

    PubMed

    Aerden, D; Bosmans, I; Vanmierlo, B; Spinnael, J; Keymeule, B; Van den Brande, P

    2013-02-01

    To investigate use of the preoperative wound swab to predict graft failure compared with establishing the indication for skin grafting on clinical grounds alone. Patients requiring meshed split-thickness skin grafting were prospectively included; the indication for grafting was established on clinical grounds exclusively. A preoperative swab of the wound bed was taken, but its result was concealed to prevent it influencing clinical decision-making. Negative pressure wound therapy (NPWT) was used for both wound bed preparation and graft fixation.After 2 months, graft area take percentage was measured using digital image processing software and the results validated against the result of the preoperative wound swab. Eighty-seven wounds were included in the study. Mean graft area take percentage was 88%,with five grafts considered complete failures(< 25% take).A posteriori analysis of the wound cultures showed that 53% had been contaminated on grafting, but these did not fare any worse than near-sterile wounds. Qualitative analysis of cultures showed that wounds containing either Pseudomonas aeruginosa or Staphylococcus aureus did have inferior outcome (mean take percentage 78.9% vs 91.3%; p=0.038).Diabetes was also a deteriorating factor (mean take percentage 83.0% vs 90.7%; p=0.004). Establishing the indication for skin grafting on clinical grounds exclusively does not yield grossly inferior results. In light of recent advances in skin grafting, including use of NPWT as adjuvant therapy, the requirement for routine preoperative wound swabs may be questioned.

  1. In vitro structural properties of braided tendon grafts.

    PubMed

    Nicklin, S; Waller, C; Walker, P; Chung, W K; Walsh, W R

    2000-01-01

    In an effort to increase strength in hamstring tendon grafts for anterior cruciate ligament reconstruction, braiding or weaving of the tendons has been suggested. The purpose of this study was to examine the biomechanical properties of two braiding techniques compared with a four-stranded tendon graft using a sheep model. Digital extensor tendons from 5 adult sheep were harvested in 28 matched pairs and randomly allocated to French plait or four-stranded weave. The grafts were tested in a hydraulic testing machine with the tendons secured in brass grips frozen with liquid carbon dioxide. The tendons were preconditioned to a distraction of 1 mm for 10 cycles followed by testing to failure at 50 mm/sec, with a data acquisition rate of 1,000 Hz. The stiffness, ultimate load to failure, and the mode of failure were recorded. All braided samples failed at the midsubstance, while the four-stranded controls failed at the grip interface. There was a significant reduction in strength and stiffness of the braided samples compared with the controls. This study demonstrated that braiding decreases the strength and stiffness of a four-stranded tendon graft by up to 54% and 85%, respectively. This finding is supported by the work of Hearle et al. (1969), who demonstrated that the decrease in strength of fiber bundles is equal to the square of the cosine of the twist angle. The twist angle in our samples was approximately 45 degrees, which equates to a decrease in strength of 50%.

  2. Fluorescence spectroscopy for assessment of liver transplantation grafts concerning graft viability and patient survival

    NASA Astrophysics Data System (ADS)

    Vollet Filho, José D.; da Silveira, Marina R.; Castro-e-Silva, Orlando; Bagnato, Vanderlei S.; Kurachi, Cristina

    2015-06-01

    Evaluating transplantation grafts at harvest is essential for its success. Laser-induced fluorescence spectroscopy (LIFS) can help monitoring changes in metabolic/structural conditions of tissue during transplantation. The aim of the present study is to correlate LIFSobtained spectra of human hepatic grafts during liver transplantation with post-operative patients' mortality rate and biochemical parameters, establishing a method to exclude nonviable grafts before implantation. Orthotopic liver transplantation, piggyback technique was performed in 15 patients. LIFS was performed under 408nm excitation. Collection was performed immediately after opening donor's abdominal cavity, after cold perfusion, end of back-table period, and 5 min and 1 h after warm perfusion at recipient. Fluorescence information was compared to lactate, creatinine, bilirubin and INR levels and to survival status. LIFS was sensitive to liver changes during transplantation stages. Study-in-progress; initial results indicate correlation between fluorescence and life/death status of patients.

  3. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  4. Inhibiting connexin channels protects against cryopreservation-induced cell death in human blood vessels.

    PubMed

    Bol, M; Van Geyt, C; Baert, S; Decrock, E; Wang, N; De Bock, M; Gadicherla, A K; Randon, C; Evans, W H; Beele, H; Cornelissen, R; Leybaert, L

    2013-04-01

    Cryopreserved blood vessels are being increasingly employed in vascular reconstruction procedures but freezing/thawing is associated with significant cell death that may lead to graft failure. Vascular cells express connexin proteins that form gap junction channels and hemichannels. Gap junction channels directly connect the cytoplasm of adjacent cells and may facilitate the passage of cell death messengers leading to bystander cell death. Two hemichannels form a gap junction channel but these channels are also present as free non-connected hemichannels. Hemichannels are normally closed but may open under stressful conditions and thereby promote cell death. We here investigated whether blocking gap junctions and hemichannels could prevent cell death after cryopreservation. Inclusion of Gap27, a connexin channel inhibitory peptide, during cryopreservation and thawing of human saphenous veins and femoral arteries was evaluated by terminal deoxynucleotidyl transferase dUTP nick end labelling (TUNEL) assays and histological examination. We report that Gap27 significantly reduces cell death in human femoral arteries and saphenous veins when present during cryopreservation/thawing. In particular, smooth muscle cell death was reduced by 73% in arteries and 71% in veins, while endothelial cell death was reduced by 32% in arteries and 51% in veins. We conclude that inhibiting connexin channels during cryopreservation strongly promotes vascular cell viability. Copyright © 2012 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  5. Living Donor Liver Transplantation for Wilson’s Disease Associated with Fulminant Hepatic Failure: A Case Report

    PubMed Central

    Huang, Yu; Takatsuki, Mitsuhisa; Soyama, Akihiko; Hidaka, Masaaki; Ono, Shinichiro; Adachi, Tomohiko; Hara, Takanobu; Okada, Satomi; Hamada, Takashi; Eguchi, Susumu

    2018-01-01

    Patient: Female, 17 Final Diagnosis: Fulminant Wilson’s disease Symptoms: General jaundice • malaise • abdominal pain Medication: — Clinical Procedure: ICU Specialty: Transplantology Objective: Rare disease Background: Liver transplantation is indicated for patients with Wilson’s disease (WD) who present either with acute liver failure or with end-stage liver disease and severe hepatic insufficiency as the first sign of disease. However, almost all reported cases have been treated with death donor liver transplantation. Here we report the case of a patient with WD associated with fulminant hepatic failure (WD-FHF) who underwent living donor liver transplantation (LDLT). Case Report: A 17-year-old female was diagnosed with WD-FHF based on high uric copper (10 603 μg/day, normal <100 μg/day), low serum ceruloplasmin (15 mg/dL, normal >20 mg/dL) and Kayser-Fleischer (K-F) corneal ring, and acute liver failure (ALF), acute renal failure (ARF) and grade 2 hepatic encephalopathy (HE). The model for end-stage liver disease (MELD) score was 35. Due to her critical condition, the patient underwent LDLT utilizing a right liver graft from her 44-year-old mother. The right hepatic vein (RHV) and inferior right hepatic vein (iRHV) were reconstructed. She developed severe liver dysfunction due to a crooked hepatic vein caused by compression from the large graft. To straighten the bend, a reoperation was performed. During the operation, we tried to relieve the compressed hepatic vein by adjusting the graft location, but the benefits were limited. We therefore performed stenting in both the RHV and iRHV on postoperative day 9. The patient gradually improved, exhibiting good liver and renal functions, and was finally discharged on postoperative day 114. Conclusions: When WD-FHF deteriorates too rapidly for conservative management, LDLT is an effective therapeutic strategy. PMID:29549236

  6. Donations After Circulatory Death in Liver Transplant

    PubMed Central

    Eren, Emre A.; Latchana, Nicholas; Beal, Eliza; Hayes, Don; Whitson, Bryan; Black, Sylvester M.

    2017-01-01

    The supply of liver grafts for treatment of end-stage liver disease continues to fall short of ongoing demands. Currently, most liver transplants originate from donations after brain death. Enhanced utilization of the present resources is prudent to address the needs of the population. Donation after circulatory or cardiac death is a mechanism whereby the availability of organs can be expanded. Donations after circulatory death pose unique challenges given their exposure to warm ischemia. Technical principles of donations after circulatory death procurement and pertinent studies investigating patient outcomes, graft outcomes, and complications are highlighted in this review. We also review associated risk factors to suggest potential avenues to achieve improved outcomes and reduced complications. Future considerations and alternative techniques of organ preservation are discussed, which may suggest novel strategies to enhance preservation and donor expansion through the use of marginal donors. Ultimately, without effective measures to bolster organ supply, donations after circulatory death should remain a consideration; however, an understanding of inherent risks and limitations is necessary. PMID:27733105

  7. GSimp: A Gibbs sampler based left-censored missing value imputation approach for metabolomics studies

    PubMed Central

    Jia, Erik; Chen, Tianlu

    2018-01-01

    Left-censored missing values commonly exist in targeted metabolomics datasets and can be considered as missing not at random (MNAR). Improper data processing procedures for missing values will cause adverse impacts on subsequent statistical analyses. However, few imputation methods have been developed and applied to the situation of MNAR in the field of metabolomics. Thus, a practical left-censored missing value imputation method is urgently needed. We developed an iterative Gibbs sampler based left-censored missing value imputation approach (GSimp). We compared GSimp with other three imputation methods on two real-world targeted metabolomics datasets and one simulation dataset using our imputation evaluation pipeline. The results show that GSimp outperforms other imputation methods in terms of imputation accuracy, observation distribution, univariate and multivariate analyses, and statistical sensitivity. Additionally, a parallel version of GSimp was developed for dealing with large scale metabolomics datasets. The R code for GSimp, evaluation pipeline, tutorial, real-world and simulated targeted metabolomics datasets are available at: https://github.com/WandeRum/GSimp. PMID:29385130

  8. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  9. Omega-3 Polyunsaturated Fatty Acid Supplementation to Prevent Arteriovenous Fistula and Graft Failure: A Systematic Review and Meta-analysis of Randomized Controlled Trials.

    PubMed

    Viecelli, Andrea K; Irish, Ashley B; Polkinghorne, Kevan R; Hawley, Carmel M; Johnson, David W; Mori, Trevor A; Pascoe, Elaine M; Strippoli, Giovanni F M; Lok, Charmaine E; Palmer, Suetonia C

    2018-07-01

    Arteriovenous access failure frequently occurs in people on hemodialysis and is associated with morbidity, mortality and large healthcare expenditures. Omega-3 polyunsaturated fatty acids (omega-3 PUFA) may improve access outcomes via pleiotropic effects on access maturation and function, but may cause bleeding complications. Systematic review with meta-analysis. Adults requiring hemodialysis via arteriovenous fistula or graft. Trials evaluating omega-3 PUFA for arteriovenous access outcomes identified by searches in CENTRAL, MEDLINE, and Embase to 24 January 2017. Omega-3 PUFA. Primary patency loss, dialysis suitability failure, access abandonment, interventions to maintain patency or assist maturation, bleeding, gastrointestinal side-effects, all-cause and cardiovascular mortality, hospitalization, and treatment adherence. Treatment effects were summarized as relative risks (RR) and 95% confidence intervals (CI). Evidence was assessed using GRADE. Five eligible trials (833 participants) with a median follow-up of 12 months compared peri-operative omega-3 PUFA supplementation with placebo. One trial (n=567) evaluated treatment for fistulae and four (n=266) for grafts. Omega-3 PUFA supplementation prevented primary patency loss with moderate certainty (761 participants, RR 0.81, CI 0.68-0.98). Low quality evidence suggested, that omega-3 PUFA may have had little or no effect on dialysis suitability failure (536 participants, RR 0.95, CI 0.73-1.23), access abandonment (732 participants, RR 0.78, CI 0.59-1.03), need for interventions (732 participants, RR 0.82, CI 0.64-1.04), or all-cause mortality (799 participants, RR 0.99, CI 0.51-1.92). Bleeding risk (793 participants, RR 1.40, CI 0.78-2.49) or gastrointestinal side-effects (816 participants, RR 1.22, CI 0.64-2.34) from treatment were uncertain. There was no evidence of different treatment effects for grafts and fistulae. Small number and methodological limitations of included trials. Omega-3 PUFA supplementation

  10. The Impact of Ischemia/Reperfusion Injury on Liver Allografts from Deceased after Cardiac Death versus Deceased after Brain Death Donors.

    PubMed

    Xu, Jin; Sayed, Blayne Amir; Casas-Ferreira, Ana Maria; Srinivasan, Parthi; Heaton, Nigel; Rela, Mohammed; Ma, Yun; Fuggle, Susan; Legido-Quigley, Cristina; Jassem, Wayel

    2016-01-01

    The shortage of organs for transplantation has led to increased use of organs procured from donors after cardiac death (DCD). The effects of cardiac death on the liver remain poorly understood, however. Using livers obtained from DCD versus donors after brain death (DBD), we aimed to understand how ischemia/reperfusion (I/R) injury alters expression of pro-inflammatory markers ceramides and influences graft leukocyte infiltration. Hepatocyte inflammation, as assessed by ceramide expression, was evaluated in DCD (n = 13) and DBD (n = 10) livers. Allograft expression of inflammatory and cell death markers, and allograft leukocyte infiltration were evaluated from a contemporaneous independent cohort of DCD (n = 22) and DBD (n = 13) livers. When examining the differences between transplant stages in each group, C18, C20, C24 ceramides showed significant difference in DBD (p<0.05) and C22 ceramide (p<0.05) were more pronounced for DCD. C18 ceramide is correlated to bilirubin, INR, and creatinine after transplant in DCD. Prior to transplantation, DCD livers have reduced leukocyte infiltration compared to DBD allografts. Following reperfusion, the neutrophil infiltration and platelet deposition was less prevalent in DCD grafts while cell death and recipients levels of serum aspartate aminotransferase (AST) of DCD allografts had significantly increased. These data suggest that I/R injury generate necrosis in the absence of a strong inflammatory response in DCD livers with an appreciable effect on early graft function. The long-term consequences of increased inflammation in DBD and increased cell death in DCD allografts are unknown and warrant further investigation.

  11. Analysis of elemental concentration censored distributions in breast malignant and breast benign neoplasm tissues

    NASA Astrophysics Data System (ADS)

    Kubala-Kukuś, A.; Banaś, D.; Braziewicz, J.; Góźdź, S.; Majewska, U.; Pajek, M.

    2007-07-01

    The total reflection X-ray fluorescence method was applied to study the trace element concentrations in human breast malignant and breast benign neoplasm tissues taken from the women who were patients of Holycross Cancer Centre in Kielce (Poland). These investigations were mainly focused on the development of new possibilities of cancer diagnosis and therapy monitoring. This systematic comparative study was based on relatively large (˜ 100) population studied, namely 26 samples of breast malignant and 68 samples of breast benign neoplasm tissues. The concentrations, being in the range from a few ppb to 0.1%, were determined for thirteen elements (from P to Pb). The results were carefully analysed to investigate the concentration distribution of trace elements in the studied samples. The measurements of concentration of trace elements by total reflection X-ray fluorescence were limited, however, by the detection limit of the method. It was observed that for more than 50% of elements determined, the concentrations were not measured in all samples. These incomplete measurements were treated within the statistical concept called left-random censoring and for the estimation of the mean value and median of censored concentration distributions, the Kaplan-Meier estimator was used. For comparison of concentrations in two populations, the log-rank test was applied, which allows to compare the censored total reflection X-ray fluorescence data. Found statistically significant differences are discussed in more details. It is noted that described data analysis procedures should be the standard tool to analyze the censored concentrations of trace elements analysed by X-ray fluorescence methods.

  12. Early graft failure of small-sized porcine valved conduits in reconstruction of the right ventricular outflow tract.

    PubMed

    Schreiber, Christian; Sassen, Stefanie; Kostolny, Martin; Hörer, Jürgen; Cleuziou, Julie; Wottke, Michael; Holper, Klaus; Fend, Falko; Eicken, Andreas; Lange, Rüdiger

    2006-07-01

    The quest for an alternative to homografts for reconstruction of the right ventricular outflow tract is ongoing. The Shelhigh No-React (NR-4000PA series) treated porcine pulmonic valve conduit (SPVC) was developed as a potential alternative. During a 12-month period from May 2004 to May 2005, the SPVC was implanted in 34 patients, of whom 62% were younger than 1 year. Median age at operation was 7 months (range, 5 days to 12 years). Thirteen SPCV conduits size 10, 11 size 12, 8 size 14, and 2 size 16 were initially implanted. Since May 2005, however, we have temporarily abandoned its implantation as we were concerned about a number of early failures. Until November 2005, 1 early and 1 late death have occurred. Both were not conduit related. Fifteen conduits were replaced in 13 patients. Of these, 10 were size 10, 3 size 12, 2 size 14, and none size 16. Mean time to replacement of the SPVC was 313 +/- 116 days. A pseudointimal peel formation and chronic inflammation with foreign-body reaction was found in all explanted conduits at all levels. The maximum of the inflammatory reaction occurred at the valvular level around the porcine tissues, with shrinkage of the valve and hemodynamic compromise. At valvular level, small punctuate calcifications were observed in 2 cases. In 6 patients an acute inflammatory component was observed. At late follow-up (mean follow-up 366 +/- 102 days, 34 patient-years), echocardiography showed a mean graft gradient of 39.8 +/- 29.7 mm Hg, with mild to moderate insufficiency in 4 patients. Although the No-React treated valve largely resists calcification, pseudointimal peel formation was found in all explanted conduits and led to multilevel conduit stenoses. The small-sized SPVC can not be regarded as an ideal conduit for right ventricular outflow tract reconstruction.

  13. Ibrutinib for chronic graft-versus-host disease after failure of prior therapy.

    PubMed

    Miklos, David; Cutler, Corey S; Arora, Mukta; Waller, Edmund K; Jagasia, Madan; Pusic, Iskra; Flowers, Mary E; Logan, Aaron C; Nakamura, Ryotaro; Blazar, Bruce R; Li, Yunfeng; Chang, Stephen; Lal, Indu; Dubovsky, Jason; James, Danelle F; Styles, Lori; Jaglowski, Samantha

    2017-11-23

    Chronic graft-versus-host disease (cGVHD) is a serious complication of allogeneic stem cell transplantation with few effective options available after failure of corticosteroids. B and T cells play a role in the pathophysiology of cGVHD. Ibrutinib inhibits Bruton tyrosine kinase in B cells and interleukin-2-inducible T-cell kinase in T cells. In preclinical models, ibrutinib reduced severity of cGVHD. This multicenter, open-label study evaluated the safety and efficacy of ibrutinib in patients with active cGVHD with inadequate response to corticosteroid-containing therapies. Forty-two patients who had failed 1 to 3 prior treatments received ibrutinib (420 mg) daily until cGVHD progression. The primary efficacy end point was cGVHD response based on 2005 National Institutes of Health criteria. At a median follow-up of 13.9 months, best overall response was 67%; 71% of responders showed a sustained response for ≥20 weeks. Responses were observed across involved organs evaluated. Most patients with multiple cGVHD organ involvement had a multiorgan response. Median corticosteroid dose in responders decreased from 0.29 mg/kg per day at baseline to 0.12 mg/kg per day at week 49; 5 responders discontinued corticosteroids. The most common adverse events were fatigue, diarrhea, muscle spasms, nausea, and bruising. Plasma levels of soluble factors associated with inflammation, fibrosis, and cGVHD significantly decreased over time with ibrutinib. Ibrutinib resulted in clinically meaningful responses with acceptable safety in patients with ≥1 prior treatments for cGVHD. Based on these results, ibrutinib was approved in the United States for treatment of adult patients with cGVHD after failure of 1 or more lines of systemic therapy. This trial was registered at www.clinicaltrials.gov as #NCT02195869. © 2017 by The American Society of Hematology.

  14. Editorial Commentary: Size Does Matter-Anterior Cruciate Ligament Graft Diameter Affects Biomechanical and Clinical Outcomes.

    PubMed

    Steiner, Mark

    2017-05-01

    Anterior cruciate ligament (ACL) graft strength is related to graft diameter and how ACL grafts heal. All grafts appear to lose strength during healing. Clinical studies have documented that hamstring grafts less than 8 mm wide are more vulnerable to failure. Tripling the semitendinosus allows to increase the graft diameter and strength. A recent study documents a semitendinosus tripling technique with excellent clinical results. Copyright © 2017 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  15. Teachers Making Decisions When We Know the Censors Are Watching.

    ERIC Educational Resources Information Center

    Napier, Minta

    Attempts to suppress and even censor various texts used by English teachers often are led by members of fundamentalist Christian groups. These activists charge educators with depreciating Christian moral values and instigating a religion of "secular humanism" in the schools. Various examples of recent legal cases show how prominent the…

  16. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care.

    PubMed

    Kowalski, Amanda

    2016-01-02

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.

  17. Anastomotic fibrous ring as cause of stricture recurrence after bulbar onlay graft urethroplasty.

    PubMed

    Barbagli, Guido; Guazzoni, Giorgio; Palminteri, Enzo; Lazzeri, Massimo

    2006-08-01

    We retrospectively reviewed patterns of failure after bulbar substitution urethroplasty. In particular we investigated the prevalence and location of anastomotic fibrous ring strictures occurring at the apical anastomoses between the graft and urethral plate after 3 types of onlay graft techniques. We reviewed the records of 107 patients who underwent bulbar urethroplasty between 1994 and 2004. Mean patient age was 44 years. Patients with lichen sclerosus, failed hypospadias repair or urethroplasty and panurethral strictures were excluded. A total of 45 patients underwent dorsal onlay skin graft urethroplasty, 50 underwent buccal mucosa onlay graft urethroplasty and 12 underwent augmented end-to-end urethroplasty. The clinical outcome was considered a success or failure at the time that any postoperative procedure was needed, including dilation. Mean followup was 74 months (range 12 to 130). Of 107 cases 85 (80%) were successful and 22 (20%) failed. Failure in 12 patients (11%) involved the whole grafted area and in 10 (9%) it involved the anastomotic site, which was distal and proximal in 5 each. Urethrography, urethral ultrasound and urethroscopy were fundamental for determining the difference between full-length and focal extension of re-stricture. Failures were treated with multistage urethroplasty in 12 cases, urethrotomy in 7 and 1-stage urethroplasty in 3. Of the patients 16 had a satisfactory final outcome and 6 underwent definitive perineal urinary diversion. The prevalence and location of anastomotic ring strictures after bulbar urethroplasty were uniformly distributed in after 3 surgical techniques using skin or buccal mucosa. Further studies are necessary to clarify the etiology of these fibrous ring strictures.

  18. HEart trAnsplantation Registry of piTie-Salpetriere University Hospital

    ClinicalTrials.gov

    2018-01-08

    Cardiac Transplant Disorder; Cardiac Death; Heart Failure; Acute Cellular Graft Rejection; Antibody-Mediated Graft Rejection; Cardiac Allograft Vasculopathy; Heart Transplant Rejection; Immune Tolerance

  19. Identification of tissue-specific cell death using methylation patterns of circulating DNA

    PubMed Central

    Lehmann-Werman, Roni; Neiman, Daniel; Zemmour, Hai; Moss, Joshua; Magenheim, Judith; Vaknin-Dembinsky, Adi; Rubertsson, Sten; Nellgård, Bengt; Blennow, Kaj; Zetterberg, Henrik; Spalding, Kirsty; Haller, Michael J.; Wasserfall, Clive H.; Schatz, Desmond A.; Greenbaum, Carla J.; Dorrell, Craig; Grompe, Markus; Zick, Aviad; Hubert, Ayala; Maoz, Myriam; Fendrich, Volker; Bartsch, Detlef K.; Golan, Talia; Ben Sasson, Shmuel A.; Zamir, Gideon; Razin, Aharon; Cedar, Howard; Shapiro, A. M. James; Glaser, Benjamin; Shemer, Ruth; Dor, Yuval

    2016-01-01

    Minimally invasive detection of cell death could prove an invaluable resource in many physiologic and pathologic situations. Cell-free circulating DNA (cfDNA) released from dying cells is emerging as a diagnostic tool for monitoring cancer dynamics and graft failure. However, existing methods rely on differences in DNA sequences in source tissues, so that cell death cannot be identified in tissues with a normal genome. We developed a method of detecting tissue-specific cell death in humans based on tissue-specific methylation patterns in cfDNA. We interrogated tissue-specific methylome databases to identify cell type-specific DNA methylation signatures and developed a method to detect these signatures in mixed DNA samples. We isolated cfDNA from plasma or serum of donors, treated the cfDNA with bisulfite, PCR-amplified the cfDNA, and sequenced it to quantify cfDNA carrying the methylation markers of the cell type of interest. Pancreatic β-cell DNA was identified in the circulation of patients with recently diagnosed type-1 diabetes and islet-graft recipients; oligodendrocyte DNA was identified in patients with relapsing multiple sclerosis; neuronal/glial DNA was identified in patients after traumatic brain injury or cardiac arrest; and exocrine pancreas DNA was identified in patients with pancreatic cancer or pancreatitis. This proof-of-concept study demonstrates that the tissue origins of cfDNA and thus the rate of death of specific cell types can be determined in humans. The approach can be adapted to identify cfDNA derived from any cell type in the body, offering a minimally invasive window for diagnosing and monitoring a broad spectrum of human pathologies as well as providing a better understanding of normal tissue dynamics. PMID:26976580

  20. Dose matters! Optimisation of guideline adherence is associated with lower mortality in stable patients with chronic heart failure.

    PubMed

    Poelzl, G; Altenberger, J; Pacher, R; Ebner, C H; Wieser, M; Winter, A; Fruhwald, F; Dornaus, C; Ehmsen, U; Reiter, S; Steinacher, R; Huelsmann, M; Eder, V; Boehmer, A; Pilgersdorfer, L; Ablasser, K; Keroe, D; Groebner, H; Auer, J; Jakl, G; Hallas, A; Ess, M; Ulmer, H

    2014-07-15

    Guidelines have been published for improving management of chronic heart failure (CHF). We examined the association between improved guideline adherence and risk for all-cause death in patients with stable systolic HF. Data on ambulatory patients (2006-2010) with CHF and reduced ejection fraction (HF-REF) from the Austrian Heart Failure Registry (HIR Austria) were analysed. One-year clinical data and long-term follow-up data until all-cause death or data censoring were available for 1014 patients (age 65 [55-73], male 75%, NYHA class I 14%, NYHA II 56%, NYHA III/IV 30%). A guideline adherence indicator (GAI [0-100%]) was calculated for each patient at baseline and after 12 ± 3 months that considered indications and contraindications for ACE-I/ARB, beta blockers, and MRA. Patients were considered ΔGAI-positive if GAI improved to or remained at high levels (≥ 80%). ΔGAI50+ positivity was ascribed to patients achieving a dose of ≥ 50% of suggested target dose. Improvements in GAI and GAI50+ were associated with significant improvements in NYHA class and NT-proBNP (1728 [740-3636] to 970 [405-2348]) (p<0.001). Improvements in GAI50+, but not GAI, were independently predictive of lower mortality risk (HR 0.55 [95% CI 0.34-0.87; p=0.01]) after adjustment for a large variety of baseline parameters and hospitalisation for heart failure during follow-up. Improvement in guideline adherence with particular emphasis on dose escalation is associated with a decrease in long-term mortality in ambulatory HF-REF subjects surviving one year after registration. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. A consistent NPMLE of the joint distribution function with competing risks data under the dependent masking and right-censoring model.

    PubMed

    Li, Jiahui; Yu, Qiqing

    2016-01-01

    Dinse (Biometrics, 38:417-431, 1982) provides a special type of right-censored and masked competing risks data and proposes a non-parametric maximum likelihood estimator (NPMLE) and a pseudo MLE of the joint distribution function [Formula: see text] with such data. However, their asymptotic properties have not been studied so far. Under the extention of either the conditional masking probability (CMP) model or the random partition masking (RPM) model (Yu and Li, J Nonparametr Stat 24:753-764, 2012), we show that (1) Dinse's estimators are consistent if [Formula: see text] takes on finitely many values and each point in the support set of [Formula: see text] can be observed; (2) if the failure time is continuous, the NPMLE is not uniquely determined, and the standard approach (which puts weights only on one element in each observed set) leads to an inconsistent NPMLE; (3) in general, Dinse's estimators are not consistent even under the discrete assumption; (4) we construct a consistent NPMLE. The consistency is given under a new model called dependent masking and right-censoring model. The CMP model and the RPM model are indeed special cases of the new model. We compare our estimator to Dinse's estimators through simulation and real data. Simulation study indicates that the consistent NPMLE is a good approximation to the underlying distribution for moderate sample sizes.

  2. Augmentation of Distal Biceps Repair With an Acellular Dermal Graft Restores Native Biomechanical Properties in a Tendon-Deficient Model.

    PubMed

    Conroy, Christine; Sethi, Paul; Macken, Craig; Wei, David; Kowalsky, Marc; Mirzayan, Raffy; Pauzenberger, Leo; Dyrna, Felix; Obopilwe, Elifho; Mazzocca, Augustus D

    2017-07-01

    The majority of distal biceps tendon injuries can be repaired in a single procedure. In contrast, complete chronic tears with severe tendon substance deficiency and retraction often require tendon graft augmentation. In cases with extensive partial tears of the distal biceps, a human dermal allograft may be used as an alternative to restore tendon thickness and biomechanical integrity. Dermal graft augmentation will improve load to failure compared with nonaugmented repair in a tendon-deficient model. Controlled laboratory study. Thirty-six matched specimens were organized into 1 of 4 groups: native tendon, native tendon with dermal graft augmentation, tendon with an attritional defect, and tendon with an attritional defect repaired with a graft. To mimic a chronic attritional biceps lesion, a defect was created by a complete tear, leaving 30% of the tendon's width intact. The repair technique in all groups consisted of cortical button and interference screw fixation. All specimens underwent cyclical loading for 3000 cycles and were then tested to failure; gap formation and peak load at failure were documented. The mean (±SD) load to failure (320.9 ± 49.1 N vs 348.8 ± 77.6 N, respectively; P = .38) and gap formation (displacement) (1.8 ± 1.4 mm vs 1.6 ± 1.1 mm, respectively; P = .38) did not differ between the native tendon groups with and without graft augmentation. In the tendon-deficient model, the mean load to failure was significantly improved with graft augmentation compared with no graft augmentation (282.1 ± 83.8 N vs 199.7 ± 45.5 N, respectively; P = .04), while the mean gap formation was significantly reduced (1.2 ± 1.0 mm vs 2.7 ± 1.4 mm, respectively; P = .04). The mean load to failure of the deficient tendon with graft augmentation (282.1 N) compared with the native tendon (348.8 N) was not significantly different ( P = .12). This indicates that the native tendon did not perform differently from the grafted deficient tendon. In a tendon

  3. Analyzing survival curves at a fixed point in time for paired and clustered right-censored data

    PubMed Central

    Su, Pei-Fang; Chi, Yunchan; Lee, Chun-Yi; Shyr, Yu; Liao, Yi-De

    2018-01-01

    In clinical trials, information about certain time points may be of interest in making decisions about treatment effectiveness. Rather than comparing entire survival curves, researchers can focus on the comparison at fixed time points that may have a clinical utility for patients. For two independent samples of right-censored data, Klein et al. (2007) compared survival probabilities at a fixed time point by studying a number of tests based on some transformations of the Kaplan-Meier estimators of the survival function. However, to compare the survival probabilities at a fixed time point for paired right-censored data or clustered right-censored data, their approach would need to be modified. In this paper, we extend the statistics to accommodate the possible within-paired correlation and within-clustered correlation, respectively. We use simulation studies to present comparative results. Finally, we illustrate the implementation of these methods using two real data sets. PMID:29456280

  4. Long-term survival of donor bone marrow multipotent mesenchymal stromal cells implanted into the periosteum of patients with allogeneic graft failure.

    PubMed

    Kuzmina, L A; Petinati, N A; Sats, N V; Drize, N J; Risinskaya, N V; Sudarikov, A B; Vasilieva, V A; Drokov, M Y; Michalzova, E D; Parovichnikova, E N; Savchenko, V G

    2016-09-01

    The present study involved three patients with graft failure following allogeneic hematopoietic stem cell transplantation (allo-HSCT). We obtained multipotent mesenchymal stromal cells (MSCs) from the original hematopoietic cell donors and implanted these cells in the periosteum to treat long-term bone marrow aplasia. The results showed that in all patients endogenous blood formation was recovered 2 weeks after MSC administration. Donor MSCs were found in recipient bone marrow three and 5 months following MSC implantation. Thus, our findings indicate that functional donor MSCs can persist in patient bone marrow.

  5. High Israeli mortality rates from diabetes and renal failure - Can international comparison of multiple causes of death reflect differences in choice of underlying cause?

    PubMed

    Goldberger, Nehama; Applbaum, Yael; Meron, Jill; Haklai, Ziona

    2015-01-01

    The age-adjusted mortality rate in Israel is low compared to most Western countries although mortality rates from diabetes and renal failure in Israel are amongst the highest, while those from cardiovascular diseases (CVD) are amongst the lowest. This study aims to assess validity of choice of underlying causes (UC) in Israel by analyzing Israeli and international data on the prevalence of these diseases as multiple causes of death (MCOD) compared to UC, and data on comorbidity (MCOD based). Age-adjusted death rates were calculated for UC and MCOD and the corresponding ratio of multiple to underlying cause of death (SRMU) for available years between 1999 and 2012. Comorbidity was explored by calculating cause of death association indicators (CDAI) and frequency of comorbid disease. These results were compared to data from USA, France, Italy, Australia and the Czech Republic for 2009 or other available year. Mortality rates for all these diseases except renal failure have decreased in Israel between 1999 and 2012 as UC and MCOD. In 2009, the SRMU for diabetes was 2.7, slightly lower than other Western countries (3.0-3.5) showing more frequent choice as UC. Similar results were found for renal failure. In contrast, the SRMU for ischemic heart disease (IHD) and cerebrovascular disease were 2.0 and 2.6, respectively, higher than other countries (1.4-1.6 and 1.7-1.9, respectively), showing less frequent choice as UC. CDAI data showed a strong association between heart and cerebrovascular disease, and diabetes in all countries. In Israel, 40 % of deaths with UC diabetes had IHD and 24 % had cerebrovascular disease. Renal disease was less strongly associated with IHD. This international comparison suggests that diabetes and renal failure may be coded more frequently in Israel as UC, sometimes instead of heart and cerebrovascular disease. Even with some changes in coding, mortality rates would be high compared to other countries, similar to the comparatively high

  6. Oxygen Uptake Efficiency Plateau Best Predicts Early Death in Heart Failure

    PubMed Central

    Hansen, James E.; Stringer, William W.

    2012-01-01

    Background: The responses of oxygen uptake efficiency (ie, oxygen uptake/ventilation = V˙o2/V˙e) and its highest plateau (OUEP) during incremental cardiopulmonary exercise testing (CPET) in patients with chronic left heart failure (HF) have not been previously reported. We planned to test the hypothesis that OUEP during CPET is the best single predictor of early death in HF. Methods: We evaluated OUEP, slope of V˙o2 to log(V˙e) (oxygen uptake efficiency slope), oscillatory breathing, and all usual resting and CPET measurements in 508 patients with low-ejection-fraction (< 35%) HF. Each had further evaluations at other sites, including cardiac catheterization. Outcomes were 6-month all-reason mortality and morbidity (death or > 24 h cardiac hospitalization). Statistical analyses included area under curve of receiver operating characteristics, ORs, univariate and multivariate Cox regression, and Kaplan-Meier plots. Results: OUEP, which requires only moderate exercise, was often reduced in patients with HF. A low % predicted OUEP was the single best predictor of mortality (P < .0001), with an OR of 13.0 (P < .001). When combined with oscillatory breathing, the OR increased to 56.3, superior to all other resting or exercise parameters or combinations of parameters. Other statistical analyses and morbidity analysis confirmed those findings. Conclusions: OUEP is often reduced in patients with HF. Low % predicted OUEP (< 65% predicted) is the single best predictor of early death, better than any other CPET or other cardiovascular measurement. Paired with oscillatory breathing, it is even more powerful. PMID:22030802

  7. A comparison of skin graft success in the head & neck with and without the use of a pressure dressing.

    PubMed

    Dhillon, M; Carter, C P; Morrison, J; Hislop, W S; Currie, W J R

    2015-06-01

    The success of skin grafting is dependent on the interplay between many factors including nutrient uptake and vascular in-growth. To allow this, it is important that the graft is immobile and traditionally a 'pressure dressing' has been placed over the graft to improve outcome and graft 'take'. We present the findings of our comparative study of full-thickness skin grafts performed in the head, neck and face region over a period of 24 months. We felt that there was an unacceptably high infection rate and graft failure using pressure dressings. Data was collected retrospectively from the case notes on 70 patients who had undergone full-thickness skin grafting to the head, neck and face over a 2 year period. Thirty-five patients underwent grafting with pressure dressing and 35 without. The group with the pressure dressing had the same 'bolster' specification-type dressing and those without had their graft 'quilted' in and chloramphenicol ointment applied topically. Success was determined by the percentage 'take' of the grafts and absence of infection i.e. purulence. Infection in those with a pressure dressing stood at 26 % in contrast to those without, at 9 %. Without a pressure dressing we observed no total graft failures, compared to 6 % in those with a pressure dressing. The results confirmed the perception that there was a higher infection and graft failure rate where a pressure dressing was applied; however, this was not a statistically significant difference and a randomised control trial with a larger sample size would be required to validate the results.

  8. Avoiding secondary skin graft donor site morbidity in the fibula free flap harvest.

    PubMed

    Kim, Paul D; Fleck, Terry; Heffelfinger, Ryan; Blackwell, Keith E

    2008-12-01

    To compare donor site morbidity in patients who have undergone fibula free flap reconstruction in which the skin graft was taken from the expected cutaneous paddle of the fibula with the known complications of the popular technique of obtaining a split-thickness skin graft (STSG) from a secondary donor site. Cohort study. The tertiary care centers at Loma Linda University Medical Center and University of California, Los Angeles, Medical Center. From September 1, 2006, to March 30, 2007, 30 patients underwent fibula free flap harvest by 2 surgeons at separate tertiary care centers. Twenty-one of those procedures took place at the University of California, Los Angeles, and 9 at Loma Linda University. Patients included 15 men (50%) and 15 women (50%), with a mean age of 58 (range, 19-88) years. All 30 patients underwent fibula free flap harvest with a split-thickness skin graft (graft thickness, 0.04 cm), obtained from osteocutaneous paddle using a 5.1-cm-wide dermatome, as well as oral cavity and oropharyngeal reconstruction with the de-epithelialized skin paddle. Measures of donor site morbidity, including graft failure and wound breakdown, and measures of recipient site morbidity, including flap failure, hardware complications, intraoral complications, and the need for additional surgery. Of the 30 patients who underwent this procedure, 4 had partial skin graft failures, for a complete skin graft survival of 87%. There were no complete skin graft losses. Regarding the fibula osteocutaneous free flap, there were no complete flap losses, 1 skin paddle necrosis that required debridement, 2 postoperative orocutaneous fistulas, 1 case of infected/extruded hardware, and 1 adhesion formation that required additional surgery for lysis of adhesion and placement of the split-thickness skin graft. The outlined novel technique has similar rates of free flap survival and skin graft take compared with previously described methods. Harvesting the skin graft over the expected

  9. [Kidney transplant experience at the Specialty Hospital Bernardo Sepulveda National Medical Center Century XXI, Mexican Institute of Social Security].

    PubMed

    Gracida-Juárez, Carmen; Espinoza-Pérez, Ramón; Cancino-López, Jorge David; Ibarra-Villanueva, Araceli; Cedillo-López, Urbano; Villegas-Anzo, Fernando; Martínez-Alvarez, Julio

    2011-09-01

    The first kidney transplant in Mexico was done on October 22, 1963 at the General Hospital of National Medical Center (CMN) of the Mexican Institute of Social Security. After the earthquake in 1985, the transplantation activity was continued at the Specialty Hospital of National Medical Center Century XXI. Our program has a continue activity for almost 48 years and a total of 2019 kidney transplants from October 1963 to December 2010. We describe our experience in 20 years. Retrospective cohort study that includes all kidney transplants performed in the period from January 1991 to December 2010. Descriptive statistics were used. The survival analysis was performed using the Kaplan Meier method. We show the patient survival, graft survival censored for death with functional graft and total graft survival (uncensored). We analyzed a total of 1544 kidney transplants. The percentage of living donor was 82.9 vs. deceased donor of 17.1%. Patient survival at 1, 5, 10, 15 and 20 years was 95.0, 91.8, 87.2, 81.1 and 70.1%, respectively; allograft survival rate censored for death with functional allograft at 1, 5, 10, 15 and 20 years was 93.0, 86.2, 76.2, 63.7 and 50.9%, respectively. Our Transplant center also take care of around 1300 living donors in the long term, looking for morbidities as risk factors for the unique kidney as metabolic syndrome, diabetes, hypertension and others. In our program, the main source of renal allografts was living donors. Our transplant center has to increase the organ procurement from deceased donors. An important contribution of our center has been the long follow up of living donors according to international consensus.

  10. Multivariate longitudinal data analysis with censored and intermittent missing responses.

    PubMed

    Lin, Tsung-I; Lachos, Victor H; Wang, Wan-Lun

    2018-05-08

    The multivariate linear mixed model (MLMM) has emerged as an important analytical tool for longitudinal data with multiple outcomes. However, the analysis of multivariate longitudinal data could be complicated by the presence of censored measurements because of a detection limit of the assay in combination with unavoidable missing values arising when subjects miss some of their scheduled visits intermittently. This paper presents a generalization of the MLMM approach, called the MLMM-CM, for a joint analysis of the multivariate longitudinal data with censored and intermittent missing responses. A computationally feasible expectation maximization-based procedure is developed to carry out maximum likelihood estimation within the MLMM-CM framework. Moreover, the asymptotic standard errors of fixed effects are explicitly obtained via the information-based method. We illustrate our methodology by using simulated data and a case study from an AIDS clinical trial. Experimental results reveal that the proposed method is able to provide more satisfactory performance as compared with the traditional MLMM approach. Copyright © 2018 John Wiley & Sons, Ltd.

  11. The Relationship of the Severity and Category of Acute Rejection With Intimal Arteritis Defined in Banff Classification to Clinical Outcomes.

    PubMed

    Wu, Kaiyin; Budde, Klemens; Schmidt, Danilo; Neumayer, Hans-Helmut; Rudolph, Birgit

    2015-08-01

    It is unclear if the category of acute rejection with intimal arteritis (ARV) is relevant to short- and long-term clinical outcomes and if the graft outcomes are affected by the severity of intimal arteritis. One hundred forty-eight ARV episodes were reviewed and categorized according to the 2013 Banff criteria of AMR: T cell-mediated rejection with intimal arteritis (v) lesion (TCMRV; n = 78), total antibody-mediated rejection with v lesion (AMRV), which were further divided into suspicious AMRV (n = 37) and AMRV (n = 33). The Banff scores of intimal arteritis (v1, v2 and v3) represented low, moderate, and high ARV severity. The grafts with TCMRV, suspicious AMRV (sAMRV), and AMRV showed similar responses to antirejection therapy, whereas the grafts with v2- or v3-ARV responded significantly poorer compared to those with v1-ARV. The 8-year death-censored graft survival (DCGS) rate was 56.8% of TCMRV versus 34.1% of total AMRV (Log rank, P = 0.03), but the 1- and 5-year DCGS rates were comparable between the 2 groups; moreover, the 1-, 5-, and 8-year DCGS rates of v1-ARV were evidently higher than v2- and v3-ARV (each pairwise comparison to v1-AVR yields P < 0.01); in contrast, the DCGS rates were similar between sAMRV and AMRV. The existing donor-specific antibodies or moderate microvascular inflammation or C4d-positive staining or intensive tubulointerstitial inflammation played a less significant role on the long-term graft survival. Compared to the category, the ARV severity is more closely associated with the initial response to antirejection therapy and long-term graft failure. The sAMRV and AMRV might represent a spectrum of the same disorder.

  12. Fifteen-Year Trends in Pediatric Liver Transplants: Split, Whole Deceased, and Living Donor Grafts.

    PubMed

    Mogul, Douglas B; Luo, Xun; Bowring, Mary G; Chow, Eric K; Massie, Allan B; Schwarz, Kathleen B; Cameron, Andrew M; Bridges, John F P; Segev, Dorry L

    2018-05-01

    To evaluate changes in patient and graft survival for pediatric liver transplant recipients since 2002, and to determine if these outcomes vary by graft type (whole liver transplant, split liver transplant [SLT], and living donor liver transplant [LDLT]). We evaluated patient and graft survival among pediatric liver-only transplant recipients the PELD/MELD system was implemented using the Scientific Registry of Transplant Recipients. From 2002-2009 to 2010-2015, survival for SLT at 30 days improved (94% vs 98%; P < .001), and at 1 year improved for SLT (89% to 95%; P <.001) and LDLT (93% to 98%; P = .002). There was no change in survival for whole liver transplant at either 30 days (98% in both; P = .7) or 1 year (94% vs 95%; P = .2). The risk of early death with SLT was 2.14-fold higher in 2002-2009 (adjusted hazard ratio [aHR] vs whole liver transplant, 1.47 2.14 3.12 ), but this risk disappeared in 2010-2015 (aHR, 0.65 1.13 1.96 ), representing a significant improvement (P = .04). Risk of late death after SLT was similar in both time periods (aHR 2002-2009, 0.87 1.14 1.48 ; aHR 2010-2015, 0.56 0.88 1.37 ). LDLT had similar risk of early death (aHR 2002-2009, 0.49 1.03 2.14 ; aHR 2010-2015, 0.26 0.74 2.10 ) and late death (aHR 2002-2009, 0.52 0.83 1.32 ; aHR 2010-2015, 0.17 0.44 1.11 ). Graft loss was similar for SLT (aHR, 0.93 1.09 1.28 ) and was actually lower for LDLT (aHR, 0.53 0.71 0.95 ). In recent years, outcomes after the use of technical variant grafts are comparable with whole grafts, and may be superior for LDLT. Greater use of technical variant grafts might provide an opportunity to increase organ supply without compromising post-transplant outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Transcriptomic Analysis Provides Insights into Grafting Union Development in Pecan (Carya illinoinensis).

    PubMed

    Mo, Zhenghai; Feng, Gang; Su, Wenchuan; Liu, Zhuangzhuang; Peng, Fangren

    2018-02-05

    Pecan ( Carya illinoinensis ), as a popular nut tree, has been widely planted in China in recent years. Grafting is an important technique for its cultivation. For a successful grafting, graft union development generally involves the formation of callus and vascular bundles at the graft union. To explore the molecular mechanism of graft union development, we applied high throughput RNA sequencing to investigate the transcriptomic profiles of graft union at four timepoints (0 days, 8 days, 15 days, and 30 days) during the pecan grafting process. After de novo assembly, 83,693 unigenes were obtained, and 40,069 of them were annotated. A total of 12,180 differentially expressed genes were identified between by grafting. Genes involved in hormone signaling, cell proliferation, xylem differentiation, cell elongation, secondary cell wall deposition, programmed cell death, and reactive oxygen species (ROS) scavenging showed significant differential expression during the graft union developmental process. In addition, we found that the content of auxin, cytokinin, and gibberellin were accumulated at the graft unions during the grafting process. These results will aid in our understanding of successful grafting in the future.

  14. The Impact of Ischemia/Reperfusion Injury on Liver Allografts from Deceased after Cardiac Death versus Deceased after Brain Death Donors

    PubMed Central

    Xu, Jin; Sayed, Blayne Amir; Casas-Ferreira, Ana Maria; Srinivasan, Parthi; Heaton, Nigel; Rela, Mohammed; Ma, Yun; Fuggle, Susan; Legido-Quigley, Cristina; Jassem, Wayel

    2016-01-01

    Background and aims The shortage of organs for transplantation has led to increased use of organs procured from donors after cardiac death (DCD). The effects of cardiac death on the liver remain poorly understood, however. Using livers obtained from DCD versus donors after brain death (DBD), we aimed to understand how ischemia/reperfusion (I/R) injury alters expression of pro-inflammatory markers ceramides and influences graft leukocyte infiltration. Methods Hepatocyte inflammation, as assessed by ceramide expression, was evaluated in DCD (n = 13) and DBD (n = 10) livers. Allograft expression of inflammatory and cell death markers, and allograft leukocyte infiltration were evaluated from a contemporaneous independent cohort of DCD (n = 22) and DBD (n = 13) livers. Results When examining the differences between transplant stages in each group, C18, C20, C24 ceramides showed significant difference in DBD (p<0.05) and C22 ceramide (p<0.05) were more pronounced for DCD. C18 ceramide is correlated to bilirubin, INR, and creatinine after transplant in DCD. Prior to transplantation, DCD livers have reduced leukocyte infiltration compared to DBD allografts. Following reperfusion, the neutrophil infiltration and platelet deposition was less prevalent in DCD grafts while cell death and recipients levels of serum aspartate aminotransferase (AST) of DCD allografts had significantly increased. Conclusion These data suggest that I/R injury generate necrosis in the absence of a strong inflammatory response in DCD livers with an appreciable effect on early graft function. The long-term consequences of increased inflammation in DBD and increased cell death in DCD allografts are unknown and warrant further investigation. PMID:26863224

  15. Censored Quantile Instrumental Variable Estimates of the Price Elasticity of Expenditure on Medical Care

    PubMed Central

    Kowalski, Amanda

    2015-01-01

    Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member’s injury to induce variation in an individual’s own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from −0.76 to −1.49, which are an order of magnitude larger than previous estimates. PMID:26977117

  16. The inferior impact of antibody-mediated rejection on the clinical outcome of kidney allografts that develop de novo thrombotic microangiopathy.

    PubMed

    Wu, Kaiyin; Budde, Klemens; Schmidt, Danilo; Neumayer, Hans-Hellmut; Lehner, Lukas; Bamoulid, Jamal; Rudolph, Birgit

    2016-02-01

    Antibody-mediated rejection (AMR) can induce and develop thrombotic microangiopathy (TMA) in renal allografts. A definitive AMR (dAMR) co-presents three diagnostic features. A suspicious AMR (sAMR) is designated when one of the three features is missing. Thirty-two TMA cases overlapping with AMR (AMR+ TMA) were studied, which involved 14 cases of sAMR+ TMA and 18 cases of dAMR+ TMA. Thirty TMA cases free of AMR features (AMR- TMA) were enrolled as control group. The ratio of complete response to treatment was similar between AMR- TMA and AMR+ TMA group (23.3% vs. 12.5%, p = 0.33), or between sAMR+ TMA and dAMR+ TMA group (14.3% vs. 11.1%, p = 0.79). At eight yr post-transplantation, the death-censored graft survival (DCGS) rate of AMR- TMA group was 62.8%, which was significantly higher than 28.0% of AMR+ TMA group (p = 0.01), but similar between sAMR+ TMA and dAMR+ TMA group (30.0% vs. 26.7%, p = 0.92). Overall, the intimal arteritis and the broad HLA (Human leukocyte antigens) mismatches were closely associated with over time renal allograft failure. The AMR+ TMA has inferior long-term graft survival, but grafts with sAMR+ TMA or dAMR+ TMA have similar characteristics and clinical courses. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Effects of upper-extremity vascular access creation on cardiac events in patients undergoing coronary artery bypass grafting.

    PubMed

    Han, Youngjin; Choo, Suk Jung; Kwon, Hyunwook; Lee, Jae Won; Chung, Cheol Hyun; Kim, Hyangkyoung; Kwon, Tae-Won; Cho, Yong-Pil

    2017-01-01

    The present study was conducted to investigate whether upper-extremity vascular access (VA) creation increases the risk for major adverse cardiac events (MACE) and death in patients undergoing coronary artery bypass grafting (CABG) with an in situ left internal thoracic artery (ITA) graft. A total of 111 patients with CABG with a left ITA graft who underwent upper-extremity VA creation were analyzed retrospectively; 93 patients received left VA creation (83.8%, ipsilateral group) and 18 patients received right VA creation (16.2%, contralateral group). The primary outcome was the occurrence of MACE, and the secondary outcome was the composite of MACE or late death. There were no significant differences in the incidence of primary (P = 0.30) or secondary (P = 0.09) outcomes between the two groups. Multivariate regression analysis indicated that prior cerebrovascular accidents (hazard ratio [HR] 3.30; 95% confidence interval [CI] 1.37-7.97; P = 0.01) and type of VA (HR 3.44; 95% CI 1.34-8.82; P = 0.01) were independently associated with MACE; prior peripheral arterial occlusive disease (HR 4.22; 95% CI 1.62-10.98; P<0.01) and type of VA (arteriovenous fistula vs. prosthetic arteriovenous grafting) (HR 3.06; 95% CI, 1.42-6.61; P<0.01) were associated with the composite of MACE or death. The side and location of VA were not associated with MACE or death. Our study showed no definite evidence that ipsilateral VA creation affects the subsequent occurrence of MACE or late death from any cause. The type of VA (a prosthetic arteriovenous grafting) is a significant predictor of the subsequent occurrence of MACE or late death.

  18. Probabilistic PCA of censored data: accounting for uncertainties in the visualization of high-throughput single-cell qPCR data.

    PubMed

    Buettner, Florian; Moignard, Victoria; Göttgens, Berthold; Theis, Fabian J

    2014-07-01

    High-throughput single-cell quantitative real-time polymerase chain reaction (qPCR) is a promising technique allowing for new insights in complex cellular processes. However, the PCR reaction can be detected only up to a certain detection limit, whereas failed reactions could be due to low or absent expression, and the true expression level is unknown. Because this censoring can occur for high proportions of the data, it is one of the main challenges when dealing with single-cell qPCR data. Principal component analysis (PCA) is an important tool for visualizing the structure of high-dimensional data as well as for identifying subpopulations of cells. However, to date it is not clear how to perform a PCA of censored data. We present a probabilistic approach that accounts for the censoring and evaluate it for two typical datasets containing single-cell qPCR data. We use the Gaussian process latent variable model framework to account for censoring by introducing an appropriate noise model and allowing a different kernel for each dimension. We evaluate this new approach for two typical qPCR datasets (of mouse embryonic stem cells and blood stem/progenitor cells, respectively) by performing linear and non-linear probabilistic PCA. Taking the censoring into account results in a 2D representation of the data, which better reflects its known structure: in both datasets, our new approach results in a better separation of known cell types and is able to reveal subpopulations in one dataset that could not be resolved using standard PCA. The implementation was based on the existing Gaussian process latent variable model toolbox (https://github.com/SheffieldML/GPmat); extensions for noise models and kernels accounting for censoring are available at http://icb.helmholtz-muenchen.de/censgplvm. © The Author 2014. Published by Oxford University Press. All rights reserved.

  19. Probabilistic PCA of censored data: accounting for uncertainties in the visualization of high-throughput single-cell qPCR data

    PubMed Central

    Buettner, Florian; Moignard, Victoria; Göttgens, Berthold; Theis, Fabian J.

    2014-01-01

    Motivation: High-throughput single-cell quantitative real-time polymerase chain reaction (qPCR) is a promising technique allowing for new insights in complex cellular processes. However, the PCR reaction can be detected only up to a certain detection limit, whereas failed reactions could be due to low or absent expression, and the true expression level is unknown. Because this censoring can occur for high proportions of the data, it is one of the main challenges when dealing with single-cell qPCR data. Principal component analysis (PCA) is an important tool for visualizing the structure of high-dimensional data as well as for identifying subpopulations of cells. However, to date it is not clear how to perform a PCA of censored data. We present a probabilistic approach that accounts for the censoring and evaluate it for two typical datasets containing single-cell qPCR data. Results: We use the Gaussian process latent variable model framework to account for censoring by introducing an appropriate noise model and allowing a different kernel for each dimension. We evaluate this new approach for two typical qPCR datasets (of mouse embryonic stem cells and blood stem/progenitor cells, respectively) by performing linear and non-linear probabilistic PCA. Taking the censoring into account results in a 2D representation of the data, which better reflects its known structure: in both datasets, our new approach results in a better separation of known cell types and is able to reveal subpopulations in one dataset that could not be resolved using standard PCA. Availability and implementation: The implementation was based on the existing Gaussian process latent variable model toolbox (https://github.com/SheffieldML/GPmat); extensions for noise models and kernels accounting for censoring are available at http://icb.helmholtz-muenchen.de/censgplvm. Contact: fbuettner.phys@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24618470

  20. Transient hyperglycemia during liver transplantation does not affect the early graft function.

    PubMed

    Blasi, Annabel; Beltran, Joan; Martin, Nuria; Martinez-Pallí, Graciela; Lozano, Juan J; Balust, Jaume; Torrents, Abigail; Taura, Pilar

    2015-01-01

    Background and rationale for the study. Hyperglycemia after graft reperfusion is a consistent finding in liver transplantation (LT) that remains poorly studied. We aim to describe its appearance in LT recipients of different types of grafts and its relation to the graft function. 436 LT recipients of donors after brain death (DBD), donors after cardiac death (DCD), and familial amyloidotic polyneuropathy (FAP) donors were reviewed. Serum glucose was measured at baseline, during the anhepatic phase, after graft reperfusion, and at the end of surgery. Early graft dysfunction (EAD) was assessed by Olthoff criteria. Caspase-3, IFN-γ, IL1β, and IL6 gene expression were measured in liver biopsy. The highest increase in glucose levels after reperfusion was observed in FAP LT recipients and the lowest in DCD LT recipients. Glucose level during the anhepatic phase was the only modifiable predictive variable of hyperglycemia after reperfusion. No relation was found between hyperglycemia after reperfusion and EAD. However, recipients with the highest glucose levels after reperfusion tended to achieve the best glucose control at the end of surgery and those who were unable to control the glucose value after reperfusion showed EAD more frequently. The highest levels of caspase-3 were found in recipients with the lowest glucose values after reperfusion. In conclusion, glucose levels increased after graft reperfusion to a different extent according to the donor type. Contrary to general belief, transient hyperglycemia after reperfusion does not appear to impact negatively on the liver graft function and could even be suggested as a marker of graft quality.

  1. Steroid Avoidance in Pediatric Heart Transplantation Results in Excellent Graft Survival

    PubMed Central

    Auerbach, Scott R.; Gralla, Jane; Campbell, David N.; Miyamoto, Shelley D.; Pietra, Biagio A.

    2018-01-01

    Background Maintenance steroid (MS) use in pediatric heart transplantation (HT) varies across centers. The purpose of this study was to evaluate the impact of steroid-free maintenance immunosuppression (SF) on graft outcomes in pediatric HT. Methods Patients younger than 18 years in the United States undergoing a first HT during 1990 to 2010 were analyzed for conditional 30-day graft loss (death or repeat HT) and death based on MS use by multivariable analysis. A propensity score was then given to each patient using a logistic model, and propensity matching was performed using pre-HT risk factors, induction therapy, and nonsteroid maintenance immunosuppression. Kaplan-Meier graft and patient survival probabilities by MS use were then calculated. Results Of 4894 patients, 3962 (81%) were taking MS and 932 (19%) SF. Of the 4530 alive at 30 days after HT, 3694 (82%) and 836 (18%) were in the MS and SF groups, respectively. Unmatched multivariable analysis showed no difference in 30-day conditional graft survival between MS and SF groups (hazard ratio=1.08, 95% confidence interval=0.93-1.24; P=0.33). Propensity matching resulted in 462 patients in each MS and SF group. Propensity-matched Kaplan-Meier survival analysis showed no difference in graft or patient survival between groups (P=0.3 and P=0.16, respectively). Conclusions We found no difference in graft survival between SF patients and those taking MS. An SF regimen in pediatric HT avoids potential complications of steroid use without compromising graft survival, even after accounting for pre-HT risk factors. PMID:24389908

  2. [Structural Damage to the Hamstring Graft due to Interaction with Fixation Material and its Effect on Biomechanical Properties of ACL Reconstruction].

    PubMed

    Kautzner, J; Držík, M; Handl, M; Povýšil, C; Kos, P; Trč, T; Havlas, V

    2017-01-01

    PURPOSE OF THE STUDY Hamstring grafts are commonly used for ACL reconstruction. The purpose of our study is to determine the effects of the suspension fixation compared to graft cross-pinning transfixation, and the effect(s) of structural damage during the preparation of the graft on biomechanical properties of the graft. MATERIAL AND METHODS The design of the study is a cadaveric biomechanical laboratory study. 38 fresh-frozen human hamstring specimens from 19 cadaveric donors were used. The grafts were tested for their loading properties. One half of each specimen was suspended over a 3.3mm pin, the other half was cross-pinned by a 3.3mm pin to simulate the graft cross-pinning technique. Single impact testing was performed and the failure force, elongation and acceleration/deceleration of each graft was recorded and the loading force vs. elongation of the graft specimens was calculated. Results for suspended and cross-pinned grafts were analysed using ANOVA method, comparing the grafts from each donor. RESULTS The ultimate strength of a double-strand gracilis graft was 1287 ± 134 N when suspended over a pin, the strength of a cross-pinned graft was 833 ± 111 N. For double-strand semitendinosus grafts the strengths were 1883 ± 198 and 997 ± 234 N, respectively. Thus, the failure load for the cross-pinning method is only 64.7% or 52.9% for the suspension method. DISCUSSION Structural damage to the graft significantly reduces the graft strength. Also, extensive suturing during preparation of the graft reduces its strength. CONCLUSIONS Fixation methods that do not interfere with the graft's structure should be used to reduce the risk of graft failure. Key words: ACL reconstruction, hamstring graft, biomechanical testing.

  3. Assessing assay agreement estimation for multiple left-censored data: a multiple imputation approach.

    PubMed

    Lapidus, Nathanael; Chevret, Sylvie; Resche-Rigon, Matthieu

    2014-12-30

    Agreement between two assays is usually based on the concordance correlation coefficient (CCC), estimated from the means, standard deviations, and correlation coefficient of these assays. However, such data will often suffer from left-censoring because of lower limits of detection of these assays. To handle such data, we propose to extend a multiple imputation approach by chained equations (MICE) developed in a close setting of one left-censored assay. The performance of this two-step approach is compared with that of a previously published maximum likelihood estimation through a simulation study. Results show close estimates of the CCC by both methods, although the coverage is improved by our MICE proposal. An application to cytomegalovirus quantification data is provided. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Arthroscopic anterior cruciate ligament distal graft rupture: a method of salvage.

    PubMed

    Larrain, Mario V; Mauas, David M; Collazo, Cristian C; Rivarola, Horacio F

    2004-09-01

    We describe a rare case of anterior cruciate ligament (ACL) distal graft rupture in a high-demand rugby player. Fifteen months before this episode, he underwent an ACL reconstruction (autologous patellar tendon graft surgery) plus posterolateral reconstruction with direct suture and fascia lata augmentation. Radiographs revealed correct positioning of tunnels and fixation screws. Magnetic resonance imaging showed that the graft rupture was close to the tibial bone block and presented a signal compatible to the optimal graft incorporation. Surgery recording and clinical records were reviewed. No failures were found. After careful evaluation we concluded that the primary cause of failure was trauma. Based on these findings a salvage surgery technique was performed. Return to sport activities was allowed after four months when sufficient strength and range of motion had returned. Recent follow up (2 years 8 months postoperative) has shown an excellent result with a Lysholm score of 100, International Knee Documentation Committee (IKDC) score of 100, and a KT-1000 arthrometer reading of between 0 and 5 mm. The athlete has returned to his previous professional level. We believe this simple, specific, nonaggressive, and anatomic reconstructive technique may be used in the case of avulsion or distal detachment caused only by trauma and with a graft that is likely to heal.

  5. Cardiovascular Risk Reduction is Important for Improving Patient and Graft Survival After Ligation and Bypass Surgery for Popliteal Artery Aneurysm.

    PubMed

    Dattani, N; Ali, M; Aber, A; Kannan, R Yap; Choke, E C; Bown, M J; Sayers, R D; Davies, R S

    2017-07-01

    To report outcomes following ligation and bypass (LGB) surgery for popliteal artery aneurysm (PAA) and study factors influencing patient and graft survival. A retrospective review of patients undergoing LGB surgery for PAA between September 1999 and August 2012 at a tertiary referral vascular unit was performed. Primary graft patency (PGP), primary-assisted graft patency (PAGP), and secondary graft patency (SGP) rates were calculated using survival analyses. Patient, graft aneurysm-free survival (GAFS), aneurysm reperfusion-free survival (ARFS), and amputation-free survival (AFS) rates were also calculated. Log-rank testing and Cox proportional hazards modeling were used to perform univariate and multivariate analysis of influencing factors, respectively. Eighty-four LGB repairs in 69 patients (mean age 71.3 years, 68 males) were available for study. The 5-year PGP, PAGP, SGP, and patient survival rates were 58.1%, 84.4%, 85.2%, and 81.1%, respectively. On multivariate analysis, the principal determinants of PGP were urgency of operation ( P = .009) and smoking status ( P = .019). The principal determinants of PAGP were hyperlipidemia status ( P = .048) and of SGP were hyperlipidemia ( P = .042) and cerebrovascular disease (CVD) status ( P = .045). The principal determinants of patient survival were previous myocardial infarction ( P = .004) and CVD ( P = .001). The 5-year GAFS, ARFS, and AFS rates were 87.9%, 91.6%, and 96.1%, respectively. This study has shown that traditional cardiovascular risk factors, such as a smoking and ischemic heart disease, are the most important predictors of early graft failure and patient death following LGB surgery for PAA.

  6. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  7. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    PubMed

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  8. Distraction osteogenesis of costochondral bone grafts in the mandible.

    PubMed

    Stelnicki, Eric J; Hollier, Larry; Lee, Catherine; Lin, Wen-Yuan; Grayson, Barry; McCarthy, Joseph G

    2002-03-01

    graft distraction complications included pin tract infections in two patients, hardware failure with premature pin pullout in one patient, and evidence of fibrous nonunions in three young patients with single, diminutive rib grafts. In group 2, there were no distraction failures. Distraction osteogenesis can be successfully performed on costochondral rib grafts of the mandible; however, the complication rate is higher than in non-rib-graft patients. Performing the technique on older, more cooperative individuals seems to reduce this risk. In addition, placement of a double rib graft or an iliac bone graft of sufficient volume to create a neomandible with greater bone stock is an absolute requirement to decrease the risk of fibrous nonunion and provide a bone base of sufficient size for retention of the distraction device and manipulation of the regenerate.

  9. Repeated blood transfusions: Identification of a novel culprit of early graft failure in children.

    PubMed

    Therrien, Judith; Guo, Kenneth; Guo, Liming; Liu, Aihua; Marelli, Ariane

    2018-03-01

    The attrition of right ventricle to pulmonary artery (RV-PA) grafts has been attributed in part to the body's immunologic response. We hypothesized that antibodies developed through blood transfusion, directed against the grafts, may result in accelerated degeneration and the need for re-intervention. This is a population-based study of the province of Quebec. We included children born between January 1, 1987 to December 31, 2006 who were diagnosed with a cono-truncal anomaly and had an RV-PA graft. The patients were followed for transfusion exposure and RV-PA graft re-intervention. Time to re-intervention in those exposed versus non-exposed was analyzed using Cox regression. Analysis was done in two time periods, before and after the calendar year 2000, given the change in blood preparation in the province of Quebec. There were 413 patients who met the inclusion criteria of a cono-truncal disorder. Of the whole study population, 69% received a blood transfusion. Cox regression analysis showed that among patients who had the initial graft performed (n=181) before year 2000, having 2 or more blood transfusion was associated with an nearly tripled risk of a re-intervention comparing to no blood transfusion (hazard ratio of 2.88; 95% confidence interval 1.05-7.91). In patients who had the initial graft performed after year 2000 (n=232), the associated risk increase was 7-fold (hazard ratio of 7.01; 95% confidence interval 3.06-16.02). Kaplan-Meier analyses confirmed the significant difference in the re-intervention free survival probabilities between those who received 2 or more blood product transfusion and those who did not as well: prior to year 2000 (67.9% vs. 88.0% at 5years, p=0.0201) as well as after year 2000 (39.7% vs. 82.8% at 5years, p<0.0001). In this population-based analysis, repeated blood product transfusion was associated with a significant increased risk of a need for RV-PA graft re-intervention. This data strongly suggest that repeated blood

  10. Factors associated with corneal graft survival in the cornea donor study.

    PubMed

    Sugar, Alan; Gal, Robin L; Kollman, Craig; Raghinaru, Dan; Dontchev, Mariya; Croasdale, Christopher R; Feder, Robert S; Holland, Edward J; Lass, Jonathan H; Macy, Jonathan I; Mannis, Mark J; Smith, Patricia W; Soukiasian, Sarkis H; Beck, Roy W

    2015-03-01

    The Cornea Donor Study (CDS) showed that donor age is not a factor in survival of most penetrating keratoplasties for endothelial disease. Secondary analyses confirm the importance of surgical indication and presence of glaucoma in outcomes at 10 years. To assess the relationship between donor and recipient factors and corneal graft survival in the CDS. Multicenter prospective, double-masked, controlled clinical trial conducted at 80 clinical sites. One hundred five surgeons enrolled 1090 participants undergoing corneal transplant for a moderate-risk condition, principally Fuchs dystrophy or pseudophakic or aphakic corneal edema (PACE). Forty-three eye banks provided corneas. Corneas from donors younger than 66 years and donors 66 years or older were assigned, masked to donor age. Surgery and postoperative care were performed according to the surgeons' usual routines. Participants were followed up for as long as 12 years. Graft failure, defined as a regrafting procedure or a cloudy cornea for 3 consecutive months. The 10-year cumulative probability of graft failure was higher in participants with PACE than in those with Fuchs dystrophy (37% vs 20%; hazard ratio [HR], 2.1 [99% CI, 1.4-3.0]; P < .001) and in participants with a history of glaucoma before penetrating keratoplasty, particularly with prior glaucoma surgery (58% with prior glaucoma surgery and use of medications to lower intraocular pressure at the time of surgery vs 22% with no history of glaucoma surgery or medication use; HR, 4.1 [99% CI, 2.2-7.5]; P < .001). We found trends toward increased graft failure in recipients who were 70 years or older compared with those younger than 60 years (29% vs 19%; HR, 1.2 [99% CI, 0.7-2.1]; P = .04) or were African American (HR, 1.5; P = .11) or who had a history of smoking (35% vs 24%; HR, 1.6 [99% CI, 0.9-2.8]; P = .02). Lower endothelial cell density (ECD) and higher corneal thickness (CT) at 6 months (6% vs 41% for ECD ≥2700 vs <1700 cells

  11. Urine Injury Biomarkers and Risk of Adverse Outcomes in Recipients of Prevalent Kidney Transplants: The Folic Acid for Vascular Outcome Reduction in Transplantation Trial

    PubMed Central

    Carpenter, Myra A.; Weiner, Daniel E.; Levey, Andrew S.; Pfeffer, Marc; Kusek, John W.; Cai, Jianwen; Hunsicker, Lawrence G.; Park, Meyeon; Bennett, Michael; Liu, Kathleen D.; Hsu, Chi-yuan

    2016-01-01

    Recipients of kidney transplants (KTR) are at increased risk for cardiovascular events, graft failure, and death. It is unknown whether urine kidney injury biomarkers are associated with poor outcomes among KTRs. We conducted a post hoc analysis of the Folic Acid for Vascular Outcome Reduction in Transplantation (FAVORIT) Trial using a case-cohort study design, selecting participants with adjudicated cardiovascular events, graft failure, or death. Urine neutrophil gelatinase–associated lipocalin (NGAL), kidney injury molecule-1 (KIM-1), IL-18, and liver–type fatty acid binding protein (L-FABP) were measured in spot urine samples and standardized to urine creatinine concentration. We adjusted for demographics, cardiovascular risk factors, eGFR, and urine albumin-to-creatinine ratio. Patients had 291 cardiovascular events, 257 graft failure events, and 359 deaths. Each log increase in urine NGAL/creatinine independently associated with a 24% greater risk of cardiovascular events (adjusted hazard ratio [aHR], 1.24; 95% confidence interval [95% CI], 1.06 to 1.45), a 40% greater risk of graft failure (aHR, 1.40; 95% CI, 1.16 to 1.68), and a 44% greater risk of death (aHR, 1.44; 95% CI, 1.26 to 1.65). Urine KIM-1/creatinine and IL-18/creatinine independently associated with greater risk of death (aHR, 1.29; 95% CI, 1.03 to 1.61 and aHR, 1.25; 95% CI, 1.04 to 1.49 per log increase, respectively) but not with risk of cardiovascular events or graft failure. Urine L-FABP did not associate with any study outcomes. In conclusion, among prevalent KTRs, higher urine NGAL, KIM-1, and IL-18 levels independently and differentially associated with greater risk of adverse outcomes. PMID:26538631

  12. Effects of upper-extremity vascular access creation on cardiac events in patients undergoing coronary artery bypass grafting

    PubMed Central

    Han, Youngjin; Choo, Suk Jung; Kwon, Hyunwook; Lee, Jae Won; Chung, Cheol Hyun; Kim, Hyangkyoung; Kwon, Tae-Won

    2017-01-01

    The present study was conducted to investigate whether upper-extremity vascular access (VA) creation increases the risk for major adverse cardiac events (MACE) and death in patients undergoing coronary artery bypass grafting (CABG) with an in situ left internal thoracic artery (ITA) graft. A total of 111 patients with CABG with a left ITA graft who underwent upper-extremity VA creation were analyzed retrospectively; 93 patients received left VA creation (83.8%, ipsilateral group) and 18 patients received right VA creation (16.2%, contralateral group). The primary outcome was the occurrence of MACE, and the secondary outcome was the composite of MACE or late death. There were no significant differences in the incidence of primary (P = 0.30) or secondary (P = 0.09) outcomes between the two groups. Multivariate regression analysis indicated that prior cerebrovascular accidents (hazard ratio [HR] 3.30; 95% confidence interval [CI] 1.37–7.97; P = 0.01) and type of VA (HR 3.44; 95% CI 1.34–8.82; P = 0.01) were independently associated with MACE; prior peripheral arterial occlusive disease (HR 4.22; 95% CI 1.62–10.98; P<0.01) and type of VA (arteriovenous fistula vs. prosthetic arteriovenous grafting) (HR 3.06; 95% CI, 1.42–6.61; P<0.01) were associated with the composite of MACE or death. The side and location of VA were not associated with MACE or death. Our study showed no definite evidence that ipsilateral VA creation affects the subsequent occurrence of MACE or late death from any cause. The type of VA (a prosthetic arteriovenous grafting) is a significant predictor of the subsequent occurrence of MACE or late death. PMID:28873444

  13. Computationally Optimizing the Compliance of a Biopolymer Based Tissue Engineered Vascular Graft

    PubMed Central

    Harrison, Scott; Tamimi, Ehab; Uhlorn, Josh; Leach, Tim; Vande Geest, Jonathan P.

    2016-01-01

    Coronary heart disease is a leading cause of death among Americans for which coronary artery bypass graft (CABG) surgery is a standard surgical treatment. The success of CABG surgery is impaired by a compliance mismatch between vascular grafts and native vessels. Tissue engineered vascular grafts (TEVGs) have the potential to be compliance matched and thereby reduce the risk of graft failure. Glutaraldehyde (GLUT) vapor-crosslinked gelatin/fibrinogen constructs were fabricated and mechanically tested in a previous study by our research group at 2, 8, and 24 hrs of GLUT vapor exposure. The current study details a computational method that was developed to predict the material properties of our constructs for crosslinking times between 2 and 24 hrs by interpolating the 2, 8, and 24 hrs crosslinking time data. matlab and abaqus were used to determine the optimal combination of fabrication parameters to produce a compliance matched construct. The validity of the method was tested by creating a 16-hr crosslinked construct of 130 μm thickness and comparing its compliance to that predicted by the optimization algorithm. The predicted compliance of the 16-hr construct was 0.00059 mm Hg−1 while the experimentally determined compliance was 0.00065 mm Hg−1, a relative difference of 9.2%. Prior data in our laboratory has shown the compliance of the left anterior descending porcine coronary (LADC) artery to be 0.00071 ± 0.0003 mm Hg−1. Our optimization algorithm predicts that a 258-μm-thick construct that is GLUT vapor crosslinked for 8.1 hrs would match LADC compliance. This result is consistent with our previous work demonstrating that an 8-hr GLUT vapor crosslinked construct produces a compliance that is not significantly different from a porcine coronary LADC. PMID:26593773

  14. Transcriptomic Analysis Provides Insights into Grafting Union Development in Pecan (Carya illinoinensis)

    PubMed Central

    Mo, Zhenghai; Feng, Gang; Su, Wenchuan; Liu, Zhuangzhuang; Peng, Fangren

    2018-01-01

    Pecan (Carya illinoinensis), as a popular nut tree, has been widely planted in China in recent years. Grafting is an important technique for its cultivation. For a successful grafting, graft union development generally involves the formation of callus and vascular bundles at the graft union. To explore the molecular mechanism of graft union development, we applied high throughput RNA sequencing to investigate the transcriptomic profiles of graft union at four timepoints (0 days, 8 days, 15 days, and 30 days) during the pecan grafting process. After de novo assembly, 83,693 unigenes were obtained, and 40,069 of them were annotated. A total of 12,180 differentially expressed genes were identified between by grafting. Genes involved in hormone signaling, cell proliferation, xylem differentiation, cell elongation, secondary cell wall deposition, programmed cell death, and reactive oxygen species (ROS) scavenging showed significant differential expression during the graft union developmental process. In addition, we found that the content of auxin, cytokinin, and gibberellin were accumulated at the graft unions during the grafting process. These results will aid in our understanding of successful grafting in the future. PMID:29401757

  15. The Endurant Stent Graft System: 15-month follow-up report in patients with challenging abdominal aortic anatomies.

    PubMed

    Hyhlik-Dürr, Alexander; Weber, Tim F; Kotelis, Drossos; Rengier, Fabian; Gahlen, Johannes; Böck, Stefanie; Köhler, Jürgen; Ratusinski, Christoph-M; Böckler, Dittmar

    2011-08-01

    The objective of this study is to report a 15-month follow-up with the Endurant Stent Graft System in patients with challenging aortic anatomies. At three German clinics, a consecutive series of 50 patients underwent endovascular abdominal aortic repair (EVAR) for challenging abdominal aortic aneurysm with the Endurant stent graft between November 2008 and May 2009. EVAR was elective in 48 cases and emergent in two. Patients had short (≤15 mm) aortic necks, severe suprarenal/infrarenal angulation, and/or small (<8 mm), calcified, severely angulated, or tortuous iliac or femoral access vessels. Additionally, a cohort of 40 patients without challenging anatomies were retrospectively analysed to clarify differences concerning technical success, mortality, and morbidity between these groups. The primary technical success rate was 92% (46/50). The 30-day mortality rate was 2% (1/50), the death due to multiorgan failure. Intraoperative angiograms revealed three type I endoleaks (2 proximal and 1 distal), and one of those was persisting at 30 days (30-day rate, 2%). Postoperative imaging discovered no further type I or type III endoleaks. The 30-day rate of the type II endoleak was 6% (3/50). There were two cases of graft limb occlusion, both requiring reintervention within 30 days. Follow-up was available in all of the 50 patients (100%) over a median of 15 months (1-25). During this time, seven patients died (overall mortality, 16%; 8/50), besides the above-described patient, all of them unrelated to the procedure. Compared to the 30-day results with the Endurant stent graft in non-challenging anatomies (no type I endoleak; no graft limb occlusion; all-cause mortality, 0%), procedure-related complications in challenging anatomies are increasing. Early and 15-month results with the Endurant stent graft in patients with challenging aortic anatomies are encouraging.

  16. Aortic Replacement with Sutureless Intraluminal Grafts

    PubMed Central

    Lemole, Gerald M.

    1990-01-01

    To avoid the anastomotic complications and long cross-clamp times associated with standard suture repair of aortic lesions, we have implanted sutureless intraluminal grafts in 122 patients since 1976. Forty-nine patients had disorders of the ascending aorta, aortic arch, or both: their operative mortality was 14% (7 patients), and the group's 5-year actuarial survival rate has been 64%. There have been no instances of graft dislodgment, graft infection, aortic bleeding, or pseudoaneurysm formation. Forty-two patients had disorders of the descending aorta and thoracoabdominal aorta: their early mortality was 10% (4 patients), and the group's 5-year actuarial survival rate has been 56%. There was 1 early instance of graft dislodgment, but no pseudoaneurysm formation, graft erosion, aortic bleeding, intravascular hemolysis, or permanent deficits in neurologic, renal, or vascular function. Thirty-one patients had the sutureless intraluminal graft implanted in the abdominal aortic position: their early mortality was 6% (2 patients), and the 5-year actuarial survival rate for this group has been 79%. There were no instances of renal failure, ischemic complication, postoperative paraplegia, pseudoaneurysm, or anastomotic true aneurysm. Our recent efforts have been directed toward developing an adjustable spool that can adapt to the widest aorta or the narrowest aortic arch vessel; but in the meanwhile, the present sutureless graft yields shorter cross-clamp times, fewer intraoperative complications, and both early and late results as satisfactory as those afforded by traditional methods of aortic repair. (Texas Heart Institute Journal 1990; 17:302-9) Images PMID:15227522

  17. On prognostic models, artificial intelligence and censored observations.

    PubMed

    Anand, S S; Hamilton, P W; Hughes, J G; Bell, D A

    2001-03-01

    The development of prognostic models for assisting medical practitioners with decision making is not a trivial task. Models need to possess a number of desirable characteristics and few, if any, current modelling approaches based on statistical or artificial intelligence can produce models that display all these characteristics. The inability of modelling techniques to provide truly useful models has led to interest in these models being purely academic in nature. This in turn has resulted in only a very small percentage of models that have been developed being deployed in practice. On the other hand, new modelling paradigms are being proposed continuously within the machine learning and statistical community and claims, often based on inadequate evaluation, being made on their superiority over traditional modelling methods. We believe that for new modelling approaches to deliver true net benefits over traditional techniques, an evaluation centric approach to their development is essential. In this paper we present such an evaluation centric approach to developing extensions to the basic k-nearest neighbour (k-NN) paradigm. We use standard statistical techniques to enhance the distance metric used and a framework based on evidence theory to obtain a prediction for the target example from the outcome of the retrieved exemplars. We refer to this new k-NN algorithm as Censored k-NN (Ck-NN). This reflects the enhancements made to k-NN that are aimed at providing a means for handling censored observations within k-NN.

  18. How the Mind of a Censor Works: The Psychology of Censorship.

    ERIC Educational Resources Information Center

    Fine, Sara

    1996-01-01

    Explores censorship and examines it as a human dynamic. Discusses the authoritarian personality, the need to control, traditionalism and the need to belong to a group, the influence of family, denial, and authoritarian women. Describes the importance of listening to "the Censor" in order to encourage dialogue and how to use effective…

  19. Recent progresses in outcome-dependent sampling with failure time data.

    PubMed

    Ding, Jieli; Lu, Tsui-Shan; Cai, Jianwen; Zhou, Haibo

    2017-01-01

    An outcome-dependent sampling (ODS) design is a retrospective sampling scheme where one observes the primary exposure variables with a probability that depends on the observed value of the outcome variable. When the outcome of interest is failure time, the observed data are often censored. By allowing the selection of the supplemental samples depends on whether the event of interest happens or not and oversampling subjects from the most informative regions, ODS design for the time-to-event data can reduce the cost of the study and improve the efficiency. We review recent progresses and advances in research on ODS designs with failure time data. This includes researches on ODS related designs like case-cohort design, generalized case-cohort design, stratified case-cohort design, general failure-time ODS design, length-biased sampling design and interval sampling design.

  20. Recent progresses in outcome-dependent sampling with failure time data

    PubMed Central

    Ding, Jieli; Lu, Tsui-Shan; Cai, Jianwen; Zhou, Haibo

    2016-01-01

    An outcome-dependent sampling (ODS) design is a retrospective sampling scheme where one observes the primary exposure variables with a probability that depends on the observed value of the outcome variable. When the outcome of interest is failure time, the observed data are often censored. By allowing the selection of the supplemental samples depends on whether the event of interest happens or not and oversampling subjects from the most informative regions, ODS design for the time-to-event data can reduce the cost of the study and improve the efficiency. We review recent progresses and advances in research on ODS designs with failure time data. This includes researches on ODS related designs like case–cohort design, generalized case–cohort design, stratified case–cohort design, general failure-time ODS design, length-biased sampling design and interval sampling design. PMID:26759313

  1. Waitlist Outcomes for Patients Relisted Following Failed Donation After Cardiac Death Liver Transplant: Implications for Awarding Model for End-Stage Liver Disease Exception Scores.

    PubMed

    Croome, K P; Lee, D D; Nguyen, J H; Keaveny, A P; Taner, C B

    2017-09-01

    Understanding of outcomes for patients relisted for ischemic cholangiopathy following a donation after cardiac death (DCD) liver transplant (LT) will help standardization of a Model for End-Stage Liver Disease exception scheme for retransplantation. Early relisting (E-RL) for DCD graft failure caused by primary nonfunction (PNF) or hepatic artery thrombosis (HAT) was defined as relisting ≤14 days after DCD LT, and late relisting (L-RL) due to biliary complications was defined as relisting 14 days to 3 years after DCD LT. Of 3908 DCD LTs performed nationally between 2002 and 2016, 540 (13.8%) patients were relisted within 3 years of transplant (168 [4.3%] in the E-RL group, 372 [9.5%] in the L-RL group). The E-RL and L-RL groups had waitlist mortality rates of 15.4% and 10.5%, respectively, at 3 mo and 16.1% and 14.3%, respectively, at 1 year. Waitlist mortality in the L-RL group was higher than mortality and delisted rates for patients with exception points for both hepatocellular carcinoma (HCC) and hepatopulmonary syndrome (HPS) at 3- to 12-mo time points (p < 0.001). Waitlist outcomes differed in patients with early DCD graft failure caused by PNF or HAT compared with those with late DCD graft failure attributed to biliary complications. In L-RL, higher rates of waitlist mortality were noted compared with patients listed with exception points for HCC or HPS. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  2. ACL graft constructs: In-vitro fatigue testing highlights the occurrence of irrecoverable lengthening and the need for adequate (pre)conditioning to avert the recurrence of knee instability.

    PubMed

    Blythe, A; Tasker, T; Zioupos, P

    2006-01-01

    The performance of ACL grafts in both the short and long term is only as good as the condition of the graft at the time of surgery. If the graft lengthens under load at the two fixation ends incorporation will take longer to occur. Previous studies have shown that the various grafts currently used are strong enough. However, data on strength came primarily from quasistatic single pull to failure tests with, in some cases, modest cycling to precondition the grafts. The present study examined the in-vitro biomechanical behaviour of model ACL grafts, which have been fatigue cycled to failure over a wide range of loads in physiological ambient conditions. Load/deformation curves and the stretch of the grafts was continuously recorded until final rupture. The grafts demonstrated typical creep-rupture like behaviour with elongation (non-recoverable stretch) and loss of stiffness leading to gradual failure. Some of the graft designs were consistently shown to elongate up to 20 mm in length within the first 2000 cycles at moderate physiological loads and a further 10 mm of elongation occurred between the initial preconditioned state and just prior to complete rupture. Not enough attention has been paid previously to the likely long term elongation patterns of ACL grafts post-surgery and even after the usual empirical preconditioning has been performed by the surgeon. Increased graft dimensions may result in recurrent knee instability and may also lead to failure of the graft to incorporate. Preconditioning in-vitro may still be a way to remove some slack and prepare the graft for its operational environment by stiffening in particular the tissue/fixation interface for those grafts that use soft polymer fixation ends.

  3. Use of ATG-Fresenius as an Induction Agent in Deceased-Donor Kidney Transplantation.

    PubMed

    Yilmaz, M; Sezer, T Ö; Kir, O; Öztürk, A; Hoşcoşkun, C; Töz, H

    2017-04-01

    Anti-T-lymphocyte globulins (ATG) are most commonly used as induction agents in kidney transplantation (KT). In this study, we investigated the use of ATG as induction therapy in deceased-donor KT. Among 152 deceased-donor KT recipients transplanted between January 2009 and December 2003, 147 with exact data were enrolled in this study. Delayed graft function was defined as dialysis requirement after KT. Greater than 10% panel-reactive antibody (PRA) was considered as positive. Total ATG (rATG-Fresenius) dosage and induction duration was evaluated. Mean age was 45 ± 10 years; 91 patients were male and 56 patients were female. Class I and class II PRA-positive patient numbers were 20 (13.6%) and 17 (11.5%), respectively. Pre-transplant dialysis vintage was 108 ± 63 months. Mean donor age was 42 ± 17, and cold ischemia time was 16 ± 5 hours. Eighty-nine patients (60%) had delayed graft function and needed at least one session of hemodialysis after transplantation. Cumulative ATG-F dosage was 676 ± 274 mg. The mean ATG-F cumulative dosage was 10.6 ± 3.8 mg/kg. At the end of first year, mean creatinine and proteinuria levels were 1.4 ± 1.0 mg/dL and 0.3 ± 0.4 g/d, respectively. Mean follow-up time was 32 ± 20 months. During follow-up, there were 14 graft failures and 11 patients died. Patient survival for 1 and 2 years were 93% and 92.3%, respectively. Death-censored graft survival rates for 1 and 2 years were 94.8% and 90.8%, respectively. ATG-F induction provides acceptable graft and patient survival in deceased-donor KT. ATG-F infusion is well tolerated. Infection rates seem to be acceptable compared with all transplantation populations. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Fast genomic predictions via Bayesian G-BLUP and multilocus models of threshold traits including censored Gaussian data.

    PubMed

    Kärkkäinen, Hanni P; Sillanpää, Mikko J

    2013-09-04

    Because of the increased availability of genome-wide sets of molecular markers along with reduced cost of genotyping large samples of individuals, genomic estimated breeding values have become an essential resource in plant and animal breeding. Bayesian methods for breeding value estimation have proven to be accurate and efficient; however, the ever-increasing data sets are placing heavy demands on the parameter estimation algorithms. Although a commendable number of fast estimation algorithms are available for Bayesian models of continuous Gaussian traits, there is a shortage for corresponding models of discrete or censored phenotypes. In this work, we consider a threshold approach of binary, ordinal, and censored Gaussian observations for Bayesian multilocus association models and Bayesian genomic best linear unbiased prediction and present a high-speed generalized expectation maximization algorithm for parameter estimation under these models. We demonstrate our method with simulated and real data. Our example analyses suggest that the use of the extra information present in an ordered categorical or censored Gaussian data set, instead of dichotomizing the data into case-control observations, increases the accuracy of genomic breeding values predicted by Bayesian multilocus association models or by Bayesian genomic best linear unbiased prediction. Furthermore, the example analyses indicate that the correct threshold model is more accurate than the directly used Gaussian model with a censored Gaussian data, while with a binary or an ordinal data the superiority of the threshold model could not be confirmed.

  5. Fast Genomic Predictions via Bayesian G-BLUP and Multilocus Models of Threshold Traits Including Censored Gaussian Data

    PubMed Central

    Kärkkäinen, Hanni P.; Sillanpää, Mikko J.

    2013-01-01

    Because of the increased availability of genome-wide sets of molecular markers along with reduced cost of genotyping large samples of individuals, genomic estimated breeding values have become an essential resource in plant and animal breeding. Bayesian methods for breeding value estimation have proven to be accurate and efficient; however, the ever-increasing data sets are placing heavy demands on the parameter estimation algorithms. Although a commendable number of fast estimation algorithms are available for Bayesian models of continuous Gaussian traits, there is a shortage for corresponding models of discrete or censored phenotypes. In this work, we consider a threshold approach of binary, ordinal, and censored Gaussian observations for Bayesian multilocus association models and Bayesian genomic best linear unbiased prediction and present a high-speed generalized expectation maximization algorithm for parameter estimation under these models. We demonstrate our method with simulated and real data. Our example analyses suggest that the use of the extra information present in an ordered categorical or censored Gaussian data set, instead of dichotomizing the data into case-control observations, increases the accuracy of genomic breeding values predicted by Bayesian multilocus association models or by Bayesian genomic best linear unbiased prediction. Furthermore, the example analyses indicate that the correct threshold model is more accurate than the directly used Gaussian model with a censored Gaussian data, while with a binary or an ordinal data the superiority of the threshold model could not be confirmed. PMID:23821618

  6. Preclinical testing for aortic endovascular grafts: results of a Food and Drug Administration workshop.

    PubMed

    Abel, Dorothy B; Beebe, Hugh G; Dedashtian, Mark M; Morton, Michael C; Moynahan, Megan; Smith, Loius J; Weinberg, Steven L

    2002-05-01

    Since their introduction into clinical trials in the United States, endovascular aortic grafts have shown various types of problems. Although details of design and construction vary between different endovascular grafts and failure modes have had a variety of causes and clinical effects, the inability of preclinical testing to predict these failures remains common to all endovascular grafts. The need to improve preclinical testing in an attempt to reduce clinical device failures resulted in a Food and Drug Administration-sponsored workshop on endovascular graft preclinical testing held in Rockville, Md, from July 31 to August 1, 2001. FORMAT: The workshop was not designed as a consensus conference. Instead, it provided a forum for bringing stakeholders together to define problems and identify areas of agreement and disagreement. The workshop had 34 invited participants who represented device manufacturers, the medical community, the Food and Drug Administration, and testing facilities, and international attendance was more than 120 people. Discussion centered on: 1, defining the physiologic, anatomic, and morphologic characteristics of abdominal aortic aneurysms before and after endovascular graft treatment; 2, identifying the types of failures that have been observed clinically; and 3, determining which characteristics should be considered during preclinical modeling to better predict clinical performance. Attendees agreed to the need to better define and address anatomic characteristics and changes in the aneurysm after endograft treatment to optimize preclinical testing. Much discussion and little agreement occurred on the importance of flow-related forces on graft performance or the need or ability to define and model physiologic compliance during durability testing. The discussion and conclusions are summarized in this paper and are provided in detail at: http://www.fda.gov/cdrh/meetings/073101workshop.html. The workshop raised awareness of significant

  7. Quantifying and comparing dynamic predictive accuracy of joint models for longitudinal marker and time-to-event in presence of censoring and competing risks.

    PubMed

    Blanche, Paul; Proust-Lima, Cécile; Loubère, Lucie; Berr, Claudine; Dartigues, Jean-François; Jacqmin-Gadda, Hélène

    2015-03-01

    Thanks to the growing interest in personalized medicine, joint modeling of longitudinal marker and time-to-event data has recently started to be used to derive dynamic individual risk predictions. Individual predictions are called dynamic because they are updated when information on the subject's health profile grows with time. We focus in this work on statistical methods for quantifying and comparing dynamic predictive accuracy of this kind of prognostic models, accounting for right censoring and possibly competing events. Dynamic area under the ROC curve (AUC) and Brier Score (BS) are used to quantify predictive accuracy. Nonparametric inverse probability of censoring weighting is used to estimate dynamic curves of AUC and BS as functions of the time at which predictions are made. Asymptotic results are established and both pointwise confidence intervals and simultaneous confidence bands are derived. Tests are also proposed to compare the dynamic prediction accuracy curves of two prognostic models. The finite sample behavior of the inference procedures is assessed via simulations. We apply the proposed methodology to compare various prediction models using repeated measures of two psychometric tests to predict dementia in the elderly, accounting for the competing risk of death. Models are estimated on the French Paquid cohort and predictive accuracies are evaluated and compared on the French Three-City cohort. © 2014, The International Biometric Society.

  8. Dialysis Vintage and Outcomes after Kidney Transplantation: A Retrospective Cohort Study

    PubMed Central

    Haller, Maria C.; Kainz, Alexander; Baer, Heather

    2017-01-01

    Background and objectives Historically, length of pretransplant dialysis was associated with premature graft loss and mortality after kidney transplantation, but with recent advancements in RRT it is unclear whether this negative association still exists. Design, setting, participants, &measurements This is a retrospective cohort study evaluating 6979 first kidney allograft recipients from the Austrian Registry transplanted between 1990 and 2013. Duration of pretransplant dialysis treatment was used as categoric predictor classified by tertiles of the distribution of time on dialysis. A separate category for pre-emptive transplantation was added and defined as kidney transplantation without any dialysis preceding the transplant. Outcomes were death-censored graft loss, all-cause mortality, and the composite of both. Results Median duration of follow-up was 8.2 years, and 1866 graft losses and 2407 deaths occurred during the study period. Pre-emptive transplantation was associated with a lower risk of graft loss (hazard ratio, 0.76; 95% confidence interval, 0.59 to 0.98), but not in subgroup analyses excluding living transplants and transplants performed since 2000. The association between dialysis duration and graft loss did not depend on the year of transplantation (P=0.40) or donor source (P=0.92). Longer waiting time on dialysis was not associated with a higher rate of graft loss, but the rate of death was higher in patients on pretransplant dialysis for >1.5 years (hazard ratio, 1.62; 95% confidence interval, 1.43 to 1.83) compared with pretransplant dialysis for <1.5 years. Conclusions Our findings support the evidence that pre-emptive transplantation is associated with superior graft survival compared with pretransplant dialysis, although this association was weaker in transplants performed since 2000. However, our analysis shows that length of dialysis was no longer associated with a higher rate of graft loss, although longer waiting times on dialysis were

  9. Prediction of Emergent Heart Failure Death by Semi-Quantitative Triage Risk Stratification

    PubMed Central

    Van Spall, Harriette G. C.; Atzema, Clare; Schull, Michael J.; Newton, Gary E.; Mak, Susanna; Chong, Alice; Tu, Jack V.; Stukel, Thérèse A.; Lee, Douglas S.

    2011-01-01

    Objectives Generic triage risk assessments are widely used in the emergency department (ED), but have not been validated for prediction of short-term risk among patients with acute heart failure (HF). Our objective was to evaluate the Canadian Triage Acuity Scale (CTAS) for prediction of early death among HF patients. Methods We included patients presenting with HF to an ED in Ontario from Apr 2003 to Mar 2007. We used the National Ambulatory Care Reporting System and vital statistics databases to examine care and outcomes. Results Among 68,380 patients (76±12 years, 49.4% men), early mortality was stratified with death rates of 9.9%, 1.9%, 0.9%, and 0.5% at 1-day, and 17.2%, 5.9%, 3.8%, and 2.5% at 7-days, for CTAS 1, 2, 3, and 4–5, respectively. Compared to lower acuity (CTAS 4–5) patients, adjusted odds ratios (aOR) for 1-day death were 1.32 (95%CI; 0.93–1.88; p = 0.12) for CTAS 3, 2.41 (95%CI; 1.71–3.40; p<0.001) for CTAS 2, and highest for CTAS 1: 9.06 (95%CI; 6.28–13.06; p<0.001). Predictors of triage-critical (CTAS 1) status included oxygen saturation <90% (aOR 5.92, 95%CI; 3.09–11.81; p<0.001), respiratory rate >24 breaths/minute (aOR 1.96, 95%CI; 1.05–3.67; p = 0.034), and arrival by paramedic (aOR 3.52, 95%CI; 1.70–8.02; p = 0.001). While age/sex-adjusted CTAS score provided good discrimination for ED (c-statistic = 0.817) and 1-day (c-statistic = 0.724) death, mortality prediction was improved further after accounting for cardiac and non-cardiac co-morbidities (c-statistics 0.882 and 0.810, respectively; both p<0.001). Conclusions A semi-quantitative triage acuity scale assigned at ED presentation and based largely on respiratory factors predicted emergent death among HF patients. PMID:21853068

  10. Endothelial dysfunction in patients with chronic heart failure is independently associated with increased incidence of hospitalization, cardiac transplantation, or death.

    PubMed

    Fischer, D; Rossa, S; Landmesser, U; Spiekermann, S; Engberding, N; Hornig, B; Drexler, H

    2005-01-01

    Endothelial dysfunction of coronary and peripheral arteries has been demonstrated in patients with chronic heart failure (CHF) and appears to be associated with functional implications. However, it is unknown whether endothelial dysfunction in CHF is independently associated with impaired outcome or progression of the disease. We assessed the follow-up of 67 consecutive patients with CHF [New York Heart Association (NYHA) functional class II-III] in which flow-dependent, endothelium-mediated vasodilation (FDD) of the radial artery was assessed by high resolution ultrasound. The primary endpoint was defined by cardiac death, hospitalization due to worsening of heart failure (NYHA class IV, pulmonary oedema), or heart transplantation. Cox regression analysis was used to determine whether FDD was associated with these heart failure-related events. During a median follow-up of 45.7 months 24 patients had an event: 18 patients were hospitalized due to worsening of heart failure or heart transplantation, six patients died for cardiac reasons. Cox regression analysis demonstrated that FDD (P<0.01), diabetes mellitus (P<0.01), and ejection fraction (P<0.01) were independent predictive factors for the occurrence of the primary endpoint. The Kaplan-Meier survival curve revealed a significantly better clinical outcome in patients with FDD above the median (6.2%) compared with those with FDD below the median (P<0.013). These observations suggest that endothelium-mediated vasodilation represents an independent predictor of cardiac death and hospitalization in patients with CHF, consistent with the notion that endothelium-derived nitric oxide may play a protective role in heart failure.

  11. Development of the mechanical properties of engineered skin substitutes after grafting to full-thickness wounds.

    PubMed

    Sander, Edward A; Lynch, Kaari A; Boyce, Steven T

    2014-05-01

    Engineered skin substitutes (ESSs) have been reported to close full-thickness burn wounds but are subject to loss from mechanical shear due to their deficiencies in tensile strength and elasticity. Hypothetically, if the mechanical properties of ESS matched those of native skin, losses due to shear or fracture could be reduced. To consider modifications of the composition of ESS to improve homology with native skin, biomechanical analyses of the current composition of ESS were performed. ESSs consist of a degradable biopolymer scaffold of type I collagen and chondroitin-sulfate (CGS) that is populated sequentially with cultured human dermal fibroblasts (hF) and epidermal keratinocytes (hK). In the current study, the hydrated biopolymer scaffold (CGS), the scaffold populated with hF dermal skin substitute (DSS), or the complete ESS were evaluated mechanically for linear stiffness (N/mm), ultimate tensile load at failure (N), maximum extension at failure (mm), and energy absorbed up to the point of failure (N-mm). These biomechanical end points were also used to evaluate ESS at six weeks after grafting to full-thickness skin wounds in athymic mice and compared to murine autograft or excised murine skin. The data showed statistically significant differences (p <0.05) between ESS in vitro and after grafting for all four structural properties. Grafted ESS differed statistically from murine autograft with respect to maximum extension at failure, and from intact murine skin with respect to linear stiffness and maximum extension. These results demonstrate rapid changes in mechanical properties of ESS after grafting that are comparable to murine autograft. These values provide instruction for improvement of the biomechanical properties of ESS in vitro that may reduce clinical morbidity from graft loss.

  12. Octogenarian liver grafts: Is their use for transplant currently justified?

    PubMed

    Jiménez-Romero, Carlos; Cambra, Felix; Caso, Oscar; Manrique, Alejandro; Calvo, Jorge; Marcacuzco, Alejandro; Rioja, Paula; Lora, David; Justo, Iago

    2017-05-07

    To analyse the impact of octogenarian donors in liver transplantation. We present a retrospective single-center study, performed between November 1996 and March 2015, that comprises a sample of 153 liver transplants. Recipients were divided into two groups according to liver donor age: recipients of donors ≤ 65 years (group A; n = 102), and recipients of donors ≥ 80 years (group B; n = 51). A comparative analysis between the groups was performed. Quantitative variables were expressed as mean values and SD, and qualitative variables as percentages. Differences in properties between qualitative variables were assessed by χ 2 test. Comparison of quantitative variables was made by t -test. Graft and patient survivals were estimated using the Kaplan-Meier method. One, 3 and 5-year overall patient survival was 87.3%, 84% and 75.2%, respectively, in recipients of younger grafts vs 88.2%, 84.1% and 66.4%, respectively, in recipients of octogenarian grafts ( P = 0.748). One, 3 and 5-year overall graft survival was 84.3%, 83.1% and 74.2%, respectively, in recipients of younger grafts vs 84.3%, 79.4% and 64.2%, respectively, in recipients of octogenarian grafts ( P = 0.524). After excluding the patients with hepatitis C virus cirrhosis (16 in group A and 10 in group B), the 1, 3 and 5-year patient ( P = 0.657) and graft ( P = 0.419) survivals were practically the same in both groups. Multivariate Cox regression analysis demonstrated that overall patient survival was adversely affected by cerebrovascular donor death, hepatocarcinoma, and recipient preoperative bilirubin, and overall graft survival was adversely influenced by cerebrovascular donor death, and recipient preoperative bilirubin. The standard criteria for utilization of octogenarian liver grafts are: normal gross appearance and consistency, normal or almost normal liver tests, hemodynamic stability with use of < 10 μg/kg per minute of vasopressors before procurement, intensive care unit stay < 3 d, CIT < 9 h

  13. Octogenarian liver grafts: Is their use for transplant currently justified?

    PubMed Central

    Jiménez-Romero, Carlos; Cambra, Felix; Caso, Oscar; Manrique, Alejandro; Calvo, Jorge; Marcacuzco, Alejandro; Rioja, Paula; Lora, David; Justo, Iago

    2017-01-01

    AIM To analyse the impact of octogenarian donors in liver transplantation. METHODS We present a retrospective single-center study, performed between November 1996 and March 2015, that comprises a sample of 153 liver transplants. Recipients were divided into two groups according to liver donor age: recipients of donors ≤ 65 years (group A; n = 102), and recipients of donors ≥ 80 years (group B; n = 51). A comparative analysis between the groups was performed. Quantitative variables were expressed as mean values and SD, and qualitative variables as percentages. Differences in properties between qualitative variables were assessed by χ2 test. Comparison of quantitative variables was made by t-test. Graft and patient survivals were estimated using the Kaplan-Meier method. RESULTS One, 3 and 5-year overall patient survival was 87.3%, 84% and 75.2%, respectively, in recipients of younger grafts vs 88.2%, 84.1% and 66.4%, respectively, in recipients of octogenarian grafts (P = 0.748). One, 3 and 5-year overall graft survival was 84.3%, 83.1% and 74.2%, respectively, in recipients of younger grafts vs 84.3%, 79.4% and 64.2%, respectively, in recipients of octogenarian grafts (P = 0.524). After excluding the patients with hepatitis C virus cirrhosis (16 in group A and 10 in group B), the 1, 3 and 5-year patient (P = 0.657) and graft (P = 0.419) survivals were practically the same in both groups. Multivariate Cox regression analysis demonstrated that overall patient survival was adversely affected by cerebrovascular donor death, hepatocarcinoma, and recipient preoperative bilirubin, and overall graft survival was adversely influenced by cerebrovascular donor death, and recipient preoperative bilirubin. CONCLUSION The standard criteria for utilization of octogenarian liver grafts are: normal gross appearance and consistency, normal or almost normal liver tests, hemodynamic stability with use of < 10 μg/kg per minute of vasopressors before procurement, intensive care

  14. Long-Term Effects of Pregnancy on Renal Graft Function in Women After Kidney Transplantation Compared With Matched Controls.

    PubMed

    Svetitsky, S; Baruch, R; Schwartz, I F; Schwartz, D; Nakache, R; Goykhman, Y; Katz, P; Grupper, A

    2018-06-01

    An important benefit associated with kidney transplantation in women of child-bearing age is increased fertility. We retrospectively evaluated the maternal and fetal complications and evolution of graft function associated with 22 pregnancies post-kidney and kidney-pancreas transplantation, compared with controls without pregnancy post-transplantation, who were matched for gender, year of transplantation, type of donor, age at transplantation, number of transplants, type of transplant (kidney vs kidney-pancreas), and cause of native kidney failure, as well as for renal parameters including serum creatinine and urine protein excretion 1 year before delivery. The mean age at time of transplantation was 22.32 (range, 19.45-33.1) years. The mean interval between transplantation and delivery was 75.7 (range, 34-147.8) months. Main maternal complications were pre-eclampsia in 27.3%. The main fetal complications included delayed intrauterine growth (18.2%), preterm deliveries (89.4%), and one death at 3 days postdelivery. The mean serum creatinine level pre-pregnancy was 1.17 (range, 0.7-3.1) mg/dL. Graft failure was higher in the pregnancy group (6 vs 3) but did not differ statistically from the control group, and was associated with creatinine pre-pregnancy (odds ratio [OR], 1.71; 95% confidence interval [CI], 1.15-3.45; P = .04), age at transplantation (1.13 [1.03-1.21]; P = .032), and time of follow-up (2.14 [1.27-2.98]; P = .026). Delta serum creatinine was not different in both groups: 1.05 ± 0.51 versus 0.99 ± 0.92 mg/dL, study versus control group, respectively (P = .17). Pregnancy after kidney transplantation is associated with serious maternal and fetal complications. We did not observe a significantly increased risk of graft loss or reduced graft function in comparison with recipients with similar clinical characteristics. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Kidney and liver organ transplantation in persons with human immunodeficiency virus

    PubMed Central

    2010-01-01

    HIV− cohort was not reported. Because of sparse data the quality of evidence supporting this outcome is very low indicating death censored graft survival is uncertain. Both the CD4+ T-cell count and HIV viral load appear controlled post transplant with an incidence of opportunistic infection of 20.5%. However, the quality of this evidence for these outcomes is very low indicating uncertainty in these effects. Similarly, because of very low quality evidence there is uncertainty in the rate of acute graft rejection among both the HIV+ and HIV− groups Liver Transplantation: HIV+/HCV+ vs. HCV+ Based on a combined HIV+/HCV+ cohort sample size of 156 from seven studies, the risk of death after liver transplantation is significantly greater (2.8 fold) in a co-infected cohort compared with an HCV+ mono-infected cohort (HR: 2.81; 95% CI: 1.47, 5.37). The quality of evidence supporting this outcome is very low. Death censored graft survival evidence was not available. Regarding disease progression, based on a combined sample size of 71 persons in the co-infected cohort, the CD4+ T-cell count and HIV viral load appear controlled post transplant; however, again the quality of evidence supporting this outcome is very low. The rate of opportunistic infection in the co-infected cohort was 7.2%. The quality of evidence supporting this estimate is very low, indicating uncertainty in these estimates of effect. Based on a combined HIV+/HCV+ cohort (n=57) the rate of acute graft rejection does not differ to that of an HCV+ mono-infected cohort (OR: 0.88; 95% CI: 0.44, 1.76). Also based on a combined HIV+/HCV+ cohort (n=83), the rate of HCV+ recurrence does not differ to that of an HCV+ mono-infected cohort (OR: 0.66; 95% CI: 0.27, 1.59). In both cases, the quality of the supporting evidence was very low. Overall, because of very low quality evidence there is uncertainty in the effect of kidney or liver transplantation in HIV+ persons with end stage organ failure compared with those

  16. Choosing a reliability inspection plan for interval censored data

    DOE PAGES

    Lu, Lu; Anderson-Cook, Christine Michaela

    2017-04-19

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  17. Choosing a reliability inspection plan for interval censored data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Lu; Anderson-Cook, Christine Michaela

    Reliability test plans are important for producing precise and accurate assessment of reliability characteristics. This paper explores different strategies for choosing between possible inspection plans for interval censored data given a fixed testing timeframe and budget. A new general cost structure is proposed for guiding precise quantification of total cost in inspection test plan. Multiple summaries of reliability are considered and compared as the criteria for choosing the best plans using an easily adapted method. Different cost structures and representative true underlying reliability curves demonstrate how to assess different strategies given the logistical constraints and nature of the problem. Resultsmore » show several general patterns exist across a wide variety of scenarios. Given the fixed total cost, plans that inspect more units with less frequency based on equally spaced time points are favored due to the ease of implementation and consistent good performance across a large number of case study scenarios. Plans with inspection times chosen based on equally spaced probabilities offer improved reliability estimates for the shape of the distribution, mean lifetime, and failure time for a small fraction of population only for applications with high infant mortality rates. The paper uses a Monte Carlo simulation based approach in addition to the common evaluation based on the asymptotic variance and offers comparison and recommendation for different applications with different objectives. Additionally, the paper outlines a variety of different reliability metrics to use as criteria for optimization, presents a general method for evaluating different alternatives, as well as provides case study results for different common scenarios.« less

  18. MEDIAN-BASED INCREMENTAL COST-EFFECTIVENESS RATIOS WITH CENSORED DATA

    PubMed Central

    Bang, Heejung; Zhao, Hongwei

    2016-01-01

    Cost-effectiveness is an essential part of treatment evaluation, in addition to effectiveness. In the cost-effectiveness analysis, a measure called the incremental cost-effectiveness ratio (ICER) is widely utilized, and the mean cost and the mean (quality-adjusted) life years have served as norms to summarize cost and effectiveness for a study population. Recently, the median-based ICER was proposed for complementary or sensitivity analysis purposes. In this paper, we extend this method when some data are censored. PMID:26010599

  19. Impact of Diabetes Mellitus on Hospitalization for Heart Failure, Cardiovascular Events, and Death: Outcomes at 4 Years From the Reduction of Atherothrombosis for Continued Health (REACH) Registry.

    PubMed

    Cavender, Matthew A; Steg, Ph Gabriel; Smith, Sidney C; Eagle, Kim; Ohman, E Magnus; Goto, Shinya; Kuder, Julia; Im, Kyungah; Wilson, Peter W F; Bhatt, Deepak L

    2015-09-08

    Despite the known association of diabetes mellitus with cardiovascular events, there are few contemporary data on the long-term outcomes from international cohorts of patients with diabetes mellitus. We sought to describe cardiovascular outcomes at 4 years and to identify predictors of these events in patients with diabetes mellitus. The Reduction of Atherothrombosis for Continued Health (REACH) registry is an international registry of patients at high risk of atherothrombosis or established atherothrombosis. Four-year event rates in patients with diabetes mellitus were determined with the corrected group prognosis method. Of the 45 227 patients in the REACH registry who had follow-up at 4 years, 43.6% (n=19 699) had diabetes mellitus at baseline. The overall risk and hazard ratio (HR) of cardiovascular death, nonfatal myocardial infarction, or nonfatal stroke were greater in patients with diabetes compared with patients without diabetes (16.5% versus 13.1%; adjusted HR, 1.27; 95% confidence interval [CI] 1.19-1.35). There was also an increase in both cardiovascular death (8.9% versus 6.0%; adjusted HR, 1.38; 95% CI, 1.26-1.52) and overall death (14.3% versus 9.9%; adjusted HR, 1.40; 95% CI, 1.30-1.51). Diabetes mellitus was associated with a 33% greater risk of hospitalization for heart failure (9.4% versus 5.9%; adjusted odds ratio, 1.33; 95% CI, 1.18-1.50). In patients with diabetes mellitus, heart failure at baseline was independently associated with cardiovascular death (adjusted HR, 2.45; 95% CI, 2.17-2.77; P<0.001) and hospitalization for heart failure (adjusted odds ratio, 4.72; 95% CI, 4.22-5.29; P<0.001). Diabetes mellitus substantially increases the risk of death, ischemic events, and heart failure. Patients with both diabetes mellitus and heart failure are at particularly elevated risk of cardiovascular death, highlighting the need for additional therapies in this high-risk population. © 2015 American Heart Association, Inc.

  20. Risk factors for urinary tract infection after renal transplantation and its impact on graft function in children and young adults.

    PubMed

    Silva, Andres; Rodig, Nancy; Passerotti, Carlo P; Recabal, Pedro; Borer, Joseph G; Retik, Alan B; Nguyen, Hiep T

    2010-10-01

    Urinary tract infection will develop in 40% of children who undergo renal transplantation. Post-transplant urinary tract infection is associated with earlier graft loss in adults. However, the impact on graft function in the pediatric population is less well-known. Additionally the risk factors for post-transplant urinary tract infection in children have not been well elucidated. The purpose of this study was to assess the relationship between pre-transplant and post-transplant urinary tract infections on graft outcome, and the risk factors for post-transplant urinary tract infection. A total of 87 patients underwent renal transplantation between July 2001 and July 2006. Patient demographics, cause of renal failure, graft outcome, and presence of pre-transplant and post-transplant urinary tract infections were recorded. Graft outcome was based on last creatinine and nephrological assessment. Median followup was 3.12 years. Of the patients 15% had pre-transplant and 32% had post-transplant urinary tract infections. Good graft function was seen in 60% of the patients and 21% had failed function. Graft function did not correlate with a history of pre-transplant or post-transplant urinary tract infection (p >0.2). Of transplanted patients with urological causes of renal failure 57% had post-transplant urinary tract infection, compared to only 20% of those with a medical etiology of renal failure (p <0.001). In this study there was no correlation between a history of urinary tract infection (either before or after transplant) and decreased graft function. History of pre-transplant urinary tract infection was suggestive of urinary tract infection after transplant. Patients with urological causes of renal failure may be at increased risk for post-transplant urinary tract infection. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  1. Risk stratification for death and all-cause hospitalization in heart failure clinic outpatients.

    PubMed

    Hummel, Scott L; Ghalib, Hussam H; Ratz, David; Koelling, Todd M

    2013-11-01

    Most heart failure (HF) risk stratification models were developed for inpatient use, and available outpatient models use a complex set of variables. We hypothesized that routinely collected clinical data could predict the 6-month risk of death and all-cause medical hospitalization in HF clinic outpatients. Using a quality improvement database and multivariable Cox modeling, we derived the Heart Failure Patient Severity Index (HFPSI) in the University of Michigan HF clinic (UM cohort, n = 1,536; 314 reached primary outcome). We externally validated the HFPSI in the Ann Arbor Veterans' Affairs HF clinic (VA cohort, n = 445; 106 outcomes) and explored "real-time" HFPSI use (VA-RT cohort, n = 486; 141 outcomes) by tracking VA patients for 6 months from their most recently calculated HFPSI, rather than using an arbitrary start date for the cohort. The HFPSI model included blood urea nitrogen, B-type natriuretic peptide, New York Heart Association class, diabetes status, history of atrial fibrillation/flutter, and all-cause hospitalization within the prior 1 and 2 to 6 months. The concordance c statistics in the UM/VA/VA-RT cohorts were 0.71/0.68/0.74. Kaplan-Meier curves and log-rank testing demonstrated excellent risk stratification, particularly between a large, low-risk group (40% of patients, 6-month event rates in the UM/VA/VA-RT cohorts 8%/12%/12%) and a small, high-risk group (10% of patients, 6-month event rates in the UM/VA/VA-RT cohorts 57%/58%/79%). The HFPSI uses readily available data to predict the 6-month risk of death and/or all-cause medical hospitalization in HF clinic outpatients and could potentially help allocate specialized HF resources within health systems. © 2013.

  2. Sudden Death in Heart Failure With Preserved Ejection Fraction: A Competing Risks Analysis From the TOPCAT Trial.

    PubMed

    Vaduganathan, Muthiah; Claggett, Brian L; Chatterjee, Neal A; Anand, Inder S; Sweitzer, Nancy K; Fang, James C; O'Meara, Eileen; Shah, Sanjiv J; Hegde, Sheila M; Desai, Akshay S; Lewis, Eldrin F; Rouleau, Jean; Pitt, Bertram; Pfeffer, Marc A; Solomon, Scott D

    2018-03-04

    This study investigated the rates and predictors of SD or aborted cardiac arrest (ACA) in HFpEF. Sudden death (SD) may be an important mode of death in heart failure with preserved ejection fraction (HFpEF). We studied 1,767 patients with HFpEF (EF ≥45%) enrolled in the Americas region of the TOPCAT (Aldosterone Antagonist Therapy for Adults With Heart Failure and Preserved Systolic Function) trial. We identified independent predictors of composite SD/ACA with stepwise backward selection using competing risks regression analysis that accounted for nonsudden causes of death. During a median 3.0-year (25 th to 75 th percentile: 1.9 to 4.4 years) follow-up, 77 patients experienced SD/ACA, and 312 experienced non-SD/ACA. Corresponding incidence rates were 1.4 events/100 patient-years (25 th to 75 th percentile: 1.1 to 1.8 events/100 patient-years) and 5.8 events/100 patient-years (25 th to 75 th percentile: 5.1 to 6.4 events/100 patient-years). SD/ACA was numerically lower but not statistically reduced in those randomized to spironolactone: 1.2 events/100 patient-years (25 th to 75 th percentile: 0.9 to 1.7 events/100 patient-years) versus 1.6 events/100 patient-years (25 th to 75 th percentile: 1.2 to 2.2 events/100 patient-years); the subdistributional hazard ratio was 0.74 (95% confidence interval: 0.47 to 1.16; p = 0.19). After accounting for competing risks of non-SD/ACA, male sex and insulin-treated diabetes mellitus were independently predictive of composite SD/ACA (C-statistic = 0.65). Covariates, including eligibility criteria, age, ejection fraction, coronary artery disease, left bundle branch block, and baseline therapies, were not independently associated with SD/ACA. Sex and diabetes mellitus status remained independent predictors in sensitivity analyses, excluding patients with implantable cardioverter-defibrillators and when predicting SD alone. SD accounted for ∼20% of deaths in HFpEF. Male sex and insulin-treated diabetes mellitus identified

  3. Progressive liver failure post acute hepatitis A, over a three-month period, resulting in hepatorenal syndrome and death

    PubMed Central

    Al Saadi, Tareq; Sawaf, Bisher; Alkhatib, Mahmoud; Zakaria, Mhd Ismael; Daaboul, Bisher

    2017-01-01

    Abstract Hepatitis A is a common viral illness worldwide. It usually results in an acute, self-limiting disease and only rarely leads to fulminant hepatic failure or any other complications. During the period of conflict in Syria, and due to the damages to water infrastructure and poor sanitation, a dramatic increase in hepatitis A virus infection has been documented. Here we report a rare case of a 14-year-old male whose hepatitis A was complicated with hepatorenal syndrome and subacute liver failure. The war condition in Syria impeded transportation of the patient to a nearby country for liver transplantation, contributing to his unfortunate death. PMID:27247182

  4. Graft-versus-host disease management.

    PubMed

    Mistrik, M; Bojtarova, E; Sopko, L; Masakova, L; Roziakova, L; Martinka, J; Batorova, A

    Graft-versus-host disease (GVHD) remains a major problem of allogeneic hematopoietic-stem cell transplantation (HSCT) and an obstacle for successful outcome. Clinically significant acute GVHD (grade II or higher) developed in 20 to 65 percent of the patients. Death due to this complication accounts for approximately 50 percent of the deaths that are not due to a relapse of the neoplasm. Up to 70 % of patients who survive beyond day 100 develop chronic GVHD and it is the leading cause of nonrelapse mortality more than 2 years after allogeneic HSCT. In addition, chronic GVHD is associated with decreased quality of life, impaired functional status, and ongoing need for immunosuppressive medications. The incidence of chronic GVHD is increasing because of expansion of the donor population beyond HLA-identical siblings, older recipient age, use of peripheral blood cells as the graft source, and infusion of donor lymphocytes for treatment of recurrent malignancy after HSCT. With the current rush in new findings related to GVHD, we see a significant advancement in its management. Given these various new options and challenges, it is important to identify the minimal requirements for diagnosis and treatment of GVHD, as access to the most sophisticated advances may vary depending on local circumstances (Tab. 4, Fig. 1, Ref. 51).

  5. Recovery of Donor Hematopoiesis after Graft Failure and Second Hematopoietic Stem Cell Transplantation with Intraosseous Administration of Mesenchymal Stromal Cells

    PubMed Central

    Sats, Natalia; Risinskaya, Natalya; Sudarikov, Andrey; Dubniak, Daria; Kraizman, Alina

    2018-01-01

    Multipotent mesenchymal stromal cells (MSCs) participate in the formation of bone marrow niches for hematopoietic stem cells. Donor MSCs can serve as a source of recovery for niches in patients with graft failure (GF) after allogeneic bone marrow (BM) transplantation. Since only few MSCs reach the BM after intravenous injection, MSCs were implanted into the iliac spine. For 8 patients with GF after allo-BMT, another hematopoietic stem cell transplantation with simultaneous implantation of MSCs from their respective donors into cancellous bone was performed. BM was aspirated from the iliac crest of these patients at 1-2, 4-5, and 9 months after the intraosseous injection of donor MSCs. Patients' MSCs were cultivated, and chimerism was determined. In 6 out of 8 patients, donor hematopoiesis was restored. Donor cells (9.4 ± 3.3%) were detected among MSCs. Thus, implanted MSCs remain localized at the site of administration and do not lose the ability to proliferate. These results suggest that MSCs could participate in the restoration of niches for donor hematopoietic cells or have an immunomodulatory effect, preventing repeated rejection of the graft. Perhaps, intraosseous implantation of MSCs contributes to the success of the second transplantation of hematopoietic stem cells and patient survival. PMID:29760731

  6. Recovery of Donor Hematopoiesis after Graft Failure and Second Hematopoietic Stem Cell Transplantation with Intraosseous Administration of Mesenchymal Stromal Cells.

    PubMed

    Petinati, Nataliya; Drize, Nina; Sats, Natalia; Risinskaya, Natalya; Sudarikov, Andrey; Drokov, Michail; Dubniak, Daria; Kraizman, Alina; Nareyko, Maria; Popova, Natalia; Firsova, Maya; Kuzmina, Larisa; Parovichnikova, Elena; Savchenko, Valeriy

    2018-01-01

    Multipotent mesenchymal stromal cells (MSCs) participate in the formation of bone marrow niches for hematopoietic stem cells. Donor MSCs can serve as a source of recovery for niches in patients with graft failure (GF) after allogeneic bone marrow (BM) transplantation. Since only few MSCs reach the BM after intravenous injection, MSCs were implanted into the iliac spine. For 8 patients with GF after allo-BMT, another hematopoietic stem cell transplantation with simultaneous implantation of MSCs from their respective donors into cancellous bone was performed. BM was aspirated from the iliac crest of these patients at 1-2, 4-5, and 9 months after the intraosseous injection of donor MSCs. Patients' MSCs were cultivated, and chimerism was determined. In 6 out of 8 patients, donor hematopoiesis was restored. Donor cells (9.4 ± 3.3%) were detected among MSCs. Thus, implanted MSCs remain localized at the site of administration and do not lose the ability to proliferate. These results suggest that MSCs could participate in the restoration of niches for donor hematopoietic cells or have an immunomodulatory effect, preventing repeated rejection of the graft. Perhaps, intraosseous implantation of MSCs contributes to the success of the second transplantation of hematopoietic stem cells and patient survival.

  7. Multiple imputation for cure rate quantile regression with censored data.

    PubMed

    Wu, Yuanshan; Yin, Guosheng

    2017-03-01

    The main challenge in the context of cure rate analysis is that one never knows whether censored subjects are cured or uncured, or whether they are susceptible or insusceptible to the event of interest. Considering the susceptible indicator as missing data, we propose a multiple imputation approach to cure rate quantile regression for censored data with a survival fraction. We develop an iterative algorithm to estimate the conditionally uncured probability for each subject. By utilizing this estimated probability and Bernoulli sample imputation, we can classify each subject as cured or uncured, and then employ the locally weighted method to estimate the quantile regression coefficients with only the uncured subjects. Repeating the imputation procedure multiple times and taking an average over the resultant estimators, we obtain consistent estimators for the quantile regression coefficients. Our approach relaxes the usual global linearity assumption, so that we can apply quantile regression to any particular quantile of interest. We establish asymptotic properties for the proposed estimators, including both consistency and asymptotic normality. We conduct simulation studies to assess the finite-sample performance of the proposed multiple imputation method and apply it to a lung cancer study as an illustration. © 2016, The International Biometric Society.

  8. Survival analysis for the missing censoring indicator model using kernel density estimation techniques

    PubMed Central

    Subramanian, Sundarraman

    2008-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented. PMID:18953423

  9. Survival analysis for the missing censoring indicator model using kernel density estimation techniques.

    PubMed

    Subramanian, Sundarraman

    2006-01-01

    This article concerns asymptotic theory for a new estimator of a survival function in the missing censoring indicator model of random censorship. Specifically, the large sample results for an inverse probability-of-non-missingness weighted estimator of the cumulative hazard function, so far not available, are derived, including an almost sure representation with rate for a remainder term, and uniform strong consistency with rate of convergence. The estimator is based on a kernel estimate for the conditional probability of non-missingness of the censoring indicator. Expressions for its bias and variance, in turn leading to an expression for the mean squared error as a function of the bandwidth, are also obtained. The corresponding estimator of the survival function, whose weak convergence is derived, is asymptotically efficient. A numerical study, comparing the performances of the proposed and two other currently existing efficient estimators, is presented.

  10. Autogenous teeth used for bone grafting: A systematic review.

    PubMed

    Gual-Vaqués, P; Polis-Yanes, C; Estrugo-Devesa, A; Ayuso-Montero, R; Mari-Roig, A; López-López, J

    2018-01-01

    Recently, bone graft materials using permanent teeth have come to light, and clinical and histological outcomes of this material have been confirmed by some studies. The aim of this systematic review was to evaluate the reliability of the autogenous tooth bone graft material applied to alveolar ridge augmentation procedures. A systematic review of literature was conducted analyzing articles published between 2007 and 2017. The following four outcome variables were defined: a) implant stability b) post-operative complication c) evaluation of implant survival and failure rates, and d) histological analysis. A total of 108 articles were identified; 6 were selected for review. Based on the PICO (problem, intervention, comparison, outcome) model, the chief question of this study was: Can patients with alveolar ridge deficiency be successfully treated with the autogenous teeth used as bone graft? The mean primary stability of the placed implants was 67.3 ISQ and the mean secondary stability was 75.5 ISQ. The dehiscence of the wound was the most frequent complication with a rate of 29.1%. Of the 182 analyzed implants, the survival rate was 97.7% and the failure rate was 2.3%. In the histological analysis, most of studies reported bone formation. There is insufficient evidence regarding the effects of autogenous teeth used for bone grafting to support any definitive conclusions, although it has been shown clinically safe and good bone forming capacity, and good results are shown about implant stability.

  11. Is pre-transplant sensitization against angiotensin II type 1 receptor still a risk factor of graft and patient outcome in kidney transplantation in the anti-HLA Luminex era? A retrospective study.

    PubMed

    Deltombe, Clement; Gillaizeau, Florence; Anglicheau, Daniel; Morelon, Emmanuel; Trébern-Launay, Katy; Le Borgne, Florent; Rimbert, Marie; Guérif, Pierrick; Malard-Castagnet, Stéphanie; Foucher, Yohann; Giral, Magali

    2017-11-01

    We aimed to assess the correlation of anti-angiotensin II type 1 receptor antibodies (anti-AT1R-Abs) before transplantation on a multicentric cohort of kidney transplant recipients (2008-2012), under tacrolimus and mycophenolate mofetil (MMF), screened by Luminex technology for anti-HLA immunization. Anti-AT1R antibody levels were measured by ELISA in pretransplantation sera of 940 kidney recipients from three French centers of the DIVAT cohort. Multivariable Cox models estimated the association between pretransplant anti-angiotensin II type 1 receptor antibodies and time to acute rejection episodes (ARE) or time to graft failure. Within our cohort, 387 patients (41.2%) had pretransplant AT1R-Abs higher than 10 U/ml and only 8% (72/970) greater than 17 U/ml. The cumulative probability of clinically relevant (cr)-ARE was 22.5% at 1 year post-transplantation [95% CI (19.9-25.4%)]. The cumulative probability of graft failure and patient death were 10.6% [95% CI (8.4-13.3%)] and 5.7% [95% CI (4.0-8.1%)] at 3 years post-transplantation, respectively. Multivariate Cox models indicated that pretransplant anti-AT1R antibody levels higher than 10 U/ml were not significantly independently associated with higher risks of acute rejection episodes [HR = 1.04, 95% CI (0.80-1.35)] nor with risk of graft failure [HR = 0.86, 95% CI (0.56-1.33)]. Our study did not confirm an association between pretransplant anti-AT1R antibody levels and kidney transplant outcomes. © 2017 Steunstichting ESOT.

  12. Responding Intelligently when Would-Be Censors Charge: "That Book Can Make Them...!"

    ERIC Educational Resources Information Center

    Martinson, David L.

    2007-01-01

    School administrators and teachers need to recognize that most persons--including would-be censors of school-related media communications--simply do not understand the complexities germane to measuring the impact of the mass media and the specific messages transmitted to broader audiences via a variety of media channels. In particular, what most…

  13. Air Pump-Assisted Graft Centration, Graft Edge Unfolding, and Graft Uncreasing in Young Donor Graft Pre-Descemet Endothelial Keratoplasty.

    PubMed

    Jacob, Soosan; Narasimhan, Smita; Agarwal, Amar; Agarwal, Athiya; A I, Saijimol

    2017-08-01

    To assess an air pump-assisted technique for graft centration, graft edge unfolding, and graft uncreasing while performing pre-Descemet endothelial keratoplasty (PDEK) using young donor grafts. Continuous pressurized air infusion was used for graft centration, graft edge unfolding, and graft unwrinkling. Ten eyes of 10 patients underwent PDEK with donors aged below 40 years. In all eyes, the donor scrolled into tight scrolls. In all cases, the air pump-assisted technique was effective in positioning and centering the graft accurately and in straightening infolded graft edges and smoothing out graft creases and wrinkles. Endothelial cell loss was 38.6%. Postoperative best-corrected visual acuity at 6 months was 0.66 ± 0.25 in decimal equivalent. Continuous pressurized air infusion acted as a third hand providing a continuous pressure head that supported the graft and prevented graft dislocation as well as anterior chamber collapse during intraocular maneuvering. Adequate maneuvering space was available in all cases, and bleeding, if any, was tamponaded successfully in all cases. Although very young donor grafts may be used for PDEK, they are difficult to center and unroll completely before floating against host stroma. An air pump-assisted technique using continuous pressurized air infusion allows successful final graft positioning even with very young donor corneas. It thus makes surgery easier as several key steps are made easier to handle. It additionally helps in tamponading hemorrhage during peripheral iridectomy, increasing surgical space, preventing fluctuations in the anterior chamber depth, and promoting graft adherence.

  14. Rapid Discontinuation of Prednisone in Kidney Transplant Recipients: 15-Year Outcomes From the University of Minnesota.

    PubMed

    Serrano, Oscar Kenneth; Kandaswamy, Raja; Gillingham, Kristen; Chinnakotla, Srinath; Dunn, Ty B; Finger, Erik; Payne, William; Ibrahim, Hassan; Kukla, Aleksandra; Spong, Richard; Issa, Naim; Pruett, Timothy L; Matas, Arthur

    2017-10-01

    Short- and intermediate-term results have been reported after rapid discontinuation of prednisone (RDP) in kidney transplant recipients. Yet there has been residual concern about late graft failure in the absence of maintenance prednisone. From October 1, 1999, through June 1, 2015, we performed a total of 1553 adult first and second kidney transplants-1021 with a living donor, 532 with a deceased donor-under our RDP protocol. We analyzed the 15-year actuarial overall patient survival (PS), graft survival (GS), death-censored GS (DCGS), and acute rejection-free survival (ARFS) rates for RDP compared with historical controls on maintenance prednisone. For living donor recipients, the actuarial 15-year PS rates were similar between groups. But RDP was associated with increased GS (P = 0.02) and DCGS (P = 0.01). For deceased donor recipients, RDP was associated with significantly better PS (P < 0.01), GS (P < 0.01) and DCGS (P < 0.01). There was no difference between groups in the rate of acute or chronic rejection, or in the mean estimated glomerular filtration rate at 15 years. However, RDP-treated recipients had significantly lower rates of avascular necrosis, cytomegalovirus, cataracts, new-onset diabetes after transplant, and cardiac complications. Importantly, for recipients with GS longer than 5 years, there was no difference between groups in subsequent actuarial PS, GS, and DCGS. In summary, at 15 years postkidney transplant, RDP did not lead to decreased in PS or GS, or an increase in graft dysfunction but as associated with reduced complication rates.

  15. A failure analysis of invasive breast cancer: most deaths from disease occur in women not regularly screened.

    PubMed

    Webb, Matthew L; Cady, Blake; Michaelson, James S; Bush, Devon M; Calvillo, Katherina Zabicki; Kopans, Daniel B; Smith, Barbara L

    2014-09-15

    Mortality reduction from mammographic screening is controversial. Individual randomized trials and meta-analyses demonstrate statistically significant mortality reductions in all age groups invited to screening. In women actually screened, mortality reductions are greater. Individual trials and meta-analyses show varying rates of mortality reduction, leading to questions about screening's value and whether treatment advances have diminished the importance of early detection. This study hypothesized that breast cancer deaths predominantly occurred in unscreened women. Invasive breast cancers diagnosed between 1990 and 1999 were followed through 2007. Data included demographics, mammography use, surgical and pathology reports, and recurrence and death dates. Mammograms were categorized as screening or diagnostic based on absence or presence of breast signs or symptoms, and were substantiated by medical records. Breast cancer deaths were defined after documentation of prior distant metastases. Absence of recurrent cancer and lethal other diseases defined death from other causes. Invasive breast cancer failure analysis defined 7301 patients between 1990 and 1999, with 1705 documented deaths from breast cancer (n = 609) or other causes (n = 905). Among 609 confirmed breast cancer deaths, 29% were among women who had been screened (19% screen-detected and 10% interval cancers), whereas 71% were among unscreened women, including > 2 years since last mammogram (6%), or never screened (65%). Overall, 29% of cancer deaths were screened, whereas 71% were unscreened. Median age at diagnosis of fatal cancers was 49 years; in deaths not from breast cancer, median age at diagnosis was 72 years. Most deaths from breast cancer occur in unscreened women. To maximize mortality reduction and life-years gained, initiation of regular screening before age 50 years should be encouraged. Copyright © 2013 American Cancer Society.

  16. Renal transplantation outcome and social deprivation in the French healthcare system: a cohort study using the European Deprivation Index.

    PubMed

    Châtelet, Valérie; Bayat-Makoei, Sahar; Vigneau, Cécile; Launoy, Guy; Lobbedez, Thierry

    2018-04-02

    The study objective was to estimate the effect of social deprivation estimated by the European Deprivation Index (EDI) on the risk of death and graft failure on renal transplantation in France. EDI was calculated for 8701 of 9205 patients receiving a first renal transplantation between 2010 and 2014. Patients were separated in EDI quintiles of the general population. A Cox model (cs-HR: cause-specific hazard ratio of death or graft failure) and a Fine and Gray model (sd-HR: subdistribution hazard ratio of death and graft failure) were used for the analysis. The 5th quintile group (most deprived) accounted for 32% of patients [2818 of 8701]. In the multivariate analysis, compared with quintile 1, the risk of death was higher for the 5 th quintile group in the complete cohort [cs-HR: 1.31, 95% CI: (1.01-1.70), sd-HR: 1.29, 95% CI: (1.00-1.68)], in the deceased donor group [cs-HR: 1.31, 95% CI: (1.00-1.71), sd-HR: 1.30, 95% CI: (1.00-1.70)] but not in living donor transplant patients. There was no association between the EDI groups and the risk of transplant failure. Social deprivation estimated by the EDI is associated with an increased risk of death in transplantation in France but not with the chance of allograft loss. © 2018 Steunstichting ESOT.

  17. Variables associated with the risk of early death after liver transplantation at a liver transplant unit in a university hospital.

    PubMed

    Azevedo, L D; Stucchi, R S; de Ataíde, E C; Boin, I F S F

    2015-05-01

    Graft dysfunction after liver transplantation is a serious complication that can lead to graft loss and patient death. This was a study to identify risk factors for early death (up to 30 days after transplantation). It was an observational and retrospective analysis at the Liver Transplantation Unit, Hospital de Clinicas, State University of Campinas, Brazil. From July 1994 to December 2012, 302 patients were included (>18 years old, piggyback technique). Of these cases, 26% died within 30 days. For analysis, Student t tests and chi-square were used to analyze receptor-related (age, body mass index, serum sodium, graft dysfunction, Model for End-Stage Liver Disease score, renal function, and early graft dysfunction [EGD type 1, 2, or 3]), surgery (hot and cold ischemia, surgical time, and units of packed erythrocytes [pRBC]), and donor (age, hypotension, and brain death cause) factors. Risk factors were identified by means of logistic regression model adjusted by the Hosmer-Lemeshow test with significance set at P < .05. We found that hyponatremic recipients had a 6.26-fold higher risk for early death. There was a 9% reduced chance of death when the recipient serum sodium increased 1 unit. The chance of EGD3 to have early death was 18-fold higher than for EGD1 and there was a 13% increased risk for death for each unit of pRBC transfused. Donor total bilirubin, hyponatremia, massive transfusion, and EGD3 in the allocation graft should be observed for better results in the postoperative period. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. A threshold-free summary index of prediction accuracy for censored time to event data.

    PubMed

    Yuan, Yan; Zhou, Qian M; Li, Bingying; Cai, Hengrui; Chow, Eric J; Armstrong, Gregory T

    2018-05-10

    Prediction performance of a risk scoring system needs to be carefully assessed before its adoption in clinical practice. Clinical preventive care often uses risk scores to screen asymptomatic population. The primary clinical interest is to predict the risk of having an event by a prespecified future time t 0 . Accuracy measures such as positive predictive values have been recommended for evaluating the predictive performance. However, for commonly used continuous or ordinal risk score systems, these measures require a subjective cutoff threshold value that dichotomizes the risk scores. The need for a cutoff value created barriers for practitioners and researchers. In this paper, we propose a threshold-free summary index of positive predictive values that accommodates time-dependent event status and competing risks. We develop a nonparametric estimator and provide an inference procedure for comparing this summary measure between 2 risk scores for censored time to event data. We conduct a simulation study to examine the finite-sample performance of the proposed estimation and inference procedures. Lastly, we illustrate the use of this measure on a real data example, comparing 2 risk score systems for predicting heart failure in childhood cancer survivors. Copyright © 2018 John Wiley & Sons, Ltd.

  19. Predicting treatment failure, death and drug resistance using a computed risk score among newly diagnosed TB patients in Tamaulipas, Mexico.

    PubMed

    Abdelbary, B E; Garcia-Viveros, M; Ramirez-Oropesa, H; Rahbar, M H; Restrepo, B I

    2017-10-01

    The purpose of this study was to develop a method for identifying newly diagnosed tuberculosis (TB) patients at risk for TB adverse events in Tamaulipas, Mexico. Surveillance data between 2006 and 2013 (8431 subjects) was used to develop risk scores based on predictive modelling. The final models revealed that TB patients failing their treatment regimen were more likely to have at most a primary school education, multi-drug resistance (MDR)-TB, and few to moderate bacilli on acid-fast bacilli smear. TB patients who died were more likely to be older males with MDR-TB, HIV, malnutrition, and reporting excessive alcohol use. Modified risk scores were developed with strong predictability for treatment failure and death (c-statistic 0·65 and 0·70, respectively), and moderate predictability for drug resistance (c-statistic 0·57). Among TB patients with diabetes, risk scores showed moderate predictability for death (c-statistic 0·68). Our findings suggest that in the clinical setting, the use of our risk scores for TB treatment failure or death will help identify these individuals for tailored management to prevent these adverse events. In contrast, the available variables in the TB surveillance dataset are not robust predictors of drug resistance, indicating the need for prompt testing at time of diagnosis.

  20. Outcome of Kidney Transplantation From Donor After Cardiac Death: Reanalysis of the US Mycophenolic Renal Transplant Registry.

    PubMed

    Zhu, D; McCague, K; Lin, W; Rong, R; Xu, M; Chan, L; Zhu, T

    2018-06-01

    Kidney transplantation is limited by the shortage of donor kidneys. Donation after cardiac death (DCD) has been explored to alleviate this problem. To better understand the outcome of DCD kidney transplantation, we reanalyzed the Mycophenolic Renal Transplant (MORE) Registry. We compared delayed graft function (DGF), biopsy-proved acute rejection (BPAR), graft loss, and patient death between DCD and donation after brain death (DBD) kidney transplantations. Recipients were further stratified into depleting and nondepleting induction groups for exploratory analysis. There were 548 patients who received kidney transplants from deceased donor in the MORE Registry. Among them, 133 received grafts from DCD donors and 415 received from DBD donors. The incidence of DGF was 29.4% and 23.5% in the DCD group and the DBD group, respectively (P = .1812), and the incidence of BPAR at 12 months was 9.0% and 9.9% respectively (P = .7713). The 1-year graft loss rate in the DCD group was higher than that in the DBD group (7.5% vs 3.1%, P = .0283), and the 4-year graft loss rate and patient death rate were not significantly different between the 2 groups. The DCD kidney transplant group had acceptable short-term outcomes and good long-term outcomes as compared with the DBD kidney transplant group. Copyright © 2018 Elsevier Inc. All rights reserved.

  1. Anterior Cruciate Ligament-Derived Stem Cells Transduced With BMP2 Accelerate Graft-Bone Integration After ACL Reconstruction.

    PubMed

    Kawakami, Yohei; Takayama, Koji; Matsumoto, Tomoyuki; Tang, Ying; Wang, Bing; Mifune, Yutaka; Cummins, James H; Warth, Ryan J; Kuroda, Ryosuke; Kurosaka, Masahiro; Fu, Freddie H; Huard, Johnny

    2017-03-01

    Strong graft-bone integration is a prerequisite for successful graft remodeling after reconstruction of the anterior cruciate ligament (ACL) using soft tissue grafts. Novel strategies to accelerate soft tissue graft-bone integration are needed to reduce the need for bone-tendon-bone graft harvest, reduce patient convalescence, facilitate rehabilitation, and reduce total recovery time after ACL reconstruction. The application of ACL-derived stem cells with enhanced expression of bone morphogenetic protein 2 (BMP2) onto soft tissue grafts in the form of cell sheets will both accelerate and improve the quality of graft-bone integration after ACL reconstruction in a rat model. Controlled laboratory study. ACL-derived CD34+ cells were isolated from remnant human ACL tissues, virally transduced to express BMP2, and embedded within cell sheets. In a rat model of ACL injury, bilateral single-bundle ACL reconstructions were performed, in which cell sheets were wrapped around tendon autografts before reconstruction. Four groups containing a total of 48 rats (96 knees) were established (n = 12 rats; 24 knees per group): CD34+BMP2 (100%), CD34+BMP2 (25%), CD34+ (untransduced), and a control group containing no cells. Six rats from each group were euthanized 2 and 4 weeks after surgery, and each graft was harvested for immunohistochemical and histological analyses. The remaining 6 rats in each group were euthanized at 4 and 8 weeks to evaluate in situ tensile load to failure in each femur-graft-tibia complex. In vitro, BMP2 transduction promoted the osteogenic differentiation of ACL-derived CD34+ cells while retaining their intrinsic multipotent capabilities. Osteoblast densities were greatest in the BMP2 (100%) and BMP2 (25%) groups. Bone tunnels in the CD34+BMP2 (100%) and CD34+BMP2 (25%) groups had the smallest cross-sectional areas according to micro-computed tomography analyses. Graft-bone integration occurred most rapidly in the CD34+BMP2 (25%) group. Tensile load to

  2. Factors associated with treatment failure, dropout, and death in a cohort of tuberculosis patients in Recife, Pernambuco State, Brazil.

    PubMed

    de Albuquerque, Maria de Fátima Pessoa Militão; Ximenes, Ricardo Arraes de Alencar; Lucena-Silva, Norma; de Souza, Wayner Vieira; Dantas, Andréa Tavares; Dantas, Odimariles Maria Souza; Rodrigues, Laura Cunha

    2007-07-01

    A cohort of cases initiating tuberculosis treatment from May 2001 to July 2003 was followed in Recife, Pernambuco State, Brazil, to investigate biological, clinical, social, lifestyle, and healthcare access factors associated with three negative tuberculosis treatment outcomes (treatment failure, dropout, and death) separately and as a group. Treatment failure was associated with treatment delay, illiteracy, and alcohol consumption. Factors associated with dropout were age, prior TB treatment, and illiteracy. Death was associated with age, treatment delay, HIV co-infection, and head of family's income. Main factors associated with negative treatment outcomes as a whole were age, HIV co-infection, illiteracy, alcoholism, and prior TB treatment. We suggest the following strategies to increase cure rates: further training of the Family Health Program personnel in TB control, awareness-raising on the need to tailor their activities to special care for cases (e.g., literacy training); targeting use of directly observed therapy for higher risk groups; establishment of a flexible referral scheme to handle technical and psychosocial problems, including alcoholism; and increased collaboration with the HIV/AIDS program.

  3. The effect of timing and graft dysfunction on survival and cardiac allograft vasculopathy in antibody-mediated rejection.

    PubMed

    Clerkin, Kevin J; Restaino, Susan W; Zorn, Emmanuel; Vasilescu, Elena R; Marboe, Charles C; Mancini, Donna M

    2016-09-01

    Antibody-mediated rejection (AMR) has been associated with increased death and cardiac allograft vasculopathy (CAV). Early studies suggested that late AMR was rarely associated with graft dysfunction, whereas recent reports have demonstrated an association with increased mortality. We investigated the timing of AMR and its association with graft dysfunction, death, and CAV. This retrospective cohort study identified all adult orthotopic heart transplant (OHT) recipients (N = 689) at Columbia University Medical Center from 2004 to 2013. There were 68 primary cases of AMR, which were stratified by early (< 1 year post-OHT) or late (> 1 year post-OHT) AMR. Kaplan-Meier survival analysis and modeling was performed with multivariable logistic regression and Cox proportional hazards regression. From January 1, 2004, through October 1, 2015, early AMR (median 23 days post-OHT) occurred in 43 patients and late AMR (median 1,084 days post-OHT) occurred in 25. Graft dysfunction was less common with early compared with late AMR (25.6% vs 56%, p = 0.01). Patients with late AMR had decreased post-AMR survival compared with early AMR (1 year: 80% vs 93%, 5 years: 51% vs 73%, p < 0.05). When stratified by graft dysfunction, only those with late AMR and graft dysfunction had worse survival (30 days: 79%, 1 year: 64%, 5 years: 36%; p < 0.006). The association remained irrespective of age, sex, donor-specific antibodies, left ventricular assist device use, reason for OHT, and recovery of graft function. Similarly, those with late AMR and graft dysfunction had accelerated development of de novo CAV (50% at 1 year; hazard ratio, 5.42; p = 0.009), whereas all other groups were all similar to the general transplant population. Late AMR is frequently associated with graft dysfunction. When graft dysfunction is present in late AMR, there is an early and sustained increased risk of death and rapid development of de novo CAV despite aggressive treatment. Copyright © 2016 International

  4. Instrumentation Failure after Partial Corpectomy with Instrumentation of a Metastatic Spine.

    PubMed

    Park, Sung Bae; Kim, Ki Jeong; Han, Sanghyun; Oh, Sohee; Kim, Chi Heon; Chung, Chun Kee

    2018-05-01

    To identify the perioperative factors associated with instrument failure in patients undergoing a partial corpectomy with instrumentation (PCI) for spinal metastasis. We assessed the one hundred twenty-four patients with who underwent PCI for a metastatic spine from 1987 to 2011. Outcome measure was the risk factor related to implantation failure. The preoperative factors analyzed were age, sex, ambulation, American Spinal Injury Association grade, bone mineral density, use of steroid, primary tumor site, number of vertebrae with metastasis, extra-bone metastasis, preoperative adjuvant chemotherapy, and preoperative spinal radiotherapy. The intraoperative factors were the number of fixed vertebrae, fixation in osteolytic vertebrae, bone grafting, and type of surgical approach. The postoperative factors included postoperative adjuvant chemotherapy and spinal radiotherapy. This study was supported by the National Research Foundation grant funded by government. There were no study-specific biases related to conflicts of interest. There were 15 instrumentation failures (15/124, 12.1%). Preoperative ambulatory status and primary tumor site were not significantly related to the development of implant failure. There were no significant associations between insertion of a bone graft into the partial corpectomy site and instrumentation failure. The preoperative and operative factors analyzed were not significantly related to instrumentation failure. In univariable and multivariable analyses, postoperative spinal radiotherapy was the only significant variable related to instrumentation failure ( p =0.049 and 0.050, respectively). When performing PCI in patients with spinal metastasis followed by postoperative spinal radiotherapy, the surgeon may consider the possibility of instrumentation failure and find other strategies for augmentation than the use of a bone graft for fusion.

  5. Full thickness skin grafts in periocular reconstructions: long-term outcomes.

    PubMed

    Rathore, Deepa S; Chickadasarahilli, Swaroop; Crossman, Richard; Mehta, Purnima; Ahluwalia, Harpreet Singh

    2014-01-01

    To evaluate the outcomes of eyelid reconstruction in patients who underwent full thickness skin grafts. A retrospective, noncomparative intervention study of patients who underwent periocular reconstruction with full thickness skin grafts between 2005 and 2011. One hundred consecutive Caucasian patients were included in the study, 54 women and 46 men. Mean follow up was 32 months. Indications for full thickness skin grafts were excision of eyelid tumors (98%) and cicatricial ectropion (2%). Site of lid defects were lower lid (60%), medial canthus (32%), upper lid (6%), and lateral canthus (2%). The skin graft donor sites were supraclavicular (44%), upper eyelid (24%), inner brachial (18%), and postauricular (14%).Early postoperative complications included lower eyelid graft contracture (1%) and partial failure (1%). Late sequelae included lower eyelid graft contracture (4%) and hypertrophic scarring (23%). Of the 23 patients with hypertrophic scar, 21 achieved good outcomes following massage with silicone gel and steroid ointment and 2 had persistent moderate lumpiness. No statistically significant association was found between graft hypertrophy and donor site or graft size. As high as 95% of all patients achieved good final eyelid position. Good color match was seen in 94% and graft hypopigmentation in 6%. An association between hypopigmentation and supraclavicular and inner brachial donor site was found to be statistically significant. Most patients (94%) achieved good eyelid position and color match. Majority (91%) of the early postoperative cicatricial sequelae can be reversed by massage, steroid ointment, and silicone gel application. Full thickness skin grafts have excellent graft survival rates and have minimal donor site morbidity.

  6. A joint model for longitudinal and time-to-event data to better assess the specific role of donor and recipient factors on long-term kidney transplantation outcomes.

    PubMed

    Fournier, Marie-Cécile; Foucher, Yohann; Blanche, Paul; Buron, Fanny; Giral, Magali; Dantan, Etienne

    2016-05-01

    In renal transplantation, serum creatinine (SCr) is the main biomarker routinely measured to assess patient's health, with chronic increases being strongly associated with long-term graft failure risk (death with a functioning graft or return to dialysis). Joint modeling may be useful to identify the specific role of risk factors on chronic evolution of kidney transplant recipients: some can be related to the SCr evolution, finally leading to graft failure, whereas others can be associated with graft failure without any modification of SCr. Sample data for 2749 patients transplanted between 2000 and 2013 with a functioning kidney at 1-year post-transplantation were obtained from the DIVAT cohort. A shared random effect joint model for longitudinal SCr values and time to graft failure was performed. We show that graft failure risk depended on both the current value and slope of the SCr. Deceased donor graft patient seemed to have a higher SCr increase, similar to patient with diabetes history, while no significant association of these two features with graft failure risk was found. Patient with a second graft was at higher risk of graft failure, independent of changes in SCr values. Anti-HLA immunization was associated with both processes simultaneously. Joint models for repeated and time-to-event data bring new opportunities to improve the epidemiological knowledge of chronic diseases. For instance in renal transplantation, several features should receive additional attention as we demonstrated their correlation with graft failure risk was independent of the SCr evolution.

  7. An Incident Cohort Study Comparing Survival on Home Hemodialysis and Peritoneal Dialysis (Australia and New Zealand Dialysis and Transplantation Registry)

    PubMed Central

    Nadeau-Fredette, Annie-Claire; Hawley, Carmel M.; Pascoe, Elaine M.; Chan, Christopher T.; Clayton, Philip A.; Polkinghorne, Kevan R.; Boudville, Neil; Leblanc, Martine

    2015-01-01

    Background and objectives Home dialysis is often recognized as a first-choice therapy for patients initiating dialysis. However, studies comparing clinical outcomes between peritoneal dialysis and home hemodialysis have been very limited. Design, setting, participants, & measurements This Australia and New Zealand Dialysis and Transplantation Registry study assessed all Australian and New Zealand adult patients receiving home dialysis on day 90 after initiation of RRT between 2000 and 2012. The primary outcome was overall survival. The secondary outcomes were on-treatment survival, patient and technique survival, and death-censored technique survival. All results were adjusted with three prespecified models: multivariable Cox proportional hazards model (main model), propensity score quintile–stratified model, and propensity score–matched model. Results The study included 10,710 patients on incident peritoneal dialysis and 706 patients on incident home hemodialysis. Treatment with home hemodialysis was associated with better patient survival than treatment with peritoneal dialysis (5-year survival: 85% versus 44%, respectively; log-rank P<0.001). Using multivariable Cox proportional hazards analysis, home hemodialysis was associated with superior patient survival (hazard ratio for overall death, 0.47; 95% confidence interval, 0.38 to 0.59) as well as better on-treatment survival (hazard ratio for on-treatment death, 0.34; 95% confidence interval, 0.26 to 0.45), composite patient and technique survival (hazard ratio for death or technique failure, 0.34; 95% confidence interval, 0.29 to 0.40), and death-censored technique survival (hazard ratio for technique failure, 0.34; 95% confidence interval, 0.28 to 0.41). Similar results were obtained with the propensity score models as well as sensitivity analyses using competing risks models and different definitions for technique failure and lag period after modality switch, during which events were attributed to the

  8. Short daily-, nocturnal- and conventional-home hemodialysis have similar patient and treatment survival.

    PubMed

    Tennankore, Karthik K; Na, Yingbo; Wald, Ron; Chan, Christopher T; Perl, Jeffrey

    2018-01-01

    Home hemodialysis (HHD) has many benefits, but less is known about relative outcomes when comparing different home-based hemodialysis modalities. Here, we compare patient and treatment survival for patients receiving short daily HHD (2-3 hours/5 plus sessions per week), nocturnal HHD (6-8 hours/5 plus sessions per week) and conventional HHD (3-6 hours/2-4 sessions per week). A nationally representative cohort of Canadian HHD patients from 1996-2012 was studied. The primary outcome was death or treatment failure (defined as a permanent return to in-center hemodialysis or peritoneal dialysis) using an intention to treat analysis and death-censored treatment failure as a secondary outcome. The cohort consisted of 600, 508 and 202 patients receiving conventional, nocturnal, and short daily HHD, respectively. Conventional-HHD patients were more likely to use dialysis catheter access (43%) versus nocturnal or short daily HHD (32% and 31%, respectively). Although point estimates were in favor of both therapies, after multivariable adjustment for patient and center factors, there was no statistically significant reduction in the relative hazard for the death/treatment failure composite comparing nocturnal to conventional HHD (hazard ratio 0.83 [95% confidence interval 0.66-1.03]) or short daily to conventional HHD (0.84, 0.63-1.12). Among those with information on vascular access, patients receiving nocturnal HHD had a relative improvement in death-censored treatment survival (0.75, 0.57-0.98). Thus, in this national cohort of HHD patients, those receiving short daily and nocturnal HHD had similar patient/treatment survival compared with patients receiving conventional HHD. Copyright © 2017 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.

  9. Effects of braiding on tensile properties of four-strand human hamstring tendon grafts.

    PubMed

    Millett, Peter J; Miller, Bruce S; Close, Matthew; Sterett, William I; Walsh, William; Hawkins, Richard J

    2003-01-01

    Anterior cruciate ligament reconstruction is commonly performed with autogenous hamstring tendon grafts. To ascertain the effects of braiding on ultimate tensile strength and stiffness of hamstring tendon graft. Controlled laboratory study. Sixteen fresh-frozen semitendinosus and gracilis tendons were harvested from eight matched (right and left) human cadaveric specimens. Both sets of hamstring tendons from each matched pair were doubled, creating a four-strand graft. Grafts were then randomized so that one graft from each matched pair was braided and the other remained unbraided. The diameter of each graft construct was recorded. Grafts were tested to failure on a materials testing machine. There were no significant differences in cross-sectional area before or after braiding. Fifteen of 16 tendons failed midsubstance; 1 failed at the lower clamp. Braiding reduced the initial tensile strength and stiffness of human hamstring tendon grafts in this study by 35.0% and 45.8%, respectively. Braiding may place the collagen fibers in a suboptimal orientation for loading that results in a weaker graft. We do not recommend the use of braiding if the strongest, stiffest initial graft is desired.

  10. Cunninghamella bertholletiae Infection in a HLA-Haploidentical Hematopoietic Stem Cell Transplant Recipient with Graft Failure: Case Report and Review of the Literature.

    PubMed

    Luo, Chao; Wang, Jiasheng; Hu, Yongxian; Luo, Yi; Tan, Yamin; Jin, Aiyun; Wei, Bin; Hu, Huixian; Huang, He

    2016-10-01

    Cunninghamella bertholletiae as a rare cause of mucormycosis has been described almost exclusively in immunosuppressed patients such as hematopoietic stem cell transplant (HSCT) recipients. The infection is associated with high rates of mortality despite aggressive treatment. We describe a 40-year-old male with HLA-haploidentical HSCT developed fungal pneumonitis caused by C. bertholletiae complicated by graft failure and prolonged neutropenia. The patient died 102 days after HSCT despite early use of posaconazole and amphotericin B, which are believed to be the two most effective antifungal antibiotics against C. bertholletiae. The case highlights extreme unfavorable outcome in C. bertholletiae infection and neutropenia as a major risk factor.

  11. Glenoid bone grafting in primary reverse total shoulder arthroplasty.

    PubMed

    Ernstbrunner, Lukas; Werthel, Jean-David; Wagner, Eric; Hatta, Taku; Sperling, John W; Cofield, Robert H

    2017-08-01

    Severe glenoid bone loss remains a challenge in patients requiring shoulder arthroplasty and may necessitate glenoid bone grafting. The purpose of this study was to determine results, complications, and rates of failure of glenoid bone grafting in primary reverse shoulder arthroplasty. Forty-one shoulders that underwent primary reverse arthroplasty between 2006 and 2013 with a minimum follow-up of 2 years (mean, 2.8 years; range, 2-6 years) were reviewed. Thirty-four (83%) received corticocancellous grafts and 7 (17%) structural grafts. Active range of motion and pain levels were significantly improved (P < .001), with mean American Shoulder and Elbow Surgeons score of 77, Simple Shoulder Test score of 9, and patient satisfaction of 93% at the most recent follow-up. Preoperative severe glenoid erosion and increasing body mass index were significantly associated with worse American Shoulder and Elbow Surgeons scores (P = .04). On radiographic evaluation, 7 patients (18%) had grade 1 or grade 2 glenoid lucency. Glenoid bone graft incorporation was observed in 31 patients (78%). Twelve patients (30%) suffered from grade 1 or grade 2 scapular notching. All of the patients with structural grafts showed graft incorporation and no signs of glenoid lucency. Although glenoid lucency, glenoid graft resorption, and scapular notching were present at short-term to midterm follow-up, none of the patients needed revision surgery. Primary reverse shoulder arthroplasty with glenoid reconstruction using bone graft relieved pain and restored shoulder function and stability. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  12. Liver transplant using donors after cardiac death: a single-center approach providing outcomes comparable to donation after brain death.

    PubMed

    Vanatta, Jason M; Dean, Amanda G; Hathaway, Donna K; Nair, Satheesh; Modanlou, Kian A; Campos, Luis; Nezakatgoo, Nosratollah; Satapathy, Sanjaya K; Eason, James D

    2013-04-01

    Organ donation after cardiac death remains an available resource to meet the demand for transplant. However, concern persists that outcomes associated with donation after cardiac death liver allografts are not equivalent to those obtained with organ donation after brain death. The aim of this matched case control study was to determine if outcomes of liver transplants with donation after cardiac death donors is equivalent to outcomes with donation after brain death donors by controlling for careful donor and recipient selection, surgical technique, and preservation solution. A retrospective, matched case control study of adult liver transplant recipients at the University of Tennessee/Methodist University Hospital Transplant Institute, Memphis, Tennessee was performed. Thirty-eight donation after cardiac death recipients were matched 1:2, with 76 donation after brain death recipients by recipient age, recipient laboratory Model for End Stage Liver Disease score, and donor age to form the 2 groups. A comprehensive approach that controlled for careful donor and recipient matching, surgical technique, and preservation solution was used to minimize warm ischemia time, cold ischemia time, and ischemia-reperfusion injury. Patient and graft survival rates were similar in both groups at 1 and 3 years (P = .444 and P = .295). There was no statistically significant difference in primary nonfunction, vascular complications, or biliary complications. In particular, there was no statistically significant difference in ischemic-type diffuse intrahepatic strictures (P = .107). These findings provide further evidence that excellent patient and graft survival rates expected with liver transplants using organ donation after brain death donors can be achieved with organ donation after cardiac death donors without statistically higher rates of morbidity or mortality when a comprehensive approach that controls for careful donor and recipient matching, surgical technique, and

  13. Comparison of 5-Year Outcomes After Coronary Artery Bypass Grafting in Heart Failure Patients With Versus Without Preserved Left Ventricular Ejection Fraction (from the CREDO-Kyoto CABG Registry Cohort-2).

    PubMed

    Marui, Akira; Nishiwaki, Noboru; Komiya, Tatsuhiko; Hanyu, Michiya; Tanaka, Shiro; Kimura, Takeshi; Sakata, Ryuzo

    2015-08-15

    Heart failure (HF) with reduced left ventricular (LV) ejection fraction (HFrEF) is regarded as an independent risk factor for poor outcomes after coronary artery bypass grafting (CABG). However, the impact of HF with preserved EF (HFpEF) still has been unclear. We identified 1,877 patients who received isolated CABG of 15,939 patients who underwent first coronary revascularization enrolled in the CREDO-Kyoto (Coronary REvascularization Demonstrating Outcome Study in Kyoto) Registry Cohort-2. Of them, 1,489 patients had normal LV function (LVEF >50% without a history of HF; Normal group), 236 had HFrEF (LVEF ≤50% with HF), and 152 had HFpEF (LVEF >50% with HF). Preoperative LVEF was the lowest in the HFrEF group (62 ± 12%, 36 ± 9%, and 61 ± 7% for the Normal, HFrEF, and HFpEF groups, respectively; p <0.001). Unadjusted 30-day mortality rate was the highest in the HFrEF group (0.5%, 3.0%, and 0.7%; p = 0.003). However, cumulative incidences of all-cause death at 5-year was the highest in the HFpEF group (14%, 27%, and 32%, respectively; p <0.001). After adjusting confounders, the risk of all-cause death in the HFpEF group was greater than the Normal group (hazard ratio [HR] 1.42; 95% confidence interval [CI] 1.02 to 1.97; p = 0.04). The risk of all-cause death was not different between the HFpEF and the HFrEF groups (HR 0.88; 95% CI 0.61 to 1.29; p = 0.52). In addition, the risks of cardiac death and sudden death in the HFpEF group were greater than the Normal group (HR 2.14, 95% CI 1.32 to 3.49, p = 0.002; and HR 3.60, 95% CI 1.55 to 8.36, p = 0.003, respectively), and the risks of those end points were not different between the HFrEF and the HFpEF groups. Despite low 30-day mortality rate after CABG in patients with HFpEF, HFpEF was associated with high risks of long-term death and cardiovascular events. Patients with HFpEF, as well as HFrEF, should be carefully operated and followed up. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Polyclonal and monoclonal antibodies for induction therapy in kidney transplant recipients.

    PubMed

    Hill, Penny; Cross, Nicholas B; Barnett, A Nicholas R; Palmer, Suetonia C; Webster, Angela C

    2017-01-11

    (CNI) treatment. ATG (with CNI therapy) had uncertain effects on death (3 to 6 months, 3 studies: RR 0.41, 0.13 to 1.22; 1 to 2 years, 5 studies: RR 0.75, 0.27 to 2.06; 5 years, 2 studies: RR 0.94, 0.11 to 7.81) and graft loss (3 to 6 months, 4 studies: RR 0.60, 0.34 to 1.05; 1 to 2 years, 3 studies: RR 0.65, 0.36 to 1.19). The effect of ATG on death-censored graft loss was uncertain at 1 to 2 years and 5 years. In non-CNI studies, ATG had uncertain effects on death but reduced death-censored graft loss (6 studies: RR 0.55, 0.38 to 0.78). When CNI and older non-CNI studies were combined, a benefit was seen with ATG at 1 to 2 years for both all-cause graft loss (7 studies: RR 0.71, 0.53 to 0.95) and death-censored graft loss (8 studies: RR 0.55, 0.39 to 0.77) but not sustained longer term. ATG increased cytomegalovirus (CMV) infection (6 studies: RR 1.55, 1.24 to 1.95), leucopenia (4 studies: RR 3.86, 2.79 to 5.34) and thrombocytopenia (4 studies: RR 2.41, 1.61 to 3.61) but had uncertain effects on delayed graft function, malignancy, post-transplant lymphoproliferative disorder (PTLD), and new onset diabetes after transplantation (NODAT).Alemtuzumab was compared to ATG in six studies (446 patients) with early steroid withdrawal (ESW) or steroid minimisation. Alemtuzumab plus steroid minimisation reduced acute rejection compared to ATG at one year (4 studies: RR 0.57, 0.35 to 0.93). In the two studies with ESW only in the alemtuzumab arm, the effect of alemtuzumab on acute rejection at 1 year was uncertain compared to ATG (RR 1.27, 0.50 to 3.19). Alemtuzumab had uncertain effects on death (1 year, 2 studies: RR 0.39, 0.06 to 2.42; 2 to 3 years, 3 studies: RR 0.67, 95% CI 0.15 to 2.95), graft loss (1 year, 2 studies: RR 0.39, 0.13 to 1.30; 2 to 3 years, 3 studies: RR 0.98, 95% CI 0.47 to 2.06), and death-censored graft loss (1 year, 2 studies: RR 0.38, 0.08 to 1.81; 2 to 3 years, 3 studies: RR 2.45, 95% CI 0.67 to 8.97) compared to ATG. Creatinine clearance was lower with

  15. Alpha/Beta T-Cell Depleted Grafts as an Immunological Booster to Treat Graft Failure after Hematopoietic Stem Cell Transplantation with HLA-Matched Related and Unrelated Donors

    PubMed Central

    Rådestad, E.; Wikell, H.; Engström, M.; Watz, E.; Sundberg, B.; Thunberg, S.; Uzunel, M.; Mattsson, J.; Uhlin, M.

    2014-01-01

    Allogeneic hematopoietic stem cell transplantation is associated with several complications and risk factors, for example, graft versus host disease (GVHD), viral infections, relapse, and graft rejection. While high levels of CD3+ cells in grafts can contribute to GVHD, they also promote the graft versus leukemia (GVL) effect. Infusions of extra lymphocytes from the original stem cell donor can be used as a treatment after transplantation for relapse or poor immune reconstitution but also they increase the risk for GVHD. In peripheral blood, 95% of T-cells express the αβ T-cell receptor and the remaining T-cells express the γδ T-cell receptor. As αβ T-cells are the primary mediators of GVHD, depleting them from the graft should reduce this risk. In this pilot study, five patients transplanted with HLA-matched related and unrelated donors were treated with αβ T-cell depleted stem cell boosts. The majority of γδ T-cells in the grafts expressed Vδ2 and/or Vγ9. Most patients receiving αβ-depleted stem cell boosts increased their levels of white blood cells, platelets, and/or granulocytes 30 days after infusion. No signs of GVHD or other side effects were detected. A larger pool of patients with longer follow-up time is needed to confirm the data in this study. PMID:25371909

  16. Surveillance Duplex Ultrasonography of Stent Grafts for Popliteal Aneurysms.

    PubMed

    Pineda, Danielle M; Troutman, Douglas A; Dougherty, Matthew J; Calligaro, Keith D

    2016-05-01

    Stent grafts, also known as covered stents, have become an increasingly acceptable treatment for popliteal artery aneurysms. However, endovascular exclusion confers lower primary patency compared to traditional open bypass and exclusion. The purpose of this study was to evaluate whether duplex ultrasonography (DU) can reliably diagnose failing stent grafts placed for popliteal artery aneurysms prior to occlusion. Between June 5, 2007, and March 11, 2014, 21 stent grafts (Viabahn; Gore, Flagstaff, Arizona) were placed in 19 patients for popliteal artery aneurysms. All patients had at least 1 follow-up duplex scan postoperatively. Mean follow-up was 28.9 months (9-93 months). Postoperative DU surveillance was performed in our Intersocietal Accreditation Commission noninvasive vascular laboratory at 1 week postprocedure and every 6 months thereafter. Duplex ultrasonography measured peak systolic velocities (PSVs) and ratio of adjacent PSVs (Vr) every 5 cm within the stent graft and adjacent arteries. We retrospectively classified the following factors as "abnormal DU findings": focal PSV > 300 cm/s, uniform PSVs < 50 cm/s throughout the graft, and Vr > 3.0. These DU criteria were derived from laboratory-specific data that we previously published on failing stent grafts placed for lower extremity occlusive disease. Four of the 21 stent grafts presented with symptomatic graft thrombosis within 6 months of a normal DU. Three of these 4 patients presented with rest pain and underwent thrombectomy (2) or vein bypass (1), and 1 elected for nonintervention for claudication. Our results suggest that surveillance DU using criteria established for grafts placed for occlusive disease may not be useful for predicting stent graft failure in popliteal artery aneurysms. © The Author(s) 2016.

  17. Improved long-term survival after intra-operative single high-dose ATG-Fresenius induction in renal transplantation: a single centre experience.

    PubMed

    Kaden, Jürgen; May, Gottfried; Völp, Andreas; Wesslau, Claus

    2009-01-01

    In organ grafts donor-specific sensitization is initiated immediately after revascularization. Therefore, in 1990 we introduced the intra-operative single high-dose ATG-Fresenius (ATG-F) induction in addition to standard triple drug therapy (TDT) consisting of steroids, azathioprine and cyclosporin. A total of 778 first renal transplantations from deceased donors, performed between 1987 and 1998, were included in this evaluation. This retrospective analysis of clinic records and electronic databases presents data of all recipients of first kidney grafts who received two different ATG-F inductions (1(st) group: 9 mg/kg body weight as single high-dose intra-operatively, n=484; 2(nd) group: 3 mg/kg body weight on 7 or 8 consecutive days as multiple-dose starting also intra-operatively, n=78) and standard TDT alone (3(rd) group: TDT alone, n=216). The 10-year patient survival rates were 72.6+/-2.6% (TDT + ATG-F single high-dose), 79.5+/-5.1% (TDT + ATG-F multiple-dose) and 67.2+/-3.7%% (TDT alone; Kaplan-Meier estimates with standard errors; ATG-F vs TDT alone, p=0.001). The 10-year graft survival rates with censoring of patients that died with a functioning graft were 73.8+/-2.4%, 57.7+/-5.8% and 58.4+/-3.6% (Kaplan-Meier estimates with standard errors; 1(st) vs 2(nd )and 3(rd) group, respectively, p<0.001) and the 10-year graft survival rates with patient death counted as graft failure were 58.3+/-2.7%, 55.7+/-5.8% and 48.2+/-3.5% (Kaplan-Meier estimates with standard errors; ATG-F single high-dose vs TDT, p=0.023). In pre-sensitized recipients there were also significant differences in favour of ATG-F, more notably in the single high-dose ATG-F induction. A total of 69% of the patients in the two cohorts receiving ATG-F did not experience any transplant rejections compared to 56% in patients undergoing TDT alone (p=0.018). The incidence of infectious complications was comparable across all groups. According to evidence obtained from the routine documentation of 778

  18. From arterial stiffness to kidney graft microvasculature: Mortality and graft survival within a cohort of 220 kidney transplant recipients.

    PubMed

    Cheddani, Lynda; Radulescu, Camélia; Chaignon, Michel; Karras, Alexandre; Neuzillet, Yann; Duong, Jean-Paul; Tabibzadeh, Nahid; Letavernier, Emmanuel; Delahousse, Michel; Haymann, Jean-Philippe

    2018-01-01

    Aortic stiffness assessed by carotid-femoral pulse wave velocity (CF-PWV) is a predictor of mortality in several populations. However, little is known in kidney transplant recipients. Our objectives were to evaluate the ability of CF-PWV measured 3 months following transplantation to predict mortality, graft loss and its potential links to measured Glomerular Filtration Rate (mGFR) or kidney graft microvasculature parameters. The study is based on a monocentric retrospective cohort including 220 adult kidney graft recipients evaluated three months after transplantation. CF-PWV measures, clinical, laboratory and histological data performed at 3 (M3) and 12 months (M12) following transplantation were retrospectively collected. The two primary endpoints were all-cause mortality and occurrence of end stage renal disease (ESRD) defined by initiation of dialysis. After a median follow up of 5.5 years [1.9; 8.8], death and graft loss occurred in 10 and 12 patients respectively. M3 CF-PWV was an independent mortality risk factor (HR = 1.29 [1.03; 1.61]; p = 0.03), despite no aortic stiffness variation during the first year of transplantation. Of notice, M3 CF-PWV was not associated with M12 mGFR or ESRD outcome. Graft microcirculation assessed by Banff vascular fibrous intimal thickening score (cv) worsened between M3 and M12 (p = 0.01), but no link was found with CF-PWV, mGFR or ESRD outcome. Surprisingly, acute rejections at M3 were associated after adjustment with mortality (p = 0.03) but not ESRD. Aortic stiffness measured 3 months after kidney transplantation is a strong predictor of mortality with no obvious influence on kidney graft microvasculature or graft loss.

  19. Independent Suture Tape Reinforcement of Small and Standard Diameter Grafts for Anterior Cruciate Ligament Reconstruction: A Biomechanical Full Construct Model.

    PubMed

    Bachmaier, Samuel; Smith, Patrick A; Bley, Jordan; Wijdicks, Coen A

    2018-02-01

    To compare the dynamic elongation, stiffness behavior, and ultimate failure load of standard with small diameter soft tissue grafts for anterior cruciate ligament (ACL) reconstruction with and without high-strength suture tape reinforcement. Both a tripled "small" diameter and a "standard" quadrupled tendon graft with and without suture tape reinforcement were tested using suspensory fixation (n = 8 each group). The suture tape was passed through the suspensory fixation button on the femur and tibia to ensure independent (safety belt) fixation from the graft in vitro. The testing of the constructs included position-controlled cyclic loading, force-controlled cyclic loading at 250 N and 400 N as well as pull to failure (50 mm/min). Reinforcement of a small diameter graft significantly reduced dynamic elongation of 38% (1.46 ± 0.28 mm vs 2.34 ± 0.44 mm, P < .001) and 50% (2.55 ± 0.44 mm vs 5.06 ± 0.67 mm, P < .001) after the 250 N and 400 N load protocol, respectively. Reinforcement of a standard diameter tendon graft decreased dynamic elongation of 15% (1.59 ± 0.34 mm vs 1.86 ± 0.17 mm, P = .066) and 26% (2.62 ± 0.44 mm vs 3.55 ± 0.44 mm, P < .001). No significant difference was found between both reinforced models. The ultimate failure loads of small and standard diameter reinforced grafts were 1592 ± 105 N and 1585 ± 265 N, resulting in a 64% (P < .001) and 40% (P < .001) increase compared with their respective controls. Independent suture tape reinforcement of soft tissue grafts for ACL reconstruction leads to significantly reduced elongation and higher ultimate failure load according to in vivo native ACL function data without stress-shielding the soft tissue graft. If in vitro results are translational to human knees in vivo, the suture tape reinforcement technique for ACL reconstruction may decrease the risk of graft tears, particularly in the case of small diameter soft tissue grafts. Copyright © 2017 Arthroscopy Association of

  20. The Automation System Censor Speech for the Indonesian Rude Swear Words Based on Support Vector Machine and Pitch Analysis

    NASA Astrophysics Data System (ADS)

    Endah, S. N.; Nugraheni, D. M. K.; Adhy, S.; Sutikno

    2017-04-01

    According to Law No. 32 of 2002 and the Indonesian Broadcasting Commission Regulation No. 02/P/KPI/12/2009 & No. 03/P/KPI/12/2009, stated that broadcast programs should not scold with harsh words, not harass, insult or demean minorities and marginalized groups. However, there are no suitable tools to censor those words automatically. Therefore, researches to develop a system of intelligent software to censor the words automatically are needed. To conduct censor, the system must be able to recognize the words in question. This research proposes the classification of speech divide into two classes using Support Vector Machine (SVM), first class is set of rude words and the second class is set of properly words. The speech pitch values as an input in SVM, it used for the development of the system for the Indonesian rude swear word. The results of the experiment show that SVM is good for this system.

  1. Impact of Cold Ischemia Time in Kidney Transplants From Donation After Circulatory Death Donors.

    PubMed

    Kayler, Liise; Yu, Xia; Cortes, Carlos; Lubetzky, Michelle; Friedmann, Patricia

    2017-07-01

    Deceased-donor kidneys are exposed to ischemic events from donor instability during the process of donation after circulatory death (DCD). Clinicians may be reluctant to transplant DCD kidneys with prolonged cold ischemia time (CIT) for fear of an additional deleterious effect. We performed a retrospective cohort study examining US registry data between 1998 and 2013 of adult first-time kidney-only recipients of paired kidneys (derived from the same donor transplanted into different recipients) from DCD donors. On multivariable analysis, death-censored graft survival (DCGS) was comparable between recipients of kidneys with higher CIT relative to paired donor recipients with lower CIT when the CIT difference was 1 hour or longer (adjusted hazard ratio, [aHR], 1.02; 95% confidence interval [CI], 0.88-1.17; n = 6276), 5 hours or longer (aHR, 0.98; 95% CI, 0.80-1.19; n = 3130), 10 hours or longer (aHR, 1.15; 95% CI, 0.82-1.60; n = 1124) or 15 hours (aHR, 1.15; 95% CI, 0.66-1.99; n = 498). There was a higher rate of primary non function in the long CIT groups for delta 1 hour or longer (0.89% vs 1.63%; P = 0.006), 5 hours (1.09% vs 1.67%, P = 0.13); 10 hours (0.53% vs 1.78%; P = 0.03), and 15 hours (0.40% vs 1.61%; P = 0.18), respectively. Between each of the 4 delta CIT levels of shorter and longer CIT, there was a significantly and incrementally higher rate of delayed graft function in the long CIT groups for delta 1 hour or longer (37.3% vs 41.7%; P < 0.001), 5 hours (35.9% vs 42.7%; P < 0.001), 10 hours (29.4% vs 44.2%, P < 0.001), and 15 hours (29.6% vs 46.1%, P < 0.001), respectively. Overall patient survival was comparable with delta CITs of 1 hour or longer (aHR, 0.96; 95% CI, 0.84-1.08), 5 hours (aHR, 1.01; 95% CI, 0.85-1.20), and 15 hours (aHR, 1.27; 95% CI, 0.79-2.06) but not 10 hours (aHR, 1.47; 95% CI, 1.09-1.98). These results suggest that in the setting of a prior ischemic donor event, prolonged CIT has limited bearing on long-term outcomes.

  2. Impact of Cold Ischemia Time in Kidney Transplants From Donation After Circulatory Death Donors

    PubMed Central

    Kayler, Liise; Yu, Xia; Cortes, Carlos; Lubetzky, Michelle; Friedmann, Patricia

    2017-01-01

    Background Deceased-donor kidneys are exposed to ischemic events from donor instability during the process of donation after circulatory death (DCD). Clinicians may be reluctant to transplant DCD kidneys with prolonged cold ischemia time (CIT) for fear of an additional deleterious effect. Methods We performed a retrospective cohort study examining US registry data between 1998 and 2013 of adult first-time kidney-only recipients of paired kidneys (derived from the same donor transplanted into different recipients) from DCD donors. Results On multivariable analysis, death-censored graft survival (DCGS) was comparable between recipients of kidneys with higher CIT relative to paired donor recipients with lower CIT when the CIT difference was 1 hour or longer (adjusted hazard ratio, [aHR], 1.02; 95% confidence interval [CI], 0.88-1.17; n = 6276), 5 hours or longer (aHR, 0.98; 95% CI, 0.80-1.19; n = 3130), 10 hours or longer (aHR, 1.15; 95% CI, 0.82-1.60; n = 1124) or 15 hours (aHR, 1.15; 95% CI, 0.66-1.99; n = 498). There was a higher rate of primary non function in the long CIT groups for delta 1 hour or longer (0.89% vs 1.63%; P = 0.006), 5 hours (1.09% vs 1.67%, P = 0.13); 10 hours (0.53% vs 1.78%; P = 0.03), and 15 hours (0.40% vs 1.61%; P = 0.18), respectively. Between each of the 4 delta CIT levels of shorter and longer CIT, there was a significantly and incrementally higher rate of delayed graft function in the long CIT groups for delta 1 hour or longer (37.3% vs 41.7%; P < 0.001), 5 hours (35.9% vs 42.7%; P < 0.001), 10 hours (29.4% vs 44.2%, P < 0.001), and 15 hours (29.6% vs 46.1%, P < 0.001), respectively. Overall patient survival was comparable with delta CITs of 1 hour or longer (aHR, 0.96; 95% CI, 0.84-1.08), 5 hours (aHR, 1.01; 95% CI, 0.85-1.20), and 15 hours (aHR, 1.27; 95% CI, 0.79-2.06) but not 10 hours (aHR, 1.47; 95% CI, 1.09-1.98). Conclusions These results suggest that in the setting of a prior ischemic donor event, prolonged CIT has limited

  3. Estimating length of avian incubation and nestling stages in afrotropical forest birds from interval-censored nest records

    USGS Publications Warehouse

    Stanley, T.R.; Newmark, W.D.

    2010-01-01

    In the East Usambara Mountains in northeast Tanzania, research on the effects of forest fragmentation and disturbance on nest survival in understory birds resulted in the accumulation of 1,002 nest records between 2003 and 2008 for 8 poorly studied species. Because information on the length of the incubation and nestling stages in these species is nonexistent or sparse, our objectives in this study were (1) to estimate the length of the incubation and nestling stage and (2) to compute nest survival using these estimates in combination with calculated daily survival probability. Because our data were interval censored, we developed and applied two new statistical methods to estimate stage length. In the 8 species studied, the incubation stage lasted 9.6-21.8 days and the nestling stage 13.9-21.2 days. Combining these results with estimates of daily survival probability, we found that nest survival ranged from 6.0% to 12.5%. We conclude that our methodology for estimating stage lengths from interval-censored nest records is a reasonable and practical approach in the presence of interval-censored data. ?? 2010 The American Ornithologists' Union.

  4. A Review on Grafting of Biofibers for Biocomposites

    PubMed Central

    Wei, Liqing; McDonald, Armando G.

    2016-01-01

    A recent increase in the use of biofibers as low-cost and renewable reinforcement for the polymer biocomposites has been seen globally. Biofibers are classified into: lignocellulosic fibers (i.e., cellulose, wood and natural fibers), nanocellulose (i.e., cellulose nanocrystals and cellulose nanofibrils), and bacterial cellulose, while polymer matrix materials can be petroleum based or bio-based. Green biocomposites can be produced using both biobased fibers and polymers. Incompatibility between the hydrophilic biofibers and hydrophobic polymer matrix can cause performance failure of resulting biocomposites. Diverse efforts have focused on the modification of biofibers in order to improve the performances of biocomposites. “Grafting” copolymerization strategy can render the advantages of biofiber and impart polymer properties onto it and the performance of biocomposites can be tuned through changing grafting parameters. This review presents a short overview of various “grafting” methods which can be directly or potentially employed to enhance the interaction between biofibers and a polymer matrix for biocomposites. Major grafting techniques, including ring opening polymerization, grafting via coupling agent and free radical induced grafting, have been discussed. Improved properties such as mechanical, thermal, and water resistance have provided grafted biocomposites with new opportunities for applications in specific industries. PMID:28773429

  5. Nurses to Their Nurse Leaders: We Need Your Help After a Failure to Rescue Patient Death.

    PubMed

    Bacon, Cynthia Thornton

    The purpose of this study was to describe nurses' needs and how they are being met and not met after caring for surgical patients who died after a failure to rescue (FTR). A qualitative, phenomenologic approach was used for the interview and analysis framework. Methods to ensure rigor and trustworthiness were incorporated into the design. The investigator conducted semistructured 1:1 interviews with 14 nurses. Data were analyzed using Colaizzi's methods. Four themes were identified: (1) coping mechanisms are important; (2) immediate peer and supervisor feedback and support are needed for successful coping; (3) subsequent supervisor support is crucial to moving on; and (4) nurses desire both immediate support and subsequent follow-up from their nurse leaders after every FTR death. Nurses' needs after experiencing an FTR patient death across multiple practice areas and specialties were remarkably similar and clearly identified and articulated. Coping mechanisms vary and are not uniformly effective across different groups. Although most nurses in this study received support from their peers after the FTR event, many nurses did not receive the feedback and support that they needed from their nurse leaders. Immediate nurse leader support and follow-up debriefings should be mandatory after patient FTR deaths. Developing an understanding of nurses' needs after experiencing an FTR event can assist nurse leaders to better support nurses who experience FTR deaths. Insight into the environment surrounding FTR deaths also provides a foundation for future research aimed at improving patient safety and quality through an improved working environment for nurses.

  6. New-onset diabetes after kidney transplantation-changes and challenges.

    PubMed

    Yates, C J; Fourlanos, S; Hjelmesaeth, J; Colman, P G; Cohney, S J

    2012-04-01

    Despite substantial improvement in short-term results after kidney transplantation, increases in long-term graft survival have been modest. A significant impediment has been the morbidity and mortality attributable to cardiovascular disease (CVD). New-onset diabetes after transplantation (NODAT) is an independent predictor of cardiovascular events. This review examines recent literature surrounding diagnosis, outcomes and management of NODAT. Amongst otherwise heterogeneous studies, a common finding is the relative insensitivity of fasting blood glucose (FBG) as a screening test. Incorporating self-testing of afternoon capillary BG and glycohemoglobin (HbA(1c) ) detects many cases that would otherwise remain undetected without the oral glucose tolerance test (OGTT). Assessing the impact of NODAT on patient and graft survival is complicated by changes to diagnostic criteria, evolution of immunosuppressive regimens and increasing attention to cardiovascular risk management. Although recent studies reinforce a link between NODAT and death with a functioning graft (DWFG), there seems to be little effect on death-censored graft loss. The significance of glycemic control and diabetes resolution for patient outcomes remain notably absent from NODAT literature and treatment is also a neglected area. This review examines new and old therapeutic options, emphasizing the need to assess β-cell pathology in customizing therapy. Finally, areas warranting further research are considered. © Copyright 2011 The American Society of Transplantation and the American Society of Transplant Surgeons.

  7. Failure-to-rescue after injury is associated with preventability: The results of mortality panel review of failure-to-rescue cases in trauma.

    PubMed

    Kuo, Lindsay E; Kaufman, Elinore; Hoffman, Rebecca L; Pascual, Jose L; Martin, Niels D; Kelz, Rachel R; Holena, Daniel N

    2017-03-01

    Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center's ability to successfully "rescue" patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. All adjudications from a mortality review panel at an academic level I trauma center from 2005-2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47-3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30-66.71) judgment. Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Failure-to-rescue after injury is associated with preventability: The results of mortality panel review of failure-to-rescue cases in trauma

    PubMed Central

    Kuo, Lindsay E.; Kaufman, Elinore; Hoffman, Rebecca L.; Pascual, Jose L.; Martin, Niels D.; Kelz, Rachel R.; Holena, Daniel N.

    2018-01-01

    Background Failure-to-rescue is defined as the conditional probability of death after a complication, and the failure-to-rescue rate reflects a center’s ability to successfully “rescue” patients after complications. The validity of the failure-to-rescue rate as a quality measure is dependent on the preventability of death and the appropriateness of this measure for use in the trauma population is untested. We sought to evaluate the relationship between preventability and failure-to-rescue in trauma. Methods All adjudications from a mortality review panel at an academic level I trauma center from 2005–2015 were merged with registry data for the same time period. The preventability of each death was determined by panel consensus as part of peer review. Failure-to-rescue deaths were defined as those occurring after any registry-defined complication. Univariate and multivariate logistic regression models between failure-to-rescue status and preventability were constructed and time to death was examined using survival time analyses. Results Of 26,557 patients, 2,735 (10.5%) had a complication, of whom 359 died for a failure-to-rescue rate of 13.2%. Of failure-to-rescue deaths, 272 (75.6%) were judged to be non-preventable, 65 (18.1%) were judged potentially preventable, and 22 (6.1%) were judged to be preventable by peer review. After adjusting for other patient factors, there remained a strong association between failure-to-rescue status and potentially preventable (odds ratio 2.32, 95% confidence interval, 1.47–3.66) and preventable (odds ratio 14.84, 95% confidence interval, 3.30–66.71) judgment. Conclusion Despite a strong association between failure-to-rescue status and preventability adjudication, only a minority of deaths meeting the definition of failure to rescue were judged to be preventable or potentially preventable. Revision of the failure-to-rescue metric before use in trauma care benchmarking is warranted. PMID:27788924

  9. Aortic operation after previous coronary artery bypass grafting: management of patent grafts for myocardial protection.

    PubMed

    Nakajima, Masato; Tsuchiya, Koji; Fukuda, Shoji; Morimoto, Hironobu; Mitsumori, Yoshitaka; Kato, Kaori

    2006-04-01

    Aortic surgery for progressive aortic valve disease or aortic aneurysm after previous coronary artery bypass grafting (CABG) is a challenging procedure. We report the outcome of aortic reoperation after previous CABG and evaluate our management of patent grafts and our methods for obtaining myocardial protection. From February 2001 to July 2003, 6 patients with progressive aortic valve disease and aneurysm of the thoracic aorta were operated on. The group comprised 3 men and 3 women with a mean age of 67.6 years. There were 4 patients with an aneurysm of the aortic arch, 1 with chronic ascending aortic dissection, and 1 with progressive aortic valve stenosis. The interval between previous CABG and aortic surgery was 74.0 +/- 44.2 months. All reoperations were performed via median resternotomy. Myocardial protection was obtained by hypothermic perfusion of patent in-situ arterial grafts following cold-blood cardioplegia administration via the aortic root under aortic cross clamping. The operative procedure was aortic arch replacement in 4 patients, ascending aortic replacement with double CABG in 1, and aortic valve replacement in 1. All patients survived the reoperation. Postoperative maximum creatine kinase-MB was 49.2 +/- 29.8 and no new Q-waves occurred in the electrocardiogram nor were any new wall motion abnormalities recognized on echocardiography. There were no late deaths during a follow-up of 30.7 months. Reoperative aortic procedures after CABG can be performed safely with myocardial protection via hypothermic perfusion of a patent in-situ arterial graft.

  10. Retransplantation in Late Hepatic Artery Thrombosis: Graft Access and Transplant Outcome.

    PubMed

    Buchholz, Bettina M; Khan, Shakeeb; David, Miruna D; Gunson, Bridget K; Isaac, John R; Roberts, Keith J; Muiesan, Paolo; Mirza, Darius F; Tripathi, Dhiraj; Perera, M Thamara P R

    2017-08-01

    Definitive treatment for late hepatic artery thrombosis (L-HAT) is retransplantation (re-LT); however, the L-HAT-associated disease burden is poorly represented in allocation models. Graft access and transplant outcome of the re-LT experience between 2005 and 2016 was reviewed with specific focus on the L-HAT cohort in this single-center retrospective study. Ninety-nine (5.7%) of 1725 liver transplantations were re-LT with HAT as the main indication (n = 43; 43%) distributed into early (n = 25) and late (n = 18) episodes. Model for end-stage liver disease as well as United Kingdom model for end-stage liver disease did not accurately reflect high disease burden of graft failure associated infections such as hepatic abscesses and biliary sepsis in L-HAT. Hence, re-LT candidates with L-HAT received low prioritization and waited longest until the allocation of an acceptable graft (median, 103 days; interquartile range, 28-291 days), allowing for progression of biliary sepsis. Balance of risk score and 3-month mortality score prognosticated good transplant outcome in L-HAT but, contrary to the prediction, the factual 1-year patient survival after re-LT was significantly inferior in L-HAT compared to early HAT, early non-HAT and late non-HAT (65% vs 82%, 92% and 95%) which was mainly caused by sepsis and multiorgan failure driving 3-month mortality (28% vs 11%, 16% and 0%). Access to a second graft after a median waitlist time of 6 weeks achieved the best short- and long-term outcome in re-LT for L-HAT (3-month mortality, 13%; 1-year survival, 77%). Inequity in graft access and peritransplant sepsis are fundamental obstacles for successful re-LT in L-HAT. Offering a graft for those in need at the best window of opportunity could facilitate earlier engrafting with improved outcomes.

  11. Comparison of Sprint Fidelis and Riata defibrillator lead failure rates.

    PubMed

    Fazal, Iftikhar A; Shepherd, Ewen J; Tynan, Margaret; Plummer, Christopher J; McComb, Janet M

    2013-09-30

    Sprint Fidelis and Riata defibrillator leads are prone to early failure. Few data exist on the comparative failure rates and mortality related to lead failure. The aims of this study were to determine the failure rate of Sprint Fidelis and Riata leads, and to compare failure rates and mortality rates in both groups. Patients implanted with Sprint Fidelis leads and Riata leads at a single centre were identified and in July 2012, records were reviewed to ascertain lead failures, deaths, and relationship to device/lead problems. 113 patients had Sprint Fidelis leads implanted between June 2005 and September 2007; Riata leads were implanted in 106 patients between January 2003 and February 2008. During 53.0 ± 22.3 months of follow-up there were 13 Sprint Fidelis lead failures (11.5%, 2.60% per year) and 25 deaths. Mean time to failure was 45.1 ± 15.5 months. In the Riata lead cohort there were 32 deaths, and 13 lead failures (11.3%, 2.71% per year) over 54.8 ± 26.3 months follow-up with a mean time to failure of 53.5 ± 24.5 months. There were no significant differences in the lead failure-free Kaplan-Meier survival curve (p=0.77), deaths overall (p=0.17), or deaths categorised as sudden/cause unknown (p=0.54). Sprint Fidelis and Riata leads have a significant but comparable failure rate at 2.60% per year and 2.71% per year of follow-up respectively. The number of deaths in both groups is similar and no deaths have been identified as being related to lead failure in either cohort. Copyright © 2012. Published by Elsevier Ireland Ltd.

  12. Hazard Function Estimation with Cause-of-Death Data Missing at Random

    PubMed Central

    Wang, Qihua; Dinse, Gregg E.; Liu, Chunling

    2010-01-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data. PMID:22267874

  13. Hazard Function Estimation with Cause-of-Death Data Missing at Random.

    PubMed

    Wang, Qihua; Dinse, Gregg E; Liu, Chunling

    2012-04-01

    Hazard function estimation is an important part of survival analysis. Interest often centers on estimating the hazard function associated with a particular cause of death. We propose three nonparametric kernel estimators for the hazard function, all of which are appropriate when death times are subject to random censorship and censoring indicators can be missing at random. Specifically, we present a regression surrogate estimator, an imputation estimator, and an inverse probability weighted estimator. All three estimators are uniformly strongly consistent and asymptotically normal. We derive asymptotic representations of the mean squared error and the mean integrated squared error for these estimators and we discuss a data-driven bandwidth selection method. A simulation study, conducted to assess finite sample behavior, demonstrates that the proposed hazard estimators perform relatively well. We illustrate our methods with an analysis of some vascular disease data.

  14. Feature selection through validation and un-censoring of endovascular repair survival data for predicting the risk of re-intervention.

    PubMed

    Attallah, Omneya; Karthikesalingam, Alan; Holt, Peter J E; Thompson, Matthew M; Sayers, Rob; Bown, Matthew J; Choke, Eddie C; Ma, Xianghong

    2017-08-03

    Feature selection (FS) process is essential in the medical area as it reduces the effort and time needed for physicians to measure unnecessary features. Choosing useful variables is a difficult task with the presence of censoring which is the unique characteristic in survival analysis. Most survival FS methods depend on Cox's proportional hazard model; however, machine learning techniques (MLT) are preferred but not commonly used due to censoring. Techniques that have been proposed to adopt MLT to perform FS with survival data cannot be used with the high level of censoring. The researcher's previous publications proposed a technique to deal with the high level of censoring. It also used existing FS techniques to reduce dataset dimension. However, in this paper a new FS technique was proposed and combined with feature transformation and the proposed uncensoring approaches to select a reduced set of features and produce a stable predictive model. In this paper, a FS technique based on artificial neural network (ANN) MLT is proposed to deal with highly censored Endovascular Aortic Repair (EVAR). Survival data EVAR datasets were collected during 2004 to 2010 from two vascular centers in order to produce a final stable model. They contain almost 91% of censored patients. The proposed approach used a wrapper FS method with ANN to select a reduced subset of features that predict the risk of EVAR re-intervention after 5 years to patients from two different centers located in the United Kingdom, to allow it to be potentially applied to cross-centers predictions. The proposed model is compared with the two popular FS techniques; Akaike and Bayesian information criteria (AIC, BIC) that are used with Cox's model. The final model outperforms other methods in distinguishing the high and low risk groups; as they both have concordance index and estimated AUC better than the Cox's model based on AIC, BIC, Lasso, and SCAD approaches. These models have p-values lower than 0

  15. Instrumentation Failure after Partial Corpectomy with Instrumentation of a Metastatic Spine

    PubMed Central

    Park, Sung Bae; Kim, Ki Jeong; Han, Sanghyun; Oh, Sohee; Kim, Chi Heon; Chung, Chun Kee

    2018-01-01

    Objective To identify the perioperative factors associated with instrument failure in patients undergoing a partial corpectomy with instrumentation (PCI) for spinal metastasis. Methods We assessed the one hundred twenty-four patients with who underwent PCI for a metastatic spine from 1987 to 2011. Outcome measure was the risk factor related to implantation failure. The preoperative factors analyzed were age, sex, ambulation, American Spinal Injury Association grade, bone mineral density, use of steroid, primary tumor site, number of vertebrae with metastasis, extra-bone metastasis, preoperative adjuvant chemotherapy, and preoperative spinal radiotherapy. The intraoperative factors were the number of fixed vertebrae, fixation in osteolytic vertebrae, bone grafting, and type of surgical approach. The postoperative factors included postoperative adjuvant chemotherapy and spinal radiotherapy. This study was supported by the National Research Foundation grant funded by government. There were no study-specific biases related to conflicts of interest. Results There were 15 instrumentation failures (15/124, 12.1%). Preoperative ambulatory status and primary tumor site were not significantly related to the development of implant failure. There were no significant associations between insertion of a bone graft into the partial corpectomy site and instrumentation failure. The preoperative and operative factors analyzed were not significantly related to instrumentation failure. In univariable and multivariable analyses, postoperative spinal radiotherapy was the only significant variable related to instrumentation failure (p=0.049 and 0.050, respectively). Conclusion When performing PCI in patients with spinal metastasis followed by postoperative spinal radiotherapy, the surgeon may consider the possibility of instrumentation failure and find other strategies for augmentation than the use of a bone graft for fusion. PMID:29631384

  16. Risk-adjusted monitoring of survival times

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sego, Landon H.; Reynolds, Marion R.; Woodall, William H.

    2009-02-26

    We consider the monitoring of clinical outcomes, where each patient has a di®erent risk of death prior to undergoing a health care procedure.We propose a risk-adjusted survival time CUSUM chart (RAST CUSUM) for monitoring clinical outcomes where the primary endpoint is a continuous, time-to-event variable that may be right censored. Risk adjustment is accomplished using accelerated failure time regression models. We compare the average run length performance of the RAST CUSUM chart to the risk-adjusted Bernoulli CUSUM chart, using data from cardiac surgeries to motivate the details of the comparison. The comparisons show that the RAST CUSUM chart is moremore » efficient at detecting a sudden decrease in the odds of death than the risk-adjusted Bernoulli CUSUM chart, especially when the fraction of censored observations is not too high. We also discuss the implementation of a prospective monitoring scheme using the RAST CUSUM chart.« less

  17. Superior long term outcome associated with native vessel versus graft vessel PCI following secondary PCI in patients with prior CABG.

    PubMed

    Mavroudis, Chrysostomos A; Kotecha, Tushar; Chehab, Omar; Hudson, Jonathan; Rakhit, Roby D

    2017-02-01

    Secondary percutaneous coronary intervention (PCI) in patients with prior coronary artery bypass graft surgery is increasingly common. Graft vessel PCI has higher rates of adverse events compared with native coronary vessel PCI. To investigate the clinical outcomes of patients with prior CABG who underwent secondary PCI of either a graft vessel (GV), a native coronary vessel (NV) or both graft and native (NG) vessels. 220 patients (84% male) who underwent PCI in our institution to either GV (n=89), NV (n=103) or both GV and NV (NG group) (n=28) were studied. The study population underwent 378 procedures (GV group; n=126, NV group; n=164 and NG group; n=88). Median follow up was for 36months [range 2-75months]. Target vessel revascularisation (TVR) occurred in 12.5% of the GV group and 3.6% in the NV group [p=0.0004], and was predominantly due to in-stent restenosis. Patients who had PCI due to TVR were more likely to suffer from diabetes and peripheral vascular disease. History of chronic renal failure was associated with higher risk (HR 2.21, p=0.005) whereas preserved left ventricular ejection fraction (LVEF) with lower risk (HR 0.17, p=0.0007) of death. The median survival (interval between CABG and end of follow-up period) was lower in the GV compared with the NV group (315 vs 372months p=0.005). This registry demonstrates inferior long term outcome for patients undergoing secondary PCI of GV versus NV. Where possible, a strategy of NV rather than GV target PCI should be considered in patients with prior CABG. Secondary PCI in patients with prior CABG surgery is increasingly common. Graft vessel PCI has inferior outcomes with high rates of restenosis and occlusion compared with native coronary vessel PCI. We studied the clinical outcomes of 220 patients with prior CABG who underwent secondary PCI to either a graft vessel (GV), a native coronary vessel (NV) or both graft and native (NG) vessels. Target vessel revascularisation was 5 times higher in the GV

  18. Pseudoaneurysm of the saphenous vein graft related to artificial rubber pericardium--a case report.

    PubMed

    Harada, Tomohiro; Ida, Takao; Koyanagi, Toshiya; Kasahara, Katsuhiko; Naganuma, Fumio; Hosoda, Saichi

    2002-01-01

    Pseudoaneurysm is an unusual complication of coronary artery bypass grafting. Such aneurysms are caused by technical surgical failures, or inflammation of the sternum and mediastinum following sternotomy observed as an early or mid-term complication of cardiac surgery. This case was an 80-year-old man with a piece of artificial rubber pericardium used for complete closure of the pericardium. A large pseudoaneurysm developed in the body of the saphenous vein graft 15 years after surgery. The old rubber synthetic pericardium was severely degenerative, which induced inflammation and disrupted the saphenous vein graft.

  19. Impact of brain death on ischemia/reperfusion injury in liver transplantation.

    PubMed

    Dziodzio, Tomasz; Biebl, Matthias; Pratschke, Johann

    2014-04-01

    In liver transplantation, the ischemia/reperfusion injury (IRI) is influenced by factors related to graft quality, organ procurement and the transplant procedure itself. However, in brain-dead donors, the process of death itself also thoroughly affects organ damage through breakdown of the autonomous nervous system and subsequent massive cytokine release. This review highlights the actual knowledge on these proinflammatory effects of brain death on IRI in liver transplantation. Brain death affects IRI either through hemodynamical or molecular effects with proinflammatory activation. Immunological effects are mainly mediated through Kupffer cell activation, leading to TNF-α and TLR4 amplification. Proinflammatory cytokines such as interleukin (IL)-6, IL-10, TNF-β and MIP-1α are released, together with activation of the innate immune system via natural killer cells and natural killer T cells, which promote organ damage and activation of fibrosis. Preprocurement treatment regimens attempt to hamper inflammatory response by the application of methylprednisolone or thymoglobulin to the donor. Selective P-selectin antagonism resulted in improved function in marginal liver grafts. Inhaled nitric oxide was found to reduce apoptosis in liver grafts. Other medications like the immunosuppressant tacrolimus produced conflicting results regarding organ protection. Furthermore, improved organ storage after procurement - such as machine perfusion - can diminish effects of IRI in a clinical setting. Brain death plays a fundamental role in the regulation of molecular markers triggering inflammation and IRI-related tissue damage in liver transplants. Although several treatment options have reached clinical application, to date, the effects of brain death during donor conditioning and organ procurement remain relevant for organ function and survival.

  20. Uric Acid and the Risks of Kidney Failure and Death in Individuals With CKD.

    PubMed

    Srivastava, Anand; Kaze, Arnaud D; McMullan, Ciaran J; Isakova, Tamara; Waikar, Sushrut S

    2018-03-01

    Serum uric acid concentrations increase in chronic kidney disease (CKD) and may lead to tubular injury, endothelial dysfunction, oxidative stress, and intrarenal inflammation. Whether uric acid concentrations are associated with kidney failure and death in CKD is unknown. Prospective observational cohort study. 3,885 individuals with CKD stages 2 to 4 enrolled in the Chronic Renal Insufficiency Cohort (CRIC) between June 2003 and September 2008 and followed up through March 2013. Baseline uric acid concentrations. Kidney failure (initiation of dialysis therapy or transplantation) and all-cause mortality. During a median follow-up of 7.9 years, 885 participants progressed to kidney failure and 789 participants died. After adjustment for demographic, cardiovascular, and kidney-specific covariates, higher uric acid concentrations were independently associated with risk for kidney failure in participants with estimated glomerular filtration rates (eGFRs) ≥ 45mL/min/1.73m 2 (adjusted HR per 1-standard deviation greater baseline uric acid, 1.40; 95% CI, 1.12-1.75), but not in those with eGFRs<30mL/min/1.73m 2 . There was a nominally higher HR in participants with eGFRs of 30 to 44mL/min/1.73m 2 (adjusted HR, 1.13; 95% CI, 0.99-1.29), but this did not reach statistical significance. The relationship between uric acid concentration and all-cause mortality was J-shaped (P=0.007). Potential residual confounding through unavailable confounders; lack of follow-up measurements to adjust for changes in uric acid concentrations over time. Uric acid concentration is an independent risk factor for kidney failure in earlier stages of CKD and has a J-shaped relationship with all-cause mortality in CKD. Adequately powered randomized placebo-controlled trials in CKD are needed to test whether urate lowering may prove to be an effective approach to prevent complications and progression of CKD. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All

  1. Regression analysis of clustered failure time data with informative cluster size under the additive transformation models.

    PubMed

    Chen, Ling; Feng, Yanqin; Sun, Jianguo

    2017-10-01

    This paper discusses regression analysis of clustered failure time data, which occur when the failure times of interest are collected from clusters. In particular, we consider the situation where the correlated failure times of interest may be related to cluster sizes. For inference, we present two estimation procedures, the weighted estimating equation-based method and the within-cluster resampling-based method, when the correlated failure times of interest arise from a class of additive transformation models. The former makes use of the inverse of cluster sizes as weights in the estimating equations, while the latter can be easily implemented by using the existing software packages for right-censored failure time data. An extensive simulation study is conducted and indicates that the proposed approaches work well in both the situations with and without informative cluster size. They are applied to a dental study that motivated this study.

  2. Septuagenarian and octogenarian donors provide excellent liver grafts for transplantation.

    PubMed

    Darius, T; Monbaliu, D; Jochmans, I; Meurisse, N; Desschans, B; Coosemans, W; Komuta, M; Roskams, T; Cassiman, D; van der Merwe, S; Van Steenbergen, W; Verslype, C; Laleman, W; Aerts, R; Nevens, F; Pirenne, J

    2012-11-01

    Wider utilization of liver grafts from donors ≥ 70 years old could substantially expand the organ pool, but their use remains limited by fear of poorer outcomes. We examined the results at our center of liver transplantation (OLT) using livers from donors ≥ 70 years old. From February 2003 to August 2010, we performed 450 OLT including 58 (13%) using donors ≥ 70 whose outcomes were compared with those using donors <70 years old. Cerebrovascular causes of death predominated among donors ≥ 70 (85% vs 47% in donors <70; P < .001). In contrast, traumatic causes of death predominated among donors <70 (36% vs 14% in donors ≥ 70; P = .002). Unlike grafts from donors <70 years old, grafts from older individuals had no additional risk factors (steatosis, high sodium, or hemodynamic instability). Both groups were comparable for cold and warm ischemia times. No difference was noted in posttransplant peak transaminases, incidence of primary nonfunction, hepatic artery thrombosis, biliary strictures, or retransplantation rates between groups. The 1- and 5-year patient survivals were 88% and 82% in recipients of livers <70 versus 90% and 84% in those from ≥ 70 years old (P = .705). Recipients of older grafts, who were 6 years older than recipients of younger grafts (P < .001), tended to have a lower laboratory Model for End-Stage Liver Disease score (P = .074). Short and mid-term survival following OLT using donors ≥ 70 yo can be excellent provided that there is adequate donor and recipient selection. Septuagenarians and octogenarians with cerebrovascular ischemic and bleeding accidents represent a large pool of potential donors whose wider use could substantially reduce mortality on the OLT waiting list. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Reconstruction of mandibular defects with autogenous bone grafts: a review of 30 cases.

    PubMed

    Sajid, Malik Ali Hassan; Warraich, Riaz Ahmed; Abid, Hina; Ehsan-ul-Haq, Muhammad; Shah, Khurram Latif; Khan, Zafar

    2011-01-01

    Multitudes of options are available for reconstruction of functional and cosmetic defects of the mandible, caused by various ailments. At the present time, autogenous bone grafting is the gold standard by which all other techniques of reconstruction of the mandible can be judged. The purpose of this study was to evaluate the outcome of different osseous reconstruction options using autogenous bone grafts for mandibular reconstruction. This Interventional study was conducted at Department of Oral and Maxillofacial Surgery, King Edward Medical University/Mayo Hospital Lahore, from January 2008 to July 2009 including one year follow-up. The study was carried out on thirty patients having bony mandibular defects. They were reconstructed with the autogenous bone grafts from different graft donor sites. On post-operative visits they were evaluated for outcome variables. Success rate of autogenous bone grafts in this study was 90%. Only 10% of the cases showed poor results regarding infection, resorption and graft failure. Autogenous bone grafts, non-vascularised or vascularised, are a reliable treatment modality for the reconstruction of the bony mandibular defects with predictable functional and aesthetic outcome.

  4. Modeling conduction in host-graft interactions between stem cell grafts and cardiomyocytes.

    PubMed

    Chen, Michael Q; Yu, Jin; Whittington, R Hollis; Wu, Joseph C; Kovacs, Gregory T A; Giovangrandi, Laurent

    2009-01-01

    Cell therapy has recently made great strides towards aiding heart failure. However, while transplanted cells may electromechanically integrate into host tissue, there may not be a uniform propagation of a depolarization wave between the heterogeneous tissue boundaries. A model using microelectrode array technology that maps the electrical interactions between host and graft tissues in co-culture is presented and sheds light on the effects of having a mismatch of conduction properties at the boundary. Skeletal myoblasts co-cultured with cardiomyocytes demonstrated that conduction velocity significantly decreases at the boundary despite electromechanical coupling. In an attempt to improve the uniformity of conduction with host cells, differentiating human embryonic stem cells (hESC) were used in co-culture. Over the course of four to seven days, synchronous electrical activity was observed at the hESC boundary, implying differentiation and integration. Activity did not extend far past the boundary, and conduction velocity was significantly greater than that of the host tissue, implying the need for other external measures to properly match the conduction properties between host and graft tissue.

  5. Bivariate Left-Censored Bayesian Model for Predicting Exposure: Preliminary Analysis of Worker Exposure during the Deepwater Horizon Oil Spill.

    PubMed

    Groth, Caroline; Banerjee, Sudipto; Ramachandran, Gurumurthy; Stenzel, Mark R; Sandler, Dale P; Blair, Aaron; Engel, Lawrence S; Kwok, Richard K; Stewart, Patricia A

    2017-01-01

    In April 2010, the Deepwater Horizon oil rig caught fire and exploded, releasing almost 5 million barrels of oil into the Gulf of Mexico over the ensuing 3 months. Thousands of oil spill workers participated in the spill response and clean-up efforts. The GuLF STUDY being conducted by the National Institute of Environmental Health Sciences is an epidemiological study to investigate potential adverse health effects among these oil spill clean-up workers. Many volatile chemicals were released from the oil into the air, including total hydrocarbons (THC), which is a composite of the volatile components of oil including benzene, toluene, ethylbenzene, xylene, and hexane (BTEXH). Our goal is to estimate exposure levels to these toxic chemicals for groups of oil spill workers in the study (hereafter called exposure groups, EGs) with likely comparable exposure distributions. A large number of air measurements were collected, but many EGs are characterized by datasets with a large percentage of censored measurements (below the analytic methods' limits of detection) and/or a limited number of measurements. We use THC for which there was less censoring to develop predictive linear models for specific BTEXH air exposures with higher degrees of censoring. We present a novel Bayesian hierarchical linear model that allows us to predict, for different EGs simultaneously, exposure levels of a second chemical while accounting for censoring in both THC and the chemical of interest. We illustrate the methodology by estimating exposure levels for several EGs on the Development Driller III, a rig vessel charged with drilling one of the relief wells. The model provided credible estimates in this example for geometric means, arithmetic means, variances, correlations, and regression coefficients for each group. This approach should be considered when estimating exposures in situations when multiple chemicals are correlated and have varying degrees of censoring. © The Author 2017

  6. The Impact of Liver Graft Injury on Cancer Recurrence Posttransplantation.

    PubMed

    Li, Chang-Xian; Man, Kwan; Lo, Chung-Mau

    2017-11-01

    Liver transplantation is the most effective treatment for selected patients with hepatocellular carcinoma. However, cancer recurrence, posttransplantation, remains to be the critical issue that affects the long-term outcome of hepatocellular carcinoma recipients. In addition to tumor biology itself, increasing evidence demonstrates that acute-phase liver graft injury is a result of hepatic ischemia reperfusion injury (which is an inevitable consequence during liver transplantation) and may promote cancer recurrence at late phase posttransplantation. The liver grafts from living donors, donors after cardiac death, and steatotic donors have been considered as promising sources of organs for liver transplantation and are associated with high incidence of liver graft injury. The acute-phase liver graft injury will trigger a series of inflammatory cascades, which may not only activate the cell signaling pathways regulating the tumor cell invasion and migration but also mobilize the circulating progenitor and immune cells to facilitate tumor recurrence and metastasis. The injured liver graft may also provide the favorable microenvironment for tumor cell growth, migration, and invasion through the disturbance of microcirculatory barrier function, induction of hypoxia and angiogenesis. This review aims to summarize the latest findings about the role and mechanisms of liver graft injury resulted from hepatic ischemia reperfusion injury on tumor recurrence posttransplantation, both in clinical and animal cohorts.

  7. Management of recurrent anterior urethral strictures following buccal mucosal graft-urethroplasty: A single center experience.

    PubMed

    Javali, Tarun Dilip; Katti, Amit; Nagaraj, Harohalli K

    2016-01-01

    To describe the safety, feasibility and outcome of redo buccal mucosal graft urethroplasty in patients presenting with recurrent anterior urethral stricture following previous failed BMG urethroplasty. This was a retrospective chart review of 21 patients with recurrent anterior urethral stricture after buccal mucosal graft urethroplasty, who underwent redo urethroplasty at our institute between January 2008 to January 2014. All patients underwent preoperative evaluation in the form of uroflowmetry, RGU, sonourethrogram and urethroscopy. Among patients with isolated bulbar urethral stricture, who had previously undergone ventral onlay, redo dorsal onlay BMG urethroplasty was done and vice versa (9+8 patients). Three patients, who had previously undergone Kulkarni-Barbagli urethroplasty, underwent dorsal free graft urethroplasty by ventral sagittal urethrotomy approach. One patient who had previously undergone urethroplasty by ASOPA technique underwent 2-stage Bracka repair. Catheter removal was done on 21(st) postoperative day. Follow-up consisted of uroflow, PVR and AUA-SS. Failure was defined as requirement of any post operative procedure. Idiopathic urethral strictures constituted the predominant etiology. Eleven patients presented with stricture recurrence involving the entire grafted area, while the remaining 10 patients had fibrotic ring like strictures at the proximal/distal graft-urethral anastomotic sites. The success rate of redo surgery was 85.7% at a mean follow-up of 41.8 months (range: 1 yr-6 yrs). Among the 18 patients who required no intervention during the follow-up period, the graft survival was longer compared to their initial time to failure. Redo buccal mucosal graft urethroplasty is safe and feasible with good intermediate term outcomes.

  8. Applications of crude incidence curves.

    PubMed

    Korn, E L; Dorey, F J

    1992-04-01

    Crude incidence curves display the cumulative number of failures of interest as a function of time. With competing causes of failure, they are distinct from cause-specific incidence curves that treat secondary types of failures as censored observations. After briefly reviewing their definition and estimation, we present five applications of crude incidence curves to show their utility in a broad range of studies. In some of these applications it is helpful to model survival-time distributions with use of two different time metameters, for example, time from diagnosis and age of the patient. We describe how one can incorporate published vital statistics into the models when secondary types of failure correspond to common causes of death.

  9. A Kidney Graft Survival Calculator that Accounts for Mismatches in Age, Sex, HLA, and Body Size.

    PubMed

    Ashby, Valarie B; Leichtman, Alan B; Rees, Michael A; Song, Peter X-K; Bray, Mathieu; Wang, Wen; Kalbfleisch, John D

    2017-07-07

    Outcomes for transplants from living unrelated donors are of particular interest in kidney paired donation (KPD) programs where exchanges can be arranged between incompatible donor-recipient pairs or chains created from nondirected/altruistic donors. Using Scientific Registry of Transplant Recipients data, we analyzed 232,705 recipients of kidney-alone transplants from 1998 to 2012. Graft failure rates were estimated using Cox models for recipients of kidney transplants from living unrelated, living related, and deceased donors. Models were adjusted for year of transplant and donor and recipient characteristics, with particular attention to mismatches in age, sex, human leukocyte antigens (HLA), body size, and weight. The dependence of graft failure on increasing donor age was less pronounced for living-donor than for deceased-donor transplants. Male donor-to-male recipient transplants had lower graft failure, particularly better than female to male (5%-13% lower risk). HLA mismatch was important in all donor types. Obesity of both the recipient (8%-18% higher risk) and donor (5%-11% higher risk) was associated with higher graft loss, as were donor-recipient weight ratios of <75%, compared with transplants where both parties were of similar weight (9%-12% higher risk). These models are used to create a calculator of estimated graft survival for living donors. This calculator provides useful information to donors, candidates, and physicians of estimated outcomes and potentially in allowing candidates to choose among several living donors. It may also help inform candidates with compatible donors on the advisability of joining a KPD program. Copyright © 2017 by the American Society of Nephrology.

  10. Association of Slow Graft Function with Long-Term Outcomes in Kidney Transplant Recipients.

    PubMed

    Wang, Connie J; Tuffaha, Ahmad; Phadnis, Milind A; Mahnken, Jonathan D; Wetmore, James B

    2018-04-03

    BACKGROUND Whether slow graft function (SGF) represents an intermediate phenotype between immediate graft function (IGF) and delayed graft function (DGF) in kidney transplant recipients is unknown. MATERIAL AND METHODS In a retrospective cohort analysis of 1,222 kidney transplant recipients, we classified patients as having IGF, SGF, and DGF using two different schemas. SGF was defined as serum creatinine (Cr) ≥3.0 mg/dL by postoperative day 5 in Schema 1, and in Schema 2, SGF was defined as Cr >1.5 mg/dL plus a creatinine reduction ratio <20% between postoperative days 1 and 3. A complementary log-log model was used to examine the association of graft function with graft survival and patient survival. RESULTS Mean age of study patients was 51.5±13.3 years, 59.9% were male, and 66.7% were white. In Schema 1, SGF and DGF were associated with comparable increases in risk of graft failure compared to IGF (hazard ratio (HR) 1.46, 95% confidence intervals (CI) 1.02-2.10 for SGF and HR 1.56, CI 1.11-2.22 for IGF); estimates were similar for Schema 2 (HR 1.52, CI 1.05-2.20 for SGF and HR 1.54, CI 1.10-2.17 for IGF). However, for mortality, outcomes for SGF were similarly to IGF, both SGF and IGF were associated with lower risk relative to DGF (HR 0.54, CI 0.36-0.80 for SGF in Schema 1; HR 0.58, CI 0.39-0.85 for SGF in Schema 2). CONCLUSIONS These findings suggest that SGF may be a marker for graft failure but not for mortality, and SGF may therefore represent a phenotype separate from IGF and DGF.

  11. Graft-Sparing Strategy for Thoracic Prosthetic Graft Infection.

    PubMed

    Uchino, Gaku; Yoshida, Takeshi; Kakii, Bunpachi; Furui, Masato

    2018-04-01

     Thoracic prosthetic graft infection is a rare but serious complication with no standard management. We reported our surgical experience on graft-sparing strategy for thoracic prosthetic graft infection.  This study included patients who underwent graft-sparing surgery for thoracic prosthetic graft infection at Matsubara Tokushukai Hospital in Japan from January 2000 to October 2017.  There were 17 patients included in the analyses, with a mean age at surgery of 71.0 ± 10.5 years; 11 were men. In-hospital mortality was observed in five patients (29.4%).  Graft-sparing surgery for thoracic prosthetic graft infection is an alternative option particularly for early graft infection after hemiarch replacement. Georg Thieme Verlag KG Stuttgart · New York.

  12. Revision Risk After Allograft Anterior Cruciate Ligament Reconstruction: Association With Graft Processing Techniques, Patient Characteristics, and Graft Type.

    PubMed

    Tejwani, Samir G; Chen, Jason; Funahashi, Tadashi T; Love, Rebecca; Maletis, Gregory B

    2015-11-01

    Allograft tissue is a common graft choice for anterior cruciate ligament reconstruction (ACLR). Allograft sterilization methods vary widely across numerous commercial tissue vendors. Multiple studies, despite being limited in sample size, have suggested a higher rate of clinical failure associated with the use of allograft tissue in ACLR when compared with autograft. To examine the association of graft processing techniques, patient characteristics, and graft type with risk of revision surgery after allograft ACLR. Cohort study; Level of evidence, 3. A retrospective cohort study was conducted that used an integrated United States health care system's ACLR registry to identify primary unilateral cases in which allografts were used. Aseptic revision was the endpoint of the study. Allograft type, processing methods (irradiation dose, AlloWash, AlloTrue, BioCleanse), and graft donor age were assessed as potential risk factors for revision, with adjustment for patient age, sex, and body mass index (BMI) by use of survival analysis. Hazard ratios (HR) and 95% confidence intervals (CIs) were calculated. A total of 5968 primary ACLR cases with allograft were included in the study, of which 3688 (61.8%) were male patients. The median age of the cohort at the time of surgery was 34.1 years (interquartile range, 24.1-42.9 years). The mean time to follow-up (±SD) was 2.1 ± 1.5 years. There were 3751 (62.9%) allograft ACLRs using soft tissue, 1188 (19.9%) with Achilles tendon, and 1029 (17.2%) with bone-patellar tendon-bone (BPTB). Graft processing groups included BioCleanse (n = 367), AlloTrue or AlloWash (n = 2278), irradiation greater than 1.8 Mrad (n = 1146), irradiation up to 1.8 Mrad (n = 3637), and no irradiation (n = 1185). There were 156 (2.6%) aseptic revisions. After adjustment for patient age, sex, and BMI, the use of BioCleanse (HR = 2.45; 95% CI, 1.36-4.40) and irradiation greater than 1.8 Mrad (HR = 1.64; 95% CI, 1.08-2.49) were associated with a higher risk of

  13. A method for analyzing clustered interval-censored data based on Cox's model.

    PubMed

    Kor, Chew-Teng; Cheng, Kuang-Fu; Chen, Yi-Hau

    2013-02-28

    Methods for analyzing interval-censored data are well established. Unfortunately, these methods are inappropriate for the studies with correlated data. In this paper, we focus on developing a method for analyzing clustered interval-censored data. Our method is based on Cox's proportional hazard model with piecewise-constant baseline hazard function. The correlation structure of the data can be modeled by using Clayton's copula or independence model with proper adjustment in the covariance estimation. We establish estimating equations for the regression parameters and baseline hazards (and a parameter in copula) simultaneously. Simulation results confirm that the point estimators follow a multivariate normal distribution, and our proposed variance estimations are reliable. In particular, we found that the approach with independence model worked well even when the true correlation model was derived from Clayton's copula. We applied our method to a family-based cohort study of pandemic H1N1 influenza in Taiwan during 2009-2010. Using the proposed method, we investigate the impact of vaccination and family contacts on the incidence of pH1N1 influenza. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Single center experience of aortic bypass graft for aortic arch obstruction in children.

    PubMed

    Shinkawa, Takeshi; Chipman, Carl; Holloway, Jessica; Tang, Xinyu; Gossett, Jeffrey M; Imamura, Michiaki

    2017-01-01

    The purpose of this study is to access the outcomes of aortic bypass graft placement in children. This is a retrospective review of all children having aortic bypass graft placement for aortic arch obstruction for the first time between 1982 and 2013 at a single institution. The actuarial survival and the freedom from aortic arch reoperation were calculated and compared between the groups. Seventy consecutive children underwent aortic bypass graft placements. The median age and body weight at the operation were 14 days and 3.6 kg. There were 7 early deaths, 6 late deaths, and 7 heart transplants during the median follow-up of 10.8 years (0.0-31.5 years). The actuarial transplant free survival was 64.7 % at 20 years and the freedom from aortic arch reoperation was 50.5 % at 10 years. Between the children younger than 1 year old and older than 1 year old, there were significant differences in actuarial transplant free survival (56.4 vs. 100 % at 15 years, p = 0.0042) and in the freedom from aortic arch reoperation (18.7 vs. 100 % at 10 years, p < 0.001). The children who received aortic bypass graft larger than 16 mm in size had no aortic arch reoperation at 15 years. The aortic bypass graft placement for aortic arch obstruction can be done with low mortality and morbidity for children who can receive bypass graft larger than 16 mm in size. However, it should be avoided for the neonates and infants except selected situations.

  15. Protection of pulmonary graft from thrombosis in donation after cardiac death: effect of warm ischaemia versus cold ischaemia.

    PubMed

    Pierre, Leif; Lindstedt, Sandra; Ingemansson, Richard

    2016-11-01

    The use of donation after cardiac death (DCD) to overcome organ shortage is slowly moving into the clinic. In this study, we compare the protective effect of warm ischaemia versus cold ischaemia on thrombotic formation in non-heparinized pulmonary grafts. Twelve Landrace pigs were randomized into two groups: warm ischaemia and cold ischaemia. Ventricular fibrillation without the administration of heparin was induced to mimick an uncontrolled DCD situation. The animals were then exposed to either 1 h of cold ischaemia (insertion of drain and installation of cold fluid in the pleuras) or warm ischaemia (body temperature). After 1 h, the pulmonary artery was opened and the pulmonary arterial branches were then macroscopically studied for thrombotic material. After 60 min, the temperature was 36.6 ± 0.0°C in the warm ischaemic group and 14.6 ± 0.1°C in the cold ischaemic group (P < 0.001). In the warm ischaemic group, no thrombotic material could be found in the pulmonary artery in the animals examined and in the cold ischaemic group 6.8 ± 0.2 ml thrombotic material was found in the pulmonary artery (P < 0.001). In the warm ischaemic group, no thrombotic material could be found in the arterial branches of the pulmonary artery and in the cold ischaemic group 2.3 ± 0.1 ml thrombotic material was found in the arterial branches of the pulmonary artery (P < 0.001). Warm ischaemia rather than cold ischaemia seems to protect the pulmonary graft from thrombosis in uncontrolled non-heparinized DCD pigs. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Atrazine concentrations in near-surface aquifers: A censored regression approach

    USGS Publications Warehouse

    Liu, S.; Yen, S.T.; Kolpin, D.W.

    1996-01-01

    In 1991, the U.S. Geological Survey (USGS) conducted a study to investigate the occurrence of atrazine (2-chloro-4-ethylamino-6- isopropylamino-s-triazine) and other agricultural chemicals in near-surface aquifers in the midcontinental USA. Because about 83% of the atrazine concentrations from the USGS study were censored, standard statistical estimation procedures could not be used. To determine factors that affect atrazine concentrations in groundwater while accommodating the high degree of data censoring. Tobit models were used (normal homoscedastic, normal heteroscedastic, lognormal homoscedastic, and lognormal heteroscedastic). Empirical results suggest that the lognormal heteroscedastic Tobit model is the model of choice for this type of study. This model determined the following factors to have the strongest effect on atrazine concentrations in groundwater: percent of pasture within 3.2 km, percent of forest within 3.2 km (2 mi), mean open interval of the well, primary water use of a well, aquifer class (unconsolidated or bedrock), aquifer type (unconfined or confined), existence of a stream within 30 m (100 ft), existence of a stream within 30 m to 0.4 km (0.25 mi), and existence of a stream within 0.4 to 3.2 km. Examining the elasticities of the continuous explanatory factors provides further insight into their effects on atrazine concentrations in groundwater. This study documents a viable statistical method that can be used to accommodate the complicating presence of censured data, a feature that commonly occurs in environmental data.

  17. Telomere shortening in hematopoietic stem cell transplantation: a potential mechanism for late graft failure?

    PubMed

    Awaya, Norihiro; Baerlocher, Gabriela M; Manley, Thomas J; Sanders, Jean E; Mielcarek, Marco; Torok-Storb, Beverly; Lansdorp, Peter M

    2002-01-01

    Telomeres serve to maintain the structural integrity of chromosomes, yet each somatic cell division is associated with a decrease in telomere length. The cumulative decrease in telomere length can impose an upper limit for the number of cell divisions that can occur before a cell senesces. When studied in vitro with fibroblasts, this limit is referred to as the Hayflick limit and usually occurs after 40 to 80 cell doublings. In theory, a similar replicative potential in a hematopoietic stem cell could support hematopoiesis in a person for more than 100 years. However, stem cells differentiate, and the telomere length differs among chromosomes within a single cell, among cell types, and among age-matched individuals. This variation in telomere length raises the possibility that long-term hematopoiesis by transplanted stem cells could, depending on the telomere length of the engrafted stem cell and the proliferative demand to which it is subjected, reach a Hayflick limit during the life span of the patient. Although significant shortening of telomeres is reported to occur within the first year posttransplantation, as yet no evidence has indicated that this shortening is associated with marrow function. In this review, we summarize reports on telomere shortening in stem cell transplantation recipients and report 2 cases in which graft failure is associated with significant telomere shortening.

  18. A Case Report of Successful Kidney Donation After Brain Death Following Nicotine Intoxication.

    PubMed

    Räsänen, M; Helanterä, I; Kalliomäki, J; Savikko, J; Parry, M; Lempinen, M

    Nicotine intoxication is a rare cause of death and can lead to brain death after respiratory arrest and hypoxic-ischemic encephalopathy. To our knowledge, no previous reports regarding organ donation after nicotine intoxication have been described. We present a successful case of kidney donation after brain death caused by subcutaneous nicotine overdose from liquid nicotine from an e-cigarette cartridge in an attempted suicide. Both kidneys were transplanted successfully with immediate graft function, and both recipients were discharged at postoperative day 9 with normal plasma creatinine levels. Graft function has remained excellent in follow-up. This case suggests that kidneys from a donor with fatal nicotine intoxication may be successfully used for kidney transplantation in the absence of other contraindications for donation. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Nomogram to Predict Graft Thickness in Descemet Stripping Automated Endothelial Keratoplasty: An Eye Bank Study.

    PubMed

    Bae, Steven S; Menninga, Isaac; Hoshino, Richard; Humphreys, Christine; Chan, Clara C

    2018-06-01

    The purpose of this study was to develop a nomogram to predict postcut thickness of corneal grafts prepared at an eye bank for Descemet stripping automated endothelial keratoplasty (DSAEK). Retrospective chart review was performed of DSAEK graft preparations by 3 experienced technicians from April 2012 to May 2017 at the Eye Bank of Canada-Ontario Division. Variables collected included the following: donor demographics, death-to-preservation time, death-to-processing time, precut tissue thickness, postcut tissue thickness, microkeratome head size, endothelial cell count, cut technician, and rate of perforation. Linear regression models were generated for each microkeratome head size (300 and 350 μm). A total of 780 grafts were processed during the study period. Twelve preparation attempts resulted in perforation (1.5%) and were excluded. Mean precut tissue thickness was 510 ± 49 μm (range: 363-670 μm). Mean postcut tissue thickness was 114 ± 22 μm (range: 57-193 μm). Seventy-nine percent (608/768) of grafts were ≤130 μm. The linear regression models included precut thickness and donor age, which were able to predict the thickness to within 25 μm 80% of the time. We report a nomogram to predict thickness of DSAEK corneal grafts prepared in an eye bank setting, which was accurate to within 25 μm 80% of the time. Other eye banks could consider performing similar analyses.

  20. Periosteal BMP2 activity drives bone graft healing.

    PubMed

    Chappuis, Vivianne; Gamer, Laura; Cox, Karen; Lowery, Jonathan W; Bosshardt, Dieter D; Rosen, Vicki

    2012-10-01

    differentiation along the osteo-chondrogenic pathway. These results indicate that BMP2 will be among the signaling molecules whose presence will determine success or failure of new bone graft strategies. Copyright © 2012 Elsevier Inc. All rights reserved.