Clerkin, Kevin J.; Restaino, Susan W.; Zorn, Emmanuel; Vasilescu, Elena R.; Marboe, Charles C.; Mancini, Donna M.
2017-01-01
Background Antibody mediated rejection (AMR) has been associated with increased mortality and cardiac allograft vasculopathy (CAV). Early studies suggested that late AMR was rarely associated with graft dysfunction while recent reports have demonstrated an association with increased mortality. We sought to investigate the timing of AMR and its association with graft dysfunction, mortality, and CAV. Methods This retrospective cohort study identified all adult heart transplant recipients at Columbia University Medical Center from 2004–2013 (689 patients). There were 68 primary cases of AMR, which were stratified by early (<1 year post-OHT) or late (>1-year post-OHT) AMR. Kaplan-Meier survival analysis and modeling was performed with multivariable logistic regression and Cox proportional hazards regression. Results From January 1, 2004 through October 1, 2015 43 patients had early AMR (median 23 days post-OHT) and 25 had late AMR (median 1084 days post-OHT). Graft dysfunction was less common with early compared with late AMR (25.6% vs. 56%, p=0.01). Patients with late AMR had decreased post-AMR survival compared with early AMR (1-year 80% vs. 93%, 5-year 51% vs. 73%, p<0.05). When stratified by graft dysfunction, only those with late AMR and graft dysfunction had worse survival (30-day 79%, 1-year 64%, and 5-year 36%, p<0.006). The association remained irrespective of age, sex, DSA, LVAD use, reason for OHT, and recovery of graft function. Similarly, those with late AMR and graft dysfunction had accelerated development of de-novo CAV (50% at 1 year, HR 5.42, p=0.009), while all other groups were all similar to the general transplant population. Conclusion Late AMR is frequently associated with graft dysfunction. When graft dysfunction is present in late AMR there is an early and sustained increased risk of mortality and rapid development of de-novo CAV despite aggressive treatment. PMID:27423693
Clerkin, Kevin J; Restaino, Susan W; Zorn, Emmanuel; Vasilescu, Elena R; Marboe, Charles C; Mancini, Donna M
2016-09-01
Antibody-mediated rejection (AMR) has been associated with increased death and cardiac allograft vasculopathy (CAV). Early studies suggested that late AMR was rarely associated with graft dysfunction, whereas recent reports have demonstrated an association with increased mortality. We investigated the timing of AMR and its association with graft dysfunction, death, and CAV. This retrospective cohort study identified all adult orthotopic heart transplant (OHT) recipients (N = 689) at Columbia University Medical Center from 2004 to 2013. There were 68 primary cases of AMR, which were stratified by early (< 1 year post-OHT) or late (> 1 year post-OHT) AMR. Kaplan-Meier survival analysis and modeling was performed with multivariable logistic regression and Cox proportional hazards regression. From January 1, 2004, through October 1, 2015, early AMR (median 23 days post-OHT) occurred in 43 patients and late AMR (median 1,084 days post-OHT) occurred in 25. Graft dysfunction was less common with early compared with late AMR (25.6% vs 56%, p = 0.01). Patients with late AMR had decreased post-AMR survival compared with early AMR (1 year: 80% vs 93%, 5 years: 51% vs 73%, p < 0.05). When stratified by graft dysfunction, only those with late AMR and graft dysfunction had worse survival (30 days: 79%, 1 year: 64%, 5 years: 36%; p < 0.006). The association remained irrespective of age, sex, donor-specific antibodies, left ventricular assist device use, reason for OHT, and recovery of graft function. Similarly, those with late AMR and graft dysfunction had accelerated development of de novo CAV (50% at 1 year; hazard ratio, 5.42; p = 0.009), whereas all other groups were all similar to the general transplant population. Late AMR is frequently associated with graft dysfunction. When graft dysfunction is present in late AMR, there is an early and sustained increased risk of death and rapid development of de novo CAV despite aggressive treatment. Copyright © 2016 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Primary graft dysfunction of the liver: definitions, diagnostic criteria and risk factors.
Neves, Douglas Bastos; Rusi, Marcela Balbo; Diaz, Luiz Gustavo Guedes; Salvalaggio, Paolo
2016-01-01
Primary graft dysfunction is a multifactorial syndrome with great impact on liver transplantation outcomes. This review article was based on studies published between January 1980 and June 2015 and retrieved from PubMed database using the following search terms: "primary graft dysfunction", "early allograft dysfunction", "primary non-function" and "liver transplantation". Graft dysfunction describes different grades of graft ischemia-reperfusion injury and can manifest as early allograft dysfunction or primary graft non-function, its most severe form. Donor-, surgery- and recipient-related factors have been associated with this syndrome. Primary graft dysfunction definition, diagnostic criteria and risk factors differ between studies. RESUMO A disfunção primária do enxerto hepático é uma síndrome multifatorial com grande impacto no resultado do transplante de fígado. Foi realizada uma ampla revisão da literatura, consultando a base de dados PubMed, em busca de estudos publicados entre janeiro de 1980 e junho de 2015. Os termos descritivos utilizados foram: "primary graft dysfunction", "early allograft dysfunction", "primary non-function" e "liver transplantation". A disfunção traduz graus diferentes da lesão de isquemia e reperfusão do órgão, e pode se manifestar como disfunção precoce ou, na forma mais grave, pelo não funcionamento primário do enxerto. Fatores relacionados ao doador, ao transplante e ao receptor contribuem para essa síndrome. Existem definições diferentes na literatura quanto ao diagnóstico e aos fatores de risco associados à disfunção primária.
Azevedo, L D; Stucchi, R S; de Ataíde, E C; Boin, I F S F
2015-05-01
Graft dysfunction after liver transplantation is a serious complication that can lead to graft loss and patient death. This was a study to identify risk factors for early death (up to 30 days after transplantation). It was an observational and retrospective analysis at the Liver Transplantation Unit, Hospital de Clinicas, State University of Campinas, Brazil. From July 1994 to December 2012, 302 patients were included (>18 years old, piggyback technique). Of these cases, 26% died within 30 days. For analysis, Student t tests and chi-square were used to analyze receptor-related (age, body mass index, serum sodium, graft dysfunction, Model for End-Stage Liver Disease score, renal function, and early graft dysfunction [EGD type 1, 2, or 3]), surgery (hot and cold ischemia, surgical time, and units of packed erythrocytes [pRBC]), and donor (age, hypotension, and brain death cause) factors. Risk factors were identified by means of logistic regression model adjusted by the Hosmer-Lemeshow test with significance set at P < .05. We found that hyponatremic recipients had a 6.26-fold higher risk for early death. There was a 9% reduced chance of death when the recipient serum sodium increased 1 unit. The chance of EGD3 to have early death was 18-fold higher than for EGD1 and there was a 13% increased risk for death for each unit of pRBC transfused. Donor total bilirubin, hyponatremia, massive transfusion, and EGD3 in the allocation graft should be observed for better results in the postoperative period. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chi, Jingmao; Chen, Hui; Tolias, Peter; Du, Henry
2014-06-01
We have explored the use of a fiber-optic probe with surface-enhanced Raman scattering (SERS) sensing modality for early, noninvasive and, rapid diagnosis of potential renal acute rejection (AR) and other renal graft dysfunction of kidney transplant patients. Multimode silica optical fiber immobilized with colloidal Ag nanoparticles at the distal end was used for SERS measurements of as-collected urine samples at 632.8 nm excitation wavelength. All patients with abnormal renal graft function (3 AR episodes and 2 graft failure episodes) who were clinically diagnosed independently show common unique SERS spectral features in the urines collected just one day after transplant. SERS-based fiber-optic probe has excellent potential to be a bedside tool for early diagnosis of kidney transplant patients for timely medical intervention of patients at high risk of transplant dysfunction.
Böhmig, G A; Regele, H; Säemann, M D; Exner, M; Druml, W; Kovarik, J; Hörl, W H; Zlabinger, G J; Watschinger, B
2000-04-01
Excellent graft outcome has been reported for spousal-donor kidney transplantation. In husband-to-wife transplantation, however, a tendency toward inferior graft survival has been described for recipients who were previously pregnant. In our series of spousal-kidney transplantations (nine transplantations; three female recipients), actual graft survival is 100% (median observation time, 339 days). Five patients experienced early allograft rejection. In four transplant recipients, rejection was easily reversible by conventional antirejection therapy. In a multiparous recipient, however, mild interstitial allograft rejection associated with early graft dysfunction was resistant to anticellular treatment (antilymphocyte antibody, tacrolimus rescue therapy). The particular finding of polymorphonuclear neutrophils in peritubular capillaries and the finding of diffuse capillary deposits of the complement split product, C4d, in a posttransplantation biopsy specimen suggested a role of antibody-mediated graft injury. Retrospective flow cytometry cross-matching showed the presence of preformed immunoglobulin G (IgG) antibodies to HLA class I antigens that were not detectable by pretransplantation lymphocytotoxic cross-match testing or screening for panel reactive antibodies. After transplantation, however, complement-fixing antibodies, also presumably triggered by reexposure to spousal-donor HLA antigens, could be detected in the patient's serum. These findings suggested antibody-mediated allograft rejection and led to the initiation of immunoadsorption therapy (14 sessions) with staphylococcal protein A. Selective removal of recipient IgG resulted in complete reversal of graft dysfunction. Our findings suggest that in husband-to-wife transplantation, donor-specific antibodies, presumably triggered by previous pregnancies, might occasionally induce sustained allograft dysfunction. Thus, in this particular setting, a detailed immunologic and histopathologic work-up regarding antibody-mediated allograft dysfunction is warranted because immunoadsorption may be a highly effective treatment modality.
Optical monitoring of kidney oxygenation and hemodynamics using a miniaturized near-infrared sensor
NASA Astrophysics Data System (ADS)
Shadgan, Babak; Macnab, Andrew; Nigro, Mark; Nguan, Christopher
2017-02-01
Background: Following human renal allograft transplant primary graft dysfunction can occur early in the postoperative period as a result of acute tubular necrosis, acute rejection, drug toxicity, and vascular complications. Successful treatment of graft dysfunction requires early detection and accurate diagnosis so that disease-specific medical and/or surgical intervention can be provided promptly. However, current diagnostic methods are not sensitive or specific enough, so that identifying the cause of graft dysfunction is problematic and often delayed. Near-infrared spectroscopy (NIRS) is an established optical method that monitors changes in tissue hemodynamics and oxygenation in real time. We report the feasibility of directly monitoring kidney the kidney in an animal model using NIRS to detect renal ischemia and hypoxia. Methods: In an anesthetized pig, a customized continuous wave spatially resolved (SR) NIRS sensor was fixed directly to the surface of the surgically exposed kidney. Changes in the concentration of oxygenated (O2Hb) deoxygenated (HHb) and total hemoglobin (THb) were monitored before, during and after renal artery clamping and reperfusion, and the resulting fluctuations in chromophore concentration from baseline used to measure variations in renal perfusion and oxygenation. Results: On clamping the renal artery THb and O2Hb concentrations declined progressively while HHb rose. With reperfusion after releasing the artery clamp O2Hb and THb rose while HHb fell with all parameters returning to its baseline. This pattern was similar in all three trials. Conclusion: This pilot study indicates that a miniaturized NIRS sensor applied directly to the surface of a kidney in an animal model can detect the onset of renal ischemia and tissue hypoxia. With modification, our NIRS-based method may contribute to early detection of renal vascular complications and graft dysfunction following renal transplant.
Metabolomics discloses donor liver biomarkers associated with early allograft dysfunction.
Cortes, Miriam; Pareja, Eugenia; García-Cañaveras, Juan C; Donato, M Teresa; Montero, Sandra; Mir, Jose; Castell, José V; Lahoz, Agustín
2014-09-01
Early allograft dysfunction (EAD) dramatically influences graft and patient outcome after orthotopic liver transplantation and its incidence is strongly determined by donor liver quality. Nevertheless, objective biomarkers, which can assess graft quality and anticipate organ function, are still lacking. This study aims to investigate whether there is a preoperative donor liver metabolomic biosignature associated with EAD. A comprehensive metabolomic profiling of 124 donor liver biopsies collected before transplantation was performed by mass spectrometry coupled to liquid chromatography. Donor liver grafts were classified into two groups: showing EAD and immediate graft function (IGF). Multivariate data analysis was used to search for the relationship between the metabolomic profiles present in donor livers before transplantation and their function in recipients. A set of liver graft dysfunction-associated biomarkers was identified. Key changes include significantly increased levels of bile acids, lysophospholipids, phospholipids, sphingomyelins and histidine metabolism products, all suggestive of disrupted lipid homeostasis and altered histidine pathway. Based on these biomarkers, a predictive EAD model was built and further evaluated by assessing 24 independent donor livers, yielding 91% sensitivity and 82% specificity. The model was also successfully challenged by evaluating donor livers showing primary non-function (n=4). A metabolomic biosignature that accurately differentiates donor livers, which later showed EAD or IGF, has been deciphered. The remarkable metabolomic differences between donor livers before transplant can relate to their different quality. The proposed metabolomic approach may become a clinical tool for donor liver quality assessment and for anticipating graft function before transplant. Copyright © 2014 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.
Long-term outcomes and management of the heart transplant recipient.
McCartney, Sharon L; Patel, Chetan; Del Rio, J Mauricio
2017-06-01
Cardiac transplantation remains the gold standard in the treatment of advanced heart failure. With advances in immunosuppression, long-term outcomes continue to improve despite older and higher risk recipients. The median survival of the adult after heart transplantation is currently 10.7 years. While early graft failure and multiorgan system dysfunction are the most important causes of early mortality, malignancy, rejection, infection, and cardiac allograft vasculopathy contribute to late mortality. Chronic renal dysfunction is common after heart transplantation and occurs in up to 68% of patients by year 10, with 6.2% of patients requiring dialysis and 3.7% undergoing renal transplant. Functional outcomes after heart transplantation remain an area for improvement, with only 26% of patients working at 1-year post-transplantation, and are likely related to the high incidence of depression after cardiac transplantation. Areas of future research include understanding and managing primary graft dysfunction and reducing immunosuppression-related complications. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jochmans, Ina; Lerut, Evelyne; van Pelt, Jos; Monbaliu, Diethard; Pirenne, Jacques
2011-11-01
To investigate circulating biomarkers of initial graft injury in a porcine kidney autotransplant model. Injury endured by kidney grafts early posttransplant determines their outcome. However, creatinine (clearance) is a poor surrogate of tissue injury and urinary biomarkers are limited by graft anuria or persistent native kidney diuresis. No validated circulating biomarkers quantifying initial graft injury exist. Minimally injured porcine kidney grafts (n = 6) were cold stored (18 hours) and autotransplanted. Moderately (n = 6) and severely injured grafts (n = 7) were exposed to 30 or 60 minutes warm ischemia before storage and autotransplantation. Four biomarkers [aspartate transaminase (AST), heart-type fatty acid-binding protein (H-FABP), neutrophil gelatinase-associated lipocalin (NGAL), and N-acetyl-β-glucosaminidase (NAG)] were measured posttransplant and compared with creatinine (clearance) and histology. Diuresis was delayed in moderately [2.5 days (2-3)] and severely [4 days (4-5)] versus minimally injured grafts (P < 0.001). Creatinine peaked later than AST, H-FABP, and NGAL [4 days (3-5) vs 3 hours (3-6), 6 hours (6-24), 2 days (1-3), respectively] and only differentiated minimally from severely injured grafts. Peak AST and H-FABP distinguished all injury grades. Neutrophil gelatinase-associated lipocalin discriminated initial graft injury 2 days posttransplant. Peak AST, H-FABP, and NGAL correlated with peak creatinine [Pearson coefficients: 0.70 (P = 0.001), 0.85 (P < 0.0001), 0.80 (P < 0.0001)]. N-acetyl-β-glucosaminidase was not different. Decreased clearance accounted for a small percentage of H-FABP and NGAL increase. Histology was not different among transplanted groups. Plasma AST, H-FABP, and NGAL reflect the severity of initial kidney graft injury and predict graft dysfunction earlier and more accurately than creatinine (clearance) and histology. They represent promising tools to improve patient care after kidney transplantation.
Short- and long-term outcomes of 1000 adult lung transplant recipients at a single center.
Kreisel, Daniel; Krupnick, Alexander S; Puri, Varun; Guthrie, Tracey J; Trulock, Elbert P; Meyers, Bryan F; Patterson, G Alexander
2011-01-01
Lung transplantation has become accepted therapy for end-stage pulmonary disease. The objective of this study was to review a single-institution experience of adult lung transplants. We reviewed 1000 adult lung transplants that were performed at Washington University between July 1988 and January 2009. Transplants were performed for emphysema (52%), cystic fibrosis (18.2%), pulmonary fibrosis (16.1%), and pulmonary vascular disease (7.2%). Overall recipient age was 48 ± 13 years with an increase from 43 ± 12 years (July 1988-November 1993) to 50 ± 14 years (June 2005-January 2009). Overall incidence of primary graft dysfunction was 22.1%. Hospital mortality was higher for patients who had primary graft dysfunction (primary graft dysfunction, 13.6%; no primary graft dysfunction, 4%; P < .001). Freedom from bronchiolitis obliterans syndrome was 84% at 1 year, 38.2% at 5 years, and 12.2% at 10 years. Survival at 1, 5, 10, and 15 years was 84%, 56.4%, 32.2%, and 17.8%, respectively. Five-year survival improved from 49.6% (July 1988-November 1993) to 62.1% (October 2001-June 2005). Primary graft dysfunction was associated with lower survival at 1, 5, and 10 years (primary graft dysfunction: 72.8%, 43.9%, and 18.7%, respectively; no primary graft dysfunction: 87.1%, 59.8%, and 35.7%, respectively, P < .001) and lower rates of freedom from bronchiolitis obliterans syndrome (primary graft dysfunction: 78%, 27.5%, and 8.5%, respectively; no primary graft dysfunction: 85.4%, 40.7%, and 13.1%, respectively, P = .007). Five-year survival has improved over the study period, but long-term outcomes are limited by bronchiolitis obliterans syndrome. Primary graft dysfunction is associated with higher rates of bronchiolitis obliterans syndrome and impaired short- and long-term survival. A better understanding of primary graft dysfunction and bronchiolitis obliterans syndrome is critical to improve outcomes. Copyright © 2011 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Nemes, Balázs; Gámán, György; Polak, Wojciech G; Gelley, Fanni; Hara, Takanobu; Ono, Shinichiro; Baimakhanov, Zhassulan; Piros, Laszlo; Eguchi, Susumu
2016-07-01
Extended-criteria donors (ECDs) have an impact on early allograft dysfunction (EAD), biliary complications, relapse of hepatitis C virus (HCV), and survivals. Early allograft dysfunction was frequently seen in grafts with moderate and severe steatosis. Donors after cardiac death (DCD) have been associated with higher rates of graft failure and biliary complications compared to donors after brain death. Extended warm ischemia, reperfusion injury and endothelial activation trigger a cascade, leading to microvascular thrombosis, resulting in biliary necrosis, cholangitis, and graft failure. The risk of HCV recurrence increased by donor age, and associated with using moderately and severely steatotic grafts. With the administration of protease inhibitors sustained virological response was achieved in majority of the patients. Donor risk index and EC donor scores (DS) are reported to be useful, to assess the outcome. The 1-year survival rates were 87% and 40% respectively, for donors with a DS of 0 and 3. Graft survival was excellent up to a DS of 2, however a DS >2 should be avoided in higher-risk recipients. The 1, 3 and 5-year survival of DCD recipients was comparable to optimal donors. However ECDs had minor survival means of 85%, 78.6%, and 72.3%. The graft survival of split liver transplantation (SLT) was comparable to that of whole liver orthotopic liver transplantation. SLT was not regarded as an ECD factor in the MELD era any more. Full-right-full-left split liver transplantation has a significant advantage to extend the high quality donor pool. Hypothermic oxygenated machine perfusion can be applied clinically in DCD liver grafts. Feasibility and safety were confirmed. Reperfusion injury was also rare in machine perfused DCD livers.
Rentsch, Markus; Kienle, Klaus; Mueller, Thomas; Vogel, Mandy; Jauch, Karl Walter; Püllmann, Kerstin; Obed, Aiman; Schlitt, Hans J; Beham, Alexander
2005-11-27
Primary graft dysfunction due to ischemia and reperfusion injury represents a major problem in liver transplantation. The related cell stress may induce apoptosis, which can be suppressed by bcl-2. The purpose of the study was to investigate the effect of adenoviral bcl-2 gene transfer on early graft function and survival in rat liver transplantation. An adenoviral construct that transfers bcl-2 under the control of a tetracycline inducible promoter was generated (advTetOn bcl-2) and used with a second adenovirus that transfers the repressor protein (advCMV Rep). Forty-eight hours before explantation, donor rats were treated with advTetOn bcl-2/ advCMV Rep (n=7) and doxycyclin, with the control adenoviral construct advCMV GFP (n=8) or with doxycyclin alone (n=8). Liver transplantation was performed following 16 hours of cold storage (UW). Bcl-2 expression and intrahepatic apoptosis was assessed. Bile flow was monitored 90 min posttransplantation. The endpoint for survival was 7 days. Bcl-2 was expressed in hepatocytes and sinusoidal lining cells. This was associated with a significant reduction of apoptotic sinusoidal lining cells and hepatocytes after 24 hours and 7 days. Bile production was significantly higher following bcl-2 pretreatment. Furthermore, bcl-2 transfer resulted in significantly improved survival (100% vs. 50% both control groups). Adenoviral bcl-2 transfer results in protein expression in hepatocytes and sinusoidal lining cells resulting in early graft function and survival enhancement after prolonged ischemia and reperfusion injury. The inhibition of apoptosis in the context of liver transplantation might be a reasonable approach in the treatment of graft dysfunction.
Hoyer, Dieter P; Paul, Andreas; Gallinat, Anja; Molmenti, Ernesto P; Reinhardt, Renate; Minor, Thomas; Saner, Fuat H; Canbay, Ali; Treckmann, Jürgen W; Sotiropoulos, Georgios C; Mathé, Zoltan
2015-01-01
Poor initial graft function was recently newly defined as early allograft dysfunction (EAD) [Olthoff KM, Kulik L, Samstein B, et al. Validation of a current definition of early allograft dysfunction in liver transplant recipients and analysis of risk factors. Liver Transpl 2010; 16: 943]. Aim of this analysis was to evaluate predictive donor information for development of EAD. Six hundred and seventy-eight consecutive adult patients (mean age 51.6 years; 60.3% men) who received a primary liver transplantation (LT) (09/2003-12/2011) were included. Standard donor data were correlated with EAD and outcome by univariable/multivariable logistic regression and Cox proportional hazards to identify prognostic donor factors after adjustment for recipient confounders. Estimates of relevant factors were utilized for construction of a new continuous risk index to develop EAD. 38.7% patients developed EAD. 30-day survival of grafts with and without EAD was 59.8% and 89.7% (P < 0.0001). 30-day survival of patients with and without EAD was 68.5% and 93.1% (P < 0.0001) respectively. Donor body mass index (P = 0.0112), gGT (P = 0.0471), macrosteatosis (P = 0.0006) and cold ischaemia time (CIT) (P = 0.0031) were predictors of EAD. Internal cross validation showed a high predictive value (c-index = 0.622). Early allograft dysfunction correlates with early results of LT and can be predicted by donor data only. The newly introduced risk index potentially optimizes individual decisions to accept/decline high risk organs. Outcome of these organs might be improved by shortening CIT. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Gastaca, M; Prieto, M; Valdivieso, A; Ruiz, P; Ventoso, A; Palomares, I; Matarranz, A; Martinez-Indart, L; Ortiz de Urbina, J
2016-09-01
The aim of this study was to determine whether a portal flow of <1,000 mL/min in orthotopic liver transplantation (OLT) is associated with a higher incidence of early graft dysfunction (EGD) and graft loss. A retrospective study was performed of 540 OLTs carried out consecutively from December 2004 to December 2013. Patients were divided into 2 groups: group A, portal flow <1,000 mL/min; and group B, portal flow >1,000 mL/min. We studied the incidence of EGD and graft survival. A subanalysis was performed to define the minimum acceptable portal flow/100 g of liver weight to reduce the development EGD and graft loss. Group A included 29 patients and group B, 511 patients. Group A had significantly lower-weight donors and recipients, female recipients with cholestatic disease, lower MELD scores, and lower hepatic artery flow. EGD occurred in 7 patients in group A (24.1%) versus 101 patients in group B (19.8%; P = .43). No significant differences were found in 1- and 5-year graft survival. A portal flow of <80 mL/min/100 g of liver weight was related to a significantly higher risk of developing EGD (odds ratio, 4.35; 95% confidence interval [CI], 1.46-12.91; P = .008) and graft loss (hazard ratio, 4.05; 95% CI, 1.32-12.42; P = .014). Intraoperative portal flow of <1,000 mL/min in OLT was not related per se with a higher incidence of EGD or graft loss. Significantly higher risk of developing EGD and graft loss was associated with a portal flow of <80 mL/min/100 g of liver weight. Copyright © 2016 Elsevier Inc. All rights reserved.
Influence of preformed donor-specific antibodies and C4d on early liver allograft function.
Perera, M T; Silva, M A; Murphy, N; Briggs, D; Mirza, D F; Neil, D A H
2013-12-01
INTRODUCTION. The impact of preformed donor-specific antibodies (DSA) is incompletely understood in liver transplantation. The incidence and impact of preformed DSA on early post liver transplant were assessed and these were correlated with compliment fragment C4d on allograft biopsy. METHODS. Pretransplant serum from 41 consecutive liver transplant recipients (brain dead donors; DBD = 27 and cardiac death donors; DCD = 14) were tested for class-specific anti-human leukocyte antigen (HLA) and compared against donor HLA types. Liver biopsies were taken during cold storage (t-1) and post-reperfusion (t0) stained with C4d and graded for preservation-reperfusion injury (PRI). RESULTS. Of the 41 recipients, 8 (20%) had anti-HLA class I/II antibodies pretransplant, 3 (7%) were confirmed preformed DSA; classes I and II (n=1) and class I only (n=2). No biopsies showed definite evidence of antibody-mediated rejection. Graft biopsies in overall showed only mild PRI with ischemic hepatocyte C4d pattern similar in both positive and negative DSA patients. One DSA-positive (33%) compared with four DSA-negative patients (10%) had significant early graft dysfunction; severe PRI causing graft loss from primary nonfunction was seen only in DSA-negative group. Allograft biopsy of preformed DSA-positive patient demonstrated only minimal PRI; however, no identifiable cause could be attributed to graft dysfunction other than preformed DSA. CONCLUSION. Preformed DSA are present in 5-10% liver transplant recipients. There is no association between anti-HLA DSA and PRI and C4d, but preformed DSA may cause early morbidity. Larger studies on the impact of DSA with optimization of C4d techniques are required.
Benko, Tamas; Gallinat, Anja; Minor, Thomas; Saner, Fuat H; Sotiropoulos, Georgios C; Paul, Andreas; Hoyer, Dieter P
2017-06-01
Recently, the postoperative Model for End stage Liver Disease score (POPMELD) was suggested as a definition of postoperative graft dysfunction and a predictor of outcome after liver transplantation (LT). The aim of the present study was to validate this concept in the context of extended criteria donor (ECD) organs. Single-center prospectively collected data (OPAL study/01/11-12/13) of 116 ECD LTs were utilized. For each recipient, the Model for End stage Liver Disease (MELD) score was calculated for 7 postoperative days (PODs). The ability of international normalized ratio, bilirubin, aspartate aminotransferase, Donor Risk Index, a recent definition of early allograft dysfunction, and the POPMELD was compared to predict 90-day graft loss. Predictive abilities were compared by receiver operating characteristic curves, sensitivity and specificity, and positive and negative predictive values. The median Donor Risk Index was 1.8. In all, 60.3% of recipients were men [median age of 54 (23-68) years]. The median POD1-7 peak-aspartate aminotransferase value was 1052 (194-17 577) U/l. The rate of early allograft dysfunction was 22.4%. The 90-day graft survival was 89.7%. Out of possible predictors of the 90-day graft loss MELD on POD5 was the best predictor of outcome (area under the curve=0.84). A MELD score of 16 or more on POD5 predicted the 90-day graft loss with a specificity of 80.8%, a sensitivity of 81.8%, and a positive and negative predictive value of 31 and 97.7%. A MELD score of 16 or more on POD5 is an excellent predictor of outcome in ECD donor LT. Routine evaluation of POPMELD scores might support clinical decision-making and should be reported routinely in clinical trials.
Benedetto, Umberto; Pecchinenda, Gustavo Guida; Chivasso, Pierpaolo; Bruno, Vito Domenico; Rapetto, Filippo; Bryan, Alan; Angelini, Gianni Davide
2016-01-01
Coronary artery bypass grafting remains the standard treatment for patients with extensive coronary artery disease. Coronary surgery without use of cardiopulmonary bypass avoids the deleterious systemic inflammatory effects of the extracorporeal circuit. However there is an ongoing debate surrounding the clinical outcomes after on-pump versus off-pump coronary artery bypass (ONCAB versus OPCAB) surgery. The current review is based on evidence from randomized controlled trials (RCTs) and meta-analyses of randomized studies. It focuses on operative mortality, mid- and long-term survival, graft patency, completeness of revascularisation, neurologic and neurophysiologic outcomes, perioperative complications and outcomes in the high risk groups. Early and late survival rates for both OPCAB and ONCAB grafting are similar. Some studies suggest early poorer vein graft patency with off-pump when compared with on-pump, comparable midterm arterial conduit patency with no difference in long term venous and arterial graft patency. A recent, pooled analysis of randomised trials shows a reduction in stroke rates with use off-pump techniques. Furthermore, OPCAB grafting seems to reduce postoperative renal dysfunction, bleeding, transfusion requirement and respiratory complications while perioperative myocardial infarction rates are similar to ONCAB grafting. The high risk patient groups seem to benefit from off-pump coronary surgery. PMID:27942394
Zishiri, Edwin T; Williams, Sarah; Cronin, Edmond M; Blackstone, Eugene H; Ellis, Stephen G; Roselli, Eric E; Smedira, Nicholas G; Gillinov, A Marc; Glad, Jo Ann; Tchou, Patrick J; Szymkiewicz, Steven J; Chung, Mina K
2013-02-01
Implantation of implantable cardioverter defibrillator for prevention of sudden cardiac death is deferred for 90 days after coronary revascularization, but mortality may be highest early after cardiac procedures in patients with ventricular dysfunction. We determined mortality risk in postrevascularization patients with left ventricular ejection fraction ≤35% and compared survival with those discharged with a wearable cardioverter defibrillator (WCD). Hospital survivors after surgical (coronary artery bypass graft surgery) or percutaneous (percutaneous coronary intervention [PCI]) revascularization with left ventricular ejection fraction ≤35% were included from Cleveland Clinic and national WCD registries. Kaplan-Meier, Cox proportional hazards, propensity score-matched survival, and hazard function analyses were performed. Early mortality hazard was higher among 4149 patients discharged without a defibrillator compared with 809 with WCDs (90-day mortality post-coronary artery bypass graft surgery 7% versus 3%, P=0.03; post-PCI 10% versus 2%, P<0.0001). WCD use was associated with adjusted lower risks of long-term mortality in the total cohort (39%, P<0.0001) and both post-coronary artery bypass graft surgery (38%, P=0.048) and post-PCI (57%, P<0.0001) cohorts (mean follow-up, 3.2 years). In propensity-matched analyses, WCD use remained associated with lower mortality (58% post-coronary artery bypass graft surgery, P=0.002; 67% post-PCI, P<0.0001). Mortality differences were not attributable solely to therapies for ventricular arrhythmia. Only 1.3% of the WCD group had a documented appropriate therapy. Patients with left ventricular ejection fraction ≤35% have higher early compared to late mortality after coronary revascularization, particularly after PCI. As early hazard seemed less marked in WCD users, prospective studies in this high-risk population are indicated to confirm whether WCD use as a bridge to left ventricular ejection fraction improvement or implantable cardioverter defibrillator implantation can improve outcomes after coronary revascularization.
A score model for the continuous grading of early allograft dysfunction severity.
Pareja, Eugenia; Cortes, Miriam; Hervás, David; Mir, José; Valdivieso, Andrés; Castell, José V; Lahoz, Agustín
2015-01-01
Early allograft dysfunction (EAD) dramatically influences graft and patient outcomes. A lack of consensus on an EAD definition hinders comparisons of liver transplant outcomes and management of recipients among and within centers. We sought to develop a model for the quantitative assessment of early allograft function [Model for Early Allograft Function Scoring (MEAF)] after transplantation. A retrospective study including 1026 consecutive liver transplants was performed for MEAF score development. Multivariate data analysis was used to select a small number of postoperative variables that adequately describe EAD. Then, the distribution of these variables was mathematically modeled to assign a score for each actual variable value. A model, based on easily obtainable clinical parameters (ie, alanine aminotransferase, international normalized ratio, and bilirubin) and scoring liver function from 0 to 10, was built. The MEAF score showed a significant association with patient and graft survival at 3-, 6- and 12-month follow-ups. Hepatic steatosis and age for donors; cold/warm ischemia times and postreperfusion syndrome for surgery; and intensive care unit and hospital stays, Model for End-Stage Liver Disease and Child-Pugh scores, body mass index, and fresh frozen plasma transfusions for recipients were factors associated significantly with EAD. The model was satisfactorily validated by its application to an independent set of 200 patients who underwent liver transplantation at a different center. In conclusion, a model for the quantitative assessment of EAD severity has been developed and validated for the first time. The MEAF provides a more accurate graft function assessment than current categorical classifications and may help clinicians to make early enough decisions on retransplantation benefits. Furthermore, the MEAF score is a predictor of recipient and graft survival. The standardization of the criteria used to define EAD may allow reliable comparisons of recipients' treatments and transplant outcomes among and within centers. © 2014 American Association for the Study of Liver Diseases.
Early statin use is an independent predictor of long-term graft survival.
Moreso, Francesc; Calvo, Natividad; Pascual, Julio; Anaya, Fernando; Jiménez, Carlos; Del Castillo, Domingo; Sánchez-Plumed, Jaime; Serón, Daniel
2010-06-01
Background. Statin use in renal transplantation has been associated with a lower risk of patient death but not with an improvement of graft functional survival. The aim of this study is to evaluate the effect of statin use in graft survival, death-censored graft survival and patient survival using the data recorded on the Spanish Late Allograft Dysfunction Study Group.Patients and methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered. Since the mean follow-up in the 2002 cohort was 3 years, statin use was analysed considering its introduction during the first year or during the initial 2 years after transplantation. Univariate and multivariate Cox regression analyses with a propensity score for statin use were employed to analyse graft survival, death-censored graft survival and patient survival.Results. In the 4682 evaluated patients, the early statin use after transplantation significantly increased from 1990 to 2002 (12.7%, 27.9%, 47.7% and 53.0%, P < 0.001). Statin use during the first year was not associated with graft or patient survival. Statin use during the initial 2 years was associated with a lower risk of graft failure (relative risk [RR] = 0.741 and 95% confidence interval [CI] = 0.635-0.866, P < 0.001) and patient death (RR = 0.806 and 95% CI = 0.656-0.989, P = 0.039). Death-censored graft survival was not associated with statin use during the initial 2 years.Conclusion. The early introduction of statin treatment after transplantation is associated with a significant decrease in late graft failure due to a risk reduction in patient death.
Early statin use is an independent predictor of long-term graft survival
Moreso, Francesc; Calvo, Natividad; Pascual, Julio; Anaya, Fernando; Jiménez, Carlos; del Castillo, Domingo; Sánchez-Plumed, Jaime; Serón, Daniel
2010-01-01
Background. Statin use in renal transplantation has been associated with a lower risk of patient death but not with an improvement of graft functional survival. The aim of this study is to evaluate the effect of statin use in graft survival, death-censored graft survival and patient survival using the data recorded on the Spanish Late Allograft Dysfunction Study Group. Patients and methods. Patients receiving a renal allograft in Spain in 1990, 1994, 1998 and 2002 were considered. Since the mean follow-up in the 2002 cohort was 3 years, statin use was analysed considering its introduction during the first year or during the initial 2 years after transplantation. Univariate and multivariate Cox regression analyses with a propensity score for statin use were employed to analyse graft survival, death-censored graft survival and patient survival. Results. In the 4682 evaluated patients, the early statin use after transplantation significantly increased from 1990 to 2002 (12.7%, 27.9%, 47.7% and 53.0%, P < 0.001). Statin use during the first year was not associated with graft or patient survival. Statin use during the initial 2 years was associated with a lower risk of graft failure (relative risk [RR] = 0.741 and 95% confidence interval [CI] = 0.635–0.866, P < 0.001) and patient death (RR = 0.806 and 95% CI = 0.656–0.989, P = 0.039). Death-censored graft survival was not associated with statin use during the initial 2 years. Conclusion. The early introduction of statin treatment after transplantation is associated with a significant decrease in late graft failure due to a risk reduction in patient death. PMID:20508861
Long-Term Outcomes of Renal Transplant in Recipients With Lower Urinary Tract Dysfunction.
Wilson, Rebekah S; Courtney, Aisling E; Ko, Dicken S C; Maxwell, Alexander P; McDaid, James
2018-01-02
Lower urinary tract dysfunction can lead to chronic kidney disease, which, despite surgical intervention, will progress to end-stage renal disease, requiring dialysis. Urologic pathology may damage a transplanted kidney, limiting patient and graft survival. Although smaller studies have suggested that urinary tract dysfunction does not affect graft or patient survival, this is not universally accepted. Northern Ireland has historically had the highest incidence of neural tube defects in Europe, giving rich local experience in caring for patients with lower urinary tract dysfunction. Here, we analyzed outcomes of renal transplant recipients with lower urinary tract dysfunction versus control recipients. We identified 3 groups of kidney transplant recipients treated between 2001 and 2010; those in group 1 had end-stage renal disease due to lower urinary tract dysfunction with prior intervention (urologic surgery, long-term catheter, or intermittent self-catheterization), group 2 had end-stage renal disease secondary to lower urinary tract dysfunction without intervention, and group 3 had end-stage renal disease due to polycystic kidney disease (chosen as a relatively healthy control cohort without comorbid burden of other causes of end-stage renal disease such as diabetes). The primary outcome measured, graft survival, was death censored, with graft loss defined as requirement for renal replacement therapy or retransplant. Secondary outcomes included patient survival and graft function. In 150 study patients (16 patients in group 1, 64 in group 2, and 70 in group 3), 5-year death-censored graft survival was 93.75%, 90.6%, and 92.9%, respectively, with no significant differences in graft failure among groups (Cox proportional hazards model). Five-year patient survival was 100%, 100%, and 94.3%, respectively. Individuals with a history of lower urinary tract dysfunction had graft and patient survival rates similar to the control group. When appropriately treated, lower urinary tract dysfunction is not a barrier to successful renal transplant.
Outcomes of cryptococcosis in renal transplant recipients in a less-resourced health care system.
Ponzio, Vinicius; Camargo, Luis F A; Medina-Pestana, José O; Perfect, John R; Colombo, Arnaldo L
2018-04-20
Cryptococcosis is the second most common cause of invasive fungal infections in renal transplant recipients in many countries, and data on graft outcome after treatment for this infection is lacking in less-resourced health care settings. Data from 47 renal transplant recipients were retrospectively collected at a single institution during a period of 13 years. Graft dysfunction, graft loss and mortality rates were evaluated. Predictors of mortality and graft loss were estimated. A total of 38 (97.4%) patients treated with amphotericin B deoxycholate (AMBd) showed graft dysfunction after antifungal initiation and 8 (18.2%) had kidney graft loss. Graft loss within 30 days after cryptococcosis onset was significantly associated with disseminated infection, greater baseline creatinine levels and graft dysfunction concomitant to AMBd therapy and an additional nephrotoxic condition. The 30-day mortality rate was 19.2% and it was significantly associated with disseminated and pulmonary infections, somnolence at admission, high CSF opening pressure, positive CSF India ink, creatinine levels greater than 2.0 mg/dL at admission, graft dysfunction in patients treated with AMBd and an additional nephrotoxic condition and graft loss within 30 days. Graft dysfunction was common in renal transplant recipients with cryptococcosis treated with AMBd. The rate of graft loss rate was high, most frequently in patients with concomitant nephrotoxic conditions. Therefore, the clinical focus should be on the use of less nephrotoxic lipid formulations of amphotericin B in this specific population requiring a polyene induction regimen for treatment of severe cryptococcosis in all health care systems caring for transplantation recipients. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Early graft dysfunction and mortality rate in marginal donor liver transplantation.
Sarkut, Pmar; Gülcü, Bariş; Işçimen, Remzi; Kiyici, Murat; Türker, Gürkan; Topal, Naile Bolca; Ozen, Yilmaz; Kaya, Ekrem
2014-01-01
To determine the effect of marginal donor livers on mortality and graft survival in liver transplantation (LT) recipients. Donors with any 1 of following were considered marginal donors: age ≥65 years, sodium level ≥ 165 mmol/L and cold ischemia time ≥ 12 h. Donors were classified according to the donor risk index (DRI) < 1.7 and ≥ 1.7. The transplant recipients' model for end-stage liver disease (MELD) scores were considered low if < 20 and high if ≥ 20. Early graft dysfunction (EGD) and mortality rate were evaluated. During the study period 47 patients underwent cadaveric LT. The mean age of the donors and recipients was 45 years (range: 5-72 years) and 46 years (range: 4-66 years), respectively. In all, there were 15 marginal donors and 18 donors with a DRI > 1.7. In total, 4 LT patients that received livers from marginal donors and 5 that received livers from donors with a DRI ≥ 1.7 had EGD. Among the recipients of marginal livers, 5 died, versus 4 of the recipients of standard livers. There was no significant difference in EGD or mortality rate between the patients that received livers from marginal donors or those with a DRI ≥ 1.7 and patients that received standard donor livers. Marginal and DRI ≥ 1.7 donors negatively affected LT outcomes, but not significantly.
Transient hyperglycemia during liver transplantation does not affect the early graft function.
Blasi, Annabel; Beltran, Joan; Martin, Nuria; Martinez-Pallí, Graciela; Lozano, Juan J; Balust, Jaume; Torrents, Abigail; Taura, Pilar
2015-01-01
Background and rationale for the study. Hyperglycemia after graft reperfusion is a consistent finding in liver transplantation (LT) that remains poorly studied. We aim to describe its appearance in LT recipients of different types of grafts and its relation to the graft function. 436 LT recipients of donors after brain death (DBD), donors after cardiac death (DCD), and familial amyloidotic polyneuropathy (FAP) donors were reviewed. Serum glucose was measured at baseline, during the anhepatic phase, after graft reperfusion, and at the end of surgery. Early graft dysfunction (EAD) was assessed by Olthoff criteria. Caspase-3, IFN-γ, IL1β, and IL6 gene expression were measured in liver biopsy. The highest increase in glucose levels after reperfusion was observed in FAP LT recipients and the lowest in DCD LT recipients. Glucose level during the anhepatic phase was the only modifiable predictive variable of hyperglycemia after reperfusion. No relation was found between hyperglycemia after reperfusion and EAD. However, recipients with the highest glucose levels after reperfusion tended to achieve the best glucose control at the end of surgery and those who were unable to control the glucose value after reperfusion showed EAD more frequently. The highest levels of caspase-3 were found in recipients with the lowest glucose values after reperfusion. In conclusion, glucose levels increased after graft reperfusion to a different extent according to the donor type. Contrary to general belief, transient hyperglycemia after reperfusion does not appear to impact negatively on the liver graft function and could even be suggested as a marker of graft quality.
Okamura, Yusuke; Yagi, Shintaro; Sato, Toshiya; Hata, Koichiro; Ogawa, Eri; Yoshizawa, Atsushi; Kamo, Naoko; Yamashiki, Noriyo; Okajima, Hideaki; Kaido, Toshimi; Uemoto, Shinji
2018-03-01
Early allograft dysfunction (EAD) defined by serum total bilirubin (TB) of 10 mg/dL or greater or prothrombin time-international normalized ratio (PT-INR) of 1.6 or greater on postoperative day 7 (POD 7) or aminotransferase greater than 2000 IU/L within the first week, is associated with early graft loss after deceased-donor liver transplantation. We aimed to determine the prognostic impact of the EAD definition in living-donor liver transplantation (LDLT). We analyzed the validity of the EAD definition and its impact on early graft survival in 260 adult recipients who underwent primary LDLT. Eighty-four (32.3%) patients met the EAD criteria; 59 (22.7%) and 46 (17.7%) patients had TB of 10 mg/dL or greater and PT-INR of 1.6 or greater on POD 7, respectively, and 22 (8.5%) patients satisfied both criteria. Graft survival differed significantly when stratified according to TB of 10 mg/dL or greater and PT-INR of 1.6 or greater (P < 0.0001). PT-INR of 1.6 or greater resulted in higher graft mortality (risk ratio [RR], 3.87; P < 0.0001 at 90 days; RR, 2.97; P < 0.0001 at 180 days), as did TB of 10 mg/dL or greater (RR, 1.89; P = 0.027 at 90 days; RR, 1.91; P = 0.006 at 180 days). Coexistence of TB of 10 mg/dL or greater and PT-INR of 1.6 or greater was strongly associated with early graft loss (59.1%, RR, 6.97 at 90 days; 68.2%; RR, 5.75 at 180 days). In Cox regression analysis, PT-INR of 1.6 or greater and TB of 10 mg/dL or greater on POD 7 were significant risk factors for early graft loss (hazard ratio, 4.10; 95% confidence interval, 2.35-7.18; P < 0.0001, and hazard ratio, 2.43; 95% confidence interval, 1.39-4.24; P = 0.0018, respectively). TB of 10 mg/dL or greater and/or PT-INR of 1.6 or greater on POD 7 predicted early graft loss after LDLT, and their coexistence worsened patient outcomes.
Wu, Ming-Jui; Chen, Wei-Ling; Kan, Chung-Dann; Yu, Fan-Ming; Wang, Su-Chin; Lin, Hsiu-Hui; Lin, Chia-Hung
2015-12-01
In physical examinations, hemodialysis access stenosis leading to dysfunction occurs at the venous anastomosis site or the outflow vein. Information from the inflow stenosis, such as blood pressure, pressure drop, and flow resistance increases, allows dysfunction screening from the stage of early clots and thrombosis to the progression of outflow stenosis. Therefore, this study proposes dysfunction screening model in experimental arteriovenous grafts (AVGs) using the fractional-order extractor (FOE) and the color relation analysis (CRA). A Sprott system was designed using an FOE to quantify the differences in transverse vibration pressures between the inflow and outflow sites of an AVG. Experimental analysis revealed that the degree of stenosis (DOS) correlated with an increase in fractional-order dynamic errors (FODEs). Exponential regression was used to fit a non-linear curve and can be used to quantify the relationship between the FODEs and DOS (R (2) = 0.8064). The specific ranges were used to evaluate the stenosis degree, such as DOS: <50, 50-80, and >80%. A CRA-based screening method was derived from the hue angle-saturation-value color model, which describes perceptual color relationships for the DOS. It has a flexibility inference manner with color visualization to represent the different stenosis degrees, which has average accuracy >90% superior to the traditional methods. This in vitro experimental study demonstrated that the proposed model can be used for dysfunction screening in stenotic AVGs.
Large-for-size liver transplant: a single-center experience.
Akdur, Aydincan; Kirnap, Mahir; Ozcay, Figen; Sezgin, Atilla; Ayvazoglu Soy, Hatice Ebru; Karakayali Yarbug, Feza; Yildirim, Sedat; Moray, Gokhan; Arslan, Gulnaz; Haberal, Mehmet
2015-04-01
The ideal ratio between liver transplant graft mass and recipient body weight is unknown, but the graft probably must weigh 0.8% to 2.0% recipient weight. When this ratio > 4%, there may be problems due to large-for-size transplant, especially in recipients < 10 kg. This condition is caused by discrepancy between the small abdominal cavity and large graft and is characterized by decreased blood supply to the liver graft and graft dysfunction. We evaluated our experience with large-for-size grafts. We retrospectively evaluated 377 orthotopic liver transplants that were performed from 2001-2014 in our center. We included 188 pediatric transplants in our study. There were 58 patients < 10 kg who had living-donor living transplant with graft-to-bodyweight ratio > 4%. In 2 patients, the abdomen was closed with a Bogota bag. In 5 patients, reoperation was performed due to vascular problems and abdominal hypertension, and the abdomen was closed with a Bogota bag. All Bogota bags were closed in 2 weeks. After closing the fascia, 10 patients had vascular problems that were diagnosed in the operating room by Doppler ultrasonography, and only the skin was closed without fascia closure. No graft loss occurred due to large-for-size transplant. There were 8 patients who died early after transplant (sepsis, 6 patients; brain death, 2 patients). There was no major donor morbidity or donor mortality. Large-for-size graft may cause abdominal compartment syndrome due to the small size of the recipient abdominal cavity, size discrepancies in vascular caliber, insufficient portal circulation, and disturbance of tissue oxygenation. Abdominal closure with a Bogota bag in these patients is safe and effective to avoid abdominal compartment syndrome. Early diagnosis by ultrasonography in the operating room after fascia closure and repeated ultrasonography at the clinic may help avoid graft loss.
Li, Cheukfai; Zhao, Qiang; Zhang, Wei; Chen, Maogen; Ju, Weiqiang; Wu, Linwei; Han, Ming; Ma, Yi; Zhu, Xiaofeng; Wang, Dongping; Guo, Zhiyong; He, Xiaoshun
2017-01-01
Background Poor transplant outcome was observed in donation after brain death followed by circulatory death (DBCD), since the donor organs suffered both cytokine storm of brain death and warm ischemia injury. MicroRNAs (miRNAs) have emerged as promising disease biomarkers, so we sought to establish a miRNA signature of porcine DBCD and verify the findings in human liver transplantation. Material/Methods MiRNA expression was determined with miRNA sequencing in 3 types of the porcine model of organ donation, including donation after brain death (DBD) group, donation after circulatory death (DCD) group, and DBCD group. Bioinformatics analysis was performed to reveal the potential regulatory behavior of target miRNA. Human liver graft biopsy samples after reperfusion detected by fluorescence in situ hybridization were used to verify the expression of target miRNA. Results We compared miRNA expression profiles of the 3 donation types. The porcine liver graft miR-146b was significantly increased and selected in the DBCD group versus in the DBD and DCD groups. The donor liver expression of human miR-146b-5p, which is homologous to porcine miR-146b, was further examined in 42 cases of human liver transplantations. High expression of miR-146b-5p successfully predicted the post-transplant early allograft dysfunction (EAD) with the area under the ROC curve (AUC) 0.759 (P=0.004). Conclusions Our results revealed the miRNA signature of DBCD liver grafts for the first time. The miR-146b-5p may have important clinical implications for monitoring liver graft function and predicating transplant outcomes. PMID:29227984
Donor age and early graft failure after lung transplantation: a cohort study.
Baldwin, M R; Peterson, E R; Easthausen, I; Quintanilla, I; Colago, E; Sonett, J R; D'Ovidio, F; Costa, J; Diamond, J M; Christie, J D; Arcasoy, S M; Lederer, D J
2013-10-01
Lungs from older adult organ donors are often unused because of concerns for increased mortality. We examined associations between donor age and transplant outcomes among 8860 adult lung transplant recipients using Organ Procurement and Transplantation Network and Lung Transplant Outcomes Group data. We used stratified Cox proportional hazard models and generalized linear mixed models to examine associations between donor age and both 1-year graft failure and primary graft dysfunction (PGD). The rate of 1-year graft failure was similar among recipients of lungs from donors age 18-64 years, but severely ill recipients (Lung Allocation Score [LAS] >47.7 or use of mechanical ventilation) of lungs from donors age 56-64 years had increased rates of 1-year graft failure (p-values for interaction = 0.04 and 0.02, respectively). Recipients of lungs from donors <18 and ≥65 years had increased rates of 1-year graft failure (adjusted hazard ratio [HR] 1.23, 95% CI 1.01-1.50 and adjusted HR 2.15, 95% CI 1.47-3.15, respectively). Donor age was not associated with the risk of PGD. In summary, the use of lungs from donors age 56 to 64 years may be safe for adult candidates without a high LAS and the use of lungs from pediatric donors is associated with a small increase in early graft failure. © Copyright 2013 The American Society of Transplantation and the American Society of Transplant Surgeons.
Lee, David D; Singh, Amandeep; Burns, Justin M; Perry, Dana K; Nguyen, Justin H; Taner, C Burcin
2014-12-01
Donation after cardiac death (DCD) liver allografts have been associated with increased morbidity from primary nonfunction, biliary complications, early allograft failure, cost, and mortality. Early allograft dysfunction (EAD) after liver transplantation has been found to be associated with inferior patient and graft survival. In a cohort of 205 consecutive liver-only transplant patients with allografts from DCD donors at a single center, the incidence of EAD was found to be 39.5%. The patient survival rates for those with no EAD and those with EAD at 1, 3, and 5 years were 97% and 89%, 79% and 79%, and 61% and 54%, respectively (P = 0.009). Allograft survival rates for recipients with no EAD and those with EAD at 1, 3, and 5 years were 90% and 75%, 72% and 64%, and 53% and 43%, respectively (P = 0.003). A multivariate analysis demonstrated a significant association between the development of EAD and the cold ischemia time [odds ratio (OR) = 1.26, 95% confidence interval (CI) = 1.01-1.56, P = 0.037] and hepatocellular cancer as a secondary diagnosis in recipients (OR = 2.26, 95% CI = 1.11-4.58, P = 0.025). There was no correlation between EAD and the development of ischemic cholangiopathy. In conclusion, EAD results in inferior patient and graft survival in recipients of DCD liver allografts. Understanding the events that cause EAD and developing preventive or early therapeutic approaches should be the focus of future investigations. © 2014 American Association for the Study of Liver Diseases.
Overextended Criteria Donors: Experience of an Italian Transplantation Center.
Nure, E; Lirosi, M C; Frongillo, F; Bianco, G; Silvestrini, N; Fiorillo, C; Sganga, G; Agnes, S
2015-09-01
The increasing gap between the number of patients who could benefit from liver transplantation and the number of available donors has fueled efforts to maximize the donor pool using marginal grafts that usually were discarded for transplantation. This study included data of all patients who received decreased donor liver grafts between January 2004 and January 2013 (n = 218) with the use of a prospectively collected database. Patients with acute liver failure, retransplantation, pediatric transplantation, and split liver transplantation were excluded. Donors were classified as standard donor (SD), extended criteria donor (ECD), and overextended criteria donor (OECD). The primary endpoints of the study were early allograft primary dysfunction (PDF), primary nonfunction (PNF), and patient survival (PS), whereas incidence of major postoperative complications was the secondary endpoint. In our series we demonstrated that OECD have similar outcome in terms of survival and incidence of complication after liver transplantation as ideal grafts. Copyright © 2015 Elsevier Inc. All rights reserved.
Donor age is a predictor of early low output after heart transplantation.
Fujino, Takeo; Kinugawa, Koichiro; Nitta, Daisuke; Imamura, Teruhiko; Maki, Hisataka; Amiya, Eisuke; Hatano, Masaru; Kimura, Mitsutoshi; Kinoshita, Osamu; Nawata, Kan; Komuro, Issei; Ono, Minoru
2016-05-01
Using hearts from marginal donors could be related to increased risk of primary graft dysfunction and poor long-term survival. However, factors associated with delayed myocardial recovery after heart transplantation (HTx) remain unknown. We sought to clarify risk factors that predict early low output after HTx, and investigated whether early low output affects mid-term graft dysfunction. We retrospectively analyzed patients who had undergone HTx at The University of Tokyo Hospital. We defined early low output patients as those whose cardiac index (CI) was <2.2 L/min/m(2) despite the use of intravenous inotrope at 1 week after HTx. We included 45 consecutive HTx recipients, and classified 11 patients into early low output group, and the others into early preserved output group. We performed univariable logistic analysis and found that donor age was the only significant factor that predicted early low output (odds ratio 1.107, 95% confidence interval 1.034-1.210, p=0.002). CI of early low output patients gradually increased and it caught up with that of early preserved output patients at 2 weeks after HTx (2.4±0.6 L/min/m(2) in early low output group vs 2.5±0.5 L/min/m(2) in early preserved output group, p=0.684). Plasma B-type natriuretic peptide concentration of early low output patients was higher (1118.5±1250.2 pg/ml vs 526.4±399.5 pg/ml; p=0.033) at 1 week, 703.6±518.4 pg/ml vs 464.6±509.0 pg/ml (p=0.033) at 2 weeks, and 387.7±231.9 pg/ml vs 249.4±209.5 pg/ml (p=0.010) at 4 weeks after HTx, and it came down to that of early preserved output patients at 12 weeks after HTx. Donor age was a predictor of early low output after HTx. We should be careful after HTx from old donors. However, hemodynamic parameters of early low output patients gradually caught up with those of early preserved output patients. Copyright © 2015 Japanese College of Cardiology. Published by Elsevier Ltd. All rights reserved.
Dare, Anna J; Logan, Angela; Prime, Tracy A; Rogatti, Sebastian; Goddard, Martin; Bolton, Eleanor M; Bradley, J Andrew; Pettigrew, Gavin J; Murphy, Michael P; Saeb-Parsy, Kourosh
2015-11-01
Free radical production and mitochondrial dysfunction during cardiac graft reperfusion is a major factor in post-transplant ischemia-reperfusion (IR) injury, an important underlying cause of primary graft dysfunction. We therefore assessed the efficacy of the mitochondria-targeted anti-oxidant MitoQ in reducing IR injury in a murine heterotopic cardiac transplant model. Hearts from C57BL/6 donor mice were flushed with storage solution alone, solution containing the anti-oxidant MitoQ, or solution containing the non-anti-oxidant decyltriphenylphosphonium control and exposed to short (30 minutes) or prolonged (4 hour) cold preservation before transplantation. Grafts were transplanted into C57BL/6 recipients and analyzed for mitochondrial reactive oxygen species production, oxidative damage, serum troponin, beating score, and inflammatory markers 120 minutes or 24 hours post-transplant. MitoQ was taken up by the heart during cold storage. Prolonged cold preservation of donor hearts before IR increased IR injury (troponin I, beating score) and mitochondrial reactive oxygen species, mitochondrial DNA damage, protein carbonyls, and pro-inflammatory cytokine release 24 hours after transplant. Administration of MitoQ to the donor heart in the storage solution protected against this IR injury by blocking graft oxidative damage and dampening the early pro-inflammatory response in the recipient. IR after heart transplantation results in mitochondrial oxidative damage that is potentiated by cold ischemia. Supplementing donor graft perfusion with the anti-oxidant MitoQ before transplantation should be studied further to reduce IR-related free radical production, the innate immune response to IR injury, and subsequent donor cardiac injury. Copyright © 2015 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Dare, Anna J.; Logan, Angela; Prime, Tracy A.; Rogatti, Sebastian; Goddard, Martin; Bolton, Eleanor M.; Bradley, J. Andrew; Pettigrew, Gavin J.; Murphy, Michael P.; Saeb-Parsy, Kourosh
2015-01-01
Background Free radical production and mitochondrial dysfunction during cardiac graft reperfusion is a major factor in post-transplant ischemia-reperfusion (IR) injury, an important underlying cause of primary graft dysfunction. We therefore assessed the efficacy of the mitochondria-targeted anti-oxidant MitoQ in reducing IR injury in a murine heterotopic cardiac transplant model. Methods Hearts from C57BL/6 donor mice were flushed with storage solution alone, solution containing the anti-oxidant MitoQ, or solution containing the non–anti-oxidant decyltriphenylphosphonium control and exposed to short (30 minutes) or prolonged (4 hour) cold preservation before transplantation. Grafts were transplanted into C57BL/6 recipients and analyzed for mitochondrial reactive oxygen species production, oxidative damage, serum troponin, beating score, and inflammatory markers 120 minutes or 24 hours post-transplant. Results MitoQ was taken up by the heart during cold storage. Prolonged cold preservation of donor hearts before IR increased IR injury (troponin I, beating score) and mitochondrial reactive oxygen species, mitochondrial DNA damage, protein carbonyls, and pro-inflammatory cytokine release 24 hours after transplant. Administration of MitoQ to the donor heart in the storage solution protected against this IR injury by blocking graft oxidative damage and dampening the early pro-inflammatory response in the recipient. Conclusions IR after heart transplantation results in mitochondrial oxidative damage that is potentiated by cold ischemia. Supplementing donor graft perfusion with the anti-oxidant MitoQ before transplantation should be studied further to reduce IR-related free radical production, the innate immune response to IR injury, and subsequent donor cardiac injury. PMID:26140808
Use of octogenarian donors for liver transplantation: a survival analysis.
Ghinolfi, D; Marti, J; De Simone, P; Lai, Q; Pezzati, D; Coletti, L; Tartaglia, D; Catalano, G; Tincani, G; Carrai, P; Campani, D; Miccoli, M; Biancofiore, G; Filipponi, F
2014-09-01
Use of very old donors in liver transplantation (LT) is controversial because advanced donor age is associated with a higher risk for graft dysfunction and worse long-term results, especially for hepatitis C virus (HCV)-positive recipients. This was a retrospective, single-center review of primary, ABO-compatible LT performed between 2001 and 2010. Recipients were stratified in four groups based on donor age (<60 years; 60-69 years; 70-79 years and ≥80 years) and their outcomes were compared. A total of 842 patients were included: 348 (41.3%) with donors <60 years; 176 (20.9%) with donors 60-69 years; 233 (27.7%) with donors 70-79 years and 85 (10.1%) with donors ≥80 years. There was no difference across groups in terms of early (≤30 days) graft loss, and graft survival at 1 and 5 years was 90.5% and 78.6% for grafts <60 years; 88.6% and 81.3% for grafts 60-69 years; 87.6% and 75.1% for grafts 70-79 years and 84.7% and 77.1% for grafts ≥80 years (p = 0.065). In the group ≥80 years, the 5-year graft survival was lower for HCV-positive versus HCV-negative recipients (62.4% vs. 85.6%, p = 0.034). Based on our experience, grafts from donors ≥80 years may provide favorable results but require appropriate selection and allocation policies. © Copyright 2014 The American Society of Transplantation and the American Society of Transplant Surgeons.
Safety and Efficacy Endpoints for Mesenchymal Stromal Cell Therapy in Renal Transplant Recipients.
Bank, J R; Rabelink, T J; de Fijter, J W; Reinders, M E J
2015-01-01
Despite excellent short-term graft survival after renal transplantation, the long-term graft outcome remains compromised. It has become evident that a combination of sustained alloreactivity and calcineurin-inhibitor- (CNI-) related nephrotoxicity results in fibrosis and consequently dysfunction of the graft. New immunosuppressive regimens that can minimize or eliminate side effects, while maintaining efficacy, are required to improve long-term graft survival. In this perspective mesenchymal stromal cells (MSCs) are an interesting candidate, since MSCs have immunosuppressive and regenerative properties. The first clinical trials with MSCs in renal transplantation showed safety and feasibility and displayed promising results. Recently, the first phase II studies have been started. One of the most difficult and challenging aspects in those early phase trials is to define accurate endpoints that can measure safety and efficacy of MSC treatment. Since both graft losses and acute rejection rates declined, alternative surrogate markers such as renal function, histological findings, and immunological markers are used to measure efficacy and to provide mechanistic insight. In this review, we will discuss the current status of MSCs in renal transplantation with a focus on the endpoints used in the different experimental and clinical studies.
Kim, Joo Dong; Choi, Dong Lak; Han, Young Seok
2014-05-01
Middle hepatic vein (MHV) reconstruction is often essential to avoid hepatic congestion and serious graft dysfunction in living donor liver transplantation (LDLT). The aim of this report was to introduce evolution of our MHV reconstruction technique and excellent outcomes of simplified one-orifice venoplasty. We compared clinical outcomes with two reconstruction techniques through retrospective review of 95 recipients who underwent LDLT using right lobe grafts at our institution from January 2008 to April 2012; group 1 received separate outflow reconstruction and group 2 received new one-orifice technique. The early patency rates of MHV in group 2 were higher than those in group 1; 98.4% vs. 88.2% on postoperative day 7 (p = 0.054) and 96.7% vs. 82.4% on postoperative day 14, respectively (p = 0.023). Right hepatic vein (RHV) stenosis developed in three cases in group 1, but no RHV stenosis developed because we adopted one-orifice technique (p = 0.043). The levels of aspartate aminotransferase (AST) and alanine aminotransferase (ALT) in group 2 were significantly lower than those in group 1 during the early post-transplant period. In conclusion, our simplified one-orifice venoplasty technique could secure venous outflow and improve graft function during right lobe LDLT. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Off-pump grafting does not reduce postoperative pulmonary dysfunction.
Izzat, Mohammad Bashar; Almohammad, Farouk; Raslan, Ahmad Fahed
2017-02-01
Objectives Pulmonary dysfunction is a recognized postoperative complication that may be linked to use of cardiopulmonary bypass. The off-pump technique of coronary artery bypass aims to avoid some of the complications that may be related to cardiopulmonary bypass. In this study, we compared the influence of on-pump or off-pump coronary artery bypass on pulmonary gas exchange following routine surgery. Methods Fifty patients (mean age 60.4 ± 8.4 years) with no preexisting lung disease and good left ventricular function undergoing primary coronary artery bypass grafting were prospectively randomized to undergo surgery with or without cardiopulmonary bypass. Alveolar/arterial oxygen pressure gradients were calculated prior to induction of anesthesia while the patients were breathing room air, and repeated postoperatively during mechanical ventilation and after extubation while inspiring 3 specific fractions of oxygen. Results Baseline preoperative arterial blood gases and alveolar/arterial oxygen pressure gradients were similar in both groups. At both postoperative stages, the partial pressure of arterial oxygen and alveolar/arterial oxygen pressure gradients increased with increasing fraction of inspired oxygen, but there were no statistically significant differences between patients who underwent surgery with or without cardiopulmonary bypass, either during ventilation or after extubation. Conclusions Off-pump surgery is not associated with superior pulmonary gas exchange in the early postoperative period following routine coronary artery bypass grafting in patients with good left ventricular function and no preexisting lung disease.
Orban, Jean-Christophe; Fontaine, Eric; Cassuto, Elisabeth; Baumstarck, Karine; Leone, Marc; Constantin, Jean-Michel; Ichai, Carole
2018-04-17
Renal transplantation represents the treatment of choice of end-stage kidney disease. Delayed graft function (DGF) remains the most frequent complication after this procedure, reaching more than 30%. Its prevention is essential as it impedes early- and long-term prognosis of transplantation. Numerous pharmacological interventions aiming to prevent ischemia-reperfusion injuries failed to reduce the rate of DGF. We hypothesize that cyclosporine as an early preconditioning procedure in donors would be associated with decreased DGF. The Cis-A-rein study is an investigator-initiated, prospective, multicenter, double-blind, randomized, controlled study performed to assess the effects of a donor preconditioning with cyclosporine A on kidney grafts function in transplanted patients. After randomization, a brain dead donor will receive 2.5 mg kg -1 of cyclosporine A or the same volume of 5% glucose solution. The primary objective is to compare the rate of DGF, defined as the need for at least one dialysis session within the 7 days following transplantation, between both groups. The secondary objectives include rate of slow graft function, mild and severe DGF, urine output and serum creatinine during the first week after transplantation, rate of primary graft dysfunction, renal function and mortality at 1 year. The sample size (n = 648) was determined to obtain 80% power to detect a 10% difference for rate of DGF at day 7 between the two groups (30% of the patients in the placebo group and 20% of the patients in the intervention group). Delayed graft function is a major issue after renal transplantation, impeding long-term prognosis. Cyclosporine A pretreatment in deceased donors could improve the outcome of patients after renal transplantation. ClinicalTrials.gov, ID: NCT02907554 Registered on 20 September 2016.
Acute allograft failure in thoracic organ transplantation.
Jahania, M S; Mullett, T W; Sanchez, J A; Narayan, P; Lasley, R D; Mentzer, R M
2000-01-01
Thoracic organ transplantation is an effective form of treatment for end-stage heart and lung disease. Despite major advances in the field, transplant patients remain at risk for acute allograft dysfunction, a major cause of early and late mortality. The most common causes of allograft failure include primary graft failure secondary to inadequate heart and lung preservation during cold storage, cellular rejection, and various donor-recipient-related factors. During cold storage and early reperfusion, heart and lung allografts are vulnerable to intracellular calcium overload, acidosis, cell swelling, injury mediated by reactive oxygen species, and the inflammatory response. Brain death itself is associated with a reduction in myocardial contractility, and recipient-related factors such as preexisting pulmonary hypertension can lead to acute right heart failure and the pulmonary reimplantation response. The development of new methods to prevent or treat these various causes of acute graft failure could lead to a marked improvement in short- and long-term survival of patients undergoing thoracic organ transplantation.
Early outcomes of liver transplants in patients receiving organs from hypernatremic donors.
Khosravi, Mohammad Bagher; Firoozifar, Mohammad; Ghaffaripour, Sina; Sahmeddini, Mohammad Ali; Eghbal, Mohammad Hossien
2013-12-01
Uncorrected hypernatremia in organ donors has been associated with poor graft or patient survival during liver transplants. However, recent studies have found no association between the donor serum sodium and transplant outcome. This study sought to show the negative effect donor hypernatremia has on initial liver allograft function. This is the first study to investigate international normalized ratio and renal factors of patients with normal and those with hypernatremic donor livers. This study was conducted at the Shiraz Transplant Research Center in Shiraz, Iran, between May 2009, and July 2011. Four hundred seven consecutive adult orthotopic liver transplants were performed at the University of Shiraz Medical Center. There were 93 donors in the group with hypernatremia with terminal serum sodium of 155 mEq/L or greater (group 1), and 314 with terminal serum sodium less than 155 mEq/L (group 2). Posttransplant data after 5 days showed that aspartate aminotransferase, alanine aminotransferase, international normalized ratio, and kidney function did not differ between the groups. Hypernatremia is the most important complication after brain death. Previous studies have suggested donor hypernatremia results in a greater incidence of early postoperative graft dysfunction in liver transplant and is considered one of the extended criteria donor. However, in recent years, this hypothesis has been questioned. Our study shows no difference between patients' initial results of liver and kidney functioning with normal and hypernatremic donor livers. This is the first study to investigate international normalized ratio as a fundamental factor in defining early allograft dysfunction and renal factors between patients with normal and hypernatremic donor's livers.
Takada, Hideaki; Kobayashi, Takashi; Ogawa, Kohei; Miyata, Hitomi; Sawada, Atsuro; Akamatsu, Shusuke; Negoro, Hiromitsu; Saito, Ryoichi; Terada, Naoki; Yamasaki, Toshinari; Inoue, Takahiro; Teramoto, Yuki; Shibuya, Shinsuke; Haga, Hironori; Kaido, Toshimi; Uemoto, Shinji; Ogawa, Osamu
2017-08-01
We report a case of lethal hepatorenal insufficiency in a 52-year-old man who received successful simultaneous hepatorenal transplantation from a deceased donor. The patient had undergone live-donor liver transplantation for type-C hepatitis and liver cirrhosis 11 years before he developed graft liver dysfunction due to recurrent viral hepatitis and cirrhosis. At that instance, he also developed end-stage renal dysfunction due to calcineurin inhibitor nephropathy and hepatorenal syndrome. Although he needed three open hemostases and abundant blood transfusion, he was withdrawn from continuous hemodiafiltration on the 55th day and discharged from the hospital on the 272nd day postoperatively. Simultaneous hepatorenal transplantation was reported to be associated with more favorable outcomes of graft function, lower rejection rates, but higher perioperative complication rates compared with liver transplantation alone in patients on hemodialysis. Particularly, close attention should be paid for hemostasis since patients have a hemorrhagic tendency until the recovery of graft liver function.
Bcl-2 protects tubular epithelial cells from ischemia reperfusion injury by inhibiting apoptosis.
Suzuki, Chigure; Isaka, Yoshitaka; Shimizu, Shigeomi; Tsujimoto, Yoshihide; Takabatake, Yoshitsugu; Ito, Takahito; Takahara, Shiro; Imai, Enyu
2008-01-01
Ischemia followed by reperfusion leads to severe organ injury and dysfunction. Inflammation is considered to be the most important cause of graft dysfunction in kidney transplantation subjected to ischemia. The mechanism that triggers inflammation and renal injury after ischemia remains to be elucidated; however, cellular stress may induce apoptosis during the first hours and days after transplantation, which might play a crucial role in early graft dysfunction. Bcl-2 is known to inhibit apoptosis induced by the etiological factors promoting ischemia and reperfusion injury. Accordingly, we hypothesized that an augmentation of the antiapoptotic factor Bcl-2 may thus protect tubular epithelial cells by inhibiting apoptosis, thereby ameliorating the subsequent tubulointerstitial injury. We examined the effects of Bcl-2 overexpression on ischemia-reperfusion (I/R) injury using Bcl-2 transgenic mice (Bcl-2 TG) and their wild-type littermates (WT). To investigate the effects of I/R injury, the left renal artery and vein were clamped for 45 min, followed by reperfusion for 0-96 h. Bcl-2 TG exhibited decreased active caspase protein in the tubular cells, which led to a reduction in TUNEL-positive apoptotic cells. Consequently, interstitial fibrosis and phenotypic changes were ameliorated in Bcl-2 TG. In conclusion, Bcl-2 augmentation protected renal tubular epithelial cells from I/R, and subsequent interstitial injury by inhibiting tubular apoptosis.
Lee Henry, Christopher; Ko, Jong Mi; Henry, Albert Carl; Matter, Gregory John
2011-01-01
Aortic valve replacement following an earlier coronary artery bypass grafting (CABG) procedure is fairly common. When this situation occurs, the type of valve dysfunction is usually stenosis (with or without regurgitation), and whether it was missed at the time of the earlier CABG or developed subsequently is usually unclear. We attempted to determine the survival in patients who had had aortic valve replacement after 2 previous CABG procedures. We describe 12 patients who had aortic valve replacement for aortic stenosis; rather than one previous CABG operation, all had had 2 previous CABG procedures. Only one patient died in the early postoperative period after aortic valve replacement, and the remaining 11 were improved substantially: all have lived at least 11 months, and one is still alive at over 101 months after aortic valve replacement. Aortic valve replacement remains beneficial for most patients even after 2 previous CABG procedures. PMID:21307968
Quintana, L F; Coll, E; Monteagudo, I; Collado, S; López-Pedret, J; Cases, A
2005-01-01
Vascular access-related complications are a frequent cause of morbidity in haemodialysis patients and generate high costs. We present the case of an adult patient with end-stage renal disease and recurrent vascular access thrombosis associated with the prothrombin mutation G20210A and renal graft intolerance. The clinical expression of this heterozygous gene mutation may have been favoured by inflammatory state, frequent in dialysis patients. In this patient, the inflammatory response associated with the renal graft intolerance would have favored the development of recurrent vascular access thrombosis in a adult heterozygous for prothrombin mutation G20210A. In the case of early dysfunction of haemodialysis vascular access and after ruling out technical problems, it is convenient to carry out a screening for thrombophilia.
[Chronic rejection: Differences and similarities in various solid organ transplants].
Suhling, H; Gottlieb, J; Bara, C; Taubert, R; Jäckel, E; Schiffer, M; Bräsen, J H
2016-01-01
In this paper, chronic rejections after transplantation of the lungs, heart, liver, and kidney are described. Chronic allograft dysfunction (CAD) plays an important role in all of these transplantations and has a significant influence on patient survival. The pathophysiological reasons for CAD varies greatly in the various organs.Chronic lung allograft dysfunction (CLAD) is the most important determinant of survival and quality of life after lung transplantation. Diagnosis is based on lung function, especially forced expiratory flow in 1 s (FEV1) decline. Prevention, early detection, and rapid treatment are extremely important. Azithromycin and extracorporeal photopheresis are commonly used for treatment because they usually positively influence the progression of lung remodeling.The expression for chronic rejection of the heart is cardiac allograft vasculopathy (CAV). Immunological and nonimmunological factors are important for its development. Due to limited therapeutic options, prevention is of utmost importance (administration of mTOR inhibitors and minimizing cardiovascular risk factors).The mid- and long-term survival rates after liver transplantation have hardly changed in recent decades, which is an indication of the difficulty in diagnosing chronic graft dysfunction. Chronic ductopenic rejection accounts for a small proportion of late graft dysfunction. Idiopathic posttransplant hepatitis and de novo autoimmune hepatitis are important in addition to recurrence of the underlying disease that led to transplantation.Chronic allograft nephropathy is the result of severe rejection which cumulates in increasing fibrosis with remodeling. The earliest possible diagnosis and therapy is currently the only option. Diagnosis is based on evidence of donor-specific antibodies and histological findings.
Zhang, Jin; Qiu, Jiang; Chen, Guo-Dong; Wang, Chang-Xi; Wang, Chang; Yu, Shuang-Jin; Chen, Li-Zhong
2018-11-01
The aim of this study is to investigate the clinical features of graft dysfunction following living kidney transplantation and to assess its causes. We retrospectively analyzed a series of 366 living kidney transplantation indication biopsies with a clear etiology and diagnosis from July 2003 to June 2016 at our center. The classifications and diagnoses were performed based on clinical and pathological characteristics. All biopsies were evaluated according to the Banff 2007 schema. Acute rejection (AR) occurred in 85 cases (22.0%), chronic rejection (CR) in 62 cases (16.1%), borderline rejection (BR) in 12 cases (3.1%), calcineurin inhibitor (CNI) toxicity damage in 41 cases (10.6%), BK virus-associated nephropathy (BKVAN) in 43 cases (11.1%), de novo or recurrent renal diseases in 134 cases (34.7%), and other causes in nine cases (2.3%); additionally, 20 cases had two simultaneous causes. The 80 cases with IgA nephropathy (IgAN) had the highest incidence (59.7%) of de novo or recurrent renal diseases. After a mean ± SD follow up of 3.7 ± 2.3 years, the 5-year graft cumulative survival rates of AR, CR, CNI toxicity, BKVAN, and de novo or recurrent renal diseases were 60.1%, 31.2%, 66.6%, 66.9%, and 67.1%, respectively. A biopsy is helpful for the diagnosis of graft dysfunction. De novo or recurrent renal disease, represented by IgAN, is a major cause of graft dysfunction following living kidney transplantation.
Gür, Demet Özkaramanlı; Gür, Özcan; Gürkan, Selami; Cömez, Selcem; Gönültaş, Aylin; Yılmaz, Murat
2016-01-01
Objective: Diabetes associated endothelial dysfunction, which determines both long and short term graft patency, is not uniform in all coronary artery bypass surgery (CABG) grafts. Herein this study, we aimed to investigate the degree of endothelial dysfunction in diabetic radial artery (RA), internal mammarian artery (IMA) and saphenous vein (SV) grafts in vitro tissue bath system. Methods: This is a prospective experimental study. Fifteen diabetic and 15 non-diabetic patients were included to the study. A total number of 96 graft samples were collected; 16 graft samples for each graft type from both diabetic and non-diabetic patients. Arterial grafts were harvested with pedicles and SV grafts were harvested by ‘no touch’ technique. Vasodilatation response of vascular rings to carbachol, which induces nitric oxide (NO) mediated vasodilatation, was designated as the measure of endothelial function. Results: The IMA grafts had the most prominent NO mediated vasodilatation in both diabetic and non-diabetic patients, concluding a better preserved endothelial function than SV and RA. The ‘no-touch’ SV and RA grafts had similar vasodilatation responses in non-diabetic patients. In diabetic patients, on the other hand, RA grafts exhibited the least vasodilatation response (ie. worst endothelial function), even less vasodilatation than ‘no touch’ SV grafts (p<0.0001). Conclusion: Deteriorated function of RA grafts in diabetic patients, even worse than SV grafts made evident by this study, encourages the use of ‘no touch’ technique as the method of SV harvesting and more meticulous imaging of RA before its use as a graft in diabetic patients. PMID:26301347
Gerstenkorn, C; Robertson, H; Mohamed, M A; O'Donnell, M; Ali, S; Talbot, D
2000-11-01
Chronic rejection accounts for the greatest loss of renal allografts. HLA mismatching has been minimised by organ allocation and new immunosuppressive drugs have been employed, but the average cadaveric graft survival still does not exceed 12 years. Though the aetiology is multifactorial, one contributory factor for this condition is cytomegalovirus (CMV). Detection of CMV in kidney biopsies and sera can diagnose and monitor this inflammatory event and define its role in chronic nephropathy. Twenty five biopsies taken at the time of transplantation, 10 biopsies for graft dysfunction and tissue blocks from 20 explanted kidney grafts were collected and investigated for CMV antigens by immunohistochemistry. Tissue samples were snap frozen and cryostat sections were incubated with monoclonal antibodies for CMV antigens followed by immunoperoxidase staining. In 12 out of 20 transplant nephrectomies CMV antigens were found. Only two of these patients had clinical CMV disease. Time 0 biopsies from CMV seronegative donors (n = 11) and CMV seropositive donors (n = 14) were negative for CMV antigens. The prevalence of CMV antigens in grafts lost due to chronic rejection was 60%. These antigens were not found within the time 0 biopsies, but were detected in 30% of biopsies taken at the time of clinical graft dysfunction. CMV appears to contribute to chronic rejection even without clinical disease.
Sayah, David M; Mallavia, Beñat; Liu, Fengchun; Ortiz-Muñoz, Guadalupe; Caudrillier, Axelle; DerHovanessian, Ariss; Ross, David J; Lynch, Joseph P; Saggar, Rajan; Ardehali, Abbas; Ware, Lorraine B; Christie, Jason D; Belperio, John A; Looney, Mark R
2015-02-15
Primary graft dysfunction (PGD) causes early mortality after lung transplantation and may contribute to late graft failure. No effective treatments exist. The pathogenesis of PGD is unclear, although both neutrophils and activated platelets have been implicated. We hypothesized that neutrophil extracellular traps (NETs) contribute to lung injury in PGD in a platelet-dependent manner. To study NETs in experimental models of PGD and in lung transplant patients. Two experimental murine PGD models were studied: hilar clamp and orthotopic lung transplantation after prolonged cold ischemia (OLT-PCI). NETs were assessed by immunofluorescence microscopy and ELISA. Platelet activation was inhibited with aspirin, and NETs were disrupted with DNaseI. NETs were also measured in bronchoalveolar lavage fluid and plasma from lung transplant patients with and without PGD. NETs were increased after either hilar clamp or OLT-PCI compared with surgical control subjects. Activation and intrapulmonary accumulation of platelets were increased in OLT-PCI, and platelet inhibition reduced NETs and lung injury, and improved oxygenation. Disruption of NETs by intrabronchial administration of DNaseI also reduced lung injury and improved oxygenation. In bronchoalveolar lavage fluid from human lung transplant recipients, NETs were more abundant in patients with PGD. NETs accumulate in the lung in both experimental and clinical PGD. In experimental PGD, NET formation is platelet-dependent, and disruption of NETs with DNaseI reduces lung injury. These data are the first description of a pathogenic role for NETs in solid organ transplantation and suggest that NETs are a promising therapeutic target in PGD.
Use of marginal grafts in deceased donor liver transplant: assessment of early outcomes.
Godara, Rajesh; Naidu, C Sudeep; Rao, Pankaj P; Sharma, Sanjay; Banerjee, Jayant K; Saha, Anupam; Vijay, Kapileshwer
2014-03-01
Orthotopic liver transplantation has become a routinely applied therapy for an expanding group of patients with end-stage liver disease. Shortage of organs has led centers to expand their criteria for the acceptance of marginal donors. There is current debate about the regulation and results of liver transplantation using marginal grafts. The study included data of all patients who received deceased donor liver grafts between March 2007 to December 2011. Patients with acute liver failure, living donor transplantation, split liver transplantation, and retransplantation were excluded. Early allograft dysfunction, primary nonfunction, patient survival, and incidence of surgical complications were measured. A total of 33 patients were enrolled in this study. There were 20 marginal and 13 nonmarginal grafts. The two groups were well matched regarding age, sex and indication of liver transplantation, model for end-stage liver disease score, technique of transplant, requirement of vascular reconstruction, warm ischemia time, blood loss, mean operative time, etc. In our study, posttransplant peak level of liver enzymes, international normalization ratio, and bilirubin were not statistically significant in the marginal and nonmarginal group. Wound infection occurred in 10 % of marginal compared with 7.7 % of nonmarginal graft recipients (p > 0.05). In the marginal group, the incidences of vascular complications, hepatic artery thrombosis (four), and portal vein thrombosis (one) were not statistically significant compared to the nonmarginal group. Acute rejection was observed in a total of seven patients (21.2 %)-five (25 %) in the marginal group and two (15.4 %) in the nonmarginal graft recipients. Primary nonfunction occurred in three (9.1 %) patients-two (10 %) in the marginal and one (7.7 %) in the nonmarginal group. Average patient survival for the whole group was 91 % at 1 week, 87.8 % at 3 months, and 84.8 % at 6 months. Because organ scarcity persists, additional pressure will build to use a greater proportion of the existing donor pool. The study, although small, clearly indicates that marginal livers can assure a normal early functional recovery after transplantation.
Current Review of Iron Overload and Related Complications in Hematopoietic Stem Cell Transplantation
Atilla, Erden; Toprak, Selami K.; Demirer, Taner
2017-01-01
Iron overload is an adverse prognostic factor for patients undergoing hematopoietic stem cell transplantation (HSCT). In the HSCT setting, pretransplant and early posttransplant ferritin and transferrin saturation were found to be highly elevated due to high transfusion requirements. In addition to that, post-HSCT iron overload was shown to be related to infections, hepatic sinusoidal obstruction syndrome, mucositis, liver dysfunction, and acute graft-versus-host disease. Hyperferritinemia causes decreased survival rates in both pre- and posttransplant settings. Serum ferritin levels, magnetic resonance imaging, and liver biopsy are diagnostic tools for iron overload. Organ dysfunction due to iron overload may cause high mortality rates and therefore sufficient iron chelation therapy is recommended in this setting. In this review the management of iron overload in adult HSCT is discussed. PMID:27956374
Reischig, Tomas; Kacer, Martin; Hruba, Petra; Jindra, Pavel; Hes, Ondrej; Lysak, Daniel; Bouda, Mirko; Viklicky, Ondrej
2017-01-01
Asymptomatic cytomegalovirus (CMV) infection is associated with graft dysfunction and failure. However, no study assessed CMV viral load in terms of the risk for graft failure. In a prospective cohort of kidney transplant recipients, we assessed the impact of CMV DNAemia on the overall graft survival and the incidence of moderate-to-severe interstitial fibrosis and tubular atrophy (IF/TA) in protocol biopsy at 36 months. CMV DNAemia was stratified by viral load in whole blood. A total of 180 patients transplanted from October 2003 through January 2011 were included and followed for 4 years; 87 (48%) patients received 3-month prophylaxis with valacyclovir and 45 (25%) with valganciclovir; 48 (27%) were managed by pre-emptive therapy. Within 12 months of transplantation, CMV DNAemia developed in 102 (57%) patients with 36 (20%) having a viral load of ≥2,000 copies/ml. Multivariate Cox analysis identified CMV DNAemia as an independent risk factor for graft loss (hazard ratio 3.42; P=0.020); however, after stratification by viral load, only CMV DNAemia ≥2,000 copies/ml (hazard ratio 7.62; P<0.001) remained significant. Both early-onset (<3 months; P=0.048) and late-onset (>3 months; P<0.001) CMV DNAemia ≥2,000 copies/ml were risk factors for graft loss. The incidence of moderate-to-severe IF/TA was not significantly influenced by CMV DNAemia. Kidney transplant recipients having CMV DNAemia with a higher viral load irrespective of the time to onset are at increased risk for graft loss.
Lan, C; Song, J L; Yan, L N; Yang, J Y; Wen, T F; Li, B; Xu, M Q
The impact of using liver allografts from donors who are younger than 14 years at the time of donation after cardiac death (DCD) liver transplantation in terms of early allograft dysfunction (EAD) and graft survival is undefined. To determine if adults undergoing DCD liver transplantation who receive a graft from a donor age younger than or equal to 13 years have similar outcomes to recipients of organs from older than 18-year-old donors. Records from adult patients undergoing DCD liver transplantation between March 2012 and December 2015 who received whole grafts from donors after cardiac death were reviewed. Patients with donors younger than or equal to 13 years (group 1) and older than 18 years (group 2) were compared for EAD rates, hepatic artery thrombosis (HAT), and graft survival. Records of 60 DCD liver transplantation patients were analyzed. The 90-day and 1-year graft survival rate of both groups was 90% versus 96% (P = .427) and 80% versus 84% (P = .668), respectively. The EAD rates of groups 1 and 2 were 30% versus 34% (P = .806). The incidence of HAT was 20% in group 1 compared with 12% in group 2 (P = .610). Also, 0.7% < graft to recipient weight ratio (GRWR) <0.8% was also usable for pediatric donor to adult recipients. Whole liver grafts from donors younger than or equal to 13 years can potentially be used in selected size-matched (GRWR >0.7%) DCD adult recipients. Copyright © 2017 Elsevier Inc. All rights reserved.
Autoimmune diabetes recurrence should be routinely monitored after pancreas transplantation.
Martins, La Salete
2014-09-24
Autoimmune type 1 diabetes recurrence in pancreas grafts was first described 30 years ago, but it is not yet completely understood. In fact, the number of transplants affected and possibly lost due to this disease may be falsely low. There may be insufficient awareness to this entity by clinicians, leading to underdiagnosis. Some authors estimate that half of the immunological losses in pancreas transplantation are due to autoimmunity. Pancreas biopsy is the gold standard for the definitive diagnosis. However, as an invasive procedure, it is not the ideal approach to screen the disease. Pancreatic autoantibodies which may be detected early before graft dysfunction, when searched for, are probably the best initial tool to establish the diagnosis. The purpose of this review is to revisit the autoimmune aspects of type 1 diabetes and to analyse data about the identified autoantibodies, as serological markers of the disease. Therapeutic strategies used to control the disease, though with unsatisfactory results, are also addressed. In addition, the author's own experience with the prospective monitoring of pancreatic autoantibodies after transplantation and its correlation with graft outcome will be discussed.
Predictive factors of short term outcome after liver transplantation: A review
Bolondi, Giuliano; Mocchegiani, Federico; Montalti, Roberto; Nicolini, Daniele; Vivarelli, Marco; De Pietri, Lesley
2016-01-01
Liver transplantation represents a fundamental therapeutic solution to end-stage liver disease. The need for liver allografts has extended the set of criteria for organ acceptability, increasing the risk of adverse outcomes. Little is known about the early postoperative parameters that can be used as valid predictive indices for early graft function, retransplantation or surgical reintervention, secondary complications, long intensive care unit stay or death. In this review, we present state-of-the-art knowledge regarding the early post-transplantation tests and scores that can be applied during the first postoperative week to predict liver allograft function and patient outcome, thereby guiding the therapeutic and surgical decisions of the medical staff. Post-transplant clinical and biochemical assessment of patients through laboratory tests (platelet count, transaminase and bilirubin levels, INR, factor V, lactates, and Insulin Growth Factor 1) and scores (model for end-stage liver disease, acute physiology and chronic health evaluation, sequential organ failure assessment and model of early allograft function) have been reported to have good performance, but they only allow late evaluation of patient status and graft function, requiring days to be quantified. The indocyanine green plasma disappearance rate has long been used as a liver function assessment technique and has produced interesting, although not univocal, results when performed between the 1th and the 5th day after transplantation. The liver maximal function capacity test is a promising method of metabolic liver activity assessment, but its use is limited by economic cost and extrahepatic factors. To date, a consensual definition of early allograft dysfunction and the integration and validation of the above-mentioned techniques, through the development of numerically consistent multicentric prospective randomised trials, are necessary. The medical and surgical management of transplanted patients could be greatly improved by using clinically reliable tools to predict early graft function. PMID:27468188
Predictive factors of short term outcome after liver transplantation: A review.
Bolondi, Giuliano; Mocchegiani, Federico; Montalti, Roberto; Nicolini, Daniele; Vivarelli, Marco; De Pietri, Lesley
2016-07-14
Liver transplantation represents a fundamental therapeutic solution to end-stage liver disease. The need for liver allografts has extended the set of criteria for organ acceptability, increasing the risk of adverse outcomes. Little is known about the early postoperative parameters that can be used as valid predictive indices for early graft function, retransplantation or surgical reintervention, secondary complications, long intensive care unit stay or death. In this review, we present state-of-the-art knowledge regarding the early post-transplantation tests and scores that can be applied during the first postoperative week to predict liver allograft function and patient outcome, thereby guiding the therapeutic and surgical decisions of the medical staff. Post-transplant clinical and biochemical assessment of patients through laboratory tests (platelet count, transaminase and bilirubin levels, INR, factor V, lactates, and Insulin Growth Factor 1) and scores (model for end-stage liver disease, acute physiology and chronic health evaluation, sequential organ failure assessment and model of early allograft function) have been reported to have good performance, but they only allow late evaluation of patient status and graft function, requiring days to be quantified. The indocyanine green plasma disappearance rate has long been used as a liver function assessment technique and has produced interesting, although not univocal, results when performed between the 1(th) and the 5(th) day after transplantation. The liver maximal function capacity test is a promising method of metabolic liver activity assessment, but its use is limited by economic cost and extrahepatic factors. To date, a consensual definition of early allograft dysfunction and the integration and validation of the above-mentioned techniques, through the development of numerically consistent multicentric prospective randomised trials, are necessary. The medical and surgical management of transplanted patients could be greatly improved by using clinically reliable tools to predict early graft function.
Preservation of myocardium during coronary artery bypass surgery.
Kinoshita, Takeshi; Asai, Tohru
2012-08-01
Myocardial protection aims to prevent reversible post-ischemic cardiac dysfunction (myocardial stunning) and irreversible myocardial cell death (myocardial infarction) that occur as a consequence of myocardial ischemia and/or ischemic-reperfusion injury. Although the mortality rate for isolated coronary artery bypass grafting has been markedly reduced during the past decade, myocardial death, as evidenced by elevation in creatine kinase-myocardial band and/or cardiac troponin, is common. This is ascribed to suboptimal myocardial protection during cardiopulmonary bypass or with off-pump technique, early graft failure, distal embolization, and regional or global myocardial ischemia during surgery. An unmet need in contemporary coronary bypass surgery is to find more effective cardioprotective strategies that have the potential for decreasing the morbidity and mortality associated with suboptimal cardioprotection. In the present review article on myocardial protection in contemporary coronary artery bypass surgery, we attempt to elucidate the clinical problems, summarize the outcomes of selected phase III trials, and introduce new perspectives.
Mallavia, Beñat; Liu, Fengchun; Ortiz-Muñoz, Guadalupe; Caudrillier, Axelle; DerHovanessian, Ariss; Ross, David J.; Lynch III, Joseph P.; Saggar, Rajan; Ardehali, Abbas; Ware, Lorraine B.; Christie, Jason D.; Belperio, John A.; Looney, Mark R.
2015-01-01
Rationale: Primary graft dysfunction (PGD) causes early mortality after lung transplantation and may contribute to late graft failure. No effective treatments exist. The pathogenesis of PGD is unclear, although both neutrophils and activated platelets have been implicated. We hypothesized that neutrophil extracellular traps (NETs) contribute to lung injury in PGD in a platelet-dependent manner. Objectives: To study NETs in experimental models of PGD and in lung transplant patients. Methods: Two experimental murine PGD models were studied: hilar clamp and orthotopic lung transplantation after prolonged cold ischemia (OLT-PCI). NETs were assessed by immunofluorescence microscopy and ELISA. Platelet activation was inhibited with aspirin, and NETs were disrupted with DNaseI. NETs were also measured in bronchoalveolar lavage fluid and plasma from lung transplant patients with and without PGD. Measurements and Main Results: NETs were increased after either hilar clamp or OLT-PCI compared with surgical control subjects. Activation and intrapulmonary accumulation of platelets were increased in OLT-PCI, and platelet inhibition reduced NETs and lung injury, and improved oxygenation. Disruption of NETs by intrabronchial administration of DNaseI also reduced lung injury and improved oxygenation. In bronchoalveolar lavage fluid from human lung transplant recipients, NETs were more abundant in patients with PGD. Conclusions: NETs accumulate in the lung in both experimental and clinical PGD. In experimental PGD, NET formation is platelet-dependent, and disruption of NETs with DNaseI reduces lung injury. These data are the first description of a pathogenic role for NETs in solid organ transplantation and suggest that NETs are a promising therapeutic target in PGD. PMID:25485813
Long-term outcomes and predictors in pediatric liver retransplantation.
Dreyzin, Alexandra; Lunz, John; Venkat, Veena; Martin, Lillian; Bond, Geoffrey J; Soltys, Kyle A; Sindhi, Rakesh; Mazariegos, George V
2015-12-01
Historically, 9-29% of pediatric liver transplant recipients have required retransplantation. Although outcomes have improved over the last decade, currently published patient and graft survival remain lower after retransplant than after primary transplant. Data from liver retransplantation recipients at our institution between 1991 and 2013 were retrospectively reviewed. Kaplan-Meier estimates were used to depict patient and graft survival. Predictors of survival were analyzed using a series of Cox proportional hazards models. Predictors were analyzed separately for patients who had "early" (≤ 30 days after primary transplant) and "late" retransplants. Eighty-four patients underwent retransplant at a median time of 241 days. Sixty percent had late retransplants. At one, five, and 10 yr, actuarial patient and graft survival were 73%/71%, 66%/63%, and 58%/53%, respectively. Since 2002, patient and graft survival improved to 86%/86% at one yr and 93%/87% at five yr. While operative complications were a common cause of death after earlier retransplants, since 2002, infection has been the only cause of death. Significant morbidities at five-yr follow-up include renal dysfunction (15%), diabetes (13%), hypertension (26%), chronic rejection (7%), and PTLD (2%). Current survival after pediatric liver retransplantation has improved significantly, but long-term immunosuppressant morbidity remains an opportunity for improvement. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Pathophysiology and classification of primary graft dysfunction after lung transplantation
Morrison, Morvern Isabel; Pither, Thomas Leonard
2017-01-01
The term primary graft dysfunction (PGD) incorporates a continuum of disease severity from moderate to severe acute lung injury (ALI) within 72 h of lung transplantation. It represents the most significant obstacle to achieving good early post-transplant outcomes, but is also associated with increased incidence of bronchiolitis obliterans syndrome (BOS) subsequently. PGD is characterised histologically by diffuse alveolar damage, but is graded on clinical grounds with a combination of PaO2/FiO2 (P/F) and the presence of radiographic infiltrates, with 0 being absence of disease and 3 being severe PGD. The aetiology is multifactorial but commonly results from severe ischaemia-reperfusion injury (IRI), with tissue-resident macrophages largely responsible for stimulating a secondary ‘wave’ of neutrophils and lymphocytes that produce severe and widespread tissue damage. Donor history, recipient health and operative factors may all potentially contribute to the likelihood of PGD development. Work that aims to minimise the incidence of PGD in ongoing, with techniques such as ex vivo perfusion of donor lungs showing promise both in research and in clinical studies. This review will summarise the current clinical status of PGD before going on to discuss its pathophysiology, current therapies available and future directions for clinical management of PGD. PMID:29268419
Excellent outcomes of liver transplantation using severely steatotic grafts from brain-dead donors.
Wong, Tiffany C L; Fung, James Y Y; Chok, Kenneth S H; Cheung, Tan To; Chan, Albert C Y; Sharr, William W; Dai, Wing Chiu; Chan, See Ching; Lo, Chung Mau
2016-02-01
Liver grafts with macrovesicular steatosis of > 60% are considered unsuitable for deceased donor liver transplantation (DDLT) because of the unacceptably high risk of primary nonfunction (PNF) and graft loss. This study reports our experience in using such grafts from brain-dead donors. Prospectively collected data of DDLT recipient outcomes from 1991 to 2013 were retrospectively analyzed. Macrovesicular steatosis > 60% at postperfusion graft biopsy was defined as severe steatosis. In total, 373 patients underwent DDLT. Nineteen patients received severely steatotic grafts (ie, macrovesicular steatosis > 60%), and 354 patients had grafts with ≤ 60% steatosis (control group). Baseline demographics were comparable except that recipient age was older in the severe steatosis group (51 versus 55 years; P = 0.03). Median Model for End-Stage Liver Disease (MELD) score was 20 in the severe steatosis group and 22 in the control group. Cold ischemia time (CIT) was 384 minutes in the severe steatosis group and 397.5 minutes in the control group (P = 0.66). The 2 groups were similar in duration of stay in the hospital and in the intensive care unit. Risk of early allograft dysfunction (0/19 [0%] versus 1/354 [0.3%]; P>0.99) and 30-day mortality (0/19 [0%] versus 11/354 [3.1%]; P = 0.93) were also similar between groups. No patient developed PNF. The 1-year and 3-year overall survival rates in the severe steatosis group were both 94.7%. The corresponding rates in the control group were 91.8% and 85.8% (P = 0.55). The use of severely steatotic liver grafts from low-risk donors was safe, and excellent outcomes were achieved; however, these grafts should be used with caution, especially in patients with high MELD score. Keeping a short CIT was crucial for the successful use of such grafts in liver transplantation. © 2015 American Association for the Study of Liver Diseases.
Wadei, H M; Lee, D D; Croome, K P; Mai, M L; Golan, E; Brotman, R; Keaveny, A P; Taner, C B
2016-03-01
Early allograft dysfunction (EAD) after liver transplantation (LT) is related to ischemia-reperfusion injury and may lead to a systemic inflammatory response and extrahepatic organ dysfunction. We evaluated the effect of EAD on new-onset acute kidney injury (AKI) requiring renal replacement therapy within the first month and end-stage renal disease (ESRD) within the first year post-LT in 1325 primary LT recipients. EAD developed in 358 (27%) of recipients. Seventy-one (5.6%) recipients developed AKI and 38 (2.9%) developed ESRD. Compared with those without EAD, recipients with EAD had a higher risk of AKI and ESRD (4% vs. 9% and 2% vs. 6%, respectively, p < 0.001 for both). Multivariate logistic regression analysis showed an independent relationship between EAD and AKI as well as ESRD (odds ratio 3.5, 95% confidence interval 1.9-6.4, and odds ratio 3.1, 95% confidence interval 11.9-91.2, respectively). Patients who experienced both EAD and AKI had inferior 1-, 3-, 5-, and 10-year patient and graft survival compared with those with either EAD or AKI alone, while those who had neither AKI nor EAD had the best outcomes (p < 0.001). Post-LT EAD is a risk factor for both AKI and ESRD and should be considered a target for future intervention to reduce post-LT short- and long-term renal dysfunction. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Anesthetic and Perioperative Management of Nontransplant Surgery in Patients After Liver Transplant.
Ersoy, Zeynep; Ayhan, Asude; Ozdemirkan, Aycan; Polat, Gulsi Gulsah; Zeyneloglu, Pinar; Arslan, Gulnaz; Haberal, Mehmet
2017-02-01
We aimed to document the anesthetic management and metabolic, hemodynamic, and clinical outcomes of liver-graft recipients who subsequently undergo nontransplant surgical procedures. We retrospectively analyzed the data of 96 liver-graft recipients who underwent 144 nontransplant surgeries between October 1998 and April 2016 at Başkent University Hospital. The median patient age at the time of nontransplant surgery was 32 years, and 35% were female (n = 33). The median time between transplant and nontransplant surgery was 1231 days. The most frequent types of nontransplant surgery were abdominal (22%), orthopedic (16%), and urologic (13%). Seventy patients had an American Society of Anesthesiologists status of 2 (49%); the status was 3 in 71 patients (49%) and 4 in 3 patients (2%). Of the 144 procedures, 23 were emergent (16%) and 48% were abdominal. General anesthesia was used in 69%, regional anesthesia in 19%, and sedoanalgesia in 11%. Twenty-five patients required intraoperative blood-product transfusion (17%). Intraoperative hemodynamic instability developed in 17% of patients, and hypoxemia developed in 2%. Eleven patients remained intubated at the end of surgery (8%). Of the 144 procedures, 19 (13%) required transfer to the intensive care unit, 108 (75%) transferred to the ward, and the remaining 17 (12%) were discharged on the same day. Eight patients developed respiratory failure (6%), 7 had renal dysfunction (5%), 4 had coagulation abnormalities (3%), and 10 had infectious complications (7%) in the early postoperative period. The median hospital stay was 4 days, and 5 patients (4%) developed rejection during hospitalization. Five patients died of respiratory or infectious complications (4%). Most liver-graft recipients who undergo nontransplant surgery are given general anesthesia, transferred to the ward after the procedure, and discharged without major complications. We suggest that orthotopic liver transplant recipients may undergo nontransplant surgery without any postoperative graft dysfunction.
Imaging follow-up after liver transplantation
Rossi, Massimo; Mennini, Gianluca; Melandro, Fabio; Anzidei, Michele; De Vizio, Silvia; Koryukova, Kameliya; Catalano, Carlo
2016-01-01
Liver transplantation (LT) represents the best treatment for end-stage chronic liver disease, acute liver failure and early stages of hepatocellular carcinoma. Radiologists should be aware of surgical techniques to distinguish a normal appearance from pathological findings. Imaging modalities, such as ultrasound, CT and MR, provide for rapid and reliable detection of vascular and biliary complications after LT. The role of imaging in the evaluation of rejection and primary graft dysfunction is less defined. This article illustrates the main surgical anastomoses during LT, the normal appearance and complications of the liver parenchyma and vascular and biliary structures. PMID:27188846
Hassani, Soghra; Alipour, Abbas; Darvishi Khezri, Hadi; Firouzian, Abolfazl; Emami Zeydi, Amir; Gholipour Baradari, Afshin; Ghafari, Rahman; Habibi, Wali-Allah; Tahmasebi, Homeyra; Alipour, Fatemeh; Ebrahim Zadeh, Pooneh
2015-03-01
We hypothesized that valerian root might prevent cognitive dysfunction in coronary artery bypass graft (CABG) surgery patients through stimulating serotonin receptors and anti-inflammatory activity. The aim of this study was to evaluate the effect of Valeriana officinalis root extract on prevention of early postoperative cognitive dysfunction after on-pump CABG surgery. In a randomized, double-blind, placebo-controlled trial, 61 patients, aged between 30 and 70 years, scheduled for elective CABG surgery using cardiopulmonary bypass (CPB), were recruited into the study. Patients were randomly divided into two groups who received either one valerian capsule containing 530 mg of valerian root extract (1,060 mg/daily) or placebo capsule each 12 h for 8 weeks, respectively. For all patients, cognitive brain function was evaluated before the surgery and at 10-day and 2-month follow-up by Mini Mental State Examination (MMSE) test. Mean MMSE score decreased from 27.03 ± 2.02 in the preoperative period to 26.52 ± 1.82 at the 10th day and then increased to 27.45 ± 1.36 at the 60th day in the valerian group. Conversely, its variation was reduced significantly after 60 days in the placebo group, 27.37 ± 1.87 at the baseline to 24 ± 1.91 at the 10th day, and consequently slightly increased to 24.83 ± 1.66 at the 60th day. Valerian prophylaxis reduced odds of cognitive dysfunction compared to placebo group (OR = 0.108, 95 % CI 0.022-0.545). We concluded that, based on this study, the cognitive state of patients in the valerian group was better than that in the placebo group after CABG; therefore, it seems that the use of V. officinalis root extract may prevent early postoperative cognitive dysfunction after on-pump CABG surgery.
Clerkin, Kevin J.; Farr, Maryjane A.; Restaino, Susan W.; Zorn, Emmanuel; Latif, Farhana; Vasilescu, Elena R.; Marboe, Charles C.; Colombo, Paolo C.; Mancini, Donna M.
2017-01-01
Introduction Donor specific anti-HLA antibodies (DSA) are common following heart transplantation and are associated with rejection, cardiac allograft vasculopathy (CAV), and mortality. Currently a non-invasive diagnostic test for pathologic AMR (pAMR) does not exist. Methods 221 consecutive adult patients underwent heart transplantation from January 1st, 2010 through August 31th, 2013 and followed through October 1st, 2015. The primary objective was to determine whether the presence of DSA could detect AMR at the time of pathologic diagnosis. Secondary analyses included the association of DSA (stratified by MHC Class and de-novo status) during AMR with new graft dysfunction, graft loss (mortality or retransplantation), and development of CAV. Results During the study period 69 individual patients (31.2%) had DSA (24% had de-novo DSA) and there were 74 episodes of pAMR in 38 unique patients. The sensitivity of DSA at any MFI to detect concurrent pAMR was only 54.3%. The presence of any DSA during pAMR increased the odds of graft dysfunction (OR 5.37, 95% CI 1.34–21.47, p=0.018), adjusting for age, gender, and timing of AMR. Circulating Class II DSA after transplantation increased the risk of future pAMR (HR 2.97, 95% CI 1.31–6.73, p=0.009). Patients who developed de-novo Class II DSA had a 151% increase in risk of graft loss (contingent on 30-day survival) compared with those who did not have DSA (95% CI 1.11–5.69, p=0.027). Conclusions DSA were inadequate to diagnose pAMR, but Class II DSA provided prognostic information regarding future pAMR, graft dysfunction with pAMR, and graft loss. PMID:27916323
Clerkin, Kevin J; Farr, Maryjane A; Restaino, Susan W; Zorn, Emmanuel; Latif, Farhana; Vasilescu, Elena R; Marboe, Charles C; Colombo, Paolo C; Mancini, Donna M
2017-05-01
Donor-specific anti-HLA antibodies (DSA) are common after heart transplantation and are associated with rejection, cardiac allograft vasculopathy, and mortality. A noninvasive diagnostic test for pathologic antibody-mediated rejection (pAMR) does not exist. From January 1, 2010, through August 31, 2013, 221 consecutive adult patients underwent heart transplantation and were followed through October 1, 2015. The primary objective was to determine whether the presence of DSA could detect AMR at the time of pathologic diagnosis. Secondary analyses included association of DSA (stratified by major histocompatibility complex class and de novo status) during AMR with new graft dysfunction, graft loss (mortality or retransplantation), and development of cardiac allograft vasculopathy. During the study period, 69 patients (31.2%) had DSA (24% had de novo DSA), and there were 74 episodes of pAMR in 38 patients. Sensitivity of DSA at any mean fluorescence intensity to detect concurrent pAMR was only 54.3%. The presence of any DSA during pAMR increased the odds of graft dysfunction (odds ratio = 5.37; 95% confidence interval [CI], 1.34-21.47; p = 0.018), adjusting for age, sex, and timing of AMR. Circulating class II DSA after transplantation increased risk of future pAMR (hazard ratio = 2.97; 95% CI, 1.31-6.73; p = 0.009). Patients who developed de novo class II DSA had 151% increased risk of graft loss (contingent on 30-day survival) compared with patients who did not have DSA (95% CI, 1.11-5.69; p = 0.027). DSA were inadequate to diagnose pAMR. Class II DSA provided prognostic information regarding future pAMR, graft dysfunction with pAMR, and graft loss. Copyright © 2017 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Zhou, Jing; Qin, Lingfeng; Yi, Tai; Ali, Rahmat; Li, Qingle; Jiao, Yang; Li, Guangxin; Tobiasova, Zuzana; Huang, Yan; Zhang, Jiasheng; Yun, James J.; Sadeghi, Mehran M.; Giordano, Frank J.; Pober, Jordan S.; Tellides, George
2015-01-01
Rationale Transplantation, the most effective therapy for end-stage organ failure, is markedly limited by early-onset cardiovascular disease (CVD) and premature death of the host. The mechanistic basis of this increased CVD is not fully explained by known risk factors. Objective To investigate the role of alloimmune responses in promoting CVD of organ transplant recipients. Methods and Results We established an animal model of graft-exacerbated host CVD by combining murine models of atherosclerosis (apolipoprotein E-deficient recipients on standard diet) and of intra-abdominal graft rejection (heterotopic cardiac transplantation without immunosuppression). CVD was absent in normolipidemic hosts receiving allogeneic grafts and varied in severity among hyperlipidemic grafted hosts according to recipient-donor genetic disparities, most strikingly across an isolated major histocompatibility complex class II antigen barrier. Host disease manifested as increased atherosclerosis of the aorta that also involved the native coronary arteries and new findings of decreased cardiac contractility, ventricular dilatation, and diminished aortic compliance. Exacerbated CVD was accompanied by greater levels of circulating cytokines, especially interferon-γ and other Th1-type cytokines, and showed both systemic and intra-lesional activation of leukocytes, particularly T helper cells. Serologic neutralization of interferon-γ after allotransplantation prevented graft-related atherosclerosis, cardiomyopathy, and aortic stiffening in the host. Conclusions Our study reveals that sustained activation of the immune system due to chronic allorecognition exacerbates the atherogenic diathesis of hyperlipidemia and results in de novo cardiovascular dysfunction in organ transplant recipients. PMID:26399469
Domenico, T.D.; Joelsons, G.; Montenegro, R.M.; Manfro, R.C.
2017-01-01
We analyzed microRNA (miR)-142-3p expression in leucocytes of the peripheral blood and urinary sediment cell samples obtained from kidney transplant recipients who developed graft dysfunction. Forty-one kidney transplant recipients with kidney graft dysfunction and 8 stable patients were included in the study. The groups were divided according to histological analysis into acute rejection group (n=23), acute tubular necrosis group (n=18) and stable patients group used as a control for gene expression (n=8). Percutaneous biopsies were performed and peripheral blood samples and urine samples were obtained. miR-142-3p was analyzed by real-time polymerase chain reaction. The group of patients with acute tubular necrosis presented significantly higher expressions in peripheral blood (P<0.05) and urine (P<0.001) compared to the stable patients group. Also, in the peripheral blood, miR-142-3p expression was significantly higher in the acute tubular necrosis group compared to the acute rejection group (P<0.05). Urine samples of the acute rejection group presented higher expression compared to the stable patients group (P<0.001) but the difference between acute tubular necrosis and acute rejection groups was not significant in the urinary analyzes (P=0.079). miR-142-3p expression has a distinct pattern of expression in the setting of post-operative acute tubular necrosis after kidney transplantation and may potentially be used as a non-invasive biomarker for renal graft dysfunction. PMID:28380212
Domenico, T D; Joelsons, G; Montenegro, R M; Manfro, R C
2017-04-03
We analyzed microRNA (miR)-142-3p expression in leucocytes of the peripheral blood and urinary sediment cell samples obtained from kidney transplant recipients who developed graft dysfunction. Forty-one kidney transplant recipients with kidney graft dysfunction and 8 stable patients were included in the study. The groups were divided according to histological analysis into acute rejection group (n=23), acute tubular necrosis group (n=18) and stable patients group used as a control for gene expression (n=8). Percutaneous biopsies were performed and peripheral blood samples and urine samples were obtained. miR-142-3p was analyzed by real-time polymerase chain reaction. The group of patients with acute tubular necrosis presented significantly higher expressions in peripheral blood (P<0.05) and urine (P<0.001) compared to the stable patients group. Also, in the peripheral blood, miR-142-3p expression was significantly higher in the acute tubular necrosis group compared to the acute rejection group (P<0.05). Urine samples of the acute rejection group presented higher expression compared to the stable patients group (P<0.001) but the difference between acute tubular necrosis and acute rejection groups was not significant in the urinary analyzes (P=0.079). miR-142-3p expression has a distinct pattern of expression in the setting of post-operative acute tubular necrosis after kidney transplantation and may potentially be used as a non-invasive biomarker for renal graft dysfunction.
López-Cantero, M; Grisolía, A L; Vicente, R; Moreno, I; Ramos, F; Porta, J; Torregrosa, S
2014-04-01
Primary graft dysfunction is a leading cause of morbimortality in the immediate postoperative period of patients undergoing lung transplantation. Among the treatment options are: lung protective ventilatory strategies, nitric oxide, lung surfactant therapy, and supportive treatment with extracorporeal membrane oxygenation (ECMO) as a bridge to recovery of lung function or re-transplant. We report the case of a 9-year-old girl affected by cystic fibrosis who underwent double-lung transplantation complicated with a severe primary graft dysfunction in the immediate postoperative period and refractory to standard therapies. Due to development of multiple organ failure, it was decided to insert arteriovenous ECMO catheters (pulmonary artery-right atrium). The postoperative course was satisfactory, allowing withdrawal of ECMO on the 5th post-surgical day. Currently the patient survives free of rejection and with an excellent quality of life after 600 days of follow up. Copyright © 2012 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Published by Elsevier España. All rights reserved.
Moreso, Francesc; Hernández, Domingo
2013-01-18
The introduction of new immunosuppressant drugs in recent years has allowed for a reduction in acute rejection rates along with highly significant improvements in short-term kidney transplantation results. Nonetheless, this improvement has not translated into such significant changes in long-term results. In this manner, late graft failure continues to be a frequent cause of readmission onto dialysis programmes and re-entry onto the waiting list. Multiple entities of immunological and non-immunological origin act together and lead to chronic allograft dysfunction. The characteristics of the transplanted organ are a greater determinant of graft survival, and although various algorithms have been designed as a way of understanding the risk of the transplant organ and assigning the most adequate recipient accordingly. They are applied in the clinical setting only under exceptional circumstances. Characterising, for each patient, the immune factors (clinical and subclinical rejection, reactivation of dormant viral infections, adherence to treatment) and non-immune factors (hypertension, diabetes, anaemia, dyslipidaemia) that contribute to chronic allograft dysfunction could allow us to intervene more effectively as a way of delaying the progress of such processes. Therefore, identifying the causes of graft failure and its risk factors, applying predictive models, and intervening in causal factors could constitute strategies for improving kidney transplantation results in terms of survival. This review analyses some of the evidences conditioning graft failure as well as related therapeutic and prognostic aspects: 1) magnitude of the problem and causes of graft failure; 2) identification of graft failure risk factors; 3) therapeutic strategies for reducing graft failure, and; 4) graft failure prediction.
Ramírez, J.; Bostock, I. C.; Martin-Onraët, A.; Calleja, S.; Sánchez-Cedillo, A.; Navarro-Vargas, L. A.; Noriega-Salas, A. L.; Martínez-Mijangos, O.; Uribe-Uribe, N. O.; Vilatoba, M.; Gabilondo, B.; Morales-Buenrostro, L. E.; Alberú, J.
2013-01-01
We report two cases of adenoviral infection in kidney transplant recipients that presented with different clinical characteristics under similar demographic and posttransplant conditions. The first case presented with fever, gross haematuria, and acute graft dysfunction 15 days following renal transplantation. A graft biopsy, analyzed with immunohistochemistry, yielded negative results. However, the diagnosis was confirmed with blood and urine real-time PCR for adenovirus 3 days after the initial clinical manifestations. The immunosuppression dose was reduced, and ribavirin treatment was started, for which the patient quickly developed toxicity. Antiviral treatment allowed for transient response; however, a relapse occurred. The viral real-time PCR became negative upon immunosuppression reduction and administration of IVIG; graft function normalized. In the second case, the patient presented with fever and dysuria 1 month after transplantation. The initial imaging studies revealed graft enlargement and areas of hypoperfusion. In this case, the diagnosis was also confirmed with blood and urine real-time PCR for adenovirus 3 days after the initial clinical manifestations. Adenoviral nephritis was confirmed through a graft biopsy analyzed with light microscopy, immunohistochemistry, and PCR in frozen tissue. The immunosuppression dose was reduced, and IVIG was administered obtaining excellent clinical results along with a negative real-time PCR. PMID:24558620
2013-01-01
Summary Vascular access dysfunction is a major cause of morbidity and mortality in hemodialysis patients. The most common cause of vascular access dysfunction is venous stenosis from neointimal hyperplasia within the perianastomotic region of an arteriovenous fistula and at the graft-vein anastomosis of an arteriovenous graft. There have been few, if any, effective treatments for vascular access dysfunction because of the limited understanding of the pathophysiology of venous neointimal hyperplasia formation. This review will (1) describe the histopathologic features of hemodialysis access stenosis; (2) discuss novel concepts in the pathogenesis of neointimal hyperplasia development, focusing on downstream vascular biology; (3) highlight future novel therapies for treating downstream biology; and (4) discuss future research areas to improve our understanding of downstream biology and neointimal hyperplasia development. PMID:23990166
Nonesterified fatty acids and development of graft failure in renal transplant recipients.
Klooster, Astrid; Hofker, H Sijbrand; Navis, Gerjan; Homan van der Heide, Jaap J; Gans, Reinold O B; van Goor, Harry; Leuvenink, Henri G D; Bakker, Stephan J L
2013-06-15
Chronic transplant dysfunction is the most common cause of graft failure on the long term. Proteinuria is one of the cardinal clinical signs of chronic transplant dysfunction. Albumin-bound fatty acids (FA) have been hypothesized to be instrumental in the etiology of renal damage induced by proteinuria. We therefore questioned whether high circulating FA could be associated with an increased risk for future development of graft failure in renal transplant recipients (RTR). To this end, we prospectively investigated the association of fasting concentrations of circulating nonesterified FA (NEFA) with the development of graft failure in RTR. Baseline measurements were performed between 2001 and 2003 in outpatient RTR with a functioning graft of more than 1 year. Follow-up was recorded until May 19, 2009. Graft failure was defined as return to dialysis or retransplantation. We included 461 RTR at a median (interquartile range [IQR]) of 6.1 (3.3-11.3) years after transplantation. Median (IQR) fasting concentrations of NEFA were 373 (270-521) μM/L. Median (IQR) follow-up for graft failure beyond baseline was 7.1 (6.1-7.5) years. Graft failure occurred in 23 (15%), 14 (9%), and 9 (6%) of RTR across increasing gender-specific tertiles of NEFA (P=0.04). In a gender-adjusted Cox-regression analysis, log-transformed NEFA level was inversely associated with the development of graft failure (hazard ratio, 0.61; 95% confidence interval, 0.47-0.81; P<0.001). In this prospective cohort study in RTR, we found an inverse association between fasting NEFA concentrations and risk for development of graft failure. This association suggests a renoprotective rather than a tubulotoxic effect of NEFA. Further studies on the role of different types of NEFA in the progression of renal disease are warranted.
Left ventricular assist device malfunction: a systematic approach to diagnosis.
Horton, Steven C; Khodaverdian, Reza; Powers, Amanda; Revenaugh, James; Renlund, Dale G; Moore, Stephanie A; Rasmusson, Brad; Nelson, Karl E; Long, James W
2004-05-05
A protocol was designed to diagnose the common malfunctions of a left ventricular assist device (LVAD). Mechanical circulatory support, primarily with an LVAD, is increasingly used for treatment of advanced heart failure (HF). Left ventricular assist device dysfunction is a recognized complication; but heretofore, a systematic method to accurately diagnose LVAD dysfunction has not been thoroughly described. We developed a catheter-based protocol designed to characterize a normally functioning LVAD and diagnose multiple types of dysfunction. A total of 15 studies of 10 patients supported with an LVAD were reviewed. All patients had been evaluated due to concerns regarding LVAD dysfunction. Of 15 examinations performed, 11 documented severe LVAD inflow valve regurgitation. One of these cases proved to have coexistent severe mitral valve regurgitation. One case was diagnosed with distortion of the LVAD outflow graft. One case of suspected embolization from the pumping chamber excluded the outflow graft as the source of emboli. One study had aortic insufficiency. As LVAD use for treatment of end-stage HF becomes widespread and durations of support are extended, dysfunction will be increasingly prevalent. This catheter-based protocol provided a practical method to diagnose multiple causes of LVAD dysfunction.
DerHovanessian, Ariss; Weigt, S. Samuel; Palchevskiy, Vyacheslav; Shino, Michael Y.; Sayah, David M.; Gregson, Aric L.; Noble, Paul W.; Palmer, Scott M.; Fishbein, Michael C.; Kubak, Bernard M.; Ardehali, Abbas; Ross, David J.; Saggar, Rajan; Lynch, Joseph P.; Elashoff, Robert M.; Belperio, John A.
2016-01-01
Primary graft dysfunction (PGD) is a possible risk factor for bronchiolitis obliterans syndrome (BOS) following lung transplantation; however, the mechanism for any such association is poorly understood. Based on TGF-β's association with acute and chronic inflammatory disorders, we hypothesized that it may play a role in the continuum between PGD and BOS. Thus, the association between PGD and BOS was assessed in a single-center cohort of lung transplant recipients. Bronchoalveolar lavage fluid concentrations of TGF-β and procollagen collected within 24 hours of transplantation were compared across the spectrum of PGD, and incorporated into Cox models of BOS. Immunohistochemistry localized expression of TGF-β and its receptor in early lung biopsies post-transplant. We found an association between PGD and BOS in both bilateral and single lung recipients with a hazard ratio of 3.07 (95% CI 1.76-5.38) for the most severe form of PGD. TGF-β and procollagen concentrations were elevated during PGD (p<0.01), and associated with increased rates of BOS. Expression of TGF-β and its receptor localized to allograft infiltrating mononuclear and stromal cells, and the airway epithelium. These findings validate the association between PGD and the subsequent development of BOS, and suggest that this association may be mediated by receptor/TGF-β biology. PMID:26461171
[Immunological Markers in Organ Transplantation].
Beckmann, J H; Heits, N; Braun, F; Becker, T
2017-04-01
The immunological monitoring in organ transplantation is based mainly on the determination of laboratory parameters as surrogate markers of organ dysfunction. Structural damage, caused by alloreactivity, can only be detected by invasive biopsy of the graft, which is why inevitably rejection episodes are diagnosed at a rather progressive stage. New non-invasive specific markers that enable transplant clinicians to identify rejection episodes at an earlier stage, on the molecular level, are needed. The accurate identification of rejection episodes and the establishment of operational tolerance permit early treatment or, respectively, a controlled cessation of immunosuppression. In addition, new prognostic biological markers are expected to allow a pre-transplant risk stratification thus having an impact on organ allocation and immunosuppressive regimen. New high-throughput screening methods allow simultaneous examination of hundreds of characteristics and the generation of specific biological signatures, which might give concrete information about acute rejection, chronic dysfunction as well as operational tolerance. Even though multiple studies and a variety of publications report about important advances on this subject, almost no new biological marker has been implemented in clinical practice as yet. Nevertheless, new technologies, in particular analysis of the genome, transcriptome, proteome and metabolome will make personalised transplantation medicine possible and will further improve the long-term results and graft survival rates. This article gives a survey of the limitations and possibilities of new immunological markers in organ transplantation. Georg Thieme Verlag KG Stuttgart · New York.
Bloch, Konstantin; Gil-Ad, Irit; Tarasenko, Igor; Vanichkin, Alexey; Taler, Michal; Hornfeld, Shay Henry; Vardi, Pnina; Weizman, Abraham
2015-06-01
The treatment of rodents with non-competitive antagonist of the N-Methyl-D-aspartate (NMDA) receptor, MK-801 (dizocilpine), induces symptoms of psychosis, deficits in spatial memory and impairment of synaptic plasticity. Recent studies have suggested that insulin administration might attenuate the cognitive dysfunctions through the modulatory effect on the expression of NMDA receptors and on the brain insulin signaling. Intrahepatic pancreatic islet transplantation is known as an efficient tool for correcting impaired insulin signaling. We examined the capacity of syngeneic islets grafted into the cranial subarachnoid cavity to attenuate behavioral dysfunctions in rats exposed to MK-801. Animals were examined in the open field (OF) and the Morris Water Maze (MWM) tests following acute or subchronic administration of MK-801. We found well-vascularized grafted islets expressing insulin, glucagon and somatostatin onto the olfactory bulb and prefrontal cortex. Significantly higher levels of insulin were detected in the hippocampus and prefrontal cortex of transplanted animals compared to the non-transplanted rats. All animals expressed normal peripheral glucose homeostasis for two months after transplantation. OF tests revealed that rats exposed to MK-801 treatment, showed hyper-responsiveness in motility parameters and augmented center field exploration compared to intact controls and these effects were attenuated by the grafted islets. Moreover, in the MWM, the rats treated with MK-801 showed impairment of spatial memory that were partially corrected by the grafted islets. In conclusion, intracranial islet transplantation leads to the expression of islet hormones in the brain and attenuates behavioral and cognitive dysfunctions in rats exposed to MK-801 administration without altering the peripheral glucose homeostasis. Copyright © 2015 Elsevier Inc. All rights reserved.
Koizumi, Noriko; Okumura, Naoki; Ueno, Morio; Kinoshita, Shigeru
2014-11-01
Corneal endothelial dysfunction accompanied by visual disturbance is a primary indication for corneal endothelial transplantation. However, despite the value and potential of endothelial graft surgery, a strictly pharmacological approach for treating corneal endothelial dysfunction remains an attractive proposition. Previously, we reported that the selective Rho-associated kinase (ROCK) inhibitor Y-27632 promotes cell adhesion and proliferation, and inhibits the apoptosis of primate corneal endothelial cells in culture. These findings have led us to develop a novel medical treatment for the early phase of corneal endothelial disease using ROCK inhibitor eye drops. In rabbit and monkey models of partial endothelial dysfunction, we showed that corneal endothelial wound healing was accelerated via the topical application of ROCK inhibitor to the ocular surface, resulting in the regeneration of a corneal endothelial monolayer with a high endothelial cell density. Based on these animal studies, we are now attempting to advance the clinical application of ROCK inhibitor eye drops for patients with corneal endothelial dysfunction. A pilot clinical study was performed at the Kyoto Prefectural University of Medicine, and the effects of Y-27632 eye drops after transcorneal freezing were evaluated in 8 patients with corneal endothelial dysfunction. We observed a positive effect of ROCK inhibitor eye drops in treating patients with central edema caused by Fuchs corneal endothelial dystrophy. We believe that our new findings will contribute to the establishment of a new approach for the treatment of corneal endothelial dysfunction.
Eisenbach, Christoph; Longerich, Thomas; Fickenscher, Helmut; Schalasta, Gunnar; Stremmel, Wolfgang; Encke, Jens
2006-01-01
We report hepatitis A virus (HAV) infection of a liver allograft following transplantation for fulminant liver failure due to HAV infection. This rare condition has been described in only three patients to date. After liver transplantation allograft function was good, but starting 80 days after transplantation, episodes of acute graft dysfunction were observed. To elucidate the reason for acute hepatic dysfunction a large number of differential diagnoses were tested. HAV RNA was undetectable for more than 80 days after transplantation. Detection of genomic HAV RNA by RT-PCR in serum and stool at the time of graft dysfunction led to the diagnosis of recurrent HAV infection. We suggest that the risk of HAV reinfection after liver transplantation may be far higher than expected as results may be misinterpreted as rejection episodes.
Méndez, A B; Ordonez-Llanos, J; Mirabet, S; Galan, J; Maestre, M L; Brossa, V; Rivilla, M T; López, L; Koller, T; Sionis, A; Roig, E
2016-11-01
Primary graft dysfunction after heart transplantation (HTx) has a very high mortality rate, especially if the left ventricle (PGD-LV) is involved. Early diagnosis is important to select the appropriate therapy to improve prognosis. The value of high-sensitivity troponin T (HS-TNT) measurement obtained at patient arrival at the intensive care unit was analyzed in 71 HTx patients. Mild or moderate PGD-LV was defined by hemodynamic compromise with one of the following criteria: left ventricular ejection fraction <40%, hemodynamic compromise with right atrial pressure >15 mm Hg, pulmonary capillary wedge pressure >20 mm Hg, cardiac index <2.0 L/min/m 2 , hypotension (mean arterial pressure <70 mm Hg), and need for high-dose inotropes (inotrope score >10) or newly placed intra-aortic balloon pump. The mean recipient age was 54 ± 12 years (73% men), and donor age was 47 ± 11 years. Ischemic time was 200 ± 51 minutes, and coronary bypass time was 122 ± 31 minutes. Nine (13%) HTx patients were diagnosed with PGD-LV post-HTx, 8 with biventricular dysfunction. Four patients died, 2 with PGD-LV (22%) and 2 without PGD (4%). Mean HS-TNT before HTx was 158 ± 565 ng/L, and post-HT was 1621 ± 1269 ng/L. The area under the curve (receiver-operator characteristic) of HS-TNT to detect patients at risk of PGD-LV was 0.860 (P < .003). A cutoff value of HS-TNT >2000 ng/L had a sensitivity of 75% and specificity of 87% to identify patients at risk of PGD-LV. Multivariate analysis identified HS-TNT >2000 ng/L (P < .02) and coronary bypass-time (P < .01) as independent predictors of PGD-LV. HS-TNT >2000 ng/L at intensive care admission after HT and prolonged coronary bypass time were the most powerful predictors of PGD-LV. HS-TNT may be helpful for early detection of HTx patients at risk of PGD-LV. Copyright © 2016 Elsevier Inc. All rights reserved.
Alvarez, Antonio; Moreno, Paula; Illana, Jennifer; Espinosa, Dionisio; Baamonde, Carlos; Arango, Elisabet; Algar, Francisco Javier; Salvatierra, Angel
2013-01-01
OBJECTIVES In current practice, donors and recipients are not matched for gender in lung transplantation. However, some data have suggested a possible effect of gender combinations on lung transplant outcomes. We examined whether donor–recipient (D/R) gender mismatch is related to adverse outcomes after lung transplantation in terms of early and long-term graft function and survival. METHODS We reviewed 256 donors and lung transplant recipients over a 14-year period. Patients were distributed into four groups: Group A (D/R: female/female), Group B (D/R: male/male), Group C (D/R: female/male), Group D (D/R: male/female). Donor and recipient variables were compared among groups, including early graft function, 30-day mortality, freedom from bronchiolitis obliterans syndrome (BOS), and long-term survival. RESULTS Group A: 57 (22%), Group B: 99 (39%), Group C: 62 (24%), Group D: 38 (15%) transplants (P = 0.001). Donor age was 29 ± 14, 27 ± 12, 33 ± 13 and 23 ± 12 years for Groups A, B, C and D, respectively (P = 0.004). Recipient age was 31 ± 15, 44 ± 17, 42 ± 16 and 30 ± 16 years for Groups A, B, C and D, respectively (P = 0.000). PaO2/FiO2 (mmHg) 24 h post-transplant was: Group A: 276 ± 144, Group B: 297 ± 131, Group C: 344 ± 133 and Group D: 238 ± 138 (P = 0.015). Primary graft dysfunction developed in 23, 14, 17 and 21% of recipients from Groups A, B, C and D, respectively (P = 0.45). Operative mortality was 4.4, 6.5, 5.2 and 2%, for recipients from Groups A, B, C and D, respectively (P = 0.66). Freedom from BOS was 73, 59 and 36% for gender-matched transplants vs 76, 67 and 40% for gender-mismatched transplants at 3, 5 and 10 years, respectively (P = 0.618), without differences among groups. A non-significant survival benefit was observed for female recipients, irrespective of the donor gender. CONCLUSIONS Donor–recipient gender mismatch does not have a negative impact on early graft function and mortality following lung transplantation. There is a trend towards a survival benefit for female recipients, irrespective of the donor gender. PMID:23322094
Szylińska, Aleksandra; Listewnik, Mariusz J; Rotter, Iwona; Rył, Aleksandra; Biskupski, Andrzej; Brykczyński, Mirosław
2017-01-01
Background Preoperative spirometry provides measurable information about the occurrence of respiratory disorders. The aim of this study was to assess the association between preoperative spirometry abnormalities and the intensification of early inflammatory responses in patients following coronary artery bypass graft in extracorporeal circulation. Material and methods The study involved 810 patients (625 men and 185 women) aged 65.4±7.9 years who were awaiting isolated coronary artery bypass surgery. On the basis of spirometry performed on the day of admittance to the hospital, the patients were divided into three groups. Patients without respiratory problems constituted 78.8% of the entire group. Restricted breathing was revealed by spirometry in 14.9% and obstructive breathing in 6.3% of patients. Results Inter-group analysis showed statistically significant differences in C-reactive protein (CRP) between patients with restrictive spirometry abnormalities and patients without any pulmonary dysfunction. CRP concentrations differed before surgery (P=0.006) and on the second (P<0.001), fourth (P=0.005) and sixth days after surgery (P=0.029). There was a negative correlation between CRP levels and FEV1. Conclusion In our study, the most common pulmonary disorders in the coronary artery bypass graft patients were restrictive. Patients with abnormal spirometry results from restrictive respiratory disorders have an elevated level of generalized inflammatory response both before and after the isolated coronary artery bypass surgery. Therefore, this group of patients should be given special postoperative monitoring and, in particular, intensive respiratory rehabilitation immediately after reconstitution. PMID:28769557
Cephalic Arch Stenosis in Autogenous Haemodialysis Fistulas: Treatment With the Viabahn Stent-Graft
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shawyer, Andrew, E-mail: andrew.shawyer@bartsandthelondon.nhs.uk; Fotiadis, Nicos I., E-mail: fotiadis.nicholas@gmail.com; Namagondlu, Girish, E-mail: girish.namagondlu@bartsandthelondon.nhs.uk
2013-02-15
Cephalic arch stenosis (CAS) is an important and common cause of dysfunction in autogenous haemodialysis fistulas that requires multiple reinterventions and aggressive surveillance. We evaluated the safety and efficacy of the Viabahn stent-graft for the management of CAS. Between April 2005 and October 2011, 11 consecutive patients [four men and seven women (mean age 56.7 years)] with CAS and dysfunctional fistulas were treated with insertion of 11 Viabahn stent-grafts. Six stent-grafts were inserted due to residual stenosis after angioplasty and five for fistuloplasty-induced rupture. No patient was lost to follow-up. The technical and clinical success rate was 100 %. Primarymore » access patency rates were 81.8 % [95 % confidence interval (CI) 0.482-0.977] at 6 months and 72.7 % (95 % CI 0.390-0.939) at 12 months. Secondary access patency rates were 90.9 % at 6 months (95 % CI 0.587-0.997). There were no procedure-related complications. Mean follow-up was 543.8 days (range 156-2,282). The use of the Viabahn stent-graft in the management of CAS is technically feasible and, in this small series, showed patency rates that compare favorably with historical data of angioplasty and bare stents.« less
Immune Dysfunctions and Abrogation of the Inflammatory Response by Environmental Chemicals.
1981-08-01
vitro effects in cats, mice, and humans. Materials and Methods: Animals: Mice: Young adult Swiss outbred mice and C57BL/6 (for skin graft donors) were...was used to determine significant differences between control (untreated) and MNU-treated cell cultures or animal groups. Results: Skin grafts : The...MNU-treated mice showed a dose-related increase in skin graft retention time, which was significant at 25 and 50 mg/kg MNU (p<.001). This is compared
Grenzi, Patricia C; Campos, Érika F; Silva, Hélio T; Felipe, Claudia R; Franco, Marcelo F; Soares, Maria F; Medina-Pestana, José O; Gerbase-DeLima, Maria
2015-03-01
Several studies have shown association of high pre- or post-transplant levels of soluble CD30 (sCD30) with acute rejection and poor late kidney transplant outcome. Our goal was to investigate whether sCD30 levels at month-3 post-transplant are associated with subclinical rejection, presence of CD30(+) cells within the graft, and expression of immune response genes in peripheral blood mononuclear cells. The study comprised 118 adult first kidney graft recipients, transplanted at a single center, receiving tacrolimus in low concentration. All were submitted to a protocol biopsy at month-3. Subclinical rejection was identified in 10 biopsies and sCD30 levels ≥ 61.88 ng/mL (P = 0.004), younger recipient age (P = 0.030) and non-Caucasian ethnicity (P = 0.011) were independently associated with this outcome. Rare CD30(+) cells were present in only two biopsies. There was a correlation between sCD30 levels and CD30 gene expression in peripheral blood mononuclear cells (r = 0.385, P = 0.043). These results show that high sCD30 levels are independent predictors of graft dysfunction and may contribute to patient selection protocols by indicating those who could benefit from a more thorough evaluation. Copyright © 2015 Elsevier B.V. All rights reserved.
Moreno, Paula; Alvarez, Antonio; Illana, Jennifer; Espinosa, Dionisio; Baamonde, Carlos; Cerezo, Francisco; Algar, Francisco Javier; Salvatierra, Angel
2013-06-01
To determine whether lung retrieval from traumatic donors performed within 24 h of brain death has a negative impact on early graft function and survival after lung transplantation (LT), when compared with those retrieved after 24 h. Review of lung transplants performed from traumatic donors over a 17-year period. Recipients were distributed into two groups: transplants from traumatic donor lungs retrieved within 24 h of brain death (Group A), and transplants from traumatic donor lungs retrieved after 24 h of brain death (Group B). Demographic data of donors and recipients, early graft function, perioperative complications and mortality were compared between both groups. Among 356 lung transplants performed at our institution, 132 were from traumatic donors (70% male, 30% female). Group A: 73 (55%); Group B: 59 (45%). There were 53 single, 77 double, and 2 combined LT. Indications were emphysema in 41 (31%), pulmonary fibrosis in 31 (23%), cystic fibrosis in 38 (29%), bronchiectasis in 9 (7%) and other indications in 13 patients (10%). Donor and recipient demographic data, need or cardiopulmonary bypass, postoperative complications and Intensive Care Unit and hospital stay did not differ between groups. Primary graft dysfunction (A vs B): 9 (16%) vs 13 (26%) P = 0.17. PaO2/FiO2 24 h post-transplant (A vs B): 303 mmHg vs 288 mmHg (P = 0.57). Number of acute rejection episodes (A vs B): 0.93 vs 1.49 (P = 0.01). Postoperative intubation time (A vs B): 99 vs 100 h (P = 0.99). 30-day mortality (A vs B): 7 (10%) vs 2 (3.5%) (P = 0.13). Freedom from bronchiolitis obliterans syndrome (A vs B): 82, 72, 37, 22 vs 78, 68, 42, 15%, at 3, 5, 10 and 15 years, respectively (P = 0.889). Survival (A vs B): 65, 54, 46, 42 and 27 vs 60, 50, 45, 43 and 29% at 3, 5, 7, 10 and 15 years, respectively (P = 0.937). In our experience, early lung retrieval after brain death from traumatic donors does not adversely affect early and long-term outcomes after LT.
De Carlis, Riccardo; Di Sandro, Stefano; Lauterio, Andrea; Ferla, Fabio; Dell'Acqua, Antonio; Zanierato, Marinella; De Carlis, Luciano
2017-02-01
The role of donation after cardiac death (DCD) in expanding the donor pool is mainly limited by the incidence of primary nonfunction (PNF) and ischemia-related complications. Even greater concern exists toward uncontrolled DCD, which represents the largest potential pool of DCD donors. We recently started the first Italian series of DCD liver transplantation, using normothermic regional perfusion (NRP) in 6 uncontrolled donors and in 1 controlled case to deal with the legally required no-touch period of 20 minutes. We examined our first 7 cases for the incidence of PNF, early graft dysfunction, and biliary complications. Acceptance of the graft was based on the trend of serum transaminase and lactate during NRP, the macroscopic appearance, and the liver biopsy. Hypothermic machine perfusion (HMP) was associated in selected cases to improve cold storage. Most notably, no cases of PNF were observed. Median posttransplant transaminase peak was 1014 IU/L (range, 393-3268 IU/L). Patient and graft survival were both 100% after a mean follow-up of 6.1 months (range, 3-9 months). No cases of ischemic cholangiopathy occurred during the follow-up. Only 1 anastomotic stricture completely resolved with endoscopic stenting. In conclusion, DCD liver transplantation is feasible in Italy despite the protracted no-touch period. The use of NRP and HMP seems to earn good graft function and proves safe in these organs. Liver Transplantation 23 166-173 2017 AASLD. © 2016 by the American Association for the Study of Liver Diseases.
Single-Center Experience Using Marginal Liver Grafts in Korea.
Park, P-J; Yu, Y-D; Yoon, Y-I; Kim, S-R; Kim, D-S
2018-05-01
Liver transplantation (LT) is an established therapeutic modality for patients with end-stage liver disease. The use of marginal donors has become more common worldwide due to the sharp increase in recipients, with a consequent shortage of suitable organs. We analyzed our single-center experience over the last 8 years in LT to evaluate the outcomes of using so-called "marginal donors." We retrospectively analyzed the database of all LTs performed at our institution from 2009 to 2017. Only patients undergoing deceased-donor LTs were analyzed. Marginal grafts were defined as livers from donors >60 years of age, livers from donors with serum sodium levels >155 mEq, graft steatosis >30%, livers with cold ischemia time ≥12 hours, livers from donors who were hepatitis B or C virus positive, livers recovered from donation after cardiac death, and livers split between 2 recipients. Patients receiving marginal grafts (marginal group) were compared with patients receiving standard grafts (standard group). A total of 106 patients underwent deceased-donor LT. There were 55 patients in the standard group and 51 patients in the marginal group. There were no significant differences in terms of age, sex, Model for End-Stage Liver Disease score, underlying liver disease, presence of hepatocellular carcinoma, and hospital stay between the 2 groups. Although the incidence of acute cellular rejection, cytomegalovirus infection, and postoperative complications was similar between the 2 groups, the incidence of early allograft dysfunction was higher in the marginal group. With a median follow-up of 26 months, the 1-, 3-, and 5-year overall and graft (death-censored) survivals in the marginal group were 85.5%, 75%, and 69.2% and 85.9%, 83.6%, and 77.2%, respectively. Patient overall survival and graft survival (death-censored) were significantly lower in the marginal group (P = .023 and P = .048, respectively). On multivariate analysis, receiving a marginal graft (hazard ratio [HR], 4.862 [95% confidence interval (CI), 1.233-19.171]; P = .024) and occurrence of postoperative complications (HR, 4.547 [95% CI, 1.279-16.168]; P = .019) were significantly associated with worse patient overall survival. Also, when factors associated with marginal graft were analyzed separately, graft steatosis >30% was independently associated with survival (HR, 5.947 [95% CI, 1.481-23.886]; P = .012). Patients receiving marginal grafts showed lower but acceptable overall survival and graft survival. However, because graft steatosis >30% was independently associated with worse survival, caution must be exercised when using this type of marginal graft by weighing the risk and benefits. Copyright © 2018 Elsevier Inc. All rights reserved.
Solgi, Ghasem; Furst, Daniel; Mytilineos, Joannis; Pourmand, Gholamreza; Amirzargar, Ali Akbar
2012-03-01
This retrospective study aims to determine the prognostic values of HLA and MICA antibodies, serum levels of sCD30 and soluble form of MHC class I related chain A (sMICA) in kidney allograft recipients. Sera samples of 40 living unrelated donor kidney recipients were tested by ELISA and Flow beads techniques for the presence of anti HLA and MICA antibodies and the contents of sCD30 and sMICA. HLA and MICA antibody specification was performed by LABScreen single antigen beads to determine whether the antibodies were directed against donor mismatches. Within first year post operatively 9 of 40 patients (22.5%) showed acute rejection episodes (ARE) that four of them lost their grafts compared to 31 functioning transplants (P=0.001). The presence of HLA antibodies before and after transplantation was significantly associated with ARE (P=0.01 and P=0.02 respectively). Sensitization to HLA class II antigens pre-transplant was strongly associated with higher incidence of ARE (P=0.004). A significant correlation was found between ARE and appearance of non-donor specific antibodies (P=0.02). HLA antibody positive patients either before or after transplantation showed lower graft survival rates than those without antibodies during three years follow-up (P=0.04 and P=0.02). Anti-MICA antibodies were observed in 8/40(20%) and 5/40(12.5%) of patients pre and post-transplant respectively. Coexistence of HLA and MICA antibodies was shown in 2 of 4 cases with graft loss. A significant increased level of sCD30 at day 14 (P=0.001) and insignificant decreased levels of sMICA pre and post operatively were detected in rejecting transplants compared to functioning graft group. Our findings support the view that monitoring of HLA and MICA antibodies as well as sCD30 levels early after transplant has predictive value for early and late allograft dysfunctions and the presence of these factors are detrimental to graft function and survival. Copyright © 2012 Elsevier B.V. All rights reserved.
Metabolic syndrome, insulin resistance, and chronic allograft dysfunction.
Porrini, Esteban; Delgado, Patricia; Torres, Armando
2010-12-01
Metabolic syndrome (MS) is a cluster of cardiovascular (CV) risk factors (hypertension, dyslipidemia, obesity, and glucose homeostasis alterations), and insulin resistance (IR) is suggested to be a common pathogenic background. In the general population, MS and IR have been proven to be risk factors for diabetes, CV disease, and chronic kidney disease. In the renal transplant setting, few studies have analyzed the relevance of MS and IR. According to the few data available, the prevalence of MS in renal transplant patients has been described as 22.6% at 12 months, 37.7% at 36 months, and 64% at 6 years after transplantation. Importantly, MS has been shown to be an independent risk factor for chronic allograft dysfunction (CAD), graft failure, new-onset diabetes, and CV disease. Also, persistent hyperinsulinemia during the first posttransplant year has been related to an increase in glomerular filtration rate, probably reflecting glomerular hyperfiltration as observed in prediabetes and early type 2 diabetes. Importantly, prediabetes (impaired fasting glucose and impaired glucose tolerance), a state hallmarked by IR, proved to be highly frequent among stable renal transplant recipients (30%), which is nearly three times its incidence in the general population. Posttransplant IR has been associated with subclinical atheromatosis as assessed by carotid intima-media thickness, and with chronic subclinical inflammation. In conclusion, MS and IR are important modifiable risk factors in renal transplant recipients, and prompt interventions to avoid its deleterious effects at the metabolic, CV, and graft function levels are needed.
Lung donor treatment protocol in brain dead-donors: A multicenter study.
Miñambres, Eduardo; Pérez-Villares, Jose Miguel; Chico-Fernández, Mario; Zabalegui, Arturo; Dueñas-Jurado, Jose María; Misis, Maite; Mosteiro, Fernando; Rodriguez-Caravaca, Gil; Coll, Elisabeth
2015-06-01
The shortage of lung donors for transplantation is the main limitation among patients awaiting this type of surgery. We previously demonstrated that an intensive lung donor-treatment protocol succeeded in increasing the lung procurement rate. We aimed to validate our protocol for centers with or without lung transplant programs. A quasi-experimental study was performed to compare lung donor rate before (historical group, 2010 to 2012) and after (prospective group, 2013) the application of a lung management protocol for donors after brain death (DBDs) in six Spanish hospitals. Lung donor selection criteria remained unchanged in both periods. Outcome measures for lung recipients were early survival and primary graft dysfunction (PGD) rates. A total of 618 DBDs were included: 453 in the control period and 165 in the protocol period. Donor baseline characteristics were similar in both periods. Lung donation rate in the prospective group was 27.3%, more than twice that of the historical group (13%; p < 0.001). The number of lungs retrieved, grafts transplanted, and transplants performed more than doubled over the study period. No differences in early recipients' survival between groups were observed (87.6% vs. 84.5%; p = 0.733) nor in the rate of PGD. Implementing our intensive lung donor-treatment protocol increases lung procurement rates. This allows more lung transplants to be performed without detriment to either early survival or PGD rate. Copyright © 2015 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Ius, Fabio; Verboom, Murielle; Sommer, Wiebke; Poyanmehr, Reza; Knoefel, Ann-Kathrin; Salman, Jawad; Kuehn, Christian; Avsar, Murat; Siemeni, Thierry; Erdfelder, Caroline; Hallensleben, Michael; Boethig, Dietmar; Schwerk, Nicolaus; Mueller, Carsten; Welte, Tobias; Falk, Christine; Haverich, Axel; Tudorache, Igor; Warnecke, Gregor
2018-05-02
This retrospective study presents our 4-year experience of preemptive treatment of early anti-HLA donor specific antibodies with IgA- and IgM-enriched immunoglobulins. We compared outcomes between patients with antibodies and treatment (case patients) and patients without antibodies (control patients). Records of patients transplanted at our institution between 03/2013 and 11/2017 were reviewed. The treatment protocol included one single 2g/kg immunoglobulin infusion followed by successive 0.5g/kg infusions for a maximum of 6 months, usually combined with a single dose of anti-CD20 antibody and, in case of clinical rejection or positive crossmatch, with plasmapheresis or immunoabsorption. Among the 598 transplanted patients, 128 (21%) patients formed the case group and 452 (76%) the control group. In 116 (91%) patients who completed treatment, 106 (91%) showed no antibodies at treatment end. Fourteen (13%) patients showed antibody recurrence thereafter. In case vs. control patients and at 4-year follow-up, respectively, graft survival (%) was 79 vs. 81 (p=0.59), freedom (%) from biopsy-confirmed rejection 57 vs. 53 (p=0.34) and from chronic lung allograft dysfunction 82 vs. 78 (p=0.83). After lung transplantation, patients with early donor specific antibodies and treated with IgA- and IgM-enriched immunoglobulins had 4-year graft survival similar to patients without antibodies and showed high antibody clearance. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Tang, Yunhua; Han, Ming; Chen, Maogen; Wang, Xiaoping; Ji, Fei; Zhao, Qiang; Zhang, Zhiheng; Ju, Weiqiang; Wang, Dongping; Guo, Zhiyong; He, Xiaoshun
2017-11-01
Transplantation centers have given much attention to donor availability. However, no reliable quantitative methods have been employed to accurately assess graft quality before transplantation. Here, we report that the indocyanine green (ICG) clearance test is a valuable index for liver grafts. We performed the ICG clearance test on 90 brain-dead donors within 6 h before organ procurement between March 2015 and November 2016. We also analyzed the relationship between graft liver function and early graft survival after liver transplantation (LT). Our results suggest that the ICG retention rate at 15 min (ICGR15) of donors before procurement was independently associated with 3-month graft survival after LT. The best donor ICGR15 cutoff value was 11.0%/min, and we observed a significant increase in 3-month graft failure among patients with a donor ICGR15 above this value. On the other hand, a donor ICGR15 value of ≤ 11.0%/min could be used as an early assessment index of graft quality because it provides additional information to the transplant surgeon or organ procurement organization members who must maintain or improve organ function to adapt the LT. An ICG clearance test before liver procurement might be an effective quantitative method to predict graft availability and improve early graft prognosis after LT.
Afonso, José Eduardo; Werebe, Eduardo de Campos; Carraro, Rafael Medeiros; Teixeira, Ricardo Henrique de Oliveira Braga; Fernandes, Lucas Matos; Abdalla, Luis Gustavo; Samano, Marcos Naoyuki; Pêgo-Fernandes, Paulo Manuel
2015-01-01
ABSTRACT Lung transplantation is a globally accepted treatment for some advanced lung diseases, giving the recipients longer survival and better quality of life. Since the first transplant successfully performed in 1983, more than 40 thousand transplants have been performed worldwide. Of these, about seven hundred were in Brazil. However, survival of the transplant is less than desired, with a high mortality rate related to primary graft dysfunction, infection, and chronic graft dysfunction, particularly in the form of bronchiolitis obliterans syndrome. New technologies have been developed to improve the various stages of lung transplant. To increase the supply of lungs, ex vivo lung reconditioning has been used in some countries, including Brazil. For advanced life support in the perioperative period, extracorporeal membrane oxygenation and hemodynamic support equipment have been used as a bridge to transplant in critically ill patients on the waiting list, and to keep patients alive until resolution of the primary dysfunction after graft transplant. There are patients requiring lung transplant in Brazil who do not even come to the point of being referred to a transplant center because there are only seven such centers active in the country. It is urgent to create new centers capable of performing lung transplantation to provide patients with some advanced forms of lung disease a chance to live longer and with better quality of life. PMID:26154550
Ersoy, Zeynep; Kaplan, Serife; Ozdemirkan, Aycan; Torgay, Adnan; Arslan, Gulnaz; Pirat, Arash; Haberal, Mehmet
2017-02-01
To analyze how graft-weight-to-bodyweight ratio in pediatric liver transplant affects intraoperative and early postoperative hemodynamic and metabolic parameters. We reviewed data from 130 children who underwent liver transplant between 2005 and 2015. Recipients were divided into 2 groups: those with a graft weight to body weight ratio > 4% (large for size) and those with a ratio ≤ 4% (normal for size). Data included demographics, preoperative laboratory findings, intraoperative metabolic and hemodynamic parameters, and intensive care follow-up parameters. Patients in the large-graft-for-size group (>4%) received more colloid solution (57.7 ± 20.1 mL/kg vs 45.1 ± 21.9 mL/kg; P = .08) and higher doses of furosemide (0.7 ± 0.6 mg/kg vs 0.4 ± 0.7 mg/kg; P = .018). They had lower mean pH (7.1 ± 0.1 vs 7.2 ± 0.1; P = .004) and PO2 (115.4 ± 44.6 mm Hg vs 147.6 ± 49.3 mm Hg; P = .004) values, higher blood glucose values (352.8 ± 96.9 mg/dL vs 262.8 ± 88.2 mg/dL; P < .001), and lower mean body temperature (34.8 ± 0.7°C vs 35.2 ± 0.6°C; P = .016) during the neohepatic phase. They received more blood transfusions during both the anhepatic (30.3 ± 24.3 mL/kg vs 18.8 ± 21.8 mL/kg; P = .013) and neohepatic (17.7 ± 20.4 mL/kg vs 10.3 ± 15.5 mL/kg; P = .031) phases and more fresh frozen plasma (13.6 ± 17.6 mL/kg vs 6.2 ± 10.2 mL/kg; P = .012) during the neohepatic phase. They also were more likely to be hypotensive (P < .05) and to receive norepinephrine infusion more often (44% vs 22%; P < .05) intraoperatively. More patients in this group were mechanically ventilated in the intensive care unit (56% vs 31%; P = .035). There were no significant differences between the groups in postoperative acute renal dysfunction, graft rejection or loss, infections, length of intensive care stay, and mortality (P > .05). High graft weight-to-body-weight ratio is associated with adverse metabolic and hemodynamic changes during the intraoperative and early postoperative periods. These results emphasize the importance of using an appropriately sized graft in liver transplant.
Implantation of Right Kidneys: Is the Risk of Technical Graft Loss Real?
Khan, Taqi T; Ahmad, Nadeem; Siddique, Kashif; Fourtounas, Konstantinos
2018-05-01
The left kidney (LK) is preferred by transplant surgeons, because its vein is always of good length and has a thick wall that enables safe suturing. On the other hand, the right renal vein is generally shorter and thinner walled, and well known for its technical difficulty during venous anastomosis, and can result in graft loss. We examined our living (LD) and deceased donor (DD) recipient data and compared the incidence of technical graft loss and early graft function in right and left kidneys. A cohort of 58 adult and pediatric recipients received an LD or DD kidney between January 2015 and December 2016. The donor and recipient data were retrieved and retrospectively analyzed. Technical graft loss was defined as graft thrombosis within the 7 days after transplant. Right kidneys (RKs) were not a risk factor for technical graft loss, and no graft was lost for technical reasons in either LD or DD transplants. Early graft function in LK and RKs was also comparable in the LD cohort, and there were no LKs in the DD cohort. Based on our data, the use of RKs was not a risk factor for technical graft loss and early graft function was comparable to LKs.
Predictors of early graft failure after coronary artery bypass grafting for chronic total occlusion.
Oshima, Hideki; Tokuda, Yoshiyuki; Araki, Yoshimori; Ishii, Hideki; Murohara, Toyoaki; Ozaki, Yukio; Usui, Akihiko
2016-07-01
Little is known regarding the transit-time flow measurement (TTFM) variables in grafts anastomosed to chronically totally occluded vessels (CTOs). We aimed to establish the TTFM cut-off values for detecting graft failure in bypass grafts anastomosed to chronically totally occluded arteries and clarify the relationship between early graft failure and the grade of collateral circulation/regional wall motion of the CTO territory. Among 491 patients who underwent isolated coronary artery bypass grafting (CABG) from 2009 to 2015, 196 cases with CTOs underwent postoperative coronary angiography within 1 month after CABG. Two hundred and forty-one CTOs in all patients were examined. Thirty-two CTOs (13%) were not bypassed and 214 conduits were anastomosed to CTOs and underwent intraoperative TTFM. Arterial conduits and saphenous vein grafts (SVGs) were used in 102 and 112 cases, respectively. Among the arterial conduit procedures that were performed, 78 involved the left internal thoracic artery (LITA), 10 involved the right internal thoracic artery (RITA) and 14 involved the right gastroepiploic artery (rGEA). Any graft showing Fitzgibbon type B or O lesions on angiography was considered to be a failing graft. The insufficiency rates for LITA, RITA, rGEA and SVG procedures were 5.1, 10, 14.3 and 7.1%, respectively. The TTFM variables recorded in failing grafts had a significantly lower mean flow (Qmean) and higher pulsatility index (PI) compared with patent grafts. Furthermore, akinetic or dyskinetic wall motion in the territory of bypassed CTOs was observed at a significantly higher rate in failing grafts. A multivariable regression analysis and receiver operating characteristic analysis revealed good predictors of early graft failure as follows: a Qmean value of < 11.5 ml/min for arterial conduits, a PI value of >5.85 and akinetic/dyskinetic wall motion in the CTO territory for SVGs. The Rentrop collateral grade was not associated with early graft failure. The Qmean value and PI value by the TTFM are useful to detect early graft failure in conduits anastomosed to CTOs. The collateral grade is not associated with graft failure; however, bypass grafting to CTOs with akinetic/dyskinetic wall motion should be carefully considered. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
[Post-transplantation diabetes mellitus in kidney recipients].
Dubois-Laforgue, Danièle
2017-04-01
Post-transplantation diabetes mellitus is defined as diabetes that is diagnosed in grafted patients. It affects 20 to 30 % of kidney transplant recipients, with a high incidence in the first year. The increasing age at transplantation and the rising incidence of obesity may increase its prevalence in the next years. Post-transplantation diabetes mellitus is associated with poor outcomes, such as mortality, cardiovascular events or graft dysfunction. Its occurrence is mainly related to immunosuppressive agents, affecting both insulin secretion and sensibility. Immunosuppressants may be iatrogenic, and as such, induce an early and transient diabetes. They may also precociously determine a permanent diabetes, acting here as a promoting factor in patients proned to the development of type 2 diabetes. Lastly, they may behave, far from transplantation, as an additional risk factor for type 2 diabetes. The screening, management and prognosis of these different subtypes of post-transplantation diabetes mellitus will be different. Copyright © 2017 Société francophone de néphrologie, dialyse et transplantation. Published by Elsevier Masson SAS. All rights reserved.
Complications of transplantation. Part 1: renal transplants.
Khaja, Minhaj S; Matsumoto, Alan H; Saad, Wael E
2014-10-01
Vascular complications after solid-organ transplantation are not uncommon and may lead to graft dysfunction and ultimately graft loss. A thorough understanding of the surgical anatomy, etiologies, and types of vascular complications, their presentation, and the options for management are important for managing these complex patients. This article reviews the basic surgical anatomy, vascular complications, and endovascular management options of vascular complications in patients with renal transplants.
Influence of Diabetes on Long-Term Coronary Artery Bypass Graft Patency.
Raza, Sajjad; Blackstone, Eugene H; Houghtaling, Penny L; Rajeswaran, Jeevanantham; Riaz, Haris; Bakaeen, Faisal G; Lincoff, A Michael; Sabik, Joseph F
2017-08-01
Nearly 50% of patients undergoing coronary artery bypass grafting have diabetes. However, little is known about the influence of diabetes on long-term patency of bypass grafts. Because patients with diabetes have more severe coronary artery stenosis, we hypothesized that graft patency is worse in patients with than without diabetes. This study sought to examine the influence of diabetes on long-term patency of bypass grafts. From 1972 to 2011, 57,961 patients underwent primary isolated coronary artery bypass grafting. Of these, 1,372 pharmacologically treated patients with diabetes and 10,147 patients without diabetes had 15,887 postoperative angiograms; stenosis was quantified for 7,903 internal thoracic artery (ITA) grafts and 20,066 saphenous vein grafts. Status of graft patency across time was analyzed by longitudinal nonlinear mixed-effects modeling. ITA graft patency was stable over time and similar in patients with and without diabetes: at 1, 5, 10, and 20 years, 97%, 97%, 96%, and 96% in patients with diabetes, and 96%, 96%, 95%, and 93% in patients without diabetes, respectively (early p = 0.20; late p = 0.30). In contrast, saphenous vein graft patency declined over time and similarly in patients with and without diabetes: at 1, 5, 10, and 20 years, 78%, 70%, 57%, and 42% in patients with diabetes, and 82%, 72%, 58%, and 41% in patients without diabetes, respectively (early p < 0.002; late p = 0.60). After adjusting for patient characteristics, diabetes was associated with higher early patency of ITA grafts (odds ratio: 0.63; 95% confidence limits: 0.43 to 0.91; p = 0.013), but late patency of ITA grafts was similar in patients with and without diabetes (p = 0.80). Early and late patency of saphenous vein grafts were similar in patients with and without diabetes (early p = 0.90; late p = 0.80). Contrary to our hypothesis, diabetes did not influence long-term patency of bypass grafts. Use of ITA grafts should be maximized in patients undergoing coronary artery bypass grafting because they have excellent patency in patients with and without diabetes even after 20 years. Copyright © 2017 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Andrade, Wagner de Castro; Velhote, Manoel Carlos Prieto; Ayoub, Ali Ahman; Silva, Marcos Marques; Gibelli, Nelson Elias M; Tannuri, Ana Cristina A; Santos, Maria Merces; Pinho-Apezzato, Maria Lucia; de Barros, Fabio; Moreira, Daniel Rangel; Miyatani, Helena T; Pereira, Raimundo Renato; Tannuri, Uenis
2014-04-01
Living donor liver transplantation has become a cornerstone for the treatment of children with end-stage hepatic dysfunction, especially within populations or countries with low rates of organ utilization from deceased donors. The objective is to report our experience with 185 living donors operated on by a team pediatric surgeons in a tertiary center for pediatric liver transplantation. Retrospective analysis of medical records of donors of hepatic grafts for transplant undergoing surgery between June 1998 and March 2013. Over the last 14 years, 185 liver transplants were performed in pediatric recipients of grafts from living donors. Among the donors, 166 left lateral segments (89.7%), 18 left lobes without the caudate lobe (9.7%) and 1 right lobe (0.5%) were harvested. The donor age ranged from 16 to 53 years, and the weight ranged from 47 to 106 kg. In 10 donors, an additional graft of the donor inferior mesenteric vein was harvested to substitute for a hypoplastic recipient portal vein. The transfusion of blood products was required in 15 donors (8.1%). The mean hospital stay was 5 days. No deaths occurred, but complications were identified in 23 patients (12.4%): 9 patients experienced abdominal pain and severe gastrointestinal symptoms and 3 patients required reoperations. Eight donors presented with minor bile leaks that were treated conservatively, and 3 patients developed extra-peritoneal infections (1 wound collection, 1 phlebitis and 1 pneumonia). Eight grafts (4.3%) showed primary dysfunction resulting in recipient death (3 cases of fulminant hepatitis, 1 patient with metabolic disease, 1 patient with Alagille syndrome and 3 cases of biliary atresia in infants under 1 year old). There was no relation between donor complications and primary graft dysfunction (P=0.6). Living donor transplantation is safe for the donor and presents a low morbidity. The donor surgery may be performed by a team of trained pediatric surgeons. © 2014.
Zhang, Tianhua; Jiang, Weiliang; Lu, Haitao; Liu, Jianfeng
2016-04-01
The present study retrospectively reviewed and evaluated the effectiveness of thoracic endovascular aortic repair (TEVAR) combined with assistant techniques and devices for the treatment of acute complicated Stanford type B aortic dissections involving aortic arch. Fifty-six patients with acute complicated Stanford type B aortic dissection involving aortic arch were treated with TEVAR combined with hybrid procedure, chimney-graft technique, and branched stent grafts from January 2009 to March 2014. Seventeen patients undergone TEVAR combined with hybrid technique. Technical success was achieved in 94.1% with 5.8% of early mortality. Strokes occurred in a patient developing paraplegia, who completely recovered after lumbar drainage. Cardiocirculatory and pulmonary complications, bypass dysfunction or severe endoleak was not observed. Thirty patients undergone TEVAR combined with chimney technique with 100% technical success rate. Chimney-stent compression was observed in 1 patient, and another bare stent was deployed inside the first one. Three patients (10%) died during the study period. Immediate postoperative type I endoleak was detected in 4 cases (13.3%). TEVAR assisted by Castor branched aortic stent grafts in 9 patients was successful. Mortality during perioperative period and 30 days after TEVAR was null. No serious complications such as strokes, acute myocardial infarction, and ischemia of arms occurred. The results indicate that TEVAR combined with hybrid technique, chimney technique, and branched stent grafts is proven to be a technically feasible and effective treatment for acute complicated Stanford type B aortic dissection involving aortic arch in small cohort. Copyright © 2016 Elsevier Inc. All rights reserved.
Schwab, Kristin; Saggar, Rajeev; Duffy, Erin; Elashoff, David; Tseng, Chi-Hong; Weigt, Sam; Charan, Deepshikha; Abtin, Fereidoun; Johannes, Jimmy; Derhovanessian, Ariss; Conklin, Jeffrey; Ghassemi, Kevin; Khanna, Dinesh; Siddiqui, Osama; Ardehali, Abbas; Hunter, Curtis; Kwon, Murray; Biniwale, Reshma; Lo, Michelle; Volkmann, Elizabeth; Torres Barba, David; Belperio, John A.; Mahrer, Thomas; Furst, Daniel E.; Kafaja, Suzanne; Clements, Philip; Shino, Michael; Gregson, Aric; Kubak, Bernard; Lynch, Joseph P.; Ross, David
2016-01-01
Rationale: Consideration of lung transplantation in patients with systemic sclerosis (SSc) remains guarded, often due to the concern for esophageal dysfunction and the associated potential for allograft injury and suboptimal post–lung transplantation outcomes. Objectives: The purpose of this study was to systematically report our single-center experience regarding lung transplantation in the setting of SSc, with a particular focus on esophageal dysfunction. Methods: We retrospectively reviewed all lung transplants at our center from January 1, 2000 through August 31, 2012 (n = 562), comparing the SSc group (n = 35) to the following lung transplant diagnostic subsets: all non-SSc (n = 527), non-SSc diffuse fibrotic lung disease (n = 264), and a non-SSc matched group (n = 109). We evaluated post–lung transplant outcomes, including survival, primary graft dysfunction, acute rejection, bronchiolitis obliterans syndrome, and microbiology of respiratory isolates. In addition, we defined severe esophageal dysfunction using esophageal manometry and esophageal morphometry criteria on the basis of chest computed tomography images. For patients with SSc referred for lung transplant but subsequently denied (n = 36), we queried the reason(s) for denial with respect to the concern for esophageal dysfunction. Measurements and Main Results: The 1-, 3-, and 5-year post–lung transplant survival for SSc was 94, 77, and 70%, respectively, and similar to the other groups. The remaining post–lung transplant outcomes evaluated were also similar between SSc and the other groups. Approximately 60% of the SSc group had severe esophageal dysfunction. Pre–lung transplant chest computed tomography imaging demonstrated significantly abnormal esophageal morphometry for SSc when compared with the matched group. Importantly, esophageal dysfunction was the sole reason for lung transplant denial in a single case. Conclusions: Relative to other lung transplant indications, our SSc group experienced comparable survival, primary graft dysfunction, acute rejection, bronchiolitis obliterans syndrome, and microbiology of respiratory isolates, despite the high prevalence of severe esophageal dysfunction. Esophageal dysfunction rarely precluded active listing for lung transplantation. PMID:27078625
Tokat, Yaman
2016-01-01
In living donor liver transplantation (LDLT), an adequate hepatic venous outflow constitutes one of the basic principles of a technically successful procedure. The issue of whether the anterior sector (AS) of the right lobe (RL) graft should or should not be routinely drained has been controversial. The aim of this 10-year, single-center, retrospective cohort study was to review the evolution of our hepatic venous outflow reconstruction technique in RL grafts and evaluate the impact of routine AS drainage strategy on the outcome. The study group consisted of 582 primary RL LDLT performed between July 2004 and December 2014. The cases were divided into 3 consecutive periods with different AS venous outflow reconstruction techniques, which included middle hepatic vein (MHV) drainage in Era 1 (n=119), a more selective AS drainage with cryopreserved homologous grafts in Era 2 (n=391), and routine segment 5 and/or 8 oriented AS drainage with synthetic grafts in Era 3 (n=72). Intraoperative portal flow measurement with routine splenic artery ligation (SAL) technique (in RL grafts with a portal flow of ≥ 250 mL/min/100 g liver tissue) was added later in Era 3. These 3 groups were compared in terms of recipient and donor demographics, surgical characteristics and short-term outcome. The rate of AS venous drainage varied from 58.8% in Era 1 and 35.0% in Era 2 to 73.6% in Era 3 (P<0.001). Perioperative mortality rate of recipients significantly decreased over the years (15.1% in Era 1 and 8.7% in Era 2 vs. 2.8% in Era 3, P=0.01). After the addition of SAL technique in the 45 cases, there was only 1 graft loss and no perioperative mortality. One-year recipient survival rate was also significantly higher in Era 3 (79.6% in Era 1 and 86.1% in Era 2 vs. 92.1% in Era 3, P=0.002). Routine AS drainage via segment 5 and/or 8 veins using synthetic grafts is a technique to fit all RL grafts in LDLT. Addition of SAL effectively prevents early graft dysfunction and significantly improves the outcome. PMID:27115010
Reeb, Jeremie; Cypel, Marcelo
2016-03-01
Lung transplantation is an established life-saving therapy for patients with end-stage lung disease. Unfortunately, greater success in lung transplantation is hindered by a shortage of lung donors and the relatively poor early-, mid-, and long-term outcomes associated with severe primary graft dysfunction. Ex vivo lung perfusion has emerged as a modern preservation technique that allows for a more accurate lung assessment and improvement in lung quality. This review outlines the: (i) rationale behind the method; (ii) techniques and protocols; (iii) Toronto ex vivo lung perfusion method; (iv) devices available; and (v) clinical experience worldwide. We also highlight the potential of ex vivo lung perfusion in leading a new era of lung preservation. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Saaiq, M.; Zaib, S.; Ahmad, S.
2012-01-01
Summary This is a study of 120 patients of either sex and all ages who had sustained deep burns of up to 40% of the total body surface area. Half the patients underwent early excision and skin autografting (i.e., within 4-7 days of sustaining burn injury) while the rest underwent delayed excision and skin autografting (i.e., within 1-4 weeks post-burn). Significant differences were found in favour of the early excision and grafting group with regard to the various burn management outcome parameters taken into consideration, i.e. culture positivity of wounds, graft take, duration of post-graft hospitalization, and mortality. PMID:23467391
Zurbano, L; Zurbano, F
2017-10-01
The lung transplantation is a therapeutic procedure indicated for lung diseases that are terminal and irreversible (except lung cancer) despite the best medical current treatment. It is an emergent procedure in medical care. In this review, an analyse is made of the most frequent complications of lung transplant related to the graft (rejection and chronic graft dysfunction), immunosuppression (infections, arterial hypertension, renal dysfunction, and diabetes), as well as others such as gastrointestinal complications, osteoporosis. The most advisable therapeutic options are also included. Specific mention is made of the reviews and follow-up for monitoring the graft and the patients, as well as the lifestyle recommended to improve the prognosis and quality of life. An analysis is also made on the outcomes in the Spanish and international registries, their historical evolution and the most frequent causes of death, in order to objectively analyse the usefulness of the transplant. Copyright © 2016 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España, S.L.U. All rights reserved.
Nordgaard, Håvard; Swillens, Abigail; Nordhaug, Dag; Kirkeby-Garstad, Idar; Van Loo, Denis; Vitale, Nicola; Segers, Patrick; Haaverstad, Rune; Lovstakken, Lasse
2010-12-01
Competitive flow from native coronary vessels is considered a major factor in the failure of coronary bypass grafts. However, the pathophysiological effects are not fully understood. Low and oscillatory wall shear stress (WSS) is known to induce endothelial dysfunction and vascular disease, like atherosclerosis and intimal hyperplasia. The aim was to investigate the impact of competitive flow on WSS in mammary artery bypass grafts. Using computational fluid dynamics, WSS was calculated in a left internal mammary artery (LIMA) graft to the left anterior descending artery in a three-dimensional in vivo porcine coronary artery bypass graft model. The following conditions were investigated: high competitive flow (non-significant coronary lesion), partial competitive flow (significant coronary lesion), and no competitive flow (totally occluded coronary vessel). Time-averaged WSS of LIMA at high, partial, and no competitive flow were 0.3-0.6, 0.6-3.0, and 0.9-3.0 Pa, respectively. Further, oscillatory WSS quantified as the oscillatory shear index (OSI) ranged from (maximum OSI = 0.5 equals zero net WSS) 0.15 to 0.35, <0.05, and <0.05, respectively. Thus, high competitive flow resulted in substantial oscillatory and low WSS. Moderate competitive flow resulted in WSS and OSI similar to the no competitive flow condition. Graft flow is highly dependent on the degree of competitive flow. High competitive flow was found to produce unfavourable WSS consistent with endothelial dysfunction and subsequent graft narrowing and failure. Partial competitive flow, however, may be better tolerated as it was found to be similar to the ideal condition of no competitive flow.
Clinical Courses of Graft Failure Caused by Chronic Allograft Dysfunction in Kidney Transplantation.
Fujiwara, T; Teruta, S; Tsudaka, S; Ota, K; Matsuda, H
Chronic allograft dysfunction (CAD) is a main cause of graft failure in kidney transplantation. We retrospectively analyzed 279 kidney transplant recipients who survived with a functioning graft for at least 2 years. CAD was defined as chronic graft deterioration, excluding other specific causes. We defined the pattern of decline in estimated glomerular filtration rate (eGFR), as follows: (1) "plateau" was defined as decline in eGFR ≤2 mL/min/1.73 m 2 /year; "long plateaus" were those lasting more than 5 years; (2) "rapid decline" was a decrease in eGFR ≥20 mL/min/1.73 m 2 /year. Patients diagnosed with CAD were categorized according to the occurrence of rapid decline and/or long plateau as follows: group 1, neither rapid decline nor long plateau; group 2, rapid decline only; group 3, long plateau only; and group 4, both rapid decline and long plateau. From a total of 81 graft losses, 51 (63%) failed because of CAD, with a median of 9.4 years. Sixteen patients belonged to group 1, 14 to group 2, 12 to group 3, and nine to group 4. Mean graft survival times in the four groups were 7.7 ± 1.1, 6.1 ± 3.1, 16.2 ± 2.5, and 10.8 ± 3.6 years, respectively (P < .001). There were significant differences among groups in donor age, year of transplantation, mean eGFR at baseline, and acute rejection rate after transplantation. The results indicate that this cohort of kidney transplant recipients who had CAD comprised subgroups with different clinical courses. Copyright © 2016 Elsevier Inc. All rights reserved.
Fox, Amanda A.; Collard, Charles D.; Shernan, Stanton K.; Seidman, Christine E.; Seidman, Jonathan G.; Liu, Kuang-Yu; Muehlschlegel, Jochen D.; Perry, Tjorvi E.; Aranki, Sary F.; Lange, Christoph; Herman, Daniel S.; Meitinger, Thomas; Lichtner, Peter; Body, Simon C.
2009-01-01
Background Ventricular dysfunction (VnD) after primary coronary artery bypass grafting is associated with increased hospital stay and mortality. Natriuretic peptides have compensatory vasodilatory, natriuretic and paracrine influences on myocardial failure and ischemia. We hypothesized that natriuretic peptide system gene variants independently predict risk of VnD after primary coronary artery bypass grafting. Methods 1164 patients undergoing primary coronary artery bypass grafting with cardiopulmonary bypass at two institutions were prospectively enrolled. After prospectively defined exclusions, 697 Caucasian patients (76 with VnD) were analyzed. VnD was defined as need for ≥ 2 new inotropes and/or new mechanical ventricular support after coronary artery bypass grafting. 139 haplotype-tagging SNPs within 7 genes (NPPA; NPPB; NPPC; NPR1; NPR2; NPR3; CORIN) were genotyped. SNPs univariately associated with VnD were entered into logistic regression models adjusting for clinical covariates predictive of VnD. To control for multiple comparisons, permutation analyses were conducted for all SNP associations. Results After adjusting for clinical covariates and multiple comparisons within each gene, seven NPPA/NPPB SNPs (rs632793, rs6668352, rs549596, rs198388, rs198389, rs6676300, rs1009592) were associated with decreased risk of postoperative VnD (additive model; odds ratios 0.44–0.55; P = 0.010–0.036), and four NPR3 SNPs (rs700923, rs16890196, rs765199, rs700926) were associated with increased risk of postoperative VnD (recessive model; odds ratios 3.89–4.28; P = 0.007–0.034). Conclusions Genetic variation within the NPPA/NPPB and NPR3 genes is associated with risk of VnD after primary coronary artery bypass grafting. Knowledge of such genotypic predictors may result in better understanding of the molecular mechanisms underlying postoperative VnD. PMID:19326473
Tran, Kenneth; Ullery, Brant W; Itoga, Nathan; Lee, Jason T
2018-04-01
The objective of this study was to describe the polar orientation of renal chimney grafts within the proximal seal zone and to determine whether graft orientation is associated with early type IA endoleak or renal graft compression after chimney endovascular aneurysm repair (ch-EVAR). Patients who underwent ch-EVAR with at least one renal chimney graft from 2009 to 2015 were included in this analysis. Centerline three-dimensional reconstructions were used to analyze postoperative computed tomography scans. The 12-o'clock polar position was set at the takeoff of the superior mesenteric artery. Relative polar positions of chimney grafts were recorded at the level of the renal artery ostium, at the mid-seal zone, and at the proximal edge of the graft fabric. Early type IA endoleaks were defined as evidence of a perigraft flow channel within the proximal seal zone. There were 62 consecutive patients who underwent ch-EVAR (35 double renal, 27 single renal) for juxtarenal abdominal aortic aneurysms with a mean follow-up of 31.2 months; 18 (29%) early type IA "gutter" endoleaks were identified. During follow-up, the majority of these (n = 13; 72%) resolved without intervention, whereas two patients required reintervention (3.3%). Estimated renal graft patency was 88.9% at 60 months. Left renal chimney grafts were most commonly at the 3-o'clock position (51.1%) at the ostium, traversing posteriorly to the 5- to 7-o'clock positions (55.5%) at the fabric edge. Right renal chimney grafts started most commonly at the 9-o'clock position (n = 17; 33.3%) and tended to traverse both anteriorly (11 to 1 o'clock; 39.2%) and posteriorly (5 to 7 o'clock; 29.4%) at the fabric edge. In the polar plane, the majority of renal chimney grafts (n = 83; 85.6%) traversed <90 degrees before reaching the proximal fabric edge. Grafts that traversed >90 degrees were independently associated with early type IA endoleaks (odds ratio, 11.5; 95% confidence interval, 2.1-64.8) even after controlling for other device and anatomic variables. Polar orientation of the chimney grafts was not associated with graft kinking or compression (P = .38) or occlusion (P = .10). Takeoff angle of the renal arteries was the most significant predictor of chimney graft orientation. Caudally directed arteries (takeoff angle >30 degrees) were less likely to have implanted chimney grafts that traversed >90 degrees in polar angle (odds ratio, 0.09; 95% confidence interval, 0.01-0.55). Renal chimney grafts vary considerably in both starting position and their polar trajectory within the proximal seal zone. Grafts that traverse >90 degrees in polar angle within the seal zone may be at increased risk of early type IA endoleaks and require more frequent imaging surveillance. Caudally directed renal arteries result in a more favorable polar geometry (eg, cranial-caudal orientation) with respect to endoleak risk and thus are more ideal candidates for parallel graft strategies. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Adil, Eelam; Poe, Dennis
2014-02-01
To present the current medical and surgical treatment options for Eustachian tube dysfunction. Balloon dilation or microdebrider Eustachian tuboplasty are feasible treatment options for patients with refractory dilatory dysfunction as an alternative to tympanostomy tube placement. There is increasing evidence that repair of patulous Eustachian tubes by the insertion of a shim or fat graft reconstruction within the lumen of the Eustachian tube orifice may be effective. In patients with Eustachian tube dysfunction that is refractory to medical management, newer surgical techniques may provide symptomatic relief with a reasonable duration. Continued basic science research into the cause of dysfunction, the mechanisms of benefit from intervention and long-term clinical outcomes are necessary.
Hwang, Ho Young; Kim, Jun Sung; Oh, Se Jin; Kim, Ki-Bong
2012-11-01
The Saphenous Vein Versus Right Internal Thoracic Artery as a Y-Composite Graft trial was designed to evaluate the saphenous vein compared with the right internal thoracic artery as a Y-composite graft anastomosed to the side of the left internal thoracic artery. In this early analysis, we compared early angiographic patency rates and clinical outcomes. From September 2008 to October 2011, 224 patients with multivessel coronary artery disease were randomized prospectively to undergo off-pump revascularization using the saphenous vein group (n = 112) or the right internal thoracic artery group (n = 112) as Y-composite grafts. Early postoperative (1.4 ± 1.1 days) angiographic patency and clinical outcomes were compared. There was 1 operative death in the right internal thoracic artery group. No statistically significant differences in postoperative morbidities, including atrial fibrillation and acute renal failure, were observed between the groups. The number of distal anastomoses using the side-arm Y-composite graft (saphenous vein vs right internal thoracic artery) were 2.3 ± 0.8 and 1.9 ± 0.7 in the saphenous vein and right internal thoracic artery groups, respectively (P < .001). A third conduit was used in 44 patients (saphenous vein group vs right internal thoracic artery group, 4/109 vs 40/110; P < .001) to extend the side-arm Y-composite graft for complete revascularization. Early angiography demonstrated an overall patency rate of 99.4% (771 of 776 distal anastomoses). Patency rates of the side-arm Y-composite graft (saphenous vein vs right internal thoracic artery) were 98.8% (245 of 248) and 99.5% (207 of 208) in the saphenous vein and right internal thoracic artery groups, respectively (P = .629). A third conduit was needed to extend the right internal thoracic artery composite graft and reach the target vessels in 36.4% (40/110) of the patients. The saphenous vein composite graft was comparable with the right internal thoracic artery composite graft in terms of early angiographic patency and clinical outcomes. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Vogrin, M; Rupreht, M; Dinevski, D; Hašpl, M; Kuhta, M; Jevsek, M; Knežević, M; Rožman, P
2010-01-01
Slow graft healing in bone tunnels and a slow graft ligamentization process after anterior cruciate ligament (ACL) reconstruction are some of the reasons for prolonged rehabilitation. The purpose of this study was to determine if the use of platelet gel (PG) accelerates early graft revascularization after ACL reconstruction. PG was produced from autologous platelet-rich plasma and applied locally. We quantitatively evaluated the revascularization process in the osteoligamentous interface zone in the bone tunnels and in the intra-articular part of the graft by means of contrast-enhanced magnetic resonance imaging (MRI). After 4-6 weeks, the PG-treated group demonstrated a significantly higher level of vascularization in the osteoligamentous interface (0.33 ± 0.09) than the control group (0.16 ± 0.09, p < 0.001). In the intra-articular part of the graft, we found no evidence of revascularization in either group. Locally applied PG enhanced early revascularization of the graft in the osteoligamentous interface zone after ACL reconstruction. Copyright © 2010 S. Karger AG, Basel.
Blood Vessels in Allotransplantation.
Abrahimi, P; Liu, R; Pober, J S
2015-07-01
Human vascularized allografts are perfused through blood vessels composed of cells (endothelium, pericytes, and smooth muscle cells) that remain largely of graft origin and are thus subject to host alloimmune responses. Graft vessels must be healthy to maintain homeostatic functions including control of perfusion, maintenance of permselectivity, prevention of thrombosis, and participation in immune surveillance. Vascular cell injury can cause dysfunction that interferes with these processes. Graft vascular cells can be activated by mediators of innate and adaptive immunity to participate in graft inflammation contributing to both ischemia/reperfusion injury and allograft rejection. Different forms of rejection may affect graft vessels in different ways, ranging from thrombosis and neutrophilic inflammation in hyperacute rejection, to endothelialitis/intimal arteritis and fibrinoid necrosis in acute cell-mediated or antibody-mediated rejection, respectively, and to diffuse luminal stenosis in chronic rejection. While some current therapies targeting the host immune system do affect graft vascular cells, direct targeting of the graft vasculature may create new opportunities for preventing allograft injury and loss. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Evaluating Renal Transplant Status Using Viscoelastic Response (VisR) Ultrasound.
Hossain, Md Murad; Selzo, Mallory R; Hinson, Robert M; Baggesen, Leslie M; Detwiler, Randal K; Chong, Wui K; Burke, Lauren M; Caughey, Melissa C; Fisher, Melrose W; Whitehead, Sonya B; Gallippi, Caterina M
2018-05-10
Chronic kidney disease is most desirably and cost-effectively treated by renal transplantation, but graft survival is a major challenge. Although irreversible graft damage can be averted by timely treatment, intervention is delayed when early graft dysfunction goes undetected by standard clinical metrics. A more sensitive and specific parameter for delineating graft health could be the viscoelastic properties of the renal parenchyma, which are interrogated non-invasively by Viscoelastic Response (VisR) ultrasound, a new acoustic radiation force (ARF)-based imaging method. Assessing the performance of VisR imaging in delineating histologically confirmed renal transplant pathologies in vivo is the purpose of the study described here. VisR imaging was performed in patients with (n = 19) and without (n = 25) clinical indication for renal allograft biopsy. The median values of VisR outcome metrics (τ, relative elasticity [RE] and relative viscosity [RV]) were calculated in five regions of interest that were manually delineated in the parenchyma (outer, center and inner) and in the pelvis (outer and inner). The ratios of a given VisR metric for all possible region-of-interest combinations were calculated, and the corresponding ratios were statistically compared between biopsied patients subdivided by diagnostic categories versus non-biopsied, control allografts using the two-sample Wilcoxon test (p <0.05). Although τ ratios non-specifically differentiated allografts with vascular disease, tubular/interstitial scarring, chronic allograft nephropathy and glomerulonephritis from non-biopsied control allografts, RE distinguished only allografts with vascular disease and tubular/interstitial scarring, and RV distinguished only vascular disease. These results suggest that allografts with scarring and vascular disease can be identified using non-invasive VisR RE and RV metrics. Copyright © 2018 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.
Oral Disease Profiles in Chronic Graft versus Host Disease
Fassil, H.; Mays, J.W.; Edwards, D.; Baird, K.; Steinberg, S.M.; Cowen, E.W.; Naik, H.; Datiles, M.; Stratton, P.; Gress, R.E.; Pavletic, S.Z.
2015-01-01
At least half of patients with chronic graft-versus-host-disease (cGVHD), the leading cause of morbidity and non-relapse mortality after allogeneic stem cell transplantation, have oral manifestations: mucosal lesions, salivary dysfunction, and limited mouth-opening. cGVHD may manifest in a single organ or affect multiple organ systems, including the mouth, eyes, and the skin. The interrelationship of the 3 oral manifestations of cGVHD with each other and with the specific manifestations of extraoral cGVHD has not been studied. In this analysis, we explored, in a large group of patients with cGVHD, the potential associations between: (1) oral mucosal disease and erythematous skin disease, (2) salivary gland dysfunction and lacrimal gland dysfunction, and (3) limited mouth-opening and sclerotic skin cGVHD. Study participants, enrolled in a cGVHD Natural History Protocol (NCT00331968, n = 212), underwent an oral examination evaluating: (1) mucosal cGVHD [NIH Oral Mucosal Score (OMS)], (2) salivary dysfunction (saliva flow and xerostomia), and (3) maximum mouth-opening measurement. Parameters for dysfunction (OMS > 2, saliva flow ≤ 1 mL/5 min, mouth-opening ≤ 35 mm) were analyzed for association with skin cGVHD involvement (erythema and sclerosis, skin symptoms), lacrimal dysfunction (Schirmer’s tear test, xerophthalmia), Lee cGVHD Symptom Scores, and NIH organ scores. Oral mucosal disease (31% prevalence) was associated with skin erythema (P < 0.001); salivary dysfunction (11% prevalence) was associated with lacrimal dysfunction (P = 0.010) and xerostomia with xerophthalmia (r = 0.32, P = 0.001); and limited mouth-opening (17% prevalence) was associated with skin sclerosis (P = 0.008) and skin symptoms (P = 0.001). There was no association found among these 3 oral cGVHD manifestations. This analysis supports the understanding of oral cGVHD as 3 distinct diseases: mucosal lesions, salivary gland dysfunction, and mouth sclerosis. Clear classification of oral cGVHD as 3 separate manifestations will improve clinical diagnosis, observational research data collection, and the definitions of outcome measures in clinical trials. PMID:25740857
2013-01-01
Introduction Intraspinal grafting of human neural stem cells represents a promising approach to promote recovery of function after spinal trauma. Such a treatment may serve to: I) provide trophic support to improve survival of host neurons; II) improve the structural integrity of the spinal parenchyma by reducing syringomyelia and scarring in trauma-injured regions; and III) provide neuronal populations to potentially form relays with host axons, segmental interneurons, and/or α-motoneurons. Here we characterized the effect of intraspinal grafting of clinical grade human fetal spinal cord-derived neural stem cells (HSSC) on the recovery of neurological function in a rat model of acute lumbar (L3) compression injury. Methods Three-month-old female Sprague–Dawley rats received L3 spinal compression injury. Three days post-injury, animals were randomized and received intraspinal injections of either HSSC, media-only, or no injections. All animals were immunosuppressed with tacrolimus, mycophenolate mofetil, and methylprednisolone acetate from the day of cell grafting and survived for eight weeks. Motor and sensory dysfunction were periodically assessed using open field locomotion scoring, thermal/tactile pain/escape thresholds and myogenic motor evoked potentials. The presence of spasticity was measured by gastrocnemius muscle resistance and electromyography response during computer-controlled ankle rotation. At the end-point, gait (CatWalk), ladder climbing, and single frame analyses were also assessed. Syrinx size, spinal cord dimensions, and extent of scarring were measured by magnetic resonance imaging. Differentiation and integration of grafted cells in the host tissue were validated with immunofluorescence staining using human-specific antibodies. Results Intraspinal grafting of HSSC led to a progressive and significant improvement in lower extremity paw placement, amelioration of spasticity, and normalization in thermal and tactile pain/escape thresholds at eight weeks post-grafting. No significant differences were detected in other CatWalk parameters, motor evoked potentials, open field locomotor (Basso, Beattie, and Bresnahan locomotion score (BBB)) score or ladder climbing test. Magnetic resonance imaging volume reconstruction and immunofluorescence analysis of grafted cell survival showed near complete injury-cavity-filling by grafted cells and development of putative GABA-ergic synapses between grafted and host neurons. Conclusions Peri-acute intraspinal grafting of HSSC can represent an effective therapy which ameliorates motor and sensory deficits after traumatic spinal cord injury. PMID:23710605
Dengue fever in renal transplant patients: a systematic review of literature.
Weerakkody, Ranga Migara; Patrick, Jean Ansbel; Sheriff, Mohammed Hussain Rezvi
2017-01-13
Dengue fever in renal transplanted patients has not been studied well, and we review all the literature about episodes dengue fever in renal transplant patients. The aim was to describe clinico-pathological characteristics, immunosuppressive protocols, need renal outcome and mortality. PubMed, LILACS, Google Scholar and Research Gate were searched for "Dengue" and "Renal/Kidney Transplantation" with no date limits. Hits were analyzed by two researchers separately. Fever, myalgia, arthralgia and headache was significantly lower than normal population, while pleural effusions and ascites were observed more. Incidence of severe dengue is significantly higher among transplant patients in this review, as well as they had a significantly higher mortality (8.9% vs 3.7%, p = 0.031). Age, period after transplantation and immunosuppressive profile had no effect on disease severity, mortality or graft out come. Presence of new bleeding complications and ascites was associated with more severe disease (p < 0.001 and p = 0.005), death (p = 0.033) or graft loss (p = 0.035). Use of tacrolimus was associated with new bleeding complications (p = 0.027), and with ascites (p = 0.021), but not with thrombocytopenia. 25% of patients with primary disease fail to mount an IgG response by 15 weeks of the illness. 58.9% had graft dysfunction during illness. Postoperative transplanted patients were at risk of severe disease and unfavorable outcome. The physical and laboratory findings in dengue fever in renal transplanted patients differ from the general population. Some degree of graft dysfunction is common during the illness, but only a minority develops graft failure.
Rodriguez, E. R.; Skojec, Diane V.; Tan, Carmela D.; Zachary, Andrea A.; Kasper, Edward K.; Conte, John V.; Baldwin, William M.
2005-01-01
Antibody-mediated rejection (AMR) in human heart transplantation is an immunopathologic process in which injury to the graft is in part the result of activation of complement and it is poorly responsive to conventional therapy. We evaluated by immunofluorescence (IF), 665 consecutive endomyocardial biopsies from 165 patients for deposits of immunoglobulins and complement. Diffuse IF deposits in a linear capillary pattern greater than 2+ were considered significant. Clinical evidence of graft dysfunction was correlated with complement deposits. IF 2+ or higher was positive for IgG, 66%; IgM, 12%; IgA, 0.6%; C1q, 1.8%; C4d, 9% and C3d, 10%. In 3% of patients, concomitant C4d and C3d correlated with graft dysfunction or heart failure. In these 5 patients AMR occurred 56–163 months after transplantation, and they responded well to therapy for AMR but not to treatment with steroids. Systematic evaluation of endomyocardial biopsies is not improved by the use of antibodies for immunoglobulins or C1q. Concomitant use of C4d and C3d is very useful to diagnose AMR, when correlated with clinical parameters of graft function. AMR in heart transplant patients can occur many months or years after transplant. PMID:16212640
Grafting influences on early acorn production in swamp white oak (Quercus bicolor Wild.)
Mark V. Coggeshall; J.W. Van Sambeek; H.E. Garrett
2008-01-01
Early fruiting of swamp white oak planting stock has been observed. The potential to exploit this trait for wildlife enhancement purposes was evaluated in a grafting study. Scions from both precocious and non-precocious ortets were grafted onto a series of related seedling rootstock sources. Acorn production was recorded through age 4 years. Acorn productivity of the...
Aortic Replacement with Sutureless Intraluminal Grafts
Lemole, Gerald M.
1990-01-01
To avoid the anastomotic complications and long cross-clamp times associated with standard suture repair of aortic lesions, we have implanted sutureless intraluminal grafts in 122 patients since 1976. Forty-nine patients had disorders of the ascending aorta, aortic arch, or both: their operative mortality was 14% (7 patients), and the group's 5-year actuarial survival rate has been 64%. There have been no instances of graft dislodgment, graft infection, aortic bleeding, or pseudoaneurysm formation. Forty-two patients had disorders of the descending aorta and thoracoabdominal aorta: their early mortality was 10% (4 patients), and the group's 5-year actuarial survival rate has been 56%. There was 1 early instance of graft dislodgment, but no pseudoaneurysm formation, graft erosion, aortic bleeding, intravascular hemolysis, or permanent deficits in neurologic, renal, or vascular function. Thirty-one patients had the sutureless intraluminal graft implanted in the abdominal aortic position: their early mortality was 6% (2 patients), and the 5-year actuarial survival rate for this group has been 79%. There were no instances of renal failure, ischemic complication, postoperative paraplegia, pseudoaneurysm, or anastomotic true aneurysm. Our recent efforts have been directed toward developing an adjustable spool that can adapt to the widest aorta or the narrowest aortic arch vessel; but in the meanwhile, the present sutureless graft yields shorter cross-clamp times, fewer intraoperative complications, and both early and late results as satisfactory as those afforded by traditional methods of aortic repair. (Texas Heart Institute Journal 1990; 17:302-9) Images PMID:15227522
Ding, Ming; Henriksen, Susan S; Martinetti, Roberta; Overgaard, Søren
2017-11-01
Early fixation of total joint arthroplasties is crucial for ensuring implant survival. An alternative bone graft material in revision surgery is needed to replace the current gold standard, allograft, seeing that the latter is associated with several disadvantages. The incubation of such a construct in a perfusion bioreactor has been shown to produce viable bone graft materials. This study aimed at producing larger amounts of viable bone graft material (hydroxyapatite 70% and β-tricalcium-phosphate 30%) in a novel perfusion bioreactor. The abilities of the bioreactor-activated graft material to induce early implant fixation were tested in a bilateral implant defect model in sheep, with allograft as the control group. Defects were bilaterally created in the distal femurs of the animals, and titanium implants were inserted. The concentric gaps around the implants were randomly filled with either allograft, granules, granules with bone marrow aspirate or bioreactor-activated graft material. Following an observation time of 6 weeks, early implant fixation and bone formation were assessed by micro-CT scanning, mechanical testing, and histomorphometry. Bone formations were seen in all groups, while no significant differences between groups were found regarding early implant fixation. The microarchitecture of the bone formed by the synthetic graft materials resembled that of allograft. Histomorphometry revealed that allograft induced significantly more bone and less fibrous tissue (p < 0.05). In conclusion, bone formation was observed in all groups, while the bioreactor-activated graft material did not reveal additional effects on early implant fixation comparable to allograft in this model. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 105B: 2465-2476, 2017. © 2016 Wiley Periodicals, Inc.
Golshayan, Déla; Wójtowicz, Agnieszka; Bibert, Stéphanie; Pyndiah, Nitisha; Manuel, Oriol; Binet, Isabelle; Buhler, Leo H; Huynh-Do, Uyen; Mueller, Thomas; Steiger, Jürg; Pascual, Manuel; Meylan, Pascal; Bochud, Pierre-Yves
2016-04-01
There are conflicting data on the role of the lectin pathway of complement activation and its recognition molecules in acute rejection and outcome after transplantation. To help resolve this we analyzed polymorphisms and serum levels of lectin pathway components in 710 consecutive kidney transplant recipients enrolled in the nationwide Swiss Transplant Cohort Study, together with all biopsy-proven rejection episodes and 1-year graft and patient survival. Functional mannose-binding lectin (MBL) levels were determined in serum samples, and previously described MBL2, ficolin 2, and MBL-associated serine protease 2 polymorphisms were genotyped. Low MBL serum levels and deficient MBL2 diplotypes were associated with a higher incidence of acute cellular rejection during the first year, in particular in recipients of deceased-donor kidneys. This association remained significant (hazard ratio 1.75, 95% confidence interval 1.18-2.60) in a Cox regression model after adjustment for relevant covariates. In contrast, there was no significant association with rates of antibody-mediated rejection, patient death, early graft dysfunction or loss. Thus, results in a prospective multicenter contemporary cohort suggest that MBL2 polymorphisms result in low MBL serum levels and are associated with acute cellular rejection after kidney transplantation. Since MBL deficiency is a relatively frequent trait in the normal population, our findings may lead to individual risk stratification and customized immunosuppression. Copyright © 2016 International Society of Nephrology. Published by Elsevier Inc. All rights reserved.
Waterman, Brian R; Arroyo, William; Cotter, Eric J; Zacchilli, Michael A; Garcia, E'Stephan J; Owens, Brett D
2018-03-01
There remains a debate over whether to retain the index anterior cruciate ligament (ACL) graft in the setting of septic arthritis. To evaluate and compare clinical outcomes for the treatment of septic arthritis after ACL reconstruction (ACLR) in those with and without early graft retention. Case series; Level of evidence, 4. The Military Health System was queried for all ACLR procedures performed between 2007 and 2013. Inclusion criteria required active military status, primary ACLR with secondary septic arthritis, and minimum 24-month surveillance. Demographic, clinical, and surgical variables were evaluated using descriptive statistics and regression analysis for factors influencing selected outcomes. Of 9511 ACLR procedures, 31 (0.32%) were identified as having secondary septic arthritis requiring urgent arthroscopic irrigation and debridement and intravenous antibiotics (mean, 6.3 weeks). The majority (62%) were treated in the subacute (2 weeks to 2 months) setting. Index ACLR was performed with a hamstring autograft (n = 17, 55%), soft tissue allograft (n = 11, 35%), and patellar tendon autograft (n = 3, 10%). The graft was retained in 71% (n = 22) of patients, while 29% (n = 9) underwent early graft debridement. At a mean 26.9-month follow-up, 48% of patients (n = 15) had returned to the military. Graft removal was not predictive of return to active duty ( P = .29). The presence of postoperative complications, including symptomatic postinfection arthritis (22.6%) and arthrofibrosis (9.7%), was the only variable predictive of inability to return to duty (odds ratio, 27.5 [95% CI, 3.24-233.47]; P = .002). Seven of 9 patients who underwent graft debridement underwent revision ACLR, and all 7 had stable knees at final follow-up compared with 68% (15/22) in the graft retention group. Arthroscopic debridement with early graft removal and staged revision ACLR remains a viable option for restoring knee stability (100%), although the rate of return to active duty was low in the graft resection group (33%). The risk of knee laxity did not differ based on early graft retention. Time to presentation with graft retention was not associated with a decreased rate of graft laxity.
Waterman, Brian R.; Arroyo, William; Cotter, Eric J.; Zacchilli, Michael A.; Garcia, E’Stephan J.; Owens, Brett D.
2018-01-01
Background: There remains a debate over whether to retain the index anterior cruciate ligament (ACL) graft in the setting of septic arthritis. Purpose: To evaluate and compare clinical outcomes for the treatment of septic arthritis after ACL reconstruction (ACLR) in those with and without early graft retention. Study Design: Case series; Level of evidence, 4. Methods: The Military Health System was queried for all ACLR procedures performed between 2007 and 2013. Inclusion criteria required active military status, primary ACLR with secondary septic arthritis, and minimum 24-month surveillance. Demographic, clinical, and surgical variables were evaluated using descriptive statistics and regression analysis for factors influencing selected outcomes. Results: Of 9511 ACLR procedures, 31 (0.32%) were identified as having secondary septic arthritis requiring urgent arthroscopic irrigation and debridement and intravenous antibiotics (mean, 6.3 weeks). The majority (62%) were treated in the subacute (2 weeks to 2 months) setting. Index ACLR was performed with a hamstring autograft (n = 17, 55%), soft tissue allograft (n = 11, 35%), and patellar tendon autograft (n = 3, 10%). The graft was retained in 71% (n = 22) of patients, while 29% (n = 9) underwent early graft debridement. At a mean 26.9-month follow-up, 48% of patients (n = 15) had returned to the military. Graft removal was not predictive of return to active duty (P = .29). The presence of postoperative complications, including symptomatic postinfection arthritis (22.6%) and arthrofibrosis (9.7%), was the only variable predictive of inability to return to duty (odds ratio, 27.5 [95% CI, 3.24-233.47]; P = .002). Seven of 9 patients who underwent graft debridement underwent revision ACLR, and all 7 had stable knees at final follow-up compared with 68% (15/22) in the graft retention group. Conclusion: Arthroscopic debridement with early graft removal and staged revision ACLR remains a viable option for restoring knee stability (100%), although the rate of return to active duty was low in the graft resection group (33%). The risk of knee laxity did not differ based on early graft retention. Time to presentation with graft retention was not associated with a decreased rate of graft laxity. PMID:29552571
Marginal donors: can older donor hearts tolerate prolonged cold ischemic storage?
Korkmaz, Sevil; Bährle-Szabó, Susanne; Loganathan, Sivakkanan; Li, Shiliang; Karck, Matthias; Szabó, Gábor
2013-10-01
Both advanced donor age and prolonged ischemic time are significant risk factors for the 1-year mortality. However, its functional consequences have not been fully evaluated in the early-phase after transplantation; even early graft dysfunction is the main determinant of long-term outcome following transplantation. We evaluated in vivo left-ventricular (LV) cardiac and coronary vascular function of old-donor grafts after short and prolonged cold ischemic times in rats 1 h after heart transplantation. The hearts were excised from young donor (3-month-old) or old donor (18-month-old) rats, stored in cold preservation solution for either 1 or 8 h, and heterotopically transplanted. After 1 h of ischemic period, in the old-donor group, LV pressure, maximum pressure development (dP/dt max), time constant of LV pressure decay (τ), LV end-diastolic pressure and coronary blood flow did not differ compared with young donors. However, endothelium-dependent vasodilatation to acetylcholine resulted in a significantly lower response of coronary blood flow in the old-donor group (33 ± 4 vs. 51 ± 15 %, p < 0.05). After 8 h preservation, two of the old-donor hearts showed no mechanical activity upon reperfusion. LV pressure (55 ± 6 vs. 72 ± 5 mmHg, p < 0.05), dP/dt max (899 ± 221 vs. 1530 ± 217 mmHg/s, p < 0.05), coronary blood flow and response to acetylcholine were significantly reduced and τ was increased in the old-donor group in comparison to young controls. During the early-phase after transplantation, the ischemic tolerance of older-donor hearts is reduced after prolonged preservation time and the endothelium is more vulnerable to ischemia/reperfusion.
Codes, Liana; de Souza, Ygor Gomes; D'Oliveira, Ricardo Azevedo Cruz; Bastos, Jorge Luiz Andrade; Bittencourt, Paulo Lisboa
2018-04-24
To analyze whether fluid overload is an independent risk factor of adverse outcomes after liver transplantation (LT). One hundred and twenty-one patients submitted to LT were retrospectively evaluated. Data regarding perioperative and postoperative variables previously associated with adverse outcomes after LT were reviewed. Cumulative fluid balance (FB) in the first 12 h and 4 d after surgery were compared with major adverse outcomes after LT. Most of the patients were submitted to a liberal approach of fluid administration with a mean cumulative FB over 5 L and 10 L, respectively, in the first 12 h and 4 d after LT. Cumulative FB in 4 d was independently associated with occurrence of both AKI and requirement for renal replacement therapy (RRT) (OR = 2.3; 95%CI: 1.37-3.86, P = 0.02 and OR = 2.89; 95%CI: 1.52-5.49, P = 0.001 respectively). Other variables on multivariate analysis associated with AKI and RRT were, respectively, male sex and Acute Physiology and Chronic Health Disease Classification System (APACHE II) levels and sepsis or septic shock. Mortality was shown to be independently related to AST and APACHE II levels (OR = 2.35; 95%CI: 1.1-5.05, P = 0.02 and 2.63; 95%CI: 1.0-6.87, P = 0.04 respectively), probably reflecting the degree of graft dysfunction and severity of early postoperative course of LT. No effect of FB on mortality after LT was disclosed. Cumulative positive FB over 4 d after LT is independently associated with the development of AKI and the requirement of RRT. Survival was not independently related to FB, but to surrogate markers of graft dysfunction and severity of postoperative course of LT.
Stevens, R B; Skorupa, J Y; Rigley, T H; Yannam, G R; Nielsen, K J; Schriner, M E; Skorupa, A J; Murante, A; Holdaway, E; Wrenshall, L E
2009-05-01
Histidine-Tryptophan-Ketoglutarate (HTK) solution is increasingly used to flush and preserve organ donor kidneys, with efficacy claimed equivalent to University of Wisconsin (UW) solution. We observed and reported increased graft pancreatitis in pancreata flushed with HTK solution, which prompted this review of transplanting HTK-flushed kidneys. We analyzed outcomes of deceased-donor kidneys flushed with HTK and UW solutions with a minimum of 12 months follow-up, excluding pediatric and multi-organ recipients. We evaluated patient and graft survival and rejection rates, variables that might constitute hazards to graft survival and renal function. Two-year patient survival, rejection, renal function and graft survival were not different, but early graft loss (<6 months) was worse in HTK-flushed kidneys (p < 0.03). A Cox analysis of donor grade, cold ischemic time, panel reactive antibodies (PRA), donor race, first vs. repeat transplant, rejection and flush solution showed that only HTK use predicted early graft loss (p < 0.04; relative risk = 3.24), almost exclusively attributable to primary non-function (HTK, n = 5 (6.30%); UW, n = 1 (0.65%); p = 0.02). Delayed graft function and early graft loss with HTK occurred only in lesser grade kidneys, suggesting it should be used with caution in marginal donors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmelter, Christopher, E-mail: christopher.schmelter@klinikum-ingolstadt.de; Raab, Udo, E-mail: udo.raab@klinikum-ingolstadt.de; Lazarus, Friedrich, E-mail: friedrich.lazarus@klinikum-ingolstadt.de
PurposeThe study was designed to assess outcomes of arteriovenous (AV) accesses after interventional stent-graft deployment in haemodialysis patients.Materials and Methods63 haemodialysis patients with 66 AV fistulas and AV grafts were treated by interventional stent-graft deployment from 2006 to 2012 at our hospital. Data of these patients were retrospectively analysed for location of deployed stent-grafts, occurrence and location of (re-)stenosis and (re-)thrombosis. Complex stenosis was the most frequent indication for stent-graft deployment (45.5 %), followed by complications of angioplasty with vessel rupture or dissection (31.8 %).ResultsA high rate of procedural success was achieved (98.5 %). The most frequent location of the deployed stent-graft wasmore » the draining vein (66.7 %). Stent-graft deployment was more frequent in AV grafts than in AV fistulas. Primary patency was 45.5 % at 6 month, 31.3 % at 12 month and 19.2 % at 24 month. Primary patency was significantly better for AV fistulas than for AV grafts with deployed stent-grafts. Patency of the deployed stent-graft was much better than overall AV access primary patency with deployed stent-graft. Re-stenosis with thrombosis was the most frequent indication for re-intervention. Most frequent location of re-stenosis was the draining vein (37.1 %), followed by stenosis at the AV access (29.5 %) and the deployed stent-graft (23.5 %).ConclusionRe-stenosis and re-thrombosis remain frequent in AV fistulas and AV grafts in haemodialysis patients despite stent-graft deployment. Re-stenosis of the deployed stent-graft is, only in the minority of the cases, responsible for AV access dysfunction.« less
Caputti, Guido Marco; Palma, José Honório; Gaia, Diego Felipe; Buffolo, Enio
2011-01-01
OBJECTIVES: Patients with coronary artery disease and left ventricular dysfunction have high mortality when kept in clinical treatment. Coronary artery bypass grafting can improve survival and the quality of life. Recently, revascularization without cardiopulmonary bypass has been presented as a viable alternative. The aim of this study is to compare patients with left ventricular ejection fractions of less than 20% who underwent coronary artery bypass graft with or without cardiopulmonary bypass. METHODS: From January 2001 to December 2005, 217 nonrandomized, consecutive, and nonselected patients with an ejection fraction less than or equal to 20% underwent coronary artery bypass graft surgery with (112) or without (off-pump) (105) the use of cardiopulmonary bypass. We studied demographic, operative, and postoperative data. RESULTS: There were no demographic differences between groups. The outcome variables showed similar graft numbers in both groups. Mortality was 12.5% in the cardiopulmonary bypass group and 3.8% in the off-pump group. Postoperative complications were statistically different (cardiopulmonary bypass versus off-pump): total length of hospital stay (days)—11.3 vs. 7.2, length of ICU stay (days)—3.7 vs. 2.1, pulmonary complications—10.7% vs. 2.8%, intubation time (hours)—22 vs. 10, postoperative bleeding (mL)—654 vs. 440, acute renal failure—8.9% vs. 1.9% and left-ventricle ejection fraction before discharge—22% vs. 29%. CONCLUSION: Coronary artery bypass grafting without cardiopulmonary bypass in selected patients with severe left ventricular dysfunction is valid and safe and promotes less mortality and morbidity compared with conventional operations. PMID:22189729
Tanaka, Yoshinari; Kita, Keisuke; Takao, Rikio; Amano, Hiroshi; Uchida, Ryohei; Shiozaki, Yoshiki; Yonetani, Yasukazu; Kinugasa, Kazutaka; Mae, Tatsuo; Horibe, Shuji
2018-01-01
Background: Accumulating evidence suggests that long-term anterior cruciate ligament (ACL) deficiency can give rise to an abnormal tibiofemoral relationship and subsequent intra-articular lesions. However, the effects of chronic ACL deficiency (ACLD) on early graft failure after anatomic reconstruction remain unclear. Hypothesis: We hypothesized that patients with long-term ACLD lasting more than 5 years would have a greater rate of early graft failure due to insufficient intraoperative reduction of the tibia and that the preoperative and immediately postoperative abnormal tibiofemoral relationship in the sagittal plane, such as anterior tibial subluxation (ATS), would correlate with the graft status on postoperative magnetic resonance imaging (MRI). Study Design: Cohort study; Level of evidence, 3. Methods: A total of 358 patients who had undergone anatomic ACL reconstruction with hamstring grafts were divided into 5 groups based on chronicity of ACLD: (1) 0 to 6 months, (2) 6 months to 1 year, (3) 1 to 2 years, (4) 2 to 5 years, and (5) longer than 5 years. Preoperatively and immediately postoperatively, lateral radiographs in full extension were taken in all patients to evaluate the tibiofemoral relationship, specifically with regard to ATS, space for the ACL (sACL), and extension angle. All patients underwent MRI at 6 months to reveal graft status. Groups with a high rate of graft failure were further analyzed to compare demographic and radiographic factors between the intact and failure subgroups, followed by multivariate logistic regression analysis to identify predisposing factors. Results: Graft failure without trauma was observed in 4 (1.8%), 0 (0%), 1 (3.7%), 3 (9.7%), and 8 patients (17.7%) in groups 1, 2, 3, 4, and 5, respectively. Of the 76 patients in groups 4 and 5, significant differences were noted between the failure and intact subgroups in preoperative ATS (4.9 vs 2.4 mm, respectively; P < .01), side-to-side differences in sACL (sACL-SSD) (4.7 vs 1.9 mm, respectively; P < .01), extension deficit (4.4° vs 1.3°, respectively; P < .01), and chondral lesions (P = .02), while postoperative ATS and sACL-SSD showed no differences. Multivariate logistic regression analysis revealed that of these factors, preoperative sACL-SSD could be a risk factor for early graft failure (odds ratio, 3.2; 95% CI, 1.37-7.46). Conclusion Early graft failure at 6 months increased in patients with ACLD longer than 2 years. In this population, preoperative sACL-SSD was the most significant risk factor for early graft failure on MRI. However, immediately postoperative radiographic measurements had no effect on graft failure rates. PMID:29479543
Yang, Feng; Wang, Jinhong; Hou, Dengbang; Xing, Jialin; Liu, Feng; Xing, Zhi chen; Jiang, Chunjing; Hao, Xing; Du, Zhongtao; Yang, Xiaofang; Zhao, Yanyan; Miao, Na; Jiang, Yu; Dong, Ran; Gu, Chengxiong; Sun, Lizhong; Wang, Hong; Hou, Xiaotong
2016-01-01
Severe left ventricular (LV) dysfunction patients undergoing off-pump coronary artery bypass grafting (OPCAB) are often associated with a higher mortality. The efficacy and safety of the preoperative prophylactic intra-aortic balloon pump (IABP) insertion is not well established. 416 consecutive patients with severe LV dysfunction (ejection fraction ≤35%) undergoing isolated OPCAB were enrolled in a retrospective observational study. 191 patients was enrolled in the IABP group; the remaining 225 patients was in control group. A total of 129 pairs of patients were propensity-score matched. No significant differences in demographic and preoperative risk factors were found between the two groups. The postoperative 30-day mortality occurred more frequently in the control group compared with the IABP group (8.5% vs. 1.6%, p = 0.02). There was a significant reduction of low cardiac output syndrome in the IABP group compared with the control group (14% vs. 6.2%, p = 0.04). Prolonged mechanical ventilation (≥48 h) occurred more frequently in the control group (34.9% vs. 20.9%, p = 0.02). IABP also decreased the postoperative length of stay. Preoperative IABP was associated with a lower 30-day mortality, suggesting that it is effective in patients with severe LV dysfunction undergoing OPCAB. PMID:27279591
Amaral Gonçalves Fusatto, Helena; Castilho de Figueiredo, Luciana; Ragonete Dos Anjos Agostini, Ana Paula; Sibinelli, Melissa; Dragosavac, Desanka
2018-01-01
The aim of this study was to identify pulmonary dysfunction and factors associated with prolonged mechanical ventilation, hospital stay, weaning failure and mortality in patients undergoing coronary artery bypass grafting with use of intra-aortic balloon pump (IABP). This observational study analyzed respiratory, surgical, clinical and demographic variables and related them to outcomes. We analyzed 39 patients with a mean age of 61.2 years. Pulmonary dysfunction, characterized by mildly impaired gas exchange, was present from the immediate postoperative period to the third postoperative day. Mechanical ventilation time was influenced by the use of IABP and PaO2/FiO2, female gender and smoking. Intensive care unit (ICU) stay was influenced by APACHE II score and use of IABP. Mortality was strongly influenced by APACHE II score, followed by weaning failure. Pulmonary dysfunction was present from the first to the third postoperative day. Mechanical ventilation time was influenced by female gender, smoking, duration of IABP use and PaO2/FiO2 on the first postoperative day. ICU stay was influenced by APACHE II score and duration of IABP. Mortality was influenced by APACHE II score, followed by weaning failure. Copyright © 2017 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.
Kim, Il-Kyu; Cho, Hyun-Young; Pae, Sang-Pill; Jung, Bum-Sang; Cho, Hyun-Woo; Seo, Ji-Hoon
2013-12-01
Oral and maxillofacial defects often require bone grafts to restore missing tissues. Well-recognized donor sites include the anterior and posterior iliac crest, rib, and intercalvarial diploic bone. The proximal tibia has also been explored as an alternative donor site. The use of the tibia for bone graft has many benefits, such as procedural ease, adequate volume of cancellous and cortical bone, and minimal complications. Although patients rarely complain of pain, swelling, discomfort, or dysfunction, such as gait disturbance, both patients and surgeons should pay close attention to such after effects due to the possibility of tibial fracture. The purpose of this study is to analyze tibial fractures that occurring after osteotomy for a medioproximal tibial graft. An analysis was intended for patients who underwent medioproximal tibial graft between March 2004 and December 2011 in Inha University Hospital. A total of 105 subjects, 30 females and 75 males, ranged in age from 17 to 78 years. We investigated the age, weight, circumstance, and graft timing in relation to tibial fracture. Tibial fractures occurred in four of 105 patients. There were no significant differences in graft region, shape, or scale between the fractured and non-fractured patients. Patients who undergo tibial grafts must be careful of excessive external force after the operation.
2017-05-29
SR aGvHD; Acute-graft-versus-host Disease; Steroid Refractory Acute Graft Versus Host Disease; Graft-versus-host-disease; Graft Vs Host Disease; Alpha 1-Antitrypsin Deficiency; Alpha-1 Proteinase Inhibitor; Alpha-1 Protease Inhibitor Deficiency; Acute Graft-Versus-Host Reaction Following Bone Marrow Transplant
Vγ4 γδ T Cells Provide an Early Source of IL-17A and Accelerate Skin Graft Rejection.
Li, Yashu; Huang, Zhenggen; Yan, Rongshuai; Liu, Meixi; Bai, Yang; Liang, Guangping; Zhang, Xiaorong; Hu, Xiaohong; Chen, Jian; Huang, Chibing; Liu, Baoyi; Luo, Gaoxing; Wu, Jun; He, Weifeng
2017-12-01
Activated γδ T cells have been shown to accelerate allograft rejection. However, the precise role of skin-resident γδ T cells and their subsets-Vγ5 (epidermis), Vγ1, and Vγ4 (dermis)-in skin graft rejection have not been identified. Here, using a male to female skin transplantation model, we demonstrated that Vγ4 T cells, rather than Vγ1 or Vγ5 T cells, accelerated skin graft rejection and that IL-17A was essential for Vγ4 T-cell-mediated skin graft rejection. Moreover, we found that Vγ4 T cells were required for early IL-17A production in the transplanted area, both in skin grafts and in the host epidermis around grafts. Additionally, the chemokine (C-C motif) ligand 20-chemokine receptor 6 pathway was essential for recruitment of Vγ4 T cells to the transplantation area, whereas both IL-1β and IL-23 induced IL-17A production from infiltrating cells. Lastly, Vγ4 T-cell-derived IL-17A promoted the accumulation of mature dendritic cells in draining lymph nodes to subsequently regulate αβ T-cell function after skin graft transplantation. Taken together, our data reveal that Vγ4 T cells accelerate skin graft rejection by providing an early source of IL-17A. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Age-Dependent Risk of Graft Failure in Young Kidney Transplant Recipients.
Kaboré, Rémi; Couchoud, Cécile; Macher, Marie-Alice; Salomon, Rémi; Ranchin, Bruno; Lahoche, Annie; Roussey-Kesler, Gwenaelle; Garaix, Florentine; Decramer, Stéphane; Pietrement, Christine; Lassalle, Mathilde; Baudouin, Véronique; Cochat, Pierre; Niaudet, Patrick; Joly, Pierre; Leffondré, Karen; Harambat, Jérôme
2017-06-01
The risk of graft failure in young kidney transplant recipients has been found to increase during adolescence and early adulthood. However, this question has not been addressed outside the United States so far. Our objective was to investigate whether the hazard of graft failure also increases during this age period in France irrespective of age at transplantation. Data of all first kidney transplantation performed before 30 years of age between 1993 and 2012 were extracted from the French kidney transplant database. The hazard of graft failure was estimated at each current age using a 2-stage modelling approach that accounted for both age at transplantation and time since transplantation. Hazard ratios comparing the risk of graft failure during adolescence or early adulthood to other periods were estimated from time-dependent Cox models. A total of 5983 renal transplant recipients were included. The risk of graft failure was found to increase around the age of 13 years until the age of 21 years, and decrease thereafter. Results from the Cox model indicated that the hazard of graft failure during the age period 13 to 23 years was almost twice as high as than during the age period 0 to 12 years, and 25% higher than after 23 years. Among first kidney transplant recipients younger than 30 years in France, those currently in adolescence or early adulthood have the highest risk of graft failure.
Al-Ruzzeh, Sharif; George, Shane; Bustami, Mahmoud; Nakamura, Koki; Ilsley, Charles; Amrani, Mohamed
2002-05-01
The left internal thoracic artery (LITA) graft to the left anterior descending (LAD) artery became the gold standard graft in coronary surgery. Subsequently, the right internal thoracic artery (RITA) graft was increasingly used. However, there is still some debate about the optimal way of using this conduit. The aim of the present study was to assess our experience in grafting the pedicled RITA graft to LAD in 212 consecutive patients. The records of 212 consecutive patients who underwent isolated coronary artery bypass grafting with the pedicled RITA graft to the LAD artery at Harefield Hospital between January 1998 and May 2001 were retrospectively reviewed. We approached the last 35 consecutive patients to obtain an angiographic control group. All 35 patients (16.5%) consented and, before discharge, underwent angiography to look at the quality of anastomoses and the patency of grafts. Successful catheterization and engagement of the RITA grafts was performed in 32 patients. Angiography showed that 32/32 (100%) of the RITA grafts were widely patent with excellent flow. The distal anastomoses of these RITA grafts were also satisfactory. There were no deaths among the study patients. Our results show that the use of the pedicled RITA graft to the LAD artery provides a good early clinical and angiographic outcome, and suggests that the pedicled RITA graft to the LAD artery is a good alternative to the pedicled LITA graft to the LAD artery.
Associated Clinical and Laboratory Markers of Donor on Allograft Function After Heart Transplant.
Braulio, Renato; Sanches, Marcelo Dias; Teixeira Junior, Antonio Lúcio; Costa, Paulo Henrique Nogueira; Moreira, Maria da Consolação Vieira; Rocha, Monaliza Angela; Andrade, Silvio Amadeu de; Gelape, Cláudio Léo
2016-04-01
Primary graft dysfunction is a major cause of mortality after heart transplantation. To evaluate correlations between donor-related clinical/biochemical markers and the occurrence of primary graft dysfunction/clinical outcomes of recipients within 30 days of transplant. The prospective study involved 43 donor/recipient pairs. Data collected from donors included demographic and echocardiographic information, noradrenaline administration rates and concentrations of soluble tumor necrosis factor receptors (sTNFR1 and sTNFR2), interleukins (IL-6 and IL-10), monocyte chemoattractant protein-1, C-reactive protein and cardiac troponin I. Data collected from recipients included operating, cardiopulmonary bypass, intensive care unit and hospitalization times, inotrope administration and left/right ventricular function through echocardiography. Recipients who developed moderate/severe left ventricular dysfunction had received organs from significantly older donors (P =0.020). Recipients from donors who required moderate/high doses of noradrenaline (>0.23 µg/kg/min) around harvesting time exhibited lower post-transplant ventricular ejection fractions (P =0.002) and required longer CPB times (P =0.039). Significantly higher concentrations of sTNFR1 (P =0.014) and sTNFR2 (P =0.030) in donors were associated with reduced intensive care unit times (≤5 days) in recipients, while higher donor IL-6 (P =0.029) and IL-10 (P =0.037) levels were correlated with reduced hospitalization times (≤25 days) in recipients. Recipients who required moderate/high levels of noradrenaline for weaning off cardiopulmonary bypass were associated with lower donor concentrations of sTNFR2 (P =0.028) and IL-6 (P =0.001). High levels of sTNFR1, sTNFR2, IL-6 and IL-10 in donors were associated with enhanced evolution in recipients. Allografts from older donors, or from those treated with noradrenaline doses >0.23 µg/kg/min, were more frequently affected by primary graft dysfunction within 30 days of surgery.
Levin, Ricardo; Degrange, Marcela; Del Mazo, Carlos; Tanus, Eduardo; Porcile, Rafael
2012-01-01
BACKGROUND: The calcium sensitizer levosimendan has been used in cardiac surgery for the treatment of postoperative low cardiac output syndrome (LCOS) and difficult weaning from cardiopulmonary bypass (CPB). OBJECTIVES: To evaluate the effects of preoperative treatment with levosimendan on 30-day mortality, the risk of developing LCOS and the requirement for inotropes, vasopressors and intra-aortic balloon pumps in patients with severe left ventricular dysfunction. METHODS: Patient with severe left ventricular dysfunction and an ejection fraction <25% undergoing coronary artery bypass grafting with CPB were admitted 24 h before surgery and were randomly assigned to receive levosimendan (loading dose 10 μg/kg followed by a 23 h continuous infusion of 0.1μg/kg/min) or a placebo. RESULTS: From December 1, 2002 to June 1, 2008, a total of 252 patients were enrolled (127 in the levosimendan group and 125 in the control group). Individuals treated with levosimendan exhibited a lower incidence of complicated weaning from CPB (2.4% versus 9.6%; P<0.05), decreased mortality (3.9% versus 12.8%; P<0.05) and a lower incidence of LCOS (7.1% versus 20.8%; P<0.05) compared with the control group. The levosimendan group also had a lower requirement for inotropes (7.9% versus 58.4%; P<0.05), vasopressors (14.2% versus 45.6%; P<0.05) and intra-aortic balloon pumps (6.3% versus 30.4%; P<0.05). CONCLUSION: Patients with severe left ventricle dysfunction (ejection fraction <25%) undergoing coronary artery bypass grafting with CPB who were pretreated with levosimendan exhibited lower mortality, a decreased risk for developing LCOS and a reduced requirement for inotropes, vasopressors and intra-aortic balloon pumps. Studies with a larger number of patients are required to confirm whether these findings represent a new strategy to reduce the operative risk in this high-risk patient population. PMID:23620700
Host-Pathogen Interactions and Chronic Lung Allograft Dysfunction.
Belperio, John; Palmer, Scott M; Weigt, S Sam
2017-09-01
Lung transplantation is now considered to be a therapeutic option for patients with advanced-stage lung diseases. Unfortunately, due to post-transplant complications, both infectious and noninfectious, it is only a treatment and not a cure. Infections (e.g., bacterial, viral, and fungal) in the immunosuppressed lung transplant recipient are a common cause of mortality post transplant. Infections have more recently been explored as factors contributing to the risk of chronic lung allograft dysfunction (CLAD). Each major class of infection-(1) bacterial (Staphylococcus aureus and Pseudomonas aeruginosa); (2) viral (cytomegalovirus and community-acquired respiratory viruses); and (3) fungal (Aspergillus)-has been associated with the development of CLAD. Mechanistically, the microbe seems to be interacting with the allograft cells, stimulating the induction of chemokines, which recruit recipient leukocytes to the graft. The recipient leukocyte interactions with the microbe further up-regulate chemokines, amplifying the influx of allograft-infiltrating mononuclear cells. These events can promote recipient leukocytes to interact with the allograft, triggering an alloresponse and graft dysfunction. Overall, interactions between the microbe-allograft-host immune system alters chemokine production, which, in part, plays a role in the pathobiology of CLAD and mortality due to CLAD.
Coronary artery surgery: indications and recent experience.
Robinson, P. S.; Coltart, D. J.; Jenkins, B. S.; Webb-Peploe, M. M.; Braimbridge, M. V.; Williams, B. T.
1978-01-01
The comprehensive experience of coronary artery surgery in a Cardiothoracic Unit over a 31-month period is reviewed. Hospital mortality for elective bypass grafting was 3.9% overall and 2.5% in those with good pre-operative left ventricular function. Major influences on hospital mortality were pre-operative left ventricular function, extent of coronary artery disease and extent of the surgical procedure undertaken in terms of number of aortocoronary grafts inserted, coronary endarterectomy and particularly concomitant valve surgery or aneurysm resection. Follow-up experience shows 74% of grafted patients to be symptom-free and 85% symptomatically improved one year after surgery with 70% symptom-free and 80% improved at two years. Early post-operative deaths appear related to early graft closure and recurrence of symptoms postoperatively to late graft closure or progression of coronary disease in the native circulation. The study provides a guide to the relative risks of coronary artery surgery for symptomatic coronary artery disease and expected symptomatic results in the early follow-up period. PMID:310999
Mangus, Richard S; Fridell, Jonathan A; Kubal, Chandrashekhar A; Davis, Jason P; Tector, A Joseph
2015-02-01
Serum alanine aminotransferase (ALT) levels are frequently elevated with liver injury and such elevations are common in deceased organ donors. The impact of this injury on early liver allograft function has not been well described. This study analyses the immediate function and 1-year graft and patient survival for liver allografts stratified by peak serum ALT levels in the deceased donor. The on-site organ procurement records for 1348 consecutive deceased liver donors were reviewed (2001–2011). Serum ALT was categorized into three study groups: normal/mild elevation, 0–499 μ/L; moderate elevation, 500–999 μ/L (>10× upper limit of normal) and severe elevation, ≥1000 μ/L (>20× upper limit of normal). Outcomes included early graft function and graft loss, and 1-year graft and patient survival. Distribution of subjects included: normal/mild, 1259 (93%); moderate, 34 (3%) and severe, 55 (4%). Risk of 30-day graft loss for the three study groups was: 72 (6%), 3 (9%) and 3 (6%) (P = 0.74). Graft and patient survival at 1 year for the three groups was: normal/mild, 1031 (87%), 1048 (88%); moderate, 31 (91%), 31 (91%) and severe, 43 (88%), 44 (90%) (P = 0.71, 0.79). Cox proportional hazards modelling of survival while controlling for donor age and recipient model for end-stage liver disease score (MELD) demonstrates no statistically significant difference among the three study groups. This study demonstrates clinical equivalence in early graft function and 1-year graft and patient survival for donor livers with varying peak levels of serum ALT. These donor allografts may, therefore, be utilized successfully.
Mangini, Sandrigo; Alves, Bárbara Rubim; Silvestre, Odílson Marcos; Pires, Philippe Vieira; Pires, Lucas José Tachotti; Curiati, Milena Novaes Cardoso; Bacal, Fernando
2015-01-01
ABSTRACT Heart transplantation is currently the definitive gold standard surgical approach in the treatment of refractory heart failure. However, the shortage of donors limits the achievement of a greater number of heart transplants, in which the use of mechanical circulatory support devices is increasing. With well-established indications and contraindications, as well as diagnosis and treatment of rejection through defined protocols of immunosuppression, the outcomes of heart transplantation are very favorable. Among early complications that can impact survival are primary graft failure, right ventricular dysfunction, rejection, and infections, whereas late complications include cardiac allograft vasculopathy and neoplasms. Despite the difficulties for heart transplantation, in particular, the shortage of donors and high mortality while on the waiting list, in Brazil, there is a great potential for both increasing effective donors and using circulatory assist devices, which can positively impact the number and outcomes of heart transplants. PMID:26154552
Oral disease profiles in chronic graft versus host disease.
Bassim, C W; Fassil, H; Mays, J W; Edwards, D; Baird, K; Steinberg, S M; Cowen, E W; Naik, H; Datiles, M; Stratton, P; Gress, R E; Pavletic, S Z
2015-04-01
At least half of patients with chronic graft-versus-host-disease (cGVHD), the leading cause of morbidity and non-relapse mortality after allogeneic stem cell transplantation, have oral manifestations: mucosal lesions, salivary dysfunction, and limited mouth-opening. cGVHD may manifest in a single organ or affect multiple organ systems, including the mouth, eyes, and the skin. The interrelationship of the 3 oral manifestations of cGVHD with each other and with the specific manifestations of extraoral cGVHD has not been studied. In this analysis, we explored, in a large group of patients with cGVHD, the potential associations between: (1) oral mucosal disease and erythematous skin disease, (2) salivary gland dysfunction and lacrimal gland dysfunction, and (3) limited mouth-opening and sclerotic skin cGVHD. Study participants, enrolled in a cGVHD Natural History Protocol (NCT00331968, n = 212), underwent an oral examination evaluating: (1) mucosal cGVHD [NIH Oral Mucosal Score (OMS)], (2) salivary dysfunction (saliva flow and xerostomia), and (3) maximum mouth-opening measurement. Parameters for dysfunction (OMS > 2, saliva flow ≤ 1 mL/5 min, mouth-opening ≤ 35 mm) were analyzed for association with skin cGVHD involvement (erythema and sclerosis, skin symptoms), lacrimal dysfunction (Schirmer's tear test, xerophthalmia), Lee cGVHD Symptom Scores, and NIH organ scores. Oral mucosal disease (31% prevalence) was associated with skin erythema (P < 0.001); salivary dysfunction (11% prevalence) was associated with lacrimal dysfunction (P = 0.010) and xerostomia with xerophthalmia (r = 0.32, P = 0.001); and limited mouth-opening (17% prevalence) was associated with skin sclerosis (P = 0.008) and skin symptoms (P = 0.001). There was no association found among these 3 oral cGVHD manifestations. This analysis supports the understanding of oral cGVHD as 3 distinct diseases: mucosal lesions, salivary gland dysfunction, and mouth sclerosis. Clear classification of oral cGVHD as 3 separate manifestations will improve clinical diagnosis, observational research data collection, and the definitions of outcome measures in clinical trials. © International & American Associations for Dental Research 2015.
Upper limb grafts for hemodialysis access.
Shemesh, David; Goldin, Ilya; Verstandig, Anthony; Berelowitz, Daniel; Zaghal, Ibrahim; Olsha, Oded
2015-01-01
Arteriovenous (AV) grafts are required for hemodialysis access when options for native fistulas have been fully exhausted, where they continue to play an important role in hemodialysis patients, offering a better alternative to central vein catheters. When planning autogenous accesses using Doppler ultrasound, adequate arterial inflow and venous outflow must be consciously preserved for future access creation with grafts. Efforts to improve graft patency include changing graft configuration, graft biology and hemodynamics. Industry offers early cannulation grafts to reduce central catheter use and a bioengineered graft is undergoing clinical studies. Although the outcome of AV grafts is inferior to fistulas, grafts can provide long-term hemodialysis access that is a better alternative to central venous catheters. AV grafts have significant drawbacks, mainly poor patency, infection and cost but also have some advantages: early maturation, ease of creation and needling and widespread availability. The outcome of AV graft surgery is variable from center to center. The primary patency rate for AV grafts is 58% at 6 months and the secondary patency rate is 76% at 6 months and 55% at 18 months. There are centers of excellence that report a 1 year secondary patency rate of up to 91%. In this review of the use of AV grafts for hemodialysis access in the upper extremities, technical issues involved in planning the access and performing the surgery in its different configurations are discussed and the role of surveillance and maintenance with their attendant surgical and radiological interventions is described.
Schmidt, Kenneth J; Tírico, Luís E; McCauley, Julie C; Bugbee, William D
2017-08-01
Regulatory concerns and the popularity of fresh osteochondral allograft (OCA) transplantation have led to a need for prolonged viable storage of osteochondral grafts. Tissue culture media allow a longer storage time but lead to chondrocyte death within the tissue. The long-term clinical consequence of prolonged storage is unknown. Patients transplanted with OCAs with a shorter storage time would have lower failure rates and better clinical outcomes than those transplanted with OCAs with prolonged storage. Cohort study; Level of evidence, 3. A matched-pair study was performed of 75 patients who received early release grafts (mean storage, 6.3 days [range, 1-14 days]) between 1997 and 2002, matched 1:1 by age, diagnosis, and graft size, with 75 patients who received late release grafts (mean storage time, 20.0 days [range, 16-28 days]) from 2002 to 2008. The mean age was 33.5 years, and the median graft size was 6.3 cm 2 . All patients had a minimum 2-year follow-up. Evaluations included pain, satisfaction, function, failures, and reoperations. Outcome measures included the modified Merle d'Aubigné-Postel (18-point) scale, International Knee Documentation Committee (IKDC) form, and Knee Society function (KS-F) scale. Clinical failure was defined as revision OCA transplantation or conversion to arthroplasty. Among patients with grafts remaining in situ, the mean follow-up was 11.9 years (range, 2.0-16.8 years) and 7.8 years (range, 2.3-11.1 years) for the early and late release groups, respectively. OCA failure occurred in 25.3% (19/75) of patients in the early release group and 12.0% (9/75) of patients in the late release group ( P = .036). The median time to failure was 3.5 years (range, 1.7-13.8 years) and 2.7 years (range, 0.3-11.1 years) for the early and late release groups, respectively. The 5-year survivorship of OCAs was 85% for the early release group and 90% for the late release group ( P = .321). No differences in postoperative pain and function were noted between the groups. Ninety-one percent of the early release group and 93% of the late release group reported satisfaction with OCA results. The transplantation of OCA tissue with prolonged storage is safe and effective for large osteochondral lesions of the knee and has similar clinical outcomes and satisfaction to the transplantation of early release grafts.
Use of autologous grafts in the treatment of acquired penile curvature: An experience of 33 cases.
Khawaja, Abdul Rouf; Dar, Tanveer Iqbal; Zahur, Suhael; Tariq, Sheikh; Hamid, Arf; Wani, M S; Wazir, B S; Iqbal, Arsheed
2016-01-01
The objective was to compare the use of autologous dermal and temporalis fascia grafts in the treatment of acquired penile curvatures. It was a prospective observational study of 33 cases, conducted in Sher-i-Kashmir Institute of Medical Sciences, Srinagar from March 2007 to September 2013. All the patients had stable Peyronies disease (PD). Dorsal, dorsolateral and vental curvatures with good preoperative erections were included. PD index with visual analog scales for curvature was used preoperatively. An informed written consent was taken from all the patients with main emphasis on erectile dysfunction. After an average follow up of 2 years, complete straightening of penis was observed in all patients with satisfactory sexual intercourse in 30 patients (90%). Three patients (10%) required frequent use of type 5 phosphodiesterase inhibitors for adequate erections. Overall 91% of patients and partners were satisfied with the procedure and cosmetically donor site was better in temporalis fascia graft site. No rejection of any graft was noted and glans hypoesthesia was noticed in 4 patients (12%). None of the patients required penile prosthesis. Total operative time for harvesting and application of the graft was more in dermal grafts (>3 hrs) than for temporalis fascia graft (2 hrs). Tunical lengthening procedures by autologous free grafts represents a safe and reproducible technique. A good preoperative erectile function is required for tunical lengthening procedure. Temporalis fascia graft is thin, tough membrane and effective graft for PD with good cosmetic and functional results.
Liver Transplantation and Donor Body Mass Index >30: Use or Refuse?
Andert, Anne; Becker, Niklas; Ulmer, Florian; Schöning, Wenzel; Hein, Marc; Rimek, Alexandra; Neumann, Ulf; Schmeding, Maximilian
2016-03-31
Organ shortage is a major problem in liver transplantation. The use of extended criteria donors has become the most important strategy for increasing the donor pool. However, the role of donor body mass index has not yet been thoroughly investigated. The aim of our study was to compare outcomes after liver transplantation in patients who received a liver from a donor with a BMI <30, 30-39, and ≥40, with special regard to the incidence of early allograft dysfunction (EAD) and primary non-function (PNF). One hundred and sixty-three patients who underwent liver transplantation at the University Hospital Aachen between June 2010 and January 2014 were included in this analysis. The outcome of liver transplantation was evaluated by the 30-day and 1-year patient and graft survival rates and the incidences of post-reperfusion syndrome (PRS), EAD, and PNF. The BMI 30-39 group had a higher incidence of EAD than the BMI <30 and BMI ≥40 groups. We observed 5 cases of PNF in the BMI <30 group. The incidence of acute renal failure was significantly higher in the BMI 30-39 and BMI ≥40 groups than in the BMI <30 group. Patient and graft survival did not differ significantly among the 3 groups. Based on the findings of this study, grafts from obese donors with a BMI >30 can be safely transplanted. Therefore, the donor pool can be enlarged to include such obese donors without a negative impact on the long-term patient outcome after liver transplantation.
Alcohol abuse in deceased liver donors: impact on post-transplant outcomes.
Mangus, Richard S; Kubal, Chandrashekhar A; Fridell, Jonathan A; Pena, Jose M; Frost, Evan M; Joseph Tector, A
2015-01-01
Many deceased liver donors with a history of alcohol abuse are excluded based upon medical history alone. This paper summarizes the transplant outcomes for a large number of deceased liver donors with a documented history of alcohol abuse. The records for 1478 consecutive deceased liver donors were reviewed (2001-2012). As per the United Network for Organ Sharing criteria, heavy alcohol use by an organ donor is defined as chronic intake of two or more drinks per day. Donors with a documented history of alcohol abuse were divided into three groups according to duration of abuse (<10 years, 10-24 years and 25 + years). Reperfusion biopsies are reported. Outcomes include biopsy appearance, early graft function and early and late graft survival. There were 161 donors with alcohol abuse: <10 years (29%); 10-24 years (42%); and ≥25 years (29%). Risk of 90-day graft loss for these three groups was: 0%, 3% and 2%, compared to 3% for all other donors (P = 0.62). Graft survival at 1 year for donor grafts with and without alcohol abuse was 89% and 87% (P = 0.52). There was no difference in early graft function. Cox proportional hazards modelling for graft survival demonstrates no statistically significant difference in survival up to 10 years post-transplant. This study demonstrates successful transplantation of a large number of deceased donor liver grafts from donors with a documented history of alcohol abuse (n = 161; 11% of all grafts). These extended criteria donor allografts may, therefore, be utilized successfully with similar outcomes. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
van der Garde, Mark; van Hensbergen, Yvette; Brand, Anneke; Slot, Manon C; de Graaf-Dijkstra, Alice; Mulder, Arend; Watt, Suzanne M; Zwaginga, Jaap Jan
2015-01-01
Human cord blood (CB) hematopoietic stem cell (HSC) transplants demonstrate delayed early neutrophil and platelet recovery and delayed longer term immune reconstitution compared to bone marrow and mobilized peripheral blood transplants. Despite advances in enhancing early neutrophil engraftment, platelet recovery after CB transplantation is not significantly altered when compared to contemporaneous controls. Recent studies have identified a platelet-biased murine HSC subset, maintained by thrombopoietin (TPO), which has enhanced capacity for short- and long-term platelet reconstitution, can self-renew, and can give rise to myeloid- and lymphoid-biased HSCs. In previous studies, we have shown that transplantation of human CB CD34(+) cells precultured in TPO as a single graft accelerates early platelet recovery as well as yielding long-term repopulation in immune-deficient mice. In this study, using a double CB murine transplant model, we investigated whether TPO cultured human CB CD34(+) cells have a competitive advantage or disadvantage over untreated human CB CD34(+) cells in terms of (1) short-term and longer term platelet recovery and (2) longer term hematological recovery. Our studies demonstrate that the TPO treated graft shows accelerated early platelet recovery without impairing the platelet engraftment of untreated CD34(+) cells. Notably, this was followed by a dominant contribution to platelet production through the untreated CD34(+) cell graft over the intermediate to longer term. Furthermore, although the contribution of the TPO treated graft to long-term hematological engraftment was reduced, the TPO treated and untreated grafts both contributed significantly to long-term chimerism in vivo.
Cheng, Pengfei; Han, Pei; Zhao, Changli; Zhang, Shaoxiang; Zhang, Xiaonong; Chai, Yimin
2016-01-01
Patients after anterior cruciate ligament (ACL) reconstruction surgery commonly encounters graft failure in the initial phase of rehabilitation. The inhibition of graft degradation is crucial for the successful reconstruction of the ACL. Here, we used biodegradable high-purity magnesium (HP Mg) screws in the rabbit model of ACL reconstruction with titanium (Ti) screws as a control and analyzed the graft degradation and screw corrosion using direct pull-out tests, microCT scanning, and histological and immunohistochemical staining. The most noteworthy finding was that tendon graft fixed by HP Mg screws exhibited biomechanical properties substantially superior to that by Ti screws and the relative area of collagen fiber at the tendon-bone interface was much larger in the Mg group, when severe graft degradation was identified in the histological analysis at 3 weeks. Semi-quantitative immunohistochemical results further elucidated that the MMP-13 expression significantly decreased surrounding HP Mg screws with relatively higher Collagen II expression. And HP Mg screws exhibited uniform corrosion behavior without displacement or loosening in the femoral tunnel. Therefore, our results demonstrated that Mg screw inhibited graft degradation and improved biomechanical properties of tendon graft during the early phase of graft healing and highlighted its potential in ACL reconstruction. PMID:27210585
NASA Astrophysics Data System (ADS)
Cheng, Pengfei; Han, Pei; Zhao, Changli; Zhang, Shaoxiang; Zhang, Xiaonong; Chai, Yimin
2016-05-01
Patients after anterior cruciate ligament (ACL) reconstruction surgery commonly encounters graft failure in the initial phase of rehabilitation. The inhibition of graft degradation is crucial for the successful reconstruction of the ACL. Here, we used biodegradable high-purity magnesium (HP Mg) screws in the rabbit model of ACL reconstruction with titanium (Ti) screws as a control and analyzed the graft degradation and screw corrosion using direct pull-out tests, microCT scanning, and histological and immunohistochemical staining. The most noteworthy finding was that tendon graft fixed by HP Mg screws exhibited biomechanical properties substantially superior to that by Ti screws and the relative area of collagen fiber at the tendon-bone interface was much larger in the Mg group, when severe graft degradation was identified in the histological analysis at 3 weeks. Semi-quantitative immunohistochemical results further elucidated that the MMP-13 expression significantly decreased surrounding HP Mg screws with relatively higher Collagen II expression. And HP Mg screws exhibited uniform corrosion behavior without displacement or loosening in the femoral tunnel. Therefore, our results demonstrated that Mg screw inhibited graft degradation and improved biomechanical properties of tendon graft during the early phase of graft healing and highlighted its potential in ACL reconstruction.
Extracorporeal life support in preoperative and postoperative heart transplant management.
Bermudez, Christian A; McMullan, D Michael
2017-10-01
Increased experience with extracorporeal life support (ECLS) as a mode of cardiac support has expanded its use to diverse patient populations including patients requiring a bridge to heart transplantation and patients requiring posttransplant support for primary graft dysfunction (PGD). The use of ECLS is associated with acceptable outcomes in well-selected patients. While outcomes with the use of extracorporeal membrane oxygenation (ECMO) as a bridge to heart transplant have been variable, several series have confirmed the safe use of ECLS to stabilize patients prior to left ventricular assist device (LVAD) implantation. These patients are then considered later, when in stable condition, for heart transplant. When ECLS is used prior to heart transplant, mortality is greatest during the first 6 months posttransplant. Patients who are alive 6 months after transplant appear to have similar survival rates as patients who were not supported with ECLS prior to transplant. ECLS support is a reliable therapeutic option for severe PGD and early graft failure after heart transplantation. In patients who require support for severe PGD, venoarterial-ECMO appears to result in better clinical outcomes than LVAD support. ECLS use for PGD after heart transplant continues to be the first line of support. Further studies are necessary to understand the optimal role of ECLS in heart transplantation.
Trailin, Andriy V; Ostapenko, Tetyana I; Nykonenko, Tamara N; Nesterenko, Svitlana N; Nykonenko, Olexandr S
2017-01-01
We aimed to determine whether serum soluble CD30 (sCD30) could identify recipients at high risk for unfavorable early and late kidney transplant outcomes. Serum sCD30 was measured on the day of kidney transplantation and on the 4th day posttransplant. We assessed the value of these measurements in predicting delayed graft function, slow graft function (SGF), acute rejection (AR), pyelonephritis, decline of allograft function after 6 months, and graft and patient survival during 5 years of follow-up in 45 recipients. We found the association between low pretransplant serum levels of sCD30 and SGF. The absence of significant decrease of sCD30 on the 4th day posttransplant was characteristic for SGF, early AR (the 8th day-6 months), late AR (>6 months), and early pyelonephritis (the 8th day-2 months). Lower pretransplant and posttransplant sCD30 predicted worse allograft function at 6 months and 2 years, respectively. Higher pretransplant sCD30 was associated with higher frequency of early AR, and worse patients' survival, but only in the recipients of deceased-donor graft. Pretransplant sCD30 also allowed to differentiate patients with early pyelonephritis and early AR. Peritransplant sCD30 is useful in identifying patients at risk for unfavorable early and late transplant outcomes.
Feasibility of using marginal liver grafts in living donor liver transplantation
Lan, Xiang; Zhang, Hua; Li, Hong-Yu; Chen, Ke-Fei; Liu, Fei; Wei, Yong-Gang; Li, Bo
2018-01-01
Liver transplantation (LT) is one of the most effective treatments for end-stage liver disease caused by related risk factors when liver resection is contraindicated. Additionally, despite the decrease in the prevalence of hepatitis B virus (HBV) over the past two decades, the absolute number of HBsAg-positive people has increased, leading to an increase in HBV-related liver cirrhosis and hepatocellular carcinoma. Consequently, a large demand exists for LT. While the wait time for patients on the donor list is, to some degree, shorter due to the development of living donor liver transplantation (LDLT), there is still a shortage of liver grafts. Furthermore, recipients often suffer from emergent conditions, such as liver dysfunction or even hepatic encephalopathy, which can lead to a limited choice in grafts. To expand the pool of available liver grafts, one option is the use of organs that were previously considered “unusable” by many, which are often labeled “marginal” organs. Many previous studies have reported on the possibilities of using marginal grafts in orthotopic LT; however, there is still a lack of discussion on this topic, especially regarding the feasibility of using marginal grafts in LDLT. Therefore, the present review aimed to summarize the feasibility of using marginal liver grafts for LDLT and discuss the possibility of expanding the application of these grafts. PMID:29930466
Biscotti, Mauer; Yang, Jonathan; Sonett, Joshua; Bacchetta, Matthew
2014-11-01
This study compared differences in patient outcomes and operative parameters for extracorporeal membrane oxygenation (ECMO) versus cardiopulmonary bypass (CPB) in patients undergoing lung transplants. Between January 1, 2008, and July 13, 2013, 316 patients underwent lung transplants at our institution, 102 requiring intraoperative mechanical cardiopulmonary support (CPB, n=55; ECMO, n=47). We evaluated survival, blood product transfusions, bleeding complications, graft dysfunction, and rejection. Intraoperatively, the CPB group required more cell saver volume (1123±701 vs 814±826 mL; P=.043), fresh-frozen plasma (3.64±5.0 vs 1.51±3.2 units; P=.014), platelets (1.38±1.6 vs 0.43±1.25 units; P=.001), and cryoprecipitate (4.89±6.3 vs 0.85±2.8 units; P<.001) than the ECMO group. Postoperatively, the CPB group received more platelets (1.09±2.6 vs 0.13±0.39 units; P=.013) and was more likely to have bleeding (15 [27.3%] vs 3 [6.4%]; P=.006) and reoperation (21 [38.2%] vs 7 [14.9%]; P=.009]. The CPB group had higher rates of primary graft dysfunction at 24 and 72 hours (41 [74.5%] vs 23 [48.9%]; P=.008; and 42 [76.4%] vs 26 [56.5%]; P=.034; respectively). There were no differences in 30-day and 1-year survivals. Relative to CPB, the ECMO group required fewer transfusions and had less bleeding, fewer reoperations, and less primary graft dysfunction. There were no statistically significant survival differences at 30 days or 1 year. Copyright © 2014 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Kim, Bong-Wan; Xu, Weiguang; Wang, Hee-Jung; Park, Yong-Keun; Lee, Kwangil; Kim, Myung-Wook
2011-09-01
To determine the feasibility of volumetric criteria without anatomic exclusion for the selection of right posterior sector (RPS) grafts for adult-to-adult living donor liver transplantation (LDLT), we reviewed and compared our transplant data for RPS grafts and right lobe (RL) grafts. Between January 2008 and September 2010, adult-to-adult LDLT was performed 65 times at our institute; 13 of the procedures (20%) were performed with RPS grafts [the posterior sector (PS) group], and 39 (60%) were performed with RL grafts (the RL group). The volumetry of the 13 RPS donor livers showed that the RPS volume was 39.8% ± 7.6% of the total liver volume. Ten of the 13 donors had to donate RPS grafts because the left liver volume was inadequate. All donor procedures were performed successfully, and all donors recovered from hepatectomy. However, longer operative times were required for the procurement of RPS grafts versus RL grafts (418 ± 40 versus 345 ± 48 minutes, P < 0.001). The postoperative recovery of liver function was smoother for the donors of the PS group versus the donors of the RL group. The RPS grafts had significantly smaller hepatic artery and bile duct openings than the RL grafts. All recipients with RPS grafts survived LDLT. No recipients experienced vascular graft complications or small-for-size graft dysfunction. There were no significant differences in the incidence of posttransplant complications between the donors and recipients of the PS and RL groups. The 3-year graft survival rates were favorable in both groups (100% in the PS group versus 91% in the RL group). In conclusion, the selection of RPS grafts by volume criteria is a feasible strategy for an adult-to-adult LDLT program. Copyright © 2011 American Association for the Study of Liver Diseases.
Handa, Takemi; Orihashi, Kazumasa; Nishimori, Hideaki; Yamamoto, Masaki
2016-11-01
Maximal graft flow acceleration (max df/dt) determined using transit-time flowmetry (TTFM) in the diastolic phase was assessed as a potential predictor of graft failure for aortocoronary artery (AC) bypass grafts in coronary artery bypass patients. Max df/dt was retrospectively measured in 114 aortocoronary artery bypass grafts. TTFM data were fitted to a 9-polynomial curve, which was derived from the first-derivative curve, to measure max df/dt (9-polynomial max df/dt). Abnormal TTFM was defined as a mean flow of <15 ml/min, a pulsatility index of >5 or a diastolic filling ratio of <50 %. Postoperative assessments were routinely performed by coronary artery angiography (CAG) at 1 year after surgery. Using TTFM, 68 grafts were normal, 4 of which were failing on CAG, and 46 grafts were abnormal, 21 of which were failing on CAG. 9-polynomial max df/dt was significantly lower in abnormal TTFM/failing by the CAG group compared with abnormal TTFM/patent by the CAG group (1.08 ± 0.89 vs. 2.05 ± 1.51 ml/s(2), respectively; P < 0.01, Mann-Whitney U test, Holm adjustment). TTFM 9-polynomial max df/dt in the early diastolic phase may be a promising predictor of future graft failure for AC bypass grafts, particularly in abnormal TTFM grafts.
Müller-Horvat, C; Schick, F; Claussen, C D; Grönewäller, E
2004-12-01
To evaluate the suitability of different MR sequences for monitoring the stage of maturation of hyaline cartilage grafts in the knee joint and the early detection of complications like hypertrophy. In addition, it was analyzed whether indirect MR arthrography can indicate debonding of the graft. MRI examinations were performed in 19 patients, aged 17 - 48 years, with autologous transplantation of a hyaline cartilage tissue graft after knee trauma. Examination dates were prior to transplantation to localize the defect, and 6 weeks, 3, 6 and 12 months after transplantation to control morphology and maturation of the autologous graft. Standard T2- and proton-density-weighted turbo spin echo (TSE) sequences and T1-weighted spin echo (SE) sequences were used, as well as gradient echo (GRE) sequences with and without magnetization transfer (MT) prepulses. In some cases, indirect MR arthrography was performed. Cartilage defect and the hyaline cartilage graft could be detected in all 19 patients. Hypertrophy of the graft could be found early in 3 patients and debonding in 1 patient. For depicting the graft a short time after surgery, T2-weighted TSE-sequences showed the best results. Six and 12 months after transplantation, spoiled 3D-GRE-sequences like FLASH3D (fast low angle shot) showed reduced artifacts due to magnetic residues from the surgery. Difference images from GRE-sequences with and without MT pulse provided high contrast between cartilage and surrounding tissue. The quantification of the MT effect showed an assimilation of the graft to the original cartilage within 12 months. Indirect MR arthrography showed subchondral contrast medium even 12 months after transplantation in 3 patients. MRI allows a reliable depiction of the hyaline graft and provides very early detection of complications like hypertrophy. The MT effect seems to be correlated with maturation of the graft and allows selective depiction of normal cartilage and engrafted cartilage.
Czigany, Zoltan; Schöning, Wenzel; Ulmer, Tom Florian; Bednarsch, Jan; Amygdalos, Iakovos; Cramer, Thorsten; Rogiers, Xavier; Popescu, Irinel; Botea, Florin; Froněk, Jiří; Kroy, Daniela; Koch, Alexander; Tacke, Frank; Trautwein, Christian; Tolba, Rene H; Hein, Marc; Koek, Ger H; Dejong, Cornelis H C; Neumann, Ulf Peter; Lurje, Georg
2017-10-10
Orthotopic liver transplantation (OLT) has emerged as the mainstay of treatment for end-stage liver disease. In an attempt to improve the availability of donor allografts and reduce waiting list mortality, graft acceptance criteria were extended increasingly over the decades. The use of extended criteria donor (ECD) allografts is associated with a higher incidence of primary graft non-function and/or delayed graft function. As such, several strategies have been developed aiming at reconditioning poor quality ECD liver allografts. Hypothermic oxygenated machine perfusion (HOPE) has been successfully tested in preclinical experiments and in few clinical series of donation after cardiac death OLT. HOPE ECD-DBD is an investigator-initiated, open-label, phase-II, prospective multicentre randomised controlled trial on the effects of HOPE on ECD allografts in donation after brain death (DBD) OLT. Human whole organ liver grafts will be submitted to 1-2 hours of HOPE (n=23) via the portal vein before implantation and are going to be compared with a control group (n=23) of patients transplanted after conventional cold storage. Primary (peak and Δ peak alanine aminotransferase within 7 days) and secondary (aspartate aminotransferase, bilirubin and international normalised ratio, postoperative complications, early allograft dysfunction, duration of hospital and intensive care unit stay, 1-year patient and graft survival) endpoints will be analysed within a 12-month follow-up. Extent of ischaemia-reperfusion (I/R) injury will be assessed using liver tissue, perfusate, bile and serum samples taken during the perioperative phase of OLT. The study was approved by the institutional review board of the RWTH Aachen University, Aachen, Germany (EK 049/17). The current paper represent the pre-results phase. First results are expected in 2018. NCT03124641. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Costochondral grafts replacing the mandibular condyle.
Ross, R B
1999-07-01
The purpose of this study was to determine the success rate of costochondral bone grafts used to replace absent or nonfunctioning temporomandibular joints and the subsequent growth of these grafts when placed in young children. This is a retrospective study of all cases with adequate follow-up records that were treated at the Craniofacial Centre at Toronto's The Hospital for Sick Children from 1974-1986. A total of 55 patients were evaluated, of whom 13 were growing children. The findings suggest that there was increased success when surgery was performed at an early age. Poorer results were achieved when previous surgery had been performed or when pathology was present. Growth of the graft did not always equal the growth of the "normal" side, but in most cases a satisfactory symmetry was achieved. Several cases exhibited excessive overgrowth. Surgery at 4 to 5 years of age will alleviate the impact of a severe facial deformity on the child during the early school years, when self-esteem is fragile and patterns of social interactions are developing. Development of the dentition is better if the jaw relationship is close to normal at an early age. It would appear that early temporomandibular joint (TMJ) construction by costochondral grafting is, at present, the method of choice for severe hemifacial microsomia.
Abdelaziz, Omar; Attia, Hussein
2016-01-01
Living-donor liver transplantation has provided a solution to the severe lack of cadaver grafts for the replacement of liver afflicted with end-stage cirrhosis, fulminant disease, or inborn errors of metabolism. Vascular complications remain the most serious complications and a common cause for graft failure after hepatic transplantation. Doppler ultrasound remains the primary radiological imaging modality for the diagnosis of such complications. This article presents a brief review of intra- and post-operative living donor liver transplantation anatomy and a synopsis of the role of ultrasonography and color Doppler in evaluating the graft vascular haemodynamics both during surgery and post-operatively in accurately defining the early vascular complications. Intra-operative ultrasonography of the liver graft provides the surgeon with useful real-time diagnostic and staging information that may result in an alteration in the planned surgical approach and corrections of surgical complications during the procedure of vascular anastomoses. The relevant intra-operative anatomy and the spectrum of normal and abnormal findings are described. Ultrasonography and color Doppler also provides the clinicians and surgeons early post-operative potential developmental complications that may occur during hospital stay. Early detection and thus early problem solving can make the difference between graft survival and failure. PMID:27468207
Graft-Sparing Strategy for Thoracic Prosthetic Graft Infection.
Uchino, Gaku; Yoshida, Takeshi; Kakii, Bunpachi; Furui, Masato
2018-04-01
Thoracic prosthetic graft infection is a rare but serious complication with no standard management. We reported our surgical experience on graft-sparing strategy for thoracic prosthetic graft infection. This study included patients who underwent graft-sparing surgery for thoracic prosthetic graft infection at Matsubara Tokushukai Hospital in Japan from January 2000 to October 2017. There were 17 patients included in the analyses, with a mean age at surgery of 71.0 ± 10.5 years; 11 were men. In-hospital mortality was observed in five patients (29.4%). Graft-sparing surgery for thoracic prosthetic graft infection is an alternative option particularly for early graft infection after hemiarch replacement. Georg Thieme Verlag KG Stuttgart · New York.
Mok, Hsiaopei; Feng, Jingwei; Hu, Wansheng; Wang, Jing; Cai, Junrong; Lu, Feng
2018-06-18
Fat grafting is a commonly used procedure; however, the mechanisms that regulate graft outcomes are not clear. Estrogen has been associated with vascularization, inflammation and fat metabolism, yet its role in fat grafting is unclear. Mice were implanted with 17β-estradiol pellets (high estrogen, HE), underwent ovariectomy (low estrogen level, OVX) or sham surgery (normal estrogen level, CON). 45 days later, inguinal fat of mice was autografted subcutaneously. At 1, 2, 4, and 12 weeks post-transplantation, grafts were dissected, weighed, and assessed for histology, angiogenesis and inflammation level. Serum estrogen level correlated to estrogen manipulation. 12 weeks after autografting, the retention rate was significantly higher in the OVX (79% ± 30%) than in the HE (16% ± 8%) and CON (35% ± 13%) groups. OVX-grafts had the least necrosis and most hypertrophic fat. OVX recruited the most pro-inflammatory macrophages and demonstrated a faster dead tissue removal process, however a higher fibrogenic tendency was found in this group. HE grafts had the most Sca1+ local stem cells and CD31 + capillary content; however, with a low level of acute inflammation and insufficient adipokine PPAR-γ expression, their retention rate was impaired. Elevated serum estrogen increased stem cell density and early vascularization; however, by inhibiting the early inflammation, it resulted in delayed necrotic tissue removal and finally led to impaired adipose restoration. A low estrogen level induced favorable inflammation status and adipocyte hypertrophy to improve fat graft retention, but a continuing decreased estrogen level led to fat graft fibrosis. Copyright © 2018 Elsevier Inc. All rights reserved.
Lung Allocation Score: A Single-Center Simulation.
Rosso, L; Palleschi, A; Tosi, D; Mendogni, P; Righi, I; Carrinola, R; Montoli, M; Damarco, F; Rossetti, V; Morlacchi, L C; Nosotti, M
2016-03-01
The lung allocation score (LAS) was introduced in the United States in May 2005 with the main goal of reducing the waiting list mortality of patients with end-stage lung diseases, but also to enhance the lung transplant benefit and improve the management of urgent candidates. Several papers have reported that LAS resulted in a reduction of the waiting list mortality but no significant survival benefit was noted. We evaluate the usefulness of LAS as a predictor for lung transplantation outcome in 123 patients listed for lung transplantation in an Italian center. Primary endpoints were waiting list mortality and posttransplant mortality at 1 year; secondary endpoints included perioperative circulatory support, cardiopulmonary bypass, primary graft dysfunction, and long-term survival after transplantation. We observed the absence of correlation between LAS and waiting list mortality. The LAS did not affect the long-term survival in our population. High LAS was predictive of primary graft dysfunction of grade 3 in the first 72 hours after transplantation. Copyright © 2016 Elsevier Inc. All rights reserved.
Sareyyupoglu, Basar; Bhama, Jay; Bonde, Pramod; Thacker, Jnanesh; Bermudez, Christian; Gries, Cynthia; Crespo, Maria; Johnson, Bruce; Pilewski, Joseph; Toyoda, Yoshiya
2011-01-01
Background: Concomitant tricuspid valve repair (TVR) and double lung transplantation (DLTx) has been a surgical option at our institution since 2004 in an attempt to improve the outcome of DLTx for end-stage pulmonary hypertension, severe tricuspid regurgitation, and right ventricle (RV) dysfunction. This study is a review of that single institutional experience. Methods: Consecutive cases of concomitant TVR and DLTx performed between 2004 and 2009 (TVR group, n = 20) were retrospectively compared with cases of DLTx alone for severe pulmonary hypertension without TVR (non-TVR group, n = 58). Results: There was one in-hospital death in the TVR group. The 90-day and 1- and 3-year survival rates for the TVR group were 90%, 75%, and 65%, respectively, which were not significantly different from those for the non-TVR group. The TVR group required less inotropic support and less prolonged mechanical ventilation in the ICU. Follow-up echocardiography demonstrated immediate elimination of both volume and pressure overload in the RV and tricuspid regurgitation in the TVR group. Notably, there was a significantly lower incidence of primary graft dysfunction following transplantation in the TVR group (P < .05). Pulmonary functional improvement shown by an FEV1 increase after 6 months was also significantly better in the TVR group (40% vs 20%, P < .05). Conclusions: Combined TVR and DLTx procedures were successfully performed without an increase in morbidity or mortality and contributed to decreased primary graft dysfunction. In our experience, this combined operative approach achieves clinical outcomes equal or superior to the outcomes seen in DLTx patients without RV dysfunction and severe tricuspid regurgitation. PMID:21700686
Ishida; Wu; Shi; Fujita; Sauvage; Hammond; Wijelath
2000-03-01
Previous studies of neointima formation on Dacron vascular grafts mainly focused on the late stages using immunohistochemistry staining for von Willebrand factor (vWF) and smooth muscle (SM) alpha-actin. However, it is impossible to use immunohistochemistry to study the early events of neointima formation, because graft samples lack sufficient cellular material. Therefore, we used reverse transcriptase-polymerase chain reaction (RT-PCR) to demonstrate dynamic changes of SM and endothelial markers during the early stages of neointima formation. Preclotted Dacron grafts were implanted in the descending thoracic aorta of 14 mongrel dogs. Specimens were retrieved at 1-4 weeks. Total RNAs were extracted from mid-portion of graft flow surfaces, and RT-PCR for vWF, SM myosin heavy chain (MHC), and SM alpha-actin were performed and expressed as a ratio to the ribosome s17 signal. SM MHC and vWF mRNA expression was low at 1-2 weeks but elevated at 3-4 weeks (P < 0.05). However, SM alpha-actin mRNA levels were expressed consistently throughout the study period. At 3-4 weeks, vWF mRNA expression was inversely correlated to thrombus formation on the graft flow surface. Increased expressions of SM MHC and vWF mRNA corresponded to the formation of neointima and an endothelial layer at the later stages. However, SM alpha-actin mRNA expression did not vary during the healing process. The application of RT-PCR should permit further studies of gene regulation in the early vascular graft healing process in vivo. This model can also be used to study the molecular events that are involved in SM cell differentiation.
Use of autologous grafts in the treatment of acquired penile curvature: An experience of 33 cases
Khawaja, Abdul Rouf; Dar, Tanveer Iqbal; Zahur, Suhael; Tariq, Sheikh; Hamid, Arf; Wani, M. S.; Wazir, B. S.; Iqbal, Arsheed
2016-01-01
Aim: The objective was to compare the use of autologous dermal and temporalis fascia grafts in the treatment of acquired penile curvatures. Materials and Methods: It was a prospective observational study of 33 cases, conducted in Sher-i-Kashmir Institute of Medical Sciences, Srinagar from March 2007 to September 2013. All the patients had stable Peyronies disease (PD). Dorsal, dorsolateral and vental curvatures with good preoperative erections were included. PD index with visual analog scales for curvature was used preoperatively. An informed written consent was taken from all the patients with main emphasis on erectile dysfunction. Results: After an average follow up of 2 years, complete straightening of penis was observed in all patients with satisfactory sexual intercourse in 30 patients (90%). Three patients (10%) required frequent use of type 5 phosphodiesterase inhibitors for adequate erections. Overall 91% of patients and partners were satisfied with the procedure and cosmetically donor site was better in temporalis fascia graft site. No rejection of any graft was noted and glans hypoesthesia was noticed in 4 patients (12%). None of the patients required penile prosthesis. Total operative time for harvesting and application of the graft was more in dermal grafts (>3 hrs) than for temporalis fascia graft (2 hrs). Conclusion: Tunical lengthening procedures by autologous free grafts represents a safe and reproducible technique. A good preoperative erectile function is required for tunical lengthening procedure. Temporalis fascia graft is thin, tough membrane and effective graft for PD with good cosmetic and functional results. PMID:27141196
Expanding the pool of kidney donors: use of kidneys with acute renal dysfunction
de Matos, Ana Cristina Carvalho; Requião-Moura, Lúcio Roberto; Clarizia, Gabriela; Durão, Marcelino de Souza; Tonato, Eduardo José; Chinen, Rogério; de Arruda, Érika Ferraz; Filiponi, Thiago Corsi; Pires, Luciana Mello de Mello Barros; Bertocchi, Ana Paula Fernandes; Pacheco-Silva, Alvaro
2015-01-01
ABSTRACT Given the shortage of organs transplantation, some strategies have been adopted by the transplant community to increase the supply of organs. One strategy is the use of expanded criteria for donors, that is, donors aged >60 years or 50 and 59 years, and meeting two or more of the following criteria: history of hypertension, terminal serum creatinine >1.5mg/dL, and stroke as the donor´s cause of death. In this review, emphasis was placed on the use of donors with acute renal failure, a condition considered by many as a contraindication for organ acceptance and therefore one of the main causes for kidney discard. Since these are well-selected donors and with no chronic diseases, such as hypertension, renal disease, or diabetes, many studies showed that the use of donors with acute renal failure should be encouraged, because, in general, acute renal dysfunction is reversible. Although most studies demonstrated these grafts have more delayed function, the results of graft and patient survival after transplant are very similar to those with the use of standard donors. Clinical and morphological findings of donors, the use of machine perfusion, and analysis of its parameters, especially intrarenal resistance, are important tools to support decision-making when considering the supply of organs with renal dysfunction. PMID:26154553
[Estimation of soluble serum CD30 in the diagnosis of early renal allograft dysfunction].
Trailin, A V
2009-10-01
We aimed to reveal factors influencing serum soluble CD30 level in the recipients of kidney allograft and to estimate its pathogenetic significance. We tested the sCD30 level in the serum before and the 4th day after operation by ELISA. It was established, thats CD30 levels before transplantation were virtually the same in patients who experienced rejection and in non-rejecting patients. However, there was a significant decrease in the level of sCD30 after transplantation in non-rejecting patients, contrary to rejecting patients. A significant decrease of sCD30 level was detected on the day 4th after the transplantation independently of dialysis requirement. The decrease of sCD30 on the day 4th after operation in the patients with delayed graft function and its stability in the patients with acute rejection may be used distinguish these complications.
Ostapenko, Tetyana I.; Nykonenko, Tamara N.; Nesterenko, Svitlana N.; Nykonenko, Olexandr S.
2017-01-01
Background We aimed to determine whether serum soluble CD30 (sCD30) could identify recipients at high risk for unfavorable early and late kidney transplant outcomes. Methods Serum sCD30 was measured on the day of kidney transplantation and on the 4th day posttransplant. We assessed the value of these measurements in predicting delayed graft function, slow graft function (SGF), acute rejection (AR), pyelonephritis, decline of allograft function after 6 months, and graft and patient survival during 5 years of follow-up in 45 recipients. Results We found the association between low pretransplant serum levels of sCD30 and SGF. The absence of significant decrease of sCD30 on the 4th day posttransplant was characteristic for SGF, early AR (the 8th day–6 months), late AR (>6 months), and early pyelonephritis (the 8th day–2 months). Lower pretransplant and posttransplant sCD30 predicted worse allograft function at 6 months and 2 years, respectively. Higher pretransplant sCD30 was associated with higher frequency of early AR, and worse patients' survival, but only in the recipients of deceased-donor graft. Pretransplant sCD30 also allowed to differentiate patients with early pyelonephritis and early AR. Conclusions Peritransplant sCD30 is useful in identifying patients at risk for unfavorable early and late transplant outcomes. PMID:28694560
Donor lung assessment using selective pulmonary vein gases.
Costa, Joseph; Sreekanth, Sowmyashree; Kossar, Alex; Raza, Kashif; Lederer, David J; Robbins, Hilary; Shah, Lori; Sonett, Joshua R; Arcasoy, Selim; D'Ovidio, Frank
2016-11-01
Standard donor lung assessment relies on imaging, challenge gases and subjective interpretation of bronchoscopic findings, palpation and visual assessment. Central gases may not accurately represent true quality of the lungs. We report our experience using selective pulmonary vein gases to corroborate the subjective judgement. Starting, January 2012, donor lungs have been assessed by intraoperative bronchoscopy, palpation and visual judgement of lung collapse upon temporary disconnection from ventilator, central gases from the aorta and selective pulmonary vein gases. Partial pressure of oxygen (pO 2 ) <300 mmHg on FiO 2 of 1.0 was considered low. The results of the chest X-ray and last pO 2 in the intensive care unit were also collected. Post-transplant primary graft dysfunction and survival were monitored. To date, 259 consecutive brain-dead donors have been assessed and 157 transplants performed. Last pO 2 in the intensive care unit was poorly correlated with intraoperative central pO 2 (Spearman's rank correlation r s = 0.29). Right inferior pulmonary vein pO 2 was associated (Mann-Whitney, P < 0.001) with findings at bronchoscopy [clean: median pO 2 443 mmHg (25th-75th percentile range 349-512) and purulent: 264 mmHg (178-408)]; palpation [good: 463 mmHg (401-517) and poor: 264 mmHg (158-434)] and visual assessment of lung collapse [good lung collapse: 429 mmHg (320-501) and poor lung collapse: 205 mmHg (118-348)]. Left inferior pulmonary pO 2 was associated (P < 0.001) with findings at bronchoscopy [clean: 419 mmHg (371-504) and purulent: 254 mmHg (206-367)]; palpation [good: 444 mmHg (400-517) and poor 282 mmHg (211-419)] and visual assessment of lung collapse [good: 420 mmHg (349-496) and poor: 246 mmHg (129-330)]. At 72 h, pulmonary graft dysfunction 2 was in 21/157 (13%) and pulmonary graft dysfunction 3 in 17/157 (11%). Ninety-day and 1-year mortalities were 6/157 (4%) and 13/157 (8%), respectively. Selective pulmonary vein gases provide corroborative objective support to the findings at bronchoscopy, palpation and visual assessment. Central gases do not always reflect true function of the lungs, having high false-positive rate towards the individual lower lobe gas exchange. Objective measures of donor lung function may optimize donor surgeon assessment, allowing for low pulmonary graft dysfunction rates and low 90-day and 1-year mortality. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Kim, Jin Sug; Kim, Weon; Park, Ji Yoon; Woo, Jong Shin; Lee, Tae Won; Ihm, Chun Gyoo; Kim, Yang Gyun; Moon, Ju-Young; Lee, Sang Ho; Jeong, Myung Ho; Jeong, Kyung Hwan
2017-01-01
Lipid lowering therapy is widely used for the prevention of cardiovascular complications after acute myocardial infarction (AMI). However, some studies show that this benefit is uncertain in patients with renal dysfunction, and the role of statins is based on the severity of renal dysfunction. In this study, we investigated the impact of statin therapy on major adverse cardiac events (MACEs) and all-cause mortality in patients with advanced renal dysfunction undergoing percutaneous coronary intervention (PCI) after AMI. This study was based on the Korea Acute Myocardial Infarction Registry database. We included 861 patients with advanced renal dysfunction from among 33,205 patients who underwent PCI after AMI between November 2005 and July 2012. Patients were divided into two groups: a statin group (n = 537) and a no-statin group (n = 324). We investigated the 12-month MACEs (cardiac death, myocardial infarction, repeated PCI or coronary artery bypass grafting) and all-cause mortality of each group. Subsequently, a propensity score-matched analysis was performed. In the total population studied, no significant differences were observed between the two groups with respect to the rate of recurrent MI, repeated PCI, coronary artery bypass grafting (CABG), or all-cause mortality. However, the cardiac death rate was significantly lower in the statin group (p = 0.009). Propensity score-matched analysis yielded 274 pairs demonstrating, results similar to those obtained from the total population. However, there was no significant difference in the cardiac death rate in the propensity score-matched population (p = 0.103). Cox-regression analysis revealed only left ventricular ejection fraction to be an independent predictor of 12-month MACEs (Hazard ratio [HR] of 0.979, 95% confidence interval [CI], 0962-0.996, p = 0.018). Statin therapy was not significantly associated with a reduction in the 12-month MACEs or all-cause mortality in patients with advanced renal dysfunction undergoing PCI after AMI.
Liu, Qinlong; Rehman, Hasibur; Krishnasamy, Yasodha; Schnellmann, Rick G; Lemasters, John J; Zhong, Zhi
2015-07-01
Inclusion of liver grafts from cardiac death donors (CDD) would increase the availability of donor livers but is hampered by a higher risk of primary non-function. Here, we seek to determine mechanisms that contribute to primary non-function of liver grafts from CDD with the goal to develop strategies for improved function and outcome, focusing on c-Jun-N-terminal kinase (JNK) activation and mitochondrial depolarization, two known mediators of graft failure. Livers explanted from wild-type, inducible nitric oxide synthase knockout (iNOS(-/-)), JNK1(-/-) or JNK2(-/-) mice after 45-min aorta clamping were implanted into wild-type recipients. Mitochondrial depolarization was detected by intravital confocal microscopy in living recipients. After transplantation of wild-type CDD livers, graft iNOS expression and 3-nitrotyrosine adducts increased, but hepatic endothelial NOS expression was unchanged. Graft injury and dysfunction were substantially higher in CDD grafts than in non-CDD grafts. iNOS deficiency and inhibition attenuated injury and improved function and survival of CDD grafts. JNK1/2 and apoptosis signal-regulating kinase-1 activation increased markedly in wild-type CDD grafts, which was blunted by iNOS deficiency. JNK inhibition and JNK2 deficiency, but not JNK1 deficiency, decreased injury and improved function and survival of CDD grafts. Mitochondrial depolarization and binding of phospho-JNK2 to Sab, a mitochondrial protein linked to the mitochondrial permeability transition, were higher in CDD than in non-CDD grafts. iNOS deficiency, JNK inhibition and JNK2 deficiency all decreased mitochondrial depolarization and blunted ATP depletion in CDD grafts. JNK inhibition and deficiency did not decrease 3-nitrotyrosine adducts in CDD grafts. The iNOS-JNK2-Sab pathway promotes CDD graft failure via increased mitochondrial depolarization, and is an attractive target to improve liver function and survival in CDD liver transplantation recipients. Copyright © 2015 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.
Early Subretinal Allograft Rejection Is Characterized by Innate Immune Activity.
Kennelly, Kevin P; Holmes, Toby M; Wallace, Deborah M; O'Farrelly, Cliona; Keegan, David J
2017-06-09
Successful subretinal transplantation is limited by considerable early graft loss despite pharmacological suppression of adaptive immunity. We postulated that early innate immune activity is a dominant factor in determining graft survival and chose a nonimmunosuppressed mouse model of retinal pigment epithelial (RPE) cell transplantation to explore this. Expression of almost all measured cytokines by DH01 RPE cells increased significantly following graft preparation, and the neutrophil chemoattractant KC/GRO/CINC was most significantly increased. Subretinal allografts of DH01 cells (C57BL/10 origin) into healthy, nonimmunosuppressed C57BL/6 murine eyes were harvested and fixed at 1, 3, 7, and 28 days postoperatively and subsequently cryosectioned and stained. Graft cells were detected using SV40 large T antigen (SV40T) immunolabeling and apoptosis/necrosis by terminal deoxynucleotidyl transferase dUTP nick-end labeling (TUNEL). Sections were also immunolabeled for macrophage (CD11b and F4/80), neutrophil (Gr1 Ly-6G), and T-lymphocyte (CD3-ɛ) infiltration. Images captured with an Olympus FV1000 confocal microscope were analyzed using the Imaris software. The proportion of the subretinal bolus comprising graft cells (SV40T+) was significantly (p < 0.001) reduced between postoperative day (POD) 3 (90 ± 4%) and POD 7 (20 ± 7%). CD11b+, F4/80+, and Gr1 Ly-6G+ cells increased significantly (p < 0.05) from POD 1 and predominated over SV40T+ cells by POD 7. Colabeling confocal microscopic analysis demonstrated graft engulfment by neutrophils and macrophages at POD 7, and reconstruction of z-stacked confocal images confirmed SV40T inside Gr1 Ly-6G+ cells. Expression of CD3-ɛ was low and did not differ significantly between time points. By POD 28, no graft cells were detectable and few inflammatory cells remained. These studies reveal, for the first time, a critical role for innate immune mechanisms early in subretinal graft rejection. The future success of subretinal transplantation will require more emphasis on techniques to limit innate immune-mediated graft loss, rather than focusing exclusively on suppression of the adaptive immune response.
Full thickness skin grafts in periocular reconstructions: long-term outcomes.
Rathore, Deepa S; Chickadasarahilli, Swaroop; Crossman, Richard; Mehta, Purnima; Ahluwalia, Harpreet Singh
2014-01-01
To evaluate the outcomes of eyelid reconstruction in patients who underwent full thickness skin grafts. A retrospective, noncomparative intervention study of patients who underwent periocular reconstruction with full thickness skin grafts between 2005 and 2011. One hundred consecutive Caucasian patients were included in the study, 54 women and 46 men. Mean follow up was 32 months. Indications for full thickness skin grafts were excision of eyelid tumors (98%) and cicatricial ectropion (2%). Site of lid defects were lower lid (60%), medial canthus (32%), upper lid (6%), and lateral canthus (2%). The skin graft donor sites were supraclavicular (44%), upper eyelid (24%), inner brachial (18%), and postauricular (14%).Early postoperative complications included lower eyelid graft contracture (1%) and partial failure (1%). Late sequelae included lower eyelid graft contracture (4%) and hypertrophic scarring (23%). Of the 23 patients with hypertrophic scar, 21 achieved good outcomes following massage with silicone gel and steroid ointment and 2 had persistent moderate lumpiness. No statistically significant association was found between graft hypertrophy and donor site or graft size. As high as 95% of all patients achieved good final eyelid position. Good color match was seen in 94% and graft hypopigmentation in 6%. An association between hypopigmentation and supraclavicular and inner brachial donor site was found to be statistically significant. Most patients (94%) achieved good eyelid position and color match. Majority (91%) of the early postoperative cicatricial sequelae can be reversed by massage, steroid ointment, and silicone gel application. Full thickness skin grafts have excellent graft survival rates and have minimal donor site morbidity.
Rescue from dwarfism by thyroid function compensation in rdw rats.
Furudate, Sen-ichi; Ono, Masao; Shibayama, Keiko; Ohyama, Yoshihide; Kuwada, Masahiro; Kimura, Toshimi; Kameya, Toru
2005-10-01
The rdw rat was initially reported as having hereditary dwarfism caused by pituitary dysfunction. Subsequent studies on the rdw rat, however, have demonstrated that the primary cause of rdw dwarfism is present in the thyroid gland but not in the pituitary gland. The primary cause of rdw rat disorders is a missense mutation of the thyroglobulin (Tg) gene by a one-point mutation. In the present study, we attempted to rescue the dwarfism of the rdw rats using a diet supplemented with thyroid powder (T-powder) and a thyroid graft (T-graft). The infants of the rdw rat were successfully raised to a mature stage body weight, accompanied by elevation of serum growth hormone (GH) and prolactin (PRL), by the T-powder. Furthermore, the T-graft successfully increased the body weight with fertility. The serum GH and PRL levels in the T-graft rdw rat significantly increased. The serum thyroid-stimulating hormone (TSH) levels in the T-graft rdw rat were significantly decreased but were significantly higher than those in the control rat. The GH and PRL mRNA expression in the rdw rat with the T-graft was virtually the same as that of the control, but the TSH beta mRNA differed from that of the control rats. Thus, the dwarfism in the rdw rat is rescued by thyroid function compensation, such as that afforded by T-powder and T-graft.
Heart retransplantation: a 23-year single-center clinical experience.
Schnetzler, B; Pavie, A; Dorent, R; Camproux, A C; Leger, P; Delcourt, A; Gandjbakhch, I
1998-04-01
The main causes of allograft failure after cardiac transplantation are primary graft dysfunction, intractable acute rejection, and coronary graft disease. Despite the important progress in the last several years in graft preservation, surgical techniques, immunosuppression, and treatment of coronary graft disease, retransplantation in selected cases is the only way to achieve long-term recipient survival. We compare here in a case-control study 24 retransplantations with 47 first transplants in patients matched for date of transplantation. Between 1973 and 1996, 1,063 patients underwent cardiac transplantation in our institution. In this cohort, 22 patients had a total of 24 retransplantations (2 second-time retransplantations). The causes of retransplantations were primary graft failure (n=4), acute rejection (n=7), coronary graft disease (n=11), and miscellaneous (n=2). Survival at 1 and 5 years of patients with retransplantations is 45.5% and 31.2%, and survival of control patients is 59.4% and 38.8% (p=0.07). An interval between first transplantation and retransplantation shorter (n=11) or longer (n=13) than 1 year is associated with a 1-year survival of 27.3% and 61.5% and a 4-year survival of 27.3% and 46%, respectively (not significant). Intervals shorter than 1 year between first transplantation and retransplantation were exclusively secondary to primary graft failure or intractable acute rejection. In the face of lack of donor grafts, these and other data indicate that retransplantation should be considered cautiously, especially when the interval between the first transplantation and retransplantation is short.
Inverse cutting of posterior lamellar corneal grafts by a femtosecond laser.
Hjortdal, Jesper; Nielsen, Esben; Vestergaard, Anders; Søndergaard, Anders
2012-01-01
Posterior lamellar grafting of the cornea has become the preferred technique for treatment of corneal endothelial dysfunction. Posterior lamellar grafts are usually cut by a micro-keratome or a femto-second laser after the epithelial side of the donor cornea has been applanated. This approach often results in variable central graft thickness in different grafts and an increase in graft thickness towards the periphery in every graft. The purpose of this study was to evaluate if posterior lamellar grafts can be prepared from the endothelial side by a femto-second laser, resulting in reproducible, thin grafts of even thickness. A CZM 500 kHz Visumax femto-second laser was used. Organ cultured donor grafts were mounted in an artifical anterior chamber with the endothelial side up and out. Posterior grafts of 7.8 mm diameter and 130 micron thickness were prepared by femto-second laser cutting. A standard DSAEK procedure was performed in 10 patients with Fuchs endothelial dystrophy. Patients were followed-up regularly and evaluated by measurement of complications, visual acuity, corneal thickness (Pentacam HR), and endothelial cell density. Femto-laser cutting of grafts and surgery was uncomplicated. Rebubbling was necessary in 5 of 10 cases (normally only in 1 of 20 cases). All grafts were attached and cleared up during the first few weeks. After six months, the average visual acuity was 0.30 (range: 0.16 to 0.50), corneal thickness was 0.58 mm (range 0.51 to 0.63), and endothelial cell density was 1.570 per sq. mm (range: 1.400 to 2.000 cells per sq. mm). The grafts were of uniform thickness, but substantial interface haze was present in most grafts. Posterior lamellar corneal grafts can be prepared from the endothelial side using a femto-second laser. All grafts were clear after 6 months with satisfying endothelial cell counts. Poor visual acuity caused by interface scatter was observed in most patients. Femto-second laser cutting parameters needs to be optimised to enable smooth cutting in the posterior stroma.
Inverse Cutting of Posterior Lamellar Corneal Grafts by a Femtosecond Laser
Hjortdal, Jesper; Nielsen, Esben; Vestergaard, Anders; Søndergaard, Anders
2012-01-01
Purpose Posterior lamellar grafting of the cornea has become the preferred technique for treatment of corneal endothelial dysfunction. Posterior lamellar grafts are usually cut by a micro-keratome or a femto-second laser after the epithelial side of the donor cornea has been applanated. This approach often results in variable central graft thickness in different grafts and an increase in graft thickness towards the periphery in every graft. The purpose of this study was to evaluate if posterior lamellar grafts can be prepared from the endothelial side by a femto-second laser, resulting in reproducible, thin grafts of even thickness. Methods A CZM 500 kHz Visumax femto-second laser was used. Organ cultured donor grafts were mounted in an artifical anterior chamber with the endothelial side up and out. Posterior grafts of 7.8 mm diameter and 130 micron thickness were prepared by femto-second laser cutting. A standard DSAEK procedure was performed in 10 patients with Fuchs endothelial dystrophy. Patients were followed-up regularly and evaluated by measurement of complications, visual acuity, corneal thickness (Pentacam HR), and endothelial cell density. Results Femto-laser cutting of grafts and surgery was uncomplicated. Rebubbling was necessary in 5 of 10 cases (normally only in 1 of 20 cases). All grafts were attached and cleared up during the first few weeks. After six months, the average visual acuity was 0.30 (range: 0.16 to 0.50), corneal thickness was 0.58 mm (range 0.51 to 0.63), and endothelial cell density was 1.570 per sq. mm (range: 1.400 to 2.000 cells per sq. mm). The grafts were of uniform thickness, but substantial interface haze was present in most grafts. Conclusions Posterior lamellar corneal grafts can be prepared from the endothelial side using a femto-second laser. All grafts were clear after 6 months with satisfying endothelial cell counts. Poor visual acuity caused by interface scatter was observed in most patients. Femto-second laser cutting parameters needs to be optimised to enable smooth cutting in the posterior stroma. PMID:22582107
Song, Myung Gyu; Seo, Tae-Seok; Lee, Chang Hee; Kim, Kyeong Ah; Kim, Jun Suk; Oh, Sang Cheul; Lee, Jae-Kwan
2015-06-01
This study was decided to evaluate the impact of diameter and the existences of multiple side holes along the straight portion of double-J ureteral stents (DJUS) on early dysfunction of stents placed for malignant ureteral strictures. Between April 2007 and December 2011, 141 DJUSs were placed via a percutaneous nephrostomy (PCN) tract in 110 consecutive patients with malignant ureteral strictures. 7F DJUSs with multiple side holes in the straight portion were placed in 58 ureters of 43 patients (Group 1). 8F DJUSs with three side holes in the proximal 2-cm of the straight portion were placed in 83 ureters of 67 patients (Group 2). The incidence of early DJUS dysfunction was compared between the two groups, and nephrostographic findings were evaluated in the cases of early dysfunction. Early dysfunction of the DJUS was noted in 14 of 58 patients (24.1 %) in Group 1, which was significantly higher (p = 0.001) than in Group 2 in which only 1 of 83 patients (1.2 %) had early dysfunction of the DJUS. Nephrostographic findings of early dysfunction included dilatation of the pelvicalyceal system, filling defects in the ureteral stent, and no passage of contrast media into the urinary bladder. In malignant ureteral strictures, multiple side holes in the straight portion of the 7-F DJUS seem to cause early dysfunction. The 8F DJUSs with three side holes in the proximal 2-cm of the straight portion may be superior at preventing early dysfunction.
Ninety-day mortality and major complications are not affected by use of lung allocation score.
McCue, Jonathan D; Mooney, Josh; Quail, Jacob; Arrington, Amanda; Herrington, Cynthia; Dahlberg, Peter S
2008-02-01
In May 2005 the Organ Procurement Transplant Network (OPTN) and United Network for Organ Sharing (UNOS) implemented the donor lung allocation score (LAS) system to prioritize organ allocation among prospective transplant recipients. The purpose of our study was to determine the impact of LAS implementation on 90-day survival, early complications and incidence of severe primary graft dysfunction (PGD) after the transplant procedure. Early outcomes among 78 patients receiving transplants after the initiation of the scoring system were compared with those of the 78 previous patients. Survival rates at 90 days and 1 year were the primary end-points of the study. Arterial blood-gas measurements were collected for all patients at the time of ICU arrival and at 12, 24 and 48 hours after surgery to determine the distribution of International Society of Heart and Lung Transplant (ISHLT) PGD grade. Major complications within 30 days post-transplant were recorded. We found a small but significant 1-year survival advantage among post-LAS implementation patients, which was largely due to decreased early mortality in comparison to the control cohort. The incidence of ISHLT Grade 3 PGD measured within the first 24 hours after transplant did not differ between groups, nor was there an increase in the rate of major post-operative complications. Implementation of the LAS system has not been associated with an increase in early mortality, immediate PGD or major complications.
Arruda, Thiago; Sukekava, Flávia; de Souza, André B; Rasmusson, Lars; Araújo, Maurício G
2013-07-01
The aim of the present study was to evaluate the effect of the placement of titanium granules in fresh extraction sockets on early bone formation. The mesial roots of the third maxillary premolars of five adult beagle dogs were removed. On one side of the maxilla (Test group) the fresh extraction socket was grafted with titanium granules, while the contra-lateral socket was left non-grafted (Control group). After 1 month of healing, the dogs were euthanized and biopsies were obtained. The healing tissues were described, and histometric measurements were performed to obtain the percentage area occupied by connective tissue, new mineralized bone, bone marrow, and biomaterial particles. After 1 month of healing the findings from the histological examination revealed the titanium graft to be well incorporated into the provisional connective tissue or newly formed woven bone. The histometric measurements showed, however, that less mineralized bone was formed in the Test group than in the Control group. The present study suggests that the use of titanium granules in fresh extraction sockets was conducive to new bone formation. The graft of titanium granules seems, however, to delay the early phase of the healing process. Copyright © 2012 Wiley Periodicals, Inc.
Inverted Lobes Have Satisfactory Functions Compared With Noninverted Lobes in Lung Transplantation.
Kayawake, Hidenao; Chen-Yoshikawa, Toyofumi F; Motoyama, Hideki; Hamaji, Masatsugu; Hijiya, Kyoko; Aoyama, Akihiro; Goda, Yasufumi; Oda, Hiromi; Ueda, Satoshi; Date, Hiroshi
2018-04-01
To overcome the problem of small-for-size grafts in standard living-donor lobar lung transplantation (LDLLT), we developed inverted LDLLT, in which a right lower lobe from 1 donor is implanted as a right graft and another right lower lobe from another donor is implanted as a left graft. We retrospectively analyzed the functions of inverted grafts vs noninverted grafts. Between 2008 and 2015, 64 LDLLTs were performed. Included were 35 LDLLTs whose recipients were adults and monitored for more than 6 months without developing chronic lung allograft dysfunction. Among them, 65 implanted lobes were eligible for this analysis. There were 31 right lower lobes implanted as right grafts (right-to-right group), 7 right lower lobes as inverted left grafts (right-to-left group), and 27 left lower lobes as left grafts (left-to-left group). We evaluated the graft forced vital capacity (G-FVC) and graft volume of the 65 lobes before and 6 months after LDLLT and compared them among the three groups. Preoperatively, G-FVC in the right-to-left group (1,050 mL) was comparable to that in the right-to-right group (1,177 mL) and better than that in the left-to-left group (791 mL, p < 0.01). Six months after LDLLT, G-FVC in the right-to-left group (1,015 mL) remained comparable to that in the right-to-right group (1,001 mL) and better than that in the left-to-left group (713 mL, p = 0.047). The ratio of graft volume 6 months after LDLLT to the preoperative value was comparable. The functions of inverted grafts in inverted LDLLTs were satisfactory compared with those of noninverted grafts. Copyright © 2018 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Amirzargar, Mohammad Ali; Amirzargar, Aliakbar; Basiri, Abbas; Hajilooi, Mehrdad; Roshanaei, Ghodratollah; Rajabi, Gholamreza; Mohammadiazar, Sina; Solgi, Ghasem
2014-01-01
This study aimed to investigate the predictive power of anti-HLA antibodies, sCD30 levels and IgA-anti-Fab autoantibody before and early after transplantation in relation to long-term kidney allograft survival. Pre- and post-transplant sera samples of 59 living-unrelated donor kidney recipients were tested for above risk factors by enzyme-linked immunoabsorbent assay. 15 out of 59 cases experienced rejection episodes (failure group). Pre- and post-transplant high sCD30 levels were significantly associated with graft failure (P=0.02 and P=0.004) and decreased 4 year graft survival (P = 0.009 and P = 0.001). Higher frequency of post-transplant HLA class-II antibody in the absence of class-I antibody was observed in failure group (P=0.007). Patients with post-transplant HLA class-I and class-II antibodies either alone or in combination showed significant lower 4 year graft survival. Recipients with high sCD30 levels in the presence of HLA class-I or class-II antibodies within 2 weeks post-transplant had poor graft survival (P = 0.004 and P = 0.002, respectively). High levels of post-transplant IgA-anti-Fab antibody was more frequent in functioning-graft patients (P = 0.00001), correlated with decreased serum creatinine levels (P = 0.01) and associated with improved graft survival (P = 0.008). Our findings indicate the deleterious effect of early post-transplant HLA antibodies and increased sCD30 levels dependently and protective effect of IgA-anti-Fab antibodies on long-term renal graft outcomes. Copyright © 2013 American Society for Histocompatibility and Immunogenetics. Published by Elsevier Inc. All rights reserved.
Ren, Zhigang; Jiang, Jianwen; Lu, Haifeng; Chen, Xinhua; He, Yong; Zhang, Hua; Xie, Haiyang; Wang, Weilin; Zheng, Shusen; Zhou, Lin
2014-10-27
Acute rejection (AR) remains a life-threatening complication after orthotopic liver transplantation (OLT) and there are few available diagnostic biomarkers clinically for AR. This study aims to identify intestinal microbial profile and explore potential application of microbial profile as a biomarker for AR after OLT. The OLT models in rats were established. Hepatic graft histology, ultrastructure, function, and intestinal barrier function were tested. Ileocecal contents were collected for intestinal microbial analysis. Hepatic graft suffered from the ischemia-reperfusion (I/R) injury on day 1, initial AR on day 3, and severe AR on day 7 after OLT. Real-time quantitative polymerase chain reaction results showed that genus Faecalibacterium prausnitzii and Lactobacillus were decreased, whereas Clostridium bolteae was increased during AR. Notably, cluster analysis of denaturing gradient gel electrophoresis (DGGE) profiles showed the 7AR and 3AR groups clustered together with 73.4% similarity, suggesting that intestinal microbiota was more sensitive than hepatic function in responding to AR. Microbial diversity and species richness were decreased during AR. Phylogenetic tree analysis showed that most of the decreased key bacteria belonged to phylum Firmicutes, whereas increased key bacteria belonged to phylum Bacteroidetes. Moreover, intestinal microvilli loss and tight junction damage were noted, and intestinal barrier dysfunction during AR presented a decrease of fecal secretory immunoglobulin A (sIgA) and increase of blood bacteremia, endotoxin, and tumor necrosis factor-α. We dynamically detail intestinal microbial characterization and find a high sensitivity of microbial change during AR after OLT, suggesting that intestinal microbial variation may predict AR in early phase and become an assistant therapeutic target to improve rejection after OLT.
Current outcomes of off-pump coronary artery bypass grafting: evidence from real world practice
2016-01-01
Coronary artery bypass grafting (CABG) can be performed conventionally using cardiopulmonary bypass (CPB) and aortic clamping or on a beating heart (BH) without the use of CPB, the so-called off-pump CABG. Some surgeons, who are proponents of off-pump CABG, preferentially use this technique for the majority of operations, whereas others use it only in certain situations which warrant avoidance of CPB. Ever since the conception of off-pump CABG, the never-ending debate about which technique of CABG is safe and efficacious continues to date. Several randomized controlled trials (RCTs) have been conducted that have either favored on-pump CABG or have failed to show a significant difference in outcomes between the two techniques. However, these RCTs have been fraught with claims that they do not represent the majority of patients undergoing CABG in real world practice. Therefore, assessment of the benefits and drawbacks of each technique through observational and registry studies would be more representative of patients encountered in daily practice. The present review examines various retrospective studies and meta-analyses of observational studies that compare the early and long-term outcomes of off- and on-pump CABG, which assesses their safety and efficacy. Additionally, their outcomes in older patients, females, and those with diabetes mellitus, renal dysfunction, presence of ascending aortic disease, and/or acute coronary syndrome (ACS) have also been discussed separately. The general consensus is that early results of off-pump CABG are comparable to or in some cases better than on-pump CABG. However, on-pump CABG provides a survival benefit in the long term according to a majority of publications in literature. PMID:27942395
Intestinal Microbial Variation May Predict Early Acute Rejection after Liver Transplantation in Rats
Ren, Zhigang; Jiang, Jianwen; Lu, Haifeng; Chen, Xinhua; He, Yong; Zhang, Hua; Xie, Haiyang; Wang, Weilin; Zheng, Shusen; Zhou, Lin
2014-01-01
Background Acute rejection (AR) remains a life-threatening complication after orthotopic liver transplantation (OLT) and there are few available diagnostic biomarkers clinically for AR. This study aims to identify intestinal microbial profile and explore potential application of microbial profile as a biomarker for AR after OLT. Methods The OLT models in rats were established. Hepatic graft histology, ultrastructure, function, and intestinal barrier function were tested. Ileocecal contents were collected for intestinal microbial analysis. Results Hepatic graft suffered from the ischemia-reperfusion (I/R) injury on day 1, initial AR on day 3, and severe AR on day 7 after OLT. Real-time quantitative polymerase chain reaction results showed that genus Faecalibacterium prausnitzii and Lactobacillus were decreased, whereas Clostridium bolteae was increased during AR. Notably, cluster analysis of denaturing gradient gel electrophoresis (DGGE) profiles showed the 7AR and 3AR groups clustered together with 73.4% similarity, suggesting that intestinal microbiota was more sensitive than hepatic function in responding to AR. Microbial diversity and species richness were decreased during AR. Phylogenetic tree analysis showed that most of the decreased key bacteria belonged to phylum Firmicutes, whereas increased key bacteria belonged to phylum Bacteroidetes. Moreover, intestinal microvilli loss and tight junction damage were noted, and intestinal barrier dysfunction during AR presented a decrease of fecal secretory immunoglobulin A (sIgA) and increase of blood bacteremia, endotoxin, and tumor necrosis factor-α. Conclusion We dynamically detail intestinal microbial characterization and find a high sensitivity of microbial change during AR after OLT, suggesting that intestinal microbial variation may predict AR in early phase and become an assistant therapeutic target to improve rejection after OLT. PMID:25321166
Yousif, A; Addison, D; Lakkis, N; Rosengart, T; Virani, S S; Birnbaum, Y; Alam, M
2018-05-01
Data from randomized trials evaluating the efficacy of on- versus off-pump coronary artery bypass grafting remain inconclusive, particularly in high-risk populations. The aim of this study is to compare the outcomes associated with on- versus off-pump coronary artery bypass grafting among high-risk patients. We performed a meta-analysis of randomized control trials comparing on- versus off-pump coronary artery bypass grafting, focusing on high-risk populations. Studies focusing on "high-risk" features: European System of Cardiac Operative Risk Evaluation (EuroSCORE) ≥ 5, age > 70 years, preexisting renal insufficiency, history of stroke(s), and the presence of left ventricular dysfunction were included. MEDLINE, Scopus, and Embase were searched for all publications between January 1, 2000 and August 1, 2016, using the following terms: on-pump, off-pump, coronary artery bypass, high-risk, left ventricular dysfunction, elderly, aged, and renal insufficiency. Endpoints included cardiovascular and all-cause mortality, non-fatal myocardial infarction, stroke, need for revascularization, renal failure, and length of hospital stay. Nine studies incorporating 11,374 patients with a mean age of 70 years were selected. There was no statistical difference in cardiovascular mortality, all-cause mortality, non-fatal myocardial infarction, and renal failure between the two groups. There was a decrease in further revascularization at 1 year with on-pump (OR 0.67 (0.50-0.89)). However, there was an increase in length of hospital stay by 2.24 days (p = 0.03) among the on-pump group with no difference in stroke (OR 1.34 (1.00-1.80)). On-pump is associated with a decreased risk of additional revascularization by 1 year. However, this appears to be a cost of longer hospitalization.
Cardiovascular disease in live related renal transplantation.
Kaul, A; Sharm, R K; Gupta, A; Sinha, N; Singh, U
2011-11-01
Cardiovascular disease has become the leading cause of morbidity and mortality in renal transplant recipients, although its pathogenesis and treatment are poorly understood. Modifiable cardiovascular risk factors and graft dysfunction both play an important role in development of post transplant cardiovascular events. Prevalence of cardiovascular disease was studied in stable kidney transplant patients on cyclosporine based triple immunosuppression in relation to the various risk factors and post transplant cardiovascular events. Analysis of 562 post transplant patients with stable graft function for 6 months, the patients were evaluated for cardiovascular events in post transplant period. Pre and post transplant risk factors were analyzed using the COX proportional hazard model. 174 patients had undergone pre transplant coronary angiography, 15 of these patients underwent coronary revascularization (angioplasty in 12, CABG in 3). The prevalence of CAD was 7.2% in transplant recipients. Of 42 patients with CAD 31 (73.8%) had cardiovascular event in post transplant period. Age > or = 40 yrs, male sex, graft dysfunction, diabetes as primary renal disease, pre transplant cardiovascular event, chronic rejection showed significant correlation in univariate analysis and there was significant between age > or = 40 years (OR = 2.16 with 95% CI, 0.977-4.78) S creatinine > or = 1.4 mg % (OR = 2.40 with 95% CI, 1.20 - 4.82), diabetes as primary disease (OR with 95% CI 3.67, 3.2-14.82), PTDM (OR 3.67, 95% CI 1.45-9.40), pre-transplant cardiovascular disease (OR 4.14, 95% CI .38-13.15) with post transplant cardiovascular event on multivariate analysis. There was poor patient and graft survival among those who suffered post transplant cardiovascular event. The incidence of cardiovascular disease continues to be high after renal transplantation and modifiable risk factors should be identified to prevent occurrence of events in post transplant period.
Expandable external support device to improve Saphenous Vein Graft Patency after CABG
2013-01-01
Objectives Low patency rates of saphenous vein grafts remain a major predicament in surgical revascularization. We examined a novel expandable external support device designed to mitigate causative factors for early and late graft failure. Methods For this study, fourteen adult sheep underwent cardiac revascularization using two vein grafts for each; one to the LAD and the other to the obtuse marginal artery. One graft was supported with the device while the other served as a control. Target vessel was alternated between consecutive cases. The animals underwent immediate and late angiography and were then sacrificed for histopathologic evaluation. Results Of the fourteen animals studied, three died peri-operatively (unrelated to device implanted), and ten survived the follow-up period. Among surviving animals, three grafts were thrombosed and one was occluded, all in the control group (p = 0.043). Quantitative angiographic evaluation revealed no difference between groups in immediate level of graft uniformity, with a coefficient-of-variance (CV%) of 7.39 in control versus 5.07 in the supported grafts, p = 0.082. At 12 weeks, there was a significant non-uniformity in the control grafts versus the supported grafts (CV = 22.12 versus 3.01, p < 0.002). In histopathologic evaluation, mean intimal area of the supported grafts was significantly lower than in the control grafts (11.2 mm^2 versus 23.1 mm^2 p < 0.02). Conclusions The expandable SVG external support system was found to be efficacious in reducing SVG’s non-uniform dilatation and neointimal formation in an animal model early after CABG. This novel technology may have the potential to improve SVG patency rates after surgical myocardial revascularization. PMID:23641948
Ischemia and reperfusion injury in renal transplantation: hemodynamic and immunological paradigms
Requião-Moura, Lúcio Roberto; Durão, Marcelino de Souza; de Matos, Ana Cristina Carvalho; Pacheco-Silva, Alvaro
2015-01-01
Ischemia and reperfusion injury is an inevitable event in renal transplantation. The most important consequences are delayed graft function, longer length of stay, higher hospital costs, high risk of acute rejection, and negative impact of long-term follow-up. Currently, many factors are involved in their pathophysiology and could be classified into two different paradigms for education purposes: hemodynamic and immune. The hemodynamic paradigm is described as the reduction of oxygen delivery due to blood flow interruption, involving many hormone systems, and oxygen-free radicals produced after reperfusion. The immune paradigm has been recently described and involves immune system cells, especially T cells, with a central role in this injury. According to these concepts, new strategies to prevent ischemia and reperfusion injury have been studied, particularly the more physiological forms of storing the kidney, such as the pump machine and the use of antilymphocyte antibody therapy before reperfusion. Pump machine perfusion reduces delayed graft function prevalence and length of stay at hospital, and increases long-term graft survival. The use of antilymphocyte antibody therapy before reperfusion, such as Thymoglobulin™, can reduce the prevalence of delayed graft function and chronic graft dysfunction. PMID:25993079
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Myung Gyu, E-mail: acube808@naver.com; Seo, Tae-Seok, E-mail: g1q1papa@korea.ac.kr; Lee, Chang Hee, E-mail: chlee86@korea.ac.kr
PurposeThis study was decided to evaluate the impact of diameter and the existences of multiple side holes along the straight portion of double-J ureteral stents (DJUS) on early dysfunction of stents placed for malignant ureteral strictures.MethodsBetween April 2007 and December 2011, 141 DJUSs were placed via a percutaneous nephrostomy (PCN) tract in 110 consecutive patients with malignant ureteral strictures. 7F DJUSs with multiple side holes in the straight portion were placed in 58 ureters of 43 patients (Group 1). 8F DJUSs with three side holes in the proximal 2-cm of the straight portion were placed in 83 ureters of 67more » patients (Group 2). The incidence of early DJUS dysfunction was compared between the two groups, and nephrostographic findings were evaluated in the cases of early dysfunction.ResultsEarly dysfunction of the DJUS was noted in 14 of 58 patients (24.1 %) in Group 1, which was significantly higher (p = 0.001) than in Group 2 in which only 1 of 83 patients (1.2 %) had early dysfunction of the DJUS. Nephrostographic findings of early dysfunction included dilatation of the pelvicalyceal system, filling defects in the ureteral stent, and no passage of contrast media into the urinary bladder.ConclusionsIn malignant ureteral strictures, multiple side holes in the straight portion of the 7-F DJUS seem to cause early dysfunction. The 8F DJUSs with three side holes in the proximal 2-cm of the straight portion may be superior at preventing early dysfunction.« less
Hadadzadeh, Mehdi; Hosseini, Seyed Habib; Mostafavi Pour Manshadi, Seyed Mohammad Yousof; Naderi, Nafiseh; Emami Meybodi, Mahmood
2013-01-01
Myocardial dysfunction is a major complication in cardiac surgery that needs inotropic support. This study evaluates the effect of milrinone on patients with low ventricular ejection fraction undergoing off- pump coronary artery bypass graft (OPCAB). The present study is designed to evaluate the effect of milrinone on myocardial dysfunction. Eighty patients with low ventricular ejection fraction (<35%), candidate for elective OPCAB, were enrolled in this study. They were randomly assigned to two groups. One group received milrinone (50 μg/kg) intravenously and another group received a saline as placebo followed by 24 hours infusion of each agent (0.5 μg/kg/min). Short outcome of patients such as hemodynamic parameters and left ventricular ejection fraction were variables evaluated. Serum levels of creatine phosphokinase, the MB isoenzyme of creatine kinase, occurrence of arrhythmias and mean duration of mechanical ventilation were significantly lower in milrinone group (P<0.05). The mean post operative left ventricular ejection fraction was significantly higher in milrinone group (P=0.031). There were no statistical significant differences between the two groups in terms of intra-aortic balloon pump, inotropic support requirement, myocardial ischemia, myocardial infarction, duration of inotropic support, duration of intensive care unit stay, mortality and morbidity rate. Administration of milrinone in patients undergoing OPCAB with low ventricular ejection fraction is useful and effective.
Tsilimparis, Nikolaos; Debus, E Sebastian; Oderich, Gustavo S; Haulon, Stephan; Terp, Kim Allan; Roeder, Blayne; Detter, Christian; Kölbel, Tilo
2016-06-01
The objective of this study was to evaluate the safety and feasibility of a novel stent graft specifically designed for treatment of the ascending aorta. This was a multicenter, retrospective analysis of all consecutive patients treated with the dedicated Zenith Ascend TAA Endovascular Graft (William Cook Europe, Bjaeverskov, Denmark) for pathologic processes requiring stent grafting of the ascending aorta. The graft is short (6.5 cm), with a delivery system designed for transfemoral placement in the ascending aorta. In 10 patients (five men; age, 67 years; range, 26-90 years), the Zenith Ascend graft was implanted for the following indications: dissection (n = 5) and aneurysm (n = 4) of the ascending aorta and fixation of an intraprocedural dislocated aortic valve (n = 1). All patients were judged to be at high risk for open surgery (nine patients were classified as American Society of Anesthesiologists class 3 or class 4). A transfemoral approach was selected in eight cases and a transapical approach in two. All endografts were successfully deployed without intraoperative adverse events at the targeted landing zone. Clinical success in coverage of the lesions was achieved in all cases with the exception of an attempted treatment of an intraprocedural aortic valve implantation dissection that resulted in early mortality. The 30-day survival was 90%. Early neurologic events included one patient with stroke and paraplegia and one patient with a transient ischemic attack. One patient underwent early evacuation of a hemopericardium. There were two late reinterventions for persisting endoleaks. At a mean follow-up of 10 months (range, 1-36 months), three late deaths occurred, with one treatment related, as a result of graft infection. Despite the fact that in this first published series the graft was frequently used as a "rescue tool" outside its intended indication, treatment with the Zenith Ascend graft in this early experience appears to be safe and feasible for repair of ascending aorta pathologic processes in high-risk patients unsuitable for open repair. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Heidt, Sebastiaan; San Segundo, David; Shankar, Sushma; Mittal, Shruti; Muthusamy, Anand S R; Friend, Peter J; Fuggle, Susan V; Wood, Kathryn J
2011-07-15
Currently, acute allograft rejection can only be detected reliably by deterioration of graft function confirmed by allograft biopsy. A huge drawback of this method of diagnosis is that substantial organ damage has already taken place at the time that rejection is diagnosed. Discovering and validating noninvasive biomarkers that predict acute rejection, and chronic allograft dysfunction, is of great importance. Many studies have investigated changes in the peripheral blood in an attempt to find biomarkers that reflect changes in the graft directly or indirectly. Herein, we will review the promises and limitations of the peripheral blood biomarkers that have been described in the literature so far.
Antibody-Mediated Rejection of Human Orthotopic Liver Allografts
Demetris, A. Jake; Jaffe, Ron; Tzakis, A.; Ramsey, Glenn; Todo, S.; Belle, Steven; Esquivel, Carlos; Shapiro, Ron; Markus, Bernd; Mroczek, Elizabeth; Van Thiel, D. H.; Sysyn, Greg; Gordon, Robert; Makowka, Leonard; Starzl, Tom
1988-01-01
A clinicopathologic analysis of liver transplantation across major ABO blood group barriers was carried out 1) to determine if antibody-mediated (humoral) rejection was a cause of graft failure and if humoral rejection can be identified, 2) to propose criteria for establishing the diagnosis, and 3) to describe the clinical and pathologic features of humoral rejection. A total of 51 (24 primary) ABO-incompatible (ABO-I) liver grafts were transplanted into 49 recipients. There was a 46% graft failure rate during the first 30 days for primary ABO-I grafts compared with an 11% graft failure rate for primary ABO compatible (ABO-C), crossmatch negative, age, sex and priority-matched control patients (P < 0.02). A similarly high early graft failure rate (60%) was seen for nonprimary ABO-I grafts during the first 30 days. Clinically, the patients experienced a relentless rise in serum transaminases, hepatic failure, and coagulopathy during the first weeks after transplant. Pathologic examination of ABO-I grafts that failed early demonstrated widespread areas of geographic hemorrhagic necrosis with diffuse intraorgan coagulation. Prominent arterial deposition of antibody and complement components was demonstrated by immunoflourescent staining. Elution studies confirmed the presence of tissue-bound, donor-specific isoagglutinins within the grafts. No such deposition was seen in control cases. These studies confirm that antibody mediated rejection of the liver occurs and allows for the development of criteria for establishing the diagnosis. ImagesFigure 1Figure 2Figure 3Figure 4Figure 5Figure 6 PMID:3046369
Extracorporeal albumin dialysis in patients with Amanita phalloides poisoning.
Faybik, Peter; Hetz, Hubert; Baker, Amir; Bittermann, Clemens; Berlakovich, Gabriela; Werba, Alois; Krenn, Claus-Georg; Steltzer, Heinz
2003-01-01
Ingestion of Amanita phalloides is the most common cause of lethal mushroom poisoning. The relative late onset of symptoms is a distinct diagnostic feature of Amanita intoxication and also the main reason of failure for extracorporeal removal of Amanita-specific toxins from the gut and circulation. Extracorporeal albumin dialysis (ECAD) has been used on six consecutive patients admitted after A. phalloides poisoning with acute liver failure (ALF). Six patients, with mean age of 46 years (range: 9-70 years), underwent one to three ECAD treatments. The mean time from mushroom ingestion until the first ECAD treatment was 76 h. Two patients regenerated spontaneously under ECAD treatment and orthotopic liver transplantation (OLT) could be avoided. Two patients were successfully bridged to OLT and one patient died because of cerebral herniation. One patient was treated with ECAD immediately after OLT because of the graft dysfunction and survived without re-transplantation. ECAD appeared to be a successful treatment perspective in supporting liver regeneration or in sufficient bridging to OLT and also in treatment of graft dysfunction after OLT in patients with A. phalloides poisoning.
Results of completion arteriography after minimally invasive off-pump coronary artery bypass.
Hoff, Steven J; Ball, Stephen K; Leacche, Marzia; Solenkova, Natalia; Umakanthan, Ramanan; Petracek, Michael R; Ahmad, Rashid; Greelish, James P; Walker, Kristie; Byrne, John G
2011-01-01
The benefits of a minimally invasive approach to off-pump coronary artery bypass remain controversial. The value of completion arteriography in validating this technique has not been investigated. From April 2007 to October 2009, fifty-six patients underwent isolated minimally invasive coronary artery bypass grafting through a left thoracotomy without cardiopulmonary bypass. Forty-three of these patients underwent completion arteriography. Sixty-five grafts were performed in these 56 patients, (average, 1.2 grafts per patient; range, 1 to 3). Forty-eight grafts were studied in the 43 patients undergoing completion arteriography. There were 4 findings on arteriogram leading to further immediate intervention (8.3%). These included 3 grafts with anastomotic stenoses or spasm requiring stent placement, and 1 patient who had limited dissection in the left internal mammary artery graft and underwent placement of an additional vein graft. These findings were independent of electrocardiographic changes or hemodynamic instability. The remainder of the studies showed no significant abnormalities. There were no deaths. One patient who did not have a completion arteriogram suffered a postoperative myocardial infarction requiring stent placement for anastomotic stenosis. Patients were discharged home an average of 6.8 days postoperatively. There were no instances of renal dysfunction postoperatively attributable to catheterization. Minimally invasive coronary artery bypass is safe and effective. Findings of completion arteriography occasionally reveal previously under-recognized findings that, if corrected in a timely fashion, could potentially impact graft patency and clinical outcomes. Our experience validates this minimally invasive technique. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Bonatti, Johannes; Rehman, Atiq; Schwartz, Kimberly; Deshpande, Seema; Kon, Zachary; Lehr, Eric; Zimrin, David; Griffith, Bartley
2010-12-01
Robotic technology enables "port only" totally endoscopic coronary artery bypass grafting (TECAB). During early procedure development only single bypass grafts were feasible. Because current referral practice for coronary bypass surgery mostly includes multivessel disease, performance of multiple endoscopic bypass grafts is desirable. We report a case in which a patient received a right internal mammary artery bypass graft to the left anterior descending artery and a left internal mammary artery jump graft to 2 obtuse marginal branches. The procedure was performed through 5 ports on the arrested heart using the daVinci S robotic surgical system. This is the first reported triple bypass grafting procedure using an arrested heart approach.
Bypass grafting to the anterior tibial artery.
Armour, R H
1976-01-01
Four patients with severe ischaemia of a leg due to atherosclerotic occlusion of the tibial and peroneal arteries had reversed long saphenous vein grafts to the patent lower part of the anterior tibial artery. Two of these grafts continue to function 19 and 24 months after operation respectively. One graft failed on the fifth postoperative day and another occluded 4 months after operation. The literature on femorotibial grafting has been reviewed. The early failure rate of distal grafting is higher than in the case of femoropopliteal bypass, but a number of otherwise doomed limbs can be salvaged. Contrary to widely held views, grafting to the anterior tibial artery appears to give results comparable to those obtained when the lower anastomosis is made to the posterior tibial artery.
Urinary tract infections in children after renal transplantation.
John, Ulrike; Kemper, Markus J
2009-06-01
Urinary tract infections (UTI) after pediatric kidney transplantation (KTX) are an important clinical problem and occur in 15-33% of patients. Febrile UTI, whether occurring in the transplanted kidney or the native kidney, should be differentiated from afebrile UTI. The latter may cause significant morbidity and is usually associated with acute graft dysfunction. Risk factors for (febrile) UTI include anatomical, functional, and demographic factors as well as baseline immunosuppression and foreign material, such as catheters and stents. Meticulous surveillance, diagnosis, and treatment of UTI is important to minimize acute morbidity and compromise of long-term graft function. In febrile UTI, parenteral antibiotics are usually indicated, although controlled data are not available. As most data concerning UTI have been accumulated retrospectively, future prospective studies have to be performed to clarify pathogenetic mechanisms and risk factors, improve prophylaxis and treatment, and ultimately optimize long-term renal graft survival.
Yunhua, Tang; Weiqiang, Ju; Maogen, Chen; Sai, Yang; Zhiheng, Zhang; Dongping, Wang; Zhiyong, Guo; Xiaoshun, He
2018-06-01
Early allograft dysfunction (EAD) and early postoperative complications are two important clinical endpoints when evaluating clinical outcomes of liver transplantation (LT). We developed and validated two ICGR15-MELD models in 87 liver transplant recipients for predicting EAD and early postoperative complications after LT by incorporating the quantitative liver function tests (ICGR15) into the MELD score. Eighty seven consecutive patients who underwent LT were collected and divided into a training cohort (n = 61) and an internal validation cohort (n = 26). For predicting EAD after LT, the area under curve (AUC) for ICGR15-MELD score was 0.876, with a sensitivity of 92.0% and a specificity of 75.0%, which is better than MELD score or ICGR15 alone. The recipients with a ICGR15-MELD score ≥0.243 have a higher incidence of EAD than those with a ICGR15-MELD score <0.243 (P <0.001). For predicting early postoperative complications, the AUC of ICGR15-MELD score was 0.832, with a sensitivity of 90.9% and a specificity of 71.0%. Those recipients with an ICGR15-MELD score ≥0.098 have a higher incidence of early postoperative complications than those with an ICGR15-MELD score <0.098 (P < 0.001). Finally, application of the two ICGR15-MELD models in the validation cohort still gave good accuracy (AUC, 0.835 and 0.826, respectively) in predicting EAD and early postoperative complications after LT. The combination of quantitative liver function tests (ICGR15) and the preoperative MELD score is a reliable and effective predictor of EAD and early postoperative complications after LT, which is better than MELD score or ICGR15 alone.
Goodwin, Jodi; Tinckam, Kathryn; denHollander, Neal; Haroon, Ayesha; Keshavjee, Shaf; Cserti-Gazdewich, Christine M
2010-09-01
It is unknown the extent to which transfusion-related acute lung injury (TRALI) contributes to primary graft dysfunction (PGD), the leading cause of death after lung transplantation. In this case of suspected transfusion-associated acute bilateral graft injury in a 61-year-old idiopathic pulmonary fibrosis patient, recipient sera from before and after transplantation/transfusion, as well as the sera of 22 of the 24 implicated blood donors, were individually screened by Luminex bead assay for the presence of human leukocyte antigen (HLA) antibodies, with recipient and lung donor HLA typing to explore for cognate relationships. A red-cell-unit donor-source anti-Cw6 antibody, cognate with the HLA type of the recipient, was identified. This is the second reported case of TRALI in the setting of lung transplantation, and the first to show an associated interaction between donor antibodies (in a low-plasma volume product) with recipient leukocytes (rather than graft antigens); therefore, it should be considered in the differential diagnosis of PGD. Copyright 2010 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Hepatic Hemodynamics and Portal Flow Modulation: The A2ALL Experience.
Emond, Jean C; Goodrich, Nathan P; Pomposelli, James J; Baker, Talia B; Humar, Abhinav; Grant, David R; Abt, Peter; Friese, Chris E; Fisher, Robert A; Kam, Igal; Sherker, Averell H; Gillespie, Brenda W; Merion, Robert M
2017-10-01
A principal aim of the Adult-to-Adult Living Donor Liver Transplantation Cohort Study was to study hepatic blood flow and effect of portal flow modulation on graft outcomes in the setting of increasing use of smaller and left lobe grafts. Recipients of 274 living donor liver transplant were enrolled in the Adult-to-Adult Living Donor Liver Transplantation Cohort Study, including 233 (85.0%) right lobes, 40 (14.6%) left lobes, and 1 (0.5%) left lateral section. Hepatic hemodynamics were recorded after reperfusion. A total of 57 portal flow modulations were performed on 52 subjects. Modulation lowered portal pressure in 68% of subjects with inconsistent effects on hepatic arterial and portal flow. A higher rate of graft dysfunction was observed in modulated vs. unmodulated subjects (31% vs. 18%; P = 0.03); however, graft survival in modulated subjects was not different from unmodulated subjects at 3 years. These results suggest the need for a study using a prespecified portal flow modulation protocol with defined indications to better define the effects of these interventions.
O'Grady, Kathleen M; Power, Hollie A; Olson, Jaret L; Morhart, Michael J; Harrop, A Robertson; Watt, M Joe; Chan, K Ming
2017-10-01
Upper trunk obstetric brachial plexus injury can cause profound shoulder and elbow dysfunction. Although neuroma excision with interpositional sural nerve grafting is the current gold standard, distal nerve transfers have a number of potential advantages. The goal of this study was to compare the clinical outcomes and health care costs between nerve grafting and distal nerve transfers in children with upper trunk obstetric brachial plexus injury. In this prospective cohort study, children who underwent triple nerve transfers were followed with the Active Movement Scale for 2 years. Their outcomes were compared to those of children who underwent nerve graft reconstruction. To assess health care use, a cost analysis was also performed. Twelve patients who underwent nerve grafting were compared to 14 patients who underwent triple nerve transfers. Both groups had similar baseline characteristics and showed improved shoulder and elbow function following surgery. However, the nerve transfer group displayed significantly greater improvement in shoulder external rotation and forearm supination 2 years after surgery (p < 0.05). The operative time and length of hospital stay were significantly lower (p < 0.05), and the overall cost was approximately 50 percent less in the nerve transfer group. Triple nerve transfer for upper trunk obstetric brachial plexus injury is a feasible option, with better functional shoulder external rotation and forearm supination, faster recovery, and lower cost compared with traditional nerve graft reconstruction. Therapeutic, II.
Carey, Joseph S; Danielsen, Beate; Milliken, Jeffrey; Li, Zhongmin; Stabile, Bruce E
2009-11-01
Percutaneous coronary intervention is increasingly used to treat multivessel coronary artery disease. Coronary artery bypass graft procedures have decreased, and as a result, percutaneous coronary intervention has increased. The overall impact of this treatment shift is uncertain. We examined the in-hospital mortality and complication rates for these procedures in California using a combined risk model. The confidential dataset of the Office of Statewide Health Planning and Development patient discharge database was queried for 1997 to 2006. A risk model was developed using International Classification of Diseases, Ninth Revision, Clinical Modification procedures and diagnostic codes from the combined pool of isolated coronary artery bypass graft and percutaneous coronary intervention procedures performed during 2005 and 2006. In-hospital mortality was corrected for "same-day" transfers to another health care institution. Early failure rate was defined as in-hospital mortality rate plus reintervention for another percutaneous coronary intervention or cardiac surgery procedure within 90 days. Coronary artery bypass graft volume decreased from 28,495 (1997) to 15,520 (2006), whereas percutaneous coronary intervention volume increased from 38,098 to 53,703. Risk-adjusted mortality rate decreased from 4.7% to 2.1% for coronary artery bypass graft procedures and from 3.4% to 1.9% for percutaneous coronary intervention. Expected mortality rate increased for both procedures. Early failure rate decreased from 13.1% to 8.0% for percutaneous coronary intervention and from 6.5% to 5.4% for coronary artery bypass graft. For the years 2004 and 2005, the risk of recurrent myocardial infarction or need for coronary artery bypass graft during the first postoperative year was 12% for percutaneous coronary intervention and 6% for coronary artery bypass grafts. This study shows that as volume shifted from coronary artery bypass grafts to percutaneous coronary intervention, expected mortality increased for both procedures. Risk-adjusted mortality rate decreased for both procedures, more so for coronary artery bypass grafts, so that corrected in-hospital mortality rates essentially equalized at approximately 2.0% in 2006. The post-procedural risk of reintervention, death, or myocardial infarction within the first year was twice as high for percutaneous coronary intervention as for coronary artery bypass grafts.
Drescher, Robert; Gühne, Falk; Freesmeyer, Martin
2017-06-01
To propose a positron emission tomography (PET)/computed tomography (CT) protocol including early-dynamic and late-phase acquisitions to evaluate graft patency and aneurysm diameter, detect endoleaks, and rule out graft or vessel wall inflammation after endovascular aneurysm repair (EVAR) in one examination without intravenous contrast medium. Early-dynamic PET/CT of the endovascular prosthesis is performed for 180 seconds immediately after intravenous injection of F-18-fluorodeoxyglucose. Data are reconstructed in variable time frames (time periods after tracer injection) to visualize the arterial anatomy and are displayed as PET angiography or fused with CT images. Images are evaluated in view of vascular abnormalities, graft configuration, and tracer accumulation in the aneurysm sac. Whole-body PET/CT is performed 90 to 120 minutes after tracer injection. This protocol for early-dynamic PET/CT and PET angiography has the potential to evaluate vascular diseases, including the diagnosis of complications after endovascular procedures.
Naffouje, Samer A; Tzvetanov, Ivo; Bui, James T; Gaba, Ron; Bernardo, Karrel; Jeon, Hoonbae
2016-10-01
Hemodialysis reliable outflow (HeRO) catheters were introduced in 2008, and have been since providing a reliable alternative for hemodialysis patients who are deemed "access challenged." However, its outcomes have not been extensively investigated due to its relatively young age. Here, we report our 6-year single institution experience, and demonstrate the significant impact of obesity on HeRO graft outcomes, an aspect not previously studied in the literature. Patients who underwent HeRO graft placement at the University of Illinois Hospital between April 2009 and August 2015 were included retrospectively. Data were collected from patients' electronic medical records and analyzed using SPSS software. Thirty-three patients who underwent 34 HeRO catheter placements were included. Mean age was 47 ± 12 years, and mean body mass index (BMI) was 30.75 ± 10.22. Median follow-up was 635 days. Overall catheter-related complications were thrombosis (70.59%), infection (20.59%), arterial steal (8.82%), and pseudoaneurysms requiring intervention (8.82%). Overall primary and secondary patency rates after 6 and 12 months were 31.25%, 25%, 78.13%, and 71.86%, respectively. Primary nonfunction rate was 14.7%. Obese patients had significantly higher rate of primary nonfunction (38.46% vs. 0%, P = 0.0046), and relative risk 3.62 (95% confidence interval [CI] 2.01-6.52). They also had a significantly decreased rate of graft patency after 12 months (10.53% vs. 53.85%, P = 0.0227), leading to a relative risk of "early" graft loss within 1 year of 5.12 (95% CI 1.26-20.83). Overall median graft patency in obese patients was significantly shorter than that of nonobese patients (311 vs. 1295 days, P = 0.014). BMI, as a continuous variable, was a significant predictor of primary nonfunction (P = 0.046) and early graft loss (0.020) when tested against age, sex, race, and diabetes in a multivariate logistic regression analysis. HeRO catheters offer a reliable, and possibly the last, alternative in hemodialysis access-challenged patients. In our population, obesity was a significant risk factor for primary nonfunction, early graft loss, and a shorter overall graft patency. BMI, as a continuous variable, can serve as a predictor of primary nonfunction and early graft loss after adjustment for age, race, sex, and diabetes. Obesity's effect on HeRO catheters has not been amply addressed; therefore further prospective studies are warranted. Copyright © 2016 Elsevier Inc. All rights reserved.
Griffin, Andrew S; Gage, Shawn M; Lawson, Jeffrey H; Kim, Charles Y
2017-01-01
This study evaluated whether the use of a staged Hemodialysis Reliable Outflow (HeRO; Merit Medical, South Jordan, Utah) implantation strategy incurs increased early infection risk compared with conventional primary HeRO implantation. A retrospective review was performed of 192 hemodialysis patients who underwent HeRO graft implantation: 105 patients underwent primary HeRO implantation in the operating room, and 87 underwent a staged implantation where a previously inserted tunneled central venous catheter was used for guidewire access for the venous outflow component. Within the staged implantation group, 32 were performed via an existing tunneled hemodialysis catheter (incidentally staged), and 55 were performed via a tunneled catheter inserted across a central venous occlusion in an interventional radiology suite specifically for HeRO implantation (intentionally staged). Early infection was defined as episodes of bacteremia or HeRO infection requiring resection ≤30 days of HeRO implantation. For staged HeRO implantations, the median interval between tunneled catheter insertion and conversion to a HeRO graft was 42 days. The overall HeRO-related infection rate ≤30 days of implantation was 8.6% for primary HeRO implantation and 2.3% for staged implantations (P = .12). The rates of early bacteremia and HeRO resection requiring surgical resection were not significantly different between groups (P = .19 and P = .065, respectively), nor were age, gender, laterality, anastomosis to an existing arteriovenous access, human immunodeficiency virus status, diabetes, steroids, chemotherapy, body mass index, or graft location. None of the patient variables, techniques, or graft-related variables correlated significantly with the early infection rate. The staged HeRO implantation strategy did not result in an increased early infection risk compared with conventional primary implantation and is thus a reasonable strategy for HeRO insertion in hemodialysis patients with complex central venous disease. Copyright © 2016 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Ura, M; Sakata, R; Nakayama, Y; Arai, Y; Oshima, S; Noda, K
2000-02-15
There has been debate regarding whether technically demanding right internal thoracic artery (RITA) grafting via the transverse sinus can be extensively applied to patients in high-risk groups, such as patients with a small body size, elderly patients, and woman with relatively smaller coronary artery and internal thoracic artery (ITA) diameters. Of the 1456 patients who underwent isolated coronary artery bypass grafting between January 1989 and December 1998 at Kumamoto Central Hospital, 393 patients (mean age, 62.4+/-9.0 years) with the RITA anastomosed to the major branches of the circumflex artery were studied. Left ITA grafting was performed in 384 patients, and in 369, the in situ left ITA was anastomosed to the left anterior descending coronary artery using standard methods. Early postoperative angiography was performed in 381 patients. The RITA was occluded in 4 patients, and string-like artery and significant stenosis were present in 11 and 7 patients, respectively; RITA graft patency was thus 94.1%. Of the preoperative variables and angiographic data, simple and multiple logistic regression analyses identified decreased severity of native stenosis, diffuse sclerosis of native vessels, and residual side branches of the ITA as independent predictors of nonfunctional grafts. The method of ITA grafting did not influence the patency of the graft. The excellent patency rate demonstrated by this study, the largest angiographic study to date of RITA grafting via the transverse sinus, indicates that this technique can provide reliable revascularization of the left ventricle and that it has the potential to be applied to a wide variety of patients with diseased circumflex arteries.
NASA Astrophysics Data System (ADS)
Yamazaki, Mutsuo; Sato, Shunichi; Saito, Daizo; Okada, Yoshiaki; Ashida, Hiroshi; Obara, Minoru
2004-07-01
Adhesion monitoring of grafted skins is very important in successful treatment of severe burns and traumas. However, current diagnosis of skin grafting is usually done by visual observation, which is not reliable and gives no quantitative information on the skin graft adhesion. When the grafted skin adheres well, neovascularities will be generated in the grafted skin tissue, and therefore adhesion may be monitored by detecting the neovascularities. In this study, we attempted to measure photoacoustic signals originate from the neovascularities by irradiating the grafted skins with 532-nm nanosecond light pulses in rat autograft and allograft models. The measurement showed that immediately after skin grafting, photoacoustic signal originate from the blood in the dermis was negligibly small, while 6 - 24 hours after skin grafting, signal was observed from the dermis in the graft. We did not observe a significant difference between the signals from the autograft and the allograft models. These results indicate that neovascularization would take place within 6 hours after skin grafting, and the rejection reaction would make little effect on adhesion within early hours after grafting.
Effects of a Fibrin Sealant on Skin Graft Tissue Adhesion in a Rodent Model.
Balceniuk, Mark D; Wingate, Nicholas A; Krein, Howard; Curry, Joseph; Cognetti, David; Heffelfiner, Ryan; Luginbuhl, Adam
2016-07-01
To establish a rodent model for skin grafting with fibrin glue and examine the effects of fibrin glue on the adhesive strength of skin grafts without bolsters. Animal cohort. Academic hospital laboratory. Three skin grafts were created using a pneumatic microtome on the dorsum of 12 rats. Rats were evenly divided into experimental (n = 6) and control (n = 6) groups. The experimental group received a thin layer of fibrin glue between the graft and wound bed, and the control group was secured with standard bolsters. Adherence strength of the skin graft was tested by measurement of force required to sheer the graft from the recipient wound. Adhesion strength measurements were taken on postoperative days (PODs) 1, 2, and 3. The experimental group required an average force of 719 g on POD1, 895 g on POD2, and 676 g on POD3, while the average force in the control group was 161 g on POD1, 257 g on POD2, and 267 g on POD3. On each of the 3 PODs, there was a significant difference in adherence strength between the experimental and control groups (P = .036, P = .029, P = .024). There is a significant difference in the adhesion strength of skin grafts to the wound bed in the early postoperative period of the 2 groups. In areas of high mobility, using the fibrin sealant can keep the graft immobile during the critical phases of early healing. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.
Barbier, Louise; Cesaretti, Manuela; Dondero, Federica; Cauchy, François; Khoy-Ear, Linda; Aoyagi, Takeshi; Weiss, Emmanuel; Roux, Olivier; Dokmak, Safi; Francoz, Claire; Paugam-Burtz, Catherine; Sepulveda, Ailton; Belghiti, Jacques; Durand, François; Soubrane, Olivier
2016-11-01
Older liver grafts have been considered in the past decade due to organ shortage. The aim was to compare outcomes after liver transplantation with either younger or older donors. Patients transplanted in our center between 2004 and 2014 with younger donors (younger than 60 years; n = 253) were compared with older donors (older than 75 years; n = 157). Multiorgan transplantations, split grafts, or non-heart-beating donors were not included. Donors in the older group were mostly women deceased from stroke, and only 3 patients had experienced cardiac arrest. Liver tests were significantly better in the older group than in the younger group. There was no difference regarding cold ischemia time, model for end-stage liver disease score, and steatosis. There was no significant difference regarding primary nonfunction and dysfunction, hepatic artery and biliary complications, and retransplantation rates. Graft survival was not different (65% and 64% in the older and younger groups, P = 0.692). Within the older group, hepatitis C infection, retransplantation, and emergency transplantation were associated with poor graft survival. Provided normal liver tests and the absence of cardiac arrest in donors, older liver grafts (>75 years) may be safely attributed to non-hepatitis C-infected recipients in the setting of a first and nonurgent transplantation.
Jansson, L; Eizirik, D L; Pipeleers, D G; Borg, L A; Hellerström, C; Andersson, A
1995-08-01
Hyperglycemia-induced beta-cell dysfunction may be an important component in the pathogenesis of non-insulin-dependent diabetes mellitus. However, most available data in this field were obtained from rodent islets. To investigate the relevance of this hypothesis for human beta-cells in vivo, human pancreatic islets were transplanted under the renal capsule of nude mice. Experimental groups were chosen so that grafted islets were exposed to either hyper- or normoglycemia or combinations of these for 4 or 6 wk. Grafts of normoglycemic recipients responded with an increased insulin release to a glucose stimulus during perfusion, whereas grafts of hyperglycemic recipients failed to respond to glucose. The insulin content of the grafts in the latter groups was only 10% of those observed in controls. Recipients initially hyperglycemic (4 wk), followed by 2 wk of normoglycemia regained a normal graft insulin content, but a decreased insulin response to glucose remained. No ultrastructural signs of beta-cell damage were observed, with the exception of increased glycogen deposits in animals hyperglycemic at the time of killing. It is concluded that prolonged exposure to a diabetic environment induces a long-term secretory defect in human beta-cells, which is not dependent on the size of the islet insulin stores.
Jansson, L; Eizirik, D L; Pipeleers, D G; Borg, L A; Hellerström, C; Andersson, A
1995-01-01
Hyperglycemia-induced beta-cell dysfunction may be an important component in the pathogenesis of non-insulin-dependent diabetes mellitus. However, most available data in this field were obtained from rodent islets. To investigate the relevance of this hypothesis for human beta-cells in vivo, human pancreatic islets were transplanted under the renal capsule of nude mice. Experimental groups were chosen so that grafted islets were exposed to either hyper- or normoglycemia or combinations of these for 4 or 6 wk. Grafts of normoglycemic recipients responded with an increased insulin release to a glucose stimulus during perfusion, whereas grafts of hyperglycemic recipients failed to respond to glucose. The insulin content of the grafts in the latter groups was only 10% of those observed in controls. Recipients initially hyperglycemic (4 wk), followed by 2 wk of normoglycemia regained a normal graft insulin content, but a decreased insulin response to glucose remained. No ultrastructural signs of beta-cell damage were observed, with the exception of increased glycogen deposits in animals hyperglycemic at the time of killing. It is concluded that prolonged exposure to a diabetic environment induces a long-term secretory defect in human beta-cells, which is not dependent on the size of the islet insulin stores. Images PMID:7635965
Flores, Stefan; Choi, Judy; Alex, Byron; Mulhall, John P
2011-07-01
Plaque incision and grafting (PIG) surgery for Peyronie's disease (PD) is a recognized management strategy. One of the recognized complications of PIG surgery is the development of postoperative erectile dysfunction (ED). To determine the incidence of ED after PIG surgery and attempt to define predictors of ED development. All patients underwent preoperative cavernosometry. Grafting was performed with either cadaveric pericardium (Tutoplast) or intestinal submucosa (Surgisis). Prior to 2006, the procedure used an H-type incision, whereas after this date, the Egydio approach has been used. Men undergoing PIG completed preoperative and 6-month postoperative International Index of Erectile Function (IIEF) questionnaires. 56 patients were analyzed. Mean patient and partner ages were 57 ± 22 and 54 ± 18 years, respectively. Mean duration of PD at the time of PIG was 22 ± 9 months. Seventy-five percent had curvature alone, 11% had hourglass/indentation deformity, and the remainder had combined curvature/indentation. Mean preoperative curvature was 52 ± 23°. Fifty-two had grafting with Tutoplast, while four had grafting with Surgisis. All men at baseline were capable of generating a penetration rigidity erection. Preoperatively, 50% of men had cavernosal insufficiency and 21% had venous leak (baseline and postoperative erectile function [EF] domain scores were 23 ± 4 and 17 ± 9, respectively [P < 0.01]). Forty-six percent of men experienced a ≥6-point decrease in EF domain score after PIG. The predictors of a ≥6-point reduction in IIEF-EF domain score on multivariable analysis were degree of preoperative curvature, type of plaque incision, patient age, and baseline venous leak. Conclusions. Almost one-half of men had significant reduction in their erectile rigidity after PIG. Reduction was predicted by larger baseline curvature, the Egydio plaque incision technique, older patient age, and the presence of venous leak at baseline. Based on these data, we discourage older men, those with venous leak, and those with profound curvature from considering PIG surgery. © 2011 International Society for Sexual Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kundu, Sanjoy, E-mail: sanjoy_kundu40@hotmail.com; Modabber, Milad; You, John M.
2011-10-15
Purpose: To assess the safety and effectiveness of a polytetrafluoroethylene (PTFE) encapsulated nitinol stents (Bard Peripheral Vascular, Tempe, AZ) for treatment of hemodialysis-related central venous occlusions. Materials and Methods: Study design was a single-center nonrandomized retrospective cohort of patients from May 2004 to August 2009 for a total of 64 months. There were 14 patients (mean age 60 years, range 50-83 years; 13 male, 1 female). All patients had autogenous fistulas. All 14 patients had central venous occlusions and presented with clinical symptoms of the following: extremity swelling (14%, 2 of 14), extremity and face swelling (72%, 10 of 14),more » and face swelling/edema (14%, 2 of 14). There was evidence of access dysfunction with decreased access flow in 36% (5 of 14) patients. There were prior interventions or previous line placement at the site of the central venous lesion in all 14 patients. Results were assessed by recurrence of clinical symptoms and function of the access circuit (National Kidney Foundation recommended criteria). Results: Sixteen consecutive straight stent grafts were implanted in 14 patients. Average treated lesion length was 5.0 cm (range, 0.9-7 cm). All 14 patients had complete central venous occlusion (100% stenosis). The central venous occlusions were located as follows: right subclavian and brachiocephalic vein (21%, 3 of 14), right brachiocephalic vein (36%, 5 of 14), left brachiocephalic vein (36%, 5 of 14), and bilateral brachiocephalic vein (7%, 1 of 14). A total of 16 PTFE stent grafts were placed. Ten- or 12-mm-diameter PTFE stent grafts were placed. The average stent length was 6.1 cm (range, 4-8 cm). Technical (deployment), anatomic (<30% residual stenosis), clinical (resolution of symptoms), and hemodynamic (resolution of access dysfunction) success were 100%. At 3, 6, and 9 months, primary patency of the treated area and access circuit were 100% (14 of 14). Conclusions: This PTFE encapsulated stent graft demonstrates encouraging intermediate-term patency results for central vein occlusions. Further prospective studies with long-term assessment and larger patient populations will be required.« less
Roles of inflammation and apoptosis in experimental brain death-induced right ventricular failure.
Belhaj, Asmae; Dewachter, Laurence; Rorive, Sandrine; Remmelink, Myriam; Weynand, Birgit; Melot, Christian; Galanti, Laurence; Hupkens, Emeline; Sprockeels, Thomas; Dewachter, Céline; Creteur, Jacques; McEntee, Kathleen; Naeije, Robert; Rondelet, Benoît
2016-12-01
Right ventricular (RV) dysfunction remains the leading cause of early death after cardiac transplantation. Methylprednisolone is used to improve graft quality; however, evidence for that remains empirical. We sought to determine whether methylprednisolone, acting on inflammation and apoptosis, might prevent brain death-induced RV dysfunction. After randomization to placebo (n = 11) or to methylprednisolone (n = 8; 15 mg/kg), 19 pigs were assigned to a brain-death procedure. The animals underwent hemodynamic evaluation at 1 and 5 hours after Cushing reflex (i.e., hypertension and bradycardia). The animals euthanized, and myocardial tissue was sampled. This was repeated in a control group (n = 8). At 5 hours after the Cushing reflex, brain death resulted in increased pulmonary artery pressure (27 ± 2 vs 18 ± 1 mm Hg) and in a 30% decreased ratio of end-systolic to pulmonary arterial elastances (Ees/Ea). Cardiac output and right atrial pressure did not change. This was prevented by methylprednisolone. Brain death-induced RV dysfunction was associated with increased RV expression of heme oxygenase-1, interleukin (IL)-6, IL-10, IL-1β, tumor necrosis factor (TNF)-α, IL-1 receptor-like (ST)-2, signal transducer and activator of transcription-3, intercellular adhesion molecules-1 and -2, vascular cell adhesion molecule-1, and neutrophil infiltration, whereas IL-33 expression decreased. RV apoptosis was confirmed by terminal deoxynucleotide transferase-mediated deoxy uridine triphosphate nick-end labeling staining. Methylprednisolone pre-treatment prevented RV-arterial uncoupling and decreased RV expression of TNF-α, IL-1 receptor-like-2, intercellular adhesion molecule-1, vascular cell adhesion molecule-1, and neutrophil infiltration. RV Ees/Ea was inversely correlated to RV TNF-α and IL-6 expression. Brain death-induced RV dysfunction is associated with RV activation of inflammation and apoptosis and is partly limited by methylprednisolone. Copyright © 2016 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Weiler, Andreas; Peine, Ricarda; Pashmineh-Azar, Alireza; Abel, Clemens; Südkamp, Norbert P; Hoffmann, Reinhard F G
2002-02-01
Interference fit fixation of soft-tissue grafts has recently raised strong interest because it allows for anatomic graft fixation that may increase knee stability and graft isometry. Although clinical data show promising results, no data exist on how tendon healing progresses using this fixation. The purpose of the present study was to investigate anterior cruciate ligament (ACL) reconstruction biomechanically using direct tendon-to-bone interference fit fixation with biodegradable interference screws in a sheep model. Animal study. Thirty-five mature sheep underwent ACL reconstruction with an autologous Achilles tendon split graft. Grafts were directly fixed with poly-(D,L-lactide) interference screws. Animals were euthanized after 6, 9, 12, 24, and 52 weeks and standard biomechanical evaluations were performed. All grafts at time zero failed by pullout from the bone tunnel, whereas grafts at 6 and 9 weeks failed intraligamentously at the screw insertion site. At 24 and 52 weeks, grafts failed by osteocartilaginous avulsion. At 24 weeks, interference screws were macroscopically degraded. At 6 and 9 weeks tensile stress was only 6.8% and 9.6%, respectively, of the graft tissue at time zero. At 52 weeks, tensile stress of the reconstruction equaled 63.8% and 47.3% of the Achilles tendon graft at time zero and the native ACL, respectively. A complete restitution of anterior-posterior drawer displacement was found at 52 weeks compared with the time-zero reconstruction. It was found that over the whole healing period the graft fixation proved not to be the weak link of the reconstruction and that direct interference fit fixation withstands loads without motion restriction in the present animal model. The weak link during the early healing stage was the graft at its tunnel entrance site, leading to a critical decrease in mechanical properties. This finding indicates that interference fit fixation of a soft-tissue graft may additionally alter the mechanical properties of the graft in the early remodeling stage because of a possible tissue compromise at the screw insertion site. Although mechanical properties of the graft tissue had not returned to normal at 1 year compared with those at time zero, knee stability had returned to normal at that time. There was no graft pullout after 24 weeks, indicating that screw degradation does not compromise graft fixation.
Hybrid endovascular stent-grafting technique for patent ductus arteriosus in an adult.
Kainuma, S; Kuratani, T; Sawa, Y
2011-09-01
A 51-year-old man was referred to our institution for patent ductus arteriosus (PDA) complicated by left ventricular dysfunction and pulmonary hypertension. Surgical closure of a PDA is usually carried out via a small posterior thoracotomy. However, thoracoscopic procedures are probably not appropriate in adults because of the frequency of calcification and the greater risk of rupture while ligating the ductus. To minimize surgical trauma, we used hybrid endovascular stent grafting combined with revascularization of the left subclavian artery, which enabled us to eliminate shunt flow to the pulmonary artery. At 11-month follow-up, the patient was asymptomatic and showed no complications. © Georg Thieme Verlag KG Stuttgart · New York.
Early Focal Segmental Glomerulosclerosis as a Cause of Renal Allograft Primary Nonfunction
Griffin, Emma J.; Thomson, Peter C.; Kipgen, David; Clancy, Marc; Daly, Conal
2013-01-01
Background. Primary focal segmental glomerulosclerosis (FSGS) is one of the commonest causes of glomerular disease and if left untreated will often progress to established renal failure. In many cases the best treatment option is renal transplantation; however primary FSGS may rapidly recur in renal allografts and may contribute to delayed graft function. We present a case of primary nonfunction in a renal allograft due to biopsy-proven FSGS. Case Report. A 32-year-old man presented with serum albumin of 22 g/L, proteinuria quantified at 12 g/L, and marked peripheral oedema. Renal biopsy demonstrated tip-variant FSGS. Despite treatment, the patient developed progressive renal dysfunction and was commenced on haemodialysis. Cadaveric renal transplantation was undertaken; however this was complicated by primary nonfunction. Renal biopsies failed to demonstrate evidence of acute rejection but did demonstrate clear evidence of FSGS. The patient was treated to no avail. Discussion. Primary renal allograft nonfunction following transplantation is often due to acute kidney injury or acute rejection. Recurrent FSGS is recognised as a phenomenon that drives allograft dysfunction but is not traditionally associated with primary nonfunction. This case highlights FSGS as a potentially aggressive process that, once active in the allograft, may prove refractory to targeted treatment. Preemptive therapies in patients deemed to be at high risk of recurrent disease may be appropriate and should be considered. PMID:23781382
Niu, Mengliang; Huang, Yuan; Sun, Shitao; Sun, Jingyu; Cao, Haishun; Shabala, Sergey
2018-01-01
Abstract Plant salt tolerance can be improved by grafting onto salt-tolerant rootstocks. However, the underlying signaling mechanisms behind this phenomenon remain largely unknown. To address this issue, we used a range of physiological and molecular techniques to study responses of self-grafted and pumpkin-grafted cucumber plants exposed to 75 mM NaCl stress. Pumpkin grafting significantly increased the salt tolerance of cucumber plants, as revealed by higher plant dry weight, chlorophyll content and photochemical efficiency (Fv/Fm), and lower leaf Na+ content. Salinity stress resulted in a sharp increase in H2O2 production, reaching a peak 3 h after salt treatment in the pumpkin-grafted cucumber. This enhancement was accompanied by elevated relative expression of respiratory burst oxidase homologue (RBOH) genes RbohD and RbohF and a higher NADPH oxidase activity. However, this increase was much delayed in the self-grafted plants, and the difference between the two grafting combinations disappeared after 24 h. The decreased leaf Na+ content of pumpkin-grafted plants was achieved by higher Na+ exclusion in roots, which was driven by the Na+/H+ antiporter energized by the plasma membrane H+-ATPase, as evidenced by the higher plasma membrane H+-ATPase activity and higher transcript levels for PMA and SOS1. In addition, early stomatal closure was also observed in the pumpkin-grafted cucumber plants, reducing water loss and maintaining the plant’s hydration status. When pumpkin-grafted plants were pretreated with an NADPH oxidase inhibitor, diphenylene iodonium (DPI), the H2O2 level decreased significantly, to the level found in self-grafted plants, resulting in the loss of the salt tolerance. Inhibition of the NADPH oxidase-mediated H2O2 signaling in the root also abolished a rapid stomatal closure in the pumpkin-grafted plants. We concluded that the pumpkin-grafted cucumber plants increase their salt tolerance via a mechanism involving the root-sourced respiratory burst oxidase homologue-dependent H2O2 production, which enhances Na+ exclusion from the root and promotes an early stomatal closure. PMID:29145593
Is Bilateral Internal Mammary Arterial Grafting Beneficial for Patients Aged 75 Years or Older?
Itoh, Satoshi; Kimura, Naoyuki; Adachi, Hideo; Yamaguchi, Atsushi
2016-07-25
Although bilateral internal mammary artery (BIMA) grafting is performed with increasing regularity in elderly patients, whether it is truly beneficial, and therefore indicated, in these patients remains uncertain. We retrospectively investigated early and late outcomes of BIMA grafting in patients aged ≥75 years. We identified 460 patients aged ≥75 years from among 2,618 patients who underwent either single internal mammary artery (SIMA) grafting (n=293) or BIMA grafting (n=107). Early outcomes did not differ between the SIMA and BIMA patients (30-day mortality: 1.7% vs. 0%, P=0.39; sternal wound infection: 1.0% vs. 4.7%; P=0.057). Late outcomes, 10-year survival in particular, were improved in the BIMA group (36.6% vs. 48.1%, P=0.033). In the analysis of the results in propensity score-matched groups (196 patients in the SIMA group, 98 patients in the BIMA group), improved 10-year survival was documented in the BIMA group (34.8% vs. 47.6%, P=0.030). Cox proportional regression analysis showed SIMA usage (non-use of BIMA) to be a predictor for late mortality (hazard ratio: 0.65, 95% confidence interval: 0.43-0.98, P=0.042). We further compared outcomes between the total non-elderly patients (n=2,158) and total elderly patients (n=460). BIMA usage was similar, as was 30-day mortality (1.0% vs. 1.3%, respectively). A survival advantage, with no increase in early mortality, can be expected from BIMA grafting in patients aged ≥75 years. (Circ J 2016; 80: 1756-1763).
Hueso, Miguel; Navarro, Estanis; Moreso, Francesc; O'Valle, Francisco; Pérez-Riba, Mercè; Del Moral, Raimundo García; Grinyó, Josep M; Serón, Daniel
2010-04-01
Grafts with subclinical rejection associated with interstitial fibrosis and tubular atrophy (SCR+IF/TA) show poorer survival than grafts with subclinical rejection without IF/TA (SCR). Aiming to detect differences among SCR+IF/TA and SCR, we immunophenotyped the inflammatory infiltrate (CD45, CD3, CD20, CD68) and used a low-density array to determine levels of T(H)1 (interleukin IL-2, IL-3, gamma-interferon, tumor necrosis factor-alpha, lymphotoxin-alpha, lymphotoxin-beta, granulocyte-macrophage colony-stimulating factor) and T(H)2 (IL-4, IL-5, IL-6, IL-10, and IL-13) transcripts as well as of IL-2R (as marker for T-cell activation) in 31 protocol biopsies of renal allografts. Here we show that grafts with early IF/TA and SCR can be distinguished from grafts with SCR on the basis of the activation of IL-10 gene expression and of an increased infiltration by B-lymphocytes in a cellular context in which the degree of T-cell activation is similar in both groups of biopsies, as demonstrated by equivalent levels of IL-2R mRNA. These results suggest that the up-regulation of the IL-10 gene expression, as well as an increased proportion of B-lymphocytes in the inflammatory infiltrates, might be useful as markers of early chronic lesions in grafts with SCR.
Hueso, Miguel; Navarro, Estanis; Moreso, Francesc; O'Valle, Francisco; Pérez-Riba, Mercè; del Moral, Raimundo García; Grinyó, Josep M.; Serón, Daniel
2010-01-01
Grafts with subclinical rejection associated with interstitial fibrosis and tubular atrophy (SCR+IF/TA) show poorer survival than grafts with subclinical rejection without IF/TA (SCR). Aiming to detect differences among SCR+IF/TA and SCR, we immunophenotyped the inflammatory infiltrate (CD45, CD3, CD20, CD68) and used a low-density array to determine levels of TH1 (interleukin IL-2, IL-3, γ-interferon, tumor necrosis factor-α, lymphotoxin-α, lymphotoxin-β, granulocyte-macrophage colony-stimulating factor) and TH2 (IL-4, IL-5, IL-6, IL-10, and IL-13) transcripts as well as of IL-2R (as marker for T-cell activation) in 31 protocol biopsies of renal allografts. Here we show that grafts with early IF/TA and SCR can be distinguished from grafts with SCR on the basis of the activation of IL-10 gene expression and of an increased infiltration by B-lymphocytes in a cellular context in which the degree of T-cell activation is similar in both groups of biopsies, as demonstrated by equivalent levels of IL-2R mRNA. These results suggest that the up-regulation of the IL-10 gene expression, as well as an increased proportion of B-lymphocytes in the inflammatory infiltrates, might be useful as markers of early chronic lesions in grafts with SCR. PMID:20150436
Patel, Rakesh P; Katsargyris, Athanasios; Verhoeven, Eric L G; Adam, Donald J; Hardman, John A
2013-12-01
The chimney technique in endovascular aortic aneurysm repair (Ch-EVAR) involves placement of a stent or stent-graft parallel to the main aortic stent-graft to extend the proximal or distal sealing zone while maintaining side branch patency. Ch-EVAR can facilitate endovascular repair of juxtarenal and aortic arch pathology using available standard aortic stent-grafts, therefore, eliminating the manufacturing delays required for customised fenestrated and branched stent-grafts. Several case series have demonstrated the feasibility of Ch-EVAR both in acute and elective cases with good early results. This review discusses indications, technique, and the current available clinical data on Ch-EVAR.
Kharod-Dholakia, Bhairavi; Randleman, J Bradley; Bromley, Jennifer G; Stulting, R Doyle
2015-06-01
To analyze current practice patterns in the prevention and treatment of corneal graft rejection for both penetrating keratoplasty (PK) and endothelial keratoplasty (EK) and to compare these patterns with previously reported practices. In 2011, an electronic survey was sent to 670 members of the Cornea Society worldwide addressing the routine postoperative management of corneal transplants at different time points, treatment of various manifestations of corneal graft rejection, and preferred surgical techniques. A total of 204 of 670 surveys (30%) were returned and evaluated. All respondents used topical corticosteroids for routine postoperative management and treatment of endothelial graft rejection. Prednisolone was the topical steroid of choice in all clinical scenarios, similar to previous surveys from 1989 to 2004. Use of subconjunctival and systemic steroids increased for many scenarios of probable and definite graft rejection. Routine use of prednisolone decreased by approximately 10% from previous surveys, whereas difluprednate was used in 13% of high-risk eyes during the first 6 months. Dexamethasone, fluorometholone, and loteprednol use remained stable. Adjunctive topical cyclosporine use increased significantly for PK and EK. EK was the preferred technique for endothelial dysfunction, whereas PK and deep anterior lamellar keratoplasty were both used for keratoconus and anterior scars. Most respondents (75%) felt that graft rejection occurs more frequently after PK than after EK. Prednisolone remains the treatment of choice for management and treatment of graft rejection; however, since the introduction of difluprednate, its use has declined slightly since the introduction of difluprednate. Despite perceived differences in rejection rates, there were no differences in prophylactic steroid treatment for PK and EK.
Overview of adult congenital heart transplants
Morales, David
2018-01-01
Transplantation for adult patients with congenital heart disease (ACHD) is a growing clinical endeavor in the transplant community. Understanding the results and defining potential high-risk patient subsets will allow optimization of patient outcomes. This review summarizes the scope of ACHD transplantation, the mechanisms of late ventricular dysfunction, the ACHD population at risk of developing heart failure, the indications and potential contraindications for transplant, surgical considerations, and post-transplant outcomes. The findings reveal that 3.3% of adult heart transplants occur in ACHD patients. The potential mechanisms for the development of late ventricular dysfunction include a morphologic systemic right ventricle, altered coronary perfusion, and ventricular noncompaction. The indications for transplant in ACHD patients include systemic ventricular failure refractory medical therapy, Fontan patients failing from chronic passive pulmonary circulation, and progressive cyanosis leading to functional decline. Transplantation in ACHD patients can be quite complex and may require extensive reconstruction of the branch pulmonary arteries, systemic veins, or the aorta. Vasoplegia, bleeding, and graft right ventricular dysfunction can complicate the immediate post-transplant period. The post-transplant operative mortality ranges between 14% and 39%. The majority of early mortality occurs in ACHD patients with univentricular congenital heart disease. However, there has been improvement in operative survival in more contemporary studies. In conclusion, the experience with cardiac transplantation for ACHD patients with end-stage heart failure is growing, and high-risk patient subsets have been defined. Significant strides have been made in developing evidence-based guidelines of indications for transplant, and the intraoperative management of complex reconstruction has evolved. With proper patient selection, more aggressive use of mechanical circulatory support, and earlier referral of patients with failing Fontan physiology, outcomes should continue to improve. PMID:29492392
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patel, Rakesh P., E-mail: rpatel9@nhs.net; Katsargyris, Athanasios, E-mail: kthanassos@yahoo.com; Verhoeven, Eric L. G., E-mail: Eric.Verhoeven@klinikum-nuernberg.de
The chimney technique in endovascular aortic aneurysm repair (Ch-EVAR) involves placement of a stent or stent-graft parallel to the main aortic stent-graft to extend the proximal or distal sealing zone while maintaining side branch patency. Ch-EVAR can facilitate endovascular repair of juxtarenal and aortic arch pathology using available standard aortic stent-grafts, therefore, eliminating the manufacturing delays required for customised fenestrated and branched stent-grafts. Several case series have demonstrated the feasibility of Ch-EVAR both in acute and elective cases with good early results. This review discusses indications, technique, and the current available clinical data on Ch-EVAR.
2013-01-01
Praetorius, F. Guided tissue regeneration using de- gradable and nondegradable membranes in rabbit tibia. Clin Oral Implants Res 4, 172, 1993. 8. Queiroz... Regeneration of periodontal tissues : combinations of barrier membranes and grafting materials–biological foundation and preclinical evi- dence: a...structural graft provides benefits for bone tissue regeneration in terms of early interfacial integration. Introduction The treatment of large-bone defects
Zhao, Hailin; Yoshida, Akira; Xiao, Wei; Ologunde, Rele; O'Dea, Kieran P; Takata, Masao; Tralau-Stewart, Catherine; George, Andrew J T; Ma, Daqing
2013-10-01
Prolonged hypothermic storage elicits severe ischemia-reperfusion injury (IRI) to renal grafts, contributing to delayed graft function (DGF) and episodes of acute immune rejection and shortened graft survival. Organoprotective strategies are therefore needed for improving long-term transplant outcome. The aim of this study is to investigate the renoprotective effect of xenon on early allograft injury associated with prolonged hypothermic storage. Xenon exposure enhanced the expression of heat-shock protein 70 (HSP-70) and heme oxygenase 1 (HO-1) and promoted cell survival after hypothermia-hypoxia insult in human proximal tubular (HK-2) cells, which was abolished by HSP-70 or HO-1 siRNA. In the brown Norway to Lewis rat renal transplantation, xenon administered to donor or recipient decreased the renal tubular cell death, inflammation, and MHC II expression, while delayed graft function (DGF) was therefore reduced. Pathological changes associated with acute rejection, including T-cell, macrophage, and fibroblast infiltration, were also decreased with xenon treatment. Donors or recipients treated with xenon in combination with cyclosporin A had prolonged renal allograft survival. Xenon protects allografts against delayed graft function, attenuates acute immune rejection, and enhances graft survival after prolonged hypothermic storage. Furthermore, xenon works additively with cyclosporin A to preserve post-transplant renal function.
Impact of the Learning Curve for Endoscopic Vein Harvest on Conduit Quality and Early Graft Patency
Desai, Pranjal; Kiani, Soroosh; Thiruvanthan, Nannan; Henkin, Stanislav; Kurian, Dinesh; Ziu, Pluen; Brown, Alex; Patel, Nisarg; Poston, Robert
2014-01-01
Background Recent studies have suggested that endoscopic vein harvest (EVH) compromises graft patency. To test whether the learning curve for EVH alters conduit integrity owing to increased trauma compared with an open harvest, we analyzed the quality and early patency of conduits procured by technicians with varying EVH experience. Methods During coronary artery bypass grafting, veins were harvested open (n = 10) or by EVH (n = 85) performed by experienced (>900 cases, >30/month) versus novice <100 cases, <3/month) technicians. Harvested conduits were imaged intraoperatively using optical coherence tomography and on day 5 to assess graft patency using computed tomographic angiography. Results Conduits from experienced (n = 55) versus novice (n = 30) harvesters had similar lengths (33 versus 34 cm) and harvest times (32.4 versus 31.8 minutes). Conduit injury was noted in both EVH groups with similar distribution among disruption of the adventitia (62%), intimal tears at branch points (23%), and intimal or medial dissections (15%), but the incidence of these injuries was less with experienced harvesters and rare in veins procured with an open technique. Overall, the rate of graft attrition was similar between the two EVH groups (6.45% versus 4.34% of grafts; p = 0.552). However, vein grafts with at least 4 intimal or medial dissections showed significantly worse patency (67% versus 96% patency; p = 0.05). Conclusions High-resolution imaging confirmed that technicians inexperienced with EVH are more likely to cause intimal and deep vessel injury to the saphenous vein graft, which increases graft failure risk. Endoscopic vein harvest remains the most common technique for conduit harvest, making efforts to better monitor the learning curve an important public health issue. PMID:21524447
[Organ transplantation and blood transfusion].
Matignon, M; Bonnefoy, F; Lang, P; Grimbert, P
2011-04-01
Pretransplant blood transfusion remains a controversial subject and its history can summarize the last 40 years of transplantation. Until 1971, transfusions were widely used in patients awaiting transplantation, especially due to the anemia induced by the chronic renal dysfunction. Then, a noxious effect of preformed anti-HLA antibodies on renal grafts survival was reported and pretransplant transfusions were stopped. Between 1972 and 1977, improvement of renal graft survival in patients who received pretransplant transfusions was noted. Therefore, from 1978 on, a systematic policy of pretransplant transfusions was adopted by almost all centres of transplantation. During the eighties, it was again abandoned for several reasons: absence of graft survival improvement in patients treated by cyclosporine, HLA immunization leading to an increased incidence of acute graft rejection, risk of viral diseases transmission and human recombinant erythropoietin development. The lack of improvement in graft survival for ten years has been leading the transplant community to look for antigen-specific immunosuppressive strategies to achieve transplantation tolerance. Donor-specific transfusion may have clinical benefits, as long-term grafts survival improvement, through modulation of the recipient's cellular immune system and has been recently reconsidered, especially before living donor transplantation. The immunological mechanisms inducing a tolerance-gaining effect of transfusions are still misunderstood, but the recent discovery of immunomodulatory effects of the apoptotic cells present in cellular products could enlighten our comprehension of pretransplant transfusions benefits and could help to develop specific tolerance induction strategies in solid organ transplantation. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Laky, D; Parascan, Liliana
2007-01-01
Hibernating myocardium represent a prolonged but potentially reversible myocardial contractile dysfunction, an incomplete adaptation caused by chronic myocardial ischemia and persisting at least until blood flow restored. The purpose of this study was to investigate the morphological changes and weather relations exist among function, metabolism and structure in left ventricular hibernating myocardium. Material and methods. Experimental study is making on 12 dogs incomplete coronary obstruction during six weeks for morphologic studies of ischemic zones. On 48 patients with coronary stenosis myocardial biopsies was effectuated during aorto-coronarian bypass graft. On 60 patients with valvular disease associated with segmental coronary atherosclerotic obstructions during surgical interventions on a effectuated repeatedly biopsies from ischemic zones. Dyskinetic ischemic areas was identified by angiography, scintigraphy, low dose dobutamine echography to identify the cells viability. On myocardial biopsies various histological, histoenzymological, immunohistochemical and ultrastructural methods were performed. The morphological cardiomyocytic changes can summarized: loss of myofilaments, accumulation of glycogen, small mitochondria with reversible lesions, decrease of smooth reticulum, absence of T tubules, depression of titin in puncted pattern, loss of cardiotonin, disorganization of cytoskeleton, dispersed nuclear heterochromatin, embryofetal dedifferentiation, and persistence of viability. Extracellular matrix is enlarged with early matrix protein such fibronectin, tenascin, fibroblasts. In experimental material the morphological changes present similarities with the human biopsies, but intermixed with postinfarction scar tissue. Redifferentiation of hibernanting cells end remodeling of extracellular matrix is possible after quigle revascularization through aorto-coronary bypass grafts.
Prolonged immunosuppression preserves nonsensitization status after kidney transplant failure.
Casey, Michael J; Wen, Xuerong; Kayler, Liise K; Aiyer, Ravi; Scornik, Juan C; Meier-Kriesche, Herwig-Ulf
2014-08-15
When kidney transplants fail, transplant medications are discontinued to reduce immunosuppression-related risks. However, retransplant candidates are at risk for allosensitization which prolonging immunosuppression may minimize. We hypothesized that for these patients, a prolonged immunosuppression withdrawal after graft failure preserves nonsensitization status (PRA 0%) better than early immunosuppression withdrawal. We retrospectively examined subjects transplanted at a single center between July 1, 1999 and December 1, 2009 with a non-death-related graft loss. Subjects were stratified by timing of immunosuppression withdrawal after graft loss: early (≤3 months) or prolonged (>3 months). Retransplant candidates were eligible for the main study where the primary outcome was nonsensitization at retransplant evaluation. Non-retransplant candidates were included in the safety analysis only. We found 102 subjects with non-death-related graft loss of which 49 were eligible for the main study. Nonsensitization rates at retransplant evaluation were 30% and 66% for the early and prolonged immunosuppression withdrawal groups, respectively (P=0.01). After adjusting for cofactors such as blood transfusion and allograft nephrectomy, prolonged immunosuppression withdrawal remained significantly associated with nonsensitization (adjusted odds ratio=5.78, 95% CI [1.37-24.44]). No adverse safety signals were seen in the prolonged immunosuppression withdrawal group compared to the early immunosuppression withdrawal group. These results suggest that prolonged immunosuppression may be a safe strategy to minimize sensitization in retransplant candidates and provide the basis for larger or prospective studies for further verification.
Waltz, Robert A; Solomon, Daniel J; Provencher, Matthew T
2014-07-01
Magnetic resonance imaging (MRI) showing an "intact" anterior cruciate ligament (ACL) graft may not correlate well with examination findings. Reasons for an ACL graft dysfunction may be from malpositioned tunnels, deficiency of secondary stabilizers, repeat injuries, or a combination of factors. To evaluate the concordance/discordance of an ACL graft assessment between an arthroscopic evaluation, physical examination, and MRI and secondarily to evaluate the contributing variables to discordance. Case series; Level of evidence, 4. A total of 50 ACL revisions in 48 patients were retrospectively reviewed. The ACL graft status was recorded separately based on Lachman and pivot-shift test data, arthroscopic findings from operative reports, and MRI evaluation and was categorized into 3 groups: intact, partial tear, or complete tear. Two independent evaluators reviewed all of the preoperative radiographs and MRI scans, and interrater and intrarater reliability were evaluated. Concordance and discordance between a physical examination, arthroscopic evaluation, and MRI evaluation of the ACL graft were calculated. Graft position and type, mechanical axis, collateral ligament injuries, chondral and meniscal injuries, and mechanism of injury were evaluated as possible contributing factors using univariate and multivariate analyses. Sensitivity and specificity of MRI to detect a torn ACL graft and meniscal and chondral injuries on arthroscopic evaluation were calculated. The interobserver and intraobserver reliability for the MRI evaluation of the ACL graft were moderate, with combined κ values of .41 and .49, respectively. The femoral tunnel position was vertical in 88% and anterior in 46%. On MRI, the ACL graft was read as intact in 24%; however, no graft was intact on arthroscopic evaluation or physical examination. The greatest discordance was between the physical examination and MRI, with a rate of 52%. An insidious-onset mechanism of injury was significantly associated with discordance between MRI and arthroscopic evaluation of the ACL (P = .0003) and specifically with an intact ACL graft on MRI (P = .0014). The sensitivity and specificity of MRI to detect an ACL graft tear were 60% and 87%, respectively. Caution should be used when evaluating a failed ACL graft with MRI, especially in the absence of an acute mechanism of injury, as it may be unreliable and inconsistent. © 2014 The Author(s).
Mittal, Anubhav; Hickey, Anthony JR; Chai, Chau C; Loveday, Benjamin PT; Thompson, Nichola; Dare, Anna; Delahunt, Brett; Cooper, Garth JS; Windsor, John A; Phillips, Anthony RJ
2011-01-01
Introduction Multiple organ dysfunction is the main cause of death in severe acute pancreatitis. Primary mitochondrial dysfunction plays a central role in the development and progression of organ failure in critical illness. The present study investigated mitochondrial function in seven tissues during early experimental acute pancreatitis. Methods Twenty-eight male Wistar rats (463 ± 2 g; mean ± SEM) were studied. Group 1 (n = 8), saline control; Group 2 (n = 6), caerulein-induced mild acute pancreatitis; Group 3 (n = 7) sham surgical controls; and Group 4 (n = 7), taurocholate-induced severe acute pancreatitis. Animals were euthanased at 6 h from the induction of acute pancreatitis and mitochondrial function was assessed in the heart, lung, liver, kidney, pancreas, duodenum and jejunum by mitochondrial respirometry. Results Significant early mitochondrial dysfunction was present in the pancreas, lung and jejunum in both models of acute pancreatitis, however, the Heart, liver, kidney and duodenal mitochondria were unaffected. Conclusions The present study provides the first description of early organ-selective mitochondrial dysfunction in the lung and jejunum during acute pancreatitis. Research is now needed to identify the underlying pathophysiology behind the organ selective mitochondrial dysfunction, and the potential benefits of early mitochondrial-specific therapies in acute pancreatitis. PMID:21492333
Wahab, Mohamed Abdel; Shehta, Ahmed; Hamed, Hosam; Elshobary, Mohamed; Salah, Tarek; Sultan, Ahmed Mohamed; Fathy, Omar; Elghawalby, Ahmed; Yassen, Amr; Shiha, Usama
2015-01-01
Introduction The early hepatic venous outflow obstruction (HVOO) is a rare but serious complication after liver transplantation, which may result in graft loss. We report a case of early HVOO after living donor liver transplantation, which was managed by ectopic placement of foley catheter. Presentation A 51 years old male patient with end stage liver disease received a right hemi-liver graft. On the first postoperative day the patient developed impairment of the liver functions. Doppler ultrasound (US) showed absence of blood flow in the right hepatic vein without thrombosis. The decision was to re-explore the patient, which showed torsion of the graft upward and to the right side causing HVOO. This was managed by ectopic placement of a foley catheter between the graft and the diaphragm and the chest wall. Gradual deflation of the catheter was gradually done guided by Doppler US and the patient was discharged without complications. Discussion Mechanical HVOO results from kinking or twisting of the venous anastomosis due to anatomical mismatch between the graft and the recipient abdomen. It should be managed surgically by repositioning of the graft or redo of venous anastomosis. Several ideas had been suggested for repositioning and fixation of the graft by the use of Sengstaken–Blakemore tubes, tissue expanders, and surgical glove expander. Conclusion We report the use of foley catheter to temporary fix the graft and correct the HVOO. It is a simple and safe way, and could be easily monitored and removed under Doppler US without any complications. PMID:25805611
[Deep alkali burns: Evaluation of a two-step surgical strategy].
Devinck, F; Deveaux, C; Bennis, Y; Deken-Delannoy, V; Jeanne, M; Martinot-Duquennoy, V; Guerreschi, P; Pasquesoone, L
2018-04-10
Chemical burns are rare but often lead to deep cutaneous lesions. Alkali agents have a deep and long lasting penetrating power, causing burns that evolve over several days. The local treatment for these patients is excision of the wound and split thickness skin graft. Early excision and immediate skin grafting of alkali burns are more likely to be complicated by graft failure and delayed wound healing. We propose a two-step method that delays skin grafting until two-three days after burn wound excision. Our population included 25 controls and 16 cases. Men were predominant with a mean age of 41.9 years. In 78% of cases, burns were located on the lower limbs. The mean delay between the burn and excision was 16.5 days. In cases, the skin graft was performed at a mean of 11.3 days after the initial excision. We did not unveil any significant difference between both groups for the total skin surface affected, topography of the burns and the causal agent. Wound healing was significantly shorter in cases vs controls (37.5 days vs 50.3 days; P<0.025). Furthermore, we observed a decreased number of graft failures in cases vs controls (13.3% vs 46.7%; P=0.059). Our study shows the relevance of a two-step surgical strategy in patients with alkali chemical burns. Early excision followed by interval skin grafting is associated with quicker wound healing and decreased rate of graft failure. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Aitken, Emma L; Jackson, Andrew J; Kingsmore, David B
2014-01-01
Early cannulation arteriovenous grafts (ecAVGs), such as the GORE Acuseal, have "low bleed" properties permitting cannulation within 24 hours of insertion. They may provide an alternative to tunneled central venous catheters (and associated line complications) in patients requiring urgent vascular access. We present our early experience of 37 patients treated with the GORE Acuseal ecAVG. A total of 11 upper limb, 24 lower limb and 2 complex graft procedures were performed. Indications for ecAVG were as follows: bridge to transplantation (21.6%); bridge to arteriovenous fistula (AVF) maturation (8.1%); AVF salvage (8.1%); no native options (67.6%, including 17 patients with bilateral central vein stenosis); 36 AVGs (97.3%) were successfully cannulated. Mean time to first cannulation: 30.4±23.4 hours (range: 2-192). Primary and secondary patency rates at 3, 6 and 12 months were 64.9%, 48.6%, 32.4% and 70.2%, 59.4%, 40.5% respectively. The systemic bacteremia rate was 0.2 per 1,000 access days. There was one perioperative death. Other complications included hematoma at cannulation sites (n=9), pseudoaneurysm (n=3) and local infection at graft site (n=6). A total of 26 of 37 patients (70.6%) achieved a "personal vascular access solution": bridge to transplantation (n=8), bridge to functioning AVF/interposition AVG (n=5), maintenance hemodialysis via ecAVG (n=13); death with functioning AVG (n=1). Early experience with the GORE Acuseal is encouraging. Patency and bacteremia rates are at least comparable to standard polytetrafluoroethylene grafts. ecAVGs have permitted cannulation within 24 hours of insertion and line avoidance in the majority of patients. Nearly three-quarters of patients achieved a definitive "personal vascular access solution" from their ecAVG.
Hurley-Sanders, Jennifer L; Sladky, Kurt K; Nolan, Elizabeth C; Loomis, Michael R
2015-09-01
A 2-yr-old female red wolf (Canis rufus gregoryi) sustained a degloving injury to the left thoracic limb while in a display habitat. Initial attempts to resolve the extensive wound by using conservative measures were unsuccessful. Subsequent treatment using a free skin graft consisted first of establishment of an adequate granulation bed via cortical bone fenestration. After establishment of a healthy granulation bed was achieved, free skin graft was harvested and transposed over the bed. To monitor viability and incorporation of the graft, serial thermographic imaging was performed. Thermography noninvasively detects radiant heat patterns and can be used to assess vascularization of tissue, potentially allowing early detection of graft failure. In this case, thermography documented successful graft attachment.
Comparison of patients’ age receiving therapeutic services in a cleft care team in Isfahan
Soheilipour, Saeed; Soheilipour, Fatemeh; Derakhshandeh, Fatemeh; Hashemi, Hedieh; Memarzadeh, Mehrdad; Salehiniya, Hamid; Soheilipour, Fahimeh
2016-01-01
Background: Due to numerous difficulties in patients suffering from varieties of cleft lip and palate, their therapeutic management involves interdisciplinary teamwork. This study was conducted to compare the age of commencing treatments such as speech therapy, secondary palate and alveolar bone grafting and orthodontics between those who sought treatment early and late. Materials and Methods: In this retrospective study, 260 files of patients with cleft lip and palate based on their age at the time of admission to a cleft care team were divided into two groups: The early admission and late admission. Both groups compared based on four variables including the mean age of beginning speech therapy, palatal secondary surgery, alveolar bone grafting, and receiving orthodontics using t-test. Results: Based on the results, among 134 patients admitted for speech therapy, the mean age of initiating speech therapy in early clients was 3.3 years, and in the late ones was 9 years. Among 47 patients with secondary surgery, the mean age in early clients was 3.88 years, and in the late clients was 15.7 years. Among 17 patients with alveolar bone grafting, the mean age in the first group was 9 years, and in the other was 16.69 years. Among 24 patients receiving orthodontic services, the mean age in early clients was 7.66 years, and in the second group was 17.05 years. Conclusion: There was a significant difference between the age of performing secondary surgery and alveolar bone grafting and the age of beginning speech therapy and receiving orthodontic services in early references and late references to the team. PMID:27274350
Abdominal Wall Transplantation: Skin as a Sentinel Marker for Rejection.
Gerlach, U A; Vrakas, G; Sawitzki, B; Macedo, R; Reddy, S; Friend, P J; Giele, H; Vaidya, A
2016-06-01
Abdominal wall transplantation (AWTX) has revolutionized difficult abdominal closure after intestinal transplantation (ITX). More important, the skin of the transplanted abdominal wall (AW) may serve as an immunological tool for differential diagnosis of bowel dysfunction after transplant. Between August 2008 and October 2014, 29 small bowel transplantations were performed in 28 patients (16 male, 12 female; aged 41 ± 13 years). Two groups were identified: the solid organ transplant (SOT) group (n = 15; 12 ITX and 3 modified multivisceral transplantation [MMVTX]) and the SOT-AWTX group (n = 14; 12 ITX and 2 MMVTX), with the latter including one ITX-AWTX retransplantation. Two doses of alemtuzumab were used for induction (30 mg, 6 and 24 h after reperfusion), and tacrolimus (trough levels 8-12 ng/mL) was used for maintenance immunosuppression. Patient survival was similar in both groups (67% vs. 61%); however, the SOT-AWTX group showed faster posttransplant recovery, better intestinal graft survival (79% vs. 60%), a lower intestinal rejection rate (7% vs. 27%) and a lower rate of misdiagnoses in which viral infection was mistaken and treated as rejection (14% vs. 33%). The skin component of the AW may serve as an immune modulator and sentinel marker for immunological activity in the host. This can be a vital tool for timely prevention of intestinal graft rejection and, more important, avoidance of overimmunosuppression in cases of bowel dysfunction not related to graft rejection. © Copyright 2015 The American Society of Transplantation and the American Society of Transplant Surgeons.
Connective tissue changes in a mouse model of vein graft disease.
Schachner, T; Heiss, S; Mayr, T; Steger, C; Zipponi, D; Reisinger, P; Bonaros, N; Laufer, G; Bonatti, J
2008-04-01
The extracellular matrix plays an important physiological role in the architecture of the vascular wall. In arterialized vein grafts severe early changes, such as thrombosis and neointimal hyperplasia occur. Paclitaxel is in clinical use as antiproliferative coating of coronary stents. We aimed to investigate the early connective tissue changes in arterialized vein grafts and the influence of perivascular paclitaxel treatment in an in vivo model. C57 black mice underwent interposition of the vena cava into the carotid artery. Neointimal hyperplasia, thrombosis, acid mucopolysaccharides (Alcian), collagen fibers (trichrome Masson), elastic fibers, and apoptosis rate (TUNEL) were quantified in paclitaxel treated veins and controls. In both, controls and paclitaxel treated vein grafts acid mucopolysaccharides and elastic fibers were found predominantly in the neointima, whereas collagen fibers were found mainly in the media and adventitia. At 4 weeks postoperatively the neointimal thickness in controls was 52 (13-130) microm, whereas in 0.6 mg/mL l paclitaxel treated veins it was 103 (43-318) microm (P=0.094). At 8 weeks postoperatively paclitaxel treated veins showed a significantly increased neointimal thickness of 136 (87-199) microm compared with 79 (62-146) microm in controls (P=0.032). There was no difference in apoptosis rate between the two groups (P=NS). Even with the lowest concentration of 0.008 mg/mL paclitaxel veins showed a neointimal thickness of 67 (46-205) microm at 4 weeks postoperatively (P=NS vs controls). Early vein graft disease is characterised by an accumulation of acid mucopolysaccharides and elastic fibers in the thickened neointima. Paclitaxel treatment increases the neointimal hyperplasia in mouse vein grafts in vivo.
Nature or Artifice? Grafting in Early Modern Surgery and Agronomy.
Savoia, Paolo
2017-01-01
In 1597, Gaspare Tagliacozzi published a famous two-volume book on “plastic surgery.” The reconstructive technique he described was based on grafting skin taken from the arm onto the mutilated parts of the patient's damaged face – especially noses. This paper focuses on techniques of grafting, the “culture of grafting,” and the relationships between surgery and plant sciences in the sixteenth century. By describing the fascination with grafting in surgery, natural history, gardening, and agronomy the paper argues that grafting techniques were subject to delicate issues: to what extent it was morally acceptable to deceive the eye with artificial entities? and what was the status of the product of a surgical procedure that challenged the traditional natural/artificial distinction? Finally, this paper shows how in the seventeenth century grafting survived the crisis of Galenism by discussing the role it played in teratology and in controversies on the uses the new mechanistic anatomy.
Ultrasonography parameters and histopathology findings in transplanted kidney.
Rigler, A A; Vizjak, A; Ferluga, D; Kandus, A; Buturović-Ponikvar, J
2013-05-01
Until now studies have shown conflicting results about morphologic and hemodynamic parameters in predicting histopathology results in renal graft malfunction. We sought to analyze whether parenchymal thickness relative to graft length and resistive index (RI) measured by ultrasonography can predict histopathology findings on renal biopsy. We retrospectively analyzed 72 deceased donor renal allograft biopsies and respective allograft ultrasounds, performed on 68 patients (57% men) with mean age of 50 years (range, 21-73), with kidney graft dysfunction in 2010 and 2011. Parenchymal thickness relative to graft length and RI were compared with different histopathology diagnoses: Acute rejection, chronic rejection, chronic kidney changes, acute tubular necrosis (ATN), and other diagnoses. The mean value of the RI and of the parenchymal thickness/graft length ratio (parenchyma size index [PSI]) was 0.81 ± 0.10 (SD) and 1.48 ± 0.27 (SD), respectively. Enlarged PSI was significantly higher in ATN (mean 1.72 ± 0.26) compared with no ATN (mean 1.39 ± 0.23; P < .001), and lower when chronic changes were present (mean 1.40 ± 0.25 for chronic changes vs mean 1.62 ± 0.28 for no chronic changes; P = .004). In the group without ATN, PSI was enlarged in acute graft rejection compared with no graft rejection (mean 1.50 ± 0.24 vs 1.24 ± 0.13, respectively; P < .001), whereas in the whole group, including ATN, PSI showed no differentiating power for acute rejection (P = .526). RI was significantly higher in ATN than without it (mean 0.91 ± 0.10 vs 0.79 ± 0.08, respectively; P < .001), whereas the RI was not increased (but was actually lower) in acute graft rejection compared with no graft rejection, neither in the whole group (mean 0.81 ± 0.09 vs 0.82 ± 0.12, respectively; P = .611). Enlarged parenchymal thickness/graft length ratio on ultrasonography was observed in ATN and acute allograft rejection. The RI was increased in ATN, but not in acute allograft rejection. Decreased parenchymal thickness/graft length ratio was observed in chronic kidney changes. Copyright © 2013 Elsevier Inc. All rights reserved.
Retransplantation in Late Hepatic Artery Thrombosis: Graft Access and Transplant Outcome.
Buchholz, Bettina M; Khan, Shakeeb; David, Miruna D; Gunson, Bridget K; Isaac, John R; Roberts, Keith J; Muiesan, Paolo; Mirza, Darius F; Tripathi, Dhiraj; Perera, M Thamara P R
2017-08-01
Definitive treatment for late hepatic artery thrombosis (L-HAT) is retransplantation (re-LT); however, the L-HAT-associated disease burden is poorly represented in allocation models. Graft access and transplant outcome of the re-LT experience between 2005 and 2016 was reviewed with specific focus on the L-HAT cohort in this single-center retrospective study. Ninety-nine (5.7%) of 1725 liver transplantations were re-LT with HAT as the main indication (n = 43; 43%) distributed into early (n = 25) and late (n = 18) episodes. Model for end-stage liver disease as well as United Kingdom model for end-stage liver disease did not accurately reflect high disease burden of graft failure associated infections such as hepatic abscesses and biliary sepsis in L-HAT. Hence, re-LT candidates with L-HAT received low prioritization and waited longest until the allocation of an acceptable graft (median, 103 days; interquartile range, 28-291 days), allowing for progression of biliary sepsis. Balance of risk score and 3-month mortality score prognosticated good transplant outcome in L-HAT but, contrary to the prediction, the factual 1-year patient survival after re-LT was significantly inferior in L-HAT compared to early HAT, early non-HAT and late non-HAT (65% vs 82%, 92% and 95%) which was mainly caused by sepsis and multiorgan failure driving 3-month mortality (28% vs 11%, 16% and 0%). Access to a second graft after a median waitlist time of 6 weeks achieved the best short- and long-term outcome in re-LT for L-HAT (3-month mortality, 13%; 1-year survival, 77%). Inequity in graft access and peritransplant sepsis are fundamental obstacles for successful re-LT in L-HAT. Offering a graft for those in need at the best window of opportunity could facilitate earlier engrafting with improved outcomes.
Early Immune Function and Duration of Organ Dysfunction in Critically Ill Septic Children.
Muszynski, Jennifer A; Nofziger, Ryan; Moore-Clingenpeel, Melissa; Greathouse, Kristin; Anglim, Larissa; Steele, Lisa; Hensley, Josey; Hanson-Huber, Lisa; Nateri, Jyotsna; Ramilo, Octavio; Hall, Mark W
2018-02-22
Late immune suppression is associated with nosocomial infection and mortality in septic adults and children. Relationships between early immune suppression and outcomes in septic children remain unclear. Prospective observational study to test the hypothesis that early innate and adaptive immune suppression are associated with longer duration of organ dysfunction in children with severe sepsis/septic shock. Methods, Measurements and Main Results: Children aged < 18 years meeting consensus criteria for severe sepsis or septic shock were sampled within 48 hours of sepsis onset. Healthy controls were sampled once. Innate immune function was quantified by whole blood ex vivo lipopolysaccharide-induced TNFα production capacity. Adaptive immune function was quantified by ex vivo phytohemagglutinin-induced IFNγ production capacity. 102 septic children and 35 healthy children were enrolled. Compared to healthy children, septic children demonstrated lower LPS-induced TNFα production (p < 0.0001) and lower PHA-induced IFNγ production (p<0.0001). Among septic children, early innate and adaptive immune suppression were associated with greater number of days with multiple organ dysfunction (MODS) and greater number of days with any organ dysfunction. On multivariable analyses, early innate immune suppression remained independently associated with increased MODS days [aRR 1.2 (1.03, 1.5)] and organ dysfunction days [aRR 1.2 (1.1, 1.3)]. Critically ill children with severe sepsis or septic shock demonstrate early innate and adaptive immune suppression. Early suppression of both innate and adaptive immunity are associated with longer duration of organ dysfunction and may be useful markers to guide investigations of immunomodulatory therapies in septic children.
Design and development of multilayer vascular graft
NASA Astrophysics Data System (ADS)
Madhavan, Krishna
2011-07-01
Vascular graft is a widely-used medical device for the treatment of vascular diseases such as atherosclerosis and aneurysm as well as for the use of vascular access and pediatric shunt, which are major causes of mortality and morbidity in this world. Dysfunction of vascular grafts often occurs, particularly for grafts with diameter less than 6mm, and is associated with the design of graft materials. Mechanical strength, compliance, permeability, endothelialization and availability are issues of most concern for vascular graft materials. To address these issues, we have designed a biodegradable, compliant graft made of hybrid multilayer by combining an intimal equivalent, electrospun heparin-impregnated poly-epsilon-caprolactone nanofibers, with a medial equivalent, a crosslinked collagen-chitosan-based gel scaffold. The intimal equivalent is designed to build mechanical strength and stability suitable for in vivo grafting and to prevent thrombosis. The medial equivalent is designed to serve as a scaffold for the activity of the smooth muscle cells important for vascular healing and regeneration. Our results have shown that genipin is a biocompatible crosslinker to enhance the mechanical properties of collagen-chitosan based scaffolds, and the degradation time and the activity of smooth muscle cells in the scaffold can be modulated by the crosslinking degree. For vascular grafting and regeneration in vivo, an important design parameter of the hybrid multilayer is the interface adhesion between the intimal and medial equivalents. With diametrically opposite affinities to water, delamination of the two layers occurs. Physical or chemical modification techniques were thus used to enhance the adhesion. Microscopic examination and graft-relevant functional characterizations have been performed to evaluate these techniques. Results from characterization of microstructure and functional properties, including burst strength, compliance, water permeability and suture strength, showed that the multilayer graft possessed properties mimicking those of native vessels. Achieving these FDA-required functional properties is essential because they play critical roles in graft performances in vivo such as thrombus formation, occlusion, healing, and bleeding. In addition, cell studies and animal studies have been performed on the multilayer graft. Our results show that the multilayer graft support mimetic vascular culture of cells and the acellular graft serves as an artery equivalent in vivo to sustain the physiological conditions and promote appropriate cellular activity. In conclusion, the newly-developed hybrid multilayer graft provides a proper balance of biomechanical and biochemical properties and demonstrates the potential for the use of vascular tissue engineering and regeneration.
Can zero-hour cortical biopsy predict early graft outcomes after living donor renal transplantation?
Rathore, Ranjeet Singh; Mehta, Nisarg; Mehta, Sony Bhaskar; Babu, Manas; Bansal, Devesh; Pillai, Biju S; Sam, Mohan P; Krishnamoorthy, Hariharan
2017-11-01
The aim of this study was to identify relevance of subclinical pathological findings in the kidneys of living donors and correlate these with early graft renal function. This was a prospective study on 84 living donor kidney transplant recipients over a period of two years. In all the donors, cortical wedge biopsy was taken and sent for assessment of glomerular, mesangial, and tubule status. The graft function of patients with normal histology was compared with those of abnormal histological findings at one, three, and six months, and one year post-surgery. Most abnormal histological findings were of mild degree. Glomerulosclerosis (GS, 25%), interstitial fibrosis (IF, 13%), acute tubular necrosis (ATN 5%), and focal tubal atrophy (FTA, 5%) were the commonly observed pathological findings in zero-hour biopsies. Only those donors who had histological changes of IF and ATN showed progressive deterioration of renal function at one month, three months, six months, and one year post-transplantation. In donors with other histological changes, no significant effect on graft function was observed. Zero-hour cortical biopsy gave us an idea of the general status of the donor kidney and presence or absence of subclinical pathological lesions. A mild degree of subclinical and pathological findings on zero-hour biopsy did not affect early graft renal function in living donor kidney transplantation. Zero-hour cortical biopsy could also help in discriminating donor-derived lesions from de novo alterations in the kidney that could happen subsequently.
Is there any place for spontaneous healing in deep palmar burn of the child?
Chateau, J; Guillot, M; Zevounou, L; Braye, F; Foyatier, J-L; Comparin, J-P; Voulliaume, D
2017-06-01
Child palm burns arise by contact and are often deep. The singular difficulty of such a disease comes from the necessity of the child growth and from the potential occurrence of constricted scars. In order to avoid sequelae, the actual gold standard is to practice an early excision of the burn, followed by a skin graft. The aim of this study is to evaluate the results of spontaneous healing combined with rehabilitation versus early skin grafting and rehabilitation concerning the apparition of sequelae. We performed a retrospective study in two burn centers and one rehabilitation hospital between 1995 and 2010. Eighty-seven hands have been included in two groups: one group for spontaneous healing and the other group for excision and skin grafting. Every child benefited from a specific rehabilitation protocol. The two main evaluation criteria were the duration of permanent splint wearing and the number of reconstructive surgery for each child. The median follow-up duration is about four years. The two groups were comparable. For the early skin grafting group, the splint wearing duration was 1/3 longer than for the spontaneous healing group. Concerning the reconstructive surgery, half of the grafted hands needed at least one procedure versus 1/5 of spontaneous healing hands. Our results show the interest of spontaneous healing in palmar burn in child, this observation requires a specific and intense rehabilitation protocol. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Pseudoaneurysm of the saphenous vein graft related to artificial rubber pericardium--a case report.
Harada, Tomohiro; Ida, Takao; Koyanagi, Toshiya; Kasahara, Katsuhiko; Naganuma, Fumio; Hosoda, Saichi
2002-01-01
Pseudoaneurysm is an unusual complication of coronary artery bypass grafting. Such aneurysms are caused by technical surgical failures, or inflammation of the sternum and mediastinum following sternotomy observed as an early or mid-term complication of cardiac surgery. This case was an 80-year-old man with a piece of artificial rubber pericardium used for complete closure of the pericardium. A large pseudoaneurysm developed in the body of the saphenous vein graft 15 years after surgery. The old rubber synthetic pericardium was severely degenerative, which induced inflammation and disrupted the saphenous vein graft.
[Features of skin graft in pediatric plastic surgery].
Depoortère, C; François, C; Belkhou, A; Duquennoy-Martinot, V; Guerreschi, P
2016-10-01
Skin graft is a skin tissue fragment transferred from a donor site to a receiving site with a spontaneous revascularization. Basic process of plastic surgery, skin graft known in children, specific, warnings and refinements. It finds its indication in many pediatric cases: integumental diseases (neavus, hamartoma), acute burns and scars, traumatic loss of substance or surgically induced, congenital malformations of the hands and feet, etc. Specific skin graft techniques in children are developed: donor sites, sampling technique and procedure, early postoperative care. Especially in children, the scalp is a perfect site for split skin graft and technique is actively developed. Refinements and special cases are discussed: use of dermal matrices, allografts, xenografts, negative pressure therapy, prior skin expansion of the donor site. Results of skin graft in children are exposed: taking of graft, growth and shrinkage, pigmentation. Skin graft sometimes allows to stay the complex movement and get the best final benefit, permanent or at least temporary, in a growing being. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Periorbital varicella gangrenosa: A rare complication of chicken pox.
Jain, Jagriti; Thatte, Shreya; Singhai, Prakhar
2015-01-01
A previously healthy six year old male child presented in pediatrics ICU in state of shock with history of fever and rashes and later was diagnosed as chicken pox. He developed right sided periorbital varicella gangrenosa which is a form of necrotizing fasciitis secondary to skin infection. Patient was treated with intravenous acyclovir, antibiotics, amphotericin B, extensive debridement and later reconstruction of upper eyelid with skin grafting. Aggressive treatment helped preventing the eyeball and orbital involvement which would have necessitated orbital exenteration. However delayed presentation resulted in necrosis of orbicularis oculi and underlying tissue which resulted in graft retraction and lid dysfunction. Clinicians should be aware of this rare but fulminating condition to minimise the sight and life threatening complications associated with it.
Neurologic Complications of Transplantation.
Dhar, Rajat
2018-02-01
Neurologic disturbances including encephalopathy, seizures, and focal deficits complicate the course 10-30% of patients undergoing organ or stem cell transplantation. While much or this morbidity is multifactorial and often associated with extra-cerebral dysfunction (e.g., graft dysfunction, metabolic derangements), immunosuppressive drugs also contribute significantly. This can either be through direct toxicity (e.g., posterior reversible encephalopathy syndrome from calcineurin inhibitors such as tacrolimus in the acute postoperative period) or by facilitating opportunistic infections in the months after transplantation. Other neurologic syndromes such as akinetic mutism and osmotic demyelination may also occur. While much of this neurologic dysfunction may be reversible if related to metabolic factors or drug toxicity (and the etiology is recognized and reversed), cases of multifocal cerebral infarction, hemorrhage, or infection may have poor outcomes. As transplant patients survive longer, delayed infections (such as progressive multifocal leukoencephalopathy) and post-transplant malignancies are increasingly reported.
Walczak, Andrzej; Ostrowski, Stanisław; Wrona, Ewa; Bartczak, Karol; Jaszewski, Ryszard
2014-01-01
Introduction Coronary artery bypass grafting (CABG) is conducted more and more commonly in patients in advanced age. Aim of the study To analyze the influence of age and concurrent risk factors on the complications and early mortality after CABG. Material and methods Medical records of 2194 patients were analyzed retrospectively. A group of 1303 patients who had undergone isolated CABG was selected. 106 (4.8%) patients were excluded due to missing data in their medical records. The remaining 1197 patients were divided into two subgroups by age: 1st group < 65 years (n = 662; 55.3%); 2nd group ≥ 65 years (n = 535; 44.7%). Results The total 30-day mortality was 3.93% and was six times higher in the older group (1.21 vs. 7.29%; p < 0.001). Complications were observed in 176 (14.70%) patients, more often in the older group (10.42% vs. 20.0%; p < 0.001). In this group all kinds of complications were noted more often and in particular: postoperative myocardial infarction (1.96% vs. 5.42%; p = 0.001), respiratory dysfunction (1.36% vs. 4.11%; p = 0.005), neurological complications (1.81% vs. 3.74%; p = 0.04) and multi-organ dysfunction syndrome (0.30% vs. 1.68%, p = 0.03). The older patients required longer time under mechanical ventilation (24.0 ± 27.9 vs. 37.0 ± 74.1 hours; p = 0.004) and stayed longer in the intensive care unit: 2.5 ± 3.0 vs. 4.1 ± 7.84 days; p < 0.001. Independent predictors of death were: female sex [OR (95% CI) = 2.4 (1.2-4.5)], age ≥ 65 years [OR = 4.9 (2.1-11.1)], eGFR < 60 mL/min/1.73 m2 [OR = 2.2 (1.0-4.7)], time at extracorporeal circulation > 72 minutes [OR = 5.5 (2.7-10.9)] and left main stem stenosis (> 50%) [OR = 2.4 (1.3-4.6)]. Conclusions Age still significantly influences postoperative complications and mortality after isolated CABG. PMID:26336419
Miśkowiec, Dawid; Walczak, Andrzej; Ostrowski, Stanisław; Wrona, Ewa; Bartczak, Karol; Jaszewski, Ryszard
2014-06-01
Coronary artery bypass grafting (CABG) is conducted more and more commonly in patients in advanced age. To analyze the influence of age and concurrent risk factors on the complications and early mortality after CABG. Medical records of 2194 patients were analyzed retrospectively. A group of 1303 patients who had undergone isolated CABG was selected. 106 (4.8%) patients were excluded due to missing data in their medical records. The remaining 1197 patients were divided into two subgroups by age: 1(st) group < 65 years (n = 662; 55.3%); 2(nd) group ≥ 65 years (n = 535; 44.7%). The total 30-day mortality was 3.93% and was six times higher in the older group (1.21 vs. 7.29%; p < 0.001). Complications were observed in 176 (14.70%) patients, more often in the older group (10.42% vs. 20.0%; p < 0.001). In this group all kinds of complications were noted more often and in particular: postoperative myocardial infarction (1.96% vs. 5.42%; p = 0.001), respiratory dysfunction (1.36% vs. 4.11%; p = 0.005), neurological complications (1.81% vs. 3.74%; p = 0.04) and multi-organ dysfunction syndrome (0.30% vs. 1.68%, p = 0.03). The older patients required longer time under mechanical ventilation (24.0 ± 27.9 vs. 37.0 ± 74.1 hours; p = 0.004) and stayed longer in the intensive care unit: 2.5 ± 3.0 vs. 4.1 ± 7.84 days; p < 0.001. Independent predictors of death were: female sex [OR (95% CI) = 2.4 (1.2-4.5)], age ≥ 65 years [OR = 4.9 (2.1-11.1)], eGFR < 60 mL/min/1.73 m(2) [OR = 2.2 (1.0-4.7)], time at extracorporeal circulation > 72 minutes [OR = 5.5 (2.7-10.9)] and left main stem stenosis (> 50%) [OR = 2.4 (1.3-4.6)]. Age still significantly influences postoperative complications and mortality after isolated CABG.
Mizota, Toshiyuki; Miyao, Mariko; Yamada, Tetsu; Sato, Masaaki; Aoyama, Akihiro; Chen, Fengshi; Date, Hiroshi; Fukuda, Kazuhiko
2016-01-01
OBJECTIVES Primary graft dysfunction (PGD) is a major cause of early morbidity and mortality after cadaveric lung transplantation (CLT). This study examined the incidence, time course and predictive value of PGD after living-donor lobar lung transplantation (LDLLT). METHODS We retrospectively investigated 75 patients (42 with LDLLT and 33 with CLT) who underwent lung transplantation from January 2008 to December 2013. Patients were assigned PGD grades at six time points, as defined by the International Society for Heart and Lung Transplantation: immediately after final reperfusion, upon arrival at the intensive care unit (ICU), and 12, 24, 48 and 72 h after ICU admission. RESULTS The incidence of severe (Grade 3) PGD at 48 or 72 h after ICU admission was similar for LDLLT and CLT patients (16.7 vs 12.1%; P = 0.581). The majority of the LDLLT patients having severe PGD first developed PGD immediately after reperfusion, whereas more than half of the CLT patients first developed severe PGD upon ICU arrival or later. In LDLLT patients, severe PGD immediately after reperfusion was significantly associated with fewer ventilator-free days during the first 28 postoperative days [median (interquartile range) of 0 (0–10) vs 21 (13–25) days, P = 0.001], prolonged postoperative ICU stay [median (interquartile range) of 20 (16–27) vs 12 (8–14) days, P = 0.005] and increased hospital mortality (27.3 vs 3.2%, P = 0.02). Severe PGD immediately after reperfusion was not associated with ventilator-free days during the first 28 postoperative days, time to discharge from ICU or hospital, or hospital mortality in CLT patients. CONCLUSIONS Postoperative incidence of severe PGD was not significantly different between LDLLT and CLT patients. In LDLLT patients, the onset of severe PGD tended to be earlier than that in CLT patients. Severe PGD immediately after reperfusion was a significant predictor of postoperative morbidity and mortality in LDLLT patients but not in CLT patients. PMID:26705301
Mizota, Toshiyuki; Miyao, Mariko; Yamada, Tetsu; Sato, Masaaki; Aoyama, Akihiro; Chen, Fengshi; Date, Hiroshi; Fukuda, Kazuhiko
2016-03-01
Primary graft dysfunction (PGD) is a major cause of early morbidity and mortality after cadaveric lung transplantation (CLT). This study examined the incidence, time course and predictive value of PGD after living-donor lobar lung transplantation (LDLLT). We retrospectively investigated 75 patients (42 with LDLLT and 33 with CLT) who underwent lung transplantation from January 2008 to December 2013. Patients were assigned PGD grades at six time points, as defined by the International Society for Heart and Lung Transplantation: immediately after final reperfusion, upon arrival at the intensive care unit (ICU), and 12, 24, 48 and 72 h after ICU admission. The incidence of severe (Grade 3) PGD at 48 or 72 h after ICU admission was similar for LDLLT and CLT patients (16.7 vs 12.1%; P = 0.581). The majority of the LDLLT patients having severe PGD first developed PGD immediately after reperfusion, whereas more than half of the CLT patients first developed severe PGD upon ICU arrival or later. In LDLLT patients, severe PGD immediately after reperfusion was significantly associated with fewer ventilator-free days during the first 28 postoperative days [median (interquartile range) of 0 (0-10) vs 21 (13-25) days, P = 0.001], prolonged postoperative ICU stay [median (interquartile range) of 20 (16-27) vs 12 (8-14) days, P = 0.005] and increased hospital mortality (27.3 vs 3.2%, P = 0.02). Severe PGD immediately after reperfusion was not associated with ventilator-free days during the first 28 postoperative days, time to discharge from ICU or hospital, or hospital mortality in CLT patients. Postoperative incidence of severe PGD was not significantly different between LDLLT and CLT patients. In LDLLT patients, the onset of severe PGD tended to be earlier than that in CLT patients. Severe PGD immediately after reperfusion was a significant predictor of postoperative morbidity and mortality in LDLLT patients but not in CLT patients. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
First experience of liver transplantation with type 2 donation after cardiac death in France.
Savier, Eric; Dondero, Federica; Vibert, Eric; Eyraud, Daniel; Brisson, Hélène; Riou, Bruno; Fieux, Fabienne; Naili-Kortaia, Salima; Castaing, Denis; Rouby, Jean-Jacques; Langeron, Olivier; Dokmak, Safi; Hannoun, Laurent; Vaillant, Jean-Christophe
2015-05-01
Organ donation after unexpected cardiac death [type 2 donation after cardiac death (DCD)] is currently authorized in France and has been since 2006. Following the Spanish experience, a national protocol was established to perform liver transplantation (LT) with type 2 DCD donors. After the declaration of death, abdominal normothermic oxygenated recirculation was used to perfuse and oxygenate the abdominal organs until harvesting and cold storage. Such grafts were proposed to consenting patients < 65 years old with liver cancer and without any hepatic insufficiency. Between 2010 and 2013, 13 LTs were performed in 3 French centers. Six patients had a rapid and uneventful postoperative recovery. However, primary nonfunction occurred in 3 patients, with each requiring urgent retransplantation, and 4 early allograft dysfunctions were observed. One patient developed a nonanastomotic biliary stricture after 3 months, whereas 8 patients showed no sign of ischemic cholangiopathy at their 1-year follow-up. In comparison with a control group of patients receiving grafts from brain-dead donors (n = 41), donor age and cold ischemia time were significantly lower in the type 2 DCD group. Time spent on the national organ wait list tended to be shorter in the type 2 DCD group: 7.5 months [interquartile range (IQR), 4.0-11.0 months] versus 12.0 months (IQR, 6.8-16.7 months; P = 0.08. The 1-year patient survival rates were similar (85% in the type 2 DCD group versus 93% in the control group), but the 1-year graft survival rate was significantly lower in the type 2 DCD group (69% versus 93%; P = 0.03). In conclusion, to treat borderline hepatocellular carcinoma, LT with type 2 DCD donors is possible as long as strict donor selection is observed. © 2015 American Association for the Study of Liver Diseases.
Angiotensin receptors as sensitive markers of acute bronchiole injury after lung transplantation.
Nataatmadja, Maria; Passmore, Margaret; Russell, Fraser D; Prabowo, Sulistiana; Corley, Amanda; Fraser, John F
2014-08-01
Although lung transplantation is the only means of survival for patients with end-stage pulmonary disease, outcomes from this intervention are inferior to other solid organ transplants. The reason for the poor outcomes may be linked to an early reaction, such as primary graft dysfunction, and associated with marked inflammatory response, bronchiole injury, and later fibrotic responses. Mediators regulating these effects include angiotensin II and matrix metalloproteinases (MMPs). We investigated changes to these mediators over the course of cardiopulmonary bypass (CPB) and up to 72 h after lung transplantation, using immunohistochemistry, Western blot, and ELISA techniques. We found 4- and 16-fold increases in plasma angiotensin II and MMP-9, respectively, from pre-CPB to post-CPB. MMP-9 levels remained elevated 1 h after transplantation. MMP-2 levels were elevated 6-24 h after lung transplantation. Type 2 angiotensin II receptor (ATR2) expression was 3.5-fold higher in bronchoalveolar cells 1-6 h after transplantation than in controls. The study suggests that the combination of cardiopulmonary bypass and lung transplantation is associated with early changes in the angiotensin II receptor system and in MMPs, and that altered expression of these mediators may be a useful marker to examine pathological changes that occur in lungs during transplant surgery.
Montero, Rosa M.; Olsburgh, Jonathon
2015-01-01
Polyuria after kidney transplantation causes graft dysfunction and increased thrombotic risk. We present a case of a polyuric adult with Dent's disease who underwent staged bilateral native nephrectomies, the first operation before transplant and the second four months after transplant. This led to improved allograft function maintained during four years of follow-up. The retroperitoneal laparoscopic approach was well tolerated and allowed continuation of peritoneal dialysis before transplantation. A staged approach helps regulate fluid balance perioperatively and may be tailored to individual need according to posttransplant urine output. This novel approach should be considered for polyuric patients with tubular dysfunction including Dent's disease. PMID:25649339
Chorio-Allantoic Membrane Grafting of Chick Limb Buds as a Class Practical.
ERIC Educational Resources Information Center
McLachlan, John C.
1981-01-01
A new method of carrying out grafts of early embryonic chick limb buds to the chick chorio-allantoic membrane and a processing schedule which renders cartilage elements visible in whole mount are discussed, including implications for the procedures and their results. (Author/DC)
Hyler, Stefan; Pischke, Søren E; Halvorsen, Per Steinar; Espinoza, Andreas; Bergsland, Jacob; Tønnessen, Tor Inge; Fosse, Erik; Skulstad, Helge
2015-04-01
Sensitive methods for the early detection of myocardial dysfunction are still needed, as ischemia is a leading cause of decreased ventricular function during and after heart surgery. The aim of this study was to test the hypothesis that low-grade ischemia could be detected quantitatively by a miniaturized epicardial ultrasound transducer (Ø = 3 mm), allowing continuous monitoring. In 10 pigs, transducers were positioned in the left anterior descending and circumflex coronary artery areas. Left ventricular pressure was obtained by a micromanometer. The left internal mammary artery was grafted to the left anterior descending coronary artery, which was occluded proximal to the anastomosis. Left internal mammary artery flow was stepwise reduced by 25%, 50%, and 75% for 18 min each. From the transducers, M-mode traces were obtained, allowing continuous tissue velocity traces and displacement measurements. Regional work was assessed as left ventricular pressure-displacement loop area. Tissue lactate measured from intramyocardial microdialysis was used as reference method to detect ischemia. All steps of coronary flow reduction demonstrated reduced peak systolic velocity (P < .05) and regional work (P < .01).The decreases in peak systolic velocity and regional work were closely related to the degree of ischemia, demonstrated by their correlations with lactate (R = -0.74, P < .01, and R = -0.64, P < .01, respectively). The circumflex coronary artery area was not affected by any of the interventions. The epicardially attached miniaturized ultrasound transducer allowed the precise detection of different levels of coronary flow reduction. The results also showed a quantitative and linear relationship among coronary flow, ischemia, and myocardial function. Thus, the ultrasound transducer has the potential to improve the monitoring of myocardial ischemia and to detect graft failure during and after heart surgery. Copyright © 2015 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Chemokines, their receptors, and transplant outcome.
Colvin, Bridget L; Thomson, Angus W
2002-07-27
Organ transplant rejection is mediated largely by circulating peripheral leukocytes induced to infiltrate the graft by various inflammatory stimuli. Of these, chemotactic cytokines called chemokines, expressed by inflamed graft tissues, as well as by early innate-responding leukocytes that infiltrate the graft, are responsible for the recruitment of alloreactive leukocytes. This report discusses the impact of these leukocyte-directing proteins on transplant outcome and novel therapeutic approaches for antirejection therapy based on targeting of chemokines and/or their receptors.
Cole, Robert Townsend; Gandhi, Jonathan; Bray, Robert A; Gebel, Howard M; Yin, Michael; Shekiladze, Nikolaz; Young, An; Grant, Aubrey; Mahoney, Ian; Laskar, S Raja; Gupta, Divya; Bhatt, Kunal; Book, Wendy; Smith, Andrew; Nguyen, Duc; Vega, J David; Morris, Alanna A
2018-04-01
Despite improvements in outcomes after heart transplantation, black recipients have worse survival compared with non-black recipients. The source of such disparate outcomes remains largely unknown. We hypothesize that a propensity to generate de-novo donor-specific antibodies (dnDSA) and subsequent antibody-mediated rejection (AMR) may account for racial differences in sub-optimal outcomes after heart transplant. In this study we aimed to determine the role of dnDSA and AMR in racial disparities in post-transplant outcomes. This study was a single-center, retrospective analysis of 137 heart transplant recipients (81% male, 48% black) discharged from Emory University Hospital. Patients were classified as black vs non-black for the purpose of our analysis. Kaplan-Meier and Cox regression analyses were used to evaluate the association between race and selected outcomes. The primary outcome was the development of dnDSA. Secondary outcomes included treated AMR and a composite of all-cause graft dysfunction or death. After 3.7 years of follow-up, 39 (28.5%) patients developed dnDSA and 19 (13.8%) were treated for AMR. In multivariable models, black race was associated with a higher risk of developing dnDSA (hazard ratio [HR] 3.65, 95% confidence interval [CI] 1.54 to 8.65, p = 0.003) and a higher risk of treated AMR (HR 4.86, 95% CI 1.26 to 18.72, p = 0.021) compared with non-black race. Black race was also associated with a higher risk of all-cause graft dysfunction or death in univariate analyses (HR 2.10, 95% CI 1.02 to 4.30, p = 0.044). However, in a multivariable model incorporating dnDSA, black race was no longer a significant risk factor. Only dnDSA development was significantly associated with all-cause graft dysfunction or death (HR 4.85, 95% CI 1.89 to 12.44, p = 0.001). Black transplant recipients are at higher risk for the development of dnDSA and treated AMR, which may account for racial disparities in outcomes after heart transplantation. Copyright © 2018 International Society for the Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Angiographic flow grading and graft arrangement of arterial conduits.
Nakajima, Hiroyuki; Kobayashi, Junjiro; Tagusari, Osamu; Niwaya, Kazuo; Funatsu, Toshihiro; Kawamura, Atsushi; Yagihara, Toshikatsu; Kitamura, Soichiro
2006-11-01
We sought to delineate the effects of competitive and reverse flow on the intermediate-term patency of arterial conduits and examined graft arrangements for maximizing antegrade bypass flow. The angiograms of 2083 bypass grafts in 570 patients who underwent off-pump total arterial revascularization without aortic manipulation since December 2000 were reviewed. The blood flow in the bypass grafts were graded A (antegrade), B (competitive), C (reverse), or O (occlusion). The mean number of distal anastomoses was 3.65 +/- 0.94 per patient. In the early angiography 91.3% (1901/2083) of the bypasses were grade A. Thirty (1.4%) bypasses were grade O, whereas 2.9% (61/2083) were grade B, and 4.4%(91/2083) were grade C. In the multivariate analysis the end-to-side anastomosis (P < .0001), 4 or more distal anastomoses of the conduit (P = .01), native coronary stenosis of less than 75% (P < .0001), and target branch location of the right coronary artery territory (P < .0001) and left circumflex artery territory (P = .02) significantly correlated with grade non-A. The patency rate in the late angiography of the bypasses graded B or C in the early angiography was 7 (28.0%) of 25, whereas that of the bypasses graded A was 164 (89.1%) of 184 (P < .0001). The actuarial graft patency rate of the bypasses graded A was 72.3% at 3 years and was significantly higher than that of the bypasses graded B or C (28.6% at 3 years after surgical intervention, P < .0001). The sufficient antegrade bypass flow had a favorable effect on the graft patency of arterial conduits. The graft arrangement should be adjusted for each patient so as to maximize the antegrade bypass flow and to confirm the advantage of arterial grafts.
Fukayama, Toshiharu; Ozai, Yusuke; Shimokawadoko, Haruka; Aytemiz, Derya; Tanaka, Ryou; Machida, Noboru; Asakura, Tetsuo
2015-01-01
ABSTRACT Vascular grafts under 5 mm or less in diameter are not developed due to a problem caused by early thrombus formation, neointimal hyperplasia, etc. Bombyx mori silk fibroin (SF) which has biodegradability and tissue infiltration is focused as tube and coating material of vascular grafts. Coating is an important factor to maintain the strength of the anastomotic region of vascular grafts, and to prevent the blood leak from the vascular grafts after implantation. Therefore, in this research, we focused on the SF concentration of the coating solution, and tissue infiltration and remodeling were compared among each SF concentration. Silk poly (-ethylene) glycol diglycidyl ether (PGDE) coating with concentrations of 1.0%, 2.5%, 5.0%, and 7.5% SF were applied for the double-raschel knitted small-sized vessel with 1.5 mm diameter and 1cm in length. The grafts were implanted in the rat abdominal aorta and removed after 3 weeks or 3 months. Vascular grafts patency was monitored by ultrasound, and morphological evaluation was performed by histopathological examination. SF concentration had no significant effects on the patency rate. However, tissue infiltration was significantly higher in the sample of 2.5% SF in 3 weeks, and 1.0% and 2.5% SF in 3 months. Also, in comparison of length inside of the graft, stenosis were not found in 3 weeks, however, found with 5.0% and 7.5% in 3 months. From these results, it is clear that 2.5% SF coating is the most suitable concentration, based on the characteristics of less stenosis, early tissue infiltration, and less neointimal hyperplasia. PMID:26496652
Genovese, Elizabeth A; Avgerinos, Efthymios D; Baril, Donald T; Makaroun, Michel S; Chaer, Rabih A
2016-12-01
There is limited investigation into the use of bio-absorbable antibiotic beads for the treatment of prosthetic vascular graft infections. Our goal was to investigate the rates of infection eradication, graft preservation, and limb salvage in patients who are not candidates for graft explant or extensive reconstruction. A retrospective review of patients implanted with antibiotic impregnated bio-absorbable calcium sulfate beads at a major university center was conducted. Six patients with prosthetic graft infections were treated with bio-absorbable antibiotics beads from 2012-2014. Grafts included an aortobifemoral, an aorto-hepatic/superior mesenteric artery, and four extra-anatomic bypasses. Pathogens included Gram-positive and Gram-negative bacteria. Half of the patients underwent graft explant with reconstruction and half debridement of the original graft, all with antibiotic bead placement around the graft. Mean follow-up was 7.3 ± 8.3 months; all patients had infection resolution, healed wounds, and 100% graft patency, limb salvage, and survival. This report details the successful use of bio-absorbable antibiotic beads for the treatment prosthetic vascular graft infections in patients at high risk for graft explant or major vascular reconstruction. At early follow-up, we demonstrate successful infection suppression, graft preservation, and limb salvage with the use of these beads in a subset of vascular patients. © The Author(s) 2016.
Genovese, Elizabeth A; Avgerinos, Efthymios D; Baril, Donald T; Makaroun, Michel S; Chaer, Rabih A
2017-01-01
Objective There is limited investigation into the use of bio-absorbable antibiotic beads for the treatment of prosthetic vascular graft infections. Our goal was to investigate the rates of infection eradication, graft preservation, and limb salvage in patients who are not candidates for graft explant or extensive reconstruction. Methods A retrospective review of patients implanted with antibiotic impregnated bio-absorbable calcium sulfate beads at a major university center was conducted. Results Six patients with prosthetic graft infections were treated with bio-absorbable antibiotics beads from 2012–2014. Grafts included an aortobifemoral, an aorto-hepatic/superior mesenteric artery, and four extra-anatomic bypasses. Pathogens included Gram-positive and Gram-negative bacteria. Half of the patients underwent graft explant with reconstruction and half debridement of the original graft, all with antibiotic bead placement around the graft. Mean follow-up was 7.3±8.3 months; all patients had infection resolution, healed wounds, and 100% graft patency, limb salvage, and survival. Conclusion This report details the successful use of bio-absorbable antibiotic beads for the treatment prosthetic vascular graft infections in patients at high risk for graft explant or major vascular reconstruction. At early follow-up, we demonstrate successful infection suppression, graft preservation, and limb salvage with the use of these beads in a subset of vascular patients. PMID:26896286
Hu, Yun-Tao; Pan, Xu-Dong; Zheng, Jun; Ma, Wei-Guo; Sun, Li-Zhong
2017-08-01
To date, clinically available expanded polytetrafluoro-ethylene (ePTFE) vascular grafts are suboptimal for reconstructing small caliber (D < 6 mm) arteries, owing to thrombosis in early and restenosis in late stage. Our aim in this preliminary study was to fabricate a nano-fibrous vascular graft which was biofunctionalized with VEGF 165 and heparin. The short term performance was evaluated both in vitro and in vivo. Four-mm caliber grafts were prepared by the coaxial-elctrospun technique, which consisted of poly(l-lactide-co-caprolactone) [P(LLA-CL)] collagen and elastin. Heparin and endothelial cell growth factor-165 (VEGF 165 ) were encapsulated in the core of the fibrous. Controlled release of the heparin and VEGF 165 were evaluated for 28 days. Endothelial cells were cultured on the electrospun grafts or ePTFE grafts as controls. The cellular adhesion, proliferation and morphology were examined. Electrospun or ePTFE grafts were randomly implanted into a rabbit infrarenal aortic replacement model (n = 30) for 28 days without any antiplatelet therapy. At the termination, all grafts were examined by Doppler ultrasound and then evaluated with histology and scanning electron microscopy. The cumulative release amount of heparin (6.93 ± 1.03 mg) and VEGF 165 (22.17 ± 5.56 μg) during 28 days were measured. Endothelial cells cultured on electrospun grafts showed significantly higher attachment efficiency and proliferation compared to the ePTFE ones (P < 0.001). At 2 h more ECs had attached to the P(LLA-CL)/Collagen/Elatin grafts (83.26 ± 8.02%) compared to P(LLA-CL) (67.07 ± 4.16%) and ePTFE (46.87 ± 8.85%). ECs proliferated faster on VEGF loaded grafts (O.D = 2.9 ± 1.2, n = 12) compared to ePTFE (O.D = 1.7 ± 1.0, n = 12). The patency was significantly higher in electrospun grafts (86.6%) than ePTFE grafts (40.0%) (P = 0.021). Correspondingly, the microscope images of electrospun implants showed little thrombus when compared with the ePTFE implants. Biofunctionalized electrospun graft showed surgical properties, hemocompatibility and higher short-term patency compared with the ePTFE grafts. Despite good early performances, profound study should be designed for long-term evaluation. Copyright © 2017 IJS Publishing Group Ltd. Published by Elsevier Ltd. All rights reserved.
Tashiro, Yasutaka; Gale, Tom; Sundaram, Vani; Nagai, Kanto; Irrgang, James J; Anderst, William; Nakashima, Yasuharu; Tashman, Scott; Fu, Freddie H
2017-07-01
A high graft bending angle (GBA) after anterior cruciate ligament (ACL) reconstruction has been suggested to cause stress on the graft. Nevertheless, evidence about its effect on graft healing in vivo is limited. The signal intensity on magnetic resonance imaging (MRI) would be higher in the proximal region of the ACL graft, and higher signals would be correlated to a higher GBA. Descriptive laboratory study. Anatomic single-bundle ACL reconstruction was performed on 24 patients (mean age, 20 ± 4 years) using the transportal technique. A quadriceps tendon autograft with a bone plug was harvested. To evaluate graft healing, the signal/noise quotient (SNQ) was measured in 3 regions of interest (ROIs) of the proximal, midsubstance, and distal ACL graft using high-resolution MRI (0.45 × 0.45 × 0.70 mm), with decreased signals suggesting improved healing. Dynamic knee motion was examined during treadmill walking and running to assess the in vivo GBA. The GBA was calculated from the 3-dimensional angle between the graft and femoral tunnel vectors at each motion frame, based on tibiofemoral kinematics determined from dynamic stereo X-ray analysis. Graft healing and GBAs were assessed at 6 and 24 months postoperatively. Repeated-measures analysis of variance was used to compare the SNQ in the 3 ROIs at 2 time points. Pearson correlations were used to analyze the relationship between the SNQ and mean GBA during 0% to 15% of the gait cycle. The SNQ of the ACL graft in the proximal region was significantly higher than in the midsubstance ( P = .022) and distal regions ( P < .001) at 6 months. The SNQ in the proximal region was highly correlated with the GBA during standing ( R = 0.64, P < .001), walking ( R = 0.65, P = .002), and running ( R = 0.54, P = .015) but not in the other regions. At 24 months, signals in the proximal and midsubstance regions decreased significantly compared with 6 months ( P < .001 and P = .008, respectively), with no difference across the graft area. The signal intensity was highest in the proximal region and lowest in the distal region of the reconstructed graft at 6 months postoperatively. A steep GBA was significantly correlated with high signal intensities of the proximal graft in this early period. A steep GBA may negatively affect proximal graft healing after ACL reconstruction.
Nakamura, Tsukasa; Ushigome, Hidetaka; Watabe, Kiyoko; Imanishi, Yui; Masuda, Koji; Matsuyama, Takehisa; Harada, Shumpei; Koshino, Katsuhiro; Iida, Taku; Nobori, Shuji; Yoshimura, Norio
2017-04-01
Immunocomplex capture fluorescence analysis (ICFA) is an attractive method to detect donor-specific anti-HLA antibodies (DSA) and HLA antigen complexes. Currently, antibody-mediated rejection (AMR) due to DSA is usually diagnosed by C4d deposition and serological DSA detection. Conversely, there is a discrepancy between these findings frequently. Thereupon, our graft ICFA technique may contribute to establish the diagnosis of AMR. Graft samples were obtained by a percutaneous needle biopsy. Then, the specimen was dissolved in PBS by the lysis buffer. Subsequently, HLA antigens were captured by anti-HLA beads. Then, DSA-HLA complexes were detected by PE-conjugated anti-human IgG antibodies, where DSA had already reacted with the allograft in vivo, analyzed by a Luminex system. A ratio (sample MFI/blank beads MFI) was calculated: ≥ 1.0 was determined as positive. We found that DSA-HLA complexes in the graft were successfully detected from only slight positive 1.03 to 79.27 in a chronic active AMR patient by graft ICFA. Next, positive graft ICFA had predicted the early phase of AMR (MFI ratio: 1.38) even in patients with no serum DSA. Finally, appropriate therapies for AMR deleted DSA deposition (MFI ratio from 0.3 to 0.7) from allografts. This novel application would detect early phase or incomplete pathological cases of AMR, which could lead to a correct diagnosis and initiation of appropriate therapies. Moreover, graft ICFA might address a variety of long-standing questions in terms of DSA. AMR: Antibody-mediated rejection; DSA: Donor-specific antibodies; ICFA: Immunocomplex capture fluorescence analysis.
The diced cartilage glue graft for nasal augmentation. Morphometric evidence of longevity.
Tasman, Abel-Jan; Diener, Pierre-André; Litschel, Ralph
2013-03-01
A grafting technique that uses diced cartilage without fascia, which improves formability while maintaining long-term stability, would be a welcome addition to the rhinoplasty armamentarium. A diced cartilage glue graft was recently introduced as the Tasman technique. The technique has been used by one of us (A.-J.T.) in 28 patients who were monitored clinically for 4 to 26 months. Sonographic morphometry of the graft was used in 10 patients with a maximum follow-up of 15 months, and 2 biopsies were obtained for histologic examination. Fashioning the diced cartilage glue graft reduced operating time compared with the diced cartilage fascia graft and allowed for a wide variety of transplant shapes and sizes, depending on the mold used. All grafts were used for augmentation of the nasal dorsum or radix and healed uneventfully. Sonographic cross-section measures of the grafts changed between 6% and –29%(median, –5%) in the early postoperative phase and 8%and –7% (median, –2%) between 3 and 15 months after insertion. Histologic examination of the graft biopsies revealed viable cartilage with signs of regeneration. The diced cartilage glue graft may become an attractive alternative to accepted methods for dorsal augmentation, the diced cartilage fascia graft in particular.
Nagao, Ryan J; Lundy, Scott; Khaing, Zin Z; Schmidt, Christine E
2011-07-01
Acellular grafts are a viable option for use in nerve reconstruction surgeries. Recently, our lab created a novel optimized decellularization procedure that removes immunological material while leaving the majority of the extracellular matrix structure intact. The optimized acellular (OA) graft has been shown to elicit an immune response equal to or less than that elicited by the isograft, the analog of the autograft in the rat model. We investigated the performance of the OA graft to provide functional recovery in a long-term study. We performed a long-term functional regeneration evaluation study using the sciatic functional index to quantify recovery of Lewis rats at regular time intervals for up to 52 weeks after graft implantation following 1 cm sciatic nerve resection. OA grafts were compared against other decellularized methods (Sondell treatment and thermal decellularization), as well as the isograft and primary neurorrhaphy. The OA graft supported comparable functional recovery to the isograft and superior regeneration to thermal and Sondell decellularization methods. Furthermore, the OA graft promoted early recovery to a greater degree compared to acellular grafts obtained using either the thermal or the Sondell methods. Equivalent functional recovery to the isograft suggests that the OA nerve graft may be a future clinical alternative to the current autologous tissue graft.
Early calcification of the aortic Mitroflow pericardial bioprosthesis in the elderly.
Alvarez, Jose Rubio; Sierra, Juan; Vega, Marino; Adrio, Belen; Martinez-Comendador, Jose; Gude, Francisco; Martinez-Cereijo, Jose; Garcia, Javier
2009-11-01
We report our experience in the elderly with aortic valve replacement using the Mitroflow A12 pericardial bioprosthesis. From January 1993 to January 2006, 491 patients over the age of 70 years received an aortic Mitroflow A12 bioprosthesis implantation. Concomitant procedures included coronary artery bypass grafting in 20% of patients. All patients had routine postoperative Echo-Doppler studies at discharge, one month and a mean of 11.1 months after surgery and annually thereafter. Twenty (4%) patients underwent a second aortic valve replacement due to bioprosthetic valve dysfunction (Group 2). Calcified stenosis was the most common finding at reoperation (98%). Median time to valve reoperation was 76 months. Of patients requiring reoperation, median age at first and second implantation was 73 (70-78) and 79 (76-83) years, respectively. For all patients, freedom from structural valve dysfunction (SVD) was 95+/-3% at 5 years and 55.8+/-2% at 10 years. Bioprosthetic valve deterioration was identified in 27 patients (Group 1). Median age of these patients at first operation and at diagnosis of deterioration by echo was 75 (70-84) and 77 (70-82) years, respectively. The median interval between operation and detection of bioprosthesis valve deterioration was 46 months. Among the total patient population, freedom from bioprosthetic deterioration was 85.7+/-2% at 5 years and 33.5+/-4% at 10 years. The Mitroflow A12 pericardial bioprosthesis provides less than optimal performance in elderly patients.
Bozbas, Huseyin; Pirat, Bahar; Demirtas, Saadet; Simşek, Vahide; Yildirir, Aylin; Sade, Elif; Sayin, Burak; Sezer, Siren; Karakayali, Hamdi; Muderrisoglu, Haldun
2009-02-01
Approximately half of all deaths in patients with end-stage renal disease (ESRD) are due to cardiovascular diseases. Although renal transplant improves survival and quality of life in these patients, cardiovascular events significantly affect survival. We sought to evaluate coronary flow reserve (CFR), an indicator of coronary microvascular function, in patients with ESRD and in patients with a functioning kidney graft. Eighty-six patients (30 with ESRD, 30 with a functioning renal allograft, and 26 controls) free of coronary artery disease or diabetes mellitus were included. Transthoracic Doppler echocardiography was used to measure coronary peak flow velocities at baseline and after dipyridamole infusion. CFR was calculated as the ratio of hyperemic to baseline diastolic peak flow velocities and was compared among the groups. The mean age of the study population was 36.1+/-7.3 years. No between-group differences were found regarding age, sex, or prevalences of traditional coronary risk factors other than hypertension. Compared with the renal transplant and control groups, the ESRD group had significantly lower mean CFR values. On multivariate regression analysis, serum levels of creatinine, age, and diastolic dysfunction were independent predictors of CFR. CFR is impaired in patients with ESRD suggesting that coronary microvascular dysfunction, an early finding of atherosclerosis, is evident in these patients. Although associated with a decreased CFR compared with controls, renal transplant on the other hand seems to have a favorable effect on coronary microvascular function.
Myers, S R; Grady, J; Soranzo, C; Sanders, R; Green, C; Leigh, I M; Navsaria, H A
1997-01-01
The clinical take rates of cultured keratinocyte autografts are poor on a full-thickness wound unless a dermal bed is provided. Even under these circumstances two important problems are the time delay in growing autografts and the fragility of the grafts. A laser-perforated hyaluronic acid membrane delivery system allows grafting at early confluence without requiring dispase digestion to release grafts from their culture dishes. We designed this study to investigate the influence of this membrane on clinical take rates in an established porcine kerato-dermal grafting model. The study demonstrated a significant reduction in take as a result of halving the keratinocyte seeding density onto the membrane. The take rates, however, of grafts grown on the membrane at half or full conventional seeding density and transplanted to a dermal wound bed were comparable, if not better, than those of keratinocyte sheet grafts.
Succi, José Ernesto; Gerola, Luis Roberto; Succi, Guilherme de Menezes; Kim, Hyong Chun; Paredes, Jorge Edwin Morocho; Bufollo, Enio
2012-01-01
To evaluate intraoperative graft patency and identify grafts under risk of early occlusion. Fifty four patients were submitted to coronary artery bypass surgery and the graft flow was assessed by the Flowmeter (Medtronic Medistim), which utilizes the TTFM method. Three patients had left main disease and 48 had normal or mildly reduced left ventricular function. In hospital mortality was 3.7% (two patients), one for mesenteric thrombosis and one due to cardiogenic chock. Seventeen patients (34%) were submitted to off pump CABG. Arterial Graft flow measures ranged from 8 to 106 ml/min (average 31.14 ml/min), and venous grafts flow ranged from 9 to 149 ml/min (average 50.42 ml/min). Flowmeter use represents higher safety both for patients and surgeons. Even under legal aspects, the documentation provided by the device can avoid future questionings.
Personal experience with the procurement of 132 liver allografts
Yanaga, K.; Tzakis, A.G.; Starzl, T.E.
2010-01-01
A single donor surgeon's experience procuring the livers from 132 donors is described. Thirty-seven grafts (28.9%) had hepatic arterial anomalies, 19 (14.4%) of which required arterial reconstruction prior to transplantation. Of the 121 grafts evaluated for early function, 103 grafts (85.2%) functioned well, whereas 14 grafts (11.6%) functioned poorly and 4 grafts (3.3%) failed to function at all. The variables associated with less than optimal function of the graft consisted of donor age (P < 0.05), duration of donor's stay in the intensive care unit (P < 0.005), abnormal graft appearance (P < 0.05), and such recipient problems as vascular thromboses during or immediately following transplantation (P < 0.005). A new preservation fluid, University of Wisconsin solution, allowed safe and longer cold storage of the liver allograft than did Euro-Collins' solution (P < 0.0001). A parameter of liver allograft viability, which is simple and predictive of allograft function prior to the actual transplant procedure, is urgently needed. PMID:2803485
Graft union formation in artichoke grafting onto wild and cultivated cardoon: an anatomical study.
Trinchera, Alessandra; Pandozy, Gianmarco; Rinaldi, Simona; Crinò, Paola; Temperini, Olindo; Rea, Elvira
2013-12-15
In order to develop a non-chemical method such as grafting effective against well-known artichoke soil borne diseases, an anatomical study of union formation in artichoke grafted onto selected wild and cultivated cardoon rootstocks, both resistant to Verticillium wilt, was performed. The cardoon accessions Belgio (cultivated cardoon) and Sardo (wild cardoon) were selected as rootstocks for grafting combinations with the artichoke cv. Romolo. Grafting experiments were carried out in the autumn and spring. The anatomical investigation of grafting union formation was conducted by scanning electron microscopy (SEM) on the grafting portions at the 3rd, 6th, 10th, 12th day after grafting. For the autumn experiment only, SEM analysis was also performed at 30 d after grafting. A high affinity between artichoke scion and cardoon rootstocks was observed, with some genotype differences in healing time between the two bionts. SEM images of scion/rootstock longitudinal sections revealed the appearance of many interconnecting structures between the two grafting components just 3d after grafting, followed by a vascular rearrangement and a callus development during graft union formation. De novo formation of many plasmodesmata between scion and rootstock confirmed their high compatibility, particularly in the globe artichoke/wild cardoon combination. Moreover, the duration of the early-stage grafting process could be influenced not only by the scion/rootstock compatibility, but also by the seasonal conditions, being favored by lower temperatures and a reduced light/dark photoperiod. Copyright © 2013 Elsevier GmbH. All rights reserved.
Epidemiologic Study and Genotyping of BK Virus in Renal Transplant Recipients.
Cobos, M; Aquilia, L; Garay, E; Ochiuzzi, S; Alvarez, S; Flores, D; Raimondi, C
2018-03-01
BK virus (BKV) infection occurs during childhood and remains latent in the urinary tract. The virus is reactivated in immunosuppressed patients, particularly in those with cellular immunity deficiency, allowing its detection in urine and blood. Nephropathy caused by the virus in renal transplantation recipients may lead to graft failure. The purpose of this study is to know the prevalence of BKV variables in renal transplantation recipients and to evaluate their clinical evolution through molecular methods of "in house" development. Urine and peripheral blood samples from 66 renal transplantation recipients from the province of Buenos Aires, Argentina, were systematically analyzed every 3 months as well as when there was graft dysfunction. Renal biopsies, which were included in the BKV detection study, were performed on those patients with graft dysfunction. Genotyping of 24 BKVs was performed, and the following distribution was found: 21 (87.5%) belonged to subtype I, 3 (12.5%) to subtype II. BKV belonging to subtypes III or IV were not found. As regards subtype I subgroups, the following were identified: 1 (4.76%) from Ia, 10 (47.61%) from Ib1 and 10 (47.61%) from Ib2. Presence of subgroup Ic was not shown. Viremia presented in 33.33% of cases, whereas 75% corresponded to subgroup Ib 1. Genotype Ib1 is prevailing in Southeast Asia, while Ib2 is prominent in Europe. Although an important proportion of the inhabitants of the province of Buenos Aires are European descendants, the prevailing genotype is Ib1, the Asian type. Genotyping might be related to the evolution of the disease in the recipient. Copyright © 2017 Elsevier Inc. All rights reserved.
Hynes, Conor F; Colo, Sanchez; Amdur, Richard L; Chawla, Lakhmir S; Greenberg, Michael D; Trachiotis, Gregory D
2016-01-01
This study aimed to evaluate the short- and long-term effects of conventional on-pump coronary bypass grafting (cCABG) compared with off-pump coronary artery bypass (OPCAB) on renal function. A retrospective review of patients undergoing coronary bypass grafting from 2004 through 2013 at a single center was conducted. Preoperative renal function, perioperative acute kidney injury, and long-term glomerular filtration were evaluated. Multivariable analyses were used to determine factors contributing to short- and long-term renal impairment. A total of 234 patients underwent cCABG, and 582 underwent OPCAB. Patients undergoing OPCAB were significantly older, had greater preoperative renal dysfunction, had greater functional dependence, and took more hypertension medications. Multivariable analyses found that 30-day acute kidney injury was an independent risk factor for a 10% decline in glomerular filtration rate at 1 and 5 years (P < 0.0001 and 0.002, respectively). However, the use of cardiopulmonary bypass was not found to influence long-term renal function (P = 0.78 at 1 year, P = 0.76 at 5 years). The percentage of patients experiencing a 10% drop in renal function from baseline at 1 year (33% OPCAB, 35% cCABG; P = 0.73) and 5 years (16% OPCAB, 16% cCABG; P = 0.93) were not significantly different. Independent predictors of acute kidney injury included baseline kidney function (P = 0.04) and age (P < 0.0001), whereas cardiopulmonary bypass did not affect the incidence (P = 0.17). A propensity-matched analysis confirmed these findings. Acute kidney injury is a risk factor for long-term renal dysfunction after either bypass method and was not greater after cCABG compared with OPCAB. Patients undergoing OPCAB did not experience greater decrease in long-term kidney function despite having worse baseline kidney function.
Allograft dendritic cell p40 homodimers activate donor-reactive memory CD8+ T cells
Tsuda, Hidetoshi; Su, Charles A.; Tanaka, Toshiaki; Ayasoufi, Katayoun; Min, Booki; Valujskikh, Anna; Fairchild, Robert L.
2018-01-01
Recipient endogenous memory T cells with donor reactivity pose an important barrier to successful transplantation and costimulatory blockade–induced graft tolerance. Longer ischemic storage times prior to organ transplantation increase early posttransplant inflammation and negatively impact early graft function and long-term graft outcome. Little is known about the mechanisms enhancing endogenous memory T cell activation to mediate tissue injury within the increased inflammatory environment of allografts subjected to prolonged cold ischemic storage (CIS). Endogenous memory CD4+ and CD8+ T cell activation is markedly increased within complete MHC-mismatched cardiac allografts subjected to prolonged versus minimal CIS, and the memory CD8+ T cells directly mediate CTLA-4Ig–resistant allograft rejection. Memory CD8+ T cell activation within allografts subjected to prolonged CIS requires memory CD4+ T cell stimulation of graft DCs to produce p40 homodimers, but not IL-12 p40/p35 heterodimers. Targeting p40 abrogates memory CD8+ T cell proliferation within the allografts and their ability to mediate CTLA-4Ig–resistant allograft rejection. These findings indicate a critical role for memory CD4+ T cell–graft DC interactions to increase the intensity of endogenous memory CD8+ T cell activation needed to mediate rejection of higher-risk allografts subjected to increased CIS. PMID:29467328
Second-degree burns with six etiologies treated with autologous noncultured cell-spray grafting.
Esteban-Vives, Roger; Choi, Myung S; Young, Matthew T; Over, Patrick; Ziembicki, Jenny; Corcos, Alain; Gerlach, Jörg C
2016-11-01
Partial and deep partial-thickness burn wounds present a difficult diagnosis and prognosis that makes the planning for a conservative treatment versus mesh grafting problematic. A non-invasive treatment strategy avoiding mesh grafting is often chosen by practitioners based on their clinical and empirical evidence. However, a delayed re-epithelialization after conservative treatment may extend the patient's hospitalization period, increase the risk of infection, and lead to poor functional and aesthetic outcome. Early spray grafting, using non-cultured autologous cells, is under discussion for partial and deep partial-thickness wounds to accelerate the re-epithelialization process, reducing the healing time in the hospital, and minimizing complications. To address planning for future clinical studies on this technology, suitable indications will be interesting. We present case information on severe second-degree injuries after gas, chemical, electrical, gasoline, hot water, and tar scalding burns showing one patient per indication. The treatment results with autologous non-cultured cells, support rapid, uncomplicated re-epithelialization with aesthetically and functionally satisfying outcomes. Hospital stays averaged 7.6±1.6 days. Early autologous cell-spray grafting does not preclude or prevent simultaneous or subsequent traditional mesh autografting when indicated on defined areas of full-thickness injury. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Chiang, Chao-Ching; Su, Chen-Yao; Huang, Ching-Kuei; Chen, Wei-Ming; Chen, Tain-Hsiung; Tzeng, Yun-Hsuan
2007-09-01
Refractory nonunions of the tibia or femur are physically and mentally devastating conditions for the patients, and the treatment is challenging for orthopedic surgeons. The goal of this study was to assess the feasibility and outcome of surgical treatment in recalcitrant nonunions of a lower extremity with bone graft enriched with autologous platelet gel (APG). Twelve patients with four femoral and eight tibial atrophic nonunions after multiple prior procedures were included. All of them were treated with the bone grafting procedures with autograft complex enriched with APG. They were evaluated with radiographs, bone mineral density for bony healing process, and the Short-Form 36 Health Survey for functional outcome. Of the 12 patients, 11 healed at an average of 19.7 weeks after the first attempt and 1 healed after the second attempt at 21 weeks. The bone mineral density continued to increase steadily from early healing to the remodeling phase. Functional status was greatly improved at an average follow-up of 32.4 months. The results of this preliminary study implied the possible potential of bone graft enriched with APG in the treatment of recalcitrant nonunions of the lower extremity. More research is necessary to clarify its role in augmentation of bone graft to enhance healing of nonunion.
Kamohara, Keiji; Furukawa, Kojiro; Itoh, Manabu; Morokuma, Hiroyuki; Tanaka, Hideya; Hayashi, Nagi; Morita, Shigeki
2015-01-01
In thoracoabdominal aneurysm (TAAA) repair, our technical modification of visceral reconstruction using longer cut pre-sewn side branches has provided good surgical outcomes. Here, we assessed the long-term durability and patency of revascularized branches using computed tomography (CT) to confirm the validity of our approach. Early and late CT evaluations were performed in 11 TAAA patients (males: 5; mean age: 60.6 years) using the Coselli graft to evaluate the position of main graft and the diverging pattern and patency of side branches. Seven of 11 were sutured in an extra-anatomical fashion using longer cut side branches. In Anatomical (n = 4) and Extra-anatomical (n = 7) groups, the early patency of side branches was not significantly different. Although the late patency of right renal artery (RA) was 100% in both groups, the one of left RA was 60% in Extra-anatomical, while 100% in Anatomical. Furthermore, the main graft in Extra-anatomical was significantly posterior and leftward to the spine with left RA side branch diverging at an acute angle. When a pre-sewn branched graft designed for TAAA is used, the graft should be sutured in a fashion similar to normal patient anatomy to minimize the possibility of kinking of RA side branch for the patency.
Ultrasound findings in dual kidney transplantation.
Damasio, M B; Cittadini, G; Rolla, D; Massarino, F; Stagnaro, N; Gherzi, M; Paoletti, E; Derchi, L E
2013-02-01
This study was done to analyse colour Doppler ultrasound (CDUS) findings in patients with dual kidney transplantation (DKT) and to compare renal volume and resistive index (RI) values between DKT and single kidney transplantation (SKT). We reviewed the clinical and imaging findings [30 CDUS, five magnetic resonance (MR) and one computed tomography (CT) examination] in 30 patients with DKT (23 men and seven women; median age 65 years; range 55-82). Three patients had clinical signs of graft malfunction. Renal volumes and RI were compared with those of 14 SKT patients and comparable levels of renal function. Three patients had graft dysfunction: one had chronic rejection and two had pathologies involving one kidney only (one encrusted pyeloureteritis of a left graft and one occluded main artery of a left graft). Asymptomatic unilateral pathologies were seen in six cases. In asymptomatic DKT patients, no significant differences in length, volume, cortical echogenicity and RI between the two kidneys were observed; DKTs were smaller (median volume 116.7 vs. 171.6 cc) and had higher RIs (0.76 vs. 0.68) (p<0.01) than SKTs. CDUS provides useful information in patients with DKT, allowing detection of clinically unsuspected unilateral diseases. At comparable levels of renal function, DKT patients had higher RI and lower volumes than SKT patients.
Does a meso-caval shunt have positive effects in a pig large-for-size liver transplantation model?
Tannuri, Ana Cristina Aoun; de Albuquerque Rangel Moreira, Daniel; Belon, Alessandro; Coelho, Maria Cecília Mendonça; Gonçalves, Josiane Oliveira; Serafini, Suellen; Tannuri, Uenis
2017-08-01
In pediatric liver transplantations with LFS grafts, higher incidences of graft dysfunction probably occur due to IRI. It was postulated that increasing the blood supply to the graft by means of a meso-caval shunt could ameliorate the IRI. Eleven pigs underwent liver transplantation and were divided into two groups: LFS and LFS+SHUNT group. A series of flowmetric, metabolic, histologic, and molecular studies were performed. No significant metabolic differences were observed between the groups. One hour after reperfusion, portal flow was significantly lower in the recipients than in the donors, proving that the graft was maintained in low portal blood flow, although the shunt could promote a transient increase in the portal blood flow and a decrease in the arterial flow. Finally, it was verified that the shunt promoted a decrease in inflammation and steatosis scores and a decrease in the expression of the eNOS gene (responsible for the generation of nitric oxide in the vascular endothelium) and an increase in the expression of the proapoptotic gene BAX. The meso-caval shunt was responsible for some positive effects, although other deleterious flowmetric and molecular alterations also occurred. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Early Spontaneous Graft Intra- and Perihepatic Hematoma after Liver Transplantation.
Lupaşcu, Cristian; Apopei, Oana; Vlad, Nutu; Vasiluta, Ciprian; Trofin, Ana-Maria; Zabara, Mihai; Vornicu, Alexandra; Lupaşcu-Ursulescu, Corina; Nitu, Mioara; Crumpei, Felicia; Braşoveanu, Vladislav; Popescu, Irinel
2017-01-01
Hematoma of the graft is a life threatening complication of liver transplantation (LT) and there has been no overt conclusion in the literature about optimal management except in scarcely reported cases. It may be either intrahepatic or subcapsular, then again it may develop spontaneously or following parenchimal injuries or transhepatic percutaneous invasive manoeuvers. In this report we describe a rare case of large spontaneous graft intra- and perihepatic hematoma. A 62 year-old man underwent a whole graft orthotopic liver transplantation (OLT) for decompensated chronic liver disease due to alcoholic cirrhosis. The surgical procedure was uneventful. During the early postoperative course, routine Doppler ultrasound examination and CT-scan revealed an extrahepatic paracaval hematoma, 7 days after transplantation, which was stable and conservatively managed until the 18-th postoperative day, when rapidly expanding intraparenchimal hematoma involving the right hemiliver, several other perihepatic hematomas, significant right pleural effusion and hemorrhagic ascites were described. The patient was successfully treated conservatively (nonsurgically) with slow recovery of the liver allograft and discharged one month later in good general status. Celsius.
Donation after cardio-circulatory death liver transplantation
Le Dinh, Hieu; de Roover, Arnaud; Kaba, Abdour; Lauwick, Séverine; Joris, Jean; Delwaide, Jean; Honoré, Pierre; Meurisse, Michel; Detry, Olivier
2012-01-01
The renewed interest in donation after cardio-circulatory death (DCD) started in the 1990s following the limited success of the transplant community to expand the donation after brain-death (DBD) organ supply and following the request of potential DCD families. Since then, DCD organ procurement and transplantation activities have rapidly expanded, particularly for non-vital organs, like kidneys. In liver transplantation (LT), DCD donors are a valuable organ source that helps to decrease the mortality rate on the waiting lists and to increase the availability of organs for transplantation despite a higher risk of early graft dysfunction, more frequent vascular and ischemia-type biliary lesions, higher rates of re-listing and re-transplantation and lower graft survival, which are obviously due to the inevitable warm ischemia occurring during the declaration of death and organ retrieval process. Experimental strategies intervening in both donors and recipients at different phases of the transplantation process have focused on the attenuation of ischemia-reperfusion injury and already gained encouraging results, and some of them have found their way from pre-clinical success into clinical reality. The future of DCD-LT is promising. Concerted efforts should concentrate on the identification of suitable donors (probably Maastricht category III DCD donors), better donor and recipient matching (high risk donors to low risk recipients), use of advanced organ preservation techniques (oxygenated hypothermic machine perfusion, normothermic machine perfusion, venous systemic oxygen persufflation), and pharmacological modulation (probably a multi-factorial biologic modulation strategy) so that DCD liver allografts could be safely utilized and attain equivalent results as DBD-LT. PMID:22969222
Fukui, Toshihiro; Tabata, Minoru; Morita, Satoshi; Takanashi, Shuichiro
2013-06-01
The aim of the present study was to determine the early and long-term outcomes of coronary artery bypass grafting in patients with acute coronary syndrome and stable angina pectoris. From September 2004 to September 2011, 382 patients with acute coronary syndrome (unstable angina pectoris and non-ST-segment elevation myocardial infarction) and 851 patients with stable angina pectoris underwent first-time isolated coronary artery bypass grafting at our institute. The early and long-term outcomes were compared between the 2 groups. Patients with acute coronary syndrome were older, were more likely to be women, had a smaller body surface area, and were more likely to have left main coronary artery disease. In both groups, bilateral internal thoracic artery grafts were used in approximately 89% of the patients, and off-pump techniques in approximately 97% of the patients. The acute coronary syndrome group had a greater operative death rate (2.6% vs 0.1%) and a greater incidence of low output syndrome (3.1% vs 1.2%) and hemodialysis requirement (2.9% vs 1.1%). Multivariate regression analysis demonstrated that age, acute coronary syndrome, lower ejection fraction, and higher creatinine level before surgery were independent predictors of operative death. However, among the hospital survivors, no differences were seen in freedom from all death (85.4% ± 2.5% vs 87.7% ± 2.0%), cardiac death (97.4% ± 0.9% vs 96.5% ± 0.9%), or major adverse cardiac and cerebrovascular events (78.0% ± 2.9% vs 78.1% ± 2.3%) at 7 years between the patients with acute coronary syndrome and stable angina pectoris. Although acute coronary syndrome is an independent predictor of early mortality in patients undergoing coronary artery bypass grafting, the long-term outcomes after surgery were similar between patients with acute coronary syndrome and stable angina pectoris who survived the early postoperative period. Copyright © 2013 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Adamek, Martina; Opelz, Gerhard; Klein, Katrin; Morath, Christian; Tran, Thuong Hien
2016-07-01
Timely detection of graft rejection is an important issue in the follow-up care after solid organ transplantation. Until now, biopsy has been considered the "gold standard" in the diagnosis of graft rejection. However, non-invasive tests such as monitoring the levels of cell-free DNA (cfDNA) as a sensitive biomarker for graft integrity have attracted increasing interest. The rationale of this approach is that a rejected organ will lead to a significant release of donor-derived cfDNA, which can be detected in the serum of the transplant recipient. We have developed a novel quantitative real-time PCR (qPCR) approach for detecting an increase of donor-derived cfDNA in the recipient's serum. Common insertion/deletion (InDel) genetic polymorphisms, which differ between donor and recipient, are targeted in our qPCR assay. In contrast to some other strategies, no specific donor/recipient constellations such as certain gender combinations or human leukocyte antigen (HLA) discrepancies are required for the application of our test. The method was first validated with serial dilutions of serum mixtures obtained from healthy blood donors and then used to determine donor-derived cfDNA levels in patients' sera within the first 3 days after their kidney transplantation had been performed. Our method represents a universally applicable, simple and cost-effective tool which can potentially be used to detect graft dysfunction in transplant recipients.
Fernández-Rodríguez, O M; Ríos, A; Navarro, J L; Pons, J A; Palenciano, C G; Mota, R; Berenguer, J J; Mulero, F; Contreras, J; Conesa, C; Ramírez, P; Fuente, T; Parrilla, P
2006-04-01
Our aim was to evaluate liver graft integrity and function using scintigraphy and ultrasonography in a porcine model of auxiliary heterotopic liver transplantation with portal vein arterialization (AHLT-PVA). Using Doppler ultrasonography we evaluated eight AHLT-PVA by parenchymal echogenicity, portal and arterial anatomy, and portal and biliary system flow. Two types of scintigraphy were performed: microaggregated human albumin colloid scintigraphy and diisopropyl iminodiacetic acid (DISIDA) scintigraphy, both labeled with 99mTc. The animals were distributed into two groups. The first group consisted of three animals with clinical suspicion of graft dysfunction, in which the ultrasonographic study revealed areas of parenchymal destructuring. In the scintigraphic study, heterogenous uptake was observed; there was no uptake in one animal. Necropsy of these three animals revealed areas of graft necrosis. The second group consisted of five animals with good clinical evolutions, in which the ultrasonographic study showed portal dilation, portal flow with arterial spiculations, and homogenous echogenicity of the hepatic parenchyma. The scintigraphic study revealed homogenous uptake by the graft and an elimination speed of the hepatobiliary agent similar to that of the native liver. An heterogenous echostructure of the graft provided a sign of poor prognosis indicating necrosis in the same way as heterogenous uptake or nonuptake of radioisotope upon scintigraphy. Scintigraphy is a good method to evaluate biliary function and bile elimination. In an AHLT-PVA, the main ultrasound findings derived from arterialization were dilation of the portal system and portal flow with arterial spiculations.
Overview of Clinical Lung Transplantation
Yeung, Jonathan C.; Keshavjee, Shaf
2014-01-01
Since the first successful lung transplant 30 years ago, lung transplantation has rapidly become an established standard of care to treat end-stage lung disease in selected patients. Advances in lung preservation, surgical technique, and immunosuppression regimens have resulted in the routine performance of lung transplantation around the world for an increasing number of patients, with wider indications. Despite this, donor shortages and chronic lung allograft dysfunction continue to prevent lung transplantation from reaching its full potential. With research into the underlying mechanisms of acute and chronic lung graft dysfunction and advances in personalized diagnostic and therapeutic approaches to both the donor lung and the lung transplant recipient, there is increasing confidence that we will improve short- and long-term outcomes in the near future. PMID:24384816
Biological Effects of Orthodontic Tooth Movement Into the Grafted Alveolar Cleft.
Sun, Jian; Zhang, Xiaoyue; Li, Renmei; Chen, Zhengxi; Huang, Yuanliang; Chen, Zhenqi
2018-03-01
Functional stimulus during orthodontic tooth movement into the grafted bone can lead to better alveolar bone grafting outcomes. The aim of this study was to analyze the biological effects of orthodontic tooth movement into the grafted alveolar cleft area with histologic staining, fluorescence staining, and real-time polymerase chain reaction (PCR). An animal model of orthodontic tooth movement into the grafted alveolar cleft area was established in 8-week-old Sprague-Dawley rats. The animals were divided into the experimental group and the control group. Four checkpoints were observed: before orthodontic stimuli, day 1 after orthodontic stimuli, day 3 after orthodontic stimuli, and day 5 after orthodontic stimuli. The cleft bone formation conditions, including the collagen fibers and the activities of the osteoclasts and osteoblasts, were evaluated by histologic staining. The expression of tartrate-resistant acid phosphatase (TRAP), receptor activator nuclear factor κB ligand, and Runt-related transcription factor 2 was detected by real-time PCR in both groups. Hematoxylin-eosin staining showed that the remodeling process of iliac autografts was completed when the orthodontic stress was applied, whereas the bone tissues first showed osteoclastogenesis and then osteogenesis. On the basis of TRAP staining, the osteoclasts increased to the maximal amount on day 3 and decreased thereafter. Evidence from tetracycline fluorescence staining indicated that no obvious changes in osteoblast activity were detected at the early stage; however, it gradually increased, especially in the region close to the root surface. According to real-time PCR, the expression of TRAP increased in both the early and middle stages, that of receptor activator nuclear factor κB ligand increased in the early stage, and that of Runt-related transcription factor 2 increased in the late stage. Moreover, the results showed significant differences between the experimental and control groups. Orthodontic tooth movement into the alveolar cleft bone graft area promoted bone remodeling of embedded bone, thus inducing bone resorption and subsequent deposition. Copyright © 2017. Published by Elsevier Inc.
Monstrey, S; Beele, H; Kettler, M; Van Landuyt, K; Blondeel, P; Matton, G; Naeyaert, J M
1999-09-01
Improved shock therapy has extended the limits of survival in patients with massive burns, and nowadays skin coverage has become the major problem in burn management. The use of mesh skin grafts is still the simplest technique to expand the amount of available donor skin. However, very wide-mesh skin grafts take a very long time to heal, often resulting in unaesthetic scar formation. On the other hand, allogeneic cultured keratinocytes have been reported as a natural source of growth factors and thus could be useful to improve wound healing of these wide-mesh grafts. A clinical study was performed to compare the use of cryopreserved allogeneic cultured keratinocytes vs. the traditional cadaveric skin as a double layer over widely expanded autogenous skin grafts. This procedure was performed in 18 pairs of full-thickness burn wounds (with similar depth and location) in 11 severely burned patients. Early clinical evaluation was made at 2, 3, and 4 to 5 weeks. Parameters such as epithelialization, granulation tissue formation, infection, and scar formation were evaluated. Biopsies were taken to compare the histological characteristics of the epidermis, the epidermal-dermal junction, and the dermis. Late evaluations were performed at 6 and 12 months regarding color, softness, thickness, and subjective feeling of the scar tissue. Aside from a faster (p < 0.05) epithelialization in the keratinocyte group at 2 weeks, there were no statistically different results in any of the early evaluated parameters, neither clinically nor histologically. At long-term follow-up, clinical results and scar characteristics were not significantly different in the two compared groups. It is concluded from the results of this study that, during the early phase, epithelialization was faster with allogeneic cultured keratinocytes compared with cadaveric skin. However, taking into account the substantial difference in costs, the described use of cryopreserved allogeneic cultured keratinocytes as a double layer on meshed autogenous split-thickness skin grafts can hardly be advocated.
Young, Raymond K; Dale, Bethany; Russell, Stuart D; Zachary, Andrea A; Tedford, Ryan J
2015-08-01
In cardiac transplant recipients, the development of antibodies to the endothelial intermediate filament protein vimentin (antivimentin antibodies, AVA) has been associated with rejection and poor outcomes. However, the incidence of these antibodies prior to transplantation and their association with early rejection has not been investigated. Pre-transplant serum was analyzed from 50 patients who underwent de novo cardiac transplant at Johns Hopkins Hospital from 2004 to 2012. Demographic, one-yr rejection, and survival data were obtained from the transplant database. The incidence of pre-transplant AVA was 34%. AVA-positive patients were younger (p = 0.03), and there was an a trend toward incidence in females (p = 0.08). Demographic data were similar among both groups. AVA positivity did not predict rejection in the first year post-transplant. There was no difference in rejection-free graft survival (53 vs. 52%, p = 0.85) at one yr. Similarly, there was no difference in graft survival at one yr (82 vs. 88%, p = 0.56) or graft survival at a median follow-up of 23 and 26 months, respectively (76 vs. 85%, p = 0.41). AVA is common in the cardiac pre-transplant population with a higher incidence in the young. The presence of detectable AVA did not correlate with early post-transplant rejection or graft survival. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Young, Raymond K.; Dale, Bethany; Russell, Stuart D.; Zachary, Andrea A.; Tedford, Ryan J.
2015-01-01
Background In cardiac transplant recipients, the development of antibodies to the endothelial intermediate filament protein vimentin (anti-vimentin antibodies, AVA) has been associated with rejection and poor outcomes. However, the incidence of these antibodies prior to transplantation and their association with early rejection has not been investigated. Methods Pre-transplant serum was analyzed from 50 patients who underwent de novo cardiac transplant at Johns Hopkins Hospital from 2004-2012. Demographic, one year rejection, and survival data were obtained from the transplant database. Results The incidence of pre-transplant AVA was 34%. AVA positive patients were younger (p=0.03) and there was an increased incidence in females (p=0.08). Demographic data were similar among both groups. AVA positivity did not predict rejection in the 1st year post-transplant. There was no difference in rejection-free graft survival (53 vs. 52%, p=0.85) at 1 year. Similarly there was no difference in graft survival at 1 year (82 vs. 88%, p=0.56) or graft survival at a median follow up of 23 and 26 months, respectively (76 vs. 85%, p=0.41). Conclusions AVA is common in the cardiac pre-transplant population with a higher incidence in the young. The presence of detectable AVA did not correlate with early post-transplant rejection or graft survival. PMID:25982351
Graft versus host disease: what should the oculoplastic surgeon know?
Tung, Cynthia I
2017-09-01
To provide a concise review of the oculoplastic manifestations of ocular graft versus host disease (GVHD), and to discuss their management. Ocular GVHD occurs as a common immune-mediated complication of hematopoietic stem cell transplantation that presents as a Stevens-Johnson-like syndrome in the acute phase or a Sjögren-like syndrome in the chronic phase. Cicatricial conjunctivitis may be underreported in ocular GVHD. The spectrum of oculoplastic manifestations includes GVHD of the skin, cicatricial entropion, nasolacrimal duct obstruction, and lacrimal gland dysfunction. Surgical treatment is indicated for patients with significant corneal complications from entropion. Surgical approach to repair of nasolacrimal duct obstruction is presented in this review, including modified approaches for treating patients at risk for keratitis sicca. Management of the ocular graft versus host patient may require a multidisciplinary approach involving collaboration from the oculoplastic surgeon, the corneal specialist, and the stem cell transplant physician. Oculoplastic manifestations of ocular GVHD typically present as cicatricial changes in the eyelid and lacrimal system. Careful oculoplastic and corneal evaluation are necessary when considering surgical management for the ocular GVHD patient.
Long-term effects of steroid withdrawal in kidney transplantation.
Offermann, G; Schwarz, A; Krause, P H
1993-01-01
The long-term graft function after withdrawal of steroids from maintenance immunosuppression was analyzed in 98 kidney recipients (59 on cyclosporin monotherapy, 39 on cyclosporin plus azathioprine) who had not developed an early rejection episode when prednisolone was discontinued. Seven years after steroid withdrawal the probability of an increase in serum creatinine (> 20% of baseline levels) was 51%. The increase in creatinine was associated with sclerosing arteriopathy as a marker of chronic rejection in 29 of 43 graft biopsies. The addition of azathioprine had no effect on the stability of long-term graft function and did not influence the 7-year graft survival rate in this highly selected group of patients.
Yousif, Mohamed Elamin Awad; Bridson, Julie M; Halawa, Ahmed
2016-06-01
There is a misconception among transplant clinicians that contraception after a successful renal transplant is challenging. This is partly due to the complex nature of transplant patients, where immunosuppression and graft dysfunction create major concerns. In addition, good evidence regarding contraception and transplant is scarce, with most of the evidence extrapolated from observational and case-controlled studies, thus adding to the dilemma of treating these patients. In this review, we closely analyzed the different methods of contraception and critically evaluated the efficacy of the different options for contraception after kidney transplant. We conclude that contraception after renal transplant is successful with acceptable risk. A multidisciplinary team approach involving obstetricians and transplant clinicians to decide the appropriate timing for conception is recommended. Early counseling on contraception is important to reduce the risk of unplanned pregnancies, improve pregnancy outcomes, and reduce maternal complications in patients after kidney transplant. To ascertain appropriate advice on the method of contraception, individualizing the method of contraception according to a patient's individual risks and expectations is essential.
Donor-acquired fat embolism syndrome after lung transplantation.
Jacob, Samuel; Courtwright, Andrew; El-Chemaly, Souheil; Racila, Emilian; Divo, Miguel; Burkett, Patrick; Fuhlbrigge, Anne; Goldberg, Hilary J; Rosas, Ivan O; Camp, Phillip
2016-05-01
Fat embolism is a known complication of severe trauma and closed chest cardiac resuscitation both of which are more common in the lung transplant donor population and can lead to donor-acquired fat embolism syndrome (DAFES). The objective was to review the diagnosis and management of DAFES in the lung transplantation literature and at our centre. We performed a literature review on DAFES using the Medline database. We then reviewed the transplant record of Brigham and Women's Hospital, a large academic hospital with an active lung transplant programme, for cases of DAFES. We identified 2 cases of DAFES in our centre, one of which required extracorporeal membrane oxygenation (ECMO) for successful management. In contrast to the broader literature on DAFES, which emphasizes unsuccessfully treated cases, both patients survived. DAFES is a rare but likely underappreciated early complication of lung transplant as it can mimic primary graft dysfunction. Aggressive interventions, including ECMO, may be necessary to achieve a good clinical outcome following DAFES. © The Author 2015. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Evaluation of serum sCD30 in renal transplantation patients with and without acute rejection.
Cervelli, C; Fontecchio, G; Scimitarra, M; Azzarone, R; Famulari, A; Pisani, F; Battistoni, C; Di Iulio, B; Fracassi, D; Scarnecchia, M A; Papola, F
2009-05-01
Despite new immunosuppressive approaches, acute rejection episodes (ARE) are still a major cause of early kidney dysfunction with a negative impact on long-term allograft survival. Noninvasive markers able to identify renal ARE earlier than creatinine measurement include sCD30. We sought to establish whether circulating levels of sCD30 in pretransplantation and posttransplantation periods were of clinical relevance to avoid graft damage. Quantitative detection of serum sCD30 was performed using an enzyme-linked immunosorbent assay. Our results demonstrated that the mean concentrations of sCD30 were significantly higher in the sera of renal transplant recipients with ARE (30.04 U/mL) and in uremic patients on the waiting list (37.7 U/mL) compared with healthy controls (HC; 9.44 U/mL), but not nonrejecting patients (12.01 U/mL). Statistical analysis revealed a strong association between high sCD30 levels in posttransplantation sera and ARE risk. This study suggested that sCD30 levels were a reliable predictor of ARE among deceased-donor kidney recipients.
Al-Chaqmaqchi, Heevy; Sadeghi, Behnam; Abedi-Valugerdi, Manuchehr; Al-Hashmi, Sulaiman; Fares, Mona; Kuiper, Raoul; Lundahl, Joachim
2013-01-01
Programmed cell death ligand-1 (PD-L1/CD274) is an immunomodulatory molecule involved in cancer and complications of bone marrow transplantation, such as graft rejection and graft-versus-host disease. The present study was designed to assess the dynamic expression of this molecule after hematopoietic stem cell transplantation in relation to acute graft-versus-host disease. Female BALB/c mice were conditioned with busulfan and cyclophosphamide and transplanted with either syngeneic or allogeneic (male C57BL/6 mice) bone marrow and splenic cells. The expression of PD-L1 was evaluated at different time points employing qPCR, western blot and immunohistochemistry. Allogeneic- but not syngeneic-transplanted animals exhibited a marked up-regulation of PD-L1 expression in the muscle and kidney, but not the liver, at days 5 and 7 post transplantation. In mice transplanted with allogeneic bone marrow cells, the enhanced expression of PD-L1 was associated with high serum levels of IFNγ and TNFα at corresponding intervals. Our findings demonstrate that PD-L1 is differently induced and expressed after allogeneic transplantation than it is after syngeneic transplantation, and that it is in favor of target rather than non-target organs at the early stages of acute graft-versus-host disease. This is the first study to correlate the dynamics of PD-L1 at the gene-, protein- and activity levels with the early development of acute graft-versus-host disease. Our results suggest that the higher expression of PD-L1 in the muscle and kidney (non-target tissues) plays a protective role in skeletal muscle during acute graft-versus-host disease. PMID:23593203
Harder, Jeffrey M; Braine, Catherine E; Williams, Pete A; Zhu, Xianjun; MacNicoll, Katharine H; Sousa, Gregory L; Buchanan, Rebecca A; Smith, Richard S; Libby, Richard T; Howell, Gareth R; John, Simon W M
2017-05-09
Various immune response pathways are altered during early, predegenerative stages of glaucoma; however, whether the early immune responses occur secondarily to or independently of neuronal dysfunction is unclear. To investigate this relationship, we used the Wld s allele, which protects from axon dysfunction. We demonstrate that DBA/2J .Wld s mice develop high intraocular pressure (IOP) but are protected from retinal ganglion cell (RGC) dysfunction and neuroglial changes that otherwise occur early in DBA/2J glaucoma. Despite this, immune pathways are still altered in DBA/2J .Wld s mice. This suggests that immune changes are not secondary to RGC dysfunction or altered neuroglial interactions, but may be directly induced by the increased strain imposed by high IOP. One early immune response following IOP elevation is up-regulation of complement C3 in astrocytes of DBA/2J and DBA/2J. Wld s mice. Unexpectedly, because the disruption of other complement components, such as C1Q, is protective in glaucoma, C3 deficiency significantly increased the number of DBA/2J eyes with nerve damage and RGC loss at an early time point after IOP elevation. Transcriptional profiling of C3-deficient cultured astrocytes implicated EGFR signaling as a hub in C3-dependent responses. Treatment with AG1478, an EGFR inhibitor, also significantly increased the number of DBA/2J eyes with glaucoma at the same early time point. These findings suggest that C3 protects from early glaucomatous damage, a process that may involve EGFR signaling and other immune responses in the optic nerve head. Therefore, therapies that target specific components of the complement cascade, rather than global inhibition, may be more applicable for treating human glaucoma.
Harder, Jeffrey M.; Braine, Catherine E.; Williams, Pete A.; Zhu, Xianjun; MacNicoll, Katharine H.; Sousa, Gregory L.; Buchanan, Rebecca A.; Smith, Richard S.; Howell, Gareth R.; John, Simon W. M.
2017-01-01
Various immune response pathways are altered during early, predegenerative stages of glaucoma; however, whether the early immune responses occur secondarily to or independently of neuronal dysfunction is unclear. To investigate this relationship, we used the Wlds allele, which protects from axon dysfunction. We demonstrate that DBA/2J.Wlds mice develop high intraocular pressure (IOP) but are protected from retinal ganglion cell (RGC) dysfunction and neuroglial changes that otherwise occur early in DBA/2J glaucoma. Despite this, immune pathways are still altered in DBA/2J.Wlds mice. This suggests that immune changes are not secondary to RGC dysfunction or altered neuroglial interactions, but may be directly induced by the increased strain imposed by high IOP. One early immune response following IOP elevation is up-regulation of complement C3 in astrocytes of DBA/2J and DBA/2J.Wlds mice. Unexpectedly, because the disruption of other complement components, such as C1Q, is protective in glaucoma, C3 deficiency significantly increased the number of DBA/2J eyes with nerve damage and RGC loss at an early time point after IOP elevation. Transcriptional profiling of C3-deficient cultured astrocytes implicated EGFR signaling as a hub in C3-dependent responses. Treatment with AG1478, an EGFR inhibitor, also significantly increased the number of DBA/2J eyes with glaucoma at the same early time point. These findings suggest that C3 protects from early glaucomatous damage, a process that may involve EGFR signaling and other immune responses in the optic nerve head. Therefore, therapies that target specific components of the complement cascade, rather than global inhibition, may be more applicable for treating human glaucoma. PMID:28446616
Souza Trindade, José Carlos; Viterbo, Fausto; Petean Trindade, André; Fávaro, Wagner José; Trindade-Filho, José Carlos Souza
2017-06-01
To study a novel penile reinnervation technique using four sural nerve grafts and end-to-side neurorraphies connecting bilaterally the femoral nerve and the cavernous corpus and the femoral nerve and the dorsal penile nerves. Ten patients (mean [± sd; range] age 60.3 [± 4.8; 54-68] years), who had undergone radical prostatectomy (RP) at least 2 years previously, underwent penile reinnervation in the present study. Four patients had undergone radiotherapy after RP. All patients reported satisfactory sexual activity prior to RP. The surgery involved bridging of the femoral nerve to the dorsal nerve of the penis and the inner part of the corpus cavernosum with sural nerve grafts and end-to-side neurorraphies. Patients were evaluated using the International Index of Erectile Function (IIEF) questionnaire and pharmaco-penile Doppler ultrasonography (PPDU) preoperatively and at 6, 12 and 18 months postoperatively, and using a Clinical Evolution of Erectile Function (CEEF) questionnaire, administered after 36 months. The IIEF scores showed improvements with regard to erectile dysfunction (ED), satisfaction with intercourse and general satisfaction. Evaluation of PPDU velocities did not reveal any difference between the right and left sides or among the different time points. The introduction of nerve grafts neither caused fibrosis of the corpus cavernosum, nor reduced penile vascular flow. CEEF results showed that sexual intercourse began after a mean of 13.7 months with frequency of sexual intercourse varying from once daily to once monthly. Acute complications were minimal. The study was limited by the small number of cases. A total of 60% of patients were able to achieve full penetration, on average, 13 months after reinnervation surgery. Patients previously submitted to radiotherapy had slower return of erectile function. We conclude that penile reinnervation surgery is a viable technique, with effective results, and could offer a new treatment method for ED after RP. © 2017 The Authors BJU International © 2017 BJU International Published by John Wiley & Sons Ltd.
Bohorquez, H; Seal, J B; Cohen, A J; Kressel, A; Bugeaud, E; Bruce, D S; Carmody, I C; Reichman, T W; Battula, N; Alsaggaf, M; Therapondos, G; Bzowej, N; Tyson, G; Joshi, S; Nicolau-Raducu, R; Girgrah, N; Loss, G E
2017-08-01
Donation after circulatory death (DCD) liver transplantation (LT) reportedly yields inferior survival and increased complication rates compared with donation after brain death (DBD). We compare 100 consecutive DCD LT using a protocol that includes thrombolytic therapy (late DCD group) to an historical DCD group (early DCD group n = 38) and a cohort of DBD LT recipients (DBD group n = 435). Late DCD LT recipients had better 1- and 3-year graft survival rates than early DCD LT recipients (92% vs. 76.3%, p = 0.03 and 91.4% vs. 73.7%, p = 0.01). Late DCD graft survival rates were comparable to those of the DBD group (92% vs. 93.3%, p = 0.24 and 91.4% vs. 88.2%, p = 0.62). Re-transplantation occurred in 18.4% versus 1% for the early and late DCD groups, respectively (p = 0.001). Patient survival was similar in all three groups. Ischemic-type biliary lesions (ITBL) occurred in 5%, 3%, and 0.2% for early DCD, late DCD, and DBD groups, respectively, but unlike in the early DCD group, in the late DCD group ITBL was endoscopically managed and resolved in each case. Using a protocol that includes a thrombolytic therapy, DCD LT yielded patient and graft survival rates comparable to DBD LT. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.
Denk, Stephanie; Wiegner, Rebecca; Hönes, Felix M.; Messerer, David A. C.; Radermacher, Peter; Kalbitz, Miriam; Braumüller, Sonja; McCook, Oscar; Gebhard, Florian; Weckbach, Sebastian; Huber-Lang, Markus
2015-01-01
Severe tissue trauma-induced systemic inflammation is often accompanied by evident or occult blood-organ barrier dysfunctions, frequently leading to multiple organ dysfunction. However, it is unknown whether specific barrier molecules are shed into the circulation early after trauma as potential indicators of an initial barrier dysfunction. The release of the barrier molecule junctional adhesion molecule-1 (JAM-1) was investigated in plasma of C57BL/6 mice 2 h after experimental mono- and polytrauma as well as in polytrauma patients (ISS ≥ 18) during a 10-day period. Correlation analyses were performed to indicate a linkage between JAM-1 plasma concentrations and organ failure. JAM-1 was systemically detected after experimental trauma in mice with blunt chest trauma as a driving force. Accordingly, JAM-1 was reduced in lung tissue after pulmonary contusion and JAM-1 plasma levels significantly correlated with increased protein levels in the bronchoalveolar lavage as a sign for alveolocapillary barrier dysfunction. Furthermore, JAM-1 was markedly released into the plasma of polytrauma patients as early as 4 h after the trauma insult and significantly correlated with severity of disease and organ dysfunction (APACHE II and SOFA score). The data support an early injury- and time-dependent appearance of the barrier molecule JAM-1 in the circulation indicative of a commencing trauma-induced barrier dysfunction. PMID:26556956
Cardiovascular and systemic effects of gastric dilatation and volvulus in dogs.
Sharp, Claire R; Rozanski, Elizabeth A
2014-09-01
Gastric dilatation and volvulus (GDV) is a common emergency condition in large and giant breed dogs that is associated with high morbidity and mortality. Dogs with GDV classically fulfill the criteria for the systemic inflammatory response syndrome (SIRS) and can go on to develop multiple organ dysfunction syndrome (MODS). Previously reported organ dysfunctions in dogs with GDV include cardiovascular, respiratory, gastrointestinal, coagulation and renal dysfunction. Cardiovascular manifestations of GDV include shock, cardiac arrhythmias and myocardial dysfunction. Respiratory dysfunction is also multifactorial, with contributory factors including decreased respiratory excursion due to gastric dilatation, decreased pulmonary perfusion and aspiration pneumonia. Gastrointestinal dysfunction includes gastric necrosis and post-operative gastrointestinal upset such as regurgitation, vomiting, and ileus. Coagulation dysfunction is another common feature of MODS in dogs with GDV. Disseminated intravascular coagulation can occur, putting them at risk of complications associated with thrombosis in the early hypercoagulable state and hemorrhage in the subsequent hypocoagulable state. Acute kidney injury, acid-base and electrolyte disturbances are also reported in dogs with GDV. Understanding the potential for systemic effects of GDV allows the clinician to monitor patients astutely and detect such complications early, facilitating early intervention to maximize the chance of successful management. Copyright © 2014 Elsevier Inc. All rights reserved.
Wan, Yue-Meng; Li, Yu-Hua; Xu, Ying; Wu, Hua-Mei; Li, Ying-Chun; Wu, Xi-Nan; Yang, Jin-Hui
2018-01-16
Transjugular intrahepatic portosystemic shunt (TIPS) is an established method for portal hypertension. This study was to investigate the long-term safety, technical success, and patency of TIPS, and to determine the risk factors and clinical impacts of shunt dysfunction. A total of 154 consecutive patients undergoing embolotherapy of gastric coronary vein and/or short gastric vein and TIPS creation were prospectively studied. Follow-up data included technical success, patency and revision of TIPS, and overall survival of patients. During the study, the primary and secondary technical success rates were 98.7% and 100%, respectively. Sixty-three patients developed shunt dysfunction, 30 with shunt stenosis and 33 with shunt occlusion. The cumulative 60-month primary, primary assisted, and secondary patency rates were 19.6%, 43.0%, and 93.4%, respectively. The cumulative 60-month overall survival rates were similar between the TIPS dysfunction group and the TIPS non-dysfunction group (68.6% vs. 58.6%, P = .096). Baseline portal vein thrombosis (P < .001), use of bare stents (P = .018), and portal pressure gradient (PPG) (P = .020) were independent predictors for shunt dysfunction, hepatocellular carcinoma (P < .001), and ascites (P = .003) for overall survival. The accuracy of PPG for shunt dysfunction was statistically significant (P < .001), and a cutoff value of 8.5 had 77.8% sensitivity and 64.8% specificity. The long-term safety, technical success, and patency of TIPS were good; baseline portal vein thrombosis, use of bare stents, and PPG were significantly associated with shunt dysfunction; shunt dysfunction has little impact on patients' long-term survival because of high secondary patency rates. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Treatment of Linear Scleroderma (en Coup de Sabre) With Dermal Fat Grafting.
Barin, Ensar Zafer; Cinal, Hakan; Cakmak, Mehmet Akif; Tan, Onder
2016-05-01
Linear scleroderma, also known as "en coup de sabre," is a subtype of localized scleroderma that warrants aesthetic correction because it appears on the forehead region in children. To report dermal fat grafting as a novel and effective surgical treatment option in linear scleroderma. Under local anesthesia, a dermal fat graft was successfully placed into a subcutaneous pocket that was prepared underneath the depressed scar. The donor site was closed primarily. No early or late complications developed postoperatively. After 1-year follow-up, the dermal fat graft was viable, the depressed scar was adequately augmented, and a good aesthetic result and patient satisfaction were obtained. We believe that dermal fat grafting is a cost-effective option and provides a long-lasting aesthetic outcome in the management of linear scleroderma. © The Author(s) 2015.
Formica, Francesco; Broccolo, Francesco; Martino, Antonello; Sciucchetti, Jennifer; Giordano, Vincenzo; Avalli, Leonello; Radaelli, Gianluigi; Ferro, Orazio; Corti, Fabrizio; Cocuzza, Clementina; Paolini, Giovanni
2009-05-01
This prospective randomized study sought to verify the systemic inflammatory response, inflammatory myocardial damage, and early clinical outcome in coronary surgery with the miniaturized extracorporeal circulation system or on the beating heart. Sixty consecutive patients were randomized to miniaturized extracorporeal circulation (n = 30) or off-pump coronary revascularization (off-pump coronary artery bypass grafting, n = 30). Intraoperative and postoperative data were recorded. Plasma levels of interleukin-6 and tumor necrosis factor-alpha were measured from systemic blood intraoperatively, at the end of operation, and 24 and 48 hours thereafter. Levels of the same markers and blood lactate were measured from coronary sinus blood intraoperatively to evaluate myocardial inflammation. Markers of myocardial damage were also analyzed. One patient died in the off-pump coronary artery bypass grafting group. There was no statistical difference in early clinical outcome in both groups. Release of interleukin-6 was higher in the off-pump coronary artery bypass grafting group 24 hours after the operation (P = .03), whereas levels of tumor necrosis factor-alpha were not different in both groups. Cardiac release of interleukin-6, tumor necrosis factor-alpha, and blood lactate were not different in both groups. Release of troponin T was not significantly different in both groups. Levels of creatine kinase mass were statistically higher in the miniaturized extracorporeal circulation group than in the off-pump coronary artery bypass grafting group, but only at the end of the operation (P < .0001). Hemoglobin levels were significantly higher in the miniaturized extracorporeal circulation group than in the off-pump coronary artery bypass grafting group after 24 hours (P = .01). Miniaturized extracorporeal circulation can be considered similar to off-pump surgery in terms of systemic inflammatory response, myocardial inflammation and damage, and early outcome.
Holly, Thomas A.; Bonow, Robert O.; Arnold, J. Malcolm O.; Oh, Jae K.; Varadarajan, Padmini; Pohost, Gerald M.; Haddad, Haissam; Jones, Robert H.; Velazquez, Eric J.; Birkenfeld, Bozena; Asch, Federico M.; Malinowski, Marcin; Barretto, Rodrigo; Kalil, Renato A.K.; Berman, Daniel S.; Sun, Jie-Lena; Lee, Kerry L.; Panza, Julio A.
2014-01-01
Objective In the Surgical Treatment for Ischemic Heart Failure (STICH) trial, surgical ventricular reconstruction plus coronary artery bypass surgery was not associated with a reduction in the rate of death or cardiac hospitalization compared to bypass alone. We hypothesized that the absence of viable myocardium identifies patients with coronary artery disease and left ventricular dysfunction who have a greater benefit with coronary artery bypass graft surgery and surgical ventricular reconstruction compared to bypass alone. Methods Myocardial viability was assessed by single photon computed tomography in 267 of the 1,000 patients randomized to bypass or bypass plus surgical ventricular reconstruction in STICH. Myocardial viability was assessed on a per patient basis as well as regionally based on pre-specified criteria. Results At 3 years, there was no difference in mortality or the combined outcome of death or cardiac hospitalization between those with and those without viability, and there was no significant interaction between the type of surgery and global viability status with respect to mortality or death plus cardiac hospitalization. Furthermore, there was no difference in mortality or death plus cardiac hospitalization between those with and without anterior wall or apical scar, and no significant interaction between the presence of scar in these regions and the type of surgery with respect to mortality. Conclusion In patients with coronary artery disease and severe regional left ventricular dysfunction, assessment of myocardial viability does not identify patients who will derive a mortality benefit from adding surgical ventricular reconstruction to coronary artery bypass graft surgery. PMID:25152476
Maeshima, Hitoshi; Baba, Hajime; Nakano, Yoshiyuki; Satomura, Emi; Namekawa, Yuki; Takebayashi, Naoko; Nomoto, Hiroshi; Suzuki, Toshihito; Mimura, Masaru; Arai, Heii
2013-10-01
Previous studies have demonstrated that patients with depression also have memory dysfunctions during depressive episodes. These dysfunctions partially remain immediately after remission from a depressive state; however, it is unclear whether these residual memory dysfunctions may disappear through long-term remission from depression. The present study compared patients during early-life (age<60) and late-life (age ≥ 60) depression while in their remitted stage with healthy controls to elucidate the impact of a long-term course on memory. Logical memory from the Wechsler Memory Scale-Revised was administered to 67 patients with major depressive disorder (MDD) (47 patients with early-life depression and residual 20 patients with late-life depression) and 50 healthy controls. MDD patients received memory assessments at the time of their initial remission and at a follow-up three years after remission. At the time of initial remission, scores for logical memory were significantly lower in both patient groups compared to matched controls. At follow-up, memory dysfunction for early-life MDD patients disappeared, whereas scores in the late-life MDD group remained significantly lower than those of matched controls. All patients in the present study were on antidepressant medications. Our findings suggested that the progress of memory performance in late-life MDD patients may be different from early-life MDD patients. © 2013 Elsevier B.V. All rights reserved.
Levine, Deborah J; Glanville, Allan R; Aboyoun, Christina; Belperio, John; Benden, Christian; Berry, Gerald J; Hachem, Ramsey; Hayes, Don; Neil, Desley; Reinsmoen, Nancy L; Snyder, Laurie D; Sweet, Stuart; Tyan, Dolly; Verleden, Geert; Westall, Glen; Yusen, Roger D; Zamora, Martin; Zeevi, Adriana
2016-04-01
Antibody-mediated rejection (AMR) is a recognized cause of allograft dysfunction in lung transplant recipients. Unlike AMR in other solid-organ transplant recipients, there are no standardized diagnostic criteria or an agreed-upon definition. Hence, a working group was created by the International Society for Heart and Lung Transplantation with the aim of determining criteria for pulmonary AMR and establishing a definition. Diagnostic criteria and a working consensus definition were established. Key diagnostic criteria include the presence of antibodies directed toward donor human leukocyte antigens and characteristic lung histology with or without evidence of complement 4d within the graft. Exclusion of other causes of allograft dysfunction increases confidence in the diagnosis but is not essential. Pulmonary AMR may be clinical (allograft dysfunction which can be asymptomatic) or sub-clinical (normal allograft function). This consensus definition will have clinical, therapeutic and research implications. Copyright © 2016 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Lee, Dae-Hee; Lee, Chang-Rack; Jeon, Jin-Ho; Kim, Kyung-Ah; Bin, Seong-Il
2015-01-01
Graft extrusion after meniscus allograft transplantation (MAT) may be affected by horn fixation, which differs between medial and lateral MAT. Few studies have compared graft extrusion, especially sagittal extrusion, after medial and lateral MAT. In patients undergoing medial and lateral MAT, graft extrusion is likely similar and not correlated with postoperative Lysholm scores. Cohort study; Level of evidence, 2. Meniscus graft extrusion in the coronal and sagittal planes was compared in 51 knees undergoing medial MAT and 84 undergoing lateral MAT. Distances from the anterior and posterior articular cartilage margins to the anterior (anterior cartilage meniscus distance [ACMD]) and posterior (posterior cartilage meniscus distance [PCMD]) horns, respectively, were assessed on immediate postoperative magnetic resonance imaging and compared in patients undergoing medial and lateral MAT. Correlations between coronal and sagittal graft extrusion and between extrusion and the Lysholm score were compared in the 2 groups. In the coronal plane, mean absolute (4.3 vs 2.7 mm, respectively; P<.001) and relative (39% vs 21%, respectively; P<.001) graft extrusions were significantly greater for medial than lateral MAT. In the sagittal plane, mean absolute and relative ACMD and PCMD values were significantly greater for medial than lateral MAT (P<.001 each). For both medial and lateral MAT, mean absolute and relative ACMDs were significantly larger than PCMDs (P<.001 each). Graft extrusion>3 mm in the coronal plane was significantly more frequent in the medial (78%) than in the lateral (35%) MAT group. In the sagittal plane, the frequencies of ACMDs (72% vs 39%, respectively) and PCMDs (23% vs 4%, respectively) >3 mm were also significantly greater in the medial than in the lateral MAT group. Coronal and sagittal extrusions were not correlated with postoperative Lysholm scores for both medial and lateral MAT. The amount and incidence of graft extrusion were greater after medial than lateral MAT in both the coronal and sagittal planes. In the sagittal plane, graft extrusion was greater and more frequent on the anterior than the posterior horn in both medial and lateral MAT. However, graft extrusion was not correlated with early clinical outcomes after both medial and lateral MAT. © 2014 The Author(s).
Photoacoustic detection of neovascularities in skin graft
NASA Astrophysics Data System (ADS)
Yamazaki, Mutsuo; Sato, Shunichi; Saitoh, Daizo; Ishihara, Miya; Okada, Yoshiaki; Ashida, Hiroshi; Obara, Minoru
2005-04-01
We previously proposed a new method for monitoring adhesion of skin graft by measuring photoacoustic (PA) signal originated from the neovascularities. In this study, immunohistochemical staining (IHC) with CD31 antibody was performed for grafted skin tissue to observe neovascularity, and the results were compared with PA signals. We also used a laser Doppler imaging (LDI) to observe blood flow in the grafted skin, and sensitivity of PA measurement and that of LDI were compared. In rat autograft models, PA signals were measured for the grafted skin at postgrafting times of 0-48 h. At 6 h postgrafting, PA signal was observed in the skin depth region of 500-600 mm, while the results of IHC showed that angiogenesis occurred at the depth of about 600 mm. Depths at which PA signal and angiogenesis were observed decreased with postgrafting time. These indicate that the PA signal observed at 6 h postgrafting originated from the neovascularities in the skin graft. Results of LDI showed no blood-originated signal before 48 h postgrafting. These findings suggest that PA measurement is effective in monitoring the adhesion of skin graft in early stage after transplantation.
Long-Term Pancreas Allograft Survival in Simultaneous Pancreas-Kidney Transplantation by Era
Waki, Kayo; Terasaki, Paul I.; Kadowaki, Takashi
2010-01-01
OBJECTIVE To determine whether short-term improvement in pancreas graft survival with simultaneous pancreas-kidney (SPK) transplants translated into improved long-term survival, then to examine the implications of that determination. RESEARCH DESIGN AND METHODS We analyzed data for 14,311 diabetic patients who received a first SPK transplant between October 1987 and November 2007, using Kaplan-Meier analysis for graft survival rates and Cox regression analysis for year-of-transplant effect. RESULTS Overall, from 1995 to 2004, 5-year pancreas graft survival stayed about the same (70–71%). Limiting analysis to grafts that survived more than 1 year, 5-year survival from 1987 to 2004 ranged from 80 to 84%. With 1987–1989 as reference, the adjusted hazard ratio for graft failure by year of transplant increased to 1.49 (95% CI 0.97–2.30) in 2000–2004. CONCLUSIONS Long-term pancreas graft survival has remained unchanged despite the dramatic decreases in technical failures and early acute rejection rates that have contributed to prolonged SPK graft survival. PMID:20460444
Rejection of isolated pancreatic allografts in patients with with diabetes.
Groth, C G; Lundgren, G; Arner, P; Collste, H; Hårdstedt, C; Lewander, R; Ostman, J
1976-12-01
Four patients with diabetes mellitus of juvenile onset but without uremia have been treated with segmental transplantation of the body and tail of pancreas. The indications were hyperlabile diabetes or progressive loss of vision. The grafts were procured from cadaveric donors four to 16 minutes after circulary arrest and were subsequently stored in the cold for approximately four hours. In one patient, the pancreatic duct was ligated, while in the other three, drainage was attained by suturing the transected end of the pancreas into a jejunal Roux-en-Y loop. Three of the grafts failed within six weeks as a result of irreversible refection, and one graft failed because of the early onset of venous thrombosis. The first sign of graft rejection was an increase in the postprandial blood sugar level, an increase in the fasting blood sugar level occurring several days later. Neither hyperamylasemia nor fever was observed. Radioisotope scans and angiograms were of great value in establishing the diagnosis of graft rejection. All of the patients survived after graft removal.
Vesicoureteral Reflux in Kidney Transplantation.
Molenaar, Nina M; Minnee, Robert C; Bemelman, Frederike J; Idu, Mirza M
2017-06-01
Vesicoureteral reflux (VUR) is frequently found after transplantation, but its impact on graft function, urine tract infection, and graft loss remains uncertain. Therefore our objective was to evaluate the effects of VUR on the outcome of renal transplantation. We included 1008 adult renal transplant recipients of whom a 1-week posttransplant voiding cystourethrogram was available. Study end points included occurrence of bacteriuria, renal function, and graft survival. In total, 106 (10.5%) of 1008 graft recipients had a diagnosis of VUR on voiding cystography. The incidence of bacteriuria was comparable in the reflux and nonreflux group (17% vs 17.4%, P = .91). There was no significant difference in renal function at 3 months and 1 year in patients with and without VUR. One- and 5-year graft survival in patients with VUR was 85.8% and 82.1% compared to 87.3% and 83.0% in patients without VUR ( P = .68 and P = .80). Posttransplant VUR has no correlations with early bacteriuria, renal function, and graft survival.
Schaffer, Joseph Christopher; Adib, Farshad; Cui, Quanjun
2014-06-01
Osteonecrosis (ON) of the femoral head, without timely intervention, often progresses to debilitating hip arthritis. Core decompression (CD) with bone grafting was used to treat patients with early-stage ON. In 3 cases, intraoperative oxygen saturation, end-tidal carbon dioxide fluctuations, and/or vital sign fluctuations were observed during insertion of the graft, a mixture of bone marrow and demineralized bone matrix. In 1 case, continued postoperative pulmonary symptoms required admission to intensive care. In this article, we describe these cases and provide supporting evidence that they were caused by fat emboli secondary to forceful insertion of bone graft. We review the literature and present complications data. Although no cases of fat emboli were reported as complications of any CD series with or without bone grafting, CD augmented with bone graft may carry risks not seen before in CD alone. Care should be taken to avoid these complications, possibly through technique modification.
Early renal function recovery and long-term graft survival in kidney transplantation.
Wan, Susan S; Cantarovich, Marcelo; Mucsi, Istvan; Baran, Dana; Paraskevas, Steven; Tchervenkov, Jean
2016-05-01
Following kidney transplantation (KTx), renal function improves gradually until a baseline eGFR is achieved. Whether or not a recipient achieves the best-predicted eGFR after KTx may have important implications for immediate patient management, as well as for long-term graft survival. The aim of this cohort study was to calculate the renal function recovery (RFR) based on recipient and donor eGFR and to evaluate the association between RFR and long-term death-censored graft failure (DCGF). We studied 790 KTx recipients between January 1990 and August 2014. The last donor SCr prior to organ procurement was used to estimate donor GFR. Recipient eGFR was calculated using the average of the best three SCr values observed during the first 3 months post-KTx. RFR was defined as the ratio of recipient eGFR to half the donor eGFR. 53% of recipients had an RFR ≥1. There were 127 death-censored graft failures (16%). Recipients with an RFR ≥1 had less DCGF compared with those with an RFR <1 (HR 0.56; 95% CI 0.37-0.85; P = 0.006). Transplant era, acute rejection, ECD and DGF were also significant determinants of graft failure. Early recovery of predicted eGFR based on donor eGFR is associated with less DCGF after KTx. © 2016 Steunstichting ESOT.
Finnerty, Celeste C; Capek, Karel D; Voigt, Charles; Hundeshagen, Gabriel; Cambiaso-Daniel, Janos; Porter, Craig; Sousse, Linda E; El Ayadi, Amina; Zapata-Sirvent, Ramon; Guillory, Ashley N; Suman, Oscar E; Herndon, David N
2017-09-01
Since the inception of the P50 Research Center in Injury and Peri-operative Sciences (RCIPS) funding mechanism, the National Institute of General Medical Sciences has supported a team approach to science. Many advances in critical care, particularly burns, have been driven by RCIPS teams. In fact, burns that were fatal in the early 1970s, prior to the inception of the P50 RCIPS program, are now routinely survived as a result of the P50-funded research. The advances in clinical care that led to the reduction in postburn death were made by optimizing resuscitation, incorporating early excision and grafting, bolstering acute care including support for inhalation injury, modulating the hypermetabolic response, augmenting the immune response, incorporating aerobic exercise, and developing antiscarring strategies. The work of the Burn RCIPS programs advanced our understanding of the pathophysiologic response to burn injury. As a result, the effects of a large burn on all organ systems have been studied, leading to the discovery of persistent dysfunction, elucidation of the underlying molecular mechanisms, and identification of potential therapeutic targets. Survival and subsequent patient satisfaction with quality of life have increased. In this review article, we describe the contributions of the Galveston P50 RCIPS that have changed postburn care and have considerably reduced postburn mortality.
Early matrix change of a nanostructured bone grafting substitute in the rat.
Xu, Weiguo; Holzhüter, Gerd; Sorg, Heiko; Wolter, Daniel; Lenz, Solvig; Gerber, Thomas; Vollmar, Brigitte
2009-11-01
A nanocrystalline bone substitute embedded in a highly porous silica gel matrix (NanoBone) has previously been shown to bridge bone defects by an organic matrix. As the initial host response on the bone graft substitute might be a determinant for subsequent bone formation, our present purpose was to characterize the early tissue reaction on this biomaterial. After implantation of 80 mg of NanoBone into the adipose neck tissue of a total of 35 rats, grafts were harvested for subsequent analysis at days 3, 6, 9, 12, and 21. The biomaterial was found encapsulated by granulation tissue which partly penetrated the implant at day 3 and completely pervaded the graft at day 12 on implantation. Histology revealed tartrate-resistant acid phosphatase (TRAP)-positive giant cells covering the biomaterial. ED1 (CD68) immunopositivity of these cells further indicated their osteoclast-like phenotype. Scanning electron microscopy revealed organic tissue components within the periphery of the graft already at day 9, whereas the central hematoma region still presented the silica-surface of the biomaterial. Energy dispersive X-ray spectroscopy further demonstrated that the silica gel was degraded faster in the peripheral granulation tissue than in the central hematoma and was replaced by organic host components by day 12. In conclusion, the silica gel matrix is rapidly replaced by carbohydrate macromolecules. This might represent a key step in the process of graft degradation on its way toward induction of bone formation. The unique composition and structure of this nanoscaled biomaterial seem to support its degradation by host osteoclast-like giant cells.
Is the age at menopause a cause of sexual dysfunction? A Brazilian population-based study.
Lett, Caio; Valadares, Ana L R; Baccaro, Luiz F; Pedro, Adriana O; Filho, Jeffrey L; Lima, Marcelo; Costa-Paiva, Lucia
2018-01-01
The aim of the study was to evaluate the association between age at menopause and sexual dysfunction and the components of sexual function in postmenopausal women. In this cross-sectional population-based study, data of 540 women aged 45 to 60 years regarding the age they were when they achieved menopause and its association with sexual dysfunction (evaluated using the Short Personal Experiences Questionnaire) were obtained through interviews. We assessed the data for associations between age at menopause and sexual dysfunction and demographic, behavioral, and clinical characteristics. Age at menopause was not associated with sexual dysfunction. Arousal (dysfunction) was the only component of sexual function that was associated with premature ovarian insufficiency (POI) and early menopause (P = 0.01). It was reported by 64.2% of women with POI (women <40 y), compared with sexual dysfunction rates of 50% and 45.6% of women aged 40 to 45 and >45 years, respectively (P = 0.04). In women with POI or early menopause, Poisson regression analysis showed that having a partner with sexual problems (prevalence ratio [PR] = 6.6; 95% CI: 3.3-13,2; P < 0.001) and dyspareunia (PR = 3.9; 95% CI: 1.8-8.2; P = 0.0005) were factors associated with arousal dysfunction. Satisfaction with the partner as a lover (PR = 0.4; 95% CI: 0.2-0.7; P = 0.002) was protective against arousal dysfunction. Arousal dysfunction was associated with early ovarian failure and POI. The major factors affecting this association were having a partner with sexual problems, dyspareunia, and no satisfaction with the partner as a lover. These findings highlight the importance of evaluating partner problems and improving lubrication in these groups of women.
A Rare Cause of Hypothalamic Obesity, Rohhad Syndrome: 2 Cases.
Şiraz, Ülkü Gül; Okdemir, Deniz; Direk, Gül; Akın, Leyla; Hatipoğlu, Nihal; Kendırcı, Mustafa; Kurtoğlu, Selim
2018-03-19
Rapid-onset obesity with hypoventilation, hypothalamic dysfunction and autonomic dysregulation (ROHHAD) syndrome is a rare disease that is difficult to diagnosis and distinguish from genetic obesity syndromes. The underlying causes of the disease has not been fully explained. Hypothalamic dysfunction causes endocrine problems, respiratory dysfunction and autonomic alterations. There are around 80 reported patients due to lack of recognition. We present two female patient suspected of ROHHAD due to weight gain since early childhood. The presented symptoms, respiratory and circulatory dysfunction, hypothalamic hypernatremia, hypothalamo-pituitary hormonal disorders such as santral hypothyrodism, hyperprolactinemia and santral early puberty are completely matched the criteria of ROHHAD syndrome. ROHHAD syndrome should be considered in differential diagnosis since it is difficult to distinguish from causes of monogenic obesity. Early identification of the disease reduces morbidity of the syndrome and patients require regular follow-up by a multidisciplinary approach.
A Scale of Socioemotional Dysfunction in Frontotemporal Dementia
Barsuglia, Joseph P.; Kaiser, Natalie C.; Wilkins, Stacy Schantz; Joshi, Aditi; Barrows, Robin J.; Paholpak, Pongsatorn; Panchal, Hemali Vijay; Jimenez, Elvira E.; Mather, Michelle J.; Mendez, Mario F.
2014-01-01
Early social dysfunction is a hallmark symptom of behavioral variant frontotemporal dementia (bvFTD); however, validated measures for assessing social deficits in dementia are needed. The purpose of the current study was to examine the utility of a novel informant-based measure of social impairment, the Socioemotional Dysfunction Scale (SDS) in early-onset dementia. Sixteen bvFTD and 18 early-onset Alzheimer’s disease (EOAD) participants received standard clinical neuropsychological measures and neuroimaging. Caregiver informants were administered the SDS. Individuals with bvFTD exhibited greater social dysfunction on the SDS compared with the EOAD group; t(32) = 6.32, p < .001. The scale demonstrated preliminary evidence for discriminating these frequently misdiagnosed groups (area under the curve = 0.920, p = <.001) and internal consistency α = 0.977. The SDS demonstrated initial evidence as an effective measure for detecting abnormal social behavior and discriminating bvFTD from EOAD. Future validation is recommended in larger and more diverse patient groups. PMID:25331776
Vignos, Michael F; Kaiser, Jarred M; Baer, Geoffrey S; Kijowski, Richard; Thelen, Darryl G
2018-05-10
Abnormal knee mechanics may contribute to early cartilage degeneration following anterior cruciate ligament reconstruction. Anterior cruciate ligament graft geometry has previously been linked to abnormal tibiofemoral kinematics, suggesting this parameter may be important in restoring normative cartilage loading. However, the relationship between graft geometry and cartilage contact is unknown. Static MR images were collected and segmented for eighteen subjects to obtain bone, cartilage, and anterior cruciate ligament geometries for their reconstructed and contralateral knees. The footprint locations and orientation of the anterior cruciate ligament were calculated. Volumetric, dynamic MR imaging was also performed to measure tibiofemoral kinematics, cartilage contact location, and contact sliding velocity while subjects performed loaded knee flexion-extension. Multiple linear regression was used to determine the relationship between non-anatomic graft geometry and asymmetric knee mechanics. Non-anatomic graft geometry was related to asymmetric knee mechanics, with the sagittal plane graft angle being the best predictor of asymmetry. A more vertical sagittal graft angle was associated with greater anterior tibial translation (β = 0.11mmdeg, P = 0.049, R 2 = 0.22), internal tibial rotation (β = 0.27degdeg, P = 0.042, R 2 = 0.23), and adduction angle (β = 0.15degdeg, P = 0.013, R 2 = 0.44) at peak knee flexion. A non-anatomic sagittal graft orientation was also linked to asymmetries in tibial contact location and sliding velocity on the medial (β = -4.2mmsdeg, P = 0.002, R 2 = 0.58) and lateral tibial plateaus (β = 5.7mmsdeg, P = 0.006, R 2 = 0.54). This study provides evidence that non-anatomic graft geometry is linked to asymmetric knee mechanics, suggesting that restoring native anterior cruciate ligament geometry may be important to mitigate the risk of early cartilage degeneration in these patients. Copyright © 2018 Elsevier Ltd. All rights reserved.
Adam, Frank; Pape, Dietrich; Schiel, Karin; Steimer, Oliver; Kohn, Dieter; Rupp, Stefan
2004-01-01
Reliable fixation of the soft hamstring grafts in ACL reconstruction has been reported as problematic. The biomechanical properties of patellar tendon (PT) grafts fixed with biodegradable screws (PTBS) are superior compared to quadrupled hamstring grafts fixed with BioScrew (HBS) or Suture-Disc fixation (HSD). Controlled laboratory study with roentgen stereometric analysis (RSA). Ten porcine specimens were prepared for each group. In the PT group, the bone plugs were fixed with a 7 x 25 mm BioScrew. In the hamstring group, four-stranded tendon grafts were anchored within a tibial tunnel of 8 mm diameter either with a 7 x 25 mm BioScrew or eight polyester sutures knotted over a Suture-Disc. The grafts were loaded stepwise, and micromotion of the graft inside the tibial tunnel was measured with RSA. Hamstring grafts failed at lower loads (HBS: 536 N, HSD 445 N) than the PTBS grafts (658 N). Stiffness in the PTBS group was much greater compared to the hamstring groups (3500 N/mm versus HBS = 517 N/mm and HSD = 111 N/mm). Irreversible graft motion after graft loading with 200 N was measured at 0.03 mm (PTBS), 0.38mm (HBS), and 1.85mm (HSD). Elasticity for the HSD fixation was measured at 0.67 mm at 100 N and 1.32 mm at 200 N load. Hamstring graft fixation with BioScrew and Suture-Disc displayed less stiffness and early graft motion compared to PTBS fixation. Screw fixation of tendon grafts is superior to Suture-Disc fixation with linkage material since it offers greater stiffness and less graft motion inside the tibial tunnel. Our results revealed graft motion for hamstring fixation with screw or linkage material at loads that occur during rehabilitation. This, in turn, may lead to graft laxity.
[Lateral column lengthening osteotomy of calcaneus].
Hintermann, B
2015-08-01
Lengthening of the lateral column for adduction of forefoot and restoration of the medial arch. Stabilization of the ankle joint complex. Supple flatfoot deformity (posterior tibial tendon dysfunction stage II). Instability of the medial ankle joint complex (superficial deltoid and spring ligament). Posttraumatic valgus and pronation deformity of the foot. Rigid flatfoot deformity (posterior tibial tendon dysfunction stage III and IV). Talocalcaneal and naviculocalcaneal coalition. Osteoarthritis of calcaneocuboid joint. Exposition of calcaneus at sinus tarsi. Osteotomy through sinus tarsi and widening until desired correction of the foot is achieved. Insertion of bone graft. Screw fixation. Immobilization in a cast for 6 weeks. Weight-bearing as tolerated from the beginning. In the majority of cases, part of hindfoot reconstruction. Reliable and stable correction. Safe procedure with few complications.
Corporoplasty using buccal mucosa graft in Peyronie disease: is it a first choice?
Zucchi, Alessandro; Silvani, Mauro; Pastore, Antonio Luigi; Fioretti, Fabrizio; Fabiani, Andrea; Villirillo, Tommaso; Costantini, Elisabetta
2015-03-01
To assesses the surgical and functional efficacy of corporoplasty with buccal mucosa graft, patients and partner's satisfaction, and the low cost of this operation. Biocompatible tissues are frequently used during corporoplasty, but they are expensive and often do not match the thickness and elasticity of the tunica albuginea, leading to fibrosis and scar retraction. Buccal mucosa graft is not usually emphasized in many review articles and clinical studies are limited. Thirty-two patients with stable disease and normal erections were included in this retrospective study. All patients underwent corporoplasty with plaque incision and buccal mucosa graft. Preoperative International Index of Erectile Function (IIEF) questionnaire and penile duplex ultrasonographies with measurement of curvature were conducted. At 6 and 12 months postoperatively, patients answered the IIEF and the Patient Global Impression of Improvement questionnaires. Patient and partner satisfaction were recorded at all subsequent visits. Thirty-two patients underwent corporoplasty between 2006 and 2013, and no major complications developed in any patient. After 1 year, curvature relapse was present in 1 patient (3.5%), and 1 patient had slight erectile dysfunction. IIEF values had significantly improved 1 year after surgery (P = .031). Patient satisfaction was 85% on the Patient Global Impression of Improvement questionnaire. Twenty-five of 28 partners were satisfied (90%). Data analysis confirmed the stability of the IIEF score in 16 patients after 2 years (mean IIEF score, 21.3). Corporoplasty with buccal mucosa graft is easy to perform and represents a good treatment choice for most forms of Peyronie disease with curvature preventing penetration and sexual intercourse. Copyright © 2015 Elsevier Inc. All rights reserved.
Soliman, A T; Adel, A; Soliman, N A; Elalaily, R; De Sanctis, V
2015-01-01
AIMS OF REVIEW: the intent of the current manuscript is to critically review the studies on pituitary gland dysfunction in early childhood following traumatic brain injury (TBI), in comparison with those in adults. Search of the literature: The MEDLINE database was accessed through PubMed in April 2015. Results were restricted to the past 15 years and English language of articles. Both transient and permanent hypopituitarisms are not uncommon after TBI. Early after the TBI, pituitary dysfunction/s differ than those occurring after few weeks and months. Growth hormone deficiency (GHD) and alterations in puberty are the most common. After the one to more years of TBI, pituitary dysfunction tends to improve in some patients but may deteriorate in others. GH deficiency as well as Hypogonadism and thyroid dysfunction are the most common permanent lesions. Many of the symptoms of these endocrine defects can pass unnoticed because of the psychomotor defects associated with the TBI like depression and apathy. Unfortunately pituitary dysfunction appear to negatively affect psycho-neuro-motor recovery as well as growth and pubertal development of children and adolescents after TBI. Therefore, the current review highlights the importance of closely following patients, especially children and adolescents for growth and other symptoms and signs suggestive of endocrine dysfunction. In addition, all should be screened serially for possible endocrine disturbances early after the TBI as well as few months to a year after the injury. Risk factors for pituitary dysfunction after TBI include relatively serious TBI (Glasgow Coma Scale score < 10 and MRI showing damage to the hypothalamic pituitary area), diffuse brain swelling and the occurrence of hypotensive and/or hypoxic episodes. There is a considerable risk of developing pituitary dysfunction after TBI in children and adolescents. These patients should be clinically followed and screened for these abnormalities according to an agreed protocol of investigations. Further multicenter and multidisciplinary prospective studies are required to explore in details the occurrence of permanent pituitary dysfunction after TBI in larger numbers of children with TBI. This requires considerable organisation and communication between many disciplines such as neurosurgery, neurology, endocrinology, rehabilitation and developmental paediatrics.
The Origin of New-Onset Diabetes After Liver Transplantation: Liver, Islets, or Gut?
Ling, Qi; Xu, Xiao; Wang, Baohong; Li, Lanjuan; Zheng, Shusen
2016-04-01
New-onset diabetes is a frequent complication after solid organ transplantation. Although a number of common factors are associated with the disease, including recipient age, body mass index, hepatitis C infection, and use of immunosuppressive drugs, new-onset diabetes after liver transplantation (NODALT) has the following unique aspects and thus needs to be considered its own entity. First, a liver graft becomes the patient's primary metabolic regulator after liver transplantation, but this would not be the case for kidney or other grafts. The metabolic states, as well as the genetics of the graft, play crucial roles in the development of NODALT. Second, dysfunction of the islets of Langerhans is common in cirrhotic patients and would be exacerbated by immunosuppressive agents, particularly calcineurin inhibitors. On the other hand, minimized immunosuppressive protocols have been widely advocated in liver transplantation because of liver tolerance (immune privilege). Third and last, through the "gut-liver axis," graft function is closely linked to gut microbiota, which is now considered an important metabolic organ and known to independently influence the host's metabolic homeostasis. Liver transplant recipients present with specific gut microbiota that may be prone to trigger metabolic disorders. In this review, we proposed 3 possible sites for the origin of NODALT, which are liver, islets, and gut, to help elucidate the underlying mechanism of NODALT.
Estévez, Ana; Jauregui, Paula; Ozerinjauregi, Nagore; Herrero-Fernández, David
2017-01-01
Child abuse affects people's ways of thinking, feeling, and observing the world, resulting in dysfunctional beliefs and maladaptive schemas. Thus, consequences of child abuse may persist during adulthood. Therefore, the aim of this study was to analyze the psychological consequences (anxiety, phobic anxiety, depression, and hopelessness) of different types of maltreatment (physical, sexual, and emotional abuse and physical and emotional neglect) and to study the role of early maladaptive schemas in the onset of symptomatology in adult female victims of child abuse. The sample consisted of 75 women referred by associations for treatment of abuse and maltreatment in childhood. Sexual abuse was the type of maltreatment that was most strongly related to most dysfunctional symptomatology, followed by emotional abuse and physical abuse, whereas physical neglect was the least related. Also, early maladaptive schemas were found to correlate with child abuse and dysfunctional symptomatology. Finally, early maladaptive schemas mediated the relationship between sexual abuse and dysfunctional symptomatology when the effect of other types of abuse was controlled. These results may provide important guidance for clinical intervention.
Loneliness and Sexual Dysfunctions.
ERIC Educational Resources Information Center
Mijuskovic, Ben
1987-01-01
Argues that sexual dysfunctions result from early childhood experiences which were originally nonsexual in nature. Contends that psychological difficulties centered around problems of loneliness tend to generate certain sexual dysfunctions. Extends and explores suggestion that genesis of sexual conflicts is in nonsexual infant separation anxiety…
Campistol, Josep M; Cockwell, Paul; Diekmann, Fritz; Donati, Donato; Guirado, Luis; Herlenius, Gustaf; Mousa, Dujanah; Pratschke, Johann; San Millán, Juan Carlos Ruiz
2009-07-01
m-TOR inhibitors (e.g. sirolimus) are well-tolerated immunosuppressants used in renal transplantation for prophylaxis of organ rejection, and are associated with long-term graft survival. Early use of sirolimus is often advocated by clinicians, but this may be associated with a number of side-effects including impaired wound-healing, lymphoceles and delayed graft function. As transplant clinicians with experience in the use of sirolimus, we believe such side-effects can be limited by tailored clinical management. We present recommendations based on published literature and our clinical experience. Furthermore, guidance is provided on sirolimus use during surgery, both at transplantation and for subsequent operations.
Katsargyris, Athanasios; Marques de Marino, Pablo; Mufty, Hozan; Pedro, Luis Mendes; Fernandes, Ruy; Verhoeven, Eric L G
2018-05-01
Visceral arteries in fenestrated and branched endovascular repair (F/BEVAR) have been addressed by fenestrations or directional side branches. Inner branches, as used in the arch branched device, could provide an extra option for visceral arteries "unsuitable" for fenestrations or directional side branches. Early experience with the use of inner branches for visceral arteries in F/BEVAR is described. All consecutive patients treated by F/BEVAR for complex abdominal aortic aneurysm (AAA) or thoraco-abdominal aneurysm (TAAA) using stent grafts with inner branches were included. Data were collected prospectively. Thirty-two patients (28 male, mean age 71.6 ± 8.3 years) were included. Seven (21.9%) patients had a complex AAA and 25 (78.1%) had a TAAA. A stent graft with inner branches only was used in four (12.5%) patients. The remaining 28 (87.5%) patients received a stent graft with fenestrations and inner branches. In total 52 vessels were targeted with inner branches. Technical success was achieved in all 32 (100%) patients. All 38 inner branch target vessels in grafts including fenestrations and inner branches were instantly catheterised (<1 minute), whereas catheterisation of target vessels in "inner branch only" grafts proved more difficult (<1 minute, n = 3; 1-3 min, n = 4; and >3 min, n = 7). The 30 day operative mortality was 3.1% (1/32). Estimated survival at 1 year was 80.0% ± 8.3%. During follow-up, four renal inner branches occluded in three patients. The estimated inner branch target vessel stent patency at 1 year was 91.9 ± 4.5%. The estimated freedom from re-intervention at 1 year was 78.4% ± 8.9%. Early data suggest that visceral inner branches might represent a feasible third option to address selected target vessels in F/BEVAR. Stent grafts with inner branch(es) in combination with fenestrations seem to be a better configuration than stent grafts with inner branches alone. Durability of the inner branch design needs further investigation. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.
P Waiker, Veena; Shivalingappa, Shanthakumar
2015-01-01
Platelet rich plasma is known for its hemostatic, adhesive and healing properties in view of the multiple growth factors released from the platelets to the site of wound. The primary objective of this study was to use autologous platelet rich plasma (PRP) in wound beds for anchorage of skin grafts instead of conventional methods like sutures, staplers or glue. In a single center based randomized controlled prospective study of nine months duration, 200 patients with wounds were divided into two equal groups. Autologous PRP was applied on wound beds in PRP group and conventional methods like staples/sutures used to anchor the skin grafts in a control group. Instant graft adherence to wound bed was statistically significant in the PRP group. Time of first post-graft inspection was delayed, and hematoma, graft edema, discharge from graft site, frequency of dressings and duration of stay in plastic surgery unit were significantly less in the PRP group. Autologous PRP ensured instant skin graft adherence to wound bed in comparison to conventional methods of anchorage. Hence, we recommend the use of autologous PRP routinely on wounds prior to resurfacing to ensure the benefits of early healing.
PIRCHE-II Is Related to Graft Failure after Kidney Transplantation
Geneugelijk, Kirsten; Niemann, Matthias; Drylewicz, Julia; van Zuilen, Arjan D.; Joosten, Irma; Allebes, Wil A.; van der Meer, Arnold; Hilbrands, Luuk B.; Baas, Marije C.; Hack, C. Erik; van Reekum, Franka E.; Verhaar, Marianne C.; Kamburova, Elena G.; Bots, Michiel L.; Seelen, Marc A. J.; Sanders, Jan Stephan; Hepkema, Bouke G.; Lambeck, Annechien J.; Bungener, Laura B.; Roozendaal, Caroline; Tilanus, Marcel G. J.; Vanderlocht, Joris; Voorter, Christien E.; Wieten, Lotte; van Duijnhoven, Elly M.; Gelens, Mariëlle; Christiaans, Maarten H. L.; van Ittersum, Frans J.; Nurmohamed, Azam; Lardy, Junior N. M.; Swelsen, Wendy; van der Pant, Karlijn A.; van der Weerd, Neelke C.; ten Berge, Ineke J. M.; Bemelman, Fréderike J.; Hoitsma, Andries; van der Boog, Paul J. M.; de Fijter, Johan W.; Betjes, Michiel G. H.; Heidt, Sebastiaan; Roelen, Dave L.; Claas, Frans H.; Otten, Henny G.; Spierings, Eric
2018-01-01
Individual HLA mismatches may differentially impact graft survival after kidney transplantation. Therefore, there is a need for a reliable tool to define permissible HLA mismatches in kidney transplantation. We previously demonstrated that donor-derived Predicted Indirectly ReCognizable HLA Epitopes presented by recipient HLA class II (PIRCHE-II) play a role in de novo donor-specific HLA antibodies formation after kidney transplantation. In the present Dutch multi-center study, we evaluated the possible association between PIRCHE-II and kidney graft failure in 2,918 donor–recipient couples that were transplanted between 1995 and 2005. For these donors–recipients couples, PIRCHE-II numbers were related to graft survival in univariate and multivariable analyses. Adjusted for confounders, the natural logarithm of PIRCHE-II was associated with a higher risk for graft failure [hazard ratio (HR): 1.13, 95% CI: 1.04–1.23, p = 0.003]. When analyzing a subgroup of patients who had their first transplantation, the HR of graft failure for ln(PIRCHE-II) was higher compared with the overall cohort (HR: 1.22, 95% CI: 1.10–1.34, p < 0.001). PIRCHE-II demonstrated both early and late effects on graft failure in this subgroup. These data suggest that the PIRCHE-II may impact graft survival after kidney transplantation. Inclusion of PIRCHE-II in donor-selection criteria may eventually lead to an improved kidney graft survival. PMID:29556227
Sigdel, Tara K.; Salomonis, Nathan; Nicora, Carrie D.; Ryu, Soyoung; He, Jintang; Dinh, Van; Orton, Daniel J.; Moore, Ronald J.; Hsieh, Szu-Chuan; Dai, Hong; Thien-Vu, Minh; Xiao, Wenzhong; Smith, Richard D.; Qian, Wei-Jun; Camp, David G.; Sarwal, Minnie M.
2014-01-01
Early transplant dysfunction and failure because of immunological and nonimmunological factors still presents a significant clinical problem for transplant recipients. A critical unmet need is the noninvasive detection and prediction of immune injury such that acute injury can be reversed by proactive immunosuppression titration. In this study, we used iTRAQ -based proteomic discovery and targeted ELISA validation to discover and validate candidate urine protein biomarkers from 262 renal allograft recipients with biopsy-confirmed allograft injury. Urine samples were randomly split into a training set of 108 patients and an independent validation set of 154 patients, which comprised the clinical biopsy-confirmed phenotypes of acute rejection (AR) (n = 74), stable graft (STA) (n = 74), chronic allograft injury (CAI) (n = 58), BK virus nephritis (BKVN) (n = 38), nephrotic syndrome (NS) (n = 8), and healthy, normal control (HC) (n = 10). A total of 389 proteins were measured that displayed differential abundances across urine specimens of the injury types (p < 0.05) with a significant finding that SUMO2 (small ubiquitin-related modifier 2) was identified as a “hub” protein for graft injury irrespective of causation. Sixty-nine urine proteins had differences in abundance (p < 0.01) in AR compared with stable graft, of which 12 proteins were up-regulated in AR with a mean fold increase of 2.8. Nine urine proteins were highly specific for AR because of their significant differences (p < 0.01; fold increase >1.5) from all other transplant categories (HLA class II protein HLA-DRB1, KRT14, HIST1H4B, FGG, ACTB, FGB, FGA, KRT7, DPP4). Increased levels of three of these proteins, fibrinogen beta (FGB; p = 0.04), fibrinogen gamma (FGG; p = 0.03), and HLA DRB1 (p = 0.003) were validated by ELISA in AR using an independent sample set. The fibrinogen proteins further segregated AR from BK virus nephritis (FGB p = 0.03, FGG p = 0.02), a finding that supports the utility of monitoring these urinary proteins for the specific and sensitive noninvasive diagnosis of acute renal allograft rejection. PMID:24335474
Puskas, J D; Winton, T L; Miller, J D; Scavuzzo, M; Patterson, G A
1992-05-01
Single lung transplantation remains limited by a severe shortage of suitable donor lungs. Potential lung donors are often deemed unsuitable because accepted criteria (both lungs clear on the chest roentgenogram, arterial oxygen tension greater than 300 mm Hg with an inspired oxygen fraction of 1.0, a positive end-expiratory pressure of 5 cm H2O, and no purulent secretions) do not distinguish between unilateral and bilateral pulmonary disease. Many adequate single lung grafts may be discarded as a result of contralateral aspiration or pulmonary trauma. We have recently used intraoperative unilateral ventilation and perfusion to assess single lung function in potential donors with contralateral lung disease. In the 11-month period ending October 1, 1990, we performed 18 single lung transplants. In four of these cases (22%), the donor chest roentgenogram or bronchoscopic examination demonstrated significant unilateral lung injury. Donor arterial oxygen tension, (inspired oxygen fraction 1.0; positive end-expiratory pressure 5 cm H2O) was below the accepted level in each case (246 +/- 47 mm Hg, mean +/- standard deviation). Through the sternotomy used for multiple organ harvest, the pulmonary artery to the injured lung was clamped. A double-lumen endotracheal tube or endobronchial balloon occlusion catheter was used to permit ventilation of the uninjured lung alone. A second measurement of arterial oxygen tension (inspired oxygen fraction 1.0; positive end-expiratory pressure 5 cm H2O) revealed excellent unilateral lung function in all four cases (499.5 +/- 43 mm Hg; p less than 0.0004). These single lung grafts (three right, one left) were transplanted uneventfully into four recipients (three with pulmonary fibrosis and one with primary pulmonary hypertension). Lung function early after transplantation was adequate in all patients. Two patients were extubated within 24 hours. There were two late deaths, one caused by rejection and Aspergillus infection and the other caused by cytomegalovirus 6 months after transplantation. Two patients are alive and doing well. We conclude that assessment of unilateral lung function in potential lung donors is indicated in selected cases, may be quickly and easily performed, and may significantly increase the availability of single lung grafts.
Bunnapradist, Suphamai; Gritsch, H Albin; Peng, Alice; Jordan, Stanley C; Cho, Yong W
2003-04-01
The current organ shortage has led to the utilization of double kidney transplants from marginal adult donors, but outcomes data are limited. The United Network for Organ Sharing registry database was used to compare the outcomes of 403 dual adult kidney transplantations (DKT) and 11,033 single kidney transplantations (SKT) from 1997 to 2000. Graft and patient survival and the effect of multiple risk factors were evaluated. It was found that DKT patients were older, less sensitized, and received grafts from older, more mismatched donors with longer cold ischemia times. There was also a greater percentage of donors with a history of diabetes or hypertension and African-American recipients and donors in the DKT group. Graft survival was inferior in the DKT group, with a 7% lower graft survival rate at 1 yr. There was a higher incidence of primary nonfunction in the DKT group, although the incidence of delayed graft function, early rejection treatment, and graft thrombosis did not differ. Multivariate analysis was used to identify African-American recipient ethnicity and retransplant as risk factors for graft loss. Graft survival was comparable in DKT and SKT with donors over 55 yr of age. DKT resulted in inferior graft outcomes compared with SKT. When compared with SKT with donors over 55 yr of age, DKT resulted in similar graft outcomes. These otherwise discarded kidneys should be cautiously considered as a source of marginal donors.
Transplantation after ex vivo lung perfusion: A midterm follow-up.
Wallinder, Andreas; Riise, Gerdt C; Ricksten, Sven-Erik; Silverborn, Martin; Dellgren, Göran
2016-11-01
A large proportion of donor lungs are discarded due to known or presumed organ dysfunction. Ex vivo lung perfusion (EVLP) has proven its value as a tool for discrimination between reversible and irreversible donor lung pathology. However, the long-term outcome after transplantation of lungs after EVLP is essentially unknown. We report short-term and midterm outcomes of recipients who received transplants of EVLP-evaluated lungs. Single-center results of recipients of lungs with prior EVLP were compared with consecutive recipients of non-EVLP lungs (controls) during the same period. Short-term follow-up included time to extubation, time in the intensive care unit, and the presence of primary graft dysfunction at 72 hours postoperatively. Mortality and incidence of chronic lung allograft dysfunction were monitored for up to 4 years after discharge. During a 4-year period, 32 pairs of initially rejected donor lungs underwent EVLP. After EVLP, 22 double lungs and 5 single lungs were subsequently transplanted. During this period, 145 patients received transplants of conventional donor lungs that did not have EVLP and constituted the control group. Median time to extubation was 7 hours in the EVLP group and 6 hours in the non-EVLP control group (p = 0.45). Median intensive care unit stay was 4 days vs. 3 days, respectively (p = 0.15). Primary graft dysfunction grade > 1 was present in 14% in the EVLP group and in 12% in the non-EVLP group at 72 hours after transplant. Survival at 1 year was 92% in the EVLP group and 79% in the non-EVLP group. Cumulative survival and freedom from retransplantation or chronic rejection were also comparable between the 2 groups (p = 0.43) when monitored up to 4 years. Selected donor lungs rejected for transplantation can be used after EVLP. This technique is effective for selection of transplantable donor lungs. Patients who received lungs evaluated under EVLP have short-term and midterm outcomes comparable to recipients of non-EVLP donor lungs. Copyright © 2016 International Society for Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
[Localized purpura revealing vascular prosthetic graft infection].
Boureau, A S; Lescalie, F; Cassagnau, E; Clairand, R; Connault, J
2013-07-01
Prosthetic graft infection after vascular reconstruction is a rare but serious complication. We report a case of infection occurring late after implantation of an iliofemoral prosthetic vascular graft. The Staphylococcus aureus infection was revealed by vascular purpura localized on the right leg 7 years after implantation of a vascular prosthesis. This case illustrates an uncommonly late clinical manifestation presenting as an acute infection 7 years after the primary operation. In this situation, the presentation differs from early infection, which generally occurs within the first four postoperative months. Diagnosis and treatment remain a difficult challenge because prosthetic graft infection is a potentially life-threatening complication. Morbidity and mortality rates are high. Here we detail specific aspects of the clinical and radiological presentation. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
[Polish vascular prosthesis sealed by albumin].
Raczyński, K; Dyczka, A; Gawlikowska, Z
1992-01-01
TRICOMED, Medical Articles has been conducting its research on textile vascular prostheses for over 35 years. In the seventies, a collagen coated graft was designed and fabricated which, however, after having been positively evaluated ceased to be used by clinics. In the eighties, double velour DALLONR prostheses was introduced which are now marketed by our company. The DALLONR graft has a low mass, developed surface area and moderate porosity. In the eighties, foreign manufacturers launched various types of coated grafts. Responding to that popular market trend we have resumed our early research on the preclotted prostheses and started studies on the use of albumin and chitosan as coating agents. The grafts impregnated with albumin have achieved satisfactory results of biological and experimental testing. These results are confirmed by current clinical examination.
[Considerations on family dynamics and the malnutrition syndrome in Mexican children].
Vásquez-Garibay, Edgar Manuel; González-Rico, José Luis; Romero-Velarde, Enrique; Sánchez-Talamantes, Eva; Navarro-Lozano, María Eugenia; Nápoles-Rodríguez, Francisco
2015-01-01
Since the early 1990s we noted that family dysfunction was more common in children with severe primary malnutrition than in children admitted to the hospital without malnutrition. Defects on feeding habits during the first year of life, especially early weaning and inadequate complementary feeding were more common in dysfunctional families. We also observed that chronic malnutrition in preschool children, and overweight and obesity in schoolchildren were more common in children from dysfunctional families. Once the association between dysfunctional family dynamics and obesity in schoolchildren was demonstrated, it was observed that low education of fathers and mothers increased twofold the possibility of family dysfunction: OR: 2.06; 95% CI: 1.37-3.10 and OR: 2.47; 95% CI: 1.57-3.89, respectively. In addition, the low-income and the lower purchasing power of foods were associated to family dysfunction (p<0.05). A remaining task is to explore how to assess family dysfunction in composite, extended, single-parent families where there exist other persons vulnerable to the different entities of malnutrition syndrome and indeed depend on adults for their care, food and nutrition.
Farina, Roberto; Simonelli, Anna; Rizzi, Alessandro; Pramstraller, Mattia; Cucchi, Alessandro; Trombelli, Leonardo
2013-07-01
This study aims to evaluate the early postoperative healing of papillary incision wounds and its association with (1) patient/site-related factors and technical (surgical) aspects as well as with (2) 6-month clinical outcomes following buccal single flap approach (SFA) in the treatment of intraosseous periodontal defects. Forty-three intraosseous defects in 35 patients were accessed with a buccal SFA alone or in combination with a reconstructive technology (graft, enamel matrix derivative (EMD), graft + EMD, or graft + membrane). Postoperative healing was evaluated at 2 weeks using the Early Wound-Healing Index (EHI). EHI ranged from score 1 (i.e., complete flap closure and optimal healing) to score 4 (i.e., loss of primary closure and partial tissue necrosis). SFA resulted in a complete wound closure at 2 weeks in the great majority of sites. A significantly more frequent presence of interdental contact point and interdental soft tissue crater, and narrower base of the interdental papilla were observed at sites with either EHI > 1 or EHI = 4 compared to sites with EHI = 1. No association between EHI and the 6-month clinical outcomes was observed. At 2 weeks, buccal SFA may result in highly predictable complete flap closure. Site-specific characteristics may influence the early postoperative healing of the papillary incision following SFA procedure. Two-week soft tissue healing, however, was not associated with the 6-month clinical outcomes.
ERIC Educational Resources Information Center
Kinzl, Johann F.; And Others
1995-01-01
This study evaluated 202 female university students for early familial experience and childhood sexual abuse (CSA) in relation to adult sexual disorders: (1) victims of multiple CSA more frequently reported sexual desire disorders; and (2) single-incident victims and nonvictims reported no significantly different rates of sexual dysfunction.…
Wu, Kaiyin; Budde, Klemens; Lu, Huber; Schmidt, Danilo; Liefeldt, Lutz; Glander, Petra; Neumayer, Hans Helmut; Rudolph, Birgit
2014-06-15
It is unclear if the severity or the timing of acute cellular rejection (ACR) defined by Banff classification 2009 is associated with graft survival. Borderline changes, TCMR I (interstitial rejection), and TCMR II/III (vascular rejection) were defined as low, moderate, and high ACR severity, respectively. Approximately 270 patients who had at least one episode of ACR were enrolled, 270 biopsies were chosen which showed the highest ACR severity of each patient and were negative for donor-specific antibodies (DSA), C4d, and microcirculation changes (MC). Six months were used as the cutoff to define early and late ACR; 370 patients without biopsy posttransplantation were recruited in the control group. Up to 8-year posttransplantation, death-censored graft survival (DCGS) rates of control, borderline, TCMR I, and TCMR II/III groups were 97.6%, 93.3%, 79.6%, and 73.6% (log rank test, P<0.001); the control group had significantly higher DCGS rate than the three ACR groups (each pairwise comparison yields P<0.05). The DCGS rate of late ACR was significantly lower compared with early ACR (63.6% vs. 87.4%, P<0.001). Intimal arteritis (Banff v-lesion) was an independent histologic risk factor correlated with long-term graft loss regardless of the timing of ACR. The v-lesions with minimal or high-grade tubulitis displayed similar graft survival (72.7% vs. 72.9%, P=0.96). All types of ACR affect long-term graft survival. Vascular or late ACR predict poorer graft survival; the extent of tubulointerstitial inflammation (TI) is of no prognostic significance for vascular rejection.
Ben Ahmed, Sabrina; Louvancourt, Adrien; Daniel, Guillaume; Combe, Pierre; Duprey, Ambroise; Albertini, Jean-Noël; Favre, Jean-Pierre; Rosset, Eugenio
2018-02-01
The objective of this study was to evaluate the early and long-term outcome of cryopreserved arterial allografts (CAAs) used for in situ reconstruction of abdominal aortic native or secondary graft infection and to identify predictors of mortality. We retrospectively included 71 patients (mean age, 65.2 years [range, 41-84 years]; men, 91.5%) treated for abdominal aortic native or secondary graft infection (65 prosthetic graft infections; 16 of them had secondary aortoenteric fistula, 2 venous graft infections, and 4 mycotic aneurysms) by in situ reconstruction with CAA in the university hospitals of Clermont-Ferrand and Saint-Etienne from 2000 to 2016. The cryopreservation protocol was identical in both centers (-140°C). Early (<30 days) and late (>30 days) mortality and morbidity, reinfection, and CAA patency were assessed. Computed tomography was performed in all survivors. Survival was analyzed with the Kaplan-Meier method. Univariate analyses were performed with the log-rank test and multivariate analysis with the Cox regression model. Mean follow-up was 45 months (0-196 months). Early postoperative mortality rate was 16.9% (11/71). Early postoperative CAA-related mortality rate was 2.8% (2/71); both patients died of proximal anastomotic rupture on postoperative days 4 and 15. Early CAA-related reintervention rate was 5.6% (4/71); all had an anastomotic rupture, and two were lethal. Early postoperative reintervention rate was 15.5% (11/71). Intraoperative bacteriologic samples were positive in 56.3%, and 31% had a sole microorganism. Escherichia coli was more frequently identified in the secondary aortoenteric fistula and Staphylococcus epidermidis in the infected prosthesis. Late CAA-related mortality rate was 2.8%: septic shock at 2 months in one patient and proximal anastomosis rupture at 1 year in one patient. Survival at 1 year, 3 years, and 5 years was 75%, 64%, and 54%, respectively. Multivariate analysis identified type 1 diabetes (hazard ratio, 2.49; 95% confidence interval, 1.05-5.88; P = .04) and American Society of Anesthesiologists class 4 (hazard ratio, 2.65; 95% confidence interval, 1.07-6.53; P = .035) as predictors of mortality after in situ CAA reconstruction. Reinfection rate was 4% (3/71). Late CAA-related reintervention rate was 12.7% (9/71): proximal anastomotic rupture in one, CAA branch stenosis/thrombosis in five, ureteral-CAA branch fistula in one, and distal anastomosis false aneurysm in two. Primary patency at 1 year, 3 years, and 5 years was 100%, 93%, and 93%, respectively. Assisted primary patency at 1 year, 3 years, and 5 years was 100%, 96%, and 96%, respectively. No aneurysm or dilation was observed. The prognosis of native or secondary aortic graft infections is poor. Aortic in situ reconstruction with CAA offers acceptable early and late results. Patients with type 1 diabetes and American Society of Anesthesiologists class 4 are at higher risk of mortality. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tanaka, Toshihiro, E-mail: toshihir@bf6.so-net.ne.jp; Guenther, Rolf W., E-mail: guenther@rad.rwth-aachen.de; Isfort, Peter, E-mail: isfort@hia.rwth-aachen.de
Transjugular intrahepatic portosystemic shunt (TIPS) dysfunction is an important problem after creation of shunts. Most commonly, TIPS recanalization is performed via the jugular vein approach. Occasionally it is difficult to cross the occlusion. We describe a hybrid technique for TIPS revision via a direct transhepatic access combined with a transjugular approach. In two cases, bare metal stents or polytetrafluoroethylene (PTFE)-covered stent grafts had been placed in TIPS tract previously, and they were completely obstructed. The tracts were inaccessible via the jugular vein route alone. In each case, after fluoroscopy or computed tomography-guided transhepatic puncture of the stented segment of themore » TIPS, a wire was threaded through the shunt and snared into the right jugular vein. The TIPS was revised by balloon angioplasty and additional in-stent placement of PTFE-covered stent grafts. The patients were discharged without any complications. Doppler sonography 6 weeks after TIPS revision confirmed patency in the TIPS tract and the disappearance of ascites. We conclude that this technique is feasible and useful, even in patients with previous PTFE-covered stent graft placement.« less
[Parenchymal complications of the transplanted kidney: the role of color-Doppler imaging].
Granata, Antonio; Clementi, Silvia; Clementi, Anna; Di Pietro, Fabio; Scarfia, Viviana R; Insalaco, Monica; Aucella, Filippo; Prencipe, Michele; Fiorini, Fulvio; Sicurezza, Elvia
2012-01-01
Kidney transplantation is the treatment of choice for end-stage renal disease, given the better quality of life of transplanted patients when compared to patients on maintenance dialysis. In spite of surgical improvements and new immunosuppressive regimens, part of the transplanted grafts still develop chronic dysfunction. Ultrasonography, both in B-mode and with Doppler ultrasound, is an important diagnostic tool in case of clinical conditions which might impair kidney function. Even though ultrasonography is considered fundamental in the diagnosis of vascular and surgical complications of the transplanted kidney, its role is not fully understood in case of parenchymal complications of the graft. The specificity of Doppler ultrasound is low both in case of acute complications such as acute tubular necrosis, drug toxicity and acute rejection, and in case of chronic conditions such as chronic allograft nephropathy. Single determinations of resistance indices present low diagnostic accuracy, which is higher in case of successive measurements performed during the follow-up of the graft. Modern techniques including tissue pulsatility index, maximal fractional area and contrast-enhanced ultrasound increase the diagnostic power of ultrasonography in case of parenchymal complications of the transplanted kidney.
Advances in urethral stricture management
Gallegos, Maxx A.; Santucci, Richard A.
2016-01-01
Urethral stricture/stenosis is a narrowing of the urethral lumen. These conditions greatly impact the health and quality of life of patients. Management of urethral strictures/stenosis is complex and requires careful evaluation. The treatment options for urethral stricture vary in their success rates. Urethral dilation and internal urethrotomy are the most commonly performed procedures but carry the lowest chance for long-term success (0–9%). Urethroplasty has a much higher chance of success (85–90%) and is considered the gold-standard treatment. The most common urethroplasty techniques are excision and primary anastomosis and graft onlay urethroplasty. Anastomotic urethroplasty and graft urethroplasty have similar long-term success rates, although long-term data have yet to confirm equal efficacy. Anastomotic urethroplasty may have higher rates of sexual dysfunction. Posterior urethral stenosis is typically caused by previous urologic surgery. It is treated endoscopically with radial incisions. The use of mitomycin C may decrease recurrence. An exciting area of research is tissue engineering and scar modulation to augment stricture treatment. These include the use of acellular matrices or tissue-engineered buccal mucosa to produce grafting material for urethroplasty. Other experimental strategies aim to prevent scar formation altogether. PMID:28105329
Iida, Shoichi; Tsuda, Hidetoshi; Tanaka, Toshiaki; Kish, Danielle D.; Abe, Toyofumi; Su, Charles A.; Abe, Ryo; Tanabe, Kazunari; Valujskikh, Anna; Baldwin, William M.; Fairchild, Robert L.
2016-01-01
Reperfusion of organ allografts induces a potent inflammatory response that directs rapid memory T cell, neutrophil and macrophage graft infiltration and their activation to express functions mediating graft tissue injury. The role of cardiac allograft IL-1 receptor signaling in this early inflammation and the downstream primary alloimmune response was investigated. When compared to complete MHC-mismatched wild type cardiac allografts, IL-1R−/− allografts had marked decreases in endogenous memory CD8 T cell and neutrophil infiltration and expression of proinflammatory mediators at early times after transplant whereas endogenous memory CD4 T cell and macrophage infiltration was not decreased. IL-1R−/− allograft recipients also had marked decreases in de novo donor-reactive CD8, but not CD4, T cell development to IFN-γ-producing cells. CD8 T cell-mediated rejection of IL-1R−/− cardiac allografts took 3 weeks longer than wild type allografts. Cardiac allografts from reciprocal bone marrow reconstituted IL-1R−/−/wild type chimeric donors indicated that IL-1R signaling on graft non-hematopoietic-derived, but not bone marrow-derived, cells is required for the potent donor-reactive memory and primary CD8 T cell alloimmune responses observed in response to wild type allografts. These studies implicate IL-1R-mediated signals by allograft parenchymal cells in generating the stimuli provoking development and elicitation of optimal alloimmune responses to the grafts. PMID:26856697
The role of endothelial cells on islet function and revascularization after islet transplantation.
Del Toro-Arreola, Alicia; Robles-Murillo, Ana Karina; Daneri-Navarro, Adrian; Rivas-Carrillo, Jorge David
2016-01-02
Islet transplantation has become a widely accepted therapeutic option for selected patients with type 1 diabetes mellitus. However, in order to achieve insulin independence a great number of islets are often pooled from 2 to 4 pancreata donors. Mostly, it is due to the massive loss of islets immediately after transplant. The endothelium plays a key role in the function of native islets and during the revascularization process after islet transplantation. However, if a delayed revascularization occurs, even the remaining islets will also undergo to cell death and late graft dysfunction. Therefore, it is essential to understand how the signals are released from endothelial cells, which might regulate both differentiation of pancreatic progenitors and thereby maintenance of the graft function. New strategies to facilitate islet engraftment and a prompt revascularization could be designed to intervene and might lead to improve future results of islet transplantation.
Costa, André Nathan; Mendes, Daniel Melo; Toufen, Carlos; Arrunátegui, Gino; Caruso, Pedro; de Carvalho, Carlos Roberto Ribeiro
2008-08-01
Fat embolism is defined as mechanical blockage of the vascular lumen by circulating fat globules. Although it primarily affects the lungs, it can also affect the central nervous system, retina, and skin. Fat embolism syndrome is a dysfunction of these organs caused by fat emboli. The most common causes of fat embolism and fat embolism syndrome are long bone fractures, although there are reports of its occurrence after cosmetic procedures. The diagnosis is made clinically, and treatment is still restricted to support measures. We report the case of a female patient who developed adult respiratory distress syndrome due to fat embolism in the postoperative period following liposuction and fat grafting. In this case, the patient responded well to alveolar recruitment maneuvers and protective mechanical ventilation. In addition, we present an epidemiological and pathophysiological analysis of fat embolism syndrome after cosmetic procedures.
Brasoveanu, Vladislav; Ionescu, Mihnea Ioan; Grigorie, Razvan; Mihaila, Mariana; Bacalbasa, Nicolae; Dumitru, Radu; Herlea, Vlad; Iorgescu, Andreea; Tomescu, Dana; Popescu, Irinel
2015-09-19
Abernethy malformation (AM), or congenital absence of portal vein (CAPV), is a very rare disease which tends to be associated with the development of benign or malignant tumors, usually in children or young adults. We report the case of a 21-year-old woman diagnosed with type Ib AM (portal vein draining directly into the inferior vena cava) and unresectable liver adenomatosis. The patient presented mild liver dysfunction and was largely asymptomatic. Living donor liver transplantation was performed using a left hemiliver graft from her mother. Postoperatively, the patient attained optimal liver function and at 9-month follow-up has returned to normal life. We consider that living donor liver transplantation is the best therapeutic solution for AM associated with unresectable liver adenomatosis, especially because compared to receiving a whole liver graft, the waiting time on the liver transplantation list is much shorter.
Yamamoto, Sumiharu; Yamane, Masaomi; Yoshida, Osamu; Waki, Naohisa; Okazaki, Mikio; Matsukawa, Akihiro; Oto, Takahiro; Miyoshi, Shinichiro
2015-11-01
Early growth response-1 (Egr-1) has been shown to be a trigger-switch transcription factor that is involved in lung ischemia-reperfusion injury (IRI). Mouse lung transplants were performed in wild-type (WT) C57BL/6 and Egr1-knockout (KO) mice in the following donor → recipient combinations: WT → WT, KO → WT, WT → KO, and KO → KO to determine whether the presence of Egr-1 in the donor or recipient is the most critical factor for IRI. Pulmonary grafts were retrieved after 18 hours of ischemia after 4 hours of reperfusion. We analyzed graft function by analyzing arterial blood gas and histology in each combination and assessed the effects of Egr1 depletion on inflammatory cytokines that are regulated by Egr-1 as well on polymorphonuclear neutrophil (PMN) infiltration. Deletion of Egr1 improved pulmonary graft function in the following order of donor → recipient combinations: WT → WT < WT → KO < KO → WT < KO → KO. Polymerase chain reaction assays for Il1B, Il6, Mcp1, Mip2, Icam1, and Cox2 showed significantly lower expression levels in the KO → KO group than in the other groups. Immunohistochemistry demonstrated clear Egr-1 expression in the nuclei of pulmonary artery endothelial cells and PMN cytoplasm in the WT grafts. Flow cytometry analysis showed that Egr1 deletion reduced PMN infiltration and that the extent of reduction correlated with graft function. Both graft and recipient Egr-1 played a role in lung IRI, but the graft side contributed more to this phenomenon through regulation of PMN infiltration. Donor Egr-1 expression in pulmonary artery endothelial cells may play an important role in PMN infiltration, which results in IRI after lung transplantation.
Auditory system dysfunction in Alzheimer disease and its prodromal states: A review.
Swords, Gabriel M; Nguyen, Lydia T; Mudar, Raksha A; Llano, Daniel A
2018-07-01
Recent findings suggest that both peripheral and central auditory system dysfunction occur in the prodromal stages of Alzheimer Disease (AD), and therefore may represent early indicators of the disease. In addition, loss of auditory function itself leads to communication difficulties, social isolation and poor quality of life for both patients with AD and their caregivers. Developing a greater understanding of auditory dysfunction in early AD may shed light on the mechanisms of disease progression and carry diagnostic and therapeutic importance. Herein, we review the literature on hearing abilities in AD and its prodromal stages investigated through methods such as pure-tone audiometry, dichotic listening tasks, and evoked response potentials. We propose that screening for peripheral and central auditory dysfunction in at-risk populations is a low-cost and effective means to identify early AD pathology and provides an entry point for therapeutic interventions that enhance the quality of life of AD patients. Copyright © 2018 Elsevier B.V. All rights reserved.
Moreira, Henrique T; Volpe, Gustavo J; Marin-Neto, José A; Ambale-Venkatesh, Bharath; Nwabuo, Chike C; Trad, Henrique S; Romano, Minna M D; Pazin-Filho, Antonio; Maciel, Benedito C; Lima, João A C; Schmidt, André
2017-03-01
Right ventricular (RV) impairment is postulated to be responsible for prominent systemic congestion in Chagas disease. However, occurrence of primary RV dysfunction in Chagas disease remains controversial. We aimed to study RV systolic function in patients with Chagas disease using cardiac magnetic resonance. This cross-sectional study included 158 individuals with chronic Chagas disease who underwent cardiac magnetic resonance. RV systolic dysfunction was defined as reduced RV ejection fraction based on predefined cutoffs accounting for age and sex. Multivariable logistic regression was used to verify the relationship of RV systolic dysfunction with age, sex, functional class, use of medications for heart failure, atrial fibrillation, and left ventricular systolic dysfunction. Mean age was 54±13 years, 51.2% men. RV systolic dysfunction was identified in 58 (37%) individuals. Although usually associated with reduced left ventricular ejection fraction, isolated RV systolic dysfunction was found in 7 (4.4%) patients, 2 of them in early stages of Chagas disease. Presence of RV dysfunction was not significantly different in patients with indeterminate/digestive form of Chagas disease (35.7%) compared with those with Chagas cardiomyopathy (36.8%) ( P =1.000). In chronic Chagas disease, RV systolic dysfunction is more commonly associated with left ventricular systolic dysfunction, although isolated and early RV dysfunction can also be identified. © 2017 American Heart Association, Inc.
Fifteen-Year Trends in Pediatric Liver Transplants: Split, Whole Deceased, and Living Donor Grafts.
Mogul, Douglas B; Luo, Xun; Bowring, Mary G; Chow, Eric K; Massie, Allan B; Schwarz, Kathleen B; Cameron, Andrew M; Bridges, John F P; Segev, Dorry L
2018-05-01
To evaluate changes in patient and graft survival for pediatric liver transplant recipients since 2002, and to determine if these outcomes vary by graft type (whole liver transplant, split liver transplant [SLT], and living donor liver transplant [LDLT]). We evaluated patient and graft survival among pediatric liver-only transplant recipients the PELD/MELD system was implemented using the Scientific Registry of Transplant Recipients. From 2002-2009 to 2010-2015, survival for SLT at 30 days improved (94% vs 98%; P < .001), and at 1 year improved for SLT (89% to 95%; P <.001) and LDLT (93% to 98%; P = .002). There was no change in survival for whole liver transplant at either 30 days (98% in both; P = .7) or 1 year (94% vs 95%; P = .2). The risk of early death with SLT was 2.14-fold higher in 2002-2009 (adjusted hazard ratio [aHR] vs whole liver transplant, 1.47 2.14 3.12 ), but this risk disappeared in 2010-2015 (aHR, 0.65 1.13 1.96 ), representing a significant improvement (P = .04). Risk of late death after SLT was similar in both time periods (aHR 2002-2009, 0.87 1.14 1.48 ; aHR 2010-2015, 0.56 0.88 1.37 ). LDLT had similar risk of early death (aHR 2002-2009, 0.49 1.03 2.14 ; aHR 2010-2015, 0.26 0.74 2.10 ) and late death (aHR 2002-2009, 0.52 0.83 1.32 ; aHR 2010-2015, 0.17 0.44 1.11 ). Graft loss was similar for SLT (aHR, 0.93 1.09 1.28 ) and was actually lower for LDLT (aHR, 0.53 0.71 0.95 ). In recent years, outcomes after the use of technical variant grafts are comparable with whole grafts, and may be superior for LDLT. Greater use of technical variant grafts might provide an opportunity to increase organ supply without compromising post-transplant outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.
Rybak, James; Larsen, Stephen; Yu, Michelle; Levine, Laurence A
2014-04-01
Management of adult acquired buried penis is a troublesome situation for both patient and surgeon. The buried penis has been associated with significant erectile and voiding dysfunction, depression, and overall poor quality of life (QOL). To identify outcomes following reconstructive surgery with release of buried penis, escutcheonectomy, and circumcision with or without skin grafting. We retrospectively identified 11 patients treated by a single surgeon between 2007 and 2011, patient ages were 44-69; complete data review was available on all 11. Validated European Organisation for Research and Treatment of Cancer 15 QOL, Center for Epidemiologic Studies Depression Scale (CES-D), and International Index of Erectile Function (IIEF) surveys assessed patient QOL, depression, and erectile function pre- and postoperatively. Mean body mass index (BMI) was 48.8 (42.4-64.6). Mean operative time was 191 minutes (139-272). Mean length of stay was 2.1 days. Ten of 11 patients required phallic skin grafting. There was one perioperative complication resulting in respiratory failure and overnight stay in the intensive care unit. Wound complications were seen in 2/11 patients, and 1 needed surgical debridement for superficial wound infection. Skin graft take was seen in 100% of the patients. Ninety-one percent of patients noted significant improvement in voiding postoperatively. Ninety-one percent of patients reported significant erectile dysfunction preoperatively. Subsequently, IIEF scores improved post surgery by an average of 7.7 points. Clinical depression was noted to be present in 7/11 patients preoperatively and 2/11 postoperatively based on CES-D surveys. QOL improved significantly in 10/11 compared with preoperative baseline; however, many patients noted significant difficulties based on their weight and other comorbidities. Management of adult acquired buried penis is a challenging, yet correctable problem. In our series it appears that by using established surgical techniques we were able to achieve significant improvements in erectile function, QOL, and measures of depression. © 2014 International Society for Sexual Medicine.
De Pergola, Giovanni; Nardecchia, Adele; Giagulli, Vito Angelo; Triggiani, Vincenzo; Guastamacchia, Edoardo; Minischetti, Manuela Castiglione; Silvestris, Franco
2013-03-01
Epidemiological studies have recently shown that obesity, and abdominal obesity in particular, is an independent risk factor for the development of heart failure (HF). Higher cardiac oxidative stress is the early stage of heart dysfunction due to obesity, and it is the result of insulin resistance, altered fatty acid and glucose metabolism, and impaired mitochondrial biogenesis. Extense myocyte hypertrophy and myocardial fibrosis are early microscopic changes in patients with HF, whereas circumferential strain during the left ventricular (LV) systole, LV increase in both chamber size and wall thickness (LV hypertrophy), and LV dilatation are the early macroscopic and functional alterations in obese developing heart failure. LV hypertrophy leads to diastolic dysfunction and subendocardial ischemia in obesity, and pericardial fat has been shown to be significantly associated with LV diastolic dysfunction. Evolving abnormalities of diastolic dysfunction may include progressive hypertrophy and systolic dysfunction, and various degrees of eccentric and/or concentric LV hypertrophy may be present with time. Once HF is established, overweight and obese have a better prognosis than do their lean counterparts with the same level of cardiovascular disease, and this phenomenon is called "obesity paradox". It is mainly due to lower muscle protein degradation, brain natriuretic peptide circulating levels and cardio-respiratory fitness than normal weight patients with HF.
Wang, Minghui; Liu, Shanying; Ouyang, Nengtai; Song, Erwei; Lutz, Jens; Heemann, Uwe
2004-09-01
Lymphocytic infiltration is obvious throughout early and late stages of chronic allograft nephropathy. Early infiltrating lymphocytes are involved in initial insults to kidney allografts, but the contribution of late infiltration to long-term allograft attrition is still controversial. Early application of FTY720 reduced the number of graft infiltrating lymphocytes, and inhibited acute rejection. The present study investigated the potential of FTY720 to reduce the number of infiltrating lymphocytes even at a late stage, and, thus, slow the pace of chronic allograft nephropathy. Fisher (F344) rat kidneys were orthotopically transplanted into Lewis recipients with an initial 10-day course of cyclosporine A (1.5 mg/kg/day). FTY720, at a dose of 0.5 mg/kg/day, or vehicle was administered to recipients either from weeks 12 to 24 or from 20 to 24 after transplantation. Animals were harvested 24 weeks after transplantation for histologic, immunohistologic, and molecular analysis. FTY720, either initiated at 12 or 20 weeks after transplantation, reduced urinary protein excretion, and significantly ameliorated glomerulosclerosis, interstitial fibrosis, tubular atrophy, and intimal proliferation of graft arteries at 24 weeks after transplantation. Furthermore FTY720 markedly suppressed lymphocyte infiltration and decreased mRNA levels of interleukin-10 (IL-10), transforming growth factor-beta (TGF-beta), and platelet-derived growth factor-B (PDGF-B) but enhanced the number of apoptotic cells in grafts. FTY720 ameliorated chronic allograft nephropathy even at advanced stages. Furthermore, our data suggest that this effect was achieved by a reduction of graft infiltrating lymphocytes.
Kiani, Soroosh; Desai, Pranjal H.; Thirumvalavan, Nannan; Kurian, Dinesh John; Flynn, Mary Margaret; Zhao, XiaoQing
2011-01-01
BACKGROUND Endoscopic vein harvest (EVH) is the US standard of care for CABG but recent comparisons to open harvest suggest that conduit quality and outcomes may be compromised. To test the hypothesis that problems with EVH may relate to its learning curve and conduit quality, we analyzed the quality and early function of conduits procured by technicians with varying EVH experience. METHODS EVH was performed during CABG by “experienced” (>900 cases, n=55 patients) vs. “novice” (<100 cases, n=30 patients) technicians. Afterwards, conduits were and examined for vascular injury using optical coherence tomography (OCT), with segments identified as injured further examined for gene expression using a tissue injury array. Conduit diameter was measured intra- and postoperatively (day 5 and 6 months) using OCT and Computed-Tomography angiography. RESULTS EVH performed by novice harvesters resulted in increased number of discrete graft injuries and higher expression of tissue injury genes. Regression analysis revealed an association between shear stress and early dilation (positive remodeling) (R2 =0.48, p <0.01). Injured veins showed blunted positive remodeling at 5 days and a greater degree of late lumen loss at 6 months. CONCLUSION Under normal conditions, intraluminal shear stress leads vein grafts to develop positive remodeling over the first postoperative week. Injury to conduits, a frequent sequela of the learning curve for EVH, was a predictor of early graft failure, blunted positive remodeling and greater negative remodeling. Given the ongoing annual volume of EVH cases, rigorous monitoring of the learning curve represents an important and unrecognized public health issue. PMID:21996436
Götz, Werner; Gerber, Thomas; Michel, Barbara; Lossdörfer, Stefan; Henkel, Kai-Olaf; Heinemann, Friedhelm
2008-10-01
Bone substitute biomaterials may be osteogenic, osteoconductive or osteoinductive. To test for these probable characteristics in a new nanoporous grafting material consisting of nanocrystalline hydroxyapatite embedded in a porous silica gel matrix (NanoBone(s)), applied in humans, we studied biopsies from 12 patients before dental implantation following various orofacial augmentation techniques with healing times of between 3.5 and 12 months. Sections from decalcified specimens were investigated using histology, histochemistry [periodic acid Schiff, alcian blue staining and tartrate-resistant acid phosphatase (TRAP)] and immunohistochemistry, with markers for osteogenesis, bone remodelling, resorption and vessel walls (alkaline phosphatase, bone morphogenetic protein-2, collagen type I, ED1, osteocalcin, osteopontin, runx2 and Von-Willebrand factor). Histologically, four specific stages of graft transformation into lamellar bone could be characterized. During early stages of healing, bone matrix proteins were absorbed by NanoBone(s) granules, forming a proteinaceous matrix, which was invaded by small vessels and cells. We assume that the deposition of these molecules promotes early osteogenesis in and around NanoBone(s) and supports the concomitant degradation probably by osteoclast-like cells. TRAP-positive osteoclast-like cells were localized directly on the granular surfaces. Runx2-immunoreactive pre-osteoblasts, which are probably involved in direct osteogenesis forming woven bone that is later transformed into lamellar bone, were attracted. Graft resorption and bone apposition around the graft granules appear concomitantly. We postulate that NanoBone(s) has osteoconductive and biomimetic properties and is integrated into the host's physiological bone turnover at a very early stage.
Kiss, Marc-Olivier; Levasseur, Annie; Petit, Yvan; Lavigne, Patrick
2012-05-01
Osteochondral autografts in mosaicplasty are inserted in a press-fit fashion, and hence, patients are kept nonweightbearing for up to 2 months after surgery to allow bone healing and prevent complications. Very little has been published regarding alternative fixation techniques of those grafts. Osteochondral autografts stabilized with a resorbable osteoconductive bone cement would have a greater load-bearing capacity than standard press-fit grafts. Controlled laboratory study. Biomechanical testing was conducted on 8 pairs of cadaveric bovine distal femurs. For the first 4 pairs, 6 single osteochondral autografts were inserted in a press-fit fashion on one femur. On the contralateral femur, 6 grafts were stabilized with a calcium triglyceride osteoconductive bone cement. For the 4 remaining pairs of femurs, 4 groups of 3 adjacent press-fit grafts were inserted on one femur, whereas on the contralateral femur, grafts were cemented. After a maturation period of 48 hours, axial loading was applied on all single grafts and on the middle graft of each 3-in-a-row series. For the single-graft configuration, median loads required to sink the press-fit and cemented grafts by 2 and 3 mm were 281.87 N versus 345.56 N (P = .015) and 336.29 N versus 454.08 N (P = .018), respectively. For the 3-in-a-row configuration, median loads required to sink the press-fit and cemented grafts by 2 and 3 mm were 260.31 N versus 353.47 N (P = .035) and 384.83 N versus 455.68 N (P = .029), respectively. Fixation of osteochondral grafts using bone cement appears to improve immediate stability over the original mosaicplasty technique for both single- and multiple-graft configurations. Achieving greater primary stability of osteochondral grafts could potentially accelerate postoperative recovery, allowing early weightbearing and physical therapy.
Improvement of tomato local varieties by grafting in organic farming
NASA Astrophysics Data System (ADS)
Moreno, Marta M.; Villena, Jaime; Moreno, Carmen; García, Arántzazu M.; Mancebo, Ignacio; Meco, Ramón
2015-04-01
Grafting is the union of two or more pieces of living plant tissue that grow as a single plant. The early use of grafted vegetables was associated with protected cultivation which involves successive cropping (Lee et al., 2010). For this reason, in the past, grafting was used with vegetable crops to limit the effects of soil-borne diseases. However, the reasons for grafting as well as the kinds of vegetable grafted have increased considerably over the years. In tomato (Solanum lycopersicum L.), one of the most important horticultural crops in the world, the effect of grafting has also been widely studied. These effects on commercial tomato varieties can be summarized in increasing plant vigor and crop yield or inducing tolerance to abiotic stresses, although the effects on tomato fruit quality or on the sensory properties are not so patent (David et al., 2008). However, a few studies about the effect of grafting on local tomato varieties, which are especially recommended for organic production in spite of their lower yields in many cases, have been developed. In this work we evaluated the effect of grafting on tomato local varieties under organic management using vigorous commercial rootstocks, and aspects related to vigor, yield and tomato fruit composition were analyzed. In general terms, grafting increased the plant vigor, the crop yield and the fruit antioxidant content, although no modification of morphological fruit attributes was observed. Keywords: grafting, Solanum lycopersicum L., local varieties, organic farming. References: Davis A.R., Perkins-Veazie P., Hassell R., Levi A., King S.R., Zhang X. 2008. Grafting effects on vegetable quality. HortScience 43(6): 1670-1671. Lee J.M., Kubota C., Tsao S.J., Bie Z., Hoyos-Echevarría P., Morra L., Oda M. 2010. Current status of vegetable grafting: Diffusion, grafting techniques, automation. Scientia Horticulturae 127: 93-105.
Radtke, A; Sotiropoulos, G C; Molmenti, E P; Sgourakis, G; Schroeder, T; Beckebaum, S; Peitgen, H-O; Cicinnati, V R; Broelsch, C E; Broering, D C; Malagó, M
2012-03-01
The passage through the hilar plate during right graft live donor liver transplantation (LDLT) can have dangerous consequences for both donors and recipients. The purpose of our study was to delineate hilar transection and biliary reconstruction strategies in right graft LDLT, with special consideration of central and peripheral hilar anatomical variants. A total of 71 consecutive donors underwent preoperative three-dimensional (3D) CT reconstructions and virtual 3D hepatectomies. A three-modal hilar passage strategy was applied, and its impact on operative strategy analyzed. In 68.4% of cases, type I and II anatomical configurations allowed for an en block hilar transection with simple anastomotic reconstructions. In 23.6% of cases, donors had "difficult" type II and types III/IV hilar bile duct anatomy that required stepwise hilar transections and complex graft biliary reconstructions. Morbidity rates for our early (A) and recent (B) experience periods were 67% and 39%, respectively. (1) Our two-level classification and 3D imaging technique allowed for donor-individualized transhilar passage. (2) A stepwise transhilar passage was favored in types III and IV inside the right-sided hilar corridor. (3) Reconstruction techniques showed no ameliorating effect on early/late biliary morbidity rates. © copyright 2012 The American Society of Transplantation and the American Society of Transplant Surgeons.
Newton, Chad A; Kozlitina, Julia; Lines, Jefferson R; Kaza, Vaidehi; Torres, Fernando; Garcia, Christine Kim
2017-08-01
Prior studies have shown that patients with pulmonary fibrosis with mutations in the telomerase genes have a high rate of certain complications after lung transplantation. However, few studies have investigated clinical outcomes based on leukocyte telomere length. We conducted an observational cohort study of all patients with pulmonary fibrosis who underwent lung transplantation at a single center between January 1, 2007, and December 31, 2014. Leukocyte telomere length was measured from a blood sample collected before lung transplantation, and subjects were stratified into 2 groups (telomere length <10th percentile vs ≥10th percentile). Primary outcome was post-lung transplant survival. Secondary outcomes included incidence of allograft dysfunction, non-pulmonary organ dysfunction, and infection. Approximately 32% of subjects had a telomere length <10th percentile. Telomere length <10th percentile was independently associated with worse survival (hazard ratio 10.9, 95% confidence interval 2.7-44.8, p = 0.001). Telomere length <10th percentile was also independently associated with a shorter time to onset of chronic lung allograft dysfunction (hazard ratio 6.3, 95% confidence interval 2.0-20.0, p = 0.002). Grade 3 primary graft dysfunction occurred more frequently in the <10th percentile group compared with the ≥10th percentile group (28% vs 7%; p = 0.034). There was no difference between the 2 groups in incidence of acute cellular rejection, cytopenias, infection, or renal dysfunction. Telomere length <10th percentile was associated with worse survival and shorter time to onset of chronic lung allograft dysfunction and thus represents a biomarker that may aid in risk stratification of patients with pulmonary fibrosis before lung transplantation. Copyright © 2017 International Society for the Heart and Lung Transplantation. Published by Elsevier Inc. All rights reserved.
Ibrahimi, Omar A; Campbell, Tracy; Youker, Summer; Eisen, Daniel B
2012-01-01
Defects of the distal nose, particularly the nasal ala, pose a reconstructive challenge due to the lack of loose adjacent tissue and proximity to a free margin. We report our experience using nonanatomic free cartilage batten grafts in combination with second intention healing for nasal ala defects. A retrospective study of distal nose defects repaired using nonanatomic free cartilage batten grafting with second intention healing was performed. Detailed data on the quality of the scar, post-operative complications, free margin distortion, functional impairments, and patient satisfaction were recorded. Digital images were also shown to an experienced fellowship-trained Mohs surgeon to assess the overall aesthetic outcome using a 5-point score ranging from poor to excellent. Sixteen subjects were included in the study. Complications were common, but minor. Five (~31%) subjects had subtle contour depressions, three (~18%) subjects had excessive granulation tissue, two (~12%) subjects had post-operative ear pain at the donor site lasting up to 10 days, and one (~6%) subject had a hypertrophic scar at the recipient site. There were two occurrences (~12%) of mild alar notching but no occurrences of significant alar margin distortion or nasal valve dysfunction. In terms of aesthetic outcome, seven (~43%) were assessed by an independent fellowship-trained Mohs surgeon as having excellent aesthetic outcomes, six (~38%) were very good, and three (~19%) were good. All sixteen subjects reported satisfaction on follow-up evaluation. Nonanatomic free cartilage grafting with second intention healing allows for facile, single-step repair of nasal ala defects with high patient satisfaction and aesthetically pleasing results. This provides an attractive alternative to other flap techniques, skin grafting, and healing via secondary intention.
High-level JCPyV viruria after kidney transplantation-Clinical and histopathological findings.
Helanterä, Ilkka; Hirsch, Hans H; Auvinen, Eeva; Mannonen, Laura; Nummi, Maaret; Wernli, Marion; Ortiz, Fernanda; Räisänen-Sokolowski, Anne; Lempinen, Marko; Lautenschlager, Irmeli
2016-12-01
The significance of JC polyomavirus (JCPyV) after kidney transplantation ranges from irrelevant to full-blown nephropathy or PML. To investigate the clinical significance of high-level JCPyV viruria and JCPyV primary infections after kidney transplantation. JCPyV viruria was detected in routine screening by quantitative real-time PCR in 40/238 kidney transplant recipients and was high-level (>10 7 copies/ml) in 17 patients. A protocol biopsy at the time of JCPyV viruria was available from 10 patients. Peak urine viral loads were 1.0×10 7 -2.5×10 9 copies/ml in the 17 high-level viruria patients. 6/15 (40%) patients with high-level JCPyV viruria with pretransplant sera available were JCPyV IgG negative suggesting that JCPyV viruria resulted from the donor graft in most cases. No acute graft dysfunction was associated with JCPyV viruria. No positive SV40 staining was detected in protocol biopsies, and no specific histopathology was associated with high-level viruria; JCPyV nephropathy was not found. No differences were seen in histopathology or graft function at 3 years in patients with high-level viruria compared to non-JCPyV viruric patients transplanted during the same time period, and outcome was similar in patients with presumably primary and reactivated JCPyV. The mean estimated GFR at last follow-up was 44ml/min (range 12-60ml/min). One graft with high-level viruria was lost 9 years posttransplant due to recurrent IgA nephropathy CONCLUSIONS: High-level JCPyV viruria seems to be associated with primary JCPyV infection reflecting the average seroprevalence of 60%, but is not stringently associated with inferior graft function or survival, or histopathological changes. Copyright © 2016 Elsevier B.V. All rights reserved.
Piccoli, G B; Motta, D; Gai, M; Mezza, E; Maddalena, E; Bravin, M; Tattoli, F; Consiglio, V; Burdese, M; Bilucaglia, D; Ferrari, A; Segoloni, G P
2004-11-01
Restarting dialysis after kidney transplantation is a critical step with psychological and clinical implications. Maintenance of residual renal function a known factor affecting survival in chronic kidney disease, has so far not been investigated after a kidney transplantation. A 54-year-old woman who started dialysis in 1974 (first graft, 1975-1999) received a second "marginal" kidney graft in February 2001 (donor age, 65 years). Her chronic therapy was tacrolimus and steroids. She had a clinical history as follows: nadir creatinine level of 1.5 mg/dL, moderate-severe hypertension, progressive graft dysfunction, nonresponsiveness to addition of mycophenolate, tapering FK levels, and a rescue switch from tacrolimus to rapamycin. From October to December 2003, the creatinine level increased from 2-2.8 to 7 mg/dL. Biopsy specimen showed malignant and "benign" nephrosclerosis, posttransplantation glomerulopathy, and tacrolimus toxicity. Chronic dialysis was started (GFR <3 mL/min). Rapamycin was discontinued. Dialysis was tailored to reach an equivalent renal clearance of >15 mL/min (2 sessions/wk). Blood pressure control improved, nephrotoxic drugs were avoided, and fluid loss was minimized (maximum 500 mL/hr). By this policy, renal function progressively increased to GFR >10 mL/min in May 2004, allowing a once or twice weekly dialysis schedule, with good clinical balance, and obvious advantages for the quality of life. This long-term patient, who restarted dialysis with severely reduced renal function, regained sufficient renal function to allow once weekly dialysis. Thus, careful tailoring of dialysis sessions at the restart of dialysis may allow preservation of residual kidney function, at least in individuals for whom a subsequent graft is unlikely.
Remodeling of ACL Allografts is Inhibited by Peracetic Acid Sterilization
Gonnermann, Johannes; Kamp, Julia; Przybilla, Dorothea; Pruss, Axel
2008-01-01
Sterilization of allografts for anterior cruciate ligament (ACL) reconstruction has become an important prerequisite to prevent disease transmission. However, current sterilization techniques impair the biological or mechanical properties of such treated grafts. Peracetic acid (PAA) has been successfully used to sterilize bone allografts without these disadvantages and does not impair the mechanical properties of soft tissue grafts in vitro. We asked whether PAA sterilization would influence recellularization, restoration of crimp length and pattern, and revascularization of ACL grafts during early healing. We used an in vivo sheep model for open ACL reconstruction. We also correlated the histologic findings with the restoration of anteroposterior stability and structural properties during load-to-failure testing. PAA slowed remodeling activity at 6 and 12 weeks compared to nonsterilized allografts and autografts. The mechanical properties of PAA grafts were also reduced compared to these control groups at both time points. We conclude PAA sterilization currently should not be used to sterilize soft tissue grafts typically used in ACL reconstruction. PMID:18491201
Active range of motion outcomes after reconstruction of burned wrist and hand deformities.
Afifi, Ahmed M; Mahboub, Tarek A; Ibrahim Fouad, Amr; Azari, Kodi; Khalil, Haitham H; McCarthy, James E
2016-06-01
This works aim is to evaluate the efficacy of skin grafts and flaps in reconstruction of post-burn hand and wrist deformities. A prospective study of 57 burn contractures of the wrist and dorsum of the hand was performed. Flaps were used only if there was a non-vascularized structure after contracture release, otherwise a skin graft was used. Active range of motion (ROM) was used to assess hand function. The extension deformity cohort uniformly underwent skin graft following contracture release with a mean improvement of 71 degrees (p<0.0001). The flexion deformity cohort was treated with either skin grafts (8 patients) or flaps (9 patients) with a mean improvement of 44 degrees (p<0.0001). Skin grafts suffice for dorsal hand contractures to restore functional wrist ROM. For flexion contractures, flaps were more likely for contractures >6 months. Early release of burn contracture is advisable to avoid deep structure contracture. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.
Badar, Athar A; Brunton, Alan P T; Mahmood, Ammad H; Dobbin, Stephen; Pozzi, Andrea; McMinn, Jenna F; Sinclair, Andrew J E; Gardner, Roy S; Petrie, Mark C; Curry, Phil A; Al-Attar, Nawwar H K; Pettit, Stephen J
2015-01-01
A systematic search of Medline, EMBASE and CINAHL electronic databases was performed. Original research articles reporting all-cause mortality following surgery in patients with aortic regurgitation and severe left ventricular systolic dysfunction (LVSD) were identified. Nine of the 10 eligible studies were observational, single-center, retrospective analyses. Survival ranged from 86 to 100% at 30 days; 81 to 100% at 1 year and 68 to 84% at 5 years. Three studies described an improvement in mean left ventricular ejection fraction (LVEF) following aortic valve replacement (AVR) of 5-14%; a fourth study reported an increase in mean left ventricular ejection fraction (LVEF) of 9% in patients undergoing isolated AVR but not when AVR was combined with coronary artery bypass graft and/or mitral valve surgery. Three studies demonstrated improvements in functional New York Heart Association (NYHA) class following AVR. Additional studies are needed to clarify the benefits of AVR in patients with more extreme degrees of left ventricular systolic dysfunction (LVSD) and the potential roles of cardiac transplantation and transaortic valve implantation.
Takahashi, Shuichiro; Tsumanuma, Riko; Aizawa, Keiko; Osakabe, Mitsumasa; Maeda, Kunihiko; Omoto, Ejiro
2016-01-01
The prognosis for myelodysplastic syndrome with bone marrow fibrosis (MDS-F) is worse than the prognosis of MDS without fibrosis. Hematopoietic stem cell transplantation (HSCT) is the only curative therapy; however, the indications and the procedures involved in HSCT remain unclear. We herein describe a 69-year-old Japanese man with MDS-F who received haploidentical HSCT and post-transplantation cyclophosphamide. Although the first HSCT resulted in secondary graft failure, the second HSCT using PTCy led to successful engraftment after early improvement in fibrosis. Since the incidence of graft failure is high in myelofibrosis patients, a secondary HSCT using PTCy may be successful if employed. PMID:27853082
BK-virus nephropathy and simultaneous C4d positive staining in renal allografts.
Honsová, E; Lodererová, A; Viklický, O; Boucek, P
2005-10-01
The role of antibodies in rejection of transplanted kidneys was the subject of debate at the last two Banff meetings and in medical journals. Diffuse C4d positive staining of peritubular capillaries (PTCs) was recognized as a marker of antibody-mediated rejection and this morphological feature was included in the updated Banff schema. At the same time polyomavirus infection of the renal allografts has been reported more frequently and is emerging as an important cause of renal allograft dysfunction and graft loss. At the present time, BK-virus nephropathy (BKN) represents the most common viral disease affecting renal allografts. BKN was identified in 6 patients in 12 biopsies and 2 graft nephrectomy specimens of 1115 biopsies between September 2000 and December 2003. Definite virus identification was done by immunohistochemistry. The reason for graft nephrectomies was graft failure due to BKN in a recipient after kidney-pancreas transplantation with good function of his pancreas graft and the necessity of continuing immunosuppression. Detection of C4d deposits was performed by immunofluorescence or by immunohistochemistry. In graftectomy samples C4d detection was performed by immunohistochemistry and retrospectively in all cases of BKN. Focal C4d positive PTCs and BKN were found simultaneously in 9 of 12 needle biopsies and in both graft nephrectomy samples. Detection of C4d by immunohistochemistry disclosed focal C4d positive staining in kidney tissue but diffuse in the sites where BK-virus inclusions in tubular epithelial cells were found. The complement system is part of the host defense response and is crucial to our natural ability to ward off infection. In cases of BKN, virus likely gains access to the bloodstream through injured tubular walls and via PTCs. Vascular endothelium in the PTCs represents a potential target antigen for alloresponse, and simultaneously possibly represents an imprint of complement activation or complement production in the places with BK-virus infection.
Melchiorri, Anthony J; Bracaglia, Laura G; Kimerer, Lucas K; Hibino, Narutoshi; Fisher, John P
2016-07-01
A critical challenge to the success of biodegradable vascular grafts is the establishment of a healthy endothelium. To establish this monolayer of endothelial cells (ECs), a variety of techniques have been developed, including cell seeding. Vascular grafts may be seeded with relevant cell types and allowed to mature before implantation. Due to the low proliferative ability of adult ECs and issues with donor site morbidity, there has been increasing interest in using endothelial progenitor cells (EPCs) for vascular healing procedures. In this work, we combined the proliferative and differentiation capabilities of a commercial cell line of early EPCs with an established bioreactor system to support the maturation of cell-seeded vascular grafts. All components of the vascular graft and bioreactor setup are commercially available and allow for complete customization of the scaffold and culturing system. This bioreactor setup enables the control of flow through the graft, imparting fluid shear stress on EPCs and affecting cellular proliferation and differentiation. Grafts cultured with EPCs in the bioreactor system demonstrated greatly increased cell populations and neotissue formation compared with grafts seeded and cultured in a static system. Increased expression of markers for mature endothelial tissues were also observed in bioreactor-cultured EPC-seeded grafts. These findings suggest the distinct advantages of a customizable bioreactor setup for the proliferation and maturation of EPCs. Such a strategy may be beneficial for utilizing EPCs in vascular tissue engineering applications.
Management of hemodialysis access infections.
Ryan, Sean V; Calligaro, Keith D; Dougherty, Matthew J
2004-03-01
Management of hemodialysis (HD) access infection is one of the most challenging and most common problems faced by surgeons, interventional radiologists, and nephrologists. The goal to eradicate infection is often at odds with the need to maintain access. Patients on HD are immunocompromised and typically have significant comorbid conditions placing them at high risk for the occurrence of access infection. Infection is most common with central-vein catheter access, followed by prosthetic arteriovenous grafts (AVG) and is rare with autogenous fistulas. The diagnosis is usually evident on physical exam, but it is not uncommon for these patients to present with atypical symptoms and lack of clinical findings. Although Staphylococcal species are the most common organism to cause infection, early empiric antimicrobial therapy should also include coverage for Gram-negative organisms. Management of central-vein catheter infection includes removal and delayed replacement or, in patients with mild clinical symptoms, catheter exchange over a guide wire. Our management of AVG infection includes total graft excision when patients present with sepsis or the entire graft is bathed in pus, subtotal graft excision when all of the graft is removed except a small oversewn cuff of prosthetic material on an underlying patent artery, and partial graft excision when only a limited infected portion of the graft is removed and a new graft is rerouted in adjacent sterile tissue to maintain patency of the original graft. This strategy has proven to be highly successful in the management of these complicated cases.
Tricarico, Rosamaria; He, Yong; Laquian, Liza; Scali, Salvatore T; Tran-Son-Tay, Roger; Beck, Adam W; Berceli, Scott A
2017-12-01
To identify anatomic and hemodynamic changes associated with impending visceral chimney stent-graft occlusion after endovascular aneurysm repair (EVAR) with the chimney technique (chEVAR). A retrospective evaluation was performed of computed tomography scans from 41 patients who underwent juxtarenal chEVAR from 2008 to 2012 to identify stent-grafts demonstrating conformational changes following initial placement. Six subjects (mean age 74 years; 3 men) were selected for detailed reconstruction and computational hemodynamic analysis; 4 had at least 1 occluded chimney stent-graft. This subset of repairs was systematically analyzed to define the anatomic and hemodynamic impact of these changes and identify signature patterns associated with impending renovisceral stent-graft occlusion. Spatial and temporal analyses of cross-sectional area, centerline angle, intraluminal pressure, and wall shear stress (WSS) were performed within the superior mesenteric and renal artery chimney grafts used for repair. Conformational changes in the chimney stent-grafts and associated perturbations, in both local WSS and pressure, were responsible for the 5 occlusions in the 13 stented branches. Anatomic and hemodynamic signatures leading to occlusion were identified within 1 month postoperatively, with a lumen area <14 mm 2 (p=0.04), systolic pressure gradient >25 Pa/mm (p=0.03), and systolic WSS >45 Pa (p=0.03) associated with future chimney stent-graft occlusion. Chimney stent-grafts at increased risk for occlusion demonstrated anatomic and hemodynamic signatures within 1 month of juxtarenal chEVAR. Analysis of these parameters in the early postoperative period may be useful for identifying and remediating these high-risk stent-grafts.
Blood-pool SPECT in addition to bone SPECT in the viability assessment in mandibular reconstruction.
Aydogan, F; Akbay, E; Cevik, C; Kalender, E
2014-01-01
The assessment of the postoperative viability of vascularized and non-vascularized grafts used in the reconstruction of mandibular defects due to trauma and surgical reasons is a major problem in maxillofacial surgery. In the present study, we evaluated the feasibility and image quality of blood-pool SPECT, which is used for the first time in the literature here in the assessment of mandibular reconstruction, in addition to non-invasive bone scintigraphy and bone SPECT. We also evaluated whether it would be useful in clinical prediction. Micro-vascularized and non-vascularized bone grafts were used in 12 Syrian men with maxillofacial trauma. Between days 5-7 after surgery, three-phase bone scintigraphy, blood-pool SPECT and delayed bone SPECT scans were performed. After month 6, the patients were assessed by control CT scans. Of the non-vascularized grafts, one graft was reported as non-viable at week one. At month 6, graft resorption was demonstrated on the CT images. The remaining non-vascularized grafts and all of the micro-vascularized grafts were considered to be viable according to delayed bone SPECT and blood-pool SPECT images. However, only the anterior and posterior ends could be clearly assessed on delayed SPECT images, while blood-pool SPECT images allowed the clear assessment of the entire graft. The combined use of blood-pool and delayed SPECT scans could allow for better assessment of graft viability in the early period, and can provide more detailed information to clinicians about prognosis in the follow-up of patients undergoing mandibular graft reconstruction.
Ziu, Mateo; Jimenez, David F
2013-11-01
Presented herein is a review of the history of fat graft use in preventing iatrogenic cerebrospinal fluid (CSF) rhinorrhea after transsphenoidal surgery. Since the first transsphenoidal surgeries were described in the early 1900s, the techniques of sellar packing to prevent CSF leak have evolved. Kanavel, Halstead, and Cushing used bismuth- or iodine-soaked gauze. Under Dandy's influence, fascia lata was the first autologous material to be used for the repair and prevention of CSF rhinorrhea. The use of autologous fat graft for this purpose has only been reported in recent decades. Montgomery was the first to use abdominal fat to obliterate the middle ear cavity in 1964, and Collins reported the first transsphenoidal application of fat graft in 1973. Other reports by Kirchner, Tindall, and Wilson followed. Copyright © 2013. Published by Elsevier Inc.
[Current insights about recurrence of glomerular diseases after renal transplantation].
Kofman, Tomek; Oniszczuk, Julie; Lang, Philippe; Grimbert, Philippe; Audard, Vincent
2018-05-01
Recurrence of glomerular disease after renal transplantation is a frequent cause of graft loss. Incidence, risk factors and outcome of recurrence are widely due to the underlying glomerular disease. Graft biopsy analysis is required to confirm the definitive diagnosis of recurrence and to start an appropriate therapy that, in some cases, remains challenging to prevent graft failure. Increased use of protocol biopsy and recent advances in our understanding of the pathogenesis of some glomerular diseases with the identification of some relevant biomarkers provide a unique opportunity to initiate kidney-protective therapy at early stages of recurrence on the graft. This review summarizes our current knowledge on the management of many recurrent primary and secondary glomerulonephritis after kidney transplantation. Copyright © 2018 Société francophone de néphrologie, dialyse et transplantation. Published by Elsevier Masson SAS. All rights reserved.
Transplantation outcomes in primary hyperoxaluria.
Bergstralh, E J; Monico, C G; Lieske, J C; Herges, R M; Langman, C B; Hoppe, B; Milliner, D S
2010-11-01
Optimal transplantation strategies are uncertain in primary hyperoxaluria (PH) due to potential for recurrent oxalosis. Outcomes of different transplantation approaches were compared using life-table methods to determine kidney graft survival among 203 patients in the International Primary Hyperoxaluria Registry. From 1976-2009, 84 kidney alone (K) and combined kidney and liver (K + L) transplants were performed in 58 patients. Among 58 first kidney transplants (32 K, 26 K + L), 1-, 3- and 5-year kidney graft survival was 82%, 68% and 49%. Renal graft loss occurred in 26 first transplants due to oxalosis in ten, chronic allograft nephropathy in six, rejection in five and other causes in five. Delay in PH diagnosis until after transplant favored early graft loss (p = 0.07). K + L had better kidney graft outcomes than K with death-censored graft survival 95% versus 56% at 3 years (p = 0.011). Among 29 year 2000-09 first transplants (24 K + L), 84% were functioning at 3 years compared to 55% of earlier transplants (p = 0.05). At 6.8 years after transplantation, 46 of 58 patients are living (43 with functioning grafts). Outcomes of transplantation in PH have improved over time, with recent K + L transplantation highly successful. Recurrent oxalosis accounted for a minority of kidney graft losses. ©2010 The Authors Journal compilation©2010 The American Society of Transplantation and the American Society of Transplant Surgeons.
Nasofacial defect following fibrosarcoma excision and radiotherapy.
Burget, G L; Panje, W R; Krause, C J
1988-01-01
For initial reconstruction, Dr. Burget suggests that he would have advanced the cheek flap medially toward the nasal septum and, subsequently, reconstructed the missing right half of the nose with a forehead flap and cartilage grafts. Dr. Panje suggested early prosthetic rehabilitation, while Dr. Krause's concepts were similar to Dr. Burget's, with forehead flap nasal reconstruction, after cheek reconstruction to the nasofacial and nasolabial lines with a medially advanced cheek flap. Dr. Panje recommended an immediate maxillary denture prosthesis, as did Dr. Krause (who supplemented this with foam rubber). Dr. Burget placed the prosthesis 3 weeks after tumor ablation. For skin grafts, Drs. Panje and Burget suggested split thickness grafts to all new surfaces to decrease wound contracture, while Dr. Krause used dermis grafts for the same purpose. Other reconstructive methods mentioned were the (1) cervical tubed flap, (2) free scapular flap, (3) Washio flap, (4) tissue expansion, and (5) nasolabial flap. Suggestions for isolated defects included: Lower eyelid--increase internal support by building up the prosthesis; release lower lid from deltopectoral flap and V-Y advancement; support graft or irradiated cartilage (1-2 mm sheet) under orbicularis oculi. Nasal ala--bring present ala down and insert cartilage graft; turn internal skin down and fill the resulting defect with a composite graft. Upper lip--multiple Z-plasty. Retrodisplacement of cheek due to maxillectomy--release buccal scar; skin graft the raw internal surface and build up prosthesis.
Heise, Carlos O; Siqueira, Mario G; Martins, Roberto S; Foroni, Luciano H; Sterman-Neto, Hugo
2017-09-01
Ulnar and median nerve transfers to arm muscles have been used to recover elbow flexion in infants with neonatal brachial plexus palsy, but there is no direct outcome comparison with the classical supraclavicular nerve grafting approach. We retrospectively analyzed patients with C5-C7 neonatal brachial plexus palsy submitted to nerve surgery and recorded elbow flexion recovery using the active movement scale (0-7) at 12 and 24 months after surgery. We compared 13 patients submitted to supraclavicular nerve grafting with 21 patients submitted to distal ulnar or median nerve transfer to biceps motor branch. We considered elbow flexion scores of 6 or 7 as good results. The mean elbow flexion score and the proportion of good results were better using distal nerve transfers than supraclavicular grafting at 12 months (p < 0.01), but not at 24 months. Two patients with failed supraclavicular nerve grafting at 12 months showed good elbow flexion recovery after ulnar nerve transfers. Distal nerve transfers provided faster elbow flexion recovery than supraclavicular nerve grafting, but there was no significant difference in the outcome after 24 months of surgery. Patients with failed supraclavicular grafting operated early can still benefit from late distal nerve transfers. Supraclavicular nerve grafting should remain as the first line surgical treatment for children with neonatal brachial plexus palsy.
Marteyn, Antoine; Sarrazin, Nadège; Yan, Jun; Bachelin, Corinne; Deboux, Cyrille; Santin, Mathieu D; Gressens, Pierre; Zujovic, Violetta; Baron-Van Evercooren, Anne
2016-04-01
Pelizaeus-Merzbacher disease (PMD) results from an X-linked misexpression of proteolipid protein 1 (PLP1). This leukodystrophy causes severe hypomyelination with progressive inflammation, leading to neurological dysfunctions and shortened life expectancy. While no cure exists for PMD, experimental cell-based therapy in the dysmyelinated shiverer model suggested that human oligodendrocyte progenitor cells (hOPCs) or human neural precursor cells (hNPCs) are promising candidates to treat myelinopathies. However, the fate and restorative advantages of human NPCs/OPCs in a relevant model of PMD has not yet been addressed. Using a model of Plp1 overexpression, resulting in demyelination with progressive inflammation, we compared side-by-side the therapeutic benefits of intracerebrally grafted hNPCs and hOPCs. Our findings reveal equal integration of the donor cells within presumptive white matter tracks. While the onset of exogenous remyelination was earlier in hOPCs-grafted mice than in hNPC-grafted mice, extended lifespan occurred only in hNPCs-grafted animals. This improved survival was correlated with reduced neuroinflammation (microglial and astrocytosis loads) and microglia polarization toward M2-like phenotype followed by remyelination. Thus modulation of neuroinflammation combined with myelin restoration is crucial to prevent PMD pathology progression and ensure successful rescue of PMD mice. These findings should help to design novel therapeutic strategies combining immunomodulation and stem/progenitor cell-based therapy for disorders associating hypomyelination with inflammation as observed in PMD. © 2015 AlphaMed Press.
Bonthuis, Marjolein; Busutti, Marco; Jager, Kitty J.; Baiko, Sergey; Bakkaloğlu, Sevcan; Battelino, Nina; Gaydarova, Maria; Gianoglio, Bruno; Parvex, Paloma; Gomes, Clara; Heaf, James G.; Podracka, Ludmila; Kuzmanovska, Dafina; Molchanova, Maria S.; Pankratenko, Tatiana E.; Papachristou, Fotios; Reusz, György; Sanahuja, Maria José; Shroff, Rukshana; Groothoff, Jaap W.; Schaefer, Franz; Verrina, Enrico
2015-01-01
Background and objectives Data on mineral metabolism in pediatric renal transplant recipients largely arise from small single-center studies. In adult patients, abnormal mineral levels are related to a higher risk of graft failure. This study used data from the European Society for Paediatric Nephrology/European Renal Association–European Dialysis and Transplant Association Registry to study the prevalence and potential determinants of mineral abnormalities, as well as the predictive value of a disturbed mineral level on graft survival in a large cohort of European pediatric renal transplant recipients. Design, setting, participants, & measurements This study included 1237 children (0–17 years) from 10 European countries, who had serum calcium, phosphorus, and parathyroid hormone measurements from 2000 onward. Abnormalities of mineral metabolism were defined according to European guidelines on prevention and treatment of renal osteodystrophy in children on chronic renal failure. Results Abnormal serum phosphorus levels were observed in 25% (14% hypophosphatemia and 11% hyperphosphatemia), altered serum calcium in 30% (19% hypocalcemia, 11% hypercalcemia), and hyperparathyroidism in 41% of the patients. A longer time since transplantation was associated with a lower risk of having mineral levels above target range. Serum phosphorus levels were inversely associated with eGFR, and levels above the recommended targets were associated with a higher risk of graft failure independently of eGFR. Conclusions Abnormalities in mineral metabolism are common after pediatric renal transplantation in Europe and are associated with graft dysfunction. PMID:25710805
A comparison of lamellar and penetrating keratoplasty outcomes: a registry study.
Coster, Douglas J; Lowe, Marie T; Keane, Miriam C; Williams, Keryn A
2014-05-01
To investigate changing patterns of practice of keratoplasty in Australia, graft survival, visual outcomes, the influence of experience, and the surgeon learning curve for endothelial keratoplasty. Observational, prospective cohort study. From a long-standing national corneal transplantation register, 13 920 penetrating keratoplasties, 858 deep anterior lamellar keratoplasties (DALKs), and 2287 endokeratoplasties performed between January 1996 and February 2013 were identified. Kaplan-Meier functions were used to assess graft survival and surgeon experience, the Pearson chi-square test was used to compare visual acuities, and linear regression was used to examine learning curves. Graft survival. The total number of corneal grafts performed annually is increasing steadily. More DALKs but fewer penetrating grafts are being performed for keratoconus, and more endokeratoplasties but fewer penetrating grafts are being performed for Fuchs' dystrophy and pseudophakic bullous keratopathy. In 2012, 1482 grafts were performed, compared with 955 in 2002, translating to a requirement for 264 extra corneal donors across the country in 2012. Comparing penetrating grafts and DALKs performed for keratoconus over the same era, both graft survival (P <0.001) and visual outcomes (P <0.001) were significantly better for penetrating grafts. Survival of endokeratoplasties performed for Fuchs' dystrophy or pseudophakic bullous keratopathy was poorer than survival of penetrating grafts for the same indications over the same era (P <0.001). Visual outcomes were significantly better for penetrating grafts than for endokeratoplasties performed for Fuchs' dystrophy (P <0.001), but endokeratoplasties achieved better visual outcomes than penetrating grafts for pseudophakic bullous keratopathy (P <0.001). Experienced surgeons (>100 registered keratoplasties) achieved significantly better survival of endokeratoplasties (P <0.001) than surgeons who had performed fewer grafts (<100 registered keratoplasties). In the hands of experienced, high-volume surgeons, endokeratoplasty failures occurred even after 100 grafts had been performed. More corneal transplants, especially DALKs and endokeratoplasties, are being performed in Australia than ever before. Survival of DALKs and endokeratoplasties is worse than the survival of penetrating grafts performed for the same indications over the same timeframe. Many endokeratoplasties fail early, but the evidence for a surgeon learning curve is unconvincing. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Raza, Sajjad; Blackstone, Eugene H; Houghtaling, Penny L; Koprivanac, Marijan; Ravichandren, Kirthi; Javadikasgari, Hoda; Bakaeen, Faisal G; Svensson, Lars G; Sabik, Joseph F
2017-12-01
The purpose of this study was to determine in patients with diabetes mellitus whether single internal thoracic artery (SITA) plus radial artery (RA) grafting yields outcomes similar to those of bilateral internal thoracic artery (BITA) grafting. From January 1994 to January 2011, 1,325 diabetic patients underwent primary isolated coronary artery bypass graft surgery with either (1) SITA plus RA with or without saphenous vein (SV) grafts (n = 965) or (2) BITA with or without SV grafts (n = 360); an internal thoracic artery was used in all patients to graft the left anterior descending coronary artery. Endpoints were in-hospital outcomes and time-related mortality. Median follow-up was 7.4 years, with a total follow-up of 9,162 patient-years. Propensity score matching was performed to identify 282 well-matched pairs for adjusted comparisons. Unadjusted in-hospital mortality was 0.52% for SITA plus RA with or without SV grafts and 0.28% for BITA with or without SV grafts, and prevalence of deep sternal wound infection was 3.2% and 1.7%, respectively. Unadjusted survival at 1, 5, 10, and 14 years was 97%, 88%, 68%, and 51% for SITA plus RA with or without SV grafts, and 97%, 95%, 80%, and 66% for BITA with or without SV grafts, respectively. Among propensity-matched patients, in-hospital mortality (0.35% versus 0.35%) and prevalence of deep sternal wound infection (1.4% versus 1.4%) were similar (p > 0.9) in the two groups, as was 1-, 5-, 10-, and 14-year survival: 97%, 90%, 70%, and 58% for SITA plus RA with or without SV grafting versus 97%, 93%, 79%, and 64% for BITA with or without SV grafting, respectively (early p = 0.8, late p = 0.2). For diabetic patients, SITA plus RA with or without SV grafting and BITA with or without SV grafting yield similar in-hospital outcomes and long-term survival after coronary artery bypass graft surgery. Therefore, both SITA plus RA and BITA plus SV grafting should be considered for these patients. Copyright © 2017 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Standard operating procedures for Peyronie's disease.
Levine, Laurence A; Burnett, Arthur L
2013-01-01
Peyronie's disease (PD) refers to a penile deformity that is associated with sexual dysfunction. To provide recommendations and Standard Operating Procedures (SOPs) based on best evidence for diagnosis and treatment of PD. Medical literature was reviewed and combined with expert opinion of the authors. Recommendations and SOPs based on grading of evidence-based medical literature. PD is a fibrotic wound-healing disorder involving the tunica albuginea of the corpora cavernosa. The resulting scar is responsible for a variety of deformities, including curvature, shortening, narrowing with hinge effect, and is frequently associated in the early phase with pain. Patients frequently experience diminished quality erections. All of these conditions can compromise sexual function for the affected male. The etiopathophysiology of PD has yet to be clarified and as a result, effective, reliable, mechanistic directed non-surgical therapy is lacking. The management of PD consists of proper diagnosis and treatment, ranging from non-surgical to surgical interventions. The main state of treatment for PD rests at this time on surgical correction that should be based on clear indications, involve surgical consent, and follow a surgical algorithm that includes tunica plication, plaque incision/partial excision and grafting, and penile prosthesis implantation. © 2012 International Society for Sexual Medicine.
Long-term outcomes and management of lung transplant recipients.
Costa, Joseph; Benvenuto, Luke J; Sonett, Joshua R
2017-06-01
Lung transplantation is an established treatment for patients with end-stage lung disease. Improvements in immunosuppression and therapeutic management of infections have resulted in improved long-term survival and a decline in allograft rejection. Allograft rejection continues to be a serious complication following lung transplantation, thereby leading to acute graft failure and, subsequently, chronic lung allograft dysfunction (CLAD). Bronchiolitis obliterans syndrome (BOS), the most common phenotype of CLAD, is the leading cause of late mortality and morbidity in lung recipients, with 50% having developed BOS within 5 years of lung transplantation. Infections in lung transplant recipients are also a significant complication and represent the most common cause of death within the first year. The success of lung transplantation depends on careful management of immunosuppressive regimens to reduce the rate of rejection, while monitoring recipients for infections and complications to help identify problems early. The long-term outcomes and management of lung transplant recipients are critically based on modulating natural immune response of the recipient to prevent acute and chronic rejection. Understanding the immune mechanisms and temporal correlation of acute and chronic rejection is thus critical in the long-term management of lung recipients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Iskandar, Muhammad Zaid; Quasem, Wahid; El-Omar, Magdi
2015-05-02
A 33-year-old man presented to hospital with acute shortness of breath and evolving ST segment changes on ECG 3 days following a cycle of 5-fluorouracil (5-FU) for colon cancer. Despite no cardiac history, subsequent echocardiogram showed severe left ventricular systolic dysfunction. The patient was initially treated with heart failure medications and his coronary angiogram was normal. Chemotherapy was stopped and he was started on nitrates and calcium channel blockers. A repeat echocardiogram and cardiac MRI a week later showed complete resolution of his left ventricular dysfunction and he was discharged home. This case report summarises 5-FU cardiotoxicity, and emphasises the importance of early recognition and correct treatment, as left ventricular systolic dysfunction in this context is potentially reversible. 2015 BMJ Publishing Group Ltd.
5-Fluorouracil cardiotoxicity: reversible left ventricular systolic dysfunction with early detection
Iskandar, Muhammad Zaid; Quasem, Wahid; El-Omar, Magdi
2015-01-01
A 33-year-old man presented to hospital with acute shortness of breath and evolving ST segment changes on ECG 3 days following a cycle of 5-fluorouracil (5-FU) for colon cancer. Despite no cardiac history, subsequent echocardiogram showed severe left ventricular systolic dysfunction. The patient was initially treated with heart failure medications and his coronary angiogram was normal. Chemotherapy was stopped and he was started on nitrates and calcium channel blockers. A repeat echocardiogram and cardiac MRI a week later showed complete resolution of his left ventricular dysfunction and he was discharged home. This case report summarises 5-FU cardiotoxicity, and emphasises the importance of early recognition and correct treatment, as left ventricular systolic dysfunction in this context is potentially reversible. PMID:25935919
Borges, V T M; Zanati, S G; Peraçoli, M T S; Poiati, J R; Romão-Veiga, M; Peraçoli, J C; Thilaganathan, B
2018-04-01
Pre-eclampsia (PE) is associated with maternal cardiac remodeling and diastolic dysfunction. The aim of this study was to assess and compare maternal left ventricular structure and diastolic function and levels of brain natriuretic peptide (BNP) in women with early-onset (< 34 weeks' gestation) vs those with late-onset (≥ 34 weeks' gestation) PE. This was a prospective, cross-sectional, observational study of 30 women with early-onset PE, 32 with late-onset PE and 23 normotensive controls. Maternal cardiac structure and diastolic function were assessed by echocardiography and plasma levels of BNP were measured by enzyme immunoassay. Early- and late-onset PE were associated with increased left ventricular mass index and relative wall thickness compared with normotensive controls. In women with early-onset PE, the prevalence of concentric hypertrophy (40%) and diastolic dysfunction (23%) was also significantly higher (both P < 0.05) compared with women with late-onset PE (16% for both). Maternal serum BNP levels were significantly higher (P < 0.05) in women with early-onset PE and correlated with relative wall thickness and left ventricular mass index. Early-onset PE is associated with more severe cardiac impairment than is late-onset PE, as evidenced by an increased prevalence of concentric hypertrophy, diastolic dysfunction and higher levels of BNP. These findings suggest that early-onset PE causes greater myocardial damage, increasing the risk of both peripartum and postpartum cardiovascular morbidity. Although these cardiovascular effects are easily identified by echocardiographic parameters and measuring BNP, further studies are needed to assess their clinical utility. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.
Double- and single-lung transplantation: an analysis of twenty years of OPTN/UNOS registry data.
Cai, Junchao
2007-01-01
1. Within the past 2 decades, the annual number of lung transplants, especially double-lung transplants, has steadily increased every year and exceeded 1,400 in the last 2 years. 2. Overall 1-, 5-, and 10-year graft survival rates for double-lung transplant recipients were 79.5%, 50.6%, and 30.4% respectively; those for left-lung transplant recipients were 76.0%, 41.8%, and 17.1%; and for right-lung transplant recipients were 78.3%, 44.8%, and 19.2%. 3. The improvement in long-term graft survival in the most recent transplant era was mainly due to improved one-year survival, more precisely, due to the increased early outcome within the first 2-3 months after transplantation. 4. A negative association between HLA mismatch and graft survival is statistically significant in both double and left-lung transplants. 5. Female COPD and ATD single-lung recipients had high long-term graft survival when they received right-lung transplants. While for male single-lung recipients, CF patients had better graft survival when they received left lung transplants; but PPH patients had higher graft survival when receiving right-lung transplants. This association between recipient gender and/or different original diseases and graft survival requires further investigation.
Urinary MicroRNA as Biomarker in Renal Transplantation.
van de Vrie, M; Deegens, J K; Eikmans, M; van der Vlag, J; Hilbrands, L B
2017-05-01
Urine represents a noninvasive source in which proteins and nucleic acids can be assessed. Such analytes may function as biomarkers to monitor kidney graft pathology at every desired frequency, thereby providing a time window to prevent graft damage by therapeutic intervention. Recently, several proteins have been measured in urine as markers of graft injury. However, the specificity is limited, and measuring urinary proteins generally lacks the potential to predict early kidney graft damage. Currently, urinary mRNA and microRNA are being investigated to evaluate the prognostic value of changes in gene expression during the initial stages of graft damage. At such time point, a change in treatment regimen and dosage is expected to have maximum potency to minimize future decline in graft function. Both mRNA and microRNAs have shown promising results in both detection and prediction of graft injury. An advantage of microRNAs compared to mRNA molecules is their stability, a characteristic that is beneficial when working with urine samples. In this review, we provide the current state of urinary biomarkers in renal transplantation, with a focus on urinary microRNA. In addition, we discuss the methods used to study urinary microRNA expression. © 2016 The Authors. American Journal of Transplantation published by Wiley Periodicals, Inc. on behalf of American Society of Transplant Surgeons.
Angiogenesis in healing autogenous flexor-tendon grafts.
Gelberman, R H; Chu, C R; Williams, C S; Seiler, J G; Amiel, D
1992-09-01
On the basis of recent evidence that flexor tendon grafts may heal without the ingrowth of vascular adhesions, eighteen autogenous donor tendons of intrasynovial and extrasynovial origin were transferred to the synovial sheaths in the forepaws of nine dogs, and controlled passive mobilization was instituted early in the postoperative period. The angiogenic responses of the tendon grafts were determined with perfusion studies with India ink followed by cleaing of the tissues with the Spalteholz technique at two, four, and six weeks. A consistent pattern of neovascularization was noted in the donor tendons of extrasynovial origin. Vascular adhesions arising from the flexor digitorum superficialis and the tendon sheath enveloped the tendon grafts by two weeks. By six weeks, the vascularity of the tendon grafts of extrasynovial origin appeared completely integrated with that of the surrounding tissues. Examination of cross sections revealed that the segments of tendon had been completely vascularized by obliquely oriented intratendinous vessels. In contrast, the flexor tendon grafts of intrasynovial origin healed without ingrowth of vascular adhesions. Primary intrinsic neovascularization took place from the proximal and, to a lesser extent, distal sites of the sutures. Examination of cross sections revealed vessels extending through the surface layer of the tendon graft, with small vessels penetrating the interior of the tendons at regular intervals.
Lee, Wei-Chia; Wu, Han-Ching; Huang, Kuo-How; Wu, Huey-Peir; Yu, Hong-Jeng; Wu, Chia-Ching
2014-01-01
Purpose To investigate the relationship between distal symmetric peripheral neuropathy and early stages of autonomic bladder dysfunction in type 2 diabetic women. Materials and Methods A total of 137 diabetic women with minimal coexisting confounders of voiding dysfunction followed at a diabetes clinic were subject to the following evaluations: current perception threshold (CPT) tests on myelinated and unmyelinated nerves at the big toe for peroneal nerve and middle finger for median nerve, uroflowmetry, post-void residual urine volume, and overactive bladder (OAB) symptom score questionnaire. Patients presenting with voiding difficulty also underwent urodynamic studies and intravesical CPT tests. Results Based on the OAB symptom score and urodynamic studies, 19% of diabetic women had the OAB syndrome while 24.8% had unrecognized urodynamic bladder dysfunction (UBD). The OAB group had a significantly greater mean 5 Hz CPT test value at the big toe by comparison to those without OAB. When compared to diabetic women without UBD, those with UBD showed greater mean 5 Hz CPT test values at the middle finger and big toe. The diabetic women categorized as C-fiber hyposensitivity at the middle finger or big toe by using CPT test also had higher odds ratios of UBD. Among diabetic women with UBD, the 5 Hz CPT test values at the big toe and middle finger were significantly associated with intravesical 5 Hz CPT test values. Conclusions Using electrophysiological evidence, our study revealed that hyposensitivity of unmyelinated C fiber afferents at the distal extremities is an indicator of early stages diabetic bladder dysfunction in type 2 diabetic women. The C fiber dysfunction at the distal extremities seems concurrent with vesical C-fiber neuropathy and may be a sentinel for developing early diabetic bladder dysfunction among female patients. PMID:24466107
Phase II Clinical Trial of Intraoral Grafting of Human Tissue Engineered Oral Mucosa
2017-10-01
experimental arm subject in the small defect study. A protocol amendment in early 2017revised the study inclusionary criteria to include all non ...construed as an official Department of the Army position, policy or decision unless so designated by other documentation. REPORT DOCUMENTATION PAGE...group phase II study to assess the safety and efficacy for use of human EVPOME for soft tissue intraoral grafting procedures compared to the “gold
[Renal failure in patients with liver transplant: incidence and predisposing factors].
Gerona, S; Laudano, O; Macías, S; San Román, E; Galdame, O; Torres, O; Sorkin, E; Ciardullo, M; de Santibañes, E; Mastai, R
1997-01-01
Renal failure is a common finding in patients undergoing orthotopic liver transplantation. The aim of the present study was to evaluate the incidence, prognostic value of pre, intra and postoperative factors and severity of renal dysfunction in patients who undergo liver transplantation. Therefore, the records of 38 consecutive adult patients were reviewed. Renal failure was defined arbitrarily as an increase in creatinine (> 1.5 mg/dl) and/or blood urea (> 80 mg/dl). Three patients were excluded of the final analysis (1 acute liver failure and 2 with a survival lower than 72 hs.) Twenty one of the 35 patients has renal failure after orthotopic liver transplantation. Six of these episodes developed early, having occurred within the first 6 days. Late renal impairment occurred in 15 patients within the hospitalization (40 +/- 10 days) (Mean +/- SD). In he overall series, liver function, evaluated by Child-Pugh classification, a higher blood-related requirements and cyclosporine levels were observed more in those who experienced renal failure than those who did not (p < 0.05). Early renal failure was related with preoperative (liver function) and intraoperative (blood requirements) factors and several causes (nephrotoxic drugs and graft failure) other than cyclosporine were present in patients who developed late renal impairment. No mortality. No mortality was associated with renal failure. We conclude that renal failure a) is a common finding after liver transplantation, b) the pathogenesis of this complication is multifactorial and, c) in not related with a poor outcome.
Pulmonary preservation studies: effects on endothelial function and pulmonary adenine nucleotides.
Paik, Hyo Chae; Hoffmann, Steven C; Egan, Thomas M
2003-02-27
Lung transplantation is an effective therapy plagued by a high incidence of early graft dysfunction, in part because of reperfusion injury. The optimal preservation solution for lung transplantation is unknown. We performed experiments using an isolated perfused rat lung model to test the effect of lung preservation with three solutions commonly used in clinical practice. Lungs were retrieved from Sprague-Dawley rats and flushed with one of three solutions: modified Euro-Collins (MEC), University of Wisconsin (UW), or low potassium dextran and glucose (LPDG), then stored cold for varying periods before reperfusion with Earle's balanced salt solution using the isolated perfused rat lung model. Outcome measures were capillary filtration coefficient (Kfc), wet-to-dry weight ratio, and lung tissue levels of adenine nucleotides and cyclic AMP. All lungs functioned well after 4 hr of storage. By 6 hr, UW-flushed lungs had a lower Kfc than LPDG-flushed lungs. After 8 hr of storage, only UW-flushed lungs had a measurable Kfc. Adenine nucleotide levels were higher in UW-flushed lungs after prolonged storage. Cyclic AMP levels correlated with Kfc in all groups. Early changes in endothelial permeability seemed to be better attenuated in lungs flushed with UW compared with LPDG or MEC; this was associated with higher amounts of adenine nucleotides. MEC-flushed lungs failed earlier than LPDG-flushed or UW-flushed lungs. The content of the solution may be more important for lung preservation than whether the ionic composition is intracellular or extracellular.
Iida, Shoichi; Tsuda, Hidetoshi; Tanaka, Toshiaki; Kish, Danielle D; Abe, Toyofumi; Su, Charles A; Abe, Ryo; Tanabe, Kazunari; Valujskikh, Anna; Baldwin, William M; Fairchild, Robert L
2016-03-15
Reperfusion of organ allografts induces a potent inflammatory response that directs rapid memory T cell, neutrophil, and macrophage graft infiltration and their activation to express functions mediating graft tissue injury. The role of cardiac allograft IL-1 receptor (IL-1R) signaling in this early inflammation and the downstream primary alloimmune response was investigated. When compared with complete MHC-mismatched wild-type cardiac allografts, IL-1R(-/-) allografts had marked decreases in endogenous memory CD8 T cell and neutrophil infiltration and expression of proinflammatory mediators at early times after transplant, whereas endogenous memory CD4 T cell and macrophage infiltration was not decreased. IL-1R(-/-) allograft recipients also had marked decreases in de novo donor-reactive CD8, but not CD4, T cell development to IFN-γ-producing cells. CD8 T cell-mediated rejection of IL-1R(-/-) cardiac allografts took 3 wk longer than wild-type allografts. Cardiac allografts from reciprocal bone marrow reconstituted IL-1R(-/-)/wild-type chimeric donors indicated that IL-1R signaling on graft nonhematopoietic-derived, but not bone marrow-derived, cells is required for the potent donor-reactive memory and primary CD8 T cell alloimmune responses observed in response to wild-type allografts. These studies implicate IL-1R-mediated signals by allograft parenchymal cells in generating the stimuli-provoking development and elicitation of optimal alloimmune responses to the grafts. Copyright © 2016 by The American Association of Immunologists, Inc.
Bradley, S P; Pahari, M; Uknis, M E; Rastellini, C; Cicalese, L
2006-01-01
The cellular and histological events that occur during the regeneration process in invertebrates have been studied in the field of visceral regeneration. We would like to explore the molecular aspects of the regeneration process in the small intestine. The aim of this study was to characterize the gene expression profiles of the intestinal graft to identify which genes may have a role in regeneration of graft tissue posttransplant. In a patient undergoing living related small bowel transplantation (LRSBTx) in our institution, mucosal biopsies were obtained from the recipient intestine and donor graft at the time of transplant and at weeks 1, 2, 3, and 6 posttransplant. Total RNA was isolated from sample biopsies followed by gene expression profiles determined from the replicate samples (n = 3) for each biopsy using the Affymetrix U133 Plus 2.0 Human GeneChip set. Two profiles were obtained from the data. One profile showed rapid increase of 45 genes immediately after transplant by week 1 with significant changes (P < .05) greater than threefold including the chemokine CXC9 and glutathione-related stress factors, GPX2 and GSTA4. The second profile identified 133 genes that were significantly decreased by threefold or greater immediately after transplant week 1, including UCC1, the human homolog of the Ependymin gene. We have identified two gene expression profiles representing early graft responses to small bowel transplantation. These profiles will serve to identify and study those genes whose products may play a role in accelerating tissue regeneration following segmental LRSBTx.
Donor hypernatremia influences outcomes following pediatric liver transplantation.
Kaseje, Neema; Lüthold, Samuel; Mentha, Gilles; Toso, Christian; Belli, Dominique; McLin, Valérie; Wildhaber, Barbara
2013-02-01
With the rising demand for liver transplantations (LTs), and the shortage of organs, extended criteria including donor hypernatremia have been adopted to increase the donor pool. Currently, there is conflicting evidence on the effect of donor hypernatremia on outcomes following LT. Our aim was to investigate differences in outcome in patients receiving grafts from hypernatremic donors compared with patients receiving grafts from normonatremic donors in the pediatric population. We retrospectively reviewed 94 pediatric patients with LTs from 1994 to 2011. We divided the patients into two groups: patients receiving organs from donors with sodium levels < 150 µmol/L, n = 67 (group 1), and patients receiving organs from donors with sodium levels ≥ 150 µmol/L, n = 27 (group 2). Using proportions and means, we analyzed patient age, sex, weight, model for end-stage liver disease (MELD) score, primary diagnosis, emergency of procedure, intraoperative transfusion volume, cold ischemia time, donor age, graft type, and postoperative graft function. Rates of mortality, rejection, early biliary, infectious, and vascular complications were calculated. Mean age was 3.9 years in group 1 and 3.7 years in group 2 (p = 0.69). Mean weight and MELD scores were similar in the two study groups (16.0 vs. 15.9 and 21.2 vs. 22.0, respectively). There were no significant differences in mean cold ischemia times 6.4 versus 6.9 hours (p = 0.29), and mean intraoperative transfusion volumes 1,068.5 mL versus 1,068.8 mL (p = 0.89). There were no statistically significant differences in mortality rates (7.3 vs. 11.1%, p = 0.68). Prothrombin time (PT) at day 10 post-LT was significantly lower in group 2 (79 vs. 64, p = 0.017), and there was a higher relative risk (RR) for early thrombotic vascular complications in group 2 (RR = 2.48); however, this was not significant (p = 0.26). No significant differences in RR for rejection (0.97, p = 0.86), viral infections (1.24, p = 0.31), bacterial infections (0.86, p = 0.62), or early biliary complications (1.03, p = 1.00) were observed. In pediatric LT patients receiving grafts from hypernatremic donors, there are no significant increases in rates of mortality, rejection, early biliary, and infectious complications. However, there is a statistically significant lower PT at postoperative day 10 following transplantation, and a more than double RR for early thrombotic vascular complications although this was not statistically significant. Georg Thieme Verlag KG Stuttgart · New York.
Satyanarayana, G.; Marty, F.M.; Tan, C.S.
2014-01-01
BK virus (BKV), an ubiquitous human polyomavirus, usually does not cause disease in healthy individuals. BKV reactivation and disease can occur in immunosuppressed individuals, such as those who have undergone renal transplantation or hematopoietic cell transplantation (HCT). Clinical manifestations of BKV disease include graft dysfunction and failure in renal transplant recipients; HCT recipients frequently experience hematuria, cystitis, hemorrhagic cystitis (HC), and renal dysfunction. Studies of HCT patients have identified several risk factors for the development of BKV disease including myeloablative conditioning, acute graft-versus-host disease, and undergoing an umbilical cord blood (uCB) HCT. Although these risk factors indicate that alterations in the immune system are necessary for BKV pathogenesis in HCT patients, few studies have examined the interactions between host immune responses and viral reactivation in BKV disease. Specifically, having BKV immunoglobulin-G before HCT does not protect against BKV infection and disease after HCT. A limited number of studies have demonstrated BKV- specific cytotoxic T-cells in healthy adults as well as in post-HCT patients who had experienced HC. New areas of research are required for a better understanding of this emerging infectious disease post HCT, including prospective studies examining BK viruria, viremia, and their relationship to clinical disease, a detailed analysis of urothelial histopathology, and laboratory evaluation of systemic and local cellular and humoral immune responses to BKV in patients receiving HCT from different sources, including uCB and haploidentical donors. PMID:24834968
Melgar-Lesmes, Pedro; Balcells, Mercedes; Edelman, Elazer R.
2017-01-01
Objective Liver transplantation is limited by ischemic injury which promotes endothelial cell and hepatocyte dysfunction and eventually organ failure. We sought to understand how endothelial state determines liver recover after hepatectomy and engraftment. Design Matrix-embedded endothelial cells (MEECs) with retained healthy phenotype or control acellular matrices were implanted in direct contact with the remaining median lobe of donor mice undergoing partial hepatectomy (70%), or in the interface between the remaining median lobe and an autograft or isograft from the left lobe in hepatectomized recipient mice. Hepatic vascular architecture, DNA fragmentation and apoptosis in the median lobe and grafts, serum markers of liver damage and phenotype of macrophage and lymphocyte subsets in the liver after engraftment were analyzed 7 days post-op. Results Healthy MEECs create a functional vascular splice in donor and recipient liver after 70% hepatectomy in mouse protecting these livers from ischemic injury, hepatic congestion and inflammation. Macrophages recruited adjacent to the vascular nodes into the implants switched to an anti-inflammatory and regenerative profile M2. MEECs improved liver function and the rate of liver regeneration and prevented apoptosis in donor liver lobes, autologous grafts, and allogeneic engraftment. Conclusions Implants with healthy endothelial cells rescue liver donor and recipient endothelium and parenchyma from ischemic injury after major hepatectomy and engraftment. This study highlights endothelial-hepatocyte crosstalk in hepatic repair and provides a promising new approach to improve regenerative medicine outcomes and liver transplantation. PMID:26851165
Ji, Qiang; Xia, Li Min; Shi, Yun Qing; Ma, Run Hua; Shen, Jin Qiang; Ding, Wen Jun; Wang, Chun Sheng
2017-10-10
Few studies focused on evaluating the impacts of preoperative severe left ventricular dysfunction on clinical outcomes of patients undergoing off-pump coronary artery bypass grafting surgery (OPCAB). This single center retrospective study aimed to evaluate the impacts of severe left ventricular dysfunction on in-hospital and mid-term clinical outcomes of Chinese patients undergoing first, scheduled, and isolated OPCAB surgery. From January 2010 to December 2014, 2032 eligible patients were included in this study and were divided into 3 groups: a severe group (patients with preoperative left ventricular ejection fraction (LVEF) of ≤35%, n = 128), an impaired group (patients with preoperative LVEF of 36-50%, n = 680), and a normal group (patients with preoperative LVEF of >50%, n = 1224). In-hospital and follow-up clinical outcomes were investigated and compared. Patients in the severe group compared to the other 2 groups had higher in-hospital mortality and higher incidences of low cardiac output and prolonged ventilation. Kaplan-Meier curves showed a similar cumulative follow-up survival between the severe group and the impaired group (χ 2 = 1.980, Log-rank p = 0.159) and between the severe group and the normal group (χ 2 = 2.701, Log-rank p = 0.102). Multivariate Cox regression indicated that grouping was not a significant variable related to mid-term all-cause mortality. No significant difference was found in the rate of repeat revascularization between the severe group (2.4%) and the other 2 groups. Patients with preoperative LVEF of ≤35% compared to preoperative LVEF of >35% increased the risk of in-hospital death and incidences of postoperative low cardiac output and prolonged ventilation, but shared similar mid-term all-cause mortality and repeat revascularization after OPCAB surgery.
Han, Chuanlai; Fu, Rong; Lei, Weifu
2018-07-01
According to clinical investigations, early postoperative cognitive dysfunction is the most common adverse event in pediatric patients after tonsillectomy. A previous study has indicated that dexmedetomidine (DEX) is an efficient drug for the treatment of postoperative cognitive dysfunction. However, the efficacy of DEX in alleviating early postoperative cognitive dysfunction in pediatric patients following tonsillectomy has remained elusive, which was therefore assessed in the present study. A total of 186 children presenting with cognitive dysfunction subsequent to tonsillectomy were recruited to analyze the efficacy of DEX. Patients were randomly divided into two groups and received intravenous treatment with DEX (n=112) or placebo (n=74). Duration of treatment, dose-limiting toxicities (DLT) and maximum tolerated dose (MTD) of DEX were evaluated in a preliminary experiment. The improvement of postoperative cognitive function in children with tonsillectomy was analyzed with a Mini-Mental State Examination (MMSE) following treatment with DEX. A 40-item quality of life (MONEX-40) questionnaire was used to assess the efficacy of DEX. The plasma levels of interleukin (IL)-6, IL-1, tumor necrosis factor (TNF)-α, superoxide dismutase (SOD), neuron-specific enolase (NSE), C-reactive protein (CRP), cortisol and melatonin were also analyzed. The preliminary experiment determined that the DLT was 10 mg/kg and the MTD was 15 mg/kg. In the major clinical trial, it was revealed that MMSE scores in the DEX treatment group were markedly improved, indicating that DEX had a beneficial effect in pediatric patients with early postoperative cognitive dysfunction after tonsillectomy. In addition, IL-1and TNF-α were downregulated, while IL-6 and SOD were upregulated in patients with cognitive dysfunction after treatment with DEX compared with those in the placebo group. Furthermore, DEX treatment markedly decreased the serum levels of CRP, NSE cortisol and melatonin, which are associated with the occurrence of postoperative cognitive dysfunction in pediatric patients following tonsillectomy. In conclusion, intravenous administration of DEX at a dose of 10 mg/kg improves postoperative cognitive function in pediatric patients with tonsillectomy by decreasing the serum levels of inflammatory factors and stress-associated signaling molecules. Trial registration no. QLSDHOS0200810102C (Qilu Hospital of Shandong University, Jinan, China).
Impact of post-kidney transplant parathyroidectomy on allograft function
Parikh, Samir; Nagaraja, Haikady; Agarwal, Anil; Samavedi, Srinivas; Von Visger, Jon; Nori, Uday; Andreoni, Kenneth; Pesavento, Todd; Singh, Neeraj
2013-01-01
Background The impact of parathyroidectomy on allograft function in kidney transplant patients is unclear. Methods We conducted a retrospective, observational study of all kidney transplant recipients from 1988 to 2008 who underwent parathyroidectomy for uncontrolled hyperparathyroidism (n = 32). Post-parathyroidectomy, changes in estimated glomerular filtration rate (eGFR) and graft loss were recorded. Cross-sectional associations at baseline between eGFR and serum calcium, phosphate, and parathyroid hormone (PTH), and associations between their changes within subjects during the first two months post-parathyroidectomy were assessed. Results Post-parathyroidectomy, the mean eGFR declined from 51.19 mL/min/1.73 m2 at parathyroidectomy to 44.78 mL/min/1.73 m2 at two months (p < 0.0001). Subsequently, graft function improved, and by 12 months, mean eGFR recovered to 49.76 mL/min/1.73 m2 (p = 0.035). Decrease in serum PTH was accompanied by a decrease in eGFR (p = 0.0127) in the first two months post-parathyroidectomy. Patients whose eGFR declined by ≥ 20% (group 1) in the first two months post-parathyroidectomy were distinguished from the patients whose eGFR declined by <20% (group 2). The two groups were similar except that group 1 had a higher baseline mean serum PTH compared with group 2, although not significant (1046.7 ± 1034.2 vs. 476.6 ± 444.9, p = 0.14). In group 1, eGFR declined at an average rate of 32% (p < 0.0001) during the first month post-parathyroidectomy compared with 7% (p = 0.1399) in group 2, and the difference between these two groups was significant (p = 0.0003). The graft function recovered in both groups by one yr. During median follow-up of 66.00 ± 49.45 months, 6 (18%) patients lost their graft with a mean time to graft loss from parathyroidectomy of 37.2 ± 21.6 months. The causes of graft loss were rejection (n = 2), pyelonephritis (n = 1) and chronic allograft nephropathy (n = 3). No graft loss occurred during the first-year post-surgery. Conclusion Parathyroidectomy may lead to transient kidney allograft dysfunction with eventual recovery of graft function by 12 months post-parathyroidectomy. Higher level of serum PTH pre-parathyoidectomy is associated with a more profound decrease in eGFR post-parathyroidectomy. PMID:23448282
Spear, Rafaelle; Sobocinski, Jonathan; Hertault, Adrien; Delloye, Matthieu; Azzauiu, Richard; Fabre, Dominique; Haulon, Stéphan
2018-04-01
To evaluate the outcomes of the second generation BeGraft balloon expandable covered stent Graft System (Bentley InnoMed, Hechingen, Germany) implanted as bridging stent grafts during fenestrated endovascular aortic repair (FEVAR) of complex aneurysms. This was a single centre prospective study including all consecutive patients treated by FEVAR performed with second generation BeGraft stent grafts as bridging stents. Demographics of patients, diameter and length of the bridging stent grafts, technical success, re-interventions, occlusions, post-operative events, and imaging (Cone Beam CT and/or CT scan, and contrast enhanced ultrasound) were prospectively collected in an electronic database. Duplex ultrasound was performed before discharge and at 6 month follow-up. At 1 year, patients were evaluated clinically and by imaging (CT and ultrasound). Between November 2015 and September 2016, 39 consecutive patients (one woman) were treated with custom made fenestrated endografts (2-5 fenestrations) for complex aneurysms or type 1 endoleak after EVAR, using a variety of bridging stents including the BeGraft. All 101 BeGraft stent grafts were successfully delivered and deployed. There was no in hospital mortality. Early fenestration patency rate was 99% (96/97); the sole target vessel post-operative occlusion was secondary to a dissection of the renal artery distal to the stent. Complementary stenting was unsuccessful in recovering renal artery patency; bilateral renal stent occlusion was observed in the same patient on a CT scan performed 2 months after the procedure. He required post-operative dialysis. No additional renal impairment was observed. During follow-up (median 13 months [11-15]), all fenestrations stented with BeGraft stent grafts remained patent (95/97, 98%). One type 1b endoleak was detected and treated (2.6%). BeGraft stent grafts used as bridging stents during FEVAR are associated with favourable outcomes at 1 year follow-up. Long-term follow-up is required to confirm these promising results. Copyright © 2018 European Society for Vascular Surgery. Published by Elsevier B.V. All rights reserved.
[Buccal mucosa graft for the treatment of long ureteral stenosis: Bibliographic review.
Del Pozo Jiménez, Gema; Castillón-Vela, Ignacio; Carballido Rodríguez, Joaquín
2017-05-01
To perform a literature review on the use of buccal mucosa graft (BMG) in the treatment of extensive ureteral stenosis, according to the criteria of Evidence Based Medicine. Pubmed search of published studies with the following keywords: "ureteral stricture treatment", "buccal mucosa graft ureteral treatment" and "buccal mucosa graft ureteroplasty", without time limits, in English and Spanish; 12 articles were identified with a total of 48 cases (46 patients) of BMG use in ureteral repair. The main etiologies of ureteral stenosis, where BMG has been applied, have been iatrogenic and inflammatory strictures. This graft has been used complicamainly in proximal or middle ureter stenosis, as a patch according to onlay technique or as a tubularized graft. Early and late complications of the procedure have been reported in 16.7% and 10.4%, respectively, with a restenosis rate of 6.25%. A 91.6% success rate was observed with this technique, with an average follow-up time of 22 (3-85) months. The findings of the present review do not justify the universal use of BMG in all ureteral strictures, particularly in the absence of long-term followup, but still provide evidence that BMG can be effectively used in extensive ureteral strictures.
Tordoir, J H; Hofstra, L; Leunissen, K M; Kitslaar, P J
1995-04-01
The purpose of this study was to evaluate the results and complications of standard ePTFE versus stretch ePTFE AV fistulas. Prospective randomised trial. University Hospital. During a 2-year period 37 patients received 17 stretch and 20 standard ePTFE graft AV fistulas. Patients were evaluated for the occurrence of complications and graft patency. Regular Duplex scans were performed to detect stenoses in the fistula circuit. Thrombotic events occurred in 40% of the standard ePTFE grafts, compared to 12% of the stretch ePTFE prostheses (p < 0.001). The incidence of puncture complications was similar in both groups. The cumulative primary patency rate in the stretch ePTFE group was significantly higher compared to the standard ePTFE group (1-year patency rates of 59% and 29%, respectively; p < 0.01). No differences in the duration of puncture site bleeding were observed. Duplex scanning showed a significantly greater number of stenoses in the standard ePTFE grafts. The new stretch ePTFE prosthesis has better primary patency rates and less stenoses due to intimal hyperplasia as compared to standard ePTFE grafts.
Liu, Chao; Han, Jian-ge
2015-02-01
The high incidence of postoperative cognitive dysfunction (POCD) after extracorporeal circulation has seriously affected the prognosis and quality of life. Its mechanism may involve the inflammatory response and oxidative stress,the excessive phosphorylation of tau protein, the decreased blood volume and oxygen in the cerebral cortex. Appropriate early warning indicators of POCD after the extracorporeal circulation should be chosen to facilitate the cross validation of the results obtained different technical approaches and thus promote the early diagnosis and treatment of POCD.
Weiss, Scott L; Balamuth, Fran; Hensley, Josey; Fitzgerald, Julie C; Bush, Jenny; Nadkarni, Vinay M; Thomas, Neal J; Hall, Mark; Muszynski, Jennifer
2017-09-01
The epidemiology of in-hospital death after pediatric sepsis has not been well characterized. We investigated the timing, cause, mode, and attribution of death in children with severe sepsis, hypothesizing that refractory shock leading to early death is rare in the current era. Retrospective observational study. Emergency departments and ICUs at two academic children's hospitals. Seventy-nine patients less than 18 years old treated for severe sepsis/septic shock in 2012-2013 who died prior to hospital discharge. None. Time to death from sepsis recognition, cause and mode of death, and attribution of death to sepsis were determined from medical records. Organ dysfunction was assessed via daily Pediatric Logistic Organ Dysfunction-2 scores for 7 days preceding death with an increase greater than or equal to 5 defined as worsening organ dysfunction. The median time to death was 8 days (interquartile range, 1-12 d) with 25%, 35%, and 49% of cumulative deaths within 1, 3, and 7 days of sepsis recognition, respectively. The most common cause of death was refractory shock (34%), then multiple organ dysfunction syndrome after shock recovery (27%), neurologic injury (19%), single-organ respiratory failure (9%), and nonseptic comorbidity (6%). Early deaths (≤ 3 d) were mostly due to refractory shock in young, previously healthy patients while multiple organ dysfunction syndrome predominated after 3 days. Mode of death was withdrawal in 72%, unsuccessful cardiopulmonary resuscitation in 22%, and irreversible loss of neurologic function in 6%. Ninety percent of deaths were attributable to acute or chronic manifestations of sepsis. Only 23% had a rise in Pediatric Logistic Organ Dysfunction-2 that indicated worsening organ dysfunction. Refractory shock remains a common cause of death in pediatric sepsis, especially for early deaths. Later deaths were mostly attributable to multiple organ dysfunction syndrome, neurologic, and respiratory failure after life-sustaining therapies were limited. A pattern of persistent, rather than worsening, organ dysfunction preceded most deaths.
Marui, Akira; Kimura, Takeshi; Nishiwaki, Noboru; Mitsudo, Kazuaki; Komiya, Tatsuhiko; Hanyu, Michiya; Shiomi, Hiroki; Tanaka, Shiro; Sakata, Ryuzo
2014-10-01
Coronary heart disease is a major risk factor for left ventricular (LV) systolic dysfunction. However, limited data are available regarding long-term benefits of percutaneous coronary intervention (PCI) in the era of drug-eluting stent or coronary artery bypass grafting (CABG) in patients with LV systolic dysfunction with severe coronary artery disease. We identified 3,584 patients with 3-vessel and/or left main disease of 15,939 patients undergoing first myocardial revascularization enrolled in the CREDO-Kyoto PCI/CABG Registry Cohort-2. Of them, 2,676 patients had preserved LV systolic function, defined as an LV ejection fraction (LVEF) of >50% and 908 had impaired LV systolic function (LVEF≤50%). In patients with preserved LV function, 5-year outcomes were not different between PCI and CABG regarding propensity score-adjusted risk of all-cause and cardiac deaths. In contrast, in patients with impaired LV systolic function, the risks of all-cause and cardiac deaths after PCI were significantly greater than those after CABG (hazard ratio 1.49, 95% confidence interval 1.04 to 2.14, p=0.03 and hazard ratio 2.39, 95% confidence interval 1.43 to 3.98, p<0.01). In both patients with moderate (35%
Pedraza, Eileen; Coronel, Maria M.; Fraker, Christopher A.; Ricordi, Camillo; Stabler, Cherie L.
2012-01-01
A major hindrance in engineering tissues containing highly metabolically active cells is the insufficient oxygenation of these implants, which results in dying or dysfunctional cells in portions of the graft. The development of methods to increase oxygen availability within tissue-engineered implants, particularly during the early engraftment period, would serve to allay hypoxia-induced cell death. Herein, we designed and developed a hydrolytically activated oxygen-generating biomaterial in the form of polydimethylsiloxane (PDMS)-encapsulated solid calcium peroxide, PDMS-CaO2. Encapsulation of solid peroxide within hydrophobic PDMS resulted in sustained oxygen generation, whereby a single disk generated oxygen for more than 6 wk at an average rate of 0.026 mM per day. The ability of this oxygen-generating material to support cell survival was evaluated using a β cell line and pancreatic rat islets. The presence of a single PDMS-CaO2 disk eliminated hypoxia-induced cell dysfunction and death for both cell types, resulting in metabolic function and glucose-dependent insulin secretion comparable to that in normoxic controls. A single PDMS-CaO2 disk also sustained enhanced β cell proliferation for more than 3 wk under hypoxic culture conditions. Incorporation of these materials within 3D constructs illustrated the benefits of these materials to prevent the development of detrimental oxygen gradients within large implants. Mathematical simulations permitted accurate prediction of oxygen gradients within 3D constructs and highlighted conditions under which supplementation of oxygen tension would serve to benefit cellular viability. Given the generality of this platform, the translation of these materials to other cell-based implants, as well as ischemic tissues in general, is envisioned. PMID:22371586
[Ultrasonic methods and semiotics in patients with vasculogenic erectile dysfunction].
Zhukov, O B; Zubarev, A R
2001-01-01
The authors have developed criteria for ultrasonic assessment of cavernous bodies, arterial and venous circulation in normal penile vessels and in erectile dysfunction in 125 patients; describe modern ultrasound modalities in differential diagnosis of various forms of vasculogenic erectile dysfunction basing on the experience with 92 patients; validate hydrodynamic role of the tunica albuginea in pathogenesis of venocorporal dysfunction and pathological venous drainage. Early ischemic signs of arterial insufficiency were revealed.
Nenezić, Dragoslav; Pandaitan, Simon; Ilijevski, Nenad; Matić, Predrag; Gajin, Predag; Radak, Dorde
2005-01-01
Although the incidence of prosthetic infection is low (1%-6%), the consequences (limb loss or death) are dramatic for a patient, with high mortality rate (25%-75%) and limb loss in 40%-75% of cases. In case of Szilagyi's grade III infection, standard procedure consists of the excision of prosthesis and wound debridement. Alternative method is medical treatment. This is a case report of a patient with prosthetic infection of Silver-ring graft, used for femoropopliteal reconstruction, in whom an extreme skin necrosis developed in early postoperative period. This complication was successfully treated medically. After repeated debridement and wound-packing, the wound was covered using Thiersch skin graft.
Bartley, Patricia; Angelakis, Emmanouil; Raoult, Didier; Sampath, Rangarajan; Bonomo, Robert A.
2016-01-01
Identifying the pathogen responsible for culture-negative valve endocarditis often depends on molecular studies performed on surgical specimens. A patient with Ehlers-Danlos syndrome who had an aortic graft, a mechanical aortic valve, and a mitral anulloplasty ring presented with culture-negative prosthetic valve endocarditis and aortic graft infection. Research-based polymerase chain reaction (PCR)/electrospray ionization mass spectrometry on peripheral blood samples identified Bartonella henselae. Quantitative PCR targeting the16S-23S ribonucleic acid intergenic region and Western immunoblotting confirmed this result. This, in turn, permitted early initiation of pathogen-directed therapy and subsequent successful medical management of B henselae prosthetic valve endocarditis and aortic graft infection. PMID:27844027
Management Options in Avascular Necrosis of Talus.
Dhillon, Mandeep S; Rana, Balvinder; Panda, Inayat; Patel, Sandeep; Kumar, Prasoon
2018-01-01
Avascular necrosis (AVN) of the talus can be a cause of significant disability and is a difficult problem to treat. The most common cause is a fracture of the talus. We have done a systematic review of the literature with the following aims: (1) identify and summarize the available evidence in literature for the treatment of talar AVN, (2) define the usefulness of radiological Hawkins sign and magnetic resonance imaging in early diagnosis, and (3) provide patient management guidelines. We searched MEDLINE and PUBMED using keywords and MESH terminology. The articles' abstracts were read by two of the authors. Forty-one studies met the inclusion criteria of the 335 abstracts screened. The interventions of interest included hindfoot fusion, conservative measures, bone grafting, vascularized bone graft, core decompression, and talar replacement. All studies were of Level IV evidence. We looked to identify the study quality, imprecise and sparse data, reporting bias, and the quality of evidence. Based on the analysis of available literature, we make certain recommendations for managing patients of AVN talus depending on identified disease factors such as early or late presentation, extent of bone involvement, bone collapse, and presence or absence of arthritis. Early talar AVN seems best treated with protected weight bearing and possibly in combination with extracorporeal shock wave therapy. If that fails, core decompression can be considered. Arthrodesis should be saved as a salvage procedure in late cases with arthritis and collapse, and a tibiotalocalcaneal fusion with bone grafting may be needed in cases of significant bone loss. Role of vascularized bone grafting is still not defined clearly and needs further investigation. Future prospective, randomized studies are necessary to guide the conservative and surgical management of talar AVN.
Management Options in Avascular Necrosis of Talus
Dhillon, Mandeep S; Rana, Balvinder; Panda, Inayat; Patel, Sandeep; Kumar, Prasoon
2018-01-01
Avascular necrosis (AVN) of the talus can be a cause of significant disability and is a difficult problem to treat. The most common cause is a fracture of the talus. We have done a systematic review of the literature with the following aims: (1) identify and summarize the available evidence in literature for the treatment of talar AVN, (2) define the usefulness of radiological Hawkins sign and magnetic resonance imaging in early diagnosis, and (3) provide patient management guidelines. We searched MEDLINE and PUBMED using keywords and MESH terminology. The articles' abstracts were read by two of the authors. Forty-one studies met the inclusion criteria of the 335 abstracts screened. The interventions of interest included hindfoot fusion, conservative measures, bone grafting, vascularized bone graft, core decompression, and talar replacement. All studies were of Level IV evidence. We looked to identify the study quality, imprecise and sparse data, reporting bias, and the quality of evidence. Based on the analysis of available literature, we make certain recommendations for managing patients of AVN talus depending on identified disease factors such as early or late presentation, extent of bone involvement, bone collapse, and presence or absence of arthritis. Early talar AVN seems best treated with protected weight bearing and possibly in combination with extracorporeal shock wave therapy. If that fails, core decompression can be considered. Arthrodesis should be saved as a salvage procedure in late cases with arthritis and collapse, and a tibiotalocalcaneal fusion with bone grafting may be needed in cases of significant bone loss. Role of vascularized bone grafting is still not defined clearly and needs further investigation. Future prospective, randomized studies are necessary to guide the conservative and surgical management of talar AVN. PMID:29887631
Experience with the first 50 ex vivo lung perfusions in clinical transplantation.
Cypel, Marcelo; Yeung, Jonathan C; Machuca, Tiago; Chen, Manyin; Singer, Lianne G; Yasufuku, Kazuhiro; de Perrot, Marc; Pierre, Andrew; Waddell, Thomas K; Keshavjee, Shaf
2012-11-01
Normothermic ex vivo lung perfusion is a novel method to evaluate and improve the function of injured donor lungs. We reviewed our experience with 50 consecutive transplants after ex vivo lung perfusion. A retrospective study using prospectively collected data was performed. High-risk brain death donor lungs (defined as Pao(2)/Fio(2) <300 mm Hg or lungs with radiographic or clinical findings of pulmonary edema) and lungs from cardiac death donors were subjected to 4 to 6 hours of ex vivo lung perfusion. Lungs that achieved stable airway and vascular pressures and Pao(2)/Fio(2) greater than 400 mm Hg during ex vivo lung perfusion were transplanted. The primary end point was the incidence of primary graft dysfunction grade 3 at 72 hours after transplantation. End points were compared with lung transplants not treated with ex vivo lung perfusion (controls). A total of 317 lung transplants were performed during the study period (39 months). Fifty-eight ex vivo lung perfusion procedures were performed, resulting in 50 transplants (86% use). Of these, 22 were from cardiac death donors and 28 were from brain death donors. The mean donor Pao(2)/Fio(2) was 334 mm Hg in the ex vivo lung perfusion group and 452 mm Hg in the control group (P = .0001). The incidence of primary graft dysfunction grade 3 at 72 hours was 2% in the ex vivo lung perfusion group and 8.5% in the control group (P = .14). One patient (2%) in the ex vivo lung perfusion group and 7 patients (2.7%) in the control group required extracorporeal lung support for primary graft dysfunction (P = 1.00). The median time to extubation, intensive care unit stay, and hospital length of stay were 2, 4, and 20 days, respectively, in the ex vivo lung perfusion group and 2, 4, and 23 days, respectively, in the control group (P > .05). Thirty-day mortality (4% in the ex vivo lung perfusion group and 3.5% in the control group, P = 1.00) and 1-year survival (87% in the ex vivo lung perfusion group and 86% in the control group, P = 1.00) were similar in both groups. Transplantation of high-risk donor lungs after 4 to 6 hours of ex vivo lung perfusion is safe, and outcomes are similar to those of conventional transplants. Ex vivo lung perfusion improved our center use of donor lungs, accounting for 20% of our current lung transplant activity. Copyright © 2012 The American Association for Thoracic Surgery. Published by Mosby, Inc. All rights reserved.
Surgical management of retraction pockets of the pars tensa with cartilage and perichondrial grafts.
Spielmann, P; Mills, R
2006-09-01
Stable, self-cleansing retraction pockets of the pars tensa are common incidental findings and require no treatment. In other cases, recurrent discharge occurs and there may also be associated conductive hearing loss. In a minority of cases, cholesteatoma may develop. This paper presents the results of surgery using a graft composed of cartilage and perichondrium for retraction pockets involving the posterior half of the tympanic membrane, as well as early results using a larger graft designed to manage retraction of the entire tympanic membrane. Data on 51 patients with posterior retraction pockets are presented. Forty-two (82 per cent) patients had no aural discharge one year following surgery and the tympanic membrane was not retracted in 43 (84 per cent). The larger 'Mercedes-Benz' graft was used in four patients and the results obtained suggested that it may prove a successful technique for extensive retraction pockets.
Delgado, Julio C; Pavlov, Igor Y; Shihab, Fuad S
2009-12-01
Levels of sCD30 represent a biomarker for early outcome in kidney transplantation. The purpose of this study was to determine the role of sCD30 levels for prediction of graft loss in the late post-transplant period. Sera were collected immediately pre-transplant and yearly thereafter for up to 5-year post-transplant in 37 primary renal transplant recipients. Levels of serum sCD30 were tested using a fluorescent microsphere assay. Levels of sCD30 significantly decreased after transplantation and remained normal in 34 patients without graft loss up to 5-year post-transplant. Elevated levels of serum sCD30 preceded the increase of serum creatinine in patients with subsequent graft loss. Elevated levels of serum sCD30 post-transplant might be a marker for predicting subsequent graft loss in the post-transplant period.
Fonseca, Ana; Monteiro, Fabiana; Canavarro, Maria Cristina
2018-06-06
This study aimed to examine the relationship between dysfunctional motherhood-related beliefs and postpartum anxiety and depression symptoms, and whether experiential avoidance may be a potential mechanism in explaining these relationships. A sample of 262 postpartum women participated in a cross-sectional online survey. The model presented a good fit (CFI = 0.96, RMSEA = 0.077) suggesting that more dysfunctional motherhood-related beliefs related with maternal responsibility and with others' judgments were associated with higher postpartum anxiety and depressive symptoms. Indirect effects through experiential avoidance were also found. Dysfunctional motherhood-related beliefs are cognitive vulnerabilities for postpartum psychological disorders and should be assessed to identify women that may be prone to early interventions. Moreover, dysfunctional beliefs seem to affect psychopathological symptoms by activating experiential avoidance strategies (e.g., rumination), which may accentuate the frequency of women's negative thoughts and emotions. Early interventions should target the promotion of acceptance of private negative experiences (psychological flexibility). © 2018 Wiley Periodicals, Inc.
Barbosa, José Augusto A; Rodrigues, Alexandre B; Mota, Cleonice Carvalho C; Barbosa, Márcia M; Simões e Silva, Ana C
2011-01-01
Obesity is a major public health problem affecting adults and children in both developed and developing countries. This condition often leads to metabolic syndrome, which increases the risk of cardiovascular disease. A large number of studies have been carried out to understand the pathogenesis of cardiovascular dysfunction in obese patients. Endothelial dysfunction plays a key role in the progression of atherosclerosis and the development of coronary artery disease, hypertension and congestive heart failure. Noninvasive methods in the field of cardiovascular imaging, such as measuring intima-media thickness, flow-mediated dilatation, tissue Doppler, and strain, and strain rate, constitute new tools for the early detection of cardiac and vascular dysfunction. These techniques will certainly enable a better evaluation of initial cardiovascular injury and allow the correct, timely management of obese patients. The present review summarizes the main aspects of cardiovascular dysfunction in obesity and discusses the application of recent noninvasive imaging methods for the early detection of cardiovascular alterations.
Hyaluronic acid enhancement of expanded polytetrafluoroethylene for small diameter vascular grafts
NASA Astrophysics Data System (ADS)
Lewis, Nicole R.
Cardiovascular disease is the leading cause of mortality and morbidity in the United States and other developed countries. In the United States alone, 8 million people are diagnosed with peripheral arterial disease per year and over 250,000 patients have coronary bypass surgery each year. Autologous blood vessels are the standard graft used in small diameter (<6mm) arterial bypass procedures. Synthetic small diameter grafts have had limited success. While polyethylene (Dacron) and expanded polytetrafluoroethylene (ePTFE) are the most commonly used small diameter synthetic vascular graft materials, there are significant limitations that make these materials unfavorable for use in the low blood flow conditions of the small diameter arteries. Specifically, Dacron and ePTFE grafts display failure due to early thrombosis or late intimal hyperplasia. With the shortage of tissue donors and the limited supply of autologous blood vessels available, there is a need for a small diameter synthetic vascular graft alternative. The aim of this research is to create and characterize ePTFE grafts prepared with hyaluronic acid (HA), evaluate thrombogenic potential of ePTFE-HA grafts, and evaluate graft mechanical properties and coating durability. The results in this work indicate the successful production of ePTFE-HA materials using a solvent infiltration technique. Surface interactions with blood show increased platelet adhesion on HA-modified surfaces, though evidence may suggest less platelet activation and erythrocyte lysis. Significant changes in mechanical properties of HA-modified ePTFE materials were observed. Further investigation into solvent selection, uniformity of HA, endothelialization, and dynamic flow testing would be beneficial in the evaluation of these materials for use in small diameter vascular graft bypass procedures.
Kattimani, Vivekanand S; Chakravarthi, Srinivas P; Neelima Devi, K Naga; Sridhar, Meka S; Prasad, L Krishna
2014-01-01
Bone grafts are frequently used in the treatment of bone defects. Bone harvesting can cause postoperative complications and sometimes does not provide a sufficient quantity of bone. Therefore, synthetic biomaterials have been investigated as an alternative to autogenous bone grafts. The aim of this study was to evaluate and compare bovine derived hydroxyapatite (BHA) and synthetic hydroxyapatite (SHA) graft material as bone graft substitute in maxillary cystic bony defects. Patients were analyzed by computerized densitometric study and digital radiography. In this study, 12 patients in each group were included randomly after clinical and radiological evaluation. The integration of hydroxyapatite was assessed with mean bone density, surgical site margin, and radiological bone formation characteristics, of the successful graft cases using computer densitometry and radio-visiograph. Statistical analysis was carried out using Mann-Whitney U-test, Wilcoxon matched pairs test and paired t-test. By the end of 24 th week, the grafted defects radiologically and statistically showed similar volumes of bone formation. However, the significant changes observed in the formation of bone and merging of material and surgical site margin at 1 st week to 1 st month. The results were significant and correlating with all the parameters showing the necessity of the grafting for early bone formation. However, the bone formation pattern is different in both BHA and SHA group at 3 rd month interval with significant P value. Both BHA and SHA graft materials are biocompatible for filling bone defects, showing less resorption and enhanced bone formation with similar efficacy. Our study showed maximum bone healing within 12 weeks of grafting of defects. The BHA is economical; however, price difference between the two is very nominal.
Honda, Kentaro; Okamura, Yoshitaka; Nishimura, Yoshiharu; Uchita, Shunji; Yuzaki, Mitsuru; Kaneko, Masahiro; Yamamoto, Nobuko; Kubo, Takashi; Akasaka, Takashi
2015-06-01
To evaluate the relationship between preoperative severity of coronary stenosis occurring with fractional flow reserve (FFR), and the intraoperative bypass graft flow pattern. In all, 72 patients were enrolled in this retrospective study. The FFR value of the left anterior descending artery was evaluated, and data on "in situ" bypass grafting from the internal thoracic artery to the left anterior descending artery were assessed. Patients were divided into 3 groups according to preoperative FFR values (Group S: FFR < 0.70; group M: 0.70 ≤ FFR < 0.75; and group N: FFR ≥ 0.75). In groups S, M, and N, respectively, mean graft flow was 24.7 ± 10.6 mL/minute, 19.2 ± 14.0 mL/minute, and 16.0 ± 9.7 mL mL/minute; pulsatility index was 2.35 ± 0.6, 3.02 ± 1.1, and 5.51 ± 8.20; and number of patients with systolic reverse flow was 3 (6.8%), 5 (35.7%), and 4 (28.6%). Significant differences were observed in graft flow (P = .009), pulsatility index (P = .038), and proportion of systolic reverse flow (P = .023) among the 3 groups. In all patients, graft patency was confirmed with intraoperative fluorescence imaging; postoperative graft patency was confirmed with multislice computed tomography or coronary angiography in 69 patients (follow-up interval: 213 days). Early graft failure occurred in 1 patient. As coronary stenosis severity increased, graft flow increased, pulsatility index decreased, and proportion of patients with systolic reverse flow increased. In mild coronary artery stenosis, the chance of flow competition between the native coronary artery and the bypass graft increased. Copyright © 2015 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Causes of graft failure in simultaneous pancreas-kidney transplantation by various time periods.
Wakil, Kayo; Sugawara, Yasuhiko; Kokudo, Norihiro; Kadowaki, Takashi
2013-01-01
Data collected by the United Network for Organ Sharing from all approved United States transplant programs was analyzed. The data included 26,572 adult diabetic patients who received a primary pancreas transplant between January 1987 and December 2012. Simultaneous pancreas-kidney (SPK) transplantation was the major therapeutic option for diabetes patients. SPK had better graft survival than pancreas transplant alone (PTA) or pancreas-after-kidney (PAK) or pancreas-with-kidney (from a living donor, PWK). The 5-year pancreas graft survival rates for SPK, PWK, PAK, and PTA were 70.0%, 57.2%, 54.0%, and 48.2%, respectively. When long-term SPK pancreas graft survival was examined by various transplant time periods, it was found that survival has remained almost stable since 1996. Graft survival rates were high among the pancreas recipients transplanted in the periods 1996-2000, 2001-2005, and 2006-2012, and the rates were similar: the 5-year rates were 68.9%, 72.4%, and 73.8%, respectively. Technical failure was the leading cause of graft loss during the first year post-transplant, regardless of period: 61.3%, 68.6%, 64.2%, and 71.9% for 1987-1995, 1996-2000, 2001-2005, and 2006-2012, respectively. After one year, chronic rejection was the leading cause of graft loss in all periods: 51.8%, 53.2%, 44.3%, and 40.7% for 1987-1995, 1996-2000, 2001-2005, and 2006-2012, respectively. Chronic rejection accounted for around 50% (or more) of the grafts that survived over five years. Survival of long-term pancreas grafts as well as long-term causes of graft loss remained almost unchanged across the different transplant periods. Clearly, there is a need for a means to identify early markers of chronic rejection, and to control it to improve long-term survival.
Nevins, Myron; Nevins, Marc L; Schupbach, Peter; Kim, Soo-Woo; Lin, Zhao; Kim, David M
2013-04-01
Many patients and clinicians would prefer a synthetic particulate bone replacement graft, but most available alloplastic biomaterials have limited osteogenic potential. An alloplast with increased regenerative capacity would be advantageous for the treatment of localized alveolar ridge defects. This prospective, randomized controlled preclinical trial utilized 6 female foxhounds to analyze the osteogenic impact of different formulations of biphasic calcium phosphate (BCP) in combination with an hydroxyapatite-collagen membrane and their ability to reconstruct deficient alveolar ridges for future implant placement. The grafted sites were allowed to heal 3 months, and then trephine biopsies were obtained to perform light microscopic and histomorphometric analyses. All treated sites healed well with no early membrane exposure or adverse soft tissue responses during the healing period. The grafted sites exhibited greater radiopacity than the surrounding native bone with BCP particles seen as radiopaque granules. The graft particles appeared to be well-integrated and no areas of loose particles were observed. Histologic evaluation demonstrated BCP particles embedded in woven bone with dense connective tissue/marrow space. New bone growth was observed around the graft particles as well as within the structure of the graft particulate. There was intimate contact between the graft particles and newly formed bone, and graft particles were bridged by the newly formed bone in all biopsies from the tested groups. The present study results support the potential of these BCP graft particulates to stimulate new bone formation. Clinical studies are recommended to confirm these preclinical findings.
Shichinohe, Ryuji; Yamamoto, Yuhei; Kawashima, Kunihiro; Kimura, Chu; Ono, Kentaro; Horiuchi, Katsumi; Yoshida, Tetsunori; Murao, Naoki; Hayashi, Toshihiko; Funayama, Emi; Oyama, Akihiko; Furukawa, Hiroshi
Early excision and skin grafting is the principle treatment for a burned hand although there are occasions when it cannot be done such as severe general condition, delayed consultation, and the lack of a definitive assessment of burn depth. This study analyzes the factors that affected function after a delayed excision and skin graft for hands with a deep dermal burn. This study retrospectively evaluated 43 burned hands that required a delayed excision and split-thickness skin graft on the dorsal side. Cases were required to only have split-thickness skin grafting from the dorsum of the hand and fingers distally to at least the proximal interphalangeal joint at least 8 days after the injury. The hands were divided into two functional categories: Functional category A, normal or nearly normal joint movements, and functional category B, abnormal joint movements. Demographic data were assessed statistically by a univariate analysis following a multiple regression analysis by a stepwise selection. A significant difference was observed between the groups in the number of days from grafting to complete wound healing of the graft site and with or without an escharotomy in the analysis. These parameters were statistically significant predictors of functional category B. The functional outcome of a burned hand after a delayed excision and split-thickness skin graft on the dorsal side became degraded depending on the number of days from grafting to complete wound healing. Cases that underwent an escharotomy also showed deterioration in function.
Impact of Procedure-Related Complications on Long-term Islet Transplantation Outcome.
Caiazzo, Robert; Vantyghem, Marie-Christine; Raverdi, Violeta; Bonner, Caroline; Gmyr, Valery; Defrance, Frederique; Leroy, Clara; Sergent, Geraldine; Hubert, Thomas; Ernst, Oliver; Noel, Christian; Kerr-Conte, Julie; Pattou, François
2015-05-01
Pancreatic islet transplantation offers a promising biotherapy for the treatment of type 1 diabetes, but this procedure has met significant challenges over the years. One such challenge is to address why primary graft function still remains inconsistent after islet transplantation. Several variables have been shown to affect graft function, but the impact of procedure-related complications on primary and long-term graft functions has not yet been explored. Twenty-six patients with established type 1 diabetes were included in this study. Each patient had two to three intraportal islet infusions to obtain 10,000 islet equivalent (IEQ)/kg in body weight, equaling a total of 68 islet infusions. Islet transplantation consisted of three sequential fresh islet infusions within 3 months. Islet infusions were performed surgically or under ultrasound guidance, depending on patient morphology, availability of the radiology suite, and patient medical history. Prospective assessment of adverse events was recorded and graded using "Common Terminology Criteria for adverse events in Trials of Adult Pancreatic Islet Transplantation." There were no deaths or patients dropouts. Early complications occurred in nine of 68 procedures. β score 1 month after the last graft and optimal graft function (β score ≥7) rate were significantly lower in cases of procedure-related complications (P = 0.02, P = 0.03). Procedure-related complications negatively impacted graft function (P = 0.009) and was an independent predictive factor of long-term graft survival (P = 0.033) in multivariate analysis. Complications occurring during radiologic or surgical intraportal islet transplantation significantly impair primary graft function and graft survival regardless of their severity.
The consequence of biologic graft processing on blood interface biocompatibility and mechanics.
Van de Walle, Aurore B; Uzarski, Joseph S; McFetridge, Peter S
2015-09-01
Processing ex vivo derived tissues to reduce immunogenicity is an effective approach to create biologically complex materials for vascular reconstruction. Due to the sensitivity of small diameter vascular grafts to occlusive events, the effect of graft processing on critical parameters for graft patency, such as peripheral cell adhesion and wall mechanics, requires detailed analysis. Isolated human umbilical vein sections were used as model allogenic vascular scaffolds that were processed with either: 1. sodium dodecyl sulfate (SDS), 2. ethanol/acetone (EtAc), or 3. glutaraldehyde (Glu). Changes in material mechanics were assessed via uniaxial tensile testing. Peripheral cell adhesion to the opaque grafting material was evaluated using an innovative flow chamber that allows direct observation of the blood-graft interface under physiological shear conditions. All treatments modified the grafts tensile strain and stiffness properties, with physiological modulus values decreasing from Glu 240±12 kPa to SDS 210±6 kPa and EtAc 140±3 kPa, P<.001. Relative to glutaraldehyde treatments, neutrophil adhesion to the decellularized grafts increased, with no statistical difference observed between SDS or EtAc treatments. Early platelet adhesion (% surface coverage) showed no statistical difference between the three treatments; however, quantification of platelet aggregates was significantly higher on SDS scaffolds compared to EtAc or Glu. Tissue processing strategies applied to the umbilical vein scaffold were shown to modify structural mechanics and cell adhesion properties, with the EtAc treatment reducing thrombotic events relative to SDS treated samples. This approach allows time and cost effective prescreening of clinically relevant grafting materials to assess initial cell reactivity.
Townsend, S A; Monga, M A; Nightingale, P; Mutimer, D; Elsharkawy, A M; Holt, A
2017-11-01
Hepatitis C virus (HCV)-related cirrhosis remains the commonest indication for liver transplantation worldwide, yet few studies have investigated the impact of donation after circulatory death (DCD) graft use on HCV recurrence and patient outcomes. DCD grafts have augmented the limited donor organ pool and reduced wait-list mortality, although concerns regarding graft longevity and patient outcome persist. This was a single-center study of all HCV + adults who underwent DCD liver transplantation between 2004 and 2014. 44 HCV+ patients received DCD grafts, and were matched with 44 HCV+ recipients of donation after brainstem death (DBD) grafts, and their outcomes examined. The groups were matched for age, sex, and presence of hepatocellular carcinoma; no significant differences were found between the group's donor or recipient characteristics. Paired and unpaired analysis demonstrated that HCV recurrence was more rapid in recipients of DCD organs compared with DBD grafts (408 vs 657 days; P = .006). There were no significant differences in graft survival, patient survival, or rates of biliary complications between the cohorts despite DCD donors being 10 years older on average than those used in other published experience. In an era of highly effective direct acting antiviral therapy, rapid HCV recrudescence in grafts from DCD donors should not compromise long-term morbidity or mortality. In the context of rising wait-list mortality, it is prudent to use all available sources to expand the pool of donor organs, and our data support the practice of using extended-criteria DCD grafts based on donor age. Notwithstanding that, clinicians should be aware that HCV recrudescence is more rapid in DCD recipients, and early post-transplant anti-viral therapy is indicated to prevent graft injury. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.
Minimally invasive direct coronary artery bypass grafting: a meta-analysis.
Kettering, K
2008-12-01
Recently minimally invasive direct coronary artery bypass (MIDCAB) grafting has become an interesting alternative to conventional coronary artery bypass grafting, especially in patients with a high-grade left anterior descending coronary artery (LAD) stenosis unsuitable for balloon angioplasty. Although MIDCAB offers several advantages such as the avoidance of sternotomy and cardiopulmonary bypass, concerns have been raised about the technical accuracy of the anastomoses that can be performed on a beating heart. Therefore, clinical and angiographic outcomes after MIDCAB are the subject of current controversy. A literature search for all published outcome studies of MIDCAB grafting was performed for the period from January 1995 through October 2007. Seventeen articles were enrolled in this meta-analysis. The data presented in the studies was analyzed with regard to clinical and angiographic results. Early and late (>30 days after MIDCAB) death rates were 1.3% (51/4081 patients) and 3.2% (130/4081 patients), respectively. The infarct rate was 0.8% (32/4081 patients; non-fatal myocardial infarction). Other minor or major complications (e.g. reoperation for management of bleeding, chest wound problems, arrhythmias, cerebrovascular accident, pericardial effusion, pulmonary complications) were reported in 781 cases. The conversion rate to sternotomy/cardiopulmonary bypass was 1.8% (74/4081 patients). A re-intervention due to graft failure was necessary in 134/4081 patients (3.3%). A total of 2556 grafts were studied angiographically immediately after surgery. One hundred and six grafts (4.2%) were occluded and 169 grafts (6.6 %) had a significant stenosis (50-99%). At 6-month follow-up, 445 grafts were studied angiographically. Sixteen grafts (3.6%) were occluded and 32 grafts (7.2%) had a significant stenosis. Clinical outcomes and immediate graft patency after MIDCAB are acceptable. However, long-term follow-up results and further randomized prospective clinical trials comparing this new technique with standard revascularization procedures in large patient cohorts are needed.
Mother and daughter with adolescent-onset severe frontal lobe dysfunction and epilepsy
dos Passos, Giordani Rodrigues; Fernández, Alonso Cuadrado; Vasques, Adriana Machado; Martins, William Alves; Palmini, André
2016-01-01
ABSTRACT Familial cases of early-onset prominent frontal lobe dysfunction associated with epilepsy have not been reported to date. We report a mother and her only daughter with incapacitating behavioral manifestations of frontal lobe dysfunction and epilepsy of variable severity. The possibility of a hitherto undescribed genetic condition is discussed. PMID:29213461
Wang, Rong; Cheng, Nan; Xiao, Cang-Song; Wu, Yang; Sai, Xiao-Yong; Gong, Zhi-Yun; Wang, Yao; Gao, Chang-Qing
2017-01-01
Background: The optimal timing of surgical revascularization for patients presenting with ST-segment elevation myocardial infarction (STEMI) and impaired left ventricular function is not well established. This study aimed to examine the timing of surgical revascularization after STEMI in patients with ischemic heart disease and left ventricular dysfunction (LVD) by comparing early and late results. Methods: From January 2003 to December 2013, there were 2276 patients undergoing isolated coronary artery bypass grafting (CABG) in our institution. Two hundred and sixty-four (223 male, 41 females) patients with a history of STEMI and LVD were divided into early revascularization (ER, <3 weeks), mid-term revascularization (MR, 3 weeks to 3 months), and late revascularization (LR, >3 months) groups according to the time interval from STEMI to CABG. Mortality and complication rates were compared among the groups by Fisher's exact test. Cox regression analyses were performed to examine the effect of the time interval of surgery on long-term survival. Results: No significant differences in 30-day mortality, long-term survival, freedom from all-cause death, and rehospitalization for heart failure existed among the groups (P > 0.05). More patients in the ER group (12.90%) had low cardiac output syndrome than those in the MR (2.89%) and LR (3.05%) groups (P = 0.035). The mean follow-up times were 46.72 ± 30.65, 48.70 ± 32.74, and 43.75 ± 32.43 months, respectively (P = 0.716). Cox regression analyses showed a severe preoperative condition (odds ratio = 7.13, 95% confidence interval 2.05–24.74, P = 0.002) rather than the time interval of CABG (P > 0.05) after myocardial infarction was a risk factor of long-term survival. Conclusions: Surgical revascularization for patients with STEMI and LVD can be performed at different times after STEMI with comparable operative mortality and long-term survival. However, ER (<3 weeks) has a higher incidence of postoperative low cardiac output syndrome. A severe preoperative condition rather than the time interval of CABG after STEMI is a risk factor of long-term survival. PMID:28218210
Mayr, Hermann O; Dietrich, Markwart; Fraedrich, Franz; Hube, Robert; Nerlich, Andreas; von Eisenhart-Rothe, Rüdiger; Hein, Werner; Bernstein, Anke
2009-09-01
A sheep study was conducted to test a press-fit technique using microporous pure beta-tricalcium phosphate (beta-TCP) dowels for fixation of the anterior cruciate ligament (ACL) graft. Microporous (5 mum) cylindrical plugs of beta-TCP (diameter, 7 mm; length, 25 mm) with interconnecting pores were used. The material featured a novel configuration of structure and surface geometry. Implants were tested by use of press-fit fixation of ACL grafts with and without bone blocks in 42 sheep over a period of 24 weeks. Biomechanical, radiologic, histologic, and immunohistochemical evaluations were performed. In load-to-failure tests at 6, 12, and 24 weeks after surgery, the intra-articular graft always failed, not the fixation. Grafts showed bony fixation in the tunnel at 6 weeks and primary healing at the junction of the tunnel and joint after 24 weeks. Tricalcium phosphate was resorbed and simultaneously replaced by bone. Remodeling was still incomplete at 24 weeks. In the sheep model microporous beta-TCP implants used with press-fit fixation of ACL grafts permit early functional rehabilitation. After 6 weeks, the graft is fixed by woven bone or bony integration. Implanted microporous tricalcium phosphate is resorbed and replaced by bone. In a sheep model we showed that primary healing of ACL grafts with resorption and bony replacement of the fixating implant can be achieved by means of press-fit fixation with pure beta-TCP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferro, Carlo; Rossi, Umberto G., E-mail: urossi76@hotmail.com; Seitun, Sara
The purpose of this report is to describe deployment of the Relay NBS Thoracic Stent Graft with the Plus Delivery System (Bolton Medical, Sunrise, FL) in a flexible resin arch model with a 15-mm radius curve as well as our preliminary clinical results. The Relay NBS graft with the Plus Delivery System was evaluated by way of bench testing, which was performed with stent grafts with diameters ranging from 24 to 46 mm and lengths ranging from 100 to 250 mm in flexible resin arch models with a 15-mm arch radius of curvature. The deployment sequence was analyzed. The Relaymore » NBS graft with the Plus Delivery System was deployed in two patients, respectively, having a 6.5-cm penetrating aortic ulcer of the proximal third of the descending thoracic aorta and a DeBakey type-I aortic dissection with chronic false lumen dilatation after surgery due to an entry site at the distal thoracic aorta. Bench tests showed proper conformation and apposition of the Relay NBS graft with the Plus Delivery System in the flexible resin model. This stent graft was deployed successfully into the two patients with a correct orientation of the first stent and without early or late complications. The Relay NBS graft with the Plus Delivery System ensures an optimal conformation and apposition of the first stent in the aortic arch with a small radius of curvature.« less
Risk factors for and management of graft pancreatitis.
Nadalin, Silvio; Girotti, Paolo; Königsrainer, Alfred
2013-02-01
Systematic and detailed analysis of risk factors, pathophysiology, clinical manifestation, diagnosis and management of graft pancreatitis in its different forms, that is acute and chronic graft pancreatitis (A-GP and C-GP), and A-GP being further distinguished into: physiological (P-AGP), early (E-AGP) and late AP (L-AGP). Graft pancreatitis is the second most-frequent complication following pancreas transplantation. P-AGP is an unavoidable entity related to ischemic reperfusion injury. It is usually clinically silent. It is a timely and prognostically self-limited process. E-AGP occurs within 3 months after pancreas transplantation (PTx) in 35% of cases and is associated with high rates of graft loss (78-91%). Clinical signs are pain, systemic inflammatory response (SIRS) and haematuria. Therapy can be medical, interventional and surgical. L-AGP occurs 3 months following PTx in 14-25% of cases and represents an uncommon cause of graft loss. Typical clinical signs are pain, abdominal tenderness and fever. Typical laboratory signs are hyperamylasaemia, hyperglycaemia and hypercreatininaemia. Therapy is usually conservative. C-GP is difficult to be distinguished from chronic rejection and is associated to graft loss in 4-10% of cases. Recurrent A-GPs and infections are the main risk factors. Specific symptoms are chronic abdominal malaise, constipation and recurrence of DM. Isolated hyperglycaemia is typical of C-GP. The therapy is usually conservative. This systematic analysis of different manifestations of graft pancreatitis provides the basis for a clinical approach to tackling this complex entity.
Salmonella Typhimurium gastroenteritis leading to chronic prosthetic vascular graft infection.
Cullinan, Milo; Clarke, Michael; Dallman, Tim; Peart, Steven; Wilson, Deborah; Weiand, Daniel
2017-08-01
Introduction. It is estimated up to 6 % of prosthetic vascular grafts become infected. Staphylococcus aureus is predominant in early infection and coagulase-negative staphylococci are predominant in late infections. Enterobacteriaceae cause 14-40 % of prosthetic vascular graft infections. This is, to our knowledge the first reported case of Salmonella gastroenteritis causing chronic prosthetic vascular graft infection (PVGI). Case presentation. A 57 years old lady presented with signs and symptoms of prosthetic vascular graft infection. Three years earlier, she had undergone a prosthetic axillo-femoral bypass graft for critical limb ischaemia. The infected prosthetic vascular graft was removed and Salmonella Typhimurium was isolated on culture. In the intervening period, Salmonella Typhimurium was isolated from a faecal specimen, collected during an episode of acute gastroenteritis. Whole-genome sequencing (WGS) showed that the respective Salmonella Typhimurium isolates differed by only a single nucleotide polymorphism (SNP). Salmonella Typhimurium was not isolated on culture of a faecal specimen collected five days following cessation of antimicrobial therapy. Six months after removal of the prosthetic graft, the patient remains under follow-up for her peripheral vascular disease, which currently requires no further surgical intervention. Conclusion. This case has clear implications for the management of chronic PVGI. It is vital to collect high-quality surgical specimens for microbiological analysis and empirical choices of antibiotics are unlikely to cover all potential pathogens. It may also be prudent to enquire about a history of acute gastroenteritis when assessing patients presenting with chronic PVGI.
Jolissaint, Joshua S; Langman, Linda W; DeBolt, Claire L; Tatum, Jacob A; Martin, Allison N; Wang, Andrew Y; Strand, Daniel S; Zaydfudim, Victor M; Adams, Reid B; Brayman, Kenneth L
2016-11-01
The purpose of this study was to determine whether bacterial contamination of islets affects graft success after total pancreatectomy with islet autotransplantation (TPIAT). Factors associated with insulin independence after TPIAT are inconclusive. Although bacterial contamination does not preclude transplantation, the impact of bacterial contamination on graft success is unknown. Patients who received TPIAT at the University of Virginia between January 2007 and January 2016 were reviewed. Patient charts were reviewed for bacterial contamination and patients were prospectively contacted to assess rates of insulin independence. There was no significant difference in demographic or perioperative data between patients who achieved insulin independence and those who did not. However, six of 27 patients analyzed (22.2%) grew bacterial contaminants from culture of the final islet preparations. These patients had significantly lower islet yield and C-peptide at most recent follow-up (P<.05), and none of these patients achieved insulin independence. Islet transplant solutions are often culture positive, likely secondary to preprocurement pancreatic manipulation and introduction of enteric flora. Although autotransplantation of culture-positive islets is safe, it is associated with higher rates of graft failure and poor islet yield. Consideration should be given to identify patients who may develop refractory chronic pancreatitis and offer early operative management to prevent bacterial colonization. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Cassis, Paola; Gallon, Lorenzo; Benigni, Ariela; Mister, Marilena; Pezzotta, Anna; Solini, Samantha; Gagliardini, Elena; Cugini, Daniela; Abbate, Mauro; Aiello, Sistiana; Rocchetta, Federica; Scudeletti, Pierangela; Perico, Norberto; Noris, Marina; Remuzzi, Giuseppe
2012-05-01
Anemia can contribute to chronic allograft injury by limiting oxygen delivery to tissues, particularly in the tubulointerstitium. To determine mechanisms by which erythropoietin (EPO) prevents chronic allograft injury we utilized a rat model of full MHC-mismatched kidney transplantation (Wistar Furth donor and Lewis recipients) with removal of the native kidneys. EPO treatment entirely corrected post-transplant anemia. Control rats developed progressive proteinuria and graft dysfunction, tubulointerstitial damage, inflammatory cell infiltration, and glomerulosclerosis, all prevented by EPO. Normalization of post-transplant hemoglobin levels by blood transfusions, however, had no impact on chronic allograft injury, indicating that EPO-mediated graft protection went beyond the correction of anemia. Compared to syngeneic grafts, control allografts had loss of peritubular capillaries, higher tubular apoptosis, tubular and glomerular oxidative injury, and reduced expression of podocyte nephrin; all prevented by EPO treatment. The effects of EPO were associated with preservation of intragraft expression of angiogenic factors, upregulation of the anti-apoptotic factor p-Akt in tubuli, and increased expression of Bcl-2. Inhibition of p-Akt by Wortmannin partially antagonized the effect of EPO on allograft injury and tubular apoptosis, and prevented EPO-induced Bcl-2 upregulation. Thus non-erythropoietic derivatives of EPO may be useful to prevent chronic renal allograft injury.
Long-term cognitive effects of human stem cell transplantation in the irradiated brain.
Acharya, Munjal M; Martirosian, Vahan; Christie, Lori-Ann; Limoli, Charles L
2014-09-01
Radiotherapy remains a primary treatment modality for the majority of central nervous system tumors, but frequently leads to debilitating cognitive dysfunction. Given the absence of satisfactory solutions to this serious problem, we have used human stem cell therapies to ameliorate radiation-induced cognitive impairment. Here, past studies have been extended to determine whether engrafted cells provide even longer-term benefits to cognition. Athymic nude rats were cranially irradiated (10 Gy) and subjected to intrahippocampal transplantation surgery 2 days later. Human embryonic stem cells (hESC) or human neural stem cells (hNSC) were transplanted, and animals were subjected to cognitive testing on a novel place recognition task 8 months later. Grafting of hNSC was found to provide long lasting cognitive benefits over an 8-month post-irradiation interval. At this protracted time, hNSC grafting improved behavioral performance on a novel place recognition task compared to irradiated animals not receiving stem cells. Engrafted hESC previously shown to be beneficial following a similar task, 1 and 4 months after irradiation, were not found to provide cognitive benefits at 8 months. Our findings suggest that hNSC transplantation promotes the long-term recovery of the irradiated brain, where intrahippocampal stem cell grafting helps to preserve cognitive function.
Matsumoto, Cal S; Zasloff, Michael A; Fishbein, Thomas M
2014-06-01
The purpose of this review is to highlight the similarities between inflammatory bowel disease and the state of the intestine allograft after transplantation. The mutant nucleotide-binding oligomerization protein 2 (NOD2) gene, which encodes for an intracellular protein that serves as an innate immune system microbial sensor in macrophages, dendritic cells, and certain intestinal epithelial cells, has been recognized as a risk factor in Crohn's disease. Similarly, recent studies have also highlighted the contribution the NOD2 mutation may have on intestinal failure itself. More specifically, in intestinal transplant recipients with the NOD2 mutation, the discovery of the reduced ability to prevent bacterial clearance, increased enterocyte stress response, and failure of key downstream expression of important cytokines and growth factors have been implicated as major factors in intestinal transplant outcomes, namely graft loss and septic death. Treatment strategies with anti tumor necrosis factor (TNF) α, similar to inflammatory bowel disease, have been employed in intestinal transplantation with promising results. In intestinal transplantation, there is evidence that the classical alloimmunity pathways that lead toward graft dysfunction and eventual graft loss may, in fact, be working in concert with a disordered innate immune system to produce a state of chronic inflammation not unlike that seen in inflammatory bowel disease.
Smith, R G
1997-01-01
Intimal proliferation or Neointimal hyperplasia (NIH) is a vascular lesion that often arises in arteries after balloon angioplasty or other vessel wall injuries. FIH is a vascular lesion that develops in autologous saphenous vein grafts (SVG) after transplantation into the aorto-coronary circulation or the peripheral vascular circulation. FIH shares elements of smooth muscle migration, proliferation and fibrous tissue deposition in common with nibrointimal proliferation (NIH). Either NIH of a coronary artery or FIH of a SVG obstruct the vascular lumen and result in myocardial dysfunction. Local radiotherapy has been used for several decades to reduce the post-operative recurrence of the fibrovascular proliferations of pterygia and keloids. Similarly, in animal and human experiments, endovascular radiotherapy has been shown to reduce arterial smooth muscle proliferation. Consideration of the similarities of vascular smooth muscle cell proliferation in NIH and FIH leads one to suggest that endovascular beta irradiation can reduce FIH as well as it reduces NIH. The goal of such treatment is to achieve a clinically significant decrease in the morbidity and mortality resulting from SVG occlusions. The potential for large reduction of the consequences of SVG occlusion, the very large number of patients at risk, and the simplicity of the proposed intervention encourages prompt scientific evaluation of this technique.
Escobar-Morreale, Héctor F
2017-01-01
Polycystic ovary syndrome (PCOS) is characterized by the association of androgen excess with chronic oligoovulation and/or polycystic ovarian morphology, yet metabolic disorders and classic and nonclassic cardiovascular risk factors cluster in these women from very early in life. This chapter focuses on the mechanisms underlying the association of PCOS with metabolic dysfunction, focusing on the role of androgen excess on the development of visceral adiposity and adipose tissue dysfunction.
Bone Marrow Graft in Man after Conditioning by Antilymphocytic Serum*
Mathé, G.; Amiel, J. L.; Schwarzenberg, L.; Choay, J.; Trolard, P.; Schneider, M.; Hayat, M.; Schlumberger, J. R.; Jasmin, Cl.
1970-01-01
Allogeneic bone marrow grafts carried out after previous administration of antilymphocytic serum alone were attempted in 16 patients. Of these, six had acute myeloblastic leukaemia, four acute lymphoblastic leukaemia, and one a blast cell crisis in polycythaemia vera. Ten of these patients were in an overt phase of the disease and resistant to chemotherapy, while nine had complete agranulocytosis. In five of these patients erythrocyte and leucocyte antigenic markers demonstrated the establishment of the graft. One patient had thalassaemia major, and four others had aplasia of the bone marrow, in one case due to chloramphenicol poisoning and in another to virus hepatitis. The grafts were successful in the last two patients and transformed their clinical condition. No signs of early acute secondary disease were noted in any of the patients, either when the donor had been given antilymphocytic serum or when he was untreated. The grafts had no adoptive immunotherapeutic effect on the acute leukaemia. These observations have clearly shown that antilymphocytic serum has an immunosuppressive effect in man when it is used alone. PMID:4909449
Chung, C-S; Lee, T-H; Chiu, C-T; Chen, Y
2017-12-01
Intestinal failure characterized by inadequate maintenance of nutrition via normal intestinal function comprises a group of disorders with many different causes. If parenteral nutrition dependency develops, which is associated with higher mortality and complications, it is considered for intestine transplantation. However, the graft failure rate is not low, and acute cellular rejection is one of the most important reasons for graft failure. As a result, early identification of rejection and timely modification of anti-rejection medications have been considered to be associated with better graft and patient survival rates. The diagnostic gold standard for rejection is mainly based on histology, but hours of delay by pathology may occur. Some researchers investigated the association of endoscopic images with graft rejection to provide timely diagnosis. In this study, we present the first case report with characteristic features under magnifying endoscopy with a narrow-band imaging system to predict epithelial regeneration and improvement of graft rejection in a patient with small-bowel transplantation. Copyright © 2017 Elsevier Inc. All rights reserved.
Wu, Wei-Chun; Ma, Hong; Xie, Rong-Ai; Gao, Li-Jian; Tang, Yue; Wang, Hao
2016-04-01
This study evaluated the role of two-dimensional speckle tracking echocardiography (2DSTE) for predicting left ventricular (LV) diastolic dysfunction in pacing-induced canine heart failure. Pacing systems were implanted in 8 adult mongrel dogs, and continuous rapid right ventricular pacing (RVP, 240 beats/min) was maintained for 2 weeks. The obtained measurements from 2DSTE included global strain rate during early diastole (SRe) and during late diastole (SRa) in the longitudinal (L-SRe, L-SRa), circumferential (C-SRe, C-SRa), and radial directions (R-SRe, R-SRa). Changes in heart morphology were observed by light microscopy and transmission electron microscopy at 2 weeks. The onset of LV diastolic dysfunction with early systolic dysfunction occurred 3 days after RVP initiation. Most of the strain rate imaging indices were altered at 1 or 3 days after RVP onset and continued to worsen until heart failure developed. Light and transmission electron microscopy showed myocardial vacuolar degeneration and mitochondrial swelling in the left ventricular at 2 weeks after RVP onset. Pearson's correlation analysis revealed that parameters of conventional echocardiography and 2DSTE showed moderate correlation with LV pressure parameters, including E/Esep' (r = 0.58, P < 0.01), L-SRe (r = -0.58, P < 0.01), E/L-SRe (r = 0.65, P < 0.01), and R-SRe (r = 0.53, P < 0.01). ROC curves analysis showed that these indices of conventional echocardiography and strain rate imaging could effectively predict LV diastolic dysfunction (area under the curve: E/Esep' 0.78; L-SRe 0.84; E/L-SRe 0.80; R-SRe 0.80). 2DSTE was a sensitive and accurate technique that could be used for predicting LV diastolic dysfunction in canine heart failure model. © 2015, Wiley Periodicals, Inc.
Ebert, Jay R; Smith, Anne; Fallon, Michael; Butler, Rodney; Nairn, Robert; Breidahl, William; Wood, David J
2015-09-01
Graft hypertrophy is a common occurrence after periosteal, collagen-covered and matrix-induced autologous chondrocyte implantation (MACI). The purpose of this study was to investigate the incidence, development, and degree of graft hypertrophy at 24 months after MACI. The hypothesis was that graft hypertrophy would not be associated with clinical outcome at 24 months. Case series, Level of evidence, 4. This study was undertaken in 180 consecutive patients (113 male, 67 female) after MACI in the knee. All patients were assessed clinically using the Knee injury and Osteoarthritis Outcome Score (KOOS) and underwent magnetic resonance imaging (MRI) at 3, 12, and 24 months after surgery. The incidence of hypertrophy relevant to anatomic graft site was investigated, as was the progressive change in hypertrophic studies postoperatively. The degree of tissue overgrowth in hypertrophic cases was investigated, as was its association with patient clinical outcome at 24 months after surgery. Of the 180 patients, 50 demonstrated a hypertrophic graft at 1 or more postoperative time points. This included 9 grafts (5.0%) at 3 months and 32 grafts (18.7%) at 12 months. At 24 months, 47 grafts (26.1%)-43 (32.1%) tibiofemoral and 4 (8.7%) patellofemoral-were hypertrophic. Patients with hypertrophic grafts at 24 months (n = 47) were younger (P = .051), they had a lower body mass index (BMI; P = .069), and significantly fewer of them had patellofemoral grafts (P = .007) compared with patients who had grafts with full (100%) tissue infill (n = 61). There were no significant differences in any of the KOOS subscales between patients with graft hypertrophy or full (100%) tissue infill at 24 months after surgery, while the severity of graft hypertrophy was not associated with KOOS subscales at 24 months. Hypertrophic grafts after MACI were common and continued to develop through to 24 months after surgery. Hypertrophic growth was associated with being younger and having a lower BMI, was more common on the femoral condyles, and overall was not associated with clinical outcome at 24 months after surgery. However, further research with longer term follow-up is required to evaluate the effect of persistent hypertrophy on graft stability and to assess the use of early surgical intervention to prevent such failure. © 2015 The Author(s).
Surgical Repair of Axillary Artery Aneurysm in a 2-Year-Old Child: A Case Report.
Beshish, Asaad G; Arutyunyan, Tsovinar
2017-05-01
Peripheral aneurysm and pseudoaneurysm of an artery is a well-recognized but rare phenomenon in children. We report a case of an axillary artery aneurysm in a 2-year-old boy with methicillin-resistant Staphylococcus aureus septic shock, acute respiratory distress syndrome, and multiorgan dysfunction syndrome. Definitive surgical treatment with left axillary artery aneurysm exclusion and bypass with greater saphenous vein graft were performed. To our knowledge, this is the only axillary artery aneurysm ever reported in a child. Copyright © 2017 Elsevier Inc. All rights reserved.
Brugière, Olivier; Pessione, Fabienne; Thabut, Gabriel; Mal, Hervé; Jebrak, Gilles; Lesèche, Guy; Fournier, Michel
2002-06-01
Among risk factors for the progression of bronchiolitis obliterans syndrome (BOS) after lung transplantation (LT), the influence of time to BOS onset is not known. The aim of the study was to assess if BOS occurring earlier after LT is associated with worse functional prognosis and worse graft survival. We retrospectively compared functional outcome and survival of all single-LT (SLT) recipients who had BOS develop during follow-up in our center according to time to onset of BOS (< 3 years or > or = 3 years after transplantation). Among the 29 SLT recipients with BOS identified during the study period, 20 patients had early-onset BOS and 9 patients had late-onset BOS. The mean decline of FEV(1) over time during the first 9 months in patients with early-onset BOS was significantly greater than in patients with of late-onset BOS (p = 0.04). At last follow-up, patients with early-onset BOS had a lower mean FEV(1) value (25% vs 39% of predicted, p = 0.004), a lower mean PaO(2) value (54 mm Hg vs 73 mm Hg, p = 0.0005), a lower 6-min walk test distance (241 m vs 414 m, p = 0.001), a higher Medical Research Council index value (3.6 vs 1.6, p = 0.0001), and a higher percentage of oxygen dependency (90% vs 11%, p = 0.001) compared with patients with late-onset BOS. In addition, graft survival of patients with early-onset BOS was significantly lower than that of patients with late-onset BOS (log-rank test, p = 0.04). There were 18 of 20 graft failures (90%) in the early-onset BOS group, directly attributable to BOS in all cases (deaths [n = 10] or retransplantation [n = 8]). In the late-onset BOS group, graft failure occurred in four of nine patients due to death from extrapulmonary causes in three of four cases. The median duration of follow-up after occurrence of BOS was not statistically different between patients with early-onset BOS and patients with late-onset BOS (31 +/- 28 months and 37 +/- 26 months, respectively; p = not significant). The subgroup of patients who had BOS develop > or = 3 years after SLT are less likely to have worrisome functional impairment develop in long-term follow-up. Considering the balance between the advantages and risks, enhancement of immunosuppression should be regarded with more caution in this subgroup than in patients with early-onset BOS.
Nonhuman primate models of polycystic ovary syndrome
Abbott, David H; Nicol, Lindsey E; Levine, Jon E; Xu, Ning; Goodarzi, Mark O; Dumesic, Daniel A
2013-01-01
With close genomic and phenotypic similarity to humans, nonhuman primate models provide comprehensive epigenetic mimics of polycystic ovary syndrome (PCOS), suggesting early life targeting for prevention. Fetal exposure to testosterone (T), of all nonhuman primate emulations, provides the closest PCOS-like phenotypes, with early-to-mid gestation T-exposed female rhesus monkeys exhibiting adult reproductive, endocrinological and metabolic dysfunctional traits that are co-pathologies of PCOS. Late gestational T exposure, while inducing adult ovarian hyperandrogenism and menstrual abnormalities, has less dysfunctional metabolic accompaniment. Fetal exposures to dihydrotestosterone (DHT) or diethylstilbestrol (DES) suggest androgenic and estrogenic aspects of fetal programming. Neonatal exposure to T produces no PCOS-like outcome, while continuous T treatment of juvenile females causes precocious weight gain and early menarche (high T), or high LH and weight gain (moderate T). Acute T exposure of adult females generates polyfollicular ovaries, while chronic T exposure induces subtle menstrual irregularities without metabolic dysfunction. PMID:23370180
Rapid identification system of frontal dysfunction in subclinical hepatic encephalopathy.
Moretti, Rita; Gazzin, Silvia; Crocè, Lory Saveria; Baso, Beatrice; Masutti, Flora; Bedogni, Giorgio; Tiribelli, Claudio
2016-01-01
Introduction and aim. Liver disease is associated with cognitive dysfunction also at early stages, and minimal hepatic encephalopathy, affecting 20-70% of patients, is frequently under-recognized. The main purpose of this work was to demonstrate that a substantial number of patients, enrolled due to an acute confusional state in absence of a diagnosis of liver disease, suffers of hepatic encephalopathy. Before a diagnosis of a well-compensated liver diseases was performed, 410 patients with an acute confusional state were enrolled in this study. Even in the presence of minimal alterations of hepatic function, the psychometric tests applied demonstrated early signs of cerebral frontal alteration. The alteration was associated with the severity of liver disease, paralleling the progression of the patient to minimal hepatic failure or chronic liver disease. These psychometric tests are essential to detect early and subclinical frontal failure. Frontal dysfunction may be a useful tool in the follow-up of these patients.
Cleper, Roxana; Ben Meir, David; Krause, Irit; Livne, Pinchas; Mor, Eitan; Davidovits, Miriam; Dagan, Amit
2018-06-01
Guidelines for bladder augmentation (BA) in kidney transplantation (KT) recipients are not well-defined. In our center, simultaneous BA with KT (BA-KT) is performed. We assessed transplantation outcomes of this unique extensive procedure. A case-control single center retrospective study. Transplantation outcomes were compared with those of KT recipients who did not need BA. Compared with 22 patients who underwent KT only, for 9 who underwent BA-KT, surgical complications and the need for revision in the early posttransplantation period were similar; early graft function was better: estimated glomerular filtration rate, 96.5 ± 17.1 versus 79.4 ± 16.6 mL/min at 0 to 6 months (P = 0.02); posttransplantation clean intermittent catheterization was more often needed: by 78% (7/9) versus 13% (3/22); and asymptomatic bacteriuria was more common: 100% versus 9% during the first 6 months (P < 0.001), 55% versus 9% (P = 0.02) and 66.6% versus 9% during the first and second years, respectively (P = 0.004). Urinary tract infection (UTI) incidence was also higher: 100% versus 23% during the first 6 months and 44% versus 9% during the second year posttransplantation. Graft function deteriorated significantly in the BA-KT group by the fifth posttransplantation year: estimated glomerular filtration rate was 47.7 ± 39.7 mL/min versus 69 ± 21.3 mL/min, with only 6 (66%) of 9 functioning grafts versus 100% in the KT only group. Causes of graft loss were noncompliance with drug therapy in 2 patients and recurrent UTIs in 2 patients. Excellent short-term outcome for simultaneous BA-KT is threatened by graft loss due to a high prevalence of UTIs and patient noncompliance with the demanding complex posttransplantation therapy.
Day, Judy D.; Metes, Diana M.; Vodovotz, Yoram
2015-01-01
A mathematical model of the early inflammatory response in transplantation is formulated with ordinary differential equations. We first consider the inflammatory events associated only with the initial surgical procedure and the subsequent ischemia/reperfusion (I/R) events that cause tissue damage to the host as well as the donor graft. These events release damage-associated molecular pattern molecules (DAMPs), thereby initiating an acute inflammatory response. In simulations of this model, resolution of inflammation depends on the severity of the tissue damage caused by these events and the patient’s (co)-morbidities. We augment a portion of a previously published mathematical model of acute inflammation with the inflammatory effects of T cells in the absence of antigenic allograft mismatch (but with DAMP release proportional to the degree of graft damage prior to transplant). Finally, we include the antigenic mismatch of the graft, which leads to the stimulation of potent memory T cell responses, leading to further DAMP release from the graft and concomitant increase in allograft damage. Regulatory mechanisms are also included at the final stage. Our simulations suggest that surgical injury and I/R-induced graft damage can be well-tolerated by the recipient when each is present alone, but that their combination (along with antigenic mismatch) may lead to acute rejection, as seen clinically in a subset of patients. An emergent phenomenon from our simulations is that low-level DAMP release can tolerize the recipient to a mismatched allograft, whereas different restimulation regimens resulted in an exaggerated rejection response, in agreement with published studies. We suggest that mechanistic mathematical models might serve as an adjunct for patient- or sub-group-specific predictions, simulated clinical studies, and rational design of immunosuppression. PMID:26441988
Kang, Woong Chol; Shin, Eak Kyun; Park, Chul-Hyun; Kang, Jin Mo; Ko, Young-Guk; Choi, Donghoon; Youn, Young Nam; Shim, Won-Heum
2013-08-01
To evaluate the outcomes of hybrid endovascular repair for aortic arch pathology. This study was a retrospective analysis involving patients who underwent hybrid endovascular repair for aortic arch pathologies. Twenty-one patients (16 men; mean age, 64.7 ± 16.2 years) with aortic arch pathologies were treated by hybrid endovascular repair. The indications for treatment included increased aneurysm size in 16 cases (71.4%), rupture or impending aneurysmal rupture in 5 cases (23.8%), and rapid growth of aortic dissection (≥ 10 mm/y) in 1 case (4.8%). Supra-aortic vessel transposition and stent-graft implantation were achieved in all cases. Two types of stent-graft was used, as follows: the Seal thoracic stent-graft in 14 patients (66.7%); and the Valiant stent grafts in 7 patients (33.3%). Perioperative complications affected 5 patients (23.8%), as follows: bleeding (n = 4, 19.0%); stroke (n = 3, 14.3%); renal failure (n = 2, 9.5%); vascular injury (n = 1, 4.8%), and respiratory failure (n = 1, 4.8%). Two patients died within 30 days (9.5%). Technical success was achieved in 15 patients (71.5%). Early endoleaks were noted in 4 patients (19.0%). One patient died during follow-up (mean, 21.3 ± 11.6 months) due to a de novo intramural hematoma. Persistent early endoleaks were noted in 4 patients (19.0%); 2 of the 4 patients were successfully managed with implantation of additional stent-grafts. No late onset endoleaks were noted. The death-free survival and reintervention-free survival rates during follow-up were 85.7% and 90.5%, respectively. Hybrid treatment with supra-aortic vessel transposition and endovascular repair may be an option in frail patients in who open procedures is too risky. © 2013 Wiley Periodicals, Inc.
Chen, Lei; Xing, Qi; Zhai, Qiyi; Tahtinen, Mitchell; Zhou, Fei; Chen, Lili; Xu, Yingbin; Qi, Shaohai; Zhao, Feng
2017-01-01
Split thickness skin graft (STSG) implantation is one of the standard therapies for full thickness wound repair when full thickness autologous skin grafts (FTG) or skin flap transplants are inapplicable. Combined transplantation of STSG with dermal substitute could enhance its therapeutic effects but the results remain unsatisfactory due to insufficient blood supply at early stages, which causes graft necrosis and fibrosis. Human mesenchymal stem cell (hMSC) sheets are capable of accelerating the wound healing process. We hypothesized that pre-vascularized hMSC sheets would further improve regeneration by providing more versatile angiogenic factors and pre-formed microvessels. In this work, in vitro cultured hMSC cell sheets (HCS) and pre-vascularized hMSC cell sheets (PHCS) were implanted in a rat full thickness skin wound model covered with an autologous STSG. Results demonstrated that the HCS and the PHCS implantations significantly reduced skin contraction and improved cosmetic appearance relative to the STSG control group. The PHCS group experienced the least hemorrhage and necrosis, and lowest inflammatory cell infiltration. It also induced the highest neovascularization in early stages, which established a robust blood micro-circulation to support grafts survival and tissue regeneration. Moreover, the PHCS grafts preserved the largest amount of skin appendages, including hair follicles and sebaceous glands, and developed the smallest epidermal thickness. The superior therapeutic effects seen in PHCS groups were attributed to the elevated presence of growth factors and cytokines in the pre-vascularized cell sheet, which exerted a beneficial paracrine signaling during wound repair. Hence, the strategy of combining STSG with PHCS implantation appears to be a promising approach in regenerative treatment of full thickness skin wounds.
Svensson, Lars G; Pillai, Saila T; Rajeswaran, Jeevanantham; Desai, Milind Y; Griffin, Brian; Grimm, Richard; Hammer, Donald F; Thamilarasan, Maran; Roselli, Eric E; Pettersson, Gösta B; Gillinov, A Marc; Navia, Jose L; Smedira, Nicholas G; Sabik, Joseph F; Lytle, Bruce W; Blackstone, Eugene H
2016-03-01
To evaluate long-term results of aortic root procedures combined with ascending aorta replacement for aneurysms, using 4 surgical strategies. From January 1995 to January 2011, 957 patients underwent 1 of 4 aortic root procedures: valve preservation (remodeling or modified reimplantation, n = 261); composite biologic graft (n = 297); composite mechanical graft (n = 156); or allograft root (n = 243). Seven deaths occurred (0.73%), none after valve-preserving procedures, and 13 strokes (1.4%). Composite grafts exhibited higher gradients than allografts or valve preservation, but the latter 2 exhibited more aortic regurgitation (2.7% biologic and 0% mechanical composite grafts vs 24% valve-preserving and 19% allografts at 10 years). Within 2 to 5 years, valve preservation exhibited the least left ventricular hypertrophy, allograft replacement the greatest; however, valve preservation had the highest early risk of reoperation, allograft replacement the lowest. Patients receiving allografts had the highest risk of late reoperation (P < .05), and those receiving composite mechanical grafts and valve preservation had the lowest. Composite bioprosthesis patients had the highest risk of late death (57% at 15 years vs 14%-26% for the remaining procedures, P < .0001), because they were substantially older and had more comorbidities (P < .0001). These 4 aortic root procedures, combined with ascending aorta replacement, provide excellent survival and good durability. Valve-preserving and allograft procedures have the lowest gradients and best ventricular remodeling, but they have more late regurgitation, and likely, less risk of valve-related complications, such as bleeding, hemorrhage, and endocarditis. Despite the early risk of reoperation, we recommend valve-preserving procedures for young patients when possible. Composite bioprostheses are preferable for the elderly. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Wang, Shaomei; Lu, Bin; Girman, Sergej; Holmes, Toby; Bischoff, Nicolas; Lund, Raymond D
2008-01-01
It is well documented that grafting of cells in the subretinal space of Royal College of Surgeons (RCS) rats limits deterioration of vision and loss of photoreceptors if performed early in postnatal life. What is unclear is whether cells introduced later, when photoreceptor degeneration is already advanced, can still be effective. This possibility was examined in the present study, using the human retinal pigment epithelial cell line, ARPE-19. Dystrophic RCS rats (postnatal day [P] 60) received subretinal injection of ARPE-19 cells (2 x 10(5)/3 microL/eye). Spatial frequency was measured by recording optomotor responses at P100 and P150, and luminance threshold responses were recorded from the superior colliculus at P150. Retinas were stained with cresyl violet, retinal cell-specific markers, and a human nuclear marker. Control animals were injected with medium alone. Animals comparably treated with grafts at P21 were available for comparison. All animals were treated with immunosuppression. Later grafts preserved both spatial frequency and threshold responses over the control and delayed photoreceptor degeneration. There were two to three layers of rescued photoreceptors even at P150, compared with a scattered single layer in sham and untreated control retinas. Retinal cell marker staining showed an orderly array of the inner retinal lamination. The morphology of the second-order neurons was better preserved around the grafted area than in regions distant from graft. Sham injection had little effect in rescuing the photoreceptors. RPE cell line transplants delivered later in the course of degeneration can preserve not only the photoreceptors and inner retinal lamination but also visual function in RCS rats. However, early intervention can achieve better rescue.
[Hepatitis B infection transmission by anti-HBc-positive grafts].
Bárcena, Rafael
2014-07-01
In Spain, the rate of anti-HBc positive, HBsAg-negative carriers is approximately 10% of adults between the ages of 26 and 65 years. It is therefore impossible to exclude these donors without increasing the mortality of recipients on waiting lists. The incidence of de novo hepatitis B infection in HBsAg-negative recipients of anti-HBc-positive donors is high without prophylaxis and is related to the serological state of the recipient against HBV. Anti-HBc and anti-HBs-positive recipients have low risk, with or without prophylaxis. This patient group therefore does not require prophylaxis but rather periodic posttransplantation checkups. For the other recipient groups (naïve, anti-Hbc and anti-HBs isolates), prophylaxis with IgG HB, lamivudine or combined therapy decreases the incidence of infection. These patients should be treated with prophylaxis immediately after transplantation. Depending on the risk, cost and benefit, patients should currently be treated with lamivudine 100mg/d indefinitely or for longer periods (>10 years). Periodic checkups of HBsAg should be conducted, and if there is graft dysfunction then HBV DNA should be checked. IF HBV DNA is discovered in the donor and found to be positive in serum or in the biopsy, the prophylaxis should be an analogue with a high barrier to resistance from the start. Grafts from anti-HBc-positive donors are not considered at-risk grafts and are used according to donor severity, without being determined by the recipient's serological profile. Copyright © 2014 Elsevier España, S.L. All rights reserved.
Margreiter, Christian; Aigner, Felix; Resch, Thomas; Berenji, Anna-Katharina; Oberhuber, Rupert; Sucher, Robert; Profanter, Christoph; Veits, Lothar; Öllinger, Robert; Margreiter, Raimund; Pratschke, Johann; Mark, Walter
2012-01-27
Although percutaneous biopsies are considered to be the gold standard in diagnosing pancreas graft rejection, they are not performed routinely because of their association with severe complications. On the other hand, correct diagnosis of rejection is essential but may be difficult in cases of enteric drainage, particularly in patients with a pancreas transplant alone or a pancreas after kidney transplant. Pancreas recipients who underwent enteroscopy between May 2005 and September 2009 were included in this retrospective analysis. Biopsies were graded 0 to 4 for interstitial and vascular changes. During the study period a total of 65 simultaneous pancreas-kidney transplants, 13 pancreas after kidney transplants and 4 pancreas transplants alone were performed. Sixty-three patients underwent a single enteroscopy, 10 had two, and 6 had three or more. Indications were protocol graft monitoring (n=73), graft dysfunction (n=17), enteric hemorrhage (n=9), or other (n=3). The duodenal segment was accessed in 76 instances (75%) with abnormal findings in 23. A total of 69 biopsies were obtained and revealed normal mucosa in 49 cases (71%). Histology showed signs of acute rejection in 11 cases. The upper gastrointestinal tract was also assessed, and, in 13 cases, additional pathologies were identified including gastroduodenitis (n=10), gastric/duodenal ulcer (n=2), and hemorrhagic esophagitis (n=1). No procedure-related complication occurred. This series of enteroscopies demonstrates that the duodenal segment of a pancreatic graft is accessible using our implant technique, and thus permitting biopsies to be obtained and endoscopic interventions to be performed.
[Endothelial keratoplasty: Descemet stripping (DSEK) using TAN EndoGlide™ device: case series].
Pazos, Henrique Santiago Baltar; Pazos, Paula Fernanda Morais Ramalho Baltar; Nogueira Filho, Pedro Antônio; Grisolia, Ana Beatriz Diniz; Silva, André Berger Emiliano; Gomes, José Álvaro Pereira
2011-01-01
To report the results of Descemet stripping endothelial keratoplasty (DSEK) using the TAN EndoGlideTM device to facilitate the insertion of the endothelial membrane. Prospective clinical study that included nine patients presenting corneal edema secondary to endothelial dysfunction. Best corrected visual acuity, refraction, central corneal thickness, endothelial cell density and complications were analyzed after a six-month follow-up. There was a significant improvement in the corneal edema and visual acuity in 7 patients (77.78%). The best corrected visual acuity ranged between 20/40 and 20/200. The average density of endothelial cells in six months varied between 1,305 cells/mm² and 2,346 cells/mm² with an average loss of 33.14% cells. Detachment of part of the graft was observed in one eye (11.11%) and primary failure of the endothelial transplantation occurred in 2 eyes (22.22%). The device TAN EndoGlideTM facilitates the introduction of the graft in Descemet stripping endothelial keratoplasty.
Dellepiane, Sergio; Medica, Davide; Quercia, Alessandro Domenico; Cantaluppi, Vincenzo
2017-06-01
Acute kidney injury (AKI) is characterized by an increasing incidence and poor outcomes in both developed and undeveloped countries. AKI is also acquiring importance in the setting of kidney transplantation (KT): besides all the classical forms of AKI that KT patients may undergo, several transplant-specific injuries can also lead to the loss of graft function. The mechanisms of tissue damage in native and grafted kidneys share several common pathogenic elements. Since appropriate therapeutic treatments are still lacking-probably due to the disease complexity-clinicians are forced to provide only supportive care. In this composite scenario, cell therapies represent an evolving frontier for AKI treatment in native and transplanted kidneys: ex-vivo manipulated stem or immune cells are able to counteract renal dysfunction by a wide range of biological mechanisms. In this review, we will discuss the potential applications of cell therapies in AKI and KT by analyzing the available clinical data and the most promising experimental prospects from a "bench to bedside" perspective.
[Ocular graft-versus-host disease: An often misdiagnosed etiology of dry eye syndrome].
Moyal, L; Adam, R; Akesbi, J; Rodallec, F T; Nordmann, J-P
2017-02-01
To report a case of severe ocular graft-versus-host disease (GVHD) after cataract surgery. Observational case report. We describe the case of a 59-year-old man with postoperative corneal ulcer on his only functional eye. His past history reported allogenic bone marrow transplant. His visual acuity (VA) was limited to hand motions. Slit lamp examination revealed diffuse conjunctival hyperemia, severe blepharitis, Meibomian dysfunction, total corneal opacification with epithelial and stromal keratitis and neovascular invasion. Because of the severe dry eye symptoms and history of allogenic hematological stem cell transplantation, ocular GVHD was diagnosed. Functional and anatomical improvement occurred rapidly with topical cyclosporine 2%, with improved VA after treatment. With any severe dry eye syndrome in the context of allogenic bone marrow transplant, ocular GVHD must be considered. For planned ocular surgery, we recommend adding cyclosporine 0.1% treatment before and after surgery to prevent severe ocular GVHD. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Grover, Dustin M; Howell, Stephen M; Hull, Maury L
2005-02-01
The tensile force applied to an anterior cruciate ligament graft determines the maximal anterior translation; however, it is unknown whether the tensile force is transferred to the intra-articular portion of the graft and whether the intra-articular tension and maximal anterior translation are maintained shortly after ligament reconstruction. Ten cadaveric knees were reconstructed with a double-looped tendon graft. The graft was looped through a femoral fixation transducer that measured the resultant force on the proximal end of the graft. A pneumatic cylinder applied a tensile force of 110 N to the graft exiting the tibial tunnel with the knee in full extension. The graft was fixed sequentially with four tibial fixation devices (a spiked metal washer, double staples, a bioabsorbable interference screw, and a WasherLoc). Three cyclic loading treatments designed to conservatively load the graft and its fixation were applied. The combined loss in intra-articular graft tension from friction, insertion of the tibial fixation device, and three cyclic loading treatments was 50% for the spiked washer (p = 0.0004), 100% for the double staples (p < 0.0001), 64% for the interference screw (p = 0.0001), and 56% for the WasherLoc (p < 0.0001). The tension loss caused an increase in the maximal anterior translation from that of the intact knee of 2.0 mm for the spiked washer (p = 0.005), 7.8 mm for the double staples (p < 0.0001), 2.7 mm for the interference screw (p = 0.001), and 2.1 mm for the WasherLoc (p < 0.0001). The tensile force applied to a soft-tissue anterior cruciate ligament graft is not transferred intra-articularly and is not maintained during graft fixation. The loss in tension is caused by friction in the tibial tunnel and wrapping the graft around the shank of the screw of the spiked washer, insertion of the tibial fixation device, and cyclical loading of the knee. The amount of tension loss is sufficient to increase the maximal anterior translation.
Zibari, Gazi B; Fallahzadeh, Mohammad Kazem; Hamidian Jahromi, Alireza; Zakhary, Joseph; Dies, David; Wellman, Greg; Singh, Neeraj; Shokouh-Amiri, Hosein
2014-01-01
The aim of this study is to report our six-year experience with portal-endocrine and gastric-exocrine drainage technique of pancreatic transplantation, which was first developed and implemented at our center in 2007. In this study, the outcomes of all patients at our center who had pancreas transplantation with portal-endocrine and gastric-exocrine drainage technique were evaluated. From October 2007 to November 2013, 38 patients had pancreas transplantation with this technique - 31 simultaneous kidney pancreas and seven pancreas alone. Median duration of follow-up was 3.8 years. One-, three-, and five-year patient and graft survival rates were 94%, 87%, 70% and 83%, 65%, 49%, respectively. For pancreas allograft dysfunction evaluation, 51 upper endoscopies were performed in 14 patients; donor duodenal biopsies were successfully obtained in 45 (88%). We detected nine episodes of acute rejection (eight patients) and seven episodes of cytomegalovirus (CMV) duodenitis (six patients). No patient developed any complication due to upper endoscopy. Portal-endocrine and gastric-exocrine drainage technique of pancreas transplantation provides lifelong easy access to the transplanted duodenum for evaluation of pancreatic allograft dysfunction.
Fernandez, Isis E; Heinzelmann, Katharina; Verleden, Stijn; Eickelberg, Oliver
2015-03-01
Tissue fibrosis, a major cause of death worldwide, leads to significant organ dysfunction in any organ of the human body. In the lung, fibrosis critically impairs gas exchange, tissue oxygenation, and immune function. Idiopathic pulmonary fibrosis (IPF) is the most detrimental and lethal fibrotic disease of the lung, with an estimated median survival of 50% after 3-5 years. Lung transplantation currently remains the only therapeutic alternative for IPF and other end-stage pulmonary disorders. Posttransplant lung function, however, is compromised by short- and long-term complications, most importantly chronic lung allograft dysfunction (CLAD). CLAD affects up to 50% of all transplanted lungs after 5 years, and is characterized by small airway obstruction with pronounced epithelial injury, aberrant wound healing, and subepithelial and interstitial fibrosis. Intriguingly, the mechanisms leading to the fibrotic processes in the engrafted lung exhibit striking similarities to those in IPF; therefore, antifibrotic therapies may contribute to increased graft function and survival in CLAD. In this review, we focus on these common fibrosis-related mechanisms in IPF and CLAD, comparing and contrasting clinical phenotypes, the mechanisms of fibrogenesis, and biomarkers to monitor, predict, or prognosticate disease status.
Zambernardi, Agustina; Gondolesi, Gabriel; Cabanne, Ana; Martinez, María I; Solar, Héctor; Rumbo, Martín; Rumbo, Carolina
2013-01-01
Exfoliative rejection is a severe complication after intestinal transplant. The assessment of mucosa histology is restricted to the area reached by endoscopy. We aim to evaluate the serum albumin (SA) value as a parameter of graft damage and clinical prognosis in intestinal exfoliative rejection (ExR). The present study is a retrospective analysis of 11 episodes of ExR occurred in a cohort of 26 patients. SA levels were measured 24 h after diagnosis and twice a week thereafter and then correlated with parameters of clinical and graft histological recovery (HR). During ExR, all patients had very low SA levels, reaching a minimum average of 1.9 ± 0.3 g/dL. According to the value of albumin levels at ExR diagnosis, the patients were grouped finding a correlation with their clinical evolution. Six ExR episodes presented with severe hipoalbuminemia (<2.2 g/dL; p < 0.05) that correlated with worse patient and graft outcome, ranging from graft loss and need for re-transplantation to delayed clinical and HR. SA at ExR diagnosis may be an indicator of the severity of the ExR process, and it could also be used as an early predictor of patient and graft outcome. © 2013 John Wiley & Sons A/S.
Attia, Tamer; Koch, Colleen G; Houghtaling, Penny L; Blackstone, Eugene H; Sabik, Ellen Mayer; Sabik, Joseph F
2017-03-01
To (1) identify sex-related differences in risk factors and revascularization strategies for patients undergoing coronary artery bypass grafting (CABG), (2) assess whether these differences influenced early and late survival, and (3) determine whether clinical effectiveness of the same revascularization strategy was influenced by sex. From January 1972 to January 2011, 57,943 adults-11,009 (19%) women-underwent primary isolated CABG. Separate models for long-term mortality were developed for men and women, followed by assessing sex-related differences in strength of risk factors (interaction terms). Incomplete revascularization was more common in men than women (26% vs 22%, P < .0001), but women received fewer bilateral internal thoracic artery (ITA) grafts (4.8% vs 12%; P < .0001) and fewer arterial grafts (68% vs 70%; P < .0001). Overall, women had lower survival than men after CABG (65% and 31% at 10 and 20 years, respectively, vs 74% and 41%; P ≤ .0001), even after risk adjustment. Incomplete revascularization was associated equally (P > .9) with lower survival in both sexes. Single ITA grafting was associated with equally (P = .3) better survival in women and men. Although bilateral ITA grafting was associated with better survival than single ITA grafting, it was less effective in women-11% lower late mortality (hazard ratio, 0.89 [0.77-1.022]) versus 27% lower in men (hazard ratio, 0.73 [0.69-0.77]; P = .01). Women on average have longer life expectancies than men but not after CABG. Every attempt should be made to use arterial grafting and complete revascularization, but for unexplained reasons, sex-related differences in effectiveness of bilateral arterial grafting were identified. Copyright © 2016. Published by Elsevier Inc.
Recipient Selection for Optimal Utilization of Discarded Grafts in Liver Transplantation.
Giretti, Giovanni; Barbier, Louise; Bucur, Petru; Marques, Frédéric; Perarnau, Jean-Marc; Ferrandière, Martine; Tellier, Anne-Charlotte; Kerouredan, Vincent; Altieri, Mario; Causse, Xavier; Debette-Gratien, Maryline; Silvain, Christine; Salamé, Ephrem
2018-05-01
In France, liver grafts that have been refused by at least 5 teams are considered for rescue allocation (RA), with the choice of the recipient being at the team's discretion. Although this system permits the use of otherwise discarded grafts in a context of organ shortage, outcomes and potential benefits need to be assessed. Between 2011 and 2015, outcomes of RA grafts (n = 33) were compared with SA grafts (n = 321) at a single French center. Liver grafts in the RA group were older (63 ± 17 years vs 54 ± 18 years, P = 0.007) and had a higher DRI (1.86 ± 0.45 vs 1.61 ± 0.47, P = 0.010). Recipients in this group had a lower Model for End-Stage Liver Disease score (14 ± 5 vs 22 ± 10, P < 0.001) and had mostly hepatocellular carcinoma (67.0% vs 40.4%, P = 0.010). The balance of risk score was significantly lower in the RA group (5.5 ± 2.9 vs 9.2 ± 5.5, P < 0.001). There were higher rates of early and delayed hepatic artery thrombosis (15.2% vs 3.1%, P = 0.001) and retransplantation (18.2% vs 4.7%, P = 0.002) in the RA group. Patient survival was not different between groups, but graft survival was impaired (95% vs 82% at 1 year and 94% vs 74% at 3 years, P = 0.001). Our results show that discarded liver grafts can be used provided that there is a strict recipient selection process, although hepatic artery thrombosis and retransplantation are more frequent. This strategy enables utilization of otherwise discarded grafts in the context of organ shortage.
Cedidi, C Can; Wilkens, L; Berger, A; Ingianni, G
2007-11-05
In patients after extensive burn injury the lack of split thickness skin graft donor sites, and consecutive delay in wound closure are critical factors of morbidity and mortality. In addition limited functional and aesthetic results after transplantation of split thickness skin grafts present a socioeconomic problem. For improved wound closure the aim of this study was the development of a one stage technique for the establishment of a multi layer composite graft, existing of a collagen-GAG-matrix with silicon layer of a two layer synthetic dermal equivalent (DE) with integrated fibroblasts, and ceratinocytes. - In 64 athymic nude mice the evaluation of the multi layer skin grafts potential to re-establish a human epidermis, and high quality dermal structure was performed. In addition to clinical investigations we measured wound contraction, and analyzed histomorphologic, immunohistologic, "in situ hybridisation", and electro microscopic data. - Our results show, that the seeding of DE with human fibroblasts and ceratinocytes as a composite skin graft reproducible enabled a wound healing with an organised human dermis and epidermis within 10 - 15 days. The histological studies of the grafted composite skin grafts in this model showed morphologically a characteristic dermal-epidermal skin structure with a cornifying epithelium, being of human origin ("in situ hybridisation"). Through the co-cultivation of fibroblasts and ceratinocytes in the DE the generation and structural morphology of collagen fibres, and inflammatory reaction in the neodermis is positively influenced, and as a consequence wound contraction significantly reduced. In regard to the early preparation of composite grafts, and the minimal requirements for donor sites - with dependable stable reconstruction of the integument - this technique may present a step forward in the treatment of patients with extensive burns.
[Effects of grafting and nitrogen fertilization on melon yield and nitrogen uptake and utilization].
Xue, Liang; Ma, Zhong Ming; DU, Shao Ping
2017-06-18
A split-field design experiment was carried out using two main methods of cultivation (grafting and self-rooted cultivation) and subplots with different nitrogen application levels (0, 120, 240, and 360 kg N·hm -2 ) to investigate the effects of cultivation method and nitrogen application levels on the yield and quality of melons, nitrogen transfer, nitrogen distribution, and nitrogen utilization rate. The results showed that melons produced by grafting cultivation had a 7.3% increase in yield and a 0.16%-3.28% decrease in soluble solid content, compared to those produced by self-rooted cultivation. The amount of nitrogen accumulated in melons grafted in the early growth phase was lower than that in self-rooted melons, and higher after fruiting. During harvest, nitrogen accumulation amount in grafted melon plants was 5.2% higher than that in self-rooted plants and nitrogen accumulation amount in fruits was 10.3% higher. Grafting cultivation increased the amount of nitrogen transfer from plants to fruits by 20.9% compared to self-rooted cultivation. Nitrogen distribution in fruits was >80% in grafted melons, whereas that in self-rooted melons was <80%. Under the same level of nitrogen fertilization, melons cultivated by grafting showed 1.3%-4.2% increase in nitrogen absorption and utilization rate, 2.73-5.56 kg·kg -1 increase in nitrogen agronomic efficiency, and 7.39-16.18 kg·kg -1 increase in nitrogen physiological efficiency, compared to self-rooted cultivation. On the basis of the combined perspective of commercial melon yield, and nitrogen absorption and utilization rate, an applied nitrogen amount of 240 kg·hm -2 is most suitable for graf-ting cultivation in this region.
Temporary diabetes insipidus in 2 men after on-pump coronary artery bypass grafting.
Uyar, Ihsan Sami; Sahin, Veysel; Akpinar, Besir; Yurtman, Volkan; Abacilar, Feyzi; Okur, Faik Fevzi; Ates, Mehmet
2013-01-01
Many complications have been reported after cardiopulmonary bypass. A common physiologic change during the early postoperative period after cardiopulmonary bypass is increased diuresis. In patients whose urine output is increased, postoperative diabetes insipidus can develop, although reports of this are rare. We present the cases of 2 patients who underwent on-pump coronary artery bypass grafting (with cardiopulmonary bypass). Each was diagnosed with diabetes insipidus postoperatively: a 54-year-old man on the 3rd day, and a 66-year-old man on the 4th day. Each patient recovered from the condition after 6 hours of intranasal therapy with synthetic vasopressin (antidiuretic hormone). The diagnosis of diabetes insipidus should be considered in patients who produce excessive urine early after cardiac surgery in which cardiopulmonary bypass has been used.
Zanchet, Marcos Vinícius; Silva, Larissa Luvison Gomes da; Matias, Jorge Eduardo Fouto; Coelho, Júlio Cezar Uili
2016-01-01
The outcome of the patients after liver transplant is complex and to characterize the risk for complications is not always easy. In this context, the hepatic post-reperfusion biopsy is capable of portraying alterations of prognostic importance. To compare the results of liver transplantation, correlating the different histologic features of the hepatic post-reperfusion biopsy with graft dysfunction, primary non-function and patient survival in the first year after transplantation. From the 377 transplants performed from 1996 to 2008, 164 patients were selected. Medical records were reviewed and the following clinical outcomes were registered: mortality in 1, 3, 6 and 12 months, graft dysfunction in varied degrees and primary graft non-function. The post-reperfusion biopsies had been examined by a blinded pathologist for the outcomes. The following histological variables had been evaluated: ischemic alterations, congestion, steatosis, neutrophilic exudate, monomorphonuclear infiltrate and necrosis. The variables associated with increased mortality were: steatosis (p=0.02209), monomorphonuclear infiltrate (p=0.03935) and necrosis (p<0.00001). The neutrophilic exudate reduced mortality in this study (p=0.00659). The primary non-function showed significant association (p<0.05) with the necrosis, steatosis and the monomorphonuclear infiltrate. Post-reperfusion biopsy is useful tool to foresee complications after liver transplant. A evolução dos pacientes após transplante hepático é complexa e caracterizar o risco para complicações nem sempre é fácil. Nesse contexto, a biópsia hepática pós-reperfusão é capaz de retratar alterações de importância prognóstica. Avaliar os resultados no primeiro ano após transplante hepático, correlacionando as alterações histológicas à biópsia hepática pós-reperfusão com a sobrevida, a disfunção e o não-funcionamento primário do enxerto. Dos 377 transplantes ocorridos de 1996 a 2008, 164 pacientes foram selecionados para estudo. Os seguintes desfechos clínicos foram registrados: mortalidade em 1, 3, 6 e 12 meses, disfunção do enxerto em graus variados e o não-funcionamento primário do enxerto. As biópsias pós-reperfusão foram examinadas por um patologista sem conhecimento dos resultados. As seguintes variáveis histológicas foram avaliadas: alterações isquêmicas, congestão, esteatose, exsudato neutrofílico, infiltrado monomorfonuclear e necrose. As variáveis associadas com aumento da mortalidade foram: esteatose (p=0.02209), infiltrado monomorfonuclear (p=0.03935) e necrose (p<0.00001). O infiltrado neutrofílico reduziu a mortalidade neste estudo (p=0.00659). O não-funcionamento primário do enxerto mostrou associação significativa (p<0.05) com a necrose, a esteatose e com o infiltrado monomorfonuclear. A biópsia hepática pós-reperfusão é ferramenta útil em prever complicações após o transplante hepático.
Urine Metabonomics Reveals Early Biomarkers in Diabetic Cognitive Dysfunction.
Song, Lili; Zhuang, Pengwei; Lin, Mengya; Kang, Mingqin; Liu, Hongyue; Zhang, Yuping; Yang, Zhen; Chen, Yunlong; Zhang, Yanjun
2017-09-01
Recently, increasing attention has been paid to diabetic encephalopathy, which is a frequent diabetic complication and affects nearly 30% of diabetics. Because cognitive dysfunction from diabetic encephalopathy might develop into irreversible dementia, early diagnosis and detection of this disease is of great significance for its prevention and treatment. This study is to investigate the early specific metabolites biomarkers in urine prior to the onset of diabetic cognitive dysfunction (DCD) by using metabolomics technology. An ultra-high performance liquid-chromatography-quadrupole time-of-flight-mass spectrometry (UPLC-Q/TOF-MS) platform was used to analyze the urine samples from diabetic mice that were associated with mild cognitive impairment (MCI) and nonassociated with MCI in the stage of diabetes (prior to the onset of DCD). We then screened and validated the early biomarkers using OPLS-DA model and support vector machine (SVM) method. Following multivariate statistical and integration analysis, we found that seven metabolites could be accepted as early biomarkers of DCD, and the SVM results showed that the prediction accuracy is as high as 91.66%. The identities of four biomarkers were determined by mass spectrometry. The identified biomarkers were largely involved in nicotinate and nicotinamide metabolism, glutathione metabolism, tryptophan metabolism, and sphingolipid metabolism. The present study first revealed reliable biomarkers for early diagnosis of DCD. It provides new insight and strategy for the early diagnosis and treatment of DCD.
Parkes, Alison; Sweeting, Helen; Wight, Daniel
2016-10-01
Research on predictors of young children's psychosocial well-being currently relies on adult-reported outcomes. This study investigated whether early family circumstances and parenting predict 7-year-olds' subjective well-being. Information on supportive friendships, liking school and life satisfaction was obtained from 7-year-olds in one Growing Up in Scotland birth cohort in 2012-2013 (N = 2869). Mothers provided information on early childhood factors from 10 to 34 months, parenting (dysfunctional parenting, home learning and protectiveness) from 46 to 70 months, and 7-year-olds' adjustment. Multivariable path models explored associations between early childhood factors, parenting and 7-year-olds' subjective well-being. Supplementary analyses compared findings with those for mother-reported adjustment. In a model of early childhood factors, maternal distress predicted less supportive friendships and lower life satisfaction (coefficients -0.12), poverty predicted less supportive friendships (-0.09) and remote location predicted all outcomes (-0.20 to -0.27). In a model with parenting added, dysfunctional parenting predicted all outcomes (-10 to -0.16), home learning predicted liking school (0.11) and life satisfaction (0.08), and protectiveness predicted life satisfaction (0.08). Effects of maternal distress were fully mediated, largely via dysfunctional parenting, while home learning mediated negative effects of low maternal education. Direct effects of poverty and remote location remained. Findings for mother-reported child adjustment were broadly similar. Unique prospective data show parenting and early childhood impact 7-year-olds' subjective well-being. They underline the benefits for children of targeting parental mental health and dysfunctional parenting, and helping parents develop skills to support children at home and school.
Early-Stage Visual Processing and Cortical Amplification Deficits in Schizophrenia
Butler, Pamela D.; Zemon, Vance; Schechter, Isaac; Saperstein, Alice M.; Hoptman, Matthew J.; Lim, Kelvin O.; Revheim, Nadine; Silipo, Gail; Javitt, Daniel C.
2005-01-01
Background Patients with schizophrenia show deficits in early-stage visual processing, potentially reflecting dysfunction of the magnocellular visual pathway. The magnocellular system operates normally in a nonlinear amplification mode mediated by glutamatergic (N-methyl-d-aspartate) receptors. Investigating magnocellular dysfunction in schizophrenia therefore permits evaluation of underlying etiologic hypotheses. Objectives To evaluate magnocellular dysfunction in schizophrenia, relative to known neurochemical and neuroanatomical substrates, and to examine relationships between electrophysiological and behavioral measures of visual pathway dysfunction and relationships with higher cognitive deficits. Design, Setting, and Participants Between-group study at an inpatient state psychiatric hospital and out-patient county psychiatric facilities. Thirty-three patients met DSM-IV criteria for schizophrenia or schizoaffective disorder, and 21 nonpsychiatric volunteers of similar ages composed the control group. Main Outcome Measures (1) Magnocellular and parvocellular evoked potentials, analyzed using nonlinear (Michaelis-Menten) and linear contrast gain approaches; (2) behavioral contrast sensitivity measures; (3) white matter integrity; (4) visual and nonvisual neuropsychological measures, and (5) clinical symptom and community functioning measures. Results Patients generated evoked potentials that were significantly reduced in response to magnocellular-biased, but not parvocellular-biased, stimuli (P=.001). Michaelis-Menten analyses demonstrated reduced contrast gain of the magnocellular system (P=.001). Patients showed decreased contrast sensitivity to magnocellular-biased stimuli (P<.001). Evoked potential deficits were significantly related to decreased white matter integrity in the optic radiations (P<.03). Evoked potential deficits predicted impaired contrast sensitivity (P=.002), which was in turn related to deficits in complex visual processing (P≤.04). Both evoked potential (P≤.04) and contrast sensitivity (P=.01) measures significantly predicted community functioning. Conclusions These findings confirm the existence of early-stage visual processing dysfunction in schizophrenia and provide the first evidence that such deficits are due to decreased nonlinear signal amplification, consistent with glutamatergic theories. Neuroimaging studies support the hypothesis of dysfunction within low-level visual pathways involving thalamocortical radiations. Deficits in early-stage visual processing significantly predict higher cognitive deficits. PMID:15867102
2010-01-01
Background Lactate clearance, a surrogate for the magnitude and duration of global tissue hypoxia, is used diagnostically, therapeutically and prognostically. This study examined the association of early lactate clearance with selected inflammatory, coagulation, apoptosis response biomarkers and organ dysfunction scores in severe sepsis and septic shock. Methods Measurements of serum arterial lactate, biomarkers (interleukin-1 receptor antagonist, interleukin-6, interleukin-8, interleukin-10, tumor necrosis factor-alpha, intercellular adhesion molecule-1, high mobility group box-1, D-Dimer and caspase-3), and organ dysfunction scores (Acute Physiology and Chronic Health Evaluation II, Simplified Acute Physiology Score II, Multiple Organ Dysfunction Score, and Sequential Organ Failure Assessment) were obtained in conjunction with a prospective, randomized study examining early goal-directed therapy in severe sepsis and septic shock patients presenting to the emergency department (ED). Lactate clearance was defined as the percent change in lactate levels after six hours from a baseline measurement in the ED. Results Two-hundred and twenty patients, age 65.0 +/- 17.1 years, were examined, with an overall lactate clearance of 35.5 +/- 43.1% and in-hospital mortality rate of 35.0%. Patients were divided into four quartiles of lactate clearance, -24.3 +/- 42.3, 30.1 +/- 7.5, 53.4 +/- 6.6, and 75.1 +/- 7.1%, respectively (p < 0.01). The mean levels of all biomarkers and organ dysfunction scores over 72 hours were significantly lower with higher lactate clearance quartiles (p < 0.01). There was a significant decreased in-hospital, 28-day, and 60-day mortality in the higher lactate clearance quartiles (p < 0.01). Conclusions Early lactate clearance as a surrogate for the resolution of global tissue hypoxia is significantly associated with decreased levels of biomarkers, improvement in organ dysfunction and outcome in severe sepsis and septic shock. PMID:20181046
Panch, Sandhya R; Szymanski, James; Savani, Bipin N; Stroncek, David F
2017-08-01
Bone marrow (BM) aspirates, mobilized peripheral blood, and umbilical cord blood (UCB) have developed as graft sources for hematopoietic stem and progenitor cells (HSPCs) for stem cell transplantation and other cellular therapeutics. Individualized techniques are necessary to enhance graft HSPC yields and cell quality from each graft source. BM aspirates yield adequate CD34 + cells but can result in relative delays in engraftment. Granulocyte colony-stimulating factor (G-CSF)-primed BM HSPCs may facilitate faster engraftment while minimizing graft-versus-host disease in certain patient subsets. The levels of circulating HSPCs are enhanced using mobilizing agents, such as G-CSF and/or plerixafor, which act via the stromal cell-derived factor 1/C-X-C chemokine receptor type 4 axis. Alternate niche pathway mediators, including very late antigen-4/vascular cell adhesion molecule-1, heparan sulfate proteoglycans, parathyroid hormone, and coagulation cascade intermediates, may offer promising alternatives for graft enhancement. UCB grafts have been expanded ex vivo with cytokines, notch-ligand, or mesenchymal stromal cells, and most studies demonstrated greater quantities of CD34 + cells ex vivo and improved short-term engraftment. No significant changes were observed in long-term repopulating potential or in patient survival. Early phase clinical trials using nicotinamide and StemReginin1 may offer improved short- and long-term repopulating ability. Breakthroughs in genome editing and stem cell reprogramming technologies may hasten the generation of pooled, third-party HSPC grafts. This review elucidates past, present, and potential future approaches to HSPC graft optimization. Published by Elsevier Inc.
Greenwood, John; Amjadi, Mahyar; Dearman, Bronwyn; Mackie, Ian
2009-01-01
Objectives: During the first 48 hours after placement, an autograft “drinks” nutrients and dissolved oxygen from fluid exuding from the underlying recipient bed (“plasmatic imbibition”). The theory of inosculation (that skin grafts subsequently obtain nourishment via blood vessel “anastomosis” between new vessels invading from the wound bed and existing graft vessels) was hotly debated from the late 19th to mid-20th century. This study aimed to noninvasively observe blood flow in split skin grafts and Integra™ dermal regeneration matrix to provide further proof of inosculation and to contrast the structure of vascularization in both materials, reflecting mechanism. Methods: Observations were made both clinically and using confocal microscopy on normal skin, split skin graft, and Integra™. The VivaScope™ allows noninvasive, real-time, in vivo images of tissue to be obtained. Results: Observations of blood flow and tissue architecture in autologous skin graft and Integra™ suggest that 2 very different processes are occurring in the establishment of circulation in each case. Inosculation provides rapid circulatory return to skin grafts whereas slower neovascularization creates an unusual initial Integra™ circulation. Conclusions: The advent of confocal laser microscopy like the VivaScope 1500™, together with “virtual” journals such as ePlasty, enables us to provide exciting images and distribute them widely to a “reading” audience. The development of the early Integra™ vasculature by neovascularization results in a large-vessel, high-volume, rapid flow circulation contrasting markedly from the inosculatory process in skin grafts and the capillary circulation in normal skin and merits further (planned) investigation. PMID:19787028
[Late results following surgical correction of syndactyly and symbrachydactyly].
Deutinger, M; Mandl, H; Frey, M; Holle, J; Freilinger, G
1989-02-01
Growth and the type of surgical treatment of the hand play an important role in the results of surgery in children. 29 patients have been operated on because of syndactyly and symbrachydactyly and were controlled. The following parameters were assessed: kind of incision and skin graft, functional results, x-ray to examine the skeleton and the depth of the commissure, colour of the skin graft and use of the hand. After operation of syndactyly all patients were able to use their hands normally, although full extend of flexion and extension was achieved only in 20 of 22 hands. In 5 divided pairs of fingers there was recurrence of syndactyly. In all cases except one, a split thickness skin graft has been used. After operative treatment of symbrachydactyly and complex syndactyly, full extent of flexion was achieved in 13 of 19 hands, in 6 hands the range of flexion was incomplete because of skeleton abnormalities. Recurrence occurred in 9 divided pairs of fingers; in 7 cases, a split thickness skin graft had been used. Despite this, all patients were able to use their hands normally. The use of split thickness skin grafts resulted in a 60% recurrence rate, whereas the use of full thickness skin graft led merely to 7.5% recurrence rate. Our results show the advantage of the full thickness skin graft. As a consequence, full thickness skin graft should be used in all cases. Furthermore, the operation should be performed at an early age, if fingers of unequal length have to be separated. Zig-zag incision should be used in all cases.
Foreskin-isolated keratinocytes provide successful extemporaneous autologous paediatric skin grafts.
Mcheik, Jiad N; Barrault, Christine; Pedretti, Nathalie; Garnier, Julien; Juchaux, Franck; Levard, Guillaume; Morel, Franck; Lecron, Jean-Claude; Bernard, François-Xavier
2016-03-01
Severe burns in children are conventionally treated with split-thickness skin autografts or epidermal sheets. However, neither early complete healing nor quality of epithelialization is satisfactory. An alternative approach is to graft isolated keratinocytes. We evaluated paediatric foreskin and auricular skin as donor sources, autologous keratinocyte transplantation, and compared the graft efficiency to the in vitro capacities of isolated keratinocytes to divide and reconstitute epidermal tissue. Keratinocytes were isolated from surgical samples by enzymatic digestion. Living cell recovery, in vitro proliferation and epidermal reconstruction capacities were evaluated. Differentiation status was analysed, using qRT-PCR and immunolabelling. Eleven children were grafted with foreskin-derived (boys) or auricular (girls) keratinocyte suspensions dripped onto deep severe burns. The aesthetic and functional quality of epithelialization was monitored in a standardized way. Foreskin keratinocyte graft in male children provides for the re-epithelialization of partial deep severe burns and accelerates wound healing, thus allowing successful wound closure, and improves the quality of scars. In accordance, in vitro studies have revealed a high yield of living keratinocyte recovery from foreskin and their potential in terms of regeneration and differentiation. We report a successful method for grafting paediatric males presenting large severe burns through direct spreading of autologous foreskin keratinocytes. This alternative method is easy to implement, improves the quality of skin and minimizes associated donor site morbidity. In vitro studies have highlighted the potential of foreskin tissue for graft applications and could help in tissue selection with the prospect of grafting burns for girls. Copyright © 2013 John Wiley & Sons, Ltd.
Peters, D C; Noble, S
1999-02-01
Cardiopulmonary bypass (CPB) is associated with defective haemostasis which results in bleeding and the requirement for allogenic blood product transfusions in many patients undergoing open heart surgery (OHS) and/or coronary artery bypass graft surgery (CABG) with CPB. Conservation of blood has become a priority during surgery because of shortages of donor blood, the risks associated with the use of allogenic blood products and the costs of these products. Aprotinin is a serine protease inhibitor isolated from bovine lung tissue which acts in a number of interrelated ways to provide an antifibrinolytic effect, inhibit contact activation, reduce platelet dysfunction and attenuate the inflammatory response to CPB. It is used to reduce blood loss and transfusion requirements in patients with a risk of haemorrhage and has clear advantages over placebo or no treatment. High dose aprotinin significantly reduces postoperative blood loss compared with aminocaproic acid and desmopressin, and decreases transfusion requirements compared with desmopressin. Results are less consistent with tranexamic acid: high dose aprotinin either reduces blood loss significantly more than, or to an equivalent level to, tranexamic acid. A variety of other lower aprotinin dosage regimens consistently result in similar reductions in blood loss to aminocaproic acid or tranexamic acid. Data from clinical trials indicate that aprotinin is generally well tolerated, and the adverse events seen are those expected in patients undergoing OHS and/or CABG with CPB. Hypersensitivity reactions occur in <0.1 to 0.6% of patients receiving aprotinin for the first time. The results of original reports indicating that aprotinin therapy may increase myocardial infarction rates or mortality have not been supported by more recent studies specifically designed to investigate this outcome. However, a tendency to early vein graft occlusion with aprotinin has been shown and care with anticoagulation and vessel grafts is required. No comparative tolerability data between aprotinin and the lysine analogues, aminocaproic acid and tranexamic acid, are available. Comparative tolerability and cost-effectiveness data for aprotinin and the lysine analogues are required to more fully assess their individual roles in reducing blood loss and transfusion requirements in patients undergoing CPB during OHS and/or CABG. However, clinical evidence to date supports the use of aprotinin over its competitors in patients at high risk of haemorrhage, in those for whom transfusion is unavailable or in patients who refuse allogenic transfusions.
Roland, Bartholomew P.; Zeccola, Alison M.; Larsen, Samantha B.; ...
2016-03-31
Triosephosphate isomerase (TPI) deficiency is a poorly understood disease characterized by hemolytic anemia, cardiomyopathy, neurologic dysfunction, and early death. TPI deficiency is one of a group of diseases known as glycolytic enzymopathies, but is unique for its severe patient neuropathology and early mortality. The disease is caused by missense mutations and dysfunction in the glycolytic enzyme, TPI. Previous studies have detailed structural and catalytic changes elicited by disease-associated TPI substitutions, and samples of patient erythrocytes have yielded insight into patient hemolytic anemia; however, the neuropathophysiology of this disease remains a mystery. This study combines structural, biochemical, and genetic approaches tomore » demonstrate that perturbations of the TPI dimer interface are sufficient to elicit TPI deficiency neuropathogenesis. Also, the present study demonstrates that neurologic dysfunction resulting from TPI deficiency is characterized by synaptic vesicle dysfunction, and can be attenuated with catalytically inactive TPI. Collectively, our findings are the first to identify, to our knowledge, a functional synaptic defect in TPI deficiency derived from molecular changes in the TPI dimer interface.« less
Left ventricular function abnormalities as a manifestation of silent myocardial ischemia.
Lambert, C R; Conti, C R; Pepine, C J
1986-11-01
A large body of evidence exists indicating that left ventricular dysfunction is a common occurrence in patients with severe coronary artery disease and represents silent or asymptomatic myocardial ischemia. Such dysfunction probably occurs early in the time course of every ischemic episode in patients with coronary artery disease whether symptoms are eventually manifested or not. The pathophysiology of silent versus symptomatic left ventricular dysfunction due to ischemia appears to be identical. Silent ischemia-related left ventricular dysfunction can be documented during spontaneous or stress-induced perturbations in the myocardial oxygen supply/demand ratio. It also may be detected by nitroglycerin-induced improvement in ventricular function or by salutary changes in wall motion following revascularization. Silent left ventricular dysfunction is a very early occurrence during ischemia and precedes electrocardiographic abnormalities. In this light, its existence should always be kept in mind when dealing with patients with ischemic heart disease. It can be hypothesized that because silent ischemia appears to be identical to ischemia with symptoms in a pathophysiologic sense, prognosis and treatment in both cases should be the same.