Objectives: We investigated the influence of the interaction between donor age and cold ischemia time on allograft survival in the absence of delayed graft function, early acute rejection, or the combination of both.
Materials and Methods: We conducted a retrospective analysis of a cohort of patients first transplanted with living-related and deceased-donor allografts between 2001 and 2016. Predictors included cold ischemia time, donor and recipient age and sex, body mass index, renal replacement therapy duration, cause of end-stage renal disease, HLA class I and II mismatches, panel of reactive antibodies score, donor creatinine concentration, development of delayed graft function, and biopsy-proven acute rejection. The response variable was time until return to renal replacement therapy. Patients who died with functioning allografts were censored at the time of death. Analyses included multivariate Cox proportional hazards regression.
Results: The study included 498 patients followed for median of 4.1 years with median cold ischemia time of 17.0 hours. On multivariate analysis, allograft survival was negatively affected by the cold ischemia time-donor age interaction (P = .026), acute rejection (P = .043), delayed graft function (P = .001), and acute rejection combined with delayed graft function (P = .002). Restricted mean allograft survival times in patients who developed neither delayed graft function nor acute rejection decreased from 13.6 to 8.6 years when cold ischemia time increased from 12 to 36 hours and donor age increased from 30 to 60 years.
Conclusions: Allograft survival was negatively affected by donor age-cold ischemia time interaction independently of the development of delayed graft function, acute rejection, or their combination.
Key words : Kidney transplant, Multivariable survival models, Renal transplant
Introduction
Kidney transplant remains the best option for the treatment of end-stage renal disease, considering both patient survival and quality of life.1,2 The worldwide recognition of this fact has led to the rise in the number of renal transplants, despite the ever-pressing limitation in organ availability.3,4 Currently, Brazil ranks second in the absolute number of renal transplants,5 tallying 5648 operations in 2015.
This large number of renal transplants stands in contrast with the paucity of mid- and long-term allograft survival data emanating from Brazil,5,6 notwithstanding recent estimates from the 2016 Annual Brazilian Report,5 which reports 1- and 5-year allograft survival rates at 93% and 86% in living-related donor transplants and at 84% and 73% in deceased-donor transplants.
Allograft survival can be influenced by pretransplant factors such as cold ischemia time (CIT), age of donor or recipient, organ recovery and storage (simple cold storage or pulsatile), cardiovascular comorbidities, and immunologic sensitization.3,7 Reports of successful kidney transplants with prolonged CIT in the 1980s8-10 have led many transplant centers to reorganize their CIT protocols and to accept organs previously unused for kidney transplant. Some studies have shown that prolonged CIT does not alter renal perfusion11 and is also not a good predictor of allograft survival.12 However, other reports have shown that longer CITs negatively affect early allograft function and can be decisive for its long-term survival.13-18 Cold ischemia time is the key factor in ischemia-reperfusion injury, which underlies the development of delayed graft function (DGF) because of acute tubular necrosis (ATN); ischemia-reperfusion injury is also pathophysiologically connected to acute cellular rejection (AR), which may also aggravate DGF.13,19-22 Moreover, there is evidence that donor age amplifies the detrimental effects of CIT on allograft survival.18,23 The influence of CIT in mid- and long-term allograft survival holds special interest not only because of its prognostic implications but also because CIT has the potential to be shortened through optimization of organ recovery protocols, organ transport, and patients’ preparation for surgery.
In this study, we investigated the extent to which the combined influence of donor age and CIT on allograft survival depends on the development of DGF, AR, or the combination of both; most importantly, we investigated the persistence of the effects of the donor age-CIT interaction in the absence of these complications. Our approach to these questions involved the use of multivariable regression models, including an interaction term between CIT and donor age. Because interaction terms between continuous predictors pose additional hurdles in understanding regression models’ implications by inspection of the respective coefficients, we chose to present our results mainly with graphs and prediction tables in an effort to improve clarity for readers.
Materials and Methods
Data acquisition
This retrospective study was approved by our Institutional Review Board. We
identified from our database all consecutive patients who received first renal
transplant, from either living-related donors or deceased donors, between
January 2001 and June 2016. Patients younger than 18 years, as well as those who
lacked follow-up information, were excluded from the analysis. Follow-up was
recorded until the last clinical visit, where it was ascertained whether the
patient had returned to renal replacement therapy. Patients who died with
functioning allografts were censored at the time of death.
Predictors
Continuous predictors included patient and donor age (in years), duration of
renal replacement therapy before transplant (RRT; either hemodialysis or
peritoneal dialysis, in years), and CIT, defined as the number of hours between
aortic (in deceased donors) or renal artery (in living-related donors) clamping
and recipient vascular anastomoses. Patient body mass index (BMI; in kg/m2),
panel reactive antibody (PRA; as a percentage, determined at most 6 months
before transplant), and donor plasma creatinine concentration (in mg%)
immediately before organ retrieval were also included as continuous predictors.
Categorical predictors were patient and donor sex; organ origin (that is, whether allografts came from living related or deceased donors, the latter further classified as deceased from vascular or nonvascular causes); HLA I mismatches, categorized as 4, 3-1, and 0 mismatches in the A and B loci; HLA II mismatches, categorized as 2, 1, and 0 mismatches in the DR locus; and cause of end-stage renal disease, categorized as systemic, renal, urologic, autosomal dominant polycystic kidney disease, and indeterminate causes. Development of DGF (defined as the need for at least 1 dialysis session during the first posttransplant week) was further classified in accordance to histopathologic analysis of posttransplant biopsies (performed during hospitalization posttransplant) into those who had or not evidence of AR (coded, respectively, as DGF+AR+ and DGF+AR-). In a similar fashion, histopathologic evidence of AR in the absence of DGF was coded as DGF-AR+, with development of neither DGF nor AR coded as DGF-AR-.Initial immunosuppressive therapy always included steroids (usually prednisone) and was assorted according to the following combinations: cyclosporine and azathioprine, cyclosporine and mycophenolate, tacrolimus and azathioprine, tacrolimus and mycophenolate, and no calcineurin inhibitor use with or without induction with thymoglobulin.
Because all allografts were perfused with Euro-Collins solution and kept in nonpulsatile cold storage, organ preservation methods were not included in the predictor’s set. The primary outcome was allograft survival. Allografts were considered functional if at the last visit the patient did not return to RRT.
Univariate and bivariate analyses
Continuous variables are shown as median values and interquartile ranges (IQR),
and categorical variables are shown as frequencies. Differences between
continuous predictors were assessed with Wilcoxon and Kruskal-Wallis tests, the
latter followed by the Dunn post hoc test as appropriate. Differences between
categorical predictors were evaluated with the Pearson chi-square test. We used
the log-rank test for bivariate analyses of graft survival, and we used logistic
models to investigate whether CIT, donor age, and the interaction of both were
associated with DGF, AR, or the combination of both DGF and AR.
Cox proportional hazards regression
Multiple multivariate Cox proportional hazards regression was undertaken after
variable imputation with predictive mean matching.24 Predictors’ nonlinear
effects were assessed by examining the correlation between Spearman ρ and ρ2.
All predictors enumerated above and interaction terms between donor age and CIT
were included in the regression equation.25 We examined the proportional hazards
assumption by computing and plotting scaled Schoenfeld residuals.
All analyses took place within the R statistical language environment (version 3.4.4),26 using mostly the rms24 and survival27 libraries. Because direct interpretation of regression coefficients and corresponding hazard ratios (HR) is complicated by the interaction term between 2 continuous variables,28 these results are mostly presented graphically and with prediction tables containing restricted mean survival times (RMST) computed from the regression model. Confidence intervals were calculated at 0.95 (95% confidence interval), and P values were computed where appropriate, with statistical significance set at P < .05.
Results
Univariate and bivariate analyses
We identified 527 patients who underwent kidney transplant during the study
period, and 29 cases were excluded: 15 patients were younger than 18 years, 9
patients had undergone a second transplant, and 5 patients lacked follow-up
data. Most of the remaining 498 patients were male (301, 60.4%)
and received organs from deceased donors (335, 67.3%). All 163 living donors
were related to the recipient, with 113 brothers (69.3%), 34 parents (20.9%), 13
sons (8%), and 3 uncles (1.8%). Table 1 displays the distribution of the
predictors in our study population.
Median follow-up was 4.1 years (IQR, 1.8-7.9 y), and 96 patients (19.3%) lost allograft function (Figure 1), chiefly because of chronic allograft nephropathy (42, 43.8%) and chronic rejection (18, 18.8%). There were 58 patients (11.6%) who died during the study period, with most (51, 87.9%) maintaining a functioning allograft. Systemic arterial hypertension was the leading single cause of end-stage renal disease (119, 23.9%), and one-third of the patients had no identifiable cause for end-stage renal disease (Table 2).
Cold ischemia time was (unsurprisingly) longer in deceased-donor transplants compared with living-related donor transplants (median 22.0 vs 1.9 h; P < .001, Wilcoxon test; Figure 2). Median ages were similar between male and female living-related donors (median of 39.0 y for both; P = .49, Wilcoxon test), but deceased male donors were younger than female donors (median of 33.0 vs 39.0 y; P < .001, Wilcoxon test).
Patients receiving kidneys from deceased donors had longer periods of RRT compared with those receiving organs from living-related donors (median of 4.0 vs 2.1 y; P < .001, Wilcoxon test). Living-related female donors slightly outnumbered living-related male donors (72/150, 52.0%), but deceased donors were mostly male (198/324, 61.1%; P = .01, Pearson chi-square test).
Bivariate logistic models revealed that the interaction between CIT and donor age was associated with the development of nonrejection DGF (with DGF but without AR, odds ratio 1.00; 95% confidence interval, 1.00-1.00; P = .02). Cold ischemia time weakly correlated with donor age among patients who received allografts from deceased donors (ρ = 0.105; P = .07).
Overall mean probabilities of allograft survival at 1, 5, and 10 years were, respectively, 90.5%, 79.4%, and 71.2%. The development of AR and DGF (with or without AR) led to shortened allograft RMST (10.5, 9.3, and 9.1 years, respectively, compared with 12.6 years for those without these complications; P = .005, P < .001, and P < .001, respectively, log-rank test; Figure 2).
Allograft survival was not significantly different with regard to patient and donor sex and age, duration of RRT, HLA II mismatches, initial immunosuppressive therapy, donor creatinine concentration, PRA scores, and cause of end-stage renal disease. Allograft survival was longer in patients with no HLA class I mismatches compared with those with 3-1 and 4 HLA class I mismatches (RMST 13.2 y compared with 11.0 and 10.6 y; P = .001, log-rank test), as well as patients who received allografts from donors deceased from nonvascular causes compared with vascular causes (RMST 11.2 vs 8.9 y; P = .02, log-rank test).
Multivariable Cox proportional hazards regression
Multivariable Cox proportional hazards regression was undertaken with 50 imputed
data sets. Donor cause of death was removed from the predictor set because its
values could be computed from the other predictors (R2 = 0.934). Although there
was evidence of nonlinear effects for donor creatinine, there were no
significant differences between models admitting (or not) nonlinearity for this
predictor (P = .34, likelihood ratio test; modeled with 4-knot restricted cubic
splines). Examination of Schoenfeld residuals revealed violation of the
proportional hazards assumption for donor creatinine and HLA class II, which was
addressed by allowing these coefficients to vary across time intervals. Table 3
displays hazard ratios extracted from the regression model.
The interaction between CIT and donor age had an independent synergistic adverse effect on allograft survival; that is, donor age amplified the effect of CIT even in the absence of AR, DGF, or the combination of both (Table 4). Restricted mean survival time was computed at 13.6 years for 30-, 45-, and 60-year-old allografts exposed to 12 hours of CIT presenting neither DGF nor AR. Increasing CIT to 36 hours reduced RSMT to 13.5, 11.8, and 8.6 years, respectively. Allograft survival curves extracted from the regression model are presented in Figure 3.
Discussion
This study, including 498 adult patients who underwent first kidney transplant with a median follow-up of 4.1 years, indicated that donor age magnified the detrimental effect of CIT on allograft survival. The adverse effect of the donor age-CIT interaction was independent of the development of AR, DGF, or the combination of both, although the occurrence of these complications further shortened allograft survival.
Although some previous studies reported that CIT and donor age were risk factors for ATN, those studies did not find an association between ATN and an increase in allograft loss17,29,30; there is growing evidence that donor age does indeed compound the effect of CIT on allograft survival. With the use of multivariate survival analysis of a large sample (6317 patients) from the Australia and New Zealand Dialysis and Transplant Registry,23 Lim and collaborators observed a decrease in death-censored allograft survival by comparing extended criteria donors who had CIT less than 12 hours (HR 1.46) and greater than 12 hours (HR 1.91) versus standard criteria donors who had CIT less than 12 hours. Avoiding the loss of power inherent to variable dichotomization,25,28 Debout and colleagues18 not only found that CIT exerts its deleterious effects on graft survival in a continuous manner (adjusted HR 1.013 for each additional hour) but also that allografts older than 61 years survived less often than allografts from donors of 50 years or younger.
These findings are compatible with ours and can be interpreted in light of the kidney senescence model advanced by Halloran and colleagues.19,31,32 In that model, allograft survival, primarily governed by the development of chronic allograft nephropathy, is heavily influenced by the functional reserve of the transplanted kidney, which in turn is mainly a function of donor age. To emphasize this concept, Veroux and colleagues from Catania, Italy,33 found that the greatest impact on allograft survival was among patients older than 65 years receiving kidneys from similarly aged (> 65 years) donors, although donor age negatively affected survival for recipients of any age. American investigators Noppakun and colleagues34 also noted a progressive reduction in living-related allograft survival for each decade above 40 years of donor age. It is not by accident that age is the main feature in the definition of expanded criteria donors.7
Increased allograft immunogenicity may also constitute a pathophysiologic underpinning of the continuous manner by which the donor age-CIT interaction has affected long-term allograft survival in our cohort. Although longer CITs reduce older allograft functional lifespan by direct ischemia-reperfusion injury,13-15,35 it is likely that such reduction in functional lifespan is also immunologically mediated, since ischemia-reperfusion tissue injury leads to antigen exposure and promotion of innate and adaptive immune responses.20,22 In fact, there is evidence that allograft immunogenicity is increased by prolonged CIT, as shown by Bryan and associates,36 who reported that a longer span of CIT (> 15 h) independently increases the risk of HLA class I antibody production in patients who subsequently rejected their allografts.
The ischemia-reperfusion injury can be minimized with the use of preservation solutions, which facilitate the use of marginal grafts and help to reduce the deleterious effects of prolonged CIT. In an experimental animal study conducted by Haberal and colleagues,37 it was observed that a new preservation solution, Baskent University Preservation Solution, resulted in better histologic aspects in kidneys transplanted from rats and pigs, compared with the University of Wisconsin (UW) and histidine-tryptophan-ketoglutarate (HTK) solutions.37 These findings suggest the future development of new preservation solutions will improve DGF and allograft survival by reducing the negative effects of prolonged cold ischemia time.
Although we found a weak correlation between deceased donor age and CIT (ρ = 0.105, Figure 4), it remains probable that donor age may, in fact, influence CIT. Cold ischemia times for older organs can become longer as transplant teams assess these allografts on the basis of biopsy results, because this process requires additional time for processing and interpretation. In addition, the higher prevalence of glomerulosclerosis in these older kidneys inflates refusal rates and may cause difficulties in organ allocation and transportation, especially in countries with large territorial extensions, such as Brazil. A positive correlation between donor age and CIT may represent a worrisome aspect in the transplant of older allografts, and, in our estimation, merits a closer look with regard to possible repercussions in organ allocation policy.
We acknowledge that this study has many limitations. Selection bias affects all retrospective studies, and ours is surely no exception. In our study, one manifestation of selection bias can be seen in the overwhelming predominance of low PRA scores among our patients; we attribute this selection bias (toward patients less immunologically sensitized) to our limited and unreliable supply of thymoglobulin during the study period. We also suspect, from the rather low values of plasma donor creatinine, that this restricted availability of thymoglobulin has driven deceased-donor allograft selection in the direction of donors with better clinical condition. This peculiarity of our data set restricts generalization of our results, which may represent an optimistic scenario pertaining to organ survival.
Our database also lacked comorbidity data for deceased donors, and we are cognizant that this predictor might have added valuable information regarding the allograft functional reserve. Additionally, as our intention was to analyze the manner by which early posttransplant predictors interfere in allograft survival, we deliberately did not include variables such as posterior modifications of immunosuppressive therapy and occurrence of AR episodes after patient discharge.
As a final note, we wish to address our decision to include data from both living-donor and deceased-donor transplant patients. We acknowledge that the former group (living-donor transplant patients) infrequently develop DGF; however, while we sought to evaluate the widest possible range of CITs, we also sought to ascertain the manner by which the donor age-CIT interaction influences allograft survival in patients who develop neither DGF nor AR. We acknowledge that CIT, alone, does not fully translate the plethora of hormonal, autonomic, and cardiovascular imbalances that follow brain death, which are also likely to influence allograft functional recovery.
In conclusion, our analyses indicated that donor age amplifies the deleterious effect of CIT on mid- and long-term kidney allograft survival. This detrimental synergistic effect of the donor age-CIT interaction was observed in patients who developed neither DGF nor AR. Given the growing need to use older allografts, these results substantiate the need to further optimize organ retrieval and distribution procedures in order to maximally reduce CITs.
References:
Volume : 18
Issue : 4
Pages : 436 - 443
DOI : 10.6002/ect.2020.0066
From the 1Department of Urology and Kidney Transplantation, Base Hospital of the
Federal District, Brasília-DF, Brazil; the 2Division of Urology and Kidney
Transplantation, University Hospital of Brasília, Brasília-DF, Brazil; the
3Division of Urology, Faculty of Medical Sciences, State University of Campinas,
São Paulo-SP, Brazil; and the 4Department of Nephrology and Kidney
Transplantation, Base Hospital of the Federal District, Brasília-DF, Brazil
Acknowledgements: The authors have no sources of funding for this study and have
no conflicts of interest to declare.
Corresponding author: Pedro Rincon C. da Cruz, Department of Urology of the Base
Hospital of the Federal District (HBDF), SMHS Área Especial, Quadra 101, 7º
Andar, Brasília-DF, Brazil 70.335-900
E-mail: pc-cruz@uol.com.br
Table 1. Distribution of Continuous and Categorical Predictors
Table 2. Specific and Aggregated Causes of End-Stage Renal Disease
Table 3. Hazard Ratios with 95% Confidence Intervals Extracted From Cox Proportional Hazards Regression Model
Table 4. Restricted Mean Survival Times for Selected Values of Cold Ischemia Time and Donor Age*
Figure 1. Kaplan-Meier Overall Allograft Survival Curve
Figure 2. Kaplan-Meier Allograft Survival Curves
Figure 3. Allograft Survival Curves Estimated From the Regression Model
Figure 4. ACorrelation Between Donor Age and Cold Ischemia Time