Transplant Research Foundation, Hunter Medical Research Institute ... cells observed on histopathology of treated allografts. Methods: Rat renal ...... the histology graded using Banff criteria 2009 for CMR and 2013 for. AMR. 44/145 (30%) ...
Immunology and Cell Biology (2015) 93, A1–A35 & 2015 Australasian Society for Immunology Inc. All rights reserved 0818-9641/15 www.nature.com/icb
ABSTRACTS
Organ donation and ethics Immunology and Cell Biology (2015) 93, A1–A35; doi:10.1038/icb.2015.75
KIDNEY DONOR RISK INDEX VERSUS ESTIMATED POST-TRANSPLANT SURVIVAL AMONG AUSTRALIAN DONOR-RECIPIENT PAIRS White Sarah1, Clayton Philip2,3, McDonald Stephen2,4, Chadban Steven1,5 of Medicine, University of Sydney, 2ANZDATA, 3Department of Nephrology, Prince of Wales Hospital, Sydney, 4Department of Nephrology, Royal Adelaide Hospital, 5Transplantation Services, Royal Prince Alfred Hospital, Sydney 1School
Aims: A critique of Australian deceased donor kidney allocation policies is that they do not match donor quality with recipient life expectancy. We aimed to evaluate the extent to which, under existing allocation policies, donor quality is matched with estimated recipient survival. Methods: Data on deceased donor, kidney-only transplants from 2010–2012 were obtained from ANZDATA and ANZOD. Kidney donor risk index (KDRI) and estimated post-transplant survival (EPTS) scores were calculated using the United States Organ Procurement and Transplantation Network formulae. Scores were categorised into percentiles, which expressed KDRI relative to the distribution of donor KDRI in the previous year, and EPTS scores relative to waitlisted patients at the end of the previous year. Results: There was a small correlation between KDRI and EPTS percentile (rho¼0.156, Po0.001). Of kidneys in the top 20th percentile of KDRI (highest quality), 57% were transplanted into recipients with an EPTS score above the median. Of recipients in the top 20th percentile of EPTS (longest estimated post-transplant survival), 24% received a kidney in the top 20th percentile of KDRI. Of recipients aged o18, 18–64, and 65+, the proportions transplanted with a
Figure: Kidney donor risk index percentile versus estimated post-transplant survival percentile for all deceased donor-recipient pairs transplanted in Australia between 2010 and 2012. Percentiles are calculated relative to other donors and recipients transplanted in the previous calendar year. Data source: ANZDATA and ANZOD.
kidney in the top 20th percentile for KDRI were 44%, 16% and 8% respectively. Conclusions: There was a small correlation between KDRI and EPTS among donor-recipient pairs transplanted in Australia from 2010–2012. Although these results indicate some matching occurs due to clinician acceptance practices, there is clearly substantial scope to increase the degree of matching of kidney quality and estimated recipient survival in Australia.
ACCESS, COMPETITION, AND THE LIVER WAITING LIST Gilroy Richard1, Voss Jordan2, Kumer Sean3, Goldberg David4, Schmitt Timothy3 1Gastroenterology
and Hepatology, University of Kansas Medical Center, Kansas City, KS, USA, 2School of Medicine, University of Kansas Medical Center, Kansas City, KS, USA, 3Department of Surgery, University of Kansas Medical Center, Kansas City, KS, USA, 4Gastroenterology and Hepatology, University of Pennsylvania, Philadelphia, PA, USA Aims: In contrast to the Australian system, transplant markets in the US compete around economic variables. Here we assess competitive behaviors and the impact on listing practices and MELD at transplant. Methods: 2013 liver waiting list and transplant variables, US Census socioeconomic variables, CDC end-stage liver disease (ESLD) mortality data, and healthcare access data (Dartmouth Atlas) were pooled to the donor service area (DSA) level and analyzed for bivariate correlations; multivariate analysis of variance/covariance (MANOVA/MANCOVA) models were used to assess categories of model for end-stage liver disease (MELD) score at time of listing. Results: Waiting list mortality rate (WL-MR) and transplant rate are positively correlated (Pearson’s r¼0.732, Po0.001) while WL-MR and transplant rate are negatively correlated with % of low MELD (6–10) wait list additions (r¼0.415, P¼0.002 and r¼0.531, P¼0.001, respectively). Waitlist additions per capita vary nearly 8-fold among DSAs and those with 2 or more programs list more candidates/capita than those with a single transplant program (P¼0.015). Similarly, DSAs with 1 transplant program list a significantly lower percentage of patients at low MELD score than those with 2 or more programs (P¼0.001). ESLD mortality rate varies 42-fold among DSAs, but it was not a significant predictor of MELD score at listing (P¼0.487) or waitlist additions/capita (P¼0.948). Uninsured rate (Po0.001), hospital beds/capita (P¼0.037), and college education (P¼0.002) are independent predictors of ESLD mortality rate. Conclusions: Competitive market principles influence listing practices and MELD at transplant in the USA. Should Competition enter the Australian system, rapid MELD escalation should be anticipated.
Abstracts A2
Results: The interstate group had longer cold ischemic time compared to the local group (hours, 9.12±0.18 vs 7.22±0.18; Po0.0001). The overall quality of donor graft in the interstate group was significantly worse compared to the local group (DRI, 1.97±0.3 vs. 1.57±0.4: Po0.0001). No significant difference in graft survival was recorded between the two groups (months, 93±6.5 vs. 96±2.8; NS). Three interstate recipients (5.5%) were retransplanted compared to 10 (2.7%) in the local group. The reasons for decline by other states were explored. Conclusion: Livers that have been declined by other centres can often be transplantable with comparable graft and patient survival.
ATTITUDES AND BELIEFS ABOUT DECEASED ORGAN DONATION IN ARABIC-SPEAKING COMMUNITIES IN AUSTRALIA: A FOCUS GROUP STUDY
OUTCOMES OF ADULT LIVER TRANSPLANTS FROM INTERSTATE DONORS West Claire, Aggarwala Shivani, Dilworth Pamela, Dissanayake Ruwan, Pulitano Carlo, Verran Deborah, Crawford Michael Australian National Liver Transplantation Unit, Royal Prince Alfred Hospital, Sydney Introduction: Due to the large number of patients on the waiting list, with a relatively low deceased donor rate, and a waitlist mortality around 20%, NSW has been a net importer of donor livers from interstate centres that have declined to use them for various reasons. Aims: We aimed to examine the clinical outcomes of liver transplants with interstate donors where the ‘home’ state had declined them. Method: From 2004–2013 a total 665 livers were transplanted, of which 98 donor livers were imported from interstate. After excluding urgent, multi-organ, re-transplants, DCD and paediatric recipients there were 55 transplants (interstate group) available for comparison with 365 recipients (local group) after the same criteria.
Immunology and Cell Biology
Ralph AF1,2, Alyami A3,4, Allen RDM3,4, Howard K5, Craig JC1,2, Chadban SJ3,4, Irving M1, Tong A1,2 1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 3Central Clinical School, University of Sydney, 4Transplantation Services, Royal Prince Alfred Hospital, Sydney, 5The Choice Institute, University of South Australia
Background: The demand for transplantable organs outweighs supply across the globe, especially in the provision of deceased donor organs from ethnic minority populations. We aimed to describe the attitudes and beliefs about organ donation for transplantation within the Arabic-speaking community. Methods: Arabic-speaking participants aged 19–77 years, were purposively recruited in Australia to participate in 6 focus groups (n¼53). Transcripts were analyzed thematically. Results: Six themes were identified: protecting family and community cohesiveness (respecting parental authority, intense emotionality, avoiding taboo, fearing judgment); religious conviction (clarifying ambiguity, adhering to religious requirements); invisibility of organ donation (proximity and direct relevance, lack of conceptual familiarity, apathy for registration); medical suspicion (visceral fear of organ removal, wary about less effort to save donors, losing body dignity, transferring historical skepticism, questioning differential allocation); owning the decision (saving lives, gaining independence, anticipating family resistance, honoring donor wishes); and reciprocal benefit. Conclusion: Organ donation is considered a generous ‘gift’ which could save lives. However, members of the Arabic-speaking community are unfamiliar with, unnerved and skeptical about the donation process. Making positive decisions about organ donation would require resolving tensions between respecting their family, community, and religious values versus their autonomy. Providing education about the need and process of organ donation through schools, religious leaders, media, and other social groups within the Arabic community may enhance factual knowledge and clarify ambiguities about the religious stance towards organ donation. Such strategies may reduce taboos and suspicion towards organ donation and thereby increase deceased donation rates within the Arabic-speaking community.
Abstracts A3
PERSPECTIVES OF DONORS AND RECIPIENTS ON THE IMPACT OF LIVING KIDNEY DONATION ON DONOR-RECIPIENT RELATIONSHIPS: THEMATIC SYNTHESIS OF QUALITATIVE STUDIES Ralph AF1,2, Butow P1,3, Craig JC2,4, Luxton G5, Rhodes P1, Hanson CS2,4, Tong A2,4 1School of Psychology, University of Sydney, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 3Other, University of Sydney, 4School of Public Health, University of Sydney, 5Prince of Wales Clinical School, University of New South Wales, Sydney
Background: Living kidney donation requires donors and recipients to renegotiate their relationships. However, tension, neglect, guilt and proprietorial concern over the recipient can lead to relationship deterioration. We aimed to describe donor and recipient expectations and experiences of the impact of living kidney donation on donorrecipient relationships. Methods: Electronic databases were searched to November 2014. Studies were synthesized thematically, and using the ENTREQ standards for reporting. Results: From 37 studies (n¼1107) spanning 12 countries, we identified seven themes. ‘Burden of obligation’ described the recipient’s perpetual sense of duty to demonstrate gratitude to the donor. ‘Expressing appreciation’ potentially caused discomfort among donors who did not want recipients to feel obliged to be continually grateful. ‘Developing a unique connection’ reflected the inexplicably special bond that donor-recipient dyads developed post donation. ‘Earning acceptance’ was the expectation that donation would restore relationships. ‘Desiring attention’ was expressed by donors who were jealous of the attention the recipient received and that they could gain attention by being a donor. ‘Restoring relational participation’ encompassed relieving both the caregiver and caretaker from the confinement of dialysis and involved increased participation in family life by the recipient. ‘Retaining kidney ownership’ reflected the donor’s inclination to ensure that their recipient’s protected ‘their’ kidney. Conclusion: Living kidney donation can strengthen donor-recipient relationships but could also trigger or exacerbate unresolved angst, tension, jealousy, and resentment. Facilitating access to psychological services, both pre- and post-transplant, that address potential relationship changes may help donors and recipients better adjust to changes in the relationship dynamics.
quantify donor kidney quality. We aimed to validate the KDRI in the Australian and New Zealand kidney transplant population. Methods: Using data from the Australia and New Zealand Organ Donor (ANZOD) and Australia and New Zealand Dialysis and Transplant (ANZDATA) Registries, we included all adult deceased donor kidney-only transplants over 2002–2013. The KDRI was calculated using the SRTR formula (including donor age, hypertension, diabetes, terminal creatinine, cause of death, height, weight, donation after circulatory death (DCD) status and hepatitis C status). We constructed three Cox models: (1) KDRI only; (2) model 1 plus transplant characteristics (HLA mismatch, ischaemic time, peak PRA and era); (3) model 2 plus recipient characteristics (age, sex, race, primary kidney disease, graft number, dialysis time, diabetes, coronary artery disease, peripheral vascular disease, cerebrovascular disease, chronic lung disease). The primary outcome was death-censored graft survival. Models were compared using Harrell’s C statistic. Results: KDRI was strongly associated with death-censored graft survival (Po0.0001 in all models; Figure). All models demonstrated moderately good discrimination, with Harrell’s C statistics (95% CI) of 0.63 (0.60, 0.65), 0.67 (0.65, 0.70) and 0.69 (0.67, 0.72) respectively. Conclusion: The KDRI performs moderately well in discriminating deceased donor kidney quality, and could feasibly be used in Australia to allocate high quality kidneys to recipients with favourable prognoses. Death-censored graft survival by KDRI quintile 1.00
0.75
0.50
KDRI quintile 1 2 3 4 5
0.25
0.00 0
3
6
9
12
Years post transplant
IMPACT OF FAMILY DONATION CONVERSATION TRAINING ON DONATION IN NSW: A CROSS SECTIONAL STUDY VALIDATION OF THE KIDNEY DONOR RISK INDEX (KDRI) IN THE AUSTRALIAN AND NEW ZEALAND KIDNEY TRANSPLANT POPULATION Clayton Philip1,2, McDonald Stephen1,3, White Sarah4, Chadban Steven4,5 1ANZDATA, 2Department of Nephrology, Prince of Wales Hospital, Sydney, 3Discipline of Medicine, University of Adelaide, 4Sydney Medical School, University of Sydney, 5Transplantation Services, Royal Prince Alfred Hospital, Sydney
Aims: The Australian deceased donor kidney allocation system does not attempt to match kidney quality with recipient prognosis. The kidney donor risk index (KDRI), developed by the United States Scientific Registry of Transplant Recipients (SRTR), was designed to
Cretikos Michelle1, Cavazzoni Elena2, Webster Angela3,4, Wyburn Kate5,6, O’Leary Michael2,5,7 Ministry of Health, 2NSW Organ and Tissue Donation Service, of Public Health, University of Sydney, 4Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 5Sydney Medical School, University of Sydney, 6Department of Nephrology, Royal Prince Alfred Hospital, Sydney, 7Intensive Care Services, Royal Prince Alfred Hospital, Sydney 1NSW
3School
Aims: Efforts to increase deceased donor rates have included attempts to identify all potential donors, and expand donation after circulatory death (DCD). We aimed to audit NSW activity, in particular the impact of requestor training of those initiating donation conversations with family members. Immunology and Cell Biology
Abstracts A4
Methods: We analysed cases entered into the NSW Donate Life deaths audit from 20 sites January 2013– September 2014. We used descriptive statistics to identify factors associated with donation outcomes. Results: 3937 patients died on or within 24 h of discharge from ICU/ ED, or were considered for organ donation. After major exclusions (active cancer, cardiac arrest or not intubated), there were 899 potential deceased donors: 360 potential brain dead (DBD), 278 potential DCD, and 262 outside DCD criteria. Family donation conversations occurred in fewer potential DCD donors than potential DBD (table 1). Donation conversation requestor training status was known in all but 2 cases. 49% conversations were conducted by requestors that had undertaken Family Donation Conversation Workshop training. These conversations resulted in 68% of next-of-kin consenting to donation. In the remaining 51% conversations conducted by requestors that had not undertaken training the consent rate was 44% (Po0.01 chisquared test). There were 184 family refusals in total. Table 1 Death audit summary of potential donors, request and consent in NSW
pushing moral boundaries); corrupting motivations (exposing the vulnerable, inevitable abuse, supplanting altruism); determining justifiable risk (compromising kidney quality, undue harm, accepting a confined risk, trusting protective mechanisms, right to autonomy); driving access (urgency of organ shortage, minimising disadvantage, guaranteeing cost-efficiency, providing impetus, counteracting black markets); and honouring donor deservingness (fairness and reason, reassurance and rewards, merited recompense). Reimbursement and justifiable recompense are considered by the Australian public as legitimate way of supporting donors and reducing disadvantage. Financial payment beyond reimbursement is regarded as morally reprehensible with the potential for exploitative commercialism. Some contend that regulated compensation could be a defensible strategy to increased donation rates provided that mechanisms are in place to protect donors. Conclusions: The perceived threat to community values of human dignity, goodwill, and fairness suggests that there could be strong public resistance to any form of financial inducements for living kidney donors. Policy priorities addressing the removal of disincentives may be more acceptable to the public.
Request conversaPotential donors
Number
tions (% of total)
Consent granted (% of conversations)
DBD DCD
360 278
314 (87) 97 (35)
163 (52) 64 (66)
Total
638
411 (64)
227 (55)
Conclusion: Efforts to increase organ donor in NSW could include better mechanisms for identification of patients suitable for DCD, and the use of requestors trained in the Family Donation Conversation.
PUBLIC ATTITUDES AND BELIEFS ABOUT PAYING LIVING KIDNEY DONORS; FOCUS GROUP STUDY Tong Allison1,2, Ralph Angelique1,2, Chapman Jeremy3, Wong Germaine1,2, Gill John4, Josephson Michelle5, Craig Jonathan1,2 1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 4Division of Nephrology, University of British Columbia, 5Department of Medicine, The University of Chicago
Aims: The unmet demand for kidney transplantation has generated intense controversy about introducing incentives for living kidney donors to increase donation rates. Such debates may impact public perception and acceptance of living kidney donation. We aimed to describe public opinion on financial reimbursement, compensation and incentives for living kidney donors. Methods: Twelve focus groups were conducted with 113 participants purposively recruited from the general public in three Australian states. Transcripts were analysed thematically. Results: Five themes were identified: creating ethical impasses (commodification of the body, quandary of kidney valuation,
Immunology and Cell Biology
AUSTRALIA’S ROLE IN SUPPORTING THE INTRODUCTION OF DECEASED ORGAN DONATION AND KIDNEY TRANSPLANTATION IN NEW CALEDONIA Allen Richard1,2, Quirin Nicolas3, Biche Veronique3, Touzain Frederic3, Le Coq Saint Giles Herve3, Sandroussi Charbel1,4, West Claire1, Newman Allyson1, Garry Lorraine1, Sukkar Louisa1, Eris Josette5 1Transplantation
Services, Royal Prince Alfred Hospital, Sydney, of Surgery, Sydney Medical School, University of Sydney, 3Centre Hospitalier Territorial, Noumea, New Caledonia, 4School of Medicine, University of Sydney, 5Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney 2Discipline
Access to kidney transplantation (KT) in Pacific island communities is challenging because of limited access to healthcare. However, New Caledonia (NC), a French territory with population of 260,000 is provided with same quality and scope of healthcare as those living in France. Living donor (LD) KT for NC has been offered in Sydney since 1979. The Aim of NC clinicians, with assistance from Agence de la Biomedecine (ABM), was to develop a deceased donor program to improve access to KT in NC. Methods: ABM supported development of local tissue typing and infection screening expertise in NC and dictated standards of organ retrieval and organ allocation. Donor retrieval team and air transport were sourced from Sydney. Two matched recipients, together with the donor retrieval team, accompanied every two DD donor kidneys back to Sydney. Recipients returned to NC after 8 weeks. Results: Community HLA typing demonstrated restricted and different patterns to NC French expatriates. Quality assurance blood specimens matched results with Australia. Sydney clinicians evaluated 32 dialysis dependent patients in NC and agreed to wait-list 30. The first DDKT was performed in April 2013 with total of 16 in the subsequent 21 months (9.2/yr). Mean LD rate in preceding 24 years was 1.6/yr. Mean cold ischaemia time was 15:05 hours with 88% initial
Abstracts A5
MOTIVATIONS, CHALLENGES AND BARRIERS TO SELF-MANAGEMENT IN KIDNEY TRANSPLANT RECIPIENTS: A SYSTEMATIC REVIEW OF QUALITATIVE STUDIES Nathan1,2,
S1,2,
Michelle3,
Jamieson Hanson Camilla Josephson Gordon Elisa J4,5, Craig Jonathon C1,2, Halleck Fabian6, Budde Klemens6, Tong Allison1,2 1Centre
for Kidney Research, The Children’s Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Department of Medicine, University of Chicago, 4Center for Healthcare Studies, Northwestern University Feinberg School of Medicine, 5Comprehensive Transplant Center, Northwestern University Feinberg School of Medicine, 6Department of Nephrology, Charite´, Universita¨tsmedizin Berlin Context: Kidney transplantation offers superior life expectancy and quality of life outcomes compared to other modalities for renal replacement therapy. However, the complex and ongoing medication and self-management regimens impose a treatment burden on patients, and non-adherence remains a leading cause of graft loss. Aims: We aimed to describe motivations, challenges and barriers to self-management in kidney transplant recipients. Methods: MEDLINE, Embase, PsycINFO, and CINAHL were searched from database inception to October 2014. We used thematic synthesis to analyse the findings. Results: Fifty studies involving 1238 participants aged from 18 to 82 years across 19 countries were included. We identified five themes: empowerment through autonomy (achieving mastery, tracking against tangible targets, developing bodily intuition, routinising and problemsolving, adaptive coping), prevailing fear of consequences (inescapable rejection anxiety, aversion to dialysis, minimising future morbidity, trivialisation and denial, defining acceptable risks), burdensome treatment and responsibilities (frustrating ambiguities, inadvertent forgetfulness, intrusive side-effects, reversing ingrained behaviours, financial hardship), over-medicalising life (dominating focus, evading patienthood, succumbing to burnout), and social accountability and motivation (demonstrating gratitude towards medical team, indebtedness to donor, peer learning, adaptive coping). Conclusions: Self-efficacy and relational responsibility encourage selfmanagement, however, these tasks can mentally and physically taxing. Transplant recipients trade off their treatment burden against the risks of side-effects and complications. Multi-component interventions including education, psychosocial support, decision aids and selfmonitoring tools may foster self-management capacity and improve transplant outcomes.
CHARACTERISING POTENTIAL, INTENDED, AND ACTUAL ORGAN DONORS IN NSW; A COHORT STUDY 2010–2014 Hirsch Daniel F1, Clayton Philip A1, Wyburn Kate1,2, O’Leary Michael1,3, Webster Angela4,5 of Medicine, University of Sydney, 2Renal Unit, Royal Prince Alfred Hospital, Sydney, 3NSW Organ and Tissue Donation Service, 4School of Public Health, University of Sydney, 5Centre for Transplant and Renal Research, Westmead Hospital, Sydney 1School
Aims: The Australia and New Zealand Organ Donor (ANZOD) Registry reports actual and intended donors, but the number of potential donors in Australia has not previously been studied. The NSW Organ and Tissue Donation Service (OTDS) collects information on all potential NSW organ donors. Using these data we explored changes in referral trends over time in NSW, where actual organ donor numbers are consistently below the national average. Methods: We used referral logs from the OTDS to collect information on demographics, co-morbidities and donation outcomes for all potential donors over 2010–2014. Analysis separated referrals by donation outcome. Results: There were 2326 total referrals. The absolute number of referrals increased steadily from a monthly mean of 32 in 2010 to 45 in 2014 (Figure). By comparison, actual donor numbers remained constant, with a monthly mean of 7 in 2010 and 7 in 2014. Therefore, the proportion of total referrals converting to donation is declining. Over 2013–2014, 77% of referrals did not proceed to donation, mainly due to medical unsuitability (72%), which was principally due to a lack of suitable organs (15.6%) and donor age (15.2%). Failed supportive treatment and active cancer accounted for 27% of medical unsuitability. Conclusions: Despite a promising increase in referrals in NSW over 2010–2014, the number of actual donors did not change. This is the first study to characterise potential donors not proceeding to donation in Australia. Further analysis of this population may identify areas for improvement within the donation process.
Referred and actual solid organ NSW donors 2010-2014 60 Referred
40 Persons
graft function. No grafts have been lost. NC surgeons now perform the retrieval procedure. Conclusion: This unique model of DDKT, driven by NC clinicians and supported logistically by Australia, has markedly improved access to KT for a Pacific island community. It could be a model for others.
20
Donated 0 2010
2011
2013
2015
Immunology and Cell Biology
Abstracts A6
REASONS FOR PANCREAS NON-RETRIEVAL: A POTENTIALLY WASTED RESOURCE? Zhou Kiane1, Lam Vincent2, Mulley William3, Kanellis John3, Hurst Kylie4, Hawthorne Wayne2, Yuen Lawrence2, Ryan Brendan2, Pleass Henry2 of Medicine, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Department of Nephrology, Monash Medical Centre, Melbourne, 4Australian and New Zealand Organ Donation Registry, Royal Adelaide Hospital 1School
Aims: There exists a significant shortfall in the availability of donor pancreases for transplantation in Australia making it essential that all suitable donor pancreases are considered for solid organ or islet transplantation. Methods: We examined the Australian and New Zealand Organ Donation (ANZOD) registry to determine whether potentially usable donor pancreases were not being retrieved and their reasons for non-retrieval. Combined kidney and liver donors from 2011 to 2013 were identified in ANZOD and reasons for concomitant non-retrieval of the pancreas across each state was extracted and compared. Results: Comparisons showed 255 (63.5%) of the 401 liver or kidney donors, deemed potential candidates for pancreas donation (p45 years of age), did not undergo pancreas retrieval. Aside from medically unsuitable candidates, major factors found to influence non-retrieval rates were characterised as either surgical reasons (11.3%) [e.g., split liver, prioritisation of liver transplant, inability to retrieve the pancreas, surgically unsuitable patients] or logistical issues (25.1%) [e.g., no suitable recipient, time constraints, surgeon or transplant team unavailable, exceeded retrieval quota, multiple donors on same day, no time for crossmatching]. Conclusions: Many of the reasons for pancreas non-retrieval in this cohort appear to be potentially reversible. By addressing both surgical and logistical obstacles it may be possible to increase donor pancreas numbers by over 36%. This study indicates the need to collect more comprehensive data on the reasons for unrealised pancreas donations to increase donor rates and improve opportunities for treating patients with Type I diabetes.
THE IMPACT OF SOCIOECONOMIC STATUS AND GEOGRAPHIC REMOTENESS ON PRE-EMPTIVE TRANSPLANT ACCESSIBILITY AND TRANSPLANT OUTCOMES IN CHILDREN WITH END STAGE KIDNEY DISEASE Francis A, Didsbury M, Grace B, Kim S, Lim WH, McDonald S, Craig JC, Wong G Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney Background and aims: Lower socioeconomic status (SES) is associated with poorer access to pre-emptive transplantation and inferior transplant outcomes in adults with end-stage kidney disease (ESKD), but their impact on children with ESKD remains unknown. We aimed to determine whether the accessibility to pre-emptive transplantation and transplant outcomes differ in children according to SES and geographic remoteness.
Immunology and Cell Biology
Methods: Using data from the Australian and New Zealand Dialysis and Transplantation (ANZDATA) Registry (1993–2013), we compared the accessibility to pre-emptive transplantation and risk of allograft failure across SES quintiles and between geographic remoteness amongst children (aged p18 years) using logistic regression and Cox proportional hazard models. Results: A total of 768 children received renal replacement therapy between 1993 and 2013 in Australia and New Zealand, of which 633 (82.4%) received their first transplant. Children with ESKD residing in regional Australia were less likely to receive pre-emptive kidney transplants compared to children living in urban centres [adjusted odds ratio (OR): 0.69 (95% CI: 0.46 –1.0)]. There were no significant differences in the likelihood of pre-emptive kidney transplantation between children of the highest SES quintile compared to the lowest SES quintile [adjusted OR: 0.79 (95% CI: 0.35 –1.41)]. Similarly, no significant associations between SES [adjusted OR: 1.17 (95% CI: 0.74 1.84)], geographic remoteness [adjusted OR: 1.12 (95% CI: 0.48 2.6)] and graft loss were observed. Conclusions: In this nationally represented sample of paediatric transplant recipients, geographic remoteness, but not low SES, was associated with lower odds of pre-emptive transplantation.
HYALURONAN – THE FIRST NEW BIOMARKER OF DONOR ORGAN QUALITY SINCE PO2? Sladden Timothy1,2, Samson Luke1,2, Hopkins Peter1,2, Yerkovich Stephanie1,2, Chambers Daniel1,2 1Lung Transplant Service, Prince Charles Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane
Aims: Acute lung injury (ALI) is common in multi-organ donors and places the transplant recipient at risk of primary graft dysfunction and death. Shedding of the endothelial glycocalyx (EG), a proteoglycanrich layer, is fundamental to ALI pathogenesis. We hypothesized that EG shedding may underlie ALI in organ donors and predict ability to transplant. Aim: To measure shed EG products (hyaluronan and syndecan) in peripheral blood of organ donors and relate to donor data. Methods: Hyaluronan and syndecan were measured (ELISA) in plasma from consecutive (2009–2014) Queensland multi-organ donors. Covariates were analysed by logistic regression. Results: 178 multi-organ donors were studied (female 50.3%; aged 48 (IQR 31–57); 420 pack-year smoking 33.3%; diabetic 7.3%; trauma 28.8%). Lungs were deemed potentially transplantable in 98 (55%), with 67 (37%) actually proceeding to transplant. Both hyaluronan (79.5 vs 51.1 ng/ml, P¼0.01) and syndecan (5494.1 vs 3654.4 pg/ml, P¼0.015) were higher in donors with trauma and metabolic acidosis (r for HCO3: 0.36, Po0.001 and 0.26, Po0.01 for hyaluronan and syndecan respectively). Of all donor factors, only PaO24300 mmHg (OR 3.76, P¼0.001) and lower hyaluronan predicted ability to transplant, with every 10 ng/ml decrement in donor peripheral blood hyaluronan increasing the lung utilization rate by 4.8% (0.3–6.5%, P¼0.019). Conclusion: Low donor hyaluronan levels at organ offer increases the chance that lungs will prove acceptable. Current studies are assessing the utility of hyaluronan as a biomarker of lung and other organ suitability and assessing if redressing EG shedding could improve organ quality, utilisation and transplant outcomes.
Abstracts A7
PANCREAS DONOR CHARACTERISTICS IN AUSTRALIA AND NEW ZEALAND: A COHORT STUDY 1984–2014 Peng Xi (Alex)1, Kelly Patrick1, Webster Angela1,2, On Behalf Of Contributors3 of Public Health, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Australia and New Zealand Islet and Pancreas Registry 1School
Aims: We aimed to describe the characteristics of pancreas donors over time in Australia and New Zealand. Methods: We used data from the Australia & New Zealand Islet and Pancreas Transplant Registry (ANZIPTR), 1984–2014. We investigated donor characteristics sex, age, BMI, hypertension and cause of death, and any change over time. Categorical and continuous characteristics were summarized as proportions and means respectively, by year groups. Pearson Chi-square test (or Fishers Exact test if required) or ANOVA was used to test difference across year groups. Po0.05 was considered statistically significant. Results: There were 627 pancreas transplantations reported from 1984–2014. Although not statistically significant, male donors predominated and the proportion of male donors decreased slightly from 64% in 1984–1994 to 55% in 2010–2014 (P¼0.16). The donor BMI and age both increased from 1984–94 to 2010–14 (P¼0.01). The proportion of donors with hypertension decreased over time (Po0.01). Cause of donor death has also changed (Po0.01), with an increase in cerebral hypoxia/ischaemia and reduction in intracranial haemorrhage. However, traumatic brain injury remains the most common donor cause of death (50–70% of deaths). Conclusions: Donor characteristics have changed over time. At donation they are older, fatter, but less hypertensive, and cause of death has changed.
SUSPENDED IN A PARADOX – PATIENT ATTITUDES TO WAIT-LISTING FOR KIDNEY TRANSPLANTATION: SYSTEMATIC REVIEW OF QUALITATIVE STUDIES Hanson Camilla1,2, Chapman Jeremy3, Halleck Fabian4, Josephson Michelle5, Craig Jonathan1,2, Tong Allison1,2 1Centre
for Kidney Research, The Children’s Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 4Department of Nephrology, Charite, Universitatsmedizin Berlin, 5Department of Medicine, The University of Chicago Aims: Patients on waiting lists for kidney transplantation have higher mortality rates and have specific anxieties about their eligibility, process and outcomes of wait-listing. We aimed to describe patient experiences and attitudes to wait-listing for kidney transplantation. Methods: Electronic databases including MEDLINE, Embase, PsycINFO and CINAHL were searched to September 2014. Thematic synthesis was used to analyse the findings. Results: Twenty two studies conducted in eight countries, involving 795 patients with any stage of CKD were included. We identified six themes (see Figure 1): accepting the only option (chance to regain normality, avoiding guilt, impulsive decision-making); maintaining hope (determined optimism, appreciating a fortuitous gift, enduring for optimal outcomes, trust in clinical judgment); burden of testing (strenuous commitment, losing the battle, medical mistrust); permeating vulnerability (eligibility enigma, being threatened, angst of timing uncertainty, desperate urgency, living in limbo, spiraling doubt and disappointment, residual ambivalence); deprived of opportunity (unfairly dismissed, unexpected disqualification, self-resignation and acceptance, jealousy, suspicious of inequity); and moral guilt (awaiting someone’s death, questioning deservingness).
Table 1 Summary of donor characteristics over time from 1984 to 2014 Donor characteristic
1984–94
1995–99
2000–04
2005–09
2010–14
Male (%) Age (years) (Mean)
64.2 26.6
64.2 23.9
66.9 28.6
56.4 28.6
54.6 28.4
0.16 0.01
BMI (kg/m2) (Mean) Hypertension (%)
22.8 15.5
21.9 19.4
24.7 2.6
24.5 1.7
24.1 0.7
0.01 o0.01
Cause of death (%) Cerebral hypoxia
1.9
3.0
11.3
8.7
16.2
Cerebral infarct Intracranial haemorrhage
0.0 38.9
3.0 26.9
2.7 20.0
3.7 13.7
3.7 12.5
Traumatic brain injury Non neurological
53.7 3.7
61.2 4.5
64.0 1.3
69.6 2.5
60.3 4.4
Other Neurological Condition Total donors
1.9 73
1.5 80
0.7 154
1.9 178
2.9 142
*P-value
P*
o0.01
for testing change over time.
Immunology and Cell Biology
Abstracts A8 Accepting the only option • Chance to regain normality • Avoiding guilt • Impulsive decision-making
Maintaining hope • Determined optimism • Appreciating a fortuitous gift • Trust in clinical judgement • Enduring for optimal outcomes
Disillusionment and unmet expectations Deprived of opportunity • Unexpected disqualification • Self-resignation and acceptance • Unfairly dismissed • Suspicious of inequity • Jealousy
Permeating vulnerability • Living in limbo • Spiralling doubt and disappointment • Residual ambivalence • Eligibility enigma • Desperate urgency • Being threatened • Angst of timing uncertainty
Burden of testing • Strenuous commitment • Losing the battle • Medical mistrust Moral guilt • Awaiting someone’s death • Questioning deservingness
Figure 1 Thematic schema of patients’ attitudes and experiences of wait-listing for kidney transplantation.
Conclusions: The waiting list offered hope of restored normality. However, the demands of work-up, uncertainty about eligibility, and waiting times that exceeded expectations impelled patients to disillusionment, despair, and suspicion of inequity. Managing patient expectations and ensuring transparency of wait-listing and allocation decisions may allay patient disappointment and scepticism; to improve patient satisfaction and treatment outcomes.
Immunosuppression and trials CD83 EXPRESSION ON HUMAN IMMUNE CELLS AS A TARGET FOR IMMUNOSUPPRESSION Hart DNJ1,2, Elgundi Z1,2, Ju Xinsheng1, Verma ND1,2, Silveira PA1,2, Fromm PD1,2, Alingcastre R1, Munster DJ3,4, Seldon TA3,4, Sheng Y3,4, Jones ML5, Munro TP5, Mahler S5, Barnard RT6, Vu PA1, Lo K1, Shahin K1,7, Larsen S8, Bradstock K1,9, Clark GJ1,2 1Dendritic
Cell Biology and Therapeutics Group, ANZAC Research Institute, 2Sydney Medical School, University of Sydney, 3Mater Institute, Brisbane, 4Cooperative Research Centre for Biomarker Translation, Melbourne, 5Australian Institute for Bioengineering and Nanotechnology, University of Queensland, Brisbane, 6School of Chemistry and Molecular Biosciences, University of Queensland, Brisbane, 7Flow Cytometry Unit, Institute of Clinical Pathology and Medical Research, Westmead Hospital, Sydney, 8Institute of Haematology, Royal Prince Alfred Hospital, Sydney, 9Blood and Marrow Transplant Service, Westmead Hospital, Sydney Aim: We defined CD83 as a therapeutic target on activated dendritic cells (DC) and showed that monoclonal antibodies (mAbs) to CD83 are effective immunosuppressive agents. We therefore analysed CD83 expression on human immune cells and examined the effect of antihuman CD83 mAb in human peripheral blood mononuclear cell (PBMC) transplanted SCID mice. Methods: We generated mouse and human mAbs to CD83, including the potential therapeutic, 3C12C. Flow cytometry analysis of resting Immunology and Cell Biology
and activated human PBMC was performed, CD83 transcripts for isoforms tested and the CD83 glycosylation pattern examined. The depletion of DC and/or T cells was examined in allogeneic (allo) or xenogeneic (xeno) mixed lymphocyte cultures (MLC) and human PBMC SCID mouse transplants. Results: There was different CD83 glycosylation as well as distinct CD83 splice variants present. CD83 expression was rapidly induced and remained elevated on activated DC. Brief, very low level CD83 expression was seen on CD4 and CD8 T cells in the alloMLC and PHA or CD3/CD28 stimulation of purified T cells. Treatment with 3C12C depleted CD83+ DC, inhibited an allo and xeno MLC response but did not affect Treg suppressor function or virus specific CD8+T cells. Administration of 3C12C depleted activated DC and reduced early and late T cell activation but maintained Tregs in the PBMC transplanted SCID mice. Conclusion: These findings suggest that CD83 expression following transplantation is a valid target for immunosuppression. The preclinical data suggests that 3C12C depletes activated DC, prevents alloT cell activation and preserves Treg and T cell viral immunity.
DECLINE IN ESTIMATED GLOMERULAR FILTRATION RATE AND SUBSEQUENT RISK OF GRAFT FAILURE AND MORTALITY AMONG KIDNEY TRANSPLANT RECIPIENTS Clayton Philip1,2, Lim Wai1,3, Chadban Steven1,4,5 1ANZDATA, 2Department of Nephrology, Prince of Wales Hospital, Sydney, 3Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 4Transplantation Services, Royal Prince Alfred Hospital, Sydney, 5Sydney Medical School, University of Sydney
Aims: Graft loss and death occur progressively after kidney transplantation, but trials designed to assess the impact of interventions on these outcomes are currently unfeasible. We aimed to determine the relationship between percentage decline in estimated glomerular filtration rate (eGFR) and subsequent graft loss or death after kidney transplantation, and thereby examine the utility of eGFR decline as a surrogate end-point. Methods: We obtained de-identified data from ANZDATA, and studied 7,949 transplants performed 1995–2009, including 71,845 patient-years follow-up, 1,121 graft losses, 1,192 deaths. We used Cox proportional hazards models to examine the relationship between
Abstracts A9
eGFR decline and outcomes, adjusted for potential confounders and baseline eGFR. Percentage change in eGFR was modeled as a restricted cubic spline. Outcomes were all-cause graft loss, death censored graft failure and patient death. Results: A 30% or greater decline in eGFR between years 1 and 3 posttransplant was seen in 10% of patients. As compared to those with stable eGFR, a 30% decline in eGFR was strongly associated with allcause graft loss (hazard ratio(HR) 2.35, 95%CI 2.05–2.69)(figure), death-censored graft failure (HR3.17, 95%CI 2.63–3.83) and death (HR1.7, 95%CI 1.45–2.06). Greater rates of decline in eGFR were associated with progressively higher HRs for all outcomes. Conclusion: A 30% decline in eGFR between years 1 and 3 following kidney transplantation is common and is strongly associated with risks of graft loss and death. These results closely mirror recent findings in CKD. Percentage decline in eGFR should be considered for use as a surrogate outcome in kidney transplant trials.
OBLIGATORY TACROLIMUS FORMULATION SUBSTITUTION IN HEART AND LUNG TRANSPLANT RECIPIENTS: A NATIONAL BIOEQUIVALENCE AUDIT Fitzsimons Sarah1, Gibbs Helen1, Wasywich Cara2, McWilliams Tanya3, Ruygrok Peter1 1Department of Cardiology, Auckland City Hospital, 2Department of Cardiology, 3Lung Transplant Service, Auckland City Hospital
Aims: In New Zealand (NZ) the national pharmaceutical procurement agency (PHARMAC) mandated a tacrolimus formation substitution from tacrolimus Prograf (Janssen Cilag) to a generic tacrolimus (Sandoz, Novartis). We sought to assess whether substitution achieved bioequivalence in our transplant recipients. Methods: All recipients of a heart or lung transplant in NZ taking tacrolimus switched from tacrolimus Prograf to tacrolimus Sandoz between May–October 2014. Trough levels were taken on day one and ten following substitution. Results: 112 patients were on tacrolimus Prograf prior to 01/10/2014. Median age 49 years (range 8–73 years), 51% female, median time post transplant 5years (range 0.4–25 years), 19% diabetic and the median creatinine level at time of switch was 101 mmol/L (range 27–265 mmol/L). 18/59 (30%) lung transplant recipients had cystic fibrosis. Twenty patients were excluded. Assuming normal distribution, mean change in trough level post conversion for the remaining 91 patients was -0.47 mg/ml (std±-2.57, P¼0.08). Both assays were performed at the same laboratory for 57 patients with a mean change in trough level of -0.38 mg/ml (std±-2.57 mg/ml, P¼0.26, mean pre-trough 8.37 mg/ml (std±-3.2), mean post 8.16 mg/ml(std±-2.99). 33patients had levels measured at different laboratories where comparative assays are used (Roche/Abbott, correlation r¼0.992) with a mean change in trough level of -0.63 mg/ml (std±-2.6, P¼0.17). Four patients had dose adjustments following conversion. All patients tolerated the switch with no significant side effect reported. Conclusion: Tacrolimus Sandoz had similar bioequivalence to Tacrolimus Prograf in our cohort of heart and lung transplant recipients. The switch between preparations was well tolerated with no significant complication.
EVEROLIMUS PLUS REDUCED-EXPOSURE CYCLOSPORIN VERSUS MYCOPHENOLIC ACID PLUS CYCLOSPORIN: SEVEN YEAR FOLLOW-UP OF ANZ PATIENTS FROM A RANDOMISED CONTROLLED TRIAL Chadban Steve1, Pilmore Helen2, Russ Graeme3, Kanellis John4, Campbell Scott5, O’connell Philip6, Lim Wai7, Lutherborrow Mark8, Kurstjens Nicol8, Walker Rowan9 1Department
of Renal Medicine, Royal Prince Alfred Hospital, Sydney, Renal Transplant Group, Auckland City Hospital, 3Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 4Department of Renal Medicine, Monash Medical Centre, Melbourne, 5Department of Medicine, Princess Alexandra Hospital, Brisbane, 6Department of Renal Medicine, Westmead Hospital, Sydney, 7WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 8Novartis Pharmaceuticals Australia, 9Department of Renal Medicine, Alfred Hospital, Melbourne 2Auckland
Study Purpose: The 2-year, phase III, A2309 trial randomized 833 kidney transplant recipients to receive 1.5 mg everolimus plus low exposure cyclosporine (EVR1.5), 3.0 mg everolimus plus low exposure cyclosporine (EVR3.0) or 1.44 mg mycophenolic acid plus standard cyclosporine (control), all with basiliximab induction and maintenance steroids. Description of Methods: Follow-up to up to 7 years will be performed by linkage to the ANZDATA Registry assessing patient survival, graft survival, eGFR, and incidence of cancer by ITT analysis. Changes to immunosuppression subsequent to study completion will be examined and sensitivity analyses undertaken. Summary of Results: At two years follow up there was no difference between arms in regards to the primary endpoint (composite endpoints of tBPAR, graft loss, loss to follow up or death). Australia and New Zealand contributed 95 patients (11%). Analyses restricted to this subgroup revealed superior MDRD eGFR for EVR1.5 versus control (60.7613 ml/min/1.73m2 v 47.44 (95% CI 4.1,22.5)). Safety and adverse events were similar with comparable numbers of patients discontinuing in all arms of the study. Total tumour burden was decreased for EVR3.0 versus control (3 v 17 respectively). Conclusions: Medium-term follow-up of the ANZ cohort of participants of this multinational RCT via registry linkage may provide useful data on clinically relevant (eGFR, cancer incidence, retention on therapy) and hard (patient and graft survival) endpoints, with potential to enhance power to differentiate therapies.
Immunology and Cell Biology
Abstracts A10
CASTANOSPERMINE, A NOVEL IMMUNOSUPPRESSANT, INHIBITS HEPARANASE WITHIN INTRAGRAFT LYMPHOID CELLS WHILE CONSERVING HEPARAN SULPHATE WITHIN RAT RENAL TRANSPLANTS Hibberd Adrian1,2, Clark David1,2, Cong Ma2, Trevillian Paul1,2 Transplant Unit, John Hunter Hospital, Newcastle, 2Hunter Transplant Research Foundation, Hunter Medical Research Institute 1Newcastle
Aim: Castanospermine an oligosaccharide processing inhibitor is synergistic with Cyclosporin A in prolonging allograft survival but only part of its mechanism of action is understood. We aimed to find an explanation for the perivenular clustering of intragraft lymphoid cells observed on histopathology of treated allografts. Methods: Rat renal transplantation; immunochemistry of sections; flow cytometry (Facsanto 2) using MoAb to heparanase (mean fluorescence intensity ratio); ELISA assay for heparan sulphate proteoglycan content of renal grafts (pg/ml)using a MoAb to heparan sulphate. Grafts for analysis from the CAST treated Group and the Untreated Group were procured at days 2, 4 and 6 after transplantation. Comparison between Groups was done using the unpaired student t test. Results: The MFIRs for heparanase within intragraft lymphoid cells were: CAST treated Group mean 2.77; SD 1.32 n¼ 3 versus Untreated Group mean 6.2; SD 1.31 n¼ 5; P¼0.020. The heparan sulphate contents of the renal allografts were: CAST treated Group mean 5.44; SD 0.85 n¼ 3 versus Untreated Group mean 2.87; SD 1.13 n¼ 9; P¼0.021. Histopathology of sections from the CAST treated Group confirmed the predominant clustering of lymphoid cells about venules when compared with the Untreated Group. Conclusions: Castanospermine appears to inhibit intragraft migration of alloreactive lymphoid cells by paralyzing intracellular heparanase rendering the cells less able to digest graft extracellular matrix mainly composed of heparan sulphate. This action may explain its synergism with CsA.
Surgical techniques MODIFIED LICH-GREGOIR TECHNIQUE PREVENTS UROLOGICAL COMPLICATIONS POST KIDNEY TRANSPLANT Ng Zi Qin, He Bulang WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth Aim: This study aims to examine urological complications by using modified Lich-Gregoir technique for ureteroneocystotomy after kidney transplant. Method: From 26th January 2010 to 30th June 2014, 209 kidney transplants were performed in our institute. Mean age was 49.94 years (3 years-81 years). Seventy were from live donors, while 139 were from deceased donors. Four patients received the 3rd and one received the fourth kidney transplant. Nine patients underwent dual-kidney transplant. All transplants except one were performed by conventional
Immunology and Cell Biology
open surgery. Ureter-bladder anastomosis was conducted by using modified Lich-Gregoir technique with an additional stitch placed at proximal part of bladder muscular incision to the ureter on each side. Urological complications were defined as ureteral stricture or urine leakage. The patients were followed-up from 3 to 54 months. Ultrasound (US) was performed on day 1 post-operation and then repeated whenever the kidney graft function deteriorated. Results: There was no urine leakage observed in this cohort. One case of dual-kidney transplant developed ureteral stricture secondary to a lymphocele. This was managed by percutaneous nephrostomy, antegrade balloon dilatation and reinsertion of the ureteric stents. Twelve patients were identified mild to moderate hydronephrosis on US; four were due to a lymphoceles; three were secondary to urinary stones. Of five patients did not have any further intervention due to satisfactory kidney graft function. Conclusion: With modified Lich-Gregoir technique, urological complications can be prevented post kidney transplant. One patient who developed ureteric stricture, is thought to be due to lymphocele rather than the technique of ureteroneocystostomy.
LAPAROSCOPIC KIDNEY TRANSPLANT BY EXTRA PERITONEAL APPROACH: ONE YEAR FOLLOW UP He Bulang1,2, Mou Lingjun1, Swaminathan Suda3, Hamdorf Jeffrey2, Delriviere Luc1,4 1WA
Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth, 2School of Surgery, The University of Western Australia, 3Renal & Transplantation Unit, Fiona Stanley Hospital, 4School of surgery Aims: Laparoscopic surgery has been widely applied to the clinic practice due to multiple benefits. The aim of this paper is to explore a surgical technique innovation of laparoscopic kidney transplantation by extra-peritoneal approach and its outcomes after one year follow up. Methods: This study was established to explore the safety of laparoscopic kidney transplant from animal model to human clinic. The research was firstly conducted on the cadaver animals, then live animals and human cadavers before application to human kidney transplant. The study patient was a 49-year old male received the kidney that was transplanted by laparoscopic technique via extra peritoneal approach. The control patient received the contralateral kidney transplanted by open surgery. Results: Both patients were hemodynamically stable during surgery with minimal blood loss. The study patient was admitted to ICU for hemofiltration a few hours due to hyperkalemia. Both kidneys experienced delayed graft function but the kidneys started function on day 6 post transplant. The analgesia consumption was significantly less in study patient. Over one year follow-up, the kidney graft function is comparable and there is no surgical complication. The surgical scar is much less visible by laparoscopic transplant. Conclusions: This study has established a novel laparoscopic technique for kidney transplant by extra peritoneal approach. This approach has retained the advantages of open kidney transplant, which allows the kidney graft located at iliac fossa in the extra peritoneal space. The kidney graft function is comparable to open kidney transplant over a year follow up.
Abstracts A11
SUPPLEMENTED CELSIOR SOLUTION PROVIDES SUPERIOR PROTECTION OF RAT HEARTS DURING EXTENDED COLD STORAGE COMPARED WITH AL SOLUTION Chew Hong Chee1,2,3, Gao Ling1, Doyle Aoife1, Hicks Mark1, Jabbour Andrew1,4, Macdonald Peter1,4 1Transplantation
Laboratory, Victor Chang Cardiac Research Institute, Sydney, 2School of Medicine, University of New South Wales, Sydney, 3Department of Surgery, St Vincent’s Hospital, Sydney, 4Department of Cardiology, St Vincent’s Hospital, Sydney Objective: Cold static storage remains the standard method for preserving donor hearts. The aim of this study was to compare heart preservation obtained with supplemented Celsior (C) solution to that obtained with Adenosine Lignocaine (AL) solution. Methods: Wistar rats (350–450 g) were used to compare the following 4 preservation solutions (n¼6 per group): AL, C, supplemented AL (sAL) and supplemented C (sC). The supplementation used was: erythropoietin (5000 IU/L), glyceryl trinitrate (100 mg/L) and zoniporide (1 mM). Isolated hearts were perfused in working mode and baseline hemodynamic measurements obtained: aortic flow (AF), coronary flow (CF), cardiac output (CO) and heart rate (HR). Hearts were then flushed with cold preservation solution and stored for 6 hours at 4 1C. Hearts were reperfused and AF, CF, CO and HR recovery expressed as a % of baseline. Results: Hearts stored in sC recovered 41±30% AF, 73±19% CF and 83±28% HR at 30 min reperfusion: significantly higher than all other groups: C (AF 17±28%; CF 32±38%; HR 30±48%); AL (AF 2±4%; CF 44±26%; HR 17±43%) and sAL (AF 4±10%; CF 28±24%; HR 38±42%). An additional group of hearts were stored in AL and reperfused in warm AL: recovery in this group was also poor (AF 1±1%; CF 17±15%; HR 14±31%). Conclusion: sC provided superior heart preservation during prolonged cold storage compared with C, AL and sAL solutions.
Percentage Recovery From Baseline (%)
C sAL sC AL + AL
CF
CO
Shahrestani Sara1, Lam Vincent2, Yuen Lawrence2, Ryan Brendan2, Pleass Henry2, Hawthorne Wayne3 of Medicine, University of Sydney, 2Department of Surgery, Westmead Hospital, Sydney, 3Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney
AL
AF
ANTE-MORTEM HEPARIN IMPROVES OUTCOMES IN DONATION AFTER CARDIAC DEATH (DCD) PANCREAS TRANSPLANTATION: A SYSTEMATIC REVIEW AND META-ANALYSIS
1School
Recovery of Baseline Measurements in All Groups (n=30) 90 80 70 60 50 40 30 20 10 0 -10
Aims: Dual renal transplant is a practice which is increasing in frequency as the use of kidneys from extended criteria donors widens. The aim of this systematic review was to describe the techniques of dual renal transplant, and to ascertain whether one technique was superior to the others. Methods: A review of the Cochrane, Medline, PubMed and Embase databases identified studies that described the technique employed for dual renal transplant. The primary end point was patient mortality, and the secondary endpoints were graft survival and operating time. Results: Three techniques of dual renal transplant are described: bilateral placement; unilateral placement of two kidneys using separate anastomoses; and unilateral placement en bloc. Ten studies of bilateral placement, four studies of unilateral placement with separate anastomoses, and two studies of unilateral placement en bloc were identified. In the bilateral group, patient survival at one year varied from 70% to 100%, graft survival was between 89% and 100%, and mean operating time was between 275 and 371 min. In the separate anastomoses studies, patient survival at one year was 100%, graft survival was between 66 and 94%, and mean surgical time was 192 to 260 min. In the en bloc studies, patient survival at one year was 100%, graft survival was 95%, and mean surgical time was 160 min. Conclusions: All three techniques appear to be acceptable. A multicentre prospective study with a larger patient population and longer follow-up would lead to more confidence in these outcomes.
HR
DUAL RENAL TRANSPLANT TECHNIQUES: A SYSTEMATIC REVIEW Cocco Annelise1, Shahrestani Sara2, Cocco Nicholas3, Yuen Lawrence1, Ryan Brendan1, Hawthorne Wayne1, Lam Vincent1,2, Pleass Henry1,2 1Department of Surgery, Westmead Hospital, Sydney, 2School of Medicine, University of Sydney, 3Department of Surgery, Royal Prince Alfred Hospital, Sydney
Aim: To identify practices that may influence the outcome of pancreas transplantation from DCD donors with particular emphasis on antemortem procedures to provide improvements for increasing use of this valuable resource. Methods: Systematic review and three separate meta-analyses were performed comparing patient survival, allograft survival and thrombosis of pancreas transplants from DCD and donation after brain death (DBD) donors. The role of other factors such as immunosuppression, surgical techniques, donor selection and retrieval processes in transplant outcomes were considered via a systematic review of the literature. Results: Twenty-three articles were identified describing the outcomes of pancreas transplantation from DCD donors. This included 736 pancreas transplant recipients from DCD with 8 case reports, 5 retrospective cohort and 10 prospective cohort studies. 8 of the 23 studies provided information on a DBD comparison group comprising 27, 344 transplant recipients. Importantly, there was no significant difference in allograft survival (HR¼0.99, P¼0.91) or patient survival (HR¼0.82, P¼0.49) between DCD and DBD pancreas transplants. Qualitative synthesis of evidence showed ante-mortem femoral cannulation reduces warm ischemic time from above 25 minutes to 20 minutes or less. The odds of thrombosis in DCD transplants Immunology and Cell Biology
Abstracts A12
were double that of DBD transplants (OR¼1.97, P¼0.002), however when the donor was given ante mortem heparin this difference was non-significant. Conclusions: DCD pancreas transplantation is a viable alternative to DBD transplantation with ante-mortem interventions including heparinization being an effective way of reducing transplant complications such as thrombosis. This potential benefit of DCD pancreas transplants warrants further study, with perhaps some liberalization of donor pancreas criteria along with careful prospective audit.
EVOLUTION IN MANAGEMENT OF LYMPHOCELES POST KIDNEY TRANSPLANT Damodaran Prabha Ramesh, He Bulang WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth Purpose: This study reviews the technique evolution in management of lymphoceles post kidney transplant in our institute. Material & Methods: From March 2006 to December 2014, seventeen patients developed symptomatic lymphocele post kidney transplant, average age 58.2 years. Fifteen patients were single kidney transplants, while two patients were dual kidney transplants. The incidence of lymphocele is about 4.3% in this cohort. The essential management is percutaneous drainage by an indwelling catheter, followed by laparoscopic fenestration. Open surgery is reserved for recurrence. Technique evolution includes: instillation of methylene blue into the lymphocele for identification during procedure; intraoperative ultrasound, placement and fixation of a drainage tube into the sac for a prolonged period allowing formation of an internal drainage channel. Results: Eleven cases resolved successfully after initial management. Three patients experienced recurrence after laparoscopic fenestration. Two of them required open surgery, whereas one underwent laparoscopic fenestration again with a drainage tube fixed into the sac. Three patients presented with an infected lymphocele. Two of them underwent open surgery evacuation and drainage, while one had percutaneous drainage with antibiotics and subsequent laparoscopic fenestration. Four patients had ureteric strictures. All cases recovered with satisfactory kidney graft function. Conclusion: Together with literature review, a protocol can be drawn for management of lymphocele including percutaneous indwelling catheter drainage, followed by laparoscopic fenestration. Technique modification with a fixation of drain tube into the sac may reduce lymphocele recurrence. Open surgery is reserved for complicated cases.
LYMPHOCOELES: REVISTING A FAMILIAR PROBLEM IN RENAL TRANSPLANT Osei Tutu Lovelace, Olakkengil Santosh A, Russell Christine Central Northern Adelaide Renal and Transplantation Service, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital Aim: Retrospective evaluation of various treatment modalities of large sized lymphocoeles in renal transplant patients over a 20 year period.
Immunology and Cell Biology
Methods: Using the hospital renal transplant database, electronic and handwritten records, surgical outcomes of 1862 patients were analysed out which 28 had large sized symptomatic lymphocoeles (4100 mls). The main outcome indicators were type of donor, duration of presentation, complication of lymphocoele, creatinine pre and post drainage and drainage technique utilized. Results: 23 patients were recipients of DBD renal transplants with 5 live donor recipients. There were no DCD recipients. The average time prior to presentation was 102.2 days. 2 patients had large haematomas needing evacuation prior to development of a lymphocoele. 3 had compression of the ipsilateral external iliac vein with 1 developing deep vein thrombosis.2 other patients had compression of the transplant ureter with a further 3 developing compression of the kidney. There was no significant change in pre and post drainage creatinine. 22 patients had laparoscopic lymphocoele fenestrations, 4 had open drainage, guidewire localization and PCN with contrast localization were utilized in 2 patients. Where possible omentum was packed into the rent in the cavity. There were no complications from the operative modalities employed above. Conclusion: Various modalities have been described in the literature for treatment/drainage of large sized lymphocoeles. Based on our series, laparoscopic drainage with omental packing if feasible sems to be the favourable and safest option.
ADRENAL LIPOMA PRESENTING AS A SURGICAL DILEMMA DURING MULTI-ORGAN RETRIEVAL Gostlow Hannah1, Sladden Nicole2, Olakkengil Santosh A1, Russell Christine1 1Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Department of Pathology, Royal Adelaide Hospital
Aims: Adrenal tumours presenting to our centre in particular benign adrenal lipomatous lesions, were reviewed after we had a deceased donor presenting with an adrenal lipoma. Methods: 27 year old female with Moyamoya vasculopathy became a multi-organ donor. During the back table dissection a smooth, round, 1.5 cm diameter tumour was discovered in the left adrenal gland. The abdominal CT scan showed the left adrenal gland to contain a homogenous radiolucent lesion with a density of -40 to -50 Hounsfield units, suggestive of adipose tissue. The adrenal gland with the tumour and the Gerotas fascia with its pad of fat were sent for histopathology and the kidney transplant went ahead. Results: From January 2001 to December 2014 our surgical pathology department encountered only a single case of a benign adrenal lipoma, which was resected as part of a radical nephrectomy for renal cell carcinoma. There were 7 cases of myelolipoma of adrenal gland, (age range 41–88 years). Six were surgical excisions and another seen at autopsy. Microscopy showed a circumscribed lesion within the adrenal parenchyma which was composed of mature adipocytes. These showed minor variation in size but no nuclear atypia. There were no lipoblasts, spindle cell proliferations, thick-walled blood vessels or clusters of myeloid cells. Immunohistochemical testing showed no adipocyte staining with HMB-45, Melan-A or SMA. These combined morphological and immunohistochemical features confirmed the diagnosis of a benign adrenal lipoma.
Abstracts A13
Conclusions: Adrenal lipomas are extremely rare benign adrenal lesions and is the first reported case of adrenal lipoma potentially complicating organ donation.
RANDOMIZED CLINICAL STUDY ON TECHNIQUE OF REPERFUSION IN LIVER TRANSPLANTATION: INSIGHT INTO MICROCIRCULATION, GENE RESPONSE, AND OUTCOME Pulitano Carlo, Joseph David, Sandroussi Charbel, Deborah Verran, Pleass Henry, Debiasio Ashe, Phong Ho, Adriano Luongo, Allen Richard, Mccaughan Geoffrey, Shackel Nicholas, Crawford Michael Australian National Liver Transplantation Unit, Royal Prince Alfred Hospital, Sydney Introduction: It remains controversial which liver reperfusion technique is the best in terms of graft injury and clinical outcome in liver transplantation (LT). Aims: To compare sequential and simultaneous reperfusion during LT in terms of liver function, graft survival, and complication rate. Methods: LT was performed in 60 adults patients randomized into 2 groups: simultaneous (SIM) where hepatic artery and portal vein are reperfused simultaneously, and sequentially (SEQ) with reperfusion of the portal vein, followed by the hepatic artery. Graft functions were assessed by serum markers, microcirculation, histopathology, genes expression, serum levels of inflammatory cytokines, and clinical outcome. Microcirculation changes were evaluated using sidestream dark field imaging. Expression of 23 specific genes were assessed using RTPCR. Mortality and biliary complications were evaluated. Results: Postoperative serum levels of bilirubin were significantly lower in the SIM than in the SEQ group (P¼0.001). Microcirculatory dysfunction was significantly more common in SEQ than SIM. Microcirculation perfusion was 37% lower in the SEQ compared to SIM (P¼0.006). All remaining microcirculation parameters had significant difference (P¼0.001) in favor of SIM group. Microcirculation was correlated with ALT, bilirubin, and endothelin-1 levels. SIM had improved histology and gene expression (P¼0.001). Biliary stricture occurred in 4 patients in the SIM and in 7 patients in the SEQ group (P¼0.316). There was no significant difference in overall complication rate or graft survival. Conclusions: These results, based on the largest randomized trial ever conducted in technique of reperfusion, demonstrate that simultaneous reperfusion causes less microcirculatory disturbance and superior primary function.
Overall graft failure Adjusted HR (95%CI)
Outcome measures CHARACTERISTICS OF EXPANDED CRITERIA DECEASED DONORS AND GRAFT AND PATIENT OUTCOMES Lim Wai1, Chadban Steven2, Ferrari Paolo2, Pilmore Helen3, Hughes Peter4, Chakera Aron1, Russ Graeme5, Wong Germaine6 1Department
of Renal Medicine, Sir Charles Gairdner Hospital, Perth, of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 3Auckland Renal Transplant Group, Auckland City Hospital, 4Department of Renal Medicine, Royal Melbourne Hospital, 5Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 6Department of Renal Medicine, Westmead Hospital, Sydney 2Department
Background and Aims: The use of expanded criteria deceased donor (ECD) kidneys for transplantation has more than doubled over the last decade. Although ECD kidneys are associated with up to a twofold greater risk of graft loss compared to non-ECD kidneys, the associations between different donor characteristics that define ECD kidneys remains unclear. The aim of this study is to examine the association between ECD characteristics and graft loss using Australia and New Zealand Dialysis and Transplant (ANDATA) registry between 1990–2012. Methods: We compared the risk of graft failure and death with functioning graft (DFG) across ECD characteristics in adjusted Cox regression analyses. Results: Of the 1717 ECD kidneys transplanted, 1304 (75.9%) were from donors aged X60 years (ECDX60), 299 (17.4%), 21 (1.3%) and 93 (5.4%) from donors aged 50–59 years with hypertension/death attributed to cerebrovascular accident (ECD50-59HTN/CVA), hypertension/terminal creatinine 4133 mmol/L (ECD50-59HTN/Cr4133) and CVA/Cr4133 (ECD50-59CVA/Cr4133) respectively. Compared to ECDZ60, ECD50-59HTN/CVA was associated with a significant reduction in overall and death-censored graft failure (DCGF) but ECD50-59CVA/Cr4133 were associated with almost a 2-fold increased risk of DFG. If restricted to ECDX60, those with HTN/ Cr4133 were associated with a higher risk of overall graft failure, DCGF and DFG compared to ECDX60 with p1 characteristic of HTN, CVA or Cr4133. Conclusions: The current definition of ECD kidneys does not take into account the differential impact on graft and patient outcomes of dissimilar donor characteristics.
DCGF Adjusted HR (95%CI)
DFG Adjusted HR (95%CI)
All ECD ECDX60
1.00
1.00
1.00
ECD50-59HTN/CVA ECD50-59HTN/Cr4133
0.73 (0.58, 0.90)* 0.57 (0.23, 1.38)
0.63 (0.43, 0.91)* —
1.10 (0.79, 1.53) 1.54 (0.57, 4.17)
ECD50-59CVA/Cr4133
1.15 (0.85, 1.56)
0.97 (0.56, 1.67)
1.89 (1.21, 2.95)*
Only ECDZ60 1 criteria
1.00
1.00
1.00
HTN and CVA HTN and Cr4133
0.86 (0.70, 1.06) 2.23 (1.17, 4.23)*
0.76 (0.53, 1.08) 3.24 (1.18, 8.91)*
1.07 (0.76, 1.50) 2.70 (1.01, 7.42)*
CVA and Cr4133
1.39 (0.92, 2.10)
1.12 (0.54, 2.31)
1.50 (0.75, 3.00)
*Po0.05.
Immunology and Cell Biology
Abstracts A14
TRANSBRONCHIAL BRUSH (TBBR) RELIABLY QUANTIFIES LYMPHOCYTIC BRONCHIOLITIS AND PREDICTS SUBSEQUENT CHRONIC LUNG ALLOGRAFT DYSFUNCTION Yerkovich Stephanie1,2, Samson Luke1,2, Sinclair Kenneth1,2, Tan Maxine1,2, Gallagher Harry1, Fiene Andreas1, Hopkins Peter1,2, Chambers Daniel1,2 1Lung Transplant Service, Prince Charles Hospital, Brisbane, 2School of Medicine, University of Queensland, Brisbane
Background: Lymphocytic bronchiolitis is one of the main risk factors for chronic lung allograft dysfunction (CLAD), but the operating characteristics of the current test (transbronchial biopsy) are poor. We hypothesised that combining transbronchial brush (TBBr) with flow cytometric evaluation of epithelial lymphocytes would better assess the allograft. Aim: To enumerate lymphocyte subsets within healthy and diseased lung allografts and to assess their association with CLAD. Methods: 230 TBBr and concurrent biopsies were obtained in 107 patients (39 CF, 34 COPD, 19 IPF, 92% bilateral, median age 49.8 years, 26% CLAD at census) undergoing post-transplant surveillance and diagnostic bronchoscopy with median follow-up of 31 (20–56) months. Cells were stained with cytokeratin, CD3, CD8, CD103 (intraepithelial T cell marker) and granzyme B (GrB) before flow cytometric analysis was performed. Results: Even in healthy lung transplant recipients (CLAD-free at census), the bronchiolar, but not bronchial region, was progressively infiltrated by both activated (CD3+CD103+GrB+, b¼0.022 (0.004– 0.039), P¼0.015) and non-activated (total CD3+CD103+, b¼0.016 (0.004–0.029), P¼0.011) intraepithelial T cells. Acute allograft dysfunction was associated with a longer time post-transplant (OR1.04, 1.02–1.06, Po0.001), a Bx biopsy grade (OR1.30, 1.04–1.63, P¼0.020), BAL neutrophilia (OR1.03, 0.99–1.05, P¼0.072) and bronchiolar infiltration of CD3+GrB+ (OR1.97, 1.13–3.41, P¼0.016). Using a 1% cut-off (by ROC analysis), for each TBBr procedure where CD3+GrB+41%, there was a 1.34 (1.17–1.54, Po0.001) fold increased risk of subsequent CLAD. Conclusion: The TBBr test provides important prognostic information in lung transplant recipients, with bronchiolar infiltration by activated CD3+GrB+ T cells a strong risk factor for subsequent CLAD development.
IMPACT OF DONOR BRAIN DEATH INTERVAL AND GRAFT ISCHAEMIC TIME ON LUNG TRANSPLANT SURVIVAL Sugianto Nara1, Lo Phillip1, Dhital Kumud1,2, Glanville Allan1,3, Havryk Adrian1,3, Malouf Monique1,3, Plit Marshall1,3, Granger Emily1,2, Jansz Paul2, Spratt Phillip1,2 1Faculty
of Medicine, University of New South Wales, Sydney, 2Department of Cardiothoracic Surgery, St Vincent’s Hospital, Sydney, 3Department of Thoracic Medicine, St Vincent’s Hospital, Sydney Background: Acceptable graft ischaemic times (GIT) for lung transplantation (LTX) remain controversial. The influence of brain death interval (BDI)(period from brain death to donor cross clamp and pneumoplegia) on LTX outcomes has not yet been investigated. We hypothesized that longer GIT as well as increasing BDI would be associated with reduced survival. Immunology and Cell Biology
Method: Retrospective single center study of 725 adult primary LTX recipients (single: bilateral¼150:575), October 1990–May 2014. Results: Both BDI and GIT have significantly increased over time. Median BDI increased from 9.3 to 18.6 hours (Po0.001) largely due to increases in offer to cross clamp time (1.0 to 7.9 hours, P¼0.005) and correlated with greater distances travelled for organ retrieval (P¼0.003, r¼0.111). Median GIT increased from 3.7 to 5.7 hours (P¼0.002). Median (interquartile range) BDI, brain death to offer interval, offer to cross clamp interval and GIT were 14.4 (11.8–18.4) hours, 5.5 (3.3–8.3) hours, 9.0 (7.5–10.8) hours and 5.0 (4.0–6.0) hours respectively. Cox multivariate analysis confirmed that GIT is a significant independent predictor of survival (P¼0.02, Hazard ratio [HR]: 0.914, 95% CI: 0.848–0.986). Kaplan-Meier analysis showed that bilateral LTX patients aged 450 years had reduced survival when GIT 46 hours (Po0.002, log-rank). BDI was not a significant independent predictor of survival on multivariate analysis (P¼0.319, HR: 0.991 95% CI: 0.975–1.008). Conclusion: GIT and BDI have significantly increased over the study period. Although BDI is not a predictor of survival, prolonged GIT 46 hours had a negative impact on survival amongst patients 450-years of age receiving bilateral lung transplants.
PROGNOSTIC VALUE OF MULTISLICE COMPUTED TOMOGRAPHY CORONARY ANGIOGRAPHY IN CARDIAC TRANSPLANT RECIPIENTS Jabbour Andrew1, Yu Chung-Yao1, Macdonald Peter1, Keogh Anne M1, Hayward Chris S1, Kotlyar Eugene1, Otton James M1, Boshell David2, Milner Brad2, Mccrohon Jane1, Sammel Neville1, Feneley Michael1 1Department of Cardiology, St Vincent’s Hospital, Sydney, 2Radiology Department, St Vincent’s Hospital, Sydney
Purpose: CTCA demonstrates good agreement with invasive coronary angiography (ICA). The prognostic value of CTCA in cardiac transplant recipients however is not known. Method: Screening CTCA performed on cardiac transplant recipients from September 2009 to July 2014 were retrospectively analyzed. Primary end point was the composite of ischaemia-related death and non-fatal coronary event. Secondary end points included nonischaemia-related death and image quality. Outcome was assessed by reviewing the patient record within St Vincent’s Hospital, Sydney. Results: Mean age at time of scanning was 53.7±15.1 years; mean time after transplantation was 9.3±8.6 years. Median follow up was
Abstracts A15
384 days. Of 145 scans, 86 (59%) had grade 0 cardiac allograft vasculopathy (CAV), 44 (30%) had grade 1, 3 (2%) had grade 2 and 2 (1%) had grade 3 CAV. A further seven scans revealed 8 patent coronary stents. 2043 segments (97%) were of diagnostic quality and 130 (6.4%) had evidence of disease. One major coronary event occurred (suspected coronary dissection). No ischaemia-related deaths were reported. One non-ischaemic death occurred due to restrictive cardiomyopathy. Four patients underwent subsequent ICA with two patients underwent subsequent coronary stenting. Conclusion: The majority of CTCA studies are of diagnostic quality. Prevalence of significant CAV detected is low but appreciable. Early detection led to appropriate therapy and was associated with no ischaemic events. The majority of patients had no visible coronary disease and had no subsequent ischaemic events. CTCA is an effective and predictive imaging strategy in cardiac transplant recipients.
PREGNANCY AFTER KIDNEY TRANSPLANTATION IN A SOUTH AUSTRALIAN COHORT OVER 39 YEARS Mohammadi Fadak1,2, Borg Matthew1,2, Faull Randall1,2, Carroll Robert1,2, Russ Graeme1,2, Coates Patrick Toby1,2, Mcdonald Stephen1,2, Jesudason Shilpa1,2 1Central
Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Centre for Clinical and Experimental Transplantation, Royal Adelaide Hospital Background and Methods: Renal transplantation restores fertility in women with ESKD, but these remain complex pregnancies with higher rates of adverse maternal-fetal outcomes than the general population. We investigated outcomes for pregnancies post-kidney transplant from a large transplanting unit in South Australia. Cases were identified via ANZDATA and by clinicians, and retrospectively reviewed. Results: We identified 43 pregnancies among 24 women (mean age 29.9±4.2 years) from 1976–2015, including one twin pregnancy. Nearly 50% of pregnancies were unplanned. Fetal Outcomes: The overall live birth rate (LBR) was 79.5% (n¼35), with 11.3% spontaneous losses, and 9.1% elective terminations. Pregnancies reaching 420 weeks (n¼36) had LBR of 97.2%, with one intrauterine death. Intrauterine growth restriction was reported in 19.4%. Serial growth scans from 12 pregnancies demonstrated mean gestational-age adjusted growth percentile progressively falling to 30.5%±17.7. Mean birth weight was 2556.3±679.7 grams, with mean gestational age of 35.4±3.6 weeks; 47.2% of births were preterm o37weeks. Only one neonatal death occurred. Maternal Outcomes: Mean pre-pregnancy creatinine (SCr) was 104.4±36.9 umol/L. SCr4110umol/L was associated with lower birthweight (Table 1). Hypertensive disorders were common: chronic hypertension (39.5%), gestational hypertension (13.9%), and preeclampsia (18.1%). The caesarean section (CS) rate was 63.9%; half of these were emergency CS for hypertensive disorders, worsening renal function, fetal position or failure of labour. Conclusions: Although live birth rates were excellent, post-transplant pregnancies had high rates of obstetric complications, particularly hypertensive disorders and prematurity. Pre-pregnancy counselling is crucial to avoid unplanned pregnancy. Better understanding of obstetric and foetal outcomes is essential to improve clinical management.
Table 1 Pre-Pregnancy Creatinine Range o 110 (n¼31)
110 (n¼12)
Mean birth rate (elective terminations excl.) 85.2% Mean Creatinine 3rd Trimester (umol/L) 86.9 ± 49.9 Mean Postpartum Creatinine (umol/L) 88.4 ± 84.2
83.3%
Hypertension Pre-eclampsia*
33.3% 36.4% 34.3 ± 4.2 2224.1 ± 823.4#
Mean gestational age at birth (weeks)* Mean birth weight (grams)*
146.2 ± 53.2 194.6 ± 100.0
22.6 % 17.4% 36.2 ± 3.2 2734.1 ± 555.8
*Live births only # P o 0.05.
BARIATRIC SURGERY IN MORBIDLY OBESE PATIENTS WITH CHRONIC KIDNEY DISEASE (CKD) Phan Kevin1, Wong Germaine2,3,4, Allen Richard5,6, Joseph David5, An Vincent Vinh Gia1, Chapman Jeremy R4, Pleass Henry6,7, Ryan Brendan7 Clinical School, Westmead Hospital, Sydney, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 3School of Public Health, University of Sydney, 4Centre for Transplant and Renal Research, University of Sydney, 5Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 6Department of Surgery, Sydney Medical School, University of Sydney, 7Department of Surgery, Westmead Hospital, Sydney 1Western
Background: It remains unclear whether bariatric surgery may improve transplantation access for the morbidly obese patients with advanced stage kidney disease who are otherwise deemed unsuitable for transplantation. We aimed to determine the potential benefits and harms of bariatric surgery in morbidly obese patients with end-stage kidney disease (ESKD). Methods: This is an observational, single-centre series of 10 consecutive patients with ESKD, who underwent bariatric surgery prior to being considered as eligible transplant candidates. Pre and postoperative outcomes were compared using independent paired t-test before (at baseline) and 3 months after the operation. Results: The mean age was 45.5±13.9 years, with an average preoperative weight of 122.2±20.3 kg. Over a follow-up period of 3 months, there was a significant reduction in weight (122.2±20.3 vs. 95.2±22.1 kg, Po0.001) and BMI (42.6±9.4 vs 29.8±6.1 kg/m2, P¼0.004). However, no significant changes in other biochemical parameters such as urea, serum creatinine and eGFR. A total of 4 patients(40%) received either a deceased or living donor kidney transplant, 5(50%) are currently being considered as potential transplant candidates, and one(10%) patient was deemed unsuitable because of gastric band slippage, leading to inadequate weight loss. Conclusions: Our findings suggest that bariatric surgery may be effective in achieving short-term weight loss in morbidly obese patients with ESKD requiring transplantation. Future well-powered randomized trials comparing the long-term efficacy and harms of bariatric surgery with standard therapy are required to confirm the trends observed.
Immunology and Cell Biology
Abstracts A16
Biostatistics, University of New South Wales, Sydney, 4Department of Nephrology, Sydney Children’s Hospital, 5Department of Renal Medicine, Lady Cilento Children’s Hospital, Brisbane, 6Department of Renal Medicine, Starship Children’s Hospital, Auckland, 7Department of Renal Medicine, Royal Children’s Hospital, Melbourne, 8School of Medicine, University of Sydney, 9Centre for Transplant and Renal Research, Westmead Hospital, Sydney
OBESITY AND THE RISK OF ALL-CAUSE AND CARDIOVASCULAR MORTALITY ACROSS THE SPECTRUM OF CKD PATIENTS: A SYSTEMATIC REVIEW AND META-ANALYSIS Ladhani Maleeka1, Craig JC1,2, Irving M1, Clayton Philip1,3, Wong Germaine1,2,3 1School of Public Health, University of Sydney, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 3Department of Renal Medicine, Prince of Wales Hospital, Sydney
Aims: To determine the association between obesity and risk of death, death from cardiovascular disease, and non-fatal cardiovascular events in people with chronic kidney disease. Methods: We searched MEDLINE and Embase to January 2015 to include all cohort studies that included an estimate of the association between obesity measured by body mass index (BMI), waist circumference (WC) and waist hip ratio (WHR) and these outcomes. Data was extracted and summarised using random effects models where possible. Results: Of the 4,063 citations identified in the search, 114 cohorts of 852,162 patients (164 reports) were analysed. A linear relationship between BMI and mortality was not always observed among CKD Stage III-V, Stage V-D and transplant patients. Of the studies (n¼37) that described a linear relationship between obesity and mortality, there was a 3 and 4% reduction for haemodialysis patients in all-cause and cardiovascular mortality, respectively (HR 0.97; 95%CI 0.96–0.98 and HR 0.96; 95%CI 0.92–1.00). In stage III–V, for every one kg/m2 increase in BMI there was a 2% reduction in all-cause mortality (HR 0.98; 95%CI 0.0.97–1.00). There was no association between BMI and allcause mortality in the peritoneal dialysis or transplanted populations (HR 1.05; 95%CI 0.95–1.17 and HR 1.00; 95%CI 0.96–1.04). Obtaining precise estimates between the association of WC, WHR and mortality outcomes were not feasible because of the limited data available. Conclusions: Being obese may be protective for cardiovascular and all-cause mortality in pre-dialysis and haemodialysis patients but not in transplanted patients.
HEALTH AND WEALTH IN CHILDREN AND ADOLESCENTS WITH KIDNEY TRANSPLANTS (K-CAD STUDY) Didsbury Madeleine1, Medway Meredith1, Chen Kerry1, Tong Allison2, Turner Robin3, Mackie Fiona4, Mctaggart Steven5, Kara Tonya6, Walker Amanda7, White Sarah8, Howard Kirsten2, Kim Siah1, Craig Jonathan1,2, Wong Germaine1,2,9 1Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 2School of Public Health, University of Sydney, 3Department of
Immunology and Cell Biology
Background: Poverty and social inequality are major barriers for achieving optimal health outcomes in children, but their impact on outcomes in children after kidney transplantation remains unclear. The K-CAD study is an Australian multicentre study that aims to describe the prevalence of economic hardship among caregivers and to determine the relationship between socioeconomic status (SES) of caregivers and self-rated health of children with CKD. Method: One hundred and eighty-four children aged 6–18 years with CKD [stage 1–5 (n¼93), dialysis (n¼20), transplant (n¼71)] were recruited from four tertiary children’s hospitals across Australia. Comparisons by quintile of SES for nominal self-rated health outcomes among children with CKD were analysed using adjusted multinomial logistic regression. Results: The mean ages of the caregivers and children were 41.3 years (SD:9.6) and 12.3 (SD:4), respectively. More than 60% of all households earned less than $1250AUD per week. Only 20% (n¼37) of caregivers engaged in full-time employment and less than 30% (n¼54) received tertiary education. Compared to children from the highest household income group, children with CKD from families in the lowest household income bracket (0-599AUD/week) reported significantly poorer self-rated health (OR 3.62, 95%CI 1.57–8.37, P¼0.006). Poor self-rated parental health was also predictive of poor-self rated health in children with CKD (OR 42.7, 95%CI 5–300, P¼0.006). Conclusion: SES of caregivers appears to have a profound impact on the self-rated health in children with CKD. Longitudinal follow-up will help delineate the cause of socioeconomic disadvantage in these children and the long-term effects on disease progression and wellbeing outcomes. THE ASSOCIATION BETWEEN HLA EPLET MISMATCHES, BROAD ANTIGEN MISMATCHES AND CLINICAL OUTCOMES IN KIDNEY TRANSPLANTATION Do Nguyen Hung1, Lim Wai2, Wong Germaine3,4,5 1School of Medicine & Pharmacology, University of Western Australia, Perth, 2Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 3Renal & Transplantation Unit, Westmead Hospital, Sydney, 4Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 5School of Public Health, University of Sydney
Background: Mismatched human leukocyte antigens (HLA) between donor and recipient are associated with a greater risk of rejection and graft loss, but it is unclear whether mismatched eplet load may identify those who are at risk of adverse events after kidney transplantation. Aim: To determine the predictive value of eplet and broad antigen mismatches (MM) for acute rejection and graft failure after kidney transplantation. Methods: Using linked data from ANZDATA and NOMS, we calculated HLA-ABDR broad antigen and eplet MM for all primary kidney transplants (2006–2011). Three stepwise-logistic regression models
Abstracts A17
were developed to determine and compare the predictive values of eplet, broad antigen and eplet +broad antigen MM for acute rejection and graft failure after kidney transplantation. Results: We included 3449 primary grafts. Over a (mean±SD) followup time 3.6±1.8 years, 647 patients (19%) experienced rejection and 401 (12%) lost their allografts. A linear association was observed between broad antigen MM and acute rejection (P¼0.803), but not for eplet MM (Po0.001). For every one broad antigen MM, there was at least a 12% increase in risk of rejection (adjusted-OR: 1.18, 95%CI 1.11–1.24, Po0.001) and graft loss (adjusted-OR: 1.12, 95%CI 1.03– 1.18, P¼0.003). Recipients with 411 eplet MM experienced at least a 25% increased risk of acute rejection (adjusted-OR 1.51, 95%CI 1.19– 1.92, P¼0.001) and graft loss (adjusted-OR 1.29, 95%CI 0.96–1.71, P¼0.88) compared to those with p11 eplet MM. There were no significant differences in the area under curve (AUC) between the combined broad antigen and eplet MM, the eplet MM alone and the broad antigen MM models (c-statistics: 0.583 [0.559–0.607], 0.554 [0.530–0.578], and 0.583 [0.559–0.607] respectively for rejection). Conclusions: Eplet and broad antigen MM are associated with acute rejection and graft failure after kidney transplantation, but inclusion of eplet matching to the traditional broad antigen matching does not improve the risk stratification for adverse events such as graft loss and acute rejection after kidney transplantation.
Background: As the disparity between demand for donor kidneys and availability continues to widen, threshold for acceptance lowers. Prolonged cold ischaemic time (CIT) is a known risk factor for adverse graft outcomes and may be a reason for disuse of potentially suitable donor organs. We aimed to evaluate the interaction between donor type (donation after cardiac death (DCD), compared with standard criteria donation (SCD)) and the duration of CIT on patient and allograft survival. Methods: Using data from the ANZDATA registry, propensity score matched analyses were conducted to determine the impact of CIT and donor type on patient and graft survival among recipients who received their first kidney transplant between 1963 and 2013. Cox proportional hazard modelling was used to determine the association between CIT and donor type, on graft and patient outcomes. Results: A total of 4065 matched pairs were obtained from 12,665 recipients. Over a median follow-up time of 7.3 years (interquartile range: 9.2 years), a total of 4557 (56%) recipients lost their allograft and 2932 (36%) died. The hazard ratios (HRs) for graft loss and mortality for those with CIT X12 hours relative to recipients whose CITo12 hours were 1.4 (95%CI: 1.3–1.5, Po0.001) and 1.8 (95%CI: 1.7–2.0, P¼0.02), respectively. There was a significant interaction between donor type and CIT on patient and graft outcomes (P¼0.03). In DCD recipients, compared with SCD recipients, there was a substantially increased risk of graft loss (HR 1.8 (95%CI: 1.4– 2.3)) and death (HR 1.9 (95% CI: 1.4–2.8)). Conclusion: Increasing and donor quality have a multiplicative effect on adverse graft and patient outcomes. Interventions to reduce the CIT, particularly among the DCD donor grafts may improve the overall graft and patient survival.
AUSTRALIAN KIDNEY EXCHANGE (AKX) DONOR KIDNEYS TRAVEL IN STYLE Allen Richard1,2, Woodruffe Claudia3,4, Pleass Henry5,6, Ferrari Paolo4 1Discipline
of Surgery, Sydney Medical School, University of Sydney, Services, Royal Prince Alfred Hospital, Sydney, 3Australian Organ and Tissue Authority, 4Department of Renal Medicine, Prince of Wales Hospital, Sydney, 5University of Sydney, 6Department of Surgery, Westmead Hospital, Sydney 2Transplantation
THE IMPACT OF COLD ISCHAEMIC TIME ON PATIENT AND ALLOGRAFT SURVIVAL FOLLOWING DONATION AFTER CARDIAC DEATH AND USING STANDARD CRITERIA Wong Germaine1,2, Craig Jonathan3, Teixeira-Pinto Armando4, Chapman Jeremy2, Macdonald Stephen5, Lim Wai6 1Centre for Transplant and Renal Research, University of Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 3Centre for Kidney Research, University of Sydney, 4School of Public Health, University of Sydney, 5Central Northern Adelaide Renal and Transplantation Service, University of Adelaide, 6Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth
Background: Two unique features of the AKX program are the requirement of simultaneous anaesthetic induction times (AIT) and organ transport to the recipient transplant centre. Implied in this decision was equivalence in quality of donor surgery and negligible impact of longer cold ischaemia time (CIT). Aim: To evaluate whether transport of live donor kidneys meets the assumed expectations. Methods: The initial 100 AKX transplants were evaluated prospectively for compliance with AIT, time of knife-to-skin (KTS) and crossclamp and total cold ischaemia time (CIT). Results: In a 48 month period, 17 donor surgeons at 12 centres were involved in fifteen 2-way, twenty 3-way, one 4-way and one 6-way exchanges. No donor withdrew consent on day of surgery. 69% of AIT were within 5 min with mean±SD of 8±18 min (range 0–105). AIT to knife-to-skin (KTS) mean time was 35±15 min (range 14–100) and mean individual surgeon KTS to cross-clamp time was 115±44 min (range 42–213). Donor kidneys were 84% left-sided and 18% had 41 artery to anastomose. Interstate air transport was Immunology and Cell Biology
Abstracts A18
required for 48 kidneys with east-east coast (n¼28) CIT 408±63 min and east-west (n¼20) CIT 630±101 min. Same state road transport was used 37 kidneys with CIT 237±67. No kidneys were lost in transport. There was 1 case of delayed graft function requiring dialysis (CIT 400 min) and 1 case of early graft loss. Conclusion: Despite wide range of donor surgery times and prolonged CIT for interstate exchanges, AKX decision to ship donor kidneys rather than the donor has proved to be safe. Greater flexibility in arrangements for AIT should be considered.
INCREASING DONOR AND RECIPIENT AGES IMPACT ON SURVIVAL AFTER HEART TRANSPLANTATION Lo Phillip1, Sugianto Nara1, Granger Emily2, Jansz Paul2, Spratt Phillip2, Hayward Christopher2, Jabbour Andrew2, Keogh Anne2, Kotlyar Eugene2, Macdonald Peter2, Dhital Kumud2 of Medicine, University of New South Wales, Sydney, 2Heart and Lung Transplant Unit, St Vincent’s Hospital, Sydney 1Faculty
Aims: Modest improvements in centre-based outcomes have encouraged many transplant units to increase donor pools by liberalising acceptance criteria. In this study, we analysed heart transplantation data focusing on temporal changes in donor and recipient characteristics. Methods: This retrospective single-centre study included 847 first time, single heart transplantations performed between February 1984 and April 2014. Three decade-like eras were defined. Patient characteristics were compared with survival in years using Kaplan-Meier statistics. Cox regression analysis was performed. Results: In our cohort of 847 patients over 30 years the 30-day, 1-, 5- and 10-years survival rates are 96%, 87%, 76% and 60% respectively. Long-term survival has not differed between the three eras (log-rank¼0.166). From 1984–1993 to 2004–2014, median [interquartile range (IQR)] donor ages have increased from 27.2 (20.3–37) to 39 (26–49) years (Po0.001), respectively. Similarly, median (IQR) ages of recipients have increased from 47 (37–53) and 51 (39.3–58) years (Po0.001), respectively. Increasing donor (P¼0.024) and recipient (Po0.001) ages predicted greater recipient mortality in univariate analysis. Donors aged 55+ years (n¼50) conferred poorer recipient survival outcomes when compared to donors aged o55 years (n¼787) (log-rank¼0.006). Regardless of donor age, recipients aged 50+ years (n¼420) have the worst
Figure: Kaplan-Meier survival of recipients of heart transplants in New South Wales, Australia, stratified by donor age (log-rank o0.05).
Immunology and Cell Biology
survival compared to recipients aged o50 years (n¼426) (log-ranko0.001). Conclusion: Donors and recipients aged 455 and 50 years, respectively, are predictors of increased mortality. The observed and continuing increase in donor and recipient marginality may adversely influence long-term survival, which currently remains relatively unchanged for the entire cohort across all three eras of heart transplantation.
INCREASING OBESITY RATES WILL INDEPENDENTLY LEAD TO A REDUCTION IN AVAILABLE LIVER DONORS Voss Jordan1, Raglow Zoe1, Schmitt Timothy2, Kumer Sean2, Gilroy Richard3 1School
of Medicine, University of Kansas Medical Center, Kansas City, KS, USA, 2Department of Surgery, University of Kansas Medical Center, Kansas City, KS, USA, 3Gastroenterology and Hepatology, University of Kansas Medical Center, Kansas City, KS, USA Aims: Obesity rates in Australia and the U.S. exceed 25% and 34%, respectively. Here we analyze the impact of obesity trends on liver deceased donor utilization and graft outcomes. Methods: In a single U.S. organ procurement organization, all donors (age X15) in 3 comparative 5 year eras were assessed: 1988–1992 (n¼357), 1998–2002 (n¼504) and 2008–2012 (n¼885). Allograft outcomes were obtained from the Scientific Registry of Transplant Recipients. Results: Recovery rates remained relatively constant (71.4% in era 1, 71.6% in era 2 and 72.6% in era 3). Mean donor BMI increased sharply, (23.6 in era 1 vs. 28.6 in era 3; Po0.001). The average BMI of transplanted livers is significantly lower than non-utilized livers (Po0.001 in all eras). BMI is independently associated with recovery in all eras (P¼0.001). The following model predicts that recovery rates will drop from a current rate of 83.8% in non-DCD to 78.3% in 2022 as a consequence of projected obesity: PERCENT RECOVERY¼ [EXP(3.44–.0615BMI)]/[1+ EXP(3.44–.0615BMI)]. One and five year graft survival for all eras combined was 90.4% and 71.7%, respectively. In logistic regression, neither DRI nor BMI was a significant predictor of graft outcome at 1 year (P¼0.630 & P¼0.410) or 5 years (P¼0.912 & P¼0.855). Mann-Whitney U Test indicated no significant differences in DRI or BMI for functioning vs non-functioning grafts at 1 and 5 years post-transplant. Conclusions: BMI predicts liver utilization but not graft outcomes suggesting some potential donors are not being utilized.
Abstracts A19
NUTRITION INTERVENTION AND SURVIVAL TO LIVER TRANSPLANTATION: A POSITIVE IMPACT ON WAIT LIST OUTCOMES Shackel Nicholas1, Lin Amelia2, Vidot Helen3, Potter Alison4 1AW
Morrow Gastroenterology & Liver Centre, Centenary Institute of Cancer Medicine and Cell Biology, Sydney, 2School of Medicine, University of Sydney, 3Dept Nutrition & Dietetics, Royal Prince Alfred Hospital, Sydney, 4Discipline of Medicine, University of Sydney Background: Malnutrition and sarcopenia in patients have been demonstrated to be strong prognostic indicators of survival prior to liver transplantation. Methods: We conducted a retrospective analysis of 280 adult patients who were assessed for liver transplantation 2012–2014. Parameters investigated included severity of liver disease, serum albumin (a marker of hepatic synthetic function), subjective global assessment (liver) [SGA (liver)] and survival. The patients were divided into two groups: well-nourished and malnourished receiving nutritional supplementation. Results: Patients were identified as requiring nutritional supplementation based on the SGA score. The group who received nutritional supplementation had an SGA score consistent with moderate or severe malnutrition in 69% of cases in contrast to only 8% the group who did not (Po0.001). Serum albumin was shown to be a poor indicator of malnutrition in patients with cirrhosis. Importantly, there was no statistically significant difference in serum albumin and MELD score between the two groups at time of activation and at time of transplant or delisting. Both groups progressed to liver transplantation or were activated for transplantation at equivalent rates. Further, there was no significant difference observed in the time spent on the waiting list and survival to transplantation between the two groups. Conclusion: In patients being assessed for liver transplant those with significant malnutrition had an equivalent outcome following nutritional supplementation to well nourished patients. Historically this malnourished group is known to have a significantly worse outcome. Therefore, nutritional assessment of patients on the transplant waiting is required to identify individuals who will receive supplementation.
INFLUENCE OF HEART FAILURE DIAGNOSIS ON SURVIVAL BENEFIT AFTER HEART TRANSPLANTATION Karas PL1, Granger E2, Jansz P2, Spratt P2, Hayward C2, Jabbour A2, Keogh A2, Kotlyer E2, Macdonald P2, Dhital K1,2 1School
of Medicine, University of New South Wales, Sydney, Department, St Vincent’s Hospital, Sydney
2Cardiothoracic
Optimisation of heart allocation is essential with the current shortage of donor hearts. Improved heart allocation should consider recipientspecific survival benefit, which is yet to be established for all patient groups. In this study, the survival benefit acquired from transplantation was assessed for different end-stage heart failure diagnosis groups. All adult heart transplant candidates registered for HT at St. Vincent’s Hospital between 1995 and 2012 were included (n¼ 585). Patients were allocated to one of five diagnosis groups: Congenital Heart Disease (CHD), Cardiomyopathy (CM), Ischaemic Heart Disease
(IHD), Idiopathic Dilated Cardiomyopathy (IDCM) and Other. CM patients were diagnosed with non-idiopathic dilated CM, or nondilated CM. Using Cox regression with transplantation as a timevarying covariate, the survival benefit of transplantation was assessed by estimating the time at which initial, high risk of mortality posttransplantation fell below pre-transplant risk (Crossover point). This model included pre-transplant deaths. Transplantation conferred a significant benefit for all disease groups, except patients with CHD (P¼0.308), due to sample size limitations. The Crossover point was achieved in all groups, indicating a survival advantage, except for patients with CHD (Figure). Patients with CM attained a survival benefit at 220 days after transplantation, followed by IDCM and IHD patients at 677 and 730 days, respectively. All diagnosis groups achieved a significant survival benefit, except patients with CHD. Survival benefit should be assessed when considering patients for transplantation, as it may maximise the utility of donor hearts.
Cells (including islets) and tissues/ischaemia reperfusion injury ISLETS EXPRESSING PD-L2 REGULATE IMMUNE RESPONSES BY SUPPRESSING T-CELL PROLIFERATION Stead Sebastian O1, Penko Daniella2, Johnston Julie2, Drogemuller Chris2, Coates Patrick T2, Rojas-Canales Darling2 1Discipline of Medicine, University of Adelaide, 2Central Northern Adelaide Renal and Transplantation Service, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital
Aim: T cell depleting antibody is associated with an increased risk of cancer after kidney transplantation but whether greater exposure is associated with a higher incidence of cancer remains unclear. We aimed to determine the association between the cumulative doses of T cell depleting antibody and the risk of cancer after kidney transplantation. Methods: Using data from the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA) between 1997–2012, we assessed the risk of incident cancer among those who had received T cell depleting antibody for induction and/or rejection stratified by T cell depleting antibody doses in tertiles using adjusted Cox proportional hazard models. Results: Of 503 kidney transplant recipients, 276 (55%), 209 (41%) and 18 (4%) patients received T cell depleting antibody for induction, rejection or induction and rejection respectively. The overall cancer incidence rate was 1,118 cancers per 100,000 patient-years, with 975, 1093 and 1377 cancers per 100,000 patient-years among those who had received 1–5 doses, 6–10 doses and 410 doses respectively. There was no association between T cell depleting antibody doses and risk of incident cancer (1–5: referent, 6–10: adjusted hazard ratio [HR] 1.19, 95%CI 0.48–2.95, 410: adjusted HR 1.42, 95%CI 0.50–4.02, p-value 0.801). Immunology and Cell Biology
Abstracts A20
Conclusion: Although the incidence of cancer after transplantation is greater in recipients who have received a higher number of T cell depleting antibody doses, a dose-dependent relationship between dose of T cell depleting antibodies and cancer risk was not observed, likely reflecting a small number of incident cancers in our cohort.
IMMUNE-MODIFYING MICROPARTICLE THERAPY REDUCES KIDNEY ISCHEMIA REPERFUSION INJURY Wu Huiling1,2, Getts Daniel3, Ma Jin2, Chen Xiaochen2, Van Vreden Caryn3, King Nicholas3, Chadban Steven1,2 1Department
of Renal Medicine, Royal Prince Alfred Hospital, Sydney, Node Laboratory, The Charles Perkins Centre, University of Sydney, 3The Discipline of Pathology, School of Medical Sciences, University of Sydney 2Kidney
Aim: Inflammatory monocyte-derived effector cells play a significant role in the pathogenesis of numerous inflammatory diseases including kidney ischemia-reperfusion injury(IRI). However, specific therapeutic agents to target these cells are not available. Recently, we reported the capacity of immune modifying microparticles (IMPs) to bind to circulating inflammatory monocytes/macrophages (Mf) via the specific scavenger receptor MARCO, thereby causing inflammatory Mf removal in spleen with subsequent protection in models of infection, autoimmunity and ischemia injury. Here we investigated the therapeutic potential of IMPs to target Mf in kidney IRI. Methods: Kidney ischemia was induced for 22 minutes followed by reperfusion. C57BL/6 mice were randomized to receive 300ul (1.461010 particles/ml, 500 nm in diameter) of negatively charged IMPs or neutral microparticles as a control by tail vein injection 2 hours after ischemia and continued daily for 3 days. Samples were collected at days 1 and 5 after reperfusion. Results: Mice treated with IMPs were protected against kidney IRI, with lower serum creatinine and less tubular damage versus control mice at days 1 and 5 (Po0.05–0.01). Tubulo-interstitial accumulation of neutrophils (day1, Po0.05) and CD68+ macrophages(day 5, Po0.05) was markedly less in mice treated with negatively charged IMPs versus controls. IFNg expression in IRI kidney at day 1 was significantly reduced while TGFb expression in IRI kidney was increased at day 5 by negatively charged IMPs treatment (Po0.05). Conclusion: IMP infusion affords significant protection from kidney IRI in mice, which may be associated with inhibition of inflammatory Mf migration and function.
CCR2+ MONOCYTE DERIVED DENDRITIC CELLS CONTRIBUTE TO EARLY GRAFT DYSFUNCTION OF MHC MISMATCHED ISLET TRANSPLANTS Chow Ke Vin1,2, Sutherland Robyn1, Zhan Yifan1, Lew Andrew1 1Department
of Immunology, Walter and Eliza Hall Institute of Medical Research, Melbourne, 2Department of Nephrology, Royal Melbourne Hospital Background: Islet transplantation can cure type 1 diabetes, but is limited by lack of donor organs and early allograft dysfunction, such that most patients require at least 2 islet transplants. CCR2+
Immunology and Cell Biology
monocytes differentiate into monocyte-derived dendritic cells (moDCs) during inflammation, and may impair islet engraftment and primary function. Methods: To determine the role of moDCs, we transplanted 400 MHC mismatched islets from BALB/C mice (H-2d) into C57BL/6 hosts (H-2b), modified to express the primate diphtheria toxin receptor under control of the CCR2 promoter (CCR2.DTR), allowing for diphtheria toxin induced conditional depletion of CCR2+ cells. Host mice received pre-transplant streptozotocin to induce diabetes. Diphtheria toxin 20 ng/g (DT+) (n¼10) or normal saline (control) (n¼9) was administered every second day from day -4. Graft function was determined by measuring blood glucose (BG). Results: DT+ mice showed absence of CCR2+ monocytes in peripheral blood and significant reduction in moDCs at the graft site. Pre-transplant BGs between DT+ and control mice were equivalent (27.0±1.3 mmol/L vs 29.6±1.1 mmol/L, P¼0.159). DT+ mice achieved higher rates of normoglycemia than control mice at day 1 (11.0±1.8 mmol/L vs 19.1±1.4 mmol/L, P¼0.004) and day 3 (7.1±0.8 mmol/L vs 15.4±2.0 mmol/L, P¼0.003). Conclusions: CCR2+ moDCS are involved in an innate immune response that mediates early dysfunction of transplanted islet allografts. Further studies are required to determine the mechanism of this effect.
MONOCYTE DERIVED DENDRITIC CELLS POORLY STIMULATE CD4+ T CELL PROLIFERATION BY DIRECT AND INDIRECT ANTIGEN PRESENTATION BUT POTENTLY INDUCE TH POLARISATION Chow Ke Vin1,2, Zhan Yifan1, Sutherland Robyn1, Lew Andrew1 1Department of Immunology, Walter and Eliza Hall Institute of Medical Research, Melbourne, 2Department of Nephrology, Royal Melbourne Hospital
Background: Transplant antigens can be recognised by direct and indirect presentation. How heterogeneous dendritic cell (DC) subsets participate in and coordinate the presentation of transplant antigens remains poorly understood. Method: We compared direct and indirect presentation by two DC subsets: conventional DCs (cDCs) and monocyte derived DCs (moDCs) in-vivo and ex-vivo, with particular regard to their ability to prime T cell proliferation and differentiation. Results: moDCs are rare in the steady state but increase significantly in response to allogeneic stimuli. They are are significantly less efficient than cDCs at inducing CD4+ T cell proliferation through direct and indirect antigen presentation. This difference was not due to differential MHC II expression or cell survival. Despite their poor ability to drive T cell proliferative responses, moDCs were potent at directing Th1 and Th17 differentiation, whilst inhibiting Th2 differentiation. moDCs also potently reduced the ability of cDCs to stimulate T cell proliferation in vitro and in vivo. Such inhibition was, at least partly, dependent on nitric oxide production, but independent of antigen presentation by moDCs. Conclusion: These results highlight the complexity of interaction between DC networks and T cells, particularly under conditions where moDCs become abundant.
Abstracts A21
P2X7 RECEPTOR BLOCKADE REDUCES THE CONCENTRATION OF CIRCULATING INTERFERON GAMMA IN A HUMANISED MOUSE MODEL OF GRAFT-VERSUS-HOST DISEASE Geraghty Nicholas1,2, Belfiore Lisa1,2, Mullany Phillip1,2, Alexander Stephen3, Sluyter Ronald1,2, Watson Debbie1,2 of Biological Sciences, University of Wollongong, 2Illawarra Health and Medical Research Institute, University of Wollongong, 3Paediatrics and Child Health, Children’s Hospital, Westmead 1School
Activation of the P2X7 receptor channel by extracellular ATP has been implicated in allogeneic mouse models of graft-versus-host disease (GVHD). Aim: To investigate P2X7 blockade as a therapeutic strategy to prevent GVHD in humanised mice. Method: ATP-induced cation uptake into human and murine leukocyte lines was measured by flow cytometry. NOD-SCID-IL2gnull (NSG) mice were injected intra-peritoneally (i.p.) with 10106 human peripheral blood mononuclear cells (hPBMC) (day 0). Humanised mice were subsequently injected i.p every second day (days 0–8), with the P2X7 antagonist Brilliant Blue G (BBG) (50 mg/kg) or saline. Mice were scored for incidence and severity of GVHD. Engraftment of hPBMCs was assessed by flow cytometry and circulating interferon gamma (IFNg) was assessed by ELISA. Results: BBG prevented ATP-induced cation uptake into both human and murine leukocyte lines in a concentration-dependent manner. BBG did not affect hPBMC engraftment in the blood at three weeks post-injection or the spleens at time of euthanasia. At both time points the majority of hPBMCs were T cells. There was no difference in clinical scores or survival between humanised mice injected with BBG or saline. However, mice injected with BBG demonstrated a significant reduction in circulating IFNg. Conclusions: The regime used in the current study to block P2X7 in humanised mice does not prevent the clinical manifestations of GVHD. However, the reduction in circulating IFNg warrants further investigation. Further studies will investigate BBG administration over the course of the model to examine the impact on GVHD. THE WESTMEAD ISLET TRANSPLANT PROGRAM 12-YEAR OUTCOMES Chew YV1, Williams LJ1, Davies SM1, Liuwantara D2, Burns H2, Hawkes J1, Patel AT1, Jimenez-Vera E2, O’Connell PJ1, Hawthorne WJ1 1Centre
for Transplant and Renal Research, Westmead Hospital, Sydney, for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney
2Centre
Aims: To establish donor factor that influence islet isolation outcomes from the Westmead Islet Transplant and to identify donor characteristics that would predict isolation outcomes that reach release criteria for transplantation. Methods: Islets were isolated from pancreases of heart beating deceased donors using collagenase and NP (SERVA). Islet preparations were divided into Transplanted and Non-transplanted and the following factors and outcomes were compared: total islet equivalents (IEQ), IEQ/gram (IEQ/g) pancreas, donor age, body mass index (BMI), pancreas weight, cold ischaemic time (CIT). The Transplanted group
were further stratified according to donor age into 10 year cohorts to determine the effect of donor age on total IEQ and IEQ/g pancreas obtained. Results: 180 islet isolations between 2003 and 2015 were evaluated. Transplanted preparations (n¼44) had significantly higher total IEQ and IEQ/g pancreas compared to their non-transplanted counterparts (n¼136). Donor characteristics predictive of a suitable islet preparation were shorter CIT (360±21 vs 423±12 min P¼0.009), larger BMI (30.3±0.9 vs 27.6± 0.4 kg/m2 P¼0.002) and greater pancreas weight (101±3 vs 87±2 g P¼0.0005). When stratified by donor age, the 40– 49 yr cohort preparations achieved significantly greater number of IEQ (788810±90863 P¼0.03), IEQ/g pancreas (7683±821 P¼0.01), those in the 450 yr old donors and had lower numbers of IEQ and IEQ/g pancreas. Conclusions: The success of islet isolation outcomes correlated with decreased CIT, increased donor BMI and larger pancreas size. Islet yields were highest from isolations from the 40–49 yr age cohort when compared to 50–59 yr old donors.
DEFINING THE INTRACELLULAR MECHANISMS TO PRESERVE ISLET CELLS FROM THE INSTANT BLOOD MEDIATED INFLAMMATORY REACTION Liuwantara David1, Chew Yi Vee2, Burns Heather1, Hawkes Joanne2, Williams Lindy2, Davies Sussan2, O’connell Philip J1, Hawthorne Wayne J1 1Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 2Centre for Transplant and Renal Research, Westmead Hospital, Sydney
Introduction: Human islet transplants must survive the immediate effect of instant blood mediated inflammatory reaction (IBMIR) as they come into contact with the circulation after infusion into the portal circulation. The aim of this study was to determine the early activation pathways involved in the initiation of the IBMIR response. Methods: An in vitro assay mimicking IBMIR was developed using platelet poor plasma (PPP), so that islet destruction was reduced, but the cellular mechanisms of IBMIR were activated. Islet viability following PPP-induced IBMIR was evaluated using the extracellular flux analyser. Results: Baseline oxygen consumption rate (OCR) of human islets incubated in PPP was reduced by B54% (untreated islets (CTL): 370±78 vs. PPP-treated: 169±55 nMoles/min.mg DNA, Po0.005). Similarly, oxidative respiration was also reduced by B55% (CTL: 180±40 vs: PPP-treated: 80±33 nMoles/min.mg DNA, Po0.005). PPP reduced islet total mitochondrial capacity by B59% (CTL: 425.4±94 vs. PPP-treated: 241±53 nMoles/min.mg DNA, Po0.005), reserve capacity was not significantly reduced. In contrast, the glucose stimulated OCR ratio (glucose: baseline) of the PPPtreated group was higher than control (CTL: 1.2-fold±0.16 vs. PPPtreated: 1.74-fold±0.49, Po 0.005). Conclusion: Incubation of human islets with PPP leads to mitochondria associated damage prior to any interactions with inflammatory leukocytes such as neutrophils. Hence, pre-treatment of human islets with therapeutic agents capable of reducing mitochondrial damage, or enhancing mitochondrial functions may prove beneficial for the protection of human islet after intra-portal transplantation.
Immunology and Cell Biology
Abstracts A22
1Illawarra
Health and Medical Research Institute (IHMRI), The University of Wollongong, 2Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney Graft-versus-host-disease (GVHD) is a major complication following bone marrow transplantation with a high mortality rate. Aims: To further understand the human immune response in the development of GVHD using a preclinical model. Methods: NODSCIDIL2Rgnull (NSG) mice were injected intraperitoneally with 10106 human peripheral blood mononuclear cells (hPBMCs). Flow cytometry was used to check for human cell engraftment with human CD45 at 3 weeks post-injection. Spleens were assessed with human leukocyte marker CD45 and human T cell markers CD3, CD4 CD8. Mice were assessed for GVHD symptoms including weight loss and increased clinical scores at 4–8weeks. Results: NSG mice injected with hPBMCs showed similar levels of engraftment of human CD45+ T cells at 3 weeks post-injection. Engrafted human cells are predominantly human CD3+ T cells and include both CD4+ and CD8+ T cells. The majority of engrafted humanised mice develop GVHD. Despite similar levels of human cell engraftment, some mice do not develop GVHD, potentially providing important insight into GVHD development. Mice with GVHD show a higher splenic CD4:CD8 ratio compared to engrafted mice without GVHD. Increased human IFN-g was observed in the serum of mice with GVHD compared to mice without GVHD. Similar levels of human cytokines (IL-6, TNFa, IL-2, IFN-g) were observed in the spleen of humanised mice with or without GVHD. However, significantly increased IL-17A expression was observed in the spleen of humanised mice with GVHD. Conclusions: This study indicates a potential role for human IFN-g and IL-17A in the development of GVHD in humanised mice.
24.00 20.00 16.00 12.00 8.00 4.00 0.00
Pressure Changes during Withdrawal in All Porcine Experiments (n=12) PAP CVP LAP 0 30 60 90 120 150 180 210 240 270 300 330 360 420 480 540 600
Watson Debbie1,2, Geraghty Nicholas1, Belfiore Lisa1, Mullany Phillip1, Alexander Stephen2, Sluyter Ronald1
Results: In the absence of spontaneous respiration, rapid desaturation occurred with PaO2 falling to 13.6±0.6 mmHg by 5 min post withdrawal. This was associated with a significant rise in central venous pressure (CVP 5.8±2.0 to 14.3±2.8 mmHg) and mean pulmonary artery pressure (MPAP 16.6±2.1 to 27.2±3.2 mmHg) and fall in left atrial pressure (LAP 8.0±1.0 to 3.5±2.5 mmHg). Thereafter central pressures gradually equalised (Figure). With increasing WI, progressive derangements in metabolic and biochemical parameters were evident. By 40 min post withdrawal, pH dropped to 7.17, lactate peaked at 12.8 mmol/L, troponin-T at 202.4 mg/L and potassium at 12 mmol/L. A surge of systemic NA and Ad was noted 4 min post withdrawal, with continued cardiac release of both catecholamines up to 20 min post withdrawal. Conclusion: Dramatic haemodynamic and metabolic changes occur during the withdrawal period, notably acute pulmonary vasoconstriction, profound acidosis, myocardial ischaemia and hyperkalaemia. Although these changes cannot be avoided, there is scope for postmortem interventions to limit these derangements and to optimise the perfusate for NEVP.
Pressure (mmHg)
THE HUMAN IMMUNE RESPONSE IN HUMANISED MICE WITH AND WITHOUT GRAFT-VERSUS-HOST DISEASE (GVHD)
Time (s)
Immunobiology: DC, NK cells and other
CHARACTERISTIC TRENDS OF THE WITHDRAWAL PERIOD IN DCD DONORS – IMPLICATIONS FOR CLINICAL DCD CARDIAC TRANSPLANTATION
ALLO-HLA REACTIVITY BY HIV-SPECIFIC T CELLS: IMPORTANT IMPLICATIONS FOR SOLID ORGAN TRANSPLANTATION IN HIV SEROPOSITIVE RECIPIENTS
Chew Hong Chee1,2,3, Iyer Arjun1, Gao Ling1, Doyle Aoife1, Villanueva Jeanette1, Hicks Mark1, Jabbour Andrew1,4, Dhital Kumud1,3, Macdonald Peter1,4
Almeida Coral-Ann1,2,3, Van Miert Paula4, Zoet Yvonne4, Witt Campbell1,2, Claas Frans4, John Mina2,3, D’Orsogna Lloyd1,2
1Transplantation Laboratory, Victor Chang Cardiac Research Institute, Sydney, 2School of Medicine, University of New South Wales, Sydney, 3Department of Surgery, St Vincent’s Hospital, Sydney, 4Department of Cardiology, St Vincent’s Hospital, Sydney
1School
Aim: Normothermic ex-vivo perfusion (NEVP) has facilitated the advent of human DCD cardiac transplantation. The sensitivity of the heart to warm ischaemia (WI) and the use of donor blood in NEVP provide important reasons to characterise the haemodynamic and metabolic derangements during withdrawal of life support. Methods: In a porcine asphyxia model, we characterised the haemodynamic, metabolic (pH, oxygen, lactate, troponin-T), biochemical (K) and catecholamine changes (noradrenaline(NA) and adrenaline(Ad)) during withdrawal periods ranging from 20–40 minutes. Immunology and Cell Biology
of Pathology and Laboratory Medicine, University of Western Australia, Perth, 2Department of Clinical Immunology, Fiona Stanley Hospital, Perth, 3Institute for Immunology and Infectious Diseases, Murdoch University, Perth, 4Department of Immunohematology and Blood Transfusion, Leiden University Medical Centre, Leiden Introduction: Solid organ transplantation is increasingly being performed in patients with chronic HIV infection, with variable outcome. In particular, HIV seropositive recipients still experience acute rejection episodes despite the presence of CD4 T cell immunodeficiency. We have recently reported that allo-HLA crossreactivity by EBV, CMV, VZV and influenza virus-specific T cells is common, and therefore we hypothesized that HIV-specific T cells themselves can trigger direct T cell mediated allorecognition.
Abstracts A23
Methods: Multiple HIV-1 specific CD8 T cell clones were generated, using single cell sorting based on HIV peptide/HLA tetrameric complex staining. The generated T cell clones were assayed for alloreactivity against a panel of single HLA expressing cell lines, using a cytokine assay, CD137 upregulation and cytotoxicity as readouts. Results: HIV-specific CD8 T cells crossreacted with allogeneic HLA molecules. A Gag RK9/HLA-A3 specific T cell clone with TCR Vbeta 23 recognised allogeneic HLA-A*69:01. Two different Gag GL9/HLAB7 restricted T cell clones with Vbeta 22 and an unknown Vbeta recognized allogeneic HLA-A*33:03. A KK10/HLA-B27 restricted T cell clone with Vbeta 5.1 recognized allogeneic HLA-A*33:03 and HLA-B*57:01. Allo-HLA reactivity by HIV-specific T cells was specific to the HIV target peptide/HLA restriction and Vbeta usage of the T cells. Overall 4/25 HIV-specific T cell clones tested recognized at least one allogeneic HLA molecule. Conclusion: HIV-specific T cells crossreacted against allogeneic HLA molecules, which may have important clinical implications in the transplant setting. HIV-specific CD8 memory T cells may augment acute cellular rejection (direct allorecognition) despite the presence of relative CD4 T-cell deficiency.
HIV antigen
HLA restriction
Gag Gag
A3 B7
Gag Gag
B7 B27
Viral peptide RLRPGGKKK (RK9) GPGHKARVL (GL9) GPGHKARVL (GL9) KRWIILGLNK (KK10)
Vbeta usage
Allo-HLA crossreactivity
23 22
A*69:01 A*33:03
Unknown 5.1
A*33:03 A*33:03 and B*57:01
DRUG INDUCED ALLOREACTIVITY: A NEW PARADIGM FOR ALLO-RECOGNITION Almeida Coral-Ann1,2,3, Van Miert Paula4, Zoet Yvonne4, Witt Campbell1,2, Claas Frans4, John Mina2,3, D’orsogna Lloyd1,2 1School
of Pathology and Laboratory Medicine, University of Western Australia, Perth, 2Department of Clinical Immunology, Fiona Stanley Hospital, Perth, 3Institute for Immunology and Infectious Diseases, Murdoch University, Perth, 4Department of Immunohematology and Blood Transfusion, Leiden University Medical Centre, Leiden Introduction: Abacavir administration is associated with drug induced hypersensitivity reactions in HIV patients expressing HLAB*57:01. However the immunological effect of abacavir administration in an HLA-B57 mismatched transplantation setting has not been studied. We hypothesized that abacavir exposure would induce de novo HLA-B57 specific allorecognition. Methods: Multiple HIV-specific CD8 T cell clones were generated from HIV infected patients negative for the HLA-B57 antigen, using single cell sorting based on HIV peptide/HLA tetrameric complex staining. The generated T cell clones were assayed for alloreactivity against a panel of single HLA expressing cell lines (SALs), in the presence or absence of abacavir. Cytokine production, CD137 upregulation and cytotoxicity were used as readouts. Results: Abacavir exposure induced de novo HLA-B57 allorecognition by HIV-specific T cells. A Gag RK9/HLA-A3 specific T cell clone, from an HLA-B57 negative HIV patient, recognized allogeneic HLA-B57 in the presence of abacavir. Abacavir did not induce recognition of any other allogeneic HLA molecules. Another clone from the same patient
with the same specificity, but with different TCR Vbeta usage did not recognize allogeneic HLA-B57 in the presence of abacavir, suggesting that TCR Vbeta specificity is important in allorecognition of the drug. Conclusion: Results presented here provide the first evidence that administration of a drug could induce specific allorecognition of mismatched HLA molecules in the transplant setting. Furthermore, HIV-specific memory T cells themselves may participate in the abacavir induced alloreactivity. We suggest that HIV-positive recipients of a HLA-B57 mismatched graft should not receive abacavir until further studies are completed. MODIFICATION OF THE ‘‘ONE STUDY’’ PANELS FOR WHOLE BLOOD IMMUNOPHENOTYPIC MONITORING FOR CLINICAL TRIALS AT WESTMEAD Chen Hsiao-Ting1,2, Dervish Suat3, Wang Xin Maggie3, Keung Karen4,5, Liuwantara David4,5, Jimenez-Vera Elvira4, Yi Shounan4,5, Hawthorne Wayne4,5, Alexander Stephen5,6, O’connell Philip4,5, Hu Min1,2 for Transplant and Renal Research, University of Sydney, 2Westmead Millennium Institute, Westmead Hospital, Sydney, 33Flow Cytometry Core Facility, Westmead Research Hub, Westmead, 4Centre for Transplant and Renal Research, Westmead Millennium Institute, Westmead Hospital, Sydney, 5University of Sydney, 6Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney 1Centre
Aim: 1) To establish whole blood immunophenotyping for patients undergoing cellular immunotherapy for regulation of their immune system. 2) To determine if there is age related differences in the cellular phenotype of healthy donors. 3) To determine if there are differences between fresh and frozen PBMC. Methods: Each panel was designed to mirror an equivalent One Study panel. Staining was performed within 4 hours of blood sample collection for whole blood. The samples were screened on an LSRFortessa (BD). BD FACSDiva and Flowjo were used for data analysis. Application settings, BDt Cytometer Setup and Tracking beads, were used to monitor cytometer performance to ensure consistency of results over time. Titration of antibodies was based on signal/noise ratio [the Mean Fluorescence Intensity (MFI) of the positive peak to the MFI of the negative peak] for a single antibody. Results: Seven leukocyte profiling panels containing 8 to 10 marker antigens for monitoring the major leukocyte subsets as well as characteristics of T cells including Tregs, B cell, dendritic cell (DC), and monocyte subsets were modified. Similar data was shown between One Study panels and newly derived panels was achieved. Additionally, these panels could be used on frozen PBMC samples. Conclusions: Immunophenotyping of whole blood using One Study protocol can be transferred to other platforms such as the LSRFortessa, suggesting that similar data may be able to be generated across One Study and Westmead platforms.
DENDRITIC CELL TARGETING WITH POROUS SILICON NANOPARTICLES AS A POTENTIAL TOLERANCE INDUCTION STRATEGY Rose Peter1,2, Mcinnes Steven3, Kireta Svjetlana1, Carroll Robert1,4, Voelcker Nicolas3, Coates Patrick1,4, Jesudason Shilpanjali1,4 1Central
Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2University of Adelaide, 3Mawson Institute, Immunology and Cell Biology
Abstracts A24
University of South Australia, of Adelaide
4School
of Medicine, University
Aims: Dendritic cells (DCs) are major antigen-presenting cells that are important inductors of transplant tolerance. The DC-SIGN receptor is uniquely expressed on DCs, and therefore is a potential target for DCspecific therapy. Porous silicon nanoparticles (pSi NPs) are a vehicle for cell-specific drug therapy as monoclonal antibodies to target-cell receptors can be conjugated to their surface. This study utilises a novel pSi NP specifically constructed to target the DC-SIGN receptor on DCs. Methods: DCs derived from human peripheral blood monocytes were cultured with pSi NPs with fluorescein isothiocyanate (FITC)-labelled anti-DC-SIGN monoclonal antibodies (mAbs) conjugated to their surface. Uptake of these pSi NPs at various time-points was compared to pSi NPs not labelled with any mAbs and pSi NPs labelled with isotype-control mAbs. Cell uptake of FITC-labelled NPs was assessed by flow cytometry and transmission electron microscopy (Figure 1). Results: NPs labelled with anti-DC-SIGN mAbs were preferentially taken up by DCs compared to unlabeled NPs and NPs labelled with isotype-control mAbs. This preferential uptake was time and dose dependent. The NPs themselves were not cytotoxic to DCs and did not induce changes in phenotype typical of DC maturation. DC-SIGN-negative antigen-presenting cells (monocytes) showed non-specific uptake of all NP constructs. Conclusions: This work demonstrates this novel pSi NP construct is a feasible vehicle for targeting DCs in-vitro. Further studies will explore delivery of immunosuppressive drugs to DCs via pSi NPs as well as testing in-vivo models, including non-human primates.
3Department
of Renal Medicine, Prince of Wales Hospital, Sydney, of Renal Medicine, Auckland City Hospital, 5Department of Renal Medicine, Royal Melbourne Hospital, 6Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 7Department of Renal Medicine, Westmead Hospital, Sydney 4Department
Background and Aims: Cytomegalovirus (CMV) and Epstein-Barr virus (EBV)-related disease are infrequent but associated with significant morbidity and mortality after kidney transplantation. Compared to CMV seronegative recipients, seropositive recipients have been shown to have a higher risk of CMV disease, graft loss and death prior to the introduction of antiviral prophylaxis but this association remains inconsistent. The aim of this study is to examine the association between donor (D)/recipient (R) CMV and EBV serological status and outcomes in kidney transplant recipients. Methods: We compared the risk of death-censored graft failure (DCGF), death with functioning graft (DFG) and incident cancer across donor/recipient CMV and EBV serological status using Australia and New Zealand Dialysis and Transplant (ANZDATA) registry between 1990–2012 using adjusted Cox regression analysis. Results: Of 3553 primary kidney transplant recipients with available viral serological status, 1513 (42.6%) and 635 (17.9%) were CMV D+/R+ and D+/R- respectively, while there were 2786 (78.4%) and 328 (9.2%) who were EBV D+/R+ and D+/R- respectively. Compared to CMV D-/R+, CMV D+/R- (adjusted hazard ratio [HR] 1.42, 95%CI 1.01, 1.00, P¼0.043) and CMV D+/R+ recipients (adjusted HR 1.42, 1.06, 1.88, P¼0.017) had significantly higher risk of DCGF but not DFG, independent of age, era and rejection. There was no association between EBV status and DCGF or DFG. EBV D+/Rrecipients had higher risk of incident cancer compared to EBV D+/R+ (adjusted HR 1.91, 95%CI 1.29, 2.84, P¼0.001), particularly in those aged o50 years (adjusted HR 4.28, 95%CI 2.54, 7.21, Po0.001). Conclusions: CMV seropositive donors, regardless of recipient CMV serological status is associated with over a 1.4-fold increased risk of DCGF, while EBV naı¨ve recipients who have received EBV seropositive donor kidneys have a higher risk of incident cancer. Clinicians should continue to be vigilant of the adverse impact of CMV seropositive donors on graft outcomes even in the era of anti-viral prophylaxis.
NON-MELANOMA SKIN CANCER MORTALITY IN KIDNEY TRANSPLANT RECIPIENTS Figure 1. Transmission electron micrograph of human monocyte-derived DC treated with pSi NPs labelled with anti-DC-SIGN monoclonal antibodies for 2 hours. Red arrows show the pSi NPs both inside the lysosome and on the surface of the DC.
Transplant complications VIRAL SEROLOGICAL STATUS AND GRAFT FAILURE, DEATH AND CANCER RISK AFTER KIDNEY TRANSPLANTATION Lim Wai1, Chadban Steven2, Ferrari Paolo3, Pilmore Helen4, Hughes Peter5, Chakera Aron1, Russ Graeme6, Clayton Phil3, Wong Germaine7 1Department 2Department
of Renal Medicine, Sir Charles Gairdner Hospital, Perth, of Renal Medicine, Royal Prince Alfred Hospital, Sydney,
Immunology and Cell Biology
Burke Michael1, Nadeau-Fredette Annie-Claire1,2, Badve Sunil1, Johnson David1, Pascoe Elaine3, Mcdonald Stephen4,5, Green Adele6,7, Carroll Robert4,8, Hawley Carmel1, Isbel Nicole1 1Department of Renal Medicine, University of Queensland, Princess Alexandra Hospital, 2Universite de Montreal, Montreal, Canada, 3School of Medicine, University of Queensland, Brisbane, 4Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 5ANZDATA, 6QIMR Berghofer Medical Research Institute, 7Cancer Research UK Manchester Institute, UK, 8Centre for Experimental Transplantation, University of Adelaide
Aims: To determine the frequency and patient characteristics of fatal Non-Melanoma Skin Cancer (NMSC) in Kidney Transplant Recipients (KTRs). Methods: This observational cohort study included all adult and paediatric KTRs transplanted in Australia and New Zealand between 1980 and 2013, using data from the Australia and New Zealand
Abstracts A25
Dialysis and Transplant (ANZDATA) Registry. Patient and NMSC characteristics in KTRs who died from NMSC were evaluated. Results: During the study period, 21875 transplant episodes occurred in 19344 patients. Of the 6780 patients who subsequently died, 229 (3.4%) died from NMSC. The NMSC deaths were due to squamous cell carcinoma (n¼181 or 79%), basal cell carcinoma (n¼18 or 8%), Merkel cell carcinoma (n¼15 or 7%), spindle cell carcinoma (n¼1 or 0.4%), fibrous histiocytoma (n¼1 or 0.4%) and NMSC unspecified (n¼13 or 6%). Of the 229 patients who died from NMSC, 170 (74%) were male, 224 (98%) were Caucasian, 217 (95%) were first graft recipients and 217 (95%) died with a functioning graft. Of those patients who were transplanted with a first graft and died from NMSC, the mean age at transplantation was 48.8±13 years and the mean age at death was 61.8±11 years. Conclusions: NMSC is an important contributor to mortality in kidney transplant recipients, particularly Caucasian males. In descending order of frequency, squamous cell, basal cell and Merkel cell carcinomas represent the most common types of NMSC causing death.
Cardiac disease is common in patients with kidney failure. Associated with this are elevations in cardiac biomarkers. Aims: We examined changes in levels of the new 5th generation high sensitivity troponin (hsTnT) occurring in patients after kidney transplantation. Methods: 52 consecutive renal transplant recipients underwent BNP and hs TNT testing on the day of kidney transplantation and at 1 week and 1 month post transplantation. Results: There was a significant decline in hs TNT levels at 1 week after transplantation that persisted at one month (Table 1). Nevertheless most patients had levels that were above the upper limit of normal at all measured time points. BNP levels decreased more slowly but remained elevated compared to the normal population. Levels of hs TnT correlated only with renal function.
Median
PANCREATIC CANCER AFTER LUNG TRANSPLANTATION FOR CYSTIC FIBROSIS Benzimra Mark, Malouf Monique, Plit Marshall, Havryk Adrian, Rigby Amy, Glanville Allan Lung Transplant Unit, St Vincent’s Hospital, Sydney Introduction: Cystic fibrosis (CF) is a lethal genetic disease which is usually associated with severe airflow limitation in addition to hepatobiliary, pancreatic and gastrointestinal dysfunction. Bilateral sequential lung transplantation (BLTX) is a recognized treatment option for patients with CF and is one of the most common indications. As a result, selected patients with CF achieve a significant survival advantage. Methods: Single centre retrospective analysis of 885 patients undergoing LTX August 1989-January 2015 to determine the incidence of pancreatic malignancy in patients transplanted for CF. Results: 212/885(24%) transplants were performed for CF. BLTX: HLTx ¼206:6, M: F ¼92:120. Median age 26 years (range 22-58 years) with median survival of 1666 days (range 2-8317 days). Four patients developed pancreatic adenocarcinoma, median age 46.5 years (range 44-59 years), median postoperative day 3551 (range 2501–5398 days). Median survival after diagnosis was 192 days (range 52–394 days). The first patient presented with right upper quadrant (RUQ) pain and obstructive jaundice requiring percutaneous drainage complicated by sepsis; Second patient presented with RUQ pain and received palliative chemotherapy; Third patient presented with recurrent biliary strictures and obstructive jaundice requiring ERCP stenting/dilatation; Fourth patient presented with abnormal LFT’s and RUQ pain and treated with chemotherapy. Conclusion: Patients with CF who have extended survival after LTX appear to be at risk of malignancies rarely seen in the younger CF population. Careful monitoring and follow-up is therefore warranted. THE EFFECT OF KIDNEY TRANSPLANTATION ON CARDIAC BIOMARKERS Pilmore Helen1, Pilmore Andrew, Stewart Ralph2, Sidhu Karishma2 Renal Transplant Group, Auckland City Hospital, 2Cardiology Dept, Auckland City Hospital 1Auckland
Troponin Baseline Troponin Week 1
34 16
Mean
P compared to Baseline
SD
41.49 21.04
25 17.31
o0.001
Troponin Month 1 Creatinine Baseline
24 722
30.47 744.48
21.64 303.99
0.006
Creatinine Week 1 Creatinine Month 1
142 123
244.9 150.86
226.22 115.34
o0.001 o0.001
CKD Epi Baseline CKD Epi Week 1
o10 ml 46.4
42.12
25.14
o0.001
CKD Epi Month 1 BNP Baseline
56.1 175
68.54 466.68
104.62 736.52
o0.001
BNP Week 1 BNP Month 1
199 46.5
525.78 134.81
850.58 188.23
0.762 0.002
Conclusions: Despite a reduction in levels of the cardiac biomarkers hs TnT and BNP after kidney transplantation, most patients at one month continued to have elevated levels compared to the upper limit of normal for the general population.
MANAGEMENT OF CORONARY ARTERY DISEASE IN DIALYSIS PATIENTS Srikumar Gajan1, Webster Mark2, Pilmore Helen1 Group, Auckland City Hospital, 2Cardiology Dept, Auckland City Hospital 1Auckland Renal Transplant
Coronary artery disease is common in renal transplant recipients however there is little evidence that revascularisation improves outcomes compared to medical management. Aims: To determine survival and cardiovascular outcomes in patients with ESRF after revascularisation compared with medical management. Methods: Survival and cardiovascular outcomes were examined in patients with ESRF who underwent coronary angiography between 2003 and 2012. These outcomes were compared in patients who underwent revascularisation [percutaneous intervention (PCI) +/stenting and coronary artery bypass grafting (CABG)] versus medical treatment. Immunology and Cell Biology
Abstracts A26
THE SPECTRUM OF COMPLICATIONS OF MTORI (MAMMALIAN TARGET OF RAPAMYCIN INHIBITORS) IN RENAL TRANSPLANT RECIPIENTS (RTR) – THE ROYAL MELBOURNE HOSPITAL (RMH) EXPERIENCE
Results: 288 patients with ESRF had a diagnostic coronary angiogram, with a total of 382 angiograms undertaken. 91 (31.6%) patients underwent revascularisation (61 PCI, 30 CABG), 151 (52.4%) patients were treated medically and 46 (16.0%) patients required no cardiac treatment. The median survival was 3.30 (IQR 2.10–5.30) years in patients undergoing CABG, 2.92 (IQR 1.49–5.41) years in patients treated with PCI and 2.95 (IQR 1.27–5.47) years for patients managed medically. There was no significant difference in survival between the treatment modalities for the entire cohort (see figure), nor for just patients with triple vessel disease. Similarly there was no difference the incidence of Major Adverse Cardiac Events when comparing medical management with revascularisation. Conclusion: Overall there was no significant difference in survival between patients undergoing revascularisation procedures compared to patients being medically managed. The overall survival for all management options is low, reflecting the poor survival of patients with ESRF on dialysis.
Das Gayatri1, Nicholls Kathy1,2, Hughes Peter1,2 of Nephrology, Royal Melbourne Hospital, 2School of Medicine, Faculty of Health Sciences, University of Melbourne 1Department
Aims: To evaluate the incidence, severity and reversibility of mTORi side effects in our unit. Methods: Retrospective study in all RTR treated with mTORi at any time post-transplant and followed at RMH. Parameters recorded were indication for mTORi, presence, type and timing of complications during follow-up and trough drug levels at 3 months post mTORi initiation and at latest follow up. Results: Ninety one patients met entry criteria (Sirolimus n¼54, Everolimus n¼37). Indications for mTORi were: clinical trial (n¼39, including 26 from day 0), cancer (skin¼20, other¼14), Calcineurin inhibitor toxicity (n¼17) and patient request (n¼1). Median duration of mTORi therapy was 8.4 years (range 7 days-15.8 years). Complications related to mTORi (Table1) were observed in 65 of the 91 (71%) but only 8 (9%) required drug cessation. 164 episodes were observed,120 in 41/54 Sirolimus patients, 44 in 24/37 Everolimus patients. Median time between initiating mTORi and first complication was 1.6 years (range 4 days-13.6 years). After complications were recognised, 57 patients continued mTORi at reduced dose, including 5 patients who required lymphocoele aspiration. Eight patients ceased mTORi due to severe lymphoedema (n¼2), pulmonary fibrosis/pneumonitis (n¼3), nephrotic range proteinuria (n¼2) and bone marrow suppression (n¼1). All side effects of mTORi reversed after dose alteration or cessation. Drug level at 3 months post transplantation was higher in the 65 patients with complications than in the 26 without problems (Mann-Whitney Z¼2.4572, P¼0.01) but current levels were similar. Conclusions: In mTORi treated RTR side effects are common, though usually treatable or self-limited.
EDEMA
HYPERGLYCAE MIA
ANAEMIA
LYMPHOEDEMA
LYMPHOCOELE
CELLULITIS
WOUND INFECTION
FOLLICULITIS
MOUTH ULCERS
ECZEMA
mTORi TOTAL
SIROLIMUS (n=54)
32
13
24
4
21
0
2
2
7
0
1
10
1
3
120
EVEROLIMUS (n=37)
13
3
11
2
4
1
1
0
7
1
0
0
1
0
44
TOTAL EPISODES
45
16
35
6
25
1
3
2
14
1
1
10
2
3
164
COMPLICATIONS
HYPERLIPIDAE MIA
PROTEINURIA
BONE MARROW SUPPRESSION PNEUMONITIS/ PULMONARY FIBROSIS
Figure: Kaplan-Meier survival curves comparing medical management, percutaneous intervention and coronary artery bypass grafting for patients with end-stage renal failure.
DRUGS
Table 1: Side Effects in Sirolimus- and Everolimus- treated Renal Transplant Recipients.
Immunology and Cell Biology
Abstracts A27
EARLY IMPACT OF OROPHARYNGEAL AND/OR LARYNGEAL DYSFUNCTION AFTER LUNG AND HEART TRANSPLANTATION Black Rebecca1,2, Glanville Allan3, Bogaardt Hans2, Macdonald Peter4, Mccabe Patricia2, Nair Priya5, Madill Catherine2 Department, St Vincent’s Hospital, Sydney, 2Faculty of Health Sciences, University of Sydney, 3Lung Transplant Unit, St Vincent’s Hospital, Sydney, 4Cardiac transplant unit, St Vincent’s Hospital, Sydney, 5Intensive care medicine, St Vincent’s Hospital, Sydney 1Speech Pathology
Purpose: There is minimal data regarding oropharyngeal and/or laryngeal dysfunction, defined as dysphagia and/or dysphonia, after lung and/or heart transplantation. A risk of aspiration and increased length of stay have been reported. Hence, we examined the association between patient demographics, medical risk factors and post operative outcomes with referral to Speech Pathology (SP) for swallowing and/ or voice assessment and management. Methods: Single center retrospective database analysis of demographic data, patient risk factors, post operative course and complications. Variables were analysed to investigate any association with referral to SP and morbid outcomes. Results: 69/284 (24%) patients transplanted 2010–2013 were referred to SP. Male: female ¼56%: 44%, mean age: 47years. 62% bilateral lung, 36% heart, 1% single lung and 1% heart-lung transplant. Total intubation time was greater for patients referred to SP (248±430 vs 56±130 hours; Po.0001), as were number of intubations required (1.07±0.72 vs 1.03±0.25; P¼.013), total days spent in ICU (29.7±30.6 vs. 7.3±12.3 days; Po.0001) and number of ICU admissions (1.57±1.07 vs 1.49±0.726; P¼.026). Mortality was increased with increased length of ICU stay (11.5±33.5 vs 12.4±19.0 days; P¼.036). Conclusion: Our study demonstrates patients with post-operative dysphonia and dysphagia have increased length of ICU stay, increased intubation time, frequency of intubation and number of admissions to ICU. Mortality was greater for patients with increased length of ICU stay. Our study has implications for informed consent and patient counselling and highlights the need for specialised SP management for this patient population. PAGE KIDNEY PHENOMENON FOLLOWING KIDNEY GRAFT BIOPSY Mak Jackie, He Bulang WA Liver & Kidney Transplant Service, Sir Charles Gairdner Hospital, Perth Introduction: The Page kidney phenomenon occurs as a result of compression to renal parenchyma leading to renal failure and hypertension. Although it is a rare complication following percutaneous renal allograft biopsy, if not diagnosed and treated promptly, graft loss is inevitable. Methods: From Jan 2011 to Dec 2014. Two cases of Page kidney phenomenon after renal graft biopsy were identified. One biopsy was indicated due to deteriorating graft function whereas another was for protocol biopsy. Both patients were females; age was 62 and 58 respectively. The biopsy was performed under Doppler ultrasound guidance. Result: The first patient was discharged on the day of procedure, as her observations after procedure were normal. She presented 10 days later with severe pain over the graft. The second patient was admitted
after the procedure due to severe pain over the graft, haematuria and reduction in urine output. A subcapsular haematoma was identified on Doppler ultrasound with absence or reversal of diastolic blood flow. Urgent surgical exploration and evacuation of haematoma was performed in both cases. The first kidney graft remained underperfused after surgery. The renal parenchyma was nonviable on day 2 post surgery and the graft was removed. The second kidney graft had demonstrated normal vascular waveforms on doppler ultrasound on day 1 after surgery. This kidney graft function recovered satisfactory over time. Conclusion: In the setting of renal graft biopsy, Page kidney should be highly suspected when patients develop acute pain over graft, reduction in urine output, decline in renal function and hypertension. The allograft can be salvaged by prompt surgical evacuation of the haematoma. RECURRENT GLOMERULOPATHY IN A RENAL ALLOGRAFT DUE TO LECITHIN CHOLESTEROL ACYLTRANSFERASE DEFICIENCY Liew Hui1, Mulley Bill1,2, Simpson Ian3 1Nephrology
and Renal Transplant, Monash Medical Centre, Melbourne, and Renal Transplant, Monash University, Melbourne, 3Department of Anatomical Pathology, Monash Medical Centre, Melbourne 2Nephrology
Lecithin cholesterol acyltransferase (LCAT) is an enzyme that converts free cholesterol into cholesteryl ester. LCAT deficiency is a rare autosomal recessive disease leading to accumulation of phospholipids in various tissues with multisystem manifestations including corneal opacification, anaemia and renal impairment. Renal involvement is a major cause of morbidity and mortality in these patients and can progress to end stage renal disease requiring renal replacement therapy or transplantation. Disease recurrence in the allograft after transplantation can occur because of persistent metabolic abnormalities. This case report describes a woman with LCAT deficiency who developed renal impairment at 33 years of age in 1987. Her kidney biopsy demonstrated features of mesangiocapillary glomerulonephritis type III however it was not until several years later that she was suspected to have LCAT deficiency after developing bilateral corneal opacities. Her kidney function deteriorated and she started haemodialysis in 2003 before receiving a deceased donor kidney transplant in 2005. A transplant biopsy in 2014 to investigate a rising creatinine revealed features consistent with recurrence of LCAT deficiency. Whilst genetic testing was never undertaken, she displayed all the manifestations of familial LCAT deficiency (FLD). To our knowledge this is the first case report of renal transplantation in FLD and its recurrence in the allograft in Australia, and the first to describe the histology of recurrent disease after 8.5 years of follow-up.
IMPACT OF PRE-OPERATIVE TREATMENT FOR POLYCYSTIC KIDNEY IN RENAL TRANSPLANT RECIPIENTS FOR AUTOSOMAL DOMINANT POLYCYSTIC KIDNEY DISEASE Marui Yuhji1, Tanaka Kiho1, Ishii Yasuo1, Ubara Yoshifumi2, Tomikawa Shinji1 1Department of Renal Transplantation Surgery, Toranomon Hospital Kajigya, Japan, 2Department of Nephrology, Toranomon Hospital Kajigaya, Japan
Immunology and Cell Biology
Abstracts A28
Table Reduction rate of PCK after RTx TAE+ Pre- Op Treatment Reduction rate at 1 year: average (range) Latest reduction rate: average (range) Duration of observation: average (range)
Nephrectomy
TAE
Nephrectomy
None
(n¼6)
(n¼4)
(n¼7)
(n¼2)
26.4%
15.4%
37.4%
28.5%
(15.3–45%)
(6.5–26.1%)
(9.5–72.6%)
(11.9–45%)
29.4%
31.8%
55.8%
36.7%
(2.8–45.2%)
(21.4–46.5%)
(15.1–82.2%)
(28.3–45%)
2.3year
3.5year
4.5year
1.5year
(1–5year)
(2–5year)
(1–8year)
(1–2year)
Most of renal transplant (RTx) recipients due to autosomal dominant polycystic kidney disease (ADPKD) require treatment pre-operatively for complication related with enlarged polycystic kidney (PCK), which includes the abdominal distension, the gross-hematuria and the cyst infection. Especially on RTx the enough room for kidney graft is needed in abdominal cavity. Aims: To assess the impact of pre-operative treatment (POT) for enlarged PCK, which included nephrectomy and transluminal renal arerial embolization (TAE) on outcome of size of native PCK and complication following RTx. Methods: RTx patients for ADPKD from 2001 to 2014 were identified. Reduction rate of size of native PCK in each POT was calculated using follow-up CT, and complication following RTx were investigated. Results: 38 patients were identified, and reduction rate of native PCK were compared in 19 patients. POT and reduction rate were shown in table. There was a tendency of difference between POT. Regarding to complication, the frequency of cyst infection and bleeding was only one each. There were 2 intestinal perforation, 1 subarachnoid hemorrhage, and 3 septic cases. Conclusion: POT for ADPKD recipients appears meaningful not only for RTx but also for prevention of recurrence of complication related with PCK.
was by simple proportions answering consistent with TSANZ guidelines. Results: 74 responses were analysed. Respondents were 59% male, 82% Australian and 70% nephrologists (12% ICU, 10% transplant surgeons, 7% hepatologists), with mean age 40-49. Table One summarises the results. Naı¨ve and vaccinated donors and recipients were well-identified. Only 26% identified a HCV antibody-positive but NAT-negative donor as posing an infection risk. Transplant decisions were variable. Only one scenario had 490% transplanting in concordance with guidelines; often stronger consensus was given to another option. For instance, Case 1 was deemed suitable with specific informed consent and prophylaxis by 51%; 59% reported Case 8 suitable with specific informed consent. Conclusions Although interpretations of donor and recipient hepatitis status were broadly consistent, transplant suitability decisions showed considerable variability among clinicians and often discordance with guidelines. This may be due to best-practice management outpacing guidelines, guidelines that are at times ambiguous or regional or personal preferences driving practice. This suggests a need for new, pragmatic guidance to aid doctors making high-impact decisions in a fast-paced environment. Table 1 Proportion of respondents who correctly attributed risk and made transplant decisions in concordance with TSANZ guidelines Scenario
Proportion correct (%) Transplant Donor Recipient Transplant
Case
Donor
Recipient
Match
status
status
suitability
Hepatitis B 1
Exposed; low risk
Naı¨ve
Suitables; consent
64
100
22
2
Exposed; high risk
Vaccinated
Unsuitable
86
95
43
3
Exposed; high risk
Unsuitable
97
76
58
4
Exposed; high risk
Exposed; no virus Naı¨ve
Unsuitable
85
100
80
5 Vaccinated Hepatitis C
Naı¨ve
Suitable
93
100
97
Waller Karen1, Wyburn Kate1,2, Shackel Nicholas1,3,4, O’Leary Michael1,5,6, Webster Angela7,8
6 7
Exposed; risk Exposed; risk
Naı¨ve Exposed;
Unsuitable 27 Unsuitable 100
96 95
41 62
of Medicine, University of Sydney, 2Department of Nephrology, Royal Prince Alfred Hospital, Sydney, 3Centenary Institute of Cancer Medicine and Cell Biology, Sydney, 4Department of Gastroenterology, Royal Prince Alfred Hospital, Sydney, 5Intensive Care Service, Royal Prince Alfred Hospital, Sydney, 6NSW Organ and Tissue Donation Service, 7School of Public Health, University of Sydney, 8Department of Nephrology, Westmead Hospital, Sydney
8
Exposed; risk
no virus Exposed;
Unsuitable
92
8
THE HINT STUDY: A CROSS-SECTIONAL SURVEY OF TRANSPLANT CLINICIANS ON HEPATITIS TRANSMISSION RISK IN SOLID ORGAN TRANSPLANTATION
1School
Aims: Understanding donor and recipient hepatitis serology and transmission risk as tests and treatments evolve can be challenging. We aimed to survey understanding among the medical transplant workforce. Methods: A cross-sectional, self-completed, anonymous survey was distributed via mailing lists targeting Nephrologists, Hepatologists, Intensive Care Specialists (ICU) and Transplant Surgeons in Australia and New Zealand. Participants answered 8 scenarios with Hepatitis B (HBV;5) and Hepatitis C (HCV;3) serology, assessing donor and recipient hepatitis status, and suitability for transplant. Analysis Immunology and Cell Biology
26
virus
PERIPHERAL BLOOD NATURAL KILLER CELL FUNCTION AND ALLO-STIMULATED T-CELL INTERFERON-GAMMA RELEASE IN KIDNEY TRANSPLANT RECIPIENTS ASSOCIATES WITH CANCER AND DEFINES RISK OF IMMUNOSUPPRESSION RELATED COMPLICATIONS Hope Christopher1,2, Fuss Alexander1,2, Hanf William1, Jesudason Shilpanjali1,2, Coates Patrick T1,2, Heeger Peter3, Carroll Robert1,2 1Centre
for Clinical and Experimental Transplantation, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Department of Medicine, University of Adelaide, 3Translational Transplant Research Center and Department of Medicine, Icahn School of Medicine at Mount Sinai
Abstracts A29
Reducing immunosuppression has been proposed as a means of preventing cancer in Kidney Transplant Recipients (KTR) but can precipitate graft rejection. Aims: To measure anti-tumor Natural Killer (NK) cell function and allo-responses in KTR to define cancer and rejection risks for use in reduction therapy. Methods: Two cohorts of KTR with and without a history of malignancy were recruited (n¼76). NK cell function was measured by Lactate Dehydrogenase (LDH) release assay and Allo-stimulated Interferon-gamma (IFN-g) quantification was measured via Panel of Reactive T cell (PRT) Enzyme-Linked ImmunoSpot (ELISPOT) from KTR Peripheral Blood Mononuclear Cells (PBMC). KTR were followed prospectively between 2010 and 2014. Results: We have found that both peripheral blood NK cell function and IFN-gamma release are diminished in KTR with malignancy compared to KTR with no history of malignancy. With prospective follow-up, KTR with poor NK cell function (o3.9% lysis) had a Hazard Ratio (HR) [95% Confidence Interval] ¼6.6 [1.7-13] for recurrent cancer or cancer death (P¼0.003). Furthermore, KTR with a PRT ELISPOT o280 spots/3x105 PBMC, had HR¼3.0 [1.2-7.4] for recurrent cancer (P¼0.019). In addition, KTR with cancer and PRT ELISPOT value o93 spots/3x105 PBMC had HR ¼8.8 [1.6-46] for septic or cancer death (P¼0.014). Conclusions: High PRT ELISPOT values have been confirmed to determine cell mediated rejection risk in early transplant KTR. This study shows low PRT ELISPOT values and low NK cell function associate to increases in cancer and immunosuppression related deaths in long-term (410 yrs) KTR.
T cell suppressive function, and the ability to engender human Tregs. The effect of different cytokines on MSC function was evaluated. Results: We found that IL-17A was a superior modulator of human MSC. These ‘‘MSC-17’’ cells showed low expression of MHC class I, MHC class II and CD40, but expressed key MSC markers including CD73, CD90 and CD105. MSC-17 potently suppressed PHA-induced human T cell proliferation (3H-thymidine) and activation (reduced surface CD25 expression, and decreased IFN-g, TNF-a and IL-2 production). T cell suppression by MSC-17 correlated with high levels of known immunomodulatory factors TGF-b1 and PGE2. In T cell co-cultures, MSC-17 consistently engendered increased CD4+CD25highCD127lowFoxP3+ Treg. Experiments with purified human CD4+CD25 T cells showed MSC-17 cells could induce(i)Tregs. The iTreg increase required MSC-T cell cell-cell contact. Functionally, FACS-sorted MSC-17-induced-iTregs could suppress human T cell activation (CD154 suppression assay). MSC-17induced-iTregs expressed functional markers CD39, CD73, CD69, OX40, CTLA-4 and GITR. Conclusion: MSC-17 can engender Tregs that potently suppress T cell activation. These MSC-17 represent a potential cell therapy to modulate Tregs for clinical application. Additional studies to establish MSC-17 functional capacity in vivo are currently under way.
Immunobiology: tolerance and Tregs/ xenotransplantation
Transplantation Immunology Laboratory, University of New South Wales, Sydney Aim: Transplant tolerance induction requires CD4+CD25+FOXP3+Treg known as tTreg. Expanding non antigen specific tTreg for therapy requires impossibly large numbers to induce tolerance. Naı¨ve tTreg with specific TCR for donor can be induced and expanded in vitro, into more potent antigen specific Treg that could effect tolerance with smaller numbers of cells. After culture of naı¨ve tTreg with alloantigen and IL-2, there is induction of the IFN-gamma receptor(IFNGR), we called these Ts1 cells. Methods: Naı¨ve CD4+CD25+Treg from DA rats were cultured with PVG stimulator cells and IL-2 for 3–4 days to induced Ts1 cells. 10-30% of cultured cells expressed CD8, whereas starting population had o1% CD8+ cells. Results: In a Ts1 preparation, CD8+ cells had increased IFNGR compared to no induction in naı¨ve CD4+CD8CD25+T cells. CD8+cells had increased suppression in MLC and in adoptive transfer assays of allograft rejection. We also showed in DA rats rejecting PVG grafts or treated to induce tolerance, 12–18% of CD4+CD25+T cells expressed CD8, whereas in naı¨ve rats o5% express CD8. These CD25+T cells from both rejecting and tolerance induced rats suppressed MLC to PVG at 1:256–1:512, and to Lewis at 1:16–1:32, showing specific suppression. The CD8+cells when enriched suppressed MLC to PVG at 1:1012, and to Lewis at 1:32. The CD4+CD8FOXP3+T cells had no enhanced suppression to PVG. Conclusion: Expression of CD8 on CD4+CD8CD25+FOXP3+T cells is a marker of Treg that have been activated by specific antigen. This marker early after tTreg activation may be used to identify and further expand antigen specific Treg.
IL-17A MODULATED HUMAN MESENCHYMAL STEM CELLS AS A NOVEL CELL THERAPY TO ENGENDER REGULATORY T CELLS (TREGS) Sivanathan Kisha Nandini1,2, Rojas-Canales Darling1,2, Hope Christopher M1,2, Krishnan Ravi1,3, Carroll Robert P2,4, Gronthos Stan1, Grey Shane5, Coates Patrick T1,2,4 1Department of Medicine, University of Adelaide, 2Centre for Clinical and Experimental Transplantation, Royal Adelaide Hospital, 3Centre for Clinical and Experimental Transplantation, 4Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 5Transplant Immunology Group, Garvan Institute of Medical Research, Sydney
Aim: Tregs offer a potential cell therapy based approach to promote tissue repair and allow transplantation tolerance. Generation of sufficient Tregs represents a major limitation to the success of this approach. Here we show a novel strategy to engender Tregs using cytokine-modulated mesenchymal stem cells (MSC). Methods: Human MSC were generated from bone marrow and assessed for surface phenotype, gene expression, cytokine profile,
RECENTLY ALLOACTIVATED CD4+CD8-CD25+TREG EXPRESS CD8 AND ARE THE ANTIGEN SPECIFIC TREG Hall BM, Robinson CM, Tran GT, Verma ND, Boyd RA, Hodgkinson SJ
Immunology and Cell Biology
Abstracts A30
CHARACTERISATION OF THE IMMUNOMODULATORY ACTION OF NOVEL ANTI-CXCR3 MONOCLONAL ANTIBODY EXPLOITING A CONSERVED HUMAN AND MURINE EPITOPE Pilgrim Suzanna1,2, Zammit Nathan1, Walters Stacey1, Mitsuru Saito3, Zang Geoff3, Robert Remy2, Alexander Stephen3, Mackay Charles4,5, Grey Shane1 1Immunology
Division, Garvan Institute of Medical Research, Sydney, Monash University, Melbourne, 3Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 4Charles Perkins Centre, University of Sydney, 5University of Sydney 2Immunology,
Introduction: Recruitment of effector T cells to inflammatory sites is a pre-requisite for transplant rejection. Here we characterise a novel mouse anti-CXCR3 mAb (aCXCR3) in multiple models of inflammation. Methods: C57BL/6 mice were subjected to allogeneic islet, skin or heart transplant and treated i.v. with multiple doses of aCXCR3 with or without subtherapeutic doses of rapamycin. We investigated the efficacy of multi-dose aCXCR3 in autoimmune NOD mice. Results: aCXCR3 impairs CXCL9-, CXCL10- and CXCL11-dependent chemotaxis in vitro and in vivo administration reduces CXCR3 receptor expression particularly on NK and T cell subsets. aCXCR3 prolonged graft survival for skin (3/5 mice surviving 4100days, 10 mg/kg 4 doses, n¼5), cardiac (MST 11 vs. 7 days P¼0.0012 10 mg/kg 4 doses, nX7) and islet-allograft (MST 34 vs. 20 days, 15% surviving4100 days, Po0.05 2 mg/kg 9 doses nX4). When combined with a sub-therapeutic dose of rapamycin (MST 35 vs. 22 days nZ4) aCXCR3 resulted in 50% of islet allografts surviving long term (4100days, Po0.05, n¼6). Surviving grafts showed restricted regions of mononuclear cell infiltrate, preserved islet architecture, strong insulin staining and increased regulatory T cell frequency suggesting a collaborative role for aCXCR3 and rapamycin in establishing tolerance. In contrast, aCXCR3 monotherapy did not alter disease outcomes, insulitis or immune cell populations in the NOD model. Conclusions: CXCR3 is one of the best markers differentially expressed between effector/activated T cells, versus Treg cells. aCXCR3 dramatically prolonged islet allograft survival and improved survival of heart and skin allografts. aCXCR3 mAb shows potential therapeutic application particularly in the context of drug tapering strategies for islet transplants.
IL-2/IL-2 AB COMPLEX INDUCED MURINE SKIN ALLOGRAFT TOLERANCE REQUIRING DST IS DEPENDENT ON TIMING OF THE COMPLEX Zhang Geoff1, Wang Yuan Min1, Hu Min1, Sawyer Andrew1, Zhou Jimmy1, Grey Shane2, Alexander Stephen1 1Centre
for Kidney Research, The Children’s Hospital at Westmead, Sydney, 2Immunology, Garvan Institute of Medical Research, Sydney Background: In vivo T reg expansion using IL2 complexed with antiIL2 antibody has been demonstrated to be effective in the induction of long-term acceptance of islet allografts. In this study, we investigated the effects of administration of IL2 complex in stringent skin allograft models with MHC class I disparity. Immunology and Cell Biology
Method: 5 groups of B6 mice received a Bm1skin graft. Bm1 is a MHC Class I mutant strain of B6. Group1: received IL2 complex 3 days before skin grafting; Group 2: skin graft only control. Group3: received IL2 complex at Day -3, donor specific transfusion (DST) at Day 0 and then skin grafting at Day7. Group4: DST at Day 0, IL2 complex at Day 4 and skin grafting at Day 7. Group 5: DST at Day 0 and skin grafting at Day 7, IL2 complex was injected i.p. for 3 consecutive days. Graft survival, histology, immunophenotyping of CD4 and CD8 T Cells, mixed lymphocyte reaction and IFN-g ELISPOT were assessed. Results: Injection of IL2 complex induced a 7.5 fold increase of Foxp3 Tregs in peripheral blood at Day4 after injection. Despite the substantial expansion in numbers of Tregs, B6 acutely rejected Bm1 skin grafts at similar rate as the no IL2 complex control group (MST¼22 vs MST¼14). DST using bm1 splenocytes prolonged survival of subsequent skin grafting but failed to induce long-term graft acceptance (MST¼50). Administration of IL2 complex after DST moderately accelerated the process of graft rejection (MST¼33). In contrast, administration of IL2 complex before DST rendered long-term graft acceptance (MST¼100 days). In MLR, in comparison to IL2 complex after DST and DST only groups, cells from mice received IL2 complex ‘‘before’’ DST group showed significantly reduced responses to bm1 stimulators while maintaining equivalent responses to allogeneic third party stimulators. IFN-g ELISPOT showed a similar pattern of response as to MLR. Compared with DST control, administration of IL2 complex after DST increased the expression of CD25 on Foxp3- CD4+ (from 5.5%±0.51 to 17%±2.3) and on CD8+ cells (from 0.74%±0.32 to 2.73%±0.92), and CD69 expression on Foxp3-CD4+ cells (from 3.8%±1.4 to 11.8%±3.2). Conclusion: IL-2 complex in combination with DST when given prior to DST leads long term tolerance, both timing of the IL-2 complex and the DST are essential for the development of tolerance as shown by the control groups of late IL-2 complex and IL-2 complex without DST. These results are consistent with the complex enhancing donor specific Tregs induced by the DST but in the absence of the DST these are not found. Similarly the late IL-2 complex appears to activate effector cells limiting tolerance.
AUTOPHAGY IS CRITICAL FOR REGULATORY T CELL MAINTENANCE AND TOLERANCE Le Texier Laetitia1, Leveque Lucie1, Lineburg Katie E1, Alexander Kylie A1, Teal Bianca1, Martinez Michelle1, Melino Michelle1, Kuns Rachel D1, Lane Steven1,2, Blake Stephen1, Teng Michele1, Clouston Andrew D3, Hill Geoffrey R1,2, MacDonald Kelli PA1 1Department
of Immunology, Queensland Institute of Medical Research, Brisbane, 2Department of Bone Marrow Transplantation, Royal Brisbane Hospital, 3Department of Pathology, Envoi Pathology, Brisbane
Regulatory T cells (Treg) play a crucial role in the maintenance of peripheral tolerance. Quantitative and/or qualitative defects in Treg result in disease states such as autoimmunity, allergy, malignancy and graft-versus-host disease (GVHD), a serious complication of allogeneic bone marrow transplantation (BMT). We recently reported increased expression of autophagy related genes (Atg) in association with enhanced survival of Treg after BMT1. Autophagy is a selfdegradative process for cytosolic components which promotes cell homeostasis and survival. Using qRT-PCR, Western blotting, and
Abstracts A31
imaging flow cytometry techniques to assess autophagic activity and autophagosome formation, we demonstrate that autophagy is a constitutively active process within Treg and is dramatically enhanced within this population following BMT. Examination of the Treg compartment of Atg5/ chimeric mice in which autophagy was globally ablated, revealed a significant reduction in Treg in the spleen, lymph nodes and bone marrow. This was cell intrinsic as Treg specific deletion of autophagy (Atg7xFoxP3cre mice) resulted in a failure of peripheral Treg homeostasis and autoimmune colitis, phenocopying the disease seen in FoxP3 deficient mice. Using the well described B6 into B6D2F1 model, recipients of grafts from Atg7xFoxP3cre mice exhibited exacerbated GVHD, confirming the requirement of autophagy within Treg for GVHD control. Intriguingly, the mTOR inhibitor, rapamycin, induces high levels of autophagy in Treg which is essential for their enhanced survival in the presence of this agent after transplantation. Thus, autophagy is critical for the maintenance of tolerance and its induction is a promising and novel therapeutic strategy in diseases characterized by immune-mediated tissue injury. 1MacDonald,
K.P., et al. Modification of T cell responses by stem cell mobilization requires direct signaling of the T cell by G-CSF and IL-10. JI 2014 INTERLEUKIN-5 THERAPY PREVENTS CHRONIC ALLOGRAFT REJECTION BY INDUCTION OF T REGULATORY CELLS Hodgkinson SJ1, Hall BM1, Hall RM1, Tran GT1, Robinson CM1, Wang C2, Sharland A2 1Transplantation
Immunology Laboratory, University of New South Wales, Sydney, 2Collaborative Transplant Group, University of Sydney
Aim: Naı¨ve CD4+CD25+Foxp3+Treg activated by specific alloantigen and IL-4, not IL-2, express the IL-5 receptor(IL5R-a) and we call these Ts2 cells. Ts2 cells are more potent alloAg specific Treg than nTreg. Here we examined if IL-5Rx activated alloAg specific Ts2 cells to prevent chronic rejection. Methods: F334 recipients with Lewis heart grafts received 5000units rIL-5 ip daily for 10d from 7d after grafting. Rejection was scored on a semi-quantitative scale. Results: Sham Rx rats developed rejection at 18d and all rejected by 28d (n¼5). IL-5Rx prevented rejection until cessation of IL-5Rx (Po0.01 vs sham Rx) and all grafts survived for 60 days, albeit there was an rejection episode after stopping IL-5. Another group with continued IL-5Rx had less rejection. Pretreatment with anti-CD25 or anti-IL-4 abolished benefit of IL-5Rx, consistent with host CD25+T cells being activated to Ts2 cells, that were then expanded by IL-5Rx. IL-5Rx had increased CD4+CD25+T cells 6-8% vs 3-4% in controls. After 10d IL-5Rx, CD4+CD25+T cells responded to Lewis but not to F344 or third party PVG, showing activation of alloAg specific Treg by IL-5Rx. CD4+CD25+T cells from rats treated 50d with IL-5 had their response to Lewis enhanced by IL-5, consistent with a Ts2 cell. RT-PCR of host CD4+CD25+T cells found IL-5Rx had more IL-5Ra, but remained FOXP3+. Conclusion: IL-5 prevented rejection through CD25+Treg that required host IL-4 to activated them, consistent with induction of Ts2 cells. This study shows promotion of Th2 cytokine induced Treg by IL-5 therapy may have potential to prevent chronic allograft rejection.
INDUCTION OF POTENT ANTIGEN SPECIFIC TREG CELLS FROM NAI¨VE TTREG BY ALLOANTIGEN AND TH2 CYTOKINES Robinson CM, Tran GT, Hall RM, Wilcox P, Hodgkinson SJ, Hall BM Transplantation Immunology Laboratory, University of New South Wales, Sydney Aim: Transplant tolerance induction requires CD4+CD25+FOXP3+ Treg known as tTreg. There is interest in expanding tTreg for therapy, but they are not antigen specific and impossibly large numbers are required to induce tolerance. We found IL-4 in the absence of IL-2 can induce changes in naı¨ve tTreg that have TCR for stimulating donor antigen, inducing the receptor for IL-5(IL-5R alpha). We examined whether IL-5 could promote further activation of more potent antigen specific Treg. Methods: Naı¨ve CD4+CD25+Treg from DA rats were cultured with PVG stimulator cells and IL-4 for 3-4 days to induce Ts2 cells. These were re-cultured with PVG and IL-4, IL-5 or IL-4 & IL-5, and their phenotype assessed by FACS and RT-PCR. Capacity to suppress in MLC was also examined. Results: In all cultures, cells remained CD4+CD25+, and 60-80% expressed FOXP3. In MLC, Ts2 cells suppressed response to PVG at 1:32, as did Ts2 cells re-cultured with IL-4 or IL-4 & IL-5, whereas Ts2 cells cultured with IL-5 alone suppressed at 1:1024. Ts2 cells re-cultured with IL-5 did not suppress responses to third party Lewis grafts. RT-PCR showed Ts2 cells re-cultured with IL-4 retained Ts2 phenotype of FOXP3, IL-5R alpha and IFN-gamma expression. Ts2 cells cultured with IL-2 expressed FOXP3, IL-5R alpha but not IL-2, IL-4 or IFN-gamma. We are further characterizing its phenotype. Conclusion: IL-5 can induce highly potent antigen specific Treg from Ts2 cells. This IL-4 and IL-5 pathway of activation of naı¨ve tTreg may provide a method for induction of antigen specific Treg for therapy.
Sensitisation, antibodies and ABO incompatible transplantation PRECONDITIONING THERAPY IN ABO-INCOMPATIBLE LIVING KIDNEY TRANSPLANTATION – A SYSTEMATIC REVIEW AND META-ANALYSIS Sharma Ankit1, Lo Phillip2,3, Craig Jonathan C2,4, Wyburn Kate5, Lim Wai6, Chapman Jeremy R1, Palmer Suetonia C7, Strippoli Giovanni FM3,8,9,10, Wong Germaine1,3,4 1Centre for Transplant and Renal Research, Westmead Hospital, Sydney, 2Faculty of Medicine, University of New South Wales, Sydney, 3School of Public Health, University of Sydney, 4Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 5Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 6Department of Renal Medicine, Sir Charles Gairdner Hospital, Perth, 7Department of Medicine, University of Otago Christchurch, New Zealand,
Immunology and Cell Biology
Abstracts A32 8Cochrane Renal Group, Cochrane Collaboration, 9Department of Emergency and Organ Transplantation, University of Bari, Italy, 10Diaverum Medical Scientific Office and Diaverum Academy, Lund, Sweden
Aims: ABO incompatible (ABOi) kidney transplantation is today an established form of renal replacement therapy for patients with end-stage kidney disease. However, the efficacy and safety of different pre-conditioning therapies remains unclear. We aimed to synthesize published evidence of the effects of preconditioning therapies in living donor ABOi kidney transplantation on graft and patient outcomes. Methods: We searched MEDLINE, Embase and Clinicaltrials.gov databases (inception through February 2014) to identify studies describing the outcomes of adult living donor ABOi kidney transplantations using pre-conditioning therapies. Two independent reviewers identified studies, extracted data, and assessed risk of bias. Data were summarised using the random effects model and heterogeneity was explored using subgroup analyses. Results: Eighty-one studies (53 case reports and case series, 24 cohort, 2 case-control and 2 registry studies) with 4,853 ABOi transplant recipients were identified. Overall, the quality of the evidence was low (based on Grading of Recommendations Assessment, Development, and Evaluation [GRADE] framework). Over the mean follow-up time of 44 (standard deviation (SD)±13.2) months, overall graft survival for recipients who received immunoadsorption and apheresis were 94.1% (95%CI: 88.2%–99.1%) and 88.6% (95%CI: 83.0%–92.6%), respectively. For those who received rituximab and underwent splenectomy, overall graft survivals were 93.5% (95%CI: 89.5%–96.0%) and 78.9% (95%CI: 70.9%–85.2%), respectively. Data on longer-term outcomes such as the incidence of malignancy were sparse. Conclusions: Rituximab or immunoadsorption preconditioning was associated with the highest graft survivals, with the lower survivals observed with splenectomy or apheresis, respectively. However, the overall quality of evidence is low and confidence in relative treatment effects is limited. Future randomised trials comparing various forms, dosing and frequency of pre-conditioning therapies is likely to have a major impact on improving our confidence in the treatment effectiveness of these techniques.
THE IMPACT OF TREATMENT ON LEUKOCYTE SUBSETS IN ACUTE AND CHRONIC ANTIBODY MEDIATED REJECTION OF THE RENAL ALLOGRAFT Ramessur Chandran Sharmila1,2, Nikolic-Paterson David2,3, Longano Anthony2,3, Han Yingjie2,3, Ma Frank2,3, Kanellis John2,3, Mulley William2,3 1Monash Medical Centre, Melbourne, 2Department of Medicine, Monash University, Melbourne, 3Department of Nephrology, Monash Medical Centre, Melbourne
Aims: To compare intragraft leukocytic infiltrates in acute (aAMR) and chronic (cAMR) antibody-mediated-rejection and the impact of specific therapies on leukocytes. Methods: 21 aAMR, 19 cAMR and 16 protocol biopsies were immunostained for macrophages, neutrophils, T and B-cells at diagnosis and during treatment in 8 aAMR patients: 1.at diagnosis, 2.post-plasma exchange (PEX) and 3.post-rituximab and 8 cAMR patients: 1.at diagnosis, 2.post-high-dose IVIg and 3.postrituximab. Results: Leukocyte numbers were increased in aAMR and cAMR relative to protocol biopsies. There were significantly more glomerular neutrophils and interstitial macrophages but fewer interstitial T and B-cells in aAMR compared with cAMR at diagnosis. In aAMR, PEX was associated with 3-fold increase in interstitial T-cells (P¼0.036) and 86% increase in macrophages (P¼0.05). Rituximab eliminated B-cells whilst interstitial and glomerular neutrophils decreased by 90% compared to diagnosis (Po0.02). Interstitial macrophages fell compared with post-PEX (P¼0.05). In cAMR, interstitial macrophages doubled (P¼0.025) post-IVIg. Rituximab eliminated B-cells whilst, interstitial macrophages remained 50% more numerous than at diagnosis (P¼0.05). (table) Graft loss(5 yrs post-diagnosis) in cAMR (n¼9 of 19) was associated with increased glomerular macrophages(P¼0.004) and T-cells (P¼0.011) at diagnosis. Three aAMR grafts were lost. Conclusions: Greater neutrophil and macrophage numbers in aAMR is consistent with acute inflammation. The post-treatment decline in neutrophils may reflect response to therapy or a shift to chronic inflammation over time. The surprising increase in macrophages with treatment in aAMR and cAMR may indicate resistance to therapy or alternatively an influx of reparative macrophages. Active glomerular inflammation in cAMR appears a negative prognostic sign. Ongoing studies are examining effects of therapy on intragraft leukocyte phenotypes and signalling pathways.
RELATIVE BENEFITS AND COSTS OF AN ACCEPTABLE HLA MISMATCH PROGRAM IN AUSTRALIA Do Nguyen Hung1, Wong Germaine2,3,4, Chapman Jeremy2, Craig Jonathan3, Howard Kirsten5, D’orsogna Lloyd6,7, Mcdonald Stephen8,9, Russ Graeme10,11, Lim Wai12
Figure: Probability of graft survival among recipients who received the various forms of preconditioning therapies: A. apheresis, B. immunoadsorption, C. splenectomy, D. rituximab.
Immunology and Cell Biology
1School of Medicine & Pharmacology, University of Western Australia, Perth, 2Renal Transplant Unit, Westmead Hospital, Sydney, 3Centre for Kidney Research, The Children’s Hospital at Westmead, Sydney, 4School of Public Health, University of Sydney, 5Institute for Choice, University of South Australia, 6Department of Immunology, Fiona Stanley Hospital, 7School of Pathology and Laboratory Medicine, University of Western Australia, Perth, 8School of Medicine, University of Adelaide, 9Depart-
Abstracts A33
ment of Renal Medicine, Royal Adelaide Hospital, 10Renal & Transplantation Unit, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 11Faculty of Health Sciences, University of Adelaide, 12Renal Unit, Sir Charles Gairdner Hospital, Perth Background: The Eurotransplant acceptable mismatch (AM) program has reduced waiting-time for highly-sensitised kidney transplant candidates but the benefits and costs of implementing a similar program in Australia are unknown. Aims: To determine the benefits and costs of implementing an AM program in Australia. Methods: All highly-sensitised (PRAX80) deceased-donor kidney transplant recipients (2006-2011) were included. Allocation scores were recalculated for every historical match considering AM when present (defined as mismatched HLA with 0-2 eplet mismatched by HLAMatchmaker) and potential transition from state to national allocation. Increases in waiting-time for X3 reallocated non-highlysensitised recipients per highly-sensitised recipient was also modelled. Decision analytical models were developed to compare the benefits and costs of including AM to our current allocation.
Results: We included 120 of 187 highly-sensitised recipients, with 67(36%) excluded as they had p1 historical matches. Including AM reduced the waiting-time for 18 (15%) recipients by a mean±SD of 12.8±10.3months (Po0.001), resulting in a gain of 8.1 qualityadjusted life days and $3850 savings per recipient. Of these 18, 8(44%) were offered earlier kidneys through transition from state to national allocation and the remaining 10(56%) through the state or national allocation. There was an increase in the mean±SD waitingtime for 44 reallocated non-highly-sensitised recipients by 4.5±10.2months (P¼0.006), resulting in a reduction of 1.0 qualityadjusted life days and $655 additional cost per recipient. The most influential variables in our model were waiting-time reduction and graft loss probability. The overall modelled benefit of including AM for a cohort of 500 candidates (5% highly-sensitised) were 5.25 graftyears gained over 80 years or 1 fewer graft lost at 5-years compared to the current allocation. Conclusions: Modelling AM integrated into the current allocation in Australia reduced the waiting-time for a proportion of highly-sensitised recipients but was associated with a small reduction in overall health benefits and additional costs for non-highly-sensitised candidates on the waitlist.
Current allocation
AM allocation
P-value
Highly-sensitised recipients (n¼120) Recipients with acceptable mismatches (n, %) Recipients with improvement in score/rank (n, %)
63 (53%) 35 (29%)
State allocation to State allocation (n, %) National allocation to National allocation (n, %)
19 (54%) 7 (20%)
State allocation to National allocation (n, %) Recipients with improved transplant potential (n, 0%)
9 (26%) 18 (15%)
State allocation to State allocation (n, %) National allocation to National allocation (n, %)
8 (44%) 2 (12%)
State allocation to National allocation (n, %)
8 (44%)
Highly-sensitised recipients with improved transplant potential (n¼18) Waiting time (months) Potential reduction in waiting time (months) Total health benefit (QALY)
81.2 (37.8) 12.629
Incremental health benefit for the population [QALY] Incremental health benefit for the individual {QALY] Total healthcare cost Incremental healthcare cost for the population
$459,857
66.9 (42.7)
Incremental healthcare cost for the individual
72.4 (39.8)
0.006
4.5 (10.2) 13.518
13.518 o0.001 [0.03 days] 0.003 [1.0 days]
Incremental health benefit for the population [QALY] Incremental health benefit for the individual [QALY] Total healthcare cost Incremental healthcare cost for the population
$459,280 -$577 -$3850
Potential reduction in waiting time (months) Total health benefit (QALY)
o0.001
12.8 (10.3) 12.633 +0.004 [1.2 days] +0.022 [8.1 days]
Incremental healthcare cost for the individual Reallocated recipients (n¼44) Waiting time (months)
68.4 (33.1)
$311,547
$311,596 +$49 +$655
Immunology and Cell Biology
Abstracts A34
PRE-TRANSPLANT ASSESSMENT OF ANGIOTENSION II TYPE-1 RECEPTOR ANTIBODIES (ATR-AB) PREDICTS RISK OF HUMORAL REJECTION IN THE ABSENCE OF DONOR SPECIFIC HLA ANTIBODIES Carroll Robert1,2, Riceman Michael1,2, Hope Christopher1,2, Daeyton Sue3, Bennett Greg3, Coates Patrick T1,2 1Centre for Clinical and Experimental Transplantation, Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital, 2Department of Medicine, University of Adelaide, 3Tissue Typing Laboratory, Australian Red Cross Blood Service, Adelaide
Pre-transplant Angiotensin II type 1 antibodies (ATRab) have been identified as possible mediators of AMR in the absence of HLA donor specific antibodies (HLA-DSA). Aims: To assess risk of cell medicated (CMR) and antibody mediated rejection (AMR) associated to presence of pre-transplant ATRab in Kidney Transplant Recipients (KTR). Methods: ATRab levels were measured in pre-transplant KTR transplanted in our centre over 2012-2014 (n¼145) from stored sera. Results: All patients had a negative T and B cell CDC crossmatch and 28/145 (19%) had a low level DSA at the time of transplantation. All KTR underwent a protocol biopsy at creatinine nadir or for cause and the histology graded using Banff criteria 2009 for CMR and 2013 for AMR. 44/145 (30%) KTR experienced a rejection episode, consisting of 14/145 (9.7%) AMR and 30/145 (20%) CMR. Only 2/28 (7%) of KTR with a low level DSA experienced AMR and both of these had detectable ATRab. The Hazard Ratio (HR) for AMR was 3.7 [95% CI 2–26] (P¼0.009) for levels 418 U/ml ATRab. 6/11 (54%) of KTR with levels 425 U/ml experienced AMR despite being HLA-DSA negative. 8/14 (57%) of all AMR occurred in KTR with levels ATRab of 5–16 U/ml. Conclusions: Pre Transplant Assessment of ATRab status can be used to predict significant risk of AMR post-transplant in the absence of HLA-DSA but half of all AMR episodes occur in KTR with intermediate levels of ATRAb. Further work is needed to determine the risk of AMR in KTR with intermediate levels of ATRab.
CHANGE IN PANEL REACTIVE ANTIBODY AND KIDNEY TRANSPLANT OUTCOMES Lim Wai1, Chapman Jeremy2, Wong Germaine2 1Department 2Department
of Renal Medicine, Sir Charles Gairdner Hospital, Perth, of Renal Medicine, Westmead Hospital, Sydney
Background and Aims: Sensitised kidney transplant recipients with high peak panel reactive antibody (PRA) levels have significantly higher risk of rejection, graft failure and death. However, it remains unclear whether current PRA or change between peak and current PRA is equally important in predicting graft and patient outcomes. The aim of this study is to examine the associations between peak PRA, change between peak and current PRA and current PRA and graft outcomes. Methods: Using the Australian and New Zealand Dialysis and Transplant Registry (ANZDATA), we assessed the risk of acute rejection (AR), death-censored graft failure (DCGF) and death with functioning graft (DFG) among primary deceased donor kidney transplant recipients with different thresholds of peak and current PRA levels using adjusted logistic and Cox proportional hazard models. Results: Of 8108 primary live and deceased donor kidney transplant recipients between 1990 and 2012, 2982 (36.8%) and 903 (11.2%) have peak PRA of 0% and 450% respectively, with 5780 (71.3%) and 234 (2.9%) have current PRA of 0% and 450% respectively. Compared to peak PRA of 0%, there was an incremental risk of acute rejection, DCGF and DFG among recipients with increasing peak PRA, particularly among those with peak PRA of 475% (AR: odds ratio [OR] 2.28, 95%CI 1.63, 3.19, Po0.001; DCGF: hazard ratio [HR] 1.60, 95%CI 1.22, 2.09, P¼0.001; DFG: HR 1.33, 95%CI 1.03, 1.73, P¼0.032). In recipients with peak PRA of 450% (n¼580), a lesser reduction in PRA from peak to current sera was associated with a higher risk of AR (430% reduction: referent; 1–10% reduction: OR 3.58, 95%CI 1.16, 11.04; 11–20% reduction: OR 1.71, 95%CI 0.58, 5.04: 21-30% reduction: HR 3.55, 95%CI 1.26, 9.95, p-value for trend 0.037). There was no association between current PRA and change between peak and current PRA and DCGF and DFG. Conclusions: Peak PRA remains the most important marker of sensitisation status that predicts adverse outcomes after transplantation whereas a greater reduction from peak to current PRA in sensitised recipients was associated with a lower risk of AR.
CELL-MEDIATED AND HUMORAL ACUTE VASCULAR REJECTION AND GRAFT LOSS – A REGISTRY STUDY Lim Wai1, Teo Rachel2, Chadban Steven3, Wong Germaine4, Clayton Phil2, Mcdonald Stephen5, Russ Graeme5 Figure 1. Pre-transplant anti-angiotensin II type-1 receptor antibody (ATRAb) levels in Kidney Transplant Recipients (KTR): Pre-transplant ATRAb levels were quantitated in 145 KTR; 44 KTR who suffered rejection episodes; 14 KTR with Antibody Mediated Rejection (AMR) and 4 KTR in this group also had concurrent CMR, 30 KTR with Cellular Mediated Rejection (CMR) and 7 KTR subclinical borderline rejection, and 94 KTR with no rejection episodes. Those 14 KTR with AMR had a greater number of ATRab U/ml than the 94 KTR with no rejection (Po0.01) using Kruskal-Wallis analysis.
Immunology and Cell Biology
1Department
of Renal Medicine, Sir Charles Gairdner Hospital, Perth, of Renal Medicine, Prince of Wales Hospital, Sydney, 3Department of Renal Medicine, Royal Prince Alfred Hospital, Sydney, 4Department of Renal Medicine, Westmead Hospital, Sydney, 5Central Northern Adelaide Renal and Transplantation Service, Royal Adelaide Hospital 2Department
Abstracts A35
Background and Aims: Rejection of renal allografts following transplantation continues to be a major impediment to long-term graft survival. Although acute vascular rejection (AVR) is associated with a high risk of graft loss, it remains unclear whether AVR with accompanied cellular or humoral rejection have dissimilar outcomes. The aim of this study is to examine the association between types of AVR and graft outcomes. Methods: Using Australia and New Zealand Dialysis and Transplant (ANZDATA) registry, primary kidney transplant recipients between 2005 and 2012 whose first rejection episode was AVR were included. AVR was categorized into those without acute cellular, glomerular or humoral rejection (AVR-none), AVR with features of cellular and/or glomerular rejections (AVR-CG), and AVR with features of AHR (AVR-AHR). Association between AVR groups and graft loss was examined using adjusted logistic and Cox regression models. Results: Of the 274 recipients, 61 (22.3%) experienced AVR-none, 79 (28.8%) experienced AVR-AHR and 134 (48.9%) experienced AVR-CG. Compared to recipients who have experienced AVR-none and AVR-CG, recipients who have experienced AVR-AHR had the highest incidence of overall graft loss at 3-months (12%, 10% and 27% respectively, w2 11.88, P¼0.003) and death-censored graft loss (10%, 14% and 32% respectively, w2 12.41, P¼0.002). Recipients who had experienced AVR-AHR were at over a 2-fold increased risk of overall (hazard ratio [HR] 2.27, 95%CI 1.09, 4.72, P¼0.02) and death-censored graft loss (HR 3.18, 95%CI 1.02, 8.22, P¼0.03) compared to recipients who had experienced AVR-none. Sensitivity analysis showed that AVR-AHR with concurrent features of cellular and glomerular rejections had the poorest outcome. Conclusions: Of all AVR, AVR-AHR is associated with the poorest outcome with over 25% of grafts being lost after 3-months after transplantation. Future studies evaluating factors that predict graft loss in those who have experienced AVR-AHR may help determine prognosis and inform treatment practices.
INCIDENCE OF PRE-FORMED DONOR SPECIFIC ANTIBODIES AT TIME OF LIVER TRANSPLANT: A RETROSPECTIVE ANALYSIS Newman Allyson1,2, Crawford Michael1, Allen Richard1, Strasser Simone3, Mccaughan Geoff3, Shackel Nick3, West Claire1 Services, Royal Prince Alfred Hospital, Sydney, 2Institute of Academic Surgery, Royal Prince Alfred Hospital, Sydney, 3AW Morrow Gastroenterology & Liver Centre, Royal Prince Alfred Hospital, Sydney 1Transplantation
Background: Presence of Donor Specific Antibodies (DSAs) is routinely examined in non-liver transplants. There is emerging research examining pre-formed DSAs and associations with liver allograft rejection and outcomes, however the association remains controversial. Aim: To determine the incidence and significance of pre-formed DSAs at time of LT. Method: A retrospective analysis of Luminex DSA measurements in LT recipients from a single center was undertaken between January and November 2014 with a minimum 3-month follow up. Blood group incompatible, paediatric patients, re-transplants and multi-organ transplant recipients were excluded from the analysis. Results: 52 LT were suitable for analysis, of which 22 patients had pre-formed DSAs (42%). The majority of patients (59%) had a single DSA (range ¼–10). Five patients had only class I DSAs, eleven had only class II, and six had a combination of class I and II DSAs present. There was a wide range of MFIs (501–23181), however the majority of DSA with titres 48000 were class II (n¼9/12 or 75%). Interestingly patients with rejection had a peak class I DSA titre of 6,046 whilst those associated with class II had a titre of 2,168. The presence of high class I DSA titre (44000) was much more common in rejecting individuals post transplant. Conclusions: This study revealed a high incidence of pre-formed DSAs at the time of LT. High DSA, especially class I, has an apparent association with LT rejection episodes. This requires further investigation. The standard of care should include routine postoperative Luminex studies to monitor DSA.
Immunology and Cell Biology