Publications by authors named "Luckmini Liyanage"

7 Publications

  • Page 1 of 1

Long-term kidney function and survival in recipients of allografts from living kidney donors with hypertension: a national cohort study.

Transpl Int 2021 Jun 15. Epub 2021 Jun 15.

Department of Surgery, Johns Hopkins University School of Medicine, Baltimore, MD, USA.

Allografts from living kidney donors with hypertension may carry subclinical kidney disease from the donor to the recipient and, thus, lead to adverse recipient outcomes. We examined eGFR trajectories and all-cause allograft failure in recipients from donors with versus without hypertension, using mixed-linear and Cox regression models stratified by donor age. We studied a US cohort from 1/1/2005 to 6/30/2017; 49 990 recipients of allografts from younger (<50 years old) donors including 597 with donor hypertension and 21 130 recipients of allografts from older (≥50 years old) donors including 1441 with donor hypertension. Donor hypertension was defined as documented predonation use of antihypertensive therapy. Among recipients from younger donors with versus without hypertension, the annual eGFR decline was -1.03 versus -0.53 ml/min/m (P = 0.002); 13-year allograft survival was 49.7% vs. 59.0% (adjusted allograft failure hazard ratio [aHR] 1.23; 95% CI 1.05-1.43; P = 0.009). Among recipients from older donors with versus without hypertension, the annual eGFR decline was -0.67 versus -0.66 ml/min/m (P = 0.9); 13-year allograft survival was 48.6% versus 52.6% (aHR 1.05; 95% CI 0.94-1.17; P = 0.4). In secondary analyses, our inferences remained similar for risk of death-censored allograft failure and mortality. Hypertension in younger, but not older, living kidney donors is associated with worse recipient outcomes.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
June 2021

Examination of Racial and Ethnic Differences in Deceased Organ Donation Ratio Over Time in the US.

JAMA Surg 2021 Apr 14;156(4):e207083. Epub 2021 Apr 14.

Department of Surgery, Johns Hopkins University School of Medicine, Baltimore, Maryland.

Importance: Historically, deceased organ donation was lower among Black compared with White populations, motivating efforts to reduce racial disparities. The overarching effect of these efforts in Black and other racial/ethnic groups remains unclear.

Objective: To examine changes in deceased organ donation over time.

Design, Setting, And Participants: This population-based cohort study used data from January 1, 1999, through December 31, 2017, from the Scientific Registry of Transplant Recipients to quantify the number of actual deceased organ donors, and from the Centers for Disease Control and Prevention Wide-ranging Online Data for Epidemiologic Research Detailed Mortality File to quantify the number of potential donors (individuals who died under conditions consistent with organ donation). Data were analyzed from December 2, 2019, to May 14, 2020.

Exposures: Race and ethnicity of deceased and potential donors.

Main Outcomes And Measures: For each racial/ethnic group and year, a donation ratio was calculated as the number of actual deceased donors divided by the number of potential donors. Direct age and sex standardization was used to allow for group comparisons, and Poisson regression was used to quantify changes in donation ratio over time.

Results: A total of 141 534 deceased donors and 5 268 200 potential donors were included in the analysis. Among Black individuals, the donation ratio increased 2.58-fold from 1999 to 2017 (yearly change in adjusted incidence rate ratio [aIRR], 1.05; 95% CI, 1.05-1.05; P < .001). This increase was significantly greater than the 1.60-fold increase seen in White individuals. Nevertheless, substantial racial differences remained, with Black individuals still donating at only 69% the rate of White individuals in 2017 (P < .001). Among other racial minority populations, changes were less drastic. Deceased organ donation increased 1.80-fold among American Indian/Alaska Native and 1.40-fold among Asian or Pacific Islander populations, with substantial racial differences remaining in 2017 (American Indian/Alaska Native population donation at 28% and Asian/Pacific Islander population donation at 85% the rate of the White population). Deceased organ donation differences between Hispanic/Latino and non-Hispanic/Latino populations increased over time (4% lower in 2017).

Conclusions And Relevance: The findings of this cohort study suggest that differences in deceased organ donation between White and some racial minority populations have attenuated over time. The greatest gains were observed among Black individuals, who have been the primary targets of study and intervention. Despite improvements, substantial differences remain, suggesting that novel approaches are needed to understand and address relatively lower rates of deceased organ donation among all racial minorities.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
April 2021

Outcomes After Declining a Steatotic Donor Liver for Liver Transplant Candidates in the United States.

Transplantation 2020 08;104(8):1612-1618

Department of Surgery, Johns Hopkins University School of Medicine, Baltimore, MD.

Background: Steatotic donor livers (SDLs, ≥30% macrosteatosis on biopsy) are often declined, as they are associated with a higher risk of graft loss, even though candidates may wait an indefinite time for a subsequent organ offer. We sought to quantify outcomes for transplant candidates who declined or accepted an SDL offer.

Methods: We used Scientific Registry of Transplant Recipients offer data from 2009 to 2015 to compare outcomes of 759 candidates who accepted an SDL to 13 362 matched controls who declined and followed candidates from the date of decision (decline or accept) until death or end of study period. We used a competing risk framework to understand the natural history of candidates who declined and Cox regression to compare postdecision survival after declining versus accepting (ie, what could have happened if candidates who declined had instead accepted).

Results: Among those who declined an SDL, only 53.1% of candidates were subsequently transplanted, 23.8% died, and 19.4% were removed from the waitlist. Candidates who accepted had a brief perioperative risk period within the first month posttransplant (adjusted hazard ratio [aHR]: 2.493.494.89, P < 0.001), but a 62% lower mortality risk (aHR: 0.310.380.46, P < 0.001) beyond this. Although the long-term survival benefit of acceptance did not vary by candidate model for end-stage liver disease (MELD), the short-term risk period did. MELD 6-21 candidates who accepted an SDL had a 7.88-fold higher mortality risk (aHR: 4.807.8812.93, P < 0.001) in the first month posttransplant, whereas MELD 35-40 candidates had a 68% lower mortality risk (aHR: 0.110.320.90, P = 0.03).

Conclusions: Appropriately selected SDLs can decrease wait time and provide substantial long-term survival benefit for liver transplant candidates.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
August 2020

Living kidney donation in individuals with hepatitis C and HIV infection: rationale and emerging evidence.

Curr Transplant Rep 2019 Jun 30;6(2):167-176. Epub 2019 Apr 30.

Department of Medicine, Johns Hopkins University School of Medicine, Baltimore, MD.

Purpose Of Review: HIV-infected (HIV+) and hepatitis C virus-infected (HCV+) individuals with end-stage renal disease (ESRD) have decreased access to kidney transplantation. With new opportunities provided by the HIV Organ Policy Equity (HOPE) Act and direct-acting antivirals (DAAs) for HCV, we explore the potential risks and benefits of living donor kidney transplantation from HIV+ or HCV+ donors, from the perspective of both donor health and recipient outcomes.

Recent Findings: The HOPE Act permits organ donation from both deceased and living HIV+ persons to HIV+ recipients; however, there is only clinical experience with HIV+ deceased donors to date. Empirical evidence demonstrates a low but acceptable risk of ESRD in potential HIV+ living donors without comorbidities who have well-controlled infection in the absence of donation. With the availability of potent DAAs for eradication of HCV infection, growing evidence shows good outcomes with HCV seropositive and/or viremic deceased kidney donors, providing rationale to consider HCV+ living donors.

Summary: HIV+ and HCV+ living donor kidney transplantation may improve access to transplant for vulnerable ESRD populations. Careful evaluation and monitoring are warranted to mitigate potential risks to donors and recipients.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
June 2019

The true risk of living kidney donation.

Curr Opin Organ Transplant 2019 08;24(4):424-428

Department of Surgery, Division of Transplantation, Johns Hopkins School of Medicine.

Purpose Of Review: The safety of living donor nephrectomy is essential to the continued success, growth, and sustainability of the clinical practice of living donor kidney transplantation. This review summarizes recent advances in our understanding of the perioperative and long-term risks faced by living kidney donors.

Recent Findings: Although adverse perioperative complications are extremely rare, donors particularly men, Black, or obese, frequently experience minor complications that result in delayed return to normal duties at home and work. Similarly, although long-term complications such as end-stage renal disease (ESRD) are rare, recent studies suggest a relative increase in risk of ESRD that is attributable to donation. Several risk calculators have been developed to help donors and their care providers quantify the baseline and postdonation risk of ESRD based on demographic and health characteristics. Thresholds of risk may help define what is an acceptable level of risk to the donor and the transplant center.

Summary: Individualized risk calculators now allow care providers and potential donors to objectively and transparently participate in shared decision-making about the safety of living kidney donation.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
August 2019

Optimal duration of postoperative helmet therapy following endoscopic strip craniectomy for sagittal craniosynostosis.

J Neurosurg Pediatr 2018 Dec;22(6):610-615

1Division of Pediatric Neurosurgery, Johns Hopkins University School of Medicine, Baltimore, Maryland; and.

OBJECTIVEMany infants with sagittal craniosynostosis undergo effective surgical correction with endoscopic strip craniectomy (ESC) and postoperative helmet therapy (PHT). While PHT is essential to achieving optimal cosmesis following ESC, there has been little comprehensive analysis of the ideal PHT duration needed to attain this goal.METHODSThe authors retrospectively reviewed the charts of infants undergoing ESC and PHT for sagittal synostosis at our institution between 2008 and 2015. Data collected included age at surgery, follow-up duration, and PHT duration. Cephalic index (CI) was evaluated preoperatively (CIpre), at its peak level (CImax), at termination of helmet therapy (CIoff), and at last follow-up (CIfinal). A multivariate regression analysis was performed to determine factors influencing CIfinal.RESULTSThirty-one patients (27 male, 4 female) were treated in the studied time period. The median age at surgery was 2.7 months (range 1.6 to 3.2) and the median duration of PHT was 10.4 months (range 8.4 to 14.4). The mean CImax was 0.83 (SD 0.01), which was attained an average of 8.4 months (SD 1.2) following PHT initiation. At last follow-up, there was an average retraction of CIfinal among all patients to 0.78 (SD 0.01). Longer helmet duration after achieving CImax did not correlate with higher CIfinal values. While CImax was a significant predictor of CIfinal, neither age at surgery nor CIpre were found to be predictive of final outcome.CONCLUSIONSPatients undergoing ESC and PHT for sagittal synostosis reach a peak CI around 7 to 9 months after surgery. PHT beyond CImax does not improve final anthropometric outcomes. CIfinal is significantly dependent on CImax, but not on age, nor CIpre. These results imply that helmet removal at CImax may be appropriate for ESC patients, while helmeting beyond the peak does not change final outcome.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
December 2018

The impact of gender on cardiovascular system calcification in very elderly patients with severe aortic stenosis.

Int J Cardiovasc Imaging 2016 Jan 29;32(1):173-9. Epub 2015 Aug 29.

Cardiovascular Division, Department of Medicine, Perelman School of Medicine of the University of Pennsylvania, 3400 Spruce Street, Philadelphia, PA, 19104, USA.

There is an established sex difference in cardiovascular disease among pre-menopausal women and age-matched men, with men having greater susceptibility to cardiovascular and coronary artery disease. Cardiovascular calcification may be linked to the atherosclerotic process and resulting disease, but the sex difference regarding coronary artery disease susceptibility and calcification is incompletely understood. We thought to measure calcium volume in different chest vascular beds in very elderly men and women with severe aortic stenosis (AS). Computed tomography scans of 94 patients with severe AS were calcium volume scored on Aquarius iNtuition Terarecon (Terarecon Inc., CA, USA) work stations. Coronary beds, aortic valve, mitral valve apparatus, and the thoracic aorta were examined. A significant sex difference in the mean total calcium volume of the coronary arteries was found in elderly (p = 0.001), with men having greater levels of calcification. There is also a significant sex difference in the amount of aortic valve calcium (p = 0.003). Furthermore, aortic and coronary calcification was independently correlated with sex. This study demonstrates a significant sex impact on calcification in the coronary beds and aortic valve in elderly patients with severe AS.
View Article and Find Full Text PDF

Download full-text PDF

Source Listing
January 2016