Publications by authors named "Scott A Rega"

8 Publications

  • Page 1 of 1

Sex and Gender Disparities in Pretransplant Characteristics and Relationships with Postoperative Outcomes in Liver Transplant Recipients with Alcoholic Liver Disease.

Exp Clin Transplant 2020 11 16;18(6):701-706. Epub 2020 Jun 16.

From the Division of Hepatobiliary Surgery and Liver Transplantation, Section of Surgical Sciences, Vanderbilt University Medical Center, Nashville, Tennessee, USA.

Objectives: Previous studies of liver transplant recipients have reported discrepancies with regard to gender and/or sex differences but have focused on pretransplant outcomes. Female candidates are less likely to receive liver transplant and more likely to die or be delisted than their male counterparts. Here, we examined differences in men versus women with alcoholic liver disease before liver transplant and the effects of these differences on posttransplant survival.

Materials And Methods: We analyzed the Scientific Registry of Transplant Recipients records of adult, deceased-donor, whole liver transplant recipients with decompensated alcoholic liver disease from 2002 to 2017 to evaluate the effects of gender on survival in 2 alcoholic liver disease cohorts: (a) including and (b) excluding recipients with additional diagnoses. Pretransplant characteristics were compared using chi-square or t tests. Kaplan-Meier and multivariable proportional hazards regression models were used to evaluate the main and covariable-adjusted effects of gender on survival.

Results: Of 13781 transplant recipients with decompensated end-stage liver disease, as defined by Model for End-Stage Liver Disease score ≥ 15, 10924 (79%) were men and 2857 (21%) were women. Women had higher Model for End-Stage Liver Disease scores, higher rates of stage 4 and 5 chronic kidney disease, and were more likely to be on dialysis or ventilator support at time of transplant (all P < .05). Among all recipients, and after adjusting for risk factors, men were approximately 9% more likely than women to experience long-term graft loss (hazard ratio = 1.093; 95% confidence interval, 1.00-1.19; P = .043). However, sex difference was not associated with risk of graft loss among those without additional diagnoses (hazard ratio = 1.09; 95% confidence interval, 0.99-1.21; P = .095).

Conclusions: Although women with alcoholic liver disease who undergo liver transplant have higher severity of illness than their male counterparts, long-term outcomes are comparable.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.6002/ect.2020.0063DOI Listing
November 2020

Comparison of Wait-List Mortality Between Cholangiocarcinoma and Hepatocellular Carcinoma Liver Transplant Candidates.

Liver Transpl 2020 09 21;26(9):1112-1120. Epub 2020 Jul 21.

Department of Surgery, Division of Hepatobiliary Surgery and Liver Transplantation, Vanderbilt University Medical Center, Nashville, TN.

Despite the divergent disease biology of cholangiocarcinoma (CCA) and hepatocellular carcinoma (HCC), wait-list prioritization is identical for both diagnoses. We compared wait-list and posttransplant outcomes between CCA and HCC liver transplantation patients with Model for End-Stage Liver Disease exceptions using Scientific Registry of Transplant Recipients data. The 408 CCA candidates listed between 2003 and mid-2017 were matched to 2 HCC cohorts by listing date (±2 months, n = 816) and by Organ Procurement and Transplantation Network (OPTN) region and date (±6 months, n = 408). Cumulative incidence competing risk regression examined the effects of diagnosis, OPTN region, and center-level CCA listing volume on wait-list removal due to death/being too ill (dropout). Cox models evaluated the effects of diagnosis, OPTN region, center-level CCA volume, and waiting time on graft failure among deceased donor liver transplantation (DDLT) recipients. After adjusting for OPTN region and CCA listing volume (all P ≥ 0.07), both HCC cohorts had a reduced likelihood of wait-list dropout compared with CCA candidates (HCC with period matching only: subdistribution hazard ratio [SHR] = 0.63; 95% CI, 0.43-0.93; P = 0.02 and HCC with OPTN region and period matching: SHR = 0.60; 95% CI, 0.41-0.87; P = 0.007). The cumulative incidence rates of wait-list dropout at 6 and 12 months were 13.2% (95% CI, 10.0%-17.0%) and 23.9% (95% CI, 20.0%-29.0%) for CCA candidates, 7.3% (95% CI, 5.0%-10.0%) and 12.7% (95% CI, 10.0%-17.0%) for HCC candidates with region and listing date matching, and 7.1% (95% CI, 5.0%-9.0%) and 12.6% (95% CI, 10.0%-15.0%) for HCC candidates with listing date matching only. Additionally, HCC DDLT recipients had a 57% reduced risk of graft failure compared with CCA recipients (P < 0.001). Waiting time was unrelated to graft failure (P = 0.57), and there was no waiting time by diagnosis cohort interaction effect (P = 0.47). When identically prioritized, LT candidates with CCA have increased wait-list dropout compared with those with HCC. More granular data are necessary to discern ways to mitigate this wait-list disadvantage and improve survival for patients with CCA.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.25807DOI Listing
September 2020

Geographic Disparities in Access to Simultaneous Pancreas and Kidney Transplant in the Pre- and Post-Pancreas Allocation System Eras.

Transplantation 2020 03;104(3):623-631

Division of Kidney and Pancreas Transplantation, Department of Surgery, Vanderbilt University Medical Center, Nashville, TN.

Background: The 2014 pancreas allocation system (PAS) intended to decrease geographic variability in listing practices for simultaneous pancreas and kidney (SPK) transplant and define eligibility criteria for those with type 2 diabetes mellitus (T2DM). Our primary aims were to evaluate geographic disparities in access to SPK and assess T2DM SPK listings in the pre- and post-PAS eras.

Methods: Adult listings for SPK and kidney transplant (pre-PAS, January 2010 to October 29, 2014; post-PAS, October 30, 2014, to June 2, 2017) were identified in the Scientific Registry of Transplant Recipients. Multivariable logistic regression models tested associations of geography and/or diabetes mellitus type on the likelihood of SPK versus kidney transplant listing pre- and post-PAS. Competing risk models tested the likelihood of SPK transplantation within 2 years of listing for SPK.

Results: Among 41 205 listings (27 393 pre-PAS; 24 439 T2DM), univariate analysis showed reduced percentages for SPK post-PAS (22.1%-20.8%; P = 0.003). After adjusting for patient and center characteristics, geographic disparities declined slightly but persisted post-PAS (era by region interaction P < 0.001). The era by type of diabetes mellitus interaction effect was statistically significant (P = 0.039), reflecting that the proportions of SPK listings for T2DM increased in the post-PAS era (3.4%-3.9%; univariate P = 0.038), while those for type 1 diabetes mellitus remained statistically stable (47.9%-48.4%; univariate P = 0.571). Among people listed for SPK, geographic disparities in the cumulative incidence of transplantation within 2 years declined and the overall likelihood of transplantation increased in the post-PAS era (both P < 0.001).

Conclusions: Geographic disparities in access to SPK declined slightly but persisted post-PAS. With new allocation change proposals and elimination of listing criteria for T2DM, further monitoring is warranted.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TP.0000000000002850DOI Listing
March 2020

The Changing Face of Liver Transplantation in the United States: The Effect of HCV Antiviral Eras on Transplantation Trends and Outcomes.

Transplant Direct 2019 Mar 20;5(3):e427. Epub 2019 Feb 20.

Division of Hepatobiliary Surgery and Liver Transplantation, Section of Surgical Sciences, Vanderbilt University Medical Center, Nashville, TN.

Background: Hepatitis C virus (HCV) cirrhosis is the leading indication for liver transplantation in the United States, although nonalcoholic steatohepatitis (NASH) is on the rise. Increasingly effective HCV antivirals are available, but their association with diagnosis-specific liver transplantation rates and early graft survival is not known.

Methods: The Scientific Registry of Transplant Recipients database records were retrospectively stratified by HCV antiviral era: interferon (2003-2010), protease inhibitors (2011-2013), and direct-acting antivirals (2014 to present). Kaplan-Meier, χ, and multivariable Cox proportional hazards regression models evaluated the effects of antiviral era and etiology of liver disease on transplantation rates and graft survival over 3 years.

Results: Liver transplants for HCV decreased (35.3% to 23.6%), whereas those for NASH and alcoholic liver disease increased (5.8% to 16.5% and 15.6% to 24.0%) with each advancing era (all < 0.05). Early graft survival improved with each advancing era for HCV but not for hepatitis B virus, NASH, or alcoholic liver disease (multivariable model era by diagnosis interaction < 0.001). Era-specific multivariable models demonstrated that the risk of early graft loss for NASH was 22% lower than for HCV in the interferon era (hazard ratio, 0.78; 95% confidence interval, 0.64-0.96; = 0.02) but risks associated with these diagnoses did not differ significantly in the protease inhibitor ( = 0.06) or direct-acting antiviral eras ( = 0.08).

Conclusions: Increasing effectiveness of HCV antivirals corresponds with decreased rates of liver transplantation for HCV and improved early graft survival. As the rates of liver transplant for NASH continue to increase, focus will be needed on the prevention and effective therapies for this disease.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TXD.0000000000000866DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6411219PMC
March 2019

A2 to B Kidney Transplantation in the Post-Kidney Allocation System Era: A 3-year Experience with Anti-A Titers, Outcomes, and Cost.

J Am Coll Surg 2019 04 30;228(4):635-641. Epub 2019 Jan 30.

Division of Kidney and Pancreas Transplantation, Department of Surgery, Vanderbilt University Medical Center, Nashville, TN.

Background: The new kidney allocation systems (KAS) instituted December 2014 permitted A2 to B deceased donor kidney transplantation (DDKTx) to improve access and reduce disparities in wait time for minorities. A recent United Network for Organ Sharing (UNOS) analysis, however, indicated only 4.5% of B candidates were registered for A2 kidneys. Cited barriers to A2 to B DDKTx include titer thresholds, patient eligibility, and increased costs. There are little published data on post-transplantation anti-A titers or outcomes of A2 to B DDKTx since this allocation change.

Study Design: We conducted a retrospective, single center, cohort analysis of 29 consecutive A2 to B and 50 B to B DDKTx from December 2014 to December 2017. Pre- and postoperative anti-A titers were monitored prospectively. Outcomes included post-transplant anti-A titers, patient and graft survival, renal function, and hospital costs.

Results: African Americans comprised 72% of the A2 to B and 60% of the B to B group. There was no difference in mean wait time (58.8 vs 70.8 months). Paired tests indicated that anti-A IgG titers in A2 to B DDKTx were increased at discharge (p = 0.001) and at 4 weeks (p = 0.037). There were no significant differences in patient or graft survival, serum creatinine (SCr), or estimated glomerular filtration rate (eGFR), but the trajectories of SCr and eGFR differed between groups over the follow-up period. A2 to B had significantly higher mean transplant total hospital costs ($114,638 vs $91,697, p < 0.001) and hospital costs net organ acquisition costs ($42,356 vs $20,983, p < 0.001).

Conclusions: Initial experience under KAS shows comparable outcomes for A2 to B vs B to B DDKTx. Anti-A titers increased significantly post-transplantation, but did not adversely affect outcomes. Hospital costs were significantly higher with A2 to B DDKTx. Transplant programs, regulators, and payors will need to weigh improved access for minorities with increased costs.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jamcollsurg.2018.12.023DOI Listing
April 2019

Long-Term Physical HRQOL Decreases After Single Lung as Compared With Double Lung Transplantation.

Ann Thorac Surg 2018 12 16;106(6):1633-1639. Epub 2018 Aug 16.

Department of Thoracic Surgery, Vanderbilt University Medical Center, Nashville, Tennessee; Transplant Center, Vanderbilt University Medical Center, Nashville, Tennessee.

Background: Single lung transplantation (SLT) and double lung transplantation (DLT) are associated with differences in morbidity and mortality, although the effects of transplant type on patient-reported outcomes are not widely reported and conclusions have differed. Previous studies compared mean health-related quality of life (HRQOL) scores but did not evaluate potentially different temporal trajectories in the context of longitudinal follow-up. To address this uncertainty, this study was designed to evaluate longitudinal HRQOL after SLT and DLT with the hypothesis that temporal trajectories differ between SLT and DLT.

Methods: Patients transplanted at a single institution were eligible to be surveyed at 1 month, 3 months, 6 months, and then annually after transplant using the Short Form 36 Health Survey, with longitudinal physical component summary (PCS) and mental component summary (MCS) scores as the primary outcomes. Multivariable mixed-effects models were used to evaluate the effects of transplant type and time posttransplant on longitudinal PCS and MCS after adjusting age, diagnosis, rejection, Lung Allocation Score quartile, and intubation duration. Time by transplant type interaction effects were used to test whether the temporal trajectories of HRQOL differ between SLT and DLT recipients. HRQOL scores were referenced to general population norms (range, 40 to 60; mean, 50 ± 10) using accepted standards for a minimally important difference (½ SD, 5 points).

Results: Postoperative surveys (n = 345) were analyzed for 136 patients (52% male, 23% SLT, age 52 ± 13 years, LAS 42 ± 12, follow-up 37 ± 29 months [range, 0.6 to 133]) who underwent lung transplantation between 2005 and 2016. After adjusting for model covariates, overall posttransplant PCS scores have a significant downward trajectory (p = 0.015) whereas MCS scores remain stable (p = 0.593), with both averaging within general population norms. The time by transplant type interaction effect (p = 0.002), however, indicate that posttransplant PCS scores of SLT recipients decline at a rate of 2.4 points per year over the total observation period compared to DLT. At approximately 60 months, the PCS scores of SLT recipients, but not DLT recipients, fall below general population norms.

Conclusions: The trajectory of physical HRQOL in patients receiving SLT declines over time compared with DLT, indicating that, in the longer term, SLT recipients are more likely to have physical HRQOL scores that fall substantively below general population norms. Physical HRQOL after 5 years may be a consideration for lung allocation and patient counseling regarding expectations when recommending SLT or DLT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.athoracsur.2018.06.072DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6240480PMC
December 2018

Directed solutions to address differences in access to liver transplantation.

Am J Transplant 2018 11 24;18(11):2670-2678. Epub 2018 May 24.

Department of Surgery and the Transplant Center, Vanderbilt University Medical Center, Nashville, TN, USA.

The United Network for Organ Sharing recently altered current liver allocation with the goal of decreasing Model for End-Stage Liver Disease (MELD) variance at transplant. Concerns over these and further planned revisions to policy include predicted decrease in total transplants, increased flying and logistical complexity, adverse impact on areas with poor quality health care, and minimal effect on high MELD donor service areas. To address these issues, we describe general approaches to equalize critical transplant metrics among regions and determine how they alter MELD variance at transplant and organ supply to underserved communities. We show an allocation system that increases minimum MELD for local allocation or preferentially directs organs into areas of need decreases MELD variance. Both models have minimal adverse effects on flying and total transplants, and do not disproportionately disadvantage already underserved communities. When combined together, these approaches decrease MELD variance by 28%, more than the recently adopted proposal. These models can be adapted for any measure of variance, can be combined with other proposals, and can be configured to automatically adjust to changes in disease incidence as is occurring with hepatitis C and nonalcoholic fatty liver disease.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ajt.14889DOI Listing
November 2018

Increasing kidney donor profile index sequence does not adversely affect medium-term health-related quality of life after kidney transplantation.

Clin Transplant 2018 Apr 30;32(4):e13212. Epub 2018 Mar 30.

Department of Surgery, Division of Kidney and Pancreas Transplantation, Vanderbilt University Medical Center, Nashville, TN, USA.

Background: The United Network for Organ Sharing system allocates deceased donor kidneys based on the kidney donor profile index (KDPI), stratified as sequences (A ≤ 20%, B > 20-<35%, C ≥ 35-≤85%, and D > 85%), with increasing KDPI associated with decreased graft survival. While health-related quality of life (HRQOL) may improve after transplantation, the effect of donor kidney quality, reflected by KDPI sequence, on post-transplant HRQOL has not been reported.

Methods: Health-related quality of life was measured using the eight scales and physical and mental component summaries (PCS, MCS) of the SF-36 Health Survey. Multivariable mixed effects models that adjusted for age, gender, rejection, and previous transplant and analysis of variance methods tested the effects of time and KDPI sequence on post-transplant HRQOL.

Results: A total of 141 waitlisted adults and 505 recipients (>1700 observations) were included. Pretransplant PCS and MCS averaged, respectively, slightly below and within general population norms (GPN; 40-60). At 31 ± 26 months post-transplant, average PCS (41 ± 11) and MCS (51 ± 11), overall and within each KDPI sequence, were within GPN. KDPI sequence was not related to post-transplant HRQOL (P > .134) or its trajectory (interaction P > .163).

Conclusion: Increasing KDPI does not adversely affect the medium-term values and trajectories of HRQOL after kidney transplantation. This may reassure patients and centers when considering using high KDPI kidneys.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ctr.13212DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5933873PMC
April 2018