Publications by authors named "Linda Jennings"

48 Publications

Non-HLA Autoantibodies at 1 Year Negatively Affect 5-Year Native Renal Function in Liver Transplant Recipients.

Transplant Proc 2021 Apr 10;53(3):1019-1024. Epub 2021 Feb 10.

Charité-Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Nephrology and Intensive Care Medicine, Campus Virchow Klinikum, Berlin, Germany; Berlin Institute of Health (BIH), Berlin, Germany.

Background: Angiotensin II type-1 receptor (ATR) and endothelin-1 type A receptor (ETR) autoantibodies, in addition to allograft injury, can bind native endothelial cells and cause vascular vasoconstriction and fibrosis progression in nontransplanted organs. Therefore, we investigated long-term native renal function in liver transplant (LT) recipients with and without anti-ATR-Abs and/or anti-ETR-Abs present in serum.

Methods: Primary LT recipients at our single center from January 2000 to April 2009 had their prospectively collected pre-LT (1269 patients) and year 1 post-LT (795 patients) serum tested retrospectively for anti-ATR-Abs and/or anti-ETR-Abs. Anti-ATR-Abs and anti-ETR-Abs testing was accomplished with a standardized solid phase assay in which >10 U was considered positive.

Results: Pretransplant anti-ATR-Abs and/or anti-ETR-Abs did not change the median delta creatinine from pretransplant to 1 year post-transplant. In multivariable analysis controlling for diabetes (DM) and calcineurin inhibitor (CNI) use, anti-ATR-Abs and/or anti-ETR-Abs at 1-year remained statistically significantly associated with a decline in GFR (measured by Modification of Diet in Renal Disease-6) from years 1-5 post-LT (P = .04). In diabetic patients the association with a decline in renal function was more pronounced with (-9.29 mL/min) vs without (-2.28 mL/min) anti-ATR-Abs and/or anti-ETR-Abs at year 1, respectively (P = .004).

Conclusion: At 1-year post-LT, the autoantibodies anti-ATR-Abs and/or anti-ETR-Abs are associated in multivariable analysis with an increased risk of native renal function decline especially in diabetic patients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.transproceed.2021.01.013DOI Listing
April 2021

Hypomagnesemia and risk of post-transplant lymphoproliferative disorder in liver transplant recipients.

Transpl Int 2020 12 9;33(12):1835-1836. Epub 2020 Nov 9.

Infections and Immunoepidemiology Branch, Division of Cancer Epidemiology and Genetics, National Cancer Institute, Rockville, MD, USA.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/tri.13735DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7902346PMC
December 2020

MELD-GRAIL-Na: Glomerular Filtration Rate and Mortality on Liver-Transplant Waiting List.

Hepatology 2020 05 29;71(5):1766-1774. Epub 2020 Jan 29.

Baylor University Medical Center, Dallas, TX.

Background And Aims: Among patients with cirrhosis awaiting liver transplantation, prediction of wait-list (WL) mortality is adjudicated by the Model for End Stage Liver Disease-Sodium (MELD-Na) score. Replacing serum creatinine (SCr) with estimated glomerular filtration rate (eGFR) in the MELD-Na score may improve prediction of WL mortality, especially for women and highest disease severity.

Approach And Results: We developed (2014) and validated (2015) a model incorporating eGFR using national data (n = 17,095) to predict WL mortality. Glomerular filtration rate (GFR) was estimated using the GFR assessment in liver disease (GRAIL) developed among patients with cirrhosis. Multivariate Cox proportional hazard analysis models were used to compare the predicted 90-day WL mortality between MELD-GRAIL-Na (re-estimated bilirubin, international normalized ratio [INR], sodium, and GRAIL) versus MELD-Na. Within 3 months, 27.8% were transplanted, 4.3% died on the WL, and 4.7% were delisted for other reasons. GFR as estimated by GRAIL (hazard ratio [HR] 0.382, 95% confidence interval [CI] 0.344-0.424) and the re-estimated model MELD-GRAIL-Na (HR 1.212, 95% CI 1.199-1.224) were significant predictors of mortality or being delisted on the WL within 3 months. MELD-GRAIL-Na was a better predictor of observed mortality at highest deciles of disease severity (≥ 27-40). For a score of 32 or higher (observed mortality 0.68), predicted mortality was 0.67 (MELD-GRAIL-Na) and 0.51 (MELD-Na). For women, a score of 32 or higher (observed mortality 0.67), the predicted mortality was 0.69 (MELD-GRAIL-Na) and 0.55 (MELD-Na). In 2015, use of MELD-GRAIL-Na as compared with MELD-Na resulted in reclassification of 16.7% (n = 672) of patients on the WL.

Conclusion: Incorporation of eGFR likely captures true GFR better than SCr, especially among women. Incorporation of MELD-GRAIL-Na instead of MELD-Na may affect outcomes for 12%-17% awaiting transplant and affect organ allocation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.30932DOI Listing
May 2020

Discovery and Validation of a Biomarker Model (PRESERVE) Predictive of Renal Outcomes After Liver Transplantation.

Hepatology 2020 05 28;71(5):1775-1786. Epub 2020 Jan 28.

Northwestern University Feinberg School of Medicine, Chicago, IL.

Background And Aims: A high proportion of patients develop chronic kidney disease (CKD) after liver transplantation (LT). We aimed to develop clinical/protein models to predict future glomerular filtration rate (GFR) deterioration in this population.

Approach And Results: In independent multicenter discovery (CTOT14) and single-center validation (BUMC) cohorts, we analyzed kidney injury proteins in serum/plasma samples at month 3 after LT in recipients with preserved GFR who demonstrated subsequent GFR deterioration versus preservation by year 1 and year 5 in the BUMC cohort. In CTOT14, we also examined correlations between serial protein levels and GFR over the first year. A month 3 predictive model was constructed from clinical and protein level variables using the CTOT14 cohort (n = 60). Levels of β-2 microglobulin and CD40 antigen and presence of hepatitis C virus (HCV) infection predicted early (year 1) GFR deterioration (area under the curve [AUC], 0.814). We observed excellent validation of this model (AUC, 0.801) in the BUMC cohort (n = 50) who had both early and late (year 5) GFR deterioration. At an optimal threshold, the model had the following performance characteristics in CTOT14 and BUMC, respectively: accuracy (0.75, 0.8), sensitivity (0.71, 0.67), specificity (0.78, 0.88), positive predictive value (0.74, 0.75), and negative predictive value (0.76, 0.82). In the serial CTOT14 analysis, several proteins, including β-2 microglobulin and CD40, correlated with GFR changes over the first year.

Conclusions: We have validated a clinical/protein model (PRESERVE) that early after LT can predict future renal deterioration versus preservation with high accuracy. This model may help select recipients at higher risk for subsequent CKD for early, proactive renal sparing strategies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.30939DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7883482PMC
May 2020

External Validation of a Pretransplant Biomarker Model (REVERSE) Predictive of Renal Recovery After Liver Transplantation.

Hepatology 2019 10 28;70(4):1349-1359. Epub 2019 May 28.

Baylor University Medical Center, Dallas, TX.

In patients with end-stage liver disease, the ability to predict recovery of renal function following liver transplantation (LT) remains elusive. However, several important clinical decisions depend on whether renal dysfunction is recoverable after LT. We used a cohort of patients undergoing LT to independently validate a published pre-LT model predictive of post-transplant renal recovery (Renal Recovery Assessment at Liver Transplant [REVERSE]: high osteopontin [OPN] and tissue inhibitor of metalloproteinases-1 [TIMP-1] levels, age < 57, no diabetes). Serum samples pre-LT and 4-12 weeks post-LT (n = 117) were analyzed for kidney injury proteins from three groups of recipients: (1) estimated glomerular filtration rate (eGFR) < 30 mL/minute/1.73 m prior to and after LT (irreversible acute kidney injury [AKI]), (2) eGFR < 30 mL/minute/1.73 m prior to LT and >50 mL/minute/1.73 m after LT (reversible AKI [rAKI]) (3) eGFR > 50 mL/minute/1.73 m prior to and after LT (no AKI). In patients with elevated pre-LT serum levels of OPN and TIMP-1, recovery of renal function correlated with decreases in the level of both proteins. At 4 weeks post-LT (n = 77 subset), the largest decline in OPN and TIMP-1 was seen in the rAKI group. Validation of the REVERSE model in this independent data set had high area under the curve (0.78) in predicting full post-LT renal recovery (sensitivity 0.86, specificity 0.6, positive predictive value 0.81, negative predictive value 0.69). Our eGFR findings were confirmed using measured GFR. Conclusion: The REVERSE model, derived from an initial training set combining plasma biomarkers and clinical characteristics, demonstrated excellent external validation performance characteristics in an independent patient cohort using serum samples. Among patients with kidney injury pre-LT, the predictive ability of this model may prove beneficial in clinical decision-making both prior to and following transplantation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.30667DOI Listing
October 2019

A Model for Glomerular Filtration Rate Assessment in Liver Disease (GRAIL) in the Presence of Renal Dysfunction.

Hepatology 2019 03 20;69(3):1219-1230. Epub 2019 Feb 20.

Baylor University Medical Center, Dallas, TX.

Estimation of glomerular filtration rate (eGFR) in patients with liver disease is suboptimal in the presence of renal dysfunction. We developed a model for GFR assessment in liver disease (GRAIL) before and after liver transplantation (LT). GRAIL was derived using objective variables (creatinine, blood urea nitrogen, age, gender, race, and albumin) to estimate GFR based on timing of measurement relative to LT and degree of renal dysfunction (www.bswh.md/grail). The measured GFR (mGFR) by iothalamate clearance (n = 12,122, 1985-2015) at protocol time points before/after LT was used as reference. GRAIL was compared with the Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) and Modification of Diet in Renal Disease (MDRD-4, MDRD-6) equations for mGFR < 30 mL/min/1.73 m . Prediction of development of chronic kidney disease (mGFR < 20 mL/min/1.73 m , initiation of chronic dialysis) and listing or receipt of kidney transplantation within 5 years was examined in internal cohort (n = 785) and external validation (n = 68,217, 2001-2015). GRAIL had less bias and was more accurate and precise as compared with CKD-EPI, MDRD-4, and MDRD-6 at time points before/after LT for low GFR. For mGFR < 30 mL/min/1.73 m , the median difference (eGFR-mGFR) was GRAIL: 5.24 (9.65) mL/min/1.73 m as compared with CKD-EPI: 8.70 (18.24) mL/min/1.73 m , MDRD-4: 8.82 (17.38) mL/min/1.73 m , and MDRD-6: 6.53 (14.42) mL/min/1.73 m . Before LT, GRAIL correctly classified 75% as having mGFR < 30 mL/min/1.73 m versus 36.1% (CKD-EPI), 36.1% (MDRD-4), and 52.8% (MDRD-6) (P < 0.01). An eGFR < 30 mL/min/1.73 m by GRAIL predicted development of CKD (26.9% versus 4.6% CKD-EPI, 5.9% MDRD-4, and 10.5% MDRD-6) in center data and needing kidney after LT (48.3% versus 22.0% CKD-EPI versus 23.1% MDRD-4 versus 48.3% MDRD-6, P < 0.01) in national data within 5 years after LT. Conclusion: GRAIL may serve as an alternative model to estimate GFR among patients with liver disease before and after LT at low GFR.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.30321DOI Listing
March 2019

Donor-specific Antibodies, Immunoglobulin-free Light Chains, and BAFF Levels in Relation to Risk of Late-onset PTLD in Liver Recipients.

Transplant Direct 2018 Jun 15;4(6):e353. Epub 2018 May 15.

Baylor Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX.

Background: Posttransplant lymphoproliferative disorder (PTLD) is a neoplastic complication of transplantation, with early cases largely due to immunosuppression and primary Epstein-Barr virus infection. Etiology may differ for later-onset cases, but the contributions of immunosuppression, immune reactivity to the donor organ, and chronic B cell activation are uncertain.

Methods: We conducted a case-control study of late-onset PTLD (diagnosed >1 year posttransplant) in a cohort of liver recipients. We assessed serum samples (obtained >6 months before diagnosis in cases) from N = 60 cases and N = 166 matched controls for donor-specific antibodies (DSAs, evaluable for N = 221 subjects), immunoglobulin kappa and lambda free light chains (FLCs, N = 137), and B cell activating factor (BAFF, N = 226). Conditional or unconditional logistic regression was used to calculate adjusted odds ratios (aORs).

Results: Circulating DSAs were less common in PTLD cases than controls (18% vs 30%), although this difference was borderline significant (aOR, 0.51; 95% confidence interval [CI], 0.24-1.10; = 0.09). Donor-specific antibodies against class II HLA antigens predominated and likewise showed a borderline inverse association with PTLD (aOR, 0.58; 95% CI, 0.27-1.24). The FLC levels were less frequently abnormal in cases than controls, but measurements were available for only a subset and confidence intervals were wide (elevated kappa: aOR, 0.57; 95% CI, 0.15-2.12; = 0.40; elevated lambda: aOR, 0.68; 95% CI, 0.30-1.50; = 0.34). B cell-activating factor levels were not associated with PTLD.

Conclusions: Our results suggest that circulating DSAs are associated with decreased risk of late-onset PTLD. Because DSAs may develop in the setting of underimmunosuppression, the inverse association with DSAs supports a role for immunosuppression in the etiology of late-onset PTLD.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TXD.0000000000000792DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6089512PMC
June 2018

Using herbarium-derived DNAs to assemble a large-scale DNA barcode library for the vascular plants of Canada.

Appl Plant Sci 2017 Dec 22;5(12). Epub 2017 Dec 22.

Centre for Biodiversity Genomics, University of Guelph, 50 Stone Road East, Guelph, Ontario N1G 2W1, Canada.

Premise Of The Study: Constructing complete, accurate plant DNA barcode reference libraries can be logistically challenging for large-scale floras. Here we demonstrate the promise and challenges of using herbarium collections for building a DNA barcode reference library for the vascular plant flora of Canada.

Methods: Our study examined 20,816 specimens representing 5076 of 5190 vascular plant species in Canada (98%). For 98% of the specimens, at least one of the DNA barcode regions was recovered from the plastid loci and and from the nuclear ITS2 region. We used beta regression to quantify the effects of age, type of preservation, and taxonomic affiliation (family) on DNA sequence recovery.

Results: Specimen age and method of preservation had significant effects on sequence recovery for all markers, but influenced some families more (e.g., Boraginaceae) than others (e.g., Asteraceae).

Discussion: Our DNA barcode library represents an unparalleled resource for metagenomic and ecological genetic research working on temperate and arctic biomes. An observed decline in sequence recovery with specimen age may be associated with poor primer matches, intragenomic variation (for ITS2), or inhibitory secondary compounds in some taxa.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3732/apps.1700079DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5749818PMC
December 2017

Non-HLA Antibodies Impact on C4d Staining, Stellate Cell Activation and Fibrosis in Liver Allografts.

Transplantation 2017 10;101(10):2399-2409

1 Annette C. & Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX. 2 Department of Pathology, University of Pittsburgh, Pittsburg, PA. 3 Department of Nephrology and Critical Care Medicine, Charité, Berlin, Germany. 4 Terasaki Foundation Laboratory, Los Angeles, CA. 5 Celltrend, Luckenwalde, Germany.

Background: Recent data have shown an increased risk for rejection, fibrosis progression, and death in liver transplantation (LT) recipients with preformed or de novo HLA donor-specific alloantibodies (DSA). However, the role of non-HLA autoantibodies and the interaction between HLA DSA and non-HLA autoantibodies remains uncharacterized.

Methods: We analyzed 1269 primary LT recipients from 1 of 2000 to 4 of 2009 with known HLA DSA status for angiotensin II type-1 receptor and endothelin-1 type A receptor autoantibodies pre-LT, and year 1 post-LT.

Results: Preformed non-HLA autoantibodies alone did not impact outcomes. In multivariable modeling, the combination of preformed non-HLA autoantibodies and HLA-DSA were associated with an increased risk for death (hazard ratio [HR], 1.66; P = 0.02) especially if the HLA DSA was of the IgG3 subclass (HR, 2.28; P = 0.01). A single de novo non-HLA autoantibody was associated with an increased risk for T cell-mediated rejection or antibody-mediated rejection (68% vs 41%, P = 0.01) and fibrosis progression (HR, 1.84; P = 0.02). Biopsies with de novo non-HLA autoantibodies revealed a new sinusoidal C4d staining pattern when compared with HLA DSA (71% vs 3%; P < 0.001). Liver sinusoidal endothelial cell activation and stellate cell activation was increased in patients with non-HLA autoantibodies in the location of C4d positivity.

Conclusions: A non-HLA autoantibody combined with a preformed HLA DSA is associated with an increased mortality risk. Isolated de novo anti-angiotensin II type-1 receptor and anti-endothelin-1 type A receptor autoantibodies are associated with an increased risk of rejection and fibrosis progression. The novel location of C4d staining in proximity to liver sinusoidal endothelial cell capillarization and stellate cell activation demonstrates allograft injury in proximity to non-HLA autoantibody binding.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TP.0000000000001853DOI Listing
October 2017

Chronic AMR in Liver Transplant: Validation of the 1-Year cAMR Score's Ability to Determine Long-term Outcome.

Transplantation 2017 09;101(9):2062-2070

1 Annette C. & Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas TX. 2 Terasaki Foundation Laboratory, Los Angeles, CA. 3 Department of Pathology, University of Pittsburgh, Pittsburg, PA.

Background: A proposed chronic antibody-mediated rejection (AMR) score has recently predicted 50%10-year death-censored allograft loss in patients with donor-specific alloantibodies (DSA) mean florescence intensity (MFI) greater than 10 000 and requires confirmation in patients with lower MFI (1000-10 000).

Methods: All patients who underwent liver transplantation from January 2000 to April 2009, had DSA (MFI ≥1000) in serum 10 to 14 months postliver transplantation, and had a protocolized liver biopsy were evaluated (n = 230). The previously proposed chronic AMR (cAMR) score was used to risk-stratify putative chronic AMR in DSA+ patients with MFI from 1000 to 10 000.

Results: The MFI distribution of DSA+ recipients were as follows: 66% had MFI 1000 to 4999, 14% had MFI 5000 to 10 000, and 20% had MFI greater than 10 000. The cAMR score distribution on 1-year protocol liver biopsy found that 41% had a score less than 13; 27% a score of 13 to 27.5, and 32% a score greater than 27.5. MFI correlated with 1-year cAMR category (<13, 46% vs 21% and >27.5, 29% vs 42% when MFI was 1000-10 000 vs MFI >10 000; P = 0.047). In patients with a cAMR score less than 13, 10-year death-censored allograft survival was 96% to 100% regardless of MFI (P = NS). The risk of allograft loss increased in patients with a cAMR score greater than 13 (P = 0.004) in DSA+ patients with MFI 1000 to 10 000. DSA MFI greater than 10 000 versus MFI 1000 to 10 000 at 1 year was also more likely to persist at 5 years (95% vs 68%; P < 0.0001).

Conclusions: Validation of the previously proposed cAMR score in a separate cohort predicts death-censored long-term allograft failure in DSA+ patients regardless of MFI, and higher MFI at 1 year predicts DSA persistence at 5 years.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TP.0000000000001802DOI Listing
September 2017

Response of irritable bowel syndrome with constipation patients administered a combined quebracho/conker tree/M. balsamea Willd extract.

World J Gastrointest Pharmacol Ther 2016 Aug;7(3):463-8

Kenneth Brown, Brandi Scott-Hoy, KBS Research LLC, Dallas, TX 75248, United States.

The aim of this case series was to retrospectively examine the symptom response of irritable bowel syndrome with constipation (IBS-C) patients administered an herbal extract in a real-world setting. Twenty-four IBS-C patients in a community office practice were provided a combination over-the-counter dietary supplement composed of quebracho (150 mg), conker tree (470 mg) and M. balsamea Willd (0.2 mL) extracts (Atrantil™) and chose to take the formulation for a minimum of 2 wk in an attempt to manage their symptoms. Patient responses to the supplement were assessed by visual analogue scale (VAS) for abdominal pain, constipation and bloating at baseline and at 2 wk as part of standard-of-care. Patient scores from VAS assessments recorded in medical chart data were retrospectively compiled and assessed for the effects of the combined extract on symptoms. Sign tests were used to compare changes from baseline to 2 wk of taking the extract. Significance was defined as P < 0.05. Twenty-one of 24 patients (88%) responded to the dietary supplement as measured by individual improvements in VAS scores for abdominal pain, bloating and constipation symptoms comparing scores prior to administration of the extract against those reported after 2 wk. There were also significant improvements in individual as well as mean VAS scores after 2 wk of administration of the combined extract compared to baseline for abdominal pain [8.0 (6.5, 9.0) vs 2.0 (1.0, 3.0), P < 0.001], bloating [8.0 (7.0, 9.0) vs 1.0 (1.0, 2.0), P < 0.001] and constipation [6.0 (3.0, 8.0) vs 2.0 (1.0, 3.0), P < 0.001], respectively. In addition, 21 of 24 patients expressed improved quality of life while taking the formulation. There were no reported side effects to administration of the dietary supplement in this practice population suggesting excellent tolerance of the formulation. This pilot retrospective analysis of symptom scores from patients before and after consuming a quebracho/conker tree/M. balsamea Willd extract may support the formulation's use in IBS-C.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.4292/wjgpt.v7.i3.463DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4986399PMC
August 2016

Predicting renal recovery after liver transplant with severe pretransplant subacute kidney injury: The impact of warm ischemia time.

Liver Transpl 2016 08 12;22(8):1085-91. Epub 2016 Jul 12.

Division of Gastroenterology and Hepatology, University of Colorado Anschutz Medical Campus, Aurora, CO.

Identifying which liver transplantation (LT) candidates with severe kidney injury will have a full recovery of renal function after liver transplantation alone (LTA) is difficult. Avoiding unnecessary simultaneous liver-kidney transplantation (SLKT) can optimize the use of scarce kidney grafts. Incorrect predictions of spontaneous renal recovery after LTA can lead to increased morbidity and mortality. We retrospectively analyzed all LTA patients at our institution from February 2002 to February 2013 (n = 583) and identified a cohort with severe subacute renal injury (n = 40; creatinine <2 mg/dL in the 14-89 days prior to LTA and not on renal replacement therapy [RRT] yet, ≥2 mg/dL within 14 days of LTA and/or on RRT). Of 40 LTA recipients, 26 (65%) had renal recovery and 14 (35%) did not. The median (interquartile range) warm ischemia time (WIT) in recipients with and without renal recovery after LTA was 31 minutes (24-46 minutes) and 39 minutes (34-49 minutes; P = 0.02), respectively. Adjusting for the severity of the subacute kidney injury with either Acute Kidney Injury Network or Risk, Injury, Failure, Loss, and End-Stage Kidney Disease criteria, increasing WIT was associated with lack of renal recovery (serum creatinine <2 mg/dL after LTA, not on RRT), with an odds ratio (OR) of 1.08 (1.01-1.16; P = 0.03) and 1.09 (1.01-1.17; P = 0.02), respectively. For each minute of increased WIT, there was an 8%-9% increase in the risk of lack of renal recovery after LTA. In a separate cohort of 98 LTA recipients with subacute kidney injury, we confirmed the association of WIT and lack of renal recovery (OR, 1.04; P = 0.04). In LT candidates with severe subacute renal injury, operative measures to minimize WIT may improve renal recovery potentially avoiding RRT and the need for subsequent kidney transplant. Liver Transplantation 22 1085-1091 2016 AASLD.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.24488DOI Listing
August 2016

Circulating TGF-β1 and VEGF and risk of cancer among liver transplant recipients.

Cancer Med 2015 Aug 27;4(8):1252-7. Epub 2015 Apr 27.

Baylor Simmons Transplant Institute, Baylor University Medical Center, Dallas, Texas.

Transplant recipients have elevated cancer risk, perhaps partly due to direct carcinogenic effects of immunosuppressive medications. Experimental evidence indicates that calcineurin inhibitors given to transplant recipients increase cellular expression of transforming growth factor β1 (TGF-β1) and vascular endothelial growth factor (VEGF), which could promote cancer. To assess the potential role of these pathways in the transplantation setting, we conducted a case-control study nested in a cohort of liver recipients. Cases had nonmelanoma skin cancer (N = 84), cancer of the lung (N = 29), kidney (N = 20), or colorectum (N = 17), or melanoma (N = 3). We selected N = 463 recipients without cancer as controls. TGF-β1 and VEGF levels were measured in sera obtained, on average, approximately 3 years before case diagnosis/control selection. We also measured platelet factor 4 (PF4), a marker of ex vivo platelet degranulation, because TGF-β1 and VEGF can be released from platelets, and we developed a statistical model to isolate the platelet-derived fraction from the remaining circulating component. Compared with controls, lung cancer cases had higher levels of TGF-β1 (median 22.8 vs. 19.4 ng/mL, P = 0.02) and VEGF (277 vs. 186 pg/mL, P = 0.02). However, lung cancer cases also had higher platelet counts (P = 0.08) and PF4 levels (P = 0.02), while residual serum levels of TGF-β1 and VEGF, after accounting for PF4, were unassociated with lung cancer (P = 0.40 and P = 0.15, respectively). Associations were not seen for other cancers. In conclusion, TGF-β1 and VEGF levels were increased in association with lung cancer among transplant recipients, which may be explained by increased platelet counts and platelet degranulation in lung cancer cases.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/cam4.455DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4559036PMC
August 2015

Acute liver allograft antibody-mediated rejection: an inter-institutional study of significant histopathological features.

Liver Transpl 2014 Oct;20(10):1244-55

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX.

Acute antibody-mediated rejection (AMR) occurs in a small minority of sensitized liver transplant recipients. Although histopathological characteristics have been described, specific features that could be used (1) to make a generalizable scoring system and (2) to trigger a more in-depth analysis are needed to screen for this rare but important finding. Toward this goal, we created training and validation cohorts of putative acute AMR and control cases from 3 high-volume liver transplant programs; these cases were evaluated blindly by 4 independent transplant pathologists. Evaluations of hematoxylin and eosin (H&E) sections were performed alone without knowledge of either serum donor-specific human leukocyte antigen alloantibody (DSA) results or complement component 4d (C4d) stains. Routine histopathological features that strongly correlated with severe acute AMR included portal eosinophilia, portal vein endothelial cell hypertrophy, eosinophilic central venulitis, central venulitis severity, and cholestasis. Acute AMR inversely correlated with lymphocytic venulitis and lymphocytic portal inflammation. These and other characteristics were incorporated into models created from the training cohort alone. The final acute antibody-mediated rejection score (aAMR score)--the sum of portal vein endothelial cell hypertrophy, portal eosinophilia, and eosinophilic venulitis divided by the sum of lymphocytic portal inflammation and lymphocytic venulitis--exhibited a strong correlation with severe acute AMR in the training cohort [odds ratio (OR) = 2.86, P < 0.001] and the validation cohort (OR = 2.49, P < 0.001). SPSS tree classification was used to select 2 cutoffs: one that optimized specificity at a score > 1.75 (sensitivity = 34%, specificity = 86%) and another that optimized sensitivity at a score > 1.0 (sensitivity = 81%, specificity = 71%). In conclusion, the routine histopathological features of the aAMR score can be used to screen patients for acute AMR via routine H&E staining of indication liver transplant biopsy samples; however, a definitive diagnosis requires substantiation by DSA testing, diffuse C4d staining, and the exclusion of other insults.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.23948DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4412307PMC
October 2014

Ratio of hepatic arterial flow to recipient body weight predicts biliary complications after deceased donor liver transplantation.

HPB (Oxford) 2014 Dec 18;16(12):1083-7. Epub 2014 Jul 18.

Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX, USA.

Objectives: Adequate hepatic arterial (HA) flow to the bile duct is essential in liver transplantation. This study was conducted to determine if the ratio of directly measured HA flow to weight is related to the occurrence of biliary complications after deceased donor liver transplantation.

Methods: A retrospective review of 2684 liver transplants carried out over a 25-year period was performed using data sourced from a prospectively maintained database. Rates of biliary complications (biliary leaks, anastomotic and non-anastomotic strictures) were compared between two groups of patients with HA flow by body weight of, respectively, <5 ml/min/kg (n = 884) and ≥5 ml/min/kg (n = 1800).

Results: Patients with a lower ratio of HA flow to weight had higher body weight (92 kg versus 76 kg; P < 0.001) and lower HA flow (350 ml/min versus 550 ml/min; P < 0.001). A lower ratio of HA flow to weight was associated with higher rates of biliary complications at 2 months, 6 months and 12 months (19.8%, 28.2% and 31.9% versus 14.8%, 22.4% and 25.8%, respectively; P < 0.001).

Conclusions: A ratio of HA flow to weight of < 5 ml/min/kg is associated with higher rates of biliary complications. This ratio may be a useful parameter for application in the prevention and early detection of biliary complications.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/hpb.12318DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4253331PMC
December 2014

Donor-specific alloantibodies are associated with fibrosis progression after liver transplantation in hepatitis C virus-infected patients.

Liver Transpl 2014 Jun;20(6):655-63

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX.

Hepatitis C virus (HCV) fibrosis progression after liver transplantation (LT) is accelerated in comparison with fibrosis progression before transplantation. The vast majority of the risk factors for fibrosis progression after LT are not modifiable. With the goal of identifying modifiable risk factors for fibrosis progression, we evaluated the impact of preformed and de novo donor-specific human leukocyte antigen alloantibodies (DSAs) on fibrosis progression after LT in HCV-viremic patients. After blinding, we analyzed all 507 HCV-viremic patients who underwent primary LT from January 2000 to May 2009 and had pretransplant and posttransplant samples available for analysis (86% of the total) for preformed and de novo class I and class II DSAs with a mean fluorescence intensity ≥ 5000 with single-antigen bead technology. Fibrosis was assessed on the basis of indication and protocol liver biopsies; compliance with protocol liver biopsies at 1, 2, and 5 years was ≥80%. Preformed class I DSAs [hazard ratio (HR) = 1.44, P = 0.04] and class II DSAs (HR = 1.86, P < 0.001) were independent predictors of progression to stage 2-4 fibrosis, and de novo DSAs (HR = 1.41, P = 0.07) had borderline significance. In addition, preformed class I DSAs (HR = 1.63, P = 0.03) and class II DSAs (HR = 1.72, P = 0.03) were statistically significantly associated with an increased risk of death. In conclusion, after we controlled for donor and recipient characteristics in multivariate modeling, DSAs were independently associated with fibrosis progression and death after LT in HCV-viremic patients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.23854DOI Listing
June 2014

Role of magnetic resonance elastography in compensated and decompensated liver disease.

J Hepatol 2014 May 19;60(5):934-9. Epub 2013 Dec 19.

Department of Radiology, Mayo Clinic College of Medicine, Rochester, MN, United States.

Background & Aims: Non-invasive predictors identifying subjects with compensated liver disease at highest risk for transitioning to a decompensated state are lacking. We hypothesized that liver shear stiffness as measured by magnetic resonance elastography is an important non-invasive predictor of hepatic decompensation.

Methods: Among patients with advanced fibrosis undergoing magnetic resonance elastography (2007-2011), a baseline cohort and follow up cohort (compensated liver disease) were established. Cause specific cox proportional hazards analysis adjusting for competing risks was utilized to determine the association between elevated liver shear stiffness and development of decompensation (hepatic encephalopathy, ascites, variceal bleeding).

Results: In the baseline cohort (n=430), subjects with decompensated liver disease had a significantly higher mean liver shear stiffness (6.8kPa, IQR 4.9-8.5) as compared to subjects with compensated liver disease (5.2kPa, IQR 4.1-6.8). After adjustment for Model for End Stage Liver Disease score, hepatitis C, age, gender, albumin, and platelet count, the mean liver shear stiffness (OR=1.13, 95% CI 1.03-1.27) was independently associated with decompensated cirrhosis at baseline. Over a median follow up of 27months (n=167), 7.2% of subjects with compensated disease experienced hepatic decompensation. In the follow up cohort, the hazard of hepatic decompensation was 1.42 (95% CI 1.16-1.75) per unit increase in liver shear stiffness over time. The hazard of hepatic decompensation was 4.96 (95% CI 1.4-17.0, p=0.019) for a subject with compensated disease and mean LSS value ⩾5.8kPa as compared to an individual with compensated disease and lower mean LSS values.

Conclusion: Baseline liver shear stiffness assessed by magnetic resonance elastography is independently associated with decompensated liver disease.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jhep.2013.12.016DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3995839PMC
May 2014

Incidence and severity of respiratory insufficiency detected by transcutaneous carbon dioxide monitoring after cardiac surgery and intensive care unit discharge.

Proc (Bayl Univ Med Cent) 2013 Oct;26(4):373-5

Baylor University Medical Center at Dallas (Lagow, Leeper, Ramsay), and the Annette C. and Harold C. Simmons Transplant Institute, Dallas, Texas (Jennings).

Patients undergoing coronary artery bypass surgery and/or heart valve surgery using a median sternotomy approach coupled with the use of cardiopulmonary bypass often experience pulmonary complications in the postoperative period. These patients are initially monitored in an intensive care unit (ICU) but after discharge from this unit to the ward they may still have compromised pulmonary function. This dysfunction may progress to significant respiratory failure that will cause the patient to return to the ICU. To investigate the severity and incidence of respiratory insufficiency once the patient has been discharged from the ICU to the ward, this study used transcutaneous carbon dioxide monitoring to determine the incidence of unrecognized inadequate ventilation in 39 patients undergoing the current standard of care. The incidence and severity of hypercarbia, hypoxia, and tachycardia in post-cardiac surgery patients during the first 24 hours after ICU discharge were found to be high, with severe episodes of each found in 38%, 79%, and 44% of patients, respectively.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3777090PMC
http://dx.doi.org/10.1080/08998280.2013.11929009DOI Listing
October 2013

Preformed class II donor-specific antibodies are associated with an increased risk of early rejection after liver transplantation.

Liver Transpl 2013 Sep 26;19(9):973-80. Epub 2013 Jul 26.

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX.

Preformed donor-specific human leukocyte antigen antibodies (DSAs) are considered a contraindication to the transplantation of most solid organs other than the liver. Conflicting data currently exist on the importance of preformed DSAs in rejection and patient survival after liver transplantation (LT). To evaluate preformed DSAs in LT, we retrospectively analyzed prospectively collected samples from all adult recipients of primary LT without another organ from January 1, 2000 to May 31, 2009 with a pre-LT sample available (95.8% of the patients). Fourteen percent of the patients had preformed class I and/or II DSAs with a mean fluorescence intensity (MFI) ≥ 5000. Preformed class I DSAs with an MFI ≥ 5000 remained persistent in only 5% of patients and were not associated with rejection. Preformed class II DSAs with an MFI of 5000 to 10,000 remained persistent in 23% of patients, and this rate increased to 33% for patients whose MFI was ≥10,000 (P < 0.001). Preformed class II DSAs in multivariable Cox proportional hazards modeling were associated with an increased risk of early rejection [hazard ratio (HR) = 1.58; p = 0.004]. In addition, multivariate modeling showed that in comparison with no DSAs (MFI < 1000), preformed class I and/or II DSAs with an MFI ≥ 5000 were independently correlated with the risk of death (HR = 1.51; p = 0.02).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.23687DOI Listing
September 2013

Basic statistical concepts in nutrition research.

Nutr Clin Pract 2013 Apr 28;28(2):182-93. Epub 2013 Feb 28.

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX 75246, USA.

Statistical principles are used in nutrition research to plan and conduct research studies and to answer research questions. This article describes general statistical concepts and provides some guidelines to assist in the interpretation of research literature. Prospective and retrospective study designs used in nutrition research are presented as well as the advantages and disadvantages of each of the study designs. Descriptive statistics used to summarize data and graphical tools used to display the shape of the distribution of a set of data guide nutrition support professionals to select appropriate statistical tests. Fundamental topics of statistics, including power analysis and sample size, confidence intervals and hypothesis testing, and analysis of variance and regression, are also reviewed. The article emphasizes the importance of effective collaboration with statisticians at an early stage of the research study to avoid potential pitfalls associated with improper utilization of statistical methods.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/0884533613478636DOI Listing
April 2013

Tumor biology and pre-transplant locoregional treatments determine outcomes in patients with T3 hepatocellular carcinoma undergoing liver transplantation.

Clin Transplant 2013 Mar-Apr;27(2):311-8. Epub 2013 Jan 28.

Simmons Transplant Institute, Baylor University Medical Center, Dallas, TX 75246, USA.

Liver transplantation is the optimal treatment for patients with hepatocellular carcinoma (HCC) and cirrhosis. This study was conducted to determine the impact of pre-transplant locoregional therapy (LRT) on HCC and our institution's experience with expansion to United Network of Organ Sharing Region 4 T3 (R4T3) criteria. Two hundred and twenty-five patients with HCC (176 meeting Milan and 49 meeting R4T3 criteria) underwent liver transplantation from 2002 to 2008. Compared with the Milan criteria, HCCs in R4T3 criteria displayed less favorable biological features such as higher median alpha-fetoprotein level (21.9 vs. 8.5 ng/mL, p = 0.01), larger tumor size, larger tumor number, and higher incidence of microvascular invasion (22% vs. 5%, p = 0.002). As a result, patients meeting Milan criteria had better five-yr survival (79% vs. 69%, p = 0.03) and a trend toward lower HCC recurrence rates (5% vs. 13%, p = 0.05). Pre-transplant LRT did not affect post-transplant outcomes in patients meeting Milan criteria but did result in lower three-yr HCC recurrence (7% vs. 75%, p < 0.001) and better three-yr survival (p = 0.02) in patients meeting R4T3 criteria. Tumor biology and pre-transplant LRT are important factors that determine the post-transplant outcomes in patients with HCC who meet R4T3 criteria.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ctr.12089DOI Listing
October 2013

Results of live donor liver transplantation in patients with hepatitic C virus infection: the HCV 3 trial experience.

Clin Transplant 2012 May-Jun;26(3):502-9. Epub 2011 Dec 12.

University of Southern California, Los Angeles, CA, USA.

Chronic hepatitis C virus (HCV) is the most common disease indication for liver transplantation (LT). Outcomes are compromised by near universal recurrence of HCV. A prospective multi-center randomized study to evaluate immunosuppressive strategies in HCV+ transplant recipients provided the opportunity to assess impact of live donor (LD) LT. Two hundred and ninety-five patients undergoing LT for HCV (260 deceased donor [DD] recipients/35 LD recipients), randomized to three regimens, were followed for two yr for patient and graft survival and rate and severity of recurrent HCV. Biopsies were performed at baseline, 3, 12, and 24 months. One- and two-yr patient survival for LD recipients was 88.1% and 81.1% vs. 90.5% and 84.6% for DD recipients (p = 0.5665). One- and two-yr graft survival for LD recipients was 82.9% and 76.2% vs. 87.9% and 81.7% for DD recipients (p = 0.3921). Recurrent HCV did not account for more deaths or graft losses in the LD recipients. In this prospective study, controlled for immunosuppression, use of LD organs did not increase the rate or severity of HCV recurrence. The more elective nature of LDLT affords an opportunity to manipulate donor and recipient factors that can impact upon outcomes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1399-0012.2011.01561.xDOI Listing
December 2012

Renal-sparing immunosuppressive protocol using OKT3 after liver transplantation: a 19-year single-institution experience.

Proc (Bayl Univ Med Cent) 2011 Oct;24(4):287-94

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center at Dallas. Dr. Kim is now a hepatobiliary fellow in Canada.

Different renal-sparing immunosuppressive protocols have been used in liver transplantation. At our institution, muromonab-CD3 (OKT3) is used in patients with acute renal failure (ARF), along with a delay in starting a calcineurin inhibitor. This study was conducted to compare outcomes in liver transplant patients with ARF who received OKT3 and those who did not. From 1988 to 2007, ARF was present in 1685 of 2587 patients (65%). OKT3 was used in 109 patients (OKT3 group). The control group (1416 patients) received a low-dose calcineurin inhibitor. The OKT3 group was more critically ill. In spite of this, the OKT3 group patients who were on renal replacement therapy (RRT) achieved long-term survival similar to that of the control group on RRT. Among the patients who were not on RRT, the OKT3 group had a higher complete recovery rate, but this did not translate into improved long-term survival. Bacterial and fungal infections were more common in the OKT3 group; however, there was no increased risk of malignancy or death from hepatitis C recurrence. The use of OKT3 in patients with ARF allowed more critically ill patients on RRT to achieve survival rates similar to those of patients who did not receive OKT3.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3205150PMC
http://dx.doi.org/10.1080/08998280.2011.11928740DOI Listing
October 2011

A randomized, multicenter study comparing steroid-free immunosuppression and standard immunosuppression for liver transplant recipients with chronic hepatitis C.

Liver Transpl 2011 Dec;17(12):1394-403

Baylor University Medical Center at Dallas, Dallas, TX 75246, USA.

This randomized, prospective, multicenter trial compared the safety and efficacy of steroid-free immunosuppression (IS) to the safety and efficacy of 2 standard IS regimens in patients undergoing transplantation for hepatitis C virus (HCV) infection. The outcome measures were acute cellular rejection (ACR), severe HCV recurrence, and survival. The patients were randomized (1:1:2) to tacrolimus (TAC) and corticosteroids (arm 1; n = 77), mycophenolate mofetil (MMF), TAC, and corticosteroids (arm 2; n = 72), or MMF, TAC, and daclizumab induction with no corticosteroids (arm 3; n = 146). In all, 295 HCV RNA-positive subjects were enrolled. At 2 years, there were no differences in ACR, HCV recurrence (biochemical evidence), patient survival, or graft survival rates. The side effects of IS did not differ, although there was a trend toward less diabetes in the steroid-free group. Liver biopsy samples revealed no significant differences in the proportions of patients in arms 1, 2, and 3 with advanced HCV recurrence (ie, an inflammation grade ≥ 3 and/or a fibrosis stage ≥ 2) in years 1 (48.2%, 50.4%, and 43.0%, respectively) and 2 (69.5%, 75.9%, and 68.1%, respectively). Although we have found that steroid-free IS is safe and effective for liver transplant recipients with chronic HCV, steroid sparing has no clear advantage in comparison with traditional IS.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.22417DOI Listing
December 2011

Effect of tacrolimus on survival in hepatitis C-infected patients after liver transplantation.

Proc (Bayl Univ Med Cent) 2011 Jul;24(3):187-91

Baylor Simmons Transplant Institute, Baylor University Medical Center at Dallas.

The observation that cyclosporine inhibits HCV replication in vitro has led some programs to use cyclosporine as the calcineurin inhibitor (CNI) of choice after orthotopic liver transplantation (OLT). Previous studies comparing outcomes with different CNIs used small HCV cohorts or had short-term follow-up. We examined patient survival and fibrosis progression in all HCV-infected adult primary OLT recipients from 1995 to 2004 at the Annette C. and Harold C. Simmons Transplant Institute (n = 516). Patients were categorized by their CNI on day 7 post-OLT, and they were excluded if they died before day 14. Patient and donor age, sex, race, and prevalence of cytomegalovirus infection post-OLT were similar in the tacrolimus and cyclosporine patients. As expected, acute cellular rejection and steroid-resistant rejection were less common in tacrolimus-treated patients. Although no difference in 1-year survival was seen, tacrolimus patients (n = 268) had superior 5-year survival compared to cyclosporine patients (n = 248) (75% vs. 67%; P = 0.02). Fibrosis progression was no different between the groups. In our retrospective analysis of 516 post-OLT patients, tacrolimus improved long-term survival compared to cyclosporine in HCV-infected patients, although it did not impact HCV fibrosis progression.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3124900PMC
http://dx.doi.org/10.1080/08998280.2011.11928712DOI Listing
July 2011

Patients with NASH and cryptogenic cirrhosis are less likely than those with hepatitis C to receive liver transplants.

Clin Gastroenterol Hepatol 2011 Aug 15;9(8):700-704.e1. Epub 2011 Apr 15.

Annette C. and Harold C. Simmons Transplant Institute, Baylor University Medical Center, Dallas, Texas, USA.

Background & Aims: Many patients with cryptogenic cirrhosis (CC) have other conditions associated with nonalcoholic steatohepatitis (NASH) that put them at risk for complications that preclude orthotopic liver transplantation (OLT).

Methods: We followed all patients with NASH and CC who were evaluated for OLT (n = 218) at Baylor Simmons Transplant Institute between March 2002 and May 2008. Data were compared with those from patients evaluated for OLT because of hepatitis C virus (HCV)-associated cirrhosis (n = 646).

Results: Patients with NASH and CC were older, more likely to be female, had a higher body mass index, and a greater prevalence of diabetes and hypertension, compared with patients with HCV-associated cirrhosis, but the 2 groups had similar model for end-stage liver disease (MELD) scores. NASH and CC in patients with MELD scores ≤15 were less likely to progress; these patients were less likely to receive OLT and more likely to die or be taken off the wait list because they were too sick, compared with patients with HCV-associated cirrhosis. The median progression rate among patients with NASH and CC was 1.3 MELD points per year versus 3.2 MELD points per year for the HCV group (P = .003). Among patients with MELD scores >15, there were no differences among groups in percentage that received transplants or rate of MELD score progression. Hepatocellular carcinoma occurred in 2.7% of patients with NASH and CC per year, compared with 4.7% per year among those with HCV-associated cirrhosis.

Conclusions: Patients with NASH and CC and low MELD scores have slower disease progression than patients with HCV-associated cirrhosis and are less likely to receive OLT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cgh.2011.04.007DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4408909PMC
August 2011

Nonalcoholic fatty liver disease after liver transplantation for cryptogenic cirrhosis or nonalcoholic fatty liver disease.

Liver Transpl 2010 Apr;16(4):431-9

Departments of Medicine, Baylor University Medical Center, Dallas, TX, USA.

Nonalcoholic steatohepatitis (NASH) may account for many cases of cryptogenic cirrhosis. If so, then steatosis might recur after liver transplantation. Two thousand fifty-two patients underwent primary liver transplantation for chronic liver disease between 1986 and 2004. Serial liver biopsy samples were assessed for steatosis and fibrosis. Two hundred fifty-seven patients (12%) had a pretransplant diagnosis of cryptogenic cirrhosis (239) or NASH (18). Fatty liver developed in 31% and was more common when the pretransplant diagnosis was NASH (45% at 5 years versus 23% for cryptogenic cirrhosis, P = 0.007). NASH developed in only 4% and occurred exclusively when steatosis had already occurred. Steatosis after liver transplantation was associated with the baseline body weight and body mass index by univariate analyses, but no pretransplant or posttransplant characteristic independently predicted steatosis after liver transplantation because obesity was so common in all groups. Five percent and 10% developed bridging fibrosis or cirrhosis after 5 and 10 years, respectively, and this was more common after NASH (31%) than in those who developed steatosis alone (6%) or had no fat (3%, P = 0.002). One-, 5-, and 10-year survival was the same in patients who underwent transplantation for cryptogenic cirrhosis or NASH (86%, 71%, and 56%) and in patients who underwent transplantation for other indications (86%, 71%, and 53%; not significant), but death was more often due to cardiovascular disease and less likely from recurrent liver disease. In conclusion, fatty liver is common after liver transplantation for cryptogenic cirrhosis or NASH but is twice as common in the latter group; this suggests that some cryptogenic cirrhosis, but perhaps not all, is caused by NASH. Posttransplant NASH is unusual, and steatosis appears to be a prerequisite. Advanced fibrosis is uncommon, and survival is the same as that of patients who undergo transplantation for other causes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.22004DOI Listing
April 2010

Indications for combined liver and kidney transplantation: propositions after a 23-yr experience.

Clin Transplant 2010 Nov-Dec;24(6):807-11

Department of Transplant Surgery, Baylor Regional Transplant Institute, Dallas, TX, USA.

The frequency of combined liver and kidney transplants (CLKT) persists despite the pronounced scarcity of organs. In this review, we sought to ascertain any factors that would reduce the use of these limited commodities. Seventy-five adult CLKT were performed over a 23-yr period at our center, 29 (39%) of which occurred during the Model for End-stage Liver Disease (MELD) era. Overall, patient survival rates were 82%, 73%, and 62% at one, three, and five yr, respectively. There was no difference in patient survival based either on pre-transplant hemodialysis status or by glomerular filtration rate (GFR) at the time of transplant. Patients undergoing a second CLKT or a liver retransplantation at the time of CLKT had a survival rate of 30% at three months. In the MELD era, patient survival was unchanged (p = NS) despite an older recipient population (p = 0.0029) and a greater number of hepatitis C patients (p = 0.0428). In summary, patients requiring liver retransplantation with concomitant renal failure should be denied CLKT. Renal allografts may also be spared by implementing strict criteria for renal organ allocation (GFR < 30 mL/min at the time of evaluation) and considering the elimination of preemptive kidney transplantation in CLKT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1399-0012.2009.01180.xDOI Listing
March 2011

Aging of hepatitis C virus (HCV)-infected persons in the United States: a multiple cohort model of HCV prevalence and disease progression.

Gastroenterology 2010 Feb 25;138(2):513-21, 521.e1-6. Epub 2009 Oct 25.

Division of Hepatology, Baylor University Medical Center and Baylor Regional Transplant Institute, Dallas, Texas, USA.

Background & Aims: The prevalence of chronic hepatitis C (CH-C) remains high and the complications of infection are common. Our goal was to project the future prevalence of CH-C and its complications.

Methods: We developed a multicohort natural history model to overcome limitations of previous models for predicting disease outcomes and benefits of therapy.

Results: Prevalence of CH-C peaked in 2001 at 3.6 million. Fibrosis progression was inversely related to age at infection, so cirrhosis and its complications were most common after the age of 60 years, regardless of when infection occurred. The proportion of CH-C with cirrhosis is projected to reach 25% in 2010 and 45% in 2030, although the total number with cirrhosis will peak at 1.0 million (30.5% higher than the current level) in 2020 and then decline. Hepatic decompensation and liver cancer will continue to increase for another 10 to 13 years. Treatment of all infected patients in 2010 could reduce risk of cirrhosis, decompensation, cancer, and liver-related deaths by 16%, 42%, 31%, and 36% by 2020, given current response rates to antiviral therapy.

Conclusions: Prevalence of hepatitis C cirrhosis and its complications will continue to increase through the next decade and will mostly affect those older than 60 years of age. Current treatment patterns will have little effect on these complications, but wider application of antiviral treatment and better responses with new agents could significantly reduce the impact of this disease in coming years.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1053/j.gastro.2009.09.067DOI Listing
February 2010

Improved results of transplantation for hepatocellular carcinoma: a report from the International Registry of Hepatic Tumors in Liver Transplantation.

Liver Transpl 2009 Jun;15(6):574-80

Baylor Regional Transplant Institute, Dallas/Fort Worth, TX 75246, USA.

Improved outcome after liver transplantation (LTX) for hepatocellular carcinoma (HCC) made LTX a legitimate treatment of the disease. We analyzed trends of LTX for HCC with tumors known before transplantation in 902 patients in a large international registry across 3 periods: 1983-1990, 1991-1996, and 1997-2005. Patient survival improved gradually across eras, with 5-year survival rates of 25.3%, 44.4%, and 67.8%, respectively (P < 0.0001), and the 5-year tumor recurrence rate declined from 59% to 41.3% and 15%, respectively (P < 0.0001). The number of HCC nodules and tumor size decreased over time, and there were fewer moderately or poorly differentiated tumors. Tumors > 5 cm decreased from 54.5% to 31.7% and 11.7%, respectively (P < 0.0001), and LTX with >or=4 nodules decreased from 38.9% to 23.5% and 15.1%, respectively (P = 0.0044). Poorly differentiated tumors decreased from 37.2% to 31.8% and 20.3%, respectively (P = 0.0005). Tumor microvascular invasion remained at 21.2% to 23.8% despite changes in patient selection over time (P = 0.7124). Stepwise Cox regression analysis (n = 502) showed significant risk for tumor recurrence and patient survival for transplants before 1997 [hazard ratio (HR), 1.82 and 1.88, respectively], tumor size > 6 cm (HR, 2.09 and 1.76), microvascular invasion (HR, 1.75 and 1.69, respectively), and alpha-fetoprotein > 200 (HR, 2.45 and 2.32, respectively). In conclusion, outcome after LTX for HCC has improved continuously over the past 20 years. Improved perioperative care and better patient selection may partially explain the improved outcome after LTX for HCC.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.21738DOI Listing
June 2009
-->