Publications by authors named "Hye-Mee Kwon"

33 Publications

Programmed Intermittent Epidural Bolus versus Continuous Epidural Infusion in Major Upper Abdominal Surgery: A Retrospective Comparative Study.

J Clin Med 2021 Nov 18;10(22). Epub 2021 Nov 18.

Department of Anesthesiology and Pain Medicine, Asan Medical Center, University of Ulsan College of Medicine, 88 Olympic-ro 43-gil, Songpa-gu, Seoul 05505, Korea.

Although recent evidence shows that the programmed intermittent epidural bolus can provide improved analgesia compared to continuous epidural infusion during labor, its usefulness in major upper abdominal surgery remains unclear. We evaluated the effect of programmed intermittent epidural bolus versus continuous epidural infusion on the consumption of postoperative rescue opioids, pain intensity, and consumption of local anesthetic by retrospective analysis of data of patients who underwent major upper abdominal surgery under ultrasound-assisted thoracic epidural analgesia between July 2018 and October 2020. The primary outcome was total opioid consumption up to 72 h after surgery. The data of postoperative pain scores, epidural local anesthetic consumption, and adverse events from 193 patients were analyzed (continuous epidural infusion: = 124, programmed intermittent epidural bolus: = 69). There was no significant difference in the rescue opioid consumption in the 72 h postoperative period between the groups (33.3 mg [20.0-43.3] vs. 28.3 mg [18.3-43.3], = 0.375). There were also no significant differences in the pain scores, epidural local anesthetic consumption, and incidence of adverse events. Our findings suggest that the quality of postoperative analgesia and safety following major upper abdominal surgery were comparable between the groups. However, the use of programmed intermittent epidural bolus requires further evaluation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/jcm10225382DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8619973PMC
November 2021

Association of skeletal muscle index with postoperative acute kidney injury in living donor hepatectomy: A retrospective single-centre cohort study.

Liver Int 2021 Nov 24. Epub 2021 Nov 24.

Department of Anesthesiology and Pain Medicine, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Republic of Korea.

Background: Although living donor liver transplantation (LDLT) is the standard treatment option for patients with end-stage liver disease, it always entails ethical concerns about the risk of living donors. Recent studies have reported a correlation between sarcopenia and surgical prognosis in recipients. However, there are few studies of donor sarcopenia and the surgical prognosis of donors. This study investigated the association between sarcopenia and postoperative acute kidney injury in liver donors.

Methods: This retrospective study analysed 2892 donors who underwent donor hepatectomy for LDLT between January 2008 and January 2018. Sarcopenia was classified into pre-sarcopenia and severe sarcopenia, which were determined to be -1 standard deviation (SD), and -2 SD from the mean baseline of the skeletal muscle index, respectively. Multivariate regression analysis was performed to evaluate the association between donor sarcopenia and postoperative AKI. Additionally, we assessed the association between donor sarcopenia and delayed recovery of liver function (DRHF).

Results: In the multivariate analysis, donor sarcopenia was significantly associated a higher incidence of postoperative AKI (adjusted odds ratio [OR]: 2.65, 95% confidence interval [CI]: 1.15-6.11, P = .022 in pre-sarcopenia, OR: 5.59, 95% CI: 1.11-28.15, P = .037 in severe sarcopenia, respectively). Additionally, hypertension and synthetic colloid use were significantly associated with postoperative AKI. In the multivariate analysis, risk factors of DRHF were male gender, indocyanine green retention rate at 15 minutes, and graft type, however, donor sarcopenia was not a risk factor.

Conclusions: Donor sarcopenia is associated with postoperative AKI following donor hepatectomy.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/liv.15109DOI Listing
November 2021

Low Preoperative Antithrombin III Level Is Associated with Postoperative Acute Kidney Injury after Liver Transplantation.

J Pers Med 2021 Jul 26;11(8). Epub 2021 Jul 26.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul 05505, Korea.

We aimed to determine the association between the preoperative antithrombin III (ATIII) level and postoperative acute kidney injury (AKI) after LT (post-LT AKI). We retrospectively evaluated 2395 LT recipients between 2010 and 2018 whose data of perioperative ATIII levels were available. Patients were divided into two groups based on the preoperative level of ATIII (ATIII < 50% vs. ATIII ≥ 50%). Multivariable regression analysis was performed to assess the risk factors for post-LT AKI. The mean preoperative ATIII levels were 30.2 ± 11.8% in the ATIII < 50% group and 67.2 ± 13.2% in the ATIII ≥ 50% group. The incidence of post-LT AKI was significantly lower in the ATIII ≥ 50% group compared to that in the ATIII < 50% group (54.7% vs. 75.5%, < 0.001); odds ratio (OR, per 10% increase in ATIII level) 0.86, 95% confidence interval (CI) 0.81-0.92; < 0.001. After a backward stepwise regression model, female sex, high body mass index, low albumin, deceased donor LT, longer duration of surgery, and high red blood cell transfusion remained significantly associated with post-LT AKI. A low preoperative ATIII level is associated with post-LT AKI, suggesting that preoperative ATIII might be a prognostic factor for predicting post-LT AKI.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/jpm11080716DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8401622PMC
July 2021

Estimation of Stroke Volume Variance from Arterial Blood Pressure: Using a 1-D Convolutional Neural Network.

Sensors (Basel) 2021 Jul 29;21(15). Epub 2021 Jul 29.

Asan Medical Center, Department of Anesthesiology and Pain Medicine, University of Ulsan College of Medicine, 88, Olympic-ro 43-gil, Seoul 05505, Korea.

Background: We aimed to create a novel model using a deep learning method to estimate stroke volume variation (SVV), a widely used predictor of fluid responsiveness, from arterial blood pressure waveform (ABPW).

Methods: In total, 557 patients and 8,512,564 SVV datasets were collected and were divided into three groups: training, validation, and test. Data was composed of 10 s of ABPW and corresponding SVV data recorded every 2 s. We built a convolutional neural network (CNN) model to estimate SVV from the ABPW with pre-existing commercialized model (EV1000) as a reference. We applied pre-processing, multichannel, and dimension reduction to improve the CNN model with diversified inputs.

Results: Our CNN model showed an acceptable performance with sample data (r = 0.91, MSE = 6.92). Diversification of inputs, such as normalization, frequency, and slope of ABPW significantly improved the model correlation (r = 0.95), lowered mean squared error (MSE = 2.13), and resulted in a high concordance rate (96.26%) with the SVV from the commercialized model.

Conclusions: We developed a new CNN deep-learning model to estimate SVV. Our CNN model seems to be a viable alternative when the necessary medical device is not available, thereby allowing a wider range of application and resulting in optimal patient management.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/s21155130DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8347322PMC
July 2021

Real-time ultrasound-guided low thoracic epidural catheter placement: technical consideration and fluoroscopic evaluation.

Reg Anesth Pain Med 2021 06 23;46(6):512-517. Epub 2021 Apr 23.

Department of Anesthesiology and Pain Medicine, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea (the Republic of).

Background And Objective: Thoracic epidural analgesia can significantly reduce acute postoperative pain. However, thoracic epidural catheter placement is challenging. Although real-time ultrasound (US)-guided thoracic epidural catheter placement has been recently introduced, data regarding the accuracy and technical description are limited. Therefore, this prospective observational study aimed to assess the success rate and describe the technical considerations of real-time US-guided low thoracic epidural catheter placement.

Methods: 38 patients in the prone position were prospectively studied. After the target interlaminar space between T9 and T12 was identified, the needle was advanced under real-time US guidance and was stopped just short of the posterior complex. Further advancement of the needle was accomplished without US guidance using loss-of-resistance techniques to normal saline until the epidural space was accessed. Procedure-related variables such as time to mark space, needling time, number of needle passes, number of skin punctures, and the first-pass success rate were measured. The primary outcome was the success rate of real-time US-guided thoracic epidural catheter placement, which was evaluated using fluoroscopy. In addition, the position of the catheter, contrast dispersion, and complications were evaluated.

Results: This study included 38 patients. The T10-T11 interlaminar space was the most location for epidural access. During the procedure, the mean time for marking the overlying skin for the procedure was 49.5±13.8 s and the median needling time was 49 s. The median number of needle passes was 1.0 (1.0-1.0). All patients underwent one skin puncture for the procedure. The first-pass and second-pass success rates were 76.3% and 18.4%, respectively. Fluoroscopic evaluation revealed that the catheter tips were all positioned in the epidural space and were usually located between T9 and T10 (84.2%). The cranial and caudal contrast dispersion were observed up to 5.4±1.6 and 2.6±1.0 vertebral body levels, respectively. No procedure-related complications occurred.

Conclusion: Real-time US guidance appears to be a feasible option for facilitating thoracic epidural insertion. Whether or not this technique improves the procedural success and quality compared with landmark-based techniques will require additional study.

Trial Registration Number: NCT03890640.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/rapm-2021-102578DOI Listing
June 2021

Rupture Risk of Intracranial Aneurysm and Prediction of Hemorrhagic Stroke after Liver Transplant.

Brain Sci 2021 Mar 31;11(4). Epub 2021 Mar 31.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul 05505, Korea.

Postoperative hemorrhagic stroke (HS) is a rare yet devastating complication after liver transplantation (LT). Unruptured intracranial aneurysm (UIA) may contribute to HS; however, related data are limited. We investigated UIA prevalence and aneurysmal subarachnoid hemorrhage (SAH) and HS incidence post-LT. We identified risk factors for 1-year HS and constructed a prediction model. This study included 3544 patients who underwent LT from January 2008 to February 2019. Primary outcomes were incidence of SAH, HS, and mortality within 1-year post-LT. Propensity score matching (PSM) analysis and Cox proportional hazard analysis were performed. The prevalence of UIAs was 4.63% ( = 164; 95% confidence interval (CI), 3.95-5.39%). The 1-year SAH incidence was 0.68% (95% CI, 0.02-3.79%) in patients with UIA. SAH and HS incidence and mortality were not different between those with and without UIA before and after PSM. Cirrhosis severity, thrombocytopenia, inflammation, and history of SAH were identified as risk factors for 1-year HS. UIA presence was not a risk factor for SAH, HS, or mortality in cirrhotic patients post-LT. Given the fatal impact of HS, a simple scoring system was constructed to predict 1-year HS risk. These results enable clinical risk stratification of LT recipients with UIA and help assess perioperative HS risk before LT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/brainsci11040445DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8066281PMC
March 2021

REPLY.

Hepatology 2021 Jul 15;74(1):536-537. Epub 2021 Jun 15.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Republic of Korea.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.31722DOI Listing
July 2021

Systolic anterior motion of mitral chordae tendineae: prevalence and clinical implications in liver transplantation.

Anesth Pain Med (Seoul) 2020 Apr 11;15(2):187-192. Epub 2020 Mar 11.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Background: Although systolic anterior motion (SAM) of the mitral valve anterior leaflet is well-known to cause hemodynamic perturbation in many anesthetic situations, the prevalence and clinical implication of SAM of mitral chordae tendineae (chordal SAM) in liver transplantation (LT) has not been evaluated. We aimed to assess the impact of chordal SAM on intraoperative postreperfusion syndrome and short and long-term all-cause mortality.

Methods: We retrospectively evaluated 1751 LT recipients from January 2011 to June 2017 who had preoperative echocardiography. Echocardiography-derived parameters and the prevalence of post-reperfusion syndrome between those with chordal SAM and without chordal SAM were compared. The cumulative mortality rate according to the presence of chordal SAM was evaluated by the Kaplan-Meier survival curve.

Results: Of the enrolled recipients, 21 (1.2%) had chordal SAM in preoperative echocardiography. Compared to those without chordal SAM, patients with chordal SAM had a smaller end-systolic volume index (median 18 ml/m vs. 22 ml/m, P = 0.015) and end-diastolic volume index (median 52 ml/m vs. 63 ml/m, P = 0.011). However, there was no difference in systolic and diastolic function in echocardiography. The prevalence of intraoperative post-reperfusion syndrome did not show any difference (42.9% vs. 45.3%, P = 1.000). Over the mean 4.8-year follow-up, cumulative 90-day and overall mortality also did not show a difference (Log rank P > 0.05, both).

Conclusions: Preoperative screening of echocardiography in LT recipients detects 1.2% of chordal SAM. It is found with small left ventricular volume, but is not related with intraoperative post-reperfusion syndrome and short- and long-term postoperative all-cause mortality in LT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.17085/apm.2020.15.2.187DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7713819PMC
April 2020

Prognostic Value of B-Type Natriuretic Peptide in Liver Transplant Patients: Implication in Posttransplant Mortality.

Hepatology 2021 Jul 15;74(1):336-350. Epub 2021 Jun 15.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Republic of Korea.

Background And Aims: Despite frequent cirrhotic cardiomyopathy or subclinical heart failure (HF), the prognostic value of peri-liver transplant (LT) B-type natriuretic peptide (BNP) has been poorly studied in advanced liver disease. We examined the association between BNP and mortality in a large cohort of LT patients and identified risk factors for peri-LT BNP increase.

Approach And Results: Using prospectively collected data from the Asan LT Registry between 2008 and 2019, 3,811 patients who measured serial pretransplant BNP (preBNP) and peak BNP levels within the first 3 posttransplant days (postBNP ) were analyzed. Thirty-day all-cause mortality predicted by adding preBNP and/or postBNP to the traditional Revised Cardiac Risk Index (RCRI) was evaluated. PreBNP > 400 pg/mL (known cutoff of acute HF) was found in 298 (7.8%); however, postBNP  > 400 pg/mL was identified in 961 (25.2%) patients, specifically in 40.4% (531/1,315) of those with a Model for End-Liver Disease score (MELDs) > 20. Strong predictors of postBNP  > 400 pg/mL were preBNP, hyponatremia, and MELDs, whereas those of preBNP > 400 pg/mL were MELDs, kidney failure, and respiratory failure. Among 100 (2.6%) post-LT patients who died within 30 days, patients with postBNP  ≤ 150 pg/mL (43.1%, reference group), 150-400 pg/mL (31.7%), 400-1,000 pg/mL (18.5%), 1,000-2,000 pg/mL (4.7%), and >2,000 pg/mL (2.0%) had 30-day mortalities of 0.9%, 2.2%, 4.0%, 7.7%, and 22.4%, respectively. Adding preBNP, postBNP , and both BNP to RCRI improved net reclassification index to 22.5%, 29.5%, and 33.1% of 30-day mortality, respectively.

Conclusions: PostBNP  > 400 pg/mL after LT was markedly prevalent in advanced liver disease and mainly linked to elevated preBNP. Routine monitoring of peri-LT BNP provides incremental prognostic information; therefore, it could help risk stratification for mortality as a practical and useful biomarker in LT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.31661DOI Listing
July 2021

von Willebrand factor to protein C ratio-related thrombogenicity with systemic inflammation is predictive of graft dysfunction after liver transplantation: Retrospective cohort study.

Int J Surg 2020 Dec 1;84:109-116. Epub 2020 Nov 1.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Republic of Korea. Electronic address:

Introduction: Early allograft dysfunction (EAD) is known to be a prototype of graft failure and ultimately influences long-term graft failure or death. We hypothesized that pretransplant thrombogenicity evaluated by procoagulant and anticoagulant, von Willebrand factor (vWF), factor Ⅷ (FⅧ), protein C (PC) and their imbalance ratio of vWF-to-PC (vWFPCR) and FVIII-to-PC (FⅧPCR), is associated with EAD and 90-day graft failure after living-related liver transplantation (LDLT) and contributes to further exacerbation of graft dysfunction when coexists with systemic inflammation.

Material And Methods: Of 1199 prospectively registered LDLT patients, 698 with measurements of each thrombogenicity parameters were analyzed. Risk factors for EAD development were searched and subsequent best cut-offs was calculated according to the receiver operator characteristic curve analysis. When comparing the outcome, multivariable regression analysis and inverse probability of treatment weighting (IPTW) of the propensity score were performed.

Results: The prevalence of EAD was 10.7% (n = 75/698) after LDLT. Of parameters, vWFPCR had highest predictivity potential of EAD with the best cut-off of 8.06. The relationship between vWFPCR≥8.06 showed significant association with EAD development (OR [95%CI], 2.55[1.28-5.09], P = 0.008) and 90-day graft failure (HR [95%CI], 2.24 [1-4.98], P = 0.043) after IPTW-adjustment. Furthermore, risk of EAD increased proportionally with increasing C-reactive protein as a continuous metric of systemic inflammation, and more steeply in those with higher thrombogenicity (i.e., higher vWFPCR). Adding vWFPCR to MELD score improved EAD risk prediction by 21.9%.

Conclusions: Pretransplant thrombogenicity assessed by imbalance of pro- and anticoagulant, was significantly associated with EAD and 90-day graft failure after LDLT and this association was worsened by systemic inflammation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijsu.2020.10.030DOI Listing
December 2020

Non-contact thermography-based respiratory rate monitoring in a post-anesthetic care unit.

J Clin Monit Comput 2021 Dec 25;35(6):1291-1297. Epub 2020 Sep 25.

Department of Anesthesiology and Pain Medicine, University of Virginia Health System, Charlottesville, VA, USA.

In patients at high risk of respiratory complications, pulse oximetry may not adequately detect hypoventilation events. Previous studies have proposed using thermography, which relies on infrared imaging, to measure respiratory rate (RR). These systems lack support from real-world feasibility testing for widespread acceptance. This study enrolled 101 spontaneously ventilating patients in a post-anesthesia recovery unit. Patients were placed in a 45° reclined position while undergoing pulse oximetry and bioimpedance-based RR monitoring. A thermography camera was placed approximately 1 m from the patient and pointed at the patient's face, recording continuously at 30 frames per second for 2 min. Simultaneously, RR was manually recorded. Offline imaging analysis identified the nares as a region of interest and then quantified nasal temperature changes frame by frame to estimate RR. The manually calculated RR was compared with both bioimpedance and thermographic estimates. The Pearson correlation coefficient between direct measurement and bioimpedance was 0.69 (R = 0.48), and that between direct measurement and thermography was 0.95 (R = 0.90). Limits of agreement analysis revealed a bias of 1.3 and limits of agreement of 10.8 (95% confidence interval 9.07 to 12.5) and - 8.13 (- 6.41 to - 9.84) between direct measurements and bioimpedance, and a bias of -0.139 and limits of agreement of 2.65 (2.14 to 3.15) and - 2.92 (- 2.41 to 3.42) between direct measurements and thermography. Thermography allowed tracking of the manually measured RR in the post-anesthesia recovery unit without requiring patient contact. Additional work is required for image acquisition automation and nostril identification.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10877-020-00595-8DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516248PMC
December 2021

Preoperative high-sensitivity troponin I and B-type natriuretic peptide, alone and in combination, for risk stratification of mortality after liver transplantation.

Korean J Anesthesiol 2021 06 26;74(3):242-253. Epub 2020 Aug 26.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Background: Given the severe shortage of donor liver grafts, coupled with growing proportion of cardiovascular death after liver transplantation (LT), precise cardiovascular risk assessment is pivotal for selecting recipients who gain the greatest survival benefit from LT surgery. We aimed to determine the prognostic value of pre-LT combined measurement of B-type natriuretic peptide (BNP) and high-sensitivity troponin I (hsTnI) in predicting early post-LT mortality.

Methods: We retrospectively evaluated 2,490 consecutive adult LT patients between 2010 and 2018. Cut-off values of BNP and hsTnI for predicting post-LT 90-day mortality were calculated. According to the derived cut-off values of two cardiac biomarkers, alone and in combination, adjusted hazard ratios (aHR) of post-LT 90-day mortality were determined using multivariate Cox regression analysis.

Results: Mortality rate after 90 days was 2.9% (72/2,490). Rounded cut-off values for post-LT 90-day mortality were 400 pg/ml for BNP (aHR 2.02 [1.15, 3.52], P = 0.014) and 60 ng/L for hsTnI (aHR 2.65 [1.48, 4.74], P = 0.001), respectively. Among 273 patients with BNP ≥ 400 pg/ml, 50.9% of patients were further stratified into having hsTnI ≥ 60 ng/L. Combined use of pre-LT cardiac biomarkers predicted post-LT 90-day mortality rate; both non-elevated: 1.0% (21/2,084), either one is elevated: 9.0% (24/267), and both elevated: 19.4% (27/139, log-rank P < 0.001; aHR vs non-elevated 4.23 [1.98, 9.03], P < 0.001).

Conclusions: Concomitant elevation of both cardiac biomarkers posed significantly higher risk of 90-day mortality after LT. Pre-LT assessment cardiac strain and myocardial injury, represented by BNP and hsTnI values, would contribute to prioritization of LT candidates and help administer target therapies that could modify early mortality.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.4097/kja.20296DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8175877PMC
June 2021

Markedly prolonged QTc interval in end-stage liver disease and risk of 30-day cardiovascular event after liver transplant.

J Gastroenterol Hepatol 2021 Mar 30;36(3):758-766. Epub 2020 Jul 30.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Background And Aim: The proportional increase of corrected QT interval (QTc) along end-stage liver disease (ESLD) severity may lead to inconsistent outcome reporting if based on conventional threshold of prolonged QTc. We investigated the comprehensive QTc distribution among ESLD patients and assessed the association between QTc > 500 ms, a criterion for diagnosing severe long-QT syndrome, and the 30-day major adverse cardiovascular event (MACE) after liver transplantation (LT) and identified the risk factors for developing QTc > 500 ms.

Methods: Data were collected prospectively from the Asan LT Registry between 2011 and 2018, and outcomes were retrospectively reviewed. Multivariable analysis and propensity score-weighted adjusted odds ratios (ORs) were calculated. Thirty-day MACEs were defined as the composite of cardiovascular mortality, arrhythmias, myocardial infarction, pulmonary thromboembolism, and/or stroke.

Results: Of 2579 patients, 194 (7.5%) had QTc > 500 ms (QTc500_Group), and 1105 (42.8%) had prolonged QTc (QTcP_Group), defined as QTc > 470 ms for women and >450 ms for men. The 30-day MACE occurred in 336 (13%) patients. QTc500_Group showed higher 30-day MACE than did those without (20.1% vs 12.5%, P = 0.003), with corresponding adjusted OR of 1.24 (95% CI: 1.06-1.46, P = 0.007). However, QTcP_Group showed comparable 30-day MACE (13.3% vs 12.8% without prolonged QTc, P = 0.764). Significant risk factors for QTc > 500 ms development were advanced liver disease, female sex, hypokalemia, hypocalcemia, high left ventricular end-diastolic volume, and tachycardia.

Conclusion: Our results revealed that, among ESLD patients, a novel threshold of QTc > 500 ms was associated with post-LT 30-day MACE but not with conventional threshold, indicating that a longer QTc threshold should be considered for this unique patient population.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/jgh.15179DOI Listing
March 2021

Does Erector Spinae Plane Block Have a Visceral Analgesic Effect?: A Randomized Controlled Trial.

Sci Rep 2020 05 21;10(1):8389. Epub 2020 May 21.

Department of Anesthesiology and Pain Medicine, Asan Medical Center, University of Ulsan, College of Medicine, Seoul, 05505, Korea.

The visceral analgesic efficacy of erector spinae plane block (ESPB) is still a matter of debate. This study attempted to investigate the visceral analgesic efficacy of ESPB in clinical setting. After randomized, we performed ultrasound-guided bilateral rectus sheath block (RSB), which was aimed to prevent postoperative somatic pain on all patients who underwent laparoscopic cholecystectomy (LC). Ultrasound-guided bilateral ESPB at T7 level was performed only to the intervention group to provide the visceral analgesic block. The intraoperative requirement for remifentanil (P = 0.021) and the cumulative fentanyl consumption at postoperative 24-hours was significantly lower in the ESPB group (206.5 ± 82.8 μg vs.283.7 ± 102.4 μg, respectively; P = 0.004) compared to non-ESPB group. The ESPB group consistently showed lower accumulated analgesic consumption compared with those in the non-ESPB group at all observed time-points (all P < 0.05) after 2 hours and the degree of the accumulated analgesic consumption reduction was greater (P = 0.04) during the 24-hour postoperative period. Pain severity was lower in the ESPB group at 6-hours postoperatively. The significantly reduced opioid consumption in ESPB group may imply that while preliminary and in need of confirmation, ESPB has potential visceral analgesic effect. Therefore, performing ESPB solely may be feasible in inducing both somatic and visceral analgesia.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-020-65172-0DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7249264PMC
May 2020

Reply.

Hepatology 2020 08;72(2):784

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Republic of Korea.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.31162DOI Listing
August 2020

Early postoperative weight gain is associated with increased risk of graft failure in living donor liver transplant recipients.

Sci Rep 2019 12 27;9(1):20096. Epub 2019 Dec 27.

Department of Anaesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Centre, University of Ulsan College of Medicine, Seoul, Korea.

Fluid overload (FO) has been shown to adversely affect multiple organs and survival in critically ill patients. Liver transplantation (LT) carries the risk of massive transfusion, which frequently results in FO. We investigated the association of postoperative weight gain with graft failure, early allograft dysfunction (EAD), and overall mortality in LT. 1833 living donor LT (LDLT) recipients were retrospectively analysed. Patients were divided into 2 groups according to postoperative weight gain (<3% group [n = 1391] and ≥3% group [n = 442]) by using maximally selected log-rank statistics for graft failure. Multivariate Cox and logistic regression analyses were performed. The ≥3% group was associated with graft failure (adjusted HR [aHR], 1.763; 95% CI, 1.248-2.490; P = 0.001). When postoperative weight change was used as a continuous variable, the aHR for each 1% increase in postoperative weight was 1.045 (95% CI, 1.009-1.082; P = 0.015). In addition, the ≥3% group was associated with EAD (adjusted OR [aOR], 1.553; 95% CI, 1.024-2.356; P = 0.038) and overall mortality (aHR, 1.731; 95% CI, 1.182-2.535; P = 0.005). In conclusion, postoperative weight gain may be independently associated with increased risk of graft failure, EAD, and mortality in LDLT recipients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-019-56543-3DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6934543PMC
December 2019

Comparison of recovery profiles in patients with Parkinson's disease for 2 types of neuromuscular blockade reversal agent following deep brain stimulator implantation.

Medicine (Baltimore) 2019 12;98(52):e18406

Department of Anesthesia and Pain Medicine, Asan Medical Center, University of Ulsan College of Medicine, Seoul.

As an anesthetic reversal agent, there are concerns with cholinesterase inhibitors regarding worsening of Parkinson's disease (PD)-related symptoms. Sugammadex, a relatively new reversal agent, does not inhibit acetylcholinesterase and does not require co-administration of an antimuscarinic agent. The present study compared the recovery profiles of 2 agents initially administered for reversal of neuromuscular blockade in patients with advanced PD who underwent deep brain stimulator implantation.A total of 121 patients with PD who underwent deep brain stimulator implantation were retrospectively analyzed. Patients were divided into 1 of 2 groups according to the type of neuromuscular blockade reversal agent (pyridostigmine vs sugammadex) initially administered. Recovery profiles reflecting time to extubation, reversal failure at first attempt, and hemodynamic stability, including incidence of hypertension or tachycardia during the emergence period, were compared.Time to extubation in the sugammadex group was significantly shorter (P < .001). In the sugammadex group, reversal failure at first attempt did not occur in any patient, while it occurred in seven (9.7%) patients in the pyridostigmine group (P = .064), necessitating an additional dose of pyridostigmine (n = 3) or sugammadex (n = 4). The incidence of hemodynamic instability during anesthetic emergence was significantly lower in the sugammadex group than in the pyridostigmine group (P = .019).Sugammadex yielded a recovery profile superior to that of pyridostigmine during the anesthesia emergence period in advanced PD patients. Sugammadex is also likely to be associated with fewer adverse effects than traditional reversal agents, which in turn would also improve overall postoperative management in this patient population.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/MD.0000000000018406DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6946526PMC
December 2019

Preoperative echocardiographic evaluation of cardiac systolic and diastolic function in liver transplant recipients with diabetes mellitus: a propensity-score matched analysis.

Anesth Pain Med (Seoul) 2019 Oct;14(4):465-473

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Background: Diabetes mellitus (DM) increases risk of heart failure. It has been shown that diabetes leads to DM-cardiomyopathy, characterized by systolic and diastolic dysfunction. Pre-transplant diastolic dysfunction, has been associated with poor graft outcome and mortality. We assessed the hypothesis that end-stage liver disease (ESLD) patients with diabetes (DM-ESLD), have more advanced cardiac systolic and diastolic dysfunction, compared to ESLD patients without diabetes (Non DM-ESLD).

Methods: We retrospectively evaluated preoperative echocardiography of 1,319 consecutive liver transplant recipients (1,007 Non DM-ESLD vs. 312 DM-ESLD [23.7%]) January 2012-May 2016. Systolic and diastolic indices, such as left ventricular ejection fraction, transmital E/A ratio, tissue doppler s', e' velocity, and E/e' ratio (index of left ventricular end-diastolic pressure), were compared using 1:2 propensity-score matching.

Results: DM-ESLD patients showed no differences in systolic indices of left ventricular ejection fraction and s' velocity, whereas diastolic indices of E/A ratio ≤ 1 (49.0% vs. 40.2% P = 0.014), e' velocity (median = 7.0 vs. 7.4 cm/s, P < 0.001) and E/e' ratio (10.9 ± 3.2 vs. 10.1 ± 3.0, P < 0.001), showed worse diastolic function compare with Non DM-ESLD patients, respectively.

Conclusions: DM-ESLD patients suffer higher degree of diastolic dysfunction compared with Non DM-ESLD patients. Based on this, careful preoperative screening for diastolic dysfunction in DM-ESLD patients is encouraged, because poor transplant outcomes have been noted in patients with preoperative diastolic dysfunction.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.17085/apm.2019.14.4.465DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7713801PMC
October 2019

Appraisal of Cardiac Ejection Fraction With Liver Disease Severity: Implication in Post-Liver Transplantation Mortality.

Hepatology 2020 04 6;71(4):1364-1380. Epub 2020 Mar 6.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Republic of Korea.

Background And Aims: Enhanced sympathetic nervous activation and peripheral vasodilation in end-stage liver disease (ESLD) may limit the importance of left ventricular ejection fraction (LVEF) as an influential prognosticator. We sought to understand the LVEF and cardiac dimensions in ESLD patients in order to define the LVEF threshold to predict all-cause mortality after liver transplantation (LT).

Approach And Results: Data were collected prospectively from the Asan LT Registry between 2008 and 2016, and outcomes were retrospectively reviewed. LVEF, end-diastolic volume index (EDVI), and end-diastolic elastance (Eed) were measured by preoperative echocardiography. Of 2,799 patients, 452 (16.2%) had LVEF ≤ 60%, with 29 (1.0%) having LVEF < 55% and 269 (9.6%) had LVEF ≥ 70%. Over a median of 5.4-year follow-up, 329 (11.8%) patients died: 104 (3.7%) died within 90 days. LVEF (range, 30%-81%) was directly proportionate to Model for End-stage Liver Disease (MELD) scores, an index of liver disease severity, in survivors but showed a fixed flat-line pattern in nonsurvivors (interaction P = 0.004 between groups), with lower EDVI (P = 0.013) and higher Eed (P = 0.001) in the MELD ≥ 20 group. Patients with LVEF ≤ 60% had higher 90-day (13% vs. 7.4%; log rank, P = 0.03) and median 5.4-year (26.7% vs. 16.2%; log rank, P = 0.003) mortality rates in the MELD ≥ 20 group, respectively, compared to those with LVEF > 60%. Specifically, in the MELD > 35 group, median 5.4-year mortality rate was 53.3% in patients with LVEF ≤ 60% versus 24% in those with LVEF > 60% (log rank P < 0.001). By contrast, mortality rates of LVEF ≤ 60% and > 60% were similar in the MELD < 20 group (log rank P = 0.817).

Conclusions: LVEF ≤ 60% is strongly associated with higher post-LT mortality rates in the MELD ≥ 20 group, indicating the need to appraise both LVEF and liver disease severity simultaneously. Enhanced diastolic elastance with low EDVI provides insights into pathogenesis of low LVEF in nonsurvivors with MELD ≥ 20.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/hep.30913DOI Listing
April 2020

Effect of Remote Ischemic Preconditioning Conducted in Living Liver Donors on Postoperative Liver Function in Donors and Recipients Following Liver Transplantation: A Randomized Clinical Trial.

Ann Surg 2020 04;271(4):646-653

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Objective: This study aimed to assess the effects of remote ischemic preconditioning (RIPC) on liver function in donors and recipients after living donor liver transplantation (LDLT).

Background: Ischemia reperfusion injury (IRI) is known to be associated with graft dysfunction after liver transplantation. RIPC is used to lessen the harmful effects of IRI.

Methods: A total of 148 donors were randomly assigned to RIPC (n = 75) and control (n = 73) groups. RIPC involves 3 cycles of 5-minute inflation of a blood pressure cuff to 200 mm Hg to the upper arm, followed by 5-minute reperfusion with cuff deflation. The primary aim was to assess postoperative liver function in donors and recipients and the incidence of early allograft dysfunction and graft failure in recipients.

Results: RIPC was not associated with any differences in postoperative aspartate aminotransferase (AST) and alanine aminotransferase levels after living donor hepatectomy, and it did not decrease the incidence of delayed graft hepatic function (6.7% vs 0.0%, P = 0.074) in donors. AST level on postoperative day 1 [217.0 (158.0, 288.0) vs 259.5 (182.0, 340.0), P = 0.033] and maximal AST level within 7 postoperative days [244.0 (167.0, 334.0) vs 296.0 (206.0, 395.5), P = 0.029) were significantly lower in recipients who received a preconditioned graft. No differences were found in the incidence of early allograft dysfunction (4.1% vs 5.6%, P = 0.955) or graft failure (1.4% vs 5.6%, P = 0.346) among recipients.

Conclusions: RIPC did not improve liver function in living donor hepatectomy. However, RIPC performed in liver donors may be beneficial for postoperative liver function in recipients after living donor liver transplantation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/SLA.0000000000003498DOI Listing
April 2020

Augmentation of Electrocardiographic QRS R-Amplitude Precedes Radiocontrast-Induced Hypotension during Mobile Computed Tomography Scanning.

J Clin Med 2019 04 12;8(4). Epub 2019 Apr 12.

Departments of Anesthesiology and Biomedical Engineering, University of Virginia School of Medicine, Pinn Hall 1232, Charlottesville, VA 22908, USA.

Although intravenous administration of contrast media may trigger a variety of adverse reactions, sedated patients undergoing computed tomography (CT) scanning usually are not able to report their symptoms, which may delay detection of adverse reactions. Furthermore, changes in vital signs cannot be typically measured during mobile CT scanning, which worsens the situation. We aimed to characterize contrast-related hemodynamic changes that occur during mobile CT scanning and predict sudden hypotension based on subtle but robust changes in the electrocardiogram (ECG). We analyzed the digitized hemodynamic data of 20 consecutive patients who underwent clipping of a cerebral artery aneurysm and contrast-enhanced CT scanning following the surgical procedure. Hemodynamic variables, including ECG findings, invasive blood pressure (BP), pulse oximetry results, capnography findings, cardiac output, and systemic vascular resistance, were monitored simultaneously. We measured morphological changes in ECG-derived parameters, including the R-R interval, ST height, and QRS R-amplitude, on a beat-to-beat basis, and evaluated the correlation between those parameters and hemodynamic changes. After the radiocontrast injection, systolic BP decreased by a median 53 mmHg from baseline and spontaneously recovered after 63 ± 19 s. An increase in QRS R-amplitude (median 0.43 mV) occurred 25 ± 10 s before hypotension developed. The receiver operating characteristic curve showed that a 16% increase in QRS R-amplitude can predict a decrease in systolic BP of >25% (area under the curve 0.852). Increased cardiac output (median delta 2.7 L/min from baseline) and decreased systemic vascular resistance (median delta 857 dyn·s/cm from baseline) were also observed during hypotension. During mobile CT scanning, profound but transient hypotension can be observed, associated with decreased vascular resistance. Augmentation of QRS R-amplitude from an ECG represents a sensitive surrogate for onset of a hypotensive episode after contrast injection, thereby serving as a simple and continuous noninvasive hemodynamic monitoring tool.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/jcm8040505DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6518100PMC
April 2019

Neutrophil-to-lymphocyte ratio is a predictor of early graft dysfunction following living donor liver transplantation.

Liver Int 2019 08 8;39(8):1545-1556. Epub 2019 Apr 8.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Songpa-gu, Republic of Korea.

Background & Aims: Early allograft dysfunction (EAD) is predictive of poor graft and patient survival following living donor liver transplantation (LDLT). Considering the impact of the inflammatory response on graft injury extent following LDLT, we investigated the association between neutrophil-to-lymphocyte ratio (NLR) and EAD, 1-year graft failure, and mortality following LDLT, and compared it to C-reactive protein (CRP), procalcitonin, platelet-to-lymphocyte ratio and the Glasgow prognostic score.

Methods: A total of 1960 consecutive adult LDLT recipients (1531/429 as development/validation cohort) were retrospectively evaluated. Cut-offs were derived using the area under the receiver operating characteristic curve (AUROC), and multivariable regression and Cox proportional hazard analyses were performed.

Results: The risk of EAD increased proportionally with increasing NLR, and the NLR AUROC was 0.73, similar to CRP and procalcitonin and higher than the rest. NLR ≥ 2.85 (best cut-off) showed a significantly higher EAD occurrence (20.5% vs 5.8%, P < 0.001), higher 1-year graft failure (8.2% vs 4.9%, log-rank P = 0.009) and higher 1-year mortality (7% vs 4.5%, log-rank P = 0.039). NLR ≥ 2.85 was an independent predictor of EAD (odds ratio, 1.89 [1.26-2.84], P = 0.002) after multivariable adjustment, whereas CRP and procalcitonin were not. Increasing NLR was independently associated with higher 1-year graft failure and mortality (both P < 0.001). Consistent results in the validation cohort strengthened the prognostic value of NLR.

Conclusions: Preoperative NLR ≥ 2.85 predicted higher risk of EAD, 1-year graft failure and 1-year mortality following LDLT, and NLR was superior to other parameters, suggesting that preoperative NLR may be a practical index for predicting graft function following LDLT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/liv.14103DOI Listing
August 2019

Early postoperative hypoalbuminaemia is associated with pleural effusion after donor hepatectomy: A propensity score analysis of 2316 donors.

Sci Rep 2019 02 26;9(1):2790. Epub 2019 Feb 26.

Department of Anaesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Pleural effusion and hypoalbuminaemia frequently occur after hepatectomy. Despite the emphasis on the safety of donors, little is known about the impact of postoperative albumin level on pleural effusion in liver donors. We retrospectively assessed 2316 consecutive liver donors from 2004 to 2014. The analysis of donors from 2004 to 2012 showed that postoperative pleural effusion occurred in 47.4% (970/2046), and serum albumin levels decreased until postoperative day 2 (POD2) and increased thereafter. In multivariable analysis, the lowest albumin level within POD2 (POD2ALB) was inversely associated with pleural effusion (OR 0.28, 95% CI 0.20-0.38; P < 0.001). POD2ALB ≤3.0 g/dL, the cutoff value at the 75th percentile, was associated with increased incidence of pleural effusion after propensity score (PS) matching (431 pairs; OR 1.69, 95% CI 1.30-2.21; P < 0.001). When we further analysed data from 2010 to 2014, intraoperative albumin infusion was associated with higher POD2ALB (P < 0.001) and lower incidence of pleural effusion (P = 0.024), compared with synthetic colloid infusion after PS matching (193 pairs). In conclusion, our data showed that POD2ALB is inversely associated with pleural effusion, and that intraoperative albumin infusion is associated with a lower incidence of pleural effusion when compared to synthetic colloid infusion in liver donors.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-019-39126-0DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6391412PMC
February 2019

Risk stratification of myocardial injury after liver transplantation in patients with computed tomographic coronary angiography-diagnosed coronary artery disease.

Am J Transplant 2019 07 19;19(7):2053-2066. Epub 2019 Feb 19.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

We aimed to determine if the severity of computed tomographic coronary angiography (CTCA)-diagnosed coronary artery disease (CAD) is associated with postliver transplantation (LT) myocardial infarction (MI) within 30 days and early mortality. We retrospectively evaluated 2118 consecutive patients who underwent CAD screening using CTCA. Post-LT type-2 MI, elicited by oxygen supply-and-demand mismatch within a month after LT, was assessed according to the severity of CTCA-diagnosed CAD. Obstructive CAD (>50% narrowing, 9.2% prevalence) was identified in 21.7% of patients with 3 or more known CAD risk factors of the American Heart Association. Post-LT MI occurred in 60 (2.8%) of total patients in whom 90-day mortality rate was 16.7%. Rates of post-LT MI were 2.1%, 3.1%, 3.4%, 4.3%, and 21.4% for normal, nonobstructive CAD, and 1-, 2-, and 3-vessel obstructive CAD, respectively. Two-vessel or 3-vessel obstructive CAD showed a 4.9-fold higher post-LT MI risk compared to normal coronary vessels. The sensitivity and negative predictive value of obstructive CAD in detecting post-LT MI were, respectively, 20% and 97.5%. In conclusion, negative CTCA finding in suspected patients can successfully exclude post-LT MI, whereas proceeding with invasive angiography is needed to further risk-stratify in patients with significant CTCA-diagnosed CAD. Prognostic role of CTCA in predicting post-LT MI needs further research.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ajt.15263DOI Listing
July 2019

Recurrent ST segment elevations in a patient with asymptomatic early repolarization during head and neck surgery: implications of vasospastic angina.

J Dent Anesth Pain Med 2018 Jun 29;18(3):189-193. Epub 2018 Jun 29.

Department of Anesthesiology and Pain Medicine, University of Ulsan College of Medicine, Seoul, Korea.

A 57-year-old woman scheduled for cochlear implant removal exhibited preoperative electrocardiographic findings of early repolarization (ER). Four episodes of transient ST segment elevations during surgery raised suspicion for vasospastic angina (VA). In the post-anesthetic care unit, the patient complained of chest discomfort and received sublingual nitroglycerin with uncertain effect. The patient refused to proceed with postoperative invasive coronary angiography, resulting in inconclusive diagnosis. Intraoperative circumstances limit the diagnosis of VA, which emphasizes the need for further testing to confirm the diagnosis. When VA is suspected in patients with underlying ER, it is reasonable to consider invasive examination to establish the diagnosis and prevent recurrence of VA. If ST changes are observed during surgery in patients with preoperative ER, careful monitoring is recommended. Due to general anesthesia, the absence of patient symptoms limits the definitive diagnosis of those with suspected VA. Therefore, additional postoperative surveillance is recommended.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.17245/jdapm.2018.18.3.189DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6031974PMC
June 2018

Prevalent metabolic derangement and severe thrombocytopenia in ABO-incompatible liver recipients with pre-transplant plasma exchange.

Sci Rep 2018 04 27;8(1):6679. Epub 2018 Apr 27.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, 88 Olympic-ro 43-gil, Songpa-gu, Seoul, 05505, Republic of Korea.

Desensitisation with therapeutic plasma exchange (TPE) is essential for ABO-incompatible (ABO-I) liver transplants (LTs). However, excessive citrate load and coagulation disturbances after TPE have been poorly studied, in particular in cirrhotic patients with hypocapnic alkalosis, metabolic compensation and electrolyte imbalances. We retrospectively evaluated 1123 consecutive LT recipients (923 ABO-compatible [ABO-C], 200 ABO-I) from November 2008 to May 2015. TPE was generally performed a day before LT and blood sampling was performed before anaesthesia induction. We performed propensity score matching (PSM) and inverse probability treatment weighting (IPTW) analyses. In 199 PSM pairs, metabolic alkalosis was prevalent in ABO-I LT recipients (expectedly due to citrate conversion) with higher pH ≥ 7.50 (IPTW-adjusted odds ratio [aOR] = 2.23) than in ABO-C LT recipients. With increasing cirrhosis severity, the arterial pH and bicarbonate levels showed dose-dependent relationships, whereas mild hypoxaemia was more prevalent in ABO-I LT recipients. ABO-I LT recipients exhibited worsened hypokalaemia ≤3.0 mmol/l (17.6%, aOR = 1.44), hypomagnesaemia ≤1.7 mg/dl (27.6%, aOR = 3.43) and thrombocytopenia <30,000/µl (19.1%, aOR = 2.26) confirmed by lower maximal clot firmness (P = 0.001) in rotational thromboelastometry (EXTEM), which necessitated platelet transfusions. Preoperative identification of these change may prevent worsening of severe electrolyte disturbances and thrombocytopenia for optimal LT anaesthesia.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-018-24887-xDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5923210PMC
April 2018

Low Mean Arterial Blood Pressure is Independently Associated with Postoperative Acute Kidney Injury After Living Donor Liver Transplantation: A Propensity Score Weighing Analysis.

Ann Transplant 2018 Apr 10;23:236-245. Epub 2018 Apr 10.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, South Korea.

BACKGROUND As end-stage liver disease progresses, renal blood flow linearly correlates with mean arterial blood pressure (MBP) due to impaired autoregulation. We investigated whether the lower degree of postoperative MBP would predict the occurrence of postoperative acute kidney injury (AKI) after liver transplantation. MATERIAL AND METHODS This retrospective study enrolled 1,136 recipients with normal preoperative kidney function. Patients were categorized into two groups according to the averaged postoperative MBP: <90 mmHg (MBPbelow90) and ≥90 mmHg (MBPover90). The primary endpoint was occurrence of postoperative AKI, defined by the creatinine criteria of the Kidney Disease Improving Global Outcomes. The logistic regression model with inverse probability treatment weighting (IPTW) of propensity score was used to compare the risk of postoperative AKI between two groups. RESULTS MBPbelow90 group (83.0±5.1 mmHg) showed higher prevalence and risk of postoperative AKI (74.2% versus 62.6%, p<0.001; IPTW-OR 1.34 [1.12-1.61], p=0.001) compared with MBPover90 group (97.3±5.2 mmHg). When stratified by quartiles of baseline cystatin C glomerular filtration ratio (GFR), the association between MBPbelow90 and postoperative AKI remained significant only with the lowest quartile (cystatin C GFR ≤85 mL/min/1.73 m²; IPTW-OR 2.24 [1.53-3.28], p<0.001), but not with 2nd-4th quartiles. CONCLUSIONS Our results suggest that maintaining supranormal MBP over 90 mmHg may be beneficial to reduce the risk of post-LT AKI, especially for liver transplant recipients with cystatin C GFR ≤85 mL/min/1.73 m².
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6248026PMC
April 2018

Propofol for implantable cardioverter defibrillator implantation in patients with Brugada syndrome.

Pacing Clin Electrophysiol 2018 06 4;41(6):656-660. Epub 2018 May 4.

Department of Internal Medicine (Cardiology), Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Aims: Avoiding propofol in patients with Brugada syndrome has been suggested because of the theoretical risk of provoking ventricular arrhythmias, although propofol may be selected for conscious sedation during electrophysiological procedures in catheterization laboratories. This study aimed to document periprocedural electrocardiographic changes and adverse events in patients with Brugada syndrome undergoing implantable cardioverter defibrillator (ICD) implantation using propofol sedation.

Methods: We reviewed the clinical data of 53 consecutive patients who underwent ICD implantation during 1998-2011. Sedation was achieved by combining propofol with either midazolam or fentanyl, and a bolus propofol dose (0.5-1 mg/kg) was administered to induce deep sedation. Periprocedural events, including arrhythmias, defibrillations, and hyperthermia episodes, were evaluated, and electrocardiogram (ECG) variables were measured. The need for emergency anesthetic support/intubation and incidence of perioperative complications or mortality were analyzed.

Results: Procedure time and cumulative propofol dose for each patient was 125.2 (42.8) min and 204.6 (212.7) mg, respectively. During deep sedation, blood pressure, heart rate, and oxygen saturation were significantly decreased (P < 0.001) such that eight (15.1%) patients required manual ventilation and one (1.9%) needed atropine injection. No significant ECG changes were observed. Only two (3.7%) patients showed newly developed ST elevation in the anterior precordial lead, whereas three (5.6%) had isolated premature ventricular contractions.

Conclusion: ICD implantation without significant ECG changes or adverse outcomes is feasible under propofol sedation in patients with Brugada syndrome. However, because of significant hemodynamic changes and respiratory compromise, close monitoring and meticulous propofol dose titration is warranted.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/pace.13342DOI Listing
June 2018

Cardiovascular dysfunction and liver transplantation.

Korean J Anesthesiol 2018 Apr 2;71(2):85-91. Epub 2018 Apr 2.

Department of Anesthesiology and Pain Medicine, Laboratory for Cardiovascular Dynamics, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Cardiovascular complications have emerged as the leading cause of death after liver transplantation, particularly among those with advanced liver cirrhosis. Therefore, a thorough and accurate cardiovascular evaluation with clear comprehension of cirrhotic cardiomyopathy is recommended for optimal anesthetic management. However, cirrhotic patients manifest cardiac dysfunction concomitant with pronounced systemic hemodynamic changes, characterized by hyperdynamic circulation such as increased cardiac output, high heart rate, and decreased systemic vascular resistance. These unique features mask significant manifestations of cardiac dysfunction at rest, which makes it difficult to accurately evaluate cardiovascular status. In this review, we have summarized the current knowledge of heart and liver interactions, focusing on the usefulness and limitations of cardiac evaluation tools for identifying high-risk patients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.4097/kjae.2018.71.2.85DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5903113PMC
April 2018

The Impact of Postreperfusion Syndrome on Acute Kidney Injury in Living Donor Liver Transplantation: A Propensity Score Analysis.

Anesth Analg 2018 08;127(2):369-378

From the Department of Anesthesiology and Pain Medicine, Asan Medical Center, University of Ulsan College of Medicine, Seoul, Korea.

Background: Postreperfusion syndrome (PRS) has been shown to be related to postoperative morbidity and graft failure in orthotopic liver transplantation. To date, little is known about the impact of PRS on the prevalence of postoperative acute kidney injury (AKI) and the postoperative outcomes after living donor liver transplantation (LDLT). The purpose of our study was to determine the impact of PRS on AKI and postoperative outcomes after LDLT surgery.

Methods: Between January 2008 and October 2015, we retrospectively collected and evaluated the records of 1865 patients who underwent LDLT surgery. We divided the patients into 2 groups according to the development of PRS: PRS group (n = 715) versus no PRS group (n = 1150). Risk factors for AKI and mortality were investigated by multivariable logistic and Cox proportional hazards regression model analysis. Propensity score (PS) analysis (PS matching and inverse probability of treatment weighting analysis) was designed to compare the outcomes between the 2 groups.

Results: The prevalence of PRS and the mortality rate were 38% and 7%, respectively. In unadjusted analyses, the PRS group showed more frequent development of AKI (P < .001), longer hospital stay (P = .010), and higher incidence of intensive care unit stay over 7 days (P < .001) than the no PRS group. After PS matching and inverse probability of treatment weighting analysis, the PRS group showed a higher prevalence of postoperative AKI (P = .023 and P = .017, respectively) and renal dysfunction 3 months after LDLT (P = .036 and P = .006, respectively), and a higher incidence of intensive care unit stay over 7 days (P = .014 and P = .032, respectively).

Conclusions: We demonstrated that the magnitude and duration of hypotension caused by PRS is a factor contributing to the development of AKI and residual renal dysfunction 3 months after LDLT.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1213/ANE.0000000000003370DOI Listing
August 2018
-->