Publications by authors named "Steven Zweibel"

18 Publications

  • Page 1 of 1

Impact of wearable cardioverter-defibrillator compliance on outcomes in the VEST trial: As-treated and per-protocol analyses.

J Cardiovasc Electrophysiol 2020 05 3;31(5):1009-1018. Epub 2020 Mar 3.

Department of Epidemiology and Biostatistics, University of California San Francisco, San Francisco, California.

Background: Vest Prevention of Early Sudden Death Trial did not demonstrate a significant reduction in arrhythmic death with the wearable cardioverter-defibrillator (WCD), but compliance with the device may have substantially affected the results. ThePletcher influence of WCD compliance on outcomes has not yet been fully evaluated.

Methods: Using linear and pooled logistic models, we performed as-treated analyses omitting person-time in the hospital and adjusted for correlates of WCD compliance. To assess the impact of early stopping of WCD, we performed a per-protocol Kaplan-Meier analysis, censoring after the last day the WCD was worn. Interactions of potential effect modifiers with treatment assignment and WCD compliance on outcomes were investigated. Finally, we used linear models to identify predictors of WCD compliance.

Results: A per-protocol analysis demonstrated a significant reduction in total (P < .001) and arrhythmic (P = .001) mortality. Better WCD compliance was independently predicted by cardiac arrest during index myocardial infarction (MI), higher Cr, diabetes, prior heart failure, EF ≤ 25%, Polish enrolling center and number of WCD alarms, while worse compliance was predicted by being divorced, Asian race, higher body mass index, prior percutaneous coronary intervention, or any WCD shock. Neither excluding time in hospital from the as-treated analysis nor adjustment for factors affecting WCD compliance materially changed the results. No variable demonstrated a significant interaction in either the intention-to-treat or as-treated analysis.

Conclusion: Robust sensitivity analyses of as-treated and per-protocol analyses suggest that the WCD is protective in compliant patients with ejection fraction less than or equal to 35% during the first 3 months post-MI.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/jce.14404DOI Listing
May 2020

Stroke Risk as a Function of Atrial Fibrillation Duration and CHADS-VASc Score.

Circulation 2019 11 30;140(20):1639-1646. Epub 2019 Sep 30.

Division of Cardiology, Department of Medicine, Feinberg School of Medicine, Northwestern University, Chicago, IL (R.M.K., R.S.P.).

Background: Studies of patients with cardiovascular implantable electronic devices show a relationship between atrial fibrillation (AF) duration and stroke risk, although the interaction with CHADS-VASc score is poorly defined. The objective of this study is to evaluate rates of stroke and systemic embolism (SSE) in patients with cardiovascular implantable electronic devices as a function of both CHADS-VASc score and AF duration.

Methods: Data from the Optum electronic health record deidentified database (2007-2017) were linked to the Medtronic CareLink database of cardiovascular implantable electronic devices capable of continuous AF monitoring. An index date was assigned as the later of either 6 months after device implantation or 1 year after electronic health record data availability. CHADS-VASc score was assessed using electronic health record data before the index date. Maximum daily AF burden (no AF, 6 minutes-23.5 hours, and >23.5 hours) was assessed over the 6 months before the index date. SSE rates were computed after the index date.

Results: Among 21 768 nonanticoagulated patients with cardiovascular implantable electronic devices (age, 68.6±12.7 years; 63% male), both increasing AF duration (<0.001) and increasing CHADS-VASc score (<0.001) were significantly associated with annualized risk of SSE. SSE rates were low in patients with a CHADS-VASc score of 0 to 1 regardless of device-detected AF duration. However, stroke risk crossed an actionable threshold defined as >1%/y in patients with a CHADS-VASc score of 2 with >23.5 hours of AF, those with a CHADS-VASc score of 3 to 4 with >6 minutes of AF, and patients with a CHADS-VASc score ≥5 even with no AF.

Conclusions: There is an interaction between AF duration and CHADS-VASc score that can further risk-stratify patients with AF for SSE and may be useful in guiding anticoagulation therapy.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/CIRCULATIONAHA.119.041303DOI Listing
November 2019

His bundle pacing, learning curve, procedure characteristics, safety, and feasibility: Insights from a large international observational study.

J Cardiovasc Electrophysiol 2019 10 2;30(10):1984-1993. Epub 2019 Aug 2.

National Heart and Lung Institute, Imperial College London, London.

Background: His-bundle pacing (HBP) provides physiological ventricular activation. Observational studies have demonstrated the techniques' feasibility; however, data have come from a limited number of centers.

Objectives: We set out to explore the contemporary global practice in HBP focusing on the learning curve, procedural characteristics, and outcomes.

Methods: This is a retrospective, multicenter observational study of patients undergoing attempted HBP at seven centers. Pacing indication, fluoroscopy time, HBP thresholds, and lead reintervention and deactivation rates were recorded. Where centers had systematically recorded implant success rates from the outset, these were collated.

Results: A total of 529 patients underwent attempted HBP during the study period (2014-19) with a mean follow-up of 217 ± 303 days. Most implants were for bradycardia indications. In the three centers with the systematic collation of all attempts, the overall implant success rate was 81%, which improved to 87% after completion of 40 cases. All seven centers reported data on successful implants. The mean fluoroscopy time was 11.7 ± 12.0 minutes, the His-bundle capture threshold at implant was 1.4 ± 0.9 V at 0.8 ± 0.3 ms, and it was 1.3 ± 1.2 V at 0.9 ± 0.2 ms at last device check. HBP lead reintervention or deactivation (for lead displacement or rise in threshold) occurred in 7.5% of successful implants. There was evidence of a learning curve: fluoroscopy time and HBP capture threshold reduced with greater experience, plateauing after approximately 30-50 cases.

Conclusion: We found that it is feasible to establish a successful HBP program, using the currently available implantation tools. For physicians who are experienced at pacemaker implantation, the steepest part of the learning curve appears to be over the first 30-50 cases.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/jce.14064DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7038224PMC
October 2019

Reactive atrial-based antitachycardia pacing therapy reduces atrial tachyarrhythmias.

Pacing Clin Electrophysiol 2019 07 29;42(7):970-979. Epub 2019 Apr 29.

Cardiology Division, Department of Biomedical, Metabolic and Neural Sciences, University of Modena and Reggio Emilia, Modena University Hospital, Modena, Italy.

Background: Reactive atrial-based antitachycardia pacing (rATP) aims to terminate atrial tachyarrhythmia/atrial fibrillation (AT/AF) episodes when they spontaneously organize to atrial flutter or atrial tachycardia; however, its effectiveness in the real-world has not been studied. We used a large device database (Medtronic CareLink, Medtronic, Minneapolis, MN, USA) to evaluate the effects of rATP at reducing AT/AF.

Methods: Pacemaker, defibrillator, and resynchronization device transmission data were analyzed. Eligible patients had device detected AT/AF during a baseline period but were not in persistent AT/AF immediately preceding first transmission. Note that 1:1 individual matching between groups was conducted using age, sex, device type, pacing mode, AT/AF, and percent ventricular pacing at baseline. Risks of AT/AF events were compared between patients with rATP-enabled versus control patients with rATP-disabled or not available in the device. For matched patients, AT/AF event rates at 2 years were estimated by Kaplan-Meier method, and hazard ratios (HRs) were calculated by Cox proportional hazard models.

Results: Of 43,440 qualifying patients, 4,203 had rATP on. Matching resulted in 4,016 pairs, totaling 8,032 patients for analysis. The rATP group experienced significantly lower risks of AT/AF events lasting ≥1 day (HR 0.81), ≥7 days (HR 0.64), and ≥30 days (HR 0.56) compared to control (P < 0.0001 for all). In subgroup analysis, rATP was associated with reduced risks of AT/AF events across age, sex, device type, baseline AT/AF, and preventive atrial pacing.

Conclusions: Among real-world patients from a large device database, rATP therapy was significantly associated with a reduced risk of AT/AF. This association was independent of whether the patient had a pacemaker, defibrillator, or resynchronization device.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/pace.13696DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6850031PMC
July 2019

Estimating the incidence of atrial fibrillation in single-chamber implantable cardioverter defibrillator patients.

Pacing Clin Electrophysiol 2019 02 13;42(2):132-138. Epub 2018 Dec 13.

Medtronic plc, Mounds View, Minnesota.

Background: Atrial arrhythmias are associated with major adverse cardiovascular events. Recent reports among implantable cardioverter defibrillator (ICD) patients have demonstrated a high prevalence of atrial fibrillation (AF), predominantly in dual-chamber recipients. AF incidence among patients with single-chamber systems (approximately 50% of all ICDs) is currently unknown. The objective was to estimate the prevalence of new-onset AF among single-chamber ICD patients by observing the rates of new atrial tachycardia (AT)/AF among a propensity scoring matched cohort of dual-chamber ICD patients from the PainFree SmartShock technology study, to better inform screening initiatives.

Methods: Among 2770 patients enrolled, 1862 single-chamber, dual-chamber, and cardiac resynchronization therapy subjects with no prior history of atrial tachyarrhythmias were included. Daily AT/AF burden was estimated using a propensity score weighted model against data from dual-chamber ICDs.

Results: Over 22 ± 9 months of follow-up, the estimated incidence of AT/AF-lasting at least 6 min, 6 h, and 24 h per day -in the single-chamber cohort was 22.0, 9.8, and 6.3%, whereas among dual-chamber patients, the prevalence was 26.6, 13.1, and 7.1%, respectively. Initiation of oral anticoagulation was estimated to occur in 9.8% of the propensity matched single-chamber cohort, which was higher than the actual observed rate of 6.0%. Stroke and transient ischemic attack occurred at low rates in all device subgroups.

Conclusions: Atrial arrhythmias occur frequently, and significant underutilization of anticoagulation is suggested in single-chamber ICD recipients. Routine screening for AF should be considered among single-chamber ICD recipients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/pace.13555DOI Listing
February 2019

Wearable Cardioverter-Defibrillator after Myocardial Infarction.

N Engl J Med 2018 Sep;379(13):1205-1215

From the Division of Cardiology, Department of Medicine, the UCSF Center for the Prevention of Sudden Death (J.E.O., C.M., B.K.L.) and the Department of Epidemiology and Biostatistics (M.J.P., E.V., T.F.H., F.L., J.A.S., S.H.), University of California, San Francisco, San Francisco; the Department of Electrocardiology, Medical University of Lodz, Lodz, Poland (J.W.); McLeod Regional Medical Center, Florence, SC (R.M.); Ochsner Medical Center and Ochsner Clinical School, University of Queensland School of Medicine, New Orleans (D.P.M.); Hartford Healthcare Heart and Vascular Institute and University of Connecticut School of Medicine, Hartford (S.Z.); Beth Israel Deaconess Medical Center, Harvard Medical School, Boston (A.E.B.); Gill Heart Institute, University of Kentucky, and Veterans Affairs Medical Center, Lexington (C.S.E.); the Department of Internal Medicine, University of Michigan, Michigan Medicine, Ann Arbor (E.H.C.); Stony Brook Medicine, Stony Brook, NY (E.R.); and First Department of Medicine-Cardiology, University Medical Center Mannheim, Mannheim, and DZHK (German Center for Cardiovascular Research), Heidelberg - both in Germany (M.B.).

Background: Despite the high rate of sudden death after myocardial infarction among patients with a low ejection fraction, implantable cardioverter-defibrillators are contraindicated until 40 to 90 days after myocardial infarction. Whether a wearable cardioverter-defibrillator would reduce the incidence of sudden death during this high-risk period is unclear.

Methods: We randomly assigned (in a 2:1 ratio) patients with acute myocardial infarction and an ejection fraction of 35% or less to receive a wearable cardioverter-defibrillator plus guideline-directed therapy (the device group) or to receive only guideline-directed therapy (the control group). The primary outcome was the composite of sudden death or death from ventricular tachyarrhythmia at 90 days (arrhythmic death). Secondary outcomes included death from any cause and nonarrhythmic death.

Results: Of 2302 participants, 1524 were randomly assigned to the device group and 778 to the control group. Participants in the device group wore the device for a median of 18.0 hours per day (interquartile range, 3.8 to 22.7). Arrhythmic death occurred in 1.6% of the participants in the device group and in 2.4% of those in the control group (relative risk, 0.67; 95% confidence interval [CI], 0.37 to 1.21; P=0.18). Death from any cause occurred in 3.1% of the participants in the device group and in 4.9% of those in the control group (relative risk, 0.64; 95% CI, 0.43 to 0.98; uncorrected P=0.04), and nonarrhythmic death in 1.4% and 2.2%, respectively (relative risk, 0.63; 95% CI, 0.33 to 1.19; uncorrected P=0.15). Of the 48 participants in the device group who died, 12 were wearing the device at the time of death. A total of 20 participants in the device group (1.3%) received an appropriate shock, and 9 (0.6%) received an inappropriate shock.

Conclusions: Among patients with a recent myocardial infarction and an ejection fraction of 35% or less, the wearable cardioverter-defibrillator did not lead to a significantly lower rate of the primary outcome of arrhythmic death than control. (Funded by the National Institutes of Health and Zoll Medical; VEST ClinicalTrials.gov number, NCT01446965 .).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1056/NEJMoa1800781DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6276371PMC
September 2018

Total laser cycles-a measure of transvenous lead extraction difficulty.

J Interv Card Electrophysiol 2018 Dec 16;53(3):383-389. Epub 2018 Aug 16.

Hartford HealthCare Heart and Vascular Institute, Hartford Hospital, Hartford, CT, USA.

Background: Several variables have been identified as predictors for difficult or complicated transvenous lead extraction (TLE), including age and number of implanted leads, as well as patient's age; however, a standard measure of TLE difficulty has not been described.

Objective: Total laser cycles (TLCs) delivered during laser-assisted TLE is an objective variable that could reflect the difficulty of TLE. This study investigated whether TLC is correlated with known predictors of difficult TLE.

Methods: In a retrospective study of TLE procedures using the laser sheath, we analyzed TLC delivered and compared it to established predictors of procedural failure and complications.

Results: Of 166 patients undergoing TLE, the laser sheath (SLS II or Glidelight, Spectranetics Inc.,) was used as the primary extraction sheath in 130 patients, and 100 patients had complete TLC data available. The mean age of the oldest lead (AOL) was 7.1 ± 3.2 years with a median of 6.91 (interquartile range [IQR] 0.48-16.69) years, and 1.6 ± 0.7 leads (range, 1-4) were extracted per procedure. Two thirds of procedures involved ICD leads. Clinical success was 99%, with one patient (1%) experiencing a major complication. Median TLC delivered was 1165 (IQR, 567-2062; range, 49-9522). TLC was positively correlated with AOL (r = 0.227, p = 0.023), and the combined age of leads was extracted (r = 0.307, p = 0.002). TLC was also positively correlated with number of leads extracted per procedure (ρ = 0.227, p = 0.024). There was a non-significant negative trend towards correlation between TLC and patient's age (r = -0.112, p = 0.268).

Conclusion: TLC showed significant correlation with known predictors of difficulty during TLE using the laser sheath. TLC is an objective method to report the difficulty of TLE and could usefully be reported in future series of laser lead extractions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10840-018-0422-3DOI Listing
December 2018

Effect of an Educational Intervention on the Accuracy of Data Submitted to a National Quality Registry.

Conn Med 2017 Apr;81(4):197-202

Background: We hypothesize that data-entry errors within the National Cardiovascular Data Registry® (NCDR) ICD Registry™ may be an important reason behind labeling many cases as nonevidence-based.

Objective: To describethe frequency of data-entry errors in implantable cardioverter-defibrillator (lCD) implant data from our institution and develop a plan for quality improvement using the Deming cycle.

Methods And Results: We assessed data of patient report forms from2007to 2010 and compared these data with forms submitted from 2011 to 2012 after implementation of a continuous multicomponent staff education and training program. Of 211 ICD implants between 2007 and 2010, 36 (17%) were labeled nonevidence-based. Twenty-four (11.4%) resulted from misclassification due to data entry errors and 12 (5.7%) were actual nonevidence-based. Postintervention, review of 97 submitted patients' data revealed one (1%) data-entry error and three (3.1%) actual nonevidence-based implants.

Conclusions: Multicomponent educational intervention was effective in reducing errors in data sub- mitted to the NCDR ICD Registry.
View Article and Find Full Text PDF

Download full-text PDF

Source
April 2017

Healthcare Utilization and Expenditures Associated With Appropriate and Inappropriate Implantable Defibrillator Shocks.

Circ Cardiovasc Qual Outcomes 2017 02;10(2)

From the Center for Digital Health, Stanford University School of Medicine, Stanford, CA (M.P.T.); the Division of Cardiac Electrophysiology, Hartford Healthcare Heart and Vascular Institute, Hartford, CT (S.Z.); the Department of Economics, Reimbursement, and Evidence, Medtronic plc, Mounds View, MN (A.L.S., S.A.M.); and Baim Institute for Clinical Research, Boston, MA (M.R.R.).

Background: In patients with implantable cardioverter-defibrillators, healthcare utilization (HCU) and expenditures related to shocks have not been quantified.

Methods And Results: We performed a retrospective cohort study of patients with implantable cardioverter-defibrillators identified from commercial and Medicare supplemental claims databases linked to adjudicated shock events from remote monitoring data. A shock event was defined as ≥1 spontaneous shocks delivered by an implanted device. Shock-related HCU was ascertained from inpatient and outpatient claims within 7 days following a shock event. Shock events were adjudicated and classified as inappropriate or appropriate, and HCU and expenditures, stratified by shock type, were quantified. Of 10 266 linked patients, 963 (9.4%) patients (61.3±13.6 years; 81% male) had 1885 shock events (56% appropriate, 38% inappropriate, and 6% indeterminate). Of these events, 867 (46%) had shock-related HCU (14% inpatient and 32% outpatient). After shocks, inpatient cardiovascular procedures were common, including echocardiography (59%), electrophysiology study or ablation (34%), stress testing (16%), and lead revision (11%). Cardiac catheterization was common (71% and 51%), but percutaneous coronary intervention was low (6.5% and 5.0%) after appropriate and inappropriate shocks. Expenditures related to appropriate and inappropriate shocks were not significantly different.

Conclusions: After implantable cardioverter-defibrillator shock, related HCU was common, with 1 in 3 shock events followed by outpatient HCU and 1 in 7 followed by hospitalization. Use of invasive cardiovascular procedures was substantial, even after inappropriate shocks, which comprised 38% of all shocks. Implantable cardioverter-defibrillator shocks seem to trigger a cascade of health care. Strategies to reduce shocks could result in cost savings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/CIRCOUTCOMES.115.002210DOI Listing
February 2017

Predicting Persistent Left Ventricular Dysfunction Following Myocardial Infarction: The PREDICTS Study.

J Am Coll Cardiol 2016 Mar;67(10):1186-1196

Department of Medicine, Division of Cardiology, University of California San Francisco, San Francisco, California. Electronic address:

Background: Persistent severe left ventricular (LV) systolic dysfunction after myocardial infarction (MI) is associated with increased mortality and is a class I indication for implantation of a cardioverter-defibrillator.

Objectives: This study developed models and assessed independent predictors of LV recovery to >35% and ≥50% after 90-day follow-up in patients presenting with acute MI and severe LV dysfunction.

Methods: Our multicenter prospective observational study enrolled participants with ejection fraction (EF) of ≤35% at the time of MI (n = 231). Predictors for EF recovery to >35% and ≥50% were identified after multivariate modeling and validated in a separate cohort (n = 236).

Results: In the PREDICTS (PREDiction of ICd Treatment Study) study, 43% of patients had persistent EF ≤35%, 31% had an EF of 36% to 49%, and 26% had an EF ≥50%. The model that best predicted recovery of EF to >35% included EF at presentation, length of stay, prior MI, lateral wall motion abnormality at presentation, and peak troponin. The model that best predicted recovery of EF to ≥50% included EF at presentation, peak troponin, prior MI, and presentation with ventricular fibrillation or cardiac arrest. After predictors were transformed into point scores, the lowest point scores predicted a 9% and 4% probability of EF recovery to >35% and ≥50%, respectively, whereas profiles with the highest point scores predicted an 87% and 49% probability of EF recovery to >35% and ≥50%, respectively.

Conclusions: In patients with severe systolic dysfunction following acute MI with an EF ≤35%, 57% had EF recovery to >35%. A model using clinical variables present at the time of MI can help predict EF recovery.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jacc.2015.12.042DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4854198PMC
March 2016

Transatrial Pericardial Insufflation of Carbon Dioxide to Facilitate Percutaneous Pericardial Access for Ablation of Ventricular Tachycardia.

J Cardiovasc Electrophysiol 2016 05 14;27(5):615. Epub 2015 Dec 14.

Interventional Electrophysiology, Hartford Hospital, Hartford, Connecticut, USA.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/jce.12861DOI Listing
May 2016

Single-Coil Implantable Cardioverter Defibrillator Leads Remained the Preferred Option.

Am J Cardiol 2015 Aug 5;116(3):490-1. Epub 2015 May 5.

Hartford, Connecticut.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjcard.2015.04.037DOI Listing
August 2015

Low inappropriate shock rates in patients with single- and dual/triple-chamber implantable cardioverter-defibrillators using a novel suite of detection algorithms: PainFree SST trial primary results.

Heart Rhythm 2015 May 28;12(5):926-36. Epub 2015 Jan 28.

Royal Jubilee Hospital, Victoria, Canada.

Background: The benefits of implantable cardioverter-defibrillators (ICDs) have been well demonstrated in many clinical trials, and ICD shocks for ventricular tachyarrhythmias save lives. However, inappropriate and unnecessary shock delivery remains a significant clinical issue with considerable consequences for patients and the healthcare system.

Objective: The purpose of the PainFree SmartShock Technology (SST) study was to investigate new-generation ICDs to reduce inappropriate and unnecessary shocks through novel discrimination algorithms with modern programming strategies.

Methods: This prospective, multicenter clinical trial enrolled 2790 patients with approved indication for ICD implantation (79% male, mean age 65 years; 69% primary prevention indication, 27% single-chamber ICD, 33% replacement or upgrade). Patients were followed for a minimum of 12 months, and mean follow-up was 22 months. The primary end-point of the study was the percentage of patients remaining free of inappropriate shocks at 1 year postimplant, analyzed separately for dual/triple-chamber ICDs (N = 2019) and single-chamber ICDs (N = 751).

Results: The inappropriate shock rate at 1 year was 1.5% for patients with dual/triple-chamber ICDs and 2.5% for patients with single-chamber devices. Two years postimplant, the inappropriate shock rate was 2.8% for patients with dual-/triple chamber ICDs and 3.7% for those with single-chamber ICDs. The most common cause of an inappropriate shock in both groups was atrial fibrillation or flutter.

Conclusion: In a large patient cohort receiving ICDs for primary or secondary prevention, the adoption of novel enhanced detection algorithms in conjunction with routine implementation of modern programming strategies led to a very low inappropriate shock rate.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.hrthm.2015.01.017DOI Listing
May 2015

Predictors of pacemaker dependence and pacemaker dependence as a predictor of mortality in patients with implantable cardioverter defibrillator.

Pacing Clin Electrophysiol 2013 Aug 13;36(8):945-51. Epub 2013 May 13.

Division of Cardiology, Lahey Clinic, Burlington, Massachusetts, USA.

Background: The prevalence, predictors, and survival for the development of pacemaker dependence (PD) in patients implanted with an implantable cardioverter defibrillator (ICD) are unknown.

Methods: This was a retrospective analysis of 1,550 consecutive patients with ICD implantation at a single center from 1996 to 2008 with a mean of 4.2 ± 3.4 years. Patients with implant intrinsic heart rates less than 40 beats/min (n = 48) and cardiac resynchronization therapy (n = 444) were excluded leaving 1,058 patients in this study. PD was defined as an intrinsic rhythm <40 beats/min after inhibiting the pacemaker, <50 beats/min with transient symptoms of dizziness relieved by resumption of pacing and right ventricle pacing despite algorithms to promote intrinsic conduction at the 3 monthly follow-up ICD clinic visits. Multivariate regression and Cox proportional hazard models were used for analysis.

Results: The mean age was 64 ± 13 years; 79% were male with a primary indication for the ICD in 57%. PD occurred in 142 (13.4%) of patients, with a mean time to PD of 2.6 ± 1.9 years. PD was associated with a 48% increased odds for mortality versus non-PD ICD patients during the mean follow-up time of 4.2 ± 3.4 years (adjusted odds ratio = 1.48 [95% confidence interval 1.080-2.042]; P = 0.015). Older age, a history of atrial fibrillation, amiodarone use, and secondary prevention were the strongest predictors for the development of PD.

Conclusions: In this single-center ICD cohort, the development of PD was not uncommon and was associated with decreased survival.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/pace.12164DOI Listing
August 2013

CRT-D Therapy in Patients with Decompensated NYHA Class-Four CHF.

Cardiol Res Pract 2012 30;2012:319205. Epub 2012 Jul 30.

Hartford Hospital, University of Connecticut, Hartford, CT 06102, USA.

Background. ACC-HRS Guidelines for Cardiac Resynchronization Therapy ICD implantation (CRT-D) do not include patients with advanced nonambulatory NYHA class-four CHF due to an expectation of limited survival. There is little data available from these large multicenter randomized studies to support or refute this claim. Purpose. We evaluated the outcomes of patients with advanced nonambulatory NYHA class-four CHF who received CRT-D devices as an attempt to improve the clinical status and promote hospital discharge. Methods. Sixteen (of our six hundred and seventy CRT-D patients) were classified as advanced nonambulatory NYHA Class four inotrope/vasodilator/diuretic-dependent patients. These patients were analyzed retrospectively for weaning success to oral medications, hospital discharge, hemodynamic stability, and survival over eighteen months. Results. Thirteen of sixteen patients were discharged to home within two weeks of implantation. The survival to hospital discharge, as well as at six, twelve, and eighteen months was positive (ninety-four percent, seventy-five percent, sixty-nine percent, sixty-nine percent, resp.). The groups showed significant improvements in systolic blood pressure, renal function, left ventricular ejection fraction, and CHF class. Conclusion. CRT-D in advanced nonambulatory NYHA four patients proved feasible and beneficial. These findings suggest that the strategy merits further study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1155/2012/319205DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3413993PMC
August 2012

ICD implantation and evidence-based patient selection.

JAMA 2011 Apr;305(15):1538-9; author reply 1539-40

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1001/jama.2011.458DOI Listing
April 2011

Coronary sinus shocking lead as salvage in patients with advanced CHF and high defibrillation thresholds.

Pacing Clin Electrophysiol 2010 Aug 8;33(8):967-72. Epub 2010 Mar 8.

Division of Cardiology, Hartford Hospital, Hartford, Connecticut, USA.

Background: We report a series of three patients whose implantable cardioverter-defibrillators (ICD) implants were unsuccessful due to inability to achieve defibrillation thresholds (DFT) at maximum available energy after failure of standard modification and enhancement procedures. All patients had advanced cardiomyopathy.

Methods: Use of the coronary sinus (CS) for left ventricular (LV) shocking electrode placement resulted in acceptable DFTs in each patient. The position of the shocking coil in all three patients was posterior, and in two patients alongside a left ventricular CS pacing lead. The best shocking configuration tested was LV (CS) + CAN (Anode) to RV (cathode) in each patient. The short- and long-term outcomes of these patients is presented and discussed.

Conclusion: This approach is suggested as a salvage option for those problematic patients who have unacceptable DFT results at implantation of an endovascular ICD system.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1540-8159.2010.02726.xDOI Listing
August 2010

"Tuned" defibrillation waveforms outperform 50/50% tilt defibrillation waveforms: a randomized multi-center study.

Pacing Clin Electrophysiol 2007 Jan;30 Suppl 1:S139-42

Rochester General Hospital, Rochester, New York, USA.

Introduction: A superior performance of a tuned waveform based on duration using an assumed cardiac membrane time constant of 3.5 ms and of a 50/50% tilt waveform over a standard 65/65% tilt waveform has been documented before. However, there has been no direct comparison of the tuned versus the 50/50% tilt waveforms.

Methods: In 34 patients, defibrillation thresholds (DFTs) for tuned versus 50/50% tilt waveforms in a random order were measured by using the optimized binary search method. High voltage lead impedance was measured and used to select the pulse widths for tuned and 50/50% tilt defibrillation waveforms.

Results: Delivered energy (7.3 +/- 4.6 J vs 8.7 +/- 5.3 J, P = 0.01), stored energy (8.2 +/- 5.1 J vs 9.7 +/- 5.6 J, P = 0.01), and delivered voltage (405.9 +/- 121.7 V vs 445.0 +/- 122.6 V, P = 0.008) were significantly lower for the tuned than for the 50/50% tilt waveform. In four patients with DFT >/= 15 J, the tuned waveform lowered the mean energy DFT by 2.8 J and mean voltage DFT by 45 V. For all patients, the mean peak delivered energy DFT was reduced from 29 J to 22 J (24% decrease). Multiple regression analysis showed that a left ventricular ejection fraction < 20% is a significant predictor of this advantage.

Conclusion: Energy and voltage DFTs are lowered with an implantable cardioverter defibrillator that uses a tuned waveform compared to a standard 50% tilt biphasic waveform.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1540-8159.2007.00624.xDOI Listing
January 2007