Publications by authors named "James Freeman"

198 Publications

Does seeing it make a difference? The self-reported deterrent impact of random breath testing.

J Safety Res 2021 Feb 3;76:1-8. Epub 2020 Nov 3.

Road Safety Research Collaboration, University of the Sunshine Coast, Sippy Downs, 4556 Queensland, Australia.

Introduction: Random Breath Testing (RBT) remains a primary method to both deter and apprehend drink drivers, yet a large proportion of road fatalities continue to be caused by the offense. Outstanding questions remain regarding how much exposure to RBT operations is needed to influence deterrence-based perceptions and subsequent offending.

Method: Given this, licensed motorists (N = 961) in Queensland were recruited to complete a questionnaire either in the community (N = 741) or on the side of the road after just being breath tested (N = 243). Survey items measured different types of exposure to RBT operations (e.g., "seen" vs. "being tested") and subsequent perceptions of apprehension as well as self-reported drink driving behaviors.

Results: The key findings that emerged were: motorists were regularly exposed to RBT operations (both viewing and being tested), such exposure was not significantly correlated with perceptions of apprehension certainty, and a sizable proportion reported engaging in drink driving behaviors (e.g., approx. 25%), although roadside participants naturally reported a lower percentage of offending behaviors. Importantly, it was revealed that current "observations" of RBT was sufficient, but not actual levels of active testing (which needed to be doubled). Nevertheless, higher levels of exposure to RBT operations was found to be predictive of a lack of intention to drink and drive again in the future.

Conclusions: This paper suggests that mere exposure to enforcement may not create the intended rule compliance, and that the frequency of exposure is also essential for the roadside.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jsr.2020.09.013DOI Listing
February 2021

Relation of Cardiovascular Risk Factors to Mortality and Cardiovascular Events in Hospitalized Patients With Coronavirus Disease 2019 (from the Yale COVID-19 Cardiovascular Registry).

Am J Cardiol 2021 Feb 1. Epub 2021 Feb 1.

Department of Cardiology, Yale New Haven Hospital, Yale University School of Medicine, New Haven, Connecticut. Electronic address:

Individuals with established cardiovascular disease or a high burden of cardiovascular risk factors may be particularly vulnerable to develop complications from coronavirus disease 2019 (COVID-19). We conducted a prospective cohort study at a tertiary care center to identify risk factors for in-hospital mortality and major adverse cardiovascular events (MACE; a composite of myocardial infarction, stroke, new acute decompensated heart failure, venous thromboembolism, ventricular or atrial arrhythmia, pericardial effusion, or aborted cardiac arrest) among consecutively hospitalized adults with COVID-19, using multivariable binary logistic regression analysis. The study population comprised 586 COVID-19 positive patients. Median age was 67 (IQR: 55 to 80) years, 47.4% were female, and 36.7% had cardiovascular disease. Considering risk factors, 60.2% had hypertension, 39.8% diabetes, and 38.6% hyperlipidemia. Eighty-two individuals (14.0%) died in-hospital, and 135 (23.0%) experienced MACE. In a model adjusted for demographic characteristics, clinical presentation, and laboratory findings, age (odds ratio [OR], 1.28 per 5 years; 95% confidence interval [CI], 1.13 to 1.45), previous ventricular arrhythmia (OR, 18.97; 95% CI, 3.68 to 97.88), use of P2Y-inhibitors (OR, 7.91; 95% CI, 1.64 to 38.17), higher C-reactive protein (OR, 1.81: 95% CI, 1.18 to 2.78), lower albumin (OR, 0.64: 95% CI, 0.47 to 0.86), and higher troponin T (OR, 1.84; 95% CI, 1.39 to 2.46) were associated with mortality (p <0.05). After adjustment for demographics, presentation, and laboratory findings, predictors of MACE were higher respiratory rates, altered mental status, and laboratory abnormalities, including higher troponin T (p <0.05). In conclusion, poor prognostic markers among hospitalized patients with COVID-19 included older age, pre-existing cardiovascular disease, respiratory failure, altered mental status, and higher troponin T concentrations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjcard.2021.01.029DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7849530PMC
February 2021

Are perceptions of penalties stable across time? The problem of causal ordering in deterrence applied to road safety.

Accid Anal Prev 2020 Oct 9;146:105746. Epub 2020 Sep 9.

Road Safety Research Collaboration, University of the Sunshine Coast, 90 Sippy Downs Dr, Sippy Downs, Queensland, 4556, Australia.

This study addressed the causal ordering problem in deterrence research by examining the perceptual stability of deterrence variables over time and comparing the results via cross-sectional and longitudinal surveys. This research extends upon scant previous research by including three key classical deterrence variables (i.e., the perceived certainty of apprehension and the perceived severity and swiftness of punishment), as well as Homel's (1988) extra-legal deterrence-related variables of the fear of physical loss, material loss and social sanctions. A longitudinal survey design was utilised over a three month-time period (N = 200, Mage = 20.38 years, 71 males) that examined the stability of the deterrence-related variables for three road rule violations, consisting of 1) exceeding the speed limit by more than 10 km/hr, 2) reading a message on a phone while driving and 3) using the social media platform of Snapchat while driving. Overall, fluctuations were found in all the deterrence-related variables (both legal and extra-legal sanctions), with the largest difference being for the perceptions of the certainty of apprehension. Consequently, it can be suggested that: (a) longitudinal surveys are more reliable when measuring the impact of deterrence perceptions on engagement in offending behaviour and (b) the problem of causal ordering regarding utilising cross-sectional surveys is further illuminated. The results suggest that deterrence is a dynamic process, constantly changing based upon individual experiences, which reinforces the need for continued enforcement efforts (both legal and non-legal), within the road safety arena to maximise rule compliance.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.aap.2020.105746DOI Listing
October 2020

Risk of COVID-19 infection after cardiac electrophysiology procedures.

Heart Rhythm O2 2020 Oct 28;1(4):239-242. Epub 2020 Aug 28.

Yale School of Medicine, New Haven, Connecticut.

Background: During the COVID-19 pandemic, attempts to conserve resources and limit virus spread have resulted in delay of nonemergent procedures across all medical specialties, including cardiac electrophysiology (EP). Many patients have delayed care and continue to express concerns about potential nosocomial spread of coronavirus.

Objective: To quantify risk of development of COVID-19 owing to in-hospital transmission related to an EP procedure, in the setting of preventive measures instituted in our laboratory areas.

Methods: We contacted patients by telephone who underwent emergent procedures in the electrophysiology lab during the COVID-19 surge at our hospital (March 16, 2020, to May 15, 2020, reaching daily census 450 COVID-19 patients,) ≥2 weeks after the procedure, to assess for symptoms of and/or testing for COVID-19, and assessed outcomes from medical record review.

Results: Of the 124 patients undergoing EP procedures in this period, none had developed documented or suspected coronavirus infection. Seven patients described symptoms of chest pain, dyspnea, or fever; 3 were tested for coronavirus and found to be negative. Of the remaining 4, 2 had a more plausible alternative explanation for the symptoms, and 2 had transient symptoms not meeting published criteria for probable COVID-19 infection.

Conclusion: Despite a high hospital census of COVID-19 patients during the period of hospital stay for an EP procedure, there were no likely COVID-19 infections occurring in follow-up of at least 2 weeks. With proper use of preventive measures as recommended by published guidelines, the risk of nosocomial spread of COVID-19 to patients in the EP lab is low.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.hroo.2020.08.006DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7455246PMC
October 2020

Survival Following Implantable Cardioverter-Defibrillator Implantation in Patients With Amyloid Cardiomyopathy.

J Am Heart Assoc 2020 09 1;9(18):e016038. Epub 2020 Sep 1.

Section of Cardiovascular Medicine Department of Internal Medicine Yale University School of Medicine New Haven CT.

Background Outcomes data in patients with cardiac amyloidosis after implantable cardioverter-defibrillator (ICD) implantation are limited. We compared outcomes of patients with ICDs implanted for cardiac amyloidosis versus nonischemic cardiomyopathies (NICMs) and evaluated factors associated with mortality among patients with cardiac amyloidosis. Methods and Results Using National Cardiovascular Data Registry's ICD Registry data between April 1, 2010 and December 31, 2015, we created a 1:5 propensity-matched cohort of patients implanted with ICDs with cardiac amyloidosis and NICM. We compared mortality between those with cardiac amyloidosis and matched patients with NICM using Kaplan-Meier survival curves and Cox proportional hazards models. We also evaluated risk factors associated with 1-year mortality in patients with cardiac amyloidosis using multivariable Cox proportional hazards regression models. Among 472 patients with cardiac amyloidosis and 2360 patients with propensity-matched NICMs, 1-year mortality was significantly higher in patients with cardiac amyloidosis compared with patients with NICMs (26.9% versus 11.3%, <0.001). After adjustment for covariates, cardiac amyloidosis was associated with a significantly higher risk of all-cause mortality (hazard ratio [HR], 1.80; 95% CI, 1.56-2.08). In a multivariable analysis of patients with cardiac amyloidosis, several factors were significantly associated with mortality: syncope (HR, 1.78; 95% CI, 1.22-2.59), ventricular tachycardia (HR, 1.65; 95% CI, 1.15-2.38), cerebrovascular disease (HR, 2.03; 95% CI, 1.28-3.23), diabetes mellitus (HR, 1.55; 95% CI, 1.05-2.27), creatinine = 1.6 to 2.5 g/dL (HR, 1.99; 95% CI, 1.32-3.02), and creatinine >2.5 (HR, 4.34; 95% CI, 2.72-6.93). Conclusions Mortality after ICD implantation is significantly higher in patients with cardiac amyloidosis than in patients with propensity-matched NICMs. Factors associated with death among patients with cardiac amyloidosis include prior syncope, ventricular tachycardia, cerebrovascular disease, diabetes mellitus, and impaired renal function.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/JAHA.120.016038DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7726970PMC
September 2020

Comparison of Mortality and Readmission in Non-Ischemic Versus Ischemic Cardiomyopathy After Implantable Cardioverter-Defibrillator Implantation.

Am J Cardiol 2020 10 24;133:116-125. Epub 2020 Jul 24.

Section of Cardiovascular Medicine, Department of Internal Medicine, Yale University School of Medicine, New Haven, Connecticut; Center for Outcomes Research and Evaluation, Yale New Haven Health Services Corporation, New Haven, Connecticut. Electronic address:

Data is lacking on the contemporary risk of death and readmission following implantable cardioverter-defibrillator (ICD) implantation in patients with non-ischemic cardiomyopathies (NICM) compared with ischemic cardiomyopathies (ICM) in a large nationally representative cohort. We performed a retrospective cohort study using the National Cardiovascular Data Registry ICD Registry linked with Medicare claims from April 1, 2010 to December 31, 2013. We established a cohort of NICM and ICM patients with a left ventricular ejection fraction ≤35% who received a de novo, primary prevention ICD. We compared mortality and readmission using Kaplan-Meier curves and Cox proportional hazard regressions models. We also evaluated temporal trends in mortality. In 31,044 NICM and 68,458 ICM patients with a median follow up of 2.4 years, 1-year mortality was significantly higher in ICM patients (12.3%) compared with NICM (7.9%, p < 0.001). The higher mortality in ICM patients remained significant after adjustment for covariates (hazard ratio [HR] 1.40; 95% confidence interval [CI] 1.36 to 1.45), and was consistent in subgroup analyses. These findings were consistent across the duration of the study. ICM patients were also significantly more likely to be readmitted for all causes (adjusted HR 1.15, CI 1.12 to 1.18) and for heart failure (adjusted HR 1.25, CI 1.21 to 1.31). In conclusion, the risks of mortality and hospital readmission after primary prevention ICD implantation were significantly higher in patients with ICM compared with NICM which was consistent across all patient subgroups tested and over the duration of the study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjcard.2020.07.035DOI Listing
October 2020

Erratum for Kweon et al., "Substrate Specificity and Structural Characteristics of the Novel Rieske Nonheme Iron Aromatic Ring-Hydroxylating Oxygenases NidAB and NidA3B3 from Mycobacterium vanbaalenii PYR-1".

mBio 2020 08 25;11(4). Epub 2020 Aug 25.

Division of Microbiology, National Center for Toxicological Research, U.S. Food and Drug Administration, Jefferson, Arkansas, USA

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1128/mBio.01872-20DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7448282PMC
August 2020

Effect of Temporary Interruption of Warfarin Due to an Intervention on Downstream Time in Therapeutic Range in Patients With Atrial Fibrillation (from ORBIT AF).

Am J Cardiol 2020 10 12;132:66-71. Epub 2020 Jul 12.

Department of Cardiovascular Medicine, Mayo Clinic, 200 1st St SW, Rochester, Minnesota 55905.

The aim of this study was to quantify time in therapeutic range (TTR) before and after a temporary interruption of warfarin due to an intervention in the Outcomes Registry for Better Informed Treatment of atrial fibrillation (AF). AF patients on warfarin who had a temporary interruption followed by resumption were identified. A nonparametric method for estimating survival functions for interval censored data was used to examine the first therapeutic International Normalized Ratio (INR) after interruption. TTR was compared using Wilcoxon signed rank test. Cox proportional hazards model was used to investigate the association between TTR in the first 3 months after interruption and subsequent outcomes at 3 to 9 months. Of 9,749 AF patients, 71% were on warfarin. Over a median (IQR) follow-up of 2.6 (1.8 to 3.1) y, 33% of patients had a total of 3,022 temporary interruptions. The first therapeutic INR was recorded within 1 week in 35.0% (95% confidence interval 32.6% to 37.4%), 2 weeks in 54.6% (52.2% to 57.0%), 30 days in 70.0% (67.9% to 72.1%) and 90 days in 91.3% (90.0% to 92.5%) of patients. Compared with pre-interruption, TTR 3 months after interruption was significantly lower (61.1% [36.6% to 85.0%] vs 67.6% [50.0% to 81.3%], p <0.0001). A 10 unit increment in the TTR in the first 3 months after interruption was associated with a lower risk of major bleeding [Hazard ratio 0.91 (0.85 to 0.97), p = 0.005]. This association was noted in patients who received bridging anticoagulation, but not in those who did not. In conclusion, temporary interruption of warfarin is common, and nearly half of these patients had subtherapeutic INR after 2 weeks. Lower TTR in the first 3 months after interruption was associated with higher incidence of major bleeding in patients who received bridging anticoagulation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjcard.2020.07.006DOI Listing
October 2020

The impact of sofosbuvir/daclatasvir or ribavirin in patients with severe COVID-19.

J Antimicrob Chemother 2020 11;75(11):3366-3372

Abadan Faculty of Medical Sciences, Abadan, Iran.

Objectives: Sofosbuvir and daclatasvir are direct-acting antivirals highly effective against hepatitis C virus. There is some in silico and in vitro evidence that suggests these agents may also be effective against SARS-CoV-2. This trial evaluated the effectiveness of sofosbuvir in combination with daclatasvir in treating patients with COVID-19.

Methods: Patients with a positive nasopharyngeal swab for SARS-CoV-2 on RT-PCR or bilateral multi-lobar ground-glass opacity on their chest CT and signs of severe COVID-19 were included. Subjects were divided into two arms with one arm receiving ribavirin and the other receiving sofosbuvir/daclatasvir. All participants also received the recommended national standard treatment which, at that time, was lopinavir/ritonavir and single-dose hydroxychloroquine. The primary endpoint was time from starting the medication until discharge from hospital with secondary endpoints of duration of ICU stay and mortality.

Results: Sixty-two subjects met the inclusion criteria, with 35 enrolled in the sofosbuvir/daclatasvir arm and 27 in the ribavirin arm. The median duration of stay was 5 days for the sofosbuvir/daclatasvir group and 9 days for the ribavirin group. The mortality in the sofosbuvir/daclatasvir group was 2/35 (6%) and 9/27 (33%) for the ribavirin group. The relative risk of death for patients treated with sofosbuvir/daclatasvir was 0.17 (95% CI 0.04-0.73, P = 0.02) and the number needed to treat for benefit was 3.6 (95% CI 2.1-12.1, P < 0.01).

Conclusions: Given these encouraging initial results, and the current lack of treatments proven to decrease mortality in COVID-19, further investigation in larger-scale trials seems warranted.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jac/dkaa331DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7529105PMC
November 2020

Effectiveness of vehicle impoundment for high-range speeding offences in Victoria, Australia.

Accid Anal Prev 2020 Sep 22;145:105690. Epub 2020 Jul 22.

University of the Sunshine Coast Road Safety Research Collaboration, Australia.

Speeding behaviour has been shown to account for a large number of deaths and serious injuries on Australian roads. Vehicle impoundment is one countermeasure which has been implemented to discourage drivers from engaging in high-range speeding. Despite this countermeasure being used as a sanction in all Australian jurisdictions to combat high-range speeding offences, limited research has examined the effectiveness of vehicle impoundments in Australia. The purpose of this research was to examine the effectiveness of vehicle impoundment for high-range speeding offences on subsequent offence and crash rates. Data were collected from drivers with an eligible excessive speeding offence in Victoria, Australia between 1 July 2006 and 31 December 2014. During this time, there were 17,440 impoundment eligible offences, 6,883 (41.8 %) of which resulted in vehicle impoundment. The analysis revealed that drivers who had a vehicle impounded were more likely to be male, younger, hold a probationary licence, and to have a court offence. In terms of the effectiveness of vehicle impoundment, among high-range offenders, re-offence rates for those who had their vehicle impounded were statistically significantly lower for all licence periods compared with offenders who did not have their vehicle impounded. There was evidence of an effect of impoundment on reducing speeding re-offence rates during the impoundment period as well as some evidence that the impact of licence suspension was greater for those who experienced impoundment. Given that vehicle impoundment is a sanction which aims to discourage and/or incapacitate drivers from engaging in on-road risk taking behaviour, in this case high-range speeding behaviour, the longer-term positive effects of this sanction may assist with the on-going effort to reduce on-road risk taking behaviours.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.aap.2020.105690DOI Listing
September 2020

Patterns of oral anticoagulation use with cardioversion in clinical practice.

Heart 2020 Jun 26. Epub 2020 Jun 26.

Duke Clinical Research Institute, Durham, North Carolina, USA.

Background: Cardioversion is common among patients with atrial fibrillation (AF). We hypothesised that novel oral anticoagulants (NOAC) used in clinical practice resulted in similar rates of stroke compared with vitamin K antagonists (VKA) for cardioversion.

Methods: Using the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation II, patients with AF who had a cardioversion, follow-up data and an AF diagnosis within 6 months of enrolment were identified retrospectively. Clinical outcomes were compared for patients receiving a NOAC or VKA for 1 year following cardioversion.

Results: Among 13 004 patients with AF, 2260 (17%) underwent cardioversion. 1613 met the inclusion criteria for this analysis. At the time of cardioversion, 283 (17.5%) were receiving a VKA and 1330 (82.5%) a NOAC. A transoesophageal echocardiogram (TOE) was performed in 403 (25%) cardioversions. The incidence of stroke/transient ischaemic attack (TIA) at 30 days was the same for patients having (3.04 per 100 patient-years) or not having (3.04 per 100 patient-years) a TOE (p=0.99). There were no differences in the incidence of death (HR 1.19, 95% CI 0.62 to 2.28, p=0.61), cardiovascular hospitalisation (HR 1.02, 95% CI 0.76 to 1.35, p=0.91), stroke/TIA (HR 1.18, 95% CI 0.30 to 4.74, p=0.81) or bleeding-related hospitalisation (HR 1.29, 95% CI 0.66 to 2.52, p=0.45) at 1 year for patients treated with either a NOAC or VKA.

Conclusions: Cardioversion was a low-risk procedure for patients treated with NOAC, and there were statistically similar rates of stroke/TIA 30 days after cardioversion as for patients treated with VKA. There were no statically significant differences in death, stroke/TIA or major bleeding at 1 year among patients treated with NOAC compared with VKA after cardioversion.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/heartjnl-2019-316315DOI Listing
June 2020

Characterization of the Inflammatory Response to Severe COVID-19 Illness.

Am J Respir Crit Care Med 2020 09;202(6):812-821

Department of Medicine.

Coronavirus disease (COVID-19) is a global threat to health. Its inflammatory characteristics are incompletely understood. To define the cytokine profile of COVID-19 and to identify evidence of immunometabolic alterations in those with severe illness. Levels of IL-1β, IL-6, IL-8, IL-10, and sTNFR1 (soluble tumor necrosis factor receptor 1) were assessed in plasma from healthy volunteers, hospitalized but stable patients with COVID-19 (COVID patients), patients with COVID-19 requiring ICU admission (COVID patients), and patients with severe community-acquired pneumonia requiring ICU support (CAP patients). Immunometabolic markers were measured in circulating neutrophils from patients with severe COVID-19. The acute phase response of AAT (alpha-1 antitrypsin) to COVID-19 was also evaluated. IL-1β, IL-6, IL-8, and sTNFR1 were all increased in patients with COVID-19. COVID patients could be clearly differentiated from COVID patients, and demonstrated higher levels of IL-1β, IL-6, and sTNFR1 but lower IL-10 than CAP patients. COVID-19 neutrophils displayed altered immunometabolism, with increased cytosolic PKM2 (pyruvate kinase M2), phosphorylated PKM2, HIF-1α (hypoxia-inducible factor-1α), and lactate. The production and sialylation of AAT increased in COVID-19, but this antiinflammatory response was overwhelmed in severe illness, with the IL-6:AAT ratio markedly higher in patients requiring ICU admission ( < 0.0001). In critically unwell patients with COVID-19, increases in IL-6:AAT predicted prolonged ICU stay and mortality, whereas improvement in IL-6:AAT was associated with clinical resolution ( < 0.0001). The COVID-19 cytokinemia is distinct from that of other types of pneumonia, leading to organ failure and ICU need. Neutrophils undergo immunometabolic reprogramming in severe COVID-19 illness. Cytokine ratios may predict outcomes in this population.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1164/rccm.202005-1583OCDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7491404PMC
September 2020

Predictors of Cardiac Perforation With Catheter Ablation of Atrial Fibrillation.

JACC Clin Electrophysiol 2020 06 25;6(6):636-645. Epub 2020 Mar 25.

Duke Clinical Research Institute, Durham, North Carolina, (c)Electrophysiology Section, Duke University Hospital, Durham, North Carolina, USA; Electrophysiology Section, Duke University Hospital, Durham, North Carolina, USA.

Objectives: This study identified factors associated with risk for cardiac perforation in the setting of atrial fibrillation (AF) ablation in contemporary clinical practice.

Background: Cardiac perforation is an uncommon but potentially fatal complication of AF ablation. An improved understanding of factors associated with cardiac perforation could facilitate improvements in procedural safety.

Methods: Logistic regression models were used to assess predictors of cardiac perforation among Medicare beneficiaries who underwent AF ablation from July 1, 2013 and December 31, 2017. Cardiac perforation was defined as a diagnosis of hemopericardium, cardiac tamponade, or pericardiocentesis, within 30 days of AF ablation.

Results: Of 102,398 patients who underwent AF ablation, 0.61% (n = 623) experienced cardiac perforation as a procedural complication. Rates of cardiac perforation decreased over time. In adjusted analyses of the overall population, female sex (odds ratio [OR]: 1.34; 95% confidence interval [CI]: 1.14 to 1.58; p = 0.0004), obesity (OR: 1.35; 95% CI: 1.09 to 1.68; p = 0.0050), and absence of intracardiac echocardiography (ICE) (OR: 4.85; 95% CI: 4.11 to 5.71; p < 0.0001) were associated with increased risk for cardiac perforation, whereas previous cardiac surgery (OR: 0.14; 95% CI: 0.07 to 0.26; p < 0.0001) was associated with a lower risk for perforation. Patient risk factors for cardiac perforation were identical in the subset of patients in whom ICE was used (n = 76,134). A risk score was generated with the following point assignments: female sex (1 point); obesity (1 point); nonuse of ICE (5 points); and previous cardiac surgery (-6 points).

Conclusions: Cardiac perforation is a rare complication of AF ablation; incidence has decreased over time. One of the strongest predictors of cardiac perforation in the contemporary era is a modifiable factor, use of intraprocedural ICE.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jacep.2020.01.011DOI Listing
June 2020

Alcohol and illicit substances associated with fatal crashes in Queensland: An examination of the 2011 to 2015 Coroner's findings.

Forensic Sci Int 2020 Jul 12;312:110190. Epub 2020 Feb 12.

Road Safety Research Collaboration, University of the Sunshine Coast, Sippy Downs, Queensland, 4556 QLD, Australia.

Problem: The problem of impaired driving is well documented in the literature but is heavily dependent upon self-report studies and/or databases that do not include in-depth information about the contributing origins of fatalities.

Aim: This study aimed to conduct an in-depth analysis of Coroner's findings for all fatally injured drivers in the state of Queensland in order to explore the prevalence of alcohol and different types of illicit substances (including drug combinations) in fatal crash reports.

Method: A total of 701 Coroner's reports related to drivers or controllers of vehicles involved in traffic related fatalities for the period of 2011-2015 were analysed, revealing 306 controllers (43.6%) were detected with either alcohol or illegal drugs (e.g., methylamphetamine, Δ9-tetrahydrocannabinol, cocaine or MDMA) RESULTS: Alcohol was the most commonly detected substance identified with 223 cases (72.9% of the drug and alcohol sample). Illicit drug detections totalled 147 cases (48% of the drug and alcohol sample) with Δ9-tetrahydrocannabinol the most commonly detected illicit substance (109 cases; 35.6% of the drug and alcohol sample) followed by methylamphetamine (total of 63 cases; 20.6% of the drug and alcohol sample). An important theme to emerge was the prevalence of polysubstance use among fatally injured drivers, not just for alcohol and one drug type, but also multiple drug combinations. Fatality trends revealed a decrease in both non-substance and alcohol-related fatalities across the study period. However, road fatalities where an illicit substance was detected increased by approximately 57%. Males were overrepresented as a proportion of total fatalities (82.4%) and there were no significant sex or age differences regarding illicit substance related deaths. Drivers of passenger vehicles were most commonly identified in the data (66.2%), but motorcycle operators were disproportionately represented (28.1% of the total controller sample compared to 4% of vehicle registrations in Queensland) CONCLUSION: This case study analysis of fatal crashes not only confirms the ongoing problem of alcohol and driving, but also illuminates the emerging (and escalating) issue of illicit substances detected in fatally injured drivers.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.forsciint.2020.110190DOI Listing
July 2020

Outcomes of Cardiac Catheterization in Patients With Atrial Fibrillation on Anticoagulation in Contemporary in Practice: An Analysis of the ORBIT II Registry.

Circ Cardiovasc Interv 2020 05 15;13(5):e008274. Epub 2020 May 15.

Duke Clinical Research Institute, Durham, NC (J.P.P., D.N.H., R.B., E.D.P., S.V.R.).

Background: Patients with atrial fibrillation on oral anticoagulation (OAC) undergoing cardiac catheterization face risks for embolic and bleeding events, yet information on strategies to mitigate these risks in contemporary practice is lacking.

Methods: We aimed to describe the clinical/procedural characteristics of a contemporary cohort of patients with atrial fibrillation on OAC who underwent cardiac catheterization. Use of bleeding avoidance strategies and bridging therapy were described and outcomes including death, stroke, and major bleeding at 30 days and 1 year were compared by OAC type.

Results: Of 13 404 patients in the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation II Registry from 2013 to 2016, 741 underwent cardiac catheterization (139 with percutaneous coronary intervention) in the setting of OAC. The patients' median age was 71, 61.8% were male, white (87.2%), had hypertension (83.7%), hyperlipidemia (72.1%), diabetes mellitus (31.6%), and chronic kidney disease (28.2%); 20.2% received warfarin while 79.8% received direct acting oral anticoagulant. One third of patients underwent radial artery access, and bivalirudin was used in 4.6%. Bridging therapy was used more often in patients on warfarin versus direct acting oral anticoagulant (16.7% versus10.0%). OAC was interrupted in 93.8% of patients. Patients on warfarin versus direct acting oral anticoagulant were equally likely to restart OAC (58.0% versus 60.7%), had similar use of antiplatelet therapy (44.0% versus 41.3%) after catheterization, and had similar rates of myocardial infarction and death at 1 year, but higher rates of major bleeding (43.3 versus 12.9 events/100 patient years) and stroke (4.9 versus 1.9 events/100 patient years).

Conclusions: In a real-world registry of patients with atrial fibrillation undergoing cardiac catheterization, most cases are elective, performed by femoral access, with interruption of OAC. Bleeding avoidance strategies such as radial artery access and bivalirudin were used infrequently and use of bridging therapy was uncommon. Nearly 40% of patients did not restart OAC postprocedure, exposing patients to risk for stroke. Further research is necessary to optimize the management of patients with atrial fibrillation undergoing cardiac catheterization.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/CIRCINTERVENTIONS.119.008274DOI Listing
May 2020

Shared decision-making in atrial fibrillation: patient-reported involvement in treatment decisions.

Eur Heart J Qual Care Clin Outcomes 2020 10;6(4):263-272

Department of Cardiology, Duke Clinical Research Institute, Durham, NC 27701, USA.

Aims: To determine the extent of shared decision-making (SDM), during selection of oral anticoagulant (OAC) and rhythm control treatments, in patients with newly diagnosed atrial fibrillation (AF).

Methods And Results: We evaluated survey data from 1006 patients with new-onset AF enrolled at 56 US sites participating in the SATELLITE substudy of the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation (ORBIT II). Patients completed surveys at enrolment and at 6-month follow-up. Patients were asked about who made their AF treatment decisions. Shared decision-making was classified as one that the patient felt was an autonomous decision or a shared decision with their healthcare provider (HCP). Approximately half of patients reported that their OAC treatment decisions were made entirely by their HCP. Compared with those reporting no SDM, patients reporting SDM for OAC were more often female (47.2% vs. 38.4%), while patients reporting SDM for rhythm control were more often male (62.2% vs. 57.6%). The most important factors cited by patients during decision-making for OAC were reducing stroke and bleeding risk, and their HCP's recommendations. After adjustment, patients with self-reported understanding of OAC, and rhythm control options, had higher odds of having participated in SDM [odds ratio (OR) 2.54, confidence interval (CI): 1.75-3.68 and OR 2.36, CI: 1.50-3.71, both P ≤ 0.001, respectively].

Conclusion: Shared decision-making is not widely implemented in contemporary AF practice. Patient understanding about available therapeutic options is associated with a more than a two-fold higher likelihood of SDM, and may be a potential target for future interventions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/ehjqcco/qcaa040DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7778449PMC
October 2020

Enhanced electrocardiographic monitoring of patients with Coronavirus Disease 2019.

Heart Rhythm 2020 09 6;17(9):1417-1422. Epub 2020 May 6.

Section of Cardiovascular Medicine, Yale University School of Medicine, New Haven, Connecticut. Electronic address:

Background: Many of the drugs being used in the treatment of the ongoing pandemic coronavirus disease 2019 (COVID-19) are associated with QT prolongation. Expert guidance supports electrocardiographic (ECG) monitoring to optimize patient safety.

Objective: The purpose of this study was to establish an enhanced process for ECG monitoring of patients being treated for COVID-19.

Methods: We created a Situation Background Assessment Recommendation tool identifying the indication for ECGs in patients with COVID-19 and tagged these ECGs to ensure prompt over reading and identification of those with QT prolongation (corrected QT interval > 470 ms for QRS duration ≤ 120 ms; corrected QT interval > 500 ms for QRS duration > 120 ms). This triggered a phone call from the electrophysiology service to the primary team to provide management guidance and a formal consultation if requested.

Results: During a 2-week period, we reviewed 2006 ECGs, corresponding to 524 unique patients, of whom 103 (19.7%) met the Situation Background Assessment Recommendation tool-defined criteria for QT prolongation. Compared with those without QT prolongation, these patients were more often in the intensive care unit (60 [58.3%] vs 149 [35.4%]) and more likely to be intubated (32 [31.1%] vs 76 [18.1%]). Fifty patients with QT prolongation (48.5%) had electrolyte abnormalities, 98 (95.1%) were on COVID-19-related QT-prolonging medications, and 62 (60.2%) were on 1-4 additional non-COVID-19-related QT-prolonging drugs. Electrophysiology recommendations were given to limit modifiable risk factors. No patient developed torsades de pointes.

Conclusion: This process functioned efficiently, identified a high percentage of patients with QT prolongation, and led to relevant interventions. Arrhythmias were rare. No patient developed torsades de pointes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.hrthm.2020.04.047DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7200355PMC
September 2020

Switching warfarin to direct oral anticoagulants in atrial fibrillation: Insights from the NCDR PINNACLE registry.

Clin Cardiol 2020 Jul 6;43(7):743-751. Epub 2020 May 6.

Yale Center for Outcomes Research and Evaluation (CORE), New Haven, Connecticut, USA.

Background: Previous studies examining the use of direct oral anticoagulants (DOACs) in atrial fibrillation (AF) have largely focused on patients newly initiating therapy. Little is known about the prevalence/patterns of switching to DOACs among AF patients initially treated with warfarin.

Hypothesis: To examine patterns of anticoagulation among patients chronically managed with warfarin upon the availability of DOACs and identify patient/practice-level factors associated with switching from chronic warfarin therapy to a DOAC.

Methods: Prospective cohort study of AF patients in the NCDR PINNACLE registry prescribed warfarin between May 1, 2008 and May 1, 2015. Patients were followed at least 1 year (median length of follow-up 375 days, IQR 154-375) through May 1, 2016 and stratified as follows: continued warfarin, switched to DOAC, or discontinued anticoagulation. To identify significant predictors of switching, a three-level multivariable hierarchical regression was developed.

Results: Among 383 008 AF patients initially prescribed warfarin, 16.3% (n = 62 620) switched to DOACs, 68.8% (n = 263 609) continued warfarin, and 14.8% (n = 56 779) discontinued anticoagulation. Among those switched, 37.6% received dabigatran, 37.0% rivaroxaban, 24.4% apixaban, and 1.0% edoxaban. Switched patients were more likely to be younger, women, white, and have private insurance (all P < .001). Switching was less likely with increased stroke risk (OR, 0.92; 95%CI, 0.91-0.93 per 1-point increase CHA DS -VASc), but more likely with increased bleeding risk (OR, 1.12; 95%CI, 1.10-1.13 per 1-point increase HAS-BLED). There was substantial variation at the practice-level (MOR, 2.33; 95%CI, 2.12-2.58) and among providers within the same practice (MOR, 1.46; 95%CI, 1.43-1.49).

Conclusions: Among AF patients treated with warfarin between October 1, 2010 and May 1, 2016, one in six were switched to DOACs, with differences across sociodemographic/clinical characteristics and substantial practice-level variation. In the context of current guidelines which favor DOACs over warfarin, these findings help benchmark performance and identify areas of improvement.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/clc.23376DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7368350PMC
July 2020

Factors Associated With Large Improvements in Health-Related Quality of Life in Patients With Atrial Fibrillation: Results From ORBIT-AF.

Circ Arrhythm Electrophysiol 2020 05 16;13(5):e007775. Epub 2020 Apr 16.

Duke Clinical Research Institute, Durham, NC (D.N.H., K.P., E.D.P., J.P.P.).

Background: Atrial fibrillation (AF) adversely impacts health-related quality of life (hrQoL). While some patients demonstrate improvements in hrQoL, the factors associated with large improvements in hrQoL are not well described.

Methods: We assessed factors associated with a 1-year increase in the Atrial Fibrillation Effect on Quality-of-Life score of 1 SD (≥18 points; 3× clinically important difference), among outpatients in the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation I registry.

Results: Overall, 28% (181/636) of patients had such a hrQoL improvement. Compared with patients not showing large hrQoL improvement, they were of similar age (median 73 versus 74, =0.3), equally likely to be female (44% versus 48%, =0.3), but more likely to have newly diagnosed AF at baseline (18% versus 8%; =0.0004), prior antiarrhythmic drug use (52% versus 40%, =0.005), baseline antiarrhythmic drug use (34.8% versus 26.8%, =0.045), and more likely to undergo AF-related procedures during follow-up (AF ablation: 6.6% versus 2.0%, =0.003; cardioversion: 12.2% versus 5.9%, =0.008). In multivariable analysis, a history of alcohol abuse (adjusted OR, 2.41; =0.01) and increased baseline diastolic blood pressure (adjusted OR, 1.23 per 10-point increase and >65 mm Hg; =0.04) were associated with large improvements in hrQoL at 1 year, whereas patients with prior stroke/transient ischemic attack, chronic obstructive pulmonary disease, and peripheral arterial disease were less likely to improve (<0.05 for each).

Conclusions: In this national registry of patients with AF, potentially treatable AF risk factors are associated with large hrQoL improvement, whereas less reversible conditions appeared negatively associated with hrQoL improvement. Understanding which patients are most likely to have large hrQoL improvement may facilitate targeting interventions for high-value care that optimizes patient-reported outcomes in AF. Registration: URL: http://www.clinicaltrials.gov. Unique identifier: NCT01165710.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/CIRCEP.119.007775DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7247417PMC
May 2020

The NCDR Left Atrial Appendage Occlusion Registry.

J Am Coll Cardiol 2020 04;75(13):1503-1518

Division of Cardiology, University of Colorado School of Medicine, Denver, Colorado.

Background: Left atrial appendage occlusion (LAAO) to prevent stroke in patients with atrial fibrillation has been evaluated in 2 randomized trials; post-approval clinical data are limited.

Objectives: The purpose of this study was to describe the National Cardiovascular Data Registry (NCDR) LAAO Registry and present patient, hospital, and physician characteristics and in-hospital adverse event rates for Watchman procedures in the United States during its first 3 years.

Methods: The authors describe the LAAO Registry structure and governance, the outcome adjudication processes, and the data quality and collection processes. They characterize the patient population, performing hospitals, and in-hospital adverse event rates.

Results: A total of 38,158 procedures from 495 hospitals performed by 1,318 physicians in the United States were included between January 2016 and December 2018. The mean patient age was 76.1 ± 8.1 years, the mean CHADS-VASc (congestive heart failure, hypertension, 65 years of age and older, diabetes mellitus, previous stroke or transient ischemic attack, vascular disease, 65 to 74 years of age, female) score was 4.6 ± 1.5, and the mean HAS-BLED (hypertension, abnormal renal/liver function, stroke, bleeding history or predisposition, labile international normalized ratio, elderly, drugs/alcohol concomitantly) score was 3.0 ± 1.1. The median annual number of LAAO procedures performed for hospitals was 30 (interquartile range: 18 to 44) and for physicians was 12 (interquartile range: 8 to 20). Procedures were canceled or aborted in 7% of cases; among cases in which a device was deployed, 98.1% were implanted with <5-mm leak. Major in-hospital adverse events occurred in 2.16% of patients; the most common complications were pericardial effusion requiring intervention (1.39%) and major bleeding (1.25%), whereas stroke (0.17%) and death (0.19%) were rare.

Conclusions: The LAAO Registry has enrolled >38,000 patients implanted with the device. Patients were generally older with more comorbidities than those enrolled in the pivotal trials; however, major in-hospital adverse event rates were lower than reported in those trials.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jacc.2019.12.040DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7205034PMC
April 2020

PD-L1 Expression on Circulating Tumor Cells May Be Predictive of Response to Pembrolizumab in Advanced Melanoma: Results from a Pilot Study.

Oncologist 2020 03 5;25(3):e520-e527. Epub 2019 Dec 5.

School of Medical and Health Sciences, Edith Cowan University, Perth, Australia.

Background: PD-1 inhibitors are routinely used for the treatment of advanced melanoma. This study sought to determine whether PD-L1 expression on circulating tumor cells (CTCs) can serve as a predictive biomarker of clinical benefit and response to treatment with the PD-1 inhibitor pembrolizumab.

Methods: Blood samples were collected from patients with metastatic melanoma receiving pembrolizumab, prior to treatment and 6-12 weeks after initiation of therapy. Multiparametric flow cytometry was used to identify CTCs and evaluate the expression of PD-L1.

Results: CTCs were detected in 25 of 40 patients (63%). Patients with detectable PD-L1 CTCs (14/25, 64%) had significantly longer progression-free survival (PFS) compared with patients with PD-L1 CTCs (26.6 months vs. 5.5 months; p = .018). The 12-month PFS rates were 76% versus 22% in the PD-L1 versus PD-L1 CTCs groups (p = .012), respectively. A multivariate linear regression analysis confirmed that PD-L1 CTC is an independent predictive biomarker of PFS (hazard ratio, 0.229; 95% confidence interval, 0.052-1.012; p = .026).

Conclusion: Our results reveal the potential of CTCs as a noninvasive real-time biopsy to evaluate PD-L1 expression in patients with melanoma. PD-L1 expression on CTCs may be predictive of response to pembrolizumab and longer PFS.

Implications For Practice: The present data suggest that PD-L1 expression on circulating tumor cells may predict response to pembrolizumab in advanced melanoma. This needs further validation in a larger trial and, if proven, might be a useful liquid biopsy tool that could be used to stratify patients into groups more likely to respond to immunotherapy, hence leading to health cost savings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1634/theoncologist.2019-0557DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7066715PMC
March 2020

Detection and prognostic role of heterogeneous populations of melanoma circulating tumour cells.

Br J Cancer 2020 03 10;122(7):1059-1067. Epub 2020 Feb 10.

School of Medical and Health Sciences, Edith Cowan University, Perth, WA, Australia.

Background: Circulating tumour cells (CTCs) can be assessed through a minimally invasive blood sample with potential utility as a predictive, prognostic and pharmacodynamic biomarker. The large heterogeneity of melanoma CTCs has hindered their detection and clinical application.

Methods: Here we compared two microfluidic devices for the recovery of circulating melanoma cells. The presence of CTCs in 43 blood samples from patients with metastatic melanoma was evaluated using a combination of immunocytochemistry and transcript analyses of five genes by RT-PCR and 19 genes by droplet digital PCR (ddPCR), whereby a CTC score was calculated. Circulating tumour DNA (ctDNA) from the same patient blood sample, was assessed by ddPCR targeting tumour-specific mutations.

Results: Our analysis revealed an extraordinary heterogeneity amongst melanoma CTCs, with multiple non-overlapping subpopulations. CTC detection using our multimarker approach was associated with shorter overall and progression-free survival. Finally, we found that CTC scores correlated with plasma ctDNA concentrations and had similar pharmacodynamic changes upon treatment initiation.

Conclusions: Despite the high phenotypic and molecular heterogeneity of melanoma CTCs, multimarker derived CTC scores could serve as viable tools for prognostication and treatment response monitoring in patients with metastatic melanoma.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41416-020-0750-9DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7109152PMC
March 2020

Past behaviours and future intentions: An examination of perceptual deterrence and alcohol consumption upon a range of drink driving events.

Accid Anal Prev 2020 Mar 28;137:105428. Epub 2020 Jan 28.

Road Safety Research Collaboration, University of the Sunshine Coast, Sippy Downs, Queensland, 4556, Australia.

Introduction: The threat of application of legal sanctions remains the prominent approach to reduce the prevalence of drink driving in a vast array of motoring jurisdictions. However, ongoing questions remain regarding: (a) the extent that such mechanisms impact upon offending behaviours, (b) the deleterious effect alcohol consumption has on decisions to drink and drive and (c) how best to operationalise (and measure) the concept of drink driving to enhance the accurate measurement of the dependent variable.

Method: This paper reports on an examination of 773 Queensland motorists' (across nine local government areas) perceptions of both legal and non-legal drink driving sanctions (as well as alcohol consumption) in order to gauge the deterrent impact upon a range of measures of drink driving: the driver thinking they are over the limit, the driver knowing they are over the limit, attempts to evade random breath testing, and intentions to re-offend. The sample completed an online or paper version of the questionnaire.

Results: The majority of participants reported "never" engaging in "possible" (74.5 %) or "acknowledged" (83.4 %) drink driving events, although a considerable proportion of the sample reported engaging in "possible" (25.5 %) or "acknowledged" (16.6 %) drink driving and attempting to evade RBT (18 %) events, as well as possible intentions to drink and drive in the future (22 %). Males were more likely to report such events. Perceptions of both legal sanctions (certainty, severity and swiftness) as well as non-legal sanctions (fear of social, internal or physical harm) were relatively high and consistent with previous research. Interestingly, non-legal sanctions were reported as stronger deterrents than legal sanctions. However, multivariate analysis revealed that legal deterrents had limited utility predicting offending behaviours, but rather, demographic characteristics (e.g., younger motorists, males) as well as risky drinking behaviour were better predictors. In regards to intentions to offend, a past conviction for drink driving was also a predictor of re-offending.

Practical Applications: These results highlight the ongoing challenges of addressing the problem of drink driving and that some motorists: (a) have entrenched behaviour and/or (b) make the decision to drink and drive before they are under the influence of alcohol.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.aap.2019.105428DOI Listing
March 2020

Post-Emergency Department Atrial Fibrillation Clinics: A Shifting Paradigm in Care?

JACC Clin Electrophysiol 2020 01;6(1):53-55

Section of Cardiovascular Medicine, Department of Medicine, Yale University School of Medicine, New Haven, Connecticut, USA.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jacep.2019.10.006DOI Listing
January 2020

Decline in renal function and oral anticoagulation dose reduction among patients with atrial fibrillation.

Heart 2020 03 7;106(5):358-364. Epub 2020 Jan 7.

Duke Clinical Research Institute, Duke University Medical Center, Durham, North Carolina, USA.

Objective: Non-vitamin K oral anticoagulants (NOACs) require dose adjustment for renal function. We sought to investigate change in renal function over time in patients with atrial fibrillation (AF) and whether those on NOACs have appropriate dose adjustments according to its decline.

Methods: We included patients with AF enrolled in the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation II registry treated with oral anticoagulation. Worsening renal function (WRF) was defined as a decrease of >20% in creatinine clearance (CrCl) from baseline. The US Food and Drug Administration (FDA)-approved package inserts were used to define the reduction criteria of NOACs dosing.

Results: Among 6682 patients with AF from 220 sites (median age (25th, 75th): 72.0 years (65.0, 79.0); 57.1% male; median CrCl at baseline: 80.1 mL/min (57.4, 108.5)), 1543 patients (23.1%) experienced WRF with mean decline in CrCl during 2 year follow-up of -6.63 mL/min for NOACs and -6.16 mL/min for warfarin. Among 4120 patients on NOACs, 154 (3.7%) patients had a CrCl decline sufficient to warrant FDA-recommended dose reductions. Of these, NOACs dosing was appropriately reduced in only 31 (20.1%) patients. Compared with patients with appropriately reduced NOACs, those without were more likely to experience bleeding complications (major bleeding: 1.7% vs 0%; bleeding hospitalisation: 2.6% vs 0%) at 1 year.

Conclusions: In the US practice, about one-fourth of patients with AF had >20% decline in CrCl over time during 2 year follow-up. As a result, about 3.7% of those treated with NOACs met guideline criteria for dose reduction, but of these, only 20.1% actually had a reduction.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/heartjnl-2019-315792DOI Listing
March 2020

Comparison of Long-Term Adverse Outcomes in Patients With Atrial Fibrillation Having Ablation Versus Antiarrhythmic Medications.

Am J Cardiol 2020 02 19;125(4):553-561. Epub 2019 Nov 19.

Division of Research, Kaiser Permanente Northern California, Oakland, California; Department of Medicine, Health Research and Policy, Stanford University School of Medicine, Stanford, California; Departments of Epidemiology, Biostatistics, and Medicine, University of California, San Francisco, San Francisco, California.

The impact of atrial fibrillation (AF) catheter ablation versus chronic antiarrhythmic therapy alone on clinical outcomes such as death and stroke remains unclear. We compared adverse outcomes for AF ablation versus chronic antiarrhythmic therapy in 1,070 adults with AF treated between 2010 and 2014 in the Kaiser Permanente Northern California and Southern California healthcare delivery systems. Patients who underwent AF catheter ablation were matched to patients treated with only antiarrhythmic medications, based on age, gender, history of heart failure, history of coronary heart disease, history of hypertension, history of diabetes, and high-dimensional propensity score. We compared crude and adjusted rates of death, ischemic stroke or transient ischemic attack, intracranial hemorrhage, and hospitalization. The matched cohort of 535 patients treated with AF ablation and 535 treated with antiarrhythmic therapy had a median follow-up of 2.0 (interquartile range 1.1 to 3.5) years. There was no significant difference in adjusted rates of death (adjusted hazard ratio [HR] 0.24, 95% confidence interval [CI] 0.03 to 1.95), intracranial hemorrhage (adjusted HR 0.17, CI 0.02 to 1.71), ischemic stroke or transient ischemic attack (adjusted HR 0.53, CI 0.18 to 1.60), and heart failure hospitalization (adjusted HR 0.85, CI 0.34 to 2.12), although there was a trend toward improvement in these outcomes with ablation. However, there was a significantly increased risk of all-cause hospitalization following ablation (adjusted HR 1.60, CI 1.25 to 2.05). In a contemporary, multicenter, propensity-matched observational cohort, AF ablation was not significantly associated with death, intracranial hemorrhage, ischemic stroke or transient ischemic attack, or heart failure hospitalization, but was associated with a higher rate of all cause-hospitalization.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjcard.2019.11.004DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6987016PMC
February 2020

Outcomes and Anticoagulation Use After Catheter Ablation for Atrial Fibrillation.

Circ Arrhythm Electrophysiol 2019 12 13;12(12):e007612. Epub 2019 Dec 13.

Duke Clinical Research Institute, Durham, NC (P.S., K.S.P., E.D.P., J.P.P.).

Background: Studies evaluating the effects of atrial fibrillation (AF) catheter ablation versus antiarrhythmic therapy on outcomes have shown mixed results. In addition, guidelines recommend continuing oral anticoagulation (OAC) after ablation for those at risk of stroke, but real-world data are lacking.

Methods: We evaluated outcomes including death, myocardial infarction, stroke or systemic embolism, intracranial bleeding, major bleeding, and hospitalization in patients undergoing AF ablation compared with a propensity score matched cohort of patients treated with anti-arrhythmic medications only in the Outcomes Registry for Better Informed Treatment of Atrial Fibrillation registries. Cox proportional hazards regression was performed to evaluate the association between AF ablation and outcomes. We then evaluated patterns of treatment with OAC among AF ablation patients.

Results: Among 21 595 patients, 1190 (6%) underwent de novo AF ablation. Our propensity score-matched cohort included 1087 patients who underwent AF ablation matched 1:1 with 1087 patients treated with antiarrhythmic medications only. There were no significant differences in the risk of all-cause and cardiovascular death, and most other major adverse cardiovascular and neurological events. AF catheter ablation was associated with an increased risk of all-cause hospitalization during follow-up (hazard ratio, 1.24 [95% CI, 1.05-1.46]), particularly in the first 3 months (the standard blanking period) after the procedure. Among those who underwent AF ablation with a CHADS VASc score ≥2 for men and ≥3 for women, 23% had OAC discontinued after ablation. Among those who discontinued OAC, the median time to discontinuation was 6.2 months.

Conclusions: In this large US national registry, we found no difference in adjusted rates of cardiovascular or all-cause death between patients treated with AF catheter ablation and antiarrhythmic medications only. Notably, discontinuation of OAC after ablation remains relatively common despite guideline recommendations for continued stroke prevention therapy in patients at risk of stroke.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/CIRCEP.119.007612DOI Listing
December 2019

Patterns of amiodarone use and outcomes in clinical practice for atrial fibrillation.

Am Heart J 2020 02 23;220:145-154. Epub 2019 Oct 23.

Columbia University Medical Center, New York, NY.

Background: Amiodarone is the most effective antiarrhythmic drug (AAD) for atrial fibrillation (AF), but it has a high incidence of adverse effects.

Methods: Using the ORBIT AF registry, patients with AF on amiodarone at enrollment, prescribed amiodarone during follow-up, or never on amiodarone were analyzed for the proportion treated with a guideline-based indication for amiodarone, the variability in amiodarone use across sites, and the outcomes (mortality, hospitalization, and stroke) among patients treated with amiodarone. Hierarchical logistic regression modeling with site-specific random intercepts compared rates of amiodarone use across 170 sites. A logistic regression model for propensity to receive amiodarone created a propensity-matched cohort. Cox proportional hazards modeling, stratified by matched pairs evaluated the association between amiodarone and outcomes.

Results: Among 6,987 AF patients, 867 (12%) were on amiodarone at baseline and 451 (6%) started on incident amiodarone during the 3-year follow-up. Use of amiodarone varied among sites from 3% in the lowest tertile to 21% in the highest (p<0.0001). Among those treated, 32% had documented contraindications to other AADs or had failed another AAD in the past. Mortality, cardiovascular hospitalization, and stroke were similar among matched patients on and not on amiodarone at baseline, while incident amiodarone use in matched patients was associated with higher all-cause mortality (adjusted HR 2.06, 95% CI 1.35-3.16).

Conclusions: Use of amiodarone among AF patients in community practice is highly variable. More than 2 out of 3 patients treated with amiodarone appeared to be eligible for a different AAD.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ahj.2019.09.017DOI Listing
February 2020

PD-L1 Expression on Circulating Tumor Cells May Be Predictive of Response to Pembrolizumab in Advanced Melanoma: Results from a Pilot Study.

Oncologist 2019 Dec 5. Epub 2019 Dec 5.

School of Medical and Health Sciences, Edith Cowan University, Perth, Australia

Background: PD-1 inhibitors are routinely used for the treatment of advanced melanoma. This study sought to determine whether PD-L1 expression on circulating tumor cells (CTCs) can serve as a predictive biomarker of clinical benefit and response to treatment with the PD-1 inhibitor pembrolizumab.

Methods: Blood samples were collected from patients with metastatic melanoma receiving pembrolizumab, prior to treatment and 6-12 weeks after initiation of therapy. Multiparametric flow cytometry was used to identify CTCs and evaluate the expression of PD-L1.

Results: CTCs were detected in 25 of 40 patients (63%). Patients with detectable PD-L1 CTCs (14/25, 64%) had significantly longer progression-free survival (PFS) compared with patients with PD-L1 CTCs (26.6 months vs. 5.5 months; = .018). The 12-month PFS rates were 76% versus 22% in the PD-L1 versus PD-L1 CTCs groups ( = .012), respectively. A multivariate linear regression analysis confirmed that PD-L1 CTC is an independent predictive biomarker of PFS (hazard ratio, 0.229; 95% confidence interval, 0.052-1.012; = .026).

Conclusion: Our results reveal the potential of CTCs as a noninvasive real-time biopsy to evaluate PD-L1 expression in patients with melanoma. PD-L1 expression on CTCs may be predictive of response to pembrolizumab and longer PFS.

Implications For Practice: The present data suggest that PD-L1 expression on circulating tumor cells may predict response to pembrolizumab in advanced melanoma. This needs further validation in a larger trial and, if proven, might be a useful liquid biopsy tool that could be used to stratify patients into groups more likely to respond to immunotherapy, hence leading to health cost savings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1634/theoncologist.2019-0557DOI Listing
December 2019

Guideline-directed therapies for comorbidities and clinical outcomes among individuals with atrial fibrillation.

Am Heart J 2020 01 8;219:21-30. Epub 2019 Nov 8.

Division of Cardiology, Duke University Medical Center, Durham, NC; Duke Clinical Research Institute, Durham, NC.

Background: Comorbidities are common in patients with atrial fibrillation (AF) and affect prognosis, yet are often undertreated. However, contemporary rates of use of guideline-directed therapies (GDT) for non-AF comorbidities and their association with outcomes are not well described.

Methods: We used the Outcomes Registry for Better Informed Treatment of AF (ORBIT-AF) to test the association between GDT for non-AF comorbidities and major adverse cardiac or neurovascular events (MACNE; cardiovascular death, myocardial infarction, stroke/thromboembolism, or new-onset heart failure), all-cause mortality, new-onset heart failure, and AF progression. Adjustment was performed using Cox proportional hazards models and logistic regression.

Results: Only 6,782 (33%) of the 20,434 patients eligible for 1 or more GDT for non-AF comorbidities received all indicated therapies. Use of all comorbidity-specific GDT was highest for patients with hyperlipidemia (75.6%) and lowest for those with diabetes mellitus (43.1%). Use of "all eligible" GDT was associated with a nonsignificant trend toward lower rates of MACNE (HR 0.90 [0.79-1.02]) and all-cause mortality (HR 0.90 [0.80-1.01]). Use of GDT for heart failure was associated with a lower risk of all-cause mortality (HR 0.77 [0.67-0.89]), and treatment of obstructive sleep apnea was associated with a lower risk of AF progression (OR 0.75 [0.62-0.90]).

Conclusions: In AF patients, there is underuse of GDT for non-AF comorbidities. The association between GDT use and outcomes was strongest in heart failure and obstructive sleep apnea patients where use of GDT was associated with lower mortality and less AF progression.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ahj.2019.10.008DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7285625PMC
January 2020