Publications by authors named "Kevin Zhang"

102 Publications

Racial and socioeconomic disparities in ocular surface squamous neoplasia: a National Cancer Database analysis.

Ophthalmic Epidemiol 2021 May 12:1-9. Epub 2021 May 12.

Department of Medicine, Henry Lynch Cancer Centre, Creighton University Medical Centre, Omaha, Nebraska, USA.

: A retrospective population-based study to investigate racial and socioeconomic disparities in patients diagnosed with ocular surface squamous neoplasia (OSSN).: To explore racial disparity, we selected OSSN patients with known age, insurance, gender and zip code-level income and education from the National Cancer Database (NCDB). Comparisons of clinical and socioeconomic variables stratified by race were made with the chi-square or Mann-Whitney tests. Survival outcome was examined a Cox regression model.: Of the 2,402 identified patients from 2004 to 2015, 117 were black. Unadjusted differences were found between groups in regard to age, histology, insurance, income, and education. Black patients in comparison to white patients were younger (mean age: 62 years vs. 70 years; < .001), represented a higher proportion of Medicaid use (10.3% vs. 3.2%; < .001) or uninsured (10.3% vs. 2.7%; < .001), and were more likely to reside in areas of low educational attainment (32.5% vs. 16.1% of whites; < .001). Multivariate analysis found significantly higher risk of death in patients who were male (HR: 1.66, 95% CI 1.37-2.01) or black (HR: 1.57, 95% CI 1.03-2.38).: Disparities in socioeconomic factors were observed in black patients with OSSN. OSSN occurred earlier in blacks, who were also socioeconomically disadvantaged and faced higher risk of death.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1080/09286586.2021.1925307DOI Listing
May 2021

Computational modeling of gene-specific transcriptional repression, activation and chromatin interactions in leukemogenesis by LASSO-regularized logistic regression.

IEEE/ACM Trans Comput Biol Bioinform 2021 May 7;PP. Epub 2021 May 7.

Many physiological and pathological pathways are dependent on gene-specific on/off regulation of transcription. Some genes are repressed, while others are activated. Although many previous studies have analyzed the mechanisms of gene-specific repression and activation, these studies are mainly based on the use of candidate genes, which are either repressed or activated, without simultaneously comparing and contrasting both groups of genes. There is also insufficient consideration of gene locations. Here we describe an integrated machine learning approach, using LASSO-regularized logistic regression, to model gene-specific repression and activation and the underlying contribution of chromatin interactions. LASSO-regularized logistic regression accurately predicted gene-specific transcriptional events and robustly detected the rate-limiting factors that underlie the differences of gene activation and repression. An example was provided by the leukemogenic transcription factor AML1-ETO, which is responsible for 10-15\% of all acute myeloid leukemia cases. The analysis of AML1-ETO has also revealed novel networks of chromatin interactions and uncovered an unexpected role for E-proteins in AML1-ETO-p300 interactions and a role for the pre-existing gene state in governing the transcriptional response. Our results show that logistic regression-based probabilistic modeling is a promising tool to decipher mechanisms that integrate gene regulation and chromatin interactions in regulated transcription.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1109/TCBB.2021.3078128DOI Listing
May 2021

Weight Loss Outcomes following Roux-en-Y Gastric Bypass and Sleeve Gastrectomy in an Ethnically Diverse Bariatric Population: Which Is More Effective?

Minim Invasive Surg 2021 16;2021:9702976. Epub 2021 Apr 16.

Department of Surgery, Wyckoff Heights Medical Center, Brooklyn, NY, USA.

Background: Laparoscopic Roux-en-Y gastric bypass (LRYGB) and laparoscopic sleeve gastrectomy (LSG) have comparable weight loss outcomes in a general bariatric population.

Objectives: This study aimed to investigate whether similar outcomes can be observed in Hispanic and African American population. . Community Hospital in New York, New York, United States.

Methods: The 5-year prospective data of patients who underwent LRYGB and LSG at a single center were retrospectively reviewed. The long-term weight loss outcomes between patients who had LRYGB and LSG were compared after adjusting for age, sex, race, diabetes mellitus, and hypertension with the linear mixed-effects or logistic regression model.

Results: Most patients were Hispanic (59.2%) and African American (22.7%). The mean% total weight loss (%TWL) values of patients with BMI <45 kg/m who underwent LRYGB and LSG were 73% and 62% after 1 year, 69% and 56% after 2 years, and 71% and 54% after 5 years, respectively. In patients with a BMI of 45-50 kg/m who underwent LRYGB and LSG, the mean %TWL values were 69% and 56% after 1 year, 75% and 58% after 2 years, and 57% and 45% after 5 years, respectively. Meanwhile, the %TWL values of patients with BMI >50 kg/m who had LRYGB and LSG were 53% and 42% after 1 year, 53% and 45% after 2 years, and 49% and 36% after 5 years, respectively. All results were statistically significant ( < 0.0001) and remained valid after adjusting for cofactors.

Conclusion: Thus, LRYGB had consistent and sustained long-term weight loss outcomes compared with LSG in a predominantly ethnically diverse patient population with different BMI. Our study had several limitations in that it is retrospective in nature and some patients were lost to follow-up during the study period.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1155/2021/9702976DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8064797PMC
April 2021

Simulated Identification of Silent COVID-19 Infections Among Children and Estimated Future Infection Rates With Vaccination.

JAMA Netw Open 2021 04 1;4(4):e217097. Epub 2021 Apr 1.

Center for Infectious Disease Modeling and Analysis, Yale School of Public Health, New Haven, Connecticut.

Importance: A significant proportion of COVID-19 transmission occurs silently during the presymptomatic and asymptomatic stages of infection. Children, although important drivers of silent transmission, are not included in the current COVID-19 vaccination campaigns.

Objective: To estimate the benefits of identifying silent infections among children as a proxy for their vaccination.

Design, Setting, And Participants: This study used an age-structured disease transmission model, parameterized with census data and estimates from published literature, to simulate the estimated synergistic effect of interventions in reducing attack rates during the course of 1 year among a synthetic population representative of the US demographic composition. The population included 6 age groups of 0 to 4, 5 to 10, 11 to 18, 19 to 49, 50 to 64, and 65 years or older based on US census data. Data were analyzed from December 12, 2020, to February 26, 2021.

Exposures: In addition to the isolation of symptomatic cases within 24 hours of symptom onset, vaccination of adults was implemented to reach a 40% to 60% coverage during 1 year with an efficacy of 95% against symptomatic and severe COVID-19.

Main Outcomes And Measures: The combinations of proportion and speed for detecting silent infections among children that would suppress future attack rates to less than 5%.

Results: In the base-case scenarios with an effective reproduction number Re = 1.2, a targeted approach that identifies 11% of silent infections among children within 2 days and 14% within 3 days after infection would bring attack rates to less than 5% with 40% vaccination coverage of adults. If silent infections among children remained undetected, achieving the same attack rates would require an unrealistically high vaccination coverage (≥81%) of this age group, in addition to 40% vaccination coverage of adults. The estimated effect of identifying silent infections was robust in sensitivity analyses with respect to vaccine efficacy against infection and reduced susceptibility of children to infection.

Conclusions And Relevance: In this simulation modeling study of a synthetic US population, in the absence of vaccine availability for children, a targeted approach to rapidly identify silent COVID-19 infections in this age group was estimated to significantly mitigate disease burden. These findings suggest that without measures to interrupt transmission chains from silent infections, vaccination of adults is unlikely to contain the outbreaks in the near term.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1001/jamanetworkopen.2021.7097DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8065378PMC
April 2021

Evaluation of COVID-19 vaccination strategies with a delayed second dose.

PLoS Biol 2021 04 21;19(4):e3001211. Epub 2021 Apr 21.

Center for Infectious Disease Modeling and Analysis, Yale School of Public Health, New Haven, Connecticut, United States of America.

Two of the Coronavirus Disease 2019 (COVID-19) vaccines currently approved in the United States require 2 doses, administered 3 to 4 weeks apart. Constraints in vaccine supply and distribution capacity, together with a deadly wave of COVID-19 from November 2020 to January 2021 and the emergence of highly contagious Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) variants, sparked a policy debate on whether to vaccinate more individuals with the first dose of available vaccines and delay the second dose or to continue with the recommended 2-dose series as tested in clinical trials. We developed an agent-based model of COVID-19 transmission to compare the impact of these 2 vaccination strategies, while varying the temporal waning of vaccine efficacy following the first dose and the level of preexisting immunity in the population. Our results show that for Moderna vaccines, a delay of at least 9 weeks could maximize vaccination program effectiveness and avert at least an additional 17.3 (95% credible interval [CrI]: 7.8-29.7) infections, 0.69 (95% CrI: 0.52-0.97) hospitalizations, and 0.34 (95% CrI: 0.25-0.44) deaths per 10,000 population compared to the recommended 4-week interval between the 2 doses. Pfizer-BioNTech vaccines also averted an additional 0.60 (95% CrI: 0.37-0.89) hospitalizations and 0.32 (95% CrI: 0.23-0.45) deaths per 10,000 population in a 9-week delayed second dose (DSD) strategy compared to the 3-week recommended schedule between doses. However, there was no clear advantage of delaying the second dose with Pfizer-BioNTech vaccines in reducing infections, unless the efficacy of the first dose did not wane over time. Our findings underscore the importance of quantifying the characteristics and durability of vaccine-induced protection after the first dose in order to determine the optimal time interval between the 2 doses.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1371/journal.pbio.3001211DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8092656PMC
April 2021

Multifaceted strategies for the control of COVID-19 outbreaks in long-term care facilities in Ontario, Canada.

Prev Med 2021 Apr 18;148:106564. Epub 2021 Apr 18.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario M3J 1P3, Canada. Electronic address:

The novel coronavirus disease 2019 (COVID-19) has caused severe outbreaks in Canadian long-term care facilities (LTCFs). In Canada, over 80% of COVID-19 deaths during the first pandemic wave occurred in LTCFs. We sought to evaluate the effect of mitigation measures in LTCFs including frequent testing of staff, and vaccination of staff and residents. We developed an agent-based transmission model and parameterized it with disease-specific estimates, temporal sensitivity of nasopharyngeal and saliva testing, results of vaccine efficacy trials, and data from initial COVID-19 outbreaks in LTCFs in Ontario, Canada. Characteristics of staff and residents, including contact patterns, were integrated into the model with age-dependent risk of hospitalization and death. Estimates of infection and outcomes were obtained and 95% credible intervals were generated using a bias-corrected and accelerated bootstrap method. Weekly routine testing of staff with 2-day turnaround time reduced infections among residents by at least 25.9% (95% CrI: 23.3%-28.3%), compared to baseline measures of mask-wearing, symptom screening, and staff cohorting alone. A similar reduction of hospitalizations and deaths was achieved in residents. Vaccination averted 2-4 times more infections in both staff and residents as compared to routine testing, and markedly reduced hospitalizations and deaths among residents by 95.9% (95% CrI: 95.4%-96.3%) and 95.8% (95% CrI: 95.5%-96.1%), respectively, over 200 days from the start of vaccination. Vaccination could have a substantial impact on mitigating disease burden among residents, but may not eliminate the need for other measures before population-level control of COVID-19 is achieved.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ypmed.2021.106564DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8053216PMC
April 2021

A vicious cycle of bisretinoid formation and oxidation relevant to recessive Stargardt disease.

J Biol Chem 2021 Jan 7;296:100259. Epub 2021 Jan 7.

Department of Ophthalmology, Columbia University Medical Center, New York, New York, USA; Department of Pathology and Cell Biology, Columbia University Medical Center, New York, New York, USA. Electronic address:

The ability of iron to transfer electrons enables the contribution of this metal to a variety of cellular activities even as the redox properties of iron are also responsible for the generation of hydroxyl radicals (OH), the most destructive of the reactive oxygen species. We previously showed that iron can promote the oxidation of bisretinoid by generating highly reactive hydroxyl radical (OH). Now we report that preservation of iron regulation in the retina is not sufficient to prevent iron-induced bisretinoid oxidative degradation when blood iron levels are elevated in liver-specific hepcidin knockout mice. We obtained evidence for the perpetuation of Fenton reactions in the presence of the bisretinoid A2E and visible light. On the other hand, iron chelation by deferiprone was not associated with changes in postbleaching recovery of 11-cis-retinal or dark-adapted ERG b-wave amplitudes indicating that the activity of Rpe65, a rate-determining visual cycle protein that carries an iron-binding domain, is not affected. Notably, iron levels were elevated in the neural retina and retinal pigment epithelial (RPE) cells of Abca4 mice. Consistent with higher iron content, ferritin-L immunostaining was elevated in RPE of a patient diagnosed with ABCA4-associated disease and in RPE and photoreceptor cells of Abca4 mice. In neural retina of the mutant mice, reduced Tfrc mRNA was also an indicator of retinal iron overload. Thus iron chelation may defend retina when bisretinoid toxicity is implicated in disease processes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jbc.2021.100259DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7948646PMC
January 2021

Projecting the impact of a two-dose COVID-19 vaccination campaign in Ontario, Canada.

Vaccine 2021 04 20;39(17):2360-2365. Epub 2021 Mar 20.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario M3J 1P3 Canada.

Background: A number of highly effective COVID-19 vaccines have been developed and approved for mass vaccination. We evaluated the impact of vaccination on COVID-19 outbreak and disease outcomes in Ontario, Canada.

Methods: We used an agent-based transmission model and parameterized it with COVID-19 characteristics, demographics of Ontario, and age-specific clinical outcomes. We implemented a two-dose vaccination program according to tested schedules in clinical trials for Pfizer-BioNTech and Moderna vaccines, prioritizing healthcare workers, individuals with comorbidities, and those aged 65 and older. Daily vaccination rate was parameterized based on vaccine administration data. Using estimates of vaccine efficacy, we projected the impact of vaccination on the overall attack rate, hospitalizations, and deaths. We further investigated the effect of increased daily contacts at different stages during vaccination campaigns on outbreak control.

Results: Maintaining non-pharmaceutical interventions (NPIs) with an average of 74% reduction in daily contacts, vaccination with Pfizer-BioNTech and Moderna vaccines was projected to reduce hospitalizations by 27.3% (95% CrI: 22.3% - 32.4%) and 27.0% (95% CrI: 21.9% - 32.6%), respectively, over a one-year time horizon. The largest benefits of vaccination were observed in preventing deaths with reductions of 31.5% (95% CrI: 22.5% - 39.7%) and 31.9% (95% CrI: 22.0% - 41.4%) for Pfizer-BioNTech and Moderna vaccines, respectively, compared to no vaccination. We found that an increase of only 10% in daily contacts at the end of lockdown, when vaccination coverage with only one dose was 6%, would trigger a surge in the outbreak. Early relaxation of population-wide measures could lead to a substantial increase in the number of infections, potentially reaching levels observed during the peak of the second wave in Ontario.

Conclusions: Vaccination can substantially mitigate ongoing COVID-19 outbreaks. Sustaining population-wide NPIs, to allow for a sufficient increase in population-level immunity through vaccination, is essential to prevent future outbreaks.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.vaccine.2021.03.058DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7980181PMC
April 2021

Microfluidic guillotine reveals multiple timescales and mechanical modes of wound response in Stentor coeruleus.

BMC Biol 2021 Apr 2;19(1):63. Epub 2021 Apr 2.

Department of Mechanical Engineering, Stanford University, Stanford, CA, 94305, USA.

Background: Wound healing is one of the defining features of life and is seen not only in tissues but also within individual cells. Understanding wound response at the single-cell level is critical for determining fundamental cellular functions needed for cell repair and survival. This understanding could also enable the engineering of single-cell wound repair strategies in emerging synthetic cell research. One approach is to examine and adapt self-repair mechanisms from a living system that already demonstrates robust capacity to heal from large wounds. Towards this end, Stentor coeruleus, a single-celled free-living ciliate protozoan, is a unique model because of its robust wound healing capacity. This capacity allows one to perturb the wounding conditions and measure their effect on the repair process without immediately causing cell death, thereby providing a robust platform for probing the self-repair mechanism.

Results: Here we used a microfluidic guillotine and a fluorescence-based assay to probe the timescales of wound repair and of mechanical modes of wound response in Stentor. We found that Stentor requires ~ 100-1000 s to close bisection wounds, depending on the severity of the wound. This corresponds to a healing rate of ~ 8-80 μm/s, faster than most other single cells reported in the literature. Further, we characterized three distinct mechanical modes of wound response in Stentor: contraction, cytoplasm retrieval, and twisting/pulling. Using chemical perturbations, active cilia were found to be important for only the twisting/pulling mode. Contraction of myonemes, a major contractile fiber in Stentor, was surprisingly not important for the contraction mode and was of low importance for the others.

Conclusions: While events local to the wound site have been the focus of many single-cell wound repair studies, our results suggest that large-scale mechanical behaviors may be of greater importance to single-cell wound repair than previously thought. The work here advances our understanding of the wound response in Stentor and will lay the foundation for further investigations into the underlying components and molecular mechanisms involved.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s12915-021-00970-0DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8017755PMC
April 2021

The internal limiting membrane: Roles in retinal development and implications for emerging ocular therapies.

Exp Eye Res 2021 May 20;206:108545. Epub 2021 Mar 20.

Glaucoma Center of Excellence, Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 North Wolfe Street, Maumenee B-110, Baltimore, MD, 21287, USA. Electronic address:

Basement membranes help to establish, maintain, and separate their associated tissues. They also provide growth and signaling substrates for nearby resident cells. The internal limiting membrane (ILM) is the basement membrane at the ocular vitreoretinal interface. While the ILM is essential for normal retinal development, it is dispensable in adulthood. Moreover, the ILM may constitute a significant barrier to emerging ocular therapeutics, such as viral gene therapy or stem cell transplantation. Here we take a neurodevelopmental perspective in examining how retinal neurons, glia, and vasculature interact with individual extracellular matrix constituents at the ILM. In addition, we review evidence that the ILM may impede novel ocular therapies and discuss approaches for achieving retinal parenchymal targeting of gene vectors and cell transplants delivered into the vitreous cavity by manipulating interactions with the ILM.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.exer.2021.108545DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8087649PMC
May 2021

Fomite Transmission, Physicochemical Origin of Virus-Surface Interactions, and Disinfection Strategies for Enveloped Viruses with Applications to SARS-CoV-2.

ACS Omega 2021 Mar 5;6(10):6509-6527. Epub 2021 Mar 5.

Department of Mechanical Engineering, Stanford University, Stanford, California 94305, United States.

Inanimate objects or surfaces contaminated with infectious agents, referred to as fomites, play an important role in the spread of viruses, including SARS-CoV-2, the virus responsible for the COVID-19 pandemic. The long persistence of viruses (hours to days) on surfaces calls for an urgent need for effective surface disinfection strategies to intercept virus transmission and the spread of diseases. Elucidating the physicochemical processes and surface science underlying the adsorption and transfer of virus between surfaces, as well as their inactivation, is important for understanding how diseases are transmitted and for developing effective intervention strategies. This review summarizes the current knowledge and underlying physicochemical processes of virus transmission, in particular via fomites, and common disinfection approaches. Gaps in knowledge and the areas in need of further research are also identified. The review focuses on SARS-CoV-2, but discussion of related viruses is included to provide a more comprehensive review given that much remains unknown about SARS-CoV-2. Our aim is that this review will provide a broad survey of the issues involved in fomite transmission and intervention to a wide range of readers to better enable them to take on the open research challenges.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1021/acsomega.0c06335DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7944398PMC
March 2021

Electrophysiological Mechanisms Underlying T-Wave Alternans and Their Role in Arrhythmogenesis.

Front Physiol 2021 4;12:614946. Epub 2021 Mar 4.

Key Lab of Medical Electrophysiology, Ministry of Education, Institute of Cardiovascular Research, Southwest Medical University, Luzhou, China.

T-wave alternans (TWA) reflects every-other-beat alterations in the morphology of the electrocardiogram ST segment or T wave in the setting of a constant heart rate, hence, in the absence of heart rate variability. It is believed to be associated with the dispersion of repolarization and has been used as a non-invasive marker for predicting the risk of malignant cardiac arrhythmias and sudden cardiac death as numerous studies have shown. This review aims to provide up-to-date review on both experimental and simulation studies in elucidating possible mechanisms underlying the genesis of TWA at the cellular level, as well as the genesis of spatially concordant/discordant alternans at the tissue level, and their transition to cardiac arrhythmia. Recent progress and future perspectives in antiarrhythmic therapies associated with TWA are also discussed.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fphys.2021.614946DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7969788PMC
March 2021

Tenofovir and emtricitabine concentrations in hair are comparable between individuals on tenofovir disoproxil fumarate versus tenofovir alafenamide-based ART.

Drug Test Anal 2021 Mar 20. Epub 2021 Mar 20.

Division of HIV, Infection Diseases, and Global Medicine, Department of Medicine, University of California San Francisco, San Francisco, California, USA.

Tenofovir disoproxil fumarate (TDF) in combination with emtricitabine (FTC) is the backbone for both human immunodeficiency virus (HIV) treatment and pre-exposure prophylaxis (PrEP) worldwide. Tenofovir alafenamide (TAF) with FTC is increasingly used in HIV treatment and was recently approved for PrEP among men-who-have-sex-with-men. TDF and TAF are both metabolized into tenofovir (TFV). Antiretrovirals in plasma are taken up into hair over time, with hair levels providing a long-term measure of adherence. Here, we report a simple, robust, highly sensitive, and validated high-performance liquid chromatography coupled with tandem mass spectrometry (LC/MS/MS)-based analytical method for analyzing TFV and FTC from individuals on either TDF/FTC or TAF/FTC in small hair samples. TFV/FTC are extracted from ~5 mg hair and separated on a column using a gradient elution. The lower quantification limits are 0.00200 (TFV) and 0.0200 (FTC) ng/mg hair; the assay is linear up to 0.400 (TFV) and 4.00 (FTC) ng/mg hair. The intra-day and inter-day coefficients of variance (CVs) are 5.39-12.6% and 6.40-13.5% for TFV and 0.571-2.45% and 2.45-5.16% for FTC. TFV concentrations from participants on TDF/FTC-based regimens with undetectable plasma HIV RNA were 0.0525 ± 0.0295 ng/mg, whereas those from individuals on TAF/FTC-based regimens were 0.0426 ± 0.0246 ng/mg. Despite the dose of TFV in TDF being 10 times that of TAF, hair concentrations of TFV were not significantly different for those on TDF versus TAF regimens. Pharmacological enhancers (ritonavir and cobicistat) did not boost TFV concentrations in hair. In summary, we developed and validated a sensitive analytical method to analyze TFV and FTC in hair and found that hair concentrations of TFV were essentially equivalent among those on TDF and TAF.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/dta.3033DOI Listing
March 2021

Validation of the STOP-Bang questionnaire as a screening tool for obstructive sleep apnoea in patients with cardiovascular risk factors: a systematic review and meta-analysis.

BMJ Open Respir Res 2021 Mar;8(1)

Department of Anaesthesia and Pain Medicine, Toronto Western Hospital, Toronto, Ontario, Canada

Introduction: Obstructive sleep apnoea (OSA) is highly prevalent in patients with cardiovascular risk factors and is associated with increased morbidity and mortality. This review presents the predictive parameters of the STOP-Bang questionnaire as a screening tool for OSA in this population.

Methods: A search of databases was performed. The inclusion criteria were: (1) use of the STOP-Bang questionnaire to screen for OSA in adults (>18 years) with cardiovascular risk factors; (2) polysomnography or home sleep apnoea testing performed as a reference standard; (3) OSA defined by either Apnoea-Hypopnoea Index (AHI) or Respiratory Disturbance Index; and (4) data on predictive parameters of the STOP-Bang questionnaire. A random-effects model was used to obtain pooled predictive parameters of the STOP-Bang questionnaire.

Results: The literature search resulted in 3888 articles, of which 9 papers met the inclusion criteria, involving 1894 patients. The average age of the included patients was 58±13 years with body mass index (BMI) of 30±6 kg/m, and 64% were male. The STOP-Bang questionnaire has a sensitivity of 89.1%, 90.7% and 93.9% to screen for all (AHI ≥5), moderate-to-severe (AHI ≥15) and severe (AHI≥30) OSA, respectively. The specificity was 32.3%, 22.5% and 18.3% and the area under the curve (AUC) was 0.86, 0.65 and 0.52 for all, moderate-to-severe and severe OSA, respectively.

Conclusion: The STOP-Bang questionnaire is an effective tool to screen for OSA (AHI≥5) with AUC of 0.86 in patients with cardiovascular risk factors.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/bmjresp-2020-000848DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7934717PMC
March 2021

How Well Can Multivariate and Univariate GWAS Distinguish Between True and Spurious Pleiotropy?

Front Genet 2020 8;11:602526. Epub 2021 Jan 8.

Department of Crop Science, University of Illinois at Urbana-Champaign, Urbana, IL, United States.

Quantification of the simultaneous contributions of loci to multiple traits, a phenomenon called pleiotropy, is facilitated by the increased availability of high-throughput genotypic and phenotypic data. To understand the prevalence and nature of pleiotropy, the ability of multivariate and univariate genome-wide association study (GWAS) models to distinguish between pleiotropic and non-pleiotropic loci in linkage disequilibrium (LD) first needs to be evaluated. Therefore, we used publicly available maize and soybean genotypic data to simulate multiple pairs of traits that were either (i) controlled by quantitative trait nucleotides (QTNs) on separate chromosomes, (ii) controlled by QTNs in various degrees of LD with each other, or (iii) controlled by a single pleiotropic QTN. We showed that multivariate GWAS could not distinguish between QTNs in LD and a single pleiotropic QTN. In contrast, a unique QTN detection rate pattern was observed for univariate GWAS whenever the simulated QTNs were in high LD or pleiotropic. Collectively, these results suggest that multivariate and univariate GWAS should both be used to infer whether or not causal mutations underlying peak GWAS associations are pleiotropic. Therefore, we recommend that future studies use a combination of multivariate and univariate GWAS models, as both models could be useful for identifying and narrowing down candidate loci with potential pleiotropic effects for downstream biological experiments.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fgene.2020.602526DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7873880PMC
January 2021

HIV incidence after pre-exposure prophylaxis initiation among women and men at elevated HIV risk: A population-based study in rural Kenya and Uganda.

PLoS Med 2021 Feb 9;18(2):e1003492. Epub 2021 Feb 9.

Department of Biostatistics and Epidemiology, University of Massachusetts, Amherst, Amherst, Massachusetts, United States of America.

Background: Oral pre-exposure prophylaxis (PrEP) is highly effective for HIV prevention, but data are limited on HIV incidence among PrEP users in generalized epidemic settings, particularly outside of selected risk groups. We performed a population-based PrEP study in rural Kenya and Uganda and sought to evaluate both changes in HIV incidence and clinical and virologic outcomes following seroconversion on PrEP.

Methods And Findings: During population-level HIV testing of individuals ≥15 years in 16 communities in the Sustainable East Africa Research in Community Health (SEARCH) study (NCT01864603), we offered universal access to PrEP with enhanced counseling for persons at elevated HIV risk (based on serodifferent partnership, machine learning-based risk score, or self-identified HIV risk). We offered rapid or same-day PrEP initiation and flexible service delivery with follow-up visits at facilities or community-based sites at 4, 12, and every 12 weeks up to week 144. Among participants with incident HIV infection after PrEP initiation, we offered same-day antiretroviral therapy (ART) initiation and analyzed HIV RNA, tenofovir hair concentrations, drug resistance, and viral suppression (<1,000 c/ml based on available assays) after ART start. Using Poisson regression with cluster-robust standard errors, we compared HIV incidence among PrEP initiators to incidence among propensity score-matched recent historical controls (from the year before PrEP availability) in 8 of the 16 communities, adjusted for risk group. Among 74,541 individuals who tested negative for HIV, 15,632/74,541 (21%) were assessed to be at elevated HIV risk; 5,447/15,632 (35%) initiated PrEP (49% female; 29% 15-24 years; 19% in serodifferent partnerships), of whom 79% engaged in ≥1 follow-up visit and 61% self-reported PrEP adherence at ≥1 visit. Over 7,150 person-years of follow-up, HIV incidence was 0.35 per 100 person-years (95% confidence interval [CI] 0.22-0.49) among PrEP initiators. Among matched controls, HIV incidence was 0.92 per 100 person-years (95% CI 0.49-1.41), corresponding to 74% lower incidence among PrEP initiators compared to matched controls (adjusted incidence rate ratio [aIRR] 0.26, 95% CI 0.09-0.75; p = 0.013). Among women, HIV incidence was 76% lower among PrEP initiators versus matched controls (aIRR 0.24, 95% CI 0.07-0.79; p = 0.019); among men, HIV incidence was 40% lower, but not significantly so (aIRR 0.60, 95% CI 0.12-3.05; p = 0.54). Of 25 participants with incident HIV infection (68% women), 7/25 (28%) reported taking PrEP ≤30 days before HIV diagnosis, and 24/25 (96%) started ART. Of those with repeat HIV RNA after ART start, 18/19 (95%) had <1,000 c/ml. One participant with viral non-suppression was found to have transmitted viral resistance, as well as emtricitabine resistance possibly related to PrEP use. Limitations include the lack of contemporaneous controls to assess HIV incidence without PrEP and that plasma samples were not archived to assess for baseline acute infection.

Conclusions: Population-level offer of PrEP with rapid start and flexible service delivery was associated with 74% lower HIV incidence among PrEP initiators compared to matched recent controls prior to PrEP availability. HIV infections were significantly lower among women who started PrEP. Universal HIV testing with linkage to treatment and prevention, including PrEP, is a promising approach to accelerate reductions in new infections in generalized epidemic settings.

Trial Registration: ClinicalTrials.gov NCT01864603.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1371/journal.pmed.1003492DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7872279PMC
February 2021

Evaluation of COVID-19 vaccination strategies with a delayed second dose.

medRxiv 2021 Jan 29. Epub 2021 Jan 29.

Center for Infectious Disease Modeling and Analysis (CIDMA), Yale School of Public Health, New Haven, Connecticut, USA.

COVID-19 vaccines currently approved in the United States require two doses, administered three to four weeks apart. Constraints in vaccine supply and distribution capacity, together with the rise of COVID-19 cases and hospitalizations, have sparked a policy debate on whether to vaccinate more individuals with the first dose of available vaccines and delay the second dose, or to continue with the recommended two-dose series as tested in clinical trials. We developed an agent-based model of COVID-19 transmission to compare the impact of these two vaccination strategies, while varying the temporal waning of vaccine efficacy against disease following the first dose, vaccine efficacy against infection, and the level of pre-existing immunity in the population. Our results show that for Moderna vaccines with 80% efficacy following the first dose, a delay of 9-12 weeks could enhance the program effectiveness and prevent additional infections, hospitalizations, and deaths, compared to a 4-week interval between the doses. However, for Pfizer-BioNTech vaccines with demonstrated efficacy of 52% after the first dose, there was no clear advantage for delaying the second dose beyond the 3-week tested schedule, unless the efficacy of the first dose did not wane over time. Our findings underscore the importance of quantifying the durability of vaccine-induced protection after the first dose as well as vaccine efficacy against infection in order to determine the optimal time interval between the two doses.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2021.01.27.21250619DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7852256PMC
January 2021

The impact of vaccination on COVID-19 outbreaks in the United States.

Clin Infect Dis 2021 Jan 30. Epub 2021 Jan 30.

Center for Infectious Disease Modeling and Analysis (CIDMA), Yale School of Public Health, New Haven, Connecticut, USA.

Background: Global vaccine development efforts have been accelerated in response to the devastating COVID-19 pandemic. We evaluated the impact of a 2-dose COVID-19 vaccination campaign on reducing incidence, hospitalizations, and deaths in the United States (US).

Methods: We developed an agent-based model of SARS-CoV-2 transmission and parameterized it with US demographics and age-specific COVID-19 outcomes. Healthcare workers and high-risk individuals were prioritized for vaccination, while children under 18 years of age were not vaccinated. We considered a vaccine efficacy of 95% against disease following 2 doses administered 21 days apart achieving 40% vaccine coverage of the overall population within 284 days. We varied vaccine efficacy against infection, and specified 10% pre-existing population immunity for the base-case scenario. The model was calibrated to an effective reproduction number of 1.2, accounting for current non-pharmaceutical interventions in the US.

Results: Vaccination reduced the overall attack rate to 4.6% (95% CrI: 4.3% - 5.0%) from 9.0% (95% CrI: 8.4% - 9.4%) without vaccination, over 300 days. The highest relative reduction (54-62%) was observed among individuals aged 65 and older. Vaccination markedly reduced adverse outcomes, with non-ICU hospitalizations, ICU hospitalizations, and deaths decreasing by 63.5% (95% CrI: 60.3% - 66.7%), 65.6% (95% CrI: 62.2% - 68.6%), and 69.3% (95% CrI: 65.5% - 73.1%), respectively, across the same period.

Conclusions: Our results indicate that vaccination can have a substantial impact on mitigating COVID-19 outbreaks, even with limited protection against infection. However, continued compliance with non-pharmaceutical interventions is essential to achieve this impact.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/cid/ciab079DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7929033PMC
January 2021

Identifying silent COVID-19 infections among children is critical for controlling the pandemic.

medRxiv 2021 Jan 8. Epub 2021 Jan 8.

Center for Infectious Disease Modeling and Analysis (CIDMA), Yale School of Public Health, New Haven, Connecticut, USA.

Importance: A significant proportion of COVID-19 transmission occurs silently during the pre-symptomatic and asymptomatic stages of infection. Children, while being important drivers of silent transmission, are not included in COVID-19 vaccination campaigns given their exclusion from clinical trials thus far.

Objective: To investigate the impact of a targeted approach to identifying silent infections among children as a proxy for their vaccination.

Design: This study used an age-structured disease transmission model to simulate the synergistic impact of interventions in reducing attack rates over the course of one year.

Setting: A synthetic population representative of the demographics of the United States (US).

Participants: Six age groups of 0-4, 5-10, 11-18, 19-49, 50-64, 65+ years old, stratified for their population size based on US census data.

Exposures: Vaccination of adults, self-isolation of all symptomatic cases within 24 hours of symptom onset, and detection of silent infections.

Main Outcomes And Measures: Vaccination of adults was implemented to reach a 40% coverage over the course of one year with a vaccine efficacy of 95% against symptomatic and severe COVID-19. Without vaccination of children, we determined the proportion and speed that would be required for identifying silent infections among this age group to suppress future attack rates below 5%.

Results: A targeted approach that identifies 20.6% and 28.6% of silent infections among children within 2 or 3 days post-infection, respectively, would be required to bring attack rates under 5% with vaccination of adults. If silent infections among children remained undetected, achieving the same attack rates would require an unrealistically high vaccination coverage (at least 82%) of this age group, in addition to the base-case 40% vaccination coverage of adults. The results were robust in sensitivity analyses with respect to vaccine efficacy against infection and reduced susceptibility of children to infection.

Conclusions And Relevance: In the absence of vaccine availability for children, a targeted approach to rapid identification of silent COVID-19 infections in this age group can significantly mitigate disease burden. Without measures to interrupt transmission chains from silent infections, vaccination of adults is unlikely to contain the outbreaks in the near term.

Key Points: What is the impact of a targeted strategy for identification of silent COVID-19 infections among children in the absence of their vaccination? In this modelling study, we found that identifying 20-30% of silent infections among children within three days post-infection would bring attack rates below 5% if only adults were vaccinated. If silent infections among children remained undetected, achieving the same attack rate would require an unrealistically high vaccination coverage (at least 82%) of this age group, in addition to vaccination of adults. Rapid identification of silent infections among children can replicate effects of their vaccination.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2021.01.06.21249349DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805462PMC
January 2021

Routine saliva testing for the identification of silent coronavirus disease 2019 (COVID-19) in healthcare workers.

Infect Control Hosp Epidemiol 2021 Jan 11:1-5. Epub 2021 Jan 11.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario, Canada.

Objective: Current COVID-19 guidelines recommend symptom-based screening and regular nasopharyngeal (NP) testing for healthcare personnel in high-risk settings. We sought to estimate case detection percentages with various routine NP and saliva testing frequencies.

Design: Simulation modeling study.

Methods: We constructed a sensitivity function based on the average infectiousness profile of symptomatic coronavirus disease 2019 (COVID-19) cases to determine the probability of being identified at the time of testing. This function was fitted to reported data on the percent positivity of symptomatic COVID-19 patients using NP testing. We then simulated a routine testing program with different NP and saliva testing frequencies to determine case detection percentages during the infectious period, as well as the presymptomatic stage.

Results: Routine biweekly NP testing, once every 2 weeks, identified an average of 90.7% (SD, 0.18) of cases during the infectious period and 19.7% (SD, 0.98) during the presymptomatic stage. With a weekly NP testing frequency, the corresponding case detection percentages were 95.9% (SD, 0.18) and 32.9% (SD, 1.23), respectively. A 5-day saliva testing schedule had a similar case detection percentage as weekly NP testing during the infectious period, but identified ~10% more cases (mean, 42.5%; SD, 1.10) during the presymptomatic stage.

Conclusion: Our findings highlight the utility of routine noninvasive saliva testing for frontline healthcare workers to protect vulnerable patient populations. A 5-day saliva testing schedule should be considered to help identify silent infections and prevent outbreaks in nursing homes and healthcare facilities.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1017/ice.2020.1413DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7870913PMC
January 2021

A vicious cycle of bisretinoid formation and oxidation relevant to recessive Stargardt disease.

J Biol Chem 2021 Jan 5. Epub 2021 Jan 5.

Ophthalmology, Columbia University, United States.

The ability of iron to transfer electrons enables the contribution of this metal to a variety of cellular activities even as the redox properties of iron are also responsible for the generation of hydroxyl radicals (•OH), the most destructive of the reactive oxygen species. We previously showed that iron can promote the oxidation of bisretinoid by generating highly reactive hydroxyl radical (•OH). Now we report that preservation of iron regulation in the retina is not sufficient to prevent iron-induced bisretinoid oxidative degradation when blood iron levels are elevated in liver-specific hepcidin knock-out mice. We obtained evidence for the perpetuation of Fenton reactions in the presence of the bisretinoid A2E and visible light. On the other hand, iron chelation by deferiprone was not associated with changes in post-bleaching recovery of 11--retinal or dark-adapted ERG b-wave amplitudes indicating that the activity of Rpe65, a rate-determining visual cycle protein that carries an iron-binding domain is not affected. Notably, iron levels were elevated in the neural retina and RPE of mice. Consistent with higher iron content, ferritin-L immunostaining was elevated in RPE of a patient diagnosed with ABCA4-associated disease and in RPE and photoreceptor cells of mice. In neural retina of the mutant mice, reduced Tfrc mRNA was also an indicator of retinal iron overload. Thus iron chelation may defend retina when bisretinoid toxicity is implicated in disease processes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1074/jbc.RA120.015890DOI Listing
January 2021

Role of the Internal Limiting Membrane in Structural Engraftment and Topographic Spacing of Transplanted Human Stem Cell-Derived Retinal Ganglion Cells.

Stem Cell Reports 2021 Jan 30;16(1):149-167. Epub 2020 Dec 30.

Glaucoma Center of Excellence, Wilmer Eye Institute, Johns Hopkins University School of Medicine, 600 North Wolfe Street, Maumenee B-110, Baltimore, MD 21287, USA. Electronic address:

Retinal ganglion cell (RGC) replacement holds potential for restoring vision lost to optic neuropathy. Transplanted RGCs must undergo neuroretinal integration to receive afferent visual signals for processing and efferent transmission. To date, retinal integration following RGC transplantation has been limited. We sought to overcome key barriers to transplanted human stem cell-derived RGC integration. Following co-culture ex vivo on organotypic mouse retinal explants, human RGCs cluster and extend bundled neurites that remain superficial to the neuroretina, hindering afferent synaptogenesis. To enhance integration, we increased the cellular permeability of the internal limiting membrane (ILM). Extracellular matrix digestion using proteolytic enzymes achieved ILM disruption while minimizing retinal toxicity and preserving glial reactivity. ILM disruption is associated with dispersion rather than clustering of co-cultured RGC bodies and neurites, and increased parenchymal neurite ingrowth. The ILM represents a significant obstacle to transplanted RGC connectivity and its circumvention may be necessary for functional RGC replacement.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.stemcr.2020.12.001DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7897583PMC
January 2021

Pragmatic randomized trial of a pre-visit intervention to improve the quality of telemedicine visits for vulnerable patients living with HIV.

J Telemed Telecare 2020 Dec 20:1357633X20976036. Epub 2020 Dec 20.

Division of HIV, ID and Global Medicine, University of California, USA.

Introduction: The COVID-19 pandemic has required a shift of many routine primary care visits to telemedicine, potentially widening disparities in care access among vulnerable populations. In a publicly-funded HIV clinic, we aimed to evaluate a pre-visit phone-based planning intervention to address anticipated barriers to telemedicine.

Methods: We conducted a pragmatic randomized controlled trial of patients scheduled for a phone-based HIV primary care visit at the Ward 86 HIV clinic in San Francisco from 15 April to 15 May 2020. Once reached by phone, patients were randomized to either have a structured pre-visit planning intervention to address barriers to an upcoming telemedicine visit versus a standard reminder call. The primary outcome was telemedicine visit attendance.

Results: Of 476 scheduled telemedicine visits, 280 patients were reached by a pre-visit call to offer enrollment. Patients were less likely to be reached if virally unsuppressed (odds ratio (OR) 0.11, 95% confidence intervals (CI) 0.03-0.48), CD4 < 200 (OR 0.24, 95% CI 0.07-0.85), or were homeless (OR 0.24, 95% CI 0.07-0.87). There was no difference between intervention and control in scheduled visit attendance (83% v. 78%, OR 1.38, 95% CI 0.67-2.81).

Conclusions: A structured phone-based planning call to address barriers to telemedicine in a public HIV clinic was less likely to reach patients with poorly-controlled HIV and patients experiencing homelessness, suggesting additional interventions may be needed in this population to ensure access to telemedicine-based care. Among patients reachable by phone, telemedicine visit attendance was high and not improved with a structured pre-visit intervention, suggesting that standard reminders may be adequate in this population.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1357633X20976036DOI Listing
December 2020

Multifaceted strategies for the control of COVID-19 outbreaks in long-term care facilities in Ontario, Canada.

medRxiv 2020 Dec 7. Epub 2020 Dec 7.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario, M3J 1P3 Canada.

Background: COVID-19 has caused severe outbreaks in Canadian long-term care facilities (LTCFs).

Objective: To evaluate the effect of mitigation measures in LTCFs including routine testing of staff and vaccination of staff and residents.

Design: Agent-based transmission model parameterized with disease-specific estimates, temporal sensitivity of nasopharyngeal (NP) and saliva testing, preliminary results of vaccine efficacy trials, and data from initial COVID-19 outbreaks in LTCFs in Ontario, Canada.

Setting: Characteristics of staff and residents were included in the model with age-dependent risk of hospitalization and deaths, calibrated to the cumulative incidence of COVID-19 reported in these settings.

Participants: Synthetic staff and resident populations.

Interventions: Routine NP and saliva testing of staff; vaccination of residents and staff.

Measurements: Daily incidence and attack rates in the LTCF using large-scale model simulations; estimates of hospitalizations and deaths and their 95% credible intervals.

Results: Weekly routine testing of staff with 2-day turnaround time reduced infections among residents by at least 20.3% (95% CrI: 18.7-21.8%), compared to baseline measures of mask-wearing, symptom screening, and staff cohorting alone. A similar reduction of hospitalizations and deaths was achieved in residents. Vaccination averted 2-4 times more infections in both staff and residents as compared to routine testing, and markedly reduced hospitalizations and deaths among residents by 81.4% (95% CrI: 80.6-82.2%), and 82.1% (95% CrI: 81.5-82.7%), respectively.

Limitations: Timelines of vaccine distribution and compliance rates with routine testing are key parameters affecting strategy outcomes.

Conclusion: Routine testing of staff reduces silent transmission in LTCFs. Vaccination could have a substantial impact on mitigating disease burden among residents, but may not eliminate the need for other measures before population-level control of COVID-19 is achieved.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2020.12.04.20244194DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7743093PMC
December 2020

Routine saliva testing for the identification of silent COVID-19 infections in healthcare workers.

medRxiv 2020 Nov 30. Epub 2020 Nov 30.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario, M3J 1P3 Canada.

Current COVID-19 guidelines recommend symptom-based screening and regular nasopharyngeal (NP) testing for healthcare personnel in high-risk settings. We sought to estimate case detection percentages with various routine NP and saliva testing frequencies. Simulation modelling study. We constructed a sensitivity function based on the average infectiousness profile of symptomatic COVID-19 cases to determine the probability of being identified at the time of testing. This function was fitted to reported data on the percent positivity of symptomatic COVID-19 patients using NP testing. We then simulated a routine testing program with different NP and saliva testing frequencies to determine case detection percentages during the infectious period, as well as the pre-symptomatic stage. Routine bi-weekly NP testing, once every two weeks, identified an average of 90.7% (SD: 0.18) of cases during the infectious period and 19.7% (SD: 0.98) during the pre-symptomatic stage. With a weekly NP testing frequency, the corresponding case detection percentages were 95.9% (SD: 0.18) and 32.9% (SD: 1.23), respectively. A 5-day saliva testing schedule had a similar case detection percentage as weekly NP testing during the infectious period, but identified about 10% more cases (mean: 42.5%; SD: 1.10) during the pre-symptomatic stage. Our findings highlight the utility of routine non-invasive saliva testing for frontline healthcare workers to protect vulnerable patient populations. A 5-day saliva testing schedule should be considered to help identify silent infections and prevent outbreaks in nursing homes and healthcare facilities.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2020.11.27.20240044DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7709182PMC
November 2020

The impact of vaccination on COVID-19 outbreaks in the United States.

medRxiv 2020 Nov 30. Epub 2020 Nov 30.

Center for Infectious Disease Modeling and Analysis (CIDMA), Yale School of Public Health, New Haven, Connecticut, USA.

Global vaccine development efforts have been accelerated in response to the devastating COVID-19 pandemic. We evaluated the impact of a 2-dose COVID-19 vaccination campaign on reducing incidence, hospitalizations, and deaths in the United States (US). We developed an agent-based model of SARS-CoV-2 transmission and parameterized it with US demographics and age-specific COVID-19 outcomes. Healthcare workers and high-risk individuals were prioritized for vaccination, while children under 18 years of age were not vaccinated. We considered a vaccine efficacy of 90% against infection following 2 doses administered 28 days apart achieving 40% vaccine coverage of the overall population. We specified 10% pre-existing population immunity for the base-case scenario and calibrated to an effective reproduction number of 1.5, accounting for current COVID-19 interventions in the US. Vaccination reduced the overall attack rate to 1.6% (95% CI: 1.3% - 1.8%) from 7.1% (95% CI: 6.3% - 7.9%) across the same period without vaccination. The highest relative reduction (83-90%) was observed among individuals aged 65 and older. Vaccination markedly reduced adverse outcomes, with non-ICU, ICU hospitalizations, and deaths decreasing by 85.2% (95% CI: 82.3% - 87.6%), 85.3% (95% CI: 82.3% - 87.8%), and 87.8% (95% CI: 85.1% - 90.1%), respectively. Our results indicate that vaccination can have a substantial impact on reducing disease transmission and adverse clinical outcomes. However, with uptake of 40% or less in the population, vaccination is unlikely to completely eliminate the need for non-pharmaceutical interventions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2020.11.27.20240051DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7709178PMC
November 2020

Low-Dose Total Skin Electron Beam Therapy as Part of a Multimodality Regimen for Treatment of Sézary Syndrome: Clinical, Immunologic, and Molecular Analysis.

JAMA Dermatol 2021 01;157(1):90-95

Department of Dermatology, Perelman School of Medicine, University of Pennsylvania, Philadelphia.

Importance: Sézary syndrome (SS) is an advanced form of cutaneous T-cell lymphoma with few long-term remissions observed.

Objective: To profile 3 patients with SS who have experienced long-term remission following the addition of low-dose total skin electron beam therapy (TSEBT) to systemic regimens of extracorporeal photopheresis, bexarotene, and interferon-γ.

Design, Setting, And Participants: This is a retrospective case series with additional investigations of patient-donated samples to assess therapeutic response. The study was conducted at the University of Pennsylvania Cutaneous Lymphoma Clinic and follows 3 patients with stage IVA1 CD4+ SS who presented to the clinic between November 1, 2009, and November 1, 2017, and who had a history of SS that was refractory to multimodality systemic therapy prior to receiving low-dose TSEBT.

Interventions: Patients were treated in a multimodality fashion with combined extracorporeal photopheresis, bexarotene, interferon-γ, and low-dose TSEBT.

Main Outcomes And Measures: To characterize treatment responses in these patients, the extent of skin disease was measured with the modified severity weighted assessment tool. Blood disease was measured with flow cytometric assessments of Sézary cell count, CD4:CD8 ratio, and high throughput sequencing of the T-cell receptors. To assess for restoration of immune function, we measured markers of immune exhaustion, including PD-1 (programmed cell death 1), TIGIT (T-cell immunoreceptor with immunoglobulin and ITIM domains), CTLA4 (cytotoxic T-lymphocyte-associated protein 4), TOX (thymocyte selection-associated high mobility group box protein), and Foxp3 (forkhead box P3) on circulating CD4 and CD8 T cells, along with production capacity of interferon-γ by lymphocytes following activation stimuli.

Results: Following administration of low-dose TSEBT and maintenance of the other therapies, remissions ranged from 24 to 30 months, with complete responses in 2 patients ongoing. Markers of immune exhaustion including PD-1, TIGIT, CTLA4, TOX, and Foxp3 were significantly reduced from baseline following TSEBT, along with enhanced production capacity of interferon-γ by lymphocytes following activation stimuli. High throughput sequencing demonstrated near-complete eradication of the circulating clone among 2 of 3 patients with stable levels in 1.

Conclusions And Relevance: We describe 3 patients who achieved long-term clinical and molecular remissions following low-dose TSEBT as part of a multimodality regimen for treatment of SS. As long-term remissions in SS are uncommon, this approach demonstrates promise, and clinical trials should be considered.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1001/jamadermatol.2020.3958DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7593882PMC
January 2021

Detectable HIV RNA in late pregnancy associated with low tenofovir hair levels at time of delivery among women living with HIV in the United States.

AIDS 2021 02;35(2):267-274

Department of Medicine, University of California, San Francisco, San Francisco, California, USA.

Objective: We evaluated peripartum tenofovir (TFV) exposure via hair measures among women living with HIV in the United States.

Design: Observational cohort study.

Methods: Hair samples were collected at or shortly after childbirth among mothers enrolled in the Surveillance Monitoring for Antiretroviral Therapy Toxicities Study of the Pediatric HIV/AIDS Cohort Study between 6/2014 and 7/2016. Among mothers receiving TFV disoproxil fumarate (TDF)-based regimens during pregnancy, TFV hair concentrations were analyzed using liquid chromatography/tandem mass spectrometry. Weight-normalized TFV concentrations were log10 transformed. Multivariable linear regression assessed correlates of TFV concentrations.

Results: Overall, 121 mothers on TDF-based antiretroviral therapy during pregnancy had hair specimens tested for TFV concentrations and were included in the analysis. Median age at delivery was 31 years [interquartile range (IQR) 26-36]; 71% self-identified as non-Hispanic black, and 10% had unsuppressed viral loads in late pregnancy (HIV RNA ≥ 400 copies/ml). Median time from birth to hair collection was 3 days (IQR 1-14) and median TFV hair concentration was 0.02 ng/mg (IQR 0.01-0.04). In multivariable models, an unsuppressed viral load in late pregnancy was associated with 80% lower adjusted mean peripartum TFV concentrations than pregnancies with viral suppression (95% confidence interval: -90% to -59%, P < 0.001). Use of TDF only in the first trimester and attaining high school graduation were also associated with lower TFV hair concentrations.

Conclusion: Unsuppressed viral load during late pregnancy was strongly associated with lower maternal TFV hair concentrations at birth, though viremia was rare. Efforts to improve maternal virological outcomes and eliminate vertical HIV transmission could incorporate drug exposure monitoring using hair or other metrics.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/QAD.0000000000002730DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7775322PMC
February 2021

The impact of mask-wearing and shelter-in-place on COVID-19 outbreaks in the United States.

Int J Infect Dis 2020 Dec 9;101:334-341. Epub 2020 Oct 9.

Agent-Based Modelling Laboratory, York University, Toronto, Ontario, M3J 1P3 Canada.

Objectives: A hasty reopening has led to a resurgence of the novel coronavirus disease 2019 (COVID-19) in the United States (US). We aimed to quantify the impact of several public health measures including non-medical mask-wearing, shelter-in-place, and detection of silent infections to help inform COVID-19 mitigation strategies.

Methods: We extended a previously established agent-based disease transmission model and parameterized it with estimates of COVID-19 characteristics and US population demographics. We implemented non-medical mask-wearing, shelter-in-place, and case isolation as control measures, and quantified their impact on reducing the attack rate and adverse clinical outcomes.

Results: We found that non-medical mask-wearing by 75% of the population reduced infections, hospitalizations, and deaths by 37.7% (interquartile range (IQR): 36.1-39.4%), 44.2% (IQR: 42.9-45.8%), and 47.2% (IQR: 45.5-48.7%), respectively, in the absence of a shelter-in-place strategy. Sheltering individuals aged 50 to 64 years of age was the most efficient strategy, decreasing attack rate, hospitalizations, and deaths by over 82% when combined with mask-wearing. Outbreak control was achieved in the simulated scenarios and the attack rate was reduced to below 1% when at least 33% of silent pre-symptomatic and asymptomatic infections were identified and isolated.

Conclusions: Mask-wearing, even with the use of non-medical masks, has a substantial impact on outbreak control. A judicious implementation of shelter-in-place strategies remains an important public health intervention amid ongoing outbreaks.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijid.2020.10.002DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7544634PMC
December 2020

Dose-dependent effects of testosterone on spatial learning strategies and brain-derived neurotrophic factor in male rats.

Psychoneuroendocrinology 2020 11 23;121:104850. Epub 2020 Aug 23.

Department of Biology, Middlebury College, Middlebury, VT, 05753, USA; Program in Neuroscience, Middlebury College, Middlebury, VT, 05753, USA. Electronic address:

Studies suggest that males outperform females on some spatial tasks. This may be due to the effects of sex steroids on spatial strategy preferences. Past experiments with male rats have demonstrated that low doses of testosterone bias them toward a response strategy, whereas high doses of testosterone bias them toward a place strategy. We investigated the effect of different testosterone doses on the ability of male rats to effectively employ these two spatial learning strategies. Furthermore, we quantified concentrations of brain-derived neurotrophic factor (pro-, mature-, and total BDNF) in the prefrontal cortex, hippocampus, and striatum. All rats were bilaterally castrated and assigned to one of three daily injection doses of testosterone propionate (0.125, 0.250, or 0.500 mg/rat) or a control injection of the drug vehicle. Using a plus-maze protocol, we found that a lower testosterone dose (0.125 mg) significantly improved rats' performance on a response task, whereas a higher testosterone dose (0.500 mg) significantly improved rats' performance on a place task. In addition, we found that a low dose of testosterone (0.125 mg) increased total BDNF in the striatum, while a high dose (0.500 mg) increased total BDNF in the hippocampus. Taken altogether, these results suggest that high and low levels of testosterone enhance performance on place and response spatial tasks, respectively, and this effect is associated with changes in BDNF levels within relevant brain regions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.psyneuen.2020.104850DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7572628PMC
November 2020