Publications by authors named "Hugh Montgomery"

201 Publications

Digital and technological innovation in vector-borne disease surveillance to predict, detect, and control climate-driven outbreaks.

Lancet Planet Health 2021 Oct;5(10):e739-e745

Oxford University Clinical Research Unit, Ho Chi Minh City, Vietnam; Centre for Tropical Medicine and Global Health, University of Oxford, Oxford, UK.

Vector-borne diseases are particularly sensitive to changes in weather and climate. Timely warnings from surveillance systems can help to detect and control outbreaks of infectious disease, facilitate effective management of finite resources, and contribute to knowledge generation, response planning, and resource prioritisation in the long term, which can mitigate future outbreaks. Technological and digital innovations have enabled the incorporation of climatic data into surveillance systems, enhancing their capacity to predict trends in outbreak prevalence and location. Advance notice of the risk of an outbreak empowers decision makers and communities to scale up prevention and preparedness interventions and redirect resources for outbreak responses. In this Viewpoint, we outline important considerations in the advent of new technologies in disease surveillance, including the sustainability of innovation in the long term and the fundamental obligation to ensure that the communities that are affected by the disease are involved in the design of the technology and directly benefit from its application.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/S2542-5196(21)00141-8DOI Listing
October 2021

Should we treat COVID-19 lung injury like ARDS? Exploring the paradigm.

Exp Physiol 2021 Sep 17. Epub 2021 Sep 17.

Centre for Human Health and Performance, University College London, London, UK.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1113/EP090010DOI Listing
September 2021

Selection pressure at altitude for genes related to alcohol metabolism: A role for endogenous enteric ethanol synthesis?

Exp Physiol 2021 Sep 6. Epub 2021 Sep 6.

Institute for Human Health and Performance, Department of Medicine, University College London, London, UK.

New Findings: What is the topic of this review? Highland natives have undergone natural selection for genetic variants advantageous in adaptation to the hypobaric hypoxia experienced at high altitude. Why genes related to alcohol metabolism appear consistently selected for has not been greatly considered. We hypothesize that altitude-related changes in the gut microbiome offer one possible explanation. What advances does it highlight? Low intestinal oxygen tension might favour the production of ethanol through anaerobic fermentation by the gut microbiome. Subsequent increases in endogenous ethanol absorption could therefore provide a selection pressure for gene variants favouring its increased degradation, or perhaps reduced degradation if endogenously synthesized ethanol acts as a metabolic signalling molecule.

Abstract: Reduced tissue availability of oxygen results from ascent to high altitude, where atmospheric pressure, and thus the partial pressure of inspired oxygen, fall (hypobaric hypoxia). In humans, adaptation to such hypoxia is necessary for survival. These functional changes remain incompletely characterized, although metabolic adaptation (rather than simple increases in convective oxygen delivery) appears to play a fundamental role. Those populations that have remained native to high altitude have undergone natural selection for genetic variants associated with advantageous phenotypic traits. Interestingly, a consistent genetic signal has implicated alcohol metabolism in the human adaptive response to hypobaric hypoxia. The reasons for this remain unclear. One possibility is that increased alcohol synthesis occurs through fermentation by gut bacteria in response to enteric hypoxia. There is growing evidence that anaerobes capable of producing ethanol become increasingly prevalent with high-altitude exposure. We hypothesize that: (1) ascent to high altitude renders the gut luminal environment increasingly hypoxic, favouring (2) an increase in the population of enteric fermenting anaerobes, hence (3) the synthesis of alcohol which, through systemic absorption, leads to (4) selection pressure on genes relating to alcohol metabolism. In theory, alcohol could be viewed as a toxic product, leading to selection of gene variants favouring its metabolism. On the contrary, alcohol is a metabolic substrate that might be beneficial. This mechanism could also account for some of the interindividual differences of lowlanders in acclimatization to altitude. Future research should be aimed at determining any shifts to favour ethanol-producing anaerobes after ascent to altitude.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1113/EP089628DOI Listing
September 2021

Effect of intermittent or continuous feeding and amino acid concentration on urea-to-creatinine ratio in critical illness.

JPEN J Parenter Enteral Nutr 2021 Aug 31. Epub 2021 Aug 31.

William Harvey Research Institute, Queen Mary University of London, London, UK.

Background: We sought to determine whether peaks in essential amino acid (EAA) concentration associated with intermittent feeding may provide anabolic advantages when compared with continuous feeding regimens in critical care.

Methods: We performed a secondary analysis of data from a multicenter trial of UK intensive care patients randomly assigned to intermittent or continuous feeding. A linear mixed-effects model was developed to assess differences in urea-creatinine ratio (raised values of which can be a marker of muscle wasting) between arms. To investigate metabolic phenotypes, we performed k-means urea-to-creatinine ratio trajectory clustering. Amino acid concentrations were also modeled against urea-to-creatinine ratio from day 1 to day 7. The main outcome measure was serum urea-to-creatinine ratio (millimole per millimole) from day 0 to the end of the 10-day study period.

Results: Urea-to-creatinine ratio trajectory differed between feeding regimens (coefficient -.245; P = .002). Patients receiving intermittent feeding demonstrated a flatter urea-to-creatinine ratio trajectory. With k-means analysis, the cluster with the largest proportion of continuously fed patients demonstrated the steepest rise in urea-to-creatinine ratio. Neither protein intake per se nor serum concentrations of EAA concentrations were correlated with urea-to-creatinine ratio (coefficient = .088 [P = .506] and coefficient <.001 [P = .122], respectively).

Conclusion: Intermittent feeding can mitigate the rise in urea-to-creatinine ratio otherwise seen in those continuously fed, suggesting that catabolism may have been, to some degree, prevented.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/jpen.2258DOI Listing
August 2021

COVID-19: UK frontline intensivists' emerging learning.

J Intensive Care Soc 2021 Aug 12;22(3):211-213. Epub 2020 Jun 12.

UCL Partners & Queen Mary University of London, London, UK.

The Intensive Care Society held a webinar on 3 April 2020 at which representatives from 11 of the most COVID-19 experienced hospital trusts in England and Wales shared learning around five specific topic areas in an open forum. This paper summarises the emerging learning and practice shared by those frontline clinicians.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1751143720931731DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8373285PMC
August 2021

Maintenance of Serum Potassium Levels ≥3.6 mEq/L Versus ≥4.5 mEq/L After Isolated Elective Coronary Artery Bypass Grafting and the Incidence of New-Onset Atrial Fibrillation: Pilot and Feasibility Study Results.

J Cardiothorac Vasc Anesth 2021 Jun 24. Epub 2021 Jun 24.

St. Bartholomew's Hospital, Barts Health NHS Trust, West Smithfield, London, United Kingdom; German Heart Center, Department of Cardiac Anesthesiology and Intensive Care Medicine, Berlin, Germany; Department of Cardiac Anesthesiology and Intensive Care Medicine, Charité Berlin, Berlin, Germany; Outcomes Research Consortium, Cleveland, OH.

Objective: Serum potassium levels frequently are maintained at high levels (≥4.5 mEq/L) to prevent atrial fibrillation after cardiac surgery (AFACS), with limited evidence. Before undertaking a noninferiority randomized controlled trial to investigate the noninferiority of maintaining levels ≥3.6 mEq/L compared with this strategy, the authors wanted to assess the feasibility, acceptability, and safety of recruiting for such a trial.

Design: Pilot and feasibility study of full trial protocol.

Setting: Two university tertiary-care hospitals.

Participants: A total of 160 individuals undergoing first-time elective isolated coronary artery bypass grafting.

Interventions: Randomization (1:1) to protocols aiming to maintain serum potassium at either ≥3.6 mEq/L or ≥4.5 mEq/L after arrival in the postoperative care facility and for 120 hours or until discharge from the hospital or AFACS occurred, whichever happened first.

Measurements And Main Results: Primary outcomes: (1) whether it was possible to recruit and randomize 160 patients for six months (estimated 20% of those eligible); (2) maintaining supplementation protocol violation rate ≤10% (defined as potassium supplementation being inappropriately administered or withheld according to treatment allocation after a serum potassium measurement); and (3) retaining 28-day follow-up rates ≥90% after surgery. Between August 2017 and April 2018, 723 patients were screened and 160 (22%) were recruited. Potassium protocol violation rate = 9.8%. Follow-up rate at 28 days = 94.3%. Data on planned outcomes for the full trial also were collected.

Conclusions: It is feasible to recruit and randomize patients to a study assessing the impact of maintaining serum potassium concentrations at either ≥3.6 mEq/L or ≥4.5 mEq/L on the incidence of AFACS.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1053/j.jvca.2021.06.021DOI Listing
June 2021

Clinically Applicable Segmentation of Head and Neck Anatomy for Radiotherapy: Deep Learning Algorithm Development and Validation Study.

J Med Internet Res 2021 07 12;23(7):e26151. Epub 2021 Jul 12.

DeepMind, London, United Kingdom.

Background: Over half a million individuals are diagnosed with head and neck cancer each year globally. Radiotherapy is an important curative treatment for this disease, but it requires manual time to delineate radiosensitive organs at risk. This planning process can delay treatment while also introducing interoperator variability, resulting in downstream radiation dose differences. Although auto-segmentation algorithms offer a potentially time-saving solution, the challenges in defining, quantifying, and achieving expert performance remain.

Objective: Adopting a deep learning approach, we aim to demonstrate a 3D U-Net architecture that achieves expert-level performance in delineating 21 distinct head and neck organs at risk commonly segmented in clinical practice.

Methods: The model was trained on a data set of 663 deidentified computed tomography scans acquired in routine clinical practice and with both segmentations taken from clinical practice and segmentations created by experienced radiographers as part of this research, all in accordance with consensus organ at risk definitions.

Results: We demonstrated the model's clinical applicability by assessing its performance on a test set of 21 computed tomography scans from clinical practice, each with 21 organs at risk segmented by 2 independent experts. We also introduced surface Dice similarity coefficient, a new metric for the comparison of organ delineation, to quantify the deviation between organ at risk surface contours rather than volumes, better reflecting the clinical task of correcting errors in automated organ segmentations. The model's generalizability was then demonstrated on 2 distinct open-source data sets, reflecting different centers and countries to model training.

Conclusions: Deep learning is an effective and clinically applicable technique for the segmentation of the head and neck anatomy for radiotherapy. With appropriate validation studies and regulatory approvals, this system could improve the efficiency, consistency, and safety of radiotherapy pathways.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2196/26151DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8314151PMC
July 2021

Multitask prediction of organ dysfunction in the intensive care unit using sequential subnetwork routing.

J Am Med Inform Assoc 2021 Aug;28(9):1936-1946

Google Health, London, United Kingdom.

Objective: Multitask learning (MTL) using electronic health records allows concurrent prediction of multiple endpoints. MTL has shown promise in improving model performance and training efficiency; however, it often suffers from negative transfer - impaired learning if tasks are not appropriately selected. We introduce a sequential subnetwork routing (SeqSNR) architecture that uses soft parameter sharing to find related tasks and encourage cross-learning between them.

Materials And Methods: Using the MIMIC-III (Medical Information Mart for Intensive Care-III) dataset, we train deep neural network models to predict the onset of 6 endpoints including specific organ dysfunctions and general clinical outcomes: acute kidney injury, continuous renal replacement therapy, mechanical ventilation, vasoactive medications, mortality, and length of stay. We compare single-task (ST) models with naive multitask and SeqSNR in terms of discriminative performance and label efficiency.

Results: SeqSNR showed a modest yet statistically significant performance boost across 4 of 6 tasks compared with ST and naive multitasking. When the size of the training dataset was reduced for a given task (label efficiency), SeqSNR outperformed ST for all cases showing an average area under the precision-recall curve boost of 2.1%, 2.9%, and 2.1% for tasks using 1%, 5%, and 10% of labels, respectively.

Conclusions: The SeqSNR architecture shows superior label efficiency compared with ST and naive multitasking, suggesting utility in scenarios in which endpoint labels are difficult to ascertain.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jamia/ocab101DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8363803PMC
August 2021

Exergy intensity and environmental consequences of the medical face masks curtailing the COVID-19 pandemic: Malign bodyguard?

J Clean Prod 2021 Sep 10;313:127880. Epub 2021 Jun 10.

Henan Province Forest Resources Sustainable Development and High-value Utilization Engineering Research Center, School of Forestry, Henan Agricultural University, Zhengzhou 450002, China.

On January 30, 2020, the World Health Organization identified SARS-CoV-2 as a public health emergency of global concern. Accordingly, the demand for personal protective equipment (PPE), including medical face masks, has sharply risen compared with 2019. The new situation has led to a sharp increase in energy demand and the environmental impacts associated with these product systems. Hence, the pandemic's effects on the environmental consequences of various PPE types, such as medical face masks, should be assessed. In light of that, the current study aimed to identify the environmental hot-spots of medical face mask production and consumption by using life cycle assessment (LCA) and tried to provide solutions to mitigate the adverse impacts. Based on the results obtained, in 2020, medical face masks production using fossil-based plastics causes the loss of 2.03 × 10 disability-adjusted life years (DALYs); 1.63 × 10 PDF*m*yr damage to ecosystem quality; the climate-damaging release of 2.13 × 10 kg CO; and 5.65 × 10 MJ damage to resources. Besides, annual medical face mask production results in 5.88 × 10 TJ demand for exergy. On the other hand, if used makes are not appropriately handled, they can lead to 4.99 × 10 Pt/yr additional damage to the environment in 2020 as determined by the EDIP 2003. Replacement of fossil-based plastics with bio-based plastics, at rates ranging from 10 to 100%, could mitigate the product's total yearly environmental damage by 4-43%, respectively Our study calls attention to the environmental sustainability of PPE used to prevent virus transmission in the current and future pandemics.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jclepro.2021.127880DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8192063PMC
September 2021

High-altitude population neonatal and maternal phenotypes associated with birthweight protection.

Pediatr Res 2021 Jun 8. Epub 2021 Jun 8.

University College London Institute for Women's Health, London, UK.

Background: States which reduce foetal oxygen delivery are associated with impaired intrauterine growth. Hypoxia results when barometric pressure falls with ascent to altitude, and with it the partial pressure of inspired oxygen ('hypobaric hypoxia'). birthweight is reduced when native lowlanders gestate at such high altitude (HA)-an effect mitigated in native (millennia) HA populations. Studying HA populations offer a route to explore the mechanisms by which hypoxia impacts foetal growth.

Methods: Between February 2017 and January 2019, we prospectively studied 316 pregnant women, in Leh, Ladakh (altitude 3524 m, where oxygen partial pressure is reduced by 1/3) and 101 pregnant women living in Delhi (low altitude, 216 m above sea level).

Results: Of Ladakhi HA newborns, 14% were small for gestational age (<10th birthweight centile) vs 19% of newborn at low altitude. At HA, increased maternal body mass index, age, and uterine artery (UtA) diameter were positively associated with growth >10th weight centile.

Conclusions: This study showed that Ladakhi offspring birthweight is relatively spared from the expected adverse HA effects. Furthermore, maternal body composition and greater UtA size may be physiological HA adaptations and warrant further study, as they offer potential mechanisms to overcome hypoxia-related growth issues.

Impact: Reduced foetal oxygen delivery seen in native lowlanders who gestate at HA causes foetal growth restriction-an effect thought to be mitigated in native HA populations. We found that greater maternal body mass and UtA diameter were associated with increased offspring birthweight in a (Ladakh) HA population. This supports a role for them as physiological mediators of adaptation and provides insights into potential mechanisms that may treat hypoxia-related growth issues.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41390-021-01593-5DOI Listing
June 2021

Cannulation of the subclavian vein using real-time ultrasound guidance.

J Intensive Care Soc 2020 Nov 23;21(4):349-354. Epub 2020 Jan 23.

Department of Intensive Care, University College Hospital, London, UK.

Cannulation of the subclavian vein has many advantages when compared to other anatomical sites for central venous access. Difficulty in its ultrasonic visualisation, and the perceived consequent 'higher' complication rate, mean that this approach has fallen out of favour. This barrier, however, may now have disappeared. In this article, we discuss the indications, contraindications and complications associated with subclavian vein cannulation, and present an ultrasound-guided approach to infraclavicular subclavian cannulation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1751143720901403DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8142095PMC
November 2020

Use of deep learning to develop continuous-risk models for adverse event prediction from electronic health records.

Nat Protoc 2021 06 5;16(6):2765-2787. Epub 2021 May 5.

DeepMind, London, UK.

Early prediction of patient outcomes is important for targeting preventive care. This protocol describes a practical workflow for developing deep-learning risk models that can predict various clinical and operational outcomes from structured electronic health record (EHR) data. The protocol comprises five main stages: formal problem definition, data pre-processing, architecture selection, calibration and uncertainty, and generalizability evaluation. We have applied the workflow to four endpoints (acute kidney injury, mortality, length of stay and 30-day hospital readmission). The workflow can enable continuous (e.g., triggered every 6 h) and static (e.g., triggered at 24 h after admission) predictions. We also provide an open-source codebase that illustrates some key principles in EHR modeling. This protocol can be used by interdisciplinary teams with programming and clinical expertise to build deep-learning prediction models with alternate data sources and prediction tasks.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41596-021-00513-5DOI Listing
June 2021

Malnutrition risk in hospitalised COVID-19 patients receiving CPAP.

Lancet 2021 04 17;397(10281):1261. Epub 2021 Mar 17.

Department of Gastroenterology, University Hospital Southampton NHS Foundation Trust, Southampton, UK.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/S0140-6736(21)00447-5DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7969140PMC
April 2021

Dysnatremia is a Predictor for Morbidity and Mortality in Hospitalized Patients with COVID-19.

J Clin Endocrinol Metab 2021 05;106(6):1637-1648

Department of Diabetes & Endocrinology, University College London Hospital NHS Foundation Trust, London, UK.

Context: Dysnatremia is an independent predictor of mortality in patients with bacterial pneumonia. There is paucity of data about the incidence and prognostic impact of abnormal sodium concentration in patients with coronavirus disease 2019 (COVID-19).

Objective: This work aimed to examine the association of serum sodium during hospitalization with key clinical outcomes, including mortality, need for advanced respiratory support and acute kidney injury (AKI), and to explore the role of serum sodium as a marker of inflammatory response in COVID-19.

Methods: This retrospective longitudinal cohort study, including all adult patients who presented with COVID-19 to 2 hospitals in London over an 8-week period, evaluated the association of dysnatremia (serum sodium < 135 or > 145 mmol/L, hyponatremia, and hypernatremia, respectively) at several time points with inpatient mortality, need for advanced ventilatory support, and AKI.

Results: The study included 488 patients (median age, 68 years). At presentation, 24.6% of patients were hyponatremic, mainly due to hypovolemia, and 5.3% hypernatremic. Hypernatremia 2 days after admission and exposure to hypernatremia at any time point during hospitalization were associated with a 2.34-fold (95% CI, 1.08-5.05; P = .0014) and 3.05-fold (95% CI, 1.69-5.49; P < .0001) increased risk of death, respectively, compared to normonatremia. Hyponatremia at admission was linked with a 2.18-fold increase in the likelihood of needing ventilatory support (95% CI, 1.34-3.45, P = .0011). Hyponatremia was not a risk factor for in-hospital mortality, except for the subgroup of patients with hypovolemic hyponatremia. Sodium values were not associated with the risk for AKI and length of hospital stay.

Conclusion: Abnormal sodium levels during hospitalization are risk factors for poor prognosis, with hypernatremia and hyponatremia being associated with a greater risk of death and respiratory failure, respectively. Serum sodium values could be used for risk stratification in patients with COVID-19.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1210/clinem/dgab107DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7928894PMC
May 2021

Healthcare Workers Bioresource: Study outline and baseline characteristics of a prospective healthcare worker cohort to study immune protection and pathogenesis in COVID-19.

Wellcome Open Res 2020 12;5:179. Epub 2020 Oct 12.

Barts Heart Centre, St Bartholomew's Hospital, Barts Health NHS Trust, London, UK, London, UK.

: Most biomedical research has focused on sampling COVID-19 patients presenting to hospital with advanced disease, with less focus on the asymptomatic or paucisymptomatic. We established a bioresource with serial sampling of health care workers (HCWs) designed to obtain samples before and during mainly mild disease, with follow-up sampling to evaluate the quality and duration of immune memory. : We conducted a prospective study on HCWs from three hospital sites in London, initially at a single centre (recruited just prior to first peak community transmission in London), but then extended to multiple sites 3 weeks later (recruitment still ongoing, target n=1,000). Asymptomatic participants attending work complete a health questionnaire, and provide a nasal swab (for SARS-CoV-2 RNA by RT-PCR tests) and blood samples (mononuclear cells, serum, plasma, RNA and DNA are biobanked) at 16 weekly study visits, and at 6 and 12 months. : Preliminary baseline results for the first 731 HCWs (400 single-centre, 331 multicentre extension) are presented. Mean age was 38±11 years; 67% are female, 31% nurses, 20% doctors, and 19% work in intensive care units. COVID-19-associated risk factors were: 37% black, Asian or minority ethnicities; 18% smokers; 13% obesity; 11% asthma; 7% hypertension and 2% diabetes mellitus. At baseline, 41% reported symptoms in the preceding 2 weeks. Preliminary test results from the initial cohort (n=400) are available: PCR at baseline for SARS-CoV-2 was positive in 28 of 396 (7.1%, 95% CI 4.9-10.0%) and 15 of 385 (3.9%, 2.4-6.3%) had circulating IgG antibodies. : This COVID-19 bioresource established just before the peak of infections in the UK will provide longitudinal assessments of incident infection and immune responses in HCWs through the natural time course of disease and convalescence. The samples and data from this bioresource are available to academic collaborators by application  https://covid-consortium.com/application-for-samples/.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.12688/wellcomeopenres.16051.2DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7836029PMC
October 2020

A carbon monoxide 'single breath' method to measure total haemoglobin mass: a feasibility study.

Exp Physiol 2021 Feb 30;106(2):567-575. Epub 2020 Dec 30.

Department of Sports Medicine/Sports Physiology, University of Bayreuth, Bayreuth, 95440, Germany.

New Findings: What is the central question of this study? Is it possible to modify the CO-rebreathing method to acquire reliable measurements of haemoglobin mass in ventilated patients? What is the main finding and its importance? A 'single breath' of CO with a subsequent 30 s breath hold provides almost as exact a measure of haemoglobin mass as the established optimized CO-rebreathing method when applied to healthy subjects. The modified method has now to be checked in ventilated patients before it can be used to quantify the contributions of blood loss and of dilution to the severity of anaemia.

Abstract: Anaemia is defined by the concentration of haemoglobin (Hb). However, this value is dependent upon both the total circulating haemoglobin mass (tHb-mass) and the plasma volume (PV) - neither of which is routinely measured. Carbon monoxide (CO)-rebreathing methods have been successfully used to determine both PV and tHb-mass in various populations. However, these methods are not yet suitable for ventilated patients. This study aimed to modify the CO-rebreathing procedure such that a single inhalation of a CO bolus would enable its use in ventilated patients. Eleven healthy volunteers performed four CO-rebreathing tests in a randomized order, inhaling an identical CO volume. In two tests, CO was rebreathed for 2 min (optimized CO rebreathing; oCOR), and in the other two tests, a single inhalation of a CO bolus was conducted with a subsequent breath hold of 15 s (Proc 15s) or 30 s (Proc 30s). Subsequently, the CO volume in the exhaled air was continuously determined for 20 min. The amount of CO exhaled after 7 and 20 min was respectively 3.1 ± 0.3 and 5.9 ± 1.1 ml for oCOR, 8.7 ± 3.6 and 12.0 ± 4.4 ml for Proc 15s and 5.1 ± 2.0 and 8.4 ±2.6 ml for Proc 30s. tHb-mass was 843 ± 293 g determined by oCOR, 821 ± 288 g determined by Proc 15s (difference: P < 0.05) and 849 ± 311 g determined by Proc 30s. Bland-Altman plots demonstrated slightly lower tHb-mass values for Proc 15s compared with oCOR (-21.8 ± 15.3 g) and similar values for Proc 30s. In healthy volunteers, a single inhalation of a CO bolus, preferably followed by a 30 s breath hold, can be used to determine tHb-mass. These results must now be validated for ventilated patients.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1113/EP089076DOI Listing
February 2021

Genetic mechanisms of critical illness in COVID-19.

Nature 2021 03 11;591(7848):92-98. Epub 2020 Dec 11.

Intensive Care Unit, Royal Infirmary of Edinburgh, Edinburgh, UK.

Host-mediated lung inflammation is present, and drives mortality, in the critical illness caused by coronavirus disease 2019 (COVID-19). Host genetic variants associated with critical illness may identify mechanistic targets for therapeutic development. Here we report the results of the GenOMICC (Genetics Of Mortality In Critical Care) genome-wide association study in 2,244 critically ill patients with COVID-19 from 208 UK intensive care units. We have identified and replicated the following new genome-wide significant associations: on chromosome 12q24.13 (rs10735079, P = 1.65 × 10) in a gene cluster that encodes antiviral restriction enzyme activators (OAS1, OAS2 and OAS3); on chromosome 19p13.2 (rs74956615, P = 2.3 × 10) near the gene that encodes tyrosine kinase 2 (TYK2); on chromosome 19p13.3 (rs2109069, P = 3.98 ×  10) within the gene that encodes dipeptidyl peptidase 9 (DPP9); and on chromosome 21q22.1 (rs2236757, P = 4.99 × 10) in the interferon receptor gene IFNAR2. We identified potential targets for repurposing of licensed medications: using Mendelian randomization, we found evidence that low expression of IFNAR2, or high expression of TYK2, are associated with life-threatening disease; and transcriptome-wide association in lung tissue revealed that high expression of the monocyte-macrophage chemotactic receptor CCR2 is associated with severe COVID-19. Our results identify robust genetic signals relating to key host antiviral defence mechanisms and mediators of inflammatory organ damage in COVID-19. Both mechanisms may be amenable to targeted treatment with existing drugs. However, large-scale randomized clinical trials will be essential before any change to clinical practice.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41586-020-03065-yDOI Listing
March 2021

Response.

Chest 2020 12;158(6):2708-2711

William Harvey Research Institute, Barts and the London School of Medicine and Dentistry, Queen Mary University of London, London, England; Adult Critical Care Unit, Royal London Hospital, London, England.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.chest.2020.08.001DOI Listing
December 2020

Diarrhoea in critical care is rarely infective in origin, associated with increased length of stay and higher mortality.

J Intensive Care Soc 2020 Feb 7;21(1):72-78. Epub 2019 May 7.

Department of Microbiology & Virology, University College London Hospitals, London, UK.

Diarrhoea, defined as > 3 loose or liquid stools per day, affects 9.7-41% of intensive care unit patients, negatively impacting on patient dignity, intensifying nursing workload and increasing morbidity. Its pathogenesis is poorly understood, but infective agents, intensive care unit therapies (such as enteral feed) and critical illness changes in the gut microbiome are thought to play a role. We analysed a consecutive cohort of 3737 patients admitted to a mixed general intensive care unit. Diarrhoea prevalence was lower than previously reported (5.3%), rarely infective in origin (6.5%) and associated with increased length of stay (median (inter-quartile range) 2.3 (1.0-5.0) days vs. 10 days (5.0-22.0), p < 0.001, sub-distribution hazard ratio 0.55 (95% CI 0.48-0.63), p < 0.001) and mortality (9.5% vs. 18.1%, p = 0.005, sub-distribution hazard ratio 1.20 (95% CI 0.79-1.81), p = 0.40), compared to patients without diarrhoea. In addition, 17.1% of patients received laxatives <24 h prior to diarrhoea onset. Further research on diarrhoea's pathogenesis in critical care is required; robust treatment protocols, investigation rationalisation and improved laxative prescribing may reduce its incidence and improve related outcomes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1751143719843423DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7137165PMC
February 2020

Effect of Intermittent or Continuous Feed on Muscle Wasting in Critical Illness: A Phase 2 Clinical Trial.

Chest 2020 07 2;158(1):183-194. Epub 2020 Apr 2.

Adult Critical Care Unit, Royal London Hospital, London, United Kingdom; William Harvey Research Institute, Barts and the London School of Medicine and Dentistry, Queen Mary University of London, London, United Kingdom.

Background: Acute skeletal muscle wasting in critical illness is associated with excess morbidity and mortality. Continuous feeding may suppress muscle protein synthesis as a result of the muscle-full effect, unlike intermittent feeding, which may ameliorate it.

Research Question: Does intermittent enteral feed decrease muscle wasting compared with continuous feed in critically ill patients?

Study Design And Methods: In a phase 2 interventional single-blinded randomized controlled trial, 121 mechanically ventilated adult patients with multiorgan failure were recruited following prospective informed consultee assent. They were randomized to the intervention group (intermittent enteral feeding from six 4-hourly feeds per 24 h, n = 62) or control group (standard continuous enteral feeding, n = 59). The primary outcome was 10-day loss of rectus femoris muscle cross-sectional area determined by ultrasound. Secondary outcomes included nutritional target achievements, plasma amino acid concentrations, glycemic control, and physical function milestones.

Results: Muscle loss was similar between arms (-1.1% [95% CI, -6.1% to -4.0%]; P = .676). More intermittently fed patients received 80% or more of target protein (OR, 1.52 [1.16-1.99]; P < .001) and energy (OR, 1.59 [1.21-2.08]; P = .001). Plasma branched-chain amino acid concentrations before and after feeds were similar between arms on trial day 1 (71 μM [44-98 μM]; P = .547) and trial day 10 (239 μM [33-444 μM]; P = .178). During the 10-day intervention period the coefficient of variation for glucose concentrations was higher with intermittent feed (17.84 [18.6-20.4]) vs continuous feed (12.98 [14.0-15.7]; P < .001). However, days with reported hypoglycemia and insulin usage were similar in both groups. Safety profiles, gastric intolerance, physical function milestones, and discharge destinations did not differ between groups.

Interpretation: Intermittent feeding in early critical illness is not shown to preserve muscle mass in this trial despite resulting in a greater achievement of nutritional targets than continuous feeding. However, it is feasible and safe.

Trial Registry: ClinicalTrials.gov; No.: NCT02358512; URL: www.clinicaltrials.gov.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.chest.2020.03.045DOI Listing
July 2020

Low serum 25-hydroxyvitamin D status in the pathogenesis of stress fractures in military personnel: An evidenced link to support injury risk management.

PLoS One 2020 24;15(3):e0229638. Epub 2020 Mar 24.

Institute of Naval Medicine, Alverstoke, Hampshire, United Kingdom.

Stress fractures are common amongst healthy military recruits and athletes. Reduced vitamin D availability, measured by serum 25-hydroxyvitamin D (25OHD) status, has been associated with stress fracture risk during the 32-week Royal Marines (RM) training programme. A gene-environment interaction study was undertaken to explore this relationship to inform specific injury risk mitigation strategies. Fifty-one males who developed a stress fracture during RM training (n = 9 in weeks 1-15; n = 42 in weeks 16-32) and 141 uninjured controls were genotyped for the vitamin D receptor (VDR) FokI polymorphism. Serum 25OHD was measured at the start, middle and end (weeks 1, 15 and 32) of training. Serum 25OHD concentration increased in controls between weeks 1-15 (61.8±29.1 to 72.6±28.8 nmol/L, p = 0.01). Recruits who fractured did not show this rise and had lower week-15 25OHD concentration (p = 0.01). Higher week-15 25OHD concentration was associated with reduced stress fracture risk (adjusted OR 0.55[0.32-0.96] per 1SD increase, p = 0.04): the greater the increase in 25OHD, the greater the protective effect (p = 0.01). The f-allele was over-represented in fracture cases compared with controls (p<0.05). Baseline 25OHD status interacted with VDR genotype: a higher level was associated with reduced fracture risk in f-allele carriers (adjusted OR 0.39[0.17-0.91], p = 0.01). Improved 25OHD status between weeks 1-15 had a greater protective effect in FF genotype individuals (adjusted OR 0.31[0.12-0.81] vs. 1.78[0.90-3.49], p<0.01). Stress fracture risk in RM recruits is impacted by the interaction of VDR genotype with vitamin D status. This further supports the role of low serum vitamin D concentrations in causing stress fractures, and hence prophylactic vitamin D supplementation as an injury risk mitigation strategy.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0229638PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7092979PMC
June 2020

Application of the optimized carbon monoxide rebreathing method for the measurement of total haemoglobin mass in chronic liver disease.

Physiol Rep 2020 03;8(6):e14402

Centre for Human Health and Performance/ Institute of Sport, Exercise and Health, University College London, and NIHR University College London Hospitals Biomedical Research Centre, London, UK.

Background: Anemia is common in liver cirrhosis. This generally infers a fall in total hemoglobin mass (tHb-mass). However, hemoglobin concentration ([Hb]) may fall due to an expansion in plasma volume (PV). The "optimized carbon monoxide rebreathing method" (oCOR) measures tHb-mass directly and PV (indirectly using hematocrit). It relies upon carboxyhemoglobin (COHb) distribution throughout the entire circulation. In healthy subjects, such distribution is complete within 6-8 min. Given the altered circulatory dynamics in cirrhosis, we sought in this pilot study, to assess whether this was true in cirrhosis. The primary aim was to ascertain if the standard timings for the oCOR were applicable to patients with chronic liver disease and cirrhosis. The secondary aim was to explore the applicability of standard CO dosing methodologies to this patient population.

Methods: Sixteen patients with chronic liver parenchymal disease were studied. However, tHb-mass was determined using the standard oCOR technique before elective paracentesis. Three subjects had an inadequate COHb% rise. In the remaining 13 (11 male), mean ± standard deviation (SD) age was 52 ± 13.8 years, body mass 79.1 ± 11.4 kg, height 175 ± 6.8 cm. To these, mean ± SD dose of carbon monoxide (CO) gas administered was 0.73 ± 0.13 ml/kg COHb values at baseline, 6 and 8 min (and "7-min value") were compared to those at 10, 12, 15 and 20 min after CO rebreathing.

Results: The "7-min value" for median COHb% (IQR) of 6.30% (6.21%-7.47%) did not differ significantly from those at subsequent time points (8 min: 6.30% (6.21%-7.47%), 10 min: 6.33% (6.00%-7.50%), 12 min: 6.33% (5.90%-7.40%), 15 min: 6.37% (5.80%-7.33%), 20 min: 6.27% (5.70%-7.20%)). Mean difference in calculated tHb-mass between minute 7 and minute 20 was only 4.1 g, or 0.6%, p = .68. No subjects reported any adverse effects.

Conclusions: The oCOR method can be safely used to measure tHb-mass in patients with chronic liver disease and ascites, without adjustment of blood sample timings. Further work might refine and validate appropriate dosing regimens.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.14814/phy2.14402DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7090373PMC
March 2020

Thirst-guided participant-controlled intravenous fluid rehydration: a single blind, randomised crossover study.

Br J Anaesth 2020 Jan 31. Epub 2020 Jan 31.

Institute of Sport Exercise & Health, University College London, London, UK.

Background: Dehydration is common in hospitals and is associated with increased mortality and morbidity. Clinical assessment and diagnostic measures of dehydration are unreliable. We sought to investigate the novel concept that individuals might control their own intravenous rehydration, guided by thirst.

Methods: We performed a single-blind, counterbalanced, randomised cross-over trial. Ten healthy male volunteers of mean age 26 (standard deviation [sd] 10.5) yr were dehydrated by 3-5% of their baseline body mass via exercising in the heat (35°C, 60% humidity). This was followed by a 4 h participant-controlled intravenous rehydration: individuals triggered up to six fluid boluses (4% dextrose in 0.18% sodium chloride) per hour in response to thirst. Participants undertook two blinded rehydration protocols which differed only by bolus volume: 50 ml (low volume [LV]) or 200 ml (high volume [HV]). Each hour during the rehydration phase, plasma osmolality (pOsm) was measured and thirst score recorded. Nude body mass was measured at baseline, after dehydration, and after the rehydration phase.

Results: In both conditions, the mean dehydration-related body mass loss was 3.9%. Thirst score was strongly associated with pOsm (within-subject r=0.74) and demand for fluid decreased as pOsm corrected. In the HV condition, participants rapidly rehydrated themselves (mean fluid delivered 3060 vs 981 ml in the LV condition) to body mass and pOsm no different to their euhydrated state.

Conclusion: Healthy individuals appear able to rely on thirst to manage intravenous fluid intake. Future work must now focus on whether patient-controlled intravenous fluids could represent a paradigm shift in the management of hydration in the clinical setting.

Clinical Trial Registration: NCT03932890.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.bja.2019.12.008DOI Listing
January 2020

Matters of life and death: Change beyond planetary homeostasis.

Exp Physiol 2019 12 4;104(12):1749-1750. Epub 2019 Oct 4.

Extreme Environments Laboratory, School of Sport, Health and Exercise Science, University of Portsmouth, Portsmouth, UK.

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1113/EP088178DOI Listing
December 2019

Evaluation of a digitally-enabled care pathway for acute kidney injury management in hospital emergency admissions.

NPJ Digit Med 2019 31;2:67. Epub 2019 Jul 31.

3Department of Applied Health Research, University College London, 1-19 Torrington Place, London, WC1E 7HB UK.

We developed a digitally enabled care pathway for acute kidney injury (AKI) management incorporating a mobile detection application, specialist clinical response team and care protocol. Clinical outcome data were collected from adults with AKI on emergency admission before (May 2016 to January 2017) and after (May to September 2017) deployment at the intervention site and another not receiving the intervention. Changes in primary outcome (serum creatinine recovery to ≤120% baseline at hospital discharge) and secondary outcomes (30-day survival, renal replacement therapy, renal or intensive care unit (ICU) admission, worsening AKI stage and length of stay) were measured using interrupted time-series regression. Processes of care data (time to AKI recognition, time to treatment) were extracted from casenotes, and compared over two 9-month periods before and after implementation (January to September 2016 and 2017, respectively) using pre-post analysis. There was no step change in renal recovery or any of the secondary outcomes. Trends for creatinine recovery rates (estimated odds ratio (OR) = 1.04, 95% confidence interval (95% CI): 1.00-1.08,  = 0.038) and renal or ICU admission (OR = 0.95, 95% CI: 0.90-1.00,  = 0.044) improved significantly at the intervention site. However, difference-in-difference analyses between sites for creatinine recovery (estimated OR = 0.95, 95% CI: 0.90-1.00,  = 0.053) and renal or ICU admission (OR = 1.06, 95% CI: 0.98-1.16,  = 0.140) were not significant. Among process measures, time to AKI recognition and treatment of nephrotoxicity improved significantly ( < 0.001 and 0.047 respectively).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41746-019-0100-6DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6669220PMC
July 2019

Implementation of a Digitally Enabled Care Pathway (Part 1): Impact on Clinical Outcomes and Associated Health Care Costs.

J Med Internet Res 2019 07 15;21(7):e13147. Epub 2019 Jul 15.

Royal Free London NHS Foundation Trust, London, United Kingdom.

Background: The development of acute kidney injury (AKI) in hospitalized patients is associated with adverse outcomes and increased health care costs. Simple automated e-alerts indicating its presence do not appear to improve outcomes, perhaps because of a lack of explicitly defined integration with a clinical response.

Objective: We sought to test this hypothesis by evaluating the impact of a digitally enabled intervention on clinical outcomes and health care costs associated with AKI in hospitalized patients.

Methods: We developed a care pathway comprising automated AKI detection, mobile clinician notification, in-app triage, and a protocolized specialist clinical response. We evaluated its impact by comparing data from pre- and postimplementation phases (May 2016 to January 2017 and May to September 2017, respectively) at the intervention site and another site not receiving the intervention. Clinical outcomes were analyzed using segmented regression analysis. The primary outcome was recovery of renal function to ≤120% of baseline by hospital discharge. Secondary clinical outcomes were mortality within 30 days of alert, progression of AKI stage, transfer to renal/intensive care units, hospital re-admission within 30 days of discharge, dependence on renal replacement therapy 30 days after discharge, and hospital-wide cardiac arrest rate. Time taken for specialist review of AKI alerts was measured. Impact on health care costs as defined by Patient-Level Information and Costing System data was evaluated using difference-in-differences (DID) analysis.

Results: The median time to AKI alert review by a specialist was 14.0 min (interquartile range 1.0-60.0 min). There was no impact on the primary outcome (estimated odds ratio [OR] 1.00, 95% CI 0.58-1.71; P=.99). Although the hospital-wide cardiac arrest rate fell significantly at the intervention site (OR 0.55, 95% CI 0.38-0.76; P<.001), DID analysis with the comparator site was not significant (OR 1.13, 95% CI 0.63-1.99; P=.69). There was no impact on other secondary clinical outcomes. Mean health care costs per patient were reduced by £2123 (95% CI -£4024 to -£222; P=.03), not including costs of providing the technology.

Conclusions: The digitally enabled clinical intervention to detect and treat AKI in hospitalized patients reduced health care costs and possibly reduced cardiac arrest rates. Its impact on other clinical outcomes and identification of the active components of the pathway requires clarification through evaluation across multiple sites.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2196/13147DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6693300PMC
July 2019

Implementation of a Digitally Enabled Care Pathway (Part 2): Qualitative Analysis of Experiences of Health Care Professionals.

J Med Internet Res 2019 07 15;21(7):e13143. Epub 2019 Jul 15.

Department of Applied Health Research, University College London, London, United Kingdom.

Background: One reason for the introduction of digital technologies into health care has been to try to improve safety and patient outcomes by providing real-time access to patient data and enhancing communication among health care professionals. However, the adoption of such technologies into clinical pathways has been less examined, and the impacts on users and the broader health system are poorly understood. We sought to address this by studying the impacts of introducing a digitally enabled care pathway for patients with acute kidney injury (AKI) at a tertiary referral hospital in the United Kingdom. A dedicated clinical response team-comprising existing nephrology and patient-at-risk and resuscitation teams-received AKI alerts in real time via Streams, a mobile app. Here, we present a qualitative evaluation of the experiences of users and other health care professionals whose work was affected by the implementation of the care pathway.

Objective: The aim of this study was to qualitatively evaluate the impact of mobile results viewing and automated alerting as part of a digitally enabled care pathway on the working practices of users and their interprofessional relationships.

Methods: A total of 19 semistructured interviews were conducted with members of the AKI response team and clinicians with whom they interacted across the hospital. Interviews were analyzed using inductive and deductive thematic analysis.

Results: The digitally enabled care pathway improved access to patient information and expedited early specialist care. Opportunities were identified for more constructive planning of end-of-life care due to the earlier detection and alerting of deterioration. However, the shift toward early detection also highlighted resource constraints and some clinical uncertainty about the value of intervening at this stage. The real-time availability of information altered communication flows within and between clinical teams and across professional groups.

Conclusions: Digital technologies allow early detection of adverse events and of patients at risk of deterioration, with the potential to improve outcomes. They may also increase the efficiency of health care professionals' working practices. However, when planning and implementing digital information innovations in health care, the following factors should also be considered: the provision of clinical training to effectively manage early detection, resources to cope with additional workload, support to manage perceived information overload, and the optimization of algorithms to minimize unnecessary alerts.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2196/13143DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6693304PMC
July 2019
-->