Publications by authors named "Samuel Jones"

247 Publications

Feasibility and acceptability of SARS-CoV-2 testing and surveillance in primary school children in England: Prospective, cross-sectional study.

PLoS One 2021 27;16(8):e0255517. Epub 2021 Aug 27.

Public Health England, London, United Kingdom.

Background: The reopening of schools during the COVID-19 pandemic has raised concerns about widespread infection and transmission of SARS-CoV-2 in educational settings. In June 2020, Public Health England (PHE) initiated prospective national surveillance of SARS-CoV-2 in primary schools across England (sKIDs). We used this opportunity to assess the feasibility and agreeability of large-scale surveillance and testing for SARS-CoV-2 infections in school among staff, parents and students.

Methods: Staff and students in 131 primary schools were asked to complete a questionnaire at recruitment and provide weekly nasal swabs for SARS-CoV-2 RT-PCR testing (n = 86) or swabs with blood samples for antibody testing (n = 45) at the beginning and end the summer half-term. In six blood sampling schools, students were asked to complete a pictorial questionnaire before and after their investigations.

Results: In total, 135 children aged 4-7 years (n = 40) or 8-11 years (n = 95) completed the pictorial questionnaire fully or partially. Prior to sampling, oral fluid sampling was the most acceptable test (107/132, 81%) followed by throat swabs (80/134, 59%), nose swabs (77/132, 58%), and blood tests (48/130, 37%). Younger students were more nervous about all tests than older students but, after completing their tests, most children reported a "better than expected" experience with all the investigations. Students were more likely to agree to additional testing for nose swabs (93/113, 82%) and oral fluid (93/114, 82%), followed by throat swabs (85/113, 75%) and blood tests (72/108, 67%). Parents (n = 3,994) and staff (n = 2,580) selected a preference for weekly testing with nose swabs, throat swabs or oral fluid sampling, although staff were more flexible about testing frequency.

Conclusions: Primary school staff and parents were supportive of regular tests for SARS-CoV-2 and selected a preference for weekly testing. Children preferred nose swabs and oral fluids over throat swabs or blood sampling.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0255517PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8396768PMC
September 2021

Understanding Town Centre Performance in Wales: Using GIS to Develop a Tool for Benchmarking.

Appl Spat Anal Policy 2021 Aug 20:1-28. Epub 2021 Aug 20.

Wales Institute of Social and Economic Research and Data (WISERD), School of Geography and Planning, Cardiff University, Glamorgan Building, King Edward VII Avenue, Cardiff, CF10 3WA UK.

Welsh Government policy establishes town centres as central places of community activity and local prosperity, recognising the positive impact towns have on the local economy and the well-being and cohesion felt amongst local communities. In light of this, recent declines in the usage of town centres are a major cause for concern. These have not been experienced uniformly across all towns, with some towns out-performing others. This paper applies principles outlined in Welsh Government's Planning Policy Wales to develop a tool which classifies a sample of 71 towns and cities in Wales based on their centre and catchment characteristics. Catchment areas have been delineated using a Spatial Interaction Model to account for complex consumer behaviours and competition between centres. The tool identifies six distinct types of towns alongside key socio-economic catchment area characteristics. Once developed, we demonstrate our tool's application by exploring variations in town centre performance between and within each town type. Case study examples exemplify how policymakers may use this tool to benchmark between towns, evaluating the suitability of a town's retail offering based on its performance relative to the benchmark, guiding decisions relating to the types of businesses and uses a town should pursue to improve its appeal to its catchment community. In conclusion, several recommendations to policymakers are suggested.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s12061-021-09417-zDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8378115PMC
August 2021

SARS-CoV-2 infection, antibody positivity and seroconversion rates in staff and students following full reopening of secondary schools in England: A prospective cohort study, September-December 2020.

EClinicalMedicine 2021 Jul 9;37:100948. Epub 2021 Jun 9.

National Infection Service, Public Health England, 61 Colindale Avenue, London NW9 5EQ, UK.

Background: Older children have higher SARS-CoV-2 infection rates than younger children. We investigated SARS-CoV-2 infection, seroprevalence and seroconversion rates in staff and students following the full reopening of all secondary schools in England.

Methods: Public Health England (PHE) invited secondary schools in six regions (East and West London, Hertfordshire, Derbyshire, Manchester and Birmingham) to participate in SARS-CoV-2 surveillance during the 2020/21 academic year. Participants had nasal swabs for RT-PCR and blood samples for SARS-CoV-2 antibodies at the beginning (September 2020) and end (December 2020) of the autumn term. Multivariable logistic regression was used to assess independent risk factors for seropositivity and seroconversion.

Findings: Eighteen schools in six regions enrolled 2,209 participants, including 1,189 (53.8%) students and 1,020 (46.2%) staff. SARS-CoV-2 infection rates were not significantly different between students and staff in round one (5/948; [] vs. 2/876 [;  = 0.46) or round two (10/948 [ vs. 7/886 [;  = 0.63), and similar to national prevalence. None of four and 7/15 (47%) sequenced strains in rounds 1 and 2 were the highly transmissible SARS-CoV-2 B.1.1.7 variant. In round 1, antibody seropositivity was higher in students than staff (114/893 [12.8%] vs. 79/861 [9.2%];  = 0.016), but similar in round 2 (117/893 [13.1%] vs.117/872 [13.3%];  = 0.85), comparable to local community seroprevalence. Between the two rounds, 8.7% (57/652) staff and 6.6% (36/549) students seroconverted ( = 0.16).

Interpretation: In secondary schools, SARS-CoV-2 infection, seropositivity and seroconversion rates were similar in staff and students, and comparable to local community rates. Ongoing surveillance will be important for monitoring the impact of new variants in educational settings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.eclinm.2021.100948DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8343251PMC
July 2021

Evaluating real-world national and regional trends in definitive closure in US burn care: A survey of US Burn Centers.

J Burn Care Res 2021 Jul 30. Epub 2021 Jul 30.

Avita Medical, Valencia, CA.

To better understand trends in burn treatment patterns related to definitive closure, this study sought benchmark real-world survey data with national data contained within the National Burn Repository version 8.0 (NBR v8.0) across key burn center practice patterns, resource utilization, and clinical outcomes. A survey, administered to a representative sample of US burn surgeons, collected information across several domains: burn center characteristics; patient characteristics including number of patients and burn size and depth; aggregate number of procedures; resource use such as autograft procedure time, and dressing changes; and costs. Survey findings were aggregated by key outcomes (number of procedures, costs) nationally and regionally. Aggregated burn center data were also compared to the NBR to identify trends relative to current treatment patterns. Benchmarking survey results against the NBR v8.0 demonstrated shifts in burn center patient mix, with more severe cases being seen in the inpatient setting and less severe burns moving to the outpatient setting. An overall reduction in the number of autograft procedures was observed compared to NBR v8.0, and time efficiencies improved as the intervention time per TBSA decreases as TBSA increases. Both nationally and regionally, an increase in costs were observed.The results suggest resource use estimates from NBR v8.0 may be higher than current practices, thus highlighting the importance of improved and timely NBR reporting and further research on burn center standard of care practices. This study demonstrates significant variations in burn center characteristics, practice patterns, and resource utilization thus increasing our understanding of burn center operations and behavior.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/irab151DOI Listing
July 2021

COVID-19 outbreaks following full reopening of primary and secondary schools in England: Cross-sectional national surveillance, November 2020.

Lancet Reg Health Eur 2021 Jul 19;6:100120. Epub 2021 May 19.

Immunisation and Countermeasures Division, Public Health England, London, United Kingdom.

Background: The full reopening of schools in September 2020 was associated with an increase in COVID-19 cases and outbreaks in educational settings across England.

Methods: Primary and secondary schools reporting an outbreak (≥2 laboratory-confirmed cases within 14 days) to Public Health England (PHE) between 31 August and 18 October 2020 were contacted in November 2020 to complete an online questionnaire.

Interpretation: There were 969 school outbreaks reported to PHE, comprising 2% ( = 450) of primary schools and 10% ( = 519) of secondary schools in England. Of the 369 geographically-representative schools contacted, 179 completed the questionnaire (100 primary schools, 79 secondary schools) and 2,314 cases were reported. Outbreaks were larger and across more year groups in secondary schools than in primary schools. Teaching staff were more likely to be the index case in primary (48/100, 48%) than secondary (25/79, 32%) school outbreaks ( = 0.027). When an outbreak occurred, attack rates were higher in staff (881/17,362; 5.07; 95%CI, 4.75-5.41) than students, especially primary school teaching staff (378/3852; 9.81%; 95%CI, 8.90-10.82%) compared to secondary school teaching staff (284/7146; 3.97%; 95%CI, 3.79-5.69%). Secondary school students (1105/91,919; 1.20%; 95%CI, 1.13-1.28%) had higher attack rates than primary school students (328/39,027; 0.84%; 95%CI, 0.75-0.94%).

Conclusions: A higher proportion of secondary schools than primary schools reported a COVID-19 outbreak and experienced larger outbreaks across multiple school year groups. The higher attack rate among teaching staff during an outbreak, especially in primary schools, suggests that additional protective measures may be needed.

Funding: PHE.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.lanepe.2021.100120DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8276523PMC
July 2021

Origins of Dissociations in the English Past Tense: A Synthetic Brain Imaging Model.

Front Psychol 2021 2;12:688908. Epub 2021 Jul 2.

Department of Psychology, Lancaster University, Lancaster, United Kingdom.

Brain imaging studies of English past tense inflection have found dissociations between regular and irregular verbs, but no coherent picture has emerged to explain how these dissociations arise. Here we use synthetic brain imaging on a neural network model to provide a mechanistic account of the origins of such dissociations. The model suggests that dissociations between regional activation patterns in verb inflection emerge in an adult processing system that has been shaped through experience-dependent structural brain development. Although these dissociations appear to be between regular and irregular verbs, they arise in the model from a combination of statistical properties including frequency, relationships to other verbs, and phonological complexity, without a causal role for regularity or semantics. These results are consistent with the notion that all inflections are produced in a single associative mechanism. The model generates predictions about the patterning of active brain regions for different verbs that can be tested in future imaging studies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fpsyg.2021.688908DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8283012PMC
July 2021

Use of high-content imaging to quantify transduction of AAV-PHP viruses in the brain following systemic delivery.

Brain Commun 2021 17;3(2):fcab105. Epub 2021 May 17.

Huntington's Disease Centre, Department of Neurodegenerative Disease, UK Dementia Research Institute at UCL, Queen Square Institute of Neurology, University College London, London WC1N 3BG, UK.

The engineering of the AAV-PHP capsids was an important development for CNS research and the modulation of gene expression in the brain. They cross the blood brain barrier and transduce brain cells after intravenous systemic delivery, a property dependent on the genotype of , the AAV-PHP capsid receptor. It is important to determine the transduction efficiency of a given viral preparation, as well as the comparative tropism for different brain cells; however, manual estimation of adeno-associated viral transduction efficiencies can be biased and time consuming. Therefore, we have used the Opera Phenix high-content screening system, equipped with the Harmony processing and analysis software, to reduce bias and develop an automated approach to determining transduction efficiency in the mouse brain. We used R Studio and 'gatepoints' to segment the data captured from coronal brain sections into brain regions of interest. C57BL/6J and CBA/Ca mice were injected with an AAV-PHP.B virus containing a green fluorescent protein reporter with a nuclear localization signal. Coronal sections at 600 μm intervals throughout the entire brain were stained with Hoechst dye, combined with immunofluorescence to NeuN and green fluorescent protein to identify all cell nuclei, neurons and transduced cells, respectively. Automated data analysis was applied to give an estimate of neuronal percentages and transduction efficiencies throughout the entire brain as well as for the cortex, striatum and hippocampus. The data from each coronal section from a given mouse were highly comparable. The percentage of neurons in the C57BL/6J and CBA/Ca brains was approximately 40% and this was higher in the cortex than striatum and hippocampus. The systemic injection of AAV-PHP.B resulted in similar transduction rates across the entire brain for C57BL/6J mice. Approximately 10-15% of all cells were transduced, with neuronal transduction efficiencies ranging from 5% to 15%, estimates that were similar across brain regions, and were in contrast to the much more localized transduction efficiencies achieved through intracerebral injection. We confirmed that the delivery of the AAV-PHP.B viruses to the brain from the vasculature resulted in widespread transduction. Our methodology allows the rapid comparison of transduction rates between brain regions producing comparable data to more time-consuming approaches. The methodology developed here can be applied to the automated quantification of any parameter of interest that can be captured as a fluorescent signal.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/braincomms/fcab105DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8200048PMC
May 2021

Using Mendelian Randomisation methods to understand whether diurnal preference is causally related to mental health.

Mol Psychiatry 2021 Jun 8. Epub 2021 Jun 8.

Genetics of Complex Traits, The College of Medicine and Health, University of Exeter, The RILD Building, RD&E Hospital, Exeter, UK.

Late diurnal preference has been linked to poorer mental health outcomes, but the understanding of the causal role of diurnal preference on mental health and wellbeing is currently limited. Late diurnal preference is often associated with circadian misalignment (a mismatch between the timing of the endogenous circadian system and behavioural rhythms), so that evening people live more frequently against their internal clock. This study aims to quantify the causal contribution of diurnal preference on mental health outcomes, including anxiety, depression and general wellbeing and test the hypothesis that more misaligned individuals have poorer mental health and wellbeing using an actigraphy-based measure of circadian misalignment. Multiple Mendelian Randomisation (MR) approaches were used to test causal pathways between diurnal preference and seven well-validated mental health and wellbeing outcomes in up to 451,025 individuals. In addition, observational analyses tested the association between a novel, objective measure of behavioural misalignment (Composite Phase Deviation, CPD) and seven mental health and wellbeing outcomes. Using genetic instruments identified in the largest GWAS for diurnal preference, we provide robust evidence that early diurnal preference is protective for depression and improves wellbeing. For example, using one-sample MR, a twofold higher genetic liability of morningness was associated with lower odds of depressive symptoms (OR: 0.92, 95% CI: 0.88, 0.97). It is possible that behavioural factors including circadian misalignment may contribute in the chronotype depression relationship, but further work is needed to confirm these findings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41380-021-01157-3DOI Listing
June 2021

Seroprevalence of SARS-CoV-2 antibodies in university students: Cross-sectional study, December 2020, England.

J Infect 2021 07 29;83(1):104-111. Epub 2021 Apr 29.

Immunisation and Countermeasures Division, Public Health England, 61 Colindale Avenue, London NW9 5EQ, UK.

Background: In England, the reopening of universities in September 2020 coincided with a rapid increase in SARS-CoV-2 infection rates in university aged young adults. This study aimed to estimate SARS-CoV-2 antibody prevalence in students attending universities that had experienced a COVID-19 outbreak after reopening for the autumn term in September 2020.

Methods: A cross-sectional serosurvey was conducted during 02-11 December 2020 in students aged ≤ 25 years across five universities in England. Blood samples for SARS-CoV-2 antibody testing were obtained using a self-sampling kit and analysed using the Abbott SARS-CoV-2 N antibody and/or an in-house receptor binding domain (RBD) assay.

Findings: SARS-CoV-2 seroprevalence in 2,905 university students was 17.8% (95%CI, 16.5-19.3), ranging between 7.6%-29.7% across the five universities. Seropositivity was associated with being younger likely to represent first year undergraduates (aOR 3.2, 95% CI 2.0-4.9), living in halls of residence (aOR 2.1, 95% CI 1.7-2.7) and sharing a kitchen with an increasing number of students (shared with 4-7 individuals, aOR 1.43, 95%CI 1.12-1.82; shared with 8 or more individuals, aOR 1.53, 95% CI 1.04-2.24). Seropositivity was 49% in students living in halls of residence that reported high SARS-CoV-2 infection rates (>8%) during the autumn term.

Interpretation: Despite large numbers of cases and outbreaks in universities, less than one in five students (17.8%) overall had SARS-CoV-2 antibodies at the end of the autumn term in England. In university halls of residence affected by a COVID-19 outbreak, however, nearly half the resident students became infected and developed SARS-CoV-2 antibodies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jinf.2021.04.028DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8081745PMC
July 2021

Peritraumatic plasma Omega-3 fatty acid concentration predicts chronic pain severity following thermal burn injury.

J Burn Care Res 2021 Apr 25. Epub 2021 Apr 25.

Institute for Trauma Recovery.

Chronic pain is a significant co-morbidity of burn injury affecting up to 60% of survivors. Currently, no treatments are available to prevent chronic pain after burn injury. Accumulating evidence suggests that omega-3 fatty acids (O3FA) improve symptoms across a range of painful conditions. In this study, we evaluated whether low peritraumatic levels of O3FA predicts greater pain severity during the year after burn injury. Burn survivors undergoing skin autograft were recruited from three participating burn centers. Plasma O3FA (n=77) levels were assessed in the early aftermath of burn injury using liquid chromatography/mass spectrometry and pain severity was assessed via the 0-10 numeric rating scale for 1 year following burn injury. Repeated-measures linear regression analyses were used to evaluate the association between peritraumatic O3FA concentrations and pain severity during the year following burn injury. Peritraumatic O3FA concentration and chronic pain severity were inversely related; lower levels of peritraumatic O3FA predicted worse pain outcomes (β=-.002, p=.020). Future studies are needed to evaluate biological mechanisms mediating this association and to assess the ability of O3FA to prevent chronic pain following burn injury.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/irab071DOI Listing
April 2021

Financial Considerations Associated With a Fourth Year of Residency Training in Family Medicine: Findings From the Length of Training Pilot Study.

Fam Med 2021 Apr;53(4):256-266

Oregon Health & Science University, Portland, OR.

Background And Objectives: The feasibility of funding an additional year of residency training is unknown, as are perspectives of residents regarding related financial considerations. We examined these issues in the Family Medicine Length of Training Pilot.

Methods: Between 2013 and 2019, we collected data on matched 3-year and 4-year programs using annual surveys, focus groups, and in-person and telephone interviews. We analyzed survey quantitative data using descriptive statistics, independent samples t test, Fisher's Exact Test and χ2. Qualitative analyses involved identifying emergent themes, defining them and presenting exemplars.

Results: Postgraduate year (PGY)-4 residents in 4-year programs were more likely to moonlight to supplement their resident salaries compared to PGY-3 residents in three-year programs (41.6% vs 23.0%; P=.002), though their student debt load was similar. We found no differences in enrollment in loan repayment programs or pretax income. Programs' descriptions of financing a fourth year as reported by the program director were limited and budget numbers could not be obtained. However, programs that required a fourth year typically reported extensive planning to determine how to fund the additional year. Programs with an optional fourth year were budget neutral because few residents chose to undertake an additional year of training. Resources needed for a required fourth year included resident salaries for the fourth year, one additional faculty, and one staff member to assist with more complex scheduling. Residents' concerns about financial issues varied widely.

Conclusions: Adding a fourth year of training was financially feasible but details are local and programs could not be compared directly. For programs that had a required rather than optional fourth year much more financial planning was needed.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.22454/FamMed.2021.406778DOI Listing
April 2021

Can Machines Find the Bilingual Advantage? Machine Learning Algorithms Find No Evidence to Differentiate Between Lifelong Bilingual and Monolingual Cognitive Profiles.

Front Hum Neurosci 2021 22;15:621772. Epub 2021 Mar 22.

Department of Psychology, Swansea University, Swansea, United Kingdom.

Bilingualism has been identified as a potential cognitive factor linked to delayed onset of dementia as well as boosting executive functions in healthy individuals. However, more recently, this claim has been called into question following several failed replications. It remains unclear whether these contradictory findings reflect how bilingualism is defined between studies, or methodological limitations when measuring the bilingual effect. One key issue is that despite the claims that bilingualism yields general protection to cognitive processes (i.e., the cognitive reserve hypothesis), studies reporting putative bilingual differences are often focused on domain specific experimental paradigms. This study chose a broader approach, by considering the consequences of bilingualism on a wide range of cognitive functions within individuals. We utilised 19 measures of different cognitive functions commonly associated with bilingual effects, to form a "cognitive profile" for 215 non-clinical participants. We recruited Welsh speakers, who as a group of bilinguals were highly homogeneous, as means of isolating the bilingualism criterion. We sought to determine if such analyses would independently classify bilingual/monolingual participant groups based on emergent patterns driven by collected cognitive profiles, such that population differences would emerge. Multiple predictive models were trained to independently recognise the cognitive profiles of bilinguals, older adults (60-90 years of age) and higher education attainment. Despite managing to successfully classify cognitive profiles based on age and education, the model failed to differentiate between bilingual and monolingual cognitive ability at a rate greater than that of chance. Repeated modelling using alternative definitions of bilingualism, and just the older adults, yielded similar results. In all cases then, using our "bottom-up" analytical approach, there was no evidence that bilingualism as a variable indicated differential cognitive performance - as a consequence, we conclude that bilinguals are not cognitively different from their monolingual counterparts, even in older demographics. We suggest that studies that have reported a bilingual advantage (typically recruiting immigrant populations) could well have confounded other key variables that may be driving reported advantages. We recommend that future research refine the machine learning methods used in this study to further investigate the complex relationship between bilingualism and cognition.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3389/fnhum.2021.621772DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8019743PMC
March 2021

SARS-CoV-2 infection and transmission in primary schools in England in June-December, 2020 (sKIDs): an active, prospective surveillance study.

Lancet Child Adolesc Health 2021 06 17;5(6):417-427. Epub 2021 Mar 17.

National Infection Service, Public Health England, London, UK.

Background: Little is known about the risk of SARS-CoV-2 infection and transmission in educational settings. Public Health England initiated a study, COVID-19 Surveillance in School KIDs (sKIDs), in primary schools when they partially reopened from June 1, 2020, after the first national lockdown in England to estimate the incidence of symptomatic and asymptomatic SARS-CoV-2 infection, seroprevalence, and seroconversion in staff and students.

Methods: sKIDs, an active, prospective, surveillance study, included two groups: the weekly swabbing group and the blood sampling group. The swabbing group underwent weekly nasal swabs for at least 4 weeks after partial school reopening during the summer half-term (June to mid-July, 2020). The blood sampling group additionally underwent blood sampling for serum SARS-CoV-2 antibodies to measure previous infection at the beginning (June 1-19, 2020) and end (July 3-23, 2020) of the summer half-term, and, after full reopening in September, 2020, and at the end of the autumn term (Nov 23-Dec 18, 2020). We tested for predictors of SARS-CoV-2 antibody positivity using logistic regression. We calculated antibody seroconversion rates for participants who were seronegative in the first round and were tested in at least two rounds.

Findings: During the summer half-term, 11 966 participants (6727 students, 4628 staff, and 611 with unknown staff or student status) in 131 schools had 40 501 swabs taken. Weekly SARS-CoV-2 infection rates were 4·1 (one of 24 463; 95% CI 0·1-21·8) per 100 000 students and 12·5 (two of 16 038; 1·5-45·0) per 100 000 staff. At recruitment, in 45 schools, 91 (11·2%; 95% CI 7·9-15·1) of 816 students and 209 (15·1%; 11·9-18·9) of 1381 staff members were positive for SARS-CoV-2 antibodies, similar to local community seroprevalence. Seropositivity was not associated with school attendance during lockdown (p=0·13 for students and p=0·20 for staff) or staff contact with students (p=0·37). At the end of the summer half-term, 603 (73·9%) of 816 students and 1015 (73·5%) of 1381 staff members were still participating in the surveillance, and five (four students, one staff member) seroconverted. By December, 2020, 55 (5·1%; 95% CI 3·8-6·5) of 1085 participants who were seronegative at recruitment (in June, 2020) had seroconverted, including 19 (5·6%; 3·4-8·6) of 340 students and 36 (4·8%; 3·4-6·6) of 745 staff members (p=0·60).

Interpretation: In England, SARS-CoV-2 infection rates were low in primary schools following their partial and full reopening in June and September, 2020.

Funding: UK Department of Health and Social Care.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/S2352-4642(21)00061-4DOI Listing
June 2021

Ageing and multisensory integration: A review of the evidence, and a computational perspective.

Cortex 2021 05 11;138:1-23. Epub 2021 Feb 11.

Donders Institute for Brain, Cognition & Behaviour, Radboud University, Nijmegen, the Netherlands. Electronic address:

The processing of multisensory signals is crucial for effective interaction with the environment, but our ability to perform this vital function changes as we age. In the first part of this review, we summarise existing research into the effects of healthy ageing on multisensory integration. We note that age differences vary substantially with the paradigms and stimuli used: older adults often receive at least as much benefit (to both accuracy and response times) as younger controls from congruent multisensory stimuli, but are also consistently more negatively impacted by the presence of intersensory conflict. In the second part, we outline a normative Bayesian framework that provides a principled and computationally informed perspective on the key ingredients involved in multisensory perception, and how these are affected by ageing. Applying this framework to the existing literature, we conclude that changes to sensory reliability, prior expectations (together with attentional control), and decisional strategies all contribute to the age differences observed. However, we find no compelling evidence of any age-related changes to the basic inference mechanisms involved in multisensory perception.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cortex.2021.02.001DOI Listing
May 2021

Single-center Experience with Venous Thromboembolism Prophylaxis for Obese Burn Patients.

J Burn Care Res 2021 05;42(3):365-368

North Carolina Jaycee Burn Center, Chapel Hill, North Carolina.

Burn injured patients are at high risk of thromboembolic complications. Morbid obesity further increases this risk. Our objective was to evaluate the efficacy of enoxaparin dosed 40 mg twice daily in achieving prophylactic plasma anti-Xa levels in obese burn patients. A retrospective chart review from November 2018 until September 2019 identified patients who were either ≥100 kg or had a body mass index ≥30 kg/m2 and initiated on enoxaparin 40 mg twice daily for venous thromboembolism prophylaxis. Patients were ≥18 yr of age and received ≥3 sequential doses of enoxaparin with appropriately timed peak plasma anti-Xa levels to monitor efficacy. One hundred forty-eight patients were screened with 43 patients included for analysis. Forty-two percent of the patients did not reach target peak plasma anti-Xa levels (0.2-0.5 IU/ml) on enoxaparin 40 mg twice daily. Patients who did not meet prophylactic target levels were more likely to be male (P < 0.05) and have an increased mean body weight (129 ± 24 kg vs 110 ± 16 kg, P < 0.05). Thirteen out of 18 patients received dosage adjustments with subsequent anti-Xa levels available for follow-up assessment, of which an additional six patients required further dosage adjustment to meet prophylactic goals. Current utilization of a fixed 40 mg twice daily regimen of enoxaparin for venous thromboembolism (VTE) is inadequate to meet target prophylactic peak plasma anti-Xa levels in the obese burn patient population. Dose adjusting enoxaparin to target anti-Xa levels to reduce VTE rates in obese burn patients should be further evaluated.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/irab039DOI Listing
May 2021

Genetic determinants of daytime napping and effects on cardiometabolic health.

Nat Commun 2021 02 10;12(1):900. Epub 2021 Feb 10.

Center for Genomic Medicine, Massachusetts General Hospital and Harvard Medical School, Boston, MA, USA.

Daytime napping is a common, heritable behavior, but its genetic basis and causal relationship with cardiometabolic health remain unclear. Here, we perform a genome-wide association study of self-reported daytime napping in the UK Biobank (n = 452,633) and identify 123 loci of which 61 replicate in the 23andMe research cohort (n = 541,333). Findings include missense variants in established drug targets for sleep disorders (HCRTR1, HCRTR2), genes with roles in arousal (TRPC6, PNOC), and genes suggesting an obesity-hypersomnolence pathway (PNOC, PATJ). Association signals are concordant with accelerometer-measured daytime inactivity duration and 33 loci colocalize with loci for other sleep phenotypes. Cluster analysis identifies three distinct clusters of nap-promoting mechanisms with heterogeneous associations with cardiometabolic outcomes. Mendelian randomization shows potential causal links between more frequent daytime napping and higher blood pressure and waist circumference.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41467-020-20585-3DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7876146PMC
February 2021

Peritraumatic Vitamin D levels predict chronic pain severity and contribute to racial differences in pain outcomes following Major Thermal Burn Injury.

J Burn Care Res 2021 Feb 10. Epub 2021 Feb 10.

Institute for Trauma Recovery.

Major thermal burn injuries result in approximately 40,000 hospitalizations in the United States each year. Chronic pain affects up to 60% of burn survivors, Black Americans have worse chronic pain outcomes than White Americans. Mechanisms of chronic pain pathogenesis after burn injury, and accounting for these racial differences, remain poorly understood. Due to socioeconomic disadvantage and differences in skin absorption, Black Americans have an increased prevalence of Vitamin D deficiency. We hypothesized that peritraumatic Vitamin D levels predict chronic pain outcomes after burn injury and contribute to racial differences in pain outcomes. Among burn survivors (n=77, 52% White, 48% Black, 77% male), peritraumatic Vitamin D levels were more likely to be deficient in Blacks vs. Whites (27/37 (73%) vs. 14/40 (35%), p<.001). Peritraumatic Vitamin D levels were inversely associated with chronic post-burn pain outcomes across all burn injury survivors, including those who were and were not Vitamin D deficient, and accounted for approximately 1/3 of racial differences in post-burn pain outcome. Future studies are needed to evaluate potential mechanisms mediating the effect of Vitamin D on post-burn pain outcomes and the potential efficacy of Vitamin D in improving pain outcomes and reducing racial differences.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/irab031DOI Listing
February 2021

The Bloom syndrome complex senses RPA-coated single-stranded DNA to restart stalled replication forks.

Nat Commun 2021 01 26;12(1):585. Epub 2021 Jan 26.

MRC Weatherall Institute of Molecular Medicine, University of Oxford, John Radcliffe Hospital, Oxford, OX3 9DS, UK.

The Bloom syndrome helicase BLM interacts with topoisomerase IIIα (TOP3A), RMI1 and RMI2 to form the BTR complex, which dissolves double Holliday junctions to produce non-crossover homologous recombination (HR) products. BLM also promotes DNA-end resection, restart of stalled replication forks, and processing of ultra-fine DNA bridges in mitosis. How these activities of the BTR complex are regulated in cells is still unclear. Here, we identify multiple conserved motifs within the BTR complex that interact cooperatively with the single-stranded DNA (ssDNA)-binding protein RPA. Furthermore, we demonstrate that RPA-binding is required for stable BLM recruitment to sites of DNA replication stress and for fork restart, but not for its roles in HR or mitosis. Our findings suggest a model in which the BTR complex contains the intrinsic ability to sense levels of RPA-ssDNA at replication forks, which controls BLM recruitment and activation in response to replication stress.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41467-020-20818-5DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7838300PMC
January 2021

Predictive Processing and Developmental Language Disorder.

J Speech Lang Hear Res 2021 01 29;64(1):181-185. Epub 2020 Dec 29.

Department of Psychology, Lancaster University, United Kingdom.

Purpose Research in the cognitive and neural sciences has situated predictive processing-the anticipation of upcoming percepts-as a dominant function of the brain. The purpose of this article is to argue that prediction should feature more prominently in explanatory accounts of sentence processing and comprehension deficits in developmental language disorder (DLD). Method We evaluate behavioral and neurophysiological data relevant to the theme of prediction in early typical and atypical language acquisition and processing. Results Poor syntactic awareness-attributable, in part, to an underlying statistical learning deficit-is likely to impede syntax-based predictive processing in children with DLD, conferring deficits in spoken sentence comprehension. Furthermore, there may be a feedback cycle in which poor syntactic awareness impedes children's ability to anticipate upcoming percepts, and this, in turn, makes children unable to improve their syntactic awareness on the basis of prediction error signals. Conclusion This article offers a refocusing of theory on sentence processing and comprehension deficits in DLD, from a difficulty in processing and integrating perceived syntactic features to a difficulty in anticipating what is coming next.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1044/2020_JSLHR-20-00409DOI Listing
January 2021

Sleep characteristics across the lifespan in 1.1 million people from the Netherlands, United Kingdom and United States: a systematic review and meta-analysis.

Nat Hum Behav 2021 01 16;5(1):113-122. Epub 2020 Nov 16.

Research Institute of Child Development and Education, University of Amsterdam, Amsterdam, the Netherlands.

We aimed to obtain reliable reference charts for sleep duration, estimate the prevalence of sleep complaints across the lifespan and identify risk indicators of poor sleep. Studies were identified through systematic literature search in Embase, Medline and Web of Science (9 August 2019) and through personal contacts. Eligible studies had to be published between 2000 and 2017 with data on sleep assessed with questionnaires including ≥100 participants from the general population. We assembled individual participant data from 200,358 people (aged 1-100 years, 55% female) from 36 studies from the Netherlands, 471,759 people (40-69 years, 55.5% female) from the United Kingdom and 409,617 people (≥18 years, 55.8% female) from the United States. One in four people slept less than age-specific recommendations, but only 5.8% slept outside of the 'acceptable' sleep duration. Among teenagers, 51.5% reported total sleep times (TST) of less than the recommended 8-10 h and 18% report daytime sleepiness. In adults (≥18 years), poor sleep quality (13.3%) and insomnia symptoms (9.6-19.4%) were more prevalent than short sleep duration (6.5% with TST < 6 h). Insomnia symptoms were most frequent in people spending ≥9 h in bed, whereas poor sleep quality was more frequent in those spending <6 h in bed. TST was similar across countries, but insomnia symptoms were 1.5-2.9 times higher in the United States. Women (≥41 years) reported sleeping shorter times or slightly less efficiently than men, whereas with actigraphy they were estimated to sleep longer and more efficiently than man. This study provides age- and sex-specific population reference charts for sleep duration and efficiency which can help guide personalized advice on sleep length and preventive practices.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41562-020-00965-xDOI Listing
January 2021

Is disrupted sleep a risk factor for Alzheimer's disease? Evidence from a two-sample Mendelian randomization analysis.

Int J Epidemiol 2021 07;50(3):817-828

MRC Integrative Epidemiology Unit, at the University of Bristol, Bristol, UK.

Background: It is established that Alzheimer's disease (AD) patients experience sleep disruption. However, it remains unknown whether disruption in the quantity, quality or timing of sleep is a risk factor for the onset of AD.

Methods: We used the largest published genome-wide association studies of self-reported and accelerometer-measured sleep traits (chronotype, duration, fragmentation, insomnia, daytime napping and daytime sleepiness), and AD. Mendelian randomization (MR) was used to estimate the causal effect of self-reported and accelerometer-measured sleep parameters on AD risk.

Results: Overall, there was little evidence to support a causal effect of sleep traits on AD risk. There was some suggestive evidence that self-reported daytime napping was associated with lower AD risk [odds ratio (OR): 0.70, 95% confidence interval (CI): 0.50-0.99). Some other sleep traits (accelerometer-measured 'eveningness' and sleep duration, and self-reported daytime sleepiness) had ORs of a similar magnitude to daytime napping, but were less precisely estimated.

Conclusions: Overall, we found very limited evidence to support a causal effect of sleep traits on AD risk. Our findings provide tentative evidence that daytime napping may reduce AD risk. Given that this is the first MR study of multiple self-report and objective sleep traits on AD risk, findings should be replicated using independent samples when such data become available.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/ije/dyaa183DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8271193PMC
July 2021

Global Health in Preconception, Pregnancy and Postpartum Alliance: development of an international consumer and community involvement framework.

Res Involv Engagem 2020 10;6:47. Epub 2020 Aug 10.

Monash Centre for Health Research and Implementation, School of Public Health and Preventive Medicine, Monash University, Clayton, Victoria Australia.

Background: The goal of the Global Health in Preconception, Pregnancy and Postpartum (HiPPP) Alliance, comprising consumers and leading international multidisciplinary academics and clinicians, is to generate research and translation priorities and build international collaboration around healthy lifestyle and obesity prevention among women across the reproductive years. In doing so, we actively seek to involve consumers in research, implementation and translation initiatives. There are limited frameworks specifically designed to involve women across the key obesity prevention windows before (preconception), during and after pregnancy (postpartum). The aim of this paper is to outline our strategy for the development of the HiPPP Consumer and Community (CCI) Framework, with consumers as central to co-designed, co-implemented and co-disseminated research and translation.

Method: The development of the framework involved three phases: In Phase 1, 21 Global HiPPP Alliance members participated in a CCI workshop to propose and discuss values and approaches for framework development; Phase 2 comprised a search of peer-reviewed and grey literature for existing CCI frameworks and resources; and Phase 3 entailed collaboration with consumers (i.e., members of the public with lived experience of weight/lifestyle issues in preconception, pregnancy and postpartum) and international CCI experts to workshop and refine the HiPPP CCI Framework (guided by Phases 1 and 2).

Results: The HiPPP CCI Framework's values and approaches identified in Phases 1-2 and further refined in Phase 3 were summarized under the following five key principles: 1. Inclusive, 2. Flexible, 3. Transparent, 4. Equitable, and 5. Adaptable. The HiPPP Framework describes values and approaches for involving consumers in research initiatives from design to translation that focus on improving healthy lifestyles and preventing obesity specifically before, during and after pregnancy; importantly it takes into consideration common barriers to partnering in obesity research during perinatal life stages, such as limited availability associated with family caregiving responsibilities.

Conclusion: The HiPPP CCI Framework aims to describe approaches for implementing meaningful CCI initiatives with women in preconception, pregnancy and postpartum periods. Evaluation of the framework is now needed to understand how effective it is in facilitating meaningful involvement for consumers, researchers and clinicians, and its impact on research to improve healthy lifestyle outcomes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s40900-020-00218-1DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7419190PMC
August 2020

A single nucleotide polymorphism genetic risk score to aid diagnosis of coeliac disease: a pilot study in clinical care.

Aliment Pharmacol Ther 2020 10 13;52(7):1165-1173. Epub 2020 Aug 13.

Institute of Biomedical and Clinical Science, University of Exeter Medical School, Exeter, UK.

Background: Single nucleotide polymorphism-based genetic risk scores (GRS) model genetic risk as a continuum and can discriminate coeliac disease but have not been validated in clinic. Human leukocyte antigen (HLA) DQ gene testing is available in clinic but does not include non-HLA attributed risk and is limited by discrete risk stratification.

Aims: To accurately characterise both HLA and non-HLA coeliac disease genetic risk as a single nucleotide polymorphism-based GRS and evaluate diagnostic utility.

Methods: We developed a 42 single nucleotide polymorphism coeliac disease GRS from a European case-control study (12 041 cases vs 12 228 controls) using HLA-DQ imputation and published genome-wide association studies. We validated the GRS in UK Biobank (1237 cases) and developed direct genotyping assays. We tested the coeliac disease GRS in a pilot clinical cohort of 128 children presenting with suspected coeliac disease.

Results: The GRS was more discriminative of coeliac disease than HLA-DQ stratification in UK Biobank (receiver operating characteristic area under the curve [ROC-AUC] = 0.88 [95% CIs: 0.87-0.89] vs 0.82 [95% CIs: 0.80-0.83]). We demonstrated similar discrimination in the pilot clinical cohort (114 cases vs 40 controls, ROC-AUC = 0.84 [95% CIs: 0.76-0.91]). As a rule-out test, no children with coeliac disease in the clinical cohort had a GRS below 38th population centile.

Conclusions: A single nucleotide polymorphism-based GRS may offer more effective and cost-efficient testing of coeliac disease genetic risk in comparison to HLA-DQ stratification. As a comparatively inexpensive test it could facilitate non-invasive coeliac disease diagnosis but needs detailed assessment in the context of other diagnostic tests and against current diagnostic algorithms.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/apt.15826DOI Listing
October 2020

Structure-Based Virtual Screening, Synthesis and Biological Evaluation of Potential FAK-FAT Domain Inhibitors for Treatment of Metastatic Cancer.

Molecules 2020 Jul 31;25(15). Epub 2020 Jul 31.

School of Pharmacy and Pharmaceutical Sciences, Cardiff University, Wales CF10 3NB, UK.

Focal adhesion kinase (FAK) is a tyrosine kinase that is overexpressed and activated in several advanced-stage solid cancers. In cancer cells, FAK promotes the progression and metastasis of tumours. In this study, we used structure-based virtual screening to filter a library of more than 210K compounds against the focal adhesion targeting FAK-focal adhesion targeting (FAT) domain to identify 25 virtual hit compounds which were screened in the invasive breast cancer line (MDA-MB-231). Most notably, compound showed low micromolar antiproliferative activity, as well as antimigratory activity. Moreover, examination in a model of triple negative breast cancer (TNBC), revealed that, despite not effecting FAK phosphorylation, compound significantly impairs proliferation whilst impairing focal adhesion growth and turnover leading to reduced migration. Further optimisation and synthesis of analogues of the lead compound using a four-step synthetic procedure was performed, and analogues were assessed for their antiproliferative activity against three breast cancer (MDA-MB-231, T47D, BT474) cell lines and one pancreatic cancer (MIAPaCa2) cell line. Compound was identified as a promising lead compound with IC values in the range of 4.59-5.28 μM in MDA-MB-231, T47D, BT474, and MIAPaCa2. Molecular modelling and pharmacokinetic studies provided more insight into the therapeutic features of this new series.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3390/molecules25153488DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7435868PMC
July 2020

Redesigning Primary Care to Address the COVID-19 Pandemic in the Midst of the Pandemic.

Ann Fam Med 2020 07;18(4):349-354

Inova Health System, Fairfax, Virginia.

During a pandemic, primary care is the first line of defense. It is able to reinforce public health messages, help patients manage at home, and identify those in need of hospital care. In response to the COVID-19 pandemic, primary care scrambled to rapidly transform itself and protect clinicians, staff, and patients while remaining connected to patients. Using the established public health framework for addressing a pandemic, we describe the actions primary care needs to take in a pandemic. Recommended actions are based on observed experiences of the authors' primary care practices and networks. Early in the COVID-19 pandemic, tasks focused on promoting physical distancing and encouraging patients with suspected illness or exposure to self-quarantine. Testing was not available and contract tracing was not possible. As the pandemic spread, in-person care was converted to virtual care using telehealth. Practices remained connected to patients using registries to reach out to those at risk for infection, with uncontrolled chronic conditions, or were socially vulnerable. Practices managed most patients with suspected COVID-19 at home. As the pandemic decelerates, practices are now preparing to address the direct and indirect consequences-complications from COVID-19 infections, missed treatment for acute problems, inadequate prevention, uncontrolled chronic disease, mental illness, and greater social needs. Throughout, practices bore tremendous financial burden, laying off staff or even closing at a time when most needed. Primary care must learn from this experience and be ready for the next pandemic. Policymakers and payers cannot fail primary care during their next time of need.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1370/afm.2557DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7358035PMC
July 2020

Work-Related Burn Injuries in a Tertiary Care Burn Center, 2013 to 2018.

J Burn Care Res 2020 09;41(5):1009-1014

Department of Surgery, University of North Carolina at Chapel Hill, North Carolina, USA.

The features of work-related burn (WRB) injuries are not well defined in the literature and they vary depending on geographical location. We wanted to describe these characteristics among patients treated in the UNC Burn Center to evaluate the potential impact of commonly accepted prevention efforts. Adults of working age, admitted between January 1, 2013, and December 31, 2018, were identified using our Burn Center Registry. Demographic data, characteristics of injury, course of treatment, and patients' outcomes were described. Differences between work-related and non-work-related injuries were evaluated using the Chi-square test and Student t-test where appropriate. Three thousand five hundred and forty-five patients were included. WRB cases constituted 18% of the study population, and this proportion remained relatively stable during the study timeframe. Young white males were the majority of this group. When compared with non-WRB patients, they were characterized by fewer co-morbidities, decreased TBSA burns, decreased risk of inhalation injury, shorter time of intensive care treatment, shorter lengths of hospital stay, and lower treatment cost. In contrast to non-WRB, among which flame injuries were the main reason for admission, work-related patients most often suffered scald burns. They also had a dramatically increased proportion of chemical and electrical burns, making the latter the most common cause of death in that group. WRB are characterized by a characteristic patient profile, burn etiologies, and outcomes. Learning specific patterns at this group may contribute to optimize work safety regulations and medical interventions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/iraa105DOI Listing
September 2020

Osteomyelitis Increases the Rate of Amputation in Patients With Type 2 Diabetes and Lower Extremity Burns.

J Burn Care Res 2020 09;41(5):981-985

Department of Surgery, University of North Carolina School of Medicine, Chapel Hill.

In patients with diabetes mellitus (DM), amputation rates exceed 30% when lower extremity osteomyelitis is present. We sought to determine the rate of osteomyelitis and any subsequent amputation in our patients with DM and lower extremity burns. We performed a single-site, retrospective review at our burn center using the institutional burn center registry, linked to clinical and administrative data. Adults (≥18 years old) with DM admitted from January 1, 2014 to December 31, 2018 for isolated lower extremity burns were eligible for inclusion. We evaluated demographics, burn characteristics, comorbidities, presence of radiologically confirmed osteomyelitis, length of stay (LOS), inpatient hospitalization costs, and amputation rate at 3 months and 12 months after injury. We identified 103 patients with DM and isolated lower extremity burns. Of these, 88 patients did not have osteomyelitis, while 15 patients had radiologically confirmed osteomyelitis within 3 months of the burn injury. Compared to patients without osteomyelitis, patients with osteomyelitis had significantly increased LOS (average LOS 22.7 days vs 12.1 days, P = .0042), inpatient hospitalization costs (average $135,345 vs $62,237, P = .0008), amputation rate within 3 months (66.7% vs 5.70%, P < .00001), and amputation rate within 12 months (66.7% vs 9.1%, P < .0001). The two groups were otherwise similar in demographics, burn injury characteristics, access to healthcare, and preexisting comorbidities. Patients with DM and lower extremity burns incurred increased LOS, higher inpatient hospitalization costs, and increased amputation rates if radiologically confirmed osteomyelitis was present within 3 months of the burn injury.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jbcr/iraa106DOI Listing
September 2020

Genetic evidence that higher central adiposity causes gastro-oesophageal reflux disease: a Mendelian randomization study.

Int J Epidemiol 2020 08;49(4):1270-1281

Genetics of Complex Traits, University of Exeter Medical School, Exeter, UK.

Background: Gastro-oesophageal reflux disease (GORD) is associated with multiple risk factors but determining causality is difficult. We used a genetic approach [Mendelian randomization (MR)] to identify potential causal modifiable risk factors for GORD.

Methods: We used data from 451 097 European participants in the UK Biobank and defined GORD using hospital-defined ICD10 and OPCS4 codes and self-report data (N = 41 024 GORD cases). We tested observational and MR-based associations between GORD and four adiposity measures [body mass index (BMI), waist-hip ratio (WHR), a metabolically favourable higher body-fat percentage and waist circumference], smoking status, smoking frequency and caffeine consumption.

Results: Observationally, all adiposity measures were associated with higher odds of GORD. Ever and current smoking were associated with higher odds of GORD. Coffee consumption was associated with lower odds of GORD but, among coffee drinkers, more caffeinated-coffee consumption was associated with higher odds of GORD. Using MR, we provide strong evidence that higher WHR and higher WHR adjusted for BMI lead to GORD. There was weak evidence that higher BMI, body-fat percentage, coffee drinking or smoking caused GORD, but only the observational effects for BMI and body-fat percentage could be excluded. This MR estimated effect for WHR equates to a 1.23-fold higher odds of GORD per 5-cm increase in waist circumference.

Conclusions: These results provide strong evidence that a higher waist-hip ratio leads to GORD. Our study suggests that central fat distribution is crucial in causing GORD rather than overall weight.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/ije/dyaa082DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7750946PMC
August 2020

Ecological drivers of global gradients in avian dispersal inferred from wing morphology.

Nat Commun 2020 05 18;11(1):2463. Epub 2020 May 18.

Department of Zoology, University of Oxford, South Parks Road, Oxford, OX1 3PS, UK.

An organism's ability to disperse influences many fundamental processes, from speciation and geographical range expansion to community assembly. However, the patterns and underlying drivers of variation in dispersal across species remain unclear, partly because standardised estimates of dispersal ability are rarely available. Here we present a global dataset of avian hand-wing index (HWI), an estimate of wing shape widely adopted as a proxy for dispersal ability in birds. We show that HWI is correlated with geography and ecology across 10,338 (>99%) species, increasing at higher latitudes and with migration, and decreasing with territoriality. After controlling for these effects, the strongest predictor of HWI is temperature variability (seasonality), with secondary effects of diet and habitat type. Finally, we also show that HWI is a strong predictor of geographical range size. Our analyses reveal a prominent latitudinal gradient in HWI shaped by a combination of environmental and behavioural factors, and also provide a global index of avian dispersal ability for use in community ecology, macroecology, and macroevolution.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41467-020-16313-6DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7235233PMC
May 2020
-->