Publications by authors named "Samantha Wilkinson"

23 Publications

  • Page 1 of 1

Ixodes ricinus and Borrelia burgdorferi sensu lato in the Royal Parks of London, UK.

Exp Appl Acarol 2021 Jul 14;84(3):593-606. Epub 2021 Jun 14.

Medical Entomology & Zoonoses Ecology, Emergency Response Department Science & Technology, Public Health England, Porton Down, UK.

Assessing the risk of tick-borne disease in areas with high visitor numbers is important from a public health perspective. Evidence suggests that tick presence, density, infection prevalence and the density of infected ticks can vary between habitats within urban green space, suggesting that the risk of Lyme borreliosis transmission can also vary. This study assessed nymph density, Borrelia prevalence and the density of infected nymphs across a range of habitat types in nine parks in London which receive millions of visitors each year. Ixodes ricinus were found in only two of the nine locations sampled, and here they were found in all types of habitat surveyed. Established I. ricinus populations were identified in the two largest parks, both of which had resident free-roaming deer populations. Highest densities of nymphs (15.68 per 100 m) and infected nymphs (1.22 per 100 m) were associated with woodland and under canopy habitats in Richmond Park, but ticks infected with Borrelia were found across all habitat types surveyed. Nymphs infected with Borrelia (7.9%) were only reported from Richmond Park, where Borrelia burgdorferi sensu stricto and Borrelia afzelii were identified as the dominant genospecies. Areas with short grass appeared to be less suitable for ticks and maintaining short grass in high footfall areas could be a good strategy for reducing the risk of Lyme borreliosis transmission to humans in such settings. In areas where this would create conflict with existing practices which aim to improve and/or meet historic landscape, biodiversity and public access goals, promoting public health awareness of tick-borne disease risks could also be utilised.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10493-021-00633-3DOI Listing
July 2021

Performing care: emotion work and 'dignity work' - a joint autoethnography of caring for our mum at the end of life.

Sociol Health Illn 2020 11 18;42(8):1888-1901. Epub 2020 Sep 18.

Liverpool John Moores University, Liverpool, UK.

In this paper we, twin sisters, present a joint autoethnographic account of providing end of life care for our mum who had terminal cancer. Using the theoretical framing of performance from Goffman's theory of Dramaturgy, we present the findings from a joint autoethnography, focusing on two key themes: performing emotion work and performing what we conceptualise as 'dignity work'. This paper's contributions are twofold. First, conceptually, this paper offers an important contribution to literature concerned with the sociology of illness, by critically engaging with Goffman's notion of frontstage and backstage performance, applied to the context of home care provided by family carers. The second contribution of this paper is methodological; we promote the under-utilised approach of a joint autoethnography and argue for its usefulness in the context of end of life care. We contend that the process of writing this paper was emotionally challenging, yet arriving at the final paper, which serves as a legacy of our mum, was cathartic. We argue for the usefulness of written diaries as a backstage arena through which other informal carers can think through, and come to terms with, experiences of death and dying.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/1467-9566.13174DOI Listing
November 2020

Comparative effects of sulphonylureas, dipeptidyl peptidase-4 inhibitors and sodium-glucose co-transporter-2 inhibitors added to metformin monotherapy: a propensity-score matched cohort study in UK primary care.

Diabetes Obes Metab 2020 05 13;22(5):847-856. Epub 2020 Feb 13.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK.

Aim: To assess the comparative effects of sodium-glucose co-transporter-2 (SGLT2) inhibitors, sulphonylureas (SUs) and dipeptidyl peptidase-4 (DPP-4) inhibitors on cardiometabolic risk factors in routine care.

Materials And Methods: Using primary care data on 10 631 new users of SUs, SGLT2 inhibitors or DPP-4 inhibitors added to metformin, obtained from the UK Clinical Practice Research Datalink, we created propensity-score matched cohorts and used linear mixed models to describe changes in glycated haemoglobin (HbA1c), estimated glomerular filtration rate (eGFR), systolic blood pressure (BP) and body mass index (BMI) over 96 weeks.

Results: HbA1c levels fell substantially after treatment intensification for all drugs: mean change at week 12: SGLT2 inhibitors: -15.2 mmol/mol (95% confidence interval [CI] -16.9, -13.5); SUs: -14.3 mmol/mol (95% CI -15.5, -13.2); and DPP-4 inhibitors: -11.9 mmol/mol (95% CI -13.1, -10.6). Systolic BP fell for SGLT2 inhibitor users throughout follow-up, but not for DPP-4 inhibitor or SU users: mean change at week 12: SGLT2 inhibitors: -2.3 mmHg (95% CI -3.8, -0.8); SUs: -0.8 mmHg (95% CI -1.9, +0.4); and DPP-4 inhibitors: -0.9 mmHg (95% CI -2.1,+0.2). BMI decreased for SGLT2 inhibitor and DPP-4 inhibitor users, but not SU users: mean change at week 12: SGLT2 inhibitors: -0.7 kg/m (95% CI -0.9, -0.5); SUs: 0.0 kg/m (95% CI -0.3, +0.2); and DPP-4 inhibitors: -0.3 kg/m (95% CI -0.5, -0.1). eGFR fell at 12 weeks for SGLT2 inhibitor and DPP-4 inhibitor users. At 60 weeks, the fall in eGFR from baseline was similar for each drug class.

Conclusions: In routine care, SGLT2 inhibitors had greater effects on cardiometabolic risk factors than SUs. Routine care data closely replicated the effects of diabetes drugs on physiological variables measured in clinical trials.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/dom.13970DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7187358PMC
May 2020

Young men's alcohol consumption experiences and performances of masculinity.

Int J Drug Policy 2020 07 12;81:102550. Epub 2019 Sep 12.

IM Marsh Campus, Liverpool John Moores University, Barkhill Road, Liverpool L17 6BD, United Kingdom. Electronic address:

Background: By creating a dichotomy between those who are 'out-of-control' 'binge drinkers' and those for whom alcohol contributes to friendship fun, academic and alcohol policy literature often fail to acknowledge the nuances in the diverse drinking practices of men.

Methods: This paper engages with findings from a multiple qualitative method research project (comprising of individual and friendship group interviews; diaries; and participant observation), conducted with 16 young men, aged 15-24: eight living in the middle-class area of Chorlton, and eight living in the working-class area of Wythenshawe, Manchester, United Kingdom.

Results: This paper provides fine-grained insights into the doings, complexities and contradictions of masculinity in the context of drinking. Young men are shown to tap into different co-existing versions of masculinity, one of which is based on the exclusion of femininity (i.e. they act as tough guys), while another version is more inclusive (i.e. it allows for displays of care).

Conclusion: This paper shows a much more complex image of young men's drinking practices than has hitherto been conceptualised in the existing literature, and brings to the fore doings of alternative masculinities. This has important implications for alcohol policy interventions targeting men, in that the complexities and contradictions of masculinity in relation to drinking must be taken seriously.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.drugpo.2019.08.007DOI Listing
July 2020

The subjective world of home care workers in dementia: an "order of worth" analysis.

Home Health Care Serv Q 2019 Apr-Jun;38(2):96-109. Epub 2019 Feb 22.

a School of Sociology and Social Policy , University of Nottingham , Nottingham , UK.

The perspective of domiciliary workers is needed to recruit a high-quality workforce and meet growing demand. An English ethnographic study yielded extensive insights. To structure analysis of the study data, we apply a method developed by political theorists Boltanski and Thévenot that identifies key variables in different values systems. This "orders of worth" framework is used to map out the distinctive features of the subjective world of home carers. The results can be drawn on to formulate recruitment and retention policies, to design reward strategies or to ensure that training and education opportunities engage effectively with the workforce.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1080/01621424.2019.1578715DOI Listing
May 2020

'Going the extra mile' for older people with dementia: Exploring the voluntary labour of homecare workers.

Dementia (London) 2020 Oct 12;19(7):2220-2233. Epub 2018 Dec 12.

School of Science and the Environment, Manchester Metropolitan University, UK.

Homecare workers provide essential physical, social and emotional support to growing numbers of older people with dementia in the UK. Although it is acknowledged that the work can sometimes be demanding, some homecare workers regularly 'go the extra mile' for service users, working above and beyond the usual remit of the job. This form of voluntarism has been interpreted as an expression of an essentially caring nature, but also as the product of a work environment structured to tacitly endorse the provision of unpaid labour. This paper draws on a qualitative study of what constitutes 'good' homecare for older people with dementia. Using homecare workers' reflexive diaries ( = 11) and interviews with homecare workers ( = 14) and managers ( = 6), we explore manifestations of, and motivations for, homecare workers going the extra mile in their everyday work. We describe three modes of voluntary labour based on these accounts which we characterise as , and Our study highlights the complex relationships between job satisfaction, social benefit and commercial gain in the homecare work sector. Further research is needed to define the full range of affective and technical skills necessary to deliver good homecare, and to ensure that homecare work is appropriately credited.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1471301218817616DOI Listing
October 2020

Factors associated with choice of intensification treatment for type 2 diabetes after metformin monotherapy: a cohort study in UK primary care.

Clin Epidemiol 2018 8;10:1639-1648. Epub 2018 Nov 8.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK,

Purpose: To understand the patient characteristics associated with treatment choice at the first treatment intensification for type 2 diabetes.

Patients And Methods: This is a noninterventional study, using UK electronic primary care records from the Clinical Practice Research Datalink. We included adults treated with metformin monotherapy between January 2000 and July 2017. The outcome of interest was the drug prescribed at first intensification between 2014 and 2017. We used multinomial logistic regression to calculate the ORs for associations between the drugs and patient characteristics.

Results: In total, 14,146 people started treatment with an intensification drug. Younger people were substantially more likely to be prescribed sodium-glucose co-transporter-2 inhibitors (SGLT2is), than sulfonylureas (SUs): OR for SGLT2i prescription for those aged <30 years was 2.47 (95% CI 1.39-4.39) compared with those aged 60-70 years. Both overweight and obesity were associated with greater odds of being prescribed dipeptidyl peptidase-4 inhibitor (DPP4i) or SGLT2i. People of non-white ethnicity were less likely to be prescribed SGLT2i or DPP4i: compared with white patients, the OR of being prescribed SGLT2i among South Asians is 0.60 (95% CI 0.42-0.85), and for black people, the OR is 0.54 (95% CI 0.30-0.97). Lower socioeconomic status was also independently associated with reduced odds of being prescribed SGLT2is.

Conclusion: Both clinical and demographic factors are associated with prescribing at the first stage of treatment intensification, with older and non-white people less likely to receive new antidiabetic treatments. Our results suggest that the selection of treatment options used at the first stage of treatment intensification for type 2 diabetes is not driven by clinical need alone.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2147/CLEP.S176142DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6233860PMC
November 2018

National hospital mortality surveillance system: a descriptive analysis.

BMJ Qual Saf 2018 12 8;27(12):974-981. Epub 2018 Oct 8.

Primary Care and Public Health, Imperial College London, London, UK.

Objective: To provide a description of the Imperial College Mortality Surveillance System and subsequent investigations by the Care Quality Commission (CQC) in National Health Service (NHS) hospitals receiving mortality alerts.

Background: The mortality surveillance system has generated monthly mortality alerts since 2007, on 122 individual diagnosis and surgical procedure groups, using routinely collected hospital administrative data for all English acute NHS hospital trusts. The CQC, the English national regulator, is notified of each alert. This study describes the findings of CQC investigations of alerting trusts.

Methods: We carried out (1) a descriptive analysis of alerts (2007-2016) and (2) an audit of CQC investigations in a subset of alerts (2011-2013).

Results: Between April 2007 and October 2016, 860 alerts were generated and 76% (654 alerts) were sent to trusts. Alert volumes varied over time (range: 40-101). Septicaemia (except in labour) was the most commonly alerting group (11.5% alerts sent). We reviewed CQC communications in a subset of 204 alerts from 96 trusts. The CQC investigated 75% (154/204) of alerts. In 90% of these pursued alerts, trusts returned evidence of local case note reviews (140/154). These reviews found areas of care that could be improved in 69% (106/154) of alerts. In 25% (38/154) trusts considered that identified failings in care could have impacted on patient outcomes. The CQC investigations resulted in full trust action plans in 77% (118/154) of all pursued alerts.

Conclusion: The mortality surveillance system has generated a large number of alerts since 2007. Quality of care problems were found in 69% of alerts with CQC investigations, and one in four trusts reported that failings in care may have an impact on patient outcomes. Identifying whether mortality alerts are the most efficient means to highlight areas of substandard care will require further investigation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/bmjqs-2018-008364DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6288692PMC
December 2018

A systematic review comparing the evidence for kidney function outcomes between oral antidiabetic drugs for type 2 diabetes.

Wellcome Open Res 2018 19;3:74. Epub 2018 Jun 19.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, WC1E 7HT, UK.

: The development of kidney disease is a serious complication among people with type 2 diabetes mellitus, associated with substantially increased morbidity and mortality.  We aimed to summarise the current evidence for the relationship between treatments for type 2 diabetes and long-term kidney outcomes, by conducting a systematic search and review of relevant studies. : We searched Medline, Embase and Web of Science, between 1st January 1980 and 15th May 2018 for published clinical trials and observational studies comparing two or more classes of oral therapy for type 2 diabetes. We included people receiving oral antidiabetic drugs. Studies were eligible that; (i) compared two or more classes of oral therapy for type 2 diabetes; (ii) reported kidney outcomes as primary or secondary outcomes; (iii) included more than 100 participants; and (iv) followed up participants for 48 weeks or more. Kidney-related outcome measures included were Incidence of chronic kidney disease, reduced eGFR, increased creatinine, 'micro' and 'macro' albuminuria. We identified 15 eligible studies, seven of which were randomised controlled trials and eight were observational studies. Reporting of specific renal outcomes varied widely. Due to variability of comparisons and outcomes meta-analysis was not possible. The majority of comparisons between treatment with metformin or sulfonylurea indicated that metformin was associated with better renal outcomes. Little evidence was available for recently introduced treatments or commonly prescribed combination therapies. : Comparative evidence for the effect of treatments for type 2 diabetes on renal outcomes, either as monotherapy or in combination is sparse.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.12688/wellcomeopenres.14660.1DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6107985PMC
June 2018

Changing use of antidiabetic drugs in the UK: trends in prescribing 2000-2017.

BMJ Open 2018 07 28;8(7):e022768. Epub 2018 Jul 28.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK.

Objectives: Guidelines for the use of drugs for type 2 diabetes mellitus (T2DM) have changed since 2000, and new classes of drug have been introduced. Our aim was to describe how drug choice at initiation and first stage of intensification have changed over this period, and to what extent prescribing was in accord with clinical guidelines, including adherence to recommendations regarding kidney function.

Design: Repeated cross-sectional study.

Setting: UK electronic primary care health records from the Clinical Practice Research Datalink.

Participants: Adults initiating treatment with a drug for T2DM between January 2000 and July 2017.

Primary And Secondary Outcome Measures: The primary outcomes were the proportion of each class of T2DM drug prescribed for initiation and first-stage intensification in each year. We also examined drug prescribing by kidney function and country within the UK.

Results: Of 280 241 people initiating treatment with T2DM drugs from 2000 to 2017, 73% (204 238/280 241) initiated metformin, 15% (42 288/280 241) a sulfonylurea, 5% (12 956/280 241) with metformin and sulfonylurea dual therapy and 7% (20 759/280 241) started other options. Clinicians have increasingly prescribed metformin at initiation: by 2017 this was 89% (2475/2778) of drug initiations. Among people with an estimated glomerular filtration rate of ≤30 mL/min/1.73 m, the most common drug at initiation was a sulfonylurea, 58% (659/1135). In 2000, sulfonylureas were the predominant drug at the first stage of drug intensification (87%, 534/615) but by 2017 this fell to 30% (355/1183) as the use of newer drug classes increased. In 2017, new prescriptions for dipeptidyl peptidase-4 inhibitors (DPP4i) and sodium/glucose cotransporter-2 inhibitors (SGLT2i) accounted for 42% (502/1183) and 22% (256/1183) of intensification drugs, respectively. Uptake of new classes differs by country with DPP4is and SGLT2is prescribed more in Northern Ireland and Wales than England or Scotland.

Conclusions: Our findings show markedly changing prescribing patterns for T2DM between 2000 and 2017, largely consistent with clinical guidelines.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/bmjopen-2018-022768DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6067400PMC
July 2018

Investigating the association of alerts from a national mortality surveillance system with subsequent hospital mortality in England: an interrupted time series analysis.

BMJ Qual Saf 2018 12 4;27(12):965-973. Epub 2018 May 4.

Primary Care and Public Health, Imperial College London, London, UK.

Objective: To investigate the association between alerts from a national hospital mortality surveillance system and subsequent trends in relative risk of mortality.

Background: There is increasing interest in performance monitoring in the NHS. Since 2007, Imperial College London has generated monthly mortality alerts, based on statistical process control charts and using routinely collected hospital administrative data, for all English acute NHS hospital trusts. The impact of this system has not yet been studied.

Methods: We investigated alerts sent to Acute National Health Service hospital trusts in England in 2011-2013. We examined risk-adjusted mortality (relative risk) for all monitored diagnosis and procedure groups at a hospital trust level for 12 months prior to an alert and 23 months post alert. We used an interrupted time series design with a 9-month lag to estimate a trend prior to a mortality alert and the change in trend after, using generalised estimating equations.

Results: On average there was a 5% monthly increase in relative risk of mortality during the 12 months prior to an alert (95% CI 4% to 5%). Mortality risk fell, on average by 61% (95% CI 56% to 65%), during the 9-month period immediately following an alert, then levelled to a slow decline, reaching on average the level of expected mortality within 18 months of the alert.

Conclusions: Our results suggest an association between an alert notification and a reduction in the risk of mortality, although with less lag time than expected. It is difficult to determine any causal association. A proportion of alerts may be triggered by random variation alone and subsequent falls could simply reflect regression to the mean. Findings could also indicate that some hospitals are monitoring their own mortality statistics or other performance information, taking action prior to alert notification.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/bmjqs-2017-007495DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6288695PMC
December 2018

Validation of asthma recording in electronic health records: a systematic review.

Clin Epidemiol 2017 1;9:643-656. Epub 2017 Dec 1.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK.

Objective: To describe the methods used to validate asthma diagnoses in electronic health records and summarize the results of the validation studies.

Background: Electronic health records are increasingly being used for research on asthma to inform health services and health policy. Validation of the recording of asthma diagnoses in electronic health records is essential to use these databases for credible epidemiological asthma research.

Methods: We searched EMBASE and MEDLINE databases for studies that validated asthma diagnoses detected in electronic health records up to October 2016. Two reviewers independently assessed the full text against the predetermined inclusion criteria. Key data including author, year, data source, case definitions, reference standard, and validation statistics (including sensitivity, specificity, positive predictive value [PPV], and negative predictive value [NPV]) were summarized in two tables.

Results: Thirteen studies met the inclusion criteria. Most studies demonstrated a high validity using at least one case definition (PPV >80%). Ten studies used a manual validation as the reference standard; each had at least one case definition with a PPV of at least 63%, up to 100%. We also found two studies using a second independent database to validate asthma diagnoses. The PPVs of the best performing case definitions ranged from 46% to 58%. We found one study which used a questionnaire as the reference standard to validate a database case definition; the PPV of the case definition algorithm in this study was 89%.

Conclusion: Attaining high PPVs (>80%) is possible using each of the discussed validation methods. Identifying asthma cases in electronic health records is possible with high sensitivity, specificity or PPV, by combining multiple data sources, or by focusing on specific test measures. Studies testing a range of case definitions show wide variation in the validity of each definition, suggesting this may be important for obtaining asthma definitions with optimal validity.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2147/CLEP.S143718DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5716672PMC
December 2017

Validation of asthma recording in electronic health records: protocol for a systematic review.

BMJ Open 2017 05 29;7(5):e014694. Epub 2017 May 29.

Department of Non-Communicable Disease Epidemiology, London School of Hygiene and Tropical Medicine, London, UK.

Background: Asthma is a common, heterogeneous disease with significant morbidity and mortality worldwide. It can be difficult to define in epidemiological studies using electronic health records as the diagnosis is based on non-specific respiratory symptoms and spirometry, neither of which are routinely registered. Electronic health records can nonetheless be valuable to study the epidemiology, management, healthcare use and control of asthma. For health databases to be useful sources of information, asthma diagnoses should ideally be validated. The primary objectives are to provide an overview of the methods used to validate asthma diagnoses in electronic health records and summarise the results of the validation studies.

Methods: EMBASE and MEDLINE will be systematically searched for appropriate search terms. The searches will cover all studies in these databases up to October 2016 with no start date and will yield studies that have validated algorithms or codes for the diagnosis of asthma in electronic health records. At least one test validation measure (sensitivity, specificity, positive predictive value, negative predictive value or other) is necessary for inclusion. In addition, we require the validated algorithms to be compared with an external golden standard, such as a manual review, a questionnaire or an independent second database. We will summarise key data including author, year of publication, country, time period, date, data source, population, case characteristics, clinical events, algorithms, gold standard and validation statistics in a uniform table.

Ethics And Dissemination: This study is a synthesis of previously published studies and, therefore, no ethical approval is required. The results will be submitted to a peer-reviewed journal for publication. Results from this systematic review can be used to study outcome research on asthma and can be used to identify case definitions for asthma.

Prospero Registration Number: CRD42016041798.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1136/bmjopen-2016-014694DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5729974PMC
May 2017

Barriers to Implementation of Rapid and Point-of-Care Tests for Human Immunodeficiency Virus Infection: Findings From a Systematic Review (1996-2014).

Point Care 2015 Sep;14(3):81-87

Department of Medicine, McGill University; †Division of Clinical Epidemiology, Department of Medicine, McGill University Health Centre; ‡INSPQ, Montreal, Quebec, Canada; §Department of Health, Ethics & Society, Research School for Public Health and Primary Care, Maastricht University, Maastricht, The Netherlands; and ∥Dalla Lana School of Public Health, University of Toronto, Toronto, Ontario, Canada.

Implementation of human immunodeficiency virus rapid and point-of-care tests (RDT/POCT) is understood to be impeded by many different factors that operate at 4 main levels-test devices, patients, providers, and health systems-yet a knowledge gap exists of how they act and interact to impede implementation. To fill this gap, and with a view to improving the quality of implementation, we conducted a systematic review.

Methods: Five databases were searched, 16,672 citations were retrieved, and data were abstracted on 132 studies by 2 reviewers.

Findings: Across 3 levels (ie, patients, providers, and health systems), a majority (59%, 112/190) of the 190 barriers were related to the integration of RDT/POCT, followed by test-device-related concern (ie, accuracy) at 41% (78/190). At the patient level, a lack of awareness about tests (15/54, 28%) and time taken to test (12/54, 22%) dominated. At the provider and health system levels, integration of RDT/POCT in clinical workflows (7/24, 29%) and within hospitals (21/34, 62%) prevailed. Accuracy (57/78, 73%) was dominant only at the device level.

Interpretation: Integration barriers dominated the findings followed by test accuracy. Although accuracy has improved during the years, an ideal implementation could be achieved by improving the integration of RDT/POCT within clinics, hospitals, and health systems, with clear protocols, training on quality assurance and control, clear communication, and linkage plans to improve health outcomes of patients. This finding is pertinent for a future envisioned implementation and global scale-up of RDT/POCT-based initiatives.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/POC.0000000000000056DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4549862PMC
September 2015

On the adaptive function of children's and adults' false memories.

Memory 2016 09 31;24(8):1062-77. Epub 2015 Jul 31.

c School of Psychology, University of Central Lancashire , Preston , UK.

Recent research has shown that memory illusions can successfully prime both children's and adults' performance on complex, insight-based problems (compound remote associates tasks or CRATs). The current research aimed to clarify the locus of these priming effects. Like before, Deese-Roediger-McDermott (DRM) lists were selected to prime subsequent CRATs such that the critical lures were also the solution words to a subset of the CRATs participants attempted to solve. Unique to the present research, recognition memory tests were used and participants were either primed during the list study phase, during the memory test phase, or both. Across two experiments, primed problems were solved more frequently and significantly faster than unprimed problems. Moreover, when participants were primed during the list study phase, subsequent solution times and rates were considerably superior to those produced by those participants who were simply primed at test. Together, these are the first results to show that false-memory priming during encoding facilitates problem-solving in both children and adults.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1080/09658211.2015.1068335DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4960504PMC
September 2016

Signatures of diversifying selection in European pig breeds.

PLoS Genet 2013 Apr 25;9(4):e1003453. Epub 2013 Apr 25.

The Roslin Institute and Royal Dick School of Veterinary Studies, University of Edinburgh, Edinburgh, United Kingdom.

Following domestication, livestock breeds have experienced intense selection pressures for the development of desirable traits. This has resulted in a large diversity of breeds that display variation in many phenotypic traits, such as coat colour, muscle composition, early maturity, growth rate, body size, reproduction, and behaviour. To better understand the relationship between genomic composition and phenotypic diversity arising from breed development, the genomes of 13 traditional and commercial European pig breeds were scanned for signatures of diversifying selection using the Porcine60K SNP chip, applying a between-population (differentiation) approach. Signatures of diversifying selection between breeds were found in genomic regions associated with traits related to breed standard criteria, such as coat colour and ear morphology. Amino acid differences in the EDNRB gene appear to be associated with one of these signatures, and variation in the KITLG gene may be associated with another. Other selection signals were found in genomic regions including QTLs and genes associated with production traits such as reproduction, growth, and fat deposition. Some selection signatures were associated with regions showing evidence of introgression from Asian breeds. When the European breeds were compared with wild boar, genomic regions with high levels of differentiation harboured genes related to bone formation, growth, and fat deposition.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1371/journal.pgen.1003453DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3636142PMC
April 2013

Development of a genetic tool for product regulation in the diverse British pig breed market.

BMC Genomics 2012 Nov 15;13:580. Epub 2012 Nov 15.

The Roslin Institute and (Royal) Dick School of Veterinary Studies, University of Edinburgh, Easter Bush, Midlothian, EH25 9RG, UK.

Background: The application of DNA markers for the identification of biological samples from both human and non-human species is widespread and includes use in food authentication. In the food industry the financial incentive to substituting the true name of a food product with a higher value alternative is driving food fraud. This applies to British pork products where products derived from traditional pig breeds are of premium value. The objective of this study was to develop a genetic assay for regulatory authentication of traditional pig breed-labelled products in the porcine food industry in the United Kingdom.

Results: The dataset comprised of a comprehensive coverage of breed types present in Britain: 460 individuals from 7 traditional breeds, 5 commercial purebreds, 1 imported European breed and 1 imported Asian breed were genotyped using the PorcineSNP60 beadchip. Following breed-informative SNP selection, assignment power was calculated for increasing SNP panel size. A 96-plex assay created using the most informative SNPs revealed remarkably high genetic differentiation between the British pig breeds, with an average FST of 0.54 and Bayesian clustering analysis also indicated that they were distinct homogenous populations. The posterior probability of assignment of any individual of a presumed origin actually originating from that breed given an alternative breed origin was > 99.5% in 174 out of 182 contrasts, at a test value of log(LR) > 0. Validation of the 96-plex assay using independent test samples of known origin was successful; a subsequent survey of market samples revealed a high level of breed label conformity.

Conclusion: The newly created 96-plex assay using selected markers from the PorcineSNP60 beadchip enables powerful assignment of samples to traditional breed origin and can effectively identify mislabelling, providing a highly effective tool for DNA analysis in food forensics.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/1471-2164-13-580DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3499217PMC
November 2012

Deciphering the genetic basis of animal domestication.

Proc Biol Sci 2011 Nov 1;278(1722):3161-70. Epub 2011 Sep 1.

The Roslin Institute and Royal (Dick) School of Veterinary Studies, University of Edinburgh, Easter Bush, Midlothian EH25 9RG, UK.

Genomic technologies for livestock and companion animal species have revolutionized the study of animal domestication, allowing an increasingly detailed description of the genetic changes accompanying domestication and breed development. This review describes important recent results derived from the application of population and quantitative genetic approaches to the study of genetic changes in the major domesticated species. These include findings of regions of the genome that show between-breed differentiation, evidence of selective sweeps within individual genomes and signatures of demographic events. Particular attention is focused on the study of the genetics of behavioural traits and the implications for domestication. Despite the operation of severe bottlenecks, high levels of inbreeding and intensive selection during the history of domestication, most domestic animal species are genetically diverse. Possible explanations for this phenomenon are discussed. The major insights from the surveyed studies are highlighted and directions for future study are suggested.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1098/rspb.2011.1376DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3169034PMC
November 2011

Evaluation of approaches for identifying population informative markers from high density SNP chips.

BMC Genet 2011 May 13;12:45. Epub 2011 May 13.

The Roslin Institute and Royal (Dick) School of Veterinary Studies, University of Edinburgh, Easter Bush, Midlothian EH25 9RG, Scotland, UK.

Background: Genetic markers can be used to identify and verify the origin of individuals. Motivation for the inference of ancestry ranges from conservation genetics to forensic analysis. High density assays featuring Single Nucleotide Polymorphism (SNP) markers can be exploited to create a reduced panel containing the most informative markers for these purposes. The objectives of this study were to evaluate methods of marker selection and determine the minimum number of markers from the BovineSNP50 BeadChip required to verify the origin of individuals in European cattle breeds. Delta, Wright's FST, Weir & Cockerham's FST and PCA methods for population differentiation were compared. The level of informativeness of each SNP was estimated from the breed specific allele frequencies. Individual assignment analysis was performed using the ranked informative markers. Stringency levels were applied by log-likelihood ratio to assess the confidence of the assignment test.

Results: A 95% assignment success rate for the 384 individually genotyped animals was achieved with <80, <100, <140 and <200 SNP markers (with increasing stringency threshold levels) across all the examined methods for marker selection. No further gain in power of assignment was achieved by sampling in excess of 200 SNP markers. The marker selection method that required the lowest number of SNP markers to verify the animal's breed origin was Wright's FST (60 to 140 SNPs depending on the chosen degree of confidence). Certain breeds required fewer markers (<100) to achieve 100% assignment success. In contrast, closely related breeds require more markers (~200) to achieve>95% assignment success. The power of assignment success, and therefore the number of SNP markers required, is dependent on the levels of genetic heterogeneity and pool of samples considered.

Conclusions: While all SNP selection methods produced marker panels capable of breed identification, the power of assignment varied markedly among analysis methods. Thus, with effective exploration of available high density genetic markers, a diagnostic panel of highly informative markers can be produced.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/1471-2156-12-45DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3118130PMC
May 2011

The temporal structure of feeding behavior.

Am J Physiol Regul Integr Comp Physiol 2011 Aug 27;301(2):R378-93. Epub 2011 Apr 27.

Animal Health Group, SAC, Edinburgh, United Kingdom.

Meals have long been considered relevant units of feeding behavior. Large data sets of feeding behavior of cattle, pigs, chickens, ducks, turkeys, dolphins, and rats were analyzed with the aims of 1) describing the temporal structure of feeding behavior and 2) developing appropriate methods for estimating meal criteria. Longer (between-meal) intervals were never distributed as the negative exponential assumed by traditional methods, such as log-survivorship analysis, but as a skewed Gaussian, which can be (almost) normalized by log-transformation of interval lengths. Log-transformation can also normalize frequency distributions of within-meal intervals. Meal criteria, i.e., the longest interval considered to occur within meals, can be estimated after fitting models consisting of Gaussian functions alone or of one Weibull and one or more Gaussian functions to the distribution of log-transformed interval lengths. Nonuniform data sets may require disaggregation before this can be achieved. Observations from all species were in conflict with assumptions of random behavior that underlie traditional methods for criteria estimation. Instead, the observed structure of feeding behavior is consistent with 1) a decrease in satiety associated with an increase in the probability of animals starting a meal with time since the last meal and 2) an increase in satiation associated with an increase in the probability of animals ending a meal with the amount of food already consumed. The novel methodology proposed here will avoid biased conclusions from analyses of feeding behavior associated with previous methods and, as demonstrated, can be applied across a range of species to address questions relevant to the control of food intake.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1152/ajpregu.00661.2010DOI Listing
August 2011

Using story contexts to bias children's true and false memories.

J Exp Child Psychol 2011 Jan 1;108(1):77-95. Epub 2010 Aug 1.

Department of Psychology, Lancaster University, Lancaster LA1 4YF, UK.

The effects of embedding standard Deese/Roediger-McDermott (DRM) lists into stories whose context biased interpretation either toward or away from the overall themes of the DRM lists on both true and false recognition were investigated with 7- and 11-year-olds. These biased story contexts were compared with the same children's susceptibility to false memory illusions using the standard DRM list presentation paradigm. The results showed the usual age effects for true and false memories in the standard DRM list paradigm, where 11-year-olds exhibited higher rates of both true and false recognition compared with the 7-year-olds. Importantly, when DRM lists were embedded in stories, these age effects disappeared for true recognition. For false recognition, although developmental differences were attenuated, older children were still more susceptible to false memory illusions than younger children. These findings are discussed in terms of current theories of children's false memories as well as the role of themes and elaboration in children's memory development.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jecp.2010.06.009DOI Listing
January 2011

The use of the bacteriocin, nisin, as a preservative in pasteurized liquid whole egg.

Lett Appl Microbiol 1992 Oct;15(4):133-136

Aplin and Barrett Ltd, 15 North Street, Beaminster, Dorset DT8 3DZ.

Nisin used at a level of 5 mg/1 resulted in a significant increase in refrigerated shelf life of pasteurized liquid whole egg from between 6-11 d to 17-20 d. In the first of two trials, nisin also protected the liquid egg from growth of Bacillus cereus. Bacillus cereus was not presented in the egg in the second trial. Effective residual levels of nisin were detected in the liquid egg post-pasteurization.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1472-765X.1992.tb00746.xDOI Listing
October 1992
-->