Publications by authors named "Michael G Buhnerkempe"

21 Publications

  • Page 1 of 1

Adverse Health Outcomes Associated With Refractory and Treatment-Resistant Hypertension in the Chronic Renal Insufficiency Cohort.

Hypertension 2021 01 9;77(1):72-81. Epub 2020 Nov 9.

Center for Clinical Research (M.G.B., A.B.), Southern Illinois University School of Medicine, Springfield, IL.

Refractory hypertension (RfH) is a severe phenotype of antihypertension treatment failure. Treatment-resistant hypertension (TRH), a less severe form of difficult-to-treat hypertension, has been associated with significantly worse health outcomes. However, no studies currently show how health outcomes may worsen upon progression to RfH. RfH and TRH were studied in 3147 hypertensive participants in the CRIC (Chronic Renal Insufficiency Cohort study). The hypertensive phenotype (ie, no TRH or RfH, TRH, or RfH) was identified at the baseline visit, and health outcomes were monitored at subsequent visits. Outcome risk was compared using Cox proportional hazards models with time-varying covariates. A total of 136 (4.3%) individuals were identified with RfH at baseline. After adjusting for participant characteristics, individuals with RfH had increased risk for the composite renal outcome across all study years (50% decline in estimated glomerular filtration rate or end-stage renal disease; hazard ratio for study years 0-10=1.73 [95% CI, 1.42-2.11]) and the composite cardiovascular disease outcome during later study years (stroke, myocardial infarction, or congestive heart failure; hazard ratio for study years 0-3=1.25 [0.91-1.73], for study years 3-6=1.50 [0.97-2.32]), and for study years 6-10=2.72 [1.47-5.01]) when compared with individuals with TRH. There was no significant difference in all-cause mortality between those with refractory versus TRH. We provide the first evidence that RfH is associated with worse long-term health outcomes compared with TRH.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1161/HYPERTENSIONAHA.120.15064DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7725845PMC
January 2021

Estimating prevalence and test accuracy in disease ecology: How Bayesian latent class analysis can boost or bias imperfect test results.

Ecol Evol 2020 Jul 15;10(14):7221-7232. Epub 2020 Jun 15.

Department of Ecology and Evolutionary Biology University of California, Los Angeles Los Angeles CA USA.

Obtaining accurate estimates of disease prevalence is crucial for the monitoring and management of wildlife populations but can be difficult if different diagnostic tests yield conflicting results and if the accuracy of each diagnostic test is unknown. Bayesian latent class analysis (BLCA) modeling offers a potential solution, providing estimates of prevalence levels and diagnostic test accuracy under the realistic assumption that no diagnostic test is perfect.In typical applications of this approach, the specificity of one test is fixed at or close to 100%, allowing the model to simultaneously estimate the sensitivity and specificity of all other tests, in addition to infection prevalence. In wildlife systems, a test with near-perfect specificity is not always available, so we simulated data to investigate how decreasing this fixed specificity value affects the accuracy of model estimates.We used simulations to explore how the trade-off between diagnostic test specificity and sensitivity impacts prevalence estimates and found that directional biases depend on pathogen prevalence. Both the precision and accuracy of results depend on the sample size, the diagnostic tests used, and the true infection prevalence, so these factors should be considered when applying BLCA to estimate disease prevalence and diagnostic test accuracy in wildlife systems. A wildlife disease case study, focusing on leptospirosis in California sea lions, demonstrated the potential for Bayesian latent class methods to provide reliable estimates under real-world conditions.We delineate conditions under which BLCA improves upon the results from a single diagnostic across a range of prevalence levels and sample sizes, demonstrating when this method is preferable for disease ecologists working in a wide variety of pathogen systems.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/ece3.6448DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7391344PMC
July 2020

Linking longitudinal and cross-sectional biomarker data to understand host-pathogen dynamics: Leptospira in California sea lions (Zalophus californianus) as a case study.

PLoS Negl Trop Dis 2020 06 29;14(6):e0008407. Epub 2020 Jun 29.

Department of Ecology and Evolutionary Biology, University of California, Los Angeles, California, United States of America.

Confronted with the challenge of understanding population-level processes, disease ecologists and epidemiologists often simplify quantitative data into distinct physiological states (e.g. susceptible, exposed, infected, recovered). However, data defining these states often fall along a spectrum rather than into clear categories. Hence, the host-pathogen relationship is more accurately defined using quantitative data, often integrating multiple diagnostic measures, just as clinicians do to assess their patients. We use quantitative data on a major neglected tropical disease (Leptospira interrogans) in California sea lions (Zalophus californianus) to improve individual-level and population-level understanding of this Leptospira reservoir system. We create a "host-pathogen space" by mapping multiple biomarkers of infection (e.g. serum antibodies, pathogen DNA) and disease state (e.g. serum chemistry values) from 13 longitudinally sampled, severely ill individuals to characterize changes in these values through time. Data from these individuals describe a clear, unidirectional trajectory of disease and recovery within this host-pathogen space. Remarkably, this trajectory also captures the broad patterns in larger cross-sectional datasets of 1456 wild sea lions in all states of health but sampled only once. Our framework enables us to determine an individual's location in their time-course since initial infection, and to visualize the full range of clinical states and antibody responses induced by pathogen exposure. We identify predictive relationships between biomarkers and outcomes such as survival and pathogen shedding, and use these to impute values for missing data, thus increasing the size of the useable dataset. Mapping the host-pathogen space using quantitative biomarker data enables more nuanced understanding of an individual's time course of infection, duration of immunity, and probability of being infectious. Such maps also make efficient use of limited data for rare or poorly understood diseases, by providing a means to rapidly assess the range and extent of potential clinical and immunological profiles. These approaches yield benefits for clinicians needing to triage patients, prevent transmission, and assess immunity, and for disease ecologists or epidemiologists working to develop appropriate risk management strategies to reduce transmission risk on a population scale (e.g. model parameterization using more accurate estimates of duration of immunity and infectiousness) and to assess health impacts on a population scale.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1371/journal.pntd.0008407DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7351238PMC
June 2020

Serious Adverse Events Cluster in Participants Experiencing the Primary Composite Cardiovascular Endpoint: A Post Hoc Analysis of the SPRINT Trial.

Am J Hypertens 2020 05;33(6):528-533

Division of General Internal Medicine, Hypertension Section, Department of Internal Medicine, Southern Illinois University School of Medicine, Springfield, Illinois, USA.

Background: Intensively treated participants in the SPRINT study experienced fewer primary cardiovascular composite study endpoints (CVD events) and lower mortality, although 38% of participants experienced a serious adverse event (SAE). The relationship of SAEs with CVD events is unknown.

Methods: CVD events were defined as either myocardial infarction, acute coronary syndrome, decompensated heart failure, stroke, or death from cardiovascular causes. Cox models were utilized to understand the occurrence of SAEs with CVD events according to baseline atherosclerotic cardiovascular disease (ASCVD) risk.

Results: SAEs occurred in 96% of those experiencing a CVD event but only in 34% (P < 0.001) of those not experiencing a CVD event. Occurrence of SAEs monotonically increased across the range of baseline ASCVD risk being approximately twice as great in the highest compared with the lowest risk category. SAE occurrence was strongly associated with ASCVD risk but was similar within risk groups across treatment arms. In adjusted Cox models, experiencing a CVD event was the strongest predictor of SAEs in all risk groups. By the end of year 1, the hazard ratios for the low, middle, and high ASCVD risk tertiles, and baseline clinical CVD group were 2.56 (95% CI = 1.39-4.71); 2.52 (1.63-3.89); 3.61 (2.79-4.68); 1.86 (1.37-2.54), respectively-a trend observed in subsequent years until study end. Intensive treatment independently predicted SAEs only in the second ASVCD risk tertile.

Conclusions: The occurrence of SAEs is multifactorial and mostly related to prerandomization patient characteristics, most prominently ASCVD risk, which, in turn, relates to in-study CVD events.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/ajh/hpaa010DOI Listing
May 2020

Prevalence of refractory hypertension in the United States from 1999 to 2014.

J Hypertens 2019 09;37(9):1797-1804

Department of Internal Medicine, Division of General Internal Medicine, Hypertension Section, Southern Illinois University School of Medicine, Springfield, Illinois.

Objectives: Refractory hypertension has been defined as uncontrolled blood pressure (at or above 140/90 mmHg) when on five or more classes of antihypertensive medication, inclusive of a diuretic. Because unbiased estimates of the prevalence of refractory hypertension in the United States are lacking, we aim to provide such estimates using data from the National Health and Nutrition Examination Surveys (NHANES).

Methods: Refractory hypertension was assessed across multiple NHANES cycles using the aforementioned definition. Eight cycles of NHANES surveys (1999-2014) representing 41 552 patients are the subject of this study. Prevalence of refractory hypertension across these surveys was estimated in the drug-treated hypertensive population after adjusting for the complex survey design and standardizing for age.

Results: Across all surveys, refractory hypertension prevalence was 0.6% [95% confidence interval (CI) (0.5, 0.7)] amongst drug-treated hypertensive adults; 6.2% [95% CI (5.1, 7.6)] of individuals with treatment-resistant hypertension actually had refractory hypertension. Although the prevalence of refractory hypertension ranged from 0.3% [95% CI (0.1, 1.0)] to 0.9% [95% CI (0.6, 1.2)] over the eight cycles considered, there was no significant trend in prevalence over time. Refractory hypertension prevalence amongst those prescribed five or more drugs was 34.5% [95% CI (27.9, 41.9)]. Refractory hypertension was associated with advancing age, lower household income, black race, and also chronic kidney disease, albuminuria, diabetes, prior stroke, and coronary heart disease.

Conclusions: We provided the first nationally representative estimate of refractory hypertension prevalence in US adults.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/HJH.0000000000002103DOI Listing
September 2019

Fixed-rate insulin for adult diabetic ketoacidosis is associated with more frequent hypoglycaemia than rate-reduction method: a retrospective cohort study.

Int J Pharm Pract 2019 Aug 8;27(4):380-385. Epub 2019 Mar 8.

HSHS St. John's Hospital, Springfield, IL, USA.

Objective: To assess whether hypoglycaemia incidence during management of adult diabetic ketoacidosis (DKA) differed following transition from a fixed-rate insulin protocol to a protocol using an empiric insulin rate reduction after normoglycaemia.

Methods: We retrospectively reviewed charts from adult patients managed with a DKA order set before and after order set revision. In cohort 1 (n = 77), insulin rate was 0.1 unit/kg/h with no adjustments and dextrose was infused at 12.5 g/h after glucose reached 250 mg/dl. In cohort 2 (n = 78), insulin was reduced to 0.05 unit/kg/h concurrent with dextrose initiation at 12.5 g/h after glucose reached 200 mg/dl. The primary outcome was hypoglycaemia (glucose < 70 mg/dl) within 24 h of the first order for insulin.

Key Findings: The 24-h incidence of hypoglycaemia was 19.2% in cohort 2 versus 32.5% in cohort 1; the adjusted odds ratio was 0.46 (95% confidence interval (CI) [0.21, 0.98]; P = 0.047). The 24-h use of dextrose 50% in water (D50W) was also reduced in cohort 2. No differences were seen in anion gap or bicarbonate normalization, rebound hyperglycaemia or ICU length of stay. In most patients who became hypoglycaemic, the preceding glucose value was below 100 mg/dl.

Conclusions: The insulin rate-reduction protocol was associated with less hypoglycaemia and no obvious disadvantage. Robust intervention for low-normal glucose values could plausibly achieve low hypoglycaemia rates with either approach.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ijpp.12525DOI Listing
August 2019

Predicting the risk of apparent treatment-resistant hypertension: a longitudinal, cohort study in an urban hypertension referral clinic.

J Am Soc Hypertens 2018 11 20;12(11):809-817. Epub 2018 Sep 20.

Division of General Internal Medicine, Department of Internal Medicine, Hypertension Section, Southern Illinois University School of Medicine, Springfield, IL, USA. Electronic address:

Apparent treatment-resistant hypertension (aTRH) is associated with higher prevalence of secondary hypertension, greater risk for adverse pressure-related clinical outcomes, and influences diagnostic and therapeutic decision-making. We previously showed that cross-sectional prevalence estimates of aTRH are lower than its true prevalence as patients with uncontrolled hypertension undergoing intensification/optimization of therapy will, over time, increasingly satisfy diagnostic criteria for aTRH. aTRH was assessed in an urban referral hypertension clinic using a 140/90 mm Hg goal blood pressure target in 745 patients with uncontrolled blood pressure, who were predominately African-American (86%) and female (65%). Analyses were stratified according to existing prescription of diuretic at initial visit. Risk for aTRH was estimated using logistic regression with patient characteristics at index visit as predictors. Among those prescribed diuretics, 84/363 developed aTRH; the risk score discriminated well (area under the receiver operating curve = 0.77, bootstrapped 95% CI [0.71, 0.81]). In patients not prescribed a diuretic, 44/382 developed aTRH, and the risk score showed a significantly better discriminative ability (area under the receiver operating curve = 0.82 [0.76, 0.87]; P < .001). In the diuretic and nondiuretic cohorts, 145/363 and 290/382 of patients had estimated risks for development of aTRH <15%. Of these low-risk patients, 139/145 and 278/290 did not develop aTRH (negative predictive value, diuretics - 0.94 [0.91, 0.98], no diuretics - 0.95 [0.93, 0.97]). We created a novel clinical score that discriminates well between those who will and will not develop aTRH, especially among those without existing diuretic prescriptions. Irrespective of baseline diuretic treatment status, a low-risk score had very high negative predictive value.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jash.2018.09.006DOI Listing
November 2018

Clay content and pH: soil characteristic associations with the persistent presence of chronic wasting disease in northern Illinois.

Sci Rep 2017 12 22;7(1):18062. Epub 2017 Dec 22.

Illinois Natural History Survey - Prairie Research Institute, University of Illinois Urbana-Champaign, 1816 S Oak Street, Champaign, IL, 61820, USA.

Environmental reservoirs are important to infectious disease transmission and persistence, but empirical analyses are relatively few. The natural environment is a reservoir for prions that cause chronic wasting disease (CWD) and influences the risk of transmission to susceptible cervids. Soil is one environmental component demonstrated to affect prion infectivity and persistence. Here we provide the first landscape predictive model for CWD based solely on soil characteristics. We built a boosted regression tree model to predict the probability of the persistent presence of CWD in a region of northern Illinois using CWD surveillance in deer and soils data. We evaluated the outcome for possible pathways by which soil characteristics may increase the probability of CWD transmission via environmental contamination. Soil clay content and pH were the most important predictive soil characteristics of the persistent presence of CWD. The results suggest that exposure to prions in the environment is greater where percent clay is less than 18% and soil pH is greater than 6.6. These characteristics could alter availability of prions immobilized in soil and contribute to the environmental risk factors involved in the epidemiological complexity of CWD infection in natural populations of white-tailed deer.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/s41598-017-18321-xDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5741720PMC
December 2017

Detecting signals of chronic shedding to explain pathogen persistence: Leptospira interrogans in California sea lions.

J Anim Ecol 2017 May 3;86(3):460-472. Epub 2017 Apr 3.

Department of Ecology and Evolutionary Biology, University of California - Los Angeles, Los Angeles, CA, USA.

Identifying mechanisms driving pathogen persistence is a vital component of wildlife disease ecology and control. Asymptomatic, chronically infected individuals are an oft-cited potential reservoir of infection, but demonstrations of the importance of chronic shedding to pathogen persistence at the population-level remain scarce. Studying chronic shedding using commonly collected disease data is hampered by numerous challenges, including short-term surveillance that focuses on single epidemics and acutely ill individuals, the subtle dynamical influence of chronic shedding relative to more obvious epidemic drivers, and poor ability to differentiate between the effects of population prevalence of chronic shedding vs. intensity and duration of chronic shedding in individuals. We use chronic shedding of Leptospira interrogans serovar Pomona in California sea lions (Zalophus californianus) as a case study to illustrate how these challenges can be addressed. Using leptospirosis-induced strands as a measure of disease incidence, we fit models with and without chronic shedding, and with different seasonal drivers, to determine the time-scale over which chronic shedding is detectable and the interactions between chronic shedding and seasonal drivers needed to explain persistence and outbreak patterns. Chronic shedding can enable persistence of L. interrogans within the sea lion population. However, the importance of chronic shedding was only apparent when surveillance data included at least two outbreaks and the intervening inter-epidemic trough during which fadeout of transmission was most likely. Seasonal transmission, as opposed to seasonal recruitment of susceptibles, was the dominant driver of seasonality in this system, and both seasonal factors had limited impact on long-term pathogen persistence. We show that the temporal extent of surveillance data can have a dramatic impact on inferences about population processes, where the failure to identify both short- and long-term ecological drivers can have cascading impacts on understanding higher order ecological phenomena, such as pathogen persistence.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/1365-2656.12656DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7166352PMC
May 2017

Inferring infection hazard in wildlife populations by linking data across individual and population scales.

Ecol Lett 2017 03 16;20(3):275-292. Epub 2017 Jan 16.

Department of Ecology & Evolutionary Biology, UCLA, Los Angeles, CA, 90095, USA.

Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1111/ele.12732DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7163542PMC
March 2017

Epidemiological models to control the spread of information in marine mammals.

Proc Biol Sci 2016 12;283(1844)

Department of Ecology and Evolutionary Biology, University of California Los Angeles, CA 90095-1606, USA.

Socially transmitted wildlife behaviours that create human-wildlife conflict are an emerging problem for conservation efforts, but also provide a unique opportunity to apply principles of infectious disease control to wildlife management. As an example, California sea lions (Zalophus californianus) have learned to exploit concentrations of migratory adult salmonids below the fish ladders at Bonneville Dam, impeding endangered salmonid recovery. Proliferation of this foraging behaviour in the sea lion population has resulted in a controversial culling programme of individual sea lions at the dam, but the impact of such culling remains unclear. To evaluate the effectiveness of current and alternative culling strategies, we used network-based diffusion analysis on a long-term dataset to demonstrate that social transmission is implicated in the increase in dam-foraging behaviour and then studied different culling strategies within an epidemiological model of the behavioural transmission data. We show that current levels of lethal control have substantially reduced the rate of social transmission, but failed to effectively reduce overall sea lion recruitment. Earlier implementation of culling could have substantially reduced the extent of behavioural transmission and, ultimately, resulted in fewer animals being culled. Epidemiological analyses offer a promising tool to understand and control socially transmissible behaviours.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1098/rspb.2016.2037DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5204154PMC
December 2016

Mapping U.S. cattle shipment networks: Spatial and temporal patterns of trade communities from 2009 to 2011.

Prev Vet Med 2016 Nov 3;134:82-91. Epub 2016 Oct 3.

Department of Biology, Colorado State University, Fort Collins, CO, USA.

The application of network analysis to cattle shipments broadens our understanding of shipment patterns beyond pairwise interactions to the network as a whole. Such a quantitative description of cattle shipments in the U.S. can identify trade communities, describe temporal shipment patterns, and inform the design of disease surveillance and control strategies. Here, we analyze a longitudinal dataset of beef and dairy cattle shipments from 2009 to 2011 in the United States to characterize communities within the broader cattle shipment network, which are groups of counties that ship mostly to each other. Because shipments occur over time, we aggregate the data at various temporal scales to examine the consistency of network and community structure over time. Our results identified nine large (>50 counties) communities based on shipments of beef cattle in 2009 aggregated into an annual network and nine large communities based on shipments of dairy cattle. The size and connectance of the shipment network was highly dynamic; monthly networks were smaller than yearly networks and revealed seasonal shipment patterns consistent across years. Comparison of the shipment network over time showed largely consistent shipping patterns, such that communities identified on annual networks of beef and diary shipments from 2009 still represented 41-95% of shipments in monthly networks from 2009 and 41-66% of shipments from networks in 2010 and 2011. The temporal aspects of cattle shipments suggest that future applications of the U.S. cattle shipment network should consider seasonal shipment patterns. However, the consistent within-community shipping patterns indicate that yearly communities could provide a reasonable way to group regions for management.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.prevetmed.2016.09.023DOI Listing
November 2016

Identification of migratory bird flyways in North America using community detection on biological networks.

Ecol Appl 2016 Apr;26(3):740-51

Migratory behavior of waterfowl populations in North America has traditionally been broadly characterized by four north-south flyways, and these flyways have been central to the management of waterfowl populations for more than 80 yr. However, previous flyway characterizations are not easily updated with current bird movement data and fail to provide assessments of the importance of specific geographical regions to the identification of flyways. Here, we developed a network model of migratory movement for four waterfowl species, Mallard (Anas platyrhnchos), Northern Pintail (A. acuta), American Green-winged Teal (A. carolinensis), and Canada Goose (Branta canadensis), in North America, using bird band and recovery data. We then identified migratory flyways using a community detection algorithm and characterized the importance of smaller geographic regions in identifying flyways using a novel metric, the consolidation factor. We identified four main flyways for Mallards, Northern Pintails, and American Green-winged Teal, with the flyway identification in Canada Geese exhibiting higher complexity. For Mallards, flyways were relatively consistent through time. However, consolidation factors revealed that for Mallards and Green-winged Teal, the presumptive Mississippi flyway was potentially a zone of high mixing between other flyways. Our results demonstrate that the network approach provides a robust method for flyway identification that is widely applicable given the relatively minimal data requirements and is easily updated with future movement data to reflect changes in flyway definitions and management goals.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1890/15-0934DOI Listing
April 2016

Mapping influenza transmission in the ferret model to transmission in humans.

Elife 2015 Sep 2;4. Epub 2015 Sep 2.

Department of Ecology and Evolutionary Biology, University of California, Los Angeles, Los Angeles, United States.

The controversy surrounding 'gain-of-function' experiments on high-consequence avian influenza viruses has highlighted the role of ferret transmission experiments in studying the transmission potential of novel influenza strains. However, the mapping between influenza transmission in ferrets and in humans is unsubstantiated. We address this gap by compiling and analyzing 240 estimates of influenza transmission in ferrets and humans. We demonstrate that estimates of ferret secondary attack rate (SAR) explain 66% of the variation in human SAR estimates at the subtype level. Further analysis shows that ferret transmission experiments have potential to identify influenza viruses of concern for epidemic spread in humans, though small sample sizes and biological uncertainties prevent definitive classification of human transmissibility. Thus, ferret transmission experiments provide valid predictions of pandemic potential of novel influenza strains, though results should continue to be corroborated by targeted virological and epidemiological research.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.7554/eLife.07969DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4586390PMC
September 2015

Utility of mosquito surveillance data for spatial prioritization of vector control against dengue viruses in three Brazilian cities.

Parasit Vectors 2015 Feb 15;8:98. Epub 2015 Feb 15.

Fogarty International Center, National Institute of Health, Bethesda, Maryland, 20892, USA.

Background: Vector control remains the primary defense against dengue fever. Its success relies on the assumption that vector density is related to disease transmission. Two operational issues include the amount by which mosquito density should be reduced to minimize transmission and the spatio-temporal allotment of resources needed to reduce mosquito density in a cost-effective manner. Recently, a novel technology, MI-Dengue, was implemented city-wide in several Brazilian cities to provide real-time mosquito surveillance data for spatial prioritization of vector control resources. We sought to understand the role of city-wide mosquito density data in predicting disease incidence in order to provide guidance for prioritization of vector control work.

Methods: We used hierarchical Bayesian regression modeling to examine the role of city-wide vector surveillance data in predicting human cases of dengue fever in space and time. We used four years of weekly surveillance data from Vitoria city, Brazil, to identify the best model structure. We tested effects of vector density, lagged case data and spatial connectivity. We investigated the generality of the best model using an additional year of data from Vitoria and two years of data from other Brazilian cities: Governador Valadares and Sete Lagoas.

Results: We found that city-wide, neighborhood-level averages of household vector density were a poor predictor of dengue-fever cases in the absence of accounting for interactions with human cases. Effects of city-wide spatial patterns were stronger than within-neighborhood or nearest-neighborhood effects. Readily available proxies of spatial relationships between human cases, such as economic status, population density or between-neighborhood roadway distance, did not explain spatial patterns in cases better than unweighted global effects.

Conclusions: For spatial prioritization of vector controls, city-wide spatial effects should be given more weight than within-neighborhood or nearest-neighborhood connections, in order to minimize city-wide cases of dengue fever. More research is needed to determine which data could best inform city-wide connectivity. Once these data become available, MI-dengue may be even more effective if vector control is spatially prioritized by considering city-wide connectivity between cases together with information on the location of mosquito density and infected mosquitos.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s13071-015-0659-yDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4335543PMC
February 2015

Eight challenges in modelling disease ecology in multi-host, multi-agent systems.

Epidemics 2015 Mar 9;10:26-30. Epub 2014 Dec 9.

Department of Ecology and Evolutionary Biology, University of California-Los Angeles, Los Angeles, CA, USA; Fogarty International Center, National Institutes of Health, Bethesda, MD, USA.

Many disease systems exhibit complexities not captured by current theoretical and empirical work. In particular, systems with multiple host species and multiple infectious agents (i.e., multi-host, multi-agent systems) require novel methods to extend the wealth of knowledge acquired studying primarily single-host, single-agent systems. We outline eight challenges in multi-host, multi-agent systems that could substantively increase our knowledge of the drivers and broader ecosystem effects of infectious disease dynamics.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.epidem.2014.10.001DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4437722PMC
March 2015

Antibiotic Efficacy in Eliminating Leptospiruria in California Sea Lions () Stranding with Leptospirosis.

Aquat Mamm 2015 26;41(2):203-212. Epub 2015 May 26.

Department of Ecology and Evolutionary Biology, University of California, Los Angeles, CA 90095, USA,

Stranded California sea lions () along the California coast have been diagnosed with leptospirosis every year since at least the 1980s. Between September 2010 and November 2011, we followed 14 stranded California sea lions that survived to release and evaluated antibiotic efficacy in eliminating leptospiruria (urinary shedding of leptospires). Leptospiruria was assessed by real-time PCR of urine and urine culture, with persistence assessed using longitudinally collected samples. Serum chemistry was used to assess recovery of normal renal function. Microscopic agglutination testing (MAT) was performed to assess serum anti- antibody titers, and the MAT reactivity patterns were consistent with serovar Pomona infection frequently observed in this population. Animals were initially treated for 6 to 16 d (median = 10.5; mean = 10.8) with antibiotics from the penicillin family, with some receiving additional antibiotics to treat other medical conditions. All urine cultures were negative; therefore, the presence of leptospiruria was assessed using PCR. Leptospiruria continued beyond the initial course of penicillin family antibiotics in 13 of the 14 sea lions, beyond the last antibiotic dose in 11 of the 14 sea lions, beyond recovery of renal function in 13 of the 14 sea lions, and persisted for at least 8 to 86 d (median = 45; mean = 46.8). Five animals were released with no negative urine PCR results detected; thus, their total shedding duration may have been longer. Cessation of leptospiruria was more likely in animals that received antibiotics for a greater duration, especially if coverage was uninterrupted. Real-time PCR results indicate that an antibiotic protocol commonly used to treat leptospirosis in rehabilitating California sea lions does not eliminate leptospiruria. It is possible that antibiotic protocols given for a longer duration and/or including other antibiotics may be effective in eliminating leptospiruria. These results may have important human and animal health implications, especially in rehabilitation facilities, as transmission may occur through contact with animals with persistent leptospiruria.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1578/AM.41.2.2015.203DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6379896PMC
May 2015

The impact of movements and animal density on continental scale cattle disease outbreaks in the United States.

PLoS One 2014 26;9(3):e91724. Epub 2014 Mar 26.

Department of Biology, Colorado State University, Fort Collins, Colorado, United States of America.

Globalization has increased the potential for the introduction and spread of novel pathogens over large spatial scales necessitating continental-scale disease models to guide emergency preparedness. Livestock disease spread models, such as those for the 2001 foot-and-mouth disease (FMD) epidemic in the United Kingdom, represent some of the best case studies of large-scale disease spread. However, generalization of these models to explore disease outcomes in other systems, such as the United States's cattle industry, has been hampered by differences in system size and complexity and the absence of suitable livestock movement data. Here, a unique database of US cattle shipments allows estimation of synthetic movement networks that inform a near-continental scale disease model of a potential FMD-like (i.e., rapidly spreading) epidemic in US cattle. The largest epidemics may affect over one-third of the US and 120,000 cattle premises, but cattle movement restrictions from infected counties, as opposed to national movement moratoriums, are found to effectively contain outbreaks. Slow detection or weak compliance may necessitate more severe state-level bans for similar control. Such results highlight the role of large-scale disease models in emergency preparedness, particularly for systems lacking comprehensive movement and outbreak data, and the need to rapidly implement multi-scale contingency plans during a potential US outbreak.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0091724PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3966763PMC
December 2015

A national-scale picture of U.S. cattle movements obtained from Interstate Certificate of Veterinary Inspection data.

Prev Vet Med 2013 Nov 16;112(3-4):318-29. Epub 2013 Aug 16.

Department of Biology, Colorado State University, Fort Collins, CO, USA.

We present the first comprehensive description of how shipments of cattle connect the geographic extent and production diversity of the United States cattle industry. We built a network of cattle movement from a state-stratified 10% systematic sample of calendar year 2009 Interstate Certificates of Veterinary Inspection (ICVI) data. ICVIs are required to certify the apparent health of cattle moving across state borders and allow us to examine cattle movements at the county scale. The majority of the ICVI sample consisted of small shipments (<20 head) moved for feeding and beef production. Geographically, the central plains states had the most connections, correlated to feeding infrastructure. The entire nation was closely connected when interstate movements were summarized at the state level. At the county-level, the U.S. is still well connected geographically, but significant heterogeneities in the location and identity of counties central to the network emerge. Overall, the network of interstate movements is described by a hub structure, with a few counties sending or receiving extremely large numbers of shipments and many counties sending and receiving few shipments. The county-level network also has a very low proportion of reciprocal movements, indicating that high-order network properties may be better at describing a county's importance than simple summaries of the number of shipments or animals sent and received. We suggest that summarizing cattle movements at the state level homogenizes the network and a county level approach is most appropriate for examining processes influenced by cattle shipments, such as economic analyses and disease outbreaks.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.prevetmed.2013.08.002DOI Listing
November 2013

Assessment of paper interstate certificates of veterinary inspection used to support disease tracing in cattle.

J Am Vet Med Assoc 2013 Aug;243(4):555-60

USDA APHIS Veterinary Services, Fort Collins, CO 80526, USA.

Objective: To evaluate the differences among each state's Interstate Certificate of Veterinary Inspection (ICVI) form and the legibility of data on paper ICVIs used to support disease tracing in cattle.

Design: Descriptive retrospective cross-sectional study.

Sample: Examples of ICVIs from 50 states and 7,630 randomly sampled completed paper ICVIs for cattle from 48 states.

Procedures: Differences among paper ICVI forms from all 50 states were determined. Sixteen data elements were selected for further evaluation of their value in tracing cattle. Completed paper ICVIs for interstate cattle exports in 2009 were collected from 48 states. Each of the 16 data elements was recorded as legible, absent, or illegible on forms completed by accredited veterinarians, and results were summarized by state. Mean values for legibility at the state level were used to estimate legibility of data at the national level.

Results: ICVIs were inconsistent among states in regard to data elements requested and availability of legible records. A mean ± SD of 70.0 ± 22.1% of ICVIs in each state had legible origin address information. Legible destination address information was less common, with 55.0 ± 21.4% of records complete. Incomplete address information was most often a result of the field having been left blank. Official animal identification was present on 33.1% of ICVIs.

Conclusions And Clinical Relevance: The inconsistency among state ICVI forms and quality of information provided on paper ICVIs could lead to delays and the need for additional resources to trace cattle, which could result in continued spread of disease. Standardized ICVIs among states and more thorough recording of information by accredited veterinarians or expanded usage of electronic ICVIs could enhance traceability of cattle during an outbreak.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2460/javma.243.4.555DOI Listing
August 2013

Transmission shifts underlie variability in population responses to Yersinia pestis infection.

PLoS One 2011 25;6(7):e22498. Epub 2011 Jul 25.

Department of Biology, Colorado State University, Fort Collins, Colorado, United States of America.

Host populations for the plague bacterium, Yersinia pestis, are highly variable in their response to plague ranging from near deterministic extinction (i.e., epizootic dynamics) to a low probability of extinction despite persistent infection (i.e., enzootic dynamics). Much of the work to understand this variability has focused on specific host characteristics, such as population size and resistance, and their role in determining plague dynamics. Here, however, we advance the idea that the relative importance of alternative transmission routes may vary causing shifts from epizootic to enzootic dynamics. We present a model that incorporates host and flea ecology with multiple transmission hypotheses to study how transmission shifts determine population responses to plague. Our results suggest enzootic persistence relies on infection of an off-host flea reservoir and epizootics rely on transiently maintained flea infection loads through repeated infectious feeds by fleas. In either case, early-phase transmission by fleas (i.e., transmission immediately following an infected blood meal) has been observed in laboratory studies, and we show that it is capable of driving plague dynamics at the population level. Sensitivity analysis of model parameters revealed that host characteristics (e.g., population size and resistance) vary in importance depending on transmission dynamics, suggesting that host ecology may scale differently through different transmission routes enabling prediction of population responses in a more robust way than using either host characteristics or transmission shifts alone.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0022498PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3143141PMC
December 2011
-->