Publications by authors named "Ezra Naman"

10 Publications

  • Page 1 of 1

Association between Taenia solium infection and HIV/AIDS in northern Tanzania: a matched cross sectional-study.

Infect Dis Poverty 2016 Dec 1;5(1):111. Epub 2016 Dec 1.

Department of Neurology, University Hospital, Klinikum rechts der Isar, Technical University Munich (TUM), Ismaninger Str. 22, 81675, Munich, Germany.

Background: The frequency of Taenia solium, a zoonotic helminth, is increasing in many countries of sub-Saharan Africa, where the prevalence of the human immunodeficiency virus (HIV) is also high. However, little is known about how these two infections interact. The aim of this study was to compare the proportion of HIV positive (+) and negative (-) individuals who are infected with Taenia solium (TSOL) and who present with clinical and neurological manifestations of cysticercosis (CC).

Methods: In northern Tanzania, 170 HIV+ individuals and 170 HIV- controls matched for gender, age and village of origin were recruited. HIV staging and serological tests for TSOL antibodies (Ab) and antigen (Ag) were performed. Neurocysticercosis (NCC) was determined by computed tomography (CT) using standard diagnostic criteria. Neurological manifestations were confirmed by a standard neurological examination. In addition, demographic, clinical and neuroimaging data were collected. Further, CD4 cell counts as well as information on highly active antiretroviral treatment (HAART) were noted.

Results: No significant differences between HIV+ and HIV- individuals regarding the sero-prevalence of taeniosis-Ab (0.6% vs 1.2%), CC-Ab (2.4% vs 2.4%) and CC-Ag (0.6% vs 0.0%) were detected. A total of six NCC cases (3 HIV+ and 3 HIV-) were detected in the group of matched participants. Two individuals (1 HIV+ and 1 HIV-) presented with headaches as the main symptom for NCC, and four with asymptomatic NCC. Among the HIV+ group, TSOL was not associated with CD4 cell counts, HAART duration or HIV stage.

Conclusions: This study found lower prevalence of taeniosis, CC and NCC than had been reported in the region to date. This low level of infection may have resulted in an inability to find cross-sectional associations between HIV status and TSOL infection or NCC. Larger sample sizes will be required in future studies conducted in that area to conclude if HIV influences the way NCC manifests itself.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s40249-016-0209-7DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5131417PMC
December 2016

Antiretroviral treatment failure predicts mortality in rural Tanzania.

Int J STD AIDS 2015 Aug 13;26(9):633-9. Epub 2014 Aug 13.

Department of Infectious Diseases, Oslo University Hospital, Oslo, Norway Vestre Viken HF, Drammen Hospital, Drammen, Norway

Virological monitoring of HIV-infected patients on antiretroviral treatment (ART) is rarely available in resource-limited settings and many patients experience unrecognized virological failure. We studied the long-term consequences of virological failure in rural Tanzania. Previously, virological efficacy was measured in a cohort treated with ART. In the present study, patients with virological failure (VF; HIV-RNA >400 copies/ml) were followed up and compared to those with virological response (VR; HIV-RNA <400 copies/ml) with regard to mortality, CD4 change and subsequent virological outcome. Fifty-six patients with VF had a median CD4 count of 358 cells/µl (interquartile range [IQR] 223-635) and a median HIV-RNA of 13,573 copies/ml (IQR 2326-129,736). Median CD4 count for those with VR was 499 cells/µl (IQR 290-636). During a median follow-up time of 39 months (IQR 18-42), 8 of 56 patients (14.3%) with VF died, compared to 1 of 63 patients (1.6%) with VR (p = 0.009). All registered deaths were HIV-related. Of 55 patients with subsequent HIV-RNA measurements, only 12 of 30 (40%) patients with VF achieved virological suppression, compared to 20 of 25 (80%) patients with VR (p = 0.003). Virological failure predicted death and subsequent virological failure in patients on ART in a resource-limited setting.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/0956462414548460DOI Listing
August 2015

Cytomegalovirus viremia in dried blood spots is associated with an increased risk of death in HIV-infected patients: a cohort study from rural Tanzania.

Int J Infect Dis 2012 Dec 30;16(12):e879-85. Epub 2012 Sep 30.

Department of Infectious Diseases, Oslo University Hospital, POB 4956 Nydalen, N-0424 Oslo, Norway.

Objectives: The objectives of the study were to assess the utility of dried blood spots (DBS) for the detection of cytomegalovirus (CMV) antibody and viremia in a resource-poor setting, to study the prevalence of CMV antibody and viremia in HIV-infected patients with access to antiretroviral therapy (ART) in Tanzania, and to relate CMV viremia to outcome.

Methods: DBS were prepared from 168 ART-naïve patients at baseline. Demographic, clinical, and laboratory data were obtained from patient records. CMV antibody was analyzed by chemiluminescent microparticle immunoassay and viremia by quantitative PCR.

Results: All patients were CMV-seropositive. At baseline 38 (22.6%) had detectable CMV viremia and 14 (8.3%) had a CMV viral load ≥ 200 copies/ml. In 135 patients available for follow-up, CMV ≥ 200 copies/ml was an independent risk factor for death with a hazard ratio of 5.0 (95% confidence interval 2.1-11.9) after adjusting for confounders. Symptoms compatible with CMV disease were common with viremia ≥ 200 copies/ml and CD4+ T cell counts <100 cells/mm(3), but confirmatory diagnostic procedures were unavailable.

Conclusions: DBS are suitable for the detection of CMV antibody and viremia in HIV patients in resource-poor areas. CMV viremia was frequent and associated with an increased risk of death. Improved diagnosis and treatment of CMV may improve the prognosis for HIV-infected patients in developing countries and should be addressed in future studies.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijid.2012.08.003DOI Listing
December 2012

Antiretroviral treatment reverses HIV-associated anemia in rural Tanzania.

BMC Infect Dis 2011 Jul 11;11:190. Epub 2011 Jul 11.

Department of Infectious Diseases, Oslo University Hospital, Ulleval, Oslo, Norway.

Background: HIV-associated anemia is common and associated with poor prognosis. However, its response to antiretroviral treatment (ART) in rural Africa is poorly understood.

Methods: HIV-infected adults (≥15 years) who enrolled in HIV care at Haydom Lutheran Hospital in northern Tanzania were included in the study. The effect of ART (zidovudine/stavudine + lamivudine + efavirenz/nevirapine) on HIV-associated anemia was studied in a subset of patients who were anemic at the time they started ART and had a follow-up hemoglobin measurement 12 months later. Pregnant women were excluded from the study, as were women who had given birth within the past 6 weeks. Anemia was defined as hemoglobin <12 g/dL in women and <13 g/dL in men. We applied paired sample T-tests to compare hemoglobin levels before and one year after ART initiation, and logistic regression models to identify predictors of persistent anemia.

Results: At enrollment, mean hemoglobin was 10.3 g/dL, and 649 of 838 patients (77.4%) were anemic. Of the anemic patients, 254 (39.1%) had microcytosis and hypochromia. Among 102 patients who were anemic at ART initiation and had a follow-up hemoglobin measurement after 12 months, the mean hemoglobin increased by 2.5 g/dL (P < 0.001); however, 39 patients (38.2%) were still anemic after 12 months of ART. Independent predictors of persistent anemia were mean cell volume in the lower quartile (<76.0 fL; Odds Ratio [OR] 4.34; 95% confidence interval [CI] 1.22-15.5) and a zidovudine-containing initial regimen (OR 2.91; 95% CI 1.03-8.19).

Conclusions: Most patients had anemia at enrollment, of whom nearly 40% had microcytosis and hypochromia suggestive of iron deficiency. The mean hemoglobin increased significantly in patients who received ART, but one third were still anemic 12 months after ART initiation indicating that additional interventions to treat HIV-associated anemia in rural Africa might be warranted, particularly in patients with microcytosis and those treated with zidovudine.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/1471-2334-11-190DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3145581PMC
July 2011

HIV-1 drug resistance testing from dried blood spots collected in rural Tanzania using the ViroSeq HIV-1 Genotyping System.

J Antimicrob Chemother 2011 Feb 25;66(2):260-4. Epub 2010 Nov 25.

Department of Infectious Diseases, Oslo University Hospital, Ulleval, Oslo, Norway.

Objectives: To assess whether the commercial ViroSeq HIV-1 Genotyping System (Abbott Molecular, Des Plains, IL, USA) can be used in conjunction with dried blood spots (DBS) for clinical monitoring of drug resistance in patients who fail antiretroviral treatment (ART) in rural Tanzania.

Patients And Methods: Patients at Haydom Lutheran Hospital with confirmed treatment failure (viral load >1000 copies/mL) of a first-line ART regimen were selected for resistance testing. DBS were stored with desiccant at -20 °C for a median of 126 days (range 0-203) and shipped at ambient temperature for 20 days. After manual extraction of nucleic acids, the ViroSeq kit was used for amplification and sequencing. DBS-derived genotypes were compared with those of a plasma-based assay.

Results: Seventeen of 36 (47%) DBS specimens were successfully genotyped. Only 2 of 16 (13%) DBS with a viral load <10,000 copies/mL could be amplified, compared with 15 of 20 (75%) DBS with a viral load >10,000 copies/mL (P = 0.001). In samples that yielded a sequence, all 23 clinically significant reverse transcriptase (RT) mutations in plasma were also detected in DBS. One RT mutation was found in DBS only. In the protease region, 77 polymorphisms were found in plasma, of which 70 (91%) were also detected in DBS. Sixteen of 17 (94%) patients had identical resistance profiles to antiretroviral drugs in plasma and DBS.

Conclusions: The ViroSeq kit performed well in patients with a high viral load, but failed to genotype most DBS with a viral load <10,000 copies/mL. In DBS that yielded a genotype, there was high concordance with a plasma-based assay.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jac/dkq433DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3019084PMC
February 2011

HIV type-1 drug resistance testing on dried blood spots is feasible and reliable in patients who fail antiretroviral therapy in rural Tanzania.

Antivir Ther 2010 ;15(7):1003-9

Department of Infectious Diseases, Oslo University Hospital, Ullevål, Oslo, Norway.

Background: HIV type-1 (HIV-1) drug resistance testing is rarely available in resource-limited settings because of high costs and stringent requirements for storage and transport of plasma. Dried blood spots (DBS) can be a convenient alternative to plasma, but the use of DBS needs validation under field conditions. We assessed the performance of DBS in genotypic resistance testing of patients who failed first-line antiretroviral therapy (ART) in rural Tanzania.

Methods: A total of 36 ART-experienced patients with viral loads >1,000 copies/ml (median 15,180 copies/ml [range 1,350-3,683,000]) and with various HIV-1 subtypes were selected for resistance testing. DBS were stored with desiccant at ambient temperature for a median of 29 days (range 8-89). Samples were amplified using an in-house reverse transcriptase-nested PCR method and sequenced using the ViroSeq™ assay (Abbott Molecular, Des Plaines, IL, USA). DBS-derived genotypes were compared with genotypes from plasma.

Results: Overall, 34 of 36 (94%) DBS specimens were successfully genotyped. In the protease region, of 142 polymorphisms found in plasma, 132 (93%) were also detected in DBS. In the reverse transcriptase region, of 57 clinically relevant mutations present in plasma, 51 (89%) were also detected in DBS. A total of 30 of 34 (88%) patients had identical resistance profiles to antiretroviral drugs in plasma and DBS.

Conclusions: Genotyping was successful in the vast majority of DBS specimens stored at ambient temperature for up to 3 months, and there was high concordance between mutations found in DBS and plasma. Our study suggests that DBS can be a feasible and reliable tool to monitor HIV-1 drug resistance in patients on ART in resource-limited settings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.3851/IMP1660DOI Listing
February 2011

Drug resistance is widespread among children who receive long-term antiretroviral treatment at a rural Tanzanian hospital.

J Antimicrob Chemother 2010 Sep 24;65(9):1996-2000. Epub 2010 Jun 24.

Department of Infectious Diseases, Oslo University Hospital, Ulleval, Oslo, Norway.

Objectives: To assess long-term virological efficacy and the emergence of drug resistance in children who receive antiretroviral treatment (ART) in rural Tanzania.

Patients And Methods: Haydom Lutheran Hospital has provided ART to HIV-infected individuals since 2003. From February through May 2009, a cross-sectional virological efficacy survey was conducted among children (<15 years) who had completed >or=6 months of first-line non-nucleoside reverse transcriptase inhibitor (NNRTI)-based ART. Genotypic resistance was determined in those with a viral load of >200 copies/mL.

Results: Virological response was measured in 19 of 23 eligible children; 8 of 19 were girls and median age at ART initiation was 5 years (range 2-14 years). Median duration of ART at the time of the survey was 40 months (range 11-61 months). Only 8 children were virologically suppressed (
Conclusions: Among children on long-term ART in rural Tanzania, >50% harboured drug resistance. Results for children were markedly poorer than for adults attending the same programme, underscoring the need for improved treatment strategies for children in resource-limited settings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1093/jac/dkq234DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2920178PMC
September 2010

Dried blood spots perform well in viral load monitoring of patients who receive antiretroviral treatment in rural Tanzania.

Clin Infect Dis 2009 Sep;49(6):976-81

Ulleval Department of Infectious Diseases, Oslo University Hospital, Oslo, Norway.

Background: Monitoring of antiretroviral treatment (ART) with human immunodeficiency virus (HIV) viral loads, as recommended in industrialized countries, is rarely available in resource-limited settings because of the high costs and stringent requirements for storage and transport of plasma. Dried blood spots (DBS) can be an alternative to plasma, but the use of DBS has not been assessed under field conditions in rural Africa. The present study investigates the performance of DBS in HIV viral load monitoring of patients who received ART in rural Tanzania.

Patients And Methods: From November 2007 through June 2008, parallel plasma and DBS specimens were obtained from patients who received ART at Haydom Lutheran Hospital in rural Tanzania. DBS specimens were stored at tropical room temperature for 3 weeks before testing with the NucliSENS EasyQ HIV-1 v1.2 assay. Results obtained with DBS were compared with results obtained with use of a gold-standard plasma assay.

Results: Ninety-eight plasma-DBS pairs were compared, and plasma viral loads ranged from <40 to >1,000,000 copies/mL. The correlation between plasma and DBS viral load was strong (R(2) = 0.75). The mean difference (+/- standard deviation) was 0.04 +/ 0.57 log(10) copies/mL, and only 8 samples showed >1 log(10) copies/mL difference. HIV type 1 RNA was detected in 7%, 60%, and 100% of DBS specimens with corresponding plasma viral loads of 40-999, 1000-2999, and 3000 copies/mL, respectively.

Conclusions: DBS, in combination with the NucliSENS EasyQ HIV-1 v1.2 asay, performed well in monitoring HIV viral loads in patients who received ART in rural Tanzania, although the sensitivity was reduced when viral burden was low. The use of DBS can simplify virological monitoring in resource-limited settings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1086/605502DOI Listing
September 2009

Virological efficacy and emergence of drug resistance in adults on antiretroviral treatment in rural Tanzania.

BMC Infect Dis 2009 Jul 7;9:108. Epub 2009 Jul 7.

Ulleval Department of Infectious Diseases, Oslo University Hospital, Oslo, Norway.

Background: Virological response to antiretroviral treatment (ART) in rural Africa is poorly described. We examined virological efficacy and emergence of drug resistance in adults receiving first-line ART for up to 4 years in rural Tanzania.

Methods: Haydom Lutheran Hospital has provided ART to HIV-infected patients since October 2003. A combination of stavudine or zidovudine with lamivudine and either nevirapine or efavirenz is the standard first-line regimen. Nested in a longitudinal cohort study of patients consecutively starting ART, we carried out a cross-sectional virological efficacy survey between November 2007 and June 2008. HIV viral load was measured in all adults who had completed at least 6 months first-line ART, and genotypic resistance was determined in patients with viral load >1000 copies/mL.

Results: Virological response was measured in 212 patients, of whom 158 (74.5%) were women, and median age was 35 years (interquartile range [IQR] 29-43). Median follow-up time was 22.3 months (IQR 14.0-29.9). Virological suppression, defined as <400 copies/mL, was observed in 187 patients (88.2%). Overall, prevalence of > or =1 clinically significant resistance mutation was 3.9, 8.4, 16.7 and 12.5% in patients receiving ART for 1, 2, 3 and 4 years, respectively. Among those successfully genotyped, the most frequent mutations were M184I/V (64%), conferring resistance to lamivudine, and K103N (27%), Y181C (27%) and G190A (27%), conferring resistance to non-nucleoside reverse transcriptase inhibitors (NNRTIs), whereas 23% had thymidine analogue mutations (TAMs), associated with cross-resistance to all nucleoside reverse transcriptase inhibitors (NRTIs). Dual-class resistance, i.e. resistance to both NRTIs and NNRTIs, was found in 64%.

Conclusion: Virological suppression rates were good up to 4 years after initiating ART in a rural Tanzanian hospital. However, drug resistance increased with time, and dual-class resistance was common, raising concerns about exhaustion of future antiretroviral drug options. This study might provide a useful forecast of drug resistance and demand for second-line antiretroviral drugs in rural Africa in the coming years.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/1471-2334-9-108DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2713244PMC
July 2009

Predictors of mortality in HIV-infected patients starting antiretroviral therapy in a rural hospital in Tanzania.

BMC Infect Dis 2008 Apr 22;8:52. Epub 2008 Apr 22.

Department of Infectious Diseases, Ulleval University Hospital, Oslo, Norway.

Background: Studies of antiretroviral therapy (ART) programs in Africa have shown high initial mortality. Factors contributing to this high mortality are poorly described. The aim of the present study was to assess mortality and to identify predictors of mortality in HIV-infected patients starting ART in a rural hospital in Tanzania.

Methods: This was a cohort study of 320 treatment-naïve adults who started ART between October 2003 and November 2006. Reliable CD4 cell counts were not available, thus ART initiation was based on clinical criteria in accordance with WHO and Tanzanian guidelines. Kaplan-Meier models were used to estimate mortality and Cox proportional hazards models to identify predictors of mortality.

Results: Patients were followed for a median of 10.9 months (IQR 2.9-19.5). Overall, 95 patients died, among whom 59 died within 3 months of starting ART. Estimated mortality was 19.2, 29.0 and 40.7% at 3, 12 and 36 months, respectively. Independent predictors of mortality were severe anemia (hemoglobin <8 g/dL; adjusted hazard ratio [AHR] 9.20; 95% CI 2.05-41.3), moderate anemia (hemoglobin 8-9.9 g/dL; AHR 7.50; 95% CI 1.77-31.9), thrombocytopenia (platelet count <150 x 109/L; AHR 2.30; 95% CI 1.33-3.99) and severe malnutrition (body mass index <16 kg/m2; AHR 2.12; 95% CI 1.06-4.24). Estimated one year mortality was 55.2% in patients with severe anemia, compared to 3.7% in patients without anemia (P < 0.001).

Conclusion: Mortality was found to be high, with the majority of deaths occurring within 3 months of starting ART. Anemia, thrombocytopenia and severe malnutrition were strong independent predictors of mortality. A prognostic model based on hemoglobin level appears to be a useful tool for initial risk assessment in resource-limited settings.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/1471-2334-8-52DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2364629PMC
April 2008