Publications by authors named "Yahya Daoud"

38 Publications

Association of Cavovarus Foot Alignment With Peroneal Tendon Tears.

Foot Ankle Int 2021 Apr 13:1071100721990348. Epub 2021 Apr 13.

Orthopedic Associates of Dallas, Baylor University Medical Center, Dallas, TX, USA.

Background: Although it is a widely accepted clinical principle that cavovarus deformity predisposes to peroneal tendon problems, there are limited data to support that assumption. This study tested the hypothesis that cavovarus is associated with peroneal tendon tears and evaluated which radiographic measures correlated with that association.

Methods: A retrospective comparison of radiographic measures of cavovarus in 234 consecutive patients operatively treated for chronically symptomatic peroneal tendon tears was compared to a matched control group. Measures included calcaneal pitch, anteroposterior (AP) talometatarsal and talocalcaneal angles, and talonavicular coverage angle. A novel coordinate system analyzed midfoot and hindfoot components of cavovarus. Analysis of variance was used to compare cohorts, and a Tukey-Kramer test used to analyze 3 subgroups of brevis and longus tears, and concomitant tears.

Results: The distribution of tears was 73% peroneus brevis, 8% longus, and 19% both tendons. Compared with controls, the study group, and subgroups, had multiple measures of increased cavovarus, including greater calcaneal pitch ( = .0001), decreased AP talo-first metatarsal angle ( = .0001), and increased talonavicular coverage angle ( = .0001). Elevated medial longitudinal arch, and rotational changes in the radiographic profiles of the hindfoot were found with the coordinate system described by Yokokura.

Conclusion: This study found a statistically significant association of increased cavovarus deformity with peroneal tendon tears, compared to controls. It documented the relative incidence of tears of peroneus brevis, peroneus longus, and concomitant tears in a large surgical series. It demonstrated which simple radiographic angles and complex coordinate measurements of cavovarus deformity were significantly associated with peroneal tendon tears.

Level Of Evidence: Level III, retrospective comparative cohort study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1071100721990348DOI Listing
April 2021

Long-term Functional Results of Total Ankle Arthroplasty in Stiff Ankles.

Foot Ankle Int 2021 May 8;42(5):527-535. Epub 2021 Feb 8.

Medical University of South Carolina, Charleston, SC, USA.

Background: Total ankle arthroplasty (TAA) is advocated over ankle arthrodesis to preserve ankle motion (ROM). Clinical and gait analysis studies have shown significant improvement after TAA. The role and outcomes of TAA in stiff ankles, which have little motion to be preserved, has been the subject of limited investigation. This investigation evaluated the mid- to long-term functional outcomes of TAA in stiff ankles.

Methods: A retrospective study of prospectively collected functional gait data in 33 TAA patients at a mean of 7.6 (5-13) years postoperatively used 1-way analysis of variance and multivariate regression analysis to compare among preoperative and postoperative demographic data (age, gender, body mass index, years postsurgery, and diagnosis) and gait parameters according to quartiles of preoperative sagittal ROM.

Results: The stiffest ankles had a mean ROM of 7.8 degrees, compared to 14.3 degrees for the middle 2 quartiles, and 21.0 degrees for the most flexible ankles. Patients in the lowest quartile (Q1) also had statistically significantly lower step length, speed, max plantarflexion, and power preoperatively. Postoperatively, they increased step length, speed, max plantarflexion, and ankle power to levels comparable to patients with more flexible ankles preoperatively (Q2, Q3, and Q4). They had the greatest absolute and relative increases in these parameters of any group, but the final total ROM was still statistically significantly the lowest.

Conclusion: Preoperative ROM was predictive of overall postoperative gait function at an average of 7.6 (range 5-13) years. Although greater preoperative sagittal ROM predicted greater postoperative ROM, the stiffest ankles showed the greatest percentage increase in ROM. Patients with the stiffest ankles had the greatest absolute and relative improvements in objective function after TAA, as measured by multiple gait parameters. At intermediate- to long-term follow-up, patients with stiff ankles maintained significant functional improvements after TAA.

Level Of Evidence: Level III, comparative study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1071100720977847DOI Listing
May 2021

Functional Outcomes of Total Ankle Arthroplasty at a Mean Follow-up of 7.6 Years: A Prospective, 3-Dimensional Gait Analysis.

J Bone Joint Surg Am 2021 Mar;103(6):477-482

Baylor University Medical Center, Dallas, Texas.

Background: In vivo gait analysis provides objective measurement of patient function and can quantify that function before and after ankle reconstruction. Previous gait studies have shown functional improvement for up to 4 years following total ankle arthroplasty (TAA), but to date, there are no published studies assessing function at ≥5 years following TAA. We hypothesized that patients who underwent TAA would show significant improvements in walking function at a minimum follow-up of 5 years, compared with their preoperative function, as measured by changes in temporospatial, kinematic, and kinetic gait parameters.

Methods: Three-dimensional gait analysis with a 12-camera digital motion-capture system and double force plates was utilized to record temporospatial, kinematic, and kinetic measures in 33 patients who underwent TAA with either the Scandinavian Total Ankle Replacement (Stryker; n = 28) or Salto Talaris Ankle (Integra LifeSciences; n = 5). Gait analysis was performed preoperatively and at a minimum follow-up of 5 years (mean, 7.6 years; range, 5 to 13 years).

Results: Significant improvements were observed in multiple gait parameters, with temporospatial increases in cadence (+9.5 steps/min; p < 0.0001), step length (+4.4 cm; p = 0.0013), and walking speed (+0.2 m/s; p < 0.0001), and kinematic increases in total sagittal range of motion (+2.0°; p = 0.0263), plantar flexion at initial contact (+2.7°; p = 0.0044), and maximum plantar flexion (+2.0°; p = 0.0488). Kinetic analysis revealed no loss of peak ankle power, despite patients aging.

Conclusions: To our knowledge, this is the first study to report 7-year functional outcomes of TAA, quantified by objective, in vivo measurements of patient gait. Patients were shown to have sustained improvement in multiple objective parameters of gait compared with preoperative function.

Level Of Evidence: Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.2106/JBJS.20.00659DOI Listing
March 2021

Correlation of Patient-Reported Outcomes With Physical Function After Total Ankle Arthroplasty.

Foot Ankle Int 2021 May 15;42(5):646-653. Epub 2021 Jan 15.

Baylor University Medical Center, Dallas, TX, USA.

Background: Total ankle arthroplasty (TAA) is successful by both subjective patient-reported outcome measures (PROMs) and objective functional improvements of gait. Each is reproducible and valid, but they are entirely distinct methods. This study investigated the correlation between subjective and objective outcomes of TAA.

Methods: Seventy patients underwent gait analysis preoperatively and 1 year after TAA. The 36-Item Short-Form Health Survey (SF-36) and visual analog score (VAS) for pain and American Orthopaedic Foot & Ankle Society (AOFAS) Ankle-Hindfoot Scores were recorded at each interval. A Student test, a multivariate regression, and a Pearson correlation coefficient were used to measure the correlation between parameters of gait and PROMs.

Results: Patients had statistically significant improvements in gait velocity, total range of motion (ROM), maximum plantarflexion, ankle power, and SF-36 Physical, VAS, and AOFAS scores. The SF-36 Physical score had a moderate positive correlation with preoperative walking speed, step length, and ankle power and postoperative walking speed and ankle power. No correlation between VAS score and function was detected. The AOFAS score had a moderate positive correlation with postoperative walking speed, step length, and ankle power, and improvement in walking speed, cadence, and ankle power.

Conclusion: Statistically significant correlations were found between numerous preoperative and postoperative comparisons of PROMs and the AOFAS score with the objective biomechanical outcomes of gait. Walking speed and ankle push-off power correlated most with patient perceptions of function and improvement, while pain and ROM did not. Subjective PROMs and objective biomechanical outcomes were complementary in the assessment of surgical outcomes and, combined, helped to address the dilemma of the confounding effect of other lower extremity pathologies on PROMs.

Level Of Evidence: Level III, comparative series.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1071100720978428DOI Listing
May 2021

Instagram and Pilon Fractures: An Analysis of Social Media and Its Relationship to Patient Injury Perception.

Foot Ankle Spec 2020 Jul 20:1938640020940837. Epub 2020 Jul 20.

Baylor University Medical Center, Dallas, Texas.

The purpose of this study was to investigate social media posts regarding pilon fractures and its relationship to patient injury perception. We evaluated Instagram media posts in patients who have suffered pilon fractures for the following variables: gender, tone, discussion of rehabilitation, activities of daily living (ADL) reference, incision/scar reference, pain, post of radiograph/imaging, external fixation reference, discussion of bracing/splinting, pre- or postoperative swelling, and need for reoperation. Results were determined by comparing each variable to gender and tone of the post to study patient injury perception. Public Instagram posts from within a 1-year time period were isolated and evaluated using the hashtag "#pilonfracture." Individual posts were analyzed by authors. In total, 241 patient posts were included for investigation and analysis of patient injury perception via social media. Of all included posts, 88% of posts had a positive tone. A majority of the posts (66.8%) mentioned rehabilitation and postoperative progress. There were significant associations between positive tone and rehabilitation ( = .0001), as well as positive tone and ADLs ( = .0361). Reported outcomes after surgical management of pilon fractures are generally poor. Nonetheless, this analysis of patients sharing their experience on social media after open reduction internal fixation of pilon fractures demonstrates a mostly positive attitude toward the injury and recovery. A positive tone of the post was significantly associated with mentions of rehabilitation and ADLs. Level III: Retrospective comparative study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1938640020940837DOI Listing
July 2020

Effects of Umbilical Cord Milking on Term Infants Delivered by Cesarean Section.

Am J Perinatol 2020 Feb 18. Epub 2020 Feb 18.

Department of Quantitative Sciences, Center for Clinical Effectiveness, Baylor Scott & White Health Care System, Dallas, Texas.

Objective:  Umbilical cord milking (UCM) is an efficient way to achieve optimal placental transfusion in term infants born by cesarean section (CS). However, it is not frequently performed due to concern for short-term adverse effects of increased blood volume, such as polycythemia and hyperbilirubinemia. The aim of this study is to evaluate the short-term effects of UCM on term infants delivered by CS.

Study Design:  We conducted a pre- and postimplementation cohort study comparing term infants delivered by CS who received UCM five times (141 infants, UCM group) during a 6-month period (August 1, 2017 to January 31, 2018) to those who received immediate cord clamping (ICC) during the same time period (105 infants, postimplementation ICC) and during a 3-month period (October1, 2016 to December 31, 2016) prior to the implementation of UCM (141 infants, preimplementation ICC).

Results:  Mothers were older in UCM group compared with both ICC groups. There were no significant differences in other maternal or neonatal characteristics. Although this study was not powered to detect differences in outcomes, the occurrence of hyperbilirubinemia needing phototherapy, symptomatic polycythemia, NICU admissions, or readmissions for phototherapy was similar between the groups.

Conclusion:  UCM intervention was not associated with increased incidence of phototherapy or symptomatic polycythemia in term infants delivered by CS.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1055/s-0040-1701617DOI Listing
February 2020

An Institutional Approach to the Management of Asymptomatic Chorioamnionitis-Exposed Infants Born ≥35 Weeks Gestation.

Pediatr Qual Saf 2019 Nov-Dec;4(6):e238. Epub 2019 Dec 5.

Department of Quantitative Sciences, Center for Clinical Effectiveness, Baylor Scott & White Health Care System, Dallas, Tex.

Our newborn practice routinely treated asymptomatic chorioamnionitis-exposed infants born at 35 weeks gestation or greater with empiric antibiotics. Starting April 1, 2017, we implemented an algorithm of not treating, unless there was an abnormal clinical and/or laboratory evaluation. The goal of this quality improvement initiative was to reduce the percentage of chorioamnionitis-exposed infants treated with antibiotics (primary outcome measure) to <50%.

Methods: We compared 123 chorioamnionitis-exposed infants born 1 year before implementation (pre-algorithm group, April 1, 2016, to March 31, 2017) with 111 born 1 year following implementation (post-algorithm group, April 1, 2017, to March 31, 2018). The primary outcome measure was analyzed monthly using a run chart.

Results: The maternal and neonatal characteristics were similar between both groups. Significantly fewer infants in the post-algorithm group received antibiotics compared with the pre-algorithm group (4.5% versus 96.8%; < 0.01). There were no differences in median hospital length of stay or incidence of neonatal intensive care unit admissions between both groups. There were no positive blood cultures or readmissions within 7 days for early-onset sepsis in either group.

Conclusion: An institutional approach of monitoring chorioamnionitis-exposed infants with a clinical and laboratory evaluation decreased antibiotic utilization in the mother-baby unit by 95% without an increase in hospital length of stay, neonatal intensive care unit admissions, or readmissions for early-onset sepsis.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/pq9.0000000000000238DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6946240PMC
December 2019

Effect of delayed cord clamping on very preterm twins.

Early Hum Dev 2018 09 10;124:22-25. Epub 2018 Aug 10.

Department of Obstetrics and Gynecology, Baylor University Medical Center, Dallas, TX, USA.

Background: The very preterm infants of twin births may particularly benefit from delayed cord clamping (DCC) as the likelihood of unfavorable outcome is greater compared to singletons. Unfortunately, there is paucity of available information regarding safety and efficacy of DCC in this group.

Objective: To report the clinical consequences of delayed cord clamping (DCC) in very preterm twins, born between 23 and 31 weeks gestation.

Study Design: In this pre and post intervention retrospective cohort study, we compared 30 very preterm infants born from 15 twin deliveries during historic study period to 32 very preterm infants born from 16 twin deliveries during DCC study period. During historic study period (August 19, 2013 to January 31, 2015), infants included were eligible to receive DCC, but their cords were immediately clamped. DCC study period (February 1, 2015 to January 31, 2017) included infants who had DCC performed for 60 s after birth.

Results: The Apgar scores and other resuscitation variables were similar between both groups. After adjusting for gestational age and mode of delivery, significantly fewer infants in the DCC cohort needed red blood cell (RBC) transfusions in first week of life compared to the historic cohort (15.6% vs. 43.3%; P = 0.03). Death and other major neonatal outcomes were similar between both groups.

Conclusion: DCC in very preterm twins was safe, feasible and not associated with any adverse neonatal outcomes compared to early cord clamping. DCC was associated with a significant reduction in early RBC transfusions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.earlhumdev.2018.08.002DOI Listing
September 2018

Limited Clinical Relevance of Vertebral Artery Injury in Blunt Trauma.

Ann Vasc Surg 2018 Nov 25;53:53-62. Epub 2018 Jul 25.

Texas Heart Hospital Baylor Plano, Plano, TX; Texas Vascular Associates, Dallas, TX. Electronic address:

Background: Blunt cerebrovascular injury (BCVI), although rare, is more common than previously thought and carries a substantial stroke and mortality risk. The purpose of our study was to evaluate the differences between blunt carotid artery (CA) and vertebral artery (VA) injuries, assess the stroke and death rates related to these injuries, and identify the relationship of Injury Severity Score (ISS) with stroke and mortality in BCVI.

Methods: Using a retrospective review of the trauma registry at a level I trauma center, we identified patients with BCVI. The study period began in January 2003 and ended in July 2014. Demographics, injuries reported, investigative studies performed, and outcomes data were obtained and analyzed. Radiographic images of both blunt CA and VA injuries were reviewed and graded by an independent radiologist, according to the current classification of blunt CA injuries.

Results: BCVI involving 114 vessels was identified in 103 patients. This population consisted of 65 males and 38 females with an average age of 45 years (15-92, range). The average ISS was 22 (4-75, range). Cervical spine fracture occurred in 80% of VA injuries (64 total patients). Injuries involved the CA in 33, the VA in 59, and both in 11. The CA group had a higher incidence of traumatic brain injury (61% vs. 46%), ISS (27 vs. 18), and stroke (24% vs. 3%), compared to the VA group. Mortality in the CA group was 30% compared to 3% in the VA group. Patients with high ISS (≥25) had increased stroke rates compared to those with lower (<25) ISS (19% vs. 6.7%). All mortalities occurred with ISS >25. Logistic regression revealed that vessel injured, ISS, and Glasgow Coma Scale (GCS) were significant risk factors for mortality. Multivariate analysis demonstrated carotid injury, and lowest GCS were independently associated with mortality.

Conclusions: In this comparison of CA and VA injuries in BCVI, VA injuries were more common and more frequently found with cervical spine fractures than CA injuries. However, VA injuries had a lower incidence of CVA and mortality. A high ISS was associated with stroke and mortality while carotid injury and lowest GCS were independently associated with increased mortality.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.avsg.2018.05.034DOI Listing
November 2018

Critical Event Intervals in Determining Candidacy for Intravenous Thrombolysis in Acute Stroke.

J Clin Med Res 2018 Jul 4;10(7):582-587. Epub 2018 Jun 4.

Department of Neurology, Baylor University Medical Center, Dallas, TX, USA.

Background: The aim of the study was to determine the optimal set point for the critical event benchmarks described in stroke guidelines and validate the ability of these goals to predict successful administration of intravenous thrombolysis within 60 min of hospital arrival.

Methods: This was a retrospective cohort analysis of patients with acute ischemic stroke who received intravenous thrombolysis following presentation to the emergency department. The national benchmarks for time intervals associated with the completion of critical events required to determine candidacy for thrombolysis were evaluated for the ability to predict successful administration of thrombolysis within 60 min of hospital arrival. Optimal time interval cut points were then estimated using regression and receiver-operator characteristic curve analysis and compared to guidelines.

Results: Of the 523 patients included in the analysis, 229 (43.8%) received intravenous thrombolysis within 60 min of hospital arrival. Of the patients who met the critical event interval goals described in guidelines, only 51.6% received thrombolysis within 60 min. The optimized cut points suggested by the regression analysis aligned with the guideline benchmarks with the only substantial difference being a shortened goal of arrival to neuroimaging start time of 19 min. This difference did not impact the overall predictive value.

Conclusion: The critical event benchmarks proposed in this study by logistic regression closely correlate with the critical event benchmarks described in the AHA/ASA acute stroke guidelines.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.14740/jocmr3425wDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5997418PMC
July 2018

Factors among patients receiving prone positioning for the acute respiratory distress syndrome found useful for predicting mortality in the intensive care unit.

Proc (Bayl Univ Med Cent) 2018 Jan 2;31(1):1-5. Epub 2018 Jan 2.

Quantitative Sciences, Center for Clinical Effectiveness, Baylor Scott and White Health, Dallas, Texas.

Optimal mechanical ventilation management in patients with the acute respiratory distress syndrome (ARDS) involves the use of low tidal volumes and limited plateau pressure. Refractory hypoxemia may not respond to this strategy, requiring other interventions. The use of prone positioning in severe ARDS resulted in improvement in 28-day survival. To determine whether mechanical ventilation strategies or other parameters affected survival in patients undergoing prone positioning, a retrospective analysis was conducted of a consecutive series of patients with severe ARDS treated with prone positioning. Demographic and clinical information involving mechanical ventilation strategies, as well as other variables associated with prone positioning, was collected. The rate of in-hospital mortality was obtained, and previously described parameters were compared between survivors and nonsurvivors. Forty-three patients with severe ARDS were treated with prone positioning, and 27 (63%) died in the intensive care unit. Only three parameters were significant predictors of survival: APACHE II score ( = 0.03), plateau pressure ( = 0.02), and driving pressure ( = 0.04). The ability of each of these parameters to predict mortality was assessed with receiver operating characteristic curves. The area under the curve values for APACHE II, plateau pressure, and driving pressure were 0.74, 0.69, and 0.67, respectively. In conclusion, in a group of patients with severe ARDS treated with prone positioning, only APACHE II, plateau pressure, and driving pressure were associated with mortality in the intensive care unit.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1080/08998280.2017.1391560DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5903513PMC
January 2018

Radiographic Results and Return to Activity After Sesamoidectomy for Fracture.

Foot Ankle Int 2017 Oct 11;38(10):1100-1106. Epub 2017 Aug 11.

4 Quantitative Sciences and Center for Clinical Effectiveness, Office of the Chief Quality Officer, Baylor Scott & White Health, Dallas, TX, USA.

Background: Limited data are available comparing the results of lateral sesamoidectomy and medial sesamoidectomy for the treatment of fractures recalcitrant to nonoperative treatment interventions. The hypothesis of this study was that sesamoidectomy for either lateral or medial sesamoid fractures would not change radiographic alignment of the first ray given the use of identical reconstruction of the plantar plate, intersesamoid ligament, and plantar ligament complex at the time of surgery.

Methods: This retrospective cohort study compared the outcomes of 46 consecutive patients treated with sesamoidectomy (24 lateral, 22 medial). Patient demographics, mechanisms of injury, and outcomes were recorded. Preoperative, postoperative, and changes in both hallux valgus angle (HVA) and intermetatarsal angle (IMA) were measured.

Results: No statistically significant difference could be detected for age ( P = .577), sex ( P = .134), return to activity ( P = 1.000), likelihood to undergo the procedure again ( P = 1.000), orthotic use postoperatively ( P = 1.000), perioperative complications ( P = .497), duration of symptoms ( P = .711), or length of follow-up ( P = .609). While statistically significant changes in preoperative and postoperative alignment were detected for both medial and lateral sesamoidectomy, these changes were not clinically significant. Patients undergoing medial sesamoidectomy had higher preoperative and postoperative HVA and IMA compared with those undergoing lateral sesamoidectomy. Medial sesamoidectomy patients had a net increase in both HVA and IMA, while patients undergoing lateral sesamoidectomy had a net decrease in both HVA and IMA.

Conclusion: Although statistically significant changes in both HVA and IMA were detected, these values were too small to be considered clinically significant. Patient outcomes did not differ between the 2 groups, and sesamoidectomy was used with low patient morbidity for both medial and lateral sesamoid fractures that failed to respond to nonoperative modalities. These data suggest that the underlying mechanics of the foot may be different in patients who sustain medial and lateral sesamoid stress injury, suggesting a possible etiologic difference between medial and lateral sesamoid injuries.

Level Of Evidence: Level III, retrospective cohort study.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1071100717717265DOI Listing
October 2017

Role of Total Ankle Arthroplasty in Stiff Ankles.

Foot Ankle Int 2017 Oct 26;38(10):1070-1077. Epub 2017 Jul 26.

5 Quantitative Sciences and Center for Clinical Effectiveness, Baylor Scott & White Health, Dallas, TX, USA.

Background: The decision tree for the operative treatment of end-stage ankle arthritis involves either ankle arthrodesis (AA) or total ankle arthroplasty (TAA). Although both have documented success providing diminished pain, improved patient-centered outcomes, and improved objective measures of function, arthroplasty is unique in its ability to preserve motion at the tibiotalar joint. Arthroplasty procedures are normally thought of as a motion-sparing surgery rather than a motion-producing procedure, which may limit its success in patients with stiff ankles. Our hypothesis was that there would be improvements in parameters of gait even in patients with a low degree of preoperative total sagittal range of motion.

Methods: A retrospective review was conducted on patients who underwent total ankle arthroplasty with greater than 1-year follow-up. Seventy-six patients were available who underwent isolated TAA for end-stage ankle arthritis with greater than 1-year follow-up. Patient demographics and preoperative and postoperative gait analyses were evaluated. Using a linear regression model, the effect sizes for the variables of age, gender, BMI, preoperative diagnosis, and preoperative total sagittal range of motion were calculated. Multivariate analysis was used to determine the influence each individual variable had on the many parameters of preoperative gait, postoperative gait, and change in gait after surgery. A post hoc analysis was conducted in which patients were divided into 4 quartiles according to preoperative range of motion. A 1-way analysis of variance (ANOVA) was used to compare improvement in parameters of gait for the 4 subgroups.

Results: Although a greater degree of preoperative sagittal range of motion was predictive of greater postoperative sagittal range of motion, patients with limited preoperative range of motion experienced a greater overall improvement in range of motion, and clinically meaningful absolute improvements in range of motion, and other parameters of gait. The post hoc analysis demonstrated that patients in the lowest quartile of preoperative motion had both statistically and clinically significant greater improvements across numerous parameters of gait, although the absolute values were lower than in the patients with higher preoperative ROM. Age, gender, BMI, and preoperative diagnosis did not correlate with changes in parameters of gait after total ankle arthroplasty.

Conclusion: Preoperative range of motion was predictive of overall postoperative gait function. On one hand, a low preoperative range of motion resulted in a lower absolute postoperative function. On the other hand, patients with stiff ankles preoperatively had a statistically and clinically greater improvement in function as measured by multiple parameters of gait. This suggests that total ankle arthroplasty can offer clinically meaningful improvement in gait function and should be considered for patients with end-stage tibiotalar arthritis even in the setting of limited sagittal range of motion.

Level Of Evidence: Level IV, retrospective case series.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1071100717718130DOI Listing
October 2017

Learning Curve Associated With an Automated Laparoscopic Suturing Device Compared With Laparoscopic Suturing.

Surg Innov 2017 Apr 24;24(2):109-114. Epub 2017 Jan 24.

1 Baylor University Medical Center at Dallas, Dallas, TX, USA.

Background: Laparoscopic suturing has proved to be a challenging skill to master which may prevent surgical procedures from being started, or completed, in a minimally invasive fashion. The aim of this study is to compare the learning curves between traditional laparoscopic techniques with a novel suturing device.

Methods: In this prospective single blinded nonrandomized controlled crossover study, we recruited 19 general surgery residents ranging from beginner (PGY1-2, n = 12) to advanced beginner (PGY3-5, n = 7). They were assigned to perform a knot tying and suturing task using either Endo360 or traditional laparoscopic technique (TLT) with needle holders before crossing over to the other method. The proficiency standards were developed by collecting the data for task completion time (TCT in seconds), dots on target (DoT in numbers), and total deviation (D in mm) on 5 expert attending surgeons (mean ± 2SD). The test subjects were "proficient" when they reached these standards 2 consecutive times.

Results: Number of attempts to complete the task was collected for Endo360 and TLT. A significant difference was observed between mean number of attempts to reach proficiency for Endo360 versus TLT ( P = .0027) in both groups combined, but this was not statistically significant in the advanced beginner group. TCT was examined for both methods and demonstrated significantly less time to complete the task for Endo360 versus TLT ( P < .0001). There were significantly less DoT for Endo360 as compared with TLT ( P < .0001), which was also associated with significantly less D ( P < .0001) indicating lower accuracy with Endo360. However, no significant difference was observed between the groups for increasing number of trials for both DoT and D.

Conclusions: This novel suturing device showed a shorter learning curve with regard to number of attempts to complete a task for the beginner group in our study, but matched the learning curve in the advanced beginner group. With regard to time to complete the task, the device was faster in both groups.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/1553350616687903DOI Listing
April 2017

The incidence of cardiomyopathy in BRCA1 and BRCA2 mutation carriers after anthracycline-based adjuvant chemotherapy.

Breast Cancer Res Treat 2017 02 9;162(1):59-67. Epub 2017 Jan 9.

Texas Oncology, Baylor Charles A. Sammons Cancer Center, Baylor University Medical Center, 3410 Worth Street, Suite 400, Dallas, TX, 75246, USA.

Purpose: Breast cancer remains the fourth-leading cause of death in the United States. Nearly 10% of breast cancers are hereditary, with deleterious mutations in BRCA1 and BRCA2 genes being the leading cause. Anthracycline chemotherapy, used commonly for breast cancer, carries cardiotoxicity risk. Recent studies demonstrated anthracycline-induced cardiac failure in homozygous BRCA2-deficient mice and increased rates of heart failure in homozygous BRCA1-deficient mice following ischemic insult. Therefore, we conducted a retrospective matched cohort study to determine the rates of anthracycline-induced cardiomyopathy in breast cancer patients with germline mutation in BRCA1 or BRCA2 genes compared to age-matched patients without a BRCA1 or BRCA2 gene mutation.

Methods: The primary endpoint was to determine the rate of cardiomyopathy defined as either congestive heart failure or asymptomatic decline in ejection fraction to <50%. A total of 102 breast cancer patients who were BRCA gene mutation carriers (55 BRCA1, 45 BRCA2, and two with both), who received anthracycline-based chemotherapy were compared to a matched cohort of breast cancer patients with wild-type BRCA gene status.

Results: We found a 4.9% rate of cardiomyopathy in the BRCA mutation carriers and 5.2% in the matched controls (p = 0.99). Cox proportional hazards model showed that only trastuzumab and hypertension were significantly associated with the development of cardiomyopathy in both groups (p < 0.05).

Conclusions: Given the limitations of a retrospective study, we saw no increased risk of cardiotoxicity among breast cancer patients with BRCA1 and/or BRCA2 gene mutations treated with standard doses of anthracycline compared to the general population.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10549-016-4101-8DOI Listing
February 2017

Central venous thrombosis in children with intestinal failure on long-term parenteral nutrition.

J Pediatr Surg 2016 May 12;51(5):790-3. Epub 2016 Feb 12.

Division of Pediatric Surgery, University of Texas Southwestern/Children's Health, Dallas, TX, USA. Electronic address:

Purpose: Central venous thrombosis (CVT) is a serious complication of long-term central venous access for parenteral nutrition (PN) in children with intestinal failure (IF). We reviewed thse incidence of CVT and possible risk factors.

Methods: Children with IF on home PN (2010-2014) with central venous imaging were reviewed. Patient demographics, catheter characteristics and related complications, and markers of liver function were compared between children with and without CVT. Serum thrombophilia markers were reviewed for patients with CVT.

Results: Thirty children with central venous imaging were included. Seventeen patients had thrombosis of ≥1 central vein, and twelve had ≥2 thrombosed central veins. Patients with and without CVT had similar demographics and catheter characteristics. Patients with CVT had a significantly lower albumin level (2.76±0.38g/dL vs. 3.12±0.41g/dL, p=0.0223). The most common markers of thrombophilia in children with CVT were antithrombin, protein C and S deficiencies, and elevated factor VIII. There was a statistically significant correlation between a combined protein C and S deficiency and having >1 CVT.

Conclusions: Children with IF on long-term PN are at high risk for CVT potentially owing to low levels of natural anticoagulant proteins and elevated factor FVIII activity, likely a reflection of liver insufficiency and chronic inflammation.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jpedsurg.2016.02.024DOI Listing
May 2016

Endoscopic button gastrostomy: Comparing a sutured endoscopic approach to the current techniques.

J Pediatr Surg 2016 Jan 23;51(1):72-5. Epub 2015 Oct 23.

Division of Pediatric Surgery, University of Texas Southwestern/Children's Health, Children's Medical Center, Dallas, TX, USA. Electronic address:

Purpose: Button gastrostomy is the preferred feeding device in children and can be placed open or laparoscopically (LBG). Alternatively, a percutaneous endoscopic gastrostomy (PEG) can be placed initially and exchanged for a button. Endoscopic-assisted button gastrostomy (EBG) combines both techniques, using only one incision and suturing the stomach to the abdominal wall. The long-term outcomes and potential costs for EBG were compared to other techniques.

Methods: Children undergoing EBG, LBG, and PEG (2010-2013) were compared. Patient demographics, procedure duration/complications, and clinic and emergency room (ER) visits for an eight-week follow-up period were compared.

Results: Patient demographics were similar (32 patients/group). Mean procedure time (min) for EBG was 38 ± 9, compared to 58 ± 20 for LBG and 31 ± 10 for PEG (p<0.0001). The most common complications were granulation tissue and infection with a trend toward fewer infections in EBG group. Average number of ER visits was similar, but PEG group had fewer clinic visits. 97% of PEG patients had subsequent visits for exchange to button gastrostomy.

Conclusions: EBG is safe and comparable to LBG and PEG in terms of complications. It has a shorter procedure time than LBG and does not require laparoscopy, device exchange, or subsequent fluoroscopic confirmation, potentially reducing costs.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jpedsurg.2015.10.014DOI Listing
January 2016

Sedation levels during propofol administration for outpatient colonoscopies.

Proc (Bayl Univ Med Cent) 2014 Jan;27(1):12-5

Baylor University Medical Center at Dallas (Ramsay, Jacobson, Richardson, Brown, Hein), Baylor Research Institute (Newman, Rogers), and the Department of Quantitative Sciences, Baylor Health Care System (De Vol, Daoud), Dallas, Texas.

The levels of sedation required for patients to comfortably undergo colonoscopy with propofol were examined. One hundred patients undergoing colonoscopy with propofol were enrolled. In addition to standard-of-care monitoring, sedation level was monitored with the Patient State Index (PSI) obtained from a brain function monitor, transcutaneous carbon dioxide (tcpCO2) was monitored with the TCM TOSCA monitor, and end-tidal carbon dioxide was monitored via nasal cannula. The Ramsay Sedation Score (RSS) was also assessed and recorded. After baseline data were obtained from the first 40 consecutive patients enrolled in the study, the remaining 60 patients were randomized into two groups. In one group the PSI value was blinded from the anesthesiologist and in the second group the PSI was visible and the impact of this information on the management of the sedation was analyzed. Overall 96% of patients reached levels of deep sedation and 89% reached levels of general anesthesia. When comparing the blinded to PSI versus unblinded groups, the blinded group had a significantly lower PSI and higher RSS and tcpCO2, indicating the blinded group was maintained at a deeper sedation level with more respiratory compromise than the unblinded group. Patients undergoing colonoscopy under propofol sedation delivered by a bolus technique are frequently taken to levels of general anesthesia and are at risk for respiratory depression, airway obstruction, and hemodynamic compromise.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3862122PMC
http://dx.doi.org/10.1080/08998280.2014.11929037DOI Listing
January 2014

Validation of clinical prediction scores in patients with primary intracerebral hemorrhage.

Neurocrit Care 2013 Dec;19(3):329-35

Department of Emergency Medicine, Baylor University Medical Center, 3500 Gaston Avenue, Dallas, TX, 75246, USA,

Background: Initial reports of the FUNC score suggest that it may accurately identify those patients suffering from intracerebral hemorrhage (ICH) with an ultra low chance of functional neurologic recovery. This study's aim is to validate the FUNC score and determine if it accurately identifies the cohort of patients with an ultra low chance of survival with good neurologic recovery.

Methods: Retrospective review of 501 consecutive primary ICH patients admitted from the Emergency Department to a large healthcare system. Performance of the FUNC, ICH-GS, and oICH scores was determined by calculating areas under the receiver-operator-characteristic curves. Patients with a predicted 100 % chance of poor neurologic outcome (PNO) (FUNC <4 and ICH-GS >10) scores were evaluated to determine if DNR impacted 90 day survival or rate of survival with a Glasgow Outcome Score of <3.

Results: In 366 cases of primary ICH who presented during the study period, 222(61 %) survived to discharge. Both the FUNC (AUC: 0.873) and ICH-GS (AUC: 0.888) outperformed the oICH (AUC: 0.743) in predicting 90-day mortality (p = <0.001). Of 68 patients with a FUNC score <4, 67 (98.5 %) had PNO at discharge. The presence of DNR was not associated with a significant difference in the rate of PNO at discharge (40/40 = 100 % vs. 27/28 = 96.4 % p = 0.42) or 90-day mortality (40/40 = 100 % vs. 21/28 = 75 %, p = 0.06).

Conclusion: The FUNC and ICH-GS appear superior to the oICH in predicting outcome in patients with primary ICH. In addition, the FUNC score appears to accurately identify patients with low chance of functional neurologic recovery at discharge.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s12028-013-9926-yDOI Listing
December 2013

Prospective, blinded exploratory evaluation of the PlayWisely program in children with autism spectrum disorder.

Yale J Biol Med 2013 Jun 13;86(2):157-67. Epub 2013 Jun 13.

Autism Treatment Center, Research Department, Dallas, Texas 75243, USA.

The purpose of the study was to explore a low-cost intervention that targets an increasingly common developmental disorder. The study was a blinded, exploratory evaluation of the PlayWisely program on autism symptoms and essential learning foundation skills (attention, recognition, and memory skills) in children with a diagnosis of autism, autism spectrum disorder (ASD), pervasive developmental disorder - not otherwise specified (PDD-NOS), and Asperger syndrome (AS). Eighteen children, 1 to 10 years of age, were evaluated using the Childhood Autism Rating Scale, Second Edition (CARS2); the PlayWisely Interactive Test of Attention, Recognition, and Memory Skills; Autism Treatment Evaluation Checklist (ATEC), and the Modified Checklist for Autism in Toddlers (M-CHAT). There were significant treatment effects for the PlayWisely measure on the Yellow Sets that examine recognition; Purple Sets that examine brain region agility and early memory skills; Blue Sets that examine phonemic awareness and recognition; and for the Total Sets, with a similar trend toward improvement in the Green Sets that examine perception and Red Sets that examine attention. No other measures reached statistical significance. The results suggest that PlayWisely can improve recognition, brain region agility, phonemic awareness, letter recognition, and early memory skills in ASD. It was observed by the parents, coaches, and study investigators that the children who were less than 3 years of age showed improvements in autism symptoms; however, the group was too small to reach statistical significance. Future studies are needed to see if this intervention can mitigate autism symptoms in very young children with ASD.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3670436PMC
June 2013

Comparative effectiveness research using electronic health records: impacts of oral antidiabetic drugs on the development of chronic kidney disease.

Pharmacoepidemiol Drug Saf 2013 Apr 24;22(4):413-22. Epub 2013 Feb 24.

Baylor Health Care System-Institute for Health Care Research and Improvement, Dallas, TX 75206, USA.

Purpose: Little is known about the comparative effects of common oral antidiabetic drugs ([OADs] metformin, sulfonylureas, or thiazolidinediones [THZs]) on chronic kidney disease (CKD) outcomes in patients newly diagnosed with type 2 diabetes (T2DM) and followed in community primary care practices. Electronic health records (EHRs) were used to evaluate the relationships between OAD class use and incident proteinuria and prevention of glomerular filtration rate decline.

Methods: A retrospective cohort study on newly diagnosed T2D cases requiring OADs documented in the EHRs of two primary care networks between 1998 and 2009 was conducted. CKD outcomes were new-onset proteinuria and estimated GFR (eGFR) falling below 60 ml/min/1.73 m(2). OAD exposures defined cohorts. Hazard ratios represent differential CKD outcome risk per year of OAD class use.

Results: A total of 798 and 977 patients qualified for proteinuria and eGFR outcome analyses, respectively. With metformin as the reference group, sulfonylurea exposure trended toward association with an increased risk of developing proteinuria ([adjusted hazard ratio; 95% CI] 1.27; 0.93, 1.74); proteinuria risk associated with THZ exposure (1.00; 0.70, 1.42) was similar to metformin. Compared with metformin, sulfonylurea exposure was associated with an increased risk of eGFR reduction to <60 ml/min/1.73 m(2) (1.41; 1.05, 1.91). THZ exposure (1.04; 0.71, 1.50) was not associated with change in the risk of eGFR decline.

Conclusions: In a primary care population, metformin appeared to decrease the risk of CKD development compared with sulfonlyureas; risks of CKD development between metformin and THZs were similar. EHR use in pharmacotherapy comparative effectiveness research creates specific challenges and study limitations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/pds.3413DOI Listing
April 2013

Metanx in type 2 diabetes with peripheral neuropathy: a randomized trial.

Am J Med 2013 Feb 5;126(2):141-9. Epub 2012 Dec 5.

Tulane University Health Sciences Center, New Orleans, LA 70112, USA.

Purpose: To determine whether a combination of L-methylfolate, methylcobalamin, and pyridoxal-5'-phosphate (LMF-MC-PLP [Metanx; Pamlab LLC, Covington, La]) improves sensory neuropathy.

Research Design And Methods: This multicenter, randomized, double-blind, placebo-controlled trial involved 214 patients with type 2 diabetes and neuropathy (baseline vibration perception threshold [VPT]: 25-45 volts), who were randomly assigned to 24 weeks of treatment with either L-methylfolate calcium 3 mg, methylcobalamin 2 mg, and pyridoxal-5'-phosphate 35 mg or placebo. The primary end point was effect on VPT. Secondary end points included Neuropathy Total Symptom Score (NTSS-6) and Short Form 36 (SF-36), as well as plasma levels of folate, vitamins B(6) and B(12), methylmalonic acid (MMA), and homocysteine.

Results: There was no significant effect on VPT. However, patients receiving LMF-MC-PLP consistently reported symptomatic relief, with clinically significant improvement in NTSS-6 scores at week 16 (P=.013 vs placebo) and week 24 (P=.033). Improvement in NTSS scores was related to baseline MMA and inversely related to baseline PLP and metformin use. Quality-of-life measures also improved. Homocysteine decreased by 2.7±3.0 μmol/L with LMF-MC-PLP versus an increase of 0.5±2.4 μmol/L with placebo (P=.0001). Adverse events were infrequent, with no single event occurring in ≥2% of subjects.

Conclusions: LMF-MC-PLP appears to be a safe and effective therapy for alleviation of peripheral neuropathy symptoms, at least in the short term. Additional long-term studies should be conducted, as the trial duration may have been too short to show an effect on VPT. In addition, further research on the effects in patients with cobalamin deficiency would be useful.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjmed.2012.06.022DOI Listing
February 2013

Association between recurrent metastasis from stage II and III primary colorectal tumors and moderate microsatellite instability.

Gastroenterology 2012 Jul 27;143(1):48-50.e1. Epub 2012 Mar 27.

Gastrointestinal Cancer Research Laboratory, Baylor University Medical Center, Dallas, Texas 75246, USA.

Colorectal cancer cells frequently have low levels of microsatellite instability (MSI-L) and elevated microsatellite alterations at selected tetranucleotide repeats (EMAST), but little is known about the clinicopathologic significance of these features. We observed that patients with stage II or III colorectal cancer with MSI-L and/or EMAST had shorter times of recurrence-free survival than patients with high levels of MSI (P = .0084) or with highly stable microsatellites P = .0415), based on Kaplan-Meier analysis. MSI-L and/or EMAST were independent predictors of recurrent distant metastasis from primary stage II or III colorectal tumors (Cox proportional hazard analysis: hazard ratio, 1.83; 95% confidence interval, 1.06-3.15; P = .0301).
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1053/j.gastro.2012.03.034DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3383934PMC
July 2012

Epigenetic regulation of Delta-Like1 controls Notch1 activation in gastric cancer.

Oncotarget 2011 Dec;2(12):1291-301

Department of Clinical Medicine, University of Bologna, Bologna, Italy.

The Notch signaling pathway drives proliferation, differentiation, apoptosis, cell fate, and maintenance of stem cells in several tissues. Aberrant activation of Notch signaling has been described in several tumours and in gastric cancer (GC), activated Notch1 has been associated with de-differentiation of lineage-committed stomach cells into stem progenitors and GC progression. However, the specific role of the Notch1 ligand DLL1 in GC has not yet been elucidated. To assess the role of DLL1 in GC cancer, the expression of Notch1 and its ligands DLL1 and Jagged1, was analyzed in 8 gastric cancer cell lines (KATOIII, SNU601, SNU719, AGS, SNU16, MKN1, MKN45, TMK1). DLL1 expression was absent in KATOIII, SNU601, SNU719 and AGS. The lack of DLL1 expression in these cells was associated with promoter hypermethylation and 5-aza-2'dC caused up-regulation of DLL1. The increase in DLL1 expression was associated with activation of Notch1 signalling, with an increase in cleaved Notch1 intracellular domain (NICD) and Hes1, and down-regulation in Hath1. Concordantly, Notch1 signalling was activated with the overexpression of DLL1. Moreover, Notch1 signalling together with DLL1 methylation were evaluated in samples from 52 GC patients and 21 healthy control as well as in INS-GAS mice infected with H. pylori and randomly treated with eradication therapy. In GC patients, we found a correlation between DLL1 and Hes1 expression, while DLL1 methylation and Hath1 expression were associated with the diffuse and mixed type of gastric cancer. Finally, none of the samples from INS-GAS mice infected with H. pylori, a model of intestinal-type gastric tumorigenesis, showed promoter methylation of DLL1. This study shows that Notch1 activity in gastric cancer is controlled by the epigenetic silencing of the ligand DLL1, and that Notch1 inhibition is associated with the diffuse type of gastric cancer.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3282085PMC
http://dx.doi.org/10.18632/oncotarget.414DOI Listing
December 2011

Foot strike and injury rates in endurance runners: a retrospective study.

Med Sci Sports Exerc 2012 Jul;44(7):1325-34

Department of Human Evolutionary Biology, Harvard University, Cambridge, MA 02138, USA.

Purpose: This retrospective study tests if runners who habitually forefoot strike have different rates of injury than runners who habitually rearfoot strike.

Methods: We measured the strike characteristics of middle- and long-distance runners from a collegiate cross-country team and quantified their history of injury, including the incidence and rate of specific injuries, the severity of each injury, and the rate of mild, moderate, and severe injuries per mile run.

Results: Of the 52 runners studied, 36 (69%) primarily used a rearfoot strike and 16 (31%) primarily used a forefoot strike. Approximately 74% of runners experienced a moderate or severe injury each year, but those who habitually rearfoot strike had approximately twice the rate of repetitive stress injuries than individuals who habitually forefoot strike. Traumatic injury rates were not significantly different between the two groups. A generalized linear model showed that strike type, sex, race distance, and average miles per week each correlate significantly (P < 0.01) with repetitive injury rates.

Conclusions: Competitive cross-country runners on a college team incur high injury rates, but runners who habitually rearfoot strike have significantly higher rates of repetitive stress injury than those who mostly forefoot strike. This study does not test the causal bases for this general difference. One hypothesis, which requires further research, is that the absence of a marked impact peak in the ground reaction force during a forefoot strike compared with a rearfoot strike may contribute to lower rates of injuries in habitual forefoot strikers.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1249/MSS.0b013e3182465115DOI Listing
July 2012

Electronic health record use to classify patients with newly diagnosed versus preexisting type 2 diabetes: infrastructure for comparative effectiveness research and population health management.

Popul Health Manag 2012 Feb 30;15(1):3-11. Epub 2011 Aug 30.

Baylor Health Care System, Dallas, Texas 75206, USA.

Use of electronic health record (EHR) content for comparative effectiveness research (CER) and population health management requires significant data configuration. A retrospective cohort study was conducted using patients with diabetes followed longitudinally (N=36,353) in the EHR deployed at outpatient practice networks of 2 health care systems. A data extraction and classification algorithm targeting identification of patients with a new diagnosis of type 2 diabetes mellitus (T2DM) was applied, with the main criterion being a minimum 30-day window between the first visit documented in the EHR and the entry of T2DM on the EHR problem list. Chart reviews (N=144) validated the performance of refining this EHR classification algorithm with external administrative data. Extraction using EHR data alone designated 3205 patients as newly diagnosed with T2DM with classification accuracy of 70.1%. Use of external administrative data on that preselected population improved classification accuracy of cases identified as new T2DM diagnosis (positive predictive value was 91.9% with that step). Laboratory and medication data did not help case classification. The final cohort using this 2-stage classification process comprised 1972 patients with a new diagnosis of T2DM. Data use from current EHR systems for CER and disease management mandates substantial tailoring. Quality between EHR clinical data generated in daily care and that required for population health research varies. As evidenced by this process for classification of newly diagnosed T2DM cases, validation of EHR data with external sources can be a valuable step.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1089/pop.2010.0084DOI Listing
February 2012

Association between biologic therapies for chronic plaque psoriasis and cardiovascular events: a meta-analysis of randomized controlled trials.

JAMA 2011 Aug;306(8):864-71

Department of Dermatology, Baylor Research Institute, 3900 Junius St, Ste 125, Dallas, TX 75246, USA.

Context: Ustekinumab and briakinumab, monoclonal antibodies to the shared p40 subunit of interleukin (IL)-12 and IL-23, have shown efficacy in treating chronic plaque psoriasis (CPP). Preliminary reports of major adverse cardiovascular events (MACEs) in psoriasis patients receiving anti-IL-12/23 agents have prompted concern.

Objective: To evaluate a possible association between biologic therapies for CPP and MACEs via meta-analysis.

Data Sources: Randomized controlled trials (RCTs) of anti-IL-12/23 (ustekinumab and briakinumab) agents and anti-tumor necrosis factor α (TNF-α) agents (adalimumab, etanercept, and infliximab) used in treating CPP were reviewed using the Cochrane Central Register of Controlled Trials, ClinicalTrials.gov, and Ovid MEDLINE from database inception to May 2011. The results of registered nonpublished completed studies were procured through abstract publications or poster presentations.

Study Selection: Randomized, placebo-controlled, double-blind, monotherapy studies (with safety outcome data for MACE) of IL-12/23 antibodies and anti-TNF-α agents in adults. Studies of psoriatic arthritis were excluded.

Data Extraction: Two investigators independently searched data while 6 investigators reviewed the abstracted data.

Results: A total of 22 RCTs comprising 10 183 patients met the predefined inclusion criteria. The primary outcome measure was MACE, a composite end point of myocardial infarction, cerebrovascular accident, or cardiovascular death during the placebo-controlled phase of treatment in patients receiving at least 1 dose of study agent or placebo. Absolute risk differences were used as an effect measure. There was no evidence of statistical heterogeneity across the studies using the I(2) statistic (I(2) = 0), allowing for combination of trial results using the Mantel-Haenszel fixed-effects method. During the placebo-controlled phases of the anti-IL-12/23 studies, 10 of 3179 patients receiving anti-IL-12/23 therapies experienced MACEs compared with zero events in 1474 patients receiving placebo (Mantel-Haenszel risk difference, 0.012 events/person-year; 95% confidence interval [CI], -0.001 to 0.026; P =.12). In the anti-TNF-α trials, only 1 of 3858 patients receiving anti-TNF-α agents experienced a MACE compared with 1 of 1812 patients receiving placebo (Mantel-Haenszel risk difference, -0.0005 events/person-year; 95% CI, -0.010 to 0.009; P = .94).

Conclusions: Compared with placebo, there was no significant difference in the rate of MACEs observed in patients receiving anti-IL-12/IL-23 antibodies or anti-TNF-α treatments. This study may have been underpowered to identify a significant difference.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1001/jama.2011.1211DOI Listing
August 2011

Comparative effectiveness of a prenatal medical food to prenatal vitamins on hemoglobin levels and adverse outcomes: a retrospective analysis.

Clin Ther 2011 Feb 25;33(2):204-10. Epub 2011 Mar 25.

Women's Clinic Shoals, Sheffield, Alabama 35660, USA.

Background: The role of folate in pregnancy is well established, with most prenatal vitamins (PNVs) on the market containing at least 800 μg of folic acid. Folic acid must be converted in the body to L-methylfolate, the natural and biologically active form of folate. The role of vitamin B(12) in pregnancy is less characterized, and most PNV formulations contain only 0 to 12 μg. The present study was undertaken to evaluate whether taking a prenatal medical food containing L-methylfolate and much higher doses of vitamin B(12) results in higher hemoglobin levels and thus, a lower incidence of anemia during pregnancy.

Objective: The objective of this exploratory study was to evaluate the effects of the prenatal medical food versus standard PNVs on hemoglobin levels and adverse outcomes throughout pregnancy.

Methods: For this retrospective analysis, we reviewed the charts of female patients taking either a prenatal medical food or standard PNV during pregnancy. Hemoglobin levels measured at initiation of prenatal care, end of second trimester, and delivery were recorded. Patients who had received additional iron supplementation, beyond that contained in the prenatal medical food or PNV they were taking and before anemia screening at the end of the second trimester, were excluded from the study. Fisher exact test, χ(2) test, student t test, and ANOVA were used to evaluate differences between the treatment groups.

Results: Data were analyzed from 112 charts: 58 patients (51.8%) were taking the prenatal medical food; 54 patients (48.2%) were taking standard PNVs. Mean (SD) age at first prenatal visit was 27 (4.6) years in the medical food group and 28.8 (3.5) years in the PNV group (P = 0.024). Mean (SD) body mass indices were 29.1 (6.5) and 31.7 (8.9) in the medical food and PNV groups, respectively (P = NS). In the medical food group, 35 women (60.3%) were white/Caucasian, 17 (29.3%) were African American, and 6 (10.4%) were of other races. In the PNV group, 24 women (44.4%) were white/Caucasian, 25 (46.3%) were African American, and 5 (9.3%) were of other races. However, race was not significantly different between the two groups. At end of second trimester and at delivery, mean (SD) hemoglobin levels were higher in the prenatal medical food group (11.8 [1.1] g/dL and 11.8 [1.3] g/dL, respectively) than in the PNV group (11.3 [1.2] g/dL and 10.7 [1.2] g/dL, respectively) (P = 0.011 and P = 0.001, respectively). Significantly fewer cases of anemia were reported at end of second trimester in the prenatal medical food group than in the PNV group (39.7% vs 74.1%; P = 0.001).

Conclusions: In the present study, supplementation with a prenatal medical food containing L-methylfolate and high-dose vitamin B(12) may maintain hemoglobin levels and decrease rates of anemia in pregnancy more effectively than standard prenatal vitamins; however, prospective, controlled studies are warranted. ClinicalTrials.gov identifier: NCT01193192.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.clinthera.2011.02.010DOI Listing
February 2011

Chemoprevention of intestinal polyps in ApcMin/+ mice fed with western or balanced diets by drinking annurca apple polyphenol extract.

Cancer Prev Res (Phila) 2011 Jun 7;4(6):907-15. Epub 2011 Mar 7.

Department of Internal Medicine, Baylor Research Institute, Sammons Cancer Center, USA.

The Western diet (WD) is associated with a higher incidence of colorectal cancer (CRC) than the Mediterranean diet. Polyphenols extracted from Annurca apple showed chemopreventive properties in CRC cells. A multifactorial, four-arm study by using wild-type (wt) and Apc(Min/+) mice was carried out to evaluate the effect on polyp number and growth of APE treatment (60 μmol/L) ad libitum in drinking water combined with a WD or a balanced diet (BD) for 12 weeks. Compared with APE treatment, we found a significant drop in body weight (P < 0.0001), severe rectal bleeding (P = 0.0076), presence of extraintestinal tumors, and poorer activity status (P = 0.0034) in water-drinking Apc(Min/+) mice, more remarkably in the WD arm. In the BD and WD groups, APE reduced polyp number (35% and 42%, respectively, P < 0.001) and growth (60% and 52%, respectively, P < 0.0001) in both colon and small intestine. Increased antioxidant activity was found in wt animals fed both diets and in Apc(Min/+) mice fed WD and drinking APE. Reduced lipid peroxidation was found in Apc(Min/+) mice drinking APE fed both diets and in wt mice fed WD. In normal mucosa, mice drinking water had lower global levels of DNA methylation than mice drinking APE. APE treatment is highly effective in reducing polyps in Apc(Min/+) mice and supports the concept that a mixture of phytochemicals, as they are naturally present in foods, represent a plausible chemopreventive agent for CRC, particularly in populations at high risk for colorectal neoplasia.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1158/1940-6207.CAPR-10-0359DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3793841PMC
June 2011

L-methylfolate Plus SSRI or SNRI from Treatment Initiation Compared to SSRI or SNRI Monotherapy in a Major Depressive Episode.

Innov Clin Neurosci 2011 Jan;8(1):19-28

Dr. Ginsberg is from Red Oak Psychiatry Associates, PA, Houston, Texas.

Objective: Evaluate the efficacy of L-methylfolate in combination with SSRI or SNRI compared to SSRI or SNRI monotherapy in a major depressive episode.

Design: A retrospective analysis of L-methylfolate plus SSRI/SNRI at treatment initiation (n=95) and SSRI/SNRI monotherapy (n=147) from patient charts.

Setting: Outpatient, private psychiatric clinic/practice.

Participants: Adults 18 to 70 with major depressive episode (single or recurrent).

Measurements: Clinical Global Impressions-Severity (CGI-S) and safety/tolerability measures.

Results: Major improvement (CGI-S reduced by ≥2 points) was experienced by 18.5 percent of L-methylfolate plus SSRI/SNRI patients (CGI-S=4-5) compared to 7.04 percent of SSRI/SNRI monotherapy (p=0.01) patients at 60 days. Forty percent of L-methylfolate plus SSRI/SNRI patients with greater functional impairment (CGI-S=5) experienced major improvement compared to 16.3 percent of SSRI/SNRI monotherapy patients (p=0.02). Median times to major improvement were 177 days for L-methylfolate plus SSRI/SNRI patients and 231 days for SSRI/SNRI monotherapy patients (p=0.03). Median time to major improvement for L-methylfolate plus SSRI/SNRI patients with greater functional impairment (CGI-S=5) was 85 days and 150 days for SSRI/SNRI monotherapy patients (p=0.018). There were no significant differences between groups in adverse events. Discontinuation due to adverse events was 17.9 percent in L-methylfolate plus SSRI/SNRI patients compared to 34 percent in the SSRI/SNRI monotherapy patients over duration of the study (p=0.0078).

Conclusion: L-methylfolate plus antidepressant at treatment onset was more effective in improving depressive symptoms and function measured by CGI-S scores within 60 days than antidepressant monotherapy, led to major symptomatic improvement more rapidly than SSRI/SNRI monotherapy, and was better tolerated.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3036555PMC
January 2011