Publications by authors named "Maria E Mayorga"

22 Publications

  • Page 1 of 1

High-Quality Masks Reduce COVID-19 Infections and Deaths in the US.

medRxiv 2021 Jan 28. Epub 2021 Jan 28.

Objectives: To evaluate the effectiveness of widespread adoption of masks or face coverings to reduce community transmission of the SARS-CoV-2 virus that causes COVID-19.

Methods: We created an agent-based stochastic network simulation using a variant of the standard SEIR dynamic infectious disease model. We considered a mask order that was initiated 3.5 months after the first confirmed COVID-19 case. We varied the likelihood of individuals wearing masks from 0-100% in steps of 20% (mask adherence) and considered 25% to 90% mask-related reduction in viral transmission (mask efficacy). Sensitivity analyses assessed early (by week 13) versus late (by week 42) adoption of masks and geographic differences in adherence (highest in urban and lowest in rural areas).

Results: Introduction of mask use with 50% efficacy worn by 50% of individuals reduces the cumulative infection attack rate (IAR) by 27%, the peak prevalence by 49%, and population-wide mortality by 29%. If 90% of individuals wear 50% efficacious masks, this decreases IAR by 54%, peak prevalence by 75%, and population-wide mortality by 55%; similar improvements hold if 70% of individuals wear 75% efficacious masks. Late adoption reduces IAR and deaths by 18% or more compared to no adoption. Lower adoption in rural areas than urban would lead to rural areas having the highest IAR.

Conclusions: Even after community transmission of SARS-CoV-2 has been established, adoption of mask-wearing by a majority of community-dwelling individuals can meaningfully reduce the number and outcome of COVID-19 infections over and above physical distancing interventions.

Highlights: This paper shows the impact of widespread adoption of masks in response to the COVID-19 pandemic, with varying levels of population adherence, mask efficacy, and timing of mask adoption.The paper's findings help inform messaging to policymakers at the state or local level considering adding or keeping mask mandates, and to communities to promote widespread adoption of high-quality masks.Adoption of masks by at least half of the population can reduce cumulative infections and population deaths by more than 25%, while decreasing peak prevalence by about 50%. Even greater marginal improvements arise with adoption rates above 70%. The benefits of adopting high-quality masks is above that achieved by mobility changes and distancing alone.Rural and suburban areas are at higher relative risk than urban areas, due to less distancing and lower adoption of masks.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2020.09.27.20199737DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7852241PMC
January 2021

The Joint Impact of COVID-19 Vaccination and Non-Pharmaceutical Interventions on Infections, Hospitalizations, and Mortality: An Agent-Based Simulation.

medRxiv 2021 Jan 10. Epub 2021 Jan 10.

Background: Vaccination against SARS-CoV-2 has the potential to significantly reduce transmission and morbidity and mortality due to COVID-19. This modeling study simulated the comparative and joint impact of COVID-19 vaccine efficacy and coverage with and without non-pharmaceutical interventions (NPIs) on total infections, hospitalizations, and deaths.

Methods: An agent-based simulation model was employed to estimate incident SARS-CoV-2 infections and COVID-19-associated hospitalizations and deaths over 18 months for the State of North Carolina, a population of roughly 10.5 million. Vaccine efficacy of 50% and 90% and vaccine coverage of 25%, 50%, and 75% (at the end of a 6-month distribution period) were evaluated. Six vaccination scenarios were simulated with NPIs (i.e., reduced mobility, school closings, face mask usage) maintained and removed during the period of vaccine distribution.

Results: In the worst-case vaccination scenario (50% efficacy and 25% coverage), 2,231,134 new SARS-CoV-2 infections occurred with NPIs removed and 799,949 infections with NPIs maintained. In contrast, in the best-case scenario (90% efficacy and 75% coverage), there were 450,575 new infections with NPIs maintained and 527,409 with NPIs removed. When NPIs were removed, lower efficacy (50%) and higher coverage (75%) reduced infection risk by a greater magnitude than higher efficacy (90%) and lower coverage (25%) compared to the worst-case scenario (absolute risk reduction 13% and 8%, respectively).

Conclusion: Simulation results suggest that premature lifting of NPIs while vaccines are distributed may result in substantial increases in infections, hospitalizations, and deaths. Furthermore, as NPIs are removed, higher vaccination coverage with less efficacious vaccines can contribute to a larger reduction in risk of SARS-CoV-2 infection compared to more efficacious vaccines at lower coverage. Our findings highlight the need for well-resourced and coordinated efforts to achieve high vaccine coverage and continued adherence to NPIs before many pre-pandemic activities can be resumed.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1101/2020.12.30.20248888DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7805476PMC
January 2021

Workflow Differences Affect Data Accuracy in Oncologic EHRs: A First Step Toward Detangling the Diagnosis Data Babel.

JCO Clin Cancer Inform 2020 06;4:529-538

Wake Forest School of Medicine, Winston Salem, NC.

Purpose: Diagnosis (DX) information is key to clinical data reuse, yet accessible structured DX data often lack accuracy. Previous research hints at workflow differences in cancer DX entry, but their link to clinical data quality is unclear. We hypothesized that there is a statistically significant relationship between workflow-describing variables and DX data quality.

Methods: We extracted DX data from encounter and order tables within our electronic health records (EHRs) for a cohort of patients with confirmed brain neoplasms. We built and optimized logistic regressions to predict the odds of fully accurate (ie, correct neoplasm type and anatomic site), inaccurate, and suboptimal (ie, vague) DX entry across clinical workflows. We selected our variables based on correlation strength of each outcome variable.

Results: Both workflow and personnel variables were predictive of DX data quality. For example, a DX entered in departments other than oncology had up to 2.89 times higher odds of being accurate ( < .0001) compared with an oncology department; an outpatient care location had up to 98% fewer odds of being inaccurate ( < .0001), but had 458 times higher odds of being suboptimal ( < .0001) compared with main campus, including the cancer center; and a DX recoded by a physician assistant had 85% fewer odds of being suboptimal ( = .005) compared with those entered by physicians.

Conclusion: These results suggest that differences across clinical workflows and the clinical personnel producing EHR data affect clinical data quality. They also suggest that the need for specific structured DX data recording varies across clinical workflows and may be dependent on clinical information needs. Clinicians and researchers reusing oncologic data should consider such heterogeneity when conducting secondary analyses of EHR data.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1200/CCI.19.00114DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7331128PMC
June 2020

Notice to comply: A systematic review of clinician compliance with guidelines surrounding acute hospital-based infection management.

Am J Infect Control 2020 08 17;48(8):940-947. Epub 2020 Mar 17.

Department of Medicine, Mayo Clinic, Rochester, MN; Department of Biostatistics. Harvard T.H. Chan School of Public Health, Boston, MA.

Purpose: To identify and characterize studies evaluating clinician compliance with infection-related guidelines, and to explore trends in guideline design and implementation strategies.

Data Sources: PubMed database, April 2017. Followed the PRISMA Statement for systematic reviews.

Study Selection: Scope was limited to studies reporting compliance with guidelines pertaining to the prevention, detection, and/or treatment of acute hospital-based infections. Initial search (1,499 titles) was reduced to 49 selected articles.

Data Extraction: Extracted publication and guideline characteristics, outcome measures reported, and any results related to clinician compliance. Primary summary measures were frequencies and distributions of characteristics. Interventions that led to improved compliance results were analyzed to identify trends in guideline design and implementation.

Results Of Data Synthesis: Of the 49 selected studies, 18 (37%), 13 (27%), and 10 (20%) focused on sepsis, pneumonia, and general infection, respectively. Six (12%), 17 (35%), and 26 (53%) studies assessed local, national, and international guidelines, respectively. Twenty studies (41%) reported 1-instance compliance results, 28 studies (57%) reported 2-instance compliance results (either before-and-after studies or control group studies), and 1 study (2%) described compliance qualitatively. Average absolute change in compliance for minimal, decision support, and multimodal interventions was 10%, 14%, and 25%, respectively. Twelve studies (24%) reported no patient outcome alongside compliance.

Conclusions: Multimodal interventions and quality improvement initiatives seem to produce the greatest improvement in compliance, but trends in other factors were inconsistent. Additional research is required to investigate these relationships and understand the implications behind various approaches to guideline design, communication, and implementation, in addition to effectiveness of protocol impact on relevant patient outcomes.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ajic.2020.02.006DOI Listing
August 2020

Estimating the impact of insurance expansion on colorectal cancer and related costs in North Carolina: A population-level simulation analysis.

Prev Med 2019 12 27;129S:105847. Epub 2019 Oct 27.

Department of Health Policy & Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA; Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA; Center for Health Promotion & Disease Prevention, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA.

Although screening is effective in reducing incidence, mortality, and costs of treating colorectal cancer (CRC), it remains underutilized, in part due to limited insurance access. We used microsimulation to estimate the health and financial effects of insurance expansion and reduction scenarios in North Carolina (NC). We simulated the full lifetime of a simulated population of 3,298,265 residents age-eligible for CRC screening (ages 50-75) during a 5-year period starting January 1, 2018, including polyp incidence and progression and CRC screening, diagnosis, treatment, and mortality. Insurance scenarios included: status quo, which in NC includes access to the Health Insurance Exchange (HIE) under the Affordable Care Act (ACA); no ACA; NC Medicaid expansion, and Medicare-for-all. The insurance expansion scenarios would increase percent up-to-date with screening by 0.3 and 7.1 percentage points for Medicaid expansion and Medicare-for-all, respectively, while insurance reduction would reduce percent up-to-date by 1.1 percentage points, compared to the status quo (51.7% up-to-date), at the end of the 5-year period. Throughout these individuals' lifetimes, this change in CRC screening/testing results in an estimated 498 CRC cases averted with Medicaid expansion and 6031 averted with Medicare-for-all, and an additional 1782 cases if health insurance gains associated with ACA are lost. Estimated cost savings - balancing increased CRC screening/testing costs against decreased cancer treatment costs - are approximately $30 M and $970 M for Medicaid expansion and Medicare-for-all scenarios, respectively, compared to status quo. Insurance expansion is likely to improve CRC screening both overall and in underserved populations while saving money, with the largest savings realized by Medicare.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ypmed.2019.105847DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC7065511PMC
December 2019

Mailed FIT (fecal immunochemical test), navigation or patient reminders? Using microsimulation to inform selection of interventions to increase colorectal cancer screening in Medicaid enrollees.

Prev Med 2019 12 18;129S:105836. Epub 2019 Oct 18.

Department of Health Policy & Management, Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States of America; Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States of America; Center for Health Promotion & Disease Prevention, University of North Carolina at Chapel Hill, Chapel Hill, NC, United States of America.

Colorectal cancer (CRC) can be effectively prevented or detected with guideline concordant screening, yet Medicaid enrollees experience disparities. We used microsimulation to project CRC screening patterns, CRC cases averted, and life-years gained in the population of 68,077 Oregon Medicaid enrollees 50-64 over a five year period starting in January 2019. The simulation estimated the cost-effectiveness of five intervention scenarios - academic detailing plus provider audit and feedback (Detailing+), patient reminders (Reminders), mailing a Fecal Immunochemical Test (FIT) directly to the patient's home (Mailed FIT), patient navigation (Navigation), and mailed FIT with Navigation (Mailed FIT + Navigation) - compared to usual care. Each intervention scenario raised CRC screening rates compared to usual care, with improvements as high as 11.6 percentage points (Mailed FIT + Navigation) and as low as 2.5 percentage points (Reminders) after one year. Compared to usual care, Mailed FIT + Navigation would raise CRC screening rates 20.2 percentage points after five years - averting nearly 77 cancer cases (a reduction of 113 per 100,000) and exceeding national screening targets. Over a five year period, Reminders, Mailed FIT and Mailed FIT + Navigation were expected to be cost effective if stakeholders were willing to pay $230 or less per additional year up-to-date (at a cost of $22, $59, and $227 respectively), whereas Detailing+ and Navigation were more costly for the same benefits. To approach national CRC screening targets, health system stakeholders are encouraged to implement Mailed FIT with or without Navigation and Reminders.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ypmed.2019.105836DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6934075PMC
December 2019

Colorectal cancer screening in newly insured Medicaid members: a review of concurrent federal and state policies.

BMC Health Serv Res 2019 May 9;19(1):298. Epub 2019 May 9.

OHSU-PSU School of Public Health, Oregon Health & Science University, Portland, OR, USA.

Background: Colorectal cancer (CRC) screening is underutilized by Medicaid enrollees and the uninsured. Multiple national and state policies were enacted from 2010 to 2014 to increase access to Medicaid and to promote CRC screening among Medicaid enrollees. We aimed to determine the impact of these policies on screening initiation among newly enrolled Oregon Medicaid beneficiaries age-eligible for CRC screening.

Methods: We identified national and state policies affecting Medicaid coverage and preventive services in Oregon during 2010-2014. We used Oregon Medicaid claims data from 2010 to 2015 to conduct a cohort analysis of enrollees who turned 50 and became age-eligible for CRC screening (a prevention milestone, and an age at which guideline-concordant screening can be assessed within a single year) during each year from 2010 to 2014. We calculated risk ratios to assess whether first year of Medicaid enrollment and/or year turned 50 was associated with CRC screening initiation.

Results: We identified 14,576 Oregon Medicaid enrollees who turned 50 during 2010-2014; 2429 (17%) completed CRC screening within 12 months after turning 50. Individuals newly enrolled in Medicaid in 2013 or 2014 were 1.58 and 1.31 times more likely, respectively, to initiate CRC screening than those enrolled by 2010. A primary care visit in the calendar year, having one or more chronic conditions, and being Hispanic was also associated with CRC screening initiation.

Discussion: The increased uptake of CRC screening in 2013 and 2014 is associated with the timing of policies such as Medicaid expansion, enhanced federal matching for preventive services offered to Medicaid enrollees without cost sharing, and formation of Medicaid accountable care organizations, which included CRC screening as an incentivized quality metric.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s12913-019-4113-2DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6509857PMC
May 2019

Trends in Characteristics of Patients Listed for Liver Transplantation Will Lead to Higher Rates of Waitlist Removal Due to Clinical Deterioration.

Transplantation 2017 10;101(10):2368-2374

1 Operations Research Graduate Program, North Carolina State University, Raleigh, NC. 2 Department of Industrial and Systems Engineering, North Carolina State University, Raleigh, NC. 3 Department of Medicine, Indiana University School of Medicine, Indianapolis, IN. 4 Department of Health Policy and Management, University of North Carolina, Chapel Hill, NC. 5 Department of Medicine, University of North Carolina, Chapel Hill, NC.

Background: Changes in the epidemiology of end-stage liver disease may lead to increased risk of dropout from the liver transplant waitlist. Anticipating the future of liver transplant waitlist characteristics is vital when considering organ allocation policy.

Methods: We performed a discrete event simulation to forecast patient characteristics and rate of waitlist dropout. Estimates were simulated from 2015 to 2025. The model was informed by data from the Organ Procurement and Transplant Network, 2003 to 2014. National data are estimated along with forecasts for 2 regions.

Results: Nonalcoholic steatohepatitis will increase from 18% of waitlist additions to 22% by 2025. Hepatitis C will fall from 30% to 21%. Listings over age 60 years will increase from 36% to 48%. The hazard of dropout will increase from 41% to 46% nationally. Wait times for transplant for patients listed with a Model for End-Stage Liver Disease (MELD) between 22 and 27 will double. Region 5, which transplants at relatively higher MELD scores, will experience an increase from 53% to 64% waitlist dropout. Region 11, which transplants at lower MELD scores, will have an increase in waitlist dropout from 30% to 44%.

Conclusions: The liver transplant waitlist size will remain static over the next decade due to patient dropout. Liver transplant candidates will be older, more likely to have nonalcoholic steatohepatitis and will wait for transplantation longer even when listed at a competitive MELD score. There will continue to be significant heterogeneity among transplant regions where some patients will be more likely to drop out of the waitlist than receive a transplant.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1097/TP.0000000000001851DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5667556PMC
October 2017

Changes in cigarette smoking initiation, cessation, and relapse among U.S. adults: a comparison of two longitudinal samples.

Tob Induc Dis 2017 14;15:17. Epub 2017 Mar 14.

Schroeder Institute for Tobacco Research & Policy Studies, Truth Initiative, Washington, DC USA.

Background: The tobacco epidemic in the U.S. has matured in the past decade. However, due to rapidly changing social policy and commercial environments, tailored prevention and interventions are needed to support further reduction in smoking.

Methods: Using Tobacco Use Supplement to the Current Population Survey (TUS-CPS) 2002-2003 and 2010-2011 longitudinal cohorts, five smoking states are defined including daily-heavy, daily-light, non-daily, former and non-smoker. We quantified the changes between smoking states for the two longitudinal cohorts, and used a series of multivariable logistic regression models to examine the association of socio-demographic attributes and initial smoking states on smoking initiation, cessation, and relapse between waves within each cohort.

Results: The prevalence of adult heavy smoking decreased from 9.9% (95% CI: 9.6%, 10.2%) in 2002 to 7.1% (95% CI: 6.9%, 7.4%) in 2010. Non-daily smokers were less likely to quit in the 2010-2011 cohort than the 2002-2003 cohort (37.0% vs. 44.9%). Gender, age group, smoker type, race and marital status exhibit similar patterns in terms of their association to the odds of initiation, cessation and relapse between the two cohorts, while education groups showed some inconsistent results between the two cohorts regarding the odds of cessation.

Conclusions: Transitions between smoking states are complex and increasingly unstable, requiring a holistic, population-based perspective to understand the stocks and flows that ultimately dictate the public health impact of cigarette smoking behavior. This knowledge helps to identify groups in need of increased tobacco control prevention and intervention efforts.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1186/s12971-017-0121-3DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5351179PMC
March 2017

Cost-Effectiveness Analysis of Four Simulated Colorectal Cancer Screening Interventions, North Carolina.

Prev Chronic Dis 2017 02 23;14:E18. Epub 2017 Feb 23.

Department of Health Policy and Management, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina.

Introduction: Colorectal cancer (CRC) screening rates are suboptimal, particularly among the uninsured and the under-insured and among rural and African American populations. Little guidance is available for state-level decision makers to use to prioritize investment in evidence-based interventions to improve their population's health. The objective of this study was to demonstrate use of a simulation model that incorporates synthetic census data and claims-based statistical models to project screening behavior in North Carolina.

Methods: We used individual-based modeling to simulate and compare intervention costs and results under 4 evidence-based and stakeholder-informed intervention scenarios for a 10-year intervention window, from January 1, 2014, through December 31, 2023. We compared the proportion of people living in North Carolina who were aged 50 to 75 years at some point during the window (that is, age-eligible for screening) who were up to date with CRC screening recommendations across intervention scenarios, both overall and among groups with documented disparities in receipt of screening.

Results: We estimated that the costs of the 4 intervention scenarios considered would range from $1.6 million to $3.75 million. Our model showed that mailed reminders for Medicaid enrollees, mass media campaigns targeting African Americans, and colonoscopy vouchers for the uninsured reduced disparities in receipt of screening by 2023, but produced only small increases in overall screening rates (0.2-0.5 percentage-point increases in the percentage of age-eligible adults who were up to date with CRC screening recommendations). Increased screenings ranged from 41,709 additional life-years up to date with screening for the voucher intervention to 145,821 for the mass media intervention. Reminders mailed to Medicaid enrollees and the mass media campaign for African Americans were the most cost-effective interventions, with costs per additional life-year up to date with screening of $25 or less. The intervention expanding the number of endoscopy facilities cost more than the other 3 interventions and was less effective in increasing CRC screening.

Conclusion: Cost-effective CRC screening interventions targeting observed disparities are available, but substantial investment (more than $3.75 million) and additional approaches beyond those considered here are required to realize greater increases population-wide.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.5888/pcd14.160158DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5325466PMC
February 2017

Multilevel predictors of colorectal cancer testing modality among publicly and privately insured people turning 50.

Prev Med Rep 2017 Jun 7;6:9-16. Epub 2016 Dec 7.

Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill, 450 West Drive, CB#7295, Chapel Hill, NC 27599-7295, United States; Cecil G Sheps Center for Health Services Research, University of North Carolina at Chapel Hill, 725 Airport Road, CB#7590, Chapel Hill, NC 27599-7590, United States; Center for Health Promotion and Disease Prevention, University of North Carolina at Chapel Hill, 1700 Martin Luther King Jr. Boulevard, CB#7426, Chapel Hill, NC 27599-7426, United States; Division of General Medicine and Clinical Epidemiology, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599, United States.

Understanding multilevel predictors of colorectal cancer (CRC) screening test modality can help inform screening program design and implementation. We used North Carolina Medicare, Medicaid, and private, commercially available, health plan insurance claims data from 2003 to 2008 to ascertain CRC test modality among people who received CRC screening around their 50th birthday, when guidelines recommend that screening should commence for normal risk individuals. We ascertained receipt of colonoscopy, fecal occult blood test (FOBT) and fecal immunochemical test (FIT) from billing codes. Person-level and county-level contextual variables were included in multilevel random intercepts models to understand predictors of CRC test modality, stratified by insurance type. Of 12,570 publicly-insured persons turning 50 during the study period who received CRC testing, 57% received colonoscopy, whereas 43% received FOBT/FIT, with significant regional variation. In multivariable models, females with public insurance had lower odds of colonoscopy than males (odds ratio [OR] = 0.68; p < 0.05). Of 56,151 privately-insured persons turning 50 years old who received CRC testing, 42% received colonoscopy, whereas 58% received FOBT/FIT, with significant regional variation. In multivariable models, females with private insurance had lower odds of colonoscopy than males (OR = 0.43; p < 0.05). People living 10-15 miles away from endoscopy facilities also had lower odds of colonoscopy than those living within 5 miles (OR = 0.91; p < 0.05). Both colonoscopy and FOBT/FIT are widely used in North Carolina among insured persons newly age-eligible for screening. The high level of FOBT/FIT use among privately insured persons and women suggests that renewed emphasis on FOBT/FIT as a viable screening alternative to colonoscopy may be important.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.pmedr.2016.11.019DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5300695PMC
June 2017

Declining liver graft quality threatens the future of liver transplantation in the United States.

Liver Transpl 2015 Aug;21(8):1040-50

Department of Medicine, University of North Carolina, Chapel Hill, NC.

National liver transplantation (LT) volume has declined since 2006, in part because of worsening donor organ quality. Trends that degrade organ quality are expected to continue over the next 2 decades. We used the United Network for Organ Sharing (UNOS) database to inform a 20-year discrete event simulation estimating LT volume from 2010 to 2030. Data to inform the model were obtained from deceased organ donors between 2000 and 2009. If donor liver utilization practices remain constant, utilization will fall from 78% to 44% by 2030, resulting in 2230 fewer LTs. If transplant centers increase their risk tolerance for marginal grafts, utilization would decrease to 48%. The institution of "opt-out" organ donation policies to increase the donor pool would still result in 1380 to 1866 fewer transplants. Ex vivo perfusion techniques that increase the use of marginal donor livers may stabilize LT volume. Otherwise, the number of LTs in the United States will decrease substantially over the next 15 years. In conclusion, the transplant community will need to accept inferior grafts and potentially worse posttransplant outcomes and/or develop new strategies for increasing organ donation and utilization in order to maintain the number of LTs at the current level.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/lt.24160DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4566853PMC
August 2015

Erratum to: Prioritization strategies for patient evacuations.

Health Care Manag Sci 2015 Jun;18(2):218

Department of Industrial Engineering, Clemson University, 110 Freeman Hall, Clemson, SC, 29634-0920, USA,

View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10729-015-9325-3DOI Listing
June 2015

Increasing prevalence of diabetes during pregnancy in South Carolina.

J Womens Health (Larchmt) 2015 Apr 18;24(4):316-23. Epub 2015 Mar 18.

1 Department of Medicine, Medical University of South Carolina , Charleston, South Carolina.

Background: The objective of our study was to examine the prevalence of diabetes during pregnancy at the population level in SC from January 1996 through December 2008.

Methods: The study included 387,720 non-Hispanic white (NHW), 232,278 non-Hispanic black (NHB), and 43,454 Hispanic live singleton births. Maternal inpatient hospital discharge codes from delivery (91.59%) and prenatal information (i.e., Medicaid [42.91%] and SC State Health Plan [SHP] [5.98%]) were linked to birth certificate data. Diabetes during pregnancy included gestational and preexisting, defined by prenatal and maternal inpatient International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes (i.e., 64801-64802, 64881-64882, or 25000-25092) or report on the birth certificate.

Results: Diabetes prevalence from any source increased from 5.02% (95% confidence interval [CI]: 4.82-5.22) in 1996 to 8.37% (95% CI: 8.15-8.60) in 2008. Diabetes prevalence, standardized for maternal age and race/ethnicity from 1996 through 2008, increased from 3.38% (95% CI: 3.29-3.47) to 5.81% (95% CI: 5.71-5.91) using birth certificate data, from 3.99% (95% CI: 3.89-4.10) to 6.69% (95% CI: 6.58-6.80) using hospital discharge data, and from 4.74% (95% CI: 4.52-4.96) to 8.82% (95% CI: 8.61-9.03) using Medicaid data. Comparing birth certificate to hospital discharge, Medicaid, and SHP data, Cohen's kappa in 2008 was 0.73 (95% CI: 0.72-0.75), 0.64 (95% CI: 0.62-0.66), and 0.59 (95% CI: 0.54-0.65), respectively.

Conclusions: An increasing prevalence of diabetes during pregnancy is reported, as well as substantial lack of agreement in reporting of diabetes prevalence across administrative databases. Prevalence of reported diabetes during pregnancy is impacted by screening, diagnostic, and reporting practices across different data sources, as well as by actual changes in prevalence over time.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1089/jwh.2014.4968DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4394884PMC
April 2015

Predicting Liver Transplant Capacity Using Discrete Event Simulation.

Med Decis Making 2015 08 12;35(6):784-96. Epub 2014 Nov 12.

Department of Health Policy and Management, University of North Carolina, Chapel Hill, NC (SBW)

The number of liver transplants (LTs) performed in the US increased until 2006 but has since declined despite an ongoing increase in demand. This decline may be due in part to decreased donor liver quality and increasing discard of poor-quality livers. We constructed a discrete event simulation (DES) model informed by current donor characteristics to predict future LT trends through the year 2030. The data source for our model is the United Network for Organ Sharing database, which contains patient-level information on all organ transplants performed in the US. Previous analysis showed that liver discard is increasing and that discarded organs are more often from donors who are older, are obese, have diabetes, and donated after cardiac death. Given that the prevalence of these factors is increasing, the DES model quantifies the reduction in the number of LTs performed through 2030. In addition, the model estimatesthe total number of future donors needed to maintain the current volume of LTs and the effect of a hypothetical scenario of improved reperfusion technology.We also forecast the number of patients on the waiting list and compare this with the estimated number of LTs to illustrate the impact that decreased LTs will have on patients needing transplants. By altering assumptions about the future donor pool, this model can be used to develop policy interventions to prevent a further decline in this lifesaving therapy. To our knowledge, there are no similar predictive models of future LT use based on epidemiological trends.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1177/0272989X14559055DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4429044PMC
August 2015

Simulated estimates of pre-pregnancy and gestational diabetes mellitus in the US: 1980 to 2008.

PLoS One 2013 5;8(9):e73437. Epub 2013 Sep 5.

Department of Industrial & Systems Engineering, North Carolina State University, Raleigh, North Carolina, United States of America.

Purpose: To simulate national estimates of prepregnancy and gestational diabetes mellitus (GDM) in non-Hispanic white (NHW) and non-Hispanic black (NHB) women.

Methods: Prepregnancy diabetes and GDM were estimated as a function of age, race/ethnicity, and body mass index (BMI) using South Carolina live singleton births from 2004-2008. Diabetes risk was applied to a simulated population. Age, natality and BMI were assigned to women according to race- and age-specific US Census, Natality and National Health and Nutrition Examination Surveys (NHANES) data, respectively.

Results: From 1980-2008, estimated GDM prevalence increased from 4.11% to 6.80% [2.68% (95% CI 2.58%-2.78%)] and from 3.96% to 6.43% [2.47% (95% CI 2.39%-2.55%)] in NHW and NHB women, respectively. In NHW women prepregnancy diabetes prevalence increased 0.90% (95% CI 0.85%-0.95%) from 0.95% in 1980 to 1.85% in 2008. In NHB women from 1980 through 2008 estimated prepregnancy diabetes prevalence increased 1.51% (95% CI 1.44%-1.57%), from 1.66% to 3.16%.

Conclusions: Racial disparities in diabetes prevalence during pregnancy appear to stem from a higher prevalence of prepregnancy diabetes, but not GDM, in NHB than NHW.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0073437PLOS
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3764167PMC
June 2014

Prioritization strategies for patient evacuations.

Health Care Manag Sci 2014 Mar 11;17(1):77-87. Epub 2013 May 11.

Department of Industrial Engineering, Clemson University, 110 Freeman Hall, Clemson, SC, 29634-0920, USA,

Evacuation from a health care facility is considered last resort, and in the event of a complete evacuation, a standard planning assumption is that all patients will be evacuated. A literature review of the suggested prioritization strategies for evacuation planning-as well as the transportation priorities used in actual facility evacuations-shows a lack of consensus about whether critical or non-critical care patients should be transferred first. In addition, it is implied that these policies are "greedy" in that one patient group is given priority, and patients from that group are chosen to be completely evacuated before any patients are evacuated from the other group. The purpose of this paper is to present a dynamic programming model for emergency patient evacuations and show that a greedy, "all-or-nothing" policy is not always optimal as well as discuss insights of the resulting optimal prioritization strategies for unit- or floor-level evacuations.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10729-013-9236-0DOI Listing
March 2014

ERα signaling regulates MMP3 expression to induce FasL cleavage and osteoclast apoptosis.

J Bone Miner Res 2013 Feb;28(2):283-90

UCLA and Orthopaedic Hospital Department of Orthopaedic Surgery, Orthopaedic Hospital Research Center, David Geffen School of Medicine, UCLA, Los Angeles, CA, USA.

The benefits of estrogens on bone health are well established; how estrogens signal to regulate bone formation and resorption is less well understood. We show here that 17β-estradiol (E2)-induced apoptosis of bone-resorbing osteoclasts is mediated by cleavage and solubilization of osteoblast-expressed Fas ligand (FasL). U2OS-ERα osteoblast-like cells expressing an EGFP-tagged FasL at the C-terminus showed decreased fluorescence after E2 treatment, indicative of a cleavage event. Treatment of U2OS-ERα cultures with a specific MMP3 inhibitor in the presence of E2 blocked FasL cleavage and showed an increase in the number of EGFP-FasL+ cells. siRNA experiments successfully knocked down MMP3 expression and restored full-length FasL to basal levels. E2 treatment of both human and murine primary osteoblasts showed upregulation of MMP3 mRNA expression, and calvarial organ cultures showed increased expression of MMP3 protein and colocalization with the osteoblast-specific RUNX2 after E2 treatment. In addition, osteoblast cell cultures derived from ERαKO mice showed decreased expression of MMP3 but not MMP7 and ADAM10, two known FasL proteases, demonstrating that ERα signaling regulates MMP3. Also, conditioned media of E2-treated calvarial osteoblasts showed an approximate sixfold increase in the concentration of soluble FasL, indicating extensive cleavage, and soluble FasL concentrations were reduced in the presence of a specific MMP3 inhibitor. Finally, to show the role of soluble FasL in osteoclast apoptosis, human osteoclasts were cocultured with MC3T3 osteoblasts. Both a specific MMP3 inhibitor and an MMP inhibitor cocktail preserved osteoclast differentiation and survival in the presence of E2 and demonstrate the necessity of MMP3 for E2-induced osteoclast apoptosis. These experiments further define the molecular mechanism of estrogen's bone-protective effects by inducing osteoclast apoptosis through upregulation of MMP3 and FasL cleavage.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1002/jbmr.1747DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3524410PMC
February 2013

Estimated number of preterm births and low birth weight children born in the United States due to maternal binge drinking.

Matern Child Health J 2013 May;17(4):677-88

Department of Public Health Sciences, Clemson University, Clemson, SC 29634, USA.

The objective of this study was to estimate the aggregate burden of maternal binge drinking on preterm birth (PTB) and low birth weight (LBW) across American sociodemographic groups in 2008. To estimate the aggregate burden of maternal binge drinking on preterm birth (PTB) and low birth weight (LBW) across American sociodemographic groups in 2008. A simulation model was developed to estimate the number of PTB and LBW cases due to maternal binge drinking. Data inputs for the model included number of births and rates of preterm and LBW from the National Center for Health Statistics; female population by childbearing age groups from the U.S. Census; increased relative risks of preterm and LBW deliveries due to maternal binge drinking extracted from the literature; and adjusted prevalence of binge drinking among pregnant women estimated in a multivariate logistic regression model using Behavioral Risk Factor Surveillance System survey. The most conservative estimates attributed maternal binge drinking to 8,701 (95% CI: 7,804-9,598) PTBs (1.75% of all PTBs) and 5,627 (95% CI 5,121-6,133) LBW deliveries in 2008, with 3,708 (95% CI: 3,375-4,041) cases of both PTB and LBW. The estimated rate of PTB due to maternal binge drinking was 1.57% among all PTBs to White women, 0.69% among Black women, 3.31% among Hispanic women, and 2.35% among other races. Compared to other age groups, women ages 40-44 had the highest adjusted binge drinking rate and highest PTB rate due to maternal binge drinking (4.33%). Maternal binge drinking contributed significantly to PTB and LBW differentially across sociodemographic groups.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10995-012-1048-1DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3664940PMC
May 2013

Maternal pre-pregnancy weight and gestational weight gain and their association with birthweight with a focus on racial differences.

Matern Child Health J 2013 Jan;17(1):85-94

Department of Medicine, Division of Biostatistics and Epidemiology, Medical University of South Carolina, 135 Cannon Street, Charleston, SC 29425, USA.

Our objectives were to examine the interaction between maternal pre-pregnancy body mass index (BMI) and gestational weight gain (GWG) and their association with birthweight, with a focus on racial differences. We used birth certificate data from live singleton births of South Carolina resident mothers, who self-reported their race as non-Hispanic white (NHW, n = 140, 128) or non-Hispanic black (NHB, n = 82,492) and who delivered at 34-44 weeks of gestation between 2004 and 2008 to conduct a cross-sectional study. Linear regression was used to examine the relationship between our exposures (i.e., race, BMI and GWG) and our outcome birthweight. Based on 2009 Institute of Medicine guidelines, the prevalence of adequate, inadequate and excessive GWG was 27.1, 24.2 and 48.7%, respectively, in NHW women and 24.2, 34.8 and 41.0%, respectively, in NHB women. Adjusting for infant sex, gestational age, maternal age, tobacco use, education, prenatal care, and Medicaid, the difference in birthweight between excessive and adequate GWG at a maternal BMI of 30 kg/m(2) was 118 g (95% CI: 109, 127) in NHW women and 101 g (95% CI: 91, 111) in NHB women. Moreover, excessive versus adequate GWG conveyed similar protection from having a small for gestational age infant in NHW [OR = 0.64 (95% CI 0.61, 0.67)] and NHB women [OR = 0.68 (95% CI: 0.65, 0.72)]. In conclusion, we report a strong association between excessive GWG and higher infant birthweight across maternal BMI classes in NHW and NHB women. Given the high prevalence of excessive GWG even a small increase in birthweight may have considerable implications at the population level.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10995-012-0950-xDOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3677820PMC
January 2013

Evaluating emergency medical service performance measures.

Health Care Manag Sci 2010 Jun;13(2):124-36

Department of Statistical Sciences & Operations Research, Virginia Commonwealth University, P.O. Box 843083, 1001 W. Main Street, Richmond, VA 23284, USA.

The ultimate goal of emergency medical service systems is to save lives. However, most emergency medical service systems have performance measures for responding to 911 calls within a fixed timeframe (i.e., a response time threshold), rather than measures related to patient outcomes. These response time thresholds are used because they are easy to obtain and to understand. This paper proposes a methodology for evaluating the performance of response time thresholds in terms of resulting patient survival rates. A model that locates ambulances to optimize patient survival rates is used for base comparison. Results are illustrated using real-world data collected from Hanover County, Virginia. The results indicate that locating ambulances to maximize seven and eight min response time thresholds simultaneously maximize patient survival. Nine and 10 min response time thresholds result in more equitable patient outcomes, with improved patient survival rates in rural regions.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1007/s10729-009-9115-xDOI Listing
June 2010

Integrating transcriptional and metabolite profiles to direct the engineering of lovastatin-producing fungal strains.

Nat Biotechnol 2003 Feb 21;21(2):150-6. Epub 2003 Jan 21.

Microbia, Inc., 320 Bent Street, Cambridge, MA 02141, USA.

We describe a method to decipher the complex inter-relationships between metabolite production trends and gene expression events, and show how information gleaned from such studies can be applied to yield improved production strains. Genomic fragment microarrays were constructed for the Aspergillus terreus genome, and transcriptional profiles were generated from strains engineered to produce varying amounts of the medically significant natural product lovastatin. Metabolite detection methods were employed to quantify the polyketide-derived secondary metabolites lovastatin and (+)-geodin in broths from fermentations of the same strains. Association analysis of the resulting transcriptional and metabolic data sets provides mechanistic insight into the genetic and physiological control of lovastatin and (+)-geodin biosynthesis, and identifies novel components involved in the production of (+)-geodin, as well as other secondary metabolites. Furthermore, this analysis identifies specific tools, including promoters for reporter-based selection systems, that we employed to improve lovastatin production by A. terreus.
View Article and Find Full Text PDF

Download full-text PDF

Source
http://dx.doi.org/10.1038/nbt781DOI Listing
February 2003