News
Article
Author(s):
Social vulnerability index, race, and poverty were associated with increased rates of CDI, calling attention to population-level health disparities.
Results from an ecological study published in Preventive Medicine Reports are calling attention to the association between socioeconomic status and Clostridioides difficile infection (CDI) rates, highlighting the role of underlying social determinants in infection disparities.1
“Most efforts toward CDI prevention have focused on antimicrobial stewardship and infection prevention in health care,” wrote lead investigator Jessica Butler, MPH, graduate teaching assistant at the University of Colorado, and colleagues.1 “Much less is known about whether health disparities exist at the population level, and whether public health efforts should address underlying social determinants of health.”
The US Centers for Disease Control and Prevention (CDC) estimates there are approximately 500,000 cases of CDI in the US each year, citing older age, a recent stay at a hospital or nursing home, a weakened immune system, and exposure to Clostridioides difficile as risk factors.2 An estimated two-thirds of CDI cases originate in hospitals, long-term care facilities, or other healthcare settings, although community cases not acquired in a healthcare setting are on the rise.3
To assess the association between measures of socioeconomic status and CDI, investigators collected CDI surveillance data from the Colorado Department of Public Health and Environment for 2,532,982 individuals living in the Denver metro area. During the surveillance data period, there were 16,781 CDI cases in the Denver metro area and all were included in the analysis. Investigators geocoded CDI cases to census tract of residence using 2010 US census boundaries and aggregated them into observed counts within census tracts according to sex and age strata.1
The overall crude CDI rate between 2016 and 2019 was 155 cases per 100,000 persons. Investigators calculated expected CDI counts for each census tract by applying sex and age stratified rates for the catchment area to the corresponding sex and age stratified population counts for each census tract and used them to calculate standardized morbidity ratios. Population denominators were taken from the 2015–2019 American Community Survey to calculate the person-time for the standardized morbidity ratios.1
The primary explanatory variable was social vulnerability index, a variable developed by the CDC/Agency for Toxic Substances and Disease Registry indicating the relative vulnerability of every US census tract ranked on 15 social factors. Explanatory variables including percent living below the federal poverty level, percent of non-White race, percent of Hispanic ethnicity, and percent without health insurance were separately evaluated.1
The median census tract social vulnerability index was 0.4 (range, .2-.7). The median percentage of the population living in poverty was 6.8 (range, 3.7-12.5), percentage without health insurance was 5.5 (range, 3.0-10.3), percentage of non-White was 14.3 (range, 8.9-24.7), and percentage of Hispanic was 15.7 (range, 8.9-30.7). Investigators pointed out social vulnerability index was correlated with poverty (P = .79), health insurance coverage (P = .78), non-White race (P = .54), and Hispanic ethnicity (P = .80), while American Community Survey variables were correlated with each other (P = .41–.75).1
Upon analysis, lower neighborhood socioeconomic status as indicated by an increase in social vulnerability index, race, and poverty were associated with increased rates of CDI. For every .1 unit increase in social vulnerability index, the rate of CDI increased by 5% (risk ratio [RR], 1.05; 95% confidence interval [CI], 1.04–1.06). In unadjusted models, poverty (RR, 1.15; 95% CI, 1.10–1.20, per 10% increase), uninsured status (RR, 1.14; 95% CI, 1.07–1.21, per 10% increase), non-white race (RR, 1.09; 95% CI, 1.05–1.12, per 10% increase), and Hispanic ethnicity (RR, 1.04; 95% CI, 1.01–1.06, per 10% increase) were associated with increases in CDI rates. In a multivariable model, poverty (adjusted risk ratio [aRR], 1.10; 95% CI, 1.05–1.16, per 10% increase) and non-white race (aRR, 1.05; 95% CI, 1.01–1.09, per 10% increase) were associated with greater CDI rates.1
“Our findings suggest that [social vulnerability index] may be a good single variable with which to identify neighborhoods at increased risk of CDI without the statistical issues associated with multiple correlated [American Community Survey] variables, and that socioeconomic characteristics not directly related to health care are important risk factors for this healthcare-associated infection,” concluded investigators.1
References: