News
Article
Author(s):
Investigators hope that the model may serve as a foundation for future applications in gout diagnosis.
Credit: Adobe Stock
Researchers have developed an interpretable machine learning (ML) model for predicting gout, establishing a foundation for future applications in supporting gout diagnosis.1
“The integration of ultrasound data into ML algorithms could provide a more comprehensive approach to gout diagnosis, combining the advantages of both imaging diagnostics and artificial intelligence. Furthermore, while existing ML models for gout prediction have shown promise, their clinical applicability has been limited by a lack of interpretability concerning their decision-making processes,” lead investigator Lishan Xiao, Department of Ultrasound, the Affiliated Hospital of Qingdao University, Qingdao, China, and colleagues wrote.1
The investigators included data from 609 patients’ first metatarsophalangeal (MTP1) joint ultrasound from 2 institutions. Institution 1 data (n = 571) were split into training cohort (TC) and internal testing cohort (ITC) in an 8:2 ratio, and Institution 2 data (n = 92) served as an external testing cohort (ETC). Key predictors were selected using Random Forest (RF), Least Absolute Shrinkage and Selection Operator (LASSO), and Extreme Gradient Boosting (XGBoost) algorithms. Xiao and colleagues evaluated 6 ML models using standard performance metrics, with SHapley Additive exPlanations (SHAP) analysis for model interpretation.1
Xiao and colleagues identified 5 key predictors of gout: serum uric acid (SUA), deep learning (DL) model predictions, tophus, bone erosion, and double contour sign (DCs). They achieved optimal predictive performance using a logistic regression (LR) model which had an Area Under the Curve (AUC) of 0.870 (95% CI, 0.820–0.920) in ITC and 0.854 (95% CI, 0.804–0.904) in ETC. The model had Brier scores of 0.138 in ITC and 0.159 in ETC, demonstrating good calibration.1
“This study developed an interpretable ML model for gout prediction and utilized SHAP to elucidate feature contributions, establishing a foundation for future applications in clinical decision support for gout diagnosis,” Xiao and colleagues wrote.1
Other recent research into trying to predict gout found that neutrophil-to-lymphocyte ratio (NLR) and monocyte-to-lymphocyte ratio (MLR) may predict gout flare risk and increased risk of cardiovascular (CVD) mortality. In a small cohort of 38 patients with gout, investigators found that levels of the neutrophil activation marker calprotectin correlated with NLR (r = 0.56, P = .0004) and MLR correlated with total number of gout attacks (r = 0.39, P = .02). NLR and MLR, but not absolute monocyte or neutrophil counts, were significantly correlated with body mass index and significantly increased in gout patients with high CVD risk (P <.05). Logistic regression analysis revealed that patients in the upper quartile of NLR or MLR had increased odds of developing high CVD risk (odds ratio 7.5 [95% CI, 1.7–33.0]).2
“NLR and MLR are associated with an increased number of gout flares and elevations of either NLR or MLR predict CVD-related morbidity risk. As NLR and MLR are readily available laboratory tests, they may serve as useful biomarkers to assess gout flare risk and identify gout patients at risk of poor CVD outcomes to offer intensified monitoring and treatment,” Stultz and colleagues concluded.2