News
Article
Author(s):
A study revealed factors linked to gout flare risk included the absence of tophi, younger age, and more. Taking allopurinol over febuxostat or vice versa did not affect flare risk.
A study showed a similar gout flare risk during allopurinol and febuxostat initiation and titration, administered as a treat-to-target strategy with optimal anti-inflammatory prophylaxis.1
“After accounting for covariates, we found that for each 1 mg/dl higher baseline sUA concentration, participants had an approximate 10% increase in flare risk during the first 6 months of [urate-lowering therapy],” investigators wrote.
Beginning urate-lowering therapy in gout can provoke gout flares. Gout attacks often occur with a significant reduction in serum urate levels; the greater the serum urate reduction, the more likely a flare will happen.2 Patients may take allopurinol or febuxostat during urate-lowering therapy to control gout, yet the question remained: which agent was more effective at reducing the flare risk?
Investigators, led by Austin Barry, MD, from the Veterans Affairs Nebraska-Western Iowa Health Care System and the department of internal medicine at the University of Nebraska Medical Center, sought to compare the flare risk during the initiation and titration of allopurinol versus febuxostat, administered as a treat-to-target strategy during urate-lowering therapy.1 The team conducted a post-hoc analysis of the STOP Gout trial, a 72-week randomized, double-blind, placebo-controlled, non-inferiority trial comparing the efficacy of allopurinol and febuxostat in 940 patients with gout undergoing anti-inflammatory prophylaxis. An equal number of participants received allopurinol and febuxostat.
Barry and colleagues examined flares during weeks 0 – 24 of urate-lowering therapy. During that time, the urate-lowering therapy was titrated to a serum urate goal of < 6 mg/dl (< 5 mg/dl if tophi). Investigators assessed flares at regular intervals through structured participant interviews. From there, they examined predictors of flares using multivariable Cox proportional hazards regression.
The sample had a mean age of 62.1 years and was mostly male (98.4%) and white (67.8%). Participants had a mean baseline serum urate of 8.5 mg/dl, and everyone received anti-inflammatory prophylaxis (89.6% colchicine, 4.6% NSAIDs, 3.1% prednisone, and 2.8% other). Among participants taking colchicine, the most common dose was 0.6 mg once daily (88.1%), followed by 0.6 mg every other day (10.3%), and 0.6 mg twice daily (1.6%).
During phase 1, ≥ 1 flare was observed in 42.1% of participants on allopurinol and 46.2% on febuxostat. The mean change in sUA over phase 1 was similar for participants experiencing ≥ 1 flare vs no flare (3.24 ± 1.87 mg/dl vs. 3.12 ± 1.68 mg/dl, respectively; P =0.34).
The multivariable analysis showed no significant difference in the flare risk between febuxostat and allopurinol (adjusted hazard ratio [aHR], 1.17; 95% confidence interval [CI], 0.90 to 1.53). Increasing the dose of urate-lowering therapy did not significantly alter the flare risk compared to not increasing the dose (aHR, 1.18; 95% CI, 0.86 to 1.63).
Additionally, the prophylaxis type nor co-existing comorbidities did not have a significant link to the flare risk. The analysis showed no evidence to suggest increasing the urate-lowering therapy dose interacted with other variables in a way that would significantly alter the flare risk, other than age (aHR, 0.87; 95% CI, 0.77-0.99 per 10 years).
Unlike other studies that found subcutaneous tophi predicted a greater flare risk during the first 3 – 6 months of urate-lowering therapy, investigators of the new study observed tophi had a “protective” effect. Participants with tophi were 30% less likely to have a flare during the first 6 months of therapy (aHR, 0.70; 95% CI, 0.54 to 0.91).
Other than the absence of tophi, the analysis revealed additional independent factors linked to the flare risk during urate-lowering therapy, which included younger age (HR, 0.86; 95% CI, 0.78 – 0.98 per 10 years) and greater baseline serum urate (aHR, 1.09; 95% CI, 1.01 – 1.18 per mg/dl). Patients with a presence of stage 3 chronic kidney disease (aHR, 1.24; 95% CI, 0.97 – 1.59) and baseline C-reactive protein (aHR, 1.05; 95% CI, 1.00 – 1.10 per 10 mg/L) also were likely to have a greater flare risk but neither reached statistical significance.
“These factors, and others identified in future research, may help to risk stratify patients initiating ULT and identify individuals who might benefit from more robust anti-inflammatory prophylaxis,” investigators concluded.
References