News
Article
Author(s):
Changes in the serum urate levels, number of gout flares before 1 year of treatment, and tophus were linked to the number of gout flares.
A higher baseline serum urate, coupled with a decrease in serum urate, led to a higher risk of gout flares, according to research published in Open Access to Scientific and Medical Research.1
“Higher baseline serum urate or higher initial urate-lowering medication dose increased risk of gout flares during urate-lowering therapy (ULT) initiation,” explained a group of Chinese investigators. “The decrease in serum urate may play a crucial role in this process. Therefore, we aim to explore the relationship between decrease in serum urate and the risk of gout flares during ULT initiation.”
Gout, a metabolic disorder, frequently demonstrates as chronic joint swelling, recurrent acute arthritis, chronic joint swelling, tophus, and, in severe cases, renal failure and joint deformity. ULT is commonly used to reduce serum urate levels; however, cardiovascular events are often reported during the first few months of ULT.2 Therefore, identifying the risk factors for gout flares during ULT initiation is critical.
During a 12-week prospective cohort study, an analysis evaluated males with gout in China. Patients were grouped by baseline serum urate levels, categorized as 7 – 7.9 mg/dL, 8 – 8.9 mg/dL, or ≥ 9mg/dL). All patients received febuxostat 20 mg daily at weeks 0 – 4, then escalated to 40 mg during weeks 4 – 12 if serum urate was >6 mg/dL.
Of the 282 patients enrolled, 260 completed treatment between March 2021 December 2021 at the Shandong Provincial Clinical Research Center for Immune Diseases and Gout. Nearly half (44.2%) reported ≥1 gout flare during the treatment period. Primary outcomes included the number of gout flares and decreases in serum urate levels over a 12-week period. Poisson regression was conducted.
In the multivariate Poisson regression changes in the serum urate levels between 0 – 12 weeks (incident rate ratio [IRR] 1.184, 95% confidence interval [CI], 1.062 – 1.320; P = .002), the number of gout flares before 1 year of treatment (1.017, 1.010 – 1.024; P < .001), and tophus (1.580, 1.023 – 2.440; P = .039) were linked to the number of gout flares. The regression analysis showed the baseline serum urate (1.256, 1.050 – 1.503; P = .013) and the number of gout flares before treatment 1 year (1.014, 1.007– 1.022; P < .001) were independently linked to the number of gout flares.Change in serum urate between the first 12 weeks (1.055, .923 – 1.207; P = .433) was no longer a risk factor.
ULT-induced gout flares were linked to the degree of serum urate levels, which was linked to baseline serum urate. In turn, a higher baseline urate and greater decrease in serum urate was associated with a higher risk of gout flares.
Investigators noted the single-center, single-ethnic study may have hindered the generalizability of results. Future studies may include a multiethnic, multicenter study with gout patients from different genders, ethnicities, and regions. Another limitation was not including the dietary intake of patients. The inclusion of which would enable adjustment for the dietary factors in a statistical analysis.
“For clinics, we suggest that they develop individualized ULT plans for gout patients based on their baseline serum urate levels,” investigators concluded. “For patients with higher serum urate, physicians would start with a lower dose of urate-lowering drugs to minimize the risk of gout flares due to a rapid fall of serum urate during ULT initiation. For related stakeholders, such as the government, we urge them to increase awareness, so that gout patients can learn more about the knowledge that rapid fall of serum urate can increase the risk of gout flares.”
References