News
Article
Author(s):
Molybdenum’s antioxidative properties may help reduce systemic inflammation, ROS, and uric acid levels.
Urinary molybdenum levels are associated with decreased prevalence of hyperuricemia and gout in adults.1
“In this study, we aimed to understand the antioxidative potential of molybdenum and the association between urinary molybdenum and various kidney-related parameters and comorbidities including gout and hyperuricemia,” lead investigator Joo Hong Joun, Seoul National University Hospital, and colleagues wrote.1
Joun and colleagues investigated the epidemiological relationship between urinary molybdenum and hyperuricemia and kidney-disease related outcomes in 15,370 adult participants in the National Health and Nutrition Examination Survey (NHANES), data from which were collected between 1999 and 2016.
The investigators corrected urinary molybdenum levels to urinary creatinine concentrations. They used multivariable linear and logistic regression analyses, adjusting for covariates including age, sex, ethnicity, diabetes mellitus, hypertension, body mass index, and estimated glomerular filtration rate to assess the association between urinary molybdenum-to-creatinine ratio and kidney-disease related outcomes. They used antimony and tungsten as control trace metals during in vitro tests.1
Joun and colleagues used HK-2 cells to assess molybdenum’s antioxidative properties. The cells were challenged with H2O2-induced oxidative stress, which was measured using a fluorescent microplate assay for reactive oxygen species (ROS). Antioxidation levels were assessed by measuring the expression of manganese superoxide dismutase. In HK-2 cells, the investigators found that ROS decreased by 34% when given 1 μg/mL molybdenum (P < 0.01), 30% when given 2 μg/mL (P < 0.01), 23% when given 5 μg/mL (P < 0.01), and 29% when given 10 μg/mL (P < 0.001).1
In the study participants, the investigators found that urinary molybdenum-to-creatinine ratio was significantly associated with decreased serum uric acid (β, -0.119 [95% CI, -0.148 to -0.090]) concentrations, decreased prevalence of hyperuricemia (odds ratio [OR], 0.73 [95% CI, 0.64–0.83]) and decreased prevalence of gout (OR, 0.71 [95% CI, 0.52–0.94]). Higher urinary molybdenum levels were associated with lower levels of systemic oxidative stress (gamma-glutamyltransferase levels; β, -0.052 [95% CI, -0.067 to -0.037]) and inflammation (C-reactive protein levels; β, -0.184 [95% CI, -0.220 to -0.148]).1
“We found molybdenum in human kidney proximal tubular epithelial cells provides antioxidative benefits in the presence of oxidative stress and that urinary molybdenum-to-creatinine ratio is associated with lower levels of systemic inflammation and oxidative stress and a reduced prevalence of hyperuricemia and gout in the general US adult population,” Joun and colleagues wrote.1
“Altogether, the antioxidative effects of molybdenum may have contributed to the prevention of hyperuricemia observed epidemiologically. Future studies should confirm via in vitro and animal studies if molybdenum’s antioxidative properties indeed directly increase the tubular excretion of uric acid and examine via human clinical trials whether molybdenum supplements decrease serum uric acid and help prevent hyperuricemia and gout,” Joun and colleagues wrote.1
Other recent research into hyperuricemia found that serum vitamin D levels were associated with a lower risk of mortality in patients with gout and hyperuricemia.2
The investigators found that after multifactorial adjustment, per one-unit increment in natural log-transformed 25(OH)D was associated with a 55% lower risk in all-cause mortality and a 61% lower risk of cardiovascular disease (CVD) mortality in patients with gout (both P ≤.003). In patients with hyperuricemia, they found a 45% lower risk of cancer mortality (P = .009). Furthermore, restricted cubic splines revealed a U-shaped relationship with all-cause and CVD mortality in patients with HUA with inflection points of 72.7 nmol/L and 38.0 nmol/L, respectively (both P ≤.003). The data trends were corroborated by subgroup and sensitivity analyses, although not always to statistical significance.2