Article
Author(s):
A review of 11 websites found that information available online to patients varies greatly in terms of quality, accuracy, and readability.
With more and more patients turning to online resources for education about conditions, many physicians have grown concerned over the quality and accuracy of the information available to their patients. 


A recent study that explored this topic by examining information available on 11 diabetic retinopathy websites returned results that suggest the concerns of many ophthalmologists and retina specialists may be warranted.
In an effort to analyze the quality, accuracy, and readability of online information on diabetic retinopathy, investigators from Bascom Palmer Eye Institute at the University of Miami Miller School of Medicine carried out a cross-sectional study of disease-related information on 11 sites that were returned as a Google search of the phrase diabetic retinopathy.
Sites examined included those of the American Academy of Ophthalmology, All About Vision, American Optometric Association, American Society of Retinal Specialists, EyeWiki, Mayo Clinic, Medical News Today, MedicineNet, National Eye Institute, WebMD, and Wikipedia.
Investigators designed a questionnaire with 26 questions based on frequently asked questions by patients in their practice — evaluations were conducted by a vitreoretinal surgeon and 2 vitreoretinal surgery fellows. Investigators established a grading scale of 0 to 4 with 4 being the maximum score to assess the quality of each answer.
Investigators analyzed the accountability of each website through an evaluation of authorship, attributions, disclosure, and currency — 4 benchmarks previously outlined by JAMA. Readability was assessed through an analysis that included Flesch reading ease score, Flesch-Kincaid grade level, Gunning Fog Index, Coleman Liau Index, and the Simple Measure of Gobbledygook Index using an online readability tool.
Upon analysis of the 11 sites, investigators found that the mean questionnaire score fort all sites was 55.76 (13.38; 95% CI, 47.85-63.67) out of 104 possible points. Additionally, a statically significant difference of content accuracy and completeness was found among the sites.
Among the sites analyzed, Wikipedia was the highest scoring with a mean score off 76.67 (0.98) points. WebMD was the lowest-scoring site with a mean score of 33 (1.10) points. Investigators pointed out that no significant correlation was found between the rank in Google search ranking and the content quality of the website. (r=0.422, P=.43).
No website achieved all 4 accountability factors and only 1 websites achieved 3 of the 4. No correlation was noted between the content quality of the website and JAMA benchmarks. 


The mean Flesch reading ease score for the sites was 46.05 (11.92; 95% CI, 39.01-53.10). Significant differences were noted between the mean reading grade level of the websites analyzed but no significant correlations were found between website quality and mean reading grade(r=0.445, P=.17).
In an editorial comment published in JAMA Ophthalmology, Rahul Khurana, MD, commended the work of investigators for highlighting what she sees as a missed opportunity to communicate with patient populations. 


“Information for patients must be accurate, free of bias, and at the appropriate reading level to accomplish this. Kloosterboer reveal the limitations of the current online information about diabetic retinopathy and show a missed opportunity for patient education,” Khurana wrote.
This study, “Assessment of the Quality, Content, and Readability of Freely Available Online Information for Patients Regarding Diabetic Retinopathy,” was published online in JAMA Ophthalmology.