Published on in Vol 7, No 1 (2022): Jan-Mar

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/27221, first published .
Internet-Based Patient Education Materials Regarding Diabetic Foot Ulcers: Readability and Quality Assessment

Internet-Based Patient Education Materials Regarding Diabetic Foot Ulcers: Readability and Quality Assessment

Internet-Based Patient Education Materials Regarding Diabetic Foot Ulcers: Readability and Quality Assessment

Authors of this article:

David Michael Lee1 Author Orcid Image ;   Elysia Grose2 Author Orcid Image ;   Karen Cross1, 3 Author Orcid Image

Original Paper

1Temerty Faculty of Medicine, University of Toronto, Toronto, ON, Canada

2Faculty of Medicine, University of Ottawa, Ottawa, ON, Canada

3Division of Plastic Surgery, St. Michael's Hospital, Toronto, ON, Canada

Corresponding Author:

David Michael Lee, BHSc, MEng

Temerty Faculty of Medicine

University of Toronto

Medical Sciences Building, Room 3157

1 King's College Circle

Toronto, ON, M5S 1A1

Canada

Phone: 1 4165582872

Email: davidm.lee@mail.utoronto.ca


Background: While diabetic foot ulcers (DFU) are a common complication of diabetes, little is known about the content and readability of online patient education materials (PEM) for DFU. The recommended reading grade level for these materials is grades 6-8.

Objective: The aim of this paper was to evaluate the quality and readability of online PEM on DFU.

Methods: A Google search was performed using 4 different search terms related to DFU. Two readability formulas were used to assess the readability of the included PEM. These included the Flesch-Kincaid grade level and the Flesch-Reading ease score. The DISCERN tool was used to determine quality and reliability.

Results: A total of 41 online PEM were included. The average Flesch-Reading ease score for all PEM was 63.43 (SD 14.21), indicating a standard difficulty level of reading. The average reading grade level was 7.85 (SD 2.38), which is higher than the recommended reading level for PEM. The mean DISCERN score was 45.66 (SD 3.34), and 27% (11/41) of the articles had DISCERN scores of less than 39, corresponding to poor or very poor quality.

Conclusions: The majority of online PEM on DFU are written above the recommended reading levels and have significant deficiencies in quality and reliability. Clinicians and patients should be aware of the shortcomings of these resources and consider the impact they may have on patients’ self-management.

JMIR Diabetes 2022;7(1):e27221

doi:10.2196/27221

Keywords



Diabetes affects 1 in 10 people worldwide and disproportionately affects those who do not have regular access to health care [1,2]. Diabetic foot ulcers (DFU) affect 15-25% of people living with diabetes mellitus at some point in their life [2]. This not only leads to a decreased quality of life and functional limitations but also precedes most lower extremity amputations [3]. Patients with DFU have a 7% risk of amputation 10 years after their diagnosis [4]. As a leading cause of mortality globally, diabetes is 1 of 4 priority noncommunicable diseases targeted for action by the World Health Organization [5].

Patient education is imperative in preventing and managing DFU and subsequently lower extremity amputations [6-8]. Foot care practices include how to inspect and wash the feet when drying, choosing suitable socks and footwear, applying lotion to dry skin, cutting nails appropriately, and notifying a health provider if a cut, blister, or sore develops [9]. Usually, patients and their families provide 95% of their diabetes foot care themselves [10].

Readability is an objective measure of reading skills required to comprehend written information [11]. These can include elements such as familiarity, legibility, typography, and complexity of words and sentences. Readability formulas attempt to assess written information based on word and sentence length as surrogates of text complexity [11]. Currently, the National Institutes of Health recommends that patient education material be written for a grade 6 level audience, the estimated reading level of the average North American adult [12]. The Canadian Medical Protective Association and the American Medical Association also recommended that patient education materials (PEM) be written for a grade 6 level audience [13].

The understandability, readability, and actionality of web-based information have been assessed for diabetes mellitus [14]. However, no study has been conducted investigating the quality and readability of online PEM regarding DFU. Since DFU are a common complication of diabetes mellitus, clinicians must evaluate the information patients access online about foot care. Self-management of diabetic foot ulcers is critical for clinical outcomes [3]. Patients often rely on a plethora of online information available to educate them on the self-management of DFU. Therefore, the objective of this study is to assess the quality and readability of online patient education material related to management and care for diabetic foot ulcers.


Search and Categorization

This study was exempt from the St. Michael’s Hospital Research Ethics Board. The search was conducted using the Google (Google Inc) search engine, the most used search engine in Toronto, Ontario, on October 1, 2020. Four search terms were used, which were “diabetic foot ulcer care,” “diabetic foot care,” “diabetic wound care,” and “foot care.” The first 20 pages of the search were reviewed for this study. Although most internet users only review the first 20 search results, the search is normally broadened to offset variability in previous search history and location [15]. Before initiating the search, the browser was set to incognito mode. All search history, cookies, and cached data were erased from the browser, and location settings were disabled to prevent the search engine from showing personalized results.

All webpages and articles that were PEM about diabetic foot care were included. The exclusion criteria included websites that were not written in English, websites that had access restrictions, nontext media (including videos and audio), news articles, scientific webpages (eg, Science Direct and PubMed), websites that targeted medical professionals, webpages that contained less than 100 words, and websites that did not contain patient information on diabetic foot ulcer care and prevention.

The websites were divided into 6 main categories based on their origin: academic institutions, professional organizations, medical information websites, government websites, private clinics, and miscellaneous websites. The websites were categorized as originating from an academic institution if they are affiliated with a university or a medical center. Examples of professional organizations include the American Diabetes Association, Diabetes Canada, Wounds Canada, International Working Group on the Diabetic Foot, and Diabetes Action Canada. Examples of medical information websites include websites such as WebMD and Merck Manual. Miscellaneous websites include Wikipedia and patient testimonials. Categorization was completed in duplicate.

Outcome Measures

Readability Evaluation

All websites were downloaded into plain text using Microsoft Word (Microsoft Corp). Formatting elements found on webpages were removed. This was carried out to avoid skewing readability results as recommended by several groups [15-17]. PEM were evaluated for readability using an online readability calculator, Readable (Added Bytes Ltd), which performs the Flesch-Kincaid reading ease (FRE) and Flesch-Kincaid grade level (FKG) readability tests. Each of these tests uses variables such as sentence length, number of words, and number of syllables to estimate readability [18-20]. Multimedia Appendix 1 describes each instrument, the formula used to calculate the score, and the interpretation of the scores generated by each instrument. To be a Grade 6 level and under, the scores for the FKG needed to be 6 or lower. FRE scores ranged from 0 to 100, with a higher score corresponding to a text that is easier to read (Table 1). An FRE score between 60 and 70 corresponded to a standard reading level. The online calculator, Readable, was used by other peer-reviewed publications, and we used the validated readability formulas in Table 1 [21,22]. The FRE and FKG scores have been used to evaluate medical literature and are the most applicable readability formulas for health information [18,23-27].

Table 1. Flesch-Kincaid reading ease score interpretation.
ScoreInterpretation
90 to 100Very easy
80 to <90Easy
70 to <80Fairly easy
60 to <70Standard
50 to <60Fairly difficulty
30 to <50Difficult
0 to <30Very difficult
Quality of Patient Education Material

DISCERN is a tool designed for patients and health care providers to assess the reliability and quality of written material on treatment choices without the need for medical knowledge [28]. It is a 16-question survey that covers the reliability of a publication, treatment options, benefits, and risks of treatment options. Table 2 describes the interpretation of the total DISCERN scores. A higher score indicates a higher quality of the publications. The DISCERN scores were independently performed by a senior medical student who was trained in using the DISCERN tool. The DISCERN score has been used by other senior medical students in other peer-reviewed publications [29-31].

Table 2. DISCERN scores.
Score rangeQuality rating
63-80Excellent
51-62Good
39-50Fair
27-38Poor
16-26Very poor

Statistical Analyses

Categorical variables were reported using frequencies and proportions. Continuous variables are presented as means (SD) or medians and interquartile ranges. Separate analyses were conducted to determine if quality and readability differed depending on the origin of the PEM. These were compared using the Kruskal Wallis test, followed by the Dunn-Bonferroni post hoc tests. The Spearman correlation coefficients were used to assess the relationship between DISCERN scores and readability scores. Statistical analyses were performed using SPSS, version 26.0 (IBM Corp), with statistical significance set to P<.05.


Search Results

A total of 80 webpages were retrieved from the search. After the removal of 4 duplicates and excluding 35 webpages, 41 webpages met the inclusion criteria. Moreover, 63% of the webpages originated from the United States (26/41) while 37% (15/41) originated from Canada. Of the included webpages, 2% (1/41) were from an academic institution, 17% (7/41) from a professional organization, 36% (15/41) were from a medical information website, 17% (7/41) were from a government website, 21% (9/41) were from private clinics, and 4% (2/41) were from miscellaneous websites. Of the excluded webpages, 2% (1/35) were from a blog, 17% (6/35) were from a scientific webpage, 25% (9/35) were from websites targeting medical professionals, and 45% (16/35) were websites without patient information pertaining to diabetic wound care.

Readability Evaluation

The FRE scores ranged from 0 to 100 with a higher score corresponding to a text that is easier to read (Table 1). An FRE score between 60 and 70 corresponds to a standard reading level. The mean FRE score for all included PEM was 63.43 (SD 14.21), indicating a standard difficulty with a range of 33.8-84.2. Moreover, 68% (58/85) had FRE scores below 60, indicating that they were “fairly difficult” to “very difficult” to read. The mean reading grade levels as determined by the FKG score was 7.85 (SD 2.38). When looking at PEM from different origins, PEM from government websites had the highest FRE scores. PEM from private clinics had the highest FKG scores (Table 3). PEM from the United States also appeared to have a higher reading level than Canadian PEM (Table 4).

Table 3. Mean readability scores according to each type of website (95% CI).

Academic institutions (n=1)Private clinics (n=15)Professional organizations (n=7)Medical information websites (n=7)Government websites (n=9)Miscellaneous (n=2)Total (N=41)
FREa score (SD)71.9 (—b)55.4 (8.4)68.9 (7.4)61.69 (8.4)73.19 (5.17)55.1 (21.9)7.85 (2.38)
FKGc score (SD)6.7 (—)8.63 (1.3)7.54 (1.8)8.15 (1.4)6.46 (1.1)8.6 (3.9)63.43 (14.21)

aFRE: Flesch-Kincaid reading ease.

bNot applicable.

cFKG: Flesch-Kincaid grade level.

Table 4. Mean readability scores according to country of origin (95% CI).

CanadaThe United States
FREa score (SD)66.52 (6.7)55.4 (5.7)
FKGb score (SD)7.67 (1.1)8.63 (1.0)

aFRE: Flesch-Kincaid reading ease.

bFKG: Flesch-Kincaid grade level.

Quality of Patient Education Material

The mean DISCERN score was 45.66 (SD 3.34) (Table 5). The weighted κ statistic for the total DISCERN scores was 0.95. The average scores for each item in the DISCERN instrument are displayed in Multimedia Appendix 1. Twenty-seven percent (11/41) of articles had total DISCERN scores of less than 39, indicating they were of “poor” or “very poor” quality. Table 6 demonstrates the DISCERN scores for the PEM based on their origin. PEM originating from medical information websites had significantly higher DISCERN scores (P=.01). There was no significant correlation between DISCERN score and FRE score (r=0.07, P=.67) or DISCERN score and the average reading grade level (r=-0.005, P=.97).

Table 5. Average score (95% CI) for each item in the DISCERN instrument.
Quality criterionValue
Section 1: reliability, mean (SD)

Are the aims clear?3.1 (0.3)

Does it achieve its aims?4.1 (0.3)

Is it relevant?3.7 (0.3)

Is it clear what sources of information were used to compile the publication (other than the author or producer)?2.6 (0.4)

Is it clear when the information used or reported in the publication was produced?2.6 (0.5)

Is it balanced and unbiased?2.7 (0.3)

Does it provide details of additional sources of support and information?2.6 (0.3)

Does it refer to areas of uncertainty?2.0 (0.4)
Total reliability score, mean (SD)23.4 (1.8)
Section 2: quality, mean (SD)

Does it describe how each treatment works?3.4 (0.4)

Does it describe the benefits of each treatment?2.8 (0.4)

Does it describe the risks of each treatment?3.0 (0.9)

Does it describe what would happen if no treatment were used?3.1 (0.4)

Does it describe how the treatment choices affect overall quality of life?2.5 (0.3)

Is it clear that there may be more than one possible treatment choice?3.1 (0.3)

Does it provide support for shared decision-making?3.0 (0.4)
Total quality score, mean (SD)19.2 (1.6)
Overall rating of sites, mean (SD)3.0 (0.3)
Total DISCERN scores, mean (SD)45.7 (3.3)
Table 6. Mean DISCERN score for patient education materials based on their origin.
PEMa originValue
Academic institutions, mean (SD)42.00 (—b)
Professional organizations, mean (SD)41.57 (9.52)
Medical information websites, mean (SD)53.53 (5.99)
Government websites, mean (SD)42.43 (5.26)
Private clinics, mean (SD)39.56 (15.83)
Miscellaneous, mean (SD)41.50 (6.36)

aPEM: patient education materials.

bNot applicable.


Principal Findings

On average, PEM on diabetic foot ulcer care were written at an approximate reading level of grades 7-8, which exceeds the 6th-grade reading level recommendation from the Canadian Medical Protective Association and the average reading level of a North American adult [14]. Furthermore, 68% had FRE scores below 60, indicating that they were “fairly difficult” to “very difficult” to read. Similar results have been found by other studies. Lipari et al conducted a study on the readability of online PEM on diabetes mellitus and found that 77% of PEM (10/13) were written above an 8th-grade reading level [14]. Furthermore, PEM from Diabetes Canada were written at about a 7th-10th grade reading level. This is an important finding as some may assume that material originating from credible sources such as academic institutions and professional organizations may be better for patient education. Our study found that PEM from professional organizations and an academic institution typically exceeded a 6th-grade reading level. This is in keeping with a previous study, which found that PEM on diabetes mellitus from US academic institutions and professional organizations were written for a reading level of above grade 10 [32].

PEM must also be reliable, comprehensive, and contain evidence-based information. This study attempted to assess reliability and quality through the DISCERN tool. Twenty-seven percent (11/41) of the articles had total DISCERN scores less than 39, indicating they were of “poor” or “very poor” quality. Similar studies on diabetic retinopathy have found that 73% (16/22) of PEM were of “poor” or “very poor” quality [33]. Interestingly, this study also found that academic institution and medical information websites had significantly higher reliability scores when compared with private clinics. These differences may be because academic institutions and medical information websites have access to several experts in their respective fields and may have more resources to produce more robust PEM. These findings may have important implications when physicians and other allied health professionals refer patients to online resources to learn more about diabetic foot ulcers.

This study performed a correlation analysis to determine the relationship between DISCERN scores and readability. A weak positive correlation was found between DISCERN scores and FRE scores, and a weak negative correlation was found between DISCERN scores and average reading grade level. Neither reached statistical significance. This implies that high-quality, more reliable PEM were not necessarily more readable. While the target audiences for these websites vary, these were the websites most readily accessible and targeted to patients. This has an important implication as easily accessible PEM that patients can easily comprehend may not necessarily be of high quality.

Limitations

This study has several limitations. Firstly, the search strategy in this study used the Google search engine with 4 different search terms to appropriately simulate how patients search the internet for health information. It is possible that patients could obtain different resources. However, Google is the most common search engine used and has been the sole search engine used in several other readability analyses [14,33,34]. Furthermore, it is not possible to predict which search terms patients will use. However, this study utilized 4 different search terms that are the most likely terms used by patients searching for diabetic foot ulcer care. We did not account for patients who do not use the internet to access PEM on diabetic foot ulcers or those who do not have access to a computer.

The correlation between readability scores and true reading comprehension cannot be considered perfect since readability scores have several limitations. Since these scores are based on variables such as the number of syllables or characters per word, they can be skewed by medical terminology such as “vasculature” or “neuropathy.” Titles and headings can also mislead them as these may be interpreted as sentences. This study mitigates these limitations by using readability formulas most suited for medical literature and appropriately preparing the text from websites. It is important to note that readability scores are not measures of overall comprehension. Rather, readability scores reflect one of the many characteristics of reading skill and reading ease of materials [35,36]. Some suggestions for improving the readability of PEM include minimizing the use of complex words and decreasing the length of sentences or syllables per word. Readability scores should be considered with other indicators in assessing the overall comprehension of written PEM. Our study attempts to address this by using readability scores along with the DISCERN tool. Lastly, although the DISCERN tool has been validated and widely applied to patient information on treatment options, it does not directly evaluate the accuracy of the information contained within these PEM. Rather, DISCERN determines the readability and quality of public materials.

Conclusion

As the COVID-19 pandemic has placed a greater emphasis on digital health, it is important to assess the readability and quality of online information of DFU to ensure adequate and appropriate patient education. While the internet has allowed for ease of access to information for a breadth of patients, our study showed that online PEM on DFU care were far above the recommended reading level for patients. Physicians and other allied health professionals should be aware of the deficiencies in the quality and reliability of internet-based PEM that patients use to inform their care. In the future, PEM authors should consider using these tools to evaluate the readability and quality of their website.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Instruments and calculations used to assess readability.

DOCX File , 13 KB

  1. Jupiter DC, Thorud JC, Buckley CJ, Shibuya N. The impact of foot ulceration and amputation on mortality in diabetic patients. I: From ulceration to death, a systematic review. Int Wound J 2015 Jan 20;13(5):892-903. [CrossRef]
  2. Siersma V, Thorsen H, Holstein P, Kars M, Apelqvist J, Jude E, et al. Health-related quality of life predicts major amputation and death, but not healing, in people with diabetes presenting with foot ulcers: the Eurodiale study. Diabetes Care Mar 1 2014;37(3):694-700. [CrossRef]
  3. Armstrong DG, Boulton AJ, Bus SA. Diabetic Foot Ulcers and Their Recurrence. N Engl J Med 2017 Jun 15;376(24):2367-2375. [CrossRef]
  4. Unwin N. Epidemiology of lower extremity amputation in centres in Europe, North America and East Asia. Br J Surg 2000 Mar;87(3):328-337. [CrossRef] [Medline]
  5. Roglic G. Global report on diabetes. World Health Orgnanization. 2016.   URL: https:/​/www.​ijncd.org/​article.​asp?issn=2468-8827;year=2016;volume=1;issue=1;spage=3;epage=8;aulast=Roglic [accessed 2021-12-29]
  6. Malone JM, Snyder M, Anderson G, Bernhard VM, Holloway G, Bunt TJ. Prevention of amputation by diabetic education. The American Journal of Surgery 1989 Dec;158(6):520-524. [CrossRef]
  7. Litzelman DK. Reduction of Lower Extremity Clinical Abnormalities in Patients with Non-Insulin-Dependent Diabetes Mellitus. Ann Intern Med 1993 Jul 01;119(1):36. [CrossRef]
  8. Alwahbi A. Impact of a diabetic foot care education program on lower limb amputation rate. VHRM 2010 Oct:923. [CrossRef]
  9. Ahmad Sharoni SK, Minhat HS, Mohd Zulkefli NA, Baharom A. Health education programmes to improve foot self-care practices and foot problems among older people with diabetes: a systematic review. Int J Older People Nurs 2016 Feb 25;11(3):214-239. [CrossRef]
  10. Palaian S, Chhetri A, Mukhyaprana P, Surulivelrajan M, Shanka PR. Role Of Pharmacist In Counseling Diabetes Patients. IJPHARM 2005 Jan;4(1):1-16. [CrossRef]
  11. Badarudeen S, Sabharwal S. Assessing readability of patient education materials: current role in orthopaedics. Clin Orthop Relat Res 2010 Oct;468(10):2572-2580 [FREE Full text] [CrossRef] [Medline]
  12. Weiss BD. Health Literacy and Patient Safety: Help Patients Understand; Manual for Clinicians. AMA Foundation. 2007.   URL: http:/​/www.​partnershiphp.org/​Providers/​HealthServices/​Documents/​Health%20Education/​CandLToolKit/​2%20Manual%20for%20Clinicians.​pdf [accessed 2022-01-06]
  13. Health literacy — An asset in safer care. The Canadian Medical Protective Association. 2013.   URL: https:/​/www.​cmpa-acpm.ca/​en/​advice-publications/​browse-articles/​2013/​health-literacy-an-asset-in-safer-care [accessed 2022-01-05]
  14. Lipari M, Berlie H, Saleh Y, Hang P, Moser L. Understandability, actionability, and readability of online patient education materials about diabetes mellitus. Am J Health Syst Pharm 2019 Jan 25;76(3):182-186. [CrossRef] [Medline]
  15. Institute of Medicine (US) Committee on Health Literacy. In: Nielsen-Bohlman L, Panzer AM, Kindig DA, editors. Health Literacy: A Prescription to End Confusion. Washington, DC, US: The National Academies Press; 2004:31-58.
  16. Schmitt PJ, Prestigiacomo CJ. Readability of Neurosurgery-Related Patient Education Materials Provided by the American Association of Neurological Surgeons and the National Library of Medicine and National Institutes of Health. World Neurosurgery 2013 Nov;80(5):e33-e39. [CrossRef]
  17. Patel CR, Sanghvi S, Cherla DV, Baredes S, Eloy JA. Readability Assessment of Internet-Based Patient Education Materials Related to Parathyroid Surgery. Ann Otol Rhinol Laryngol 2015 Jan 15;124(7):523-527. [CrossRef]
  18. Kloosterboer A, Yannuzzi NA, Patel NA, Kuriyan AE, Sridhar J. Assessment of the Quality, Content, and Readability of Freely Available Online Information for Patients Regarding Diabetic Retinopathy. JAMA Ophthalmol 2019 Nov 01;137(11):1240. [CrossRef]
  19. Kasabwala K, Misra P, Hansberry DR, Agarwal N, Baredes S, Setzen M, et al. Readability assessment of the American Rhinologic Society patient education materials. International Forum of Allergy & Rhinology 2012 Oct 08;3(4):325-333. [CrossRef]
  20. Misra P, Agarwal N, Kasabwala K, Hansberry DR, Setzen M, Eloy JA. Readability analysis of healthcare-oriented education resources from the american academy of facial plastic and reconstructive surgery. The Laryngoscope 2012 Sep 28;123(1):90-96. [CrossRef]
  21. Worrall AP, Connolly MJ, O’Neill A, O’Doherty M, Thornton KP, McNally C, et al. Readability of online COVID-19 health information: a comparison between four English speaking countries. BMC Public Health 2020 Nov 13;20(1):1-12. [CrossRef]
  22. Kher A, Johnson S, Griffith R. Readability Assessment of Online Patient Education Material on Congestive Heart Failure. Advances in Preventive Medicine 2017;2017:1-8. [CrossRef]
  23. Maciolek KA, Jarrard DF, Abel EJ, Best SL. Systematic Assessment Reveals Lack of Understandability for Prostate Biopsy Online Patient Education Materials. Urology 2017 Nov;109:101-106. [CrossRef]
  24. Dollahite J, Thompson C, McNew R. Readability of printed sources of diet and health information. Patient Education and Counseling 1996 Mar;27(2):123-134. [CrossRef]
  25. Kandula S, Zeng-Treitler Q. Creating a gold standard for the readability measurement of health texts. 2008 Presented at: AMIA Annu Symp Proc; November 6, 2008; Washington, DC, US.
  26. Wong K, Gilad A, Cohen MB, Kirke DN, Jalisi SM. Patient education materials assessment tool for laryngectomy health information. Head & Neck 2017 Aug 16;39(11):2256-2263. [CrossRef]
  27. Zheng J, Yu H. Assessing the Readability of Medical Documents: A Ranking Approach. JMIR Med Inform 2018 Mar 23;6(1):e17. [CrossRef]
  28. Shepperd S, Charnock D. Why DISCERN? Health Expect 1998 Nov 04;1(2):134-135 [FREE Full text] [CrossRef] [Medline]
  29. Ferster APO, Hu A. Evaluating the Quality and Readability of Internet Information Sources regarding the Treatment of Swallowing Disorders. Ear, Nose & Throat Journal 2019 Jan 08;96(3):128-138. [CrossRef]
  30. Ting K, Hu A. Evaluating the Quality and Readability of Thyroplasty Information on the Internet. Journal of Voice 2014 May;28(3):378-381. [CrossRef]
  31. Yi GS, Hu A. Quality and Readability of Online Information on In-Office Vocal Fold Injections. Ann Otol Rhinol Laryngol 2019 Nov 08;129(3):294-300. [CrossRef]
  32. Dorcely B, Agarwal N, Raghuwanshi M. Quality assessment of diabetes online patient education materials from academic institutions. Health Education Journal 2014 Oct 26;74(5):568-577. [CrossRef]
  33. Novin S, Konda SM, Xing B, Bange M, Blodi B, Burckhard B. Diabetic Retinopathy Online: A Powerful Opportunity for Revision. Journal of Consumer Health on the Internet 2020 Sep 01;24(3):251-268. [CrossRef]
  34. Talati K, Upadhyay V, Gupta P, Joshi A. Quality of diabetes related health information on internet: an Indian context. IJEH 2013;7(3):205. [CrossRef]
  35. McGee J. Toolkit Part 7: Using readability formulas: A cautionary note. In: Toolkit for making written material clear and effective. Washington, DC, US: US Department of Health and Human Services; 2010:1-43.
  36. Simply Put: A guide for creating easy-to-understand materials. Centers for Disease Control Prevention. 2009 Apr.   URL: https://www.cdc.gov/healthliteracy/pdf/simply_put.pdf [accessed 2021-04-05]


DFU: diabetic foot ulcers
FKG: Flesch-Kincaid grade level
FRE: Flesch-Kincaid reading ease
PEM: patient education materials


Edited by K Mizokami-Stout, D Griauzde; submitted 16.01.21; peer-reviewed by J McGuire, S Rush; comments to author 17.03.21; revised version received 11.06.21; accepted 16.10.21; published 11.01.22

Copyright

©David Michael Lee, Elysia Grose, Karen Cross. Originally published in JMIR Diabetes (https://diabetes.jmir.org), 11.01.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Diabetes, is properly cited. The complete bibliographic information, a link to the original publication on https://diabetes.jmir.org/, as well as this copyright and license information must be included.