Abstract
Diabetes self-management plays a major role in controlling blood sugar levels and avoiding chronic complications. In this report, we investigate the strengths and limitations of artificial intelligence chatbots in supporting patients with type 1 diabetes and their families. With the growing accessibility of these constantly evolving tools, front-line providers must advocate for their responsible use.
JMIR Diabetes 2025;10:e76986doi:10.2196/76986
Keywords
Introduction
Type 1 diabetes (T1D) is one of the most common chronic diseases of childhood with significant morbidity and mortality. High-level evidence supports the optimization of blood sugar control to prevent chronic complications []. Even with remarkable advances in diabetes technology, such as continuous glucose monitoring systems and automated insulin delivery pumps, a majority of patients fail to meet glycemic targets [].
Diabetes self-management and education play a major role in glycemic control. However, several challenges limit the potential of traditional educational models: shortages of diabetes educators, reduced access to care for some patient groups (exacerbated by the recent pandemic), and inherent limitations to addressing patients’ and caregivers’ individual needs.
Chatbots are tools powered by artificial intelligence (AI) that have shown promise in many aspects []. They are computer programs designed to respond to questions in natural language as if they were a real person. It is anticipated that such tools will be increasingly used by patients and their families because of their ease of access. Clinicians nonetheless remain a key source for trusted health information, and pediatricians and other front-line health care providers must understand the strengths and limitations of such tools.
A recent paper in Diabetes Care [] noted strengths and inaccuracies of the recent GPT-4 model developed by OpenAI when asked questions on diet/exercise, glycemic management, and insulin storage and administration in adult patients. Children with T1D have unique needs regarding self-management when compared to adults. In this brief report, we sought to evaluate the strengths and weaknesses of an AI tool for pediatric diabetes management.
Methods
We asked a multidisciplinary team at the Pediatric Diabetes Clinic of the Montreal Children’s Hospital at McGill University Health Centre, a tertiary care center caring for 900 pediatric patients with T1D, to help craft a comprehensive list of questions regarding education and self-management of children with T1D spanning 4 key categories (). The multidisciplinary team included a pediatric endocrinologist, a pediatric emergency room physician, a medical student with previous experience in pediatric diabetes research, 3 certified diabetes educators, and 1 parent partner, all affiliated with the Montreal Children’s Hospital.
The questions were asked in both French and English to the chatbot, and its answers were evaluated qualitatively based on whether they reflected what is routinely taught to families and whether the answers were complete, accurate, and correct.
New diagnosis
- My child was just diagnosed with T1D. What is this? I have only heard of diabetes in older obese adults.
- Will my child have any restrictions in his life?
- When will my child be ready to go back to school? What should I say to the school teacher?
- Can you recommend reliable websites where I can read more about T1D?
Recognizing and managing hypo- and hyperglycemia
- What is the blood sugar level that the doctors kept mentioning?
- What is a normal blood sugar?
- What symptoms will my child have if his/her blood sugar is high or low?
- What do I do if my child’s blood sugar is high or low?
- My child is vomiting and his blood sugar is high. Should I bring him to the ER?
Glucometers and insulin
- My child’s doctor and nurse showed to me how to use a glucometer but I can not remember the exact steps. Can you give me step by step instructions?
- My child’s doctor and nurse gave me 2 different insulins; a long acting and a short acting. What is the difference?
- My child’s doctor and nurse showed me how to use an insulin pen but I cannot remember the exact steps. Can you give me step by step instructions?
- Which body parts should I use to do the insulin injections?
Newer diabetes technologies
- My child’s doctor and nurse mentioned continuous glucose monitors and insulin pumps. What are these? Are they recommended for children with T1D?
- How often do I need to change the CGM and the pump?
- How do I interpret the different metrics on a CGM? What is the goal target in range?
- What type of insulin is given in an insulin pump?
Results
compiles the feedback given by the multidisciplinary diabetes team, highlighting strengths and limitations of ChatGPT for diabetes self-management and patient education. We found that the answers provided by ChatGPT had the same limitations in both French and English.
Strengths
- Conversation tone imitated a real human-to-human interaction
- Clear instructions in point form
- Easily understood and mostly accurate responses from multiple trustworthy online sources
- No medical jargon
- Most answers safely highlighted the importance of remaining in contact with the child’s health care provider
- Generally useful information, available 24-7
- Potential for immediate translation to support families who face language barriers
Limitations
- New diagnosis
- No reference to the International Society for Pediatric and Adolescent Diabetes, which provides specific guidelines for children, such as a target blood glucose range of 70-180 mg/dl (4-10 mmol/L)
- Recognizing and managing hypo- or hyperglycemia
- No mention of home ketone testing in the management of hyperglycemia, a critical step in assessing the risk of diabetic ketoacidosis while the child is still at home
- No mention that the amount of carbohydrates used to treat hypoglycemia depends on the child’s weight; for example giving 15 g of carbohydrates to a 20 kg child would cause rebound hyperglycemia, while 5 g of carbohydrates to an adolescent would not be enough; also, the type of insulin therapy influences how a low blood glucose is treated; for example, if the child is on standard pump therapy and has a glucose level <3 mmol/L, we recommend suspending insulin delivery until glucose >4 mmol/L
- Glucometers and insulin analogs
- It would be useful if images accompanied the step-by-step instructions on how to use a glucometer, the insulin pen, and the body parts where insulin injections can be self-administered.
- No mention of admelog (insulin lispro) or trurapi (insulin aspart), insulin products approved by Health Canada. No mention of Tresiba, a newer long-acting insulin often used to provide more flexibility.
- Newer diabetes technologies
- Blood glucose is provided only in mg/dl. ChatGPT did not provide the international consensus guidelines on time above range (<25%) and time below range (<4%).
- Incorrect information that insulin pumps only give short- or rapid-acting insulin. Long-acting insulin is not necessary as the pump delivers tiny amounts of insulin every few minutes.
Discussion
When asked common questions on T1D, ChatGPT provided useful answers, though they were at times lengthy and lacked pediatric-specific details. It used clear terminology that had the potential for immediate translation to different languages. One important deficiency was noted regarding counseling for hyperglycemia, as it omitted the need for ketone measurement at home. The chatbot also provided inaccurate information on the type of insulin delivered by pumps ().
AI chatbots for diabetes education of patients and families can free precious health care provider time, allowing clinicians to support patients with the greatest need for more frequent human touchpoints. A human-AI interaction that integrates safeguards such as easily interpretable care algorithms can thus enable more frequent technology-enabled contact.
To understand the limitations of AI tools in diabetes education, we need to consider the way that they are developed and trained. They use an extraordinarily large text database from the internet and generate answers based on the likelihood of phrases following each other. They are not trained on databases specific to medical information and are not capable of evaluating the reliability of the answers they provide. Thus, they are prone to the phenomenon of “hallucination”: answers that sound reasonable but are factually incorrect [].
As with any new technology in health care, it is important to consider ChatGPT’s strengths, limitations, and ethical implications prior to implementing its use in routine clinical practice. It is essential to ensure that AI-based resources are easily accessible to our patients to prevent worsening existing health care disparities. Moreover, transparency in how information is generated and communicated is critical, so that patients can interpret responses with appropriate caution. With the growing ease of access to these constantly evolving tools, front-line providers must advocate for use that is responsible and promotes value-based care [].
Acknowledgments
The pediatric diabetes educators at the Montreal Children’s Hospital, Sara Seguin-Plourde, RN; Sandra Kambites, RN; and Charleen Magee, RN provided valuable feedback on the answers provided by ChatGPT. AM and EO are funded by the Fonds de recherche du Québec-Santé.
Authors' Contributions
FV and AM conceptualized the project, developed the methodology, analyzed the results, and wrote the manuscript. EO, PD, and PK analyzed the findings and reviewed and edited the manuscript.
Conflicts of Interest
None declared.
References
- Effect of intensive diabetes treatment on the development and progression of long-term complications in adolescents with insulin-dependent diabetes mellitus: Diabetes Control and Complications Trial. J Pediatr. Aug 1994;125(2):177-188. [CrossRef] [Medline]
- Miller KM, Beck RW, Foster NC, Maahs DM. HbA1c levels in type 1 diabetes from early childhood to older adults: a deeper dive into the influence of technology and socioeconomic status on HbA1c in the T1D Exchange Clinic Registry findings. Diabetes Technol Ther. Sep 2020;22(9):645-650. [CrossRef] [Medline]
- Bibault JE, Chaix B, Guillemassé A, et al. A chatbot versus physicians to provide information for patients with breast cancer: blind, randomized controlled noninferiority trial. J Med Internet Res. Nov 27, 2019;21(11):e15787. [CrossRef] [Medline]
- Sng GGR, Tung JYM, Lim DYZ, Bee YM. Potential and pitfalls of ChatGPT and natural-language artificial intelligence models for diabetes education. Diabetes Care. May 1, 2023;46(5):e103-e105. [CrossRef] [Medline]
- Chelli M, Descamps J, Lavoué V, et al. Hallucination rates and reference accuracy of ChatGPT and Bard for systematic reviews: comparative analysis. J Med Internet Res. May 22, 2024;26:e53164. [CrossRef] [Medline]
- WHO issues first global report on artificial intelligence (AI) in health and six guiding principles for its design and use. World Health Organization. 2021. URL: https://www.who.int/news/item/28-06-2021-who-issues-first-global-report-on-ai-in-health-and-six-guiding-principles-for-its-design-and-use [Accessed 2025-10-09]
Abbreviations
| AI: artificial intelligence |
| T1D: type 1 diabetes |
Edited by Ruxing Zhao; submitted 05.May.2025; peer-reviewed by Chekwube Obianyo, Randa Salah Gomaa Mahmoud; final revised version received 29.Aug.2025; accepted 03.Sep.2025; published 07.Nov.2025.
Copyright© Flavia Voiculescu, Paul Darvasi, Esli Osmanlliu, Preetha Krishnamoorthy, Angeliki Makri. Originally published in JMIR Diabetes (https://diabetes.jmir.org), 7.Nov.2025.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Diabetes, is properly cited. The complete bibliographic information, a link to the original publication on https://diabetes.jmir.org/, as well as this copyright and license information must be included.

