•  
  •  
 

Document Type

Conference Proceedings

Abstract

Objective: To assess the accuracy and completeness of ChatGPT’s health education responses on diabetic foot ulcer prevention compared with the American Diabetes Association (ADA) Standards of Care. Methods: A cross-sectional rubric-based analysis was conducted using the 2024 ADA foot care guidelines. A six-domain scoring rubric (inspection, hygiene, toenail care, footwear/socks, professional exams, and warning signs) was developed. Eight patient-style prompts were entered into ChatGPT (GPT-4), and two independent reviewers evaluated responses on a 0–2 scale per domain (maximum score = 12). Results: ChatGPT demonstrated strong overall guideline concordance, with three responses achieving perfect scores (12/12). Four responses scored between 8–10, typically omitting toenail care or underemphasizing professional exams. One response scored 4/12, providing partial coverage of inspection and warning signs but lacking other essential domains. Conclusion: ChatGPT provides largely accurate, ADA-consistent preventive education for diabetic foot care but demonstrates variability in topic depth. Deficiencies in toenail care and referral guidance highlight the need for clinician oversight. While AI tools like ChatGPT may enhance patient understanding, they should serve as adjuncts, not replacements for professional medical counseling.

Share

COinS