Dear Editor,
“Leveraging ChatGPT to aid patient education on coronary angiogram”1 is an interesting article. The study assessed ChatGPT’s ability to conversely provide information regarding the coronary angiography process, pointing out its advantages and disadvantages. Although ChatGPT provided information in an exhaustive and methodical manner, it also had flaws, including factual errors, omissions and recommendations that lacked flexibility. The results imply that although ChatGPT and other natural-language artificial intelligence (AI) models can be useful resources for patient education, they should not take the place of the individualised guidance and treatment given by medical experts.
The study’s dependence on ChatGPT, a single AI model, may not adequately capture the breadth of natural-language AI options for healthcare applications, which is one of its weak points. It would be useful to assess how well ChatGPT performs in comparison to other AI models when it comes to giving people medical information. Furthermore, the study’s concentration on the coronary angiography, a particular cardiology technique, raises concerns about the findings’ generalisability to other medical specialties or themes. Subsequent investigations may examine ChatGPT’s efficacy in disseminating knowledge on a more extensive array of healthcare subjects.
Concerns regarding the possible effects of natural language AI on patient education and healthcare delivery are brought up by the study. How can medical practitioners ensure information integrity and dependability while integrating AI technologies, such as ChatGPT, into patient education campaigns? More importantly, how can AI models be enhanced to overcome the shortcomings this study found, such as factual errors and rigid recommendations? Future research and development in the area of natural-language AI in healthcare can be guided by these questions.
The study concludes by highlighting the potential benefits and difficulties of utilising ChatGPT and other natural language AI models for patient education in the healthcare industry. Although ChatGPT showed promise in terms of offering thorough information, it also had shortcomings that should be fixed. Healthcare practitioners should be aware of these advantages and disadvantages of AI in patient education and work with AI developers to enhance the quality and dependability of the information given to patients in order to optimise the technology’s benefits. Prospective avenues for investigation may encompass investigating the utilisation of AI models across an expanded array of healthcare domains and specialisations, in addition to formulating tactics to augment the efficacy of AI in providing tailored healthcare recommendations.
Declaration
The authors have no affiliations or financial involvement with any commercial organisation with a direct financial interest in the subject or materials discussed in the manuscript
References
- Koh SJQ, Yeo KK, Yap JJ. Leveraging ChatGPT to aid patient education on coronary angiogram. Ann Acad Med Singap 2023;52:374-7.
Authors’ reply:
Dear Editor,
We appreciate the insightful comments regarding our article, “‘Leveraging ChatGPT to aid patient education on coronary angiogram’: Correspondence”.1
We agree that ChatGPT represents a single large language model (LLM) and may not fully encompass the diversity of artificial intelligence models available. However, given ChatGPT’s widespread accessibility and popularity, as evidenced by its rapid growth in monthly users and significant market share,2 it is highly relevant as the primary LLM tool for evaluation in this study. Moreover, numerous studies have used ChatGPT as a benchmark, including one that demonstrated its potential medical accuracy through its performance on the United States Medical Licensing Examination (USMLE).3
Coronary angiography, while relatively common, is an invasive procedure that often prompts questions from patients and public, forming the basis of our article’s assessment. Given ChatGPT’s conversational nature, we have also explored its utility in addressing queries related to end-stage heart failure,4 with similar findings, suggesting that this evaluation can be extended to other medical fields.
We concur that while ChatGPT shows promise, there are potential pitfalls that healthcare practitioners should be aware of. As more patients turn to these platforms for health information, it is essential for healthcare providers to understand the limitations of these models and to anticipate and address potential misinformation.
Samuel Ji Quan Koh, Jonathan Yap
Correspondence:
Dr Samuel Ji Quan Koh and Dr Jonathan Jiunn-Liang Yap, Department of Cardiology, National Heart Centre Singapore, 5 Hospital Dr, Singapore 169609. Email: Koh [email protected]
Dr Jonathan Jiunn-Liang Yap jonathan. Email: [email protected]
References
- Koh SJQ, Yeo KK, JJL Y. Leveraging ChatGPT to aid patient education on coronary angiogram. Ann Acad Med Singap 2023;52:374-7.
- Reuters. ChatGPT sets record for fastest-growing user base – analyst note, 2 February 2023. https://www.reuters.com/technology/chatgpt-sets-record-fastest-growing-user-base-analyst-note-2023-02-01/. Accessed 9 July 2024.
- Kung TH, Cheatham M, Medenilla A, et al. Performance of ChatGPT on USMLE: Potential for AI-assisted medical education using large language models. PLOS Digit Health 2023;2:e0000198.
- Koh SJQ, Sim DKL, SH N. Letter to the Editor: Educating Patients With Advanced Heart Failure Through Chat Generative Pretrained Transformer and Natural-Language Artificial Intelligence: Is Now the Time for It? J Palliat Med 2023;26:893-5.
The author(s) declare there are no affiliations with or involvement in any organisation or entity with any financial interest in the subject matter or materials discussed in this manuscript.