• Vol. 53 No. 7, 454–455
  • 25 July 2024

Using artificial intelligence as an ethics advisor

Dear Editor,

Ethical dilemmas are common in the practice of medicine and can lead to an array of seemingly reasonable decisions unless policies or regulations mandate certain actions. Choosing the appropriate solution requires not only biomedical evidence, but also requires the balancing of possibly divergent preferences, values, contextual factors and ethical theories. These include utilitarianism, which aims to optimise happiness for the largest number of people; versus deontology, which promotes actions based on rules and duties even if these actions do not result in the greatest common good. The inability to find common ground can both delay appropriate care and trigger moral distress among health professionals.1  However, training in ethical reasoning or obtaining ethics consultations may not be universally available. How then can frontline healthcare teams navigate ethical dilemmas?

Generative artificial intelligence (generative AI) is a form of artificial intelligence that can produce text in response to user prompts and includes large language model chatbots such as Chat Generative Pre-trained Transformer (ChatGPT, https://chat.openai.com/). To demonstrate how generative AI can act as an ethics advisor, the freely available ChatGPT version 3.5 was applied to 2 cases involving extracorporeal membrane oxygenation (ECMO)—a high-cost intervention fraught with ethical considerations2—from the AMA Journal of Ethics (Supplementary Material). In both cases, user prompts asked ChatGPT to provide answers to 5 queries:

  • “What should the clinician do for the case?” (A general, open-ended question to seek practical guidance on the appropriate action to be taken.)
  • “You mentioned ethical principles. Please elaborate.” (A more specific question in response to ChatGPT’s advice to consider ethical principles, seeking to clarify specific principles for the clinician’s understanding.)
  • “What if no palliative care specialists or ethicists are available?” (A question for advice in the absence of specialist consultation, which is a real-world limitation that clinicians may encounter.)
  • “Should ECMO be continued or withdrawn?” (A specific question regarding withdrawing ECMO, aiming to elicit advice on a key course of action.)
  • “In retrospect, should the patient have received ECMO?” (A specific question regarding a counter-factual scenario, to help decision-making for future patients.)

ChatGPT’s response provided a framework for ethical case analysis, complementing the general rules and practice recommendations provided by current ECMO guidelines,3 and covered most of the common competencies of communications, ethics and professionalism for medical students and physicians4 (Table 1). Although the general privacy, bias and explainability concerns5,6 about artificial intelligence may still affect generative AI’s role as an ethics advisor, generative AI does not require the input of confidential information and generates textual output that is transparent to human interpretation.

Table 1. Common competencies of communications, ethics and professionalism.

Common competencies Coverage by advice provided by ECMO guidelines3 Coverage by advice provided by ChatGPT
Values/ethics—appreciation of professional, legal and societal values to guide practice Partially. Advised inclusion and exclusion criteria for ECMO, prognosis, possibility of bridging to advanced therapies. No mention of ethical principles. Yes. Advised consideration of ethical principles (beneficence, non-maleficence, autonomy, justice), medical condition, prognosis, quality of life, goals of care, patient/family preferences and resource allocation.
Professional responsibilities—appreciation and manifesting these responsibilities Yes. Advised assessment of the patient’s condition and reviewing of treatment goals. Yes. Advised continuous reassessment of the patient’s condition, reviewing of treatment goals, and provision of emotional support to family members or surrogates.
Doctor-patient relationships — appreciation of the guiding principles, expectations and ability to nurture these relationships Yes. Advised family education and discussions regarding possible discontinuation of ECMO support and provision of appropriate palliative care for cases where ECMO is unlikely to succeed. Yes. Advised communication, information gathering, goals of care discussion, and shared decision-making with family members or surrogates.
Interprofessional team working Yes. Advised involving palliative or supportive care teams, psychologic counsellors and ethics teams. Yes. Advised involving a multidisciplinary team and palliative care consultation.
Ethical and legal responsibilities Partially. Advised addressing ethical considerations, beyond merely providing treatment with ECMO. No mention of legal responsibilities. Yes. Advised addressing ethical considerations, and good documentation of all discussions, decisions and care provided. Advised adherence to legal requirements and institutional policies.
Continuous learning and quality improvement Yes. Advised that training and retraining be part of the ECMO programme. Yes. Advised learning from external resources, online resources and colleagues.

ECMO: extracorporeal membrane oxygenation

As demonstrated by the 2 case examples, generative AI like ChatGPT may be well-suited to provide ethical guidance to frontline clinicians. ChatGPT’s responses can prompt clinicians to address critical aspects when navigating ethical dilemmas in healthcare. These include thorough information gathering; effective communication with family members and the multidisciplinary team; discussions about goals of care; ethical principles; consultation with specialist palliative care (where possible); seeking peer input; utilising online resources and guidelines; and maintaining clear documentation. Conceivably, such ethical challenges in healthcare can involve decision-making for the withholding or withdrawal of treatments across diverse fields of medicine.

Such support is not to replace the essential roles of humans during the process of ethical reasoning, as ChatGPT does not do the following: gathering information; communicating with patients, families and other clinicians; integrating ethical principles, human values, subjective judgements and contextual issues; balancing arguments and counterarguments.7 Also, as ChatGPT’s output has shown, generative AI does not provide binary “yes” or “no” answers to specific ethical questions. This limitation hinders its suitability for making quick decisions in urgent situations. Rather, by providing guidance on the essential areas for consideration, generative AI shows promise as a collaborative intelligence tool to augment human decision-making for ethical dilemmas.8

Declaration
There are no affiliations or financial involvement with any commercial organisation with a direct financial interest in the subject or materials discussed in the manuscript.

Correspondence: Dr Kay Choong See, Senior Consultant, Division of Respiratory and Critical Care Medicine, Department of Medicine, National University Hospital, Level 10 NUHS Tower Block, 1E Kent Ridge Road, Singapore 119228.
Email: [email protected]

This article was first published online on 25 July 2024 at annals.edu.sg


References

  1. Farrell CM, Hayward BJ. Ethical Dilemmas, Moral Distress, and the Risk of Moral Injury: Experiences of Residents and Fellows During the COVID-19 Pandemic in the United States. Acad Med 2022;97:S55-S60.
  2. Lim JKB, Qadri SK, Toh TSW, et al. Extracorporeal Membrane Oxygenation for Severe Respiratory Failure During Respiratory Epidemics and Pandemics: A Narrative Review. Ann Acad Med Singap 2020;49:199-214.
  3. Lorusso R, Shekar K, MacLaren G, et al. ELSO Interim Guidelines for Venoarterial Extracorporeal Membrane Oxygenation in Adult Cardiac Patients. ASAIO J 2021;67:827-44.
  4. Ting JJQ, Phua GLG, Hong DZ, et al. Evidence-guided approach to portfolio-guided teaching and assessing communications, ethics and professionalism for medical students and physicians: a systematic scoping review. BMJ Open 2023;13:e067048.
  5. Tan EC. Artificial Intelligence and Medical Innovation. Ann Acad Med Singap 2020;49:252-55.
  6. Meier LJ, Hein A, Diepold K, et al. Algorithms for Ethical Decision-Making in the Clinic: A Proof of Concept. Am J Bioeth 2022;22:4-20.
  7. Pearlman RA, Foglia MB, Fox E, et al. Ethics Consultation Quality Assessment Tool: A Novel Method for Assessing the Quality of Ethics Case Consultations Based on Written Records. Am J Bioeth 2016;16:3-14.
  8. See KC. Collaborative intelligence for intensive care units. Crit Care 2021;25:426.