Should Artificial Intelligence be used to support clinical ethical decision-making? A systematic review of reasons.
Decision support systems, clinical
Decision-making, artificial intelligence
Ethics, Clinical
Systematic review
Journal
BMC medical ethics
ISSN: 1472-6939
Titre abrégé: BMC Med Ethics
Pays: England
ID NLM: 101088680
Informations de publication
Date de publication:
06 07 2023
06 07 2023
Historique:
received:
06
03
2023
accepted:
28
06
2023
medline:
10
7
2023
pubmed:
7
7
2023
entrez:
6
7
2023
Statut:
epublish
Résumé
Healthcare providers have to make ethically complex clinical decisions which may be a source of stress. Researchers have recently introduced Artificial Intelligence (AI)-based applications to assist in clinical ethical decision-making. However, the use of such tools is controversial. This review aims to provide a comprehensive overview of the reasons given in the academic literature for and against their use. PubMed, Web of Science, Philpapers.org and Google Scholar were searched for all relevant publications. The resulting set of publications was title and abstract screened according to defined inclusion and exclusion criteria, resulting in 44 papers whose full texts were analysed using the Kuckartz method of qualitative text analysis. Artificial Intelligence might increase patient autonomy by improving the accuracy of predictions and allowing patients to receive their preferred treatment. It is thought to increase beneficence by providing reliable information, thereby, supporting surrogate decision-making. Some authors fear that reducing ethical decision-making to statistical correlations may limit autonomy. Others argue that AI may not be able to replicate the process of ethical deliberation because it lacks human characteristics. Concerns have been raised about issues of justice, as AI may replicate existing biases in the decision-making process. The prospective benefits of using AI in clinical ethical decision-making are manifold, but its development and use should be undertaken carefully to avoid ethical pitfalls. Several issues that are central to the discussion of Clinical Decision Support Systems, such as justice, explicability or human-machine interaction, have been neglected in the debate on AI for clinical ethics so far. This review is registered at Open Science Framework ( https://osf.io/wvcs9 ).
Sections du résumé
BACKGROUND
Healthcare providers have to make ethically complex clinical decisions which may be a source of stress. Researchers have recently introduced Artificial Intelligence (AI)-based applications to assist in clinical ethical decision-making. However, the use of such tools is controversial. This review aims to provide a comprehensive overview of the reasons given in the academic literature for and against their use.
METHODS
PubMed, Web of Science, Philpapers.org and Google Scholar were searched for all relevant publications. The resulting set of publications was title and abstract screened according to defined inclusion and exclusion criteria, resulting in 44 papers whose full texts were analysed using the Kuckartz method of qualitative text analysis.
RESULTS
Artificial Intelligence might increase patient autonomy by improving the accuracy of predictions and allowing patients to receive their preferred treatment. It is thought to increase beneficence by providing reliable information, thereby, supporting surrogate decision-making. Some authors fear that reducing ethical decision-making to statistical correlations may limit autonomy. Others argue that AI may not be able to replicate the process of ethical deliberation because it lacks human characteristics. Concerns have been raised about issues of justice, as AI may replicate existing biases in the decision-making process.
CONCLUSIONS
The prospective benefits of using AI in clinical ethical decision-making are manifold, but its development and use should be undertaken carefully to avoid ethical pitfalls. Several issues that are central to the discussion of Clinical Decision Support Systems, such as justice, explicability or human-machine interaction, have been neglected in the debate on AI for clinical ethics so far.
TRIAL REGISTRATION
This review is registered at Open Science Framework ( https://osf.io/wvcs9 ).
Identifiants
pubmed: 37415172
doi: 10.1186/s12910-023-00929-6
pii: 10.1186/s12910-023-00929-6
pmc: PMC10327319
doi:
Types de publication
Systematic Review
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
48Informations de copyright
© 2023. The Author(s).
Références
Am J Bioeth. 2022 Jul;22(7):26-28
pubmed: 35737486
JAMA. 2003 Sep 3;290(9):1166-72
pubmed: 12952998
Am J Bioeth. 2022 Jul;22(7):46-49
pubmed: 35737497
AMA J Ethics. 2022 Aug 1;24(8):E773-780
pubmed: 35976935
Am J Bioeth. 2022 Jul;22(7):1-3
pubmed: 35737501
Sci Eng Ethics. 2019 Aug;25(4):985-991
pubmed: 26403297
J Med Philos. 2014 Apr;39(2):196-204
pubmed: 24526783
J Med Philos. 2014 Apr;39(2):99-103
pubmed: 24616483
Ann Intern Med. 2018 Dec 18;169(12):866-872
pubmed: 30508424
Sci Eng Ethics. 2020 Dec;26(6):3217-3227
pubmed: 32960411
Am J Bioeth. 2022 Jul;22(7):4-20
pubmed: 35293841
PLoS Med. 2007 Mar;4(3):e35
pubmed: 17388655
J Med Philos. 2014 Apr;39(2):130-52
pubmed: 24556152
Eur J Cancer. 2019 May;113:47-54
pubmed: 30981091
Bioethics. 2020 Mar;34(3):264-271
pubmed: 31577851
Kans Nurse. 2007 Apr;82(4):5-8
pubmed: 17523368
Am J Bioeth. 2019 Nov;19(11):9-12
pubmed: 31647760
Nat Med. 2019 Jan;25(1):65-69
pubmed: 30617320
J Med Philos. 2014 Apr;39(2):169-77
pubmed: 24526784
N Engl J Med. 2019 Oct 10;381(15):1480-1485
pubmed: 31597026
J Med Ethics. 2022 May;48(5):287-289
pubmed: 35470145
Handb Clin Neurol. 2013;118:25-34
pubmed: 24182365
J Med Ethics. 2019 Mar;45(3):156-160
pubmed: 30467198
J Med Ethics. 2023 Mar;49(3):165-174
pubmed: 36347603
J Med Ethics. 2008 Jun;34(6):450-5
pubmed: 18511618
Bioethics. 2022 Feb;36(2):143-153
pubmed: 34251687
Bioethics. 2022 Feb;36(2):134-142
pubmed: 34599834
J Med Ethics. 2022 May;48(5):304-310
pubmed: 34921123
AJOB Empir Bioeth. 2022 Apr-Jun;13(2):125-135
pubmed: 35259317
J Med Ethics. 2007 Jan;33(1):51-7
pubmed: 17209113
Aust Crit Care. 2016 May;29(2):97-103
pubmed: 26388551
AMA J Ethics. 2018 Sep 1;20(9):E902-910
pubmed: 30242824
BMC Med Ethics. 2019 Nov 14;20(1):81
pubmed: 31727134
Am J Bioeth. 2022 Jul;22(7):43-46
pubmed: 35737491
J Med Ethics. 2021 Nov 9;:
pubmed: 34753795
Am J Bioeth. 2022 Jul;22(7):38-40
pubmed: 35737487
Am J Bioeth. 2022 Jul;22(7):28-30
pubmed: 35737495
Medicina (Kaunas). 2020 Mar 20;56(3):
pubmed: 32244930
J Med Ethics. 2022 Mar;48(3):175-183
pubmed: 33687916
Am J Bioeth. 2022 Jul;22(7):35-37
pubmed: 35737489
Syst Rev. 2016 Jun 07;5:95
pubmed: 27267765
Aust Health Rev. 2015 Feb;39(1):44-50
pubmed: 25514126
Am J Bioeth. 2022 Jul;22(7):81-83
pubmed: 35737493
Am J Bioeth. 2022 Jul;22(7):30-33
pubmed: 35737496
Adv Exp Med Biol. 2020;1213:3-21
pubmed: 32030660
J Med Philos. 2014 Apr;39(2):104-29
pubmed: 24526785
BMC Med Ethics. 2019 Jul 15;20(1):48
pubmed: 31307458
J Med Philos. 2014 Apr;39(2):153-60
pubmed: 24526781
J Med Philos. 2014 Apr;39(2):187-95
pubmed: 24554777
J Clin Monit Comput. 2017 Apr;31(2):261-271
pubmed: 26902081