Health Care Professionals' Experience of Using AI: Systematic Review With Narrative Synthesis.
CDSS
artificial intelligence
clinical decision support systems
clinician experience
decision-making
health care delivery
health care professionals
quality assessment
Journal
Journal of medical Internet research
ISSN: 1438-8871
Titre abrégé: J Med Internet Res
Pays: Canada
ID NLM: 100959882
Informations de publication
Date de publication:
30 Oct 2024
30 Oct 2024
Historique:
received:
22
12
2023
accepted:
25
07
2024
revised:
10
06
2024
medline:
31
10
2024
pubmed:
30
10
2024
entrez:
30
10
2024
Statut:
epublish
Résumé
There has been a substantial increase in the development of artificial intelligence (AI) tools for clinical decision support. Historically, these were mostly knowledge-based systems, but recent advances include non-knowledge-based systems using some form of machine learning. The ability of health care professionals to trust technology and understand how it benefits patients or improves care delivery is known to be important for their adoption of that technology. For non-knowledge-based AI tools for clinical decision support, these issues are poorly understood. The aim of this study is to qualitatively synthesize evidence on the experiences of health care professionals in routinely using non-knowledge-based AI tools to support their clinical decision-making. In June 2023, we searched 4 electronic databases, MEDLINE, Embase, CINAHL, and Web of Science, with no language or date limit. We also contacted relevant experts and searched reference lists of the included studies. We included studies of any design that reported the experiences of health care professionals using non-knowledge-based systems for clinical decision support in their work settings. We completed double independent quality assessment for all included studies using the Mixed Methods Appraisal Tool. We used a theoretically informed thematic approach to synthesize the findings. After screening 7552 titles and 182 full-text articles, we included 25 studies conducted in 9 different countries. Most of the included studies were qualitative (n=13), and the remaining were quantitative (n=9) and mixed methods (n=3). Overall, we identified 7 themes: health care professionals' understanding of AI applications, level of trust and confidence in AI tools, judging the value added by AI, data availability and limitations of AI, time and competing priorities, concern about governance, and collaboration to facilitate the implementation and use of AI. The most frequently occurring are the first 3 themes. For example, many studies reported that health care professionals were concerned about not understanding the AI outputs or the rationale behind them. There were issues with confidence in the accuracy of the AI applications and their recommendations. Some health care professionals believed that AI provided added value and improved decision-making, and some reported that it only served as a confirmation of their clinical judgment, while others did not find it useful at all. Our review identified several important issues documented in various studies on health care professionals' use of AI tools in real-world health care settings. Opinions of health care professionals regarding the added value of AI tools for supporting clinical decision-making varied widely, and many professionals had concerns about their understanding of and trust in this technology. The findings of this review emphasize the need for concerted efforts to optimize the integration of AI tools in real-world health care settings. PROSPERO CRD42022336359; https://tinyurl.com/2yunvkmb.
Sections du résumé
BACKGROUND
BACKGROUND
There has been a substantial increase in the development of artificial intelligence (AI) tools for clinical decision support. Historically, these were mostly knowledge-based systems, but recent advances include non-knowledge-based systems using some form of machine learning. The ability of health care professionals to trust technology and understand how it benefits patients or improves care delivery is known to be important for their adoption of that technology. For non-knowledge-based AI tools for clinical decision support, these issues are poorly understood.
OBJECTIVE
OBJECTIVE
The aim of this study is to qualitatively synthesize evidence on the experiences of health care professionals in routinely using non-knowledge-based AI tools to support their clinical decision-making.
METHODS
METHODS
In June 2023, we searched 4 electronic databases, MEDLINE, Embase, CINAHL, and Web of Science, with no language or date limit. We also contacted relevant experts and searched reference lists of the included studies. We included studies of any design that reported the experiences of health care professionals using non-knowledge-based systems for clinical decision support in their work settings. We completed double independent quality assessment for all included studies using the Mixed Methods Appraisal Tool. We used a theoretically informed thematic approach to synthesize the findings.
RESULTS
RESULTS
After screening 7552 titles and 182 full-text articles, we included 25 studies conducted in 9 different countries. Most of the included studies were qualitative (n=13), and the remaining were quantitative (n=9) and mixed methods (n=3). Overall, we identified 7 themes: health care professionals' understanding of AI applications, level of trust and confidence in AI tools, judging the value added by AI, data availability and limitations of AI, time and competing priorities, concern about governance, and collaboration to facilitate the implementation and use of AI. The most frequently occurring are the first 3 themes. For example, many studies reported that health care professionals were concerned about not understanding the AI outputs or the rationale behind them. There were issues with confidence in the accuracy of the AI applications and their recommendations. Some health care professionals believed that AI provided added value and improved decision-making, and some reported that it only served as a confirmation of their clinical judgment, while others did not find it useful at all.
CONCLUSIONS
CONCLUSIONS
Our review identified several important issues documented in various studies on health care professionals' use of AI tools in real-world health care settings. Opinions of health care professionals regarding the added value of AI tools for supporting clinical decision-making varied widely, and many professionals had concerns about their understanding of and trust in this technology. The findings of this review emphasize the need for concerted efforts to optimize the integration of AI tools in real-world health care settings.
TRIAL REGISTRATION
BACKGROUND
PROSPERO CRD42022336359; https://tinyurl.com/2yunvkmb.
Identifiants
pubmed: 39476382
pii: v26i1e55766
doi: 10.2196/55766
doi:
Types de publication
Journal Article
Systematic Review
Review
Langues
eng
Sous-ensembles de citation
IM
Pagination
e55766Informations de copyright
©Abimbola Ayorinde, Daniel Opoku Mensah, Julia Walsh, Iman Ghosh, Siti Aishah Ibrahim, Jeffry Hogg, Niels Peek, Frances Griffiths. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 30.10.2024.