Active learning using deep Bayesian networks for surgical workflow analysis.
Active learning
Bayesian deep learning
Surgical workflow analysis
Journal
International journal of computer assisted radiology and surgery
ISSN: 1861-6429
Titre abrégé: Int J Comput Assist Radiol Surg
Pays: Germany
ID NLM: 101499225
Informations de publication
Date de publication:
Jun 2019
Jun 2019
Historique:
received:
30
01
2019
accepted:
02
04
2019
pubmed:
11
4
2019
medline:
3
9
2019
entrez:
11
4
2019
Statut:
ppublish
Résumé
For many applications in the field of computer-assisted surgery, such as providing the position of a tumor, specifying the most probable tool required next by the surgeon or determining the remaining duration of surgery, methods for surgical workflow analysis are a prerequisite. Often machine learning-based approaches serve as basis for analyzing the surgical workflow. In general, machine learning algorithms, such as convolutional neural networks (CNN), require large amounts of labeled data. While data is often available in abundance, many tasks in surgical workflow analysis need annotations by domain experts, making it difficult to obtain a sufficient amount of annotations. The aim of using active learning to train a machine learning model is to reduce the annotation effort. Active learning methods determine which unlabeled data points would provide the most information according to some metric, such as prediction uncertainty. Experts will then be asked to only annotate these data points. The model is then retrained with the new data and used to select further data for annotation. Recently, active learning has been applied to CNN by means of deep Bayesian networks (DBN). These networks make it possible to assign uncertainties to predictions. In this paper, we present a DBN-based active learning approach adapted for image-based surgical workflow analysis task. Furthermore, by using a recurrent architecture, we extend this network to video-based surgical workflow analysis. To decide which data points should be labeled next, we explore and compare different metrics for expressing uncertainty. We evaluate these approaches and compare different metrics on the Cholec80 dataset by performing instrument presence detection and surgical phase segmentation. Here we are able to show that using a DBN-based active learning approach for selecting what data points to annotate next can significantly outperform a baseline based on randomly selecting data points. In particular, metrics such as entropy and variation ratio perform consistently on the different tasks. We show that using DBN-based active learning strategies make it possible to selectively annotate data, thereby reducing the required amount of labeled training in surgical workflow-related tasks.
Identifiants
pubmed: 30968355
doi: 10.1007/s11548-019-01963-9
pii: 10.1007/s11548-019-01963-9
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
1079-1087Références
Int J Comput Assist Radiol Surg. 2014 May;9(3):495-511
pubmed: 24014322
Med Image Comput Comput Assist Interv. 2014;17(Pt 2):438-45
pubmed: 25485409
Int J Comput Assist Radiol Surg. 2015 Aug;10(8):1201-12
pubmed: 25895078
Am J Surg. 2016 Feb;211(2):398-404
pubmed: 26709011
IEEE Trans Med Imaging. 2017 Jan;36(1):86-97
pubmed: 27455522
Int J Comput Assist Radiol Surg. 2018 Jun;13(6):925-933
pubmed: 29704196
IEEE Trans Med Imaging. 2018 May;37(5):1114-1126
pubmed: 29727275
Neural Comput. 1997 Nov 15;9(8):1735-80
pubmed: 9377276