Automatic surgical phase recognition in laparoscopic inguinal hernia repair with artificial intelligence.
Artificial intelligence
Automatic surgical phase recognition
Laparoscopic inguinal hernia repair
Learning curve
Journal
Hernia : the journal of hernias and abdominal wall surgery
ISSN: 1248-9204
Titre abrégé: Hernia
Pays: France
ID NLM: 9715168
Informations de publication
Date de publication:
12 2022
12 2022
Historique:
received:
10
02
2022
accepted:
21
04
2022
pubmed:
11
5
2022
medline:
26
11
2022
entrez:
10
5
2022
Statut:
ppublish
Résumé
Because of the complexity of the intra-abdominal anatomy in the posterior approach, a longer learning curve has been observed in laparoscopic transabdominal preperitoneal (TAPP) inguinal hernia repair. Consequently, automatic tools using artificial intelligence (AI) to monitor TAPP procedures and assess learning curves are required. The primary objective of this study was to establish a deep learning-based automated surgical phase recognition system for TAPP. A secondary objective was to investigate the relationship between surgical skills and phase duration. This study enrolled 119 patients who underwent the TAPP procedure. The surgical videos were annotated (delineated in time) and split into seven surgical phases (preparation, peritoneal flap incision, peritoneal flap dissection, hernia dissection, mesh deployment, mesh fixation, peritoneal flap closure, and additional closure). An AI model was trained to automatically recognize surgical phases from videos. The relationship between phase duration and surgical skills were also evaluated. A fourfold cross-validation was used to assess the performance of the AI model. The accuracy was 88.81 and 85.82%, in unilateral and bilateral cases, respectively. In unilateral hernia cases, the duration of peritoneal incision (p = 0.003) and hernia dissection (p = 0.014) detected via AI were significantly shorter for experts than for trainees. An automated surgical phase recognition system was established for TAPP using deep learning with a high accuracy. Our AI-based system can be useful for the automatic monitoring of surgery progress, improving OR efficiency, evaluating surgical skills and video-based surgical education. Specific phase durations detected via the AI model were significantly associated with the surgeons' learning curve.
Sections du résumé
BACKGROUND
Because of the complexity of the intra-abdominal anatomy in the posterior approach, a longer learning curve has been observed in laparoscopic transabdominal preperitoneal (TAPP) inguinal hernia repair. Consequently, automatic tools using artificial intelligence (AI) to monitor TAPP procedures and assess learning curves are required. The primary objective of this study was to establish a deep learning-based automated surgical phase recognition system for TAPP. A secondary objective was to investigate the relationship between surgical skills and phase duration.
METHODS
This study enrolled 119 patients who underwent the TAPP procedure. The surgical videos were annotated (delineated in time) and split into seven surgical phases (preparation, peritoneal flap incision, peritoneal flap dissection, hernia dissection, mesh deployment, mesh fixation, peritoneal flap closure, and additional closure). An AI model was trained to automatically recognize surgical phases from videos. The relationship between phase duration and surgical skills were also evaluated.
RESULTS
A fourfold cross-validation was used to assess the performance of the AI model. The accuracy was 88.81 and 85.82%, in unilateral and bilateral cases, respectively. In unilateral hernia cases, the duration of peritoneal incision (p = 0.003) and hernia dissection (p = 0.014) detected via AI were significantly shorter for experts than for trainees.
CONCLUSION
An automated surgical phase recognition system was established for TAPP using deep learning with a high accuracy. Our AI-based system can be useful for the automatic monitoring of surgery progress, improving OR efficiency, evaluating surgical skills and video-based surgical education. Specific phase durations detected via the AI model were significantly associated with the surgeons' learning curve.
Identifiants
pubmed: 35536371
doi: 10.1007/s10029-022-02621-x
pii: 10.1007/s10029-022-02621-x
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
1669-1678Informations de copyright
© 2022. The Author(s), under exclusive licence to Springer-Verlag France SAS, part of Springer Nature.
Références
Ger R, Monroe K, Duvivier R et al (1990) Management of indirect inguinal hernias by laparoscopic closure of the neck of the sac. Am J Surg 159:370–373
doi: 10.1016/S0002-9610(05)81273-5
pubmed: 2138432
Ielpo B, Duran H, Diaz E et al (2018) A prospective randomized study comparing laparoscopic transabdominal preperitoneal (TAPP) versus Lichtenstein repair for bilateral inguinal hernias. Am J Surg 216:78–83
doi: 10.1016/j.amjsurg.2017.07.016
pubmed: 28751063
Lovisetto F, Zonta S, Rota E et al (2007) Laparoscopic transabdominal preperitoneal (TAPP) hernia repair: surgical phases and complications. Surg Endosc Other Interv Tech 21:646–652
doi: 10.1007/s00464-006-9031-9
Furtado M, Claus CMP, Cavazzola LT et al (2019) Systemization of laparoscopic inguinal hernia repair (TAPP) based on a new anatomical concept: inverted y and five triangles. Arq Bras Cir Dig 32:e1426
doi: 10.1590/0102-672020180001e1426
pubmed: 30758474
pmcid: 6368153
Simons MP, Smietanski M, Bonjer HJ et al (2018) International guidelines for groin hernia management. Hernia 22:1–165
doi: 10.1007/s10029-017-1668-x
Esteva A, Kuprel B, Novoa RA et al (2017) Dermatologist-level classification of skin cancer with deep neural networks. Nature 542:115–118
doi: 10.1038/nature21056
pubmed: 28117445
pmcid: 8382232
Petit O, Thome N, Charnoz A, et al (2018) Handling missing annotations for semantic segmentation with deep ConvNets. In: Stoyanov D. et al. (eds) Deep learning in medical image analysis and multimodal learning for clinical decision support
Misawa M, Kudo S, Mori Y et al (2018) Artificial intelligence-assisted polyp detection for colonoscopy: initial experience. Gastroenterology 154:2027–2029
doi: 10.1053/j.gastro.2018.04.003
pubmed: 29653147
Gulshan V, Peng L, Coram M et al (2016) Development and validation of a deep learning algorithm for detection of diabetic retinopathy in retinal fundus photographs. JAMA 316:2402–2410
doi: 10.1001/jama.2016.17216
pubmed: 27898976
Hirasawa T, Aoyama K, Tanimoto T et al (2018) Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer 21:653–660
doi: 10.1007/s10120-018-0793-2
pubmed: 29335825
Zhao W, Yang J, Sun Y et al (2018) 3D deep learning from CT scans predicts tumor invasiveness of subcentimeter pulmonary adenocarcinomas. Cancer Res 78:6881–6889
doi: 10.1158/0008-5472.CAN-18-0696
pubmed: 30279243
Hashimoto DA, Rosman G, Witkowski ER et al (2019) Computer vision analysis of intraoperative video: automated recognition of operative steps in laparoscopic sleeve gastrectomy. Ann Surg 270:414–421
doi: 10.1097/SLA.0000000000003460
pubmed: 31274652
Kitaguchi D, Takeshita N, Matsuzaki H et al (2020) Automated laparoscopic colorectal surgery workflow recognition using artificial intelligence: experimental research. Int J Surg 79:88–94
doi: 10.1016/j.ijsu.2020.05.015
pubmed: 32413503
Daes J, Felix E (2017) Critical view of the myopectineal orifice. Ann Surg 266:e1-2
doi: 10.1097/SLA.0000000000002104
pubmed: 27984213
Colak E, Ozlem N, Kucuk GO et al (2015) Prospective randomized trial of mesh fixation with absorbable versus nonabsorbable tacker in laparoscopic ventral incisional hernia repair. Int J Clin Exp Med 8:21611–21616
pubmed: 26885113
pmcid: 4723958
Garrow CR, Kowalewski KF, Li L et al (2021) Machine learning for surgical phase recognition: a systematic review. Ann Surg 273:684–693
doi: 10.1097/SLA.0000000000004425
pubmed: 33201088
Lundervold AS, Lundervold A (2019) An overview of deep learning in medical imaging focusing on MRI. Z Med Phys 29:102–127
doi: 10.1016/j.zemedi.2018.11.002
pubmed: 30553609
Kim M, Yun J, Cho Y et al (2019) Deep learning in medical imaging. Neurospine 16:657–668
doi: 10.14245/ns.1938396.198
pubmed: 31905454
pmcid: 6945006
Czempiel T, Paschali M, Keicher M, et al (2020) TeCNO: Surgical Phase Recognition with Multi-stage Temporal Convolutional Networks. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics). 12263 LNCS:343–52
Jin Y, Dou Q, Chen H et al (2018) SV-RCNet: Workflow recognition from surgical videos using recurrent convolutional network. IEEE Trans Med Imaging 37:1114–1126
doi: 10.1109/TMI.2017.2787657
pubmed: 29727275
He K, Zhang X, Ren S, et al (2016) Deep residual learning for image recognition. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit. 770–8
Du Y, Wang W, Wang L (2015) Hierarchical recurrent neural network for skeleton based action recognition. Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit 07:1110–1118
Pedregosa F, Varoquaux G et al (2011) Scikit-learn: machine learning in Python. J Mach Learn Res 12:2825–2830
Hosmer, David W, Stanley Lemeshow RXS (2013) Applied Logistic Regression. Third edition / Hoboken (N.J.). Wiley
Bodenstedt S, Wagner M, Katić D, et al (2017) Unsupervised temporal context learning using convolutional neural networks for laparoscopic workflow analysis. http://arxiv.org/abs/1702.03684
Padoy N, Blum T, Ahmadi SA et al (2012) Statistical modeling and recognition of surgical workflow. Med Image Anal 16:632–641
doi: 10.1016/j.media.2010.10.001
pubmed: 21195015
Guédon ACP, Meij SEP, Osman KNMMH et al (2021) Deep learning for surgical phase recognition using endoscopic videos. Surg Endosc 35:6150–6157
doi: 10.1007/s00464-020-08110-5
pubmed: 33237461
Kitaguchi D, Takeshita N, Matsuzaki H et al (2021) Deep learning-based automatic surgical step recognition in intraoperative videos for transanal total mesorectal excision. Surg Endosc. https://doi.org/10.1007/s00464-021-08381-6
doi: 10.1007/s00464-021-08381-6
pubmed: 33825016
pmcid: 8758657
Ward TM, Hashimoto DA, Ban Y et al (2021) Automated operative phase identification in peroral endoscopic myotomy. Surg Endosc 35:4008–4015
doi: 10.1007/s00464-020-07833-9
pubmed: 32720177
Padoy N (2019) Machine and deep learning for workflow recognition during surgery. Minim Invasive Ther Allied Technol 28:82–90
doi: 10.1080/13645706.2019.1584116
pubmed: 30849261
Mahmoud N, Collins T, Hostettler A et al (2019) Live tracking and dense reconstruction for handheld monocular endoscopy. IEEE Trans Med Imaging 38:79–89
doi: 10.1109/TMI.2018.2856109
pubmed: 30010552
Vávra P, Roman J, Zonča P et al (2017) Recent development of augmented reality in surgery: a review. J Heal Eng. https://doi.org/10.1155/2017/4574172
doi: 10.1155/2017/4574172
Marescaux J, Rubino F, Arenas M et al (2004) Augmented-reality-assisted laparoscopic adrenalectomy. JAMA 292:2214–2215
pubmed: 15536106
Collins T, Pizarro D, Gasparini S et al (2021) Augmented reality guided laparoscopic surgery of the uterus. IEEE Trans Med Imaging 40:371–380
doi: 10.1109/TMI.2020.3027442
pubmed: 32986548
Quero G, Lapergola A, Soler L et al (2019) Virtual and augmented reality in oncologic liver surgery. Surg Oncol Clin N Am 28:31–44
doi: 10.1016/j.soc.2018.08.002
pubmed: 30414680
Modrzejewski R, Collins T, Bartoli A et al (2018) Soft-body registration of pre-operative 3D Models to intra-operative RGBD partial body scans. In: Frangi A, Schnabel J, Davatzikos C et al (eds) Medical Image computing and computer assisted intervention—MICCAI 2018. MICCAI 2018. Lecture notes in computer science, vol 11073. Springer, Cham
Bittner R, Arregui ME, Bisgaard T et al (2011) Guidelines for laparoscopic (TAPP) and endoscopic (TEP) treatment of inguinal hernia [International Endohernia Society (IEHS)]. Surg Endosc 25:2773–2843
doi: 10.1007/s00464-011-1799-6
pubmed: 21751060
pmcid: 3160575
Muysoms F, Van Cleven S, Kyle-Leinhase I et al (2018) Robotic-assisted laparoscopic groin hernia repair: observational case-control study on the operative time during the learning curve. Surg Endosc 32:4850–4859. https://doi.org/10.1007/s00464-018-6236-7
doi: 10.1007/s00464-018-6236-7
pubmed: 29766308
Miskovic D (2012) Profciency gain and competency assessment in laparoscopic colorectal surgery. PhD Thesis, Imperial College London, Department of Surgery and Cancer
Levin M, McKechnie T, Khalid S et al (2019) Automated methods of technical skill assessment in surgery: a systematic review. J Surg Educ 76:1629–1639. https://doi.org/10.1016/j.jsurg.2019.06.011
doi: 10.1016/j.jsurg.2019.06.011
pubmed: 31272846
Yengera G, Mutter D, Marescaux J, et al (2018) Less is More: Surgical Phase Recognition with Less Annotations through Self-Supervised Pre-training of CNN-LSTM Networks. http://arxiv.org/abs/1805.08569