Autonomous control of an ultrasound probe for intra-operative ultrasonography using vision-based shape sensing of pneumatically attachable flexible rails.

Intra-operative ultrasound Medical robotics Robotic-assisted surgery Shape sensing

Journal

International journal of computer assisted radiology and surgery
ISSN: 1861-6429
Titre abrégé: Int J Comput Assist Radiol Surg
Pays: Germany
ID NLM: 101499225

Informations de publication

Date de publication:
22 May 2024
Historique:
received: 03 03 2024
accepted: 03 05 2024
medline: 23 5 2024
pubmed: 23 5 2024
entrez: 22 5 2024
Statut: aheadofprint

Résumé

In robotic-assisted minimally invasive surgery, surgeons often use intra-operative ultrasound to visualise endophytic structures and localise resection margins. This must be performed by a highly skilled surgeon. Automating this subtask may reduce the cognitive load for the surgeon and improve patient outcomes. We demonstrate vision-based shape sensing of the pneumatically attachable flexible (PAF) rail by using colour-dependent image segmentation. The shape-sensing framework is evaluated on known curves ranging from The vision-based sensor is shown to have comparable sensing accuracy with FBGS-based systems. We find the RMSE of the vision-based shape sensing of the PAF rail compared with ground truth to be We have proposed a framework for autonomous intra-operative US scanning using vision-based shape sensing to inform path planning. Ultrasound images were evaluated by clinicians for sharpness of image, clarity of structures visible, and contrast of solid and fluid areas. Clinicians evaluated that robot-acquired images were superior to human-acquired images in all metrics. Future work will translate the framework to a da Vinci surgical robot.

Identifiants

pubmed: 38777945
doi: 10.1007/s11548-024-03178-z
pii: 10.1007/s11548-024-03178-z
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Subventions

Organisme : Wellcome / EPSRC Centre for Interventional and Surgical Sciences
ID : 203145/Z/16/Z
Organisme : Engineering and Physical Sciences Research Council
ID : EP/P027938/1, EP/R004080/1, EP/P012841/1
Organisme : Royal Academy of Engineering Chair in Emerging Technologies Scheme
ID : CiET1819/2/36

Informations de copyright

© 2024. The Author(s).

Références

Hussain A, Malik A, Halim MU, Ali AM (2014) The use of robotics in surgery: a review. Int J Clin Pract 68(11):1376–1382
doi: 10.1111/ijcp.12492 pubmed: 25283250
Bramhe S, Pathak SS (2022) Robotic surgery: a narrative review. Cureus 14(9):e29179
pubmed: 36258968 pmcid: 9573327
Soomro NA, Hashimoto DA, Porteous AJ, Ridley CJA, Marsh WJ, Ditto R, Roy S (2020) Systematic review of learning curves in robot-assisted surgery. BJS Open 4(1):27–44
doi: 10.1002/bjs5.50235 pubmed: 32011823
Hanly EJ, Talamini MA (2004) Robotic abdominal surgery. Am J Surg 188(4):19–26
doi: 10.1016/j.amjsurg.2004.08.020
Benway BM, Bhayani SB, Rogers CG, Porter JR, Buffi NM, Figenshau RS, Mottrie A (2010) Robot-assisted partial nephrectomy: an international experience. Eur Urol 57(5):815–820
doi: 10.1016/j.eururo.2010.01.011 pubmed: 20116163
Sun Y, Wang W, Zhang Q, Zhao X, Xu L, Guo H (2021) Intraoperative ultrasound: technique and clinical experience in robotic-assisted renal partial nephrectomy for endophytic renal tumors. Int Urol Nephrol 53(3):455–463
doi: 10.1007/s11255-020-02664-y pubmed: 33006090
Ettorre CD, Stilli A, Dwyer G, Neves JB, Tran M, Stoyanov D (2019) Semi-autonomous interventional manipulation using pneumatically attachable flexible rails. In: IEEE international conference on intelligent robots and systems, pp 1347–1354
Stilli A, Dimitrakakis E, D’ettorre C, Tran M, Stoyanov D (2019) Pneumatically attachable flexible rails for track-guided ultrasound scanning in robotic-assisted partial nephrectomy - a preliminary design study. IEEE Robot Autom Lett 4(2):1208–1215
doi: 10.1109/LRA.2019.2894499
McDonald-Bowyer A, Dietsch S, Dimitrakakis E, Coote JM, Lindenroth L, Stoyanov D, Stilli A (2023) Organ curvature sensing using pneumatically attachable flexible rails in robotic-assisted laparoscopic surgery. Front Robot AI 9:1099275
doi: 10.3389/frobt.2022.1099275 pubmed: 36686214 pmcid: 9849801
Mcdonald-Bowyer A, Dietsch S, Dimitrakakis E, Coote J, Lindenroth L, Stoyanov D, Stilli A, Mcdonald-Bowyer A, Dietsch S, Dimitrakakis E, Coote J, Lindenroth L, Stoyanov D, Stilli A (2022) Towards autonomous robotic ultrasound scanning using pneumatically attachable flexible rails. EasyChair
Dietsch S, McDonald-Bowyer A, Dimitrakakis E, Coote JM, Lindenroth L, Stilli A, Stoyanov D (2022) Localization of interaction using fibre-optic shape sensing in soft-robotic surgery tools. In: IEEE international conference on intelligent robots and systems 2022-October, pp 8057–8063
Pierrot F, Dombre E, Dégoulange E, Urbain L, Caron P, Boudet S, Gariépy J, Mégnien JL (1999) Hippocrate: a safe robot arm for medical applications with force feedback. Med Image Anal 3(3):285–300. https://doi.org/10.1016/S1361-8415(99)80025-5
doi: 10.1016/S1361-8415(99)80025-5 pubmed: 10710297
Elek R, Nagy TD, Nagy DA, Takacs B, Galambos P, Rudas I, Haidegger T (2017) Robotic platforms for ultrasound diagnostics and treatment. In: 2017 IEEE international conference on systems, man, and cybernetics, SMC 2017 2017-January, 1752–1757 . https://doi.org/10.1109/SMC.2017.8122869
Huang Y, Xiao W, Wang C, Liu H, Huang R, Sun Z (2021) Towards fully autonomous ultrasound scanning robot with imitation learning based on clinical protocols. IEEE Robot Autom Lett 6(2):3671–3678. https://doi.org/10.1109/LRA.2021.3064283
doi: 10.1109/LRA.2021.3064283
Pratt P, Hughes-Hallett A, Zhang L, Patel N, Mayer E, Darzi A, Yang GZ (2015) Autonomous ultrasound-guided tissue dissection. Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics) 9349:249–257. https://doi.org/10.1007/978-3-319-24553-9_31/COVER
Schneider C, Nguan C, Rohling R, Salcudean S (2016) Tracked “pick-Up’’ ultrasound for robot-assisted minimally invasive surgery. IEEE Trans Biomed Eng 63(2):260–268. https://doi.org/10.1109/TBME.2015.2453173
doi: 10.1109/TBME.2015.2453173 pubmed: 26168430
Marahrens N, Scaglioni B, Jones D, Prasad R, Biyani CS, Valdastri P (2022) Towards autonomous robotic minimally invasive ultrasound scanning and vessel reconstruction on non-planar surfaces. Front Robot AI 9:940062
doi: 10.3389/frobt.2022.940062 pubmed: 36304794 pmcid: 9594548
Jiang Z, Li Z, Grimm M, Zhou M, Esposito M, Wein W, Stechele W, Wendler T, Navab N (2022) Autonomous robotic screening of tubular structures based only on real-time ultrasound imaging feedback. IEEE Trans Ind Electron 69(7), 7064–7075. arXiv:2011.00099 . https://doi.org/10.1109/TIE.2021.3095787
Huang Q, Zhou J, Li Z (2023) Review of robot-assisted medical ultrasound imaging systems: technology and clinical applications. Neurocomputing 559:126790
doi: 10.1016/j.neucom.2023.126790
Camarillo DB, Loewke KE, Carlson CR, Salisbury JK (2008) Vision based 3-D shape sensing of flexible manipulators. In: Proceedings - IEEE international conference on robotics and automation, pp 2940–2947
Ferrier NJ, Brockett RW (2000) Reconstructing the shape of a deformable membrane from image data. Int J Robot Res 19(9):795–816
doi: 10.1177/02783640022067184
Stassi S, Cauda V, Canavese G, Pirri CF (2014) Flexible tactile sensing based on piezoresistive composites: a review. Sensors 14(3):5296–5332
doi: 10.3390/s140305296 pubmed: 24638126 pmcid: 4003994
Sareh S, Noh Y, Li M, Ranzani T, Liu H, Althoefer K (2015) Macrobend optical sensing for pose measurement in soft robot arms. Smart Mater Struct 24:125024
doi: 10.1088/0964-1726/24/12/125024
Song S, Li Z, Yu H, Ren H (2015) Electromagnetic positioning for tip tracking and shape sensing of flexible robots. IEEE Sens J 15(8):4565–4575
doi: 10.1109/JSEN.2015.2424228
Kamiyama K, Vlack K, Mizota T, Kajimoto H, Kawakami N, Tachi S (2005) Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput Graphics Appl 25(1):68–75
Chorley C, Melhuish C, Pipe T, Rossiter J (2010) Tactile edge detection. In: Proceedings of IEEE sensors, pp 2593–2598
Wang R, Wang S, Du S, Xiao E, Yuan W, Feng C (2020) Real-time soft body 3D proprioception via deep vision-based sensing. IEEE Robot Autom Lett 5(2):3382–3389
doi: 10.1109/LRA.2020.2975709
Zhang Z, Dequidt J, Duriez C (2018) Vision-based sensing of external forces acting on soft robots using finite element method. IEEE Robot Autom Lett 3(3):1529–1536
Stilli A, Dimitrakakis E, Tran M, Stoyanov D (2018) Track-guided ultrasound scanning for tumour margins outlining in robot-assisted partial nephrectomy. In: Althoefer K, Vander Poorten E (eds) Proceedings of the 8th joint workshop on new technology for computer/robot assisted surgery (CRAS). CRAS, London (2018)
Thompson S, Dowrick T, Ahmad M, Xiao G, Koo B, Bonmati E, Kahl K, Clarkson MJ (2020) SciKit-surgery: compact libraries for surgical navigation. Int J Comput Assist Radiol Surg 15(7):1075–1084
doi: 10.1007/s11548-020-02180-5 pubmed: 32436132 pmcid: 7316849
Salimi N, Gonzalez-Fiol A, Yanez D, Fardelmann K, Harmon E, Kohari K, Abdel-Razeq S, Magriples U, Alian A (2022) Ultrasound image quality comparison between a handheld ultrasound transducer and mid-range ultrasound machine. POCUS J 7(1):154–159
doi: 10.24908/pocus.v7i1.15052 pubmed: 36896280 pmcid: 9979954

Auteurs

Aoife McDonald-Bowyer (A)

WEISS, UCL, London, UK. aoife.mcdonald-bowyer.19@ucl.ac.uk.

Tom Syer (T)

Department of Radiology, University of Cambridge, Cambridge, UK.

Adam Retter (A)

Centre for Medical Imaging, UCL, London, UK.

Danail Stoyanov (D)

WEISS, UCL, London, UK.

Agostino Stilli (A)

WEISS, UCL, London, UK.

Classifications MeSH