USPoint: Self-Supervised Interest Point Detection and Description for Ultrasound-Probe Motion Estimation During Fine-Adjustment Standard Fetal Plane Finding.

Local detector and descriptor Obstetric US Probe motion

Journal

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention
Titre abrégé: Med Image Comput Comput Assist Interv
Pays: Germany
ID NLM: 101249582

Informations de publication

Date de publication:
17 Sep 2022
Historique:
medline: 24 5 2023
pubmed: 24 5 2023
entrez: 24 5 2023
Statut: ppublish

Résumé

Ultrasound (US)-probe motion estimation is a fundamental problem in automated standard plane locating during obstetric US diagnosis. Most recent existing recent works employ deep neural network (DNN) to regress the probe motion. However, these deep regressionbased methods leverage the DNN to overfit on the specific training data, which is naturally lack of generalization ability for the clinical application. In this paper, we are back to generalized US feature learning rather than deep parameter regression. We propose a self-supervised learned local detector and descriptor, named USPoint, for US-probe motion estimation during the fine-adjustment phase of fetal plane acquisition. Specifically, a hybrid neural architecture is designed to simultaneously extract a local feature, and further estimate the probe motion. By embedding a differentiable USPoint-based motion estimation inside the proposed network architecture, the USPoint learns the keypoint detector, scores and descriptors from motion error alone, which doesn't require expensive human-annotation of local features. The two tasks, local feature learning and motion estimation, are jointly learned in a unified framework to enable collaborative learning with the aim of mutual benefit. To the best of our knowledge, it is the first learned local detector and descriptor tailored for the US image. Experimental evaluation on real clinical data demonstrates the resultant performance improvement on feature matching and motion estimation for potential clinical value. A video demo can be found online: https://youtu.be/JGzHuTQVlBs.

Identifiants

pubmed: 37223131
doi: 10.1007/978-3-031-16449-1_11
pmc: PMC7614558
mid: EMS159397
doi:

Types de publication

Journal Article

Langues

eng

Pagination

104-114

Références

Med Image Anal. 2006 Apr;10(2):137-49
pubmed: 16143560
Med Image Anal. 2018 Aug;48:187-202
pubmed: 29936399
Med Image Comput Comput Assist Interv. 2020 Oct;12263:583-592
pubmed: 33103163
Med Image Comput Comput Assist Interv. 2021 Sep 21;12908:670-679
pubmed: 35373220

Auteurs

Cheng Zhao (C)

Institute of Biomedical Engineering, University of Oxford, Oxford, UK.

Richard Droste (R)

Institute of Biomedical Engineering, University of Oxford, Oxford, UK.

Lior Drukker (L)

Nuffield Department of Women's and Reproductive Health, University of Oxford, Oxford, UK.

Aris T Papageorghiou (AT)

Nuffield Department of Women's and Reproductive Health, University of Oxford, Oxford, UK.

J Alison Noble (J)

Institute of Biomedical Engineering, University of Oxford, Oxford, UK.

Classifications MeSH