Self-Supervised Robust Feature Matching Pipeline for Teach and Repeat Navigation.

artificial neural network computer vision deep learning long-term autonomy mobile robot self-supervised machine learning visual teach and repeat navigation

Journal

Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366

Informations de publication

Date de publication:
07 Apr 2022
Historique:
received: 01 03 2022
revised: 28 03 2022
accepted: 31 03 2022
entrez: 23 4 2022
pubmed: 24 4 2022
medline: 27 4 2022
Statut: epublish

Résumé

The performance of deep neural networks and the low costs of computational hardware has made computer vision a popular choice in many robotic systems. An attractive feature of deep-learned methods is their ability to cope with appearance changes caused by day-night cycles and seasonal variations. However, deep learning of neural networks typically relies on large numbers of hand-annotated images, which requires significant effort for data collection and annotation. We present a method that allows autonomous, self-supervised training of a neural network in visual teach-and-repeat (VT&R) tasks, where a mobile robot has to traverse a previously taught path repeatedly. Our method is based on a fusion of two image registration schemes: one based on a Siamese neural network and another on point-feature matching. As the robot traverses the taught paths, it uses the results of feature-based matching to train the neural network, which, in turn, provides coarse registration estimates to the feature matcher. We show that as the neural network gets trained, the accuracy and robustness of the navigation increases, making the robot capable of dealing with significant changes in the environment. This method can significantly reduce the data annotation efforts when designing new robotic systems or introducing robots into new environments. Moreover, the method provides annotated datasets that can be deployed in other navigation systems. To promote the reproducibility of the research presented herein, we provide our datasets, codes and trained models online.

Identifiants

pubmed: 35458823
pii: s22082836
doi: 10.3390/s22082836
pmc: PMC9032253
pii:
doi:

Types de publication

Journal Article

Langues

eng

Sous-ensembles de citation

IM

Subventions

Organisme : Czech Science Foundation
ID : 20-27034J

Références

J Opt Soc Am A Opt Image Sci Vis. 2001 Feb;18(2):253-64
pubmed: 11205970
IEEE Trans Pattern Anal Mach Intell. 2007 Jun;29(6):1052-67
pubmed: 17431302
IEEE Trans Pattern Anal Mach Intell. 2022 Apr;44(4):2074-2088
pubmed: 33074802

Auteurs

Tomáš Rouček (T)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

Arash Sadeghi Amjadi (AS)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

Zdeněk Rozsypálek (Z)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

George Broughton (G)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

Jan Blaha (J)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

Keerthy Kusumam (K)

Department of Computer Science, University of Nottingham, Jubilee Campus, 7301 Wollaton Rd, Lenton, Nottingham NG8 1BB, UK.

Tomáš Krajník (T)

Artificial Intelligence Center, Faculty of Electrical Engineering, Czech Technical University in Prague, 166 27 Prague 6, Czech Republic.

Articles similaires

Humans Middle Aged Female Male Surveys and Questionnaires
Adolescent Child Female Humans Male

Unsupervised learning for real-time and continuous gait phase detection.

Dollaporn Anopas, Yodchanan Wongsawat, Jetsada Arnin
1.00
Humans Gait Neural Networks, Computer Unsupervised Machine Learning Walking

Classifications MeSH