Environment Classification for Robotic Leg Prostheses and Exoskeletons Using Deep Convolutional Neural Networks.

artificial intelligence biomechatronics computer vision deep learning exoskeletons prosthetics rehabilitation robotics wearables

Journal

Frontiers in neurorobotics
ISSN: 1662-5218
Titre abrégé: Front Neurorobot
Pays: Switzerland
ID NLM: 101477958

Informations de publication

Date de publication:
2021
Historique:
received: 25 06 2021
accepted: 20 12 2021
entrez: 21 2 2022
pubmed: 22 2 2022
medline: 22 2 2022
Statut: epublish

Résumé

Robotic leg prostheses and exoskeletons can provide powered locomotor assistance to older adults and/or persons with physical disabilities. However, the current locomotion mode recognition systems being developed for automated high-level control and decision-making rely on mechanical, inertial, and/or neuromuscular sensors, which inherently have limited prediction horizons (i.e., analogous to walking blindfolded). Inspired by the human vision-locomotor control system, we developed an environment classification system powered by computer vision and deep learning to predict the oncoming walking environments prior to physical interaction, therein allowing for more accurate and robust high-level control decisions. In this study, we first reviewed the development of our "ExoNet" database-the largest and most diverse open-source dataset of wearable camera images of indoor and outdoor real-world walking environments, which were annotated using a hierarchical labeling architecture. We then trained and tested over a dozen state-of-the-art deep convolutional neural networks (CNNs) on the ExoNet database for image classification and automatic feature engineering, including: EfficientNetB0, InceptionV3, MobileNet, MobileNetV2, VGG16, VGG19, Xception, ResNet50, ResNet101, ResNet152, DenseNet121, DenseNet169, and DenseNet201. Finally, we quantitatively compared the benchmarked CNN architectures and their environment classification predictions using an operational metric called "NetScore," which balances the image classification accuracy with the computational and memory storage requirements (i.e., important for onboard real-time inference with mobile computing devices). Our comparative analyses showed that the EfficientNetB0 network achieves the highest test accuracy; VGG16 the fastest inference time; and MobileNetV2 the best NetScore, which can inform the optimal architecture design or selection depending on the desired performance. Overall, this study provides a large-scale benchmark and reference for next-generation environment classification systems for robotic leg prostheses and exoskeletons.

Identifiants

pubmed: 35185507
doi: 10.3389/fnbot.2021.730965
pmc: PMC8855111
doi:

Types de publication

Journal Article

Langues

eng

Pagination

730965

Informations de copyright

Copyright © 2022 Laschowski, McNally, Wong and McPhee.

Déclaration de conflit d'intérêts

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Références

IEEE Trans Biomed Eng. 2015 Nov;62(11):2576-2587
pubmed: 26111386
Nature. 2015 May 28;521(7553):436-44
pubmed: 26017442
IEEE Trans Neural Syst Rehabil Eng. 2019 Mar;27(3):465-476
pubmed: 30703033
Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul;2018:5135-5141
pubmed: 30441496
Annu Int Conf IEEE Eng Med Biol Soc. 2011;2011:5452-5
pubmed: 22255571
IEEE Trans Neural Syst Rehabil Eng. 2016 Apr;24(4):434-43
pubmed: 25879962
IEEE Trans Neural Syst Rehabil Eng. 2017 Feb;25(2):171-182
pubmed: 26829794
Sensors (Basel). 2019 Nov 28;19(23):
pubmed: 31795240
J Neuroeng Rehabil. 2019 Jan 3;16(1):2
pubmed: 30606194
Sensors (Basel). 2019 Nov 08;19(22):
pubmed: 31717471
IEEE Int Conf Rehabil Robot. 2019 Jun;2019:868-873
pubmed: 31374739
Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul;2018:1817-1820
pubmed: 30440748
IEEE Trans Biomed Eng. 2018 Aug;65(8):1759-1770
pubmed: 29989950
Front Robot AI. 2020 Dec 03;7:562061
pubmed: 33501327
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3360-3363
pubmed: 31946601
IEEE Trans Cybern. 2021 Jun;51(6):3285-3297
pubmed: 32203049
IEEE Trans Neural Syst Rehabil Eng. 2019 Sep;27(9):1780-1790
pubmed: 31425118
IEEE Trans Biomed Eng. 2012 Oct;59(10):2716-25
pubmed: 22996721
Annu Int Conf IEEE Eng Med Biol Soc. 2011;2011:4255-8
pubmed: 22255279
J Neuroeng Rehabil. 2015 Jan 05;12:1
pubmed: 25557982
Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:4631-4635
pubmed: 34892246
IEEE Trans Biomed Eng. 2011 Oct;58(10):2867-75
pubmed: 21768042
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3163-3166
pubmed: 31946559
Annu Int Conf IEEE Eng Med Biol Soc. 2016 Aug;2016:5055-5058
pubmed: 28269404

Auteurs

Brokoslaw Laschowski (B)

Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada.
Waterloo Artificial Intelligence Institute, University of Waterloo, Waterloo, ON, Canada.

William McNally (W)

Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada.
Waterloo Artificial Intelligence Institute, University of Waterloo, Waterloo, ON, Canada.

Alexander Wong (A)

Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada.
Waterloo Artificial Intelligence Institute, University of Waterloo, Waterloo, ON, Canada.

John McPhee (J)

Department of Systems Design Engineering, University of Waterloo, Waterloo, ON, Canada.
Waterloo Artificial Intelligence Institute, University of Waterloo, Waterloo, ON, Canada.

Classifications MeSH