Learning personalized ADL recognition models from few raw data.
Activity of daily living
EHealth
Few-shot learning
Gated recurrent units
Inertial measurement unit
Matching networks
Journal
Artificial intelligence in medicine
ISSN: 1873-2860
Titre abrégé: Artif Intell Med
Pays: Netherlands
ID NLM: 8915031
Informations de publication
Date de publication:
07 2020
07 2020
Historique:
received:
18
11
2019
revised:
25
04
2020
accepted:
23
06
2020
entrez:
24
8
2020
pubmed:
24
8
2020
medline:
19
8
2021
Statut:
ppublish
Résumé
Recognition of activities of daily living (ADL) is an essential component of assisted living systems based on actigraphy. This task can nowadays be performed by machine learning models which are able to automatically extract and learn relevant features but, most of time, need to be trained with large amounts of data collected on several users. In this paper, we propose an approach to learn personalized ADL recognition models from few raw data based on a specific type of neural network called matching network. The interest of this few-shot learning approach is three-fold. Firstly, people perform activities their own way and general models may average out important individual characteristics unlike personalized models that could thus achieve better performance. Secondly, gathering large quantities of annotated data from one user is time-consuming and threatens privacy in a medical context. Thirdly, matching networks are by nature weakly dependent on the classes they are trained on and can generalize easily to new activities without needing extra training, thus making them very versatile for real applications. Our results show the effectiveness of the proposed approach compared to general neural network models, even in situations with few training data.
Identifiants
pubmed: 32828455
pii: S0933-3657(19)31137-6
doi: 10.1016/j.artmed.2020.101916
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
101916Informations de copyright
Copyright © 2020 Elsevier B.V. All rights reserved.