Ensemble or pool: A comprehensive study on transfer learning for c-VEP BCI during interpersonal interaction.
Brain–computer interface
Code-modulated visual evoked potential
Subject transfer framework
Transfer learning
Journal
Journal of neuroscience methods
ISSN: 1872-678X
Titre abrégé: J Neurosci Methods
Pays: Netherlands
ID NLM: 7905558
Informations de publication
Date de publication:
01 09 2020
01 09 2020
Historique:
received:
17
02
2020
revised:
08
05
2020
accepted:
04
07
2020
pubmed:
10
7
2020
medline:
22
6
2021
entrez:
10
7
2020
Statut:
ppublish
Résumé
To reduce calibration time of brain-computer interface (BCI) or even implement zero-training BCI, researchers have been studying how to effectively apply transfer learning in the field. In order to thoroughly investigate the performance of transfer learning in BCI and the key factors affecting transfer performance in the field, we carried out a comprehensive study. In general, transferring knowledge in BCI is implemented in two ways: ensemble or pool. In this work, we propose two different transfer approaches. One is to transfer the information of all channels as a whole from the source subjects to a target subject. The second approach is to transfer the information of corresponding channels between the subjects. A subject transfer framework is built by combining the two approaches with ensemble or pool. We investigated the performances of eight implementations of this framework on a data set acquired by an interpersonal interaction (Chicken Game) experiment based on code-modulated visual evoked potential (c-VEP) BCI. The results show that transfer learning generally provides acceptable classification performance. Additionally, an in-depth analysis reveals that a target subject usually shares different brain signal distribution with different source subjects. In fact, this is a hypothesis usually implied by this kind of research. Transfer learning for c-VEP BCI can be qualified for reducing calibration time or starting the recognition of BCI without sufficient subjects' own data. In addition, our finding suggests a solid validity of the hypothesis underlying transferring knowledge in BCI.
Sections du résumé
BACKGROUND
To reduce calibration time of brain-computer interface (BCI) or even implement zero-training BCI, researchers have been studying how to effectively apply transfer learning in the field. In order to thoroughly investigate the performance of transfer learning in BCI and the key factors affecting transfer performance in the field, we carried out a comprehensive study.
NEW METHOD
In general, transferring knowledge in BCI is implemented in two ways: ensemble or pool. In this work, we propose two different transfer approaches. One is to transfer the information of all channels as a whole from the source subjects to a target subject. The second approach is to transfer the information of corresponding channels between the subjects. A subject transfer framework is built by combining the two approaches with ensemble or pool.
RESULTS
We investigated the performances of eight implementations of this framework on a data set acquired by an interpersonal interaction (Chicken Game) experiment based on code-modulated visual evoked potential (c-VEP) BCI. The results show that transfer learning generally provides acceptable classification performance. Additionally, an in-depth analysis reveals that a target subject usually shares different brain signal distribution with different source subjects. In fact, this is a hypothesis usually implied by this kind of research.
CONCLUSIONS
Transfer learning for c-VEP BCI can be qualified for reducing calibration time or starting the recognition of BCI without sufficient subjects' own data. In addition, our finding suggests a solid validity of the hypothesis underlying transferring knowledge in BCI.
Identifiants
pubmed: 32645409
pii: S0165-0270(20)30278-8
doi: 10.1016/j.jneumeth.2020.108855
pii:
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
108855Informations de copyright
Copyright © 2020 Elsevier B.V. All rights reserved.