NeuroVAD: Real-Time Voice Activity Detection from Non-Invasive Neuromagnetic Signals.
LSTM-RNN
MEG
SVM
VAD
brain-computer interface
speech-BCI
wavelet
Journal
Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366
Informations de publication
Date de publication:
16 Apr 2020
16 Apr 2020
Historique:
received:
19
03
2020
revised:
11
04
2020
accepted:
14
04
2020
entrez:
23
4
2020
pubmed:
23
4
2020
medline:
23
2
2021
Statut:
epublish
Résumé
Neural speech decoding-driven brain-computer interface (BCI) or speech-BCI is a novel paradigm for exploring communication restoration for locked-in (fully paralyzed but aware) patients. Speech-BCIs aim to map a direct transformation from neural signals to text or speech, which has the potential for a higher communication rate than the current BCIs. Although recent progress has demonstrated the potential of speech-BCIs from either invasive or non-invasive neural signals, the majority of the systems developed so far still assume knowing the onset and offset of the speech utterances within the continuous neural recordings. This lack of real-time voice/speech activity detection (VAD) is a current obstacle for future applications of neural speech decoding wherein BCI users can have a continuous conversation with other speakers. To address this issue, in this study, we attempted to automatically detect the voice/speech activity directly from the neural signals recorded using magnetoencephalography (MEG). First, we classified the whole segments of pre-speech, speech, and post-speech in the neural signals using a support vector machine (SVM). Second, for continuous prediction, we used a long short-term memory-recurrent neural network (LSTM-RNN) to efficiently decode the voice activity at each time point via its sequential pattern-learning mechanism. Experimental results demonstrated the possibility of real-time VAD directly from the non-invasive neural signals with about 88% accuracy.
Identifiants
pubmed: 32316162
pii: s20082248
doi: 10.3390/s20082248
pmc: PMC7218843
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Subventions
Organisme : University of Texas System Brain Research Grant
ID : 362221
Organisme : NIH HHS
ID : R03DC013990; R01DC016621
Pays : United States
Références
Science. 2008 Nov 7;322(5903):970-3
pubmed: 18988858
Neuroimage. 2009 Aug 1;47(1):314-25
pubmed: 19327400
PLoS One. 2016 Nov 22;11(11):e0166872
pubmed: 27875590
Semin Speech Lang. 2008 Nov;29(4):267-75
pubmed: 19058113
Neural Comput. 1997 Nov 15;9(8):1735-80
pubmed: 9377276
Front Neurosci. 2020 Mar 24;14:226
pubmed: 32265635
Front Neurosci. 2020 Apr 07;14:290
pubmed: 32317917
Nature. 2019 Apr;568(7753):493-498
pubmed: 31019317
Prog Brain Res. 2005;150:495-511
pubmed: 16186044
Trends Cogn Sci. 2006 Jan;10(1):14-23
pubmed: 16321563
Speech Commun. 2010 Apr 1;52(4):367-379
pubmed: 20204164
Nature. 2018 Mar 29;555(7698):657-661
pubmed: 29562238
Biomed Res Int. 2016;2016:2618265
pubmed: 28097128
Clin Neurophysiol. 2006 Mar;117(3):479-83
pubmed: 16458595
Conf Proc IEEE Eng Med Biol Soc. 2013;2013:2188-91
pubmed: 24110156
Neuroimage. 2013 Jan 15;65:349-63
pubmed: 23046981
Clin Neurophysiol. 2002 Jun;113(6):767-91
pubmed: 12048038
Neurosci Lett. 2012 Oct 3;527(1):34-9
pubmed: 22926020
Comput Intell Neurosci. 2016;2016:7489108
pubmed: 27524998
J Neural Eng. 2010 Oct;7(5):056007
pubmed: 20811093
Proc Natl Acad Sci U S A. 2012 Jul 17;109(29):11854-9
pubmed: 22753470
Brain Inform (2018). 2019 Dec;11309:163-172
pubmed: 31768504
J Neural Eng. 2019 Jun;16(3):036019
pubmed: 30831567
Front Neurosci. 2015 Jun 12;9:217
pubmed: 26124702
Conf Proc IEEE Eng Med Biol Soc. 2019 Jul;2019:5531-5535
pubmed: 31947107