Measurement of the N170 during facial neuromuscular electrical stimulation (fNMES).
Artefacts
EEG
Electrical stimulation
FNMES
N170
Journal
Journal of neuroscience methods
ISSN: 1872-678X
Titre abrégé: J Neurosci Methods
Pays: Netherlands
ID NLM: 7905558
Informations de publication
Date de publication:
01 06 2023
01 06 2023
Historique:
received:
26
01
2023
revised:
06
04
2023
accepted:
08
05
2023
medline:
14
6
2023
pubmed:
12
5
2023
entrez:
11
5
2023
Statut:
ppublish
Résumé
Studies on facial feedback effects typically employ props or posed facial expressions, which often lack temporal precision and muscle specificity. Facial Neuromuscular Electrical Stimulation (fNMES) allows for a controlled influence of contractions of facial muscles, and may be used to advance our understanding of facial feedback effects, especially when combined with Electroencephalography (EEG). However, electrical stimulation introduces significant interference that can mask underlying brain dynamics. Whether established signal processing methods can allow for a reduction of said interference whilst retaining effects of interest, remains unexplored. We addressed these questions focusing on the classic N170 visual evoked potential, a face-sensitive brain component: 20 participants viewed images of houses, and of sad, happy, and neutral faces. On half of the trials, fNMES was delivered to bilateral lower-face muscles during the presentation of visual stimuli. A larger N170 amplitude was found for faces relative to houses. Interestingly, this was the case both without and during fNMES, regardless of whether the fNMES artefact was removed or not. Moreover, sad facial expressions elicited a larger N170 amplitude relative to neutral facial expressions, both with and without fNMES. fNMES offers a more precise way of manipulating proprioceptive feedback from facial muscles, which affords greater diversity in experimental design for studies on facial feedback effects. We show that the combining of fNMES and EEG can be achieved and may serve as a powerful means of exploring the impact of controlled proprioceptive inputs on various types of cognitive processing.
Sections du résumé
BACKGROUND
Studies on facial feedback effects typically employ props or posed facial expressions, which often lack temporal precision and muscle specificity.
NEW METHOD
Facial Neuromuscular Electrical Stimulation (fNMES) allows for a controlled influence of contractions of facial muscles, and may be used to advance our understanding of facial feedback effects, especially when combined with Electroencephalography (EEG). However, electrical stimulation introduces significant interference that can mask underlying brain dynamics. Whether established signal processing methods can allow for a reduction of said interference whilst retaining effects of interest, remains unexplored.
RESULTS
We addressed these questions focusing on the classic N170 visual evoked potential, a face-sensitive brain component: 20 participants viewed images of houses, and of sad, happy, and neutral faces. On half of the trials, fNMES was delivered to bilateral lower-face muscles during the presentation of visual stimuli. A larger N170 amplitude was found for faces relative to houses. Interestingly, this was the case both without and during fNMES, regardless of whether the fNMES artefact was removed or not. Moreover, sad facial expressions elicited a larger N170 amplitude relative to neutral facial expressions, both with and without fNMES.
COMPARISON WITH EXISTING METHODS
fNMES offers a more precise way of manipulating proprioceptive feedback from facial muscles, which affords greater diversity in experimental design for studies on facial feedback effects.
CONCLUSIONS
We show that the combining of fNMES and EEG can be achieved and may serve as a powerful means of exploring the impact of controlled proprioceptive inputs on various types of cognitive processing.
Identifiants
pubmed: 37169226
pii: S0165-0270(23)00096-1
doi: 10.1016/j.jneumeth.2023.109877
pii:
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
109877Informations de copyright
Copyright © 2023 The Authors. Published by Elsevier B.V. All rights reserved.
Déclaration de conflit d'intérêts
Declaration of Competing Interest The authors declare no conflict of interest.