Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception.

EEG audio-tactile speech perception audio-visual speech perception auditory evoked potentials multisensory integration trimodal speech perception

Journal

Frontiers in human neuroscience
ISSN: 1662-5161
Titre abrégé: Front Hum Neurosci
Pays: Switzerland
ID NLM: 101477954

Informations de publication

Date de publication:
2023
Historique:
received: 20 05 2023
accepted: 08 08 2023
medline: 14 9 2023
pubmed: 14 9 2023
entrez: 14 9 2023
Statut: epublish

Résumé

To perceive speech, our brains process information from different sensory modalities. Previous electroencephalography (EEG) research has established that audio-visual information provides an advantage compared to auditory-only information during early auditory processing. In addition, behavioral research showed that auditory speech perception is not only enhanced by visual information but also by tactile information, transmitted by puffs of air arriving at the skin and aligned with speech. The current EEG study aimed to investigate whether the behavioral benefits of bimodal audio-aerotactile and trimodal audio-visual-aerotactile speech presentation are reflected in cortical auditory event-related neurophysiological responses. To examine the influence of multimodal information on speech perception, 20 listeners conducted a two-alternative forced-choice syllable identification task at three different signal-to-noise levels. Behavioral results showed increased syllable identification accuracy when auditory information was complemented with visual information, but did not show the same effect for the addition of tactile information. Similarly, EEG results showed an amplitude suppression for the auditory N1 and P2 event-related potentials for the audio-visual and audio-visual-aerotactile modalities compared to auditory and audio-aerotactile presentations of the syllable/pa/. No statistically significant difference was present between audio-aerotactile and auditory-only modalities. Current findings are consistent with past EEG research showing a visually induced amplitude suppression during early auditory processing. In addition, the significant neurophysiological effect of audio-visual but not audio-aerotactile presentation is in line with the large benefit of visual information but comparatively much smaller effect of aerotactile information on auditory speech perception previously identified in behavioral research.

Identifiants

pubmed: 37706173
doi: 10.3389/fnhum.2023.1225976
pmc: PMC10495990
doi:

Types de publication

Journal Article

Langues

eng

Pagination

1225976

Informations de copyright

Copyright © 2023 Hansmann, Derrick and Theys.

Déclaration de conflit d'intérêts

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Références

Int J Audiol. 2010 May;49(5):378-87
pubmed: 20380611
Brain Res Cogn Brain Res. 2003 Dec;18(1):65-75
pubmed: 14659498
Neuropsychologia. 2014 Jan;53:115-21
pubmed: 24291340
Front Psychol. 2014 May 13;5:420
pubmed: 24860533
J Acoust Soc Am. 2019 Sep;146(3):1605
pubmed: 31590504
J Neurosci. 2009 Oct 28;29(43):13445-53
pubmed: 19864557
Neuropsychologia. 2018 Jan 31;109:126-133
pubmed: 29248497
Neuropsychologia. 2014 May;57:71-7
pubmed: 24530236
J Speech Hear Res. 1995 Jun;38(3):690-705
pubmed: 7674660
Int J Psychophysiol. 2013 Jul;89(1):136-47
pubmed: 23797145
Nature. 1976 Dec 23-30;264(5588):746-8
pubmed: 1012311
Psychophysiology. 2016 Sep;53(9):1295-306
pubmed: 27295181
Eur J Neurosci. 2004 Oct;20(8):2225-34
pubmed: 15450102
Nature. 2009 Nov 26;462(7272):502-4
pubmed: 19940925
J Acoust Soc Am. 2010 Nov;128(5):EL342-6
pubmed: 21110549
J Cogn Neurosci. 2010 Jul;22(7):1583-96
pubmed: 19583474
J Acoust Soc Am. 2016 Nov;140(5):3531
pubmed: 27908052
J Speech Hear Res. 1978 Dec;21(4):625-36
pubmed: 745365
Front Psychol. 2014 Nov 26;5:1340
pubmed: 25505438
Proc Natl Acad Sci U S A. 2005 Jan 25;102(4):1181-6
pubmed: 15647358
Lang Speech. 1967 Jan-Mar;10(1):1-28
pubmed: 6044530
Atten Percept Psychophys. 2012 Nov;74(8):1761-81
pubmed: 23070884
Multisens Res. 2013;26(5):405-16
pubmed: 24649526
Ear Hear. 1984 Jul-Aug;5(4):211-27
pubmed: 6468779
J Speech Lang Hear Res. 2009 Aug;52(4):1073-81
pubmed: 19641083
Cortex. 2016 Feb;75:220-230
pubmed: 26045213
Int J Audiol. 2004 Jan;43(1):15-28
pubmed: 14974624
Neuroimage. 2013 Apr 15;70:101-12
pubmed: 23274182
J Acoust Soc Am. 2019 Nov;146(5):3495
pubmed: 31795693
Sci Rep. 2022 Jan 17;12(1):837
pubmed: 35039580
Psychon Bull Rev. 2015 Oct;22(5):1299-307
pubmed: 25802068
J Cogn Neurosci. 2007 Dec;19(12):1964-73
pubmed: 17892381
J Acoust Soc Am. 2009 Apr;125(4):2272-81
pubmed: 19354402
J Acoust Soc Am. 1991 Dec;90(6):2971-84
pubmed: 1838561
J Speech Hear Res. 1982 Mar;25(1):108-16
pubmed: 7087411
Eur J Neurosci. 2017 Nov;46(10):2578-2583
pubmed: 28976045
Am J Otol. 1991;12 Suppl:188-200
pubmed: 2069181
Neuroscience. 2013 Sep 5;247:145-51
pubmed: 23673276

Auteurs

Doreen Hansmann (D)

School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand.

Donald Derrick (D)

New Zealand Institute of Language, Brain and Behaviour, University of Canterbury, Christchurch, New Zealand.

Catherine Theys (C)

School of Psychology, Speech and Hearing, University of Canterbury, Christchurch, New Zealand.
New Zealand Institute of Language, Brain and Behaviour, University of Canterbury, Christchurch, New Zealand.

Classifications MeSH