Modulated stimuli demonstrate asymmetric interactions between hearing and vision.
Journal
Scientific reports
ISSN: 2045-2322
Titre abrégé: Sci Rep
Pays: England
ID NLM: 101563288
Informations de publication
Date de publication:
20 05 2019
20 05 2019
Historique:
received:
27
07
2018
accepted:
08
05
2019
entrez:
22
5
2019
pubmed:
22
5
2019
medline:
21
10
2020
Statut:
epublish
Résumé
The nature of interactions between the senses is a topic of intense interest in neuroscience, but an unresolved question is how sensory information from hearing and vision are combined when the two senses interact. A problem for testing auditory-visual interactions is devising stimuli and tasks that are equivalent in both modalities. Here we report a novel paradigm in which we first equated the discriminability of the stimuli in each modality, then tested how a distractor in the other modality affected performance. Participants discriminated pairs of amplitude-modulated tones or size-modulated visual objects in the form of a cuboid shape, alone or when a similarly modulated distractor stimulus of the other modality occurred with one of the pair. Discrimination of sound modulation depth was affected by a modulated cuboid only when their modulation rates were the same. In contrast, discrimination of cuboid modulation depth was little affected by an equivalently modulated sound. Our results suggest that what observers perceive when auditory and visual signals interact is not simply determined by the discriminability of the individual sensory inputs, but also by factors that increase the perceptual binding of these inputs, such as temporal synchrony.
Identifiants
pubmed: 31110202
doi: 10.1038/s41598-019-44079-5
pii: 10.1038/s41598-019-44079-5
pmc: PMC6527605
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
7605Subventions
Organisme : Wellcome Trust
Pays : United Kingdom
Organisme : Wellcome Trust
ID : 102558/Z/13/Z
Pays : United Kingdom
Références
Percept Psychophys. 2007 Jul;69(5):673-86
pubmed: 17929691
Nature. 1976 Dec 23-30;264(5588):746-8
pubmed: 1012311
Exp Brain Res. 2009 Sep;198(1):49-57
pubmed: 19597804
Int J Psychophysiol. 2003 Oct;50(1-2):117-24
pubmed: 14511840
Neuron. 2005 Nov 3;48(3):489-96
pubmed: 16269365
Perception. 2017 Jul;46(7):793-814
pubmed: 28622759
J Cogn Neurosci. 1996 Nov;8(6):497-506
pubmed: 23961981
Percept Psychophys. 1981 Dec;30(6):557-64
pubmed: 7335452
eNeuro. 2018 Mar 8;5(1):
pubmed: 29527567
Neuroimage. 2009 Feb 1;44(3):1210-23
pubmed: 18973818
Nature. 2002 Jan 24;415(6870):429-33
pubmed: 11807554
Hear Res. 2009 Dec;258(1-2):89-99
pubmed: 19393306
Psychol Bull. 1980 Nov;88(3):638-67
pubmed: 7003641
Prog Brain Res. 2006;155:243-58
pubmed: 17027392
Prog Brain Res. 2009;176:245-58
pubmed: 19733761
J Neurosci. 2012 Sep 26;32(39):13402-10
pubmed: 23015431
J Neurophysiol. 2003 Feb;89(2):1078-93
pubmed: 12574482
Int J Psychophysiol. 2003 Oct;50(1-2):125-45
pubmed: 14511841
PLoS One. 2013;8(1):e54789
pubmed: 23355900
Trends Neurosci. 2016 Feb;39(2):74-85
pubmed: 26775728
Brain Res Cogn Brain Res. 2002 Jun;14(1):147-52
pubmed: 12063138
Curr Biol. 2004 Feb 3;14(3):257-62
pubmed: 14761661
Elife. 2015 Feb 05;4:
pubmed: 25654748
Percept Mot Skills. 1973 Dec;37(3):967-79
pubmed: 4764534
Exp Brain Res. 2009 Jul;196(3):353-60
pubmed: 19488743
Percept Psychophys. 2003 Jan;65(1):123-32
pubmed: 12699315
Curr Opin Neurobiol. 2016 Oct;40:31-37
pubmed: 27344253
J Acoust Soc Am. 2000 Sep;108(3 Pt 1):1197-208
pubmed: 11008820
Nature. 2000 Dec 14;408(6814):788
pubmed: 11130706
Acta Psychol (Amst). 2012 Jun;140(2):111-8
pubmed: 22622231
Science. 1995 Oct 13;270(5234):303-4
pubmed: 7569981
Brain Res Cogn Brain Res. 2003 Jul;17(2):447-53
pubmed: 12880914