Shape detection beyond the visual field using a visual-to-auditory sensory augmentation device.

auditory spatial perception multisensory perception multisensory spatial perception sensory substitution sensory substitution device (SSD) spatial perception visual-auditory visual-spatial perception

Journal

Frontiers in human neuroscience
ISSN: 1662-5161
Titre abrégé: Front Hum Neurosci
Pays: Switzerland
ID NLM: 101477954

Informations de publication

Date de publication:
2023
Historique:
received: 30 09 2022
accepted: 09 01 2023
entrez: 20 3 2023
pubmed: 21 3 2023
medline: 21 3 2023
Statut: epublish

Résumé

Current advancements in both technology and science allow us to manipulate our sensory modalities in new and unexpected ways. In the present study, we explore the potential of expanding what we perceive through our natural senses by utilizing a visual-to-auditory sensory substitution device (SSD), the EyeMusic, an algorithm that converts images to sound. The EyeMusic was initially developed to allow blind individuals to create a spatial representation of information arriving from a video feed at a slow sampling rate. In this study, we aimed to use the EyeMusic for the blind areas of sighted individuals. We use it in this initial proof-of-concept study to test the ability of sighted subjects to combine visual information with surrounding auditory sonification representing visual information. Participants in this study were tasked with recognizing and adequately placing the stimuli, using sound to represent the areas outside the standard human visual field. As such, the participants were asked to report shapes' identities as well as their spatial orientation (front/right/back/left), requiring combined visual (90° frontal) and auditory input (the remaining 270°) for the successful performance of the task (content in both vision and audition was presented in a sweeping clockwise motion around the participant). We found that participants were successful at a highly above chance level after a brief 1-h-long session of online training and one on-site training session of an average of 20 min. They could even draw a 2D representation of this image in some cases. Participants could also generalize, recognizing new shapes they were not explicitly trained on. Our findings provide an initial proof of concept indicating that sensory augmentation devices and techniques can potentially be used in combination with natural sensory information in order to expand the natural fields of sensory perception.

Identifiants

pubmed: 36936618
doi: 10.3389/fnhum.2023.1058617
pmc: PMC10017858
doi:

Types de publication

Journal Article

Langues

eng

Pagination

1058617

Informations de copyright

Copyright © 2023 Shvadron, Snir, Maimon, Yizhar, Harel, Poradosu and Amedi.

Déclaration de conflit d'intérêts

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Références

Front Psychol. 2019 Oct 04;10:2246
pubmed: 31636589
Nature. 1976 Dec 23-30;264(5588):746-8
pubmed: 1012311
Restor Neurol Neurosci. 2019;37(2):155-166
pubmed: 31006700
Neuron. 2012 Nov 8;76(3):640-52
pubmed: 23141074
J Neurosci. 2001 May 1;21(9):RC142: 1-5
pubmed: 11312316
Cereb Cortex. 2015 Aug;25(8):2049-64
pubmed: 24518756
Neuroimage. 2021 Jul 15;235:118029
pubmed: 33836269
Neuropsychologia. 2022 Aug 13;173:108305
pubmed: 35752268
Front Psychol. 2014 Oct 20;5:1121
pubmed: 25368587
Sci Rep. 2021 Jun 7;11(1):11944
pubmed: 34099756
Front Hum Neurosci. 2015 Apr 21;9:197
pubmed: 25954176
Neuroreport. 2007 Dec 3;18(18):1901-4
pubmed: 18007183
Sci Rep. 2022 Feb 25;12(1):3206
pubmed: 35217676
Prog Brain Res. 2001;134:427-45
pubmed: 11702559
Psychol Rev. 2017 Nov;124(6):740-761
pubmed: 28910127
PLoS One. 2010 Oct 13;5(10):e13296
pubmed: 20967247
Curr Biol. 2008 May 6;18(9):694-8
pubmed: 18450446
J Neurosci. 1989 Sep;9(9):3306-13
pubmed: 2795164
Psychol Rev. 1974 Nov;81(6):521-35
pubmed: 4445414
Proc Natl Acad Sci U S A. 2004 Apr 13;101(15):5658-63
pubmed: 15064396
J Neurophysiol. 2009 Jan;101(1):315-22
pubmed: 18987123
Trends Cogn Sci. 2014 May;18(5):242-50
pubmed: 24630872
Brain Topogr. 2009 May;21(3-4):221-31
pubmed: 19326203
Conscious Cogn. 2010 Mar;19(1):492-500
pubmed: 19955003
Front Integr Neurosci. 2019 Sep 12;13:51
pubmed: 31572136
J Exp Psychol Hum Percept Perform. 2014 Jun;40(3):983-94
pubmed: 24446717
Front Syst Neurosci. 2016 Nov 08;10:89
pubmed: 27877116
Curr Biol. 2008 Jun 24;18(12):R519-21
pubmed: 18579094
Sci Rep. 2021 May 20;11(1):10636
pubmed: 34017027
Nat Rev Neurosci. 2013 Oct;14(10):693-707
pubmed: 24052177
PLoS One. 2021 Apr 27;16(4):e0250281
pubmed: 33905446
Curr Opin Neurobiol. 2015 Dec;35:169-77
pubmed: 26469211
Neuropsychologia. 1995 Nov;33(11):1419-32
pubmed: 8584178
Seeing Perceiving. 2010;23(1):3-38
pubmed: 20507725
Brain. 2005 Mar;128(Pt 3):606-14
pubmed: 15634727
Neuroimage. 2016 Aug 1;136:162-73
pubmed: 27143090
Int J Neurosci. 1983 May;19(1-4):29-36
pubmed: 6874260
Trends Cogn Sci. 2017 May;21(5):307-310
pubmed: 28385460
Brain Res. 2008 Nov 25;1242:252-62
pubmed: 18710656
Nat Neurosci. 2007 Jun;10(6):687-9
pubmed: 17515898
Iperception. 2020 May 19;11(3):2041669520913052
pubmed: 32489576
Prog Brain Res. 2011;192:17-31
pubmed: 21763516
Restor Neurol Neurosci. 2014;32(2):247-57
pubmed: 24398719
Neurosci Biobehav Rev. 2020 Sep;116:494-507
pubmed: 32652097
Front Hum Neurosci. 2023 Jan 26;16:1058093
pubmed: 36776219
Nature. 1969 Mar 8;221(5184):963-4
pubmed: 5818337
J Neurophysiol. 1965 Nov;28(6):1029-40
pubmed: 5883730
Exp Brain Res. 2009 Jan;192(3):343-58
pubmed: 18762928
Cogn Process. 2021 Sep;22(Suppl 1):69-75
pubmed: 34410554
Ann N Y Acad Sci. 2004 May;1013:83-91
pubmed: 15194608
Neurosci Biobehav Rev. 2014 Apr;41:3-15
pubmed: 24275274

Auteurs

Shira Shvadron (S)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.

Adi Snir (A)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.

Amber Maimon (A)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.

Or Yizhar (O)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.
Research Group Adaptive Memory and Decision Making, Max Planck Institute for Human Development, Berlin, Germany.
Max Planck Dahlem Campus of Cognition (MPDCC), Max Planck Institute for Human Development, Berlin, Germany.

Sapir Harel (S)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.

Keinan Poradosu (K)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.
Weizmann Institute of Science, Rehovot, Israel.

Amir Amedi (A)

Baruch Ivcher School of Psychology, The Baruch Ivcher Institute for Brain, Cognition, and Technology, Reichman University, Herzliya, Israel.
The Ruth and Meir Rosenthal, Brain Imaging Center, Reichman University, Herzliya, Israel.

Classifications MeSH