Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality.
cognitive load
eye tracking
flow state
language task
listening comprehension
mental load
virtual reality
Journal
Sensors (Basel, Switzerland)
ISSN: 1424-8220
Titre abrégé: Sensors (Basel)
Pays: Switzerland
ID NLM: 101204366
Informations de publication
Date de publication:
25 Jul 2023
25 Jul 2023
Historique:
received:
15
05
2023
revised:
03
07
2023
accepted:
14
07
2023
medline:
16
8
2023
pubmed:
12
8
2023
entrez:
12
8
2023
Statut:
epublish
Résumé
Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of "staring into the distance" without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.
Identifiants
pubmed: 37571449
pii: s23156667
doi: 10.3390/s23156667
pmc: PMC10422404
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Références
Appl Ergon. 2019 Nov;81:102883
pubmed: 31422246
Neuroimage. 2016 Apr 1;129:25-39
pubmed: 26673115
Front Psychol. 2017 Jun 30;8:1092
pubmed: 28713304
Behav Res Methods. 2020 Oct;52(5):2232-2255
pubmed: 32291732
Psychol Bull. 1982 Mar;91(2):276-92
pubmed: 7071262
Cogn Affect Behav Neurosci. 2016 Aug;16(4):601-15
pubmed: 27038165
PLoS One. 2011 Mar 25;6(3):e18298
pubmed: 21464969
Front Hum Neurosci. 2021 Feb 26;15:593108
pubmed: 33716689
IEEE Trans Vis Comput Graph. 2017 Nov;23(11):2378-2388
pubmed: 28809700
Atten Percept Psychophys. 2020 Oct;82(7):3432-3444
pubmed: 32500390
Front Psychol. 2022 Apr 07;13:815665
pubmed: 35465560
J Vis. 2008 Mar 28;8(3):33.1-30
pubmed: 18484839
Front Robot AI. 2020 Feb 21;7:20
pubmed: 33501189
Optom Vis Sci. 2003 Jun;80(6):467-73
pubmed: 12808408
Cogn Sci. 2021 Apr;45(4):e12977
pubmed: 33877694
IEEE Trans Vis Comput Graph. 2018 Apr;24(4):1633-1642
pubmed: 29553930
Behav Res Methods. 2018 Apr;50(2):834-852
pubmed: 28593606
Conscious Cogn. 2017 Aug;53:165-175
pubmed: 28689088
J Eye Mov Res. 2019 Apr 05;12(1):
pubmed: 33828721
Front Psychol. 2021 Dec 31;12:650693
pubmed: 35035362
Psychon Bull Rev. 2015 Dec;22(6):1814-9
pubmed: 26268431