EyeT4Empathy: Dataset of foraging for visual information, gaze typing and empathy assessment.
Journal
Scientific data
ISSN: 2052-4463
Titre abrégé: Sci Data
Pays: England
ID NLM: 101640192
Informations de publication
Date de publication:
03 12 2022
03 12 2022
Historique:
received:
22
02
2022
accepted:
23
11
2022
entrez:
3
12
2022
pubmed:
4
12
2022
medline:
7
12
2022
Statut:
epublish
Résumé
We present a dataset of eye-movement recordings collected from 60 participants, along with their empathy levels, towards people with movement impairments. During each round of gaze recording, participants were divided into two groups, each one completing one task. One group performed a task of free exploration of structureless images, and a second group performed a task consisting of gaze typing, i.e. writing sentences using eye-gaze movements on a card board. The eye-tracking data recorded from both tasks is stored in two datasets, which, besides gaze position, also include pupil diameter measurements. The empathy levels of participants towards non-verbal movement-impaired people were assessed twice through a questionnaire, before and after each task. The questionnaire is composed of forty questions, extending a established questionnaire of cognitive and affective empathy. Finally, our dataset presents an opportunity for analysing and evaluating, among other, the statistical features of eye-gaze trajectories in free-viewing as well as how empathy is reflected in eye features.
Identifiants
pubmed: 36463232
doi: 10.1038/s41597-022-01862-w
pii: 10.1038/s41597-022-01862-w
pmc: PMC9719458
doi:
Types de publication
Dataset
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
752Informations de copyright
© 2022. The Author(s).
Références
Zamani, H., Abas, A. & Amin, M. Eye tracking application on emotion analysis for marketing strategy. Journal of Telecommunication, Electronic and Computer Engineering (JTEC) 8, 87–91 (2016).
Wang, L. Test and evaluation of advertising effect based on EEG and eye tracker. Translational Neuroscience 10, 14–18 (2019).
pubmed: 31098306
pmcid: 6487787
doi: 10.1515/tnsci-2019-0003
Neomániová, K. et al. The use of eye-tracker and face reader as useful consumer neuroscience tools within logo creation. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis 67, 1061–1070 (2019).
doi: 10.11118/actaun201967041061
Hessels, R. S. & Hooge, I. T. Eye tracking in developmental cognitive neuroscience–the good, the bad and the ugly. Developmental Cognitive Neuroscience 40, 100710 (2019).
pubmed: 31593909
pmcid: 6974897
doi: 10.1016/j.dcn.2019.100710
Hu, Z. et al. Dgaze: CNN-based gaze prediction in dynamic scenes. IEEE Transactions on Visualization and Computer Graphics 26, 1902–1911 (2020).
pubmed: 32070980
doi: 10.1109/TVCG.2020.2973473
Clay, V., König, P. & Koenig, S. Eye tracking in virtual reality. Journal of Eye Movement Research 12 (2019).
Ulahannan, A., Jennings, P., Oliveira, L. & Birrell, S. Designing an adaptive interface: Using eye tracking to classify how information usage changes over time in partially automated vehicles. IEEE Access 8, 16865–16875 (2020).
doi: 10.1109/ACCESS.2020.2966928
Spataro, R., Ciriacono, M., Manno, C. & La Bella, V. The eye-tracking computer device for communication in amyotrophic lateral sclerosis. Acta Neurologica Scandinavica 130, 40–45 (2014).
pubmed: 24350578
doi: 10.1111/ane.12214
Loch, F. et al. An adaptive virtual training system based on universal design. IFAC-PapersOnLine 51, 335–340 (2019).
doi: 10.1016/j.ifacol.2019.01.023
Burrell, J., Hornberger, M., Carpenter, R., Kiernan, M. & Hodges, J. Saccadic abnormalities in frontotemporal dementia. Neurology 78, 1816–1823 (2012).
pubmed: 22573637
doi: 10.1212/WNL.0b013e318258f75c
Perneczky, R. et al. Saccadic latency in Parkinson’s disease correlates with executive function and brain atrophy, but not motor severity. Neurobiology of Disease 43, 79–85 (2011).
pubmed: 21310235
pmcid: 3102178
doi: 10.1016/j.nbd.2011.01.032
Antoniades, C. A., Xu, Z., Mason, S. L., Carpenter, R. & Barker, R. A. Huntington’ disease:cchanges in saccades and hand-tapping over 3 years. Journal of Neurology 257, 1890–1898 (2010).
pubmed: 20585954
doi: 10.1007/s00415-010-5632-2
Chandna, A., Chandrasekharan, D. P., Ramesh, A. V. & Carpenter, R. Altered interictal saccadic reaction time in migraine: a cross-sectional study. Cephalalgia 32, 473–480 (2012).
pubmed: 22492423
doi: 10.1177/0333102412441089
Pouget, P., Wattiez, N., Rivaud-Péchoux, S. & Gaymard, B. Rapid development of tolerance to sub-anaesthetic dose of ketamine: An oculomotor study in macaque monkeys. Psychopharmacology 209, 313–318 (2010).
pubmed: 20309530
doi: 10.1007/s00213-010-1797-8
Antoniades, C. et al. An internationally standardised antisaccade protocol. Vision Research 84, 1–5 (2013).
pubmed: 23474300
doi: 10.1016/j.visres.2013.02.007
Rucci, M. & Poletti, M. Control and functions of fixational eye movements. Annual review of vision science 1, 499–518 (2015).
pubmed: 27795997
pmcid: 5082990
doi: 10.1146/annurev-vision-082114-035742
Caligari, M., Godi, M., Guglielmetti, S., Franchignoni, F. & Nardone, A. Eye tracking communication devices in amyotrophic lateral sclerosis: Impact on disability and quality of life. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration 14, 546–552 (2013).
pubmed: 23834069
doi: 10.3109/21678421.2013.803576
Proudfoot, M. et al. Eye-tracking in amyotrophic lateral sclerosis: A longitudinal study of saccadic and cognitive tasks. Amyotrophic Lateral Sclerosis and Frontotemporal Degeneration 17, 101–111 (2016).
doi: 10.3109/21678421.2015.1054292
Otero, S. C., Weekes, B. S. & Hutton, S. B. Pupil size changes during recognition memory. Psychophysiology 48, 1346–1353 (2011).
pubmed: 21575007
doi: 10.1111/j.1469-8986.2011.01217.x
Kret, M. E. The role of pupil size in communication. Is there room for learning? Cognition and Emotion 32, 1139–1145 (2018).
pubmed: 28857664
doi: 10.1080/02699931.2017.1370417
Kret, M. E. & Sjak-Shie, E. E. Preprocessing pupil size data: Guidelines and code. Behavior Research Methods 51, 1336–1342 (2019).
pubmed: 29992408
doi: 10.3758/s13428-018-1075-y
Harrison, N. A., Wilson, C. E. & Critchley, H. D. Processing of observed pupil size modulates perception of sadness and predicts empathy. Emotion 7, 724 (2007).
pubmed: 18039039
doi: 10.1037/1528-3542.7.4.724
Egawa, S., Sejima, Y., Sato, Y. & Watanabe, T. A laughing-driven pupil response system for inducing empathy. In 2016 IEEE/SICE International Symposium on System Integration (SII), 520–525 (IEEE, 2016).
Cosme, G. et al. Pupil dilation reflects the authenticity of received nonverbal vocalizations. Scientific Reports 11, 1–14 (2021).
doi: 10.1038/s41598-021-83070-x
Bhurtel, S., Lind, P. G. & Mello, G. B. M. For a new protocol to promote empathy towards users of communication technologies. In International Conference on Human-Computer Interaction, 3–10 (Springer, 2021).
Griffith, H., Lohr, D., Abdulin, E. & Komogortsev, O. Gazebase, a large-scale, multi-stimulus, longitudinal eye movement dataset. Scientific Data 8, 1–9 (2021).
doi: 10.1038/s41597-021-00959-y
Wilming, N. et al. An extensive dataset of eye movements during viewing of complex images. Scientific Data 4, 1–11 (2017).
doi: 10.1038/sdata.2016.126
Kümmerer, M., Wallis, T. S. A. & Bethge, M. Saliency benchmarking made easy: Separating models, maps and metrics. In Ferrari, V., Hebert, M., Sminchisescu, C. & Weiss, Y. (eds.) Computer Vision – ECCV 2018, Lecture Notes in Computer Science, 798–814 (Springer International Publishing, 2018).
Błażejczyk, P. & Magdziarz, M. Stochastic modeling of lévy-like human eye movements? Chaos: An Interdisciplinary Journal of Nonlinear Science 31, 043129 (2021).
doi: 10.1063/5.0036491
Brockmann, D. & Geisel, T. The ecology of gaze shifts. Neurocomputing 32, 643–650 (2000).
doi: 10.1016/S0925-2312(00)00227-7
Brockmann, D. & Geisel, T. Are human scanpaths levy flights? In 9th International Conference on Artificial Neural Networks: ICANN, 263–268 (IET, 1999).
Stephen, D. G., Mirman, D., Magnuson, J. S. & Dixon, J. A. Lévy-like diffusion in eye movements during spoken-language comprehension. Physical Review E 79, 056114 (2009).
doi: 10.1103/PhysRevE.79.056114
Viswanathan, G. M. et al. Lévy flight search patterns of wandering albatrosses. Nature 381, 413–415 (1996).
doi: 10.1038/381413a0
Sims, D., Humphries, N., Bradford, R. & Bruce, B. Lévy flight and brownian search patterns of a free-ranging predator reflect different prey field characteristics. Journal of Animal Ecology 81, 432–442 (2012).
pubmed: 22004140
doi: 10.1111/j.1365-2656.2011.01914.x
Raichlen, D. A. et al. Evidence of lévy walk foraging patterns in human hunter–gatherers. Proceedings of the National Academy of Sciences 111, 728–733 (2014).
doi: 10.1073/pnas.1318616111
Rhee, I. et al. On the levy-walk nature of human mobility. IEEE ACM Trans Netw 19, 630–643 (2011).
doi: 10.1109/TNET.2011.2120618
Bénichou, O., Loverdo, C., Moreau, M. & Voituriez, R. Intermittent search strategies. Reviews of Modern Physics 83, 81 (2011).
doi: 10.1103/RevModPhys.83.81
Boccignone, G. & Ferraro, M. Feed and fly control of visual scanpaths for foveation image processing. Annals of Telecommunications-Annales des Télécommunications 68, 201–217 (2013).
doi: 10.1007/s12243-012-0316-9
Goto, Y. et al. Saccade eye movements as a quantitative measure of frontostriatal network in children with adhd. Brain and Development 32, 347–355 (2010).
pubmed: 19505783
doi: 10.1016/j.braindev.2009.04.017
Fernández-Martnez, M., Sánchez-Granero, M., Segovia, J. T. & Román-Sánchez, I. An accurate algorithm to calculate the hurst exponent of self-similar processes. Physics Letters A 378, 2355–2362 (2014).
doi: 10.1016/j.physleta.2014.06.018
Marlow, C. A. et al. Temporal structure of human gaze dynamics is invariant during free viewing. PloS one 10, e0139379 (2015).
pubmed: 26421613
pmcid: 4589360
doi: 10.1371/journal.pone.0139379
Freije, M. et al. Multifractal detrended fluctuation analysis of eye-tracking data. In European Congress on Computational Methods in Applied Sciences and Engineering, 476–484 (Springer, 2017).
Suman, A. A. et al. Spatial and time domain analysis of eye-tracking data during screening of brain magnetic resonance images. Plos one 16, e0260717 (2021).
pubmed: 34855867
pmcid: 8639086
doi: 10.1371/journal.pone.0260717
Unema, P. J., Pannasch, S., Joos, M. & Velichkovsky, B. M. Time course of information processing during scene perception: The relationship between saccade amplitude and fixation duration. Visual cognition 12, 473–494 (2005).
doi: 10.1080/13506280444000409
Fuhl, W., Bozkir, E. & Kasneci, E. Reinforcement learning for the privacy preservation and manipulation of eye tracking data. In International Conference on Artificial Neural Networks, 595–607 (Springer, 2021).
Majaranta, P., Ahola, U.-K. & Špakov, O. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 357–360 (2009).
Lim, J. Z., Mountstephens, J. & Teo, J. Emotion recognition using eye-tracking: taxonomy, review and current challenges. Sensors 20, 2384 (2020).
pubmed: 32331327
pmcid: 7219342
doi: 10.3390/s20082384
Villani, D. et al. Visual exploration patterns of human figures in action: an eye tracker study with art paintings. Frontiers in Psychology 6, 1636 (2015).
pubmed: 26579021
pmcid: 4620395
doi: 10.3389/fpsyg.2015.01636
Reniers, R. L., Corcoran, R., Drake, R., Shryane, N. M. & Völlm, B. A. The qcae: A questionnaire of cognitive and affective empathy. Journal of Personality Assessment 93, 84–95 (2011).
pubmed: 21184334
doi: 10.1080/00223891.2010.528484
Olsen, A. The tobii i-vt fixation filter. Tobii Technology 21 (2012).
Komogortsev, O. V. et al. Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Trans Biomed Eng 57, 2635–2645 (2010).
doi: 10.1109/TBME.2010.2057429
Lencastre, P. Eye tracker data. Figshare https://doi.org/10.6084/m9.figshare.19729636.v2 (2022).
Lencastre, P. Raw data. Figshare https://doi.org/10.6084/m9.figshare.19209714.v1 (2022).
Lencastre, P. Questionnaires. Figshare https://doi.org/10.6084/m9.figshare.19657323.v2 (2022).
Feng, Y. et al. Virtual pointer for gaze guidance in laparoscopic surgery. Surgical Endoscopy 34, 3533–3539 (2020).
pubmed: 31586251
doi: 10.1007/s00464-019-07141-x
Shi, Y., Zheng, Y., Du, J., Zhu, Q. & Liu, X. The impact of engineering information complexity on working memory development of construction workers: An eye-tracking investigation. In Construction Research Congress 2020: Infrastructure Systems and Sustainability, 89–98 (American Society of Civil Engineers Reston, VA, 2020).
Vrabič, N., Juroš, B. & Pompe, M. T. Automated visual acuity evaluation based on preferential looking technique and controlled with remote eye tracking. Ophthalmic Research 64, 389–397 (2021).
pubmed: 33080607
doi: 10.1159/000512395
Netzel, R. et al. Comparative eye-tracking evaluation of scatterplots and parallel coordinates. Visual Informatics 1, 118–131 (2017).
doi: 10.1016/j.visinf.2017.11.001
Niehorster, D. C., Andersson, R. & Nyström, M. Titta: A toolbox for creating psychtoolbox and psychopy experiments with tobii eye trackers. Behavior Research Methods 52, 1970–1979 (2020).
pubmed: 32128697
pmcid: 7575480
doi: 10.3758/s13428-020-01358-8
Zhou, L. & Xue, F. Show products or show people: An eye-tracking study of visual branding strategy on instagram. Journal of Research in Interactive Marketing (2021).
Fayed, K., Franken, B. & Berkling, K. Understanding the use of eye-tracking recordings to measure and classify reading ability in elementary children school. CALL for Widening Participation: Short Papers from EUROCALL 2020 69 (2020).
Krohn, O. A., Varankian, V., Lind, P. G. & Mello, G. B. M. Construction of an inexpensive eye tracker for social inclusion and education. In International Conference on Human-Computer Interaction, 60–78 (Springer, 2020).
Tobii-AB. Tobii Pro x3-120 eye tracker user’ manual. Available at https://www.tobiipro.com/siteassets/tobii-pro/user-manuals/tobii-pro-x3-120-user-manual.pdf/?v=1.0.9 (2019).
Schmitz, S., Krummenauer, F., Henn, S. & Dick, H. B. Comparison of three different technologies for pupil diameter measurement. Graefe’s Archive for Clinical and Experimental Ophthalmology 241, 472–477 (2003).
pubmed: 12739174
doi: 10.1007/s00417-003-0669-x
Brisson, J. et al. Pupil diameter measurement errors as a function of gaze direction in corneal reflection eyetrackers. Behavior Research Methods 45, 1322–1331 (2013).
pubmed: 23468182
doi: 10.3758/s13428-013-0327-0
Lencastre, P. Code to read data. Figshare https://doi.org/10.6084/m9.figshare.21608238.v1 (2022).