Eye Tracking in Virtual Reality.
Eye movements
Head movements
Virtual reality
Journal
Current topics in behavioral neurosciences
ISSN: 1866-3370
Titre abrégé: Curr Top Behav Neurosci
Pays: Germany
ID NLM: 101535383
Informations de publication
Date de publication:
2023
2023
Historique:
medline:
23
10
2023
pubmed:
30
1
2023
entrez:
29
1
2023
Statut:
ppublish
Résumé
This chapter explores the current state of the art in eye tracking within 3D virtual environments. It begins with the motivation for eye tracking in Virtual Reality (VR) in psychological research, followed by descriptions of the hardware and software used for presenting virtual environments as well as for tracking eye and head movements in VR. This is followed by a detailed description of an example project on eye and head tracking while observers look at 360° panoramic scenes. The example is illustrated with descriptions of the user interface and program excerpts to show the measurement of eye and head movements in VR. The chapter continues with fundamentals of data analysis, in particular methods for the determination of fixations and saccades when viewing spherical displays. We then extend these methodological considerations to determining the spatial and temporal coordination of the eyes and head in VR perception. The chapter concludes with a discussion of outstanding problems and future directions for conducting eye- and head-tracking research in VR. We hope that this chapter will serve as a primer for those intending to implement VR eye tracking in their own research.
Identifiants
pubmed: 36710302
doi: 10.1007/7854_2022_409
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
73-100Informations de copyright
© 2023. The Author(s), under exclusive license to Springer Nature Switzerland AG.
Références
Anderson NC, Bischof WF, Foulsham T, Kingstone A (2020) Turning the (virtual) world around: patterns in saccade direction vary with picture orientation and shape in virtual reality. J Vis 20(8):21–21
doi: 10.1167/jov.20.8.21
pmcid: 7443121
Backhaus D, Engbert R, Rothkegel LOM, Trukenbrod HA (2020) Task-dependence in scene perception: head unrestrained viewing using mobile eye-tracking. J Vis 20(5):3–3. https://doi.org/10.1167/jov.20.5.3
doi: 10.1167/jov.20.5.3
pubmed: 32392286
pmcid: 7409614
Barnes GR (1979) Vestibulo-ocular function during co-ordinated head and eye movements to acquire visual targets. J Physiol 287(1):127–147. https://doi.org/10.1113/jphysiol.1979.sp012650
doi: 10.1113/jphysiol.1979.sp012650
pubmed: 311828
pmcid: 1281486
Batschelet E (1981) Circular statistics in biology. Academic Press, New York, p 388
Bebko AO, Troje N (2020) bmlTUX: design and control of experiments in virtual reality and beyond. PsyArXiv. https://doi.org/10.31234/osf.io/arvkf
Bischof WF, Anderson NC, Doswell MT, Kingstone A (2020) Visual exploration of omnidirectional panoramic scenes. J Vis 20(7):23. https://doi.org/10.1167/jov.20.7.23
doi: 10.1167/jov.20.7.23
pubmed: 32692829
pmcid: 7424099
Blue Trident IMU | Inertial Sensor by Vicon | Biomechanic Tracking (2022) Vicon. Retrieved March 30, 2022, from https://www.vicon.com/hardware/blue-trident/
Brookes J, Warburton M, Alghadier M, Mon-Williams M, Mushtaq F (2020) Studying human behavior with virtual reality: the unity experiment framework. Behav Res Methods 52(2):455–463. https://doi.org/10.3758/s13428-019-01242-0
doi: 10.3758/s13428-019-01242-0
pubmed: 31012061
Core – Pupil Player (2022) Pupil Labs. Retrieved March 31, 2022, from https://docs.pupil-labs.com
David EJ, Beitner J, Võ ML-H (2020) Effects of transient loss of vision on head and eye movements during visual search in a virtual environment. Brain Sci 10(11):841. https://doi.org/10.3390/brainsci10110841
doi: 10.3390/brainsci10110841
pubmed: 33198116
pmcid: 7696943
David EJ, Beitner J, Võ ML-H (2021) The importance of peripheral vision when searching 3D real-world scenes: a gaze-contingent study in virtual reality. J Vis 21(7):3–3
doi: 10.1167/jov.21.7.3
pubmed: 34251433
pmcid: 8287039
David EJ, Lebranchu P, Perreira Da Silva M, Le Callet P (2022) What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality? J Vis 22(4):12. https://doi.org/10.1167/jov.22.4.12
doi: 10.1167/jov.22.4.12
pubmed: 35323868
pmcid: 8963670
De Abreu A, Ozcinar C, Smolic A (2017) Look around you: saliency maps for omnidirectional images in VR applications. 2017 Ninth international conference on quality of multimedia experience (QoMEX), pp 1–6
Delreux V, Abeele SV, Lefevre P, Roucoux A (1991) Eye–head coordination: influence of eye position on the control of head movement amplitude. Brain Space:38–48
Doshi A, Trivedi MM (2012) Head and eye gaze dynamics during visual attention shifts in complex environments. J Vis 12(2):9–9
doi: 10.1167/12.2.9
pubmed: 22323822
Dosso JA, Kingstone A (2018) Social modulation of object-directed but not image-directed actions. PLoS One 13(10):e0205830. https://doi.org/10.1371/journal.pone.0205830
doi: 10.1371/journal.pone.0205830
pubmed: 30352061
pmcid: 6198971
Dufaux F, Konrad J (2000) Efficient, robust, and fast global motion estimation for video coding. IEEE Trans Image Process 9(3):497–501
doi: 10.1109/83.826785
pubmed: 18255419
Flindall J, Sara A, Kingstone A (2021) Head and eye movements are each facilitated by the offset of a central fixation point in a virtual gap paradigm. Exp Brain Res 239(1):117–126. https://doi.org/10.1007/s00221-020-05905-9
doi: 10.1007/s00221-020-05905-9
pubmed: 33118041
Folk CL, Remington RW, Johnston JC (1992) Involuntary covert orienting is contingent on attentional control settings. J Exp Psychol Hum Percept Perform 18:1030–1030
doi: 10.1037/0096-1523.18.4.1030
pubmed: 1431742
Foulsham T, Kingstone A, Underwood G (2008) Turning the world around: patterns in saccade direction vary with picture orientation. Vision Res 48(17):1777–1790
doi: 10.1016/j.visres.2008.05.018
pubmed: 18599105
Foulsham T, Walker E, Kingstone A (2011) The where, what and when of gaze allocation in the lab and the natural environment. Vision Res 51(17):1920–1931
doi: 10.1016/j.visres.2011.07.002
pubmed: 21784095
Foulsham T, Kingstone A (2017) Are fixations in static natural scenes a useful predictor of attention in the real world. Can J Exp Psychol / Revue canadienne de psychologie expérimentale 71(2):172–181
doi: 10.1037/cep0000125
pubmed: 28604053
Freedman EG (2008) Coordination of the eyes and head during visual orienting. Exp Brain Res 190(4):369
doi: 10.1007/s00221-008-1504-8
pubmed: 18704387
pmcid: 2605952
Freud E, Macdonald SN, Chen J, Quinlan DJ, Goodale MA, Culham JC (2018) Getting a grip on reality: grasping movements directed to real objects and images rely on dissociable neural representations. Cortex 98:34–48. https://doi.org/10.1016/j.cortex.2017.02.020
doi: 10.1016/j.cortex.2017.02.020
pubmed: 28431740
Fuller J (1992a) Comparison of head movement strategies among mammals. The headneck sensory motor system. Oxford University Press, New York, pp 101–112
doi: 10.1093/acprof:oso/9780195068207.003.0013
Fuller J (1992b) Head movement propensity. Exp Brain Res 92(1):152–164
doi: 10.1007/BF00230391
pubmed: 1486950
Gallup AC, Vasilyev D, Anderson N, Kingstone A (2019) Contagious yawning in virtual reality is affected by actual, but not simulated, social presence. Sci Rep 9(1):294. https://doi.org/10.1038/s41598-018-36570-2
doi: 10.1038/s41598-018-36570-2
pubmed: 30670744
pmcid: 6342947
Goldring JE, Dorris MC, Corneil BD, Ballantyne PA, Munoz DR (1996) Combined eye-head gaze shifts to visual and auditory targets in humans. Exp Brain Res 111(1):68–78
doi: 10.1007/BF00229557
pubmed: 8891638
Henderson JM (2016) Gaze control as prediction. Trends Cogn Sci. http://www.sciencedirect.com/science/article/pii/S1364661316301929
Hessels RS, Niehorster DC, Nyström M, Andersson R, Hooge IT (2018) Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. R Soc Open Sci 5(8):180502
doi: 10.1098/rsos.180502
pubmed: 30225041
pmcid: 6124022
Holmqvist K, Andersson R (2017) Eye tracking: a comprehensive guide to methods. Paradigms and measures
Hooge IT, Hessels RS, Niehorster DC, Diaz GJ, Duchowski AT, Pelz JB (2019) From lab-based studies to eye-tracking in virtual and real worlds: conceptual and methodological problems and solutions
Jacobs O, Anderson NC, Bischof WF, Kingstone A (2020) Into the unknown: head-based selection is less dependent on peripheral information than gaze-based selection in 360-degree virtual reality scenes. PsyArXiv. https://doi.org/10.31234/osf.io/2qtcw
Kassner M, Patera W, Bulling A (2014) Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. Proceedings of the 2014 ACM international joint conference on pervasive and ubiquitous computing: adjunct publication, pp 1151–1160
Kingstone A, Smilek D, Eastwood JD (2008) Cognitive ethology: a new approach for studying human cognition. Br J Psychol 99(3):317–340
doi: 10.1348/000712607X251243
pubmed: 17977481
Komogortsev OV, Gobert DV, Jayarathna S, Gowda SM (2010) Standardization of automated analyses of oculomotor fixation and saccadic behaviors. IEEE Trans Biomed Eng 57(11):2635–2645
doi: 10.1109/TBME.2010.2057429
Land MF, Hayhoe M (2001) In what ways do eye movements contribute to everyday activities? Vision Res 41(25):3559–3565
doi: 10.1016/S0042-6989(01)00102-X
pubmed: 11718795
Mathôt S, Schreij D, Theeuwes J (2012) OpenSesame: an open-source, graphical experiment builder for the social sciences. Behav Res Methods 44(2):314–324
doi: 10.3758/s13428-011-0168-7
pubmed: 22083660
Ohayon S, Rivlin E (2006) Robust 3d head tracking using camera pose estimation. 18th international conference on pattern recognition (ICPR’06), vol 1, pp 1063–1066
Peirce J, Gray JR, Simpson S, MacAskill M, Höchenberger R, Sogo H, Kastman E, Lindeløv JK (2019) PsychoPy2: experiments in behavior made easy. Behav Res Methods 51(1):195–203. https://doi.org/10.3758/s13428-018-01193-y
doi: 10.3758/s13428-018-01193-y
pubmed: 30734206
pmcid: 6420413
Pupil Invisible – Eye tracking glasses for the real world – Pupil Labs (2022) Retrieved March 30, 2022, from https://pupil-labs.com/products/invisible/
Rai Y, Gutiérrez J, Le Callet P (2017) A dataset of head and eye movements for 360 degree images. Proceedings of the 8th ACM on multimedia systems conference, pp 205–210
Risko EF, Richardson DC, Kingstone A (2016) Breaking the fourth wall of cognitive science: real-world social attention and the dual function of gaze. Curr Dir Psychol Sci 25(1):70–74
doi: 10.1177/0963721415617806
Salvucci DD, Goldberg JH (2000) Identifying fixations and saccades in eye-tracking protocols. Proceedings of the 2000 symposium on eye tracking research and applications, pp 71–78
SensoMotoric (2017) SensoMotoric Instruments. [Apparatus and software]. https://en.wikipedia.org/wiki/SensoMotoric_Instruments
Silvis JD, Donk M (2014) The effects of saccade-contingent changes on oculomotor capture: salience is important even beyond the first oculomotor response. Atten Percept Psychophys 76(6):1803–1814
doi: 10.3758/s13414-014-0688-1
pubmed: 24874259
Sitzmann V, Serrano A, Pavel A, Agrawala M, Gutierrez D, Masia B, Wetzstein G (2018) Saliency in VR: how do people explore virtual environments? IEEE Trans Vis Comput Graph 24(4):1633–1642. https://doi.org/10.1109/TVCG.2018.2793599
doi: 10.1109/TVCG.2018.2793599
pubmed: 29553930
Solman GJ, Foulsham T, Kingstone A (2017) Eye and head movements are complementary in visual selection. R Soc Open Sci 4(1):160569
doi: 10.1098/rsos.160569
pubmed: 28280554
pmcid: 5319320
Solman GJ, Kingstone A (2014) Balancing energetic and cognitive resources: memory use during search depends on the orienting effector. Cognition 132(3):443–454. https://doi.org/10.1016/j.cognition.2014.05.005
doi: 10.1016/j.cognition.2014.05.005
pubmed: 24946208
Stahl JS (2001) Eye-head coordination and the variation of eye-movement accuracy with orbital eccentricity. Exp Brain Res 136(2):200–210
doi: 10.1007/s002210000593
pubmed: 11206282
’t Hart BM, Vockeroth J, Schumann F, Bartl K, Schneider E, König P, Einhäuser W (2009) Gaze allocation in natural stimuli: comparing free exploration to head-fixed viewing conditions. Vis Cogn 17(6–7):1132–1158. https://doi.org/10.1080/13506280902812304
doi: 10.1080/13506280902812304
Theeuwes J (1994) Endogenous and exogenous control of visual selection. Perception 23(4):429–440
doi: 10.1068/p230429
pubmed: 7991343
Torralba A, Oliva A, Castelhano MS, Henderson JM (2006) Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol Rev 113(4):766–786. https://doi.org/10.1037/0033-295X.113.4.766
doi: 10.1037/0033-295X.113.4.766
pubmed: 17014302
Unity Essentials (2022) Unity learn. Retrieved March 31, 2022, from https://learn.unity.com/pathway/unity-essentials
V120 (n.d.) Duo – an optical tracking system in a single, plug-and-play package. OptiTrack. Retrieved March 30, 2022, from http://optitrack.com/cameras/v120-duo/index.html
van Zoest W, Donk M, Theeuwes J (2004) The role of stimulus-driven and goal-driven control in saccadic visual selection. J Exp Psychol Hum Percept Perform 30(4):746
doi: 10.1037/0096-1523.30.4.749
pubmed: 15305440
Vasser M, Kängsepp M, Magomedkerimov M, Kilvits K, Stafinjak V, Kivisik T, Vicente R, Aru J (2017) VREX: an open-source toolbox for creating 3D virtual reality experiments. BMC Psychol 5(1):4. https://doi.org/10.1186/s40359-017-0173-4
doi: 10.1186/s40359-017-0173-4
pubmed: 28196507
pmcid: 5307765
Visbox, Inc. (2022) Retrieved March 30, 2022, from http://www.visbox.com/
Vizard | Virtual Reality Software for Researchers (2022) Retrieved March 30, 2022, from https://www.worldviz.com/vizard-virtual-reality-software
Watson MR, Voloh B, Thomas C, Hasan A, Womelsdorf T (2019) USE: an integrative suite for temporally-precise psychophysical experiments in virtual environments for human, nonhuman, and artificially intelligent agents. J Neurosci Methods 326:108374. https://doi.org/10.1016/j.jneumeth.2019.108374
doi: 10.1016/j.jneumeth.2019.108374
pubmed: 31351974
Zangemeister WH, Stark L (1982) Types of gaze movement: variable interactions of eye and head movements. Exp Neurol 77(3):563–577. https://doi.org/10.1016/0014-4886(82)90228-X
doi: 10.1016/0014-4886(82)90228-X
pubmed: 7117463