Imaging Time Series of Eye Tracking Data to Classify Attentional States.
Augmented Reality
Gramian Angular Fields
Imaging Time Series
Markov Transition Fields
attention
classification
convolutional neural network
eye tracking
Journal
Frontiers in neuroscience
ISSN: 1662-4548
Titre abrégé: Front Neurosci
Pays: Switzerland
ID NLM: 101478481
Informations de publication
Date de publication:
2021
2021
Historique:
received:
05
02
2021
accepted:
03
05
2021
entrez:
14
6
2021
pubmed:
15
6
2021
medline:
15
6
2021
Statut:
epublish
Résumé
It has been shown that conclusions about the human mental state can be drawn from eye gaze behavior by several previous studies. For this reason, eye tracking recordings are suitable as input data for attentional state classifiers. In current state-of-the-art studies, the extracted eye tracking feature set usually consists of descriptive statistics about specific eye movement characteristics (i.e., fixations, saccades, blinks, vergence, and pupil dilation). We suggest an Imaging Time Series approach for eye tracking data followed by classification using a convolutional neural net to improve the classification accuracy. We compared multiple algorithms that used the one-dimensional statistical summary feature set as input with two different implementations of the newly suggested method for three different data sets that target different aspects of attention. The results show that our two-dimensional image features with the convolutional neural net outperform the classical classifiers for most analyses, especially regarding generalization over participants and tasks. We conclude that current attentional state classifiers that are based on eye tracking can be optimized by adjusting the feature set while requiring less feature engineering and our future work will focus on a more detailed and suited investigation of this approach for other scenarios and data sets.
Identifiants
pubmed: 34121994
doi: 10.3389/fnins.2021.664490
pmc: PMC8193942
doi:
Types de publication
Journal Article
Langues
eng
Pagination
664490Subventions
Organisme : Austrian Science Fund FWF
ID : P 29801
Pays : Austria
Informations de copyright
Copyright © 2021 Vortmann, Knychalla, Annerer-Walcher, Benedek and Putze.
Déclaration de conflit d'intérêts
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
Références
PLoS One. 2013;8(1):e52955
pubmed: 23382827
Sensors (Basel). 2019 Dec 27;20(1):
pubmed: 31892141
Cogn Sci. 2021 Apr;45(4):e12977
pubmed: 33877694
Psychol Aging. 2015 Jun;30(2):266-278
pubmed: 25938246
Atten Percept Psychophys. 2009 May;71(4):881-95
pubmed: 19429966
Eur J Neurosci. 2013 Aug;38(3):2389-98
pubmed: 23675850
Brain Behav. 2020 Oct;10(10):e01790
pubmed: 32816400
Behav Res Methods. 2018 Feb;50(1):134-150
pubmed: 28181186
Psychon Bull Rev. 2015 Dec;22(6):1814-9
pubmed: 26268431
Sensors (Basel). 2020 Jan 19;20(2):
pubmed: 31963823
Psychol Rev. 1980 Jul;87(4):329-54
pubmed: 7413885
Psychol Sci. 2014 Jan;25(1):188-97
pubmed: 24285432
Neuropsychologia. 2014 Apr;56:393-400
pubmed: 24561034
Psychol Bull. 2006 Nov;132(6):946-958
pubmed: 17073528
Front Psychol. 2017 Jun 30;8:1092
pubmed: 28713304
Q J Exp Psychol (Hove). 2013;66(12):2289-94
pubmed: 24313285
Atten Percept Psychophys. 2020 Oct;82(7):3432-3444
pubmed: 32500390
Can J Exp Psychol. 2013 Mar;67(1):51-59
pubmed: 23458551
Front Hum Neurosci. 2019 Oct 09;13:348
pubmed: 31649517
Annu Rev Psychol. 2011;62:73-101
pubmed: 19575619
Cogn Neurodyn. 2018 Dec;12(6):597-606
pubmed: 30483367
Cogn Affect Behav Neurosci. 2016 Aug;16(4):601-15
pubmed: 27038165
Behav Res Methods. 2014 Dec;46(4):913-21
pubmed: 24258321
Front Neurol. 2019 Feb 18;10:80
pubmed: 30833926
Hum Mov Sci. 2012 Dec;31(6):1353-65
pubmed: 22877514