Home > Research > Publications & Outputs > Classifying Attention Types with Thermal Imagin...

Links

Text available via DOI:

View graph of relations

Classifying Attention Types with Thermal Imaging and Eye Tracking

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
  • Yomna Abdelrahman
  • Anam Ahmad Khan
  • Joshua Newn
  • Eduardo Velloso
  • Sherine Ashraf Safwat
  • James Bailey
  • Andreas Bulling
  • Frank Vetere
  • Albrecht Schmidt
Close
Article number69
<mark>Journal publication date</mark>9/09/2019
<mark>Journal</mark>Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Issue number3
Volume3
Number of pages27
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Despite the importance of attention in user performance, current methods for attention classification do not allow to discriminate between different attention types. We propose a novel method that combines thermal imaging and eye tracking to unobtrusively classify four types of attention: sustained, alternating, selective, and divided. We collected a data set in which we stimulate these four attention types in a user study (N = 22) using combinations of audio and visual stimuli while measuring users' facial temperature and eye movement. Using a Logistic Regression on features extracted from both sensing technologies, we can classify the four attention types with high AUC scores up to 75.7% for the user independent-condition independent, 87% for the user-independent-condition dependent, and 77.4% for the user-dependent prediction. Our findings not only demonstrate the potential of thermal imaging and eye tracking for unobtrusive classification of different attention types but also pave the way for novel applications for attentive user interfaces and attention-aware computing.