Home > Research > Publications & Outputs > GazeCast: Using Mobile Devices to Allow Gaze-ba...

Links

Text available via DOI:

View graph of relations

GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public Displays

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date30/05/2023
Host publicationProceedings - ETRA 2023: ACM Symposium on Eye Tracking Research and Applications
EditorsStephen N. Spencer
PublisherAssociation for Computing Machinery (ACM)
Pages1-8
Number of pages8
ISBN (electronic)9798400701504
<mark>Original language</mark>English
EventETRA '23:: 2023 Symposium on Eye Tracking Research and Applications - Tubingen, Germany
Duration: 30/05/20232/06/2023

Conference

ConferenceETRA '23:
Country/TerritoryGermany
CityTubingen
Period30/05/232/06/23

Publication series

NameEye Tracking Research and Applications Symposium (ETRA)

Conference

ConferenceETRA '23:
Country/TerritoryGermany
CityTubingen
Period30/05/232/06/23

Abstract

Gaze is promising for natural and spontaneous interaction with public displays, but current gaze-enabled displays require movement-hindering stationary eye trackers or cumbersome head-mounted eye trackers. We propose and evaluate GazeCast - a novel system that leverages users' handheld mobile devices to allow gaze-based interaction with surrounding displays. In a user study (N = 20), we compared GazeCast to a standard webcam for gaze-based interaction using Pursuits. We found that while selection using GazeCast requires more time and physical demand, participants value GazeCast's high accuracy and flexible positioning. We conclude by discussing how mobile computing can facilitate the adoption of gaze interaction with pervasive displays.