Home > Research > Publications & Outputs > Prediction of gaze estimation error for error-a...

Links

Text available via DOI:

View graph of relations

Prediction of gaze estimation error for error-aware gaze-based interfaces

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date14/03/2016
Host publicationETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications
Place of PublicationNew York
PublisherACM
Pages275-278
Number of pages4
ISBN (print)9781450341257
<mark>Original language</mark>English

Abstract

Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.