Home > Research > Publications & Outputs > Prediction of gaze estimation error for error-a...

Links

Text available via DOI:

View graph of relations

Prediction of gaze estimation error for error-aware gaze-based interfaces

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Prediction of gaze estimation error for error-aware gaze-based interfaces. / Barz, Michael; Daiber, Florian Johannes; Bulling, Andreas.
ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. New York: ACM, 2016. p. 275-278.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Barz, M, Daiber, FJ & Bulling, A 2016, Prediction of gaze estimation error for error-aware gaze-based interfaces. in ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, New York, pp. 275-278. https://doi.org/10.1145/2857491.2857493

APA

Barz, M., Daiber, F. J., & Bulling, A. (2016). Prediction of gaze estimation error for error-aware gaze-based interfaces. In ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications (pp. 275-278). ACM. https://doi.org/10.1145/2857491.2857493

Vancouver

Barz M, Daiber FJ, Bulling A. Prediction of gaze estimation error for error-aware gaze-based interfaces. In ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. New York: ACM. 2016. p. 275-278 doi: 10.1145/2857491.2857493

Author

Barz, Michael ; Daiber, Florian Johannes ; Bulling, Andreas. / Prediction of gaze estimation error for error-aware gaze-based interfaces. ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. New York : ACM, 2016. pp. 275-278

Bibtex

@inproceedings{9f2f6350629e44879b4351da7f37a7eb,
title = "Prediction of gaze estimation error for error-aware gaze-based interfaces",
abstract = "Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.",
author = "Michael Barz and Daiber, {Florian Johannes} and Andreas Bulling",
year = "2016",
month = mar,
day = "14",
doi = "10.1145/2857491.2857493",
language = "English",
isbn = "9781450341257",
pages = "275--278",
booktitle = "ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Prediction of gaze estimation error for error-aware gaze-based interfaces

AU - Barz, Michael

AU - Daiber, Florian Johannes

AU - Bulling, Andreas

PY - 2016/3/14

Y1 - 2016/3/14

N2 - Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.

AB - Gaze estimation error is inherent in head-mounted eye trackers and seriously impacts performance, usability, and user experience of gaze-based interfaces. Particularly in mobile settings, this error varies constantly as users move in front and look at different parts of a display. We envision a new class of gaze-based interfaces that are aware of the gaze estimation error and adapt to it in real time. As a first step towards this vision we introduce an error model that is able to predict the gaze estimation error. Our method covers major building blocks of mobile gaze estimation, specifically mapping of pupil positions to scene camera coordinates, marker-based display detection, and mapping of gaze from scene camera to on-screen coordinates. We develop our model through a series of principled measurements of a state-of-the-art head-mounted eye tracker.

U2 - 10.1145/2857491.2857493

DO - 10.1145/2857491.2857493

M3 - Conference contribution/Paper

SN - 9781450341257

SP - 275

EP - 278

BT - ETRA '16 Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications

PB - ACM

CY - New York

ER -