Home > Research > Publications & Outputs > Online-EYE: Multimodal Implicit Eye Tracking Ca...

Links

Text available via DOI:

View graph of relations

Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date25/04/2025
Host publicationProceedings of the 2025 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York, NY, USA
PublisherThe Association for Computing Machinery
Pages1-16
Number of pages16
ISBN (electronic)9798400713941
<mark>Original language</mark>English

Publication series

NameCHI '25
PublisherAssociation for Computing Machinery

Abstract

Unlike other inputs for extended reality (XR) that work out of the box, eye tracking typically requires custom calibration per user or session. We present a multimodal inputs approach for implicit calibration of eye tracker in VR, leveraging UI interaction for continuous, background calibration. Our method analyzes gaze data alongside controller interaction with UI elements, and employing ML techniques it continuously refines the calibration matrix without interrupting users from their current tasks. Potentially eliminating the need for explicit calibration. We demonstrate the accuracy and effectiveness of this implicit approach across various tasks and real time applications achieving comparable eye tracking accuracy to native, explicit calibration. While our evaluation focuses on VR and controller-based interactions, we anticipate the broader applicability of this approach to various XR devices and input modalities.