Home > Research > Publications & Outputs > Online-EYE: Multimodal Implicit Eye Tracking Ca...

Links

Text available via DOI:

View graph of relations

Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR. / Hou, Baosheng James; Abramyan, Lucy; Gurumurthy, Prasanthi et al.
Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. New York, NY, USA: The Association for Computing Machinery, 2025. p. 1-16 550 (CHI '25).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Hou, BJ, Abramyan, L, Gurumurthy, P, Adams, H, Tosic Rodgers, I, Gonzalez, EJ, Patel, K, Colaço, A, Pfeuffer, K, Gellersen, H, Ahuja, K & Gonzalez-Franco, M 2025, Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR. in Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems., 550, CHI '25, The Association for Computing Machinery, New York, NY, USA, pp. 1-16. https://doi.org/10.1145/3706598.3713461

APA

Hou, B. J., Abramyan, L., Gurumurthy, P., Adams, H., Tosic Rodgers, I., Gonzalez, E. J., Patel, K., Colaço, A., Pfeuffer, K., Gellersen, H., Ahuja, K., & Gonzalez-Franco, M. (2025). Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1-16). Article 550 (CHI '25). The Association for Computing Machinery. https://doi.org/10.1145/3706598.3713461

Vancouver

Hou BJ, Abramyan L, Gurumurthy P, Adams H, Tosic Rodgers I, Gonzalez EJ et al. Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. New York, NY, USA: The Association for Computing Machinery. 2025. p. 1-16. 550. (CHI '25). doi: 10.1145/3706598.3713461

Author

Hou, Baosheng James ; Abramyan, Lucy ; Gurumurthy, Prasanthi et al. / Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR. Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems. New York, NY, USA : The Association for Computing Machinery, 2025. pp. 1-16 (CHI '25).

Bibtex

@inproceedings{41a7d7f0aa4a41d39c76b53dd07d61d4,
title = "Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR",
abstract = "Unlike other inputs for extended reality (XR) that work out of the box, eye tracking typically requires custom calibration per user or session. We present a multimodal inputs approach for implicit calibration of eye tracker in VR, leveraging UI interaction for continuous, background calibration. Our method analyzes gaze data alongside controller interaction with UI elements, and employing ML techniques it continuously refines the calibration matrix without interrupting users from their current tasks. Potentially eliminating the need for explicit calibration. We demonstrate the accuracy and effectiveness of this implicit approach across various tasks and real time applications achieving comparable eye tracking accuracy to native, explicit calibration. While our evaluation focuses on VR and controller-based interactions, we anticipate the broader applicability of this approach to various XR devices and input modalities.",
keywords = "Gaze estimation, implicit calibration, eye tracking",
author = "Hou, {Baosheng James} and Lucy Abramyan and Prasanthi Gurumurthy and Haley Adams and {Tosic Rodgers}, Ivana and Gonzalez, {Eric J} and Khushman Patel and Andrea Cola{\c c}o and Ken Pfeuffer and Hans Gellersen and Karan Ahuja and Mar Gonzalez-Franco",
year = "2025",
month = apr,
day = "25",
doi = "10.1145/3706598.3713461",
language = "English",
series = "CHI '25",
publisher = "The Association for Computing Machinery",
pages = "1--16",
booktitle = "Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems",

}

RIS

TY - GEN

T1 - Online-EYE: Multimodal Implicit Eye Tracking Calibration for XR

AU - Hou, Baosheng James

AU - Abramyan, Lucy

AU - Gurumurthy, Prasanthi

AU - Adams, Haley

AU - Tosic Rodgers, Ivana

AU - Gonzalez, Eric J

AU - Patel, Khushman

AU - Colaço, Andrea

AU - Pfeuffer, Ken

AU - Gellersen, Hans

AU - Ahuja, Karan

AU - Gonzalez-Franco, Mar

PY - 2025/4/25

Y1 - 2025/4/25

N2 - Unlike other inputs for extended reality (XR) that work out of the box, eye tracking typically requires custom calibration per user or session. We present a multimodal inputs approach for implicit calibration of eye tracker in VR, leveraging UI interaction for continuous, background calibration. Our method analyzes gaze data alongside controller interaction with UI elements, and employing ML techniques it continuously refines the calibration matrix without interrupting users from their current tasks. Potentially eliminating the need for explicit calibration. We demonstrate the accuracy and effectiveness of this implicit approach across various tasks and real time applications achieving comparable eye tracking accuracy to native, explicit calibration. While our evaluation focuses on VR and controller-based interactions, we anticipate the broader applicability of this approach to various XR devices and input modalities.

AB - Unlike other inputs for extended reality (XR) that work out of the box, eye tracking typically requires custom calibration per user or session. We present a multimodal inputs approach for implicit calibration of eye tracker in VR, leveraging UI interaction for continuous, background calibration. Our method analyzes gaze data alongside controller interaction with UI elements, and employing ML techniques it continuously refines the calibration matrix without interrupting users from their current tasks. Potentially eliminating the need for explicit calibration. We demonstrate the accuracy and effectiveness of this implicit approach across various tasks and real time applications achieving comparable eye tracking accuracy to native, explicit calibration. While our evaluation focuses on VR and controller-based interactions, we anticipate the broader applicability of this approach to various XR devices and input modalities.

KW - Gaze estimation

KW - implicit calibration

KW - eye tracking

U2 - 10.1145/3706598.3713461

DO - 10.1145/3706598.3713461

M3 - Conference contribution/Paper

T3 - CHI '25

SP - 1

EP - 16

BT - Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems

PB - The Association for Computing Machinery

CY - New York, NY, USA

ER -