Home > Research > Publications & Outputs > Eye&Head

Electronic data

  • Final Accepted Version

    Rights statement: © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921

    Accepted author manuscript, 3.71 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Eye&Head : Synergetic Eye and Head Movement for Gaze Pointing and Selection. / Sidenmark, Ludwig; Gellersen, Hans.

UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. New York : ACM, 2019. p. 1161-1174.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Sidenmark, L & Gellersen, H 2019, Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. ACM, New York, pp. 1161-1174. https://doi.org/10.1145/3332165.3347921

APA

Sidenmark, L., & Gellersen, H. (2019). Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology (pp. 1161-1174). ACM. https://doi.org/10.1145/3332165.3347921

Vancouver

Sidenmark L, Gellersen H. Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection. In UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. New York: ACM. 2019. p. 1161-1174 https://doi.org/10.1145/3332165.3347921

Author

Sidenmark, Ludwig ; Gellersen, Hans. / Eye&Head : Synergetic Eye and Head Movement for Gaze Pointing and Selection. UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. New York : ACM, 2019. pp. 1161-1174

Bibtex

@inproceedings{cc25d2c0c5d14e089877cdcafb9ef19f,
title = "Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection",
abstract = "Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.",
keywords = "Gaze interaction, 3D Interaction, Eye-head coordination, Eye tracking, Target selection, Virtual Reality",
author = "Ludwig Sidenmark and Hans Gellersen",
note = "{\textcopyright} ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921",
year = "2019",
month = oct,
day = "20",
doi = "10.1145/3332165.3347921",
language = "English",
pages = "1161--1174",
booktitle = "UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Eye&Head

T2 - Synergetic Eye and Head Movement for Gaze Pointing and Selection

AU - Sidenmark, Ludwig

AU - Gellersen, Hans

N1 - © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921

PY - 2019/10/20

Y1 - 2019/10/20

N2 - Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.

AB - Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.

KW - Gaze interaction

KW - 3D Interaction

KW - Eye-head coordination

KW - Eye tracking

KW - Target selection

KW - Virtual Reality

U2 - 10.1145/3332165.3347921

DO - 10.1145/3332165.3347921

M3 - Conference contribution/Paper

SP - 1161

EP - 1174

BT - UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology

PB - ACM

CY - New York

ER -