Rights statement: © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921
Accepted author manuscript, 3.71 MB, PDF document
Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License
Final published version
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Eye&Head
T2 - Synergetic Eye and Head Movement for Gaze Pointing and Selection
AU - Sidenmark, Ludwig
AU - Gellersen, Hans
N1 - © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology http://doi.acm.org/10.1145/3332165.3347921
PY - 2019/10/20
Y1 - 2019/10/20
N2 - Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.
AB - Eye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and identify design principles for Eye&Head gaze interaction. We introduce three novel techniques that build on the distinction of head-supported versus eyes-only gaze, to enable dynamic coupling of gaze and pointer, hover interaction, visual exploration around pre-selections, and iterative and fast confirmation of targets. We demonstrate Eye&Head interaction on applications in virtual reality, and evaluate our techniques against baselines in pointing and confirmation studies. Our results show that Eye&Head techniques enable novel gaze behaviours that provide users with more control and flexibility in fast gaze pointing and selection.
KW - Gaze interaction
KW - 3D Interaction
KW - Eye-head coordination
KW - Eye tracking
KW - Target selection
KW - Virtual Reality
U2 - 10.1145/3332165.3347921
DO - 10.1145/3332165.3347921
M3 - Conference contribution/Paper
SP - 1161
EP - 1174
BT - UIST '19 Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology
PB - ACM
CY - New York
ER -