Rights statement: © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction, 27, 1, 2019 http://doi.acm.org/10.1145/3361218
Accepted author manuscript, 2.5 MB, PDF document
Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License
Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
AU - Sidenmark, Ludwig
AU - Gellersen, Hans
N1 - © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction, 27, 1, 2019 http://doi.acm.org/10.1145/3361218
PY - 2019/12/17
Y1 - 2019/12/17
N2 - Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality (VR) aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements' contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges, and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.
AB - Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality (VR) aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements' contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges, and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.
KW - Eye gaze
KW - Gaze shifts
KW - Eye-head coordination
KW - Eye, head and body movement
KW - Eye tracking
KW - Gaze interaction
KW - Multimodal interaction
U2 - 10.1145/3361218
DO - 10.1145/3361218
M3 - Journal article
VL - 27
JO - ACM Transactions on Computer-Human Interaction (TOCHI)
JF - ACM Transactions on Computer-Human Interaction (TOCHI)
SN - 1073-0516
IS - 1
M1 - 4
ER -