Home > Research > Publications & Outputs > Eye, Head and Torso Coordination During Gaze Sh...

Electronic data

  • Eye__Head_and_Torso_Coordination_During_Gaze_Shifts_in_Virtual_Reality

    Rights statement: © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction, 27, 1, 2019 http://doi.acm.org/10.1145/3361218

    Accepted author manuscript, 2.5 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. / Sidenmark, Ludwig; Gellersen, Hans.
In: ACM Transactions on Computer-Human Interaction (TOCHI), Vol. 27, No. 1, 4, 17.12.2019.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Sidenmark, L & Gellersen, H 2019, 'Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality', ACM Transactions on Computer-Human Interaction (TOCHI), vol. 27, no. 1, 4. https://doi.org/10.1145/3361218

APA

Vancouver

Sidenmark L, Gellersen H. Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. ACM Transactions on Computer-Human Interaction (TOCHI). 2019 Dec 17;27(1):4. doi: 10.1145/3361218

Author

Sidenmark, Ludwig ; Gellersen, Hans. / Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality. In: ACM Transactions on Computer-Human Interaction (TOCHI). 2019 ; Vol. 27, No. 1.

Bibtex

@article{f354059d848c472ca1b77dc63eb6781d,
title = "Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality",
abstract = "Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality (VR) aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements' contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges, and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.",
keywords = "Eye gaze, Gaze shifts, Eye-head coordination, Eye, head and body movement, Eye tracking, Gaze interaction, Multimodal interaction",
author = "Ludwig Sidenmark and Hans Gellersen",
note = "{\textcopyright} ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction, 27, 1, 2019 http://doi.acm.org/10.1145/3361218",
year = "2019",
month = dec,
day = "17",
doi = "10.1145/3361218",
language = "English",
volume = "27",
journal = "ACM Transactions on Computer-Human Interaction (TOCHI)",
issn = "1073-0516",
publisher = "Association for Computing Machinery (ACM)",
number = "1",

}

RIS

TY - JOUR

T1 - Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality

AU - Sidenmark, Ludwig

AU - Gellersen, Hans

N1 - © ACM, 2019. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Computer-Human Interaction, 27, 1, 2019 http://doi.acm.org/10.1145/3361218

PY - 2019/12/17

Y1 - 2019/12/17

N2 - Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality (VR) aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements' contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges, and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.

AB - Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article reports a study of gaze shifts in virtual reality (VR) aimed to address the gap and inform design. We identify general eye, head and torso coordination patterns and provide an analysis of the relative movements' contribution and temporal alignment. We quantify effects of target distance, direction and user posture, describe preferred eye-in-head motion ranges, and identify a high variability in head movement tendency. Study insights lead us to propose gaze zones that reflect different levels of contribution from eye, head and body. We discuss design implications for HCI and VR, and in conclusion argue to treat gaze as multimodal input, and eye, head and body movement as synergetic in interaction design.

KW - Eye gaze

KW - Gaze shifts

KW - Eye-head coordination

KW - Eye, head and body movement

KW - Eye tracking

KW - Gaze interaction

KW - Multimodal interaction

U2 - 10.1145/3361218

DO - 10.1145/3361218

M3 - Journal article

VL - 27

JO - ACM Transactions on Computer-Human Interaction (TOCHI)

JF - ACM Transactions on Computer-Human Interaction (TOCHI)

SN - 1073-0516

IS - 1

M1 - 4

ER -