Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Gaze + Pinch interaction in virtual reality
AU - Pfeuffer, Ken
AU - Mayer, Benedikt
AU - Mardanbegi, Diako
AU - Gellersen, Hans
N1 - Publisher Copyright: © 2017 Association for Computing Machinery. Copyright: Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2017/10/16
Y1 - 2017/10/16
N2 - Virtual reality affords experimentation with human abilities beyond what’s possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gesture’s effect can be applied to any object the user looks at - whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.
AB - Virtual reality affords experimentation with human abilities beyond what’s possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gesture’s effect can be applied to any object the user looks at - whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.
KW - Eye tracking
KW - Freehand gesture
KW - Gaze
KW - Interaction technique
KW - Menu
KW - Multimodal interface
KW - Pinch
KW - Virtual reality
U2 - 10.1145/3131277.3132180
DO - 10.1145/3131277.3132180
M3 - Conference contribution/Paper
AN - SCOPUS:85037053971
T3 - SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction
SP - 99
EP - 108
BT - SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction
PB - Association for Computing Machinery, Inc
T2 - 5th ACM Symposium on Spatial User Interaction, SUI 2017
Y2 - 16 October 2017 through 17 October 2017
ER -