Home > Research > Publications & Outputs > Gaze + Pinch interaction in virtual reality

Text available via DOI:

View graph of relations

Gaze + Pinch interaction in virtual reality

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Gaze + Pinch interaction in virtual reality. / Pfeuffer, Ken; Mayer, Benedikt; Mardanbegi, Diako et al.
SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction. Association for Computing Machinery, Inc, 2017. p. 99-108 (SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Pfeuffer, K, Mayer, B, Mardanbegi, D & Gellersen, H 2017, Gaze + Pinch interaction in virtual reality. in SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction. SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction, Association for Computing Machinery, Inc, pp. 99-108, 5th ACM Symposium on Spatial User Interaction, SUI 2017, Brighton, United Kingdom, 16/10/17. https://doi.org/10.1145/3131277.3132180

APA

Pfeuffer, K., Mayer, B., Mardanbegi, D., & Gellersen, H. (2017). Gaze + Pinch interaction in virtual reality. In SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction (pp. 99-108). (SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction). Association for Computing Machinery, Inc. https://doi.org/10.1145/3131277.3132180

Vancouver

Pfeuffer K, Mayer B, Mardanbegi D, Gellersen H. Gaze + Pinch interaction in virtual reality. In SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction. Association for Computing Machinery, Inc. 2017. p. 99-108. (SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction). doi: 10.1145/3131277.3132180

Author

Pfeuffer, Ken ; Mayer, Benedikt ; Mardanbegi, Diako et al. / Gaze + Pinch interaction in virtual reality. SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction. Association for Computing Machinery, Inc, 2017. pp. 99-108 (SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction).

Bibtex

@inproceedings{afa6a1e13d924deaa6ef6821f0e27916,
title = "Gaze + Pinch interaction in virtual reality",
abstract = "Virtual reality affords experimentation with human abilities beyond what{\textquoteright}s possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gesture{\textquoteright}s effect can be applied to any object the user looks at - whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.",
keywords = "Eye tracking, Freehand gesture, Gaze, Interaction technique, Menu, Multimodal interface, Pinch, Virtual reality",
author = "Ken Pfeuffer and Benedikt Mayer and Diako Mardanbegi and Hans Gellersen",
note = "Publisher Copyright: {\textcopyright} 2017 Association for Computing Machinery. Copyright: Copyright 2017 Elsevier B.V., All rights reserved.; 5th ACM Symposium on Spatial User Interaction, SUI 2017 ; Conference date: 16-10-2017 Through 17-10-2017",
year = "2017",
month = oct,
day = "16",
doi = "10.1145/3131277.3132180",
language = "English",
series = "SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction",
publisher = "Association for Computing Machinery, Inc",
pages = "99--108",
booktitle = "SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction",

}

RIS

TY - GEN

T1 - Gaze + Pinch interaction in virtual reality

AU - Pfeuffer, Ken

AU - Mayer, Benedikt

AU - Mardanbegi, Diako

AU - Gellersen, Hans

N1 - Publisher Copyright: © 2017 Association for Computing Machinery. Copyright: Copyright 2017 Elsevier B.V., All rights reserved.

PY - 2017/10/16

Y1 - 2017/10/16

N2 - Virtual reality affords experimentation with human abilities beyond what’s possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gesture’s effect can be applied to any object the user looks at - whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.

AB - Virtual reality affords experimentation with human abilities beyond what’s possible in the real world, toward novel senses of interaction. In many interactions, the eyes naturally point at objects of interest while the hands skilfully manipulate in 3D space. We explore a particular combination for virtual reality, the Gaze + Pinch interaction technique. It integrates eye gaze to select targets, and indirect freehand gestures to manipulate them. This keeps the gesture use intuitive like direct physical manipulation, but the gesture’s effect can be applied to any object the user looks at - whether located near or far. In this paper, we describe novel interaction concepts and an experimental system prototype that bring together interaction technique variants, menu interfaces, and applications into one unified virtual experience. Proof-of-concept application examples were developed and informally tested, such as 3D manipulation, scene navigation, and image zooming, illustrating a range of advanced interaction capabilities on targets at any distance, without relying on extra controller devices.

KW - Eye tracking

KW - Freehand gesture

KW - Gaze

KW - Interaction technique

KW - Menu

KW - Multimodal interface

KW - Pinch

KW - Virtual reality

U2 - 10.1145/3131277.3132180

DO - 10.1145/3131277.3132180

M3 - Conference contribution/Paper

AN - SCOPUS:85037053971

T3 - SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction

SP - 99

EP - 108

BT - SUI 2017 - Proceedings of the 2017 Symposium on Spatial User Interaction

PB - Association for Computing Machinery, Inc

T2 - 5th ACM Symposium on Spatial User Interaction, SUI 2017

Y2 - 16 October 2017 through 17 October 2017

ER -