Home > Research > Publications & Outputs > Hands-on, Hands-off

Electronic data

Links

Text available via DOI:

View graph of relations

Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date21/10/2024
Host publicationUIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Pages1-12
Number of pages12
ISBN (electronic)9798400706288
<mark>Original language</mark>English

Abstract

Extended Reality (XR) systems with hand-tracking support direct manipulation of objects with both hands. A common interaction in this context is for the non-dominant hand (NDH) to orient an object for input by the dominant hand (DH). We explore bimanual interaction with gaze through three new modes of interaction where the input of the NDH, DH, or both hands is indirect based on Gaze+Pinch. These modes enable a new dynamic interplay between our hands, allowing flexible alternation between and pairing of complementary operations. Through applications, we demonstrate several use cases in the context of 3D modelling, where users exploit occlusion-free, low-effort, and fluid two-handed manipulation. To gain a deeper understanding of each mode, we present a user study on an asymmetric rotate-translate task. Most participants preferred indirect input with both hands for lower physical effort, without a penalty on user performance. Otherwise, they preferred modes where the NDH oriented the object directly, supporting preshaping of the hand, which is more challenging with indirect gestures. The insights gained are of relevance for the design of XR interfaces that aim to leverage eye and hand input in tandem.