Home > Research > Publications & Outputs > Hands-on, Hands-off

Electronic data

Links

Text available via DOI:

View graph of relations

Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction. / Lystbæk, Mathias; Mikkelsen, Thorbjørn; Krisztandl, Roland et al.
UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. New York: Association for Computing Machinery (ACM), 2024. p. 1-12 80.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Lystbæk, M, Mikkelsen, T, Krisztandl, R, Gonzalez, EJ, Gonzalez-Franco, M, Gellersen, H & Pfeuffer, K 2024, Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction. in UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology., 80, Association for Computing Machinery (ACM), New York, pp. 1-12. https://doi.org/10.1145/3654777.3676331

APA

Lystbæk, M., Mikkelsen, T., Krisztandl, R., Gonzalez, E. J., Gonzalez-Franco, M., Gellersen, H., & Pfeuffer, K. (2024). Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction. In UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (pp. 1-12). Article 80 Association for Computing Machinery (ACM). https://doi.org/10.1145/3654777.3676331

Vancouver

Lystbæk M, Mikkelsen T, Krisztandl R, Gonzalez EJ, Gonzalez-Franco M, Gellersen H et al. Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction. In UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. New York: Association for Computing Machinery (ACM). 2024. p. 1-12. 80 doi: 10.1145/3654777.3676331

Author

Lystbæk, Mathias ; Mikkelsen, Thorbjørn ; Krisztandl, Roland et al. / Hands-on, Hands-off : Gaze-Assisted Bimanual 3D Interaction. UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology. New York : Association for Computing Machinery (ACM), 2024. pp. 1-12

Bibtex

@inproceedings{fe613a08064f4a40b44279025dd1ce52,
title = "Hands-on, Hands-off: Gaze-Assisted Bimanual 3D Interaction",
abstract = "Extended Reality (XR) systems with hand-tracking support direct manipulation of objects with both hands. A common interaction in this context is for the non-dominant hand (NDH) to orient an object for input by the dominant hand (DH). We explore bimanual interaction with gaze through three new modes of interaction where the input of the NDH, DH, or both hands is indirect based on Gaze+Pinch. These modes enable a new dynamic interplay between our hands, allowing flexible alternation between and pairing of complementary operations. Through applications, we demonstrate several use cases in the context of 3D modelling, where users exploit occlusion-free, low-effort, and fluid two-handed manipulation. To gain a deeper understanding of each mode, we present a user study on an asymmetric rotate-translate task. Most participants preferred indirect input with both hands for lower physical effort, without a penalty on user performance. Otherwise, they preferred modes where the NDH oriented the object directly, supporting preshaping of the hand, which is more challenging with indirect gestures. The insights gained are of relevance for the design of XR interfaces that aim to leverage eye and hand input in tandem.",
author = "Mathias Lystb{\ae}k and Thorbj{\o}rn Mikkelsen and Roland Krisztandl and Gonzalez, {Eric J.} and Mar Gonzalez-Franco and Hans Gellersen and Ken Pfeuffer",
year = "2024",
month = oct,
day = "21",
doi = "10.1145/3654777.3676331",
language = "English",
pages = "1--12",
booktitle = "UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology",
publisher = "Association for Computing Machinery (ACM)",
address = "United States",

}

RIS

TY - GEN

T1 - Hands-on, Hands-off

T2 - Gaze-Assisted Bimanual 3D Interaction

AU - Lystbæk, Mathias

AU - Mikkelsen, Thorbjørn

AU - Krisztandl, Roland

AU - Gonzalez, Eric J.

AU - Gonzalez-Franco, Mar

AU - Gellersen, Hans

AU - Pfeuffer, Ken

PY - 2024/10/21

Y1 - 2024/10/21

N2 - Extended Reality (XR) systems with hand-tracking support direct manipulation of objects with both hands. A common interaction in this context is for the non-dominant hand (NDH) to orient an object for input by the dominant hand (DH). We explore bimanual interaction with gaze through three new modes of interaction where the input of the NDH, DH, or both hands is indirect based on Gaze+Pinch. These modes enable a new dynamic interplay between our hands, allowing flexible alternation between and pairing of complementary operations. Through applications, we demonstrate several use cases in the context of 3D modelling, where users exploit occlusion-free, low-effort, and fluid two-handed manipulation. To gain a deeper understanding of each mode, we present a user study on an asymmetric rotate-translate task. Most participants preferred indirect input with both hands for lower physical effort, without a penalty on user performance. Otherwise, they preferred modes where the NDH oriented the object directly, supporting preshaping of the hand, which is more challenging with indirect gestures. The insights gained are of relevance for the design of XR interfaces that aim to leverage eye and hand input in tandem.

AB - Extended Reality (XR) systems with hand-tracking support direct manipulation of objects with both hands. A common interaction in this context is for the non-dominant hand (NDH) to orient an object for input by the dominant hand (DH). We explore bimanual interaction with gaze through three new modes of interaction where the input of the NDH, DH, or both hands is indirect based on Gaze+Pinch. These modes enable a new dynamic interplay between our hands, allowing flexible alternation between and pairing of complementary operations. Through applications, we demonstrate several use cases in the context of 3D modelling, where users exploit occlusion-free, low-effort, and fluid two-handed manipulation. To gain a deeper understanding of each mode, we present a user study on an asymmetric rotate-translate task. Most participants preferred indirect input with both hands for lower physical effort, without a penalty on user performance. Otherwise, they preferred modes where the NDH oriented the object directly, supporting preshaping of the hand, which is more challenging with indirect gestures. The insights gained are of relevance for the design of XR interfaces that aim to leverage eye and hand input in tandem.

U2 - 10.1145/3654777.3676331

DO - 10.1145/3654777.3676331

M3 - Conference contribution/Paper

SP - 1

EP - 12

BT - UIST'24: Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology

PB - Association for Computing Machinery (ACM)

CY - New York

ER -