Home > Research > Publications & Outputs > Exploring Gaze for Assisting Freehand Selection...

Electronic data

Links

Text available via DOI:

View graph of relations

Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR. / Lystbæk, Mathias; Pfeuffer, Ken; Grønbæk, Jens Emil et al.
In: Proceedings of the ACM on Human-Computer Interaction - CSCW, Vol. 6, No. ETRA, 141, 13.05.2022, p. 1-16.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Lystbæk, M, Pfeuffer, K, Grønbæk, JE & Gellersen, H 2022, 'Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR', Proceedings of the ACM on Human-Computer Interaction - CSCW, vol. 6, no. ETRA, 141, pp. 1-16. https://doi.org/10.1145/3530882

APA

Lystbæk, M., Pfeuffer, K., Grønbæk, J. E., & Gellersen, H. (2022). Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR. Proceedings of the ACM on Human-Computer Interaction - CSCW, 6(ETRA), 1-16. Article 141. https://doi.org/10.1145/3530882

Vancouver

Lystbæk M, Pfeuffer K, Grønbæk JE, Gellersen H. Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR. Proceedings of the ACM on Human-Computer Interaction - CSCW. 2022 May 13;6(ETRA):1-16. 141. doi: 10.1145/3530882

Author

Lystbæk, Mathias ; Pfeuffer, Ken ; Grønbæk, Jens Emil et al. / Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR. In: Proceedings of the ACM on Human-Computer Interaction - CSCW. 2022 ; Vol. 6, No. ETRA. pp. 1-16.

Bibtex

@article{10b0004e4a934632832b1ce1aac52e80,
title = "Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR",
abstract = "With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry.",
author = "Mathias Lystb{\ae}k and Ken Pfeuffer and Gr{\o}nb{\ae}k, {Jens Emil} and Hans Gellersen",
year = "2022",
month = may,
day = "13",
doi = "10.1145/3530882",
language = "English",
volume = "6",
pages = "1--16",
journal = "Proceedings of the ACM on Human-Computer Interaction - CSCW",
issn = "2573-0142",
publisher = "Association for Computing Machinery (ACM)",
number = "ETRA",

}

RIS

TY - JOUR

T1 - Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR

AU - Lystbæk, Mathias

AU - Pfeuffer, Ken

AU - Grønbæk, Jens Emil

AU - Gellersen, Hans

PY - 2022/5/13

Y1 - 2022/5/13

N2 - With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry.

AB - With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry.

U2 - 10.1145/3530882

DO - 10.1145/3530882

M3 - Journal article

VL - 6

SP - 1

EP - 16

JO - Proceedings of the ACM on Human-Computer Interaction - CSCW

JF - Proceedings of the ACM on Human-Computer Interaction - CSCW

SN - 2573-0142

IS - ETRA

M1 - 141

ER -