Home > Research > Publications & Outputs > A Fitts’ Law Study of Gaze-Hand Alignment for S...

Electronic data

  • Fitts_Law_Uta

    Accepted author manuscript, 7.01 MB, PDF document

Text available via DOI:

View graph of relations

A Fitts’ Law Study of Gaze-Hand Alignment for Selection in 3D User Interfaces

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
  • Uta Wagner
  • Mathias Lystbæk
  • Pavel Manakhov
  • Jens Emil Grønbæk
  • Ken Pfeuffer
  • Hans Gellersen
Close
Publication date19/04/2023
Host publication2023 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM
<mark>Original language</mark>English

Abstract

Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts’ Law study design with targets presented at different depths in the visual scene, to assess effect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efficient when targets are close to the image plane but less performant with increasing target depth due to parallax.