Home > Research > Publications & Outputs > Outline Pursuits

Electronic data

  • Outline Pursuits

    Rights statement: © ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI'20 Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems http://doi.acm.org/10.1145/3313831.3376438

    Accepted author manuscript, 4.44 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual Reality

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date23/04/2020
Host publicationCHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM
ISBN (print)9781450367080
<mark>Original language</mark>English
EventCHI 2020 - Honololu, Hawaii
Duration: 25/04/202030/04/2020
https://chi2020.acm.org/

Conference

ConferenceCHI 2020
Period25/04/2030/04/20
Internet address

Conference

ConferenceCHI 2020
Period25/04/2030/04/20
Internet address

Abstract

In 3D environments, objects can be difficult to select when they overlap, as this affects available target area and increases selection ambiguity. We introduce Outline Pursuits which extends a primary pointing modality for gaze-assisted selection of occluded objects. Candidate targets within a pointing cone are presented with an outline that is traversed by a moving stimulus. This affords completion of the selection by gaze attention to the intended target's outline motion, detected by matching the user's smooth pursuit eye movement. We demonstrate two techniques implemented based on the concept, one with a controller as the primary pointer, and one in which Outline Pursuits are combined with head pointing for hands-free selection. Compared with conventional raycasting, the techniques require less movement for selection as users do not need to reposition themselves for a better line of sight, and selection time and accuracy are less affected when targets become highly occluded.

Bibliographic note

© ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI'20 Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems http://doi.acm.org/10.1145/3313831.3376438