Home > Research > Publications & Outputs > Snap, Pursuit and Gain

Electronic data

Links

Text available via DOI:

View graph of relations

Snap, Pursuit and Gain: Virtual Reality Viewport Control by Gaze

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date11/05/2024
Host publicationCHI'24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems
Place of PublicationNw York
PublisherACM
Number of pages14
ISBN (electronic)9798400703300
<mark>Original language</mark>English
EventCHI 2024: Surfing the World - Hawaiʻi Convention Center, Oahu, United States
Duration: 11/05/202416/05/2024
https://chi2024.acm.org/

Conference

ConferenceCHI 2024
Country/TerritoryUnited States
CityOahu
Period11/05/2416/05/24
Internet address

Conference

ConferenceCHI 2024
Country/TerritoryUnited States
CityOahu
Period11/05/2416/05/24
Internet address

Abstract

Head-mounted displays let users explore virtual environments through a viewport that is coupled with head movement. In this work, we investigate gaze as an alternative modality for viewport control, enabling exploration of virtual worlds with less head movement.
We designed three techniques that leverage gaze based on different eye movements: Dwell Snap for viewport rotation in discrete steps, Gaze Gain for amplified viewport rotation based on gaze angle, and Gaze Pursuit for central viewport alignment of gaze targets.
All three techniques enable 360-degree viewport control through naturally coordinated eye and head movement. We evaluated the techniques in comparison with controller snap and head amplification baselines, for both coarse and precise viewport control, and found them to be as fast and accurate. We observed a high variance in performance which may be attributable to the different degrees to which humans tend to support gaze shifts with head movement.