Home > Research > Publications & Outputs > Coordinated Eye and Head Movements for Gaze Int...

Electronic data

Text available via DOI:

View graph of relations

Coordinated Eye and Head Movements for Gaze Interaction in 3D Environments

Research output: ThesisDoctoral Thesis

Published
Publication date2023
QualificationPhD
Awarding Institution
Supervisors/Advisors
Award date23/11/2022
Publisher
  • Lancaster University
<mark>Original language</mark>English

Abstract

Gaze is attractive for interaction, as we naturally look at objects we are interested in. As a result, gaze has received significant attention within human-computer interaction as an input modality. However, gaze has been limited to only eye movements in situations where head movements are not expected to be used or as head movements in an approximation of gaze when an eye tracker is unavailable. From these observations arise an opportunity and a challenge: we propose to consider gaze as multi-modal in line with psychology and neuroscience research to more accurately represent user movements. The natural coordination of eye and head movements could then enable the development of novel interaction techniques to further the possibilities of gaze as an input modality. However, knowledge of the eye and head coordination in 3D environments and its usage for interaction design is limited.

This thesis explores eye and head coordination and their potential for interaction in 3D environments by developing interaction techniques that aim to tackle established gaze-interaction issues. We study fundamental eye, head, and body movements in virtual reality during gaze shifts. From the study results, we design interaction techniques and applications that avoid the Midas touch issue, allow expressive gaze- based interaction, and handle eye tracking accuracy issues. We ground the evaluation of our interaction techniques through empirical studies.

From the techniques and study results, we define three design principles for coordinated eye and head interaction from these works that distinguish between eye- only and head-supported gaze shifts, eye-head alignment as input, and distinguishing head movements for gestures and head movements that naturally occur to support gaze. We showcase new directions for gaze-based interaction and present a new way to think about gaze by taking a more comprehensive approach to gaze interaction and showing that there is more to gaze than just the eyes.