Home > Research > Publications & Outputs > Eye tracking and gaze interface design for perv...

Electronic data

  • YanxiaZhang-PhDThesis

    Accepted author manuscript, 33.8 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Eye tracking and gaze interface design for pervasive displays

Research output: ThesisDoctoral Thesis

Published
Publication date2015
Number of pages225
QualificationPhD
Awarding Institution
Supervisors/Advisors
Publisher
  • Lancaster University
<mark>Original language</mark>English

Abstract

Eye tracking for pervasive displays in everyday computing is an emerging area in research. There is an increasing number of pervasive displays in our surroundings, such as large displays in public spaces, digital boards in offices and smart televisions at home. Gaze is an attractive input modality for these displays, as people naturally look at objects of interest and use their eyes to seek information. Existing research has applied eye tracking in a variety of fields, but tends to be in constrained environments for lab applications.

This thesis investigates how to enable robust gaze sensing in pervasive contexts and how eye tracking can be applied for pervasive displays that we encounter in our daily life. To answer these questions, we identify the technical and design challenges posed by using gaze for pervasive displays.

Firstly, in out-of-lab environments, interactions are usually spontaneous where users and systems are unaware of each other beforehand. This poses the technical problem that gaze sensing should not need prior user training and should be robust in unconstrained environments. We develop novel vision-based systems that require only off-the-shelf RGB cameras to address this issue.

Secondly, in pervasive contexts, users are usually unaware of gaze interactivity iii
of pervasive displays and the technical restrictions of gaze sensing systems. However, there is little knowledge about how to enable people to use gaze interactive systems in daily life. Thus, we design novel interfaces that allow novice users to interact with contents on pervasive displays, and we study the usage of our systems through field deployments. We demonstrate that people can walk up to a gaze interactive system and start to use it immediately without human assistance.

Lastly, pervasive displays could also support multiuser co-located collaborations. We explore the use of gaze for collaborative tasks. Our results show that sharing gaze information on shared displays can ease communications and improve collaboration.

Although we demonstrate benefits of using gaze for pervasive displays, open challenges remain in enabling gaze interaction in everyday computing and require further investigations. Our research provides a foundation for the rapidly growing field of eye tracking for pervasive displays.