Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
<mark>Journal publication date</mark> | 18/08/2015 |
---|---|
<mark>Journal</mark> | Personal and Ubiquitous Computing |
Issue number | 5-6 |
Volume | 19 |
Number of pages | 15 |
Pages (from-to) | 967-981 |
Publication Status | Published |
Early online date | 3/07/15 |
<mark>Original language</mark> | English |
In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.