Final published version
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Eye tracking for public displays in the wild
AU - Zhang, Yanxia
AU - Chong, Ming Ki
AU - Müller, Jörg
AU - Bulling, Andreas
AU - Gellersen, Hans
PY - 2015/8/18
Y1 - 2015/8/18
N2 - In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.
AB - In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such contexts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates spontaneous gaze interaction, enabling users to walk up to a display and navigate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configuration, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the-shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction and maps this input to rate-controlled navigation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a 4-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye movements are subtle, users cannot learn gaze interaction from only observing others and as a result guidance is required.
KW - Calibration-free
KW - Deployment
KW - Eye tracking
KW - Gaze interaction
KW - In-the-wild study
KW - Public displays
KW - Scrolling
U2 - 10.1007/s00779-015-0866-8
DO - 10.1007/s00779-015-0866-8
M3 - Journal article
AN - SCOPUS:84939252890
VL - 19
SP - 967
EP - 981
JO - Personal and Ubiquitous Computing
JF - Personal and Ubiquitous Computing
SN - 1617-4909
IS - 5-6
ER -