Home > Research > Publications & Outputs > Gaze-touch

Links

Text available via DOI:

View graph of relations

Gaze-touch: combining gaze with multi-touch for interaction on the same surface

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date10/2014
Host publicationUIST '14 Proceedings of the 27th annual ACM symposium on User interface software and technology
Place of PublicationNew York
PublisherACM
Pages509-518
Number of pages10
ISBN (print)9781450330695
<mark>Original language</mark>English

Abstract

Gaze has the potential to complement multi-touch for interaction on the same surface. We present gaze-touch, a technique that combines the two modalities based on the principle of ''gaze selects, touch manipulates''. Gaze is used to select a target, and coupled with multi-touch gestures that the user can perform anywhere on the surface. Gaze-touch enables users to manipulate any target from the same touch position, for whole-surface reachability and rapid context switching. Conversely, gaze-touch enables manipulation of the same target from any touch position on the surface, for example to avoid occlusion. Gaze-touch is designed to complement direct-touch as the default interaction on multi-touch surfaces. We provide a design space analysis of the properties of gaze-touch versus direct-touch, and present four applications that explore how gaze-touch can be used alongside direct-touch. The applications demonstrate use cases for interchangeable, complementary and alternative use of the two modes of interaction, and introduce novel techniques arising from the combination of gaze-touch and conventional multi-touch.