Home > Research > Publications & Outputs > Multi-user Gaze-based Interaction Techniques on...

Links

Text available via DOI:

View graph of relations

Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date25/05/2021
Host publicationETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications
Place of PublicationNew York
PublisherAssociation for Computing Machinery (ACM)
Number of pages7
ISBN (electronic)9781450383455
<mark>Original language</mark>English
Event2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021 - Virtual, Online, United Kingdom
Duration: 24/05/202127/05/2021

Conference

Conference2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021
Country/TerritoryUnited Kingdom
CityVirtual, Online
Period24/05/2127/05/21

Conference

Conference2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021
Country/TerritoryUnited Kingdom
CityVirtual, Online
Period24/05/2127/05/21

Abstract

Eye-gaze is a technology for implicit, fast, and hands-free input for a variety of use cases, with the majority of techniques focusing on single-user contexts. In this work, we present an exploration into gaze techniques of users interacting together on the same surface. We explore interaction concepts that exploit two states in an interactive system: 1) users visually attending to the same object in the UI, or 2) users focusing on separate targets. Interfaces can exploit these states with increasing availability of eye-tracking. For example, to dynamically personalise content on the UI to each user, and to provide a merged or compromised view on an object when both users' gaze are falling upon it. These concepts are explored with a prototype horizontal interface that tracks gaze of two users facing each other. We build three applications that illustrate different mappings of gaze to multi-user support: an indoor map with gaze-highlighted information, an interactive tree-of-life visualisation that dynamically expands on users' gaze, and a worldmap application with gaze-aware fisheye zooming. We conclude with insights from a public deployment of this system, pointing toward the engaging and seamless ways how eye based input integrates into collaborative interaction.