Home > Research > Publications & Outputs > Multi-user Gaze-based Interaction Techniques on...

Links

Text available via DOI:

View graph of relations

Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens. / Pfeuffer, Ken; Alexander, Jason; Gellersen, Hans.
ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications. New York: Association for Computing Machinery (ACM), 2021. 26.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Pfeuffer, K, Alexander, J & Gellersen, H 2021, Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens. in ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications., 26, Association for Computing Machinery (ACM), New York, 2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021, Virtual, Online, United Kingdom, 24/05/21. https://doi.org/10.1145/3448018.3458016

APA

Pfeuffer, K., Alexander, J., & Gellersen, H. (2021). Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens. In ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications Article 26 Association for Computing Machinery (ACM). https://doi.org/10.1145/3448018.3458016

Vancouver

Pfeuffer K, Alexander J, Gellersen H. Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens. In ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications. New York: Association for Computing Machinery (ACM). 2021. 26 doi: 10.1145/3448018.3458016

Author

Pfeuffer, Ken ; Alexander, Jason ; Gellersen, Hans. / Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens. ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications. New York : Association for Computing Machinery (ACM), 2021.

Bibtex

@inproceedings{5e0fbfbbdf5c495db4d60017a041f9c0,
title = "Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens",
abstract = "Eye-gaze is a technology for implicit, fast, and hands-free input for a variety of use cases, with the majority of techniques focusing on single-user contexts. In this work, we present an exploration into gaze techniques of users interacting together on the same surface. We explore interaction concepts that exploit two states in an interactive system: 1) users visually attending to the same object in the UI, or 2) users focusing on separate targets. Interfaces can exploit these states with increasing availability of eye-tracking. For example, to dynamically personalise content on the UI to each user, and to provide a merged or compromised view on an object when both users' gaze are falling upon it. These concepts are explored with a prototype horizontal interface that tracks gaze of two users facing each other. We build three applications that illustrate different mappings of gaze to multi-user support: an indoor map with gaze-highlighted information, an interactive tree-of-life visualisation that dynamically expands on users' gaze, and a worldmap application with gaze-aware fisheye zooming. We conclude with insights from a public deployment of this system, pointing toward the engaging and seamless ways how eye based input integrates into collaborative interaction. ",
keywords = "collaboration, eye-tracking, gaze input, multi-user interaction, shared user interface",
author = "Ken Pfeuffer and Jason Alexander and Hans Gellersen",
year = "2021",
month = may,
day = "25",
doi = "10.1145/3448018.3458016",
language = "English",
booktitle = "ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications",
publisher = "Association for Computing Machinery (ACM)",
note = "2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021 ; Conference date: 24-05-2021 Through 27-05-2021",

}

RIS

TY - GEN

T1 - Multi-user Gaze-based Interaction Techniques on Collaborative Touchscreens

AU - Pfeuffer, Ken

AU - Alexander, Jason

AU - Gellersen, Hans

PY - 2021/5/25

Y1 - 2021/5/25

N2 - Eye-gaze is a technology for implicit, fast, and hands-free input for a variety of use cases, with the majority of techniques focusing on single-user contexts. In this work, we present an exploration into gaze techniques of users interacting together on the same surface. We explore interaction concepts that exploit two states in an interactive system: 1) users visually attending to the same object in the UI, or 2) users focusing on separate targets. Interfaces can exploit these states with increasing availability of eye-tracking. For example, to dynamically personalise content on the UI to each user, and to provide a merged or compromised view on an object when both users' gaze are falling upon it. These concepts are explored with a prototype horizontal interface that tracks gaze of two users facing each other. We build three applications that illustrate different mappings of gaze to multi-user support: an indoor map with gaze-highlighted information, an interactive tree-of-life visualisation that dynamically expands on users' gaze, and a worldmap application with gaze-aware fisheye zooming. We conclude with insights from a public deployment of this system, pointing toward the engaging and seamless ways how eye based input integrates into collaborative interaction.

AB - Eye-gaze is a technology for implicit, fast, and hands-free input for a variety of use cases, with the majority of techniques focusing on single-user contexts. In this work, we present an exploration into gaze techniques of users interacting together on the same surface. We explore interaction concepts that exploit two states in an interactive system: 1) users visually attending to the same object in the UI, or 2) users focusing on separate targets. Interfaces can exploit these states with increasing availability of eye-tracking. For example, to dynamically personalise content on the UI to each user, and to provide a merged or compromised view on an object when both users' gaze are falling upon it. These concepts are explored with a prototype horizontal interface that tracks gaze of two users facing each other. We build three applications that illustrate different mappings of gaze to multi-user support: an indoor map with gaze-highlighted information, an interactive tree-of-life visualisation that dynamically expands on users' gaze, and a worldmap application with gaze-aware fisheye zooming. We conclude with insights from a public deployment of this system, pointing toward the engaging and seamless ways how eye based input integrates into collaborative interaction.

KW - collaboration

KW - eye-tracking

KW - gaze input

KW - multi-user interaction

KW - shared user interface

U2 - 10.1145/3448018.3458016

DO - 10.1145/3448018.3458016

M3 - Conference contribution/Paper

AN - SCOPUS:85107537767

BT - ETRA '21 Short Papers: ACM Symposium on Eye Tracking Research and Applications

PB - Association for Computing Machinery (ACM)

CY - New York

T2 - 2021 ACM Symposium on Eye Tracking Research and Applications, ETRA 2021

Y2 - 24 May 2021 through 27 May 2021

ER -