Home > Research > Publications & Outputs > AmbiGaze

Electronic data

  • pn0505-vellosoA

    Final published version, 1.12 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

AmbiGaze: direct control of ambient devices by gaze

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date4/06/2016
Host publicationDIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems
Place of PublicationNew York
PublisherACM
Pages812-817
Number of pages6
ISBN (print)9781450340311
<mark>Original language</mark>English

Abstract

Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.