Home > Research > Publications & Outputs > Gaze-shifting

Electronic data

  • GazeShifting

    Rights statement: ©ACM, 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology http://dx.doi.org/10.1145/2807442.2807460

    Accepted author manuscript, 7.21 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

Gaze-shifting: direct-indirect input with pen and touch modulated by gaze

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
NullPointerException

Abstract

Modalities such as pen and touch are associated with direct input but can also be used for indirect input. We propose to combine the two modes for direct-indirect input modulated by gaze. We introduce gaze-shifting as a novel mechanism for switching the input mode based on the alignment of manual input and the user's visual attention. Input in the user's area of attention results in direct manipulation whereas input offset from the user's gaze is redirected to the visual target. The technique is generic and can be used in the same manner with different input modalities. We show how gaze-shifting enables novel direct-indirect techniques with pen, touch, and combinations of pen and touch input.

Bibliographic note

©ACM, 2015. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in UIST '15 Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology http://dx.doi.org/10.1145/2807442.2807460