Home > Research > Publications & Outputs > Partially-indirect Bimanual Input with Gaze, Pe...

Electronic data

  • CHI_16BimanualInputGaze

    Rights statement: © {Owner/Author ACM}, 2016. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/2858036.2858201

    Accepted author manuscript, 0.99 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

View graph of relations

Partially-indirect Bimanual Input with Gaze, Pen, and Touch for Pan, Zoom, and Ink Interaction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date7/05/2016
Host publicationCHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM Press
Pages2845-2856
Number of pages12
ISBN (print)9781450333627
<mark>Original language</mark>English
EventCHI 2016 - California, San Jose, United States
Duration: 7/05/201612/05/2016

Conference

ConferenceCHI 2016
Country/TerritoryUnited States
CitySan Jose
Period7/05/1612/05/16

Conference

ConferenceCHI 2016
Country/TerritoryUnited States
CitySan Jose
Period7/05/1612/05/16

Abstract

Bimanual pen and touch UIs are mainly based on the direct manipulation paradigm. Alternatively we propose partially- indirect bimanual input, where direct pen input is used with the dominant hand, and indirect-touch input with the non-dominant hand. As direct and indirect inputs do not overlap, users can interact in the same space without interference. We investigate two indirect-touch techniques combined with direct pen input: the first redirects touches to the user’s gaze position, and the second redirects touches to the pen position. In this paper, we present an empirical user study where we compare both partially-indirect techniques to direct pen and touch input in bimanual pan, zoom, and ink tasks. Our experimental results show that users are comparatively fast with the indirect techniques, but more accurate as users can dynamically change the zoom-target during indirect zoom gestures. Further our studies reveal that direct and indirect zoom gestures have distinct characteristics regarding spatial use, gestural use, and bimanual parallelism.

Bibliographic note

© {Owner/Author ACM}, 2016. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '16 Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/2858036.2858201