Home > Research > Publications & Outputs > Remote Control by Body Movement in Synchrony wi...

Electronic data

  • 0-Master-Document

    Rights statement: © 2017 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1, 3, 2017 http://doi.acm.org/10.1145/3130910

    Accepted author manuscript, 11.2 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
Article number45
<mark>Journal publication date</mark>1/09/2017
<mark>Journal</mark>Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
Issue number3
Volume1
Number of pages22
Publication StatusPublished
<mark>Original language</mark>English

Abstract

In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.

Bibliographic note

© 2017 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1, 3, 2017 http://doi.acm.org/10.1145/3130910