Home > Research > Publications & Outputs > Remote Control by Body Movement in Synchrony wi...

Electronic data

  • 0-Master-Document

    Rights statement: © 2017 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1, 3, 2017 http://doi.acm.org/10.1145/3130910

    Accepted author manuscript, 11.2 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch. / Clarke, Christopher; Bellino, Alessio; Abreu Esteves, Augusto Emanuel et al.
In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Vol. 1, No. 3, 45, 01.09.2017.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Clarke C, Bellino A, Abreu Esteves AE, Gellersen H-WG. Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2017 Sept 1;1(3):45. doi: 10.1145/3130910

Author

Clarke, Christopher ; Bellino, Alessio ; Abreu Esteves, Augusto Emanuel et al. / Remote Control by Body Movement in Synchrony with Orbiting Widgets : an Evaluation of TraceMatch. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. 2017 ; Vol. 1, No. 3.

Bibtex

@article{a89ccf404a6d42329dbe9f23a3efa396,
title = "Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch",
abstract = "In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.",
author = "Christopher Clarke and Alessio Bellino and {Abreu Esteves}, {Augusto Emanuel} and Gellersen, {Hans-Werner Georg}",
note = "{\textcopyright} 2017 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1, 3, 2017 http://doi.acm.org/10.1145/3130910",
year = "2017",
month = sep,
day = "1",
doi = "10.1145/3130910",
language = "English",
volume = "1",
journal = "Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies",
issn = "2474-9567",
publisher = "Association for Computing Machinery (ACM)",
number = "3",

}

RIS

TY - JOUR

T1 - Remote Control by Body Movement in Synchrony with Orbiting Widgets

T2 - an Evaluation of TraceMatch

AU - Clarke, Christopher

AU - Bellino, Alessio

AU - Abreu Esteves, Augusto Emanuel

AU - Gellersen, Hans-Werner Georg

N1 - © 2017 Copyright held by the owner/author(s). Publication rights licensed to Association for Computing Machinery. 2017. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 1, 3, 2017 http://doi.acm.org/10.1145/3130910

PY - 2017/9/1

Y1 - 2017/9/1

N2 - In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.

AB - In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.

U2 - 10.1145/3130910

DO - 10.1145/3130910

M3 - Journal article

VL - 1

JO - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

JF - Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies

SN - 2474-9567

IS - 3

M1 - 45

ER -