Home > Research > Publications & Outputs > Radi-Eye

Electronic data

  • Radi-Eye Accepted Version

    Rights statement: © ACM, 2021. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021 http://doi.acm.org/10.1145/3411764.3445697

    Accepted author manuscript, 6.21 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction using Gaze-Activated Head-Crossing

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date8/05/2021
Host publicationCHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM
Pages1-11
Number of pages11
ISBN (electronic)9781450380966
<mark>Original language</mark>English

Abstract

Eye gaze and head movement are attractive for hands-free 3D interaction in head-mounted displays, but existing interfaces afford only limited control. Radi-Eye is a novel pop-up radial interface designed to maximise expressiveness with input from only the eyes and head. Radi-Eye provides widgets for discrete and continuous input and scales to support larger feature sets. Widgets can be selected with Look & Cross, using gaze for pre-selection followed by head-crossing as trigger and for manipulation. The technique leverages natural eye-head coordination where eye and head move at an offset unless explicitly brought into alignment, enabling interaction without risk of unintended input. We explore Radi-Eye in three augmented and virtual reality applications, and evaluate the effect of radial interface scale and orientation on performance with Look & Cross. The results show that Radi-Eye provides users with fast and accurate input while opening up a new design space for hands-free fluid interaction.

Bibliographic note

© ACM, 2021. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021 http://doi.acm.org/10.1145/3411764.3445697