Home > Research > Publications & Outputs > User-Elicited Surface and Motion Gestures for O...

Electronic data

  • Mobile_GES_AR

    Accepted author manuscript, 10.7 MB, PDF document

Links

Text available via DOI:

View graph of relations

User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date28/09/2022
Host publicationMobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services
Place of PublicationNew York
PublisherACM
Pages13:1-13:6
Number of pages6
ISBN (electronic)9781450393416
<mark>Original language</mark>English

Publication series

NameMobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services

Abstract

Recent advancements in mobile and AR technology can facilitate powerful and practical solutions for six degrees of freedom (6DOF) manipulation of 3D objects on mobile devices. However, existing 6DOF manipulation research typically focuses on surface gestures, relying on widgets for modal interaction to segment manipulations and degrees of freedom at the cost of efficiency and intuitiveness. In this paper, we explore a combination of surface and motion gestures to present an implicit modal interaction method for 6DOF manipulation of 3D objects in Mobile Augmented Reality (MAR). We conducted a guessability study that focused on key object manipulations, resulting in a set of user-defined motion and surface gestures. Our results indicate that user-defined gestures both have reasonable degrees of agreement whilst also being easy to use. Additionally, we present a prototype system that makes use of a consensus set of gestures that leverage user mobility for manipulating virtual objects in MAR.