Home > Research > Publications & Outputs > User-Elicited Surface and Motion Gestures for O...

Electronic data

  • Mobile_GES_AR

    Accepted author manuscript, 10.7 MB, PDF document

Links

Text available via DOI:

View graph of relations

User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality. / Harris, Daniel; Potts, Dominic; Houben, Steven.
MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services. New York: ACM, 2022. p. 13:1-13:6 13 (MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Harris, D, Potts, D & Houben, S 2022, User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality. in MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services., 13, MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services, ACM, New York, pp. 13:1-13:6. https://doi.org/10.1145/3528575.3551443

APA

Harris, D., Potts, D., & Houben, S. (2022). User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality. In MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services (pp. 13:1-13:6). Article 13 (MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services). ACM. https://doi.org/10.1145/3528575.3551443

Vancouver

Harris D, Potts D, Houben S. User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality. In MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services. New York: ACM. 2022. p. 13:1-13:6. 13. (MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services). doi: 10.1145/3528575.3551443

Author

Harris, Daniel ; Potts, Dominic ; Houben, Steven. / User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality. MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services. New York : ACM, 2022. pp. 13:1-13:6 (MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services).

Bibtex

@inproceedings{177a76a53f6746679f11ed50df7de276,
title = "User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality",
abstract = "Recent advancements in mobile and AR technology can facilitate powerful and practical solutions for six degrees of freedom (6DOF) manipulation of 3D objects on mobile devices. However, existing 6DOF manipulation research typically focuses on surface gestures, relying on widgets for modal interaction to segment manipulations and degrees of freedom at the cost of efficiency and intuitiveness. In this paper, we explore a combination of surface and motion gestures to present an implicit modal interaction method for 6DOF manipulation of 3D objects in Mobile Augmented Reality (MAR). We conducted a guessability study that focused on key object manipulations, resulting in a set of user-defined motion and surface gestures. Our results indicate that user-defined gestures both have reasonable degrees of agreement whilst also being easy to use. Additionally, we present a prototype system that makes use of a consensus set of gestures that leverage user mobility for manipulating virtual objects in MAR.",
keywords = "Mobile Computing, Augmented Reality, Interaction Techniques, Gesture Elicitation",
author = "Daniel Harris and Dominic Potts and Steven Houben",
year = "2022",
month = sep,
day = "28",
doi = "10.1145/3528575.3551443",
language = "English",
series = "MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services",
publisher = "ACM",
pages = "13:1--13:6",
booktitle = "MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services",

}

RIS

TY - GEN

T1 - User-Elicited Surface and Motion Gestures for Object Manipulation in Mobile Augmented Reality

AU - Harris, Daniel

AU - Potts, Dominic

AU - Houben, Steven

PY - 2022/9/28

Y1 - 2022/9/28

N2 - Recent advancements in mobile and AR technology can facilitate powerful and practical solutions for six degrees of freedom (6DOF) manipulation of 3D objects on mobile devices. However, existing 6DOF manipulation research typically focuses on surface gestures, relying on widgets for modal interaction to segment manipulations and degrees of freedom at the cost of efficiency and intuitiveness. In this paper, we explore a combination of surface and motion gestures to present an implicit modal interaction method for 6DOF manipulation of 3D objects in Mobile Augmented Reality (MAR). We conducted a guessability study that focused on key object manipulations, resulting in a set of user-defined motion and surface gestures. Our results indicate that user-defined gestures both have reasonable degrees of agreement whilst also being easy to use. Additionally, we present a prototype system that makes use of a consensus set of gestures that leverage user mobility for manipulating virtual objects in MAR.

AB - Recent advancements in mobile and AR technology can facilitate powerful and practical solutions for six degrees of freedom (6DOF) manipulation of 3D objects on mobile devices. However, existing 6DOF manipulation research typically focuses on surface gestures, relying on widgets for modal interaction to segment manipulations and degrees of freedom at the cost of efficiency and intuitiveness. In this paper, we explore a combination of surface and motion gestures to present an implicit modal interaction method for 6DOF manipulation of 3D objects in Mobile Augmented Reality (MAR). We conducted a guessability study that focused on key object manipulations, resulting in a set of user-defined motion and surface gestures. Our results indicate that user-defined gestures both have reasonable degrees of agreement whilst also being easy to use. Additionally, we present a prototype system that makes use of a consensus set of gestures that leverage user mobility for manipulating virtual objects in MAR.

KW - Mobile Computing

KW - Augmented Reality

KW - Interaction Techniques

KW - Gesture Elicitation

U2 - 10.1145/3528575.3551443

DO - 10.1145/3528575.3551443

M3 - Conference contribution/Paper

T3 - MobileHCI 2022 Adjunct - Publication of the 24th ACM International Conference on Human-Computer Interaction with Mobile Devices and Services

SP - 13:1-13:6

BT - MobileHCI '22: Adjunct Publication of the 24th International Conference on Human-Computer Interaction with Mobile Devices and Services

PB - ACM

CY - New York

ER -