Home > Research > Publications & Outputs > Generalizing wave gestures from sparse examples...

Electronic data

Links

Text available via DOI:

View graph of relations

Generalizing wave gestures from sparse examples for real-time character control

Research output: Contribution to journalJournal article

Published
  • Helge Rhodin
  • James Tompkin
  • Kwang In Kim
  • Edilson de Aguiar
  • Hanspeter Pfister
  • Hans-Peter Seidel
  • Christian Theobalt
Close
Article number181
<mark>Journal publication date</mark>11/2015
<mark>Journal</mark>ACM Transactions on Graphics (Proc. SIGGRAPH Asia)
Issue number6
Volume34
Publication statusPublished
Original languageEnglish

Abstract

Motion-tracked real-time character control is important for games and VR, but current solutions are limited: retargeting is hard for non-human characters, with locomotion bound to the sensing volume; and pose mappings are ambiguous and not robust with consumer trackers, with dynamic motion properties unwieldy. We robustly estimate wave properties — amplitude, frequency, and phase — for a set of interactively-defined gestures, by mapping user motions to a low-dimensional independent representation. The mapping both separates simultaneous or intersecting gestures, and extrapolates gesture variations from single training examples. For animation control, e.g., locomotion, wave properties map naturally to stride length, step frequency, and progression, and allow smooth animation from standing, to walking, to running. Simultaneous gestures are disambiguated successfully. Interpolating out-of-phase locomotions is hard, e.g., quadruped legs between walks and runs, so we introduce a new time-interpolation scheme to reduce artifacts. These improvements to real-time motion-tracked character control are particularly important for common cyclic animations, which we validate in a user study, with versatility to apply to part and full body motions across a variety of sensors.

Bibliographic note

Date of Acceptance: 31/08/2015