Accepted author manuscript, 8.23 MB, PDF document
Research output: Contribution to Journal/Magazine › Journal article › peer-review
Research output: Contribution to Journal/Magazine › Journal article › peer-review
}
TY - JOUR
T1 - Generalizing wave gestures from sparse examples for real-time character control
AU - Rhodin, Helge
AU - Tompkin, James
AU - Kim, Kwang In
AU - de Aguiar, Edilson
AU - Pfister, Hanspeter
AU - Seidel, Hans-Peter
AU - Theobalt, Christian
N1 - Date of Acceptance: 31/08/2015
PY - 2015/11
Y1 - 2015/11
N2 - Motion-tracked real-time character control is important for games and VR, but current solutions are limited: retargeting is hard for non-human characters, with locomotion bound to the sensing volume; and pose mappings are ambiguous and not robust with consumer trackers, with dynamic motion properties unwieldy. We robustly estimate wave properties — amplitude, frequency, and phase — for a set of interactively-defined gestures, by mapping user motions to a low-dimensional independent representation. The mapping both separates simultaneous or intersecting gestures, and extrapolates gesture variations from single training examples. For animation control, e.g., locomotion, wave properties map naturally to stride length, step frequency, and progression, and allow smooth animation from standing, to walking, to running. Simultaneous gestures are disambiguated successfully. Interpolating out-of-phase locomotions is hard, e.g., quadruped legs between walks and runs, so we introduce a new time-interpolation scheme to reduce artifacts. These improvements to real-time motion-tracked character control are particularly important for common cyclic animations, which we validate in a user study, with versatility to apply to part and full body motions across a variety of sensors.
AB - Motion-tracked real-time character control is important for games and VR, but current solutions are limited: retargeting is hard for non-human characters, with locomotion bound to the sensing volume; and pose mappings are ambiguous and not robust with consumer trackers, with dynamic motion properties unwieldy. We robustly estimate wave properties — amplitude, frequency, and phase — for a set of interactively-defined gestures, by mapping user motions to a low-dimensional independent representation. The mapping both separates simultaneous or intersecting gestures, and extrapolates gesture variations from single training examples. For animation control, e.g., locomotion, wave properties map naturally to stride length, step frequency, and progression, and allow smooth animation from standing, to walking, to running. Simultaneous gestures are disambiguated successfully. Interpolating out-of-phase locomotions is hard, e.g., quadruped legs between walks and runs, so we introduce a new time-interpolation scheme to reduce artifacts. These improvements to real-time motion-tracked character control are particularly important for common cyclic animations, which we validate in a user study, with versatility to apply to part and full body motions across a variety of sensors.
U2 - 10.1145/2816795.2818082
DO - 10.1145/2816795.2818082
M3 - Journal article
VL - 34
JO - ACM Transactions on Graphics (Proc. SIGGRAPH Asia)
JF - ACM Transactions on Graphics (Proc. SIGGRAPH Asia)
SN - 0730-0301
IS - 6
M1 - 181
ER -