Accepted author manuscript, 7.6 MB, PDF document
Available under license: None
Final published version
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - AutoBAP
T2 - ACII '13 Affective Computing and Intelligent Interaction 2013
AU - Velloso, Eduardo
AU - Bulling, Andreas
AU - Gellersen, Hans
PY - 2013
Y1 - 2013
N2 - Manual annotation of human body movement is an integral part of research on non-verbal communication and computational behaviour analysis but also a very time-consuming and tedious task. In this paper we present AutoBAP, a systemthat automates the coding of bodily expressions according to the body action and posture (BAP) coding scheme. Our system takes continuous body motion and gaze behaviour data as its input. The data is recorded using a full body motion tracking suit and a wearable eye tracker. From the data our system automaticallygenerates a labelled XML file that can be visualised and edited with off-the-shelf video annotation tools. We evaluate our system in a laboratory-based user study with six participants performing scripted sequences of 184 actions. Results from the user study show that our prototype system is able to annotate 172 out of the274 labels of the full BAP coding scheme with good agreement with a manual annotator (Cohen’s kappa > 0.6).
AB - Manual annotation of human body movement is an integral part of research on non-verbal communication and computational behaviour analysis but also a very time-consuming and tedious task. In this paper we present AutoBAP, a systemthat automates the coding of bodily expressions according to the body action and posture (BAP) coding scheme. Our system takes continuous body motion and gaze behaviour data as its input. The data is recorded using a full body motion tracking suit and a wearable eye tracker. From the data our system automaticallygenerates a labelled XML file that can be visualised and edited with off-the-shelf video annotation tools. We evaluate our system in a laboratory-based user study with six participants performing scripted sequences of 184 actions. Results from the user study show that our prototype system is able to annotate 172 out of the274 labels of the full BAP coding scheme with good agreement with a manual annotator (Cohen’s kappa > 0.6).
U2 - 10.1109/ACII.2013.29
DO - 10.1109/ACII.2013.29
M3 - Conference contribution/Paper
SN - 9780769550480
SP - 135
EP - 140
BT - Proceedings of the 5th International Conference on Affective Computing and Intelligent Interaction (ACII’13)
PB - IEEE
CY - Piscataway, N.J.
Y2 - 2 September 2013 through 5 September 2013
ER -