12,000

We have over 12,000 students, from over 100 countries, within one of the safest campuses in the UK

93%

93% of Lancaster students go into work or further study within six months of graduating

Home > Research > Publications & Outputs > AutoBAP
View graph of relations

« Back

AutoBAP: automatic coding of body action and posture units from wearable sensors

Research output: Contribution in Book/Report/ProceedingsPaper

Published

Publication date2013
Host publicationProceedings of the 5th International Conference on Affective Computing and Intelligent Interaction (ACII’13)
Place of publicationPiscataway, N.J.
PublisherIEEE
Pages135-140
Number of pages6
ISBN (Print)9780769550480
Original languageEnglish

Conference

ConferenceACII '13 Affective Computing and Intelligent Interaction 2013
CountrySwitzerland
CityGeneva
Period2/09/135/09/13

Conference

ConferenceACII '13 Affective Computing and Intelligent Interaction 2013
CountrySwitzerland
CityGeneva
Period2/09/135/09/13

Abstract

Manual annotation of human body movement is an integral part of research on non-verbal communication and computational behaviour analysis but also a very time-consuming and tedious task. In this paper we present AutoBAP, a system
that automates the coding of bodily expressions according to the body action and posture (BAP) coding scheme. Our system takes continuous body motion and gaze behaviour data as its input. The data is recorded using a full body motion tracking suit and a wearable eye tracker. From the data our system automatically
generates a labelled XML file that can be visualised and edited with off-the-shelf video annotation tools. We evaluate our system in a laboratory-based user study with six participants performing scripted sequences of 184 actions. Results from the user study show that our prototype system is able to annotate 172 out of the
274 labels of the full BAP coding scheme with good agreement with a manual annotator (Cohen’s kappa > 0.6).