Home > Research > Publications & Outputs > Interaction Recognition Through Body Parts Rela...

Links

Text available via DOI:

View graph of relations

Interaction Recognition Through Body Parts Relation Reasoning

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date23/02/2020
Host publicationPattern Recognition: ACPR 2019
EditorsS. Palaiahnakote, G. Sanniti di Baja, L. Wang, W. Yan
Place of PublicationCham
PublisherSpringer
Pages268-280
Number of pages13
ISBN (electronic)9783030414047
ISBN (print)9783030414030
<mark>Original language</mark>English

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer
Volume12046
ISSN (Print)0302-9743
ISSN (electronic)1611-3349

Abstract

Person-person mutual action recognition (also referred to as interaction recognition) is an important research branch of human activity analysis. It begins with solutions based on carefully designed local-points and hand-crafted features, and then progresses to deep learning architectures, such as CNNs and LSTMS. These solutions often consist of complicated architectures and mechanisms to embed the relationships between the two persons on the architecture itself, to ensure the interaction patterns can be properly learned. Our contribution with this work is by proposing a more simple yet very powerful architecture, named Interaction Relational Network, which utilizes minimal prior knowledge about the structure of the data. We drive the network to learn to identify how to relate the body parts of the persons interacting, in order to better discriminate among the possible interactions. By breaking down the body parts through the frames as sets of independent joints, and with a few augmentations to our architecture to explicitly extract meaningful extra information from each pair of joints, our solution is able to achieve state-of-the-art performance on the traditional interaction recognition dataset SBU, and also on the mutual actions from the large-scale dataset NTU RGB+D.