Home > Research > Publications & Outputs > Personalized emotion recognition by personality...

Electronic data

  • PersonalityTOMM

    Rights statement: © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) - Special Section on Deep Learning for Intelligent Multimedia Analytics and Special Section on Multi-Modal Understanding of Social, Affective and Subjective Attributes of Data, 15, 1s, 2019 http://doi.acm.org/10.1145/3233184

    Accepted author manuscript, 1.61 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Personalized emotion recognition by personality-aware high-order learning of physiological signals

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published
  • S. Zhao
  • A. Gholaminejad
  • G. Ding
  • Y. Gao
  • J. Han
  • K. Keutzer
Close
Article number14
<mark>Journal publication date</mark>1/02/2019
<mark>Journal</mark>ACM Transactions on Multimedia Computing, Communications, and Applications
Issue number1S
Volume15
Number of pages18
Publication StatusPublished
<mark>Original language</mark>English

Abstract

Due to the subjective responses of different subjects to physical stimuli, emotion recognition methodologies from physiological signals are increasingly becoming personalized. Existing works mainly focused on modeling the involved physiological corpus of each subject, without considering the psychological factors, such as interest and personality. The latent correlation among different subjects has also been rarely examined. In this article, we propose to investigate the influence of personality on emotional behavior in a hypergraph learning framework. Assuming that each vertex is a compound tuple (subject, stimuli), multi-modal hyper-graphs can be constructed based on the personality correlation among different subjects and on the physiological correlation among corresponding stimuli. To reveal the different importance of vertices, hyperedges, and modalities, we learn the weights for each of them. As the hypergraphs connect different subjects on the compound vertices, the emotions of multiple subjects can be simultaneously recognized. In this way, the constructed hypergraphs are vertex-weighted multi-modal multi-task ones. The estimated factors, referred to as emotion relevance, are employed for emotion recognition. We carry out extensive experiments on the ASCERTAIN dataset and the results demonstrate the superiority of the proposed method, as compared to the state-of-the-art emotion recognition approaches.

Bibliographic note

© ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM) - Special Section on Deep Learning for Intelligent Multimedia Analytics and Special Section on Multi-Modal Understanding of Social, Affective and Subjective Attributes of Data, 15, 1s, 2019 http://doi.acm.org/10.1145/3233184