Home > Research > Publications & Outputs > Real-time facial expression recognition based o...


Text available via DOI:

View graph of relations

Real-time facial expression recognition based on iterative transfer learning and efficient attention network

Research output: Contribution to Journal/MagazineJournal articlepeer-review

  • Y. Kong
  • S. Zhang
  • K. Zhang
  • Q. Ni
  • J. Han
<mark>Journal publication date</mark>31/05/2022
<mark>Journal</mark>IET Image Processing
Issue number6
Number of pages15
Pages (from-to)1694-1708
Publication StatusPublished
Early online date16/02/22
<mark>Original language</mark>English


Real-time facial expression recognition is the basis for computers to understand human emotions and detect abnormalities in time. To effectively solve the problems of server overload and privacy information leakage, a real-time facial expression recognition method based on iterative transfer learning and efficient attention network (EAN) for edge resource-constrained scenes is proposed in this paper. Firstly, an EAN is designed with its parameter number and computation amount strictly limited by depth separable convolution and local channel attention mechanism. Then, the soft labels of facial expression data were obtained by EAN based on the idea of knowledge distillation, so as to provide more supervision information for the training process. Finally, an iterative transfer learning method of teacher-student (T-S) network was proposed; it refines the soft labels of the teacher network and further improves the recognition accuracy of the student network. The tests on the public datasets, FER2013 and RAF-DB, show that this method can significantly reduce the model complexity and achieve high recognition accuracy. Compared with other advanced methods, the proposed method strikes a good balance between complexity and accuracy, and well meets the real-time deployment requirements of facial expression recognition technology for edge resource-constrained scenes. © 2022 The Authors. IET Image Processing published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology.