Home > Research > Publications & Outputs > Robust human face tracking in eigenspace for pe...

Links

Text available via DOI:

View graph of relations

Robust human face tracking in eigenspace for perceptual human-robot interaction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNChapter

Published
Publication date1/12/2010
Host publicationComputer Vision for Multimedia Applications: Methods and Solutions
EditorsJinjun Wang, Jian Cheng, SHuqiang Jian
PublisherIGI Global
Pages60-72
Number of pages13
ISBN (electronic)9781609600266
ISBN (print)9781609600242
<mark>Original language</mark>English

Abstract

This chapter introduces a robust human face tracking scheme for vision-based human-robot interaction, where the detected face-like regions in the video sequence are tracked using unscented Kalman filter (UKF), and face occlusion are tackled by using an online appearance-based scheme using principle component analysis (PCA). The experiment is carried out with the standard test video, which validates that the proposed PCA-based face tracking can attain robust performance in tackling face occlusions.