Home > Research > Publications & Outputs > Robust online multi-target visual tracking usin...


Text available via DOI:

View graph of relations

Robust online multi-target visual tracking using a HISP filter with discriminative deep appearance learning

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Article number102952
<mark>Journal publication date</mark>31/05/2021
<mark>Journal</mark>Journal of Visual Communication and Image Representation
Number of pages13
Publication StatusPublished
Early online date5/05/21
<mark>Original language</mark>English


We propose a novel online multi-target visual tracker based on the recently developed Hypothesized and Independent Stochastic Population (HISP) filter. The HISP filter combines advantages of traditional tracking approaches like MHT and point-process-based approaches like PHD filter, and it has linear complexity while maintaining track identities. We apply this filter for tracking multiple targets in video sequences acquired under varying environmental conditions and targets density using a tracking-by-detection approach. We also adopt deep CNN appearance representation by training a verification-identification network (VerIdNet) on large-scale person re-identification data sets. We construct an augmented likelihood in a principled manner using this deep CNN appearance features and spatio-temporal information. Furthermore, we solve the problem of two or more targets having identical label considering the weight propagated with each confirmed hypothesis. Extensive experiments on MOT16 and MOT17 benchmark data sets show that our tracker significantly outperforms several state-of-the-art trackers in terms of tracking accuracy.