Home > Research > Publications & Outputs > The influence of multi-sensor video fusion on o...
View graph of relations

The influence of multi-sensor video fusion on object tracking using a particle filter

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date2/10/2006
Host publicationInformatik für Menschen
EditorsChristian Hochberger
Place of PublicationBonn
PublisherGesellschaft für Informatik
Pages354-358
Number of pages5
ISBN (print)978-3-88579-187-4
<mark>Original language</mark>English
EventLNCS from the 2nd Workshop on Multiple Sensor Data Fusion: Solutions, Applications - Dresden, Germany
Duration: 2/10/20066/10/2006

Conference

ConferenceLNCS from the 2nd Workshop on Multiple Sensor Data Fusion: Solutions, Applications
CityDresden, Germany
Period2/10/066/10/06

Conference

ConferenceLNCS from the 2nd Workshop on Multiple Sensor Data Fusion: Solutions, Applications
CityDresden, Germany
Period2/10/066/10/06

Abstract

This paper investigates how the object tracking performance is affected by the fusion quality of videos from visible (VIZ) and infrared (IR) surveillance cameras, as compared to tracking in single modality videos. The videos have been fused using the simple averaging, and various multiresolution techniques. Tracking has been accomplished by means of a particle filter using colour and edge cues. The highest tracking accuracy has been obtained in IR sequences, whereas the VIZ video was affected by many artifacts and showed the worst tracking performance. Among the fused videos, the complex wavelet and the averaging techniques, offered the best tracking performance, comparable to that of IR. Thus, of all the methods investigated, the fused videos, containing complementary contextual information from both single modality input videos, are the best source for further analysis by a human observer or a computer program.