Home > Research > Publications & Outputs > Data fusion for unsupervised video object detec...

Electronic data

  • 1570113613

    Rights statement: ©2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    Accepted author manuscript, 1.13 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

View graph of relations

Data fusion for unsupervised video object detection, tracking and geo-positioning

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date4/07/2015
Host publicationInformation Fusion (Fusion), 2015 18th International Conference on
PublisherIEEE
Pages142-149
Number of pages8
ISBN (Print)9781479974047
<mark>Original language</mark>English
EventInternational Conference on Information Fusion'2015 - Washington DC, Washington DC USA, United States
Duration: 4/07/20158/07/2015

Conference

ConferenceInternational Conference on Information Fusion'2015
Country/TerritoryUnited States
CityWashington DC USA
Period4/07/158/07/15

Conference

ConferenceInternational Conference on Information Fusion'2015
Country/TerritoryUnited States
CityWashington DC USA
Period4/07/158/07/15

Abstract

In this work we describe a system and propose a novel algorithm for moving object detection and tracking based on video feed. Apart of many well-known algorithms, it performs detection in unsupervised style, using velocity criteria for the objects detection. The algorithm utilises data from a single camera and Inertial Measurement Unit (IMU) sensors and performs fusion of video and sensory data captured from the UAV. The algorithm includes object tracking and detection, augmented by object geographical co-ordinates estimation. The algorithm can be generalised for any particular video sensor and is not restricted to any specific applications. For object tracking, Bayesian filter scheme combined with approximate inference is utilised. Object localisation in real-world co-ordinates is based on the tracking results and IMU sensor measurements.