Rights statement: ©2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
4.58 MB, PDF document
Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License
Final published version
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Edge flow
AU - Morris, Gruff
AU - Angelov, Plamen Parvanov
N1 - ©2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
PY - 2015/10/9
Y1 - 2015/10/9
N2 - In this paper we introduce a new data driven method to novelty detection and object definition in dynamic video streams that indiscriminately detects both static and moving objects in the scene. A sliding window density estimation is introduced in order to reliably detect texture edges. A Sobelfiltering process is used to extract gradient of edges. Using this new approach, the detection of object textures1 can be done accurately and in real-time. In this paper we demonstrate the capabilities of the algorithm on video scenarios, and show that object textures in the scene are reliably detected. We are able to show clearly the capability of the algorithm to be robust in occlusion scenarios; working in real-time, and defining clear objects where other techniques attribute such small detections tonoise.
AB - In this paper we introduce a new data driven method to novelty detection and object definition in dynamic video streams that indiscriminately detects both static and moving objects in the scene. A sliding window density estimation is introduced in order to reliably detect texture edges. A Sobelfiltering process is used to extract gradient of edges. Using this new approach, the detection of object textures1 can be done accurately and in real-time. In this paper we demonstrate the capabilities of the algorithm on video scenarios, and show that object textures in the scene are reliably detected. We are able to show clearly the capability of the algorithm to be robust in occlusion scenarios; working in real-time, and defining clear objects where other techniques attribute such small detections tonoise.
KW - cybernetics
KW - image processing
U2 - 10.1109/SMC.2015.339
DO - 10.1109/SMC.2015.339
M3 - Conference contribution/Paper
SN - 9781479986972
SP - 1942
EP - 1948
BT - Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC)
PB - IEEE
ER -