Home > Research > Publications & Outputs > Edge flow

Electronic data

  • EdgeFlow_v4_LatesPreview

    Rights statement: ©2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

    4.58 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Edge flow

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Edge flow. / Morris, Gruff; Angelov, Plamen Parvanov.
Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2015. p. 1942-1948.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Morris, G & Angelov, PP 2015, Edge flow. in Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, pp. 1942-1948. https://doi.org/10.1109/SMC.2015.339

APA

Morris, G., & Angelov, P. P. (2015). Edge flow. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC) (pp. 1942-1948). IEEE. https://doi.org/10.1109/SMC.2015.339

Vancouver

Morris G, Angelov PP. Edge flow. In Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE. 2015. p. 1942-1948 doi: 10.1109/SMC.2015.339

Author

Morris, Gruff ; Angelov, Plamen Parvanov. / Edge flow. Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2015. pp. 1942-1948

Bibtex

@inproceedings{30459fd61b1e4abb98f4b1528b7e2e79,
title = "Edge flow",
abstract = "In this paper we introduce a new data driven method to novelty detection and object definition in dynamic video streams that indiscriminately detects both static and moving objects in the scene. A sliding window density estimation is introduced in order to reliably detect texture edges. A Sobelfiltering process is used to extract gradient of edges. Using this new approach, the detection of object textures1 can be done accurately and in real-time. In this paper we demonstrate the capabilities of the algorithm on video scenarios, and show that object textures in the scene are reliably detected. We are able to show clearly the capability of the algorithm to be robust in occlusion scenarios; working in real-time, and defining clear objects where other techniques attribute such small detections tonoise.",
keywords = "cybernetics, image processing",
author = "Gruff Morris and Angelov, {Plamen Parvanov}",
note = "{\textcopyright}2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.",
year = "2015",
month = oct,
day = "9",
doi = "10.1109/SMC.2015.339",
language = "English",
isbn = "9781479986972",
pages = "1942--1948",
booktitle = "Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC)",
publisher = "IEEE",

}

RIS

TY - GEN

T1 - Edge flow

AU - Morris, Gruff

AU - Angelov, Plamen Parvanov

N1 - ©2015 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

PY - 2015/10/9

Y1 - 2015/10/9

N2 - In this paper we introduce a new data driven method to novelty detection and object definition in dynamic video streams that indiscriminately detects both static and moving objects in the scene. A sliding window density estimation is introduced in order to reliably detect texture edges. A Sobelfiltering process is used to extract gradient of edges. Using this new approach, the detection of object textures1 can be done accurately and in real-time. In this paper we demonstrate the capabilities of the algorithm on video scenarios, and show that object textures in the scene are reliably detected. We are able to show clearly the capability of the algorithm to be robust in occlusion scenarios; working in real-time, and defining clear objects where other techniques attribute such small detections tonoise.

AB - In this paper we introduce a new data driven method to novelty detection and object definition in dynamic video streams that indiscriminately detects both static and moving objects in the scene. A sliding window density estimation is introduced in order to reliably detect texture edges. A Sobelfiltering process is used to extract gradient of edges. Using this new approach, the detection of object textures1 can be done accurately and in real-time. In this paper we demonstrate the capabilities of the algorithm on video scenarios, and show that object textures in the scene are reliably detected. We are able to show clearly the capability of the algorithm to be robust in occlusion scenarios; working in real-time, and defining clear objects where other techniques attribute such small detections tonoise.

KW - cybernetics

KW - image processing

U2 - 10.1109/SMC.2015.339

DO - 10.1109/SMC.2015.339

M3 - Conference contribution/Paper

SN - 9781479986972

SP - 1942

EP - 1948

BT - Proceedings of the 2015 IEEE International Conference on Systems, Man, and Cybernetics (SMC)

PB - IEEE

ER -