Home > Research > Publications & Outputs > EagleSense

Electronic data

  • EaglseSenseCHI2017

    Rights statement: ©The Authors. 2017. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/3025453.3025562

    Accepted author manuscript, 4.89 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing. / Wu, Chi-Jui; Houben, Steven; Marquardt, Nicolai .
CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York: ACM, 2017. p. 3929-3942.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Wu, C-J, Houben, S & Marquardt, N 2017, EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing. in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, New York, pp. 3929-3942. https://doi.org/10.1145/3025453.3025562

APA

Wu, C-J., Houben, S., & Marquardt, N. (2017). EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing. In CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (pp. 3929-3942). ACM. https://doi.org/10.1145/3025453.3025562

Vancouver

Wu C-J, Houben S, Marquardt N. EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing. In CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York: ACM. 2017. p. 3929-3942 doi: 10.1145/3025453.3025562

Author

Wu, Chi-Jui ; Houben, Steven ; Marquardt, Nicolai . / EagleSense : tracking people and devices in interactive spaces using real-time top-view depth-sensing. CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. New York : ACM, 2017. pp. 3929-3942

Bibtex

@inproceedings{18b6b9f75d1645c0a1433a2ce7bfce0f,
title = "EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing",
abstract = "Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.",
author = "Chi-Jui Wu and Steven Houben and Nicolai Marquardt",
note = "{\textcopyright}The Authors. 2017. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/3025453.3025562",
year = "2017",
month = may,
day = "6",
doi = "10.1145/3025453.3025562",
language = "English",
isbn = "9781450346559",
pages = "3929--3942",
booktitle = "CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems",
publisher = "ACM",

}

RIS

TY - GEN

T1 - EagleSense

T2 - tracking people and devices in interactive spaces using real-time top-view depth-sensing

AU - Wu, Chi-Jui

AU - Houben, Steven

AU - Marquardt, Nicolai

N1 - ©The Authors. 2017. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/3025453.3025562

PY - 2017/5/6

Y1 - 2017/5/6

N2 - Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.

AB - Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.

U2 - 10.1145/3025453.3025562

DO - 10.1145/3025453.3025562

M3 - Conference contribution/Paper

SN - 9781450346559

SP - 3929

EP - 3942

BT - CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems

PB - ACM

CY - New York

ER -