Home > Research > Publications & Outputs > EagleSense

Electronic data

  • EaglseSenseCHI2017

    Rights statement: ©The Authors. 2017. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/3025453.3025562

    Accepted author manuscript, 4.89 MB, PDF document

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

EagleSense: tracking people and devices in interactive spaces using real-time top-view depth-sensing

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date6/05/2017
Host publicationCHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems
Place of PublicationNew York
PublisherACM
Pages3929-3942
Number of pages14
ISBN (print)9781450346559
<mark>Original language</mark>English

Abstract

Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies.

Bibliographic note

©The Authors. 2017. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems http://dx.doi.org/10.1145/3025453.3025562