Home > Research > Publications & Outputs > Out of sight

Links

Text available via DOI:

View graph of relations

Out of sight: a toolkit for tracking occluded human joint positions

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Out of sight: a toolkit for tracking occluded human joint positions. / Wu, Chi-Jui; Quigley, Aaron; Harris-Birtill, David.
In: Personal and Ubiquitous Computing, Vol. 21, No. 1, 02.2017, p. 125-135.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

Wu, C-J, Quigley, A & Harris-Birtill, D 2017, 'Out of sight: a toolkit for tracking occluded human joint positions', Personal and Ubiquitous Computing, vol. 21, no. 1, pp. 125-135. https://doi.org/10.1007/s00779-016-0997-6

APA

Wu, C-J., Quigley, A., & Harris-Birtill, D. (2017). Out of sight: a toolkit for tracking occluded human joint positions. Personal and Ubiquitous Computing, 21(1), 125-135. https://doi.org/10.1007/s00779-016-0997-6

Vancouver

Wu C-J, Quigley A, Harris-Birtill D. Out of sight: a toolkit for tracking occluded human joint positions. Personal and Ubiquitous Computing. 2017 Feb;21(1):125-135. Epub 2016 Dec 2. doi: 10.1007/s00779-016-0997-6

Author

Wu, Chi-Jui ; Quigley, Aaron ; Harris-Birtill, David. / Out of sight : a toolkit for tracking occluded human joint positions. In: Personal and Ubiquitous Computing. 2017 ; Vol. 21, No. 1. pp. 125-135.

Bibtex

@article{745937e59f954d5f81a1eab1844b2683,
title = "Out of sight: a toolkit for tracking occluded human joint positions",
abstract = "Real-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system{\textquoteright}s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sightaddresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining thefield of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique{\textquoteright}s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45, and 90 apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person{\textquoteright}s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the systemas open source.",
keywords = "Kinect, Occlusion, Toolkit",
author = "Chi-Jui Wu and Aaron Quigley and David Harris-Birtill",
year = "2017",
month = feb,
doi = "10.1007/s00779-016-0997-6",
language = "English",
volume = "21",
pages = "125--135",
journal = "Personal and Ubiquitous Computing",
issn = "1617-4909",
publisher = "Springer Verlag London Ltd",
number = "1",

}

RIS

TY - JOUR

T1 - Out of sight

T2 - a toolkit for tracking occluded human joint positions

AU - Wu, Chi-Jui

AU - Quigley, Aaron

AU - Harris-Birtill, David

PY - 2017/2

Y1 - 2017/2

N2 - Real-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system’s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sightaddresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining thefield of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique’s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45, and 90 apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person’s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the systemas open source.

AB - Real-time identification and tracking of the joint positions of people can be achieved with off-the-shelf sensing technologies such as the Microsoft Kinect, or other camera-based systems with computer vision. However, tracking is constrained by the system’s field of view of people. When a person is occluded from the camera view, their position can no longer be followed. Out of Sightaddresses the occlusion problem in depth-sensing tracking systems. Our new tracking infrastructure provides human skeleton joint positions during occlusion, by combining thefield of view of multiple Kinects using geometric calibration and affine transformation. We verified the technique’s accuracy through a system evaluation consisting of 20 participants in stationary position and in motion, with two Kinects positioned parallel, 45, and 90 apart. Results show that our skeleton matching is accurate to within 16.1 cm (s.d. = 5.8 cm), which is within a person’s personal space. In a realistic scenario study, groups of two people quickly occlude each other, and occlusion is resolved for 85% of the participants. A RESTful API was developed to allow distributed access of occlusion-free skeleton joint positions. As a further contribution, we provide the systemas open source.

KW - Kinect

KW - Occlusion

KW - Toolkit

U2 - 10.1007/s00779-016-0997-6

DO - 10.1007/s00779-016-0997-6

M3 - Journal article

VL - 21

SP - 125

EP - 135

JO - Personal and Ubiquitous Computing

JF - Personal and Ubiquitous Computing

SN - 1617-4909

IS - 1

ER -