Home > Research > Publications & Outputs > Discrimination of gaze directions using low-lev...
View graph of relations

Discrimination of gaze directions using low-level eye image features

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Discrimination of gaze directions using low-level eye image features. / Zhang, Yanxia; Bulling, Andreas; Gellersen, Hans.
Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction. New York: ACM, 2011. p. 9-14 (PETMEI '11).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Zhang, Y, Bulling, A & Gellersen, H 2011, Discrimination of gaze directions using low-level eye image features. in Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction. PETMEI '11, ACM, New York, pp. 9-14. https://doi.org/10.1145/2029956.2029961

APA

Zhang, Y., Bulling, A., & Gellersen, H. (2011). Discrimination of gaze directions using low-level eye image features. In Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction (pp. 9-14). (PETMEI '11). ACM. https://doi.org/10.1145/2029956.2029961

Vancouver

Zhang Y, Bulling A, Gellersen H. Discrimination of gaze directions using low-level eye image features. In Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction. New York: ACM. 2011. p. 9-14. (PETMEI '11). doi: 10.1145/2029956.2029961

Author

Zhang, Yanxia ; Bulling, Andreas ; Gellersen, Hans. / Discrimination of gaze directions using low-level eye image features. Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction. New York : ACM, 2011. pp. 9-14 (PETMEI '11).

Bibtex

@inproceedings{59b2f26904ce42cf8c6a5f4ac8e58284,
title = "Discrimination of gaze directions using low-level eye image features",
abstract = "In mobile daily life settings, video-based gaze tracking faces challenges associated with changes in lighting conditions and artefacts in the video images caused by head and body movements. These challenges call for the development of new methods that are robust to such influences. In this paper we investigate the problem of gaze estimation, more specifically how to discriminate different gaze directions from eye images. In a 17 participant user study we record eye images for 13 different gaze directions from a standard webcam. We extract a total of 50 features from these images that encode information on color, intensity and orientations. Using mRMR feature selection and a k-nearest neighbor (kNN) classifier we show that we can estimate these gaze directions with a mean recognition performance of 86%.",
author = "Yanxia Zhang and Andreas Bulling and Hans Gellersen",
year = "2011",
doi = "10.1145/2029956.2029961",
language = "English",
isbn = "978-1-4503-0930-1",
series = "PETMEI '11",
publisher = "ACM",
pages = "9--14",
booktitle = "Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction",

}

RIS

TY - GEN

T1 - Discrimination of gaze directions using low-level eye image features

AU - Zhang, Yanxia

AU - Bulling, Andreas

AU - Gellersen, Hans

PY - 2011

Y1 - 2011

N2 - In mobile daily life settings, video-based gaze tracking faces challenges associated with changes in lighting conditions and artefacts in the video images caused by head and body movements. These challenges call for the development of new methods that are robust to such influences. In this paper we investigate the problem of gaze estimation, more specifically how to discriminate different gaze directions from eye images. In a 17 participant user study we record eye images for 13 different gaze directions from a standard webcam. We extract a total of 50 features from these images that encode information on color, intensity and orientations. Using mRMR feature selection and a k-nearest neighbor (kNN) classifier we show that we can estimate these gaze directions with a mean recognition performance of 86%.

AB - In mobile daily life settings, video-based gaze tracking faces challenges associated with changes in lighting conditions and artefacts in the video images caused by head and body movements. These challenges call for the development of new methods that are robust to such influences. In this paper we investigate the problem of gaze estimation, more specifically how to discriminate different gaze directions from eye images. In a 17 participant user study we record eye images for 13 different gaze directions from a standard webcam. We extract a total of 50 features from these images that encode information on color, intensity and orientations. Using mRMR feature selection and a k-nearest neighbor (kNN) classifier we show that we can estimate these gaze directions with a mean recognition performance of 86%.

U2 - 10.1145/2029956.2029961

DO - 10.1145/2029956.2029961

M3 - Conference contribution/Paper

SN - 978-1-4503-0930-1

T3 - PETMEI '11

SP - 9

EP - 14

BT - Proceedings of the 1st international workshop on pervasive eye tracking 38; mobile eye-based interaction

PB - ACM

CY - New York

ER -