Home > Research > Publications & Outputs > Pupil-canthi-ratio
View graph of relations

Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. / Zhang, Yanxia; Bulling, Andreas; Gellersen, Hans.
AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces. New York: ACM, 2014. p. 129-132.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Zhang, Y, Bulling, A & Gellersen, H 2014, Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. in AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces. ACM, New York, pp. 129-132. https://doi.org/10.1145/2598153.2598186

APA

Zhang, Y., Bulling, A., & Gellersen, H. (2014). Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. In AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (pp. 129-132). ACM. https://doi.org/10.1145/2598153.2598186

Vancouver

Zhang Y, Bulling A, Gellersen H. Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction. In AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces. New York: ACM. 2014. p. 129-132 doi: 10.1145/2598153.2598186

Author

Zhang, Yanxia ; Bulling, Andreas ; Gellersen, Hans. / Pupil-canthi-ratio : a calibration-free method for tracking horizontal gaze direction. AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces. New York : ACM, 2014. pp. 129-132

Bibtex

@inproceedings{2c473b1d7f074de7a09ed0ee89d3dcf5,
title = "Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction",
abstract = "Eye tracking is compelling for hands-free interaction with pervasive displays. However, most existing eye tracking systems require specialised hardware and explicit calibrations of equipment and individual users, which inhibit their widespread adoption. In this work, we present a light-weight and calibration-free gaze estimation method that leverages only an off-the-shelf camera to track users' gaze horizontally. We introduce pupil-canthi-ratio (PCR), a novel measure for estimating gaze directions. By using the displacement vector between the inner eye corner and the pupil centre of an eye, PCR is calculated as the ratio of the displacement vectors from both eyes. We establish a mapping between PCR to gaze direction by Gaussian process regression, which inherently infers averted horizontal gaze directions of users. We present a study to identify the characteristics of PCR. The results show that PCR achieved an average accuracy of 3.9 degrees across different people. Finally, we show examples of real-time applications of PCR that allow users to interact with a display by moving only their eyes.",
keywords = "Gaussian regression, calibration-free, eye tracking , gaze-based interaction, pervasive displays, vision-based",
author = "Yanxia Zhang and Andreas Bulling and Hans Gellersen",
year = "2014",
doi = "10.1145/2598153.2598186",
language = "English",
isbn = "9781450327756",
pages = "129--132",
booktitle = "AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Pupil-canthi-ratio

T2 - a calibration-free method for tracking horizontal gaze direction

AU - Zhang, Yanxia

AU - Bulling, Andreas

AU - Gellersen, Hans

PY - 2014

Y1 - 2014

N2 - Eye tracking is compelling for hands-free interaction with pervasive displays. However, most existing eye tracking systems require specialised hardware and explicit calibrations of equipment and individual users, which inhibit their widespread adoption. In this work, we present a light-weight and calibration-free gaze estimation method that leverages only an off-the-shelf camera to track users' gaze horizontally. We introduce pupil-canthi-ratio (PCR), a novel measure for estimating gaze directions. By using the displacement vector between the inner eye corner and the pupil centre of an eye, PCR is calculated as the ratio of the displacement vectors from both eyes. We establish a mapping between PCR to gaze direction by Gaussian process regression, which inherently infers averted horizontal gaze directions of users. We present a study to identify the characteristics of PCR. The results show that PCR achieved an average accuracy of 3.9 degrees across different people. Finally, we show examples of real-time applications of PCR that allow users to interact with a display by moving only their eyes.

AB - Eye tracking is compelling for hands-free interaction with pervasive displays. However, most existing eye tracking systems require specialised hardware and explicit calibrations of equipment and individual users, which inhibit their widespread adoption. In this work, we present a light-weight and calibration-free gaze estimation method that leverages only an off-the-shelf camera to track users' gaze horizontally. We introduce pupil-canthi-ratio (PCR), a novel measure for estimating gaze directions. By using the displacement vector between the inner eye corner and the pupil centre of an eye, PCR is calculated as the ratio of the displacement vectors from both eyes. We establish a mapping between PCR to gaze direction by Gaussian process regression, which inherently infers averted horizontal gaze directions of users. We present a study to identify the characteristics of PCR. The results show that PCR achieved an average accuracy of 3.9 degrees across different people. Finally, we show examples of real-time applications of PCR that allow users to interact with a display by moving only their eyes.

KW - Gaussian regression

KW - calibration-free

KW - eye tracking

KW - gaze-based interaction

KW - pervasive displays

KW - vision-based

U2 - 10.1145/2598153.2598186

DO - 10.1145/2598153.2598186

M3 - Conference contribution/Paper

SN - 9781450327756

SP - 129

EP - 132

BT - AVI '14 Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces

PB - ACM

CY - New York

ER -