Home > Research > Publications & Outputs > AmbiGaze

Electronic data

  • pn0505-vellosoA

    Final published version, 1.12 MB, PDF document

    Available under license: CC BY: Creative Commons Attribution 4.0 International License

Links

Text available via DOI:

View graph of relations

AmbiGaze: direct control of ambient devices by gaze

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

AmbiGaze: direct control of ambient devices by gaze . / Velloso, Eduardo; Wirth, Markus; Weichel, Christian et al.
DIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems. New York: ACM, 2016. p. 812-817.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Velloso, E, Wirth, M, Weichel, C, Abreu Esteves, AE & Gellersen, H-WG 2016, AmbiGaze: direct control of ambient devices by gaze . in DIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, New York, pp. 812-817. https://doi.org/10.1145/2901790.2901867

APA

Velloso, E., Wirth, M., Weichel, C., Abreu Esteves, A. E., & Gellersen, H-W. G. (2016). AmbiGaze: direct control of ambient devices by gaze . In DIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems (pp. 812-817). ACM. https://doi.org/10.1145/2901790.2901867

Vancouver

Velloso E, Wirth M, Weichel C, Abreu Esteves AE, Gellersen H-WG. AmbiGaze: direct control of ambient devices by gaze . In DIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems. New York: ACM. 2016. p. 812-817 doi: 10.1145/2901790.2901867

Author

Velloso, Eduardo ; Wirth, Markus ; Weichel, Christian et al. / AmbiGaze : direct control of ambient devices by gaze . DIS '16 : Proceedings of the 2016 ACM Conference on Designing Interactive Systems. New York : ACM, 2016. pp. 812-817

Bibtex

@inproceedings{d9f39d1f36c64db9b8dc12020235b158,
title = "AmbiGaze: direct control of ambient devices by gaze ",
abstract = "Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.",
author = "Eduardo Velloso and Markus Wirth and Christian Weichel and {Abreu Esteves}, {Augusto Emanuel} and Gellersen, {Hans-Werner Georg}",
year = "2016",
month = jun,
day = "4",
doi = "10.1145/2901790.2901867",
language = "English",
isbn = "9781450340311",
pages = "812--817",
booktitle = "DIS '16",
publisher = "ACM",

}

RIS

TY - GEN

T1 - AmbiGaze

T2 - direct control of ambient devices by gaze

AU - Velloso, Eduardo

AU - Wirth, Markus

AU - Weichel, Christian

AU - Abreu Esteves, Augusto Emanuel

AU - Gellersen, Hans-Werner Georg

PY - 2016/6/4

Y1 - 2016/6/4

N2 - Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.

AB - Eye tracking offers many opportunities for direct device control in smart environments, but issues such as the need for calibration and the Midas touch problem make it impractical. In this paper, we propose AmbiGaze, a smart environment that employs the animation of targets to provide users with direct control of devices by gaze only through smooth pursuit tracking. We propose a design space of means of exposing functionality through movement and illustrate the concept through four prototypes. We evaluated the system in a user study and found that AmbiGaze enables robust gaze-only interaction with many devices, from multiple positions in the environment, in a spontaneous and comfortable manner.

U2 - 10.1145/2901790.2901867

DO - 10.1145/2901790.2901867

M3 - Conference contribution/Paper

SN - 9781450340311

SP - 812

EP - 817

BT - DIS '16

PB - ACM

CY - New York

ER -