Home > Research > Publications & Outputs > Faces of Focus

Links

Text available via DOI:

View graph of relations

Faces of Focus: A Study on the Facial Cues of Attentional States

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Faces of Focus: A Study on the Facial Cues of Attentional States. / Babaei, Ebrahim; Srivastava, Namrata; Newn, Joshua et al.
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. New York: ACM, 2020. p. 1-13 439.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Babaei, E, Srivastava, N, Newn, J, Zhou, Q, Dingler, T & Velloso, E 2020, Faces of Focus: A Study on the Facial Cues of Attentional States. in CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems., 439, ACM, New York, pp. 1-13. https://doi.org/10.1145/3313831.3376566

APA

Babaei, E., Srivastava, N., Newn, J., Zhou, Q., Dingler, T., & Velloso, E. (2020). Faces of Focus: A Study on the Facial Cues of Attentional States. In CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13). Article 439 ACM. https://doi.org/10.1145/3313831.3376566

Vancouver

Babaei E, Srivastava N, Newn J, Zhou Q, Dingler T, Velloso E. Faces of Focus: A Study on the Facial Cues of Attentional States. In CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. New York: ACM. 2020. p. 1-13. 439 doi: 10.1145/3313831.3376566

Author

Babaei, Ebrahim ; Srivastava, Namrata ; Newn, Joshua et al. / Faces of Focus : A Study on the Facial Cues of Attentional States. CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. New York : ACM, 2020. pp. 1-13

Bibtex

@inproceedings{dcb9a7b69646461dae3174c9fffd4059,
title = "Faces of Focus: A Study on the Facial Cues of Attentional States",
abstract = "Automatically detecting attentional states is a prerequisite for designing interventions to manage attention - knowledge workers' most critical resource. As a first step towards this goal, it is necessary to understand how different attentional states are made discernible through visible cues in knowledge workers. In this paper, we demonstrate the important facial cues to detect attentional states by evaluating a data set of 15 participants that we tracked over a whole workday, which included their challenge and engagement levels. Our evaluation shows that gaze, pitch, and lips part action units are indicators of engaged work; while pitch, gaze movements, gaze angle, and upper-lid raiser action units are indicators of challenging work. These findings reveal a significant relationship between facial cues and both engagement and challenge levels experienced by our tracked participants. Our work contributes to the design of future studies to detect attentional states based on facial cues.",
author = "Ebrahim Babaei and Namrata Srivastava and Joshua Newn and Qiushi Zhou and Tilman Dingler and Eduardo Velloso",
year = "2020",
month = apr,
day = "21",
doi = "10.1145/3313831.3376566",
language = "English",
isbn = "9781450367080",
pages = "1--13",
booktitle = "CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Faces of Focus

T2 - A Study on the Facial Cues of Attentional States

AU - Babaei, Ebrahim

AU - Srivastava, Namrata

AU - Newn, Joshua

AU - Zhou, Qiushi

AU - Dingler, Tilman

AU - Velloso, Eduardo

PY - 2020/4/21

Y1 - 2020/4/21

N2 - Automatically detecting attentional states is a prerequisite for designing interventions to manage attention - knowledge workers' most critical resource. As a first step towards this goal, it is necessary to understand how different attentional states are made discernible through visible cues in knowledge workers. In this paper, we demonstrate the important facial cues to detect attentional states by evaluating a data set of 15 participants that we tracked over a whole workday, which included their challenge and engagement levels. Our evaluation shows that gaze, pitch, and lips part action units are indicators of engaged work; while pitch, gaze movements, gaze angle, and upper-lid raiser action units are indicators of challenging work. These findings reveal a significant relationship between facial cues and both engagement and challenge levels experienced by our tracked participants. Our work contributes to the design of future studies to detect attentional states based on facial cues.

AB - Automatically detecting attentional states is a prerequisite for designing interventions to manage attention - knowledge workers' most critical resource. As a first step towards this goal, it is necessary to understand how different attentional states are made discernible through visible cues in knowledge workers. In this paper, we demonstrate the important facial cues to detect attentional states by evaluating a data set of 15 participants that we tracked over a whole workday, which included their challenge and engagement levels. Our evaluation shows that gaze, pitch, and lips part action units are indicators of engaged work; while pitch, gaze movements, gaze angle, and upper-lid raiser action units are indicators of challenging work. These findings reveal a significant relationship between facial cues and both engagement and challenge levels experienced by our tracked participants. Our work contributes to the design of future studies to detect attentional states based on facial cues.

U2 - 10.1145/3313831.3376566

DO - 10.1145/3313831.3376566

M3 - Conference contribution/Paper

SN - 9781450367080

SP - 1

EP - 13

BT - CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems

PB - ACM

CY - New York

ER -