Home > Research > Publications & Outputs > Exploring Eye Expressions for Enhancing EOG-Bas...

Electronic data

  • INTERACT23-EOG-EyeExpressions

    Accepted author manuscript, 1.26 MB, PDF document

    Embargo ends: 26/08/25

    Available under license: CC BY-NC: Creative Commons Attribution-NonCommercial 4.0 International License

Links

Text available via DOI:

View graph of relations

Exploring Eye Expressions for Enhancing EOG-Based Interaction

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Publication date26/08/2023
Host publicationHuman-Computer Interaction – INTERACT 2023: 19th IFIP TC13 International Conference, York, UK, August 28 – September 1, 2023, Proceedings, Part IV
EditorsJosé Abdelnour Nocera, Marta Kristín Lárusdóttir, Helen Petrie, Antonio Piccinno, Marco Winckler
Place of PublicationCham
PublisherSpringer
Pages68-79
Number of pages12
ISBN (electronic)9783031422935
ISBN (print)9783031422928
<mark>Original language</mark>English
EventInteract 2023: Design for Equality and Justice - University of York, York, United Kingdom
Duration: 28/08/20231/09/2023
https://interact2023.org/

Conference

ConferenceInteract 2023
Country/TerritoryUnited Kingdom
CityYork
Period28/08/231/09/23
Internet address

Conference

ConferenceInteract 2023
Country/TerritoryUnited Kingdom
CityYork
Period28/08/231/09/23
Internet address

Abstract

This paper explores the classification of eye expressions for EOG-based interaction using JINS MEME, an off-the-shelf eye-tracking device. Previous studies have demonstrated the potential for using electrooculography (EOG) for hands-free human-computer interaction using eye movements (directional, smooth pursuit) and eye expressions (blinking, winking). We collected a comprehensive set of 14 eye gestures to explore how well both types of eye gestures be classified together in a machine learning model. Using a Random Forest classifier trained on our collected data using 15 engineered features, we obtained an overall classification performance of 0.77 (AUC). Our results show that we can reliably classify eye expressions, enhancing the range of available eye gestures for hands-free interaction. With continued development and refinement in EOG-based technology, our findings have long-term implications for improving the usability of the technology and for individuals who require a richer vocabulary of eye gestures to interact hands-free.