Home > Research > Publications & Outputs > Multimodal recognition of reading activity in t...
View graph of relations

Multimodal recognition of reading activity in transit using body-worn sensors

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Published

Standard

Multimodal recognition of reading activity in transit using body-worn sensors. / Bulling, Andreas; Ward, Jamie A.; Gellersen, Hans.
In: ACM Transactions on Applied Perception, Vol. 9, No. 1, 2, 01.03.2012.

Research output: Contribution to Journal/MagazineJournal articlepeer-review

Harvard

APA

Vancouver

Bulling A, Ward JA, Gellersen H. Multimodal recognition of reading activity in transit using body-worn sensors. ACM Transactions on Applied Perception. 2012 Mar 1;9(1):2. doi: 10.1145/2134203.2134205

Author

Bibtex

@article{df6a72a598394fc5903cf27f094e7838,
title = "Multimodal recognition of reading activity in transit using body-worn sensors",
abstract = "Reading is one of the most well-studied visual activities. Vision research traditionally focuses on understanding the perceptual and cognitive processes involved in reading. In this work we recognize reading activity by jointly analyzing eye and head movements of people in an everyday environment. Eye movements are recorded using an electrooculography (EOG) system; body movements using body-worn inertial measurement units. We compare two approaches for continuous recognition of reading: String matching (STR) that explicitly models the characteristic horizontal saccades during reading, and a support vector machine (SVM) that relies on 90 eye movement features extracted from the eye movement data. We evaluate both methods in a study performed with eight participants reading while sitting at a desk, standing, walking indoors and outdoors, and riding a tram. We introduce a method to segment reading activity by exploiting the sensorimotor coordination of eye and head movements during reading. Using person-independent training, we obtain an average precision for recognizing reading of 88.9% (recall 72.3%) using STR and of 87.7% (recall 87.9%) using SVM over all participants. We show that the proposed segmentation scheme improves the performance of recognizing reading events by more than 24%. Our work demonstrates that the joint analysis of eye and body movements is beneficial for reading recognition and opens up discussion on the wider applicability of a multimodal recognition approach to other visual and physical activities.",
author = "Andreas Bulling and Ward, {Jamie A.} and Hans Gellersen",
year = "2012",
month = mar,
day = "1",
doi = "10.1145/2134203.2134205",
language = "English",
volume = "9",
journal = "ACM Transactions on Applied Perception",
issn = "1544-3558",
publisher = "Association for Computing Machinery (ACM)",
number = "1",

}

RIS

TY - JOUR

T1 - Multimodal recognition of reading activity in transit using body-worn sensors

AU - Bulling, Andreas

AU - Ward, Jamie A.

AU - Gellersen, Hans

PY - 2012/3/1

Y1 - 2012/3/1

N2 - Reading is one of the most well-studied visual activities. Vision research traditionally focuses on understanding the perceptual and cognitive processes involved in reading. In this work we recognize reading activity by jointly analyzing eye and head movements of people in an everyday environment. Eye movements are recorded using an electrooculography (EOG) system; body movements using body-worn inertial measurement units. We compare two approaches for continuous recognition of reading: String matching (STR) that explicitly models the characteristic horizontal saccades during reading, and a support vector machine (SVM) that relies on 90 eye movement features extracted from the eye movement data. We evaluate both methods in a study performed with eight participants reading while sitting at a desk, standing, walking indoors and outdoors, and riding a tram. We introduce a method to segment reading activity by exploiting the sensorimotor coordination of eye and head movements during reading. Using person-independent training, we obtain an average precision for recognizing reading of 88.9% (recall 72.3%) using STR and of 87.7% (recall 87.9%) using SVM over all participants. We show that the proposed segmentation scheme improves the performance of recognizing reading events by more than 24%. Our work demonstrates that the joint analysis of eye and body movements is beneficial for reading recognition and opens up discussion on the wider applicability of a multimodal recognition approach to other visual and physical activities.

AB - Reading is one of the most well-studied visual activities. Vision research traditionally focuses on understanding the perceptual and cognitive processes involved in reading. In this work we recognize reading activity by jointly analyzing eye and head movements of people in an everyday environment. Eye movements are recorded using an electrooculography (EOG) system; body movements using body-worn inertial measurement units. We compare two approaches for continuous recognition of reading: String matching (STR) that explicitly models the characteristic horizontal saccades during reading, and a support vector machine (SVM) that relies on 90 eye movement features extracted from the eye movement data. We evaluate both methods in a study performed with eight participants reading while sitting at a desk, standing, walking indoors and outdoors, and riding a tram. We introduce a method to segment reading activity by exploiting the sensorimotor coordination of eye and head movements during reading. Using person-independent training, we obtain an average precision for recognizing reading of 88.9% (recall 72.3%) using STR and of 87.7% (recall 87.9%) using SVM over all participants. We show that the proposed segmentation scheme improves the performance of recognizing reading events by more than 24%. Our work demonstrates that the joint analysis of eye and body movements is beneficial for reading recognition and opens up discussion on the wider applicability of a multimodal recognition approach to other visual and physical activities.

U2 - 10.1145/2134203.2134205

DO - 10.1145/2134203.2134205

M3 - Journal article

VL - 9

JO - ACM Transactions on Applied Perception

JF - ACM Transactions on Applied Perception

SN - 1544-3558

IS - 1

M1 - 2

ER -