Rights statement: © ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia 2020 https://dl.acm.org/doi/proceedings/10.1145/3428361
Accepted author manuscript, 3.71 MB, PDF document
Final published version
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSN › Conference contribution/Paper › peer-review
}
TY - GEN
T1 - Motion Coupling of Earable Devices in Camera View
AU - Clarke, Christopher
AU - Ehrich, Peter
AU - Gellersen, Hans
N1 - © ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia 2020 https://dl.acm.org/doi/proceedings/10.1145/3428361
PY - 2020/11/1
Y1 - 2020/11/1
N2 - Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others.
AB - Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others.
U2 - 10.1145/3428361.3428470
DO - 10.1145/3428361.3428470
M3 - Conference contribution/Paper
SN - 9781450388702
SP - 13
EP - 17
BT - MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia
PB - ACM
CY - New York
ER -