Home > Research > Publications & Outputs > Motion Coupling of Earable Devices in Camera View

Electronic data

  • Motion_Coupling_of_Earable_Devices_in_Camera_View___MUM2020_Note

    Rights statement: © ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia 2020 https://dl.acm.org/doi/proceedings/10.1145/3428361

    Accepted author manuscript, 3.71 MB, PDF document

Links

Text available via DOI:

View graph of relations

Motion Coupling of Earable Devices in Camera View

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

Motion Coupling of Earable Devices in Camera View. / Clarke, Christopher; Ehrich, Peter; Gellersen, Hans.
MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia. New York: ACM, 2020. p. 13–17.

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Clarke, C, Ehrich, P & Gellersen, H 2020, Motion Coupling of Earable Devices in Camera View. in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia. ACM, New York, pp. 13–17. https://doi.org/10.1145/3428361.3428470

APA

Clarke, C., Ehrich, P., & Gellersen, H. (2020). Motion Coupling of Earable Devices in Camera View. In MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia (pp. 13–17). ACM. https://doi.org/10.1145/3428361.3428470

Vancouver

Clarke C, Ehrich P, Gellersen H. Motion Coupling of Earable Devices in Camera View. In MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia. New York: ACM. 2020. p. 13–17 doi: 10.1145/3428361.3428470

Author

Clarke, Christopher ; Ehrich, Peter ; Gellersen, Hans. / Motion Coupling of Earable Devices in Camera View. MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia. New York : ACM, 2020. pp. 13–17

Bibtex

@inproceedings{efdb1d42f2f74b198dc4677013e6cd6b,
title = "Motion Coupling of Earable Devices in Camera View",
abstract = "Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others.",
author = "Christopher Clarke and Peter Ehrich and Hans Gellersen",
note = "{\textcopyright} ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia 2020 https://dl.acm.org/doi/proceedings/10.1145/3428361",
year = "2020",
month = nov,
day = "1",
doi = "10.1145/3428361.3428470",
language = "English",
isbn = "9781450388702",
pages = "13–17",
booktitle = "MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia",
publisher = "ACM",

}

RIS

TY - GEN

T1 - Motion Coupling of Earable Devices in Camera View

AU - Clarke, Christopher

AU - Ehrich, Peter

AU - Gellersen, Hans

N1 - © ACM, 2020. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia 2020 https://dl.acm.org/doi/proceedings/10.1145/3428361

PY - 2020/11/1

Y1 - 2020/11/1

N2 - Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others.

AB - Earables, earphones augmented with inertial sensors and real-time data accessibility, provide the opportunity for private audio channels in public settings. One of the main challenges of achieving this goal is to correctly associate which device belongs to which user without prior information. In this paper, we explore how motion of an earable, as measured by the on-board accelerometer, can be correlated against detected faces from a webcam to accurately match which user is wearing the device. We conduct a data collection and explore which type of user movement can be accurately detected using this approach, and investigate how varying the speed of the movement affects detection rates. Our results show that the approach achieves greater detection results for faster movements, and that it can differentiate the same movement across different participants with a detection rate of 86%, increasing to 92% when differentiating a movement against others.

U2 - 10.1145/3428361.3428470

DO - 10.1145/3428361.3428470

M3 - Conference contribution/Paper

SN - 9781450388702

SP - 13

EP - 17

BT - MUM 2020: 19th International Conference on Mobile and Ubiquitous Multimedia

PB - ACM

CY - New York

ER -