Home > Research > Publications & Outputs > 3D Printed Brain-Controlled Robot-Arm Prostheti...

Links

Text available via DOI:

View graph of relations

3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published

Standard

3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. / Lonsdale, David; Zhang, Li; Jiang, Richard.
2020 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2021. p. 247-253 9469532 (Proceedings - International Conference on Machine Learning and Cybernetics; Vol. 2020-December).

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Harvard

Lonsdale, D, Zhang, L & Jiang, R 2021, 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. in 2020 International Conference on Machine Learning and Cybernetics (ICMLC)., 9469532, Proceedings - International Conference on Machine Learning and Cybernetics, vol. 2020-December, IEEE, pp. 247-253. https://doi.org/10.1109/ICMLC51923.2020.9469532

APA

Lonsdale, D., Zhang, L., & Jiang, R. (2021). 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. In 2020 International Conference on Machine Learning and Cybernetics (ICMLC) (pp. 247-253). Article 9469532 (Proceedings - International Conference on Machine Learning and Cybernetics; Vol. 2020-December). IEEE. https://doi.org/10.1109/ICMLC51923.2020.9469532

Vancouver

Lonsdale D, Zhang L, Jiang R. 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. In 2020 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE. 2021. p. 247-253. 9469532. (Proceedings - International Conference on Machine Learning and Cybernetics). Epub 2020 Dec 2. doi: 10.1109/ICMLC51923.2020.9469532

Author

Lonsdale, David ; Zhang, Li ; Jiang, Richard. / 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. 2020 International Conference on Machine Learning and Cybernetics (ICMLC). IEEE, 2021. pp. 247-253 (Proceedings - International Conference on Machine Learning and Cybernetics).

Bibtex

@inproceedings{08077eb0403b43bcbb0c12d581950752,
title = "3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors",
abstract = "In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.",
author = "David Lonsdale and Li Zhang and Richard Jiang",
year = "2021",
month = jul,
day = "5",
doi = "10.1109/ICMLC51923.2020.9469532",
language = "English",
series = "Proceedings - International Conference on Machine Learning and Cybernetics",
publisher = "IEEE",
pages = "247--253",
booktitle = "2020 International Conference on Machine Learning and Cybernetics (ICMLC)",

}

RIS

TY - GEN

T1 - 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors

AU - Lonsdale, David

AU - Zhang, Li

AU - Jiang, Richard

PY - 2021/7/5

Y1 - 2021/7/5

N2 - In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.

AB - In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.

U2 - 10.1109/ICMLC51923.2020.9469532

DO - 10.1109/ICMLC51923.2020.9469532

M3 - Conference contribution/Paper

T3 - Proceedings - International Conference on Machine Learning and Cybernetics

SP - 247

EP - 253

BT - 2020 International Conference on Machine Learning and Cybernetics (ICMLC)

PB - IEEE

ER -