Home > Research > Publications & Outputs > 3D Printed Brain-Controlled Robot-Arm Prostheti...

Links

Text available via DOI:

View graph of relations

3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors

Research output: Contribution in Book/Report/Proceedings - With ISBN/ISSNConference contribution/Paperpeer-review

Published
Close
Publication date5/07/2021
Host publication2020 International Conference on Machine Learning and Cybernetics (ICMLC)
PublisherIEEE
Pages247-253
Number of pages7
ISBN (electronic)9780738124261
<mark>Original language</mark>English

Publication series

NameProceedings - International Conference on Machine Learning and Cybernetics
Volume2020-December
ISSN (Print)2160-133X
ISSN (electronic)2160-1348

Abstract

In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.