Skip to content

josecyc/neurosleeve

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

87 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neurosleeve

Deep Learning For Hand Gesture Signal Classification

This is an implementation of a ConvNet to predict the position of the hand using four Double-Differential sEMG signals from the forearm. To stream the sEMG signals we used a Ganglion Board from OpenBCI and its GUI networking widget.

The repository includes:

  • Source code of two ConvNets that work with spectrograms or the raw data from the four channels
  • Script to collect data
  • Dataset built from josecyc and DanielCordovaV
  • Pre-trained weights for this dataset
  • Demo that maps predictions from model to certain keys
  • Resources on which we based our project

Training on Your Own Dataset

Our data acquisition method:

Our dataset includes about 1100 examples for every hand gesture, we are open-sourcing it in the hope of beneffiting the community and obtaining feedback. To build our dataset we decided to have six hand positions(labels) as well as a neutral position:

Electrode Placement

The method we used was placing four pairs of electrodes to measure the Double-Differential sEMG signal from the muscles after thorough research we settled on a location at 90% the distance from the wrist to the elbow. We tried to target(Flexor Digitorum Profundus, Extensor Digitorium Communis , Extensor Capri Radialis Longus and Flexor Carpi Radialis) precise location can be appreciated in pictures. We didn't notice substantial difference in minor differences in placement on subsequent prediction sessions. We hold the same position for ten seconds at 200 Hz then splitting that into 40 sub samples.

Following our procedure:

  • First download the OpenBCI GUI from this site
  • Define a position for each label (you can change the quantity of labels and their names by modifying the LABELS variable inside the scrips)
  • Run the osc_collect_data.py script and hold the position you define for each label ten seconds (You have five seconds after each label appears to adjust the position)
  • Run the Jupyter Notebooks inside models directory
  • Test your results using the demo script

Requirements

Install required modules:

  • pip install -r requirements.txt

Resources

Includes research papers on which we built upon for this project.

Authors:

Special thanks to:

  • Taylor Yang who provided the code for most of the streaming part of this project with his openbci_stream repo
  • 42 Silicon Valley, specifically to 42 Robotics for providing the enviroment in which this project was developed

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •