Skip to content

Read data from Myo armband and make data sets for training machine learning model. Achieve real-time gesture recognition to control a humanoid manipulator (uHand 2.0).

Notifications You must be signed in to change notification settings

wsjpla/MYO-Gesture-Control

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MYO-Gesture-Control

Requirement:

  • myo-python
  • Numpy
  • Scikit-learn
  • Tensorflow

Usage:

dataset.py: Record EMG signals to make datasets for model training

train_model.py: Train the CNN model

ges_rec_online.py: Achieve real-time recognition to control a humanoid manipulator (uHand 2.0)

decode_grip_strength.py: Decode the relative strength of grip force according to the signal amplitude

7 gestures:

< fist > < finger_spread > < thumb > < 1_finger_type > < 2_finger_type > < V_gesture > < relax >

About

Read data from Myo armband and make data sets for training machine learning model. Achieve real-time gesture recognition to control a humanoid manipulator (uHand 2.0).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages