Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Position Calibration tools to minimise cross session variance. #10

Open
PerlinWarp opened this issue Nov 11, 2021 · 0 comments
Open

Position Calibration tools to minimise cross session variance. #10

PerlinWarp opened this issue Nov 11, 2021 · 0 comments
Labels
enhancement New feature or request

Comments

@PerlinWarp
Copy link
Owner

Sensor positioning is hard and solving it is part of solving cross session generalisation, although there are fancy techniques that can be performed after the data is gathered, it would be worth having a discussion on the simple things. Here are some of my rough thoughts:

Calibration Methods

Once a placement is chosen, originally I would sharpie an outline of the whole Myo on my hand. It's important to mark the position of each sensor pod, not just one. The Myo can both rotate around the arm, move up and down, or tilt up and down.

One simple way I calibrate is by placing the Myo on my proximal forearm of my right hand, waving my wrist right and then making sure this movement only peaks one channel, e.g. channel 3.
If the movement peaks two channels, I know the relevant muscle is in-between 2 EMG sensors and then rotate the Myo slightly to minimise cross talk and maximise the amount of the signal only picked up by channel 3.

Once this placement was used a kNN classifier can be trained on relevant gestures, e.g. 5 finger flexion. Then next session, after manual calibration is attempted it can be fine tuned until the kNN classifier works.

Search for something better, or at least more scientific

Some tools could be made to tell the user to rotate the Myo and show the user how similar the readings are to the previous data gathered.
Adding live PCA plotting to the live classifiers with a print out of between/within group variance may help quickly tryout different positioning for different gestures.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant