Skip to content

Nathanael349/universal-gestures-lab

 
 

Repository files navigation

Universal Gestures: Lab

This repository houses the machine learning/neural network implementation of the Universal Gestures project. See the Technical Specification for more details.

Related: Universal Gestures Unity Project and Scrum Board.

Setup

Python

Install Python 3.12.3 or later.

requirements.txt

From the root directory, run the following command to install the required packages:

pip install -r requirements.txt

Usage

  1. Populate data/ with json data collected from the Unity data collection scene.
  2. Run process_data.py to split the dataset into test and train.
  3. Run model.py to train the model, or run model_two_hands.py to train a two-handed model. Ensure the number of input features to the model is correct.
  4. Find the outputted weights in trained_model/. The model is outputted in both .onnx and .json formats. The onnx model can then be imported into the Universal Gestures Unity project and used in the inference script.

About

PyTorch research repository for Universal Gestures.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%