Skip to content

Latest commit

 

History

History
28 lines (18 loc) · 1.39 KB

README.md

File metadata and controls

28 lines (18 loc) · 1.39 KB

Universal Gestures: Lab

This repository houses the machine learning/neural network implementation of the Universal Gestures project. See the Technical Specification for more details.

Related: Universal Gestures Unity Project and Scrum Board.

Setup

Python

Install Python 3.12.3 or later.

requirements.txt

From the root directory, run the following command to install the required packages:

pip install -r requirements.txt

Usage

  1. Populate data/ with json data collected from the Unity data collection scene.
  2. Run process_data.py to split the dataset into test and train.
  3. Run model.py to train the model, or run model_two_hands.py to train a two-handed model. Ensure the number of input features to the model is correct.
  4. Find the outputted weights in trained_model/. The model is outputted in both .onnx and .json formats. The onnx model can then be imported into the Universal Gestures Unity project and used in the inference script.