Skip to content

Latest commit

 

History

History
67 lines (53 loc) · 2.63 KB

README.md

File metadata and controls

67 lines (53 loc) · 2.63 KB

hcam-torch

Implementation of the paper Towards mental time travel: a hierarchical memory for reinforcement learning agents using CleanRL.

Contributors

Euijin Jeong
Euijin Jeong

💻
Ownfos
Ownfos

💻

This project follows the all-contributors specification.

Progress

Get Started

We tested on Python 3.10.

We recommand to use Anaconda(or Miniconda) to run in virtual environment.

conda create -n hcam python=3.10 -y
conda activate hcam

Clone the repo:

git clone https://github.com/jinPrelude/hcam-torch.git
cd hcam-torch

Install poetry and run poetry install. This will install all dependencies we need.

pip install poetry && poetry install

If the installation completed, try ballet_lstm_lang_only.py

# --track for logging wandb.ai
python ballet_lstm_lang_only --track

Benchmark

LSTM Agent

Tested on i9-11900k + RTX 3090 :

Playing BalletEnv (2_delay16_easy)
Trained in ≈15M total frames Trained in ≈3H