Skip to content

Latest commit

 

History

History
60 lines (42 loc) · 1.7 KB

README.md

File metadata and controls

60 lines (42 loc) · 1.7 KB

Neural-Processes

This repository contains PyTorch implementations of the following Neural Process variants

  • Recurrent Attentive Neural Process (ANPRNN)
  • Neural Processes (NPs)
  • Attentive Neural Processes (ANPs)

The model architectures follow the ones proposed in the papers.

This is the author implementation of NeurIPS 2019 Workshop paper

Installation & Requirements

It is recommended to use Python3. Using virtual environment is recommended.

PyTorch 
Numpy
Matplotlib

Descriptions

  • The Neural Process models are under /neural_process_models folder
  • In /neural_process_models/modules there are all the modules for building NP networks, including linear MLP, attention network and so on.
  • The data generation functions are under \misc

Results

1d function regression

10 context points

Usage

Simple example of training an Attentive Neural Process of 1d function regression

python3 main_ANP_1d_regression.py

For digital number inpainting trained on MNIST data

python3 main_ANP_mnist.py

See Neural Process models in /neural_process_models for detailed examples of construction NP models for specific tasks.

Acknowledgements

For any question, please refer to

License

MIT