Skip to content

Latest commit

 

History

History
30 lines (20 loc) · 1.35 KB

readme.md

File metadata and controls

30 lines (20 loc) · 1.35 KB

Neural Processes in PyTorch

What is this?

This is a PyTorch implementation of various Neural Process (NPs) variants, including Standard NPs [1], Self-attentive NPs [2], and NPs with Bayesian Aggregation [3].

Plots taken from [3].

[1] Garnelo et al., "Neural Processes", ICML 2018 Workshop on Theoretical Foundations and Applications of Deep Generative Models

[2] Kim et al., "Attentive Neural Processes", ICLR 2019

[3] Volpp et al., "Bayesian Context Aggregation for Neural Processes", ICLR 2021, cf. https://github.com/boschresearch/bayesian-context-aggregation

Getting Started

First install metalearning_benchmarks from here.

Then clone this repository and run

pip install . 

from the source directory.

To get familiar with the code, have a look at the example script ./scripts/run_neural_process.py.

Notes

This code is still in development and thus not all features are thoroughly tested. Some features may change in the future. It was tested only with the packages listed in ./setup.cfg.

License

This code is licensed under the AGPL-3.0 license and is free to use by anyone without any restrictions.