Skip to content

A PyTorch implementation of several Neural Process variants, including Bayesian Context Aggregation.

License

Notifications You must be signed in to change notification settings

michaelvolpp/neural_process

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neural Processes in PyTorch

What is this?

This is a PyTorch implementation of various Neural Process (NPs) variants, including Standard NPs [1], Self-attentive NPs [2], and NPs with Bayesian Aggregation [3].

Plots taken from [3].

[1] Garnelo et al., "Neural Processes", ICML 2018 Workshop on Theoretical Foundations and Applications of Deep Generative Models

[2] Kim et al., "Attentive Neural Processes", ICLR 2019

[3] Volpp et al., "Bayesian Context Aggregation for Neural Processes", ICLR 2021, cf. https://github.com/boschresearch/bayesian-context-aggregation

Getting Started

First install metalearning_benchmarks from here.

Then clone this repository and run

pip install . 

from the source directory.

To get familiar with the code, have a look at the example script ./scripts/run_neural_process.py.

Notes

This code is still in development and thus not all features are thoroughly tested. Some features may change in the future. It was tested only with the packages listed in ./setup.cfg.

License

This code is licensed under the AGPL-3.0 license and is free to use by anyone without any restrictions.

About

A PyTorch implementation of several Neural Process variants, including Bayesian Context Aggregation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages