A framework for composing Neural Processes in Python
Project description
Neural Processes
A framework for composing Neural Processes in Python.
Installation
pip install neuralprocesses tensorflow tensorflow-probability # For use with TensorFlow
pip install neuralprocesses torch # For use with PyTorch
If something is not working or unclear, please feel free to open an issue.
Documentation
See here.
TL;DR! Just Get me Started!
Here you go:
import torch
import neuralprocesses.torch as nps
# Construct a ConvCNP.
convcnp = nps.construct_convgnp(dim_x=1, dim_y=2, likelihood="het")
# Construct optimiser.
opt = torch.optim.Adam(convcnp.parameters(), 1e-3)
# Training: optimise the model for 32 batches.
for _ in range(32):
# Sample a batch of new context and target sets. Replace this with your data. The
# shapes are `(batch_size, dimensionality, num_data)`.
xc = torch.randn(16, 1, 10) # Context inputs
yc = torch.randn(16, 2, 10) # Context outputs
xt = torch.randn(16, 1, 15) # Target inputs
yt = torch.randn(16, 2, 15) # Target output
# Compute the loss and update the model parameters.
loss = -torch.mean(nps.loglik(convcnp, xc, yc, xt, yt, normalise=True))
opt.zero_grad(set_to_none=True)
loss.backward()
opt.step()
# Testing: make some predictions.
mean, var, noiseless_samples, noisy_samples = nps.predict(
convcnp,
torch.randn(16, 1, 10), # Context inputs
torch.randn(16, 2, 10), # Context outputs
torch.randn(16, 1, 15), # Target inputs
)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
neuralprocesses-0.2.4.tar.gz
(71.5 kB
view hashes)
Built Distribution
Close
Hashes for neuralprocesses-0.2.4-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0ad297769f6e640d7ac91da442671731a222bfefb7c73d373668a810d553cb1d |
|
MD5 | 1cb2843878561e58bac8340fae4a78de |
|
BLAKE2b-256 | eb55b9d710dddd6b137cb161ab510f0ad1cafc815efca71ab5e75d1ede2494d8 |