Skip to main content

Probabilistic programming using pytorch.

Project description

Borch

pipeline status coverage report lifecycle Code style: black docs

Getting Started | Documentation | Contributing

Borch is a universal probabilistic programming language (PPL) framework developed by Desupervised, that uses and integrates with PyTorch. Borch was designed with special attention to support Bayesian neural networks in a native fashion. Further, it's designed to

  • Flexible and scalable framework
  • Support neural networks out of the box.
  • Have bells and whistles a universal PPL needs.

It can be installed with

pip install borch

Usage

See our full tutorials here.

As a quick example let's look into how the neural network interface looks. The module borch.nn provides implementations of neural network modules that are used for deep probabilistic programming and provides an interface almost identical to the torch.nn modules. In many cases it is possible to just switch

import torch.nn as nn

to

import borch.nn as nn

and a network defined in torch is now probabilistic, without any other changes in the model specification, one also need to change the loss function to infer.vi.vi_loss.

For example, a convolutional neural network can be written as

import torch
import torch.nn.functional as F
from borch import nn

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 6, 5)
        self.conv2 = nn.Conv2d(6, 16, 5)
        self.fc1 = nn.Linear(16 * 5 * 5, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2))
        x = F.max_pool2d(F.relu(self.conv2(x)), 2)
        x = x.view(-1, self.num_flat_features(x))
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x

    def num_flat_features(self, x):
        size = x.size()[1:]
        num_features = 1
        for s in size:
            num_features *= s
        return num_features

Installation

Borch can be installed using

pip install borch

Docker

The Borch Docker images are available as both CPU and GPU versions at gitlab.com/desupervised/borch/container_registry. The latest CPU images can be used as

docker run registry.gitlab.com/desupervised/borch/cpu:master

Contributing

Please read the contribution guidelines in CONTRIBUTING.md.

Citation

If you use this software for your research or business please cite us and help the package grow!

@misc{borch,
  author = {Belcher, Lewis and Gudmundsson, Johan and Green, Michael},
  title = {Borch},
  howpublished = {https://gitlab.com/desupervised/borch},
  month        = "Apr",
  year         = "2021",
  note         = "v0.1.0",
  annote       = ""
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

borch-0.1.5.tar.gz (66.2 kB view details)

Uploaded Source

Built Distribution

borch-0.1.5-py3-none-any.whl (82.5 kB view details)

Uploaded Python 3

File details

Details for the file borch-0.1.5.tar.gz.

File metadata

  • Download URL: borch-0.1.5.tar.gz
  • Upload date:
  • Size: 66.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.14

File hashes

Hashes for borch-0.1.5.tar.gz
Algorithm Hash digest
SHA256 c7b28e3c39c104d46c0374ebb0fe1dd8fc2e1cd5363ed45363eb9a3901fefff8
MD5 6961b11ebb28ec2ddf73e492ea9c603a
BLAKE2b-256 9dd3b7598719cef78f06d2d73b81a2314e476ff81c7745875499d4cdd1b2645c

See more details on using hashes here.

File details

Details for the file borch-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: borch-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 82.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.7.14

File hashes

Hashes for borch-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 472b7291973e8ae4a0d46d3c04c0878182799dfdfedd1c5da037e1297d84479d
MD5 8e5fe839e66cb84a4c5df5743b25dd93
BLAKE2b-256 376ee2a5547f686d2cacf199022082a8fea7b5f3aa05c8c5ccd540be75c2cea4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page