Skip to main content

A library for differentiable nonlinear optimization.

Project description

CircleCI License pypi PyPi Downloads Python pre-commit black PRs

A library for differentiable nonlinear optimization

PaperVideoTwitterWebpageTutorials

Theseus is an efficient application-agnostic library for building custom nonlinear optimization layers in PyTorch to support constructing various problems in robotics and vision as end-to-end differentiable architectures.

Differentiable nonlinear optimization provides a general scheme to encode inductive priors, as the objective function can be partly parameterized by neural models and partly with expert domain-specific differentiable models. The ability to compute gradients end-to-end is retained by differentiating through the optimizer which allows neural models to train on the final task loss, while also taking advantage of priors captured by the optimizer.

See list of papers published using Theseus for examples across various application domains.


Current Features

Application agnostic interface

Our implementation provides an easy to use interface to build custom optimization layers and plug them into any neural architecture. Following differentiable features are currently available:

Efficiency based design

We support several features that improve computation times and memory consumption:

Getting Started

Prerequisites

  • We strongly recommend you install Theseus in a venv or conda environment with Python 3.8-3.10.
  • Theseus requires torch installation. To install for your particular CPU/CUDA configuration, follow the instructions in the PyTorch website.
  • For GPU support, Theseus requires nvcc to compile custom CUDA operations. Make sure it matches the version used to compile pytorch with nvcc --version. If not, install it and ensure its location is on your system's $PATH variable.
  • Theseus also requires suitesparse, which you can install via:
    • sudo apt-get install libsuitesparse-dev (Ubuntu).
    • conda install -c conda-forge suitesparse (Mac).

Installing

  • pypi

    pip install theseus-ai
    

    We currently provide wheels with our CUDA extensions compiled using CUDA 11.6 and Python 3.10. For other CUDA versions, consider installing from source or using our build script.

    Note that pypi installation doesn't include our experimental Theseus Labs. For this, please install from source.

  • From source

    The simplest way to install Theseus from source is by running the following (see further below to also include BaSpaCho)

    git clone https://github.com/facebookresearch/theseus.git && cd theseus
    pip install -e .
    

    If you are interested in contributing to Theseus, instead install

    pip install -e ".[dev]"
    pre-commit install
    

    and follow the more detailed instructions in CONTRIBUTING.

  • Installing BaSpaCho extensions from source

    By default, installing from source doesn't include our BaSpaCho sparse solver extension. For this, follow these steps:

    1. Compile BaSpaCho from source following instructions here. We recommend using flags -DBLA_STATIC=ON -DBUILD_SHARED_LIBS=OFF.

    2. Run

      git clone https://github.com/facebookresearch/theseus.git && cd theseus
      BASPACHO_ROOT_DIR=<path/to/root/baspacho/dir> pip install -e .
      

      where the BaSpaCho root dir must have the binaries in the subdirectory build.

Running unit tests (requires dev installation)

python -m pytest tests

By default, unit tests include tests for our CUDA extensions. You can add the option -m "not cudaext" to skip them when installing without CUDA support. Additionally, the tests for sparse solver BaSpaCho are automatically skipped when its extlib is not compiled.

Examples

Simple example. This example is fitting the curve $y$ to a dataset of $N$ observations $(x,y) \sim D$. This is modeled as an Objective with a single CostFunction that computes the residual $y - v e^x$. The Objective and the GaussNewton optimizer are encapsulated into a TheseusLayer. With Adam and MSE loss, $x$ is learned by differentiating through the TheseusLayer.

import torch
import theseus as th

x_true, y_true, v_true = read_data() # shapes (1, N), (1, N), (1, 1)
x = th.Variable(torch.randn_like(x_true), name="x")
y = th.Variable(y_true, name="y")
v = th.Vector(1, name="v") # a manifold subclass of Variable for optim_vars

def error_fn(optim_vars, aux_vars): # returns y - v * exp(x)
    x, y = aux_vars
    return y.tensor - optim_vars[0].tensor * torch.exp(x.tensor)

objective = th.Objective()
cost_function = th.AutoDiffCostFunction(
    [v], error_fn, y_true.shape[1], aux_vars=[x, y],
    cost_weight=th.ScaleCostWeight(1.0))
objective.add(cost_function)
layer = th.TheseusLayer(th.GaussNewton(objective, max_iterations=10))

phi = torch.nn.Parameter(x_true + 0.1 * torch.ones_like(x_true))
outer_optimizer = torch.optim.Adam([phi], lr=0.001)
for epoch in range(10):
    solution, info = layer.forward(
        input_tensors={"x": phi.clone(), "v": torch.ones(1, 1)},
        optimizer_kwargs={"backward_mode": "implicit"})
    outer_loss = torch.nn.functional.mse_loss(solution["v"], v_true)
    outer_loss.backward()
    outer_optimizer.step()

See tutorials, and robotics and vision examples to learn about the API and usage.

Citing Theseus

If you use Theseus in your work, please cite the paper with the BibTeX below.

@article{pineda2022theseus,
  title   = {{Theseus: A Library for Differentiable Nonlinear Optimization}},
  author  = {Luis Pineda and Taosha Fan and Maurizio Monge and Shobha Venkataraman and Paloma Sodhi and Ricky TQ Chen and Joseph Ortiz and Daniel DeTone and Austin Wang and Stuart Anderson and Jing Dong and Brandon Amos and Mustafa Mukadam},
  journal = {Advances in Neural Information Processing Systems},
  year    = {2022}
}

License

Theseus is MIT licensed. See the LICENSE for details.

Additional Information

Theseus is made possible by the following contributors:

Made with contrib.rocks.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

theseus_ai-0.2.3.tar.gz (147.2 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

theseus_ai-0.2.3-cp310-cp310-manylinux_2_17_x86_64.whl (7.3 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

theseus_ai-0.2.3-cp39-cp39-manylinux_2_17_x86_64.whl (7.3 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

theseus_ai-0.2.3-cp38-cp38-manylinux_2_17_x86_64.whl (16.0 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

File details

Details for the file theseus_ai-0.2.3.tar.gz.

File metadata

  • Download URL: theseus_ai-0.2.3.tar.gz
  • Upload date:
  • Size: 147.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.10

File hashes

Hashes for theseus_ai-0.2.3.tar.gz
Algorithm Hash digest
SHA256 19f22638dd0cecb763d1d6f948974426b5150733cdcae1195434b2f10c59e2b5
MD5 aeaedebafd9e8d16bb060a64e77d94b6
BLAKE2b-256 3682a8176b553c2487c3705aa8aaa374fc5d68bc93814c8e648636aecd0a952c

See more details on using hashes here.

File details

Details for the file theseus_ai-0.2.3-cp310-cp310-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for theseus_ai-0.2.3-cp310-cp310-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 45ed9d078f0cd38e63a7c7b9d288a59ccb300c4c3d4e2fff23e2a2b7ee3ef481
MD5 a321df404e39de406eaad80116e8ae5c
BLAKE2b-256 42f8d766fda4a2d84f3dc989d20c303155ccfb795f79b8e4708b26688a0c190d

See more details on using hashes here.

File details

Details for the file theseus_ai-0.2.3-cp39-cp39-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for theseus_ai-0.2.3-cp39-cp39-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 91449d9afffad856dce23c52489ed809f3307a4190c0cb54f6797da4a7514339
MD5 409986b97b71ee7104eb0b5a08cf5c6d
BLAKE2b-256 54905b4705a6eb18c725c142dd6d13dd06addb87bdd1bd665b57d911aedfb80e

See more details on using hashes here.

File details

Details for the file theseus_ai-0.2.3-cp38-cp38-manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for theseus_ai-0.2.3-cp38-cp38-manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 745a38bbbda1c22faf73e78b7400b1e2fa0397d85dc61c65e1bfe19b1380e8a9
MD5 217665109a589b82f5932f3907491909
BLAKE2b-256 bbde7acd1750a0f7ebd2252d210bd4ae9f550af1b14d0e26e15c273046bda5b2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page