Skip to main content

PyTorch Multimodal Library

Project description

TorchMultimodal (Beta Release)

Introduction

TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale. It provides:

  • A repository of modular and composable building blocks (models, fusion layers, loss functions, datasets and utilities).
  • A repository of examples that show how to combine these building blocks with components and common infrastructure from across the PyTorch Ecosystem to replicate state-of-the-art models published in the literature. These examples should serve as baselines for ongoing research in the field, as well as a starting point for future work.

As a first open source example, researchers will be able to train and extend FLAVA using TorchMultimodal.

Installation

TorchMultimodal requires Python >= 3.7. The library can be installed with or without CUDA support. The following assumes conda is installed.

Prerequisites

  1. Install conda environment

    conda create -n torch-multimodal python=\
    conda activate torch-multimodal
    
  2. Install pytorch, torchvision, and torchtext. See PyTorch documentation.

    # Use the current CUDA version as seen [here](https://pytorch.org/get-started/locally/)
    # Select the nightly Pytorch build, Linux as the OS, and conda. Pick the most recent CUDA version.
    conda install pytorch torchvision torchtext pytorch-cuda=\ -c pytorch-nightly -c nvidia
    
    # For CPU-only install
    conda install pytorch torchvision torchtext cpuonly -c pytorch-nightly
    

Install from binaries

Nightly binary on Linux for Python 3.7, 3.8 and 3.9 can be installed via pip wheels. For now we only support Linux platform through PyPI.

python -m pip install torchmultimodal-nightly

Building from Source

Alternatively, you can also build from our source code and run our examples:

git clone --recursive https://github.com/facebookresearch/multimodal.git multimodal
cd multimodal

pip install -e .

For developers please follow the development installation.

Documentation

The library builds on the following concepts:

  • Architectures: These are general and composable classes that capture the core logic associated with a family of models. In most cases these take modules as inputs instead of flat arguments (see Models below). Examples include the LateFusion, FLAVA and CLIP. Users should either reuse an existing architecture or a contribute a new one. We avoid inheritance as much as possible.

  • Models: These are specific instantiations of a given architecture implemented using builder functions. The builder functions take as input all of the parameters for constructing the modules needed to instantiate the architecture. See cnn_lstm.py for an example.

  • Modules: These are self-contained components that can be stitched up in various ways to build an architecture. See lstm_encoder.py as an example.

Contributing

See the CONTRIBUTING file for how to help out.

License

TorchMultimodal is BSD licensed, as found in the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

torchmultimodal_nightly-2022.12.17-py39-none-any.whl (126.8 kB view details)

Uploaded Python 3.9

torchmultimodal_nightly-2022.12.17-py38-none-any.whl (126.8 kB view details)

Uploaded Python 3.8

torchmultimodal_nightly-2022.12.17-py37-none-any.whl (126.8 kB view details)

Uploaded Python 3.7

File details

Details for the file torchmultimodal_nightly-2022.12.17-py39-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.12.17-py39-none-any.whl
Algorithm Hash digest
SHA256 88d985eb492b92d13c5c1f8311e02d9ecc709f9178006f93dcd02cabb05fa7d8
MD5 4c7a6a3899c07117dd7e75d7113d66e6
BLAKE2b-256 b4d2281db8e4124f767d0949b63b13a1b986855ddcefedc4cf116e36f31c0bda

See more details on using hashes here.

File details

Details for the file torchmultimodal_nightly-2022.12.17-py38-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.12.17-py38-none-any.whl
Algorithm Hash digest
SHA256 29a0928c683da33c3416d36a4067daa4490c5ae78d4d9b578780bb42e5e0c863
MD5 ee620e83d42002a85f61d246e7c354be
BLAKE2b-256 fbd04bbd8c5e81a563ce00396e41cba7f3d9a92396183d615f7753c2c545bb52

See more details on using hashes here.

File details

Details for the file torchmultimodal_nightly-2022.12.17-py37-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.12.17-py37-none-any.whl
Algorithm Hash digest
SHA256 b82a91526b7c79f9507a2acbf0f67785ab6565f9d16170ed24a07d77e890cf1f
MD5 cda036a5baf7de8036c8f2b0287cf39b
BLAKE2b-256 3a4cfd14d95c2598c55220644255dae81aacb6c4198f59fe336f3a3d33959ec2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page