Skip to main content

PyTorch Multimodal Library

Project description

TorchMultimodal (Beta Release)

Introduction

TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale. It provides:

  • A repository of modular and composable building blocks (models, fusion layers, loss functions, datasets and utilities).
  • A repository of examples that show how to combine these building blocks with components and common infrastructure from across the PyTorch Ecosystem to replicate state-of-the-art models published in the literature. These examples should serve as baselines for ongoing research in the field, as well as a starting point for future work.

As a first open source example, researchers will be able to train and extend FLAVA using TorchMultimodal.

Installation

TorchMultimodal requires Python >= 3.7. The library can be installed with or without CUDA support. The following assumes conda is installed.

Prerequisites

  1. Install conda environment

    conda create -n torch-multimodal python=\
    conda activate torch-multimodal
    
  2. Install pytorch, torchvision, and torchtext. See PyTorch documentation.

    # Use the current CUDA version as seen [here](https://pytorch.org/get-started/locally/)
    # Select the nightly Pytorch build, Linux as the OS, and conda. Pick the most recent CUDA version.
    conda install pytorch torchvision torchtext pytorch-cuda=\ -c pytorch-nightly -c nvidia
    
    # For CPU-only install
    conda install pytorch torchvision torchtext cpuonly -c pytorch-nightly
    

Install from binaries

Nightly binary on Linux for Python 3.7, 3.8 and 3.9 can be installed via pip wheels. For now we only support Linux platform through PyPI.

python -m pip install torchmultimodal-nightly

Building from Source

Alternatively, you can also build from our source code and run our examples:

git clone --recursive https://github.com/facebookresearch/multimodal.git multimodal
cd multimodal

pip install -e .

For developers please follow the development installation.

Documentation

The library builds on the following concepts:

  • Architectures: These are general and composable classes that capture the core logic associated with a family of models. In most cases these take modules as inputs instead of flat arguments (see Models below). Examples include the LateFusion, FLAVA and CLIP. Users should either reuse an existing architecture or a contribute a new one. We avoid inheritance as much as possible.

  • Models: These are specific instantiations of a given architecture implemented using builder functions. The builder functions take as input all of the parameters for constructing the modules needed to instantiate the architecture. See cnn_lstm.py for an example.

  • Modules: These are self-contained components that can be stitched up in various ways to build an architecture. See lstm_encoder.py as an example.

Contributing

See the CONTRIBUTING file for how to help out.

License

TorchMultimodal is BSD licensed, as found in the LICENSE file.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

torchmultimodal_nightly-2022.11.11-py39-none-any.whl (126.6 kB view details)

Uploaded Python 3.9

torchmultimodal_nightly-2022.11.11-py38-none-any.whl (126.6 kB view details)

Uploaded Python 3.8

torchmultimodal_nightly-2022.11.11-py37-none-any.whl (126.6 kB view details)

Uploaded Python 3.7

File details

Details for the file torchmultimodal_nightly-2022.11.11-py39-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.11.11-py39-none-any.whl
Algorithm Hash digest
SHA256 9230310d9a2c7a95875acc78d596f48bc423c109afb2555f81cc9f200e78c3b9
MD5 586626683b199b2fd38c8c8b71bbed84
BLAKE2b-256 ed8eeb9a2dac2c1438f6fe8a9e04d25d8fcdb31c206d9f18922d14a6f6c82eba

See more details on using hashes here.

File details

Details for the file torchmultimodal_nightly-2022.11.11-py38-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.11.11-py38-none-any.whl
Algorithm Hash digest
SHA256 dea0426b08c8605a6cd1cb224fd9a91d6bb62397bfc2542ca07c7631d08b2f90
MD5 b9cff6d498b29a5663ffba1f3cca7e7d
BLAKE2b-256 071bc05980095bdcb73266b0bb6d5986bde568b8646faf11b385aacb567d842b

See more details on using hashes here.

File details

Details for the file torchmultimodal_nightly-2022.11.11-py37-none-any.whl.

File metadata

File hashes

Hashes for torchmultimodal_nightly-2022.11.11-py37-none-any.whl
Algorithm Hash digest
SHA256 9eea07ee600e8cf5afcb5af0e9b96d88a318af24965328feed4d2a89392eeaea
MD5 2a4e7d9d10bc9785f81c2b2584706412
BLAKE2b-256 43ca1a9249e6b86f9a56cce34264c32f10fafbf7d3479ac43e306e0c548c2bd6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page