PyTorch Multimodal Library
Project description
TorchMultimodal (Beta Release)
Introduction
TorchMultimodal is a PyTorch library for training state-of-the-art multimodal multi-task models at scale. It provides:
- A repository of modular and composable building blocks (models, fusion layers, loss functions, datasets and utilities).
- A repository of examples that show how to combine these building blocks with components and common infrastructure from across the PyTorch Ecosystem to replicate state-of-the-art models published in the literature. These examples should serve as baselines for ongoing research in the field, as well as a starting point for future work.
As a first open source example, researchers will be able to train and extend FLAVA using TorchMultimodal.
Installation
TorchMultimodal requires Python >= 3.8. The library can be installed with or without CUDA support. The following assumes conda is installed.
Prerequisites
-
Install conda environment
conda create -n torch-multimodal python=\ conda activate torch-multimodal
-
Install pytorch, torchvision, and torchtext. See PyTorch documentation.
# Use the current CUDA version as seen [here](https://pytorch.org/get-started/locally/) # Select the nightly Pytorch build, Linux as the OS, and conda. Pick the most recent CUDA version. conda install pytorch torchvision torchtext pytorch-cuda=\ -c pytorch-nightly -c nvidia # For CPU-only install conda install pytorch torchvision torchtext cpuonly -c pytorch-nightly
Install from binaries
Nightly binary on Linux for Python 3.8 and 3.9 can be installed via pip wheels. For now we only support Linux platform through PyPI.
python -m pip install torchmultimodal-nightly
Building from Source
Alternatively, you can also build from our source code and run our examples:
git clone --recursive https://github.com/facebookresearch/multimodal.git multimodal
cd multimodal
pip install -e .
For developers please follow the development installation.
Documentation
The library builds on the following concepts:
-
Architectures: These are general and composable classes that capture the core logic associated with a family of models. In most cases these take modules as inputs instead of flat arguments (see Models below). Examples include the
LateFusion
,FLAVA
andCLIP
. Users should either reuse an existing architecture or a contribute a new one. We avoid inheritance as much as possible. -
Models: These are specific instantiations of a given architecture implemented using builder functions. The builder functions take as input all of the parameters for constructing the modules needed to instantiate the architecture. See cnn_lstm.py for an example.
-
Modules: These are self-contained components that can be stitched up in various ways to build an architecture. See lstm_encoder.py as an example.
Contributing
See the CONTRIBUTING file for how to help out.
License
TorchMultimodal is BSD licensed, as found in the LICENSE file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
File details
Details for the file torchmultimodal_nightly-2023.3.22-py39-none-any.whl
.
File metadata
- Download URL: torchmultimodal_nightly-2023.3.22-py39-none-any.whl
- Upload date:
- Size: 128.6 kB
- Tags: Python 3.9
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2862205eacb51af0c033290f6c76a46c76d23bdee3cfcf920b970d6d5909fb70 |
|
MD5 | 809a80aa986fa0f6ecc08e17331fece7 |
|
BLAKE2b-256 | 02c15f8dea640892632a2e7c700c86733f42b1185e2e623f70616f542856dd86 |
File details
Details for the file torchmultimodal_nightly-2023.3.22-py38-none-any.whl
.
File metadata
- Download URL: torchmultimodal_nightly-2023.3.22-py38-none-any.whl
- Upload date:
- Size: 128.6 kB
- Tags: Python 3.8
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d2a3eca4c4042f47f89da9652af5950bb5f0ddea7d4c6d6f330e8628b766448 |
|
MD5 | 3f528eb99d5b849c8f35884134e4f486 |
|
BLAKE2b-256 | 1c9a60eede36f276e31468198c2a041c17e77ebba9181a51c6a03985585771d6 |