Skip to main content

PyTorch Lightning Bolts is a community contribution for ML researchers.

Project description

Logo

PyTorch Lightning Bolts

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch

PyPI Status PyPI Status codecov

Documentation Status Slack license Next Release


Trending contributors

Continuous Integration

System / PyTorch ver. 1.4 (min. req.) 1.5 (latest)
Linux py3.6 / py3.7 / py3.8 CI testing CI testing
OSX py3.6 / py3.7 / py3.8 CI testing CI testing
Windows py3.6 / py3.7 / py3.8 wip wip

Install

pip install pytorch-lightning-bolts

Docs

What is Bolts

Bolts is a Deep learning research and production toolbox of:

  • SOTA pretrained models.
  • Model components.
  • Callbacks.
  • Losses.
  • Datasets.

Main Goals of Bolts

The main goal of Bolts is to enable rapid model idea iteration.

Example 1: Finetuning on data

from pl_bolts.models.self_supervised import SimCLR
from pl_bolts.models.self_supervised.simclr.transforms import SimCLRTrainDataTransform, SimCLREvalDataTransform
import pytorch_lightning as pl

# data
train_data = DataLoader(MyDataset(transforms=SimCLRTrainDataTransform(input_height=32)))
val_data = DataLoader(MyDataset(transforms=SimCLREvalDataTransform(input_height=32)))

# model
model = SimCLR(pretrained='imagenet2012')

# train!
trainer = pl.Trainer(gpus=8)
trainer.fit(model, train_data, val_data)

Example 2: Subclass and ideate

from pl_bolts.models import ImageGPT
from pl_bolts.self_supervised import SimCLR

class VideoGPT(ImageGPT):

    def training_step(self, batch, batch_idx):
        x, y = batch
        x = _shape_input(x)

        logits = self.gpt(x)
        simclr_features = self.simclr(x)

        # -----------------
        # do something new with GPT logits + simclr_features
        # -----------------

        loss = self.criterion(logits.view(-1, logits.size(-1)), x.view(-1).long())

        logs = {"loss": loss}
        return {"loss": loss, "log": logs}

Who is Bolts for?

  • Corporate production teams
  • Professional researchers
  • Ph.D. students
  • Linear + Logistic regression heroes

I don't need deep learning

Great! We have LinearRegression and LogisticRegression implementations with numpy and sklearn bridges for datasets! But our implementations work on multiple GPUs, TPUs and scale dramatically...

Check out our Linear Regression on TPU demo

from pl_bolts.models.regression import LinearRegression
from pl_bolts.datamodules import SklearnDataModule

# sklearn dataset
X, y = load_boston(return_X_y=True)
loaders = SklearnDataModule(X, y)

model = LinearRegression(input_dim=13)
trainer = pl.Trainer(num_tpu_cores=1)
trainer.fit(model, loaders.train_dataloader(), loaders.val_dataloader())
trainer.test(test_dataloaders=loaders.test_dataloader())

Is this another model zoo?

No!

Bolts is unique because models are implemented using PyTorch Lightning and structured so that they can be easily subclassed and iterated on.

For example, you can override the elbo loss of a VAE, or the generator_step of a GAN to quickly try out a new idea. The best part is that all the models are benchmarked so you won't waste time trying to "reproduce" or find the bugs with your implementation.

Team

Bolts is supported by the PyTorch Lightning team and the PyTorch Lightning community!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-lightning-bolts-0.1.1.tar.gz (100.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pytorch_lightning_bolts-0.1.1-py3-none-any.whl (174.1 kB view details)

Uploaded Python 3

File details

Details for the file pytorch-lightning-bolts-0.1.1.tar.gz.

File metadata

  • Download URL: pytorch-lightning-bolts-0.1.1.tar.gz
  • Upload date:
  • Size: 100.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.3.1 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for pytorch-lightning-bolts-0.1.1.tar.gz
Algorithm Hash digest
SHA256 8454fa2ec710c9edb5d27a2ff7303d8088bc393ba5967e73a29c100fa05579f5
MD5 adf971e4e987c965e25df2365dedfc21
BLAKE2b-256 030e845c84754b1d177caaaf12632ced764c0e07297976de7f392170a0d168b1

See more details on using hashes here.

File details

Details for the file pytorch_lightning_bolts-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: pytorch_lightning_bolts-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 174.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.3.1 requests-toolbelt/0.9.1 tqdm/4.48.2 CPython/3.8.5

File hashes

Hashes for pytorch_lightning_bolts-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 dfa672ae71ce67862de86620d4756fe23a905c4536162000e782b2f1822f1192
MD5 2396188b24720992771d557b4d4a9978
BLAKE2b-256 98275376b533536bf095f0e2d10c7fc5629b1ee324d8e0e41edc942b3326b6bd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page