Skip to main content

Deprecated in favor of Lightning Bolts - https://pypi.org/project/lightning-bolts

Project description

WARNING: Deprecated in favor of Lightning Bolts - https://pypi.org/project/lightning-bolts

Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch


WebsiteInstallationMain goalslatest Docsstable DocsCommunityGrid AILicence

PyPI Status PyPI Status Build Status codecov CodeFactor

Documentation Status Slack Discourse status license


Continuous Integration

CI testing
System / PyTorch ver. 1.6 (min. req.) 1.8 (latest)
Linux py3.{6,8} CI full testing CI full testing
OSX py3.{6,8} CI full testing CI full testing
Windows py3.7* CI base testing CI base testing
  • * testing just the package itself, we skip full test suite - excluding tests folder

Install

View install

Simple installation from PyPI

pip install lightning-bolts

Install bleeding-edge (no guarantees)

pip install git+https://github.com/PytorchLightning/lightning-bolts.git@master --upgrade

In case you want to have full experience you can install all optional packages at once

pip install lightning-bolts["extra"]

What is Bolts

Bolts is a Deep learning research and production toolbox of:

  • SOTA pretrained models.
  • Model components.
  • Callbacks.
  • Losses.
  • Datasets.

Main Goals of Bolts

The main goal of Bolts is to enable rapid model idea iteration.

Example 1: Finetuning on data

from pl_bolts.models.self_supervised import SimCLR
from pl_bolts.models.self_supervised.simclr.transforms import SimCLRTrainDataTransform, SimCLREvalDataTransform
import pytorch_lightning as pl

# data
train_data = DataLoader(MyDataset(transforms=SimCLRTrainDataTransform(input_height=32)))
val_data = DataLoader(MyDataset(transforms=SimCLREvalDataTransform(input_height=32)))

# model
weight_path = 'https://pl-bolts-weights.s3.us-east-2.amazonaws.com/simclr/bolts_simclr_imagenet/simclr_imagenet.ckpt'
simclr = SimCLR.load_from_checkpoint(weight_path, strict=False)

simclr.freeze()

# finetune

Example 2: Subclass and ideate

from pl_bolts.models import ImageGPT
from pl_bolts.models.self_supervised import SimCLR

class VideoGPT(ImageGPT):

    def training_step(self, batch, batch_idx):
        x, y = batch
        x = _shape_input(x)

        logits = self.gpt(x)
        simclr_features = self.simclr(x)

        # -----------------
        # do something new with GPT logits + simclr_features
        # -----------------

        loss = self.criterion(logits.view(-1, logits.size(-1)), x.view(-1).long())

        logs = {"loss": loss}
        return {"loss": loss, "log": logs}

Who is Bolts for?

  • Corporate production teams
  • Professional researchers
  • Ph.D. students
  • Linear + Logistic regression heroes

I don't need deep learning

Great! We have LinearRegression and LogisticRegression implementations with numpy and sklearn bridges for datasets! But our implementations work on multiple GPUs, TPUs and scale dramatically...

Check out our Linear Regression on TPU demo

from pl_bolts.models.regression import LinearRegression
from pl_bolts.datamodules import SklearnDataModule
from sklearn.datasets import load_boston
import pytorch_lightning as pl

# sklearn dataset
X, y = load_boston(return_X_y=True)
loaders = SklearnDataModule(X, y)

model = LinearRegression(input_dim=13)

# try with gpus=4!
# trainer = pl.Trainer(gpus=4)
trainer = pl.Trainer()
trainer.fit(model, train_dataloader=loaders.train_dataloader(), val_dataloaders=loaders.val_dataloader())
trainer.test(test_dataloaders=loaders.test_dataloader())

Is this another model zoo?

No!

Bolts is unique because models are implemented using PyTorch Lightning and structured so that they can be easily subclassed and iterated on.

For example, you can override the elbo loss of a VAE, or the generator_step of a GAN to quickly try out a new idea. The best part is that all the models are benchmarked so you won't waste time trying to "reproduce" or find the bugs with your implementation.

Team

Bolts is supported by the PyTorch Lightning team and the PyTorch Lightning community!


Licence

Please observe the Apache 2.0 license that is listed in this repository. In addition the Lightning framework is Patent Pending.

Citation

To cite bolts use:

@article{falcon2020framework,
  title={A Framework For Contrastive Self-Supervised Learning And Designing A New Approach},
  author={Falcon, William and Cho, Kyunghyun},
  journal={arXiv preprint arXiv:2009.00104},
  year={2020}
}

To cite other contributed models or modules, please cite the authors directly (if they don't have bibtex, ping the authors on a GH issue)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch-lightning-bolts-0.3.2.post1.tar.gz (153.4 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file pytorch-lightning-bolts-0.3.2.post1.tar.gz.

File metadata

  • Download URL: pytorch-lightning-bolts-0.3.2.post1.tar.gz
  • Upload date:
  • Size: 153.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for pytorch-lightning-bolts-0.3.2.post1.tar.gz
Algorithm Hash digest
SHA256 b2e19c014986141fb8cc49fb92ef78fc05c9bc68618625068602aa985a285a86
MD5 f97aa661b0916e252b42aee4055ff6cd
BLAKE2b-256 4f9d9795a011d21635dd3dfad39220072cf64ed00acc1a95c65a95f007ec44f4

See more details on using hashes here.

File details

Details for the file pytorch_lightning_bolts-0.3.2.post1-py3-none-any.whl.

File metadata

  • Download URL: pytorch_lightning_bolts-0.3.2.post1-py3-none-any.whl
  • Upload date:
  • Size: 252.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.24.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.8.10

File hashes

Hashes for pytorch_lightning_bolts-0.3.2.post1-py3-none-any.whl
Algorithm Hash digest
SHA256 816dc1dafdbd251fd3c115ddca6e900aa38af86ff9f8ff7dec272f80cfa13a06
MD5 6d2acb8de742984686f66f024f8cd6f8
BLAKE2b-256 0c711c421bf2b3f1e3445dd7b7377a022b09c01dfa1b3eade2c654081bb1e279

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page