Skip to main content

Modules, operations and models for computer vision in PyTorch

Project description

Holocron

License Codacy Badge Build Status codecov Docs

Implementations of recent Deep Learning tricks in Computer Vision, easily paired up with your favorite framework and model zoo.

Holocrons were information-storage datacron devices used by both the Jedi Order and the Sith that contained ancient lessons or valuable information in holographic form.

Source: Wookieepedia

Table of Contents

Note: support of activation mapper and model summary has been dropped and outsourced to independent packages (torch-cam & torch-scan) to clarify project scope.

Getting started

Prerequisites

  • Python 3.6 (or more recent)
  • pip

Installation

Install the package in developer mode

git clone https://github.com/frgfm/Holocron.git
pip install -e Holocron/

Note: pip package release will soon be available

Usage

nn

Main features
Usage

Similar usage to torch.nn

import torch.nn as nn
from holocron.nn import Mish, NLReLU

# Both modules inherit from torch.nn.Module and can be used as such
model = nn.Sequential(nn.Conv2d(3, 64, (3, 3)),
                      Mish(),
                      nn.Conv2d(64, 128, (3, 3)),
                      NLReLU(),)

models

Main features
Usage

Using the models module, you can easily load torch modules or full models:

from holocron.models.res2net import res2net
# Load pretrained Res2net
model = res2net(depth=50, num_classes=10, pretrained=True).eval()

ops

Main features
Usage

Similar usage to torchvision.ops

import torch
from holocron.ops.boxes import box_ciou

boxes1 = torch.tensor([[0, 0, 100, 100], [100, 100, 200, 200]], dtype=torch.float32)
boxes1 = torch.tensor([[50, 50, 150, 150]], dtype=torch.float32)

box_ciou(boxes1, boxes2)

optim

Main features
Usage

The optimizer wrapper can be used on any torch.optim.optimizer.Optimizer object

from torchvision.models.resnet import resnet18
from holocron.optim import RaLars

model = resnet18()
# Common usage of optimizer
optimizer = RaLars(model.parameters(), lr=3e-4)
# Wrap it with Lookahead
optimizer = Lookahead(optimizer, sync_rate=0.5, sync_period=6)
# Now use it just like your base optimizer

You can use the OneCycleScheduler as follows:

from torchvision.models.resnet import resnet18
from torch.optim import Adam
from holocron.optim.lr_scheduler import OneCycleScheduler

model = resnet18()
# Let's have different LRs for weight and biases for instance
bias_params, weight_params = [], []
for n, p in model.named_parameters():
	if n.endswith('.bias'):
		bias_params.append(p)
    else:
    	weight_params.append(p)
# We pass the parameters to the optimizer
optimizer = Adam([dict(params=weight_params, lr=2e-4), dict(params=bias_params, lr=1e-4)])

steps = 500
scheduler = OneCycleScheduler(optimizer, steps, cycle_momentum=False)
# Let's record the evolution of LR in each group
lrs = [[], []]
for step in range(steps):
	for idx, group in enumerate(optimizer.param_groups):
		lrs[idx].append(group['lr'])
	# Train your model and perform optimizer.step() here
	scheduler.step()

# And plot the result
import matplotlib.pyplot as plt
plt.plot(lrs[0], label='Weight LR'); plt.plot(lrs[1], label='Bias LR'); plt.legend(); plt.show()

onecycle

Technical roadmap

The project is currently under development, here are the objectives for the next releases:

  • Standardize models: standardize models by task.
  • Speed benchmark: compare holocron.nn functions execution speed.
  • Reference scripts: add reference training scripts

Documentation

The full package documentation is available here for detailed specifications. The documentation was built with Sphinx using a theme provided by Read the Docs

Contributing

Please refer to CONTRIBUTING if you wish to contribute to this project.

License

Distributed under the MIT License. See LICENSE for more information.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pylocron-0.1.1.tar.gz (32.9 kB view hashes)

Uploaded Source

Built Distribution

pylocron-0.1.1-py3-none-any.whl (37.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page