Skip to main content

Collection of useful (PyTorch) functions.

Project description

tajer

Taller [taˈʎer] is the spanish word for workshop, a good place to store tools, such as useful (PyTorch) functions.

You can easily install it with

pip3 install git+https://github.com/joh-fischer/tajer.git#egg=tajer

Utils

See tajer.utils.py for more information...

Neural network layers

from tajer.nn import ResidualBlock

# depthwise separable convolution (https://arxiv.org/abs/1704.04861)
from tajer.nn import DepthwiseSeparableConv2D

# attention layers (https://arxiv.org/abs/1706.03762)
from tajer.nn import MultiHeadAttention, ConvAttention

# linear attention (https://arxiv.org/abs/1812.01243)
from tajer.nn import LinearConvAttention

# Convolutional block attention module (https://arxiv.org/abs/1807.06521)
from tajer.nn import CBAM

# 1D sinusoidal time embedding (https://arxiv.org/abs/1706.03762)
from tajer.nn import TimeEmbedding

Distributed PyTorch

In tajer/distributed/min_DDP.py you can find a minimum working example of single-node, multi-gpu training with PyTorch, as well as a README.md that shows you how to use it. All communication between processes, as well as the multiprocess spawn is handled by the functions defined in distributed_pytorch.py.

Logging

Command line and txt logger

This function returns a logger that prints to the command line and writes all outputs also to a text log file.

from tajer.log import get_logger

logger = get_logger('log_dir', dist_rank=0)

logger.info("...")
logger.warning("...")

Logger class

Here is a small example of how it works.

import torch
from tajer.log import Logger

logger = Logger('./logs',
                # create log-folder: './logs/model1/22-07-07_121028'
                'experiment_name', timestamp=True,
                # include tensorboard SummaryWriter
                tensorboard=True)

logger.log_hparams({'lr': 1e-4,
                    'optimizer': 'Adam'})

for epoch in range(2):
    logger.init_epoch(epoch)  # initialize epoch to aggregate values

    # training
    for step in range(4):
        logger.log_metrics({'loss': torch.rand(1), 'acc': torch.rand(1)},
                           phase='train', aggregate=True)

    # write to tensorboard
    logger.tensorboard.add_scalar('train/loss', logger.epoch['loss'].avg)

    # validation simulation
    for step in range(2):
        logger.log_metrics({'val_loss': torch.rand(1)},
                           phase='val', aggregate=True)

        print('Running average:', logger.epoch['val_loss'].avg)
        print('Running sum:', logger.epoch['val_loss'].sum)

logger.save()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tajer-1.2.3.tar.gz (16.5 kB view details)

Uploaded Source

File details

Details for the file tajer-1.2.3.tar.gz.

File metadata

  • Download URL: tajer-1.2.3.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for tajer-1.2.3.tar.gz
Algorithm Hash digest
SHA256 164a4293e9ba4af9d3d34984d1e1711acc3a0df26f60299599fa0a6325404c56
MD5 31641c67750c289bf034644dad3f5048
BLAKE2b-256 488bc589fec4ede089ffdb460e5f915344534015b5c75a852dd276e64d596d66

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page