Skip to main content

Collection of useful (PyTorch) functions.

Project description

tajer

Taller [taˈʎer] is the spanish word for workshop, a good place to store tools, such as useful (PyTorch) functions.

You can easily install it with

pip3 install git+https://github.com/joh-fischer/tajer.git#egg=tajer

Utils

See tajer.utils.py for more information...

Neural network layers

from tajer.nn import ResidualBlock

# depthwise separable convolution (https://arxiv.org/abs/1704.04861)
from tajer.nn import DepthwiseSeparableConv2D

# attention layers (https://arxiv.org/abs/1706.03762)
from tajer.nn import MultiHeadAttention, ConvAttention

# linear attention (https://arxiv.org/abs/1812.01243)
from tajer.nn import LinearConvAttention

# Convolutional block attention module (https://arxiv.org/abs/1807.06521)
from tajer.nn import CBAM

# 1D sinusoidal time embedding (https://arxiv.org/abs/1706.03762)
from tajer.nn import TimeEmbedding

Distributed PyTorch

In tajer/distributed/min_DDP.py you can find a minimum working example of single-node, multi-gpu training with PyTorch, as well as a README.md that shows you how to use it. All communication between processes, as well as the multiprocess spawn is handled by the functions defined in distributed_pytorch.py.

Logging

Command line and txt logger

This function returns a logger that prints to the command line and writes all outputs also to a text log file.

from tajer.log import get_logger

logger = get_logger('log_dir', dist_rank=0)

logger.info("...")
logger.warning("...")

Logger class

Here is a small example of how it works.

import torch
from tajer.log import Logger

logger = Logger('./logs',
                # create log-folder: './logs/model1/22-07-07_121028'
                'experiment_name', timestamp=True,
                # include tensorboard SummaryWriter
                tensorboard=True)

logger.log_hparams({'lr': 1e-4,
                    'optimizer': 'Adam'})

for epoch in range(2):
    logger.init_epoch(epoch)  # initialize epoch to aggregate values

    # training
    for step in range(4):
        logger.log_metrics({'loss': torch.rand(1), 'acc': torch.rand(1)},
                           phase='train', aggregate=True)

    # write to tensorboard
    logger.tensorboard.add_scalar('train/loss', logger.epoch['loss'].avg)

    # validation simulation
    for step in range(2):
        logger.log_metrics({'val_loss': torch.rand(1)},
                           phase='val', aggregate=True)

        print('Running average:', logger.epoch['val_loss'].avg)
        print('Running sum:', logger.epoch['val_loss'].sum)

logger.save()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tajer-1.2.2.tar.gz (15.9 kB view details)

Uploaded Source

File details

Details for the file tajer-1.2.2.tar.gz.

File metadata

  • Download URL: tajer-1.2.2.tar.gz
  • Upload date:
  • Size: 15.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for tajer-1.2.2.tar.gz
Algorithm Hash digest
SHA256 bcc22984e73b7d8ac7abd73dfaaf0c9cacfef1415b614e9799545a3f4b4af47b
MD5 e87f1bb43f8a3cde807c7ecd03a7fb0b
BLAKE2b-256 8c5de1f6b0d772b4cecb279dbda81cd2cdf4097fb8328703a631b4fff7f40f96

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page