Skip to main content

Collection of useful (PyTorch) functions.

Project description

tajer

Taller [taˈʎer] is the spanish word for workshop, a good place to store tools, such as useful (PyTorch) functions.

You can easily install it with

pip3 install git+https://github.com/joh-fischer/tajer.git#egg=tajer

Utils

See tajer.utils.py for more information...

Neural network layers

from tajer.nn import ResidualBlock

# depthwise separable convolution (https://arxiv.org/abs/1704.04861)
from tajer.nn import DepthwiseSeparableConv2D

# attention layers (https://arxiv.org/abs/1706.03762)
from tajer.nn import MultiHeadAttention, ConvAttention

# linear attention (https://arxiv.org/abs/1812.01243)
from tajer.nn import LinearConvAttention

# Convolutional block attention module (https://arxiv.org/abs/1807.06521)
from tajer.nn import CBAM

# 1D sinusoidal time embedding (https://arxiv.org/abs/1706.03762)
from tajer.nn import TimeEmbedding

Distributed PyTorch

In tajer/distributed/min_DDP.py you can find a minimum working example of single-node, multi-gpu training with PyTorch, as well as a README.md that shows you how to use it. All communication between processes, as well as the multiprocess spawn is handled by the functions defined in distributed_pytorch.py.

Logging

Command line and txt logger

This function returns a logger that prints to the command line and writes all outputs also to a text log file.

from tajer.log import get_logger

logger = get_logger('log_dir', dist_rank=0)

logger.info("...")
logger.warning("...")

Logger class

Here is a small example of how it works.

import torch
from tajer.log import Logger

logger = Logger('./logs',
                # create log-folder: './logs/model1/22-07-07_121028'
                'experiment_name', timestamp=True,
                # include tensorboard SummaryWriter
                tensorboard=True)

logger.log_hparams({'lr': 1e-4,
                    'optimizer': 'Adam'})

for epoch in range(2):
    logger.init_epoch(epoch)  # initialize epoch to aggregate values

    # training
    for step in range(4):
        logger.log_metrics({'loss': torch.rand(1), 'acc': torch.rand(1)},
                           phase='train', aggregate=True)

    # write to tensorboard
    logger.tensorboard.add_scalar('train/loss', logger.epoch['loss'].avg)

    # validation simulation
    for step in range(2):
        logger.log_metrics({'val_loss': torch.rand(1)},
                           phase='val', aggregate=True)

        print('Running average:', logger.epoch['val_loss'].avg)
        print('Running sum:', logger.epoch['val_loss'].sum)

logger.save()

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tajer-1.2.1.tar.gz (15.9 kB view details)

Uploaded Source

File details

Details for the file tajer-1.2.1.tar.gz.

File metadata

  • Download URL: tajer-1.2.1.tar.gz
  • Upload date:
  • Size: 15.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.9

File hashes

Hashes for tajer-1.2.1.tar.gz
Algorithm Hash digest
SHA256 6af2cb43bc35a0ae4d61f4c1fd2bb5d094a95ece251a25ca04a10f9ddf0a81e4
MD5 bb079dd5a89bd7b9e2dd5a2622bfda73
BLAKE2b-256 9fdd995e536c8017d47c9f1250673d9e8460ea87723fc99008a5cbbb6ecf4609

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page