Skip to main content

Kickass Orchestration System for Training, Yielding & Logging

Project description

Kostyl Toolkit

Kickass Orchestration System for Training, Yielding & Logging — a batteries-included toolbox that glues PyTorch Lightning, Hugging Face Transformers, and ClearML into a single workflow.

Overview

  • Rapidly bootstrap Lightning experiments with opinionated defaults (KostylLightningModule, custom schedulers, grad clipping and metric formatting).
  • Keep model configs source-controlled via Pydantic mixins, with ClearML syncing out of the box (ConfigLoadingMixin, ClearMLConfigMixin).
  • Reuse Lightning checkpoints directly inside Transformers models through LightningCheckpointLoaderMixin.
  • Ship distributed-friendly utilities (deterministic logging, FSDP helpers, LR scaling, ClearML tag management).

Installation

# Latest release from PyPI
pip install kostyl-toolkit

# or with uv
uv pip install kostyl-toolkit

Development setup:

uv sync                # creates the virtualenv declared in pyproject.toml
source .venv/bin/activate.fish
pre-commit install     # optional but recommended

Quick Start

from lightning import Trainer
from transformers import AutoModelForSequenceClassification

from kostyl.ml_core.configs.hyperparams import HyperparamsConfig
from kostyl.ml_core.configs.training_params import TrainingParams
from kostyl.ml_core.lightning.extenstions.custom_module import KostylLightningModule


class TextClassifier(KostylLightningModule):
	def __init__(self, hyperparams: HyperparamsConfig):
				super().__init__()
		self.hyperparams = hyperparams  # grad clipping + scheduler knobs
				self.model = AutoModelForSequenceClassification.from_pretrained(
						"distilbert-base-uncased",
						num_labels=2,
				)

		def training_step(self, batch, batch_idx):
				outputs = self.model(**batch)
				self.log("train/loss", outputs.loss)
				return outputs.loss

train_cfg = TrainingParams.from_file("configs/training.yaml")
hyperparams = HyperparamsConfig.from_file("configs/hyperparams.yaml")

module = TextClassifier(hyperparams)

trainer = Trainer(**train_cfg.trainer.model_dump())
trainer.fit(module)

Restoring a plain Transformers model from a Lightning checkpoint:

from kostyl.ml_core.lightning.extenstions.pretrained_model import LightningCheckpointLoaderMixin


model = LightningCheckpointLoaderMixin.from_lighting_checkpoint(
		"checkpoints/epoch=03-step=500.ckpt",
		config_key="config",
		weights_prefix="model.",
)

Components

  • Configurations (kostyl/ml_core/configs): strongly-typed training, optimizer, and scheduler configs with ClearML syncing helpers.
  • Lightning Extensions (kostyl/ml_core/lightning): custom LightningModule base class, callbacks, logging bridges, and the checkpoint loader mixin.
  • Schedulers (kostyl/ml_core/schedulers): extensible LR schedulers (base/composite/cosine) with serialization helpers and on-step logging.
  • ClearML Utilities (kostyl/ml_core/clearml): tag/version helpers and logging bridges for ClearML Tasks.
  • Distributed + Metrics Utils (kostyl/ml_core/dist_utils.py, metrics_formatting.py): world-size-aware LR scaling, rank-aware metric naming, and per-class formatting.
  • Logging Helpers (kostyl/utils/logging.py): rank-aware Loguru setup and uniform handling of incompatible checkpoint keys.

Project Layout

kostyl/
	ml_core/
		configs/                # Pydantic configs + ClearML mixins
		lightning/              # Lightning module, callbacks, loggers, extensions
		schedulers/             # Base + composite/cosine schedulers
		clearml/                # Logging + pulling utilities
	utils/                    # Dict helpers, logging utilities

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kostyl_toolkit-0.1.24.tar.gz (25.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kostyl_toolkit-0.1.24-py3-none-any.whl (35.9 kB view details)

Uploaded Python 3

File details

Details for the file kostyl_toolkit-0.1.24.tar.gz.

File metadata

  • Download URL: kostyl_toolkit-0.1.24.tar.gz
  • Upload date:
  • Size: 25.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.23

File hashes

Hashes for kostyl_toolkit-0.1.24.tar.gz
Algorithm Hash digest
SHA256 878aa6a4dc11479ebc7353f0dd2d91c60c8f9a7fdbd02fcfc3cdf9a5cc033041
MD5 eb45f4ca9538afdfdfbcae06ebc9bb42
BLAKE2b-256 beab683f4c026c98d6f11e6b706a2f5998916d2109a7c56eaefea8d93885c6cf

See more details on using hashes here.

File details

Details for the file kostyl_toolkit-0.1.24-py3-none-any.whl.

File metadata

File hashes

Hashes for kostyl_toolkit-0.1.24-py3-none-any.whl
Algorithm Hash digest
SHA256 129e605e47ec04d7ed0d00d98199bc5ad06626cca891b4d848942a3addaecce6
MD5 c35e48f5dc80534f4c46689bc4ad191f
BLAKE2b-256 a07ceb8a2215228dc75fd485f7691f7d9f072f6dfac7ac8dad796a4ebc25026f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page