Flexible experiment logging helpers for PyTorch projects.
Project description
Atalaya
Atalaya is a lightweight toolkit that helps you keep PyTorch experiments organised. It provides:
- a
Writerbuilt on top oftensorboardXwith handy helpers for scalars, models, CSV exports, and optional integrations with Weights & Biases, Neptune, Comet, and ClearML; - automatic
info.jsonmetadata snapshots (command, Python version, Git details) so you can replay runs later; - a colour-aware
terminalhelper for structured CLI output with timestamps; - a
Timerutility to track wall-clock timing for blocks of code or functions; - optional console log capture so that your scripts remain reproducible.
The writer uses tensorboardX as its event backend, so you get native TensorBoard support while still being able to plug in third-party experiment trackers such as Weights & Biases, Neptune, ClearML, or Comet.
| Integration | Status |
|---|---|
| TensorBoardX | Works (core backend) |
| Weights & Biases | Works (Writer.with_wandb) |
| Neptune | Not fully validated yet (Writer.with_neptune) |
| ClearML | Not fully validated yet (Writer.with_clearml) |
| Comet ML | Not fully validated yet (Writer.with_comet) |
Installation
Install the core package from PyPI:
pip install atalaya
Extras are available if you want to enable third-party integrations:
# Enable Weights & Biases support
pip install atalaya[wandb]
# Install everything (Neptune, Comet, ClearML, matplotlib, seaborn)
pip install atalaya[all]
Quick Start
TensorBoard logging
from atalaya.writer import Writer
writer = Writer(
name="baseline",
project="my-awesome-project",
logdir="logs",
add_time=True,
save_as_csv=True,
output_catcher=True,
log_git_info=True,
)
# Optional: sync logs with Weights & Biases
writer.with_wandb(group="experiments", entity="my-team")
for epoch in range(10):
metrics = {"loss": 0.1 * epoch, "accuracy": 0.5 + 0.05 * epoch}
writer.add_scalars(metrics, global_step=epoch, prefix="train")
# WandB-style convenience API
writer.log({"train/loss": metrics["loss"]}, step=epoch)
# Log entire models to track parameter and gradient histograms
writer.add_models(
{"encoder": encoder, "decoder": decoder},
global_step=epoch,
log_type="all", # "parameters", "gradients", or "all"
)
writer.close()
CSV logging is enabled when save_as_csv=True, and calling output_catcher=True mirrors everything printed to the console into log.txt within the run folder.
Every run writes an info.json file with the launch command, Python version, and other metadata in the log directory. When log_git_info is enabled and a Git repository is detected, the same file also includes Git details plus ready-to-run checkout/clone commands so experiments can be reproduced. You can merge your own metadata into the file via the info parameter (for example info={"dataset": "cifar10"}).
Optional integrations
writer.with_wandb(group="experiments", entity="my-team")
writer.with_neptune(entity="my-workspace")
Install the matching extras first (for example poetry add atalaya[wandb] or pip install atalaya[neptune]).
Timing utilities
from atalaya.time_manager import Timer
timer = Timer("training")
with timer:
run_training_loop()
timer.report(report_type="total_with_stats") # prints coloured summary to the terminal
Terminal helper
from atalaya.terminal import terminal
# Optional: persist terminal output without a Writer output catcher
terminal.set_log_file("logs/terminal.log")
# Override colours either by name or raw ANSI code
terminal.set_named_color("orange", "\033[38;5;208m")
terminal.set_color("warning", "orange")
terminal.print_info("Loading data...")
terminal.print_warning("Loss is plateauing...", color="orange")
terminal.print_ok("Training finished successfully!")
Messages are timestamped with the process uptime by default.
When you enable Writer(..., output_catcher=True) the log file is handled automatically, so calling set_log_file is not required.
Example project
The example/ directory ships with a small PyTorch multi-layer perceptron that demonstrates writer usage end-to-end:
python example/mlp.py
Logs are saved in example/logs/, and the script shows how to combine Writer, Timer, and the terminal helper in a real training loop.
Development and Publishing
This project is managed with Poetry.
# Install dependencies (add --with dev to use Poe tasks)
poetry install --with dev
# Clean previous builds and publish to PyPI
poe publish
# Or run the Poetry command directly
# poetry publish --build
The package metadata (version, dependencies, classifiers) lives in pyproject.toml.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file atalaya-0.2.8.tar.gz.
File metadata
- Download URL: atalaya-0.2.8.tar.gz
- Upload date:
- Size: 15.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.10.19 Darwin/25.0.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6fa8e35f31a712e341b9894d484f5de1a00573727d7973bc74740c39d48105b1
|
|
| MD5 |
f24c47346f6ef5c9d219ab99d4fddd2a
|
|
| BLAKE2b-256 |
b458ccb586f118de9c76365ec2a6a51a7b3a81b65f9a441f413df79b1bf7e15f
|
File details
Details for the file atalaya-0.2.8-py3-none-any.whl.
File metadata
- Download URL: atalaya-0.2.8-py3-none-any.whl
- Upload date:
- Size: 16.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.3 CPython/3.10.19 Darwin/25.0.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7b52d77e47b51f8d2b6e5c4d6123fe99062df3a52c3e87dbf40ac522784e9d2f
|
|
| MD5 |
37e19729ad5ac91f2cea5af64a514374
|
|
| BLAKE2b-256 |
307b49b86bc09d21e860db7912ba1be54c49c8601e77f4eeb6a57cf44507e4ea
|