Skip to main content

A small tool package for qq

Project description

✨qqtools✨

PyPI Downloads PyPI - Monthly Downloads Python version

A lightweight library, crafted and battle-tested daily by qq, to make PyTorch life a little easier.

I’ve gathered the repetitive parts of my day-to-day work and refined them into this slim utility library. It serves as my personal toolkit for handling data, training, and experiments, designed to keep projects moving fast with cleaner code and smoother workflows (and hopefully yours too!).

Built for me, shared for you.

What it includes

At its core, qqtools is a collection of small utilities I use around PyTorch projects:

  • data containers such as qDict and qData
  • dataset and dataloader helpers such as qDictDataset and qDictDataloader
  • small neural network helpers such as qMLP
  • a lightweight training framework, qpipeline
  • a command-line experiment queue for Linux, qexp
  • config and serialization helpers for YAML, JSON, pickle, and LMDB

At the core, it is still a practical toolbox for the repetitive parts around experiments.

Install

# Core install
pip install qqtools

# Full install
pip install qqtools[full]

# If you only want the experiment queue extras:
pip install qqtools[exp]

While some parts still work with torch==1.x, torch>=2.4 is recommended

qDict

qDict is mainly there for cleaner attribute access in batch-like code:

# Instead of dirty dict brackets:
# batch["input_ids"], batch["attention_mask"]

# Use clean attribute access:
batch = qt.qDict({"input_ids": input_ids, "attention_mask": attention_mask})
out = model(batch.input_ids)

Context scope and qt.use_ctx

qt.ctx provides a lightweight scoped context. Values set inside with qt.ctx(...) are visible only in that scope and its nested calls, and the outer state is restored automatically when the block exits.

Scope exit restores the previous key bindings. If you intentionally mutate a shared mutable object in place through the live context, that mutation is considered caller-managed behavior and may remain visible outside the block.

import qqtools as qt

with qt.ctx(dim=512):
    print(qt.ctx.dim)  # 512

print(qt.ctx.get("dim"))  # None

@qt.use_ctx is the simplest way to inject context values into a class constructor:

import qqtools as qt


@qt.use_ctx
class AttentionLayer:
    def __init__(self, dim=64, heads=8):
        self.dim = dim
        self.heads = heads


with qt.ctx(dim=512, heads=16):
    layer = AttentionLayer()
    print(layer.dim, layer.heads)  # 512 16

Manual constructor arguments still take precedence over injected context values.

qexp

qexp is a lightweight experiment queue for Linux hosts. It is built around a shared project root, can work on multi-machines with multi-GPUs.

Quick start:

qexp init --shared-root /mnt/share/myproject/.qexp --machine gpu-a
qexp submit --name demo1 -- python train.py -c config1.yaml
qexp submit --name demo2 -- python train.py -c config2.yaml
qexp submit --name demo3 -- python train.py -c config3.yaml
# 3 tasks will be queued and run sequentially

After init, qexp saves the current shared_root and machine as CLI context, so you usually do not need to repeat them on every command.

Python API:

from qqtools.plugins import qexp

task = qexp.submit(
    qexp.load_root_config("/mnt/share/myproject/.qexp", "gpu-a"),
    command=["python", "train.py", "--epochs", "10"],
    name="demo",
)
print(task.task_id)

Note: Run pip install qqtools[exp] before use qexp command.

qpipeline

qpipeline is a minimal training loop scaffold. It doesn't try to be a heavy framework. You write the project-specific model and task logic, and qpipeline handles the repetitive boilerplate: config-driven startup, train/val loops, metric aggregation, and checkpointing.

A tight training entry:

import torch
from qqtools.plugins.qpipeline import prepare_cmd_args, qPipeline
from qqtools.nn import qMLP

class MyTask:
    def __init__(self, args):
        # Your custom data logic goes here
        self.train_loader, self.val_loader = build_loaders(args)

    def batch_forward(self, model, batch):
        return {"pred": model(batch.x)}

    def batch_loss(self, out, batch):
        loss = torch.nn.functional.mse_loss(out["pred"], batch.y)
        return {"loss": (loss, len(batch.y))}

    def batch_metric(self, out, batch):
        mae = (out["pred"] - batch.y).abs().mean()
        return {"mae": (mae, len(batch.y))}

    def post_metric_to_err(self, result):
        return result["mae"]

class MyPipeline(qPipeline):
    @staticmethod
    def prepare_model(args):
        return qMLP([16, 8, 1])

    @staticmethod
    def prepare_task(args):
        return MyTask(args)

if __name__ == "__main__":
    args = prepare_cmd_args()
    pipe = MyPipeline(args, train=True)
    pipe.fit()

Because qpipeline enforces a stable entry contract, it pairs perfectly with qexp for queued execution:

qexp submit -- python entry.py --config configs/train.yaml

Configuration follows a standard YAML structure. See qConfig.md for details.

Plugin modules

Under src/qqtools/plugins/, there are also:

  • qchem - tools for reading and processing quantum chemistry outputs
  • qpipeline - a training pipeline framework built on top of the core torch utilities
  • qhyperconnect - an implementation of Hyper-Connection for PyTorch

Test

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qqtools-1.2.17.tar.gz (192.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qqtools-1.2.17-py3-none-any.whl (234.3 kB view details)

Uploaded Python 3

File details

Details for the file qqtools-1.2.17.tar.gz.

File metadata

  • Download URL: qqtools-1.2.17.tar.gz
  • Upload date:
  • Size: 192.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qqtools-1.2.17.tar.gz
Algorithm Hash digest
SHA256 eb9dc8f474fd964ea07391258316a9fc2769ddd2abd22d044a05d0dee82fc859
MD5 548094630c8c250fcd4220c668b529e7
BLAKE2b-256 163df9809e1856c9b7e0298616b73ef7c959744ebeffb7b5d7cbf0912087e683

See more details on using hashes here.

File details

Details for the file qqtools-1.2.17-py3-none-any.whl.

File metadata

  • Download URL: qqtools-1.2.17-py3-none-any.whl
  • Upload date:
  • Size: 234.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qqtools-1.2.17-py3-none-any.whl
Algorithm Hash digest
SHA256 5714146af193572d0c4cec4fc34d2a38fcb06e241de246f1457fb0545e407df0
MD5 9cb697c7c3e74adf821e45c0737d0952
BLAKE2b-256 8bd1bba03348b3dd78c49c7c7dde2797f4478dccc898c36967d12ea9b180e81f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page