Skip to main content

A small tool package for qq

Project description

✨qqtools✨

PyPI Downloads PyPI - Monthly Downloads Python version

A lightweight library, crafted and battle-tested daily by qq, to make PyTorch life a little easier.

I’ve gathered the repetitive parts of my day-to-day work and refined them into this slim utility library. It serves as my personal toolkit for handling data, training, and experiments, designed to keep projects moving fast with cleaner code and smoother workflows (and hopefully yours too!).

Built for me, shared for you.

What it includes

At its core, qqtools is a collection of small utilities I use around PyTorch projects:

  • data containers such as qDict and qData
  • dataset and dataloader helpers such as qDictDataset and qDictDataloader
  • small neural network helpers such as qMLP
  • a lightweight training framework, qpipeline
  • a command-line experiment queue for Linux, qexp
  • config and serialization helpers for YAML, JSON, pickle, and LMDB

At the core, it is still a practical toolbox for the repetitive parts around experiments.

Install

# Core install
pip install qqtools

# Full install
pip install qqtools[full]

# If you only want the experiment queue extras:
pip install qqtools[exp]

While some parts still work with torch==1.x, torch>=2.4 is recommended

qDict

qDict is mainly there for cleaner attribute access in batch-like code:

# Instead of dirty dict brackets:
# batch["input_ids"], batch["attention_mask"]

# Use clean attribute access:
batch = qt.qDict({"input_ids": input_ids, "attention_mask": attention_mask})
out = model(batch.input_ids)

qexp

qexp is a lightweight experiment queue for Linux hosts. It is built around a shared project root, can work on multi-machines with multi-GPUs.

Quick start:

qexp init --shared-root /mnt/share/myproject/.qexp --machine gpu-a
qexp submit --name demo1 -- python train.py -c config1.yaml
qexp submit --name demo2 -- python train.py -c config2.yaml
qexp submit --name demo3 -- python train.py -c config3.yaml
# 3 tasks will be queued and run sequentially

After init, qexp saves the current shared_root and machine as CLI context, so you usually do not need to repeat them on every command.

Python API:

from qqtools.plugins import qexp

task = qexp.submit(
    qexp.load_root_config("/mnt/share/myproject/.qexp", "gpu-a"),
    command=["python", "train.py", "--epochs", "10"],
    name="demo",
)
print(task.task_id)

Note: Run pip install qqtools[exp] before use qexp command.

qpipeline

qpipeline is a minimal training loop scaffold. It doesn't try to be a heavy framework. You write the project-specific model and task logic, and qpipeline handles the repetitive boilerplate: config-driven startup, train/val loops, metric aggregation, and checkpointing.

A tight training entry:

import torch
from qqtools.plugins.qpipeline import prepare_cmd_args, qPipeline
from qqtools.nn import qMLP

class MyTask:
    def __init__(self, args):
        # Your custom data logic goes here
        self.train_loader, self.val_loader = build_loaders(args)

    def batch_forward(self, model, batch):
        return {"pred": model(batch.x)}

    def batch_loss(self, out, batch):
        loss = torch.nn.functional.mse_loss(out["pred"], batch.y)
        return {"loss": (loss, len(batch.y))}

    def batch_metric(self, out, batch):
        mae = (out["pred"] - batch.y).abs().mean()
        return {"mae": (mae, len(batch.y))}

    def post_metric_to_err(self, result):
        return result["mae"]

class MyPipeline(qPipeline):
    @staticmethod
    def prepare_model(args):
        return qMLP([16, 8, 1])

    @staticmethod
    def prepare_task(args):
        return MyTask(args)

if __name__ == "__main__":
    args = prepare_cmd_args()
    pipe = MyPipeline(args, train=True)
    pipe.fit()

Because qpipeline enforces a stable entry contract, it pairs perfectly with qexp for queued execution:

qexp submit -- python entry.py --config configs/train.yaml

Configuration follows a standard YAML structure. See qConfig.md for details.

Plugin modules

Under src/qqtools/plugins/, there are also:

  • qchem - tools for reading and processing quantum chemistry outputs
  • qpipeline - a training pipeline framework built on top of the core torch utilities
  • qhyperconnect - an implementation of Hyper-Connection for PyTorch

Test

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qqtools-1.2.13.tar.gz (187.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qqtools-1.2.13-py3-none-any.whl (228.9 kB view details)

Uploaded Python 3

File details

Details for the file qqtools-1.2.13.tar.gz.

File metadata

  • Download URL: qqtools-1.2.13.tar.gz
  • Upload date:
  • Size: 187.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qqtools-1.2.13.tar.gz
Algorithm Hash digest
SHA256 af93cc5db745ca4a25fa382588661bba44da25f1f14b074ec7630444a2d7ba6f
MD5 074fdcf569fe7c18267c42f3efbfd6e6
BLAKE2b-256 9c29140cbca73e026e8a0f3f75b4936310f1317fa6936dbef8114aa0599a8991

See more details on using hashes here.

File details

Details for the file qqtools-1.2.13-py3-none-any.whl.

File metadata

  • Download URL: qqtools-1.2.13-py3-none-any.whl
  • Upload date:
  • Size: 228.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for qqtools-1.2.13-py3-none-any.whl
Algorithm Hash digest
SHA256 768a9a285d6559850bb4d6de9b6ace9bd6772d85620a4c50fabb8a6bb9ce51bd
MD5 bd9398779261b1cbdd24d9a367032bb1
BLAKE2b-256 9ee5179abd2f034f7c5f415ef47b46e1af59f821bec2e8af0a1a1ebbdd47d78d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page