A small tool package for qq
Project description
✨qqtools✨
A lightweight library, crafted and battle-tested daily by qq, to make PyTorch life a little easier.
I’ve gathered the repetitive parts of my day-to-day work and refined them into this slim utility library. It serves as my personal toolkit for handling data, training, and experiments, designed to keep projects moving fast with cleaner code and smoother workflows (and hopefully yours too!).
Built for me, shared for you.
What it includes
At its core, qqtools is a collection of small utilities I use around PyTorch projects:
- data containers such as
qDictandqData - dataset and dataloader helpers such as
qDictDatasetandqDictDataloader - small neural network helpers such as
qMLP - a lightweight training framework,
qpipeline - a command-line experiment queue for Linux,
qexp - config and serialization helpers for YAML, JSON, pickle, and LMDB
At the core, it is still a practical toolbox for the repetitive parts around experiments.
Install
# Core install
pip install qqtools
# Full install
pip install qqtools[full]
# If you only want the experiment queue extras:
pip install qqtools[exp]
While some parts still work with
torch==1.x,torch>=2.4is recommended
qDict
qDict is mainly there for cleaner attribute access in batch-like code:
# Instead of dirty dict brackets:
# batch["input_ids"], batch["attention_mask"]
# Use clean attribute access:
batch = qt.qDict({"input_ids": input_ids, "attention_mask": attention_mask})
out = model(batch.input_ids)
qexp
qexp is a lightweight experiment queue for Linux hosts.
It is built around a shared project root, can work on multi-machines with multi-GPUs.
Quick start:
qexp init --shared-root /mnt/share/myproject/.qexp --machine gpu-a
qexp submit --name demo1 -- python train.py -c config1.yaml
qexp submit --name demo2 -- python train.py -c config2.yaml
qexp submit --name demo3 -- python train.py -c config3.yaml
# 3 tasks will be queued and run sequentially
After init, qexp saves the current shared_root and machine as CLI context, so you usually do not need to repeat them on every command.
Python API:
from qqtools.plugins import qexp
task = qexp.submit(
qexp.load_root_config("/mnt/share/myproject/.qexp", "gpu-a"),
command=["python", "train.py", "--epochs", "10"],
name="demo",
)
print(task.task_id)
Note: Run
pip install qqtools[exp]before useqexpcommand.
qpipeline
qpipeline is a minimal training loop scaffold. It doesn't try to be a heavy framework. You write the project-specific model and task logic, and qpipeline handles the repetitive boilerplate: config-driven startup, train/val loops, metric aggregation, and checkpointing.
A tight training entry:
import torch
from qqtools.plugins.qpipeline import prepare_cmd_args, qPipeline
from qqtools.nn import qMLP
class MyTask:
def __init__(self, args):
# Your custom data logic goes here
self.train_loader, self.val_loader = build_loaders(args)
def batch_forward(self, model, batch):
return {"pred": model(batch.x)}
def batch_loss(self, out, batch):
loss = torch.nn.functional.mse_loss(out["pred"], batch.y)
return {"loss": (loss, len(batch.y))}
def batch_metric(self, out, batch):
mae = (out["pred"] - batch.y).abs().mean()
return {"mae": (mae, len(batch.y))}
def post_metric_to_err(self, result):
return result["mae"]
class MyPipeline(qPipeline):
@staticmethod
def prepare_model(args):
return qMLP([16, 8, 1])
@staticmethod
def prepare_task(args):
return MyTask(args)
if __name__ == "__main__":
args = prepare_cmd_args()
pipe = MyPipeline(args, train=True)
pipe.fit()
Because qpipeline enforces a stable entry contract, it pairs perfectly with qexp for queued execution:
qexp submit -- python entry.py --config configs/train.yaml
Configuration follows a standard YAML structure. See qConfig.md for details.
Plugin modules
Under src/qqtools/plugins/, there are also:
qchem- tools for reading and processing quantum chemistry outputsqpipeline- a training pipeline framework built on top of the core torch utilitiesqhyperconnect- an implementation of Hyper-Connection for PyTorch
Test
tox
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file qqtools-1.2.15.tar.gz.
File metadata
- Download URL: qqtools-1.2.15.tar.gz
- Upload date:
- Size: 188.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e6d20b1ba0154f423fe34f507a1c54b2422abb0ec1df4b1216347e296c53bf62
|
|
| MD5 |
4d4d0cd0c48a43d980985e57f442915a
|
|
| BLAKE2b-256 |
5e7aac4a7c9d42c9df7d149a87e917a56f524da54a38678b081fb8017df91eda
|
File details
Details for the file qqtools-1.2.15-py3-none-any.whl.
File metadata
- Download URL: qqtools-1.2.15-py3-none-any.whl
- Upload date:
- Size: 231.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b61f47a9f9ac3cd387f11607bd47d4faed3bd44b5fdbf188bc8b7526c432510e
|
|
| MD5 |
4eecab4f7ce98323bf932a00bd55b7a3
|
|
| BLAKE2b-256 |
f4a29faeae042c917b643e64178a7ac9b984419e19d18cb7cf2b6e1cac3703c9
|