Skip to main content

A small tool package for qq

Project description

✨qqtools✨

PyPI Downloads PyPI - Monthly Downloads Python version

A lightweight library, crafted and battle-tested daily by qq, to make PyTorch life a little easier.

I’ve gathered the repetitive parts of my day-to-day work and refined them into this slim utility library. It serves as my personal toolkit for handling data, training, and experiments, designed to keep projects moving fast with cleaner code and smoother workflows (and hopefully yours too!).

Built for me, shared for you.

Requirements

  • torch>=2.0 for full functionality
    • Some components maintain backward compatibility with torch==1.x
    • Recommended: torch>=2.4
  • pyyaml>=6.0
    • Recommended to use YAML format for all configuration files.

This provides a unified approach to drive and manage all workflow operations.

To get started quickly, install it via pip:

pip install qqtools

Install with full features:

pip install qqtools[full]

Data Format Support

Non-torch formats:

qDict : Enhanced of basic Dict.
qScalaDict : Dict[str, num]. A dict that maps str to scala;
qListData : List[dict]. A list of dicts.

Torch-related data formats

qData
qBatchList

Simple Training Loop

For jupyter users

import qqtools as qt
qt.import_common(globals())

x = np.random.rand(100, 5)
y = np.random.rand(100)

# dataset wrap
xs = [ x[i] for i in range(len(x))]
ys = [ y[i] for i in range(len(y))]
data_list = [ qt.qData({'x': x[i], 'y':y[i]})  for i in range(len(x))] 
dataset = qt.qDictDataset(data_list=data_list)
dataloader = qt.qDictDataloader()

# model
model = qt.nn.qMLP([5,5,1], activation="relu")
loss_fn = torch.nn.MSELoss()
optimizer = torch.optim.AdamW(model.parameters(), lr=1.0e-4, weight_decay=0.01)

# device
device = torch.device("cuda")
model.to(device)

# loop
for epoch in range(100):
    for batch in dataloader:
        batch.to(device)
        out = model(batch.x)
        loss = loss_fn(out, batch.y)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
    print(f"{epoch} {loss.item():4.6f}")

Individual Modules

The following modules are consumers of the core functionality provided by this package. Each is designed to be independent, allowing for sole import.

under plugins/

  • qchem
  • qpipeline
  • qhyperconnect

Test

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qqtools-1.2.3.tar.gz (133.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qqtools-1.2.3-py3-none-any.whl (167.1 kB view details)

Uploaded Python 3

File details

Details for the file qqtools-1.2.3.tar.gz.

File metadata

  • Download URL: qqtools-1.2.3.tar.gz
  • Upload date:
  • Size: 133.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qqtools-1.2.3.tar.gz
Algorithm Hash digest
SHA256 371d25a862edd7baef5e4e7a0957d21575f4e6abcb82085988cca11fa13fdf26
MD5 0a537641496d48d7ade7b00e9b06bac2
BLAKE2b-256 1c068023a94adbb593c671923d7c89d2d6a359a15b4ce499de229926a42ce569

See more details on using hashes here.

File details

Details for the file qqtools-1.2.3-py3-none-any.whl.

File metadata

  • Download URL: qqtools-1.2.3-py3-none-any.whl
  • Upload date:
  • Size: 167.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qqtools-1.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 228ea653bb441651ff01700d9640443a54a977105b72c1103bc2361f95804caa
MD5 70b72461cdb428da85c1fff80dbe8a68
BLAKE2b-256 fe25857e9581a8ff30981990c0453003b3540242b20364d0aa3b7501041e7541

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page