Skip to main content

A small tool package for qq

Project description

✨qqtools✨

PyPI Downloads PyPI - Monthly Downloads Python version

A lightweight library, crafted and battle-tested daily by qq, to make PyTorch life a little easier.

I’ve gathered the repetitive parts of my day-to-day work and refined them into this slim utility library. It serves as my personal toolkit for handling data, training, and experiments, designed to keep projects moving fast with cleaner code and smoother workflows (and hopefully yours too!).

Built for me, shared for you.

Requirements

  • torch>=2.0 for full functionality
    • Some components maintain backward compatibility with torch==1.x
    • Recommended: torch>=2.4
  • pyyaml>=6.0
    • Recommended to use YAML format for all configuration files.

This provides a unified approach to drive and manage all workflow operations.

To get started quickly, install it via pip:

pip install qqtools

Install with full features:

pip install qqtools[full]

Data Format Support

Non-torch formats:

qDict : Enhanced of basic Dict.
qScalaDict : Dict[str, num]. A dict that maps str to scala;
qListData : List[dict]. A list of dicts.

Torch-related data formats

qData
qBatchList

Simple Training Loop

For jupyter users

import qqtools as qt
qt.import_common(globals())

x = np.random.rand(100, 5)
y = np.random.rand(100)

# dataset wrap
xs = [ x[i] for i in range(len(x))]
ys = [ y[i] for i in range(len(y))]
data_list = [ qt.qData({'x': x[i], 'y':y[i]})  for i in range(len(x))] 
dataset = qt.qDictDataset(data_list=data_list)
dataloader = qt.qDictDataloader()

# model
model = qt.nn.qMLP([5,5,1], activation="relu")
loss_fn = torch.nn.MSELoss()
optimizer = torch.optim.AdamW(model.parameters(), lr=1.0e-4, weight_decay=0.01)

# device
device = torch.device("cuda")
model.to(device)

# loop
for epoch in range(100):
    for batch in dataloader:
        batch.to(device)
        out = model(batch.x)
        loss = loss_fn(out, batch.y)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
    print(f"{epoch} {loss.item():4.6f}")

Individual Modules

The following modules are consumers of the core functionality provided by this package. Each is designed to be independent, allowing for sole import.

under plugins/

  • qchem
  • qpipeline
  • qhyperconnect

Test

tox

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qqtools-1.1.34.tar.gz (125.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

qqtools-1.1.34-py3-none-any.whl (156.5 kB view details)

Uploaded Python 3

File details

Details for the file qqtools-1.1.34.tar.gz.

File metadata

  • Download URL: qqtools-1.1.34.tar.gz
  • Upload date:
  • Size: 125.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qqtools-1.1.34.tar.gz
Algorithm Hash digest
SHA256 b010df22738e4328998ba9269d9d3fa6ab728de4e152d8a6a306bcf462a28684
MD5 9ebaf11c7d51a1ba816092d2610ee96b
BLAKE2b-256 9dadc826f6bd82a2f463683d3cadc6dc27d9babbe4bcd1243e2ac4e14efe8343

See more details on using hashes here.

File details

Details for the file qqtools-1.1.34-py3-none-any.whl.

File metadata

  • Download URL: qqtools-1.1.34-py3-none-any.whl
  • Upload date:
  • Size: 156.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for qqtools-1.1.34-py3-none-any.whl
Algorithm Hash digest
SHA256 08a15fb070c6782a33ae11ad0962bfd0b52b837694afd0c89ae3ae19f9237934
MD5 e67ffd94a60b0f065d4e45e0ca1c854e
BLAKE2b-256 04491c55ecb18773c3ece01579ad5d0e623bbb32716641bfcefde62a3d27e67a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page