Skip to main content

Super‑resolution research framework for PyTorch with a focus on simplicity and flexibility using config files.

Project description

SR-Forge

Structured Research Framework for Organized Research & Guided Experiments

SR-Forge is a modular, config-driven PyTorch framework for deep learning research. It handles the repetitive plumbing — data routing, component wiring, configuration management — so you can focus on what matters: your models, your data, and your experiments.

How It Works

Dataset --> [Entry] --> Transforms --> [Entry] --> Model --> [Entry] --> Loss
  1. A Dataset loads raw data and wraps it in an Entry — a dictionary-like container that carries tensors, metadata, and any other fields through the pipeline
  2. Transforms preprocess the data (normalize, augment, reshape)
  3. A Model runs the neural network computation
  4. Results are written back to the Entry
  5. A Loss function evaluates the prediction against the target

Every component reads from and writes to Entry objects. This uniform interface is what makes everything interchangeable — swap any component and the rest of the pipeline doesn't change.

Key Features

  • Entry-based data flow — All data lives in Entry objects. Every component reads from Entry fields and writes back to them, so components are interchangeable without glue code.
  • IO binding — Components declare abstract port names ("I need an image") that get mapped to concrete Entry fields ("read from input_rgb"). Write a component once, reuse it with any data layout.
  • Pipeline composition — Chain models and transforms with a simple arrow syntax: image -> encoder -> features -> decoder -> output.
  • Configuration over code — Define entire experiments in YAML. Reproduce any experiment by sharing a config file.
  • Built-in models — FSRCNN, DSen2, RAMS, TR-MISR, MagNAt, plus a registry for custom architectures.
  • Unified metrics — L1, L2, SSIM, LPIPS, schedulable loss combiners, and straightforward logging.

Installation

Prerequisites

Install PyTorch before installing SR-Forge. Follow the official instructions at pytorch.org to pick the right build for your OS and GPU.

# Example: CUDA 12.8
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu128

# Example: CPU only
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu

Install SR-Forge

pip install srforge

Graph neural network support (optional)

For graph-based models (e.g., MagNAt), install PyTorch Geometric first, then the graph extra:

pip install torch-geometric
pip install pyg-lib torch-scatter torch-sparse torch-cluster torch-spline-conv \
    -f https://data.pyg.org/whl/torch-2.7.0+cu128.html

pip install srforge[graph]

Quick Start

mkdir my-experiment && cd my-experiment
srforge init

This generates a complete training script and config file:

my-experiment/
├── train.py              # Complete training script
└── configs/
    └── train-cfg.yaml    # Sample config with all settings

Edit configs/train-cfg.yaml to point at your data, then run:

python train.py

The generated files are a starting point — modify them to fit your workflow. The config supports multi-GPU, mixed precision, dataset caching, W&B tracking, loss scheduling, and checkpointing out of the box.

A Taste of SR-Forge

from srforge.models import Model
from srforge.data import Entry
import torch

class Upscaler(Model):
    def __init__(self):
        super().__init__()
        self.net = torch.nn.Conv2d(3, 3, 3, padding=1)

    def _forward(self, image):
        return self.net(image)

model = Upscaler()
model.set_io({"inputs": {"image": "input"}, "outputs": "prediction"})

entry = Entry({"input": torch.randn(1, 3, 64, 64)})
result = model(entry)
print(result.prediction.shape)  # torch.Size([1, 3, 64, 64])

The model reads from entry["input"], runs the network, and stores the result in entry["prediction"].

Documentation

Full documentation is available at tarasiewicztomasz.gitlab.io/sr-forge.

For Developers

Install PyTorch and PyG with CUDA wheels first (see above), then clone and install in editable mode:

git clone https://gitlab.com/tarasiewicztomasz/sr-forge.git
cd sr-forge
pip install -e ".[dev,graph]"

Run the test suite:

pytest tests/ -v

Some tests require optional dependencies (e.g., torch_geometric). These are automatically skipped if the dependency is missing.

Contributing

SR-Forge is under active development. Contributions welcome!

  • Report issues on GitLab
  • Submit merge requests
  • Share your models and experiments

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

srforge-0.13.0.tar.gz (326.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

srforge-0.13.0-py3-none-any.whl (174.5 kB view details)

Uploaded Python 3

File details

Details for the file srforge-0.13.0.tar.gz.

File metadata

  • Download URL: srforge-0.13.0.tar.gz
  • Upload date:
  • Size: 326.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for srforge-0.13.0.tar.gz
Algorithm Hash digest
SHA256 44d00d3b7a87a855308a8eda9c3f5f2fe7b75bf26af554e47aca083c95fc2738
MD5 e7014a4aacc2bde59666b1a64b47d4c8
BLAKE2b-256 559c2d7ecb8988ffa781b6e216ae76ad16d23a772013e0245ce87dfc6ee49af1

See more details on using hashes here.

File details

Details for the file srforge-0.13.0-py3-none-any.whl.

File metadata

  • Download URL: srforge-0.13.0-py3-none-any.whl
  • Upload date:
  • Size: 174.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.15

File hashes

Hashes for srforge-0.13.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b774c868fa84a88a0af448ecc4b9b15515803defda4df57ea6abf1d62c2fa794
MD5 ec4a880512a6ecf3e3eee33a90624849
BLAKE2b-256 2d32356592284a7d1b40c9d51b1f4789b26fd020f3f4425874ce1e9ab1e28b60

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page