Skip to main content

Super‑resolution research framework for PyTorch with a focus on simplicity and flexibility using config files.

Project description

SR-Forge

Structured Research Framework for Organized Research & Guided Experiments

SR-Forge is a modular, config-driven PyTorch framework for deep learning research. It handles the repetitive plumbing — data routing, component wiring, configuration management — so you can focus on what matters: your models, your data, and your experiments.

How It Works

Dataset --> [Entry] --> Transforms --> [Entry] --> Model --> [Entry] --> Loss
  1. A Dataset loads raw data and wraps it in an Entry — a dictionary-like container that carries tensors, metadata, and any other fields through the pipeline
  2. Transforms preprocess the data (normalize, augment, reshape)
  3. A Model runs the neural network computation
  4. Results are written back to the Entry
  5. A Loss function evaluates the prediction against the target

Every component reads from and writes to Entry objects. This uniform interface is what makes everything interchangeable — swap any component and the rest of the pipeline doesn't change.

Key Features

  • Entry-based data flow — All data lives in Entry objects. Every component reads from Entry fields and writes back to them, so components are interchangeable without glue code.
  • IO binding — Components declare abstract port names ("I need an image") that get mapped to concrete Entry fields ("read from input_rgb"). Write a component once, reuse it with any data layout.
  • Pipeline composition — Chain models and transforms with a simple arrow syntax: image -> encoder -> features -> decoder -> output.
  • Configuration over code — Define entire experiments in YAML. Reproduce any experiment by sharing a config file.
  • Built-in models — FSRCNN, DSen2, RAMS, TR-MISR, MagNAt, plus a registry for custom architectures.
  • Unified metrics — L1, L2, SSIM, LPIPS, schedulable loss combiners, and straightforward logging.

Installation

Prerequisites

Install PyTorch before installing SR-Forge. Follow the official instructions at pytorch.org to pick the right build for your OS and GPU.

# Example: CUDA 12.8
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu128

# Example: CPU only
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cpu

Install SR-Forge

pip install srforge

Graph neural network support (optional)

For graph-based models (e.g., MagNAt), install PyTorch Geometric first, then the graph extra:

pip install torch-geometric
pip install pyg-lib torch-scatter torch-sparse torch-cluster torch-spline-conv \
    -f https://data.pyg.org/whl/torch-2.7.0+cu128.html

pip install srforge[graph]

Quick Start

mkdir my-experiment && cd my-experiment
srforge init

This generates a complete training script and config file:

my-experiment/
├── train.py              # Complete training script
└── configs/
    └── train-cfg.yaml    # Sample config with all settings

Edit configs/train-cfg.yaml to point at your data, then run:

python train.py

The generated files are a starting point — modify them to fit your workflow. The config supports multi-GPU, mixed precision, dataset caching, W&B tracking, loss scheduling, and checkpointing out of the box.

A Taste of SR-Forge

from srforge.models import Model
from srforge.data import Entry
import torch

class Upscaler(Model):
    def __init__(self):
        super().__init__()
        self.net = torch.nn.Conv2d(3, 3, 3, padding=1)

    def _forward(self, image):
        return self.net(image)

model = Upscaler()
model.set_io({"inputs": {"image": "input"}, "outputs": "prediction"})

entry = Entry({"input": torch.randn(1, 3, 64, 64)})
result = model(entry)
print(result.prediction.shape)  # torch.Size([1, 3, 64, 64])

The model reads from entry["input"], runs the network, and stores the result in entry["prediction"].

Documentation

Full documentation is available at tarasiewicztomasz.gitlab.io/sr-forge.

For Developers

Install PyTorch and PyG with CUDA wheels first (see above), then clone and install in editable mode:

git clone https://gitlab.com/tarasiewicztomasz/sr-forge.git
cd sr-forge
pip install -e ".[dev,graph]"

Run the test suite:

pytest tests/ -v

Some tests require optional dependencies (e.g., torch_geometric). These are automatically skipped if the dependency is missing.

Contributing

SR-Forge is under active development. Contributions welcome!

  • Report issues on GitLab
  • Submit merge requests
  • Share your models and experiments

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

srforge-0.10.0.tar.gz (262.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

srforge-0.10.0-py3-none-any.whl (156.5 kB view details)

Uploaded Python 3

File details

Details for the file srforge-0.10.0.tar.gz.

File metadata

  • Download URL: srforge-0.10.0.tar.gz
  • Upload date:
  • Size: 262.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for srforge-0.10.0.tar.gz
Algorithm Hash digest
SHA256 473f00844e4895c03ef619c99d6b966f32c815630b70679f443acde357125fc6
MD5 ef28dcdef89946bf5d2f2d0d4ce491f3
BLAKE2b-256 3f94cfb7b4ba808e128ba14358392475475338a5198c488d5d2a91980474f620

See more details on using hashes here.

File details

Details for the file srforge-0.10.0-py3-none-any.whl.

File metadata

  • Download URL: srforge-0.10.0-py3-none-any.whl
  • Upload date:
  • Size: 156.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for srforge-0.10.0-py3-none-any.whl
Algorithm Hash digest
SHA256 968df105288d4f3c03e59900ab1f6958f84df994eb9e70addbbd85a211e8bcb6
MD5 e7bba355501387d821a134ac0939d5a1
BLAKE2b-256 59d7bbb01d660e78599c67c68c11ec1b95e1918cc184128e12e976ddb5c46409

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page