Skip to main content

Wrapper Library for training deep neural networks using PyTorch framework.

Project description

deep-ml

Licence Python Downloads Contributions welcome

deep-ml is a high-level PyTorch training framework that simplifies deep learning workflows for computer vision tasks. It provides easy-to-use trainers with distributed training support, comprehensive task implementations, and seamless experiment tracking.

Key Features

Multiple Training Backends

  • FabricTrainer: Lightning Fabric for distributed training (recommended for multi-GPU)
  • AcceleratorTrainer: HuggingFace Accelerate integration (recommended for multi-GPU)
  • Learner: Classic PyTorch trainer (single-device, notebook-friendly)

Pre-built Task Implementations

  • Image Classification (single & multi-label)
  • Semantic Segmentation (binary & multiclass)
  • Image Regression
  • Custom tasks via extensible base classes

Experiment Tracking

  • TensorBoard integration (default)
  • MLflow support
  • Weights & Biases (wandb) integration
  • Custom logger interface

Advanced Training Features

  • ✅ Automatic Mixed Precision (AMP)
  • ✅ Gradient accumulation & clipping
  • ✅ Learning rate scheduling with warmup
  • ✅ Multi-GPU and distributed training
  • ✅ Checkpoint management
  • ✅ Progress bars and real-time metrics

Installation

Basic Installation

pip install deepml

With Optional Dependencies

# For Lightning Fabric
pip install deepml lightning-fabric

# For HuggingFace Accelerate
pip install deepml accelerate

# For MLflow tracking
pip install deepml mlflow

# For Weights & Biases
pip install deepml wandb

# For Albumentations (segmentation)
pip install deepml albumentations

Quick Start

Image Classification

from deepml.tasks import ImageClassification
from deepml.fabric_trainer import FabricTrainer
import torch
from torch.optim import Adam
from torchvision.models import resnet18

# 1. Define your model
model = resnet18(num_classes=10)

# 2. Create a task
task = ImageClassification(
    model=model,
    model_dir="./checkpoints",
    classes=['cat', 'dog', 'bird', ...]  # Optional
)

# 3. Setup optimizer and loss
optimizer = Adam(model.parameters(), lr=1e-3)
criterion = torch.nn.CrossEntropyLoss()

# 4. Create trainer
trainer = FabricTrainer(
    task=task,
    optimizer=optimizer,
    criterion=criterion,
    accelerator="auto",  # Use GPU if available
    devices="auto",      # Use all available devices
    precision="16-mixed" # Mixed precision training
)

# 5. Train!
trainer.fit(
    train_loader=train_loader,
    val_loader=val_loader,
    epochs=50
)

# 6. Visualize predictions
task.show_predictions(loader=val_loader, samples=9)

Semantic Segmentation

from deepml.tasks import Segmentation
from deepml.fabric_trainer import FabricTrainer
from deepml.losses import JaccardLoss

# Define model (e.g., U-Net)
model = UNet(in_channels=3, out_channels=1)

# Create task
task = Segmentation(
    model=model,
    model_dir="./checkpoints",
    mode="binary",
    num_classes=1,
    threshold=0.5
)

# Setup training
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
criterion = torch.nn.BCEWithLogitsLoss()

trainer = FabricTrainer(task=task, optimizer=optimizer, criterion=criterion)

# Train
trainer.fit(
    train_loader=train_loader,
    val_loader=val_loader,
    epochs=100
)

Documentation

Full documentation is available at: [Documentation Link]

  • Getting Started: Installation and quick start guide
  • User Guide: Detailed guides for trainers, tasks, datasets, etc.
  • API Reference: Complete API documentation
  • Tutorials: Step-by-step tutorials for common use cases
  • Examples: Complete example projects

Tutorials

Available Tutorials

  1. Image Classification: Train ResNet on CIFAR-10
  2. Transfer Learning: Fine-tune pre-trained models
  3. Semantic Segmentation: U-Net for binary segmentation
  4. Multi-GPU Training: Distributed training across GPUs
  5. Hyperparameter Tuning: Optimize with Optuna
  6. Model Deployment: Export to TorchScript/ONNX

See the tutorials documentation for complete guides.

💡 Advanced Features

Distributed Training

# Multi-GPU training with DDP
trainer = FabricTrainer(
    task=task,
    optimizer=optimizer,
    criterion=criterion,
    accelerator="gpu",
    strategy="ddp",
    devices="auto"  # Use all GPUs
)

Gradient Accumulation

# Simulate larger batch sizes
trainer.fit(
    train_loader=train_loader,
    val_loader=val_loader,
    epochs=50,
    gradient_accumulation_steps=4  # Effective batch = 4x
)

Learning Rate Scheduling

from deepml.lr_scheduler_utils import setup_one_cycle_lr_scheduler_with_warmup

lr_scheduler_fn = lambda opt: setup_one_cycle_lr_scheduler_with_warmup(
    optimizer=opt,
    steps_per_epoch=len(train_loader),
    warmup_ratio=0.1,
    num_epochs=50,
    max_lr=1e-3
)

trainer = FabricTrainer(
    ...,
    lr_scheduler_fn=lr_scheduler_fn
)

Experiment Tracking

from deepml.tracking import MLFlowLogger, WandbLogger

# MLflow
logger = MLFlowLogger(
    experiment_name='my-experiment',
    tracking_uri='./mlruns'
)

# Weights & Biases
logger = WandbLogger(
    project='my-project',
    name='experiment-1'
)

trainer.fit(..., logger=logger)

Supported Tasks

Task Description Typical Use Cases
ImageClassification Single-label classification CIFAR-10, ImageNet
MultiLabelImageClassification Multi-label classification Object attributes
Segmentation Pixel-level classification Medical imaging, autonomous driving
ImageRegression Continuous value prediction Age estimation, depth prediction
NeuralNetTask Generic task template Custom tasks

Custom Loss Functions

  • JaccardLoss: IoU loss for segmentation
  • RMSELoss: Root mean squared error
  • WeightedBCEWithLogitsLoss: Weighted binary cross-entropy
  • ContrastiveLoss: For siamese networks
  • AngularPenaltySMLoss: ArcFace, SphereFace, CosFace for face recognition

Metrics

  • Classification: Accuracy, BinaryAccuracy
  • Segmentation: IoU, Dice Coefficient, Pixel Accuracy
  • Custom: Easy to implement custom metrics

Datasets

  • ImageDataFrameDataset: Load from pandas DataFrame
  • ImageRowDataFrameDataset: Flattened arrays in DataFrame
  • SegmentationDataFrameDataset: Images + masks with Albumentations
  • ImageListDataset: Directory of images

Contributing

Contributions are welcome! See CONTRIBUTING.md for guidelines.

Development Setup

git clone https://github.com/sagar100rathod/deep-ml.git
cd deep-ml
pip install -e ".[dev]"
pytest  # Run tests

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • PyTorch team for the amazing framework
  • Lightning AI for Lightning Fabric
  • HuggingFace for Accelerate
  • All contributors to this project

Contact

⭐ Star History

If you find this project useful, please consider giving it a star!

Citation

If you use deep-ml in your research, please cite:

@software{deepml2026,
  author = {Rathod, Sagar},
  title = {deep-ml: PyTorch Training Framework},
  year = {2026},
  url = {https://github.com/sagar100rathod/deep-ml}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepml-3.0.0.tar.gz (177.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

deepml-3.0.0-py3-none-any.whl (183.0 kB view details)

Uploaded Python 3

File details

Details for the file deepml-3.0.0.tar.gz.

File metadata

  • Download URL: deepml-3.0.0.tar.gz
  • Upload date:
  • Size: 177.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.13 Linux/6.17.0-1008-azure

File hashes

Hashes for deepml-3.0.0.tar.gz
Algorithm Hash digest
SHA256 017c311f9a2dc23d4fd797c6c77af1c2e541ec774a9b73c7f1c60be5aaaf59d8
MD5 51ed9ca32fb84dbf10dc7056658777eb
BLAKE2b-256 8dd8afa3f2ecce174263718522aa6eef491eb46cbd43d065ed07faba8593123e

See more details on using hashes here.

File details

Details for the file deepml-3.0.0-py3-none-any.whl.

File metadata

  • Download URL: deepml-3.0.0-py3-none-any.whl
  • Upload date:
  • Size: 183.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.13 Linux/6.17.0-1008-azure

File hashes

Hashes for deepml-3.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7fda61f205e1ec73e3c320c7e533f0862cd55d24e03b7532b0211fd669b4dc1d
MD5 7eb59aab8d32b45023c6e1b0ddd93d6b
BLAKE2b-256 555ce3a387cfd937017d0c679ad34a5a2a11bb78a7c666e7aa13e342e0cfcf66

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page