Skip to main content

A lightweight and modular library for rapidly developing and constructing PyTorch models for deep learning.

Project description

LeibNetz

PyPI - License CI/CD Pipeline codecov PyPI - Version PyPI - Python Version

A lightweight and modular library for rapidly developing and constructing PyTorch models for deep learning, specifically focused on image segmentation and convolutional neural networks.

Features

  • 🧱 Modular Architecture: Build networks using composable node-based components
  • 🔧 Pre-built Networks: Ready-to-use implementations of U-Net, ScaleNet, and AttentiveScaleNet
  • 📐 Automatic Shape Propagation: Smart shape calculation and management throughout the network
  • 🎯 Specialized for Segmentation: Optimized for image segmentation tasks
  • 🔬 Biologically-Inspired Learning: Local learning rules including Hebbian, Oja's, and Krotov's rules
  • PyTorch Integration: Seamless integration with the PyTorch ecosystem

Installation

From PyPI (Recommended)

pip install leibnetz

From Source

git clone https://github.com/janelia-cellmap/LeibNetz.git
cd LeibNetz
pip install -e .

Development Installation

git clone https://github.com/janelia-cellmap/LeibNetz.git
cd LeibNetz
pip install -e ".[dev]"

Quick Start

Building a Simple U-Net

import torch
from leibnetz import build_unet

# Create a U-Net for 4-class segmentation
model = build_unet(
    input_nc=1,        # Single input channel (e.g., grayscale)
    output_nc=4,       # 4 output classes
    base_nc=64,        # Base number of features
    max_nc=512,        # Maximum number of features
    num_levels=4       # Number of resolution levels
)

# Forward pass
x = torch.randn(1, 1, 256, 256)  # Batch, channels, height, width
output = model(x)
print(f"Output shape: {output.shape}")  # [1, 4, 256, 256]

Using the Modular Node System

from leibnetz import LeibNet
from leibnetz.nodes import ConvPassNode, ResampleNode

# Build a custom network using nodes
nodes = [
    ConvPassNode(input_nc=1, output_nc=32, kernel_size=3),
    ResampleNode(scale_factor=0.5, mode="area"),  # Downsample
    ConvPassNode(input_nc=32, output_nc=64, kernel_size=3),
    ResampleNode(scale_factor=2.0, mode="nearest"),  # Upsample
    ConvPassNode(input_nc=64, output_nc=4, kernel_size=1)  # Final classification
]

model = LeibNet(nodes)

# Use the model
x = torch.randn(1, 1, 128, 128)
output = model(x)

ScaleNet for Multi-Scale Processing

from leibnetz import build_scalenet

# Create a ScaleNet with multiple processing scales
model = build_scalenet(
    input_nc=1,
    output_nc=4,
    base_nc=32,
    subnet_dict_list=[
        {"input_shape": (64, 64), "num_levels": 3},
        {"input_shape": (128, 128), "num_levels": 4},
        {"input_shape": (256, 256), "num_levels": 4}
    ]
)

# Process different scales
outputs = model(x)

Core Components

Networks (leibnetz.nets)

  • U-Net: Classic encoder-decoder architecture for segmentation
  • ScaleNet: Multi-scale processing network for handling different resolutions
  • AttentiveScaleNet: ScaleNet enhanced with attention mechanisms

Nodes (leibnetz.nodes)

Building blocks for custom architectures:

  • ConvPassNode: Convolutional layers with optional normalization and activation
  • ResampleNode: Upsampling/downsampling operations
  • ConvResampleNode: Combined convolution and resampling
  • AdditiveAttentionGateNode: Attention mechanism for feature gating
  • WrapperNode: Wraps existing PyTorch modules as nodes

Model Management

  • LeibNet: Main class for composing nodes into networks
  • ModelWrapper: Utilities for model management and deployment

Local Learning Rules

Biologically-inspired learning algorithms:

from leibnetz.local_learning import HebbsRule, OjasRule, KrotovsRule

# Apply Hebbian learning to a model
convert_to_bio(model, rule=HebbsRule())

Advanced Usage

Custom Node Creation

from leibnetz.nodes import Node

class CustomProcessingNode(Node):
    def __init__(self, channels):
        super().__init__()
        self.conv = torch.nn.Conv2d(channels, channels, 3, padding=1)
        self.norm = torch.nn.BatchNorm2d(channels)
        self.activation = torch.nn.ReLU()

    def forward(self, x):
        return self.activation(self.norm(self.conv(x)))

    def get_output_from_input_shape(self, input_shape):
        # Shape preserved through convolution
        return input_shape

    def get_input_from_output_shape(self, output_shape):
        # Inverse shape calculation
        return output_shape

Network Visualization

import matplotlib.pyplot as plt

# Visualize network structure
model = build_unet(input_nc=1, output_nc=4)
model.visualize_network()
plt.show()

Examples

Complete training examples are available in the examples/ directory:

Testing

Run the test suite:

# Install test dependencies
pip install pytest pytest-cov

# Run all tests
pytest tests/ -v

# Run with coverage
pytest tests/ -v --cov --cov-report=term-missing

# Run specific test categories
pytest tests/ -m "not slow"  # Skip slow tests
pytest tests/ -m "unit"      # Run only unit tests

Development

Code Quality

The project uses several tools to maintain code quality:

# Format code
black src/

# Type checking
mypy src/

# Run linting
flake8 src/

# Sort imports
isort src/

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass (pytest tests/)
  6. Format your code (black src/)
  7. Commit your changes (git commit -m 'Add amazing feature')
  8. Push to the branch (git push origin feature/amazing-feature)
  9. Open a Pull Request

Requirements

  • Python 3.10+
  • PyTorch 1.9+
  • NumPy
  • NetworkX

License

This project is licensed under the BSD-3 License - see the LICENSE file for details.

Citation

If you use LeibNetz in your research, please cite:

@software{leibnetz2024,
  author = {Jeff Rhoades and Larissa Heinrich},
  title = {LeibNetz: A Lightweight and Modular Library for Deep Learning},
  url = {https://github.com/janelia-cellmap/LeibNetz},
  year = {2024}
}

Acknowledgments

Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leibnetz-2025.11.13.1849.tar.gz (65.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leibnetz-2025.11.13.1849-py3-none-any.whl (39.2 kB view details)

Uploaded Python 3

File details

Details for the file leibnetz-2025.11.13.1849.tar.gz.

File metadata

  • Download URL: leibnetz-2025.11.13.1849.tar.gz
  • Upload date:
  • Size: 65.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for leibnetz-2025.11.13.1849.tar.gz
Algorithm Hash digest
SHA256 9aa1aef6204d29f6676247958c3ff25d1ac282e64736c77c3cefd55c6facb588
MD5 d74d61ff70f4e5a4f0fcec0538a6d36f
BLAKE2b-256 3798721273f0b54c36064c95e680f5c75c2a9287547602ed8d7c79624c6d17cf

See more details on using hashes here.

Provenance

The following attestation bundles were made for leibnetz-2025.11.13.1849.tar.gz:

Publisher: ci-cd.yml on janelia-cellmap/LeibNetz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file leibnetz-2025.11.13.1849-py3-none-any.whl.

File metadata

File hashes

Hashes for leibnetz-2025.11.13.1849-py3-none-any.whl
Algorithm Hash digest
SHA256 c5d6f1f30fda13e713570921f47a86249d1583faba703c66682fa9beb8341e8a
MD5 55d244e0a6e51a84e388a34e3aff6f98
BLAKE2b-256 f3df5b407b2bc5a74fd0063107846cee7dd8b18429fa17ea40e0c50204ac0dc9

See more details on using hashes here.

Provenance

The following attestation bundles were made for leibnetz-2025.11.13.1849-py3-none-any.whl:

Publisher: ci-cd.yml on janelia-cellmap/LeibNetz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page