Skip to main content

A lightweight and modular library for rapidly developing and constructing PyTorch models for deep learning.

Project description

LeibNetz

GitHub - License CI/CD Pipeline codecov PyPI - Version PyPI - Python Version

A lightweight and modular library for rapidly developing and constructing PyTorch models for deep learning, specifically focused on image segmentation and convolutional neural networks.

Features

  • 🧱 Modular Architecture: Build networks using composable node-based components
  • 🔧 Pre-built Networks: Ready-to-use implementations of U-Net, ScaleNet, and AttentiveScaleNet
  • 📐 Automatic Shape Propagation: Smart shape calculation and management throughout the network
  • 🎯 Specialized for Segmentation: Optimized for image segmentation tasks
  • 🔬 Biologically-Inspired Learning: Local learning rules including Hebbian, Oja's, and Krotov's rules
  • PyTorch Integration: Seamless integration with the PyTorch ecosystem

Installation

From PyPI (Recommended)

pip install leibnetz

From Source

git clone https://github.com/janelia-cellmap/LeibNetz.git
cd LeibNetz
pip install -e .

Development Installation

git clone https://github.com/janelia-cellmap/LeibNetz.git
cd LeibNetz
pip install -e ".[dev]"

Quick Start

Building a Simple U-Net

import torch
from leibnetz import build_unet

# Create a U-Net for 4-class segmentation
model = build_unet(
    input_nc=1,        # Single input channel (e.g., grayscale)
    output_nc=4,       # 4 output classes
    base_nc=64,        # Base number of features
    max_nc=512,        # Maximum number of features
    num_levels=4       # Number of resolution levels
)

# Forward pass
x = torch.randn(1, 1, 256, 256)  # Batch, channels, height, width
output = model(x)
print(f"Output shape: {output.shape}")  # [1, 4, 256, 256]

Using the Modular Node System

from leibnetz import LeibNet
from leibnetz.nodes import ConvPassNode, ResampleNode

# Build a custom network using nodes
nodes = [
    ConvPassNode(input_nc=1, output_nc=32, kernel_size=3),
    ResampleNode(scale_factor=0.5, mode="area"),  # Downsample
    ConvPassNode(input_nc=32, output_nc=64, kernel_size=3),
    ResampleNode(scale_factor=2.0, mode="nearest"),  # Upsample
    ConvPassNode(input_nc=64, output_nc=4, kernel_size=1)  # Final classification
]

model = LeibNet(nodes)

# Use the model
x = torch.randn(1, 1, 128, 128)
output = model(x)

ScaleNet for Multi-Scale Processing

from leibnetz import build_scalenet

# Create a ScaleNet with multiple processing scales
model = build_scalenet(
    input_nc=1,
    output_nc=4,
    base_nc=32,
    subnet_dict_list=[
        {"input_shape": (64, 64), "num_levels": 3},
        {"input_shape": (128, 128), "num_levels": 4},
        {"input_shape": (256, 256), "num_levels": 4}
    ]
)

# Process different scales
outputs = model(x)

Core Components

Networks (leibnetz.nets)

  • U-Net: Classic encoder-decoder architecture for segmentation
  • ScaleNet: Multi-scale processing network for handling different resolutions
  • AttentiveScaleNet: ScaleNet enhanced with attention mechanisms

Nodes (leibnetz.nodes)

Building blocks for custom architectures:

  • ConvPassNode: Convolutional layers with optional normalization and activation
  • ResampleNode: Upsampling/downsampling operations
  • ConvResampleNode: Combined convolution and resampling
  • AdditiveAttentionGateNode: Attention mechanism for feature gating
  • WrapperNode: Wraps existing PyTorch modules as nodes

Model Management

  • LeibNet: Main class for composing nodes into networks
  • ModelWrapper: Utilities for model management and deployment

Local Learning Rules

Biologically-inspired learning algorithms:

from leibnetz.local_learning import HebbsRule, OjasRule, KrotovsRule

# Apply Hebbian learning to a model
convert_to_bio(model, rule=HebbsRule())

Advanced Usage

Custom Node Creation

from leibnetz.nodes import Node

class CustomProcessingNode(Node):
    def __init__(self, channels):
        super().__init__()
        self.conv = torch.nn.Conv2d(channels, channels, 3, padding=1)
        self.norm = torch.nn.BatchNorm2d(channels)
        self.activation = torch.nn.ReLU()

    def forward(self, x):
        return self.activation(self.norm(self.conv(x)))

    def get_output_from_input_shape(self, input_shape):
        # Shape preserved through convolution
        return input_shape

    def get_input_from_output_shape(self, output_shape):
        # Inverse shape calculation
        return output_shape

Network Visualization

import matplotlib.pyplot as plt

# Visualize network structure
model = build_unet(input_nc=1, output_nc=4)
model.visualize_network()
plt.show()

Examples

Complete training examples are available in the examples/ directory:

Testing

Run the test suite:

# Install test dependencies
pip install pytest pytest-cov

# Run all tests
pytest tests/ -v

# Run with coverage
pytest tests/ -v --cov --cov-report=term-missing

# Run specific test categories
pytest tests/ -m "not slow"  # Skip slow tests
pytest tests/ -m "unit"      # Run only unit tests

Development

Code Quality

The project uses several tools to maintain code quality:

# Format code
black src/

# Type checking
mypy src/

# Run linting
flake8 src/

# Sort imports
isort src/

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Add tests for new functionality
  5. Ensure all tests pass (pytest tests/)
  6. Format your code (black src/)
  7. Commit your changes (git commit -m 'Add amazing feature')
  8. Push to the branch (git push origin feature/amazing-feature)
  9. Open a Pull Request

Requirements

  • Python 3.10+
  • PyTorch 1.9+
  • NumPy
  • NetworkX

License

This project is licensed under the MIT License - see the LICENSE file for details.

Citation

If you use LeibNetz in your research, please cite:

@software{leibnetz2024,
  author = {Jeff Rhoades and Larissa Heinrich},
  title = {LeibNetz: A Lightweight and Modular Library for Deep Learning},
  url = {https://github.com/janelia-cellmap/LeibNetz},
  version = {0.2.0},
  year = {2024}
}

Acknowledgments

Support

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

leibnetz-0.2.0.tar.gz (44.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

leibnetz-0.2.0-py3-none-any.whl (39.2 kB view details)

Uploaded Python 3

File details

Details for the file leibnetz-0.2.0.tar.gz.

File metadata

  • Download URL: leibnetz-0.2.0.tar.gz
  • Upload date:
  • Size: 44.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for leibnetz-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a554a70ade05d869df1c0c4af2002d0b0055db886cfe379327413bc0415e73c4
MD5 7410f94e2a71fb68fa1da2d8f5a44e44
BLAKE2b-256 950912b831c86e215cce505a9901511a49945bbaa0a4a42808ff044a944471be

See more details on using hashes here.

Provenance

The following attestation bundles were made for leibnetz-0.2.0.tar.gz:

Publisher: ci-cd.yml on janelia-cellmap/LeibNetz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file leibnetz-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: leibnetz-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 39.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for leibnetz-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ac4ec1df8ccc01609128f9e839c88b0b56c7c38d560107d969c0ab6f93b84fd3
MD5 ae6c928302c9d84ba51e8c56f7f7d50b
BLAKE2b-256 1c757ffce115453d36d49f4adc7caa7f623ef8003f244d2ab0dc35f7c357ba99

See more details on using hashes here.

Provenance

The following attestation bundles were made for leibnetz-0.2.0-py3-none-any.whl:

Publisher: ci-cd.yml on janelia-cellmap/LeibNetz

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page