Skip to main content

A From Scratch Neural Network Framework with Educational Purposes

Project description

forgeNN

Table of Contents

Python 3.8+ NumPy PyPI version Downloads License

Installation

pip install forgeNN

Overview

forgeNN is a modern neural network framework that is developed by a solo developer learning about ML. Features vectorized operations for high-speed training.

Key Features

  • Vectorized Operations: NumPy-powered batch processing (100x+ speedup)
  • Dynamic Computation Graphs: Automatic differentiation with gradient tracking
  • Complete Neural Networks: From simple neurons to complex architectures
  • Production Loss Functions: Cross-entropy, MSE with numerical stability

Quick Start

High-Performance Training

import forgeNN
from sklearn.datasets import make_classification

# Generate dataset
X, y = make_classification(n_samples=1000, n_features=20, n_classes=3)

# Create vectorized model  
model = forgeNN.VectorizedMLP(20, [64, 32], 3)
optimizer = forgeNN.VectorizedOptimizer(model.parameters(), lr=0.01)

# Fast batch training
for epoch in range(10):
    # Convert to tensors
    x_batch = forgeNN.Tensor(X)
    
    # Forward pass
    logits = model(x_batch)
    loss = forgeNN.cross_entropy_loss(logits, y)
    
    # Backward pass
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()
    
    acc = forgeNN.accuracy(logits, y)
    print(f"Epoch {epoch}: Loss = {loss.data:.4f}, Acc = {acc*100:.1f}%")

Architecture

  • Main API: forgeNN.Tensor, forgeNN.VectorizedMLP (production use)
  • Legacy API: forgeNN.legacy.* (educational purposes)
  • Functions: Complete activation and loss function library
  • Examples: example.py - Complete MNIST classification demo

Performance

Implementation Speed MNIST Accuracy
Vectorized 38,000+ samples/sec 93%+ in <2s

Highlights:

  • 100x+ speedup over scalar implementations
  • Production-ready performance with educational clarity
  • Memory efficient vectorized operations

Complete Example

See example.py for a full MNIST classification demo achieving professional results.

Links

TODO List

Based on comprehensive comparison with PyTorch and NumPy:

CRITICAL MISSING FEATURES (High Priority):

  1. TENSOR SHAPE OPERATIONS:

    • reshape() : Change tensor dimensions (tensor.reshape(2, -1))
    • transpose() : Swap dimensions (tensor.transpose(0, 1))
    • view() : Memory-efficient reshape (tensor.view(-1, 5))
    • flatten() : Convert to 1D (tensor.flatten())
    • squeeze() : Remove size-1 dims (tensor.squeeze())
    • unsqueeze() : Add size-1 dims (tensor.unsqueeze(0))
  2. MATRIX OPERATIONS:

    • matmul() / @ : Matrix multiplication with broadcasting
    • dot() : Vector dot product
  3. TENSOR COMBINATION:

    • cat() : Join along existing dim (torch.cat([a, b], dim=0))
    • stack() : Join along new dim (torch.stack([a, b]))

IMPORTANT FEATURES (Medium Priority):

  1. ADVANCED ACTIVATIONS:

    • lrelu() : AVAILABLE as forgeNN.functions.activation.LRELU (needs fixing)
    • swish() : AVAILABLE as forgeNN.functions.activation.SWISH (needs fixing)
    • gelu() : Gaussian Error Linear Unit (missing)
    • elu() : Exponential Linear Unit (missing)
  2. TENSOR UTILITIES:

    • split() : Split into chunks
    • chunk() : Split into equal pieces
    • permute() : Rearrange dimensions
  3. INDEXING:

    • Boolean indexing: tensor[tensor > 0]
    • Fancy indexing: tensor[indices]
    • gather() : Select along dimension

NICE-TO-HAVE (Lower Priority):

  1. LINEAR ALGEBRA:

    • norm() : Vector/matrix norms
    • det() : Matrix determinant
    • inverse() : Matrix inverse
  2. CONVENIENCE:

    • clone() : Deep copy
    • detach() : Remove from computation graph
    • requires_grad_(): In-place grad requirement change
  3. INFRASTRUCTURE:

    • Better error messages for shape mismatches
    • Memory-efficient operations
    • API consistency improvements
    • Comprehensive documentation

PRIORITY ORDER:

  1. Shape operations (reshape, transpose, flatten)
  2. Matrix multiplication (matmul, @)
  3. Tensor combination (cat, stack)
  4. More activations (leaky_relu, gelu)
  5. Documentation and error handling

Contributing

I am not currently accepting contributions, but I'm always open to suggestions and feedback!

Acknowledgments

  • Inspired by educational automatic differentiation tutorials
  • Built for both learning and production use
  • Optimized with modern NumPy practices
  • Available on PyPI: pip install forgeNN

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

forgenn-1.0.3.tar.gz (35.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

forgenn-1.0.3-py3-none-any.whl (28.2 kB view details)

Uploaded Python 3

File details

Details for the file forgenn-1.0.3.tar.gz.

File metadata

  • Download URL: forgenn-1.0.3.tar.gz
  • Upload date:
  • Size: 35.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for forgenn-1.0.3.tar.gz
Algorithm Hash digest
SHA256 34e6968fec3d8128e7b0afdb90f886cae9e446ae57885723dc11bcd7aaee3208
MD5 cfdc4f1db2b62364cc6a60490694cd2b
BLAKE2b-256 0fccd2f49f7da5f33d84a4e28324f71847ac3cb89ed859886d7f6b446836cdd4

See more details on using hashes here.

File details

Details for the file forgenn-1.0.3-py3-none-any.whl.

File metadata

  • Download URL: forgenn-1.0.3-py3-none-any.whl
  • Upload date:
  • Size: 28.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for forgenn-1.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 0676c32121c3327cdbc83b3d150fe47f16cab96d02fbe656a921d28e34a2fad9
MD5 0e916cb82c6d2545f949c56a318f71db
BLAKE2b-256 923cb16f78aa76d737efc9b864897472c61de121c9d2c4fa42f3a62157da6fa2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page