Skip to main content

A powerful simple Neural Network building blocks based on MyGrad

Project description

License: MIT

A small, extensible, and lightweight framework for building powerful neural networks in Python, built on mygrad.

HelixNet is designed to be a transparent and easy-to-understand tool for learning and experimentation. It is powerful enough to run complex models like CNNs and LSTMs, but simple enough to run efficiently on modest hardware.


Key Features

  • Lightweight & Simple: No complex compilation or heavy dependencies. Just plug and play.
  • Extensible by Design: A clean, object-oriented structure (Layer, Optimiser) makes it easy to create your own custom layers and optimizers.
  • Modern Architecture: Includes common layers like Dense, Conv2D, MaxPooling2D, LSTM, and Embedding.
  • Powerful Optimizers: Comes with robust implementations of SGD (with momentum), RMSProp andAdam.
  • Full Documentation: Comprehensive documentation available here.

Installation

# For the latest stable version
pip install HelixNet

# For development (editable) mode
pip install -e .

Quickstart: Training a Model

Here's a quick example of how to build and train a model on the classic "spiral" dataset.

import numpy as np
import mygrad as mg
import helixnet.layers as layers
import helixnet.optimisers as optimisers
import helixnet.activations as activations
import helixnet.models as models
from nnfs.datasets import spiral_data

# Create dataset
X, y = spiral_data(samples=100, classes=3)
X = mg.tensor(X)
y = mg.tensor(y, dtype=int)

# Build model
model = models.Sequental([
    layers.Dense(2, 64, activation=activations.ReLU),
    layers.Dense(64, 3, activation=(lambda x: x)) # Logits
])

optim = optimisers.Adam(learning_rate=0.05, decay=5e-7)

# Train the model
for epoch in range(10001):
    logits = model.forward(X)
    loss = mg.nnet.losses.softmax_crossentropy(logits, y)

    loss.backward()
    optim.optimise(model)
    model.null_grads()

    if epoch % 100 == 0:
        accuracy = np.mean(np.argmax(logits.data, axis=1) == y.data)
        print(f'Epoch: {epoch}, Loss: {loss.data:.4f}, Accuracy: {accuracy:.4f}')

Documentation

For a full guide to all layers, optimizers, and functionalities, please see the Full HelixNet Documentation.

Contributing

Contributions are welcome! If you find a bug or have a feature request, please open an issue. If you'd like to contribute code, please see the notes on our own contributions to mygrad (#445) for an example of our development philosophy.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

helixnet-0.6.2.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

helixnet-0.6.2-py3-none-any.whl (22.0 kB view details)

Uploaded Python 3

File details

Details for the file helixnet-0.6.2.tar.gz.

File metadata

  • Download URL: helixnet-0.6.2.tar.gz
  • Upload date:
  • Size: 18.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for helixnet-0.6.2.tar.gz
Algorithm Hash digest
SHA256 370a2bfb8cc621d4d819284bc4a8a1b1ed4b9bfba7184bc6eeb76e7fc3e481a4
MD5 627317653fe1ae6ef77fa442ba512112
BLAKE2b-256 7132a7bfd5c7a89287b4fad741ba16dbb0b71d194a5595ffde21cd48912c8342

See more details on using hashes here.

Provenance

The following attestation bundles were made for helixnet-0.6.2.tar.gz:

Publisher: python-publish.yml on TheIridiumMan/HelixNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file helixnet-0.6.2-py3-none-any.whl.

File metadata

  • Download URL: helixnet-0.6.2-py3-none-any.whl
  • Upload date:
  • Size: 22.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for helixnet-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 fa627c3bd8d6678adac2bf5103f5ae8096c180025318c76b1687ff44501d06a8
MD5 fddb1dd119a7bca6281fa1f8f73057ce
BLAKE2b-256 0f3a30fd427465b9f06c95c0701cb6a1e123eb9d8c845f9bd0b2967a74e0b40e

See more details on using hashes here.

Provenance

The following attestation bundles were made for helixnet-0.6.2-py3-none-any.whl:

Publisher: python-publish.yml on TheIridiumMan/HelixNet

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page