Skip to main content

Pybernetics is a lightweight toolkit for the development and training of neural networks.

Project description

Pybernetics

Pybernetics is a lightweight Python toolkit for developing and training neural networks from scratch. It is designed to be a self-contained library, avoiding the use of third-party machine learning or deep learning frameworks. Pybernetics relies on NumPy for matrix operations and incorporates handcrafted implementations for common neural network components such as layers, activation functions, and optimizers.

Key Features:

  • Lightweight and Modular: Provides essential tools for building and training neural networks while maintaining simplicity and flexibility.
  • Custom Activation Functions: Includes a variety of activation functions implemented using NumPy for high performance and easy customization.
  • Dataset Integration: Offers utilities to generate synthetic datasets or fetch real-world datasets via scikit-learn's fetch_openml (used solely for dataset retrieval).
  • Utilities for NLP: Supports tokenization, bag-of-words, Markov chains, and other natural language processing methods tailored for neural network use cases.

Modules and Classes:

  • _Utils: Internal utility functions for mathematical operations and helper methods, including:

    • Maths: Implements activation functions such as ReLU, sigmoid, softmax, and their derivatives.
    • Helpers: Provides methods for element-wise operations on NumPy arrays.
  • Dataset: Generates or fetches datasets for training, including synthetic datasets like spirals or real-world datasets using OpenML.

  • NaturalLanguageProcessing: A collection of NLP tools including tokenizers, Markov chains, bag-of-words representations, and character/word predictors.

  • Layers: Contains classes for building neural network layers, including:

    • Dense: Fully connected layers with customizable input and output sizes.
    • Sigmoid: Implements the sigmoid activation function for neural network layers.
    • ReLU: Implements the ReLU activation function for neural network layers.
    • Tanh: Implements the tanh activation function for neural network layers.
    • Binary: Implements a binary step activation function.
    • LeakyReLU: Implements the leaky ReLU activation function with a customizable alpha parameter.
    • Swish: Implements the Swish activation function with a customizable beta parameter.
    • ELU: Implements the ELU activation function with a customizable alpha parameter.
    • Softmax: Implements the softmax activation function for probability distributions.
    • SELU: Implements the SELU activation function with alpha and scale parameters.
    • GELU: Implements the Gaussian Error Linear Unit activation function.
    • Softplus: Implements the softplus activation function.
    • Arctan: Implements the arctan activation function.
    • Signum: Implements the sign function for activation.
    • Hardmax: Implements the hardmax activation function.
    • LogSigmoid: Implements the log-sigmoid activation function.
    • ReLU6: Implements the ReLU6 activation function with output clipping between 0 and 6.
    • TReLU: Implements the thresholded ReLU (TReLU) activation function.
    • Clip: Clips inputs to a defined minimum and maximum value range.
    • Normalize: Normalizes inputs to a specified range.
    • Dropout: Implements the Dropout layer
    • ZeroCenteredSigmoid: Custom author designed activation function.
    • Custom: Allows defining custom activation functions and their derivatives.
    • Conv1D: Implements a sliding 1D kernal applied to the input.
  • Loss: Defines loss functions for neural network training, including:

    • CategoricalCrossentropy: Computes the cross-entropy loss for classification tasks.
    • MeanSquaredError: Calculates the mean squared error for regression tasks.
  • Optimizers: Provides optimization algorithms for training neural networks, including:

    • SGD: Stochastic Gradient Descent optimizer with customizable learning rate.
  • Training: Contains classes for training neural networks, including:

    • Loop: The main training loop for training neural networks with specified optimizers, loss functions, and layers.
  • Models: Defines high-level models for training neural networks, including:

    • Sequential: A feedforward neural network model that can be trained on datasets.
    • load: Loads a pybernetics saved neural network.
  • _Typing: (Internal) Type hints for classes and functions, including custom types for neural network components.

  • DataTypes: Custom data types for neural network components.

Dependencies:

  • NumPy: Core dependency for numerical computations.

Built-in modules:

  • typing: Typing for all classes and functions
  • re: RegEx used for fast non-pythonic language filtering and substitution
  • collections: 'Defaultdict' used in NLP

Metadata:

  • Author: Marco Farruggio
  • License: MIT
  • Version: 4.5.3
  • Status: Development
  • Created: 2024-11-28
  • Platform: Cross-platform

Usage:

Import pybernetics and utilize its modular components to design, train, and evaluate neural networks or integrate its NLP tools into your projects.

Example:

import pybernetics as pb
import numpy as np

sgd_optimizer = pb.Optimizers.SGD(0.01)
cc_loss = pb.Loss.CC()
sd_dataset = pb.Datasets.spiral_data(100, 3)

pbnn = pb.Models.Sequential([
    pb.Layers.Dense(2, 3, "random"),
    pb.Layers.Sigmoid(-750, 750),
    pb.Layers.Dense(3, 3, "random"),
    pb.Layers.Tanh(),
    pb.Layers.Dense(3, 3, "random")],
    optimizer = sgd_optimizer,
    loss_function = cc_loss)

pbnn.fit(sd_dataset, 1000)

For full documentation and examples, refer to the class-level docstrings or future project documentation.

Importing

It is recommended to import the library using the following syntax:

import pybernetics as pb

As most of the doc strings do not include the full module path, this will make it easier to understand. Also it is recommended to use the 'pb' alias as it is shorter and easier to type. Shortens attribute chaining and makes the code more readable.

Maths

  • All mathematical operations are performed using NumPy arrays. As every function expects.
  • Memory usage optimized by in place operations and variable reusage and reallocation
  • Speed optimized using NumPy for base C computational speed with arrays

Dedication

  • Sam Blight

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pybernetics-0.1.1.tar.gz (32.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pybernetics-0.1.1-py3-none-any.whl (35.7 kB view details)

Uploaded Python 3

File details

Details for the file pybernetics-0.1.1.tar.gz.

File metadata

  • Download URL: pybernetics-0.1.1.tar.gz
  • Upload date:
  • Size: 32.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for pybernetics-0.1.1.tar.gz
Algorithm Hash digest
SHA256 111ecbf0d5d7ee745f1d9ad1448b13ac65bd2e74d3e36a4e87f9f85ba90b9924
MD5 89229f34d1055b010069a7bfc456e858
BLAKE2b-256 e6656492b1522efdf81d84fab7addfe54ce6557a2558ac267a77aad59cc0ea65

See more details on using hashes here.

File details

Details for the file pybernetics-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: pybernetics-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 35.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for pybernetics-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e1b5b2eeccf1d401782c69b33ad74eb717d9b7cbeb47b78f8651884f0a4f292f
MD5 74a0fda466fb20bb1c4892dcd81a8220
BLAKE2b-256 0033b0d89503b2166b5f061d2e2b1db7428c3d2dc6e8ffe90be536650dd417e6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page