Skip to main content

A hand made machine learning library from scratch, by Thiago Macedo

Project description

SkyNet: A Numpy-powered, 100% Hand Made, Machine Learning Library 🚀

Welcome to SkyNet! This is my personal machine learning library. It is entirely made from scratch, using only NumPy. No professional machine learning libraries like Scikit-Learn, TensorFlow, or PyTorch are allowed. Everything is made by applying the fundamental concepts of machine learning. It supports both classical algorithms, supervised and unsupervised, as well as deep learning. This project is in its first stages, so there is a lot more to come. This serves as a showcase of my skills and conceptual knowledge in machine learning, calculus, linear algebra, and statistics. Stay tuned, because there is a lot more to be implemented here!

Banner/Image

Disclaimer: SkyNet is, at its heart, a project of passion and learning. Please refrain from deploying it in a professional setting. While meticulously crafted, it doesn't leverage state-of-the-art optimization and lacks GPU support.

Table of Contents

  1. Features
  2. Installation
  3. Usage
  4. Acknowledgements

Features

  • NumPy Powered: SkyNet is built entirely with NumPy, showcasing a true understanding of the algorithms.
  • Classical Machine Learning: SkyNet supports both supervised and unsupervised machine learning models.
  • Custom Neural Nets: SkyNet allows you to create custom neural networks with customizable layers, multiple activation functions, and optimization methods.
  • Transparency: SkyNet's code is well-documented with detailed comments and docstrings to guide you through each part of the code.

Installation

Install skynet_ml directly from PyPI:

pip install skynet-ml

Usage

Getting started with SkyNet is pretty straightforward. Here's a step-by-step guide:

  1. Setup Your Environment:

Before diving in, ensure you have numpy:

pip install numpy
  1. Enter skynet:

Navigate to the SkyNet directory:

cd skynet
  1. Make Magic!:
# import all the stuff you need 
import numpy as np
from keras.datasets import mnist
from keras.utils import to_categorical
from skynet_ml.nn.models import Sequential
from skynet_ml.nn.layers import Dense
from skynet_ml.nn.optimizers import Adam
from skynet_ml.metrics import ConfusionMatrix
from skynet_ml.utils.nn.early_stopping import EarlyStopping

# read the mnist dataset
(x_train, y_train),(x_test, y_test) = mnist.load_data()

num_labels = len(np.unique(y_train))
y_train = to_categorical(y_train)
y_test = to_categorical(y_test)
image_size = x_train.shape[1]
input_size = image_size**2

x_train = np.reshape(x_train, [-1, input_size])
x_train = x_train.astype('float32') / 255
x_test = np.reshape(x_test, [-1, input_size])
x_test = x_test.astype('float32') / 255

# create the model
model = Sequential()

# add layers to the mdoel
model.add(Dense(n_units=150, activation="leaky_relu", input_dim=input_size))
model.add(Dense(n_units=150, activation="leaky_relu"))
model.add(Dense(n_units=num_labels, activation="softmax"))

# compile the model
opt = Adam(learning_rate=0.001, beta1=0.9, beta2=0.999)
model.compile(loss="categorical_crossentropy", optimizer=opt, regularizer="l2")

# fit your model
model.fit(x_train=x_train, 
          y_train=y_train, 
          x_val=x_test, 
          y_val=y_test, 
          metrics=["accuracy", "precision", "recall", "fscore"], 
          early_stopping=EarlyStopping(patience=10, min_delta=0.0001),
          epochs=5, 
          batch_size=32,
          save_training_history_in="testing/logs/mnist_sequential.csv")

# predict with the model
y_pred = model.predict(x_test)

# compute the confusion matrix
cf = ConfusionMatrix(task_type="multiclass").compute(y_test, y_pred)
ConfusionMatrix(task_type="multiclass").plot(cf, save_in="testing/logs/confusion_matrix.png")

# save the model 
save_model(model, "testing/logs/mnist_sequential.pkl")
plot_model(model, save_in="testing/logs/mnist_sequential_model.txt")

# plot the training history
plot_training_history("testing/logs/mnist_sequential.csv", save_in="testing/logs/mnist_sequential.png")

Acknowledgements

  • Big thanks to Numpy for being the foundation of this project.
  • Grateful for my professor Lucas Kupssinsku who is teaching me all this stuff.
  • Big thanks to Ian Goodfellow for writing a bible about deep learning.
  • Big thanks to my boy chatGPT, who wrote 90% of the docstrings because I'm way to lazzy (including this docstring hehe).

Banner/Image

Disclaimer: SkyNet is a personal project and it's not designed for professional use. Therefore, do not sue me for using the same name as the evil A.I in The Terminator, the name is a joke.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

skynet_ml-1.2.0.7.tar.gz (49.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

skynet_ml-1.2.0.7-py3-none-any.whl (75.6 kB view details)

Uploaded Python 3

File details

Details for the file skynet_ml-1.2.0.7.tar.gz.

File metadata

  • Download URL: skynet_ml-1.2.0.7.tar.gz
  • Upload date:
  • Size: 49.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for skynet_ml-1.2.0.7.tar.gz
Algorithm Hash digest
SHA256 c65507bd3889cce2612103ca95e70faae27b0d42ab600c168941f4a3e37d1e64
MD5 a2d9267625269ddd4da41f0f0c603dbc
BLAKE2b-256 97b4762a38064680d6758165c1c07d2804e86d2b1769ce64c9b683238141a5e6

See more details on using hashes here.

File details

Details for the file skynet_ml-1.2.0.7-py3-none-any.whl.

File metadata

  • Download URL: skynet_ml-1.2.0.7-py3-none-any.whl
  • Upload date:
  • Size: 75.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for skynet_ml-1.2.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 891afed4e08d5ed9e1d14707eeafd724171a3fa6a77ff17fd0df77af385671f4
MD5 52dfe7cd41fb6e893d5ca13209c87c1a
BLAKE2b-256 231c26e4bdf3ca7953c98fcb919e43548e751f7f7c838c2f6be68368427be69d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page